Review

Embodying Emotion

See allHide authors and affiliations

Science  18 May 2007:
Vol. 316, Issue 5827, pp. 1002-1005
DOI: 10.1126/science.1136930

Abstract

Recent theories of embodied cognition suggest new ways to look at how we process emotional information. The theories suggest that perceiving and thinking about emotion involve perceptual, somatovisceral, and motoric reexperiencing (collectively referred to as “embodiment”) of the relevant emotion in one's self. The embodiment of emotion, when induced in human participants by manipulations of facial expression and posture in the laboratory, causally affects how emotional information is processed. Congruence between the recipient's bodily expression of emotion and the sender's emotional tone of language, for instance, facilitates comprehension of the communication, whereas incongruence can impair comprehension. Taken all together, recent findings provide a scientific account of the familiar contention that “when you're smiling, the whole world smiles with you.”

Here is a thought experiment: A man goes into a bar to tell a new joke. Two people are already in the bar. One is smiling and one is frowning. Who is more likely to “get” the punch line and appreciate his joke? Here is another: Two women are walking over a bridge. One is afraid of heights, so her heart pounds and her hands tremble. The other is not afraid at all. On the other side of the bridge, they encounter a man. Which of the two women is more likely to believe that she has just met the man of her dreams?

You probably guessed that the first person of the pair described in each problem was the right answer. Now consider the following experimental findings:

  1. While adopting either a conventional working posture or one of two so-called ergonomic postures, in which the back was straight and the shoulders were held high and back or in which the shoulders and head were slumped, experimental participants learned that they had succeeded on an achievement test completed earlier. Those who received the good news in the slumped posture felt less proud and reported being in a worse mood than participants in the upright or working posture (1).

  2. Images that typically evoke emotionally “positive” and “negative” responses were presented on a computer screen. Experimental participants were asked to indicate when a picture appeared by quickly moving a lever. Some participants were instructed to push a lever away from their body, whereas others were told to pull a lever toward their body. Participants who pushed the lever away responded to negative images faster than to positive images, whereas participants who pulled the lever toward themselves responded faster to positive images (2).

  3. Under the guise of studying the quality of different headphones, participants were induced either to nod in agreement or to shake their heads in disagreement. While they were “testing” their headphones with one of these two movements, the experimenter placed a pen on the table in front of them. Later, a different experimenter offered the participants the pen that had been placed on the table earlier or a novel pen. Individuals who were nodding their heads preferred the old pen, whereas participants who had been shaking their heads preferred the new one (3).

All of these studies show that there is a reciprocal relationship between the bodily expression of emotion and the way in which emotional information is attended to and interpreted (Fig. 1). Charles Darwin himself defined attitude as a collection of motor behaviors (especially posture) that conveys an organism's emotional response toward an object (4). Thus, it would not have come as any surprise to him that the human body is involved in the acquisition and use of attitudes and preferences. Indeed, one speculates that Darwin would be satisfied to learn that research reveals that (i) when individuals adopt emotion-specific postures, they report experiencing the associated emotions; (ii) when individuals adopt facial expressions or make emotional gestures, their preferences and attitudes are influenced; and (iii) when individuals' motor movements are inhibited, interference in the experience of emotion and processing of emotional information is observed (5). The causal relationship between embodying emotions, feeling emotional states, and acquiring and using information about emotion is currently the subject of a substantial amount of research in psychology and neuroscience. The way to understand this relationship between bodily states of emotion and the manner in which humans encode, represent, and use emotional information is the topic of this article. In particular, I discuss insights that have been stimulated by theories of embodied cognition and show how such theories account for the embodiment effects that you and Darwin might have been able to intuit.

Fig. 1.

Two ways in which facial expression has been manipulated in behavioral experiments. (Top) In order to manipulate contraction of the brow muscle in a simulation of negative affect, researchers have affixed golf tees to the inside of participants' eyebrows (42). Participants in whom negative emotion was induced were instructed to bring the ends of the golf tees together, as in the right panel. [Photo credit: Psychology Press]. (Bottom) In other research, participants either held a pen between the lips to inhibit smiling, as in the left panel, or else held the pen between the teeth to facilitate smiling (39).

Emotions and Theories of Embodied Cognition

Until recently, psychologists and cognitive scientists have spent little effort on the development of complete models of the mental processing of emotional information. This is true in spite of the fact that such information prioritizes attention (6), access to word meaning (7), and the organization of material in memory (8). For many scientists, emotion has simply seemed fraught with too many difficulties to be considered as a tractable topic of study.

One way to avoid the problems in studying emotions is to make them go away. Classic models of information processing in the cognitive sciences allow sensory, motor, and emotional experience to be represented as stripped of their perceptual and experiential basis. In such models, largely inspired by the metaphor of “mind as computer,” information taken in by the different sense modalities is preserved in memory in the form of abstract symbols. These are stored in a manner that is functionally separated from the original neural systems (those involved in vision, olfaction, and audition, for example) that encoded them in the first place [(9, 10); see (11) and (12) for discussion]. Such information-processing models render what individuals know about emotion equivalent to what they know about most other things. Conveniently, the models also do away with the priority of emotion in information processing. And the sensory, motor, and affective systems are not required for thinking or language use.

There are other ways to think about information processing, and these ways are clustered under the label “theories of embodied cognition.” Although this approach provides an original perspective and is based on methodological and technological innovation, the basic idea is actually very old (13). The assertion common to recent instantiations of such theories is that high-level cognitive processes (such as thought and language) use partial reactivations of states in sensory, motor, and affective systems to do their jobs (14). Put another way, the grounding for knowledge—what it refers to—is the original neural state that occurred when the information was initially acquired. If this is true, then using knowledge is a lot like reliving past experience in at least some (and sometimes all) of its sensory, motor, and affective modalities: The brain captures modality-specific states during perception, action, and interoception and then reinstantiates parts of the same states to represent knowledge when needed.

Theories of embodied cognition have now been applied to provide rigorous accounts of emotion and the processing of information about emotion (5, 15). In this regard, experiencing an emotion, perceiving an emotional stimulus, and retrieving an emotional memory all involve highly overlapping mental processes. One schematic way that this might work is illustrated in Fig. 2. As depicted, the perception of an emotional stimulus, such as a snarling bear, involves, among other responses, seeing, hearing, and feeling consciously afraid of the bear. Altogether, the neural, bodily, and subjective feeling state might be called “fear” for the perceiver (although the same patterns might be called “exhilaration” for another perceiver or for the same perceiver in a different context). Populations of neurons in the modality-specific sensory, motor, and affective systems are highly interconnected, and their activation supports the integrated, multimodal experience of the bear.

Fig. 2.

(Left) Activation of populations of neurons on visual, auditory, and affective systems upon perception of the snarling bear is illustrated schematically. (Right) Later, when remembering the appearance of the bear, parts of the original states of the visual system are reinstated. These then can act to reactivate the parts of the states that were originally active in the other systems (5). [Photo credit: Jim Zuckerman/CORBIS]

Later, in just thinking about stumbling on the bear, the neural states that represent (for example) the visual impression of the bear can be reactivated. The reinstantiation of a pattern of neurons in one system can then cascade to complete the full pattern in the others. Through the interconnections of the populations of neurons that were active during the original experience, a partial multimodal reenactment of the experience is produced (16, 17). Critically for such an account, one reason that only parts of the original neural states are reactivated is that attention is selectively focused on the aspects of the experience that are most salient and important for the individual. These then are the aspects that are most likely to be stored for later reactivation (12). Because emotions are salient and functional, this aspect of experience will certainly be preserved (8).

In theories of embodied cognition, using knowledge—as in recalling memories, drawing inferences, and making plans—is thus called “embodied” because an admittedly incomplete but cognitively useful reexperience is produced in the originally implicated sensory-motor systems, as if the individual were there in the very situation, the very emotional state, or with the very object of thought (18). The embodiment of anger might involve tension in muscles used to strike, the enervation of certain facial muscles to form a scowl, and even the rise in diastolic blood pressure and in peripheral resistance, for example. The concept of reenactment and related concepts such as simulation, resonance, and emulation are widely accepted in theories of embodied cognition, but many different mechanistic neural accounts of it have been proposed (19). One promising possibility is that simulation is supported by specialized “mirror neurons” or even an entire “mirror neuron system,” which maps the correspondences between the observed and performed actions. However, there is much disagreement about the exact location of the mirror neurons, whether these neurons actually constitute a “system” (in the sense of interconnected elements), and whether there actually are specialized neurons dedicated to mirroring (or whether regular neurons can simply perform a mirroring function). Some of the original work on mirror neurons in monkeys emphasized a distinctive role of neurons located in the inferior parietal and inferior frontal cortex, which discharge both when a monkey performs an action and when it observes another individual's action (20). The implications of this work were quickly extended to humans. Some scientists argue that humans have a dedicated “mirror neuron area,” located around the Broadmann's Area 44 (the human homolog of the monkey F5 region). This mirror neuron area may compute complex operations, such as mapping the correspondence between self and others or differentiating between goal-oriented versus nonintentional actions (20). But more questions about an architecture for embodied cognition have been raised than have been answered. The specifics of the underlying architecture will be one of the defining projects for neuroscience and neurophysiology in the coming years.

Perceiving Emotion

One hypothesis regarding the application of theories of embodied cognition to emotion is that the perception of emotional meaning—recognizing a facial expression of emotion or the words “snarling bear”—involves the embodiment of the implied emotion (21). There is now substantial empirical support for this hypothesis. Neuroimaging studies have revealed that recognizing a facial expression of emotion in another person and experiencing that emotion oneself involve overlapping neural circuits. In an illustrative study, researchers had participants inhale odors that generated feelings of disgust (22). The same participants then watched videos of other individuals expressing disgust. Results showed that areas of the anterior insula and, to some extent, the anterior cingulate cortex were activated both when individuals observed disgust in others and when they experienced disgust themselves [related findings are reported in (23, 24)].

Similarly, behavioral studies demonstrate that emotional expressions and gestures are visibly imitated by observers and that this imitation is accompanied by self-reports of the associated emotional state (25). Theories of embodied cognition provide a theoretical account of why this is so: The imitation of other individuals' emotional expressions is part of the bodily reenactment of the experience of the other's state. When emotional imitation goes smoothly, there is a strong foundation for empathy (26) and, therefore, even good marriages. Mimicking the facial expressions of your partner is good for your relationship, even if this means that you will grow to resemble each other because you repeatedly use the same facial muscles, as the findings of one study suggest (27). In contrast, there is evidence that relates failures in processes of emotional imitation, such as those which occur in autism, with substantial problems in social interaction (28).

One important implication of this type of emotional resonance across individuals is its probable role in observational learning. In observational learning, the positive or negative consequences of a given behavior are learned by watching another individual experience these consequences. A recent functional magnetic resonance imaging study revealed similar changes in brain activity of a female participant when painful stimulation was applied to her own hand and to her partner's hand (29). A related study used single-cell recording and found activation of pain-related neurons when a painful stimulus was applied to the participant's own hand and also when the patient watched the painful stimulus applied to the experimenter's hand (30). This suggests that observational learning is supported by a reenactment of the emotional experience of the model in the observer. Although a direct test of such a claim is required, the same mechanism should underlie instructed learning. In instructed learning, neither the self nor another person ever experiences pain or pleasure. Rather, learning occurs through the transmission of language. When children learn not to put their fingers in electrical outlets or to carelessly run into the street, their behavior is guided by verbal instruction, not direct experience. They must, therefore, be able to reexperience an emotion when that emotional consequence is described in language. Already published comparisons of amygdala activation during conditioned, observational, and instructed fear-learning in humans are consistent with just such a view (31). The findings suggest that the emotional processes that support all three types of learning share important similarities.

Thinking About Emotion

In my own laboratory, we have demonstrated that using emotional information stored in memory involves embodiment (32). In one study, experimental participants made judgments (they provided a “yes” or “no” response) about whether words referring to concrete objects (e.g., “baby,” “slug”) were associated with an emotion. The objects had been rated by other individuals as being strongly associated with the emotions of joy, disgust, anger, or no particular emotion. During the task, the activation of four facial muscles (Fig. 3) was recorded with a technique called electromyographic recording. In another study, the same method was followed but the words now referred to abstract concepts; they were adjectives that denoted affective states and conditions (e.g., “joyful,” “enraged”).

Fig. 3.

The muscles associated with the facial expressions measured in recent work are shown. The orbicularis oculi and zygomaticus are activated to produce a smile, the corrugator is activated during frowning in anger, and the levator is used to produce the grimace of disgust.

Results of both studies showed that, in making their judgments, individuals embodied the relevant, discrete emotion as indicated by their facial expressions. The findings indicate that in the very brief time it took participants to decide that a “slug” was related to an emotion (less than 3 s), they expressed disgust on their faces. They appeared to make their judgments on the basis of the embodiment of the referent (objects for the first study and emotional states for the second). Further support for such a conclusion comes from the results of a second condition of each study. In fact, the experimenter instructed half of the participants to make a different judgment about the words. Those participants indicated (“yes” or “no”) whether the words were written in capital letters. In order to make such judgments, these participants would not have to embody the emotional meaning of the words; indeed, findings revealed that these participants showed no systematic activation of the facial musculature whatsoever. The point that embodiment does not occur when the information can be processed on the basis of association or perceptual features has been made in other research as well (33, 34).

Further evidence of the embodiment of emotional concepts was also obtained in extensions of research on the costs of switching processing between sensory modalities to the area of emotion. Researchers have shown that shifting from processing in one modality to another involves temporal processing costs (35): Individuals take longer to judge the location of a visual stimulus after having just detected the location of an auditory one, for example, than if both stimuli arrive to the same modality. For the present concerns, it is of interest that similar “switching costs” are also found when participants engage in conceptual tasks: Individuals are slower to say that typical instances of object categories have certain features if those features are processed in different modalities (36). They are slower to verify that a “bomb” can be “loud” when they have just confirmed that a “lemon” can be “tart” than compared to when, for example, they have just confirmed that “leaves” can be “rustling.” This provides support for the general assertion made by theories of embodied cognition that individuals simulate objects in the relevant modalities when they use them in thought and language.

Vermeulen and colleagues (37) examined switching costs in verifying properties of positive and negative concepts such as “triumph” and “victim.” Properties of these concepts were taken from vision, audition, and the affective system. Parallel to switching costs observed for neutral concepts, the study showed that, for positive and negative concepts, verifying properties from different modalities produced costs such that reaction times were longer and error rates were higher than if no modality switching was required. This effect was observed when participants had to switch from the affective system to sensory modalities and vice versa. In other words, participants were less efficient in verifying that a “victim” can be “stricken” if the previous trial involved verifying that a “spider” can be “black” than they were if that previous trial involved verifying that an “orphan” can be “hopeless.” And participants were less efficient in verifying that a “spider” can be “black” when that trial was preceded by the judgment that an “orphan” can be “hopeless” than if preceded by the judgment that a “wound” can be “open.” This provides evidence that affective properties of concepts are simulated in the emotional system when the properties are the subject of active thought.

Comprehending Emotional Language

Developments in theories of embodied cognition to account for language make the claim that language comprehension relies in part on embodied conceptualizations of the situations that language describes (38). The first step in language comprehension, then, is to index words or phrases to embodied states that refer to these objects. Next, the observer simulates possible interactions with the objects. Finally, the message is understood when a coherent set of actions is created.

Some evidence in support of such an account of understanding emotional language was published almost 20 years ago, though no fully developed model was available at the time to interpret the findings. In the study, some participants held a pencil between their front teeth while performing a laboratory task that involved rating the funniness of different cartoons (39). Holding the pen in the mouth this way covertly led the individuals to smile. Other participants were instructed to hold a pencil between their lips, without touching the pencil with their teeth, and this prevented them from smiling (Fig. 1). Results revealed that, as suggested in the thought problem that began this article, individuals who were led to smile evaluated the cartoons as funnier than did participants whose smiles were blocked. It appeared that those individuals who were smiling somehow “got” the comic meaning of the cartoons better or easier than did the individuals who were prevented from smiling.

More evidence for simulation of emotions in sentence comprehension is now available (40). The reasoning that motivated the research was that if the comprehension of sentences with emotional meaning requires the partial reenactment of emotional bodily states, then reenactment of congruent (or incongruent) emotions should facilitate (or inhibit) language comprehension. Participants had to judge whether the sentences described a pleasant or an unpleasant event, while holding a pen between the teeth (again, to induce smiling) or between the lips (to inhibit smiling). Reading times for understanding sentences describing pleasant events were faster when participants were smiling than times when particpants were prevented from smiling. Sentences that described unpleasant events were understood faster when participants were prevented from smiling than when they were smiling. The same effect was observed in a second experiment in which participants had to evaluate whether the sentences were easy or hard to understand.

Conclusions

Early critics of theories of embodied cognition argued that bodily feedback is too undifferentiated and too slow to represent emotional experience (41). In fact, the motor system alone can support extremely subtle distinctions. But, more importantly, recent theories of embodied cognition avoid such criticisms by focusing on the brain's modality-specific systems, not only on muscles and viscera. The circuits in modality-specific brain areas are fast, refined, and able to flexibly process a large number of states. These states can be reactivated without their output being observable in overt behavior. This account is ripe, therefore, to generate research that can further the understanding of learning, language comprehension, psychotherapeutic techniques, and attitudes and prejudice, just to name a few psychological phenomena. These days, those few seem to be pretty important.

References and Notes

View Abstract

Navigate This Article