This paper discusses how modern brain research shows us that emotions are key in creating effective communication as our decision-making process mostly does not include rational reasoning. From the same research, we know that people’s own accounts of what they do or feel is heavily biased and that traditional research methods need to be supplemented by other more indirect methods that can measure what really happens. Methods like eye tracking and facial coding minimize the risk of getting biased results and can thus create a more accurate base for decisions on e.g. communication and advertising. The Sticky platform allows users to measure both visual engagement (via webcam-based eye tracking) and emotions (via web cam based facial coding). Research and validation studies, show that the facial coding accurately identifies six universal emotions (based on Paul Ekman’s research) and that videos which elicit strong emotions from viewers — whether positive or negative — are much more likely to be shared than those that provide a weak emotional response.
Why emotions matter
Feeling is a form of thinking. Both are ways we process information, but feeling is faster. That’s the crux of Nobel Prize winner, Daniel Kahneman’s mind-clarifying work. In everyday life, we are constantly bombarded with huge amounts of sensory information. We can’t process it all because the capacity of the brain is limited. The brain must decide what to focus on/pay attention to and what to ignore.
In Thinking, Fast and Slow, Kahneman (Kahneman, Daniel (2011), Thinking fast and slow. Farrar, Straus and Giroux) uses the terms “System 1” and “System 2” to describe and explain how the mind works. System 1 “is the brain’s fast, automatic, intuitive approach”, System 2 “the mind’s slower, analytical mode, where reason dominates.” According to Kahneman, System 1 is the more influential, guiding and steering system to a large extent. Or more simply put, attention is prioritized for emotional information, meaning emotional things draw attention faster and hold it longer than nonemotional elements.
In turn, information that is perceptual prioritized is easier to remember – both directly after the event and over time (i.e. things that are highly emotional increases the possibility that the event is remembered later). The reason for this is that evolutionary it has been beneficial to remember and be able to retrieve information connected with well-being (things that are good or things to stay clear off). Indeed, a general and central role of emotion is to emphasize things in the environment that are significant to us, and thus influence how we direct our attention and actions.
Today modern brain research can prove that rationality and judgment are connected to emotional ability. From the same research, it becomes clear that our consciousness has a narrow range. At least 95 percent, probably more, of all external sensory input are received at a low consciousness level and much of what we do is thus steered at a subconscious level (Damasio, Antonio R. (2005), Descartes' Error: Emotion, Reason and the Human Brain. Penguin Books Ltd). Only a fraction of our decision making is made on conscious and rational grounds and a decision is probably more akin to processed emotions and rough estimates than to rational analysis of carefully calculated amounts. This means that emotions are key in communication as our decision-making process mostly from a stronger consolidation which does not include rational reasoning.
The Measurement Problem
In the end, successful communication is all about getting attention – without attention nothing will happen; no awareness, no brand lifts, no considerations, no purchases... nothing!
But already some 100 years ago, the businessman John Wanamaker beautifully summed up the difficulties in evaluating marketing in the now classic saying: "Half the money I spend on advertising is wasted; the problem is I do not know which half."
Since then, there have been numerous attempts to identify which half works and which half is wasted. At best, these efforts have been benign, at worst, they have been directly misleading. In large part, because Plato's old dualistic view of the human mind set the agenda for how we have approached the question of effect.
Plato viewed consciousness as an eternal struggle between rationality and emotion, where the rational brain needed to control unruly emotions (Plato (ca 370 BC), The Phaedrus dialogue). The image imprinted Western thought for centuries and in the attempts to translate Plato’s binary psychology into practice the conscious and rational was given a prominent role. Not least in the market research industry focus has been on the rational and conscious brain and therefore traditional methods of measuring consumer behavior have relied upon consumers’ recollection and personal account of actions. Most people find this difficult. In fact, none of us really knows, nor can know, how and why we do things, we just do them. Gerald Zaltman (Zaltman, Gerald (2003), How customers think: Essential insights into the mind of the market. Harvard Business School Press), professor at Harvard Business School, has identified that “the correlation between stated intent and actual behavior is usually low and often negative.”
This means that anything that involves opinions or people’s own accounts of what they do or feel is heavily biased towards what an individual thinks you want to hear and what you as a researcher want to hear. The traditional research methods need to be supplemented by other more indirect methods that can measure what really happens.
Methods like eye tracking and facial coding allow you to identify consumers’ attention and spontaneous reactions without asking questions or having to rely on volatile memories. This minimizes the risk of getting biased results because respondents are trying to respond in a politically correct way or merely incorrectly remember what they did.
Eye tracking and facial coding are research methods that not only tell us what people claim to remember, but what they have done, seen and felt. These research methods consider how the human mind works, enabling you to measure the important stuff instead of making what is easy to measure important.
Facial coding - method and history
Origins In 1872, Charles Darwin (Darwin, Charles (1872). The Expression of the Emotions in Man and Animals. John Murray) was the first to suggest that there are universal facial expressions of emotion. The ideas about emotions were a centerpiece of his theory of evolution, suggesting that emotions and their expressions are biologically innate and evolutionarily adaptive. There have been arguments both in favor and against ever since. Several studies since then have attempted to classify human emotions to demonstrate how your face can give away your emotional state. A significant contribution was the “universality studies” carried out by Paul Ekman et al. In these studies, Ekman and his team of scientists showed participants photos of faces showing different emotional states and asked them to classify the emotion they saw. The research determined that there is high cross-cultural agreement in judgments of emotions in faces by people in both literate (Ekman, 1972, 1973; Ekman & Friesen, 1971; Ekman, Sorenson, & Friesen, 1969; Izard, 1971) and preliterate cultures (Ekman & Friesen, 1971; Ekman, et al., 1969).
Specifically, they identified six core emotions, which Ekman termed universal emotions: these original universal emotions are:
Anger/Puzzlement- symbolized by eyebrows lowering, lips pressing firmly and eyes bulging
Fear - symbolized by the upper eyelids raising, eyes opening and the lips stretching horizontally
Sadness - symbolized by lowering of the mouth corners, the eyebrows descending to inner corners and the eyelids drooping
Disgust - symbolized by the upper lip raising, nose bridge wrinkling and cheeks raising
Joy - symbolized by raising of the mouth corners (an obvious smile) and tightening of the eyelids
Surprise - symbolized by eyebrows arching, eyes opening wide and with the jaw dropping slightly
Sometimes a seventh emotion - contempt - is considered universal, but all in all, there is strong evidence for the universal facial expressions of at least six emotions – anger, fear, sadness, disgust, joy and surprise.
Facial Action Coding System (FACS)
To be able to systematically, and accurately encode facial expressions, Ekman et al. developed the Facial Action Coding System (FACS). The first manual, which was built on the work of Hjortsjo (1970), was published in 1978 but has since been updated several times with the latest version dating from 2002. The manual is exhaustive and spans over 500 pages enabling human coders to manually code nearly any facial expression anatomically possible. This is a time-consuming process and ties the research to a lab environment. Therefore, automated algorithms that detect facial expressions from video feeds (e.g. from a web cam) are starting to take over the analysis, in part originating with Rosalind Picard's 1995 paper6 on affective computing. Today there is automated software that makes it possible to collect and analyze data through recordings with webcam. These videos are then analyzed using an algorithm and may provide insights into how people react emotionally to the communication. And because emotion is central to the way we behave and the decisions we make, this is important information for marketers. Moreover, this data that is very difficult to obtain through traditional survey methods, which, of course, are based on the respondents to describe their feelings. So, which emotions does your communication evoke? Facial coding helps to answer this critical question by measuring emotions through facial expressions.
Application and interpretation
Driving effective communication
The importance of emotions is not just of academic interest, but of huge importance for driving effective communication and advertising. Over the last few years the evidence for emotion-based research has grown and there are now many case studies and statistics showing how emotion is strongly correlated to success. For instance:
According to Unruly ShareRank data, around 70% of viewers who experienced an intense emotional response to an ad were very likely to buy the product, more than double the 30% of viewers who were very likely to buy the product having experienced merely a moderate emotional response, representing an uplift of up to 144%. (Unruly ShareRank data; Feb 2013 to March 2015; n=84,000)
Karen Nelson-Field, a senior research associate at the Ehrenberg-Bass Institute for Marketing Science, found that videos which elicit strong emotions from viewers — whether positive or negative — are twice as likely to be shared as those that provide a weak emotional response. (https://contently.com/2013/12/16/the-emotions-that-trigger-video-sharing/)
Affectiva show evidence that emotions in advertising, captured with facial coding, can predict ad success with two dimensions of positive and negative emotion separating good and bad ads in over 70% of cases. Says Affectiva: “The dynamics of how emotion responses unfold over time yields insights into the emotional profile of a “good” ad. Increasing amusement throughout an ad was a predictor of success, while the presence of negative emotions was inversely correlated to sales.” 9 (http://www.affectiva.com/wpcontent/uploads/2014/09/Do_Emotions_in_Advertising_Drive_ Sales_Use_of_Facial_Coding_to_Understand_The_Relati.pdf)
Example 1: Heineken Walk-In Fridge
This commercial elicits strong positive emotions during branding; however, it takes more than 10 seconds to create an emotional response at the beginning of the ad, which indicates low engagement. Therefore, while it may perform well on TV, it is expected to perform worse in digital formats where users can skip pre-roll commercials or move to a new tab.
The women screaming in the closet creates the first small response, which leads to a much larger response when the men start screaming with a well-placed branding moment timed to the rise in positive emotion.
Link to study results: https://share.sticky.ai/#/e/E7hnGQf7OQ9D3q5ZaHryK/d/DGXv3xIv5BMjKLqxFdOY4
Example 2: Kia - Hero's Journey
The video evokes positive emotions and renders peaks at the expected, humorous moments. It also shows on high and positive emotional response at the branding moment. However, the first emotional response comes well past the skip line (5 sec) meaning there is a risk that viewers will skip or disengage with the video. Also, the eye tracking shows that on the last image part of the video, people engage more with the person than with the logo and product image.
Editing the video for a quicker emotional response in the beginning and moving the logo and product image closer to center of attention in the final scene would be two ways to optimize the video's effect even further.
Link to study results: https://share.sticky.ai/#/e/E8oU7K6JrxUsA6soZe67b/d/DGIj8gZEnIfZyQUsHWpFu
Sticky Emotion Analysis
Sticky Emotion Analysis tool works by using opt-in participants to watch your media and tracking their facial expressions through their webcams to pinpoint emotional activity in response to frames and elements in your media.
Used in conjunction with eye tracking, you’ll be able to tell not just what people are feeling as they watch the video, but exactly what they’re looking at on-screen when they feel that emotion. If someone feels happy one second and sad a few seconds later, you’ll be able to tell exactly which character, logo, music or other element causes happiness or sadness and what changed on-screen to elicit the emotional shift.
What do the numbers mean?
The results on the Sticky app are presented in a line graph, mapping the emotional response to a media second by second. You’ll also see a representation of the visual attention in the media, making it possible to see what elements earn the most attention at the same time as seeing the emotional response to each element.
In the Emotions graph, the x-axis represents the time segment in the media based on seconds and the y-axis represents the emotion intensity. Thus, the number on the y-axis shows the intensity level of specific emotions for each time frame - i.e. not necessarily the number of people expressing the emotion.
Simply put, the number is the compound output of two variables; (a) the number of people expressing an emotion and (b) the intensity by which they are expressing it.
Basically, this means that the number shown is the sum of each person’s individual emotion intensity at a specific time divided by the total number of persons in the test (n), where the individual emotion intensity is a number between 0 and 100.
What’s a peak?
A peak is a relative reference, which means the highest measured point for each emotion. You can expect peak heights to differ between the emotions measured. Some facial expressions, for example a smile, are more distinct and easier for the algorithm to pick up, resulting in higher peaks. Others are more nuanced and thus often results in lower scores. Normally expressions of joy, disgust and surprise are the most accurately detected. Expressions of anger, sadness and fear tend to be more nuanced and subtle and are therefore harder to detect resulting in scores at the lower end of the range. This pattern is true for all algorithms, but the absolute maximums can obviously differ slightly. Based on the true data test in the validation process, we have analyzed and mapped this for Sticky’s emotion algorithm. For all emotions, we’ve found that anything below an emotion intensity level of 5 is insignificant and should be disregarded. The other thresholds depend on how strong responses can be expected to be returned for the different measured emotions.
Below are the different thresholds you can use as a guidance to get a sense of what the different peaks mean:
<5 = noise (disregard)
5-15 = response
16-25 = intense response
26+ = max response
Surprise and Disgust:
<5 = noise (disregard)
5-15 = response
16-20 = intense response
21+ = max response
<5 = noise (disregard)
5-10 = response
11-15 = intense response
16+ = max response
Fear and Sadness:
<5 = noise (disregard)
5-10 = response
11+ = max response
Validating the performance and quality of our data - both the facial coding and the eye tracking - is an ongoing process and one we take seriously. There are multiple things on different levels of detail that make up the final output; as for instance, how well the algorithms detect faces, facial key points and eyes and then translates that into emotions and gaze data points. These things are continuously checked and improved but in the end the proof is in the pudding, i.e. how well do the algorithms measure what is says it measures. For facial coding, a good way of validating its ability to measure facial expressions and emotions, is through a ‘true data test’. This means we control what emotions we expect to be expressed at every point in time and see how well the algorithm picks that up.
In this study, we showed participants pictures of facial expressions from Paul Ekman’s original studies and told them to express the pictured emotion in the image: “You will now see a sequence with facial expressions of emotions. Please try to mimic these facial expressions and emotions as they are shown” The images were displayed one after the other in the order below, starting with Joy and ending with Anger, and shown for five seconds each.
The participants were recruited through online panels to represent a US general population between 18 and 70 years of age. In total 95 recordings with data that was analyzed.
Below you can see a screenshot from the results as presented on the Sticky platform. The colors on the graph correspond to the various emotions, and what’s pictured above the graph is the face participants are told to mimic. The results show that the algorithm works and accurately finds the correct facial expressions. The graph peaks with the correct emotion in response to the facial expression displayed, i.e. in each case, the system recognized the emotion that it was supposed to recognize. As discussed earlier, the maximum values differ between the emotions. The graph shows Joy as the most expressive emotion whereas the peaks for Sadness and Fear are much lower. Also, worth noting is that fear and surprise are related facial expressions and can therefore sometimes be hard to distinguish.
A live version of the emotion analysis for this study is available here: https://dev-share.sticky.ai/#/e/EGcRTgCmC5HCCVU0L6vSk/d/DGf3EpPGDEsgQLaKZXqhR
Measured emotions - definitions
In contrast to negative emotions, positive emotions seldom occur in response to life threatening situations. Therefore, joy and other positive emotions did not need to elicit focused attention and focused responses. Joy widens the mind, creating flexible responses, broadening the array of possible thoughts and options. Joy makes people want to push the limits, play and be creative in terms of physical, social, intellectual, and artistic tendencies.
This manifests in a variety of ways. An eye tracking study found that positive emotions lead to more attention on the periphery and wide visual search patterns (Wadlinger et al., 2006). People experiencing joy tend to be more inclusive, receptive to new information, and perform better on tests of creativity (Cohn and Frederickson, 2008).
These are all specific examples of a bigger behavioral pattern that build enduring personal resources. This phenomenon is called the broaden-and-build theory, that holds that the creative and open thinking elicited by joy allows people to broaden the scope for their environment and build out in the environment. The takeaway from this ‘broadening’ is that joyful people are more open to choose and may be willing to try a brand they haven’t yet tried. Kahn and Isen’s classic experiment on positive affect and choice showed that people with a mild positive affect facilitates variety-seeking among different consumer brands (Kahn and Isen, 1993). Therefore, videos or ads featuring a new brand or product should strive to elicit joy.
Since sadness is associated with loss, behavior manifests like loss as well. From an evolutionary perspective, sadness occurs when an individual was separated from the group -- lost. As mentioned earlier sadness is a hybrid between seeking and avoidance, and now I’ll go into more detail on that point. Picture an animal separated from its mother. You would expect the animal would initially look for its mother itself. After not finding her, it would withdraw. (Solms and Turnbull, 2002). Every child separated from their mother in the mall knows it’s easier to be found if you stay still, even though the instinct is to seek in that situation.
This behavioral pattern explains a few things about the emotion. Light sadness experienced for a short period of time can increase motivation. Experimenters showed participants either a happy film or a sad film and then gave them an untimed cognitive ability test to take. Since the test was untimed, the amount of time people spent on the test was indicative of perseverance and motivation. People who were shown the sad film spent 1.5x as long on the test, and got 1.5x as many questions correct as people who were shown the happy film (Forgas, 2016). This is the seeking phase of the emotion. It is also widely accepted that when people are extremely sad or experience sadness for long periods of time, motivation can be hard; this is avoidance.
Because sadness is a loss-prevention emotion, people who are experiencing sadness have a higher buying price for objects they desire, as Jennifer Lerner’s famous ‘Heart strings and purse strings’ experiment’ shows (Lerner, 2004). In simplistic evolutionary psychology terms, this is because people experiencing sadness want to make up for a loss by acquiring more goods.
People often view sad as a ‘bad’ emotion, but a light sadness for a limited period can be effective for people who make videos. Video ads can make people sad to increase motivation and willingness to pay, but just not *too* sad lest they decrease motivation.
Fear is a powerful motivator, changing priority weightings for goals and motivations by making safety the highest priority. Focus widens to find and avoid the danger, and long-term planning is near impossible. Fear can be thought of as both a seeking and an avoidance emotion, depending on the circumstances; this duality has been famously codified as the “fight or flight” response. Fear with a solution elicits the ‘fight’ response and subsequent approach behavior, whereas fear without a solution the ‘flight’ response and subsequent avoidance behavior.
Even though fear widens a person’s field of vision to scan for potential dangers, a study on emotions and attention found that fear makes people less distractible. People experiencing fear were more likely to notice a second stimuli (Vermeulen, 2009). Therefore, fear can be used effectively to capture attention; however, a solution must be presented, or else people will avoid rather than seek the stimulus. The best way to leverage fear in creating visual content is to present a specific danger and a specific solution. In the case of specific dangers, fear increases purchase intent only if the product is perceived to reduce the perceptions of danger (McDaniels et al., 1984). However, the risk in using fear marketing tactics is the potential, through nonspecific messaging, to elicit the flight instead of the fight response.
It is also worth noting that fear activates specialized learning systems. When people are afraid, they associate everything in their surroundings with that emotion, and they remember the specific sequences of events that led to that emotion. Therefore, fear plays especially well with educational efforts, à la Mothers Against Drunk Driving and anti-smoking campaigns.
As you can probably assume, surprise is good at capturing attention. In his article “Surprise: A shortcut for attention?”, Pierre Baldi makes a mathematical argument for surprise’s importance in capturing attention. Baldi argues that while searching for information, there are competitive inputs at play: Those that are relevant to the objective of the search, and those that are unexpected or surprising while searching.
Using a bayesian approach and after several mathematical models and equations, Baldi concluded that surprise can act as a shortcut to relevance but is not a perfect substitute. In other words, surprise takes precedence over relevance in the short term.
Surprise can therefore be the best emotional cue for capturing attention in the short term, especially in the case of irrelevant information (pre-roll video ads that play over what you wanted/expected to watch). Surprise is also very useful in building an object’s shareability. Tania Luna and LeeAnn Renninger put forth the following model of a four-step process a person undergoes as they experience surprise in their 2015 book Surprise: Embrace the Unpredictable and Engineer the Unexpected: 1) Freeze, when the unexpected stops us in our tracks. 2) Find, when we attempt to explain what’s going on. 3) Shift, when we change our perspective based on the new findings. 4) Share, when we feel the need to share our surprise (and the object of our surprise) with others. Surprise is therefore also the best emotion for shareability and for measuring a video’s potential to go viral.
Disgust is the most basic avoidance emotion, triggering strong goals to expel the stimuli. Because of this, disgust should be avoided. In the same ‘Heart strings and purse strings’ article mentioned in the sadness discussion, Lerner discovered that disgust decreases buying and selling prices (Lerner, 2004). And in a different study on emotions and attention, researchers discovered that disgust makes people more distractible; people experiencing disgust were less likely to notice a second stimuli (Vermeulen, 2009). The authors theorized that this is because the disgust response reduces sensory exposure. Therefore, disgust is the kryptonite for trying to capture attention or sell something, as it lowers attention and makes the object unpalatable and undesired. Video marketers should avoid the emotion at all costs. Unless you’re making a comedy, avoid disgust because it can be a strong avoidance signal to your viewers.
Puzzlement is not one of Ekman’s original universal emotions where this would instead be Anger. Normally anger is defined as a strong feeling of annoyance, displeasure, or hostility and usually something to stay clear from. On the Sticky app, we use the label Puzzlement to describe the facial expressions of a furrowed brow, etc. which are also associated with anger. The expression could equally be anger or puzzlement depending on the context. In most cases media evoke puzzlement a lot more than anger and hence the label to make the analysis appropriate. Puzzlement and anger share cognitive processes (e.g. confusion, frustration), so they are not a world apart. The same can be true of, say, Disgust, which might be someone disagreeing with something rather than being disgusted. But again, they share a similar process (e.g. rejection, disliking, avoidance).