What is emotion analytics and why is it important?by
Interactions, facts and feelings shape our relationships. A truism: It's not what you say, but how you say it. Expressions matter, as do the sentiment behind each encounter and the emotions raised. Emotion is entwined with the literal meaning of words used.
This fact/feeling principle applies to both inter-personal and business relationships. "Emotional and factual appeals cannot be easily separated," writes Nigel Hollis of Kantar Millward Brown in an analysis of advertising approaches. "[A] distinction between emotional and rational is one that exists only in the minds of marketers, not consumers," according to Hollis.
The fact/feeling equation is central to corporate customer experience (CX) initiatives. CX practitioners map customer journeys that are defined by both the what and the how-did-it-make-you-feel? of customer-brand interactions. "Emotion drives loyalty," according to CX visionary Bruce Temkin, and loyalty drives profit.
Another truism: You can't improve what you don't measure, not systematically, on a corporate scale.
Sentiment analysis and the varieties of emotion AI
Enter sentiment analysis, software technology that quantifies mood, attitude, opinion, and emotion in digital media, in images, video, audio, and text. One subspecies infers emotion via facial-expression analysis. Providers include Affectiva, CrowdEmotion, Eyeris, Kairos, nViso, Noldus Information Technology, RealEyes, and Sightcorp. Another variety analyses emotion in speech. Check out audEERING, Beyond Verbal, EMOspeech, Good Vibrations, NICE, Verint, and Vokaturi. On the text front, natural language processing (NLP) techniques can identify and extract emotion in online, social, and enterprise sources, delivered by companies that include Clarabridge, Crimson Hexagon, Feedback Ferret, IBM (AlchemyAPI and Watson Tone Analyzer), indico, Receptiviti, and an advisee of mine, Heartbeat AI Technologies.
This article aims to get a handle on the state of emotion analytics — specifically, emotion in text — via an interview with Heartbeat founder Lana Novikova. Lana describes herself as a marketer by training and a market researcher by trade, never satisfied with numbers and observations, always pushing to understand the "deep why" behind consumer (and her own) decisions. She'll be presenting at LT-Accelerate, a conference I organise, November 21-22 in Brussels, alongside Odile Jagsch, a consultant at global market research consultancy Kantar TNS, topic "The 'Why' Behind Customer Loyalty."
Seth Grimes. Heartbeat designs "emotionally intelligent technologies." OK, what's an "emotionally intelligent technology"?
Lana Novikova. Let's start with the concept of Emotional Intelligence (EQ), popularised by a psychologist Daniel Goleman in 1990s.
Imagine a newborn human who comes with a basic wiring for recognising and expressing key emotions, and with an enormous capacity to learn. She can cry or stay calm, smell and turn her head towards her mother’s breast. Once she can see faces, her mirror neutrons start learning and mimicking facial expressions; then she develops more and more capacity to read and express emotions — from touch, to tone & voice recognition, to basic language to more complex if-then scenarios. In a perfect world, she grows into a secure and happy person who can recognize and name a wide range of her own emotions, understands what other people feel from multiple expressions, and has a capacity to express and manage her emotions.
SG. So you apply the EQ concept to and via technology.
One day, this technology might surpass humans in understanding human emotions because it will tap into data that humans can not perceive on their own.
LN. Technology today has a super high IQ — it can beat the best human chess, Jeopardy, and Go players — yet it has a very low EQ. At Heartbeat, we want to be a part of an academic and business community that changes this. Emotionally intelligent technology is never going to feel emotions or express them like our baby can, but it will eventually become very good at perceiving and understanding human emotions from data.
One day, this technology might surpass humans in understanding human emotions because it will tap into data that humans can not perceive on their own: biometrics, brain waves, subtle cues from body language and facial expressions, and more.
SG. Relating the tech to Heartbeat...
LN. We are focusing on training the (metaphorical) technology infant to recognize explicit feelings from language, from text, and to guess the range of emotions it communicates. Just as some people can intuitively differentiate between many emotions, our growing algorithm can tell what kind of Joy or Anger is expressed in language. There could be as little as 2-3% and as much as 50% affect words and phrases in any given unstructured text. We find these words and assign them to a cluster of emotions. This process mimics how our brain deciphers emotions from language.
SG. What aspects of emotion does Heartbeat detect and measure? Do you adopt a particular emotion model?
LN. There are a few models and classifications of emotions developed by brilliant psychologists like Paul Ekman and Robert Plutchik, and even by Human-Machine Interaction Network on Emotion (HUMANE). I was inspired by a more intuitive model of W. G. Parrott (2001), originally described by Shaver in 1987. It has a tree structure and includes Primary, Secondary and Tertiary emotions. I also did a lot of reading about effective neuroscience, and tried to combine Parrott's model with what I took from the work of J. LeDoux, R. Davidson, and J. Panksepp. Then my "practical life-long quant researcher” side took over and asked, “How is this segmentation going to be useful to a brand of chocolate, or a bank, or a political party?"
The art of analysing good quality text data lies in understanding (a) how to ask a good question, and (b) how to infer meaning from people’s answers.
We ended with a 2-level clustering of 99 complex emotions and feelings into nine primary emotions: Joy, Love, Trust, Anger, Fear, Disgust, Sadness, Surprise and Void (which is explicit lack of emotion like in “I don’t care"). We also added Body Sense (positive, negative and neutral) as a way to analyse words and phrases that don’t point to a particular emotion, yet are useful for understanding human perception overall, especially for marketing food and body care products.
Many words and phrases are coded into multiple emotion clusters. Would you agree that there is a large overlap between Disgust and Anger, or Love and Joy, or even Anger and Fear? For example, we put the word "terrific" into both Joy and Fear, and let context decide which emotion it is more likely to represent. This is the most challenging part of our journey — understanding how different context colors "terrific" into happy or unhappy expression. It's the domain of Machine Learning that needs lots of training data. We are just scratching the service here, but this is also the most exciting part of my job.
SG. How does the tech work?
LN. Today, our tech is very simple yet very accurate. It’s called "bag of words": 8,000+ words and phrases (including negation, metaphor, and other multigrams) professionally coded into categories and validated by skilled psychologists and psycholinguists. Our software consumes unstructured text from survey responses and social media, and produces a set of visualisations and charts in a simple elegant dashboard.
We do our best work when we analyse data from survey responses that focus on people’s feelings. Data like that produces over 30% affect words, and has the context controlled by researcher. Once the report is ready (which is almost instantly), we curate results by removing some words that do not apply to the report. This is how we deal with another industry challenge, ambiguity. Finally, to prove that we are very good at what we do - we show all words and phrases for each Primary emotion. You can click on any word and see exactly how it appeared in the original text - no “black boxes” here. Since our taxonomy reached 5,000, our match rate — the percentage of affect words that we recognise — is over 95%. Heartbeat is committed to accuracy, depth, and transparency.
SG. How is Heartbeat different or better than the competition?
LN. Heartbeat is different because it was created by a market researcher (myself) who spent hundreds of hours coding open ended survey data. My team built our award winning app for researchers and marketers who appreciate the depth of consumer insight. I love working with good quality data, and would choose quality over quantity any time.
Survey data is under-used and often abused. The art of analysing good quality text data lies in understanding (a) how to ask a good question, and (b) how to infer meaning from people’s answers. I believe Heartbeat is better for distilling emotions from open-ended survey questions than any other company on the market today including IBM/Alchemy and other powerful APIs. They can do a lot of advanced text analytics with huge amounts of data. We made it simple, transparent, and fun. Just check out our dashboards — clients love it! Another big differentiator is that we are 100% focused on emotions — not sentiment or basic emotions, but fine-grained feelings. Our reports can be useful for anyone - from a CEO and CMO to a brand manager to a CX analyst to an agency creative director.
SG. How do Heartbeat emotion text analytics findings complement or compare with insights discovered via neuroscience and biometrics, via facial image recognition and brain-activity measurement?
LN. I strongly believe in cooperation over competition. I think putting our best technologies together will create a better future for our businesses and for this planet. A graphic will communicate how I see the future of emotion AI integration. (See the image below.)
How Heartbeat AI sees the progress of emotion analytics.
SG. Could you please sketch out 2 or 3 use cases? What's your target market — clients and applications?
LN. Let’s start with the customer, the consumer or shopper. What do all customers have in common? They're human, they feel emotions all the time, and those emotions drive many (if not most) of their decisions.
Why not measure emotions on every step of your customer journey? It's not easy to put an eye tracking and EEG device on thousands of people, but it's easy to ask one simple question, "Please share a few words that best describe how you feel about X." This natural human question fits nicely into any survey and customer feed-back tool, and it’s surprisingly powerful. It not only invites a wide range of un-aided and un-biased feelings on the subject, but it can also aid at predicting what people will do in the future.
We're a start-up, but we have already published two case studies to show the predictive power of emotions measured with HEARTBEAT emotion analytics: banking and political elections. The best application of HEARTBEAT is in on-going customer experience measurement and foresight.
SG. What's on your product roadmap?
LN. We are driving in the fast lane! We launched our first prototype last December, and won an international startup competition (Insight Innovation Competition by GreenBook) in March 2016 in Amsterdam. All Spring and Summer, we tested our tool with some of the best research companies in the world. Finally, we launched enterprise SaaS this fall, which will enable anyone — brands, market-research suppliers, consultancies, marketing agencies — to access the engine. No more manual coding: leave it to a machine, fast and accurate.
Next, we're venturing into a long complex journey of NLP and machine learning, to crack the challenge of context and meaning.
Here's a quote that resonates with me, especially when it comes to trying to solve one of the biggest puzzles in the Universe, the puzzle of human consciousness: "The most influential thinkers in our own era live at the nexus of the cognitive sciences, evolutionary psychology, and information technology." That's New York Times columnist David Brooks.
The mission of Heartbeat Ai is to design emotionally intelligent technologies and tools to help machines understand peoples' feelings and improve our emotional wellbeing. I don’t know exactly how we'll get there in the end, but I know that we are on the right track today.
SG. Thanks Lana!
Readers, I've written-up a couple of other emotion analytics interviews — an IBM Watson blog contribution, Sentiment, Emotion, Attitude, and Personality, via Natural Language Processing, based on a conversation with IBMer Rama Akkiraju and On Facial Coding, Emotion Analytics, and Emotion Aware Applications with Affectiva principal scientist Daniel McDuff — and of course you can learn more about Lana's Heartbeat work at LT-Accelerate in November. If you can swing a trip to Brussels, see you there!
Seth Grimes is the leading industry analyst covering natural language processing (NLP), text analytics, and sentiment analysis technologies and their business applications. He founded Washington DC based Alta Plana Corporation, an information technology strategy consultancy, in...
Please login or register to join the discussion.
Thanks Seth. Unfortunately I tend not to agree, sorry. So a few challenges for you. (1) Psychology is pretty much in agreement; emotions are grounded in prior appraisal (cognition both conscious and unconscious) and that emotions can be adapted on reflection. In essence, emotions are not drivers of loyalty, but indirectly influential and need to be taken in the round with cognition (ref: Baumeister): you might as well say cognition is a driver of loyalty (2) Hence, as per my last article it is understanding the situations that lead to an emotional response - consumer well-being - that are critical (3) On text analytics, I would have thought that the person mentioning the narrative, since it is their subjective state, would be in the best place to say 'how they feel' and the meaning behind their feelings. Hence, I am supportive of non-text algorithm approaches such as sensemaker that get the customer to rate and state what their own text means (4) I am dead against sum of everything approaches - customers do not calculate in this way. In short, my argument would be appraisal first. Interested in how your respond back. Emotion I believe is on the agenda since text enables closer to the fleeting moment response - a good thing. But that's not emotion, that's closer to the whole cognitive-motivational-relationship event (Lazarus).
Seth - as a psychologist working on emotion and with particular experience in CX I would have to agree with Steven. I have to force myself not to shudder at the mention of 'automated' emotional analytic engines. If psychologists themselves are still in full blown debate in terms of how the [***] we actually measure emotions - I am somewhat puzzled that 'analytic experts' can raise their hands and say - "yep we've solved that problem". Most of emotional experience is actually 'hidden and non conscious'. The number of times I have asked customers how 'they feel' and the picture that tends to emerge is a) they are very bad at actually knowing b) they are even worse at verbalising it (as they are using post rationalised views and are restricted by their language and semantic capability). So for example the so called 'terrible' experience described by a customer flying with a flight brand, was in reality once picked apart - not terrible at all - just 'not as expected'. Just as there is a problem with person A using the word 'super service' and comparing it with person B saying it. In actual fact they probably mean completely different experiences - but in trying to describe how they feel have used the same restricted semantic terms. I actually think automated emotional semantic analysis can be misleading and dangerous for brands. Think about your own face to face interactions with people - do you just listen to what words they use or do you fit it with how they say it, what does their face look like and who is in the environment when they are saying it? If we all just responded to the words that are spoken at us - we'd all get in a lot of trouble quickly! If you truly want to know the complex emotional value of an experience you need to deploy slightly more robust methods. You also need to capture the non conscious drivers of the emotion - so you need to dig beyond just the words that appear on a text/email.
Hi Simon and Steven! I found both your comments to Seth's interview very useful. In my experience, customers rarely mention emotive words, but rather explain what happened during their interaction with a company. Lana solves this problem by probing customers to pinpoint the specific emotions, and for those who can express them, she can capture them using her tools. I propose an alternative way of addressing this: Sticking to detecting themes that aren't exactly emotional, but which translate to human behaviours that trigger emotions in customers. Here is a link to my post describing this in more detail, if you are interested: https://www.linkedin.com/pulse/missing-link-emotional-analysis-customer-...
Steven, thanks for the comment. Frankly, my concern is systematic emotion measurement and analysis at scale -- what I termed "emotion AI" -- rather than the sources, and like you (I believe), I am interested how emotion models might be turned to predictive use, which is roughly equivalent to your "situations that lead to an emotional response." I will say that as a technologist, I subscribe to attitudes that say you should aim for adequate and useful and then iteratively refine -- or abandon and restart if necessary -- in preference to a requirement for complete theoretical grounding from the start.
Simon, I won't dispute that "Most of emotional experience is actually 'hidden and non conscious'." Yet "nothing is hidden that will not be made manifest, nor is anything secret that will not be known and come to light." So my attitude is, measure what you can, and see what you can make of what you measure. Our descriptions will be representations rather than absolutes (cf Wittgenstein on color). Their correctness will resist proof. But they may be useful nonetheless, particularly as we develop and move to the robust methods that we agree are needed.
I find this to be a really interesting discussion. If we look beyond the binary write/wrong POV and admit that we all collectively know probably less than 10% about how the human brain/mind operates, we can let new ideas and approaches in. Noone really knows where the boundary between conscious and sub-conscious is, whether it's a CX experience or a conversation with your mother-in-law. Luckily, there are many tools that can help uncover deep insights, learn, explain, and even predict outcomes. Heartbeat is one of such tools, detecting themes is another, and there will be many more coming. The field of emotion analytics is growing, and we need to buckle up to join the ride, or being ready to be tossed to the sidelines. This is coming to CX soon: "Emotionally intelligent computers may already have a higher EQ than you" https://techcrunch.com/2016/12/02/emotionally-intelligent-computers-may-.... My suggestion is to expand existing horizons and start testing and integrating our tools for the benefit of clients and consumers.
Seth, I am not in disagreement on 'aim for adequate' nor go for scale. But there is a serious problem if you go for scale with something that claims to measure emotion but does nothing of the sort. That's a general comment not a critique of your approaches. Once again, it is standard cognitive psychology, emotion requires cognition. It requires the person 'who feels' to rate and state 'how they feel'. I am not saying an algorithm says nothing, but it is deeply myopic. Only a human can interpret their own context effectively. Humans at the front- machines in the middle - humans at the end. All else is software sales