Real-World Examples of Emotion AI in Action

 

In our last blog, we read some cases of sentiment analysis, where someone’s expressed intent is positive, negative, or neutral, and may not provide enough insight to understand what you hope to learn from what is being communicated.  Maya Angelou once said, “People will forget what you said, people will forget what you did, but people will never forget how you made them feel.”  Emotion AI is all about understanding feelings. This is our blog showing real-world examples of Emotion AI.

Image-Based Emotion Analysis Measuring Customer Response

Disney has pioneered early work using image analysis to detect audience emotions during movie screenings. Outfitting a 400-seat theatre with 4 infrared cameras to monitor 150 showings of a range of 9 mainstream movies (movies such as Star Wars, Zootopia, and Jungle Book) was such a demanding job. As a result, it allowed them to identify emotions, their intensity and how consistently they were felt.  Even more impressive was how accurate it became at predicting emotion.  The resulting dataset of 16 million facial indicators across 3,179 audiences trained a machine learning algorithm that could predict audience reaction across a spectrum of emotions that included joy, happiness and humour through sadness and surprise.

Emotion AI Recognition

With this advanced analysis, the company derives usable insights to generate a better emotional connection with audiences in upcoming films.

Text-Based Emotion Analysis for Market Research or Customer Service

More information is readily available for analysis in text form. These can be from social media reviews or proprietary information gathered via surveys or other sources. Going beyond sentiment analysis, understanding the type and depth of emotion expressed can provide greater insight into an audience’s perceptions. A valence arousal model illustration below shows how to plot emotional data points deriving from emotion AI.  

The evaluation of the text analysis by its valence on the horizontal plane indicates the degree of negativity or positivity. The horizontal plane references the degree of excitement or enthusiasm. At the bottom, there is a low arousal or relaxed state to a highly excitable state at the top. This reflects a quadrant effect that places enthusiastically positive emotions in the upper right quadrant and resigned negative emotions in the lower left quadrant.  At the same time, animatedly negative emotions would plot in the upper left quadrant and calm yet positive emotions in the lower right.

Emotion AI Recognition

Text analysis such as these Reddit examples, drawn from Google Research’s GoEmotions dataset and mapped against this model.

Emotion recognition

Emotion-recognition

Emotion-recognition

Applying emotion AI in this way and plotting the instances of feedback. This allows you to better understand the depth of reaction to a brand, product, or marketing message. Indeed, the responses received from customer service staff threads in a truly objective way. 

Emotion mapping

Inference

All businesses today aspire to enable their people to make better decisions faster.  Analyzing sentiment and emotion has been difficult to do at scale causing people to see it as subjective and anecdotal.  The tools and know-how exist to analyze sentiment and emotion with precision and objectivity at scale.  Making this kind of high-value insight available to more of your staff allows easy decisions making. This is because of more objective facts and data than educated guesses or “hunches".  

If you have a use case for sentiment or emotion analysis and to explore more, contact us at support@qualetics.com.