Facial recognition technology is pretty well-known at this point, but some developers are taking it one step further to recognize not only faces but also the emotions that those faces display. While this could be useful for a variety of industries, this technology also has some limitations that organizations need to address before they incorporate it.
Emotion recognition technology overview
- What is emotion recognition technology?
- Use cases for emotion recognition AI
- Limits of emotion detection AI
- Deciding if emotion recognition software is right for your business
What is Emotion Recognition Technology?
Emotion recognition technology is a type of artificial intelligence related to facial recognition that attempts to identify how a human subject is feeling based on their facial expressions and bodily cues, including heart rate and brain activity. The software can also track eye movement to identify which parts of stimuli the subject is paying the most attention to. It requires deep learning technology in order to improve as it gathers more data.
Kate Krosschell, marketing automation manager at iMotions, explains, “The most commonly referenced methodology is Facial Expression Analysis, which works on algorithms trained on human facial coders and millions of human faces to detect facial action coding units like lip curl, eyebrow raise, brow furrow, etc. The technology puts these action units together to assess the expressivity of seven core emotions (joy, anger, fear, disgust, contempt, sadness, and surprise). These output measures provide probability values to represent the likelihood that the expected emotion is being expressed.”
Use Cases for Emotion Recognition AI
Emotion recognition technology has use cases across a variety of industries, including healthcare, marketing, and manufacturing.
Healthcare providers can use emotion recognition AI to prioritize their patients by analyzing facial expressions in the waiting room, especially in urgent care centers where people don’t schedule appointments. Those who are in the most discomfort could receive the highest priority, while those with lesser ailments could wait for an opening.
However, researchers are using emotion recognition technology in more experimental ways, as well. Using Google Glass and a custom smartphone app, researchers at the Stanford University School of Medicine found a way to help autistic children better identify the emotions and facial expressions they encountered. The app gives the child real-time feedback on the facial expressions of other people, as long as the child is wearing the Google Glass.
Talking about her son Alex, a participant in the study, Donji Cullenbine said, “It was a game environment in which he wanted to win — he wanted to guess right.” The Stanford researchers gamified their app to make it more appealing to children, and after an average of ten weeks, 12 out of the 14 trial families saw marked improvement in the amount of eye contact their autistic children were showing.
Marketers always want to know how products will perform before releasing them, but that’s easier said than done. Focus groups aren’t always as reliable as companies would hope, and some people are too nice — worried about hurting someone’s feelings, so they sugarcoat their feedback rather than being honest. However, emotion recognition technology can help companies get more from their focus groups by analyzing the facial expressions of testers using their products or watching their advertisements.
“Recently our client Unravel Research used EEG to detect the hit potential of streaming music on Spotify,” Krosschell said. “They found that brain activity among a small sample of listeners can indicate something called neural synchrony, which has a pretty interesting correlation with how popular a song ends up being on Spotify over the span of weeks and months.”
Also read: The Overlooked and Undervalued Importance of Marketing
Automotive manufacturers can also make good use of emotion recognition technology. Cars that alert drivers when they nod off or are getting drowsy can help prevent dangerous accidents. Road rage or other extreme emotions could also trigger the alert. This could be especially helpful in the case of cars with self-driving or autopilot features. If the human driver becomes overly emotional or sleepy, the autopilot can engage, while alerting the driver. The alert could startle them awake or help them take a few moments to calm down before resuming control of the vehicle.
Limits of Emotion Detection AI
As with most AI models, accuracy is a concern with emotion detection. Researchers have to train the AI extensively and include subjects from a variety of cultures because not every culture expresses emotions in the same way. Even with this diverse training, the AI would have to correctly determine which culture a subject belongs to in order to provide an accurate reading, which may not always be possible.
In her article for The Atlantic, Kate Crawford, research professor at USC Annenberg and a senior principal researcher at Microsoft Research, writes, “Given that facial expressions are culturally variable, using them to train machine-learning systems would inevitably mix together all sorts of different contexts, signals, and expectations.” Researchers can’t expect a smile in one country to portray the exact same emotion in another. For example, many people laugh when they get nervous or simply to be polite. It doesn’t always convey joy.
Also read: The Struggles & Solutions to Bias in AI
Gather Feedback and Improve Products with Emotion Recogntion Technology (and Permission!)
Emotion recognition technology has many uses in the business world, but it also comes with limitations. It’s a great tool for gathering feedback about products and services, and it may also help companies make large strides in both healthcare and manufacturing. Companies looking to gather research on their customers or improve their products should look into emotion detection AI. However, it’s important to keep in mind that not everyone displays their emotions in the same way, so the software shouldn’t be used to make life-altering decisions, like ascribing guilt in the case of a crime or determining if someone is lying to their insurance provider. Emotion recognition technology should only be used with the subject’s consent.
Read next: Google Makes Case for Managing AI Models