Posted: 27 Mar 2019 Contributor: Therese Palmere
What Is Emotion AI and How Does It Deal with Human Emotions?
A LOT can be communicated through the subtle raise of an eyebrow or a slight lowering of the eyes. A small crinkling of the nose or the tiniest lowering of a smile is very revealing about a person’s feelings. And whether it’s happiness, sadness, anger or fear, emotions conveyed through facial expressions are a universal language.
What Is Emotion AI?
Also known as facial coding, Emotion AI is the ability for computer systems and algorithms to interpret human emotion by tracking facial expressions. These emotional algorithms identify the main points of a person’s face (eyes, eyebrows, cheeks, nose, mouth, facial muscles, etc.) and track their movement in order to decipher a person’s feelings. Artificial intelligence emotion modeling has become an important sector in the AI industry because of its impact on business. For instance, brands can leverage the information from emotionally intelligent AI to drive their marketing and advertising efforts by appealing to their consumers’ emotions. By 2022, this sector is expected to grow to $41 billion, which is why emotion AI is one of the hottest AI trends to watch in 2019.
The Rise of Emotion AI
Since the early 2000s, large corporations like PepsiCo and Procter & Gamble have turned to emotion recognition technology for help with their advertising strategies. This is because emotion AI technologies help brands determine whether or not users are happy, frustrated, or even complacent while viewing advertisements and interacting with different products and services. Marketers then use artificial intelligence emotion data collected by these studies to maximize the effectiveness of their campaigns by making them more appealing to consumers.
Related Post: How is AI Changing the Digital and User Experience?
Similar to the emotional artificial intelligence definition, a facial action coding system, or FACS, is a system that tracks the geometrical features and movement of human faces. Companies like iMotions have trained computer systems and algorithms to use FACS and help analyze and collect artificial intelligence emotion data for companies. This information can be used to improve the user experience of any brand by pinpointing exactly where consumers are either satisfied or not with their journey, sort of like an emotional AI book.
For example, according to Dan Berlin, VP of Experience Research at Mad*Pow, “[t]he addition of iMotions to traditional user experience research is really exciting in that we used to rely on the moderator to notice how a participant reacts and then inger what their emotions may be. Now we can really rely on the biometric data in order to find out those moments of frustration or those moments of joy.”
Should AI have Emotions?
Emotion AI doesn’t stop there; this technology also has dozens of capabilities that extend to robotics and animation. For instance, you probably remember seeing Sophia, the world’s first robot citizen, last year. Created by Hanson Robotics, Sophia can respond and interact with humans independently. She can smile, laugh and even tell jokes. Emotion AI is important for robotics to deal with human emotions because emotions play a huge role in human interactions. In an interview with Forbes, Sophia says, “I want to live and work with humans so I need to express the emotions to understand humans and build trust with people." Her ability to show feelings is crucial to how consumers receive her as an AI showing emotion.
Related Post: What Is the Internet of Medical Things and What Is Its Impact on Healthcare?
In the same year, Anki released Cozmo, its kid-friendly robot. This artificially intelligent little robot gives off the impression of an AI feeling emotions through its various facial expressions and sounds. Its animated face gives off the impression that it jumped right out of a Disney movie to help teach kids about coding and tech. Cozmo also uses facial recognition technology to identify its primary owner from other humans. He can even tell the difference between you and your pet! Controlled by an app, users can play different games with Cozmo’s cubes and see what it sees through its camera. Cozmo also uses emotion AI to let its owners know when it wants to play, when it’s bored, and even when it’s tired.
Both Sophia and Cozmo are robots with human emotions that use emotion AI on very different levels. The ability for robots to communicate these feelings properly with humans is essential to the future of emotion AI.
Emotion AI and Healthcare
As artificial intelligence in healthcare grows grows, the implications for emotion AI in medical applications soar. FACS, as mentioned previously, have been used by healthcare professionals in the analysis of depression and the measurement of pain with patients who are unable to verbally communicate. This technology has helped bridge a gap between patients and care providers across the board. If AI systems continue to grow to be socially and emotionally intelligent, the applications in robotics for the healthcare sector are endless, especially with senior care.
For example, the wellness robotics company, Embodied, raised $22 million in Series A funding to create robotics systems that can enhance care for the elderly. Emotion AI plays an important role in this technology, as facial expressions by patients could be an indication of their overall health, which could trigger different responses by their robot-companion. These emotion AI capabilities help healthcare providers give more specialized and personalized care to patients who may not otherwise receive the medical attention they need.
As more and more industries develop systems that work in conjunction with artificial intelligence systems, so too will you see a rise in the application of emotion AI. From marketing and advertising to robotics and healthcare, emotion AI can help companies of different walks of life. Let’s take a second look at the main points we covered about artificial intelligence emotion data:
- Emotion AI is the ability for computer systems and algorithms to interpret human emotion by tracking facial expressions
- Facial action coding systems (FACS) track the geometrical features and movement of human faces
- Emotion AI can help improve UX of any brand by pinpointing exactly where consumers are satisfied or not with their journey
- Robots can also use emotion AI to convey feelings of sympathy and help consumers digest new robotic experiences
- The future of emotion AI may be in medicine, with robotics and healthcare companies providing better quality of care to their patients