Are Facial Datasets Sufficient to Perform Sentiment Analysis Using Emotional AI?

by December 19, 2019

From effective computing to affective computing, AI has come a long way in transforming our personal and professional space. Affective computing is something that gathers data from the personal space of an individual and leverages it for professional gains. The technology which is commonly known as Emotional AI allows the study and development of devices and applications with an ability to recognize, interpret, process, and simulate human affects. Emotional AI is an interdisciplinary field that covers computer science, psychology, and cognitive science.

According to Harvard Business Review, the days when AI technology will be able to “recognize, process, and simulate” human emotions are not that far away. As the emotional AI has already made itself a unique place in the affective computing market, it has been predicted that its market size could grow to about US$ 41 billion by 2022.

The technology of sentiment analysis or emotion analysis caters great insights into rapidly growing customer service issues in an effort to conveniently identify and act on the root cause of issues or even mitigate them before they reach critical mass. It can also help discover and improve the brand’s reputation; gaining insight into customer behavior and attitudes over certain services, products, campaigns, or other topics; and providing a unified view of the full customer journey.

According to some statistics, around 46 percent of callers just want to have listened during a conversation with a contact center agent. Here emotional AI can help such agents develop an emotional connection with customers and cultivate empathy towards them. This will help organizations push customer experience to the next level.

However, currently, the technology is being used to assess job applicants and people suspected of crimes. Emotional AI is also being tested for further applications, say, in VR headsets to decode gamers’ emotional states.

Well on the darker side, the technology is also suspected to increase race and gender disparities. Here, regulators should step in to leverage significant restrictions on its uses and AI companies need to work on their AI biases to mitigate such issues. According to a study by the Association for Psychological Science, it’s quite hard to use facial expressions alone to accurately tell how someone is feeling. The association has spent two years reviewing more than 1,000 papers on emotion detection.

According to Lisa Feldman Barrett, a professor of psychology at Northeastern University, who worked on the study said: “About 20 to 30 percent of the time, people make the expected facial expression,” such as smiling when happy… But the rest of the time, they don’t. “They’re not moving their faces in random ways. They’re expressing emotion in ways that are specific to the situation.”

The AI Now institute says in its December 2019 report that “regulators should ban the use of affect recognition in important decisions that impact people’s lives and access to opportunities.”

The report also quotes, “there remains little to no evidence that these new affect-recognition products have any scientific validity. In February, researchers at Berkeley found that in order to detect emotions with accuracy and the high agreement requires context beyond the face and body.”