Identifying what someone is feeling or even anticipating potential reactions based on nonverbal behavioral cues is no longer a problem reserved for sensitive and astute people.
With the advancement of cutting-edge technologies in emotional intelligence, this capability gains new dimensions with the capability of machines recognizing human emotions for a variety of purposes.
Complex facial detection algorithms are now powerful enough to analyze and measure emotions captured in real-world situations. They are so powerful that we are reaching a point that some ethical aspects have been raised.
How does Emotion Recognition work?
Emotion Recognition is based on facial expression recognition, a computer-based technology that employs algorithms to detect faces, code facial expressions, and recognize emotional states in real-time.
It accomplishes this by analyzing faces in images or video using computer-powered cameras embedded in laptops, mobile phones, and digital signage systems, as well as cameras mounted on computer screens.
In general, computer-powered cameras are used for facial analysis in three steps:
- Recognition of faces: Identifying faces in a scene, an image, or video footage.
- Recognition of facial landmarks: Extraction of facial feature information from detected faces. Detecting the shape of facial components, for example, or describing the skin’s texture in a facial area.
- Emotion classification and facial expression: Analyzing the movement and changes in the appearance of facial features and categorizing this information into expression-interpretative categories such as facial muscle activations such as smile or frown; emotion categories such as happiness or anger; and attitude categories such as (dis)liking or ambivalence.

A practical example
Some time ago, to learn more about this technology, I’ve implemented a Facial Sentiment Detector capable of Real-time face detection and emotion/gender classification using fer2013/IMDB datasets with a Keras CNN model and OpenCV.
The system using the IMDB gender classification reached an accuracy of 96% for gender, and using the fer2013 dataset for emotion classification test, reached an accuracy of 66%.

This system is based on a paper that proposes the implementation of a general framework for designing real-time CNNs (convolutional neural networks) that control a real-time vision system capable of performing face detection, gender classification, and emotion classification, by the B-IT-BOTS robotics team.
For further details, please refer to my GitHub.
Raising awareness with a game.
There are many ways to start discussions about AI’s Sentiment Recognition ethical aspects.
One of them is allowing people to make faces on their webcam or smartphone to see an AI in action through a game that uses emotion recognition.
Recently, a team of researchers from the University of Cambridge and UCL has created a website designed to help people understand how computers can scan facial expressions to detect emotion and start discussions about technology and its social implications.
This project was created by a group of social scientists, citizen scientists, and designers who want to start a conversation about emotion recognition systems, from the science behind the technology to its social implications – and everything in between.
When you visit the website, you can play a game to make faces at your device’s camera to get the AI emotion recognition system to recognize six emotions: happiness, sadness, fear, surprise, disgust, and anger.
The project allows people to interact with these systems and gain a better understanding of how powerful they are, as well as how flawed they are.
If you want, you can also answer a series of optional questions to help researchers, such as whether you have used the technology before and whether you find it useful or concerning.
The optional answers to questions will be used in an academic paper on citizen science approaches to understand the societal implications of emotion recognition better.
The Ethical aspect of Emotion Recognition
Emotion recognition is unreliable once it presumes that our facial expressions perfectly reflect our inner emotions.
Everyone who ever faked a smile knows it is not regularly true.
This kind of initiative is an engaging way to spark discussions about technology and its social implications.
This project may seem to be a fun game. Still, it is also an exciting and science-baked way to make people think about this technology’s stakes, considering the worrying potential for discrimination and surveillance it may have.
Applications of Sentiment Analysis
With a strong commercial appeal, businesses can use this technique to assess the general mood of their customers and, as a result, define marketing strategies. Simply having a social media account allows you to use the tool and understand what is being said about it and your customers’ behavior.
The technique can also be used in call centers. Using Artificial Intelligence, it is possible to identify feelings in real-time and, as a result, provide direct care.
The sentiment analysis is extensive and can be used for a variety of purposes, including:
- Market research to capture the emotion expressed by the general public;
- Immersive experience creation: the ability to change the virtual reality environment based on the emotions transmitted by the user;
- User evaluations;
- Virtual assistants: language can be customized based on the user’s emotions.
In some countries, like in China, AI Emotion Recognition technology has been used in many sectors, including police interrogation and school behavior monitoring.
Other potential applications include:
- Border control.
- Evaluation of candidates during job interviews.
- Collection of customer insights for businesses.
Conclusion
Complex facial detection algorithms are now powerful enough to analyze and measure emotions captured in real-world situations. They are so powerful that we are reaching a point that some ethical aspects have been raised.
So we need a more open and democratic method to determine how the technology can be used and especially to decide how it must be used.
The user experience combined with information security needs to be strengthened. And that goes beyond implementing mechanisms.
Increasingly, it will be necessary to give users the means to manage which data they want to share and which they do not.
Reference
- Emojify, An Online Game To Show Risks Of AI Emotion …. https://www.technologytimes.pk/2021/04/04/emojify-an-online-game-to-show-risks-of-ai-emotion-recognition/
- Face classification and detection – https://github.com/gitliber/Facial_Emotion_Detector
- test the artificial intelligence emotion recognition technology. – https://emojify.info/
- Scientists create an online game to show risks of AI emotion …. https://jerseyeveningpost.com/news/uk-news/2021/04/04/scientists-create-online-game-to-show-risks-of-ai-emotion-recognition/
- Introduction to Facial Emotion Recognition – https://algorithmia.com/blog/introduction-to-emotion-recognition
- Facial emotion recognition using deep learning: review and insights – https://www.sciencedirect.com/science/article/pii/S1877050920318019
- Artificial intelligence advances threaten privacy of …. https://www.eurekalert.org/pub_releases/2019-01/uoc–aia010319.php
- Real-time Convolutional Neural Network for Emotion and gender Detection – https://github.com/gitliber/Facial_Emotion_Detector/blob/master/report.pdf