The world’s leading publication for data science, AI, and ML professionals.

Will AI Ever Understand Our Emotions?

The use of machine learning to recognise human emotions

Affective computing

Artificial Intelligence has transformed our lives. Plenty of applications everywhere. From the face recognition software on your phone and smart digital assistants like Siri, Alexa or Google assistants to recommending your next favourite song or TV series. AI is being used in another area of computing, namely Affective Computing. What does this mean, however? In simple terms, it is a machine’s ability to deal with something that we believe machines would never be capable of, except perhaps in science fiction movies; emotions!

Could our Apple Watch or Fitbit ever know how we feel?

Well, in reality, this is not as simple as a yes or a no. In summary, however, there are clues that can be extracted from physiological sensors that can help us infer an individual’s emotional state. For instance, one of the most clinically validated approaches is the use of a galvanic sensor to assess stress in individuals. In simple terms, the sensor can detect micro-levels of sweat on the skin which indicates, sometimes, that we are in a stressful situation. How many times did your palms get sweaty and what those times were? Perhaps before an exam or a stressful event?

Another example is perhaps nervousness and how it is dealt with by different people. Some people might move more than usual, others might remain unusually quiet and still. In both cases, the physical activity changes from its usual state and potentially heart rate is elevated.

Another clinically validated observation relates heart rate to stress. More specifically, it has been observed that heart rate even though on average is let’s say 60 beats per minute, within a minute will not beat 60 times in equal intervals. More specifically, a beat to another (beat-to-beat interval) will slightly vary, e.g., 828ms, 845ms, 754ms, 742ms. The variance is usually a few tens of milliseconds to a hundred or so. This is an important observation as it was correlated to the function of the autonomous nervous system. Specifically, when we are in fight or flight mode the beat-to-beat intervals tend to be more constant while if we are relaxed there seems to be greater variation. This is formalised as Heart Rate Variability (HRV) and it gives an insight into how we feel from a physiological perspective. HRV is also used by professional athletes to optimise their training regimes, as well as recommended in some mindfulness practices and in the area of biofeedback in general.

The common denominator of those approaches is the use of special sensors to monitor human body activity in various ways. Having said special sensors, it doesn’t mean they are uncommon, not anymore. Crucially, sensors that are able to measure heart rate, temperature, and movement are embedded in commercial devices many people have like smartwatches and fitness bands etc.

But how are all this relevant to AI?

In short, these sensors generate (time-series) signals that algorithms could utilise in identifying patterns. Practically, from a machine learning perspective, we can create features (characteristics) based on those signals that capture information that could potentially be classified as a specific emotional state. We are basically looking for those cues in those signals that capture Emotions. For example, from a heart rate signal, you could calculate and extract the HRV which as discussed above plays a role in stress or from the accelerometer calculate the time of being active and so on.

This is exactly what the idea was when back in 2015 a user study was run to collect physiological data from wearable devices. In particular, the study was designed to collect data when arriving at work and stop collecting when leaving by attaching a body sensor to the participants. The study ran for 2 weeks and many hours of data signals were collected. In order to develop a supervised machine learning model, we also need labels. That’s why each user had to download and use a companion android app, in which the user had to report their feelings every 2 hours for the duration of the workday and for two weeks. An example of the Android’s app interface is shown below.

After the 2-week period, the data was assessed and aligned participants’ manual labels (their Mood) to their corresponding multimodal signals. A visualisation of the end-to-end process from obtaining various wearable data signals to predicting and inferring potential mood is presented below.

As different people might perceive mood differently or react differently to the same situation, further investigation was carried out to explore this intuition. As we can see from the figure below, there is an additional step compared to the approach above. Specifically, data signals are grouped together where appropriate based on how similar they are so that different perceptions of emotions fall into the same bucket (Perception Clusters).

All in all, this research and effective computing in general have done great leaps in understanding human emotions. It is mostly evident nowadays as commercial smartwatches come with the ability to provide feedback on people’s stress levels. This is just the beginning. Just imagine the possibilities this technology could have. Understanding not just moods but potentially mental states in general, like fatigue and calmness can potentially be life-saving. Just consider the scenario of a lorry driver or a surgeon before a long journey or a long and difficult operation. A tool that assesses fitness in terms of mental state can be hugely useful, if not critical. Of course, however, more research is needed to improve and validate these models, as well as take into account any ethical and privacy considerations.


You can find both research papers discussed in this post here:

https://www.researchgate.net/publication/301583412_HealthyOffice_Mood_recognition_at_work_using_smartphones_and_wearable_sensors

https://www.researchgate.net/publication/348109115_Perception_Clusters_Automated_Mood_Recognition_Using_a_Novel_Cluster-Driven_Modelling_System


Related Articles