Can AI identify feelings much better than humans?

Blogs

If you have any thoughts/feedback or desire to get involved in our research study– please email me at ajit.jaokar at feynlabs.ai

Introduction

The topic is covered under Affective computing (in some cases called synthetic psychological intelligence, or feeling AI)

Affective computing is the research study and advancement of systems and devices that can acknowledge, analyze, process, and replicate human impacts( feelings).

Emotion Research Labs utilizes the idea of facial action coding system (FACS) – which is a system based on facial muscle modifications and can define facial actions to express specific human emotions as specified by Basic feelings, Compound emotions, Secondary feelings (euphoria, rejection, self-rejection) etc

How do we apply this for AI? Ultimately, artificial empathy might even be modelled as a support learni …

If you have any thoughts/feedback or dream to get involved in our research–

please email me at ajit.jaokar at feynlabs.ai

Affective Computing

Image source: Emotion research study labs

Affective computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects(emotions). It is an interdisciplinary field spanning computer science, psychology, and cognitive science. The more modern branch of computer science originated with Rosalind Picard’s 1995 paper on affective computing.

A motivation for the research is the ability to simulate empathy. The machine should interpret the emotional state of humans and adapt its behavior to them, giving an appropriate response to those emotions.

The difference between sentiment analysis and affective analysis is that the latter detects the different emotions instead of identifying only the polarity of the phrase.

(above adapted from wikipedia) 

Artificial Intelligence and Facial Emotion recognition – some background

Facial emotion recognition an important topic in the  field of computer vision and artificial intelligence. Facial expressions are one of the main Information channels for interpersonal Communication.

Verbal components only convey 1/3 of human communication – and hence nonverbal components  such as facial emotions are important for recognition of emotion.  

Facial emotion recognition is based on the fact that humans display subtle but noticeable changes in skin color, caused by blood flow within the  face, in order to  communicate how we’re feeling. Darwin was the first to suggest that facial expressions are universal and other studies have shown that the same applies to communication in primates.

 

Microexpressions – Is AI better at facial emotion recognition than humans?

Independent of AI, there has been work done in facial emotional recognition. Plutchik wheel of emotions illustrates the various relationships among the emotions. Ekman and Friesen pioneered the study of emotions and their relation to facial expressions.

 

Is AI better than humans at recognising facial emotion recognition?

 

AI’s ability to detect emotion from facial expressions better than humans lies in the understanding of microemotions.

Macroexpressions last between  0.5 to 4 seconds  (and hence are easy to see). In contrast,  Microexpressions  last as little as 1/30 of a second. AI is better at detection microemotions than humans.

Haggard & Isaacs (1966) verified the existence of  microexpressions  while scanning films of psychotherapy  sessions in slow motion.

Microexpressions occurred when individuals attempted to be deceitful about their  emotional expressions.

Emotion Research Labs uses the concept of facial action coding system (FACS) – which is a system based on facial  muscle changes and can characterize facial actions to express individual human emotions as defined by Basic emotions, Compound emotions, Secondary  emotions (ecstasy, rejection, self-rejection) etc

 

A model for artificial empathy for long engagements between humans and AI

How do we apply this for AI?

One of the areas I am working on is – artificial empathy.

The concept of artificial empathy has been proposed by researchers like Asada and ohers.

Empathy is the capacity to understand or feel what another person is experiencing from within their frame of reference, that is, the capacity to place oneself in another’s position.

The three types of empathy – all need the sharing of experiences. Hence, if AI is to understand human empathy – AI needs people to  share experiences to it. This is especially true in a longer relationship between humans and AI – such as coaching.

The research I am working on is based on the idea of creating an “AI coach” and in the creating of a model for engagement between humans and AI in longer interactions. It is based on the propensity of humans to share experiences with the AI coach. Ultimately, artificial empathy could even be modelled as a reinforcement learni…

 

If you have any thoughts/feedback or wish to participate in our research –

please email me at ajit.jaokar at feynlabs.ai

Image source: Emotion research labs

Please follow and like us: