Emotion AI

Published by Wranga | October 13, 2022
Emotion AI

Written by Venkatesh Ramamrat

Navarasa means nine emotions; rasa means an emotional state of mind. Nine emotions are Shringara (love/beauty), Hasya (laughter), Karuna(sorrow), Raudra (anger), Veera (heroism/courage), Bhayanaka (terror/fear), Bibhatsa (disgust), Adbutha (surprise/wonder), Shantha (peace or tranquility).

After watching a series of the same name on Netflix, I began to ponder deeper into the concept of emotion and since we at Wranga, study deeply the effects of technology on children, it became a very crucial thought of how we could understand feelings with AI.

In the context of human-computer interaction, a modality is the classification of a single independent channel of sensory input/output between a computer and a human. A system is designated unimodal if it has only one modality implemented, and multimodal if it has more than one. Modalities can be generally defined in two forms: human-computer and computer-human modalities.

Emotion AI

Computer–Human modalities

Computers utilize a wide range of technologies to communicate and send information to humans. Common Modalities include Cision, Audition, and taction, whereas more complex modalities include taste, smell, heat, pain, and balance.

Human-computer modalities

Computers can be equipped with various types of input devices and sensors to allow them to receive information from humans. Certain modalities can provide a richer interaction depending on the context, and having options for implementation allows for more robust systems.

Simple modalities include Keyboard, Pointing device, and Touchscreen whereas complex modalities include Computer vision, Speech recognition, Motion, Orientation, and a very recent Emotion.

Emotion Recognition

Emotion recognition is the process of identifying human emotion. People vary widely in their accuracy at recognizing the emotions of others. The use of technology to help people with emotion recognition is a relatively nascent research area.

Very interesting research by HUME AI

The research defines 27 emotions: admiration, adoration, aesthetic appreciation, amusement, anger, anxiety, awe, awkwardness, boredom, calmness, confusion, craving, disgust, empathic pain, entrancement, excitement, fear, horror, interest, joy, nostalgia, relief, romance, sadness, satisfaction, sexual desire, surprise.

Watch this Interactive Map

Affective computing

Affective computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human effects. It is an interdisciplinary field spanning computer science, psychology, and cognitive science. While some core ideas in the field may be traced as far back as early philosophical inquiries into emotion, the more modern branch of computer science originated with Rosalind Picard's 1995 paper on affective computing and her book Affective Computing published by MIT Press.

Emotive Internet

Emotive Internet is a conceptualization of the Internet as an emergent emotional public space, such as how it serves as a space for the social sharing of emotions. It can also denote the quality of the Internet that allows it to be used to communicate in an emotive fashion or with emotional intent. Since it is an expressive medium, it also enables users to construct and represent their identities online. This is evident in the way emotional responses have been integrated with online communication and interactions.

The concept is also linked to emotional analytics and emotion-sensing applications, particularly those technologies that power the Internet of Things (IoT) - the smart home devices that can store and process the user's emotional profile to deliver services.

Emotion AI

Emotional AI refers to technologies that use affective computing and artificial intelligence techniques to sense, learn about and interact with human emotional life. It is a weak form of AI in that these technologies aim to read and react to emotions through text, voice, computer vision, biometric sensing, and, potentially, information about a person’s context.

While the effectiveness of current methods is highly debatable, we believe that the use of human-state measurement to engage with qualitative dimensions of human life is still in its infancy. The following techniques are used to try to sense and discern people’s states, emotions, and expressions:

  • Sentiment analysis of online language, emojis, images, and video for evidence of moods, feelings, and emotions
  • Facial coding of expressions
  • Speech analytics: includes elements such as the rate of speech, increases and decreases in pauses, and tone
  • Virtual Reality (VR) allows remote viewers to understand and feel what the wearer is experiencing. Headwear may also contain EEG and face muscle sensors
  • Augmented Reality (AR): remote viewers can track attention, reactions, and interaction with digital objects
  • Eye-tracking: measures gaze, eye position, and eye movement
  • Wearables sense skin responses, muscle activity, heart activity, skin temperature, respiration, and brain activity
  • Gesture, behavior, and internal physiology: cameras track hands, faces, external bodily behavior, and remote heart rate tracking
Emotion AI

Emotion AI and Children

Children exercise their social skills in 5 Social and Emotional Learning areas:

  • Self-Awareness of one’s own beliefs and goals
  • Self-Management of emotions
  • Social Awareness of the goals and feelings of others
  • Relationship Skills, including cooperation and communication
  • Responsible Decision Making
Emotion recognition makes use of facial expressions and biometric capture, but it is more than this. We use the term ‘emotional AI’ as a catch-all term to account for affective computing and AI techniques that pertain to sense and ‘feel-into’ human emotional life. This is accomplished through cameras, microphones, skin-contact and other sensors, detecting things like facial expression, the timbre of voice, heart rate, movement pattern, and other biometrics.

These technologies do not authentically understand emotion. They can see, count, compute, react, and output, but they do not know sadness or elation. But, although emotional AI systems do not have sentience nor feel in any human-like way, they still signal a new relationship between people and technology – a new vector for social life to be machine-readable; a way to turn our inner life into data.

Emotional AI promises a better experience with services, devices, and technologies. However, some considerations give cause to mistrust and question the rollout of these technologies. A very important study, which resonates with the work we do with Wranga, is by Professor Andrew Mcstay, which highlights four critical impact areas for children.

  • Promoting fairness for children: Emotional technology can potentially lead to unfair treatment of children, during childhood and later in adulthood.
  • Support for parents: Emotional AI can affect the role of parents and how they interact with their children. These effects should be explored through various lenses to mitigate negative or unethical outcomes.
  • Care for the experience of childhood: Child-focused emotional AI enables new depths and dimensions to the commercialization of childhood. Potential ethical and developmental problems must be highlighted and addressed.
  • Need for good governance: Generic and adult-oriented governance strategies are inadequate for child-focused emotional AI. Existing laws need to be reviewed and adapted to include child biometrics, and to account for the possible harms to children’s privacy and children themselves.
Emotional AI and Children 2020 Report

At Wranga, with an understanding of cutting-edge technology as well as knowledge and experience with the Safety Tech of Big Technology firms, we bring the best solutions to keep our children safe and help parents and children experience the beautiful growth in the technology age,and we seek to reach far and wide to every parent in the world.