What Happens When Artificial Intelligence Can Read Our Emotion in Virtual Reality

--

Apple: Animoji

Being surrounded by machines that understand our emotion is one of many ‘what ifs’ that is kind of creepy to even think about. Don’t get surprised. We will get to that future sooner or later owing to technological advances, but how?

How does a machine ‘sense’ our emotion?

At Apple’s September keynote, Apple X for the first time showed off its slick design to the world, and Apple phone lovers couldn’t help but shout “hooray!” with enthusiasm. What caught people’s eyes unexpectedly among others was Animoji, a dozen different animal emojis that mirror users’ facial expressions and that can be shared with others. Animoji seems interesting for sure, but what does it really mean for our communication in a digital world?

Nowadays, an overwhelming amount of human-to-human communications happen every second via different digital platforms, but they are quite often void of the essence of human nature: emotion. To facilitate machine-mediated communication, many tech giants are spending a great deal of time and effort on finding proper sensors that can empower digital machines to interpret our emotion. At least for smartphones, since we take pictures and talk on the phone in a daily basis, it comes naturally to engineers to use a camera (facial recognition) and microphone (virtual assistants―Siri, Google Assistant or Amazon Alexa) to ‘sense’ our emotion.

What about in VR?

Facebook Social VR

Social Virtual Reality (VR) is a new emerging digital platform that offers a virtual space where people with their avatars can interact with others. But how do we add an emotional texture to VR? That gets us to Massachusetts Institute of Technology (MIT) Media Lab.

A: circuit board with bluetooth connection B: PPG senor C: GSR Electrode

MIT Media Lab decided to add an extra layer of emotional skin to a virtual avatar. The researchers created an ‘emotional beast’ in VR that changes its appearance responding to a user’s emotional state. In order to detect a user’s emotion in VR, the team integrated a physiological sensing module including electrodes―for galvanic skin response (GSR) data collection―and photoplethysmogram (PPG) sensors ―for heart rate data collection―into the mask of a VR headset. GSR data reflects a user’s emotional arousal, but it is not enough to determine whether a user is aroused positively or negatively. Thus, a PPG sensor―using light to track the rate of blood flow and gauge a user’s anxiety and stress levels (negative arousal)―is needed to complement GSR data. Basically, these selected physiological sensors act as a medium for emotion recognition just as a camera and microphone in smartphones.

The researchers crafted two types of ‘emotional beasts’: fur-based and particle-based.

A fur-based emotional beast

The fur-based ‘emotional beast’ has the ability to contract and grow its fur to visually express the happiness of a user. Based on Lang’s Model, the team evaluated the four emotional states on a scale of 0 to 1. The fur-based beast grows its fur to full length if the evaluated emotion is ‘happy’ whereas the fur stays within the inner skin and thus results in the smooth outer skin if evaluated to be ‘neutral’.

A particle-based emotional beast

The particle-based ‘emotional beast’, on the other hand, takes account of two variables: the brightness and color. On a scale of 0 to 1, the arousal level of a user is estimated. At a high arousal level, the particles illuminate while at a neutral state become almost invisible. In a similar manner, a user can express his/her frustration and anger to other avatars by converting the color of the particles from blue to red.

Indeed, MIT Media Lab has crafted visually scintillating artwork. These colorful and vibrant creatures enabled the users to express their emotions in most vivid way possible and thus brought emotional texture to a surface-level experience of VR (See the video here).

How Can Emotion AI Revolutionize VR?

Yet what’s working behind ‘emotional beast’ is machine learning algorithm. The researchers let the system to learn the physiological data-sets and predict a person’s emotional states. Without this process, GSR and PPG data are just a bunch of numbers that tells us nothing. In fact, any system that aims at detecting emotion based on user-provided data inevitably entails machine learning process.

Although the “emotional beast” project has successfully portrayed how emotion detection technology can be used in VR, being able to perform human-to-human communication within VR may become of little interest to us if Artificial Intelligence(AI) comes into play―because VR coupled with Emotion AI will eventually touch every part of our lives and bring up so many ‘what ifs’.

“What if AI can gauge your preference towards all the products you’ve seen in a virtual shopping mall and then suggest a purchase list of the preferred products or even automatically purchase them for you?”

“What if AI can measure the concentration and excitement level of a middle school student listening to a lecture in VR and come up with the customized curriculum specifically for that student?”

“What if…”

There are so many we can think of right now, and believe it or not, these ‘what if’ scenarios of AI reading our emotion do not remain as a creepy pipe-dream anymore.

Reference

  1. Emotional Beasts: Visually Expressing Emotions through Avatars in VR
  2. Apple: Animoji

--

--

A tech start-up to develop a VR cognitive care solution aiming to early detect older people at-risk for dementia by collecting and analyzing user’s bio-signals.