Robot Not a Dev? Pre-order Now
Not a Dev?

Sensing Human Emotions

emotion

#1

There are many companies on the market, such as Emotient (www.imotions.com) and Affectiva (www.affectiva.com), that can use camera & biometric data to discern (sense & classify) human emotions.

This differs (slightly) from sentiment analysis (Sentiment analysis - Wikipedia), where the primary data input is voice/sound, rather than visual (camera) data.

https://nordicapis.com/20-emotion-recognition-apis-that-will-leave-you-impressed-and-concerned/

Regardless of the means & mechanism by which human mood, emotion, or sentiment is derived, I’m wondering if it would be helpful for your Misty robot to be equipped with such technology?

If you had the ability to query your robot, for example through an SDK or API, to return current emotion of any faces currently in view and/or mood/sentiment from any voices currently speaking, would you use it? How would you incorporate human emotion, mood, or sentiment into the skills that you build for your Misty robot? Can you think of any skills that you would not be able to build if your Misty robot didn’t have human emotion-sensing capability?

Have you ever used any of these technologies? Do you have a favorite “emotion recognition” API?


5 Questions about robots
#2

I’ve actually been thinking about this a ton lately, especially after watching this. I’ve used affectiva and a few other similar SDKs / APIs and while it works ok for classifying “expressions” there really isn’t a good way of understanding “Emotions” due to the complexity behind it. So I think as a developer its nice to have the option to say, trigger events based on a smile or a frown is fun, but I haven’t found it too practical in understanding the emotional state of a user.
I would like to see something that uses audio for sentiment analysis though, I think that could be more interesting. for example, say the robot is on but not being interacted with directly, it hears and detects crying and could then try to pin point that location to seek to comfort the one making the sound. that would be pretty rad, almost like what you might get from a dog (definitely not a cat though :stuck_out_tongue: ).
So I guess at the end of it all I would say yes to audio and visual sentiment / expression classification of some sort.


#3

Ultimately, it needs to be a higher level thought process on the robot. Take all of these different sensor inputs and guess what emotion is displayed. The next step is guessing whether it’s feigned or genuine.


#4

Gah, I’ve been wanting this since day 1 on the project. Being able to sense a person’s emotional state would be so useful for doing automatic robot adaption - The robot can try a (subtle) variation on a behavior or a plan and see if the human is pleased or displeased and use that as an implicit reinforcement signal. Or the robot could observe the human’s responses to external stimuli to determine what the human likes/dislikes and use that when choosing what skill to execute in which context.

My issue with most of the current work is that it only really works well for full-frontal facial images, which our robot doesn’t tend to have. But I’m with @Scotty - I think if we want this on a misty we’ll need to integrate multiple signals.


#5

I find it somewhat amusing that we think machines will soon be able to tell our emotions when humans can barely do this. We can’t even create good training data because it’s hard for people to guess the emotions from just images. We can’t even agree what a series of stylized eyes mean.

That being said, with machines we can go beyond the sensory mediums that we as humans use. Why limit ourselves to just visible imagery when we could combine with other sensors (e.g. thermal, NIR) and extracting extra info from video (like this Video Magnification). By tracking things like temperature of skin, reflectance, pulse, body language and pose, and face landmarks we may be able to get pretty decent results.

Whether we’ll actually want our robots to sense our emotions will be another matter.


#6

Why limit ourselves to distal sensing? Many people these days wear biometric sensors by choice (sports watches), which could give a robot direct access to heart-rate, GSR, temperature, blood oxygenation, acceleration, etc. I’ve been wanting to get a project going using some of the new EEG sensors (think emotiv headband) for a while now.


#7

Good point. And that’s actually one of the potential applications for Misty to connect to your workout devices and provide coaching etc. Maybe we’ll make a lie detector mode for Misty :wink:

I might be biased, but for me the more sensors the better :slight_smile:


#8

Agreed. I want all the sensors.


#9

Do you think it will be possible/practical to add sensors to Misty via hardware attachments, etc?


#10

The backpack should be very useful for some sensors. I would want a smoke detector and a gas detector as potential backpacks.


#11

If the robot talks WiFi and Bluetooth, no reason I can’t grab data from your smartwatch, smart thermostat, smart oven, smart shoe…


#12

I think that it’s important to mention that emotion classification is important in applications, such as those mentioned in this article:

When the robot is involved in some form of therapy or when it is working with people who are considered to be part of protected or vulnerable populations, sensing human partner emotions can be critical to successful operation of the robot in it’s primary function.


#13

I could imagine people being more comfortable telling their robot how they are feeling - as part of the training process. And, if the robot thinks it sees an emotion, learning further by asking “are you feeling x?”

That said… have you ever had a friend say “did you feel that person’s energy?” (That might have been calm, agitated, anxious, angry, etc…) And, the energy felt might have been very different from the presented affect. PERHAPS vital signs of the person can predict “the energy” (@Dan is there research into people’s vital signs showing strong correlation to a specific emotion?) but I’m suspicious that predictive ability works.

@adam1 thoughts?


#14

yes, we need Misty to be able to sense “the elephant in the room”!


#15

A quick look online turns up https://arxiv.org/pdf/1702.02510.pdf , which is a database of EEG, ECG and GSR matched to valence and arousal, among other affective markers. They also use personality as a co-variate, which is pretty interesting.

There is also some research that indicates that people are more comfortable disclosing their thoughts and feelings to a robot: A Robot that Encourages Self-disclosure by Hug | SpringerLink


#16

We know that we have named emotions as short-hand for describing a physiological state. Anger, frustration, and irritation are different words and describe different emotional states, however, physiologically, they may not be that different.


#17

In the mental health community - say a group home environment - it might be interesting to have Misty or a bot interact with residents. I do not mean just emotion detection but the ability to program them as well and give them something to do and then see it work with them.


#18

Delivering medications?


#19

Maybe happier tasks than meds? :slight_smile:

Every human – be they considered well or in need of health assistance – needs solid, achievable challenges to feel good. I’d say that successfully getting a robot to perform fun tasks (dancing, greeting your friends) is good for anyone’s mental health!


#20

Follow up - Here’s an article about some issues with the most common style of emotion recognition: