Robot Not a Dev? Pre-order Now
Not a Dev?

Sensing Human Emotions



I really liked this article you posted @Dan. Very informative.

I think this article supports the idea that Misty can best infer a person’s emotions by collecting and analyzing over time information about each specific person only. This within person method can be used simultaneously on multiple individuals. I think this approach has been mentioned previously in one or more of these posts and/or accompanying articles…

In fact it might be interesting to program Misty to ask questions about the mood of each person in her household on a semi-regular basis (e.g., can you rate how happy you are right now, using a 1 - 10 scale; the scale would have been previously defined). I think this idea would relate to @Terrainpark two posts, among others.

Another thought. Vector, Misty’s upcoming “companion”, is supposed to have tons of personality. In fact I think that Vector’s current ratings on his Emotion Engine are available for inspection at any time. Could Misty be trained to recognize Vector’s moods to a reasonably accurate extent? Could be an interesting project on the way to having Misty sense human emotions.


I also liked this article @Dan

I have not studied much on emotions as I have on personality types.

I would most certainly agree after reading this article that robots should not only identify type but always be monitoring peoples emotions.

If the robot knew emotions it would know when and for how long someone wanted to interact for. If the robot knew type it would know how and why to interact with someone.

Yes, I would like to learn more about facial to mood recognition programming.


Coming late to this party, but wanted to point out a second part was written to the article Dan posted in Sept. see Part Two I found the observations in P2 more in alignment with life, we exhibit a multitude of emotions and each individual exhibits them in their own manner. It’s no wonder we have dozen’s of emojis to pick from when trying to substitute in-person reactions. I find it interesting that none of these articles reference the living humans with Aspergers that often have an inability to read others feelings/emotions. There is a great ‘This American Life’ episode about a couple who figured out how to deal with this: Play the Part.

All This aside, realizing a perfect solution is not near at hand, my thoughts are towards a simple approach for Misty. What ever the methods of detection of emotion, Misty may need to summons a human to intervene. There is a large and growing elderly population who live alone and many suffer from depression mostly caused by loneliness. If Misty can accumulate data; simplest by asking: “Hey good morning, how are you feeling today?” …then if it seems there is a progression of depression, provide an event to call a caregiver or friend to say, “John could use some company,” is a good step forward. If Misty can also provide some company great, but alerting others to step up the contact is even better. Along with some circumstantial information and a feedback loop for ML overtime, knowing how to respond will gradually improve.


I like this suggestion about Misty collecting data a lot. I think there are a lot of situations where this could be used that would be very beneficial. In additional to collecting open-ended data Misty might even have the person addressed rate him- or herself on a simple scale.
Ideally Misty would do the analysis of, say, depression herself without having a person read or listen to the responses.