Robot Not a Dev? Buy Now
Not a Dev?

Misty Community Forum

Develop interpersonal skills

As a social psychologist I’d like to see Misty given the ability to casually touch a person. When done by a librarian, for example, this is generally seen by recipients as positive. Even more interesting if the type of casual touch could vary by a person’s expression.

I can also imagine Misty being given the ability to function as a Rogerian therapist by using some recent version of Eliza software.


Very interesting idea.

Big step beyond an understanding or caring facial expression.

Since Misty doesn’t look overly “humanoid” I could see that kind of interaction be great with no creepy factor.

The Eliza idea. That could be a slippery slope chock full of unintended consequences from simple decision tree bug.

Very interesting! I’m a bit cautious to have a robot have so much agency and wonder what @michael thinks about this.

A similar concept is to have people actively touch the robot for an emotional connection e.g. this pets concept:


Research shows that people generally liked their interactions with Eliza as a software program. This experience could be even more engaging if we could get it to work through Misty.

For those of you not familiar with Rogerian therapy responses consist of comments/questions like “Tell me more about this” or “Tell me more about {keyword from person’s previous comment}”

I like the idea of touch going both from Misty to a person and vice versa. Ideally there would be a soft area of Misty to touch. I was already thinking that Misty’s arm could be made longer with a soft overlay.


There’s been a few interesting studies on robot touch, a recent one being:

“Observing Robot Touch in Context” from HRI 2018 [] looked at just this issue, having a robot casually touch people during an interaction

Also, going the other way, there is:
“I am a warm robot: the effects of temperature in physical human–robot interaction” in Robotica 2014
[I am a warm robot: the effects of temperature in physical human–robot interaction | Robotica | Cambridge Core] looking at how the sensation of touching a robot (particularly temperature) influences interactions.

For our Misty robot, we had talked about and explored a wide variety of options. Everything from fur to soft-touch plastic or rubber to routing waste heat from the processors to make the robot warm to the touch. Like @BoulderAl, I wanted a soft squishy warm place on the robot to pet (belly? head? back?). Perhaps we’ll see something like this in future versions…


Interestingly, “develop interpersonal skills” has been on my personal to-do list for some time now. I suspect Misty may have an easier time of it. :smirk:

1 Like

I like the idea of having a soft squishy place on Misty to touch, even if it isn’t warm. Perhaps someone could develop an attractive add-on that could be attached to Misty II’s skin.

And thanks for the reference to the article on robot touch. Given the study’s design limitations (robot is large, with massive arms and a male voice) I think it’s safe to conclude that Misty’s touch will be seen as positive as long as it is done well.

I also did a bit of research on what robots currently exist that are designed as robot companions. The three that stood out to me are:

• Paro (Japanese, dosphin, non-verbal)
• Mabu (Catalia Health, Personal Healthcare Companion, verbal)
• ElliQ (intuition robotics, social robot for the elderly, verbal, paired with tablet)

I think Misty has the potential to incorporate a lot of what Mabu and ElliQ in particular have to offer.

1 Like

Interesting topic for sure, all I can say is if you wanted to get me to touch a robot, then throw a plasma ball on its head and i’m all over it :wink:


I agree with @Scott, that “touch” is a slippery slope for robot-human interaction, and it should be approached cautiously.

I believe that touch is something that is very personal and also context-dependent. Touch requires permission, and so, before robots go around touching humans, robots need the ability to ask for and acknowledge consent. They need to ability to resolve the difference between yes and no. Robots also need the ability to understand that consent in one context, does not extrapolate to mean that consent has been granted for touching in all contexts. Similarly, before humans go around touching robots, robots need to be equipped with the ability to give & revoke consent and possess some level of understanding of what that means.

This article from HRI 2011 is the one that comes to my mind when I think about robots touching humans: (

I imagine that in the future, robots will be similar to humans in many facets. In modern society, there are laws that dictate (socially) acceptable human-human touching. So too, in the future, maybe robot-robot and robot-human touching will be governed by similar rules.

And then there are individual differences to consider also. Consider animals as examples. Some cats are friendly and like to be touched by humans. Other cats prefer to be left alone and do not want humans touching them. I believe that in the future, there will also be variance in robot preference about being touched by humans. Each robot will regard being touched differently, and so, it won’t be acceptable for humans (or robots) to walk around randomly touching robots.


Thanks for giving me another opportunity to clarify what I had in mind in this line of development. I realized that I hadn’t made clear that I see Misty developing interpersonal skills within the “family” Misty is part of. Thus my interest is in working on help Misty be a social companion. In this context I think touch and touching can be seen as appropriate.

Of course rules would have to be developed so that Misty can interpret feedback on whether and under what situations touch would be received positively within Misty’s “family”. These rules might also apply to strangers, but I see these occasions as occurring a lot less frequently than Misty’s interaction with people s/he lives with.

I agree that the norms for robotic actions in the future still need to be worked out. To response to one particular issue you raised: “… before humans go around touching robots, robots need to be equipped with the ability to give & revoke consent and possess some level of understanding of what that means.” This could be taken to imply that Misty needs to give consent before a family member picks him/her up or otherwise engages in physical interaction. This may also imply that we don’t really “own” Misty but are rather Misty’s custodian. I doubt you’d get much agreement on this issue from Misty’s developers, however.

The research article you referenced and the article that Dan referenced in an earlier post both have methodological flaws that greatly temper any conclusions one might want to reach. The authors do recognize this in the discussion, but the issues they appropriately raise are easy to forget. I conducted research and published in this area (impression formation; attribution theory) back in the 60’s and 70’s. I doubt readers would be interested in these issues but I’d be happy to talk with you (and/or Dan) about them if you’d like. Note that I live in Boulder so there are many ways this could happen if you are interested.

1 Like


Totally agree that there are some issues in the referenced papers. I put them forth mostly to indicate that work is going on in this area, not to say that anything in particular has been determined. Mostly they serve to point the way to future work that needs to be done before we fully understand touch and how it applies to human-human, let alone human-robot interaction.

As to your point:

You might be surprised by how many people might be OK with that. Here in Boulder, for example, we don’t own our pets: [6 Weird and Unique Boulder Laws | About Boulder County Colorado - Visitor and Local Guide to Boulder County Colorado]

Always happy to chat - either here or in person if you want to stop by the office.


Thanks, Dan. I’ll plan to drop in, perhaps sometime next week.

How should I connect to see what time would be convenient?

1 Like