Robot Not a Dev? Pre-order Now
Not a Dev?

Letting Misty recognize sensation and touch


#1

Misty’s Product Roadmap

I recently met Dana Hughes just before his move to Pittsburg. He just earned his doctoral degree at CU in robot sensation and touch. While I know virtually nothing about designing or evaluating robotic hardware I think his work might be useful for Misty in future iterations. He has just started a post-doc continuing this work at Carnegie-Melon.

Here is an edited abstract from a 2018 conference paper that gives the flavor of his work. Note that he has worked on developing the robotic skin mentioned as well.
Abstract:
Deep learning approaches have been used to perform classification in several applications with high-dimensional input data. In this paper, we investigate the potential for deep learning for classifying affective touch on robotic skin in a social setting. …The deep learning approaches … allows gestures to be predicted in real-time at a rate of 6 to 9 Hertz. The memory requirements of the models demonstrate that they can be implemented on small, inexpensive micro-controllers, demonstrating that classification can be performed in the skin itself by collocating computing elements with the sensor array.

The url for all his articles is: https://scholar.google.com/citations?hl=en&user=mv_fbkkAAAAJ&view_op=list_works&sortby=pubdate


#2

Hi @BoulderAl, that is awesome and its a super exciting field to be in!
Like you said, while it’s not possible for Misty II at this point I’d love to see where that type of technology takes us! Thanks for sharing!


#3

Another neat technology we’ve had our eye on for touch is:


#4

Looks like there are a lot of options that look very promising. Interesting that this work is from a group working at Carnegie Melon where Dana is now based.