I recently met Dana Hughes just before his move to Pittsburg. He just earned his doctoral degree at CU in robot sensation and touch. While I know virtually nothing about designing or evaluating robotic hardware I think his work might be useful for Misty in future iterations. He has just started a post-doc continuing this work at Carnegie-Melon.
Here is an edited abstract from a 2018 conference paper that gives the flavor of his work. Note that he has worked on developing the robotic skin mentioned as well.
Deep learning approaches have been used to perform classification in several applications with high-dimensional input data. In this paper, we investigate the potential for deep learning for classifying affective touch on robotic skin in a social setting. …The deep learning approaches … allows gestures to be predicted in real-time at a rate of 6 to 9 Hertz. The memory requirements of the models demonstrate that they can be implemented on small, inexpensive micro-controllers, demonstrating that classification can be performed in the skin itself by collocating computing elements with the sensor array.
The url for all his articles is: https://scholar.google.com/citations?hl=en&user=mv_fbkkAAAAJ&view_op=list_works&sortby=pubdate