Robot Not a Dev? Pre-order Now
Not a Dev?

Mixed Reality Misty


#1

I was actually quite excited when I saw that Misty was using the structure core sensor, I was an original backer of the sensor on Kickstarter and have used it quite a bit over the years. I was glad to see this:

https://bridge.occipital.com/blog/announcing-bridge-engine-10/

I wasn’t sure how far it would get after testing the alpha, I think I had 0002 of the Bridge Engine sets :stuck_out_tongue:
Anyway, after seeing the new release I was looking at the robot they have in there and how the interactions could be very similar to Misty in the real world. Perhaps there is something there? Maybe you could get a low poly version of Misty and have Occipital use that in the MR demos? The MR Misty could be part of a UX Simulator?


#2

Hey @station, we’re REALLY excited to bring some of these MR interactions to Misty. I personally am not as familiar with the capabilities of the bridge engine, is there a specific interaction that you’re MOST excited about?


#3

Sure @CHRIS_IS_MISTICAL!

The first thing that comes to mind is doing something like “virtual tagging” which might consist of something like: if a user had a tablet with structure sensor or bridge engine headset, you could load Misty’s map she created of a space with her sensor. This could then be used for localization and then “pin” or “tag” more items that would then have a 1 to 1 virtual and real world space. so on top of computer vision tagging the location of items on a map, you could manually add them in 3D space as well, this could be useful if you wanted misty to look up or down on patrol to perform more complex tasks.
In theory you could just load the 3D map and then in a GUI allow users to tag stuff in there instead of MR, which leads to the next potential case.

Misty could share things that she sees in the virtual space that can only be seen in MR, so similar to the already existing MR robot character in the default Bridge Engine app, Misty could be a guide or companion in MR that allows for more engaging Physical interactions of an MR experience. It also opens the door for a new kind of training. We’ve probably all seen demos of VR training sims, well what about HRI sims? so training that involves working with robots yet is still mainly a virtual interface.

One last case was more utilitarian but had to do with some work I had done attempting to create a solution for the visually impaired. Imagine Misty created a map of then environment and did automatic “tagging” of bathrooms and exits in a building, that map could then be shared with anyone who visits that space (via geolocation basic app updates), and the wearer of the Bridge Headset could be guided to a desired location through voice interaction and haptics since the localization and path finding can be done virtually. Similar to this but cheaper. Misty could act as the maintainer of this map for whatever office, and each time she updates her map it would serve another purpose.


#4

This is great. Thanks @station!

Did you see Occipitals latest experiments?