@Vlad How do you view the .occ file? I couldn’t find any free programs that let me view it.
Bump. Being able to view the OCC files would be really helpful, is there a specific program I should download?
Moved this topic to a new thread
OCC files are in a proprietary format, however, you can also pull .ply files.
adb pull /sdcard/occ/mapping_sessions/<MAP_NAME.UTC>
^ contains all the .ply files
These files you can view with a program like Mesh lab (https://www.meshlab.net)
Thank you, that is helpful to be able to see those at least. I guess the only issue is that the ply files don’t get made unless a map is able to be made, but my misty for some reason cant make maps at all right now I was surprised by how many mapping sessions were stored in that folder, do those not get deleted with the “deleteSlamMap” function?
Stored maps can be deleted using REST and adb.
Example to delete - ‘Map_20200412_00.19.09.UTC’
'http://<misty_ip>/api/slam/map?key=Map_20200412_00.19.09.UTC' method: 'DELETE'
rm -r Map_20200412_00.19.09.UTC
Yea I usually delete them using REST, but even though “GET http:/ip address /api/slam/map/ids” returns empty, there were still a lot of old map folders in the mapping session folder (most of them were empty except for a json file though). Not a big issue I just thought that was strange
So your robot was able to make maps and now it’s not able to in the same locations? If that’s true I wonder if your sensor lost calibration
That’s what I was wondering, I’ve been mapping, deleting, and remapping the same room for a few weeks now and just friday it stopped working altogether. How would I go about recalibrating?
We would have to recalibrate it on our side since it takes some special equipment. Looping in @Bryan to help with that.
From @morgan: “The easiest way to tell [if the sensor is calibrated] is to point the robot towards a large flat surface (wall), take a depth image, then look at the uniformity of that image. Additionally, a series of images from the fish eye could be captured to ensure that there aren’t any artifacts.”
Hm okay, if I use the REST call the take a depth picture should I just be looking at the array of depth values it returns?
I tried this call “GET http://ip_address/api/cameras/depth” twice. The first time almost all of the results are NaN, but the ones that aren’t NaN are actually pretty consistent distance (around 780). The second time there was only NaN for the entire image. I would attach the files but I don’t think I have that option. For reference, this is the wall that I am aimed at.
While we’re talking about calibration, the IMU data says that the z-acceleration is at -9.8 m/s^2 (speed of gravity), and the x and y acceleration is usually somewhere between 0 and .3 when misty is not moving. Right now it says the y acceleration is at .28 and x is at .04. I didnt think much of this at first but is that abnormal?
@morgan Any thoughts?
Looks fine to me on the acceleration data
When we’re testing calibration at the factory, the last test is a depth test. To perform the depth test, we elevate the robot and take ten depth images from the sensor. The robot is placed around a meter from a wall, and a meter off of the floor, really just to ensure that the image doesn’t pickup floor or ceiling. From those images, we validate that the data is reporting ~1.0m for most of the pixels. Given the way the sensor works, some of the outer pixels commonly don’t get data from that test, and that’s alright.
With respect to that IMU, there are actually two in Misty. One is built into the Occipital sensor, the other into the base of the robot. When it comes to obtaining pose, the IMU in the Occipital sensor is the only one that matters (and I’m not sure if there’s an easy way to get to that data at the moment). That IMU gets calibrated in the factory as well, and is done with a robot arm and visual target.
The values you’re getting from the other IMU are a little suspect, but aren’t effecting pose. That IMU does self calibrate, so I’d expect the numbers to stabilize fairly quickly after power up.
As for your pose issue, when I’m initially testing, I tend to start with a scene that contains a lot of visual features, and with those features at a reasonable distance. Posters and pictures tend to work pretty well, as do furniture items like couches, chairs, etc. Also, if you haven’t discovered them, there are also some advanced controls (like exposure) in the mapping section of command center.
Finally, if you could DM me your robot’s serial number, I could check our error logs, just in case we’ve caught anything error-wise that might be helpful on our end.
Thanks for the explanation, I’ll send you the serial number. I’ll try to get some depth pictures from above a wall to see how those turn out. Based on what I’ve said do you think something is wrong with my Misty? Mapping was working pretty well till last Friday then ever since then I haven’t even been able to find pose so I’m not sure what could’ve changed.
Edit: Man I hate when this happens… It just started working today Not sure what changed, it was right after an update but reading the update notes it doesnt seem like that really shouldve changed anything. At least it’s working now