Robot Not a Dev? Buy Now
Not a Dev?

Misty Community Forum

Training Mission: Obstacle Avoidance & Autonomous Locomotion

Briefing

Mission Objective: Learn about Misty’s locomotion commands and built-in hazards system
Mission Type: Training
Capabilities Exercised: Robust locomotion, obstacle avoidance, hazards system
Launch Date: 10/18/2019
End Date: Ongoing

Mission Overview

This training mission is all about Misty’s locomotion and obstacle avoidance capabilities. Misty’s SDK provides an interface for coding your robot to drive around her environment, and her built-in hazards system stops her from running commands that could cause her harm. As Misty’s developers, you can integrate these capabilities to develop your own unique implementation of autonomous locomotion.

To complete the mission, learn about Misty’s hazards system and locomotion commands by exploring the docs, code samples, and tutorials linked below. If something isn’t clear or you need more information, let us know by posting a reply to this topic. Then, when you’re ready, join the challenge to code the best obstacle course navigation skill!

Details

When Misty boots up, her hazards system runs in the background to analyze sensor data and determine whether the robot is operating safely. The hazards system triggers what’s called a “hazard state” when a sensor returns a value outside a safe range. While in a hazard state, Misty ignores any commands that would make that state worse.

To picture this in action, imagine running a skill that calls the misty.Drive() method to move Misty forward in a straight line. Now imagine that when she runs this command, Misty happens to be facing the top of a staircase. With the hazards system configured to its default settings, Misty continues driving straight until one of her front downward-facing (“edge”) time-of-flight sensors pokes over the top of the stairs. When this happens, the edge sensor suddenly returns a distance value that’s much higher than a ledge Misty could safely drive over. This value triggers a hazard state for the sensor, and the system cancels all forward locomotion. To exit this hazard state, Misty must drive in a safe direction (backwards, in our imagined example) until this front edge sensor returns a safe distance value. Until then, Misty ignores forward drive commands.

A quick way to implement autonomous locomotion is to register for HazardNotification event messages in your skills. Use HazardNotification events to determine the location of a hazard and programmatically change course when Misty enters a hazard state.

Resources

Misty’s SDK provides access to several commands for coding Misty to drive around her environment. Use these commands with the sensor data in HazardNotification event messages (and messages from other event types) to code autonomous locomotion into your skills.

Concepts

Reference

JavaScript API Code Samples

  • driveCircle - Code Misty to drive in a circle using the misty.DriveArc() method.
  • driveSquare - Code Misty to drive in a square using the misty.DriveHeading() and misty.DriveArc() methods.
  • hazardNotification - Register for HazardNotification events and print debug messages with details about which sensors are in a hazard state. (Tip: Use this as a starting point for your autonomous locomotion skill!)
  • timeOfFlights - Code Misty to react when she detects an object within a certain distance of her range (outward-facing) time-of-flight sensors.
  • bumpSensor - Play different sounds when you trigger the bump sensors on Misty’s base.

Mission Support

Hit a roadblock? Find a bug? Have feedback? Let us know! As Misty’s earliest developers, you are the real superstars on the journey to making personal robots a reality. Your creativity makes all this possible, and we’re eager to hear your feedback. Reply to this post to start a conversation with a member of the Misty team.

2 Likes

Hi Johnathan,
Here is the link to the GitHub repository for the code:
https://github.com/pvinhha/Misty_General_Checkup_Obstacle_Avoidance
and the video is at the dropbox link I sent to you.

Thanks,
Phillip