Mission Objective: Learn about Misty’s ActuatorPosition event type and understand how to use head and arm movement commands in your skill code
Mission Type: Training
Capabilities Exercised: Spatial Awareness, Programmable Personality
Est. Time to Complete: 1 - 2 hours
Launch Date: 11/1/2019
End Date: Ongoing
This training mission is all about familiarizing yourself with Misty’s actuator position data and head and arm movement commands.
Misty’s APIs provide several commands for controlling your robot’s head and arm movement. Additionally, you can get the current position of Misty’s arm and neck motors (relative to her coordinate frame) in the body of
ActuatorPosition event messages, which you can subscribe to in on-robot skills or remotely via Misty’s WebSocket server. By using actuator position data together with Misty’s movement commands, you can code coordinated head and arm movements in your skills and robot applications.
To complete the training mission, walk through the materials below to learn about Misty’s movement commands, actuator position data, and her coordinate frame. If something isn’t clear or you need more information, let us know by posting a reply to this topic. Then, when you’re ready, join the challenge to code the best dance skill for Misty!
Here are a few ways to experiment with and learn about Misty’s movement commands, actuator position data, and her coordinate frame.
Explore movement & position data with the Command Center
The quickest way to start sending discrete head and arm movement commands is through Misty’s Command Center. Start by booting up your robot and connecting her to the Command Center web page. Then, scroll down to the Movement section. The commands you see here are a wrapper around the endpoints for movement commands in Misty’s REST API, and you can use them to quickly get an idea of what kind of movement your robot is capable of.
As you experiment with movement commands, be sure to use the Sensor Data section to stream Actuator Position data via Misty’s WebSocket connection. This stream of data updates to show the current position of Misty’s arm and neck motors as you issue movement commands.
Learn about Misty’s coordinate frame
Misty uses a right-handed coordinate frame when orienting her body, head, and arms relative to her surroundings, when interpreting movement and locomotion commands, and when publishing sensor data. The
ActuatorPosition event message includes data that describes the position of Misty’s head and neck motors relative to this coordinate frame.
Misty’s developer documentation provides details about the positional values and range of movement in the description for each movement command. Additionally, we’ve prepared a set of visuals to help you better understand Misty’s coordinate frame and movement ranges. Download the Misty II Coordinate Frame & Movement Ranges PDF for a handy guide to the values Misty’s movement commands accept, and to learn how Misty understands her current position and the location of incoming data she senses in her environment.
Try some sample code
As a very basic example of how all this fits together in your skills, we’ve prepared a piece of sample code you can run on your own robot. For the last part of this training mission, download the
simpleDance skill works as follows: When Misty lifts her arm to a certain position, an actuator position event listener triggers a callback that invokes a command for Misty to lift her head. When her head reaches its destination, a separate actuator position event listener triggers another callback that invokes a command to lower Misty’s arm. We alternate between listening for data from Misty’s head pitch and right arm actuators in this way to keep these movements running on a loop.
ActuatorPosition events, and parse the position data to chain movements together for coordinated action.
Note: While you can use Misty’s API to specify head and arm movement using radians or a relative “positional” unit of measurement, we recommend using degrees. By default, most of Misty’s event messages and movement commands return and accept angular values in degrees, so sticking to degrees makes it easier to integrate movement commands with sensor and event data in your skill code.
Hit a roadblock? Find a bug? Have feedback? Let us know! As Misty’s earliest developers, you are the real superstars on the journey to making personal robots a reality. Your creativity makes all this possible, and we’re eager to hear your feedback. Reply to this post to start a conversation with a member of the Misty team.