Robot Engage Blog Buy Now
Blog

Misty Community Forum

Training Mission: Coordinating Movement with Actuator Position Data

Mission Objective: Learn about Misty’s ActuatorPosition event type and understand how to use head and arm movement commands in your skill code
Mission Type: Training
Capabilities Exercised: Spatial Awareness, Programmable Personality
Est. Time to Complete: 1 - 2 hours
Launch Date: 11/1/2019
End Date: Ongoing

Mission Overview

This training mission is all about familiarizing yourself with Misty’s actuator position data and head and arm movement commands.

Misty’s APIs provide several commands for controlling your robot’s head and arm movement. Additionally, you can get the current position of Misty’s arm and neck motors (relative to her coordinate frame) in the body of ActuatorPosition event messages, which you can subscribe to in on-robot skills or remotely via Misty’s WebSocket server. By using actuator position data together with Misty’s movement commands, you can code coordinated head and arm movements in your skills and robot applications.

To complete the training mission, walk through the materials below to learn about Misty’s movement commands, actuator position data, and her coordinate frame. If something isn’t clear or you need more information, let us know by posting a reply to this topic. Then, when you’re ready, join the challenge to code the best dance skill for Misty!

Details

Here are a few ways to experiment with and learn about Misty’s movement commands, actuator position data, and her coordinate frame.

Explore movement & position data with the Command Center

The quickest way to start sending discrete head and arm movement commands is through Misty’s Command Center. Start by booting up your robot and connecting her to the Command Center web page. Then, scroll down to the Movement section. The commands you see here are a wrapper around the endpoints for movement commands in Misty’s REST API, and you can use them to quickly get an idea of what kind of movement your robot is capable of.

As you experiment with movement commands, be sure to use the Sensor Data section to stream Actuator Position data via Misty’s WebSocket connection. This stream of data updates to show the current position of Misty’s arm and neck motors as you issue movement commands.

Learn about Misty’s coordinate frame

Misty uses a right-handed coordinate frame when orienting her body, head, and arms relative to her surroundings, when interpreting movement and locomotion commands, and when publishing sensor data. The ActuatorPosition event message includes data that describes the position of Misty’s head and neck motors relative to this coordinate frame.

Misty’s developer documentation provides details about the positional values and range of movement in the description for each movement command. Additionally, we’ve prepared a set of visuals to help you better understand Misty’s coordinate frame and movement ranges. Download the Misty II Coordinate Frame & Movement Ranges PDF for a handy guide to the values Misty’s movement commands accept, and to learn how Misty understands her current position and the location of incoming data she senses in her environment.

Try some sample code

As a very basic example of how all this fits together in your skills, we’ve prepared a piece of sample code you can run on your own robot. For the last part of this training mission, download the simpleDance JavaScript skill from the Misty Community Sandbox repo on GitHub.

The simpleDance skill works as follows: When Misty lifts her arm to a certain position, an actuator position event listener triggers a callback that invokes a command for Misty to lift her head. When her head reaches its destination, a separate actuator position event listener triggers another callback that invokes a command to lower Misty’s arm. We alternate between listening for data from Misty’s head pitch and right arm actuators in this way to keep these movements running on a loop.

Open up the JavaScript file in your text editor to see an example of how you can send movement commands, register for ActuatorPosition events, and parse the position data to chain movements together for coordinated action.

Links

Concepts

Note: While you can use Misty’s API to specify head and arm movement using radians or a relative “positional” unit of measurement, we recommend using degrees. By default, most of Misty’s event messages and movement commands return and accept angular values in degrees, so sticking to degrees makes it easier to integrate movement commands with sensor and event data in your skill code.

Reference

JavaScript Code Sample

Mission Support

Hit a roadblock? Find a bug? Have feedback? Let us know! As Misty’s earliest developers, you are the real superstars on the journey to making personal robots a reality. Your creativity makes all this possible, and we’re eager to hear your feedback. Reply to this post to start a conversation with a member of the Misty team.

1 Like

First, thanks for the well laid out documentation. I have been looking for tricky nuggets, but regretfully for me, kudos to you, I have not been able to find any more. I do have some questions, if you can point me in the right direction. These are based on the pretendCoinToss skill, thought it would be easier just to point to the code instead of copying it here. Misty Skill: Flips a pretend coin

Question 1: In this skill, Misty reaches max pitch position looking down at right hand before the desired yaw position is fully reached. I tried intermediate points, but my method produced choppy results. I want Mistys head to reach the “Look at Right Hand” point in pitch, yaw, roll at the exact same time. Please point me in a direction you think would help.

Question 2: Most commands have prePause and postPause. How should I think about these? Maybe a simpler question is are these two sections of code equivalent?

misty. Pause(3000);
misty.MoveArm(“both”, 45, 70, 0); //Note prePause and postPause are not included
misty. Pause(3000);

misty.MoveArm(“both”, 45, 70, 0, 3000, 3000); //prePause and postPause included in one command

Question 3: The Actuator positions as they are shown in Command Center are shown to 2 decimal places. Why the precision? Playing with this using TestMove function in the pretendCoinToss code, I changed the value to move head back and forth at very small increments <1 degree. There is not a discernable position change at 0.1, and in the tutorials, these position values are typically to a whole degree. Is there a benefit to defining positions to higher resolution?

Question 4: Is there a head position where the measurements are more accurate, or a position where they are less accurate? Does the accuracy change with different head positions?

Question 5: I mistyped a value, and commanded Misty to an unobtainable head position. Besides freaking me out when the clutch engages, is there another way to let me know I did something wrong? Is there software that checks that the commanding positions are within the range of possible before sending them on, or is that code supposed to reside in my skill level code?

As always, thanks for your help.

1 Like

Thanks for the great questions, @MorningR! I’ll answer what I can and try and track down more details for those things I’m not sure on.

Question 1: In this skill, Misty reaches max pitch position looking down at right hand before the desired yaw position is fully reached. I tried intermediate points, but my method produced choppy results. I want Mistys head to reach the “Look at Right Hand” point in pitch, yaw, roll at the exact same time. Please point me in a direction you think would help.

We’ve actually been working on some changes to the system responsible for actuating head movement commands. My understanding is that after next update (scheduled for 2019-11-19T07:00:00Z), the default behavior will be such that pitch, roll, and yaw motors all reach their designated positions at close to the same time. (These changes will also bring head movement speeds closer to where they were before Misty’s 2019.10.08 system update)

Question 2: Most commands have prePause and postPause. How should I think about these? Maybe a simpler question is are these two sections of code equivalent?

misty. Pause(3000);
misty.MoveArm(“both”, 45, 70, 0); //Note prePause and postPause are not included
misty. Pause(3000);
misty.MoveArm(“both”, 45, 70, 0, 3000, 3000); //prePause and postPause included in one command

I can answer your simpler question here :slight_smile: You’re thinking about it correctly – those two pieces of code should behave the same.

Question 3: The Actuator positions as they are shown in Command Center are shown to 2 decimal places. Why the precision? Playing with this using TestMove function in the pretendCoinToss code, I changed the value to move head back and forth at very small increments <1 degree. There is not a discernable position change at 0.1, and in the tutorials, these position values are typically to a whole degree. Is there a benefit to defining positions to higher resolution?

I don’t know the answer to this – I’ll have to chat with our team to learn more about the design choices here. @steven also seems like someone who could potentially shed some light here. I’ve flagged him in case he’d like to weigh in.

Question 4: Is there a head position where the measurements are more accurate, or a position where they are less accurate? Does the accuracy change with different head positions?

I am not aware of any position where the measurements are more accurate, or where the movement will be more precise. I typically expect an error range within a few degrees for head / arm movement commands. In the simpleDance sample linked above, the conditional block looks for a measurement within a range, rather than a precise position.

Question 5: I mistyped a value, and commanded Misty to an unobtainable head position. Besides freaking me out when the clutch engages, is there another way to let me know I did something wrong? Is there software that checks that the commanding positions are within the range of possible before sending them on, or is that code supposed to reside in my skill level code?

As far as I know we don’t expect the neck motor clutches to engage even when you pass in a value outside the range. I’m asking around in case I’m wrong on that, but I’d like to get some more information from you in case there’s a bug or hardware issue at play. Could you follow these steps to generate a diagnostic report and attach it in an email to support@mistyrobotics.com?

(Edit) Update: After investigating a bit, it looks like the clutch issues you’re seeing are not expected behavior, and are related to a bug we’ve been tracking internally for a few weeks. There is a fix that’s scheduled to go out with next week’s update. Until then, if you move the neck outside the allowed range without specifying either a duration or velocity, then the system defaults to a speed that can cause the clutch to engage. I’m told you can work around this bug by supplying a value for either duration or velocity when you invoke a head movement command.

@MorningR I can’t answer all of your questions but can weigh in on these:

Question 3: The Actuator positions as they are shown in Command Center are shown 
to 2 decimal places. Why the precision? Playing with this using TestMove function 
in the pretendCoinToss code, I changed the value to move head back and forth at 
very small increments <1 degree. There is not a discernable position change at 
0.1, and in the tutorials, these position values are typically to a whole degree. 
Is there a benefit to defining positions to higher resolution?

There are several factors which affect the resolution of the head movements. We use variable resistors (VRs) to provide position feedback, and these come in several standard ranges. For example the Yaw VR is 180 degrees, which means as the head moves the resistance increases linearly from 0 ohms to 10k ohms. Our microcontroller reads the VR using a 12-bit analogue to digital converter (ADC). This functionally divides the 180 degree sweep of the VR into 4096 discreet sections, so a resolution of 0.04 degrees, not too shabby, for yYaw, however Pitch and Roll have lower resolution due to their VRs.

The next factor is the motor/gearbox. The head gearboxes have relatively low gear ratios which make accurate portioning more challenging because large gear ratios allow the motor to spin at high speeds.

This bring us to the last factor: control system. The control system must ingest the feedback from the VRs, and control the motors to reach the specified position. Steven can provide a more accurate description of our control system but it is a variation of a PID loop which allows it to take into account the velocity and position of the joint. Tuning PIDs requires trade-offs between sensitivity and overshoot. If the system is too sensitive an input of 0.1 degrees will cause it to move quickly, overshooting the desired position, and in extreme cases it could oscillate the joint forever overshooting each time. In theory a value of 0.1 degrees difference will cause motion eventually, but it takes time for the error term to build up enough for the system to react. Our system has a deadband to eliminate unwanted movements due to noise in the VRs. I can’t recall how large it is though :stuck_out_tongue:

Apologies if that was too long, and pedantic, I really enjoy this sort of thing.

Question 4: Is there a head position where the measurements are more accurate, 
or a position where they are less accurate? Does the accuracy change with 
different head positions?

Positioning accuracy should be the same throughout the motion of any given degree of freedom (Pitch, Yaw, Roll etc.). The accuracy differs between degrees of freedom due to the factors from Question 3. Roughly in order of most accurate to least accurate: Arms, Yaw, Pitch, Roll.

Hope that helps!

2 Likes

The movement dead-band is currently 2 degrees. As Miles says this is due to noise in the ADCs. I have been working on different filtering techniques that should hopefully allow a smaller dead-band (less than 1 degree). There is a balance between filtering and dead-band, as filtering adds lag in the feedback, which results in overshoot and jerky movements in the controller.

2 Likes

@steven and @miles Thanks for the insight, and don’t apologize for getting into the technical details. It is good to know this info when trying to get Misty to do a move. I like that there is a lot of resolution, even if it is not all available (yet). In playing around with it, small movements can carry significant information and make Misty actions seem more realistic.

There are tradeoffs in the skill code I write. I can either get to a head position fast, and risk overshoot, or I can go slow, and ensure an accurate movement. I am looking at how best to implement this. Right now I am focusing on smooth movements. Given position and velocity levers (and in the future Duration), there are many ways to get from point A to B. Also, if I physically attach something on Mistys head (like the mohawk) this may change how I need to tune the movements. The same occurs with the arms, I think.

Question 6: Is the expectation in the long-term that I should just put in the position, velocity, and duration values in a one line command, and Misty controls take care of everything, or are there strategies I should implement at the skill level to balance the tradeoffs?

As an example of what I was trying in the code below is me playing with nodding “no” in the yesnoAPI skill code:

Code 2 different ways.

Trying to overly define:

    //misty.MoveHead(double pitch, double roll, double yaw, double velocity, [int prePauseMs], [int postPauseMs]);
    misty.MoveHead(0, 0, -15, 100, 500, 3000);  //Move head right - fastest speed
    misty.MoveHead(0, 0, -30, 30, 500, 1000);  //Move head - last 15 degrees slower
    misty.Pause(700);
    misty.MoveHead(0, 0, 15, 100, 500, 3000);  //Move head left - fastest speed
    misty.MoveHead(0, 0, 30, 30, 500, 1000); 
    misty.Pause(700);

In the above code, I tried a number of different yaw positions, speeds, pause times.

Simple:

    misty.MoveHead(0, 0, -30, 80, 500, 1000);  //Move head right
    misty.Pause(700);
    misty.MoveHead(0, 0, 30, 80, 500, 1000); //move head left
    misty.Pause(700);

My Findings: The simple code both looks and works better. It seems that I am overthinking this. If I make the steps small, or try to map out points, Misty moves in choppy motion. I think the answer is to just define the positions where I want the head to go, and not try to do anything to “help” the positioning. Your comments and insight into this are appreciated.

2 Likes

@MorningR thanks for this update. I’m scheduled this sprint and next to do some more refinement on head and arm movements, so some of the things you mention will be addressed in that effort.

Small movements, great for relaying information and emotion, are quite difficult from a control perspective. I’m definitely going to be pushing to get as small movements as possible over the coming months. Smoothing out the feedback is step one (its the first thing on my plate when I start back on torso movements), and some tweaks to our controller will be step 2. Not sure how small I’ll be able to make a step, but I’ll push to make it as small and controlled as possible.

Another BIG part of this next push on torso movements is to implement a movement profile in the movements. This allows me to control the acceleration/deceleration of the movement intentionally, rather than it being a “side affect” of the PPI controller. I will start with trapezoidal movements (constant acceleration for phase 1, constant velocity for phase 2, constant deceleration for phase 3). But I do hope to play around with sinusoidal functions (humans move in mostly sinusoidal movements). The two really big advantages to having this profile in the movement is that it will eliminate overshoot on fast movements, and it will appear more lifelike right out of the box. I am considering ways to give control of profile parameters to the user.

Right now, when you assign a new command to the torso motors, it clears out some of the information the PPI cascade controller is using to hold a constant velocity, which is why the movement appears choppy when you assign multiple smaller commands and smoother when you assign just one big one. This is another thing that is going to change in this coming effort. Like I did with drive movements, there will be a controlled deceleration/acceleration, and that will allow me to better hold all the values between moves. This should result in smoother transitions between move commands, such that you can assign multiple, quick moves and it should appear very natural.

Somewhere in the roadmap for future functionality there should be a streaming velocity command that will let the user just say “go at this speed” and then it will be the responsibility of the skill to stream new commands down regularly. This will allow for better control from a joystick, for example.

So in your case, I’d suggest that the best bet is to assign a move and let the onboard controller handle it. Once I get this next push on torso movements into the release, you’ll see an improvement there.

2 Likes