Abstract
The Roemotion* Roy robotic arm is the result of a successfully funded Kickstarter project launched in 2012, which was described as a “project to create a human sized animatronic character from only laser cut mechanics and off the shelf hobby servos.” In this experiment, software has been developed using the Intel® RealSense™ SDK for Windows* to control the Roy hand using the SDK’s hand tracking APIs (Figure 1).
Figure 1.Robotic hand control software.
The code for this project was developed in C#/XAML using Microsoft Visual Studio* Community 2015, and works with both the Intel RealSense F200 and SR300 (coming soon) cameras. To see the software-controlled robotic hand in action, check out the YouTube* video: https://youtu.be/VQ93jw4Aocg
About the Roy Arm
The Roy arm assembly is currently available for purchase from the Roemotion, Inc. website in kit form, which includes:
- Laser cut pieces
- All necessary hardware
- 8 hobby-grade servos
- 6 servo extension cables
As stated on the Roemotion website, the kit does not include any control electronics. This is because the initial concept of the project was to supply cool mechanical systems for people to use with whatever controller they want. As such, this experiment incorporates a third-party servo controller for driving the motors in the robotic hand (Figure 2).
Figure 2. Roy robotic arm.
The hand incorporates six servo motors: one for each of the fingers (index, middle, ring, and pinky) and two for the thumb. (Note: there are two additional servos located in the base of the arm for controlling wrist movements, but these are not controlled in this experiment.)
Intel® RealSense™ SDK Hand Tracking APIs
As stated in the Intel RealSense SDK online documentation, the hand tracking module provides real-time 3D hand motion tracking and can track one or two hands, providing precise joint-level locations and positions. Of particular interest to this real-time device control experiment is the finger’s “foldedness” value acquired through calls to the QueryFingerData() method.
Control Electronics
This experiment incorporated a Pololu Micro Maestro* 6-channel USB servo controller (Figure 3) to control the six motors located in the Roy hand. This device includes a fairly comprehensive SDK for developing control applications targeting different platforms and programming languages.
Figure 3.Pololu Micro Maestro* servo controller.
Servo Controller Settings
Before custom software could be developed to directly control the robotic hand in this experiment, it was essential to understand each of the full-scale finger ranges in terms of servo control parameters. Unlike high-end robotic servos with integrated controllers, whose position encoders can be queried prior to applying torque, the low-cost servos used in the Roy hand needed to be energized cautiously to avoid rapid motor movements that could lead to binding the fingers and potentially stripping the motor gears.
Fortunately, the Pololu Micro Maestro SDK includes a Control Center app that allows a user to configure firmware-level parameters and save them to flash memory on the control board. The settings that were determined experimentally for this application are shown in Figure 4.
Figure 4.Pololu Maestro Control Center app.
Once the Min and Max position settings are fixed, the servo controller firmware will not allow the servos to be accidentally software-driven to a position that exceeds the desired range of motion. This is critical for this type of application, which has mechanical hard-stops (that is, fingers fully open or closed) that could cause a motor to burn out or strip gears if over-driven.
Another important setting for an application such as this is the “On startup or error” parameter, which in this case ensures the default starting (and error) position for all of the fingers is “open” to prevent binding of the index finger and thumb if they were allowed to close indiscriminately.
The two final settings that are noteworthy are the Speed and Acceleration parameters. These setting allow for motion smoothing at the firmware level, which is often preferable to higher-level filtering algorithms that can add latency and overhead to the main software application.
Note: In more advanced robotic servos that include integrated controllers, a proportional–integral–derivative controller (PID) algorithm is often implemented that allows each term to be flashed in firmware for low-level (that is, closer to the metal) feedback tuning to facilitate smooth motor translations without burdening the higher-level software.
Custom Control Software
In this experiment, custom software (Figure 5) was developed that leverages many of the hand tracking features that are currently present in the SDK samples.
Figure 5. Custom Control Software.
Although real-time fingertip tracking data is presented in the user interface, this particular experiment ultimately relied on the following three parameters for controlling the Roy hand:
- Alert data
- Foldedness data
- Scaled data
Alert Data
Alerts are the most important information to monitor in a real-time device control application such as this. It is paramount to understand (and control) how a device will behave when its set-point values become unreliable or unavailable.
In this experiment the following alert information is being monitored:
- Hand detected
- Hand calibrated
- Hand inside borders
The design of this software app precludes control of the robotic hand servos in the event of any alert condition. In order for the software to control the robotic hand, the user’s hand must be successfully calibrated and within the operating range of the camera.
As shown in the code snippet below, the custom app loops over the total number of fired alerts and sets three Boolean member variables -- detectionStatusOk, calibrationStatusOk, and borderStatusOk (note that handOutput is an instance of PXCMHandData):
for (int i = 0; i < handOutput.QueryFiredAlertsNumber(); i++) { PXCMHandData.AlertData alertData; if (handOutput.QueryFiredAlertData(i, out alertData) != pxcmStatus.PXCM_STATUS_NO_ERROR) { continue; } switch (alertData.label) { case PXCMHandData.AlertType.ALERT_HAND_DETECTED: detectionAlert = "Hand Detected"; detectionStatusOk = true; break; case PXCMHandData.AlertType.ALERT_HAND_NOT_DETECTED: detectionAlert = "Hand Not Detected"; detectionStatusOk = false; break; case PXCMHandData.AlertType.ALERT_HAND_CALIBRATED: calibrationAlert = "Hand Calibrated"; calibrationStatusOk = true; break; case PXCMHandData.AlertType.ALERT_HAND_NOT_CALIBRATED: calibrationAlert = "Hand Not Calibrated"; calibrationStatusOk = false; break; case PXCMHandData.AlertType.ALERT_HAND_INSIDE_BORDERS: bordersAlert = "Hand Inside Borders"; borderStatusOk = true; break; case PXCMHandData.AlertType.ALERT_HAND_OUT_OF_BORDERS: bordersAlert = "Hand Out Of Borders"; borderStatusOk = false; break; } }
A test to determine if detectionStatusOk, calibrationStatusOk, and borderStatusOk are all true is performed before any attempt is made in the software to control the hand servos. If at any time one of these flags is set to false, the fingers will be driven to their default Open positions for safety.
Foldedness Data
The custom software developed in this experiment makes calls to the QueryFingerData() method, which returns the finger’s “foldedness” value and fingertip radius. The foldedness value is in the range of 0 (finger folded) to 100 (finger extended).
The foldedness data for each finger is retrieved within the acquire/release frame loop as shown in the following code snippet (where handData is an instance of PXCMHandData.IHand):
PXCMHandData.FingerData fingerData; handData.QueryFingerData(PXCMHandData.FingerType.FINGER_THUMB, out fingerData); thumbFoldeness = fingerData.foldedness; lblThumbFold.Content = string.Format("Thumb Fold: {0}", thumbFoldeness); handData.QueryFingerData(PXCMHandData.FingerType.FINGER_INDEX, out fingerData); indexFoldeness = fingerData.foldedness; lblIndexFold.Content = string.Format("Index Fold: {0}", indexFoldeness); handData.QueryFingerData(PXCMHandData.FingerType.FINGER_MIDDLE, out fingerData); middleFoldeness = fingerData.foldedness; lblMiddleFold.Content = string.Format("Middle Fold: {0}", middleFoldeness); handData.QueryFingerData(PXCMHandData.FingerType.FINGER_RING, out fingerData); ringFoldeness = fingerData.foldedness; lblRingFold.Content = string.Format("Ring Fold: {0}", ringFoldeness); handData.QueryFingerData(PXCMHandData.FingerType.FINGER_PINKY, out fingerData); pinkyFoldeness = fingerData.foldedness; lblPinkyFold.Content = string.Format("Pinky Fold: {0}", pinkyFoldeness);
Scaled Data
After acquiring the foldedness data for each of the user’s fingers, scaling equations are processed to map these values to the full-scale ranges of each robotic finger. Each full-scale value (that is, the control pulse width, in microseconds, required to move the finger either fully opened or fully closed) are defined as constants in the servo.cs class:
// Index finger public const int INDEX_OPEN = 1808; public const int INDEX_CLOSED = 800; public const int INDEX_DEFAULT = 1750; . . .
Individual constants are defined for each finger on the robotic hand, which match the Min and Max servo parameters that were flashed in the Micro Maestro controller board (see Figure 4). Similarly, the full-scale range of the finger foldedness data is defined in the software:
int fingerMin = 0; int fingerMax = 100;
Since the finger foldedness range is the same for all fingers (that is, 0 to 100), the range only needs to be defined once and can be used for the data scaling operation performed for each finger as shown below:
// Index finger int indexScaled = Convert.ToInt32((Servo.INDEX_OPEN - Servo.INDEX_CLOSED) * (index - fingerMin) / (fingerMax - fingerMin) + Servo.INDEX_CLOSED); lblIndexScaled.Content = string.Format("Index Scaled: {0}", indexScaled); Hand.MoveFinger(Servo.HandJoint.Index, Convert.ToUInt16(indexScaled)); . . .
Check Out the Video
To see the robotic hand in action, check out the YouTube video here: https://youtu.be/VQ93jw4Aocg
Summary
This software experiment took only a few hours to implement, once the basic control constraints of the servo motors were tested and understood. The Windows 10 desktop app was developed in C#/XAML, and it leveraged many of the features present in the Intel RealSense SDK APIs and code samples.
About Intel® RealSense™ Technology
To learn more about the Intel RealSense SDK for Windows, go to https://software.intel.com/en-us/intel-realsense-sdk.
About the Author
Bryan Brown is a software applications engineer in the Developer Relations Division at Intel.