by MarcoP » Fri Jun 22, 2012 11:35 am
by MarcoP
Fri Jun 22, 2012 11:35 am
Hello
To those who did not know this, some time ago we did a small demo with a
Robobuilder being controlled with a Kinect sensor.
That demo only read the position of the hand and moved the robot arm up and down to try to mimic the hand movement.
We have now improved on that to include
full Upper Body Tracking.
This means the Robobuilder should be able to much closer mimic the position of arms by tracking shoulder, elbow and wrist (within it's own DoF limitations):
[Edit by Pedro, July 9th 2012] The source code and executable for the new version with Full upper Body Tracking can be downloaded here http://robosavvy.com/RoboSavvyPages/Rob ... _SDKv1.zip
- If you just want to run the software (without editing any source code), follow the same instructions used for the previous version: http://robosavvy.com/forum/viewtopic.php?p=33879#33943
- If you want to edit the code you'll need some more dependencies. Check here http://robosavvy.com/forum/viewtopic.php?p=33879#33892
An important heads up is that this code is designed for Kinect SDK v1.0
How we've implemented it:
WARNING! Math contents ahead!
A human shoulder has 3 degrees of freedom, meaning it can move in 3 different ways. Pitch, roll and yaw. A human elbow has one degree of freedom, pitch. More info
here.
Our objective was to try to copy those movements into the Robuilder arm. However the Robobuilder arm does not have shoulder roll. That meant we had to sacrifice one degree of freedom.
To those more familiar with these subjects, what we are doing here is not
inverse kinematics. We do not want to move the robot hand to a specific position in space, but rather move the robot arm in a similar way to the human movement. This means we only need to work with angles.
The Kinect outputs joints positions in a coordinate system present at the end of
this site.
Since we need to get two angles for the two shoulder servos, the use of
spherical coordinate system seems adequate.
The steps performed for each side of the body are:
The 3d position of the elbow is subtracted from the 3d position of the shoulder.
This gives a 3d vector corresponding to the upper arm angle in relation to the body.
That coordinate system is transformed into a coordinate system, where the Z axis points from the shoulder joint to the outward direction.
This enables us to obtain the azimuth and inclination angle by using
these formulas.
(All of this assumes the person is facing the Kinect, so some strange results may occur if this is not the case)
For the elbow angle another approach is used:
By subtracting the wrist position from the elbow position, we get the vector corresponding to the forearm.
We can then use
this method to obtain the angle that is used for the elbow.
Because of the limitation on the degrees of freedom only the elbow angle is copied, and not the forearm direction. This means that in some cases the movement is not correct.
Still we think this works out nicely.
Also visible in the video of another demo we prepared, where the Kinect is rotating to track a person.
I did the math for this, so if you have any questions let me know.
Pedro did most of the programming, so he will follow up with details on that
Regards
Hello
To those who did not know this, some time ago we did a small demo with a
Robobuilder being controlled with a Kinect sensor.
That demo only read the position of the hand and moved the robot arm up and down to try to mimic the hand movement.
We have now improved on that to include
full Upper Body Tracking.
This means the Robobuilder should be able to much closer mimic the position of arms by tracking shoulder, elbow and wrist (within it's own DoF limitations):
[Edit by Pedro, July 9th 2012] The source code and executable for the new version with Full upper Body Tracking can be downloaded here http://robosavvy.com/RoboSavvyPages/Rob ... _SDKv1.zip
- If you just want to run the software (without editing any source code), follow the same instructions used for the previous version: http://robosavvy.com/forum/viewtopic.php?p=33879#33943
- If you want to edit the code you'll need some more dependencies. Check here http://robosavvy.com/forum/viewtopic.php?p=33879#33892
An important heads up is that this code is designed for Kinect SDK v1.0
How we've implemented it:
WARNING! Math contents ahead!
A human shoulder has 3 degrees of freedom, meaning it can move in 3 different ways. Pitch, roll and yaw. A human elbow has one degree of freedom, pitch. More info
here.
Our objective was to try to copy those movements into the Robuilder arm. However the Robobuilder arm does not have shoulder roll. That meant we had to sacrifice one degree of freedom.
To those more familiar with these subjects, what we are doing here is not
inverse kinematics. We do not want to move the robot hand to a specific position in space, but rather move the robot arm in a similar way to the human movement. This means we only need to work with angles.
The Kinect outputs joints positions in a coordinate system present at the end of
this site.
Since we need to get two angles for the two shoulder servos, the use of
spherical coordinate system seems adequate.
The steps performed for each side of the body are:
The 3d position of the elbow is subtracted from the 3d position of the shoulder.
This gives a 3d vector corresponding to the upper arm angle in relation to the body.
That coordinate system is transformed into a coordinate system, where the Z axis points from the shoulder joint to the outward direction.
This enables us to obtain the azimuth and inclination angle by using
these formulas.
(All of this assumes the person is facing the Kinect, so some strange results may occur if this is not the case)
For the elbow angle another approach is used:
By subtracting the wrist position from the elbow position, we get the vector corresponding to the forearm.
We can then use
this method to obtain the angle that is used for the elbow.
Because of the limitation on the degrees of freedom only the elbow angle is copied, and not the forearm direction. This means that in some cases the movement is not correct.
Still we think this works out nicely.
Also visible in the video of another demo we prepared, where the Kinect is rotating to track a person.
I did the math for this, so if you have any questions let me know.
Pedro did most of the programming, so he will follow up with details on that
Regards
Last edited by MarcoP on Fri Jun 22, 2012 11:51 am, edited 1 time in total.