Gesture Recognition

Gesture Recognition

Hi every body,

is that someone can help me! I want to add another poses as label_pose_thumb_up, I can not find how to proceed.


17 posts / 0 new
Last post
For more complete information about compiler optimizations, see our Optimization Notice.

in the moment i'm just creating it using hands and fingers positions...


I want to retrieve the fingers/hands positions with the SDK, can you help me !!


When you download the SDK, you get lot of samples. In that, there is one for Gesture recognition. It has what you are looking for.


I "ve already seen it but there is not what I'm looking for.

Is that the sdk makes a kind of image processing for example comparison between stored images and others captured by the cam!!!

The nodes that you receive are discrete positions, so you have to decide which nodes you want to receive. I say node because you aren't actually tracking fingers, you're tracking points that just happen to coincide with certain points on your hand - I'm currently working on an article that will show how you can create a custom gesture that uses these nodes - hopefully this will clarify things for you.

Peter can you please send me the article that will show how you can create a custom gesture!!!


Hi every body,

I would like to know what is the utility of the number written on HexaDecimal in this code:

    struct Gesture {
        typedef pxcEnum Label;          // The pose/gesture identifier (bit-OR'ed value of set and detailed identifiers)
        enum {
            LABEL_ANY=0,                                // No gesture
            LABEL_MASK_SET          =   0xffff0000,     // AND this mask with the pose/gesture identifier to get the pose/gesture set.
            LABEL_MASK_DETAILS      =   0x0000ffff,     // AND this mask with the pose/gesture identifier to get the pose/gesture details within a set.

            LABEL_SET_HAND          =   0x00010000,     // Set of hand gestures
            LABEL_SET_NAVIGATION    =   0x00020000,     // Set of navigation gestures
            LABEL_SET_POSE          =   0x00040000,     // Set of poses
            LABEL_SET_CUSTOMIZED    =   0x00080000,     // Set of customized poses/gestures

            /* predefined nativation gestures */
            LABEL_NAV_SWIPE_LEFT    =   LABEL_SET_NAVIGATION+1,     // The swipe left navigation gesture
            LABEL_NAV_SWIPE_RIGHT,                                  // The swipe right navigation gesture
            LABEL_NAV_SWIPE_UP,                                        // The swipe up navigation gesture
            LABEL_NAV_SWIPE_DOWN,                                   // The swipe down navigation gesture

            /* predefined common hand gestures */
            LABEL_HAND_WAVE         =    LABEL_SET_HAND+1,          // The wave hand gesture
            LABEL_HAND_CIRCLE,                                      // The circle hand gesture
            /* predefined common hand poses */
            LABEL_POSE_THUMB_UP     =    LABEL_SET_POSE+1,          // The thumb up pose
            LABEL_POSE_THUMB_DOWN,                                  // The thumb down pose
            LABEL_POSE_PEACE,                                       // The peace/victory pose
            LABEL_POSE_BIG5,                                        // The big5 pose


Each gesture enumerator value contains two values: the set value (above 0x10000) and the gesture identifier (less than 0x10000). You can define a customized set and add labels to it.


what do you mean when you said: "You can define a customized set and add labels to it"!!

and how can i do this!!


This looks complicated, I was planning to just use Hidden Markov Models to recognize gestures. I am sure you have heard about them. If not, here is: somehting you can start with.


Seif Eddine B. wrote:

what do you mean when you said: "You can define a customized set and add labels to it"!!

and how can i do this!!

Meaning that you could extend the existing list in your own space by adding new ones (should be done by continuing the number ID from the predefined ones). But, defining the new enum tag itself is not sufficient since it is the symbolic tag only, you would have to write your own recognition algorithm for the new gesture too  ;-)

Thanks akhilesh for information but I dont think we need Hidden Markov Models to recognize gestures. I think as per my understading of library that we need to store depth image and need to compare depth image of every frame. If it matches with the stored depth image then custom gesture matched.

Even I am trying to create custom gesture, but lack of clear documentation is making it difficult.

Shyam, it is not necessary to compare depth images.

If you, for instance want to have a gesture which is showing al 5 fingers with the palm facing the hand then you would look for a hand which has all it's fingers open and the palm normal vector facing the camera. or you could have a gesture which is activated when the hand is held in one place for (example) 2 seconds, which means that (for example again) the palm coordinates do not change alot over that timespan (just some simple calculations)

write a program which lets you see all the parameters available and then show your hand the way you want your gesture to be. take the most important parameters, add (something like) +/- 20% and there you go.

this is just how i am doing my gestures and probably not very good, but i hope it gives you a basic understanding.

WolfTW can you please share your code ;)

I'd like to but if you are working with intel's sdk it wont be of much use to you as i am primarily working with softkinetic's iisu interaction designer. The programs do the same thing but my script is written in lua (thats a coding language)..

I don't have a lot of experience with c++ or c# so i find their sdk easier to use. give it a try yourself!

Leave a Comment

Please sign in to add a comment. Not a member? Join today