Archived - Working with Hands/Gestures

The Intel® RealSense™ SDK has been discontinued. No ongoing support or updates will be available.

There are multiple Hand Modalities in the Intel® RealSense™ SDK 2014 Gold R1 release. They include:

hand skeletonSkeleton Tracking (articulated hand and finger joint tracking)

  • 22 joints per hand
  • Includes Left vs Right hand recognition.
  • R1 SDK can track 2 hands (can be 2 right hands from 2 users).
  • Joint tracking doesn't require calibration, Uses generic hand proportions.
  • Provides a normalized skeleton even if part of the tracked hand passes
    out of the FOV or is occluded (or if there is too much noise).
  • Allows for background segmented from the hand (change background).
  • Can also use closed to full open values (0-100) of Hand openness
    and Finger foldedness (only available in Skeletal Tracking mode).

 

Contour Mode - aka Blob tracking.

  • Blobs are any connected components (fist, open hand, hand holding something, 2 hands connected).
  • This mode provides segmenting which helps with tracking the hand without including the forearm.
  • Contour Mode has it's own smoothing.
  • Each blob has a mask, a contour line, a pixel count and extremity points.
  • R1 SDK can track up to 2 blobs.
  • You can set minimal blob size, max depth of the object,, and the distance of the 'virtual wall'

 

Gesture Recognition - 10 preset gestures provided in R1 (see below).

  • Static (non moving 'poses') and Active (a pose in motion).
    Example: Static open hand can become Active as a Wave.
  • Note: Active preset gestures require fairly specific motions
    (see especially swipe and tap).
  • All dynamic gestures require recognition/registration and a termination pose. The dynamic motion can be set by movement or time.
  • Gestures usually peak at 30 FPS
  • Multiple gestures: Don't use similar gestures together
    (for example: thumbs up/down and fist).
  • If both static and Active gestures are used based on the same 'pose',
    then both will be recognized which will confuse interactions.
  • See also Touchless Controller Gestures

 

Hand gestures swipe right leftTouchless Controller

  • Useful for navigation, cursor replacement. Swipes are in this module.
  • tc->AddGestureActionMapping(L“swipeLeft”, PXCTouchless Controller::Action_NextTrack);
    tc->AddGestureActionMapping(L“swipeRight”, PXCTouchless Controller::Action_PrevTrack);

 

Best Practices:

  • User interactions should begin by making the Big 5 (“spreadfingers”) gesture briefly to allow the camera time to recognize and calibrate the hand. (True for all hand modalities) It is slow and resource intensive to ID the hand – so work to avoid it. Warn users if they’re about to leave FOV Warn users to slow motions down; etc.
  • ONLY enable the gestures used per section (don't just enable all)
  • LIMIT the number of joints tracked to those required for the motion.
  • Folding Fingers are interdependent: Use only two tracked joints and still make the hand's four fingers.bend realistically. Only the index finger can be bent individually, or remain unbent while other fingers bend - so it needs its own unique joint tracking. But for other fingers: Bend the middle finger then the ring and pinky are pulled down; Bend the ring finger, the middle and pinky finger are pulled down. Bend the pinky finger, the middle and ring fingers are pulled down.
  • Rotation: The degrees that a particular section of human finger can rotate is determined by the length of the section (not the joint). The finger-tip of the middle finger, can bend less than the finger's middle section, and the middle section bends less than the base section. You only need a TrackingScript cover Index: Joint_Index_IT1 in all joints; Middle, Ring and Pinky fingers: Joint_Middle_ITI in all joints.
  • Children's Hands: Small hands are difficult to acquire. In Gold R1, please allow 5 seconds to lock in and acquire.
  • ​Turn off unused modalities, only call the specific functions required. You don't usually need skeletal tracking when using gestures (and vice versa).Don't use contour while using skeletal or gestures. Especially watch hand_facade.
  • Use separate thread for different modalities (except 2 hands gestures). This will help in only running required modality for each section. But remember switch to thread can be CPU intensive.
  • Don't wait for frames from the hand modalities.
  • Provide visual feedback to the user - Hand Not Detected, Hand Not Calibrated, Hand Out of Borders, Hand moving too fast - a viewport is good for this. Also help the user calibrate their range and know the gestures by providing an upfront tutorial.
  • Unity tips: Make use of the hand_lost rule. Also reduce jerkiness/separations by setting Virtual and Real World Box values to 0.

Also see:

For more complete information about compiler optimizations, see our Optimization Notice.