Welcome back to the 4thweek blog. In the previous blogs, we had a clear discussion about the proposal, approach of using AR for the system, sensor usage, usage of Leap and classes used for the software etc. Also, details were presented about the interfacing the sensor with the smart phone as well. In this week’s blog, clear information about the supported gestures are presented.
AR is chosen over the VR as it really provides fantastic and interactive user experience. Time has come to understand the methods through which inputs can be given. The hand gestures are all fed in to the system through the leap motion sensor. There are two types of gestures one can use. First, default class and second, customized. Default gestures are easy to be used and pinch and grab are the default gestures. Programmers can create some customized gestures, if needed. Position, rotation and scaling shall be discussed under this section of customized gestures. As any other technology stuff, here, the customized gestures are derived from the default gestures only. Position gesture is the one which gets derived from the default gesture pinching.
When you wish the object to be moved, just pinch by fingers over the screen and it shall move the object, virtually for you.
The gestures are presented below with a brief note.
Pinch-in is accomplished with 2 fingers. The thumb must be used as one of the two fingers. The other finger is left to your choice. It can be any of the remaining 4. When this is carried out, the two points get intersected and the object selected shall get turned to green. The gesture is presented as figure. 1.
Fig. 1 Pinching.
The next gesture to be discussed is grab. This is yet another default gesture. One has to use all the five fingers to accomplish the task. All the five fingers should be brought close to the center of the palm. The below figure 2 represents the grab gesture.
Fig. 2 A grab.
When this gesture is done the hand turns blue but when released the hand again changes to its default color. This gesture allows us to rotate, translate the object etc.
Now, it is important to understand the user defined gestures. Other than the gestures which are defined default, developers are allowed free to define their own gestures, of their choice. This increases the flexibility and gets the programmer more options.
First such derived gesture is Position. We shall learn about the same.
Position of the palm really governs the position gesture. Palm position shall act as the reference point to understand the object movements. Another interesting factor is, the direction of the hand is defined through what the middle finger points to. Positioning is accomplished through the grab or pinch of the object and can be moved and placed at an appropriate place. Means, we can position the object anywhere on the available screen. Positioning is done by pinching or grabbing the object first and then it can be moved and placed on the frame available. One can understand how positioning is done with the below diagrammatic representation.
Fig. 3 Positioning.
As discussed, little early, the gesture is a derived one as well. One has to use grab to get this done. All the five fingers are to be closed to get them as shown in the below figure. There would be a color change and one can visualize that as well, immediately. One can refer to Fig.4 to get a clearer view.
Fig. 4 Rotation.
Finally scaling can also be done.
Scaling, as everyone knows, is stretching (varying) the size of the object. Two hands are to be used for this task. Both the hands are placed before the screen and the selection happens, then. When the distance between two hands getting increased or varied, the stretching happens. Below figure can be referred to get a clearer understanding of what scaling is.
Fig. 5 Scaling – A simple stretch.
With these, we have come to a conclusion about the gestures and the same can be seen in the video demos shortly.
With that, I sign off for the week. Bye, take care!
 McNeill, D., 1992. Hand and mind: What gestures reveal about thought. University of Chicago press.
 Hotelling, S., Strickon, J.A., Huppi, B.Q., Chaudhri, I., Christie, G., Ording, B., Kerr, D.R. and Ive, J.P., Apple Inc, 2013. Gestures for touch sensitive input devices. U.S. Patent 8,479,122.
 Rubine, D., 1991. Specifying gestures by example (Vol. 25, No. 4, pp. 329-337). ACM.
Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors. These optimizations include SSE2, SSE3, and SSSE3 instruction sets and other optimizations. Intel does not guarantee the availability, functionality, or effectiveness of any optimization on microprocessors not manufactured by Intel. Microprocessor-dependent optimizations in this product are intended for use with Intel microprocessors. Certain optimizations not specific to Intel microarchitecture are reserved for Intel microprocessors. Please refer to the applicable product User and Reference Guides for more information regarding the specific instruction sets covered by this notice.
Notice revision #20110804