The Intel® RealSense™ SDK has been discontinued. No ongoing support or updates will be available.
By Mark Forbes
Created by AFTER-MOUSE.COM USA INC., the Clean’Move mouse driver provides a gesture-based, no-touch PC interface that allows users to view and manipulate data while working in a sterile environment, such as an operating room or clean room. For example, with Clean’Move, users in operating rooms employ their hands to make mouse-input motions directly in front of a PC located outside the room. One hand moves the mouse pointer, and the other hand clicks the mouse button. Using these two simple gestures enables easy access to critical information without users having to leave the clean room environment.
This case study describes how AFTER-MOUSE.COM developed Clean’Move to meet the needs of users who have a wide range of computer experience and work in extremely challenging computing environments.
The App Request
AFTER-MOUSE.COM was approached by a pharmaceutical company to create a hands-free mouse driver app that would allow users in sterile environments easier access to computing resources. Currently, in many cases, workers in a sterile environment must go to a separate room to use a computer and then re-scrub and re-dress before returning to the sterile room. In other instances, the computer operator is outside the operating room and instructed by the doctor as to what data is needed. Obviously, these methods are quite inefficient.
The company wanted an app that could be used on a Windows* 8 desktop platform stationed behind a glass wall that separated it from the sterile environment. The app needed a gesture-controlled interface so that surgical assistants, for example, could easily access data such as medical records and diagnostic images.
In addition to placement of the PC outside the sterile room, accuracy was critical. Misinterpretation of a gesture could lead to lost time, and data-access errors during a surgery or clean-room procedure could bring catastrophic results. Moreover, the gestures would have to be recognized by workers wearing sterile garb, and those gestures had to closely mimic the intuitive gestures that the tablet and smartphone users were already accustomed to.
Although the sponsor company had experimented with the use of traditional touch screens, they found that it was very difficult to keep touch screens clean to the level required by a sterile environment. Moving the computer outside of the sterile environment is a key feature of Clean’Move because it eliminates extra cleaning steps for workers and allows servicing without the need to enter the sterile room.
The Intel® RealSense™ SDK
After developing the original Clean’Move app using Microsoft Kinect* technology, the AFTER-MOUSE.COM’s team ported it to the Intel® RealSense™ SDK to improve the tracking and close-in interaction with users.
To activate the app, a user faces the camera and raises one hand. At that point, the user’s hand becomes the mouse and moves the pointer around the screen. When a button-click selection is needed, their other hand clicks the mouse by “pushing” toward the camera.
The Intel RealSense SDK provided increased accuracy and usability up-close to the separation glass. (Users previously had to maintain some distance from the glass, which was often difficult in the tight working space of a clean room or operating room.) Additional benefits of Intel® RealSense™ technology include an increase in gesture recognition accuracy and a greatly reduced number of false gesture recognitions, which can cause delays and possibly result in data-retrieval errors.
Further, the Intel RealSense SDK provided better subject tracking. This is critical in operating rooms and clean rooms, which tend to be crowded. By tracking the correct subject, other movement is ignored by Clean’Move, which results in better control of the mouse.
“The clean room limits the type and number of gestures that can be employed,” said Nicolas Chaillan, AFTER-MOUSE.COM founder and CEO. “It’s an exciting use of the technology because Clean’Move lets scientists interact where they couldn’t before. And this app isn’t even using all the features of Intel RealSense technology. That’s what I like about the technology… it has many layers with a lot of capabilities.”
Challenges and Solutions
AFTER-MOUSE.COM recognized several challenges and extensively tested prototype solutions with users. The primary challenge was tracking; they wanted Clean’Move to track the current user as he or she moved and, just as important, not track the person nearby. Next, the number and types of gestures had to be established and verified. Finally, because gesture-controlled computing can be physically fatiguing if users must hold up their arms or hands for long periods of time, the development team had to make sure the Clean’Move gestures were simple and fast.
The Clean’Move team spent a great deal of time developing and adjusting the tracking features. In addition to investigating things like how the app responded when a user held an object in the hand that wasn’t performing a gesture, the team strived to learn how scientific and surgical staff users actually work. They wanted Clean’Move to recognize and track users only when they wanted to interact with the PC. Normal movements could not be interpreted as gestures; only specific targeted gestures would be recognized and acted upon.
During the development process, AFTER-MOUSE.COM conducted several gesture tests and learned that too many gestures were counter-productive. The users either became frustrated at having to learn them, or they grew tired after using several different gestures throughout the work day. In both cases, users wanted to give up on the concept. This fact further validated AFTER-MOUSE.COM’s hypothesis: fewer, simpler gestures would make Clean’Move significantly more usable.
Additionally, they found that the scientific and surgical groups operated the Clean’Move prototype in a very similar way. “We saw that most users liked to stop what they were doing and go to the computer. They typically had both hands free, faced the screen directly, and assumed a standstill position that said, ‘I need the computer now’,” said Chaillan.
To keep Clean’Move from tracking the wrong person, they kept the interface very simple. That is, the app only tracks when a user faces the computer with empty hands. The app stops tracking when the user turns away from the screen or turns their back to it. The team also defined a working range to encompass a specific area where users actually interact with the computer so Clean’Move would ignore movements outside that range.
The Clean’Move app recognizes very simple gestures. To activate the app, users simply face the computer screen and move one hand. The activation hand, whether left or right, also acts as the computer mouse to move the pointer; it can also be used to scroll through data. The other hand performs the “select” action (i.e., a mouse click) with a push-forward motion.
The interface is completely intuitive. “Face the device, move one hand… so simple. It’s not waving, it’s not complicated; users just look at the device and move a hand,” said Chaillan. Further, to ensure success, AFTER-MOUSE.COM provides face-to-face training once Clean’Move is installed.
AFTER-MOUSE.COM learned that the more gestures used and the more complex they were, the greater the user fatigue. Thus, the Clean’Move gestures are limited to an activation motion, mouse movement, and mouse clicks so that fatigue isn’t an issue.
“You want to limit the type of gestures that require raising the shoulders. Instead we just use quick movements of the arm,” explained Chaillan.
Working with Intel
AFTER-MOUSE.COM and Intel have been working together for several years, starting with all-in-one multi-touch screens and devices. When Intel approached them with Intel RealSense technology, AFTER-MOUSE.COM was eager to take part in the development program.
While they have enjoyed the process, there was one issue with the Intel RealSense SDK that had to be resolved to make Clean’Move viable. In the alpha version, tracking would sometimes break lock with the subject and move too slowly. AFTER-MOUSE.COM sent to Intel a video of this problem and then worked closely with Intel engineers to resolve it for the Intel® RealSense™ SDK Gold release. The tracking is now smooth and quick.
Advice for Developers
For developers working in challenging environments, such as clean rooms and sterile environments, Chaillan offers valuable advice.
- Keep gestures simple, intuitive, and easy-to-use so that extensive training isn’t required for users to be quickly successful with the app.
- Use gestures that mimic natural movements and mouse usage, such as using a hand motion to move the pointer and using the other hand in a pushing motion to execute a mouse click. Gestures should allow users to maintain their same working routines with minimal additional learning.
- When using gestures in a work environment, fatigue can become a significant factor—to the point where users reject the app. Limit the number of gestures that require users to raise their shoulders, and limit the range of the gesture motions.
Chaillan noted, “Know when you’re done and don’t use too many gestures. Intel RealSense technology is capable of many, many gestures—from voice to precise finger actions. If you include too many gestures in an app, or include gestures that aren’t intuitive, you’ll have to teach users how to use them and they will have to remember all those gestures. If your interface is too complex, people won’t use the app.”
Other Uses for Intel® RealSense™ Technology
AFTER-MOUSE.COM envisions many opportunities in the healthcare field that could incorporate Intel RealSense technology. Of particular interest are applications in emergency rooms, which they view as an underserved market. In an ER environment, Intel RealSense technology could be used on computers in the triage area. For example, nurses might use gestures to enter vital-sign readings without leaving the patients. (Due to the noisy and potentially confusing sounds in the ER, AFTER-MOUSE.COM suggests that gesture input is preferred over voice input.) Further, facial recognition would ensure security and limit access to patient records by unauthorized personnel. Additional intake information, such as patient history, symptoms, and other medical and insurance data could be entered via voice or gestures, again keeping the nurse in contact with the patient.
Inside the ER, the medical staff could access a similar gesture-controlled system to get patient information and add findings and diagnoses. Medical devices that move with the patient, such as blood pressure and respiration monitors, could also integrate Intel RealSense technology, unifying the ER model. Those devices could also provide access to medical images and data in much the same way the surgeons gain access with Clean’Move.
AFTER-MOUSE.COM has many apps in markets outside of the healthcare industry and they see a great future for Intel RealSense technology in those markets as well. They have already ported most of their business-to-business apps from Kinect to the Intel RealSense SDK to leverage the benefits of facial and voice recognition, as well as other RealSense technology features. In fact, the layering of features in Intel RealSense technology is one of the things Chaillan likes best. “Our Retail’Move app uses all those features—face recognition, close-up interaction, voice recognition—everything that makes sense for that app. We use a lot of gesture-based technologies to catch people’s attention and bring them closer to the device. We’re working on a lot of exciting projects with Intel.”
About AFTER-MOUSE.COM and Nicolas Chaillan
Nicolas Chaillan founded AFTER-MOUSE.COM in 2008. The company specializes in the development of innovative solutions such as multi-touch, wearables, and motion-recognition business applications. AFTER-MOUSE.COM has offices in 10 countries around the world.
Chaillan, who is also chairman of Holding AKT, is more than a self-described “serial entrepreneur” and business executive; he is a successful software developer and early contributor to the PHP language. Born in Marseilles, France, he taught himself programming at the age of 7, sold the code for his first video game when he was 11, and developed computer security systems as a teenager.
Get the Intel® RealSense™ SDK UX Design Guidelines document and learn best practices in software development for all input modalities, including gesture.