Archived - Intel® RealSense™ Blog Series: Music, game development, and more with developer Stefan Sadchikov

Published:10/22/2014   Last Updated:10/22/2014

The Intel® RealSense™ SDK has been discontinued. No ongoing support or updates will be available.

Intel® RealSense™ technology makes it possible for our digital worlds to interact with our physical, organic worlds in meaningful ways. Many of the projects that developers are creating step across boundaries that just a few years ago would have been impossible to imagine.

The RealSense™ App Challenge 2014 is an exciting competition meant to spur developers from all over the world to forge new ground in the field of perceptual computing. Voice control, gesture recognition, facial analysis, and augmented reality are just some of the ideations we’re looking forward to in this challenge. Evolved from the 2013 Perceptual Computing Challenges and utilizing the Intel® RealSense™ SDK for Windows Beta and Developer Kit as well as the Intel 3D gesture camera, developers will be tasked with the challenge of designing innovative new ways that RealSense™ technology can be integrated into our computing experiences.

This definitely makes for some exciting developer possibilities. These technologies are changing the way we develop apps; for example, a user could choose to open up a YouTube video simply by using a hand gesture, or post a new Facebook status with a blink of an eye and voice recognition. With the release of the Intel® RealSense™ SDK for Windows Beta and Developer Kit and the Creative * Interactive Gesture Camera Developer Kit with RGB and 3D depth capturing capabilities to bundle this entire tech together, developers have a unique opportunity to usher in a whole new era of PC and computer interaction.

Perceptual Computing Challenge 2013 2nd prize winner Stefan Sadchikov graciously took some time of out of his busy schedule to talk to us about his experience with that competition, his thoughts on perceptual computing, and his development plans for the future. Mr. Sadchikov won second place in the overall competition with his entry “Drummer”:

More about this project:

“The application is a music synthesizer that will allow you to create drum sounds using gestures and head tilts, playing "imaginary drums". User's hands (hand cursors) are rendered on the screen along with the drums, which may be selected from the slide-menu (all musical instruments are configured via external file and can be added and modified at will). Instruments are played when hand cursors collide with them (cursors are tracked after user's hands). Rhythm bar for each instrument is displayed on the center of the screen and may be written to external file (as text). All controls over the application are performed with gestures - Moving instruments around, adding new ones, navigating drum menu, recording tracks.”

What got you interested in coding, particularly perceptual computing?

Computers and everything related to them – particularly, videogames – fascinated me when I was young. When it was time to decide what I want to study and work on, programming seemed like an obvious choice, so I went to study Math and Programming at Samara State Aerospace University.

While studying, I worked as a programmer in a couple of local companies unrelated to game industry, spending my free time gathering specific knowledge about game development.

Tell us about your history with the Perceptual Computing Challenge. What were the highlights for you in 2013?

Intel’s PerC SDK got me interested because of unique way the user can interact with a computer. Gestures and hand movements are somewhat intuitive – all people use them – so the idea of making it an input for a computer application sounds promising. New scheme of user input may give a twist to existing game genres or even create new ones.

For the 2013 challenge I wanted to make a small project that would start from scratch, pass all development phases and see a release in some way. It was not the first project I ever attempted, but the project that I wanted to finish in a proper way. That’s how we came up with the idea of the Drummer app.

Some of the purposes of this project for me was to get more practice in C++ (I previously worked as a Java programmer), build an architecture for a small game-like app and learn to use a new tool that can possibly aid me in my further project – Perceptual SDK.

We did not use other Intel developer tools then. If fact, I personally learned about them only after the challenge.

Tell us about the overall development experience: did everything go smoothly? What challenges did you come across, and how did you solve them?

The main challenges came from not being familiar with SDK. It was only Beta then, and could not see difference between left hand and right hand, for example. Such features were not very thoroughly documented, so it took a bit of effort to learn, what the SDK can and what it cannot do. But the development process overall went smooth and there were no challenges we could not overcome.

Since the last Challenge, tell us what you’ve been working on. How did the Challenge help you in your professional or personal development projects?

I now work on a game called Insomnia at a local company Studio Mono (the Kickstarter page can be found here:https://www.kickstarter.com/projects/1892480689/insomnia-an-rpg-set-in-a-brutal-dieselpunk-univers). As I said before, one of the purposes of our 2013 app was me studying C++, and it really helped me a lot. Things I learned while making Drummer app I currently use in my everyday work.

Are you competing in the current Intel RealSense App Challenge (https://realsenseappchallenge.intel.com/landing/)?

Yes, we have submitted our idea to the competition. Last year’s challenge was a really interesting and useful experience, and I am planning to work with the technology further, this time with a more game-like application.

The new SDK has much more features than the 2013 version, with more robust detection and more information captured. Our last year’s application was not very easy to use because it relied on accurate positioning, which the SDK could not offer back then. We’ve learned our lessons and we’ll see, what the new SDK can offer.

We have not decided yet what tools to use, but it would probably be ones we are already comfortable with – like Visual Studio. It’s not the best IDE ever, but it’s hard to find something better for C++ development. JetBrains just released early access of their CLion IDE, and, knowing their Java IDEs – this must be something solid, but it has been in development for a long time and it will take time to polish.

How would you define what RealSense technology is?

I see RealSense technology as an alternative to common input methods, like mouse and keyboard. It can hardly take their place, but it can complement their abilities. Every device has its own zone of comfort, the problems it solves, and the RealSense technology solves different problems, then mouse or keyboard.

Where do you see perceptual computing going in the next five years?

It might become a common input device, in a line with a mouse and keyboard. Or it can become something specific, like the drawing pads for artists, to be used only for some specific situations.

What are you looking forward to in the current RealSense Challenge? What made this challenge compelling to you?

New challenge features a new, better SDK, and it would be interesting for me to play with its possibilities. Also, the Ambassador Track sound like a good idea – to compete with people I already know from previous challenge.

Were there new capabilities enabled through the latest Real Sense software or hardware solutions which piqued your interest and or got you inspired?

Yes, from the demos I’ve seen on MWC and features on Intel Developer Zone, the SDK now exposes much more data to work with, and it made possible to create a game like the one we submitted for the challenge.

What advice would you give your fellow Challengers?

Always check the capabilities of SDK before trying to implement something. And also, in my experience, apps that rely on physical feedback (like our Drummer app, that mimics the process of playing drums without providing the actual physical feedback from the drum) always look crippled in some way. There are a lot of ideas like this that seem interesting to implement, some of them can become something awesome – but it also may completely ruin user experience for others.

Which other tools, documentation, and/or websites were the most helpful to the dev effort?

A lot of questions I looked for were answered on the forums. This is the only resource I used, apart from official documentation.

Give us an example of a “breakthrough moment” in your perceptual computing development.

I think for me it is the moment I actually learned how to properly set up a camera and get some results. J It took time to do properly, but it was probably a key moment in the development.

Thanks again to Stefan Sadchikov for participating, and we wish you the best of luck in this year’s Intel RealSense App Challenge! For more information, we invite you to visit the following resources:

 

 

 

 

 

 

 

 

 

 

 

Product and Performance Information

1

Performance varies by use, configuration and other factors. Learn more at www.Intel.com/PerformanceIndex.