Archived - Interview with Developer at Global Game Jam Yogyakarta: Creating Intel® RealSense™ Game within 48 Hours

The Intel® RealSense™ SDK has been discontinued. No ongoing support or updates will be available.

In January 2016 I participated in Global Game Jam 2016, as a speaker at Jogja Game Jam #jgj48. Jogja Game Jam is a Jam site located at Yogyakarta, Indonesia. I talked to the developer, seeking to know if they were interested in creating something called the Intel RealSense games because I would be very happy to help them. After the Event, I was surprised because there was one team that succeeded in creating Intel RealSense game within 48 hours (actually it was two games; different platform in 48 hours).

So I’m did interview with Ariska Hidayat, one of the team member.

 

Ariska, can you tell me about your games and the team ?

Our team has 4 members. Myself as RealSense programmer, Fatah Rona as artist, Hariyanov as Android Programmer and Farid as Game Designer. we made two games, “Legendary Egg” which is an Android Platform, and “Swipey Monsters” which is on PC platform with Intel RealSense technology and basically, these two game has a similar gameplay but with different platform.

 

What about the gameplay ?

The Gameplay  is quite simple. The player need to keep the hatched egg from being attacked by monsters, they do this by tapping it. In Swipey Monsters, the player only need to move their hands (thanks to the RealSense technology) to attack the monsters.

 

Can you tell us how you create an Intel RealSense based game in 48 hours ?

To create an Intel RealSense based game, we need a software and a hardware that supports Intel RealSense technology. You can use F200 Dev Kit or any laptop with Intel RealSense Camera built in. I’m using a laptop that already has it, and installing it with the Intel RealSense Camera Driver (DCM) and also Intel RealSense SDK R5 (GOLD) version. Both of these games were developed using Unity 5.3.

After designing the gameplay, we create the core of gameplay, and in this part of development took the most of the jam time. Our team had to make the behavior of the monster, creating particles, setting and placing collider, and creating random spawn behavior of the monster at the same time.

The next step is creating RealSense control, this step was the easiest one, and this was were I did my part. The last step of development, was to complete the layout, such as the Main Menu, Score Menu. Sound effect and animation also need to be done.

 

Tell me about Intel RealSense part on your game.

Intel RealSense SDK includes many examples and I’m very glad to know it. We looked at Intel RealSense hand tracking capabilities and chose what controller was needed to be adapted with the gameplay requirement. In our game, we only needed a single point hand tracking device as a controller.

 

How do you develop it ?

There are 2 ways of Intel RealSense development in Unity. First, using the toolkit, we only needed to drag and drop the prefabs which implements Intel RealSense hand tracking.

The second way, is to call the sensor value directly from library, and it was easier if we use this technique. From 22 point of hand, we only used one point in this development, and it was the hand center position. From the tracking position, we can get X value (Horizontal) and Y value (Vertical) to control the object pointer position. Each of every monster has a collider as object detection. When this monster object collides with pointer object, the monster will be destroyed. When added some special effects, it looked good.

 

Sounds greats, but can we see your project and learn from it?

yes sure, the game project and the executable can be downloaded here , and please also don't forget to download the Android version in here or in google play store

 
 
For more complete information about compiler optimizations, see our Optimization Notice.

1 comment

Top

Add a Comment

Have a technical question? Visit our forums. Have site or software product issues? Contact support.