Simian Squared Go Perceptual: Ultimate Coder

Hello all and welcome to our very first Ultimate Coder: Going Perceptual blogpost!

I thought we'd use this opportunity to introduce ourselves and tell you about what we will be doing with the Perceptual Computing platform and where we're coming from with our ideas:

Who are Simian Squared?

We are an independent developer and are based in London, England. You may recognise us as the developers of the upcoming platformer The Other Brothers and a Physynth, a musical experience for iPad (which we'll be telling you a little more about later on in this post)

What is Perceptual Computing?

Intel have provided us with a lovely piece of kit called the Interactive Gesture Camera, a stunningly powerful Ultrabook portable computer as well as their Perceptual Computing (PerC) SDK. In a nutshell, the PerC it allows us to capture and interpret physical gestures or movements and use them to immerse the user in our interactive experience.


Why did we decide to jump aboard the challenge?

That excitement of being able to work with something powerful and unknown is something that appeals to us and is the main reason behind us deciding to take a project like this on. We see Perceptual Computing as an opportunity to create a new kind of immersive experience, one that is almost tactile and that is very exciting to us.

We should probably start by talking to you a little bit about what we've done before in this area, and how it will translate into this project: 

Back at the end of 2011 we took some time out of working on games (in the traditional sense of the word) to work on an experimental idea we had been bouncing around for a while.

So what was it?

Like with the Ultrabooks and Perceptual Computing today, Apple at the time had presented us with an exciting new canvas to work on and made developers lives easier with great hardware and support.

We had a large-screened device with a number of sensors at our disposal and a blank canvas to come up with something unique. Having this, we took the opportunity to wander away from our comfort-zone of games and proposed an idea we had for a new kind of musical instrument - one that took from my experience as a games developer to result in tweakable physics simulators to trigger their sounds with.

So how does that relate to this?

We looked at what other music app developers had been doing with their user interfaces and saw that for the most part everything was quite utilitarian, which is fine, but there was nothing really there that created an experience, nothing that pulled the user in and took them away from reality for a while.

So thinking about doing things a little differently in terms of user-experience, we started off with the visuals - being games developers, we knew that we could bring our knowledge of realtime 3D over to making an interface that really stunned our users. 

We had been thinking that a traditional interface really wouldn't cut it - this idea conjured up visions of old 1960s hardware, of finding an ancient bit of kit in your dad's garage and firing it up it's half-working displays and temperamental sound units with wonder and little sprinkling of intrigue.


After some brainstorming, we came up with a number of ways that we could immerse the player:

1. We decided that we would emulate physical hardware in realtime 3D. No pop-ups or drop downs allowed that could break the immersion, and it needed to be beautiful. High resolution, realistic textures and a fully three dimensional interface.

2. Following on from that, we came up with the idea to create a sort of pseudo-3D depth effect by tilting the camera to match the angle of tilt on that the actual physical device is held at. This gave the illusion of it being an actual, physical piece of hardware.

3. In conjunction with this, we lit the virtual hardware and wrote custom shaders which reacted with the lights in combination with the physical angle of the iPad to create a realistic surface that shone as you moved your iPad.

4. Finally, being a rather complex app, we wanted to create a manual whilst adhering to our rule of emulating a physical experience, so we made our own simulation of a physical book, with page turning and hand illustrated diagrams.


What does that mean for this project?

We want to take that kind of experience to the next level with the PerC. Allowing the user to physically interact with our virtual world just screams out to us as something we can use to make something special. We want to expand upon our initial ideas for visuals here, and bring something completely new to the table with the incredible gestural and motion controls and hopefully make the user feel an almost tactile experience with it.

Physynth was a niche product and one that did have a steep learning curve, and that's something we realised limited its accessibility - for this project we will be learning from that and creating something that anyone can use be they eight years old or eighty years old.

Right, enough background - give me the details, What are Simian Squared actually making here?

We will be creating a virtual pottery wheel, set in a beautiful location that will give users the chance to use their hands to sculpt digital clay into beautiful works of art. The user will use physical gestures and motions to mould, manipulate and then paint the clay.

We will put a large emphasis on getting it to feel as natural, and look as beautiful as possible to give the user a wonderful experience that also fully demonstrates the power of the PerC and Ultrabook. We will be leveraging our experience with rendering and physics to get the most out of the Ultrabook and PerC hardware.


 So how far along are we?

Well, so far we've been split between doing R&D for the tech we will be writing for this project, and wrapping up on development for our upcoming platformer, The Other Brothers - we're about ready to kick into full gear with the challenge though and will be keeping you updated on all goings on over the coming seven weeks.

We've taken a long hard look at the Intel PerC SDK and its incredibly exciting. They've stripped out the main technical hurdles and left us with a creative level playing field - and we can utilise UNITY for this - the cool part of Unity is being able to leverage an awesome workflow and great special effects.

We'll be talking in depth about this process of using custom shaders plus sharing source code as the blog progresses, so keep checking back for an in-depth look at our creative process and source code! 


Giuseppe and Rob
Simian Squared Ltd

This blog post is mirrored on our own website at 

For more complete information about compiler optimizations, see our Optimization Notice.