We've had an extremely hectic week but we're still here and Rob's going to talk about doing the sculpty-stuff - before we jump in, just want to say thank you again to Intel for inviting us up to GDC - having the opportunity to meet the Intel team and a fantastic group of developers all interested in pushing the boundaries was great. I managed to introduce PerC (or Percy as we like to call it here) to another developer who is actually rather experienced in a similar area so it should bring about something fun!
For the app we had two choices,
1. to use tessellation and deform the mesh via DX11 but risk incompatibility
2. to use an extremely high res subdivided mesh with bones. We went with option 2 because it looks almost as good, but will work on anyone's hardware.
We felt it was important to work on as many machines as possible because the real focus here is getting Perceptual Computing out there into the hands of those pioneers who want to try something fresh and new with their computer and the PerC.
The object is first set up in 3DSMAX, however you could use Blender to do pretty much the same job: it is skinned and boned at regular height intervals from bottom to top.
In the code we detect how close the fingers are to the middle of the screen. PerC has an excellent command for this -
if (pp.QueryGeoNode(PXCMGesture.GeoNode.Label.LABEL_BODY_HAND_LEFT|PXCMGesture.GeoNode.Label.LABEL_HAND_MIDDLE,out ndata))
If this is true, ndata will contain what you requested. This can be a bitmask if you want to retrieve a wider spectrum of information.
The command is QueryGeoNode. You're able to retrieve rudimentary positioning with:
ndata.positionImage.x and ndata.positionImage.y
ndata is filled from QueryGeoNode.
Although this is generally the entire hand - we want that for now.
In addition to that, we start pulling the finger information out for fine tuning. But how to simulate pottery with just positional data? The behaviour of pottery depends on:
1. the pressure against clay - press the middle and it forces the clay to grow taller
2. how many fingers are touching the clay - will this cause it to be bumpy? Therefore we need to dampen, or smooth the input values.
To dampen we could simply lerp the value:
smooth.x = Mathf.Lerp(smooth.x, ndata.positionImage.x, 10.0f*Time.deltaTime);
We also wanted a way to texture the clay with visual feedback. PerC for all it's fantastic potential still lacks tactile feedback. You can't feel thin air very well and this is an integral part of using your hands. Those thousands of nerve endings are often taken for granted.
So what we must do instead, is provide feedback using audio/visual - and in this case we have a lined clay texture that breaks up when fingers or hand touches one side.
In order to make it fun, as well as creative, it's a good idea to use two hands pushing an impulse from both sides of the pottery. This means its reasonably possible to create a wobbly mess as you probably would in real life if you only used one hand! The code is simple, like most code generally is - what is not simple is devising a way that FEELS NATURAL to human beings. This is the crux of developing for perceptual hardware. While the maths sound geeky and exciting, they are rarely the most important part of perceptual computing.
The most important part is translating human expectation into a meaningful result as Intel's engineers have already done the heavy lifting with the SDK.
If you are a programmer or developer interested in what we are doing with perceptual computing and intel, you will be happy to know we are including the full source code, models, art and full unity project files for you to use as you see fit. This is part of our ongoing commitment to Intel as we feel that there is a fantastic future for perceptual computing and this is just the beginning.
-- That's all from Rob,
everyone is doing some really exciting things, we look forward to the end of the week!