After PerC challenge was over, I was so soaked with the development that I thought, I need to share expereince of the App Shadow Art Show Maker. So I wrote a detailed blog. Thank to David Liu for ensuring and helping with the publishing of the blog. You can read about the the blog here:
A couple of things I found playing around with hand-tracking with the SDK and Unity which may be helpful:
1) Even if you are using only one hand, you must check geonodes for both the PRIMARY and SECONDARY (or LEFT and RIGHT) tracks. Ususally on initial startup, the PRIMARY will be used, but it can easily move over to the SECONDARY.
I want to transform the coordinates returned by QueryFaceLandmarkData, which are in RGB camera space, to the depth map coordinate space, so I can find in the depth map where the eyes and mouth are.
The function QueryUVMap() gives the correspondence between depth map coordinates to RGB coordinates, but I need the other way round, any idea on how to do it?
I am sure that the SDK support circle gesture.
But if I want to know this circle gesture is clockwise or anticlockwise, how should I do this implement?
The Intel® Perceptual Computing SDK 2013 R6 is now available! We’ve been hard at work resolving issues and improving performance and are excited to share the results. So what’s new?
You know PCSDK is an exciting thing especially in face tracking. I use func "QueryLandmarkData" to get the z position of eye, but the z is always 0; and another problem even x and y display diffirently as gesture coordinate. I don't know what are they and where is 0 point in their coordinate.
You know? Please tell me and thank you very much
Does the PercSDK and Senz3D camera work on a Win RT device? If not, does anybody know if there are any plans for this?
I tested the link code below and realized that objects are considered black background. I wonder why. Does the camera have creative problem for segmenting objects colored black?
I would like to comment on that.
I'm working in coding a windows service which detects some gestures and also moves the mouse pointer with the movement of the primary hand node. I'm having problems at the time I query for the node data, it always returns PXCM_STATUS_ITEM_UNAVAILABLE and due to that I'm not able to get the node position, etc.
I'm pretty new at this; is there a reason why at the time I query for the node data it always returns that?? do I need to configure something?
Is it possible to use Intel perceptual camera on Android platform?
If it is not possible right now, what about the near future?
Since Android phone support micro-usb. (or normal usb via OTG converter)