working outdoors in sunlight

working outdoors in sunlight

Does the new RealSense camera work outdoors in the presence of sunlight?    After attending the recent webinar and happy hour meetings, I'm getting conflicting information about this.

     ...Tim...

16 posts / 0 new
Last post
For more complete information about compiler optimizations, see our Optimization Notice.

Hi Tim!  I was also at the webinar.

The public version of the RealSense camera is initially going to be built into tablets and ultrabooks.  The tablet version will be positioned to face outwards away from the user (rather than inward towards the user, as the PC version does) so that the environment in front of the user can be incorporated into applications.  Given that it is going into mobile tablets (unlike Kinect), I think it is highly likely that it is designed to cope with outdoor sunlight.

Here's the product info page on the tablet version of the camera.  It shows a picture of a guy playing with his kid on a beach while someone films him with the tablet.  That's just about as typical a bright-light scenario as you can get if the picture reflects (no pun intended) real-world conditions.

http://www.intel.com/content/www/us/en/architecture-and-technology/reals...

I can see real potential for a rear facing camera however this is very likely to split development effort into gesture interpretation and scene interpretation.  I suspect there will be more effort spent on scene interpretation meaning UltraBooks will be fitted with cameras facing the wrong way for most applications.

Just my thoughts of course.

David

I would suppose that gesture recognition on a tablet is less practical than a PC / Ultrabook camera, because unless you stand the tablet on a surface to use it then it'd be very likely to fall out of your grip while doing a gesture and shatter on the ground!  

Plus, at the moment, making hand signs to your mobile in public is likely to attract friendly men in white coats to drag you away.  Then again, filming folk with your rear camera doesn't get you a sweet reception either  :)  Like the webinar said, perceptual computing is still a very new tech, even to developers.  Hence having the App Challenge contests, hackathons and such to promote awareness of the tech.

Like the Wii had to be born to popularize motion control, and Kinect built on that mainstream consumer foundation, perceptual computing should be widely accepted in the next couple of years.  After all, the kinds of abilities it offers are definitely not fads, but the realization of things that people have wanted to do without physical controls for a long time.

> still a very new tech, even to developers.  Hence having the App Challenge contests, hackathons and such to promote awareness of the tech.

Sure.  I'd like to be aware of whether the tech works in sunlight or not.  While it's interesting to conjecture things based on pictures in their marketing that show people outdoors, I'd like to hear something a bit more definitive from Intel itself.  Based on the fact that the new version of the camera still relies on infrared, it doesn't seem promising.  If it does work in sunlight, it would be a major selling point for certain applications.

      ...Tim...

Tim, when I'm using a vision-based tech and want to alter the light / color properties of what is being looked at, I tend to put a filter over the camera lens to screen out what I don't want (e.g a slightly tinted see-through material to give the lens "sunglasses" in bright light).  

Hah, it reminds me when the original version of the PC game 'Deus Ex: Human Revolution' had a gold filter to make everything in the game world tinted gold, and it gained internet nicknames such as "pee-o-vision" and "urine goggles".  :)  They removed the filter in the Director's Cut version later.

Anyway, an external lens filter is a far from ideal solution of course if you are making an app that relies on the outdoors, since you don't want all your users to have to stick tinted material over their tablet camera!  But it can be helpful when you're doing a project for your own use and benefit.  Would be handy if tablets' casings were designed to have a little tinted-plastic slider that you could flick over the lens in bright weather.  Would probably only cost the manufacturer a few cents.

At the moment I can't find any direct mentions by Intel about the ability to see in light.  But as another piece of third-party evidence, here's an article from June's Computex show that shows a person using RealSense while a bright show-floor light is beaming onto the screen.

http://www.stuff.tv/sg/3d/intel-s-realsense-project-now-enables-its-tech...

 

8< I would suppose that gesture recognition on a tablet is less practical than a PC / Ultrabook camera, because unless you stand the tablet on a surface to use it then it'd be very likely to fall out of your grip while doing a gesture and shatter on the ground!  

I totally agree that non-touch gestures are impractical for a device operating in tablet mode however it seems you missed my point.  There are a finite number of developers and having devices with forward facing xor rear facing cameras will split development effort.  My suspicion is that there will be killer apps for both and this will mean that devices with single direction cameras will be a frustration for users.

The obvious answer would be to fit two cameras or have a rotatable pod with RealSense camera on one side and Hi res 2D camera on the other so you have the best of both worlds although both would increase the production cost of a device.

David

I saw and understood your point about fragmentation of developers, I just didn't write anything in reply to it.  My bad, sorry.

Fragmentation is rarely a good thing.  It's something that's particularly a problem with Android-based tablets.  An example of multi-platform hardware done well is Activision's digital / toy combination Skylanders, where the physical device you place the toy figure on (the 'Portal of Power) is the same for all console and mobile platforms, and the only thing that's different between the platforms is the software.  So you can take your Portal of Power and figurines to a friend's house and continue playing on their device.

I imagine a RealSense camera costs a fair bit more than an ordinary camera.  If that's the case, the most cost-effective way for tablet manufacturers to give users the option of using both front and rear-facing camera applications might be to put a single camera on a pivot so that it can be flipped round depending on what application the user is using.

Hi Marty,

8< the most cost-effective way for tablet manufacturers to give users the option of using both front and rear-facing camera applications might be to put a single camera on a pivot so that it can be flipped round depending on what application the user is using.

Great minds....

8< or have a rotatable pod with RealSense camera on one side and Hi res 2D camera on the other so you have the best of both worlds although both would increase the production cost of a device.

Alternatively how about a clip on prism to act like a reverse view periscope?

David

 

Cool ideas David, but they tend to be limited to personal mod projects or specialist builds for certain applications rather than widespread retail releases, sadly.  Manufacturers don't want to return to the era where you could put enough attachments on a Game Boy to make it as easy to handle as a brick  :)

http://i.imgur.com/XKe1K.jpg

Maybe you could do a Kickstarter campaign for a clip-on- prismscope!

> here's an article from June's Computex show that shows a person using RealSense while a bright show-floor light is beaming onto the screen.

My question isn't about bright light, it's about sunlight.  I know that infrared depth-sensing works fine indoors in bright light.  The problem with use in the presence of sunlight, I believe, is that sunlight has lots of infrared light that swamps the infrared that the camera is putting out.

     ...Tim...

 

No harm in indulging in well-reasoned scientific speculation based on pieces of information available.  Science is fun!   But until there's official tech spec info (I imagine RealSense alpha testers have it but can't disclose it due to NDA agreements), educated guessing is all we can do.

In this case, if you're designing an application for RealSense then the sensible thing to do is focus on developing the parts of the app that Intel has released solid details about and leave the unknown bits for later once the camera is released so that you don't lose time building stuff that turns out to be unusable.

This is a long way of saying "I don't know if RealSense will turn to dust like Dracula when exposed to deadly sun beams".  :)

I've already designed and implemented my application using the Creative Senz3D camera.  Here's a video showing my installation at Burning Man last year which used four of the Senz3D cameras (two cameras on two laptops):

     https://www.youtube.com/watch?v=M4Xd9TrcPE0

As a result, I know from actual experience that the Senz3D does not work in sunlight.  When I heard about the "new" RealSense camera, I was hoping that it used a different technology than the Senz3D, but it looks like it'll be pretty much the same (with higher resolution).

     ...Tim...

What an awesome installation, Tim - thanks so much for sharing!!

As the 2014 RealSense SDK supports the gesture functions that were in the 2013 Perceptual Computing SDK used with the Creative camera (this was confirmed in the Q&A section of last week's webinar), it would make sense if RealSense was based on that tech for legacy support.  But it's possible to make new hardware that emulates the functions of an older one.  So I don;t know.  As Star Trek VI said, the future is the undiscovered country.

 

 

> it would make sense if RealSense was based on that tech for legacy support.

I seem to recall them saying at the happy hour meeting that the new SDK would not support the legacy hardware (i.e. the Senz3D), and that the APIs would not be the same, so it doesn't appear that they're making any attempt to be backward compatible.   "Supporting the gesture functions" and "Having the same API" are two different things.

     ...Tim...

I seem to remember that some people who have been alpha testers found that some camera code that worked with the 2013 SDK didn't work properly with the early version of the 2014 SDK.  The advice given at the webinar was to wait til the new camera and SDK come out and *then* work on incorporating RealSense into one's apps instead of trying to adapt the 2013 code.

Leave a Comment

Please sign in to add a comment. Not a member? Join today