We're working on a telepresence app. The user at a scene of interest has a camera feed that is displayed and sent to a remote user. The remote user can watch the camera feed and send back information by drawing on the feed and it gets shown to the user at the scene. So, for example, a remote nurse can draw on a camera feed of a patient by a home health aide, showing the home health aid exactly where the patient needs support to avoid bed sores, or where a closer look is needed, etc.. Other use cases are a girlfriend helping a boyfriend shop on a business trip and other exciting telepresence things!
Use of Intel Tools/SDK:
We've downloaded and installed Beacon Mountain and Intel XDK. We're working on porting a WebRTC video call native library and sample app ( http://webrtc.googlecode.com/svn/trunk ) from armv7 Android to x86. Right now the app is building and running on devices like the HTC One, Samsung Galaxy Note, and Intel powered Samsung Galaxy Tab 3. It uses a native library armeabi-v7a/libwebrtc-video-demo-jni.so that we're trying to compile for Intel x86 for better performance, however. Performance is very important for a good user experience in video apps! We also have a shared drawing demo written using HTML5 that runs in Chrome which is available for platforms like Android and laptops.
Stand out features:
This app is helping to get native performance video calling out to all developers and putting together a functional use case that can really help people work together and be effective and save lives and live happier. We're testing on cutting edge devices like Google Glass and the Galaxy Tab 3. Right now the only video calling available on Google Glass is very plain and limited Google Hangouts, so this is very exciting.
Gaps / Areas Required to Further this App:
We're still working on getting the compilation working for producing the native x86 library. There's a complex build system called gclient and ninja that makes things more complex then just changing the CPU ABI setting in the Android mk file.