This project attempts to discover intuitive methods for user interactions and feedback in a volumetric 3D environment. Applications could potentially include navigation through web content, social media, or even file browsing. Platforms could potentially include standalone desktop applications, browser apps, or even on mobile devices with embedded hardware.
Development environment will be using C# Mono .NET in Unity3D engine. A 3D volumetric particle field is generated within the bounds of the touch area. Touch Input from one or both hands drive Newtonian fields (intensified by hand velocity), which disturb the particle field, providing spatial feedback. The particles attempt to resist the input physics by returning to their natural distribution. Additionally, the prototype examines static camera positions, versus camera views that trail the primary hand. The latter option was found to provide superior spatial feedback and surprisingly precise input control. To demonstrate UI functionality, the prototype includes content from live RSS streams, which are serialized and displayed context sensitive. Live content is also being pulled from a preset Twitter feed (this may require an overhaul as Twitter moves to a new API). Both sources require formatting to remove XML / HTML markup, which is not natively recognized by the 3D engine. 3D geometry for UI assets were modeled in Autodesk 3D Studio Max, and 2D artwork developed in Photoshop.