This article, Part 1 of a series, describes how to add Intel RealSense SDK features into a game character in massively multiplayer online role-playing games (MMORPGs) on UE3 by using C++, not an Unreal script. We focus on using the Intel RealSense SDK as the manipulation method of the characters’ facial bone structure (face-rigging) for performance and workload issues.
This article is Part 2 of a series on how to add Intel RealSense SDK features into a game character in massively multiplayer online role-playing games (MMORPGs) on UE3 by using C++,rather than an Unreal script. Learn how to set up VIsual Studio for the game example here.
Follow Pawel L. to learn about Intel's graphic driver support for the emerging Vulkan* graphics API. He'll be providing several tutorials along with Github source code.
The Intel® RealSense™ camera is a vital tool for creating VR and AR projects. Part 2 of this article lays out how to use the Intel RealSense camera nodes in TouchDesigner to set up a render or real-time projections for multi screens, single screen, 180 degree (FullDome) and 360 degree VR renders. In addition the Intel RealSense™ camera information can be sent out to an Oculus Rift* through the TouchDesigner Oculus Rift TOP node. Includes sample .TOE files.
This paper provides an overview of some key aspects in developing Intel® RealSense™ applications that are portable across the different front-facing cameras: Intel® RealSense™ cameras F200 and SR300. It also details several methods for detecting the set of front- and rear-facing camera devices featured in the Intel RealSense SDK.
RealPerspective is a code sample that utilizes Intel® RealSense™ technology to create a unique game experience in which the user can move their head around and have the game’s perspective correctly computed. Traditionally this has been done with a RGB camera or IR trackers, but with the Intel RealSense camera’s depth information, the developer is provided accurate face tracking without any additional hardware on the user's system. The sample accomplishes the effect by implementing an off-axis perspective projection, using as inputs the face’s spatial X, Y position and the face’s average depth.
This article provides an introduction to autonomous navigation and its use in augmented reality applications, with a focus on agents that move and navigate. Autonomous agents are entities that act independently using artificial intelligence, which defines the operational parameters and rules by which the agent must abide. The agent responds dynamically in real time to its environment, so even a simple design can result in complex behavior. An example is developed that uses the Intel RealSense camera R200 and the Unity* 3D Game Engine.
In game development, middleware can be the software between the kernel and the UX, or it can be software that makes game development easier by providing additional services, features, and functionality. Whether you are looking for an entire game engine to develop your idea into a game, or an efficient easy-to-use video codec to deploy full motion video, this list will guide you to the best middleware to use while developing your game for Intel® architecture.