If you just got here and do not know that this post is part of a series of posts for Intel Ultimate Code Challenge VR you can check our last article!
Curious to see what we have been doing this last week? Just watch it!
Now that you know what we are talking about I have a question for you:
Are you familiar with the 70 20 10 Model? In short, it’s a learning and development model that corresponds to a breakdown of how people learn effectively, showing that 70% of the learning process is focused on the practice.
This is one of the reasons Virtual Reality is so amazing. Using the latest generation of headsets, like the Windows Mixed Reality devices, we have access to 6DOF (degrees of freedom) enabling us not only to look into the virtual world, but also to walk around inside it. This creates a sense of presence even more intense and, combined with the use of motion controllers, gives the user an experience very similar to one in real life.
As stated before, our main focus in this project is to create a virtual laboratory that can teach students about several themes. We believe that a Crime Investigation would be perfect as a case study, since it mixes several areas of knowledge like: Biology, Chemistry, Physics etc.
The user will be taken from a crime scene - the room of a teenager, to the forensic laboratory. Using Virtual Reality, the player can actually reach for the clues around the room, signaling and selecting important objects and materials. After a meticulous process they can go to the lab and investigate these clues.
In order to achieve a photorealistic experience and to be able to develop this proof of concept in the timeframe of 8 weeks we chose to use Epic Games*’ Engine known as Unreal* 4. This is not only due to the team’s extensive experience with the tool, but also due to the amazing number of features provided by it.
For instance, since the 4.13 version of UE4 they have a template for VR experiences. A very nice guide for this template can be found here.
As noted, one of the very useful features is Object Grabbing. Through the use of interfaces implemented by the grabbable objects, specific methods can be called on the hands, enabling interaction with the world around the player.
Blueprints are a visual scripting language that make it very simple to prototype fast interactions, and they are also very easy to handle. It is common to see artists using it and not even realizing they are writing code. In the end, these codes can be copied and pasted since they are interpreted as XML texts. The website is a great way to share codes with other developers.
As you can see from the video, we have already implemented some methods and interactions, but most of them are still in early stages. In our next posts we will be getting more and more into technical issues and how we solved these problems.
If you are also starting to develop for VR I recommend you to take a look into this article on ways of minimizing effects of cyber sickness and best methods to place interfaces. Another great source of knowledge about Virtual Reality is the Intel Guidelines, if you haven’t checked it out, do it right now.