Can basic coding concepts be taught in virtual reality? We believe the answer is yes and in fact we are building the world’s very first VR application to teach computer science basics in a virtual environment. This initiative has been made possible thanks to the generous support of Intel and our vast experience teaching coding at Zenva Academy.
How can we make teaching computer science fun, engaging? Robots that teach kids how to code have been an inspiration. Presenting computer science to the public in a way that’s not intimidating is certainly a challenge. The beauty of virtual reality is that it allows us to create an entire universe without the limitations and constraints that come with traditional teaching methods or physical hardware.
Zenva Sky has taken a turn this week by focusing on the idea of programming a robotic vehicle which will introduce users to progressively more advanced computer science concepts. Keep on reading to find out what changed from our last update and what areas we’ll be developing next!
Each challenge / level consists of a grid-based area that a robotic vehicle must navigate in order to reach the goal. It’s important to mention that the models you see below are being used for development purposes – the final art style of Zenva Sky hasn’t been defined yet, and it will most likely look radically different to what you see here.
The user is placed inside the vehicle. They can enter different commands into their robot from the cabin's dashboard. Each command is added to the Program panel. The goal of the program is to take the robot all the way to the goal of the level.
We built the first level, which can be solved by moving and rotation the robot. The first level can be solved making the following moves:
Which translates into the following commands:
In order to create such a program, the user needs to press the buttons on their dashboard using their hand-tracked Mixed Reality controllers -- just like they would in a real vehicle!
Previously, we were using “laser pointers” for UI interactions, however it felt more natural for VR to actually press buttons and use your hands to interact with the different element, without using a "mouse cursor" analogy (which is what VR laser pointers are, in a way):
If the user makes a mistake they can remove the command from the Program by touching the “X” buttons with their hand-tracked controller:
When the Program is ready to go, the user presses the “Execute” button in the Program panel, which will make the vehicle move and execute every command on the list.
Simulator sickness (aka “motion sickness”) is a problem for many users. These are some of the considerations we’ve taken so far:
Since we started with this project, having a playable level was a main concern. If you followed our past updates we had reached a few roadblocks with our previous prototype. The current gameplay feels adequate and finally having that first level complete is a big milestone for us!
The next thing we’ll implement is the ability to activate boolean gates, so that we can develop a few more challenges that incorporate boolean logic on top of the existing move and rotate commands.
Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors. These optimizations include SSE2, SSE3, and SSSE3 instruction sets and other optimizations. Intel does not guarantee the availability, functionality, or effectiveness of any optimization on microprocessors not manufactured by Intel. Microprocessor-dependent optimizations in this product are intended for use with Intel microprocessors. Certain optimizations not specific to Intel microarchitecture are reserved for Intel microprocessors. Please refer to the applicable product User and Reference Guides for more information regarding the specific instruction sets covered by this notice.
Notice revision #20110804