Breathing Life into VR, low-cost animations systems!

Our journey so far on the development of Minerva Lab has been amazing! What? You do not know the deeds of our adventures? Check it here to understand what we have been building. Now with the final line been reached I want to show you how amazing is the work produced by our dedicated team:

And the subject I want to highlight for this week is animations. Surely you saw our little Robot Own (BUBO) moving and "taking" in the above video, this movements are the one that really put life into a digital character, a binary expression of the ways that this individual had once only in our imaginations finally brought to life.

Pixar Lamp Line Art - GIF (2) by Tsuriel - Dribbble Animasi Line Gif

Is it strange that even a simple lamp can have this much of personality? Of course, this is a genius work from Pixar, nothing less than the first company to create a full tridimensional movie in world-scale, Toy Story.

But, animations have a lot behind them, from old flip books that had to copy the same draw with minimal differences between each other to create a sense of movement, to finally achieving the digital era where copy and past are as simple as pressing 2 buttons.

Dragon Ball Goku's from Bandai vs Superman from DC in a Flipbook Animation by etoilec1

This process gave us amazing results for decades, from animated cartoons to most of the 2D games great franchises, like Nintendo’s Mario or Sega’s Sonic. The process of animations using each frame with an image and them playing this sequence fast would create the illusion of movement, attach, defending, casting and so on.

Mario and Yoshi from Nintendo

Nevertheless, when this process was adapted to 3D engines the results where not that great, meaning that the need for much higher framerates resulted in more animations and this was also increasing the size of the models, one thing that was strongly avoided when we used to have physical distribution of the content, meaning a hard limit of (in the case of a CD) 640MB of memory.

As an alternative they began to use a technique known as Skeletal animation, which instead of animation each model creates a skeletal that can be assigned to several models and animates it. In this way, the movements of the skeletal are translated as deformations for the model and can even create much more natural movements.

Further, they also are pretty useful to make inverse kinematics, meaning that I can move the hand of a character locked into an object, let’s say a shield, and all the other bones can recalculate theirs positions by the changes of position from the shield.

One of the first games to get great success using this kind of technology was Valve’s Half-life, which was a very moddable game engines, resulting in other well-know franchises like: Counter-Strike, Team-Fortress and Portal. You can see in all these games, more on they old versions, that several animations were shared, for instance the Counter-Terrorist and Terrorist from CS uses all the same animations for moving and shooting.

Nowadays, most of the game engines and 3D modelling programs support the Skeletal animations, and even for 2D animations they have been used a lot. Programs like Spline or Marionete made possible to have similar pipelines of development for 2D games and are changing the way many game developers are approaching how to animated in 2D.

Ok, but how knowing this can really help you to make the best of your VR experience? Well, to start you should look for an experience animator for your team. Remember, it is not because someone can create 3D models that the same person will know how to properly animate, this are two separated works and it would be rare to find someone who truly enjoy and are good in both.

Second, rejoice! Yes, rejoice! Because of this amazing Skeletal system, you can simple use other animations into your own experience with a very low cost! Never heard of Mixamo? It is a free tool that not only shares an insane number of animations, most of them created by motion capture systems that gives a very authentic movement, but also have an auto-rigging tool that makes possible for you upload your own model and see it moving with their animations.

Not only that, but when using Unreal Engine, it has a lot of animations included in their learning system and on the content examples. In fact, the Epic documentation about how powerful is this engine regarding Animation is overwhelming. You have the ability to blend animations and to change in real-time aspects of the skeletal to precisely fit objects into it (using sockets) and to make very complex inverse kinematics.

In previous works the team from VRMonkey has used the IKinema plugin, that can be easily incorporated on Unreal Engine, and allow to have very precise and realistic Inverse Kinematic. The first use of it is to make characters to walk in a more natural way in landscapes with variation of height, but this plugin can do much more.

In one of the last updates from them, Orion, you can use this program to actually have motion capturing system with low-cost.  Meaning that with it you can now have custom animations with less than US$ 2,000.00, using a HTC Vive and some Vive Trackers.

Last, but not least, with the new iPhone X IR cameras we can now do face tracking with a relative low-cost system. Both Unreal and Unity have released tutorials where you can do bot real-time tracking or record an animation to latter use. Not only that, there are Apps on iTunes that record the face movement allowing it to save it in a FBX file.

That is for today guys, I really hope you have enjoyed! Please remember to follow both our Facebook Page so you can be updated with the last news from our projects and do not hesitate in comment bellow so we can get your share of thoughts! Thank you for you reading!

For more complete information about compiler optimizations, see our Optimization Notice.