2016 was a landmark year for virtual reality, but 2017 will be nothing short of surreal. As we look to CES and beyond, it’s also a good time to look back. Here are the top 10 stories from our blog in 2016.
DEC. 5: The next generation of mobile VR headsets will feature new sensors with higher performance, lower power, and 180×180 degrees of tracking.
FEB. 17: The rise of VR means that our dreams of interacting with digital content on a physical level are coming to life. But to make that happen, you need a more natural interface. You need the power and complexity of the human hand.
Interacting with digital content on a physical level starts with the human hand. Click To TweetThis is why we created the Orion software, which is built from the ground up to tackle the unique challenges of hand tracking for VR. The reaction from our community was incredible:
Five releases later, we continue to improve our hand tracking technology for the next generation of headsets.
MARCH 2: To match the new capabilities of Leap Motion Orion with the performance demands of VR, we gave our Unity toolset an overhaul from the ground up.
MARCH 31: Originally an independent project, getnamo’s plugin got an Epic seal of approval with an official integration in Unreal Engine 4.11. Stay tuned for more updates in 2017.
JUNE 1: The Hands Module adds a range of example hands to your Unity toolkit. Version 2.0 features an autorigging function so you can bring hand designs to life in two minutes or less.
JUNE 11: Featuring buttons, toggles, sliders, and experimental interactions, the UI Input Module makes it simple to create tactile user interfaces in VR.
By exploring grey areas between real-world and digital physics, we can build a more human experience. Click To TweetAUG. 23: Game physics engines were never designed for human hands. But by exploring the grey areas between real-world and digital physics, we can build a more human experience. A virtual world where you can reach out and grab something – a block, a teapot, a planet – and simply pick it up.
However, the Interaction Engine unlocks more than just grabbing interactions. It also takes in the dynamic context of the objects your hands are near. Learn more in Martin Schubert’s post on building with the Interaction Engine.
SEPT. 2: This rapid overview of our Unity Core Assets and Modules covers everything from custom-designed hands and user interfaces to event triggers.. Do you know what each Module can do?
SEPT. 8: Breaking into VR development doesn’t need to break the bank. If you have a newer Android phone and a good gaming computer, it’s possible to prototype, test, and bring your hands into mobile VR.
Explorations will continue through 2017 with sound design, avatars, locomotion, and more. See you in the new year!