was a landmark year for virtual reality, but 2017 will be nothing short of surreal. As we look to CES and beyond, it’s also a good time to look back. Here are the top 10 stories from our blog in .

The future of #VR is mobile. This week @LeapMotion will be at @CES w/ their next-gen reference design Click To Tweet

DEC. 5: The next generation of mobile headsets will feature new sensors with higher performance, lower power, and 180×180 degrees of tracking.

Our team will be at CES January 5-8 with our Motion Mobile Platform reference design. You can join us on the showfloor or follow @LeapMotion to experience the future of VR/AR.
 

FEB. 17: The rise of VR means that our dreams of interacting with digital content on a physical level are coming to life. But to make that happen, you need a more natural interface. You need the power and complexity of the human hand.

Interacting with digital content on a physical level starts with the human hand. Click To TweetThis is why we created the software, which is built from the ground up to tackle the unique challenges of hand tracking for VR. The reaction from our community was incredible:

Five releases later, we continue to improve our hand tracking technology for the next generation of headsets.
 

MARCH 2: To match the new capabilities of Leap Motion Orion with the performance demands of VR, we gave our Unity toolset an overhaul from the ground up.
 

MARCH 1: Originally an independent project, getnamo’s plugin got an Epic seal of approval with an official integration in Unreal Engine 4.11. Stay tuned for more updates in 2017.
 

JUNE 1: The Hands Module adds a range of example hands to your Unity toolkit. Version 2.0 features an autorigging function so you can bring hand designs to life in two minutes or less.
 

JUNE 11: Featuring buttons, toggles, sliders, and experimental interactions, the UI Input Module makes it simple to tactile user interfaces in VR.
 

By exploring grey areas between real-world and digital physics, we can build a more human experience. Click To TweetAUG. 23: Game physics engines were never designed for human hands. But by exploring the grey areas between real-world and digital physics, we can build a more human experience. A virtual world where you can reach out and grab something – a block, a teapot, a planet – and simply pick it up.

However, the Interaction Engine unlocks more than just grabbing interactions. It also takes in the dynamic context of the objects your hands are near. Learn more in Martin Schubert’s post on building with the Interaction Engine.
 

SEPT. 2: This rapid overview of our Unity Core Assets and Modules covers everything from custom-designed hands and user interfaces to event triggers.. Do you know what each Module can do?
 

SEPT. 8: Breaking into VR development doesn’t need to break the bank. If you have a newer Android phone and a good gaming computer, it’s possible to prototype, test, and bring your hands into mobile VR.
 

NOV. 17: Explorations in VR Design is a journey through the bleeding edge of VR design – from architecting a space and designing interfaces to making users feel powerful.

Explorations will continue through 2017 with sound design, avatars, locomotion, and more. See you in the new year!





Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here