True hand presence in VR is incredibly powerful – and easier than ever. With the Leap Motion Unity Core Assets and Modules, you can start building right away with features like custom-designed hands, user interfaces, and event triggers. Each Module is designed to unlock new capabilities in your VR project, and work with others for more advanced combinations.
Unlock the power of true hand presence in #VR and start building right away. Click To TweetIn this post, we’ll take a quick look at our Core Assets, followed by the Modules. Each section includes links to more information, including high-level overviews, documentation, and examples. The Core Assets and Modules themselves all include demo scenes, which are often the best way to get started.
Leap Motion Core Assets
The Leap Motion Unity assets provide an easy way to bring hands into a Unity game. Since they’re built on the native VR integration included in Unity 5.4, they support both the Oculus Rift and HTC Vive. Setup is fast and easy, taking less than a minute.
Our new Orion Core Assets have been massively optimized for VR, with features like persistent hands in the Editor, greatly simplified workflows, and the ability to easily toggle through different sets of hands. For a more in-depth perspective on how the Core Assets are architected, see our posts Redesigning Our Unity Core Assets: Part 1 and Part 2.
Modules are powerful extensions built on the Core Assets. With Modules, you can unlock a wide range of capabilities in your project.
Imagine an experience where you can reach out and grab any object – a block, a teapot, a planet – and simply pick it up. Your fingers phase through the material, but the object still feels real. Like it has weight.
The Interaction Engine is a fundamental tool that enables Unity developers to rapidly build object interactions that feel more human. It implements an alternate set of customizable physics rules that take over when your hands are embedded inside a virtual object.
With the Leap Motion Orion software, we’ve away from touchscreen-like-gestures – such as swipe and circle – and towards more physical interactions designed for VR, like pinching and grabbing. Pinching is a powerful interaction that lies at the core of our Blocks demo, and has the ability to drive a wide variety of experiences.
#VR design means stepping away from the mouse and touchscreen, and towards physical interactions. Click To TweetPinching and other hand poses are detected and managed through Detectors – which is not really a Module, but rather a set of scripts within the Core Assets themselves. With Detectors, you can:
- use pinch gestures within your project
- take advantage of hand poses like “thumbs-up”
- create custom hand pose detectors with logic recipes like whether:
- the fingers of a hand are curled or extended
- a finger or palm is pointing in a particular direction
- a hand or fingertip is close to one of a set of target objects
In just a few minutes, the Hands Module gives you the power to select from different hand assets or bring your hand models to life in VR. With the Hands Module, you can:
- access a range of example hands, including:
- highly optimized rigged meshes
- abstract geometric hands
- dynamically generated hands (based on the real-world proportions of the user’s hand)
- autorig a wide array of FBX hand assets
UI Input Module
Fully interactive menus – ones that you can touch with your bare hands – can be enormously compelling. The UI Input Module provides a simplified interface for physically interacting with World Space Canvases in Unity’s UI System. With the UI Input Module, you can:
- build interfaces with buttons and sliders
- design and customize your interface’s appearance and animation effects
- easily setup and modify an event system for your interface
Last but not least, the Attachments Module is designed in part to augment and extend the capabilities of the other Modules. With the Attachments Module, you can:
- attach Unity game objects to a hand
- trigger events in the virtual world, using scripts for turning on and off attached game objects (designed to work with Detectors)
- create a wearable menu attached to your arm (in combination with the UI Input Module)
What new Module would you like to see next? What kinds of experiences can you imagine from combining the existing Modules? Let us know in the comments!