The world is changing – can you hack it? At Leap Motion, we believe that the next wave of technological interfaces will rely on the original human operating system: your hands. Whether you’re giving people the power to grab a skeleton, reaching into a human heart, or teaching anyone how to program, hands are powerful.
With HackingEDU just around the corner, Leap Motion is sponsoring the world’s largest education hackathon with over 100 Leap Motion Controllers for attendees to use. In the past, our community has built some incredible educational projects that bring a new level of interaction (and fun) to classroom activities. This is your time to hit the ground running and build an awesome project like:
While you can find all of our platforms for artists, creative coders, designers, and more on our Platform Integrations & Libraries, this post is only going to cover some of the most popular hackathon platforms. After all, with just 36 hours to build, you need to ramp up fast!
Getting Started with Leap Motion
The Leap Motion Controller is a small USB sensor that tracks how you naturally move your hands, so you can reach into the world beyond the screen – in virtual reality, augmented reality, Mac or PC. The hardware itself is fairly simple, with three LEDs and two infrared cameras. It can track your hands to about two feet, converting the raw image feed into a rich array of tracking data. You even have access to the raw infrared camera feed, letting you create augmented reality experiences.
Design 101: Escaping from Flatland
As a new way of interacting with technology, designing for motion control also involves new ways of thinking about interactions. Physics engines aren’t designed with hands in mind, and traditional UIs are built for 2D screens. Here are some tips that will help you build compelling experiences that feel natural:
Don’t settle for air pokes. Imagine how you would control your computer with your bare hands. Rather than simply using them in the place of a mouse or touchscreen, you can push, pull, and manipulate the digital world in three dimensions!
The sensor is always on. Motion control offers a lot of nuance and power, but unlike with mouse clicks or screen taps, your hand doesn’t have the ability to disappear at will. Avoid the “Midas touch” by including safe poses and zones to allow users to comfortably move their hands around without interacting.
Use easily tracked poses. Whenever possible, encourage users to keep their fingers splayed and hands perpendicular to the field of view. Grab, pinch, and pointing gestures tend to perform well, as long as they’re clearly visible to the controller.
For more tips, check out our Introduction to Motion Control, VR Best Practices Guidelines, and 4 Design Problems for VR Tracking (And How to Solve Them).
Building a 3D Desktop App with Unity
Unity is a powerful game engine that makes it easy to rapidly build and develop desktop and VR projects. Here’s a quick video that shows you how to make a VR demo from scratch in just four minutes:
You can also check out our Unity setup guide to see how you can start building.
Building a Web App
Visual Programming for Artists and Musicians
Available on Mac, Vuo is a visual programming language that lets you easily prototype, mix, and mash up multimedia experiments. By using code like building blocks, artists and designers can quickly create amazing experiences that mash together visuals and sound. You can weave music from the air or create a physics simulation like this gravity mesh example:
For hardware hackers, boards like Arduino and Raspberry Pi are the essential building blocks that let them mix and mash things together. And while these devices don’t have the processing power to run our core tracking software, there are many ways to bridge hand tracking input on your computer with robots, drones, and more. Check out this quick getting started tutorial for Cylon.js, which lets you connect just about any device you can imagine: