hero-revised.jpg
hero-revised.jpg

VR Development Guide


Leap Motion's mission is to remove the barriers between people and technology. This guide has everything you need to jumpstart your VR project.

SCROLL DOWN

VR Development Guide


Leap Motion's mission is to remove the barriers between people and technology. This guide has everything you need to jumpstart your VR project.

Technology Overview

Our technology tracks the movement of your hands and fingers, so you can reach into virtual and augmented reality to interact with new worlds.

VR Developer Kit

Leap Motion technology is designed to track hands and fingers with high accuracy, low processing power, and near-zero latency. Our PC developer kit is designed to attach to the Oculus Rift, HTC Vive, or other Windows-based VR headset. This brings powerful, natural, human interaction to virtual reality.

Capabilities and Limitations

As an optical tracking platform, Leap Motion can only track what it can see (or infer). It is best suited for:

  • pinch, grab, and other physical interactions
  • interacting with objects and interfaces that the user is looking at
  • self-expression in social VR

We recommend you avoid:

  • touchscreen-style interactions
  • sword and gun interactions
  • interactions that regularly take hands out of tracking range

Suggested VR Projects

  • Stroke rehabilitation. Encourage stroke patients to achieve their treatment goals with entertaining gameplay mechanics.
  • Industrial training. Create a virtual flight cockpit or guide users through a complicated procedure.
  • Education. Guide users through an interactive data visualization that teaches them a complex concept.

The rest of this guide covers the key development resources you need to get started, along with design best practices.

Setup Guide

  1. Download and install the Leap Motion Orion SDK.
  2. If your Leap Motion VR Developer Kit isn't attached to a VR headset, see our installation guides.
  3. Visit our Gallery to try recommended Leap Motion VR experiences and examples.

Development Assets

The two main VR game engines are Unity and Unreal. Leap Motion provides extensive resources for Unity, with greater support for Unreal in development.

Unity

All Unity assets can be found at developer.leapmotion.com/unity.

Unity Core Assets

  • Basic framework for Unity development
  • Built on the engine’s native VR integration
  • Includes many example scenes and scripts, including how to attach objects and event conditions to hands
  • Quick setup guide

Interaction Engine

  • Handles complex interactions with objects and interfaces!
  • Rapidly implement and customize grab and throw mechanics
  • Detect hand states like pinch and “thumbs-up” gestures
  • Easily create interfaces with buttons and sliders, including wearable interfaces
  • Compatible with Oculus Touch and Vive controllers for cleaner workflow
  • Blog post: Interaction Engine 1.0
  • Documentation and Button Builder example
physical-interface.gif

Hands Module

  • Choose from different hand assets (e.g. highly optimized meshes, abstract hands, etc.)
  • Autorig your own FBX hand assets
  • Blog post: Introducing the Hands Module (Part 1Part 2)

Unreal Engine

Leap Motion support for Unreal Engine 4 is provided through a native plugin that is packaged with the engine. Currently only Windows development is supported. More information can be found at github.com/getnamo/leap-ue4.

VR Design Guidelines

For more insights, see our full series on Explorations in VR Design

World Design and Storytelling

  • Think about how your space is experienced at the human level. Where does their attention focus? Does it feel claustrophobic or confusing?
  • What is the user doing? Why are they doing it? And how will they know this?
  • Objects and scene elements should communicate purposes and statuses to the user.
  • Prototype, test, and iterate. There is no substitute for putting on the headset.

Interactive Design

In the real world, we never think twice about using our hands to control objects. We instinctively know how. The “physical” design of UI elements in VR should guide the user in using the interface. Good interactions include:

  • Pinch for precise physical control (e.g. stretching or pulling at a small target, such as a small object or part of a larger object).
  • Grab to interact directly with a larger object.
  • Basic hand poses such as “fingers out, palm up” to activate a menu, or “thumbs-up” to trigger actions.

Text and tutorial prompts are often essential elements of interactive design:

  • Clearly describe intended interactions.
  • Attach prompts to the user’s hands, or set them to appear near objects and interfaces contextually.
  • Audio and narrative cues are powerful.

Designing for Orion Tracking

  • The sensor is always on. There is no tactile barrier that separates interaction from non-interaction. More abstract interactions should be limited in their impact and rarely a part of casual movement.
  • Dynamic feedback. All interactions should have a distinct initiation and completion state, reflected through dynamic feedback that responds to the user’s motions. The more ambiguous the start and stop, the more likely that users will do it incorrectly.
  • Keep hands in range. If the user can’t see their hand, they may not be able to use it. Encourage users to look at their hands and design interactions that won’t fail if they move their hands out of range.
  • Finger occlusion. Avoid interactions that depend on the position of fingers when they are out of the device’s line of sight.

User Safety and Comfort

  • The display should respond to the user’s movements at all times.
  • Never instigate any movement without user input.
  • Don’t artificially rotate or move the horizon line or other large components of the scene.
  • Reduce neck strain – don’t require a significant degree of looking around.
  • Mix up different types of interactions so that users can work different muscle groups.
  • For optimal comfort, position user interfaces around the level of the breastbone.
  • User testing is essential to identify possible fatigue and comfort issues.
Projectiles from the Weightless: Remastered Training Room were designed to suggest how they  should be held. This led to a higher rate of users grabbing by placing their fingers in the indents, making it much easier to successfully release the projectiles.

Projectiles from the Weightless: Remastered Training Room were designed to suggest how they  should be held. This led to a higher rate of users grabbing by placing their fingers in the indents, making it much easier to successfully release the projectiles.

Object Interactions

The Leap Motion Interaction Engine is designed to handle low-level physics interactions in VR. With it, users can grab objects of a variety of shapes and textures, as well as multiple objects. To design grabbable objects for VR, build with affordances – physical characteristics that guide the user in using that object.

  • Good affordances ensure that your users understand what they can do.
  • They also make it easier for you to anticipate how your demo will be used.
  • The more specific the interaction, the more specific the affordance should appear.
  • For inspiration, look for real-world affordances that you can reflect in your own projects.

Everything should be reactive. People enjoy playing with game physics, pushing the boundaries and seeing what’s possible. With that in mind, every interactive object should respond to any casual movement, especially if the user might pick it up.

User Interface Design

User interfaces designed for VR should combine 3D effects and satisfying sound effects with dynamic visual cues. VR menus tend to fall into two categories:

  • Fixed in 3D space. This a fast and easy way to create a compelling user experience. Floating buttons and sliders are stable, reliable, and easy for users to understand.
  • Attached to the hand. This creates a “wearable menu” that can be locked to your hand, wrist, or arm. Use our Interaction Engine's user interface toolkit to build a wearable interface in just a few minutes. For optimal tracking, buttons should float in space next to the hand, rather than on the hand itself.
The hand interface from Blocks. Buttons only appear when your left hand is in this position.

The hand interface from Blocks. Buttons only appear when your left hand is in this position.

Avatar Design

  • Avoid representing whatever cannot be tracked. Abstract or stylized hands and bodies are often preferable and less resource-intensive.
  • Virtual hand positions should match their real-world counterparts as closely as possible. Their movement should never be blocked by virtual obstacles like walls.
  • Body avatars are generally unnecessary, except for social VR.

Locomotion

Any form of simulated locomotion has the potential to make users feel sick. However, there are a few effective techniques when your users need to navigate a large virtual world:

  • Teleportation. Used in a variety of games and incorporated directly into the game narrative.
  • Moving on rails. Effective with zero acceleration (linear movement only), moving only in straight lines, using short bursts of motion, and maintaining framerate.
  • Cockpit. Virtual vehicles and spaceships make it possible to explore a larger virtual world while the user remains in a limited space.

Sound Design

Sound is an essential aspect of truly immersive VR. Combined with hand tracking and visual feedback, it even has the power to create the illusion of tactile sensation. Use sound for:

  • realistic sound effects
  • mood and atmosphere
  • giving depth to 3D space
  • building and reinforcing interactions
  • tutorial audio
  • setting boundaries
  • evoking touch with “textured” sounds

Case Study: Blocks

Blocks includes a wide range of interactions. For this reason, we built it with a tutorial stage that walks the user through different elements of the experience. Here’s how these interactions combine to create a magical experience.

Thumbs up to continue. The “thumbs up” is consistently used around the world to mean “OK.” Just in case, our tutorial robot shows you how.

Pinch with both hands to create a block. This interaction can be difficult to describe in words, but the user understands instantly when they see how the robot does it. The entire interaction is built with visual and sound cues:

  • Upon detecting a pinch, a small blue circle appears at your thumb and forefinger.
  • A low-pitched sound effect plays, indicating potential.
  • When spawning, the block glows red in your hands. It appears to be made of energy. (Imagine how unsatisfying it would be if the block first appeared grey and fully formed!)
  • Upon release, a higher-pitched sound plays to indicate that the interaction is over. The glow on the block cools as it assumes its final physical shape.

Grab a block. This is as direct and natural as it gets – something we’ve all done since childhood. Reach out and grab it with your hand.

Turn gravity on and off. Deactivating gravity is a broad, sweeping interaction for something that massively affects the world around you. The act of raising up with your hands feels like it fits with the “lifting up” of the blocks you’ve created. Similarly, restoring gravity requires the opposite – bringing both of your hands down.

Change the block shape. This can be done with an arm interface that only appears when the palm of your left hand is facing up. This is an abstract interaction, so the interface has to be very clean and simple. With only three large buttons, spaced far apart, users can play and explore without making irreversible changes.

From there, players have the ability to create stacks, catapults, chain reactions, and more. We can’t wait to see what kinds of experiences you’ll build. If you have any questions, please contact our team directly at community@leapmotion.com.