Until the rise of VR, we lived on the edges of a digital universe that was trapped behind glass screens. Immensely powerful and infinitely portable, but still distant and inaccessible.

Now the glass is breaking. We can see and reach into new worlds, and the digital is taking substance in our reality. You are now one of its many artists, architects, sculptors, and storytellers.

Designing a fluid and seamless experience for VR/AR is impossible without a deeper understanding of the medium. But VR/AR is still largely unexplored. There are no hard-and-fast rules.

This guide will take you through the bleeding edge of VR/AR design – how to architect a space, design groundbreaking interactions, and make users feel powerful. Here’s everything you need to build a more human reality:

  • World Design
  • Interactive Design
  • Designing for Orion Tracking
  • User Safety and Comfort
  • Object Interactions
  • User Interface Design
  • Storytelling and Narrative
  • Avatar Design
  • Locomotion
  • Sound Design
  • Case Study: Blocks

Note: This is a limited-release early version. Please do not share externally.

World Design

Like an architect or a set designer, you have the power to create moods and experiences through a physical environment. In developing your space, you can also think about how it’s experienced at the human scale – in terms of attention, structure, and affordance.

As you move your gaze through the scene, where does it land? Where do you focus? How does the structure of the space around you make you feel and how does it influence you to move? How do the objects and scene elements you focus on communicate their purposes and statuses?

The most important thing is to prototype, test, and iterate. We’ve often encountered spaces and sets that look great on a monitor, but feel weird or claustrophobic in VR. There is no substitute for actually getting your eyes into the space and looking around.

Interactive Design

Designing for hands in VR starts with thinking about the real world and our expectations. In the real world, we never think twice about using our hands to control objects. We instinctively know how. The “physical” design of UI elements in VR should build on these expectations and guide the user in using the interface.

There are three types of interactions, ranging from easy to difficult to learn:

  • Direct interactions follow the rules of the physical world. They occur in response to the ergonomics and affordances of specific objects. As a result, they are grounded and specific, making them easy to distinguish from other types of hand movements. Once the user understands that these interactions are available, there is little or no extra learning required. (For example, pushing an on/off button in virtual reality.)
  • Metaphorical interactions are partially abstract but still relate in some way to the real world. For example, pinching the corners of an object and stretching it out. They occupy a middle ground between direct and abstract interactions.
  • Abstract interactions are totally separate from the real world and have their own logic, which must be learned. Some are already familiar, inherited from desktop and mobile operating systems, while others will be completely new. Abstract interactions should be designed with our ideas about the world in mind. While these ideas may vary widely from person to person, it’s important to understand their impact on meaning to the user. (For example, pointing at oneself when referring to another person would feel strange.)

Direct interactions can be implied and continually reinforced through the use of affordance in physical design. Use them as frequently as possible. Higher-level interactions require more careful treatment, and may need to be introduced and reinforced throughout the experience. All three kinds of interactions can be incredibly powerful.

Leap Motion Orion tracking was designed with simple physical interactions in mind, starting with pinch and grab. The pinch interaction allows for precise physical control of an object, and corresponds with user expectations for stretching or pulling at a small target, such as a small object or part of a larger object. Grab interactions are broader and allow users to interact directly with a larger object.

 

Towards the more abstract end of the spectrum, we’ve also developed a toolkit for basic hand poses, such as the “thumbs up gesture.” These should be used sparingly and accompanied by tutorials or text cues.

Text and tutorial prompts are often essential elements of interactive design. Be sure to clearly describe intended interactions, as this will greatly impact how the user does the interaction.  You may want to design with prompts that are attached to your hands, or which appears near objects and interfaces contextually. Audio and narrative cues can be enormously helpful in driving user interactions.

Learn how these principles were applied in our Blocks demo in the case study at the end of this document.

Designing for Orion Tracking

Here are some quick tips on building for the strengths of Leap Motion technology, while avoiding common pitfalls.

The sensor is always on. Unlike a touchscreen or game controller, there is no tactile barrier that separates interaction from non-interaction. This means that more abstract interactions should be both extremely limited in their impact and rarely a part of casual movement.

Dynamic feedback. The absence of binary tactile feedback also means that your experience should eliminate ambiguity wherever possible. All interactions should have a distinct initiation and completion state, reflected through dynamic feedback that responds to the user’s motions. The more ambiguous the start and stop, the more likely that users will do it incorrectly.

Keeping hands in sight. If the user can’t see their hand, they can’t use it. While this might seem obvious to developers, it isn’t always to users – especially when focused on the object they’re trying to manipulate, rather than looking at their hand.

Finger occlusion. It’s important to encourage users to keep their hands in view, and to guide them through interactions. Be sure to avoid interactions that depend on the position of fingers when they are out of the device’s line of sight, and reward correct behaviors.

User Safety and Comfort

What’s the most important rule in VR? Never make your users sick. The Oculus Best PracticesDesigning for Google Cardboard, and other resources cover this issue in great detail, but no guide to VR design and development would be complete without it.

  • The display should respond to the user’s movements at all times. Without exception. Even in menus, when the game is paused, or during cutscenes, users should be able to look around.
  • Do not instigate any movement without user input (including changing head orientation, translation of view, or field of view). This includes shaking the camera to reflect an explosion, or artificially bobbing the head while the user walks through a scene.
  • Avoid rotating or moving the horizon line or other large components of the environment unless it corresponds with the user’s real-world motions.
  • Reduce neck strain with experiences that reward (but don’t require) a significant degree of looking around. Try to restrict movement in the periphery.
  • Ensure that the virtual cameras rotate and move in a manner consistent with head and body movements.

Ergonomics

One way to avoid user fatigue is to mix up different types of interactions. These allow your user to interact with the world in different ways and use different muscle groups. More frequent interactions should be brief, simple, and achieved with a minimum of effort, while less frequent interactions can be broader or require more effort.

No one enjoys feeling cramped or boxed in. Our bodies tend to move in arcs, rather than straight lines, so it’s important to compensate by allowing for arcs in 3D space. Your interactions should also be fairly forgiving of inaccurate motions. Always keep your user’s comfort in mind from the perspective of hand, arm and shoulder fatigue. User testing is essential in identifying possible fatigue and comfort issues.

Ideal Height Range

Interactive elements within your scene should typically rest in the “Goldilocks zone” between desk height and eye level. For ergonomic reasons, the best place to put user interfaces is typically around the level of the breastbone.

Object Interactions

Touchless interaction in VR is an unprecedented design challenge that requires a human-centered solution. This is the philosophy behind the Leap Motion Interaction Engine, which is built to handle low-level physics interactions and make them feel familiar. The Interaction Engine makes it possible for you to reach out and grab an object, and it responds. Grabbing it in your hand, your fingers phase through the material, but it still feels real. You can grab objects of a variety of shapes and textures, as well as multiple objects near each other that would otherwise be ambiguous.

Building Affordances

In the field of industrial design, “affordances” refers to the physical characteristics of an object that guide the user in using that object. Solid affordances are critical in VR interactive design. They ensure that your users understand what they can do, and make it easier for you to anticipate how your demo will be used.

The more specific the interaction, the more specific the affordance should appear. This effectively “tricks” the user into making the right movements. For inspiration, look for real-world affordances that you can reflect in your own projects.

In designing Weightless: Remastered, Martin Schubert found that everyone tends to grab basic shapes in different ways – from holding them lightly, to closing their fists through them. He took inspiration from baseballs and bowling balls to suggest how the projectiles in the Training Room should be held. This led to a much higher rate of users grabbing by placing their fingers in the indents, making it much easier to successfully release the projectiles.

Everything Should Be Reactive

People enjoy playing with game physics, pushing the boundaries and seeing what’s possible. With that in mind, every interactive object should respond to any casual movement. This is especially true with any object that the user might pick up and look at, since people love to pick up and play with tools and other objects.

User Interface Design

In the early days of the PC, GUIs often relied heavily on skeuomorphic 3D elements, like buttons that appeared to compress when clicked. These faded away in favor of color state changes, reflecting a flat design aesthetic. With the rise of VR, many of those old skeumorphs meant to represent three-dimensionality – the stark shadows, the compressible behaviors – are gaining new life.

Windows users in 1992 needed 3D effects on buttons to understand that they were meant to be pressed, just like buttons on other media like radios, televisions, and VCRs. In 2016, active and passive states in the OS are communicated entirely through color states – no more drop shadows. All major operating systems and the modern web are now built with a flat minimalist design language.

But this doesn’t mean that skeuomorphism is the answer – because the flat-skeuomorphic spectrum is just another form of flat thinking. Instead, VR design will converge on essential cues that communicate structure and relationships between different UI elements. The design process behind the UI Input Module was driven by many of these insights, combining 3D effects and satisfying sound effects with dynamic visual cues.

Wearable Interfaces

Fixing the user interface in 3D space is a fast and easy way to create a quick, compelling user experience. Floating buttons and sliders are stable, reliable, and easy for users to understand. However, they can also feel obtrusive, especially when their use is limited.

At Leap Motion, we’ve been experimenting internally with a range of different interfaces that are part of the user. This “wearable device” can be locked to your hand, wrist, or arm, and revealed automatically or through a gesture. A simple form of this interface can be seen in Blocks, which features a three-button menu that allows you to toggle between different shapes. It remains hidden unless your left palm is facing towards you.

These early experiments point towards wearable interfaces where the user always has instant access to notifications and statuses, such as the time of day. More powerful options may be unlocked through a trigger gesture, such as tapping a virtual wristwatch. By combining our Attachments Module and UI Input Module, it’s possible to build a wearable interface in just a few minutes.

Storytelling and Narrative

As human beings, stories are how we make sense of the world. In building on your concept’s core concept, you’ll want to give the user a story. Just ask yourself three questions. What is the user doing? Why are they doing it? And how will they know this?

Hand presence brings you into VR in a fundamental way. The player’s hands might reinforce presence within the scene, enable basic controls, or drive the story by affecting objects and events. It all depends on what kind of story you want to tell. But since the player now has a physical presence within the scene, you need to identify what that means from a story perspective.

Much of the narrative behind an experience will be told in the first few seconds, as the user becomes accustomed to the scene. The world and sound design should all reinforce that core narrative. If there’s something immediately in front of the user, it will be the first thing they try to grab. From there, everything is a learning process.

Avatar Design

In VR, it’s often best to avoid representing whatever cannot be tracked. In many cases, abstract or stylized hands and bodies are preferable to more realistically rendered hands that may fall into the uncanny valley. They are also less resource-intensive, which is important for mobile VR.

Our Hands Module features a range of example hands. This includes abstract geometric hands, which are dynamically generated based on the real-world proportions of the user’s hand, and more classic rigged meshes. You can also autorig a wide array of FBX hand assets with one or two button presses.

Hand Position

Just as accurate head tracking can place you inside a virtual world, hand tracking can reinforce the sense that you’ve actually traveled to another place. Conversely, when hands appear in the wrong spot, it can be very disorienting.

To enhance immersion and help users rely on their proprioception to control movements and understand depth, it’s essential that virtual hand positions match their real-world counterparts as closely as possible. Their movement should never be blocked by virtual obstacles like walls.

Arm and Body Position

By providing a bodily relationship to the elements in the scene, you can increase immersion and help ground your user. But even in the best circumstances, body animation without an external sensor can be unreliable, so you should only create an avatar for the user if they are likely to be in alignment. For example, if the user is sitting in a chair in a game, you can expect that they will do the same in real life.

People are usually comfortable with a lack of a body, due to experiences with disembodied observation (movies and gaming) and due to the minimal presence of one’s own body in one’s normal field of view (standing looking forward). However, adding a second body that moves independently is unfamiliar and at odds with the user’s proprioception.

Locomotion

World navigation is one of the greatest challenges in VR. Beyond actually walking around in a Holodeck-style space, there are no truly seamless solutions, and any form of simulated locomotion has the potential to make users feel sick. However, the VR community is constantly pushing the boundaries with a variety of experimental approaches evolving alongside the hardware.

Generally, the best VR applications that use hand controllers for navigation aren’t centered around hand interactions moving the camera around in a non-physical way, but transitioning between different states. Below are some interesting experiments in locomotion, both from Leap Motion developers and the broader VR community.

Teleportation

Teleportation has been used to great effect in games like Cloudhead Games’ Blink VR, Neat Corporation’s Budget Cuts, and Epic’s Bullet Train, where the animation that would normally appear while moving between two points has been eliminated altogether. Similarly, Frooxius’ World of Comenius is a seated experience where players can travel between different educational scenes. It features glowing orbs that, when tapped, grow and envelop the player, taking them into the new scene.

In each case, the specific method of locomotion is incorporated directly into the game narrative, and a variety of design considerations go into how the teleportation mechanic plays out. As a high-impact mechanic, make sure that the method you use is reliable and not susceptible to false positives.

Moving on Rails

The classic rail shooter mechanic has a lot of clear benefits for a hand tracking platform – and judging by games like Aboard the Lookinglass and the Gear VR title Dead Secret, it can be very successful. By handling locomotion, it gives the user more subtle (or more spectacular) ways to play and explore. However, this mechanic also sacrifices the player’s ability to explore an open world, and can cause simulator sickness (“sim sickness”) if not used carefully.

Robot Invader, the developer behind Dead Secret, developed their rails locomotion system based around zero acceleration (linear movement only), moving only in straight lines, using short bursts of motion, and maintaining framerate. As always, the camera should never under any circumstances rotate without direct user input.

Two-Handed Flight

Two-handed flight is a great way to give your users superpowers. With Orion’s extended tracking range, this approach is more stable and reliable than ever. However, it can get tiring unless used in a short demo, or alongside other interactive elements. This technique will also only work in a limited number of narratives, and should never introduce unwanted camera motions. Use it carefully.

Cockpit

Virtual vehicles such as the spaceships in Elite Dangerous and Eve Valkyrie make it possible to explore a larger virtual world while the user remains in a limited space. By their design, VR cockpits also place lots of unmoving objects in the foreground, which can reduce sim sickness.

Sound Design

Sound is an essential aspect of truly immersive VR. The twin streams of light from your headset’s screen can fool your eyes, but it takes sound to transport people to new worlds. Sound can give depth and emotion to an experience, build and reinforce interactions, and guide users through an alien landscape. Combined with hand tracking and visual feedback, it even has the power to create the illusion of tactile sensation.

Realistic sound effects. Engines like Unity and Unreal are constantly getting better at representing sound effects in 3D space. The more realistic that zombie right behind you sounds, the more your hair will stand on end. Ambient sound can establish the mood and atmosphere of your game. Unlike with desktop gaming, where audio may be non-positional, sounds in VR should almost always be rooted in one spot. This allows for positional 3D audio which varies as the user turns their head and moves around the scene.

Mood and atmosphere. Music also plays a crucial role in setting the mood for an experience. Weightless features soft piano tracks that feel elegant and contemplative. Blocks has a techno vibe with a deep bass. Land’s End combines a dreamy and surreal quality with hard edges like tape saturation and vinyl noise. If you imagine shuffling those soundtracks, it massively changes how people experience those demos.

Giving depth to 3D space. When it comes to depth cues, stereoscopic vision is a massive improvement on traditional monitors. But it’s not perfect. For this reason, sound is more than just an immersive tool – how (and where) objects around you sound has an enormous effect on your understanding of where they are. This applies to everything from background noises to user interfaces.

Building and reinforcing interactions. Sound can be essential in communicating the inception, success, failure, and overall nature of interactions and game physics, especially when the user’s eyes are drawn elsewhere. It’s also a double-edge sword that relies on careful timing, as even being off by a quarter second can disrupt the experience.

Tutorial audio. It’s sad but true – most users do not read instructions. Fortunately, while written instructions have to compete with a huge variety of visual stimuli, you have a lot more control over what your user hears. Just make sure that the cues work within the narrative and don’t become repetitive.

Setting boundaries. Virtual reality is an exciting medium, and for first-time users, it can take a few minutes to master its limitations. Just as you can use audio cues to discourage head clipping within your game, you can help your users master interactions faster by guiding them to keep their hands in view. Auditory cues can serve as powerful but invisible reminders.

Evoking touch. In the absence of touch feedback, visual and auditory feedback are essential. They can fill in that cognitive gap and reinforce which elements of a scene are interactive, and what happens when the user “touches” them. Some users describe phantom sensations in VR, which are almost always associated with compelling sound design.

Case Study: Blocks

In our earlier section on Interactive Design, we discussed the three parts of the interactive spectrum – physical, metaphorical, and abstract. Blocks includes several interactions that fall along different points of this spectrum. For this reason, we built it with a tutorial stage that walks the user through different elements of the experience. The most basic elements of the experience are either direct or metaphorical, while the abstract elements are optional.

This case study examines the different interactions in Blocks and how they combine to create a magical experience.

Thumbs up to continue. While abstract gestures are usually dangerous, the “thumbs up” is consistently used around the world to mean “OK.” Just in case, our tutorial robot makes a thumbs up, and encourages you to imitate it.

Pinch with both hands to create a block. This metaphorical interaction can be difficult to describe in words (“bring your hands into, pinch with your thumbs and index fingers, then separate your hands, then release the pinch!”) but the user “gets it” instantly when seeing how the robot does it. The entire interaction is built with visual and sound cues along the way:

  • Upon detecting a pinch, a small blue circle appears at your thumb and forefinger.
  • A low-pitched sound effect plays, indicating potential.
  • When spawning, the block glows red in your hands. It’s not yet solid, but instead appears to be made of energy. (Imagine how unsatisfying it would be if the block appeared grey and fully formed when spawning!)
  • Upon release, a higher-pitched sound plays to indicate that the interaction is over. The glow on the block cools as it assumes its final physical shape.

Grab a block. This is as direct and natural as it gets – something we’ve all done since childhood. Reach out, grab with your hand, and the block follows it. This kind of immersive life-like interactions in VR is actually enormously complicated, as digital physics has never been designed for human hands reaching into it. Blocks achieves this with an early prototype of our Interaction Engine.

Turn gravity on and off. Deactivating gravity is a broad, sweeping interaction for something that massively affects the world around you. The act of raising up with your hands feels like it fits with the “lifting up” of the blocks you’ve created. Similarly, restoring gravity requires the opposite – bringing both of your hands down. While abstract, the action still feels like it makes sense. In both cases, completing the interaction causes the blocks to emit a warm glow. This glow moves outwards in a wave, showing both that (1) you have created the effect, and (2) it specifically affects the blocks and their behavior.

Change the block shape. Virtual reality gives us the power to augment our digital selves with capabilities that mirror real-world wearable technologies. We are all cyborgs in VR. For Blocks, we built an arm interface that only appears when the palm of your left hand is facing up. This is a combination of metaphorical and abstract interactions, so the interface has to be very clean and simple. With only three large buttons, spaced far apart, users can play and explore their options without making irreversible changes.

Revisiting our three interaction types, we find that the essential interactions are direct or metaphorical, while abstract interactions are optional and can be easily learned:

  • Direct: grab a block
  • Metaphorical: create a block, press a button
  • Abstract: thumbs up to continue, turn gravity on and off, summon the arm interface

From there, players have the ability to create stacks, catapults, chain reactions, and more. Even when you’ve mastered all the interactions in Blocks, it’s still a fun place to revisit.

Want to see the full Explorations in VR Design series? Go to developer.leapmotion.com/explorations, where you'll find new explorations every week.