UX and Innovation

Virtual Reality User Experience and Design

Virtual reality’s journey from high-tech, futurist dream to mass market consumer product has been long and bumpy. In fact, without the patience and enthusiasm of dedicated experimenters and researchers, we probably still wouldn’t be there.

Even the astounding success of recent VR products like the Oculus Rift shows just how hard it’s been to get the tech ready for the mainstream. When Palmer Luckey sought out funding in 2012, it had already been about two decades since early headsets like the Sega VR and the Virtual Boy had tried (and failed) to bring the virtual reality user experience to the commercial market. Outside of well-funded research labs and enterprising hardware hackers, pretty much no one had access to the technology, and it took a Kickstarter campaign to get an affordable developer kit in the hands of enthusiasts.

Even now, headsets like the Rift and the HTC Vive are still a pretty niche market. If not for the spread of smartphone VR options like Google Cardboard, most consumers still wouldn’t be able to use VR.

This reveals a difficult truth for mobile app design and development professionals: VR is still a new and uncertain technology. Yes, it’s received a ton of buzz, and there’s been a rapid growth in sales and enthusiasm, but consumers are fickle, and this isn’t the first time we’ve all been told VR was the key to the future.

Will VR really change everything, or is it destined to remain a niche market? Well, that depends on us, the mobile app design and development community.

Virtual Reality and Immersive Experience
VR is synonymous with giving users an immersive experience. Instead of interacting with a flat screen that’s separate from them, the user dives into a simulated world that’s all around them. It asks for a lot more from the user. Physically, the display is literally strapped to their face, covering their whole field of vision and (in many cases) their ears as well. Psychologically, they’re not using an app you designed, they’re inside a world you created.

To make it worth it, you need to create a world that feels compelling, real (although not necessarily realistic) and immersive. However, the tools you have to do that will vary between devices.

At its most immersive, VR would enable your users to interact with the virtual world in much the same way they interact with the real world around them. They’d be able to:

Look around to see objects in three dimensions.
Perceive sound, shadows, light and so on as if they were located in three-dimensional space.
See objects interact with each other and the user in a fluid and believable way.
Physically manipulate and interact with virtual objects and characters using their actual body movements.
Move around in physical reality, and have those movements reflected within the virtual world.
However, there are obvious obstacles to this. If you’re walking around, and can only see the virtual world, you’re liable to trip and fall or walk into traffic. In a completely immersive experience, you’d need to be able to interact with the virtual world and your physical environment simultaneously, with the virtual environment layered on top of the physical world around you.

This model, called mixed reality, is already under development. Technologies like the Microsoft HoloLens can project Minecraft creations on top of real tables, or let you battle insectoid robots that come out of the walls.

However, for the vast majority of users, that type of immersive technology isn’t available. Each virtual reality tool places its own constraints on how users can interact with software, and by extension, what VR UX design can accomplish. The dedicated VR headsets like the HTC Vive, Oculus Rift, and PlayStation VR offer a fairly immersive virtual reality experience. They can all track head movement, and can also track body movement within certain constraints that depend on the headset, peripherals, and configuration. The HTC Vive in particular offers 6 Degrees of Freedom (DoF) with room-scale VR. That means it can track user movement through an entire room, up to 15’ x 15’.

They offer sophisticated controls, that can enable the user to interact with objects in the world using a combination of motion and button pushing, but the exact capabilities also depend on the technology. However, even at the high end, VR headsets have limits. The user generally needs to be in a room setup with a clear area, and most need to be connected by a cord to a powerful computer or gaming console.

Understand the Optics of Virtual Reality Experience
All virtual reality headsets depend on a screen that’s held very close to the user’s eyes. Your UX design needs to factor in how your users’ vision works in that environment, and avoid practices that cause disorientation and eye strain.

One example is what the Oculus UI guidelines refer to as “The Infinity Problem.” This occurs with simulated heads up displays (HUDs), where the same image is displayed on each eye. HUDs have a lot of obvious uses for VR. You could have stats, like your score or health, continuously display within your field of vision, or post important messages in the corner so users can see them, no matter how they’re oriented within the VR world.

But in this case, this practice is verboten.

The problem has to do with how your eyes focus. If an object is relatively close to you, your eyes see two different images. Your brain processes the difference between those images to show you how close the object is, and create a three-dimensional image of it. The only time both eyes see the same image is if you’re looking at an object very, very far away — at visual infinity.

A HUD creates the impression of an object that is both behind everything else and in front of everything else. It’s located focally at visual infinity, but it’s also layered on top of the world. Neither your brain nor your eyes deal with this paradox well, and it can cause disorientation, eye strain, and a generally unpleasant experience.

If you want to display data, it needs to be on a surface modeled in 3D space. One way to do this is by simply displaying a flat screen in front of the user — like Virtual Desktop does. However, in many applications, it may make sense to use more creative ways of displaying data. For example, you could have a control room, or put plaques under objects that glow when the user looks at them.

Depth occlusion is another problem VR designers need to take into account. In traditional mobile app design, you can generally stick a menu whenever it’s needed without worrying too much about what it blocks. For example, if users are playing a game and want to turn off the music, or change some other setting, you can just pause the game and slide an “Options” menu on top.

In virtual UX design, however, if you put a menu in front of the player, it might end up inside a wall, or stuck halfway through an object. This will break the visual coherence of the world, and be disorienting to the viewer, causing similar problems to a HUD. There are a lot of different ways to address this problem, such as:

Projecting the menu closer to the user than other objects are displayed.
Fading to another scene when the user loads the menu.
Temporarily making objects closer than the menu disappear.
Changing to flat, two-dimensional display when the player calls up a menu.
Storing the menu in a particular location that the user can move to.
Displaying the menu on an in-world object — for example, a notebook or wrist computer that the user carries.

Leave a Reply

Your email address will not be published.