In EVE: Valkyrie, players can experience an exhilarating sense of speed as their ship flies through structures in simulated deep space. A user gazing up at the Eiffel Tower during a virtual tour of Paris may feel dwarfed by the architecture. Virtual roller coasters defy expectations and feel true-to-life. These are features of VR, and the particular intensity of these sensations is unique to the medium. This feat is rendered possible by engineers and developers who study perceptual psychology to understand the interplay between VR technology and the human mind.
“Everything comes down to a perceptual experience,” says Kimberly Voll, senior technical designer at Riot Games, a computer scientist and an expert in perceptual psychology. Part of her job is to determine how memory, expectations and attention shape the way the brain interprets sensory input. And she isn’t alone in this endeavour. Oculus VR has an entire department of perceptual psychologists doing research for them.
One of the first people to work on perception at Oculus VR was Steven LaValle who was hired as the company’s chief scientist only a few days after the Rift’s successful Kickstarter campaign. In this role, he led a team of perceptual psychologists to develop perceptually tuned headtracking. The goal of all this work? To give the user a sense of presence.
“The feeling of presence seems to occur when enough of the sensory cues provided by VR are in agreement so they overwhelm the sensory input from the real world that hasn’t been co-opted by VR,” says LaValle. When the sensory cues aren’t in agreement, the body will find ways to let the player know, potentially inducing nausea and often breaking immersion.
For this reason, developers must understand the role of the vestibular system in perception. If our vestibular system, which senses motion and orientation, doesn’t match the cues from our eyes, then the body will be quick to let the player know that what they’re perceiving is in someway inconsistent with the reality they’re used to experiencing. ‘Perceptually tuned headtracking’ is designed to circumvent this problem.
The vestibular system also accounts for mismatched locomotion and acceleration cues. One way to handle locomotion without needing movement from the user is through ‘tunnelling,’ which blurs the peripheral vision to diminish the visual perception of motion. Another way is through the Vive’s body tracking ability, which translates real world movement into a virtual world.
Presence is further maintained when developers convey proper expectations. Continuing to speak of the pratfalls of acceleration in VR, LaValle tells me of an encounter in an unspecified game demo in which a menu rushes from the background towards the player, making some feel nauseous. The primary cause of the issue was the users’ perception they were accelerating towards the menu. When they were told they were, in fact, stationary and the menu was coming to them, the problem was alleviated. As evidenced by a variety of optical illusions, expectations are a powerful force in shaping perception, and a virtual environment that’s consistent with our expectations is one that’s capable of commanding our attention.
Expectations also play a large role in something Voll calls “the fidelity contract”. It refers to the expectations we gather from reality manifesting within a simulation as a system of rules. Gratuitous details, for example, can cause a breach in the fidelity contract. If a user can’t interact with a bulk of what they see, the illusion VR provides starts to fade. “We need to work with the brain,” says Voll. “The simpler we stay, the more we can figure out what we can get for free.”
And the brain does, indeed, give developers a shocking amount for ‘free’. “When it comes to your own representation of yourself, your brain knows where you are in space. It doesn’t need to look at your hands to know where they are. And it turns out your arms don’t play a huge role in this,” says Voll. “If you have no arms and just hand models, your brain is like, ‘Yeah, I got arms, it’s cool.’ But if you model them and they don’t behave the right way, your brain is like, ‘Something is wrong with these arms!’ and now you’re distracted by these arms instead of just not thinking about them.”
To discover these mind habits, the onus may be on software developers to experiment as much as possible. But they need to do it scientifically and be aware of our mind’s ability to accommodate new ideas. Designers can develop a bias where they’ve adapted past noticing critical flaws in their software, and users can adapt past the initial novelty of being in VR.
In the case of adapted users, traditional aspects of game design will be important. When users are engaged emotionally or creatively, VR commandeers their attention, which is a factor in perception, and if it’s focused on game details, external signals that might impede presence get muted.
It’s scary to think, but what we perceive as reality is somewhat illusory in nature—our perception is composed of a constantly shifting series of signals and fluctuating priorities. As developers continue to experiment with the mix, they’ll learn more about us and how to better manipulate our perception, creating stronger feelings of presence. In essence, the main selling point of VR is it will get increasingly compelling.
By Benjamin Maltbie