Skip to main content

Razer's haptic ecosystem for PC works, but it overloaded my senses

At CES last week, Razer announced that it's bringing the HyperSense technology from its Nari Ultimate headset to other peripherals. The plan is for keyboards and mice, and maybe other peripherals in the future, to provide haptic feedback for explosions, frontal assaults, and the pitter-patter of enemies running up from behind—all timed to what is happening on screen.

During the show, we got to try Razer's full HyperSense ecosystem firsthand. The haptic feedback system uses your peripherals' positioning on a traditional PC gaming set up, along with positional audio and specific sound cues from in-game, to provide the sensory and tactile feedback needed to create a feeling of full 360-degree immersion. The entire set up included Razer's Nari Ultimate headset, a mouse and wrist rest with haptics by Lofelt—the same company who partnered with Razer on the Nari—and a chair with haptics by Subpac. We played Overwatch and Doom to test out the tech's varying levels of intensity. 

To put it simply, the haptic feedback works, and it works well. Left click and you'll feel your mouse jerk at the same time your character fires their gun. Turn your back on an enemy and your chair will alert you to the consequences. It's one of the more interesting and polished 'immersion' systems I experienced at CES, but it gave me mixed feelings. 

Everything was well-timed and synced. I could feel footsteps in my left wrist. When I was struck by incoming fire, I felt it in my back and sides of my head. I had more fun trying this kind of gaming 'immersion' for the first time than VR, and felt more immersed in the game than VR overall. But I wanted a way to dial down the intensity of some of the peripherals in the moment, namely the Nari Ultimate. If you have played The Addams Family Generator at an amusement park, my experience was reminiscent. The headset vibrations radiated from my ears and temples all the way down both sides of my jaw, leaving a tingling feeling in my face for several minutes.  

But the rest of the haptic ecosystem felt kind of cool while playing Overwatch, running around the training ground as Pharah, steadily firing at enemy robots. After a while, I got used to the tactile feedback coming from the mouse, wrist rest, and chair. I can't say if competitive players would want the haptic feedback turned on all the time, but Overwatch itself seems like a game that meshes well with whole haptic ecosystem—granted I have yet to try it during an actual match.

Playing Doom was like cranking the vibrations up to 11 and it, unfortunately, took me out of playing the game. Having a bombardment of tactile feedback in addition to loud sounds and on-screen action was sensory overload. For Doom in particular, I couldn't see myself using the haptic ecosystem for an extended period of time. It was a neat experience, but the rumbles were too much.

Razer is currently working with several developers to bring its haptic ecosystem to other games, and I would imagine the system would have a different feel to it depending on the game. I honestly would love to try Razer's system again with a walking sim, see how the system responds to a slower-paced game that's focused on small actions instead of loud explosions. It'd be interesting to see that level of haptic subtlety, as that's my preferred level of intensity.

The Nari Ultimate headset is already available, but the wrist rest, mouse, and chair are still in their prototyping phase. No word on when the full ecosystem will be available for consumer purchase, but considering how synchronous all the peripherals seemed, maybe a full-launch isn't terribly far away.

When Joanna's not writing about gaming desktops, cloud gaming, or other hardware-related things, she's doing terrible stuff in The Sims 4, roleplaying as a Malkavian, or playing horror games that would give normal people nightmares. She also likes narrative adventures.