Razer’s big announcement for CES 2015 was Open Source Virtual Reality, a project aimed at building an ecosystem for VR development that would support all sorts of VR hardware. But what does that actually mean? We weren’t sure, so we spent some time talking to Razer about what OSVR means for the development of VR software and hardware. While it’s still not entirely clear if OSVR will make VR development easier for developers in the long run, Razer cleared up some of the confusion by explaining the benefits of their OSVR plugins for Unreal and Unity, and how they hope to make platform-agnostic VR development easier.
We also talked a bit about what Razer gets out of this whole project. They want to see VR succeed, and think a platform-agnostic ecosystem is the best way to ease VR development for game developers. When VR is thriving, Razer wants to enter the fray with peripherals, not HMDs. We may see the Razer Hydra come back in some evolved form a year or two from now.
Wes Fenlon, PC Gamer: Can you explain what OSVR is doing on the software side to help with VR development?
Razer: On the software side, OSVR is an open source framework that provides support for developers to interact with a variety of devices, from HMDs, especially the OSVR headset, and also integrate with peripherals like the Nod, Hydra, Leap Motion, and about a dozen others right now. We’re looking to expand that using libraries like VRPN that can incorporate about a hundred or so peripherals to work out of the box with OSVR.
For game developers, using our framework, they can create unique events that can be integrated into their game or application. So you can use a combination of devices to create things like gestures, so if you're using the Nod for example, and you want to have a gesture when you rotate your hands, do this, and with Leap, do that, you can create that, and the software will automatically alert your application to an event-driven system when that happens.
PCG: So you’re talking about simplifying it so that a game developer can say “if you’re using…”
Razer: The idea is to abstract the complexities of game development right now. If we don’t build this, the ecosystem will never survive. We need a forum for developers to focus on the content and not worry about the actual execution of how to do it.
PCG: Abstracting is the word I was looking for. So if you’re trying to write a command in a game—if you’re doing hand gestures of some kind, OSVR would take that input and say “if Leap Motion is connected,” it spits out the code to control that. If it’s some other implementation, it’ll detect what the hardware is and pick the right code for that so you’re not having to do it manually every time.
Razer: Exactly. So we have the concept of device classes, so if you have a device...like a Hydra or Nod, you could actually create filters that interpret the events being created and pass that to the application. We’re not exactly calling them drivers, but they basically provide the support for talking to the peripherals and bringing that sensor data into OSVR. There’s an event model where logic can be written to interpret events in meaningful ways for developers. Then there’s support for the game engines, Unity and Unreal, so a game developer saying “I don’t want to mess with all that stuff, I just want it to work,” in Unity they can incorporate our Unity plugin and access positional tracking and OSVR built-in.
PCG: So the game engine side you’re basically taking about an OSVR plugin in Unity and Unreal, they’re there already?
We have a Unity plugin already, we have an Unreal plugin that’s in the source code so that developers can access a tracker and some of the variables coming from the HMD.
PCG: Most open source projects start pretty barebones, as a community thing. And then open source developers rally around it and make it bigger. Or not, if it’s not successful. How much has Razer put into this already, versus where it’s going to go, if the open source community rallies around it?
Razer: OSVR right now is still at a very early stage, although it is usable right now. The end goal is to make it so simple that developers would not have to spend much on writing code. We also hope to get contributions from the community so that when new devices come out, we don’t have to be the first to implement it. The community can add support for that. We’re not there yet, it’s fair to say we’re at an alpha stage. But it’s a working alpha.
PCG: What about Oculus? Is Oculus Rift DK2 compatible with OSVR right now?
Razer: We’re supporting DK2 already. It’s supported via the Oculus SDK.
PCG: Right now, Oculus is kind of the only game in town for VR. You’ve got Sony doing Project Morpheus. I know there’s some VR at the show, but I don’t know if anyone is a real competitor for Oculus right now. Do you think you’re going to have to put forth some effort to convince people to work within OSVR, don’t just go straight to the Oculus dev kit?
Razer: From what we’ve seen, there’s not a lot of convincing that's required. Ultimately what we’re giving to developers is one solution that fits everything, including Oculus. As a developer you’re interested in an install base, in simple development, and in a great experience. They can get all these things in OSVR.
They get the maximum install base, a great experience, and the experience can differ a lot between different HMDs, right? It can still be the best experience on one specific HMD. But that’s between the HMD guys to deliver the best solution.
But it creates an open market where, if today you’re not the best solution, but you come up with a better solution six months down the road, you can compete within that space. It’s not exclusive, it’s inclusive, which is the important aspect of it. That’s why a lot of developers were immediately interested in it, and we’ve got a ton of developers really looking forward to coming onboard OSVR. You’ll see a lot of announcements from now leading up to GDC, and some others leading up to E3 where people will come onboard depending on where they are with their VR development.
Response from developer side has been amazing. It means they are independent of hardware. They can focus on their game. They don’t have to make that decision, okay, we’re going to bet on this horse or that one. They don’t put their fate in someone else’s hands either.
PCG: What drove your decisions for the pieces you put in your HMD? You call it the hacker dev kit, so it’s not going to be a consumer product.
Razer: It’s not at all a consumer unit. We had a couple cool things we had available for hardware. We’ve done research and development for VR for a long time, and stuff that came out of that is, for example, the double lens system we use that reduces the distortion. It was absolutely like, you saw a pretty clear image, because there’s almost no distortion through the lenses, so the software has to do almost no distortion correction at all. So ours is a crisper and cleaner image quality, and at the same time it reduces development effort. The other thing is the adjustable optics, from a mechanical side, allows for different interpupillary distance, so not matter how large your face is you can make it work for you. Those things we thought we could bring to the table. Stuff like having the belt box on there so you don’t have cables going from your head to the PC, they go to your belt so you have the full freedom of movement. We wanted to make it available for anyone. We’d be happy for anyone doing HMDs right now to take anything that’s in there and incorporate it in their designs. That’s what it’s there for.
...We want to find the best technologies for all these things. On the peripheral side and HMD side, there are a ton of different technologies being thrown around, and it’s very difficult to figure out what is the best technologies for each of those things. Once we roll this out and it’s in the market for awhile, we’ll see the community coming in and trying different technologies, that’s when we figure out what is the best technology, to then bring it to the consumer space, which is our ultimate goal.
For us, our expertise is on the peripheral side of things, so that’s where we’ll go first. The first announcement you’ll see for us in terms of a consumer product is going to be on the peripheral side of things. We’re not interested in HMD at the moment. This is a developer unit, not a consumer unit. This is just to help the competition get in there. Ultimately what we’re interested in is making virtual reality a reality. That's cliche and cheesy as hell, but that’s what we want to do. Get it to the consumer space, get it ready, and for us, because we’re not entirely altruistic, is to be part of the ecosystem on the peripheral side of things. It’s not crucial for us to be in the HMD space.
PCG: So when you said you ultimately wanted to move into the consumer space, you didn’t mean with an HMD?
Razer: No. We’re primarily interested with the peripheral side of things...We have demos with the Hydra, Nod Bluetooth ring, so there are different types of UI technologies we’re evaluating. On the HMD side other people can do the evaluation. We’re delivering the framework for it. Our focus is going to be delivering the best way to interact.
We’re also looking at more exotic stuff like Virtuix Omni or 3D Runner, just announced yesterday or two days ago. All these exotic types of virtual reality input mechanisms or devices. What gives you the most immersive experience? That’s what we’re interested in, to give the best possible game experience from that side of things.
If Oculus takes anything from this, we’d be happy. If they get a better HMD out of this because they incorporate any of our stuff, great, fantastic.
PCG: The dual lenses, reducing distortion, brings up an interesting question for me when it comes to platform-agnostic development. If someone is using this headset to test out their game, and they’re programming their game based on how the game is rendered on that screen, say they’re using an Oculus and it is doing some distortion correction because it doesn’t have the same lens system...how does OSVR on the software side compensate for those differences?
Razer: That’s where device detection comes in again, having that layer in between where the HMD knows whether to distort or not. Basically, the game engine and HMD together need to know “Do we do distortion with this HMD or not?” And then as a game developer, he puts out the video output, and then before it goes to the display, the decision need to be made: “are there any corrections that need to be made?”
On our hacker dev kit, no it’s all good. But on other HMDs you may need brightness adjustment, distortion correction, whatever it is. But that should be decided by the HMD itself as opposed to the game, so the game can be agnostic of the hardware, and the HMD makes sure that whatever they get, they make it look the best possible way. So there’s nothing wrong with distortion as long as you make it look right on the HMD.
That segments the work as well. Everyone does the work they should be doing. As an HMD guy I should be the expert on how to make it look amazing and how my head tracker can give the fastest possible information back to the game. That’s what I should be focusing on. As a UI guy, I need to give the most immersive experience in terms of controls. As a game guy, I should be focusing on the game, not how we detect devices and get into the game and make it look nice on an HMD.
That’s our vision behind it at the end of the day. Everyone focuses on what they’re good at and combining those expertises will push the technology forward. Because there are still tons of issues with VR today. To overcome all of these, if you have one company or everyone for themselves, god knows how long it’s going to take. We thought VR was going to be ready two years ago, then one year ago, then now, and it’s still not ready. That’s kind of why we did go into this space and make it all open, because it’s not getting there at the pace we’d like it to.
PCG: After using the most recent Oculus headset, I feel like they’re getting close to getting the HMD stuff down. One of the big question marks left is controls. A controller works for some things but is not really the right input. Razer had the Hydra a few years ago, but you don’t still sell units, do you?
Chris: We didn’t. I think, actually, a lot of the stuff we learned from the Hydra influences how we approach OSVR. It was a technology that was interesting but it lacked adaption in the big picture. Now with virtual reality, there’s a high demand for motion sensing controls. We’re looking very closely and evaluating technologies, whether it’s something like the Hydra or something else entirely, but we are looking at that space. Whatever we’re going to be announcing when we make an actual consumer product will be on the peripheral side.