The New Cinema
While Oculus Connect is primarily a developer's convention, there were plenty of content creators and creative types in attendance, too. And for the second talk on the first day, people crammed into the theater where Facebook's David Pio had just given a talk on streaming to see Rob Bredow talk about how Lucasfilm and Industrial Light & Magic are using VR to design experiences as well as plan shots.
I think most of the people came to the talk because, you know, Star Wars. That wasn't far off, since Bredow showed lots of video of how Lucasfilm and ILM was using VR technology to tell stories within Lucas's far, far away galaxy.
One of the first tools that Bredow showed off was the use of VR to scout locations, using an in-house tool called V-Scout. The scouting tool allows directors to plop down digital assets on a landscape and move around within it to find the best angles and determine movement of action. The tool, he said, can mimic various Panavision lenses, to give the user an idea of how the scene will actually look when shot. Scouting locations is an expensive part of filmmaking, and being able to do it remotely using topographical data and VR imagery could cut costs for film production, Bredow said.
The other really impressive tool that ILM showed off was the live rendering process they developed. The live rendering can capture human actors in a motion capture suit, and plop those actions into a digital scene. This isn't too unlike what games already do: dynamically render 3D scenes as they are played out. But Bredow was quick to point out that this tech was quite different, and didn't focus on viewer interaction.
In one demonstration, Bredow showed a scene that had been scripted using this technology. A squad of stormtroopers patrol a desert village, looking for Rebel droids, of course. R2-D2 and C-3PO emerge from a shadowy house, requesting a pickup from a Rebel ship captain. In the distance, a ship holds off an approaching AT-AT. As the droids turn to find an alternate route, they are confronted by none other than Boba Fett, and the scene ends.
While the video looked good and near film-quality as it was, Bredow backed up to show what made this so cool: With VR, you can see any part of the scene you like, and aren't tied to what the "camera" shows you. After he restarts the demo, he pauses the "video" and moves the camera perspective around different sides of the stormtroopers. From there, he looks around to the house where the droids are hiding. After moving the perspective to include the droids, he pushes play, and we see and hear Princess Leia giving instructions to the droids before they emerge. Jumping around, we see what Boba Fett was doing (blasting some folks) before he runs into the droids.
Bredow says that this technology is great for telling shorts where users can examine multiple storylines that happen at the same time. The Boba Fett or droid scenes normally would have "been left on the cutting room floor," Bredow says. But with VR and dynamic rendering, viewers can explore these hidden plot points to get a better understanding of the story.
While ILM was showing off its in-house tools, Oculus was busy giving stuff away to filmmakers. During the keynote, Oculus announced that it would be basically open-sourcing the assets used to create it's VR experience Henry. While the story of Henry—a cute melancholy hedgehog who just wants a friend to be able to hug him without fear of being impaled by his spines—wasn't particularly deep, the experience itself was a great proof of concept. It was pretty much like being inside a Pixar short.
Henry, the hedgehog. (Oculus)
Offering up the experience as a boilerplate for other creators to learn how to create VR experiences, while sounding very open, is somewhat analogous to giving away code examples to programmers. By seeing how an entire application works, a developer can use some of the same methodologies or tools to solve their particular problem. Creators who are interested in VR production will be able to use Henry in much the same way.
I got a chance to talk to Eugene Chung, founder of Penrose Studios at the conference at the developer lounge. He was the only VR film developer in the room, which was dominated by game developers. I wanted to pick his brain about what he thought about VR as an artist.
Chung showed me a short called The Rose and I that Penrose had produced, based on the classic story of Le Petit Prince (The Little Prince). Much like Henry, it was more of a film than a game, though it was rendered dynamically and you could move around in the environment and look in all directions.
From what I can tell, what we call passive VR “films” is still up in the air. “VR experience” seems to be a popular term, but “VR film” has been used here and there, too.
“It truly is a new art form,” Chung told me. He likened using VR to the emergence of film. “I can tell you that just as cinema is its own language, VR is its own language.” VR as a medium, Chung said, was just as different from film as film was from theater. The storytelling remains quite the same, but how you visually represent stories is quite different.
That doesn’t mean that VR will kill the movies. People still go to plays, after all. The VR experience is wildly different from seeing a film on the big screen. As it is now, there’s no real replacement for hugging your significant other or laughing with your friends while watching a film together.
ILM's programs weren't the only creative tools that were highlighted at Connect. If there was one non-game that stole the show at Connect, it was Medium. The program serves as Oculus's "paint program for VR."
I got to play with Medium, and I thought is was actually the best use of Oculus Touch that I experienced (to be fair to other devs, there are only so many demos you can try in a given time). While I played in Medium with an Oculus software engineer who works on the project, I felt a childlike joy as I discovered the tools needed to create a (admittedly poor) sculpture. As much as I love the Eve: Valkyrie demo for its fulfillment of a childhood fantasy of being a starfighter pilot, Medium touched me on another level.
Wes Fenlon from PC Gamer tries out Medium at Oculus Connect.
It seems straightforward enough: You’re in a room, and you can create 3D objects with a palette of tools found on your off-hand. Once your tool and color is set, you can add material to the 3D space. It just floats there, defying gravity, which should make creating virtual pottery much easier than the real thing. When you’re done, Medium allows you to save the object to an .obj file, or other 3D object files. You can then send that file to a 3D printer, plop it into a game, or use it as a 3D asset in filmmaking.
The engineer had admin control of the experience, and at one point she revoked my use of the tools to demonstrate a different tool—the symmetry plane. As useful as the plane is for creating objects that won’t look lop-sided, I was more taken aback by the admin abilities of the program. I instantly thought of a teacher showing sculpting or digital art students how to create a specific shape, without fear of the students altering the object before she was ready.
While great for artists, Medium in its current state is limited for designers. It’s really tough to get straight lines or exact curves like you can get in Maya or AutoCAD. Oculus software engineer Lydia Choy said that straight-edge tools and the like are in development.
Even with its limitations, thoughts wandered to people with disabilities. I have a friend who gets severe pain in his hands, which precludes him from doing basic things like typing for long periods. This friend is an artist too, and the loss of his hands as useful tools had been particularly hard on him. Medium with Oculus Touch offers a solution where a person can create 3D art with very little physical effort. Besides the fact you don’t have to hold up a heavy object to carve it, the Touch controllers are easy to actuate and are light enough to use for a long enough time to be useful.
Other creative applications like Medium could have far-reaching implications, especially if developers integrate a social element to it. Working on an object by yourself is cool, but it’s way better if you can do it with someone else. Having a spare set of eyes and other ideas could help engineers, artists, and even doctors.