In May 2012, a programmer named John Carmack -- who,
as a cofounder of id Software and the man behind games like Doom and Wolfenstein
3D, is widely seen as the father of 3D gaming -- tweeted a
picture of what looked like steampunk bifocals made out of a black
shoebox. "This is a lot cooler than it looks," read the caption.
He was right.
Since then, that awkward contraption -- now more
streamlined, and known as the Oculus Rift -- has become the most anticipated
new product in gaming since the Nintendo Wii got people off the couch. It's a
head-mounted display that promises to be a gigantic step toward what many had
dismissed as an unrealisable dream: virtual reality.
The Rift is the brainchild of a 19-year-old tinkerer and VR enthusiast named
Palmer Luckey. A collector of old VR headsets, Luckey was all too familiar with
the shortcomings every system had faced -- small fields of vision, unwieldy
form factors, horrific resolution. He was also uniquely suited to do something
about it: Years of modding videogame consoles and refurbishing iPhones for fun
and profit had given him enough fine-soldering skills to start Frankensteining
pieces from his existing headset collection. Eventually, chronicling
his efforts on a message board devoted to 3D gaming, he figured out a
way to cobble together a headset with a field of vision that dwarfed anything
else on the market and allowed people to become completely immersed in a
360-degree playspace.Luckey originally envisioned his creation as a DIY kit for hobbyists; after joining up with two partners and officially incorporating, though, they realised they could have a game-changing consumer peripheral on their hands. They began pre-selling $300 (£183) prototype headsets to software developers on Kickstarter in August 2012, just weeks after Carmack had taken his early version (on loan from Luckey) to the E3 gaming trade show. They pulled in nearly $2.5 million (£1.5 million) -- and in spring of 2013, when those units began shipping to developers, virtual reality started to seem a lot less virtual.
Since then, the hype has only gotten louder. Oculus brought an improved version of the Rift to E3 in June and showed off its 1080p resolution as well as demos that placed wearers inside a virtual movie theater watching a trailer for Man of Steel. For the first time, applications beyond gaming began to suggest themselves. People, to use the clinical term, freaked out. Hell, I freaked out. Media coverage was rapturous; devoted Oculus forums and subreddits proliferated. Oculus launched a project depository for game demos, and developers responded, creating completely new experiences for a completely new medium.
Now they just need to build a consumer version that can pay off the promise of Luckey's early experiments. And now that the company has received $75 million (£45.9 million) in Series B funding from the likes of Andreessen Horowitz -- which sources say value the company at more than $250 million (£153 million) -- the stakes are even higher. Before the official consumer version of the Rift (known internally as V1) becomes available in 2014, they'll have to iron out seemingly innumerable kinks, from finalising the display tech to deciding exactly what features will be included. But first and foremost, they'll have to solve a problem that has plagued VR since the days of Tron: how not to make people sick.
There are a number of hurdles to creating a seamless virtual experience. Tracking, resolution, motion blur; sometimes it seems like the list never ends. But underneath them all is the most visceral obstacle of all. It's the one that makes people feel dizzy, or hot, or nauseated. It's latency.
Latency is lag. It's the delay between inputting a command and seeing the effects of that command. You see it today in online gaming when your broadband connection gets unreliable: suddenly, you're a half-step behind the action. Your button pushes and thumbstick moves don't kick in immediately, and by the time you've seen another player he's already killed you. In that kind of situation, the only price of latency is frustration. Wearing a VR headset, though, the price is something akin to motion sickness. Oculus CEO Brendan Iribe calls it "the Uncomfortable Valley" -- that queasy feeling players get when they move their head and experience a barely perceptible delay while the world catches up. People are willing to put up with many things in the name of novelty, but nausea isn't one of them.
Of course, you can't fully remove latency -- there always will be some delay. The question is how low latency needs to be. As processor power has progressed, various head-mounted displays and VR sets have claimed to have solved the latency problem at various thresholds: 100 milliseconds! 40 milliseconds! Those thresholds might do away with the most frustrating delays, but they can't guarantee comfort. "It's easier to get sick from latency than it is to perceive it," Luckey says. "People in the VR industry have been disagreeing on what humans can perceive -- and that number always seems to match up to what their system is just barely able to do."
For Oculus, that magic number is somewhere under 20 milliseconds. "When you cross that, you get to something magical. It really enhances the sense of presence," says Nate Mitchell, the company's vice president of product. "I'm very confident we'll be sub-15." That's about half the latency than the devkit version can attain, and it will require a bunch of innovations -- some that cut the actual latency, and some that just convince you that they have.
1. User input
2. A USB cable delivering that command from the Rift to a computer
3. The game engine translating that command for the graphics processor (GPU)
4. The processor issuing a "write" command for a new image
5. The Rift's display switching pixels to display the new image
6. The new image appearing in full
Oculus' task is shaving as many milliseconds of latency from as many of these stages as possible.
The first step is minimising input latency -- speeding the process of translating action into digital commands. That's a matter of hardware. While Luckey's original prototypes used an off-the-shelf sensor, current Rift headsets utilise a proprietary inertial measurement unit (IMU) tracker that uses a gyroscope, accelerometer, and magnetometer and then fuses those readings to evaluate head motion in real time. That "sensor fusion," as Oculus calls it, drops latency for this first stage -- which was often around 15 milliseconds for the original tracker -- to below a millisecond. Getting that input command from the Rift to the computer-- where all the actual processing takes place -- adds another 1-2 milliseconds, a number that can't be reduced without reinventing the USB cable entirely.
At that point, the burden of latency shifts from the Rift itself to the game developer -- in particular, how high a frame rate the game is able to deliver. Most modern-day games play at 60 frames per second; at that speed, a single game image takes 16.67 milliseconds to render and get pushed to the computer's graphics processor. If a developer is able to double that speed, of course, they can halve the latency introduced by that render time (i.e., a game that plays at 120fps takes 8 milliseconds to render a single image). That's not unheard of for PC games, but for now Oculus has to assume the lowest common denominator of 16.67 milliseconds.
If you're keeping track, we're already close to 20 milliseconds of latency -- and the image in your headset still hasn't changed. For that to happen, the computer's GPU needs to send a command back up the USB cable to each pixel in the Rift's display. Some pixel switches happen very quickly -- a black pixel can turn white in less than 10 milliseconds -- but for one gray pixel to become slightly grayer can take close to 30 milliseconds. To save time, each individual pixel starts switching as soon as it receives its "write" command from the GPU; the Rift writes from bottom to top, so by the time the GPU is sending a command to the top lines of the display, the bottom lines have already switched.
All told, that process "smears out," in Mitchell's words, to between 20 and 30 milliseconds overall; since the processing and writing happen somewhat concurrently, that adds up to a total lag of up to 40 milliseconds for a 60fps game. (For reference, Luckey's original duct-tape prototype was in the 50s -- and Sony's $1000 (£612) HMZ T3W has 52 milliseconds of latency in the device itself, not counting the latency introduced by a game.) That's still too much.
Fortunately, Oculus has a few tricks left to employ.
Nips and tucks
So that time-consuming pixel-switching process that hogs so much of the latency pipeline? Turns out it's only time-consuming on LCD screens. OLED technology, such as what's found on a Samsung Galaxy phone, can switch pixels in a matter of microseconds. So will the consumer version of the Rift utilise OLED panels? "We haven't decided anything, but it would make a lot of sense to consider," says Mitchell (with a smile, it should be noted). "The real trick to OLED is that there's only one manufacturer in the world making OLED technology in the form factor we want." That would be Samsung -- and they've never sold their tech to any third party.
That doesn't mean Oculus isn't considering other display technologies that outperform LCD -- and it doesn't mean they're not tinkering. As part of their ongoing collaborative work with Valve Software, the two companies worked up a prototype that, along with a number of other improvements that Wired saw but can't discuss, utilised an OLED panel. That was the prototype that got them below 20 milliseconds for the first time. It was also the first time that Iribe was able to use the Rift without feeling sick. "I'm one of the most sensitive in the office," Iribe says. "People always joke that 'it's not ready to show Brendan.' And for the first time, I felt incredibly good." So where did the OLED panel come from? "We gutted a phone," says Luckey. They won't say what phone they gutted, but it's clear that it's a Galaxy S4.
By all accounts, that was a turning point. Andreessen Horowitz founder Marc Andreessen had been asking how long it would take for Oculus to conquer the Uncomfortable Valley. "We've had to be honest and say it's soon, but not yet," Iribe says. After he saw the prototype, though, Iribe emailed Andreessen: "I said, 'Okay, we're ready -- you need to come down here on the next flight.'"
Chris Dixon was among four people from Andreessen Horowitz who saw the newest prototype. "I'd tried the devkit and thought it was really impressive, but the latency wasn't quite there in my opinion," Dixon says. "Going and seeing the new prototype gave me confidence that they were going to solve all of those problems. I think I've seen five or six computer demos in my life that made me think the world was about to change: Apple II, Netscape, Google, iPhone…then Oculus. It was that kind of amazing." (The prototype and functionality that Andreessen Horowitz saw will not be shown at CES, as has been speculated elsewhere.)
It's not all thanks to the OLED display, though. The Rift's three-sensor head-tracking unit also samples motion data at 1000 times a second -- four times faster than the original tracker from Luckey's prototype. That high-speed data collection has not just allowed them to shave even more latency, but to actually predict where a player will move his head before he moves it. "There's research that says you can't predict where the head's going well enough to render ahead of it, but nobody's ever had a thousand samples a second to be able to try," says Luckey.
He goes on: "If a head's moving very quickly, you know it cannot instantly stop, and you know that it can only start slowing down at a certain speed, so you can say 'well, if they're turning their head, I know that they can't slow down their head beyond a certain amount, so I'm going to render a little bit ahead of where it's reporting they are.' Then, as your head starts to slow, it lowers the prediction back down to zero, and during a very high-speed turn it can ramp it back up, which is when you need it as well." This doesn't change the latency of processing in any real way, but it allows the user to see the new image faster, and thus change the perceived latency -- nearly 10 milliseconds less.
And then there's Carmack, the graphics wizard who officially joined Oculus in August as CTO. Before he joined the team, Carmack wrote on his own blog about mitigating latency by changing the way the GPU writes displays. (If you're not a programmer, don't even bother trying to parse it: just know that it mentions things like "late frame scheduling" and "asynchronous time warp," and be thankful there's not a quiz.) Since then, Oculus has met with GPU manufacturers NVIDIA, Qualcomm, and AMD in an effort to ensure off-the-shelf PCs (and next-generation Android phones) will be able to pump the proper experience to the headset. "Making people buy a particular graphics card just to use Rift is totally off the table," Mitchell says.
There's still one more piece of the puzzle: ensuring the games and applications people are developing for the Rift are as streamlined as possible. In October, Oculus began shipping the Latency Tester, a tiny tool that allows developers to measure latency of their own motion-to-photons pipeline. By placing it in one of the headset's two eyecups and pressing the test button, developers send a test signal that tells their game engine to draw a colored square on the screen; the tester reports how many milliseconds it takes for the square to appear.
Granted, it's all still a work in progress, but after attacking it from so many angles, Oculus doesn't consider latency the bugaboo it once was. "We've crossed the Uncomfortable Valley," Iribe concludes. So it's on to other things. Mitchell points out that the Oculus team still needs to make significant progress on a number of challenges -- tracking precision, image quality, resolution -- before the consumer unit finally ships. But with an extra $75 million (£46 million), and the guidance from one of the most hands-on VC firms around, they're confident that they'll get there.
In the meantime, there are always more
milliseconds to lose. "We're shooting for zero," Mitchell says.
"We're super-close."
0 comments:
Post a Comment