VR is either going to upend our lives in a way nothing has since the smartphone, or it’s the technological equivalent of trying to make “fetch” happen. The poles of that debate were established in 2012, when VR first re-emerged from obscurity at a videogame trade show; they’ve persisted through Facebook’s $3 billion acquisition of headset maker Oculus in 2014, through years of refinement and improvement, and well into the first-and-a-halfth generation of consumer hardware.
The truth, of course, is likely somewhere in between. But either way, virtual reality represents an extraordinary shift in the way humans experience the digital realm. Computing has always been a mediated experience: People pass information back and forth through screens and keyboards. VR promises to do away with that pesky middle layer altogether. As does VR's cousin, augmented reality, which is sometimes called mixed reality—not to mention that VR, AR, and MR can all be lumped into the umbrella term XR, for "extended reality." And VR depends on headsets, while AR is (for now, at least) most commonly experienced through your phone. Got all that? Don't worry, we're generally just going to stick with VR for the purposes of this guide. By enveloping you in an artificial world, or bringing virtual objects into your real-world environment, "spatial computing" allows you to interact more intuitively with those objects and information.
Now VR is finally beginning to come of age, having survived the troublesome stages of the famous "hype cycle"—the Peak of Inflated Expectation, even the so-called Trough of Disillusionment. But it's doing so at a time when people are warier about technology than they've ever been. Privacy breaches, internet addiction, toxic online behavior: These ills are all at the forefront of the cultural conversation, and they all have the potential to be amplified many times over by VR/AR. As with the technology itself, of course, "potential" is only one road of many. But, since VR/AR is poised to make significant leaps in the next two years (for real this time!), there's no better time to engage with their promise and their pitfalls.
The current life cycle of virtual reality may have begun when the earliest prototypes of the Oculus Rift showed up at the E3 videogame trade show in 2012, but it’s been licking at the edges of our collective consciousness for more than a century. The idea of immersing ourselves in 3-D environments dates all the way back to the stereoscopes that captivated people's imaginations in the 19th century. If you present an almost identical image to each eye, your brain will combine them and find depth in their discrepancies; it's the same mechanism that View-Masters used to become a childhood staple.
When actual VR took root in our minds as an all-encompassing simulacrum is a little fuzzier. As with most technological breakthroughs, the vision likely began with science fiction—specifically Stanley G. Weinbaum’s 1935 short story “Pygmalion’s Spectacles,” in which a scientist devises a pair of glasses that can "make it so that you are in the story, you speak to the shadows, and the shadows reply, and instead of being on a screen, the story is all about you, and you are in it."
Moving beyond stereoscopes and toward those magical glasses took a little more time, however. In the late 1960s, a University of Utah computer science professor named Ivan Sutherland—who had invented Sketchpad, the predecessor of the first graphic computer interface, as an MIT student—created a contraption called the Sword of Damocles. The name was fitting: The Sword of Damocles was so large it had to be suspended from the ceiling. Nonetheless, it was the first "head-mounted display"; users who had its twin screens attached to their head could look around the room and see a virtual 3-D cube hovering in midair. (Because you could also see your real-world surroundings, this was more like AR than VR, but it remains the inspiration for both technologies.)
Sutherland and his colleague David Evans eventually joined the private sector, adapting their work to flight simulator products. The Air Force and NASA were both actively researching head-mounted displays as well, leading to massive helmets that could envelop pilots and astronauts in the illusion of a 360-degree space. Inside the helmets, pilots could see a digital simulation of the world outside their plane, with their instruments superimposed in 3-D over the display; when they moved their heads the display would shift, reflecting whatever part of the world they were "looking" at.
None of this technology had a true name, though—at least not until the 1980s, when a twenty-something college dropout named Jaron Lanier dubbed it "virtual reality." (The phrase was first used by French playwright Antonio Artaud in a 1933 essay.) The company Lanier had co-founded, VPL Research, created the first official products that could deliver VR: the EyePhone (yup), the DataGlove, and the DataSuit. They delivered a compelling, if graphically primitive, experience, but they were slow, uncomfortable, and—at more than $350,000 for a full setup for two people, including the computer to run it all—prohibitively expensive.
Yet, led by VPL’s promise, and fueled by sci-fi writers, VR captured the popular imagination in the first half of the 1990s. If you didn't read Neal Stephenson's 1992 novel Snow Crash, you may have seen the movie Lawnmower Man that same year—a divine piece of schlock that featured VPL's gear (and was so far removed from the Stephen King short story it purported to adapt that King sued to have his name removed from the poster). And it wasn't just colonizing genre movies or speculative fiction: VR figured prominently in syndicated live-action kiddie fare like VR Troopers, and even popped up in episodes of Murder She Wrote and Mad About You.
In the real world, virtual reality was promised to gamers everywhere. In arcades and malls, Virtuality pods let people play short VR games (remember Dactyl Nightmare?); in living rooms, Nintendo called its 3-D videogame system "Virtual Boy," conveniently ignoring the fact that the headsets delivered headaches rather than actual VR. (The Virtual Boy was discontinued six months after release.) VR proved unable to deliver on its promise, and its cultural presence eventually dried up. Research continued in academia and private-sector labs, but VR simply ceased to exist as a viable consumer technology.
Then the smartphone came along.
Phones featured compact high-resolution displays; they contained tiny gyroscopes and accelerometers; they boasted mobile processors that could handle 3-D graphics. And all of a sudden, the hardware limitations that had stood in the way of VR weren't a problem anymore.
In 2012, id Software cofounder and virtual-reality aficionado John Carmack came to the E3 videogame trade show with a special surprise: He had borrowed a prototype of a headset created by a 19-year-old VR enthusiast named Palmer Luckey and hacked it to run a VR version of the game Doom. Its face was covered with duct tape, and a strap ripped from a pair of Oakley ski goggles was all that held it to your head, but it worked. When people put on the headset, they found themselves surrounded by the 3-D graphics they'd normally see on a TV or monitor. They weren't just playing Doom—they were inside it.
Things happened fast after that. Luckey's company, Oculus, raised more than $2 million on Kickstarter to produce the headset, which he called the Oculus Rift. In 2014, Facebook purchased Oculus for nearly $3 billion. ("Oculus has the chance to create the most social platform ever, and change the way we work, play and communicate," Mark Zuckerberg said at the time.) In 2016, the first wave of dedicated consumer VR headsets were released, though all three were effectively peripherals rather than full systems: The Oculus Rift and the HTC Vive each connected to high-powered PCs, and the PlayStation VR system ran off a PlayStation 4 game console. And in 2018, the first "stand-alone" headsets hit the market. They don't connect to a computer or depend on your smartphone to supply the display and processing; they're self-contained, all-in-one devices that make VR truly easy to use for the first time ever.
What all this is for is a question that doesn't have a single answer. The easiest but least satisfying response is that it's for everything. Beyond games and other interactive entertainment, VR shows promising applications for pain relief and PTSD, for education and design, for both telecommuting and office work. Thanks to "embodied presence"—you occupy an avatar in virtual space—social VR is not just more immersive than any digitally mediated communication we've ever experienced, but more affecting as well. The experiences we have virtually, from our reactions to our surroundings to the quality of our interactions, are stored and retrieved in our brains like any other experiential memory.
Yet, for all the billions of dollars poured into the field, nothing has yet emerged as the iPhone of VR: the product that combines compelling technology with an intuitive, desirable form. And while augmented and mixed reality are still a few years behind VR, it stands to reason that these related technologies won't remain distinct for long, instead merging into a single device that can deliver immersive, shut-out-the-world VR experiences—and then become transparent to let you interact with the world again. That may end up coming from Apple; the Cupertino company is reportedly at work on a headset that could launch as early as 2020. However, incredibly well-funded and even more incredibly secretive Florida-based Magic Leap has recently emerged from years of guarded development to launch the first developer-only version of its own AR headset; the company has said its device would be able to deliver traditional VR as well as hologram-driven mixed reality.
But even with that sort of device, we're at the beginning of a long, uncertain road—not because of what the technology can do, but because of how people could misuse it. The internet is great; how people treat each other on the internet, not so much. Apply that logic to VR, where being embodied as an avatar means you have personal boundaries that can be violated, and where spatialized audio and haptic feedback lets you hear and feel what other people are saying and doing to you, and you're looking at a potential for harassment and toxic behavior that's exponentially more visceral and traumatizing than anything on conventional social media.
And then there's the question of authentication. The internet has given us phishing and catfishing, deep fakes and fake news. Transpose any one of those into an all-encompassing experiential medium, and it's not hard to imagine what a bad actor (or geopolitical entity) could accomplish.
Those are the darkest timelines, for sure—and despite what the creators of Black Mirror seem to think, there's no guarantee things will swing that way. But if we've learned anything from how our lawmakers think about technology, it's that they don't think about it hard enough, and they don't think about it soon enough. So it's better to have these conversations now, before we find ourselves trying to answer questions no one saw coming.
Besides, the way things are going, there's going to be a lot of good coming at us in the next few years. Let's try to keep it that way.
The Singularity Lab Interactive Laboratory has been offering solutions for business, education and creativity in the virtual reality market for 5 years. Following the trends, and successfully applying all the technological developments of the present, together with our clients we create the future!