All seems to indicate that new, fully immersive next-generation virtual worlds and user interfaces may be around the corner. Let’s go back to the future in the Metaverse!
The first press reports after the announcement of the acquisition on March 25 mentioned the anger of many Oculus Rift enthusiasts and backers of the wildly successful Kikcstarter project that launched the Oculus Rift in September 2012, who don’t seem to have much love for Facebook.
I am no unconditional fan of Facebook – I just use it like everyone else – but only big money can produce VR gear like the Oculus Rift with good industrial quality and a price point easily affordable to the average consumer. Like it or not, Facebook has made possible a major step towards consumer VR. When you can step in a VR world at 360 degrees with fully immersive, intuitive, well designed interfaces , suspending disbelief is much easier. Acceptance of VR requires suspension of disbelief. It is no surprise that also other companies – Google, Valve, Sony – are developing their own VR hardware and software solutions.
I fell totally in love with Second Life one minute after joining in 2005. A few weeks later I left a very boring but very well paid senior management post in the public sector to became a technology entrepreneur.
As we all know, Snow Crash crashed in only a few years. In August 2007, as CEO of a Second Life development company in Spain with some high profile clients, I was interviewed by the national TV to comment on why companies were wasting millions on desert corporate VR spaces without visitors. This meme, started by an article on Wired, propagated through the blogosphere in a clamor of mockery, and Second Life started to fade out.
Looking back, I see several reasons for the fall of SL. The main reasons are [adapted from “Snow Crash(ed) in Second Life (end 2012)“]:
1. The interface is far, far too difficult for today’s casual users who think that the Internet is that little box with lights that flicker when you are on Facebook. Many users, including many who use the Internet daily for work, do not really know how to copy and paste, or the difference between left and right click. Add to this the terminal attention deficit that is endemic in our days, and you see how the Second Life interface is too difficult for mass adoption. Today, you need to design one-click user interfaces, because two clicks is too many [or, even better, no-click – see below].
2. Related to 1, SL is too heavy for most user computers. Those with powerful gaming systems and modern graphic cards never realize it, but Second Life is just not usable on low-performance computers, including new computers of users who don’t know how to switch off all resource hungry antivirus software, firewalls, background software clutter and assorted useless trash left in today’s computers by the manufacturers.
3. A 3D interface that imitates reality can be a great and intuitive user interface (if you see a door, you should go through, if you see a chair, you should sit down, etc.), but 3D on a flat 2D screen is not really immersive 3D, and may make things difficult for the user – most people do not see a 2D slide show as a representation of 3D space, and tend to dismiss it as irrelevant and fake – especially if problems 1 and 2 increase the mental distance.
[See “Snow Crash(ed) in Second Life (end 2012)” for more.]
Let’s backtrack to the idea of a “no-click interface” – no mouse, no keyboard, just life-like interaction with a virtual world that surrounds you at 360 degrees. Let’s face the fact – a keyboard is an awful victorian atavism of a typewriter. A mouse is better but still a relatively awkward solution. In the original vision of the novel Snow Crash, Hiro and friends don’t bother with archaic keyboards, picture frame screens or clattery mouses. Instead:
“[The] beam is made to sweep back and forth across the lenses of Hiro’s goggles, in much the same way as the electron beam in a television paints the inner surface of the eponymous Tube. The resulting image hangs in space in front of Hiro’s view of Reality. By drawing a slightly different image in front of each eye, the image can be made three-dimensional.
By changing the image seventy-two times a second, it can be made to move. By drawing the moving three-dimensional image at a resolution of 2K pixels on a side, it can be as sharp as the eye can perceive, and by pumping stereo digital sound through the little earphones, the moving 3-D pictures can have a perfectly realistic soundtrack. So Hiro’s not actually here at all. He’s in a computer-generated universe that his computer is drawing onto his goggles and pumping into his earphones.”
In Ernest Cline’s Ready Player One (2011), the Metaverse is called OASIS – a massively multiplayer, high fidelity virtual world used by most of the world’s population for all sorts of activities from gaming to business and education. Cline describes the OASIS user interface gear and game engine:
“The keys to the success of the OASIS were the two new pieces of interface hardware that GSS had created, both of which were required to access the simulation: the OASIS visor and haptic gloves. The wireless one-size-fits-all OASIS visor was slightly larger than a pair of sunglasses. It used harmless low-powered lasers to draw the stunningly real environment of the OASIS. right onto its wearer’s retinas, completely immersing their entire field of vision in the online world. The visor was light-years ahead of the clunky virtual-reality goggles available prior to that time, and it represented a paradigm shift in virtual-reality technology – as did the lightweight OASIS haptic gloves, which allowed users to directly control the hands of their avatar and to interact with their simulated environment as if they were actually inside it.
When you picked up objects, opened doors, or operated vehicles, the haptic gloves made you feel these nonexistent objects and surfaces as if they were really right there in front of you. The gloves let you, as the television ads put it, ‘reach in and touch the OASIS.’ Working together, the visor and the gloves made entering the OASIS an experience unlike anything else available, and once people got a taste of it, there was no going back.
The software that powered the simulation, Halliday’s new OASIS Reality Engine, also represented a huge technological breakthrough. It managed to overcome limitations that had plagued previous simulated realities. In addition to restricting the overall size of their virtual environments, earlier MMOs had been forced to limit their virtual populations, usually to a few thousand users per server. If too many people were logged in at the same time, the simulation would slow to a crawl and avatars would freeze in midstride as the system struggled to keep up. But the OASIS utilized a new kind of fault-tolerant server array that could draw additional processing power from every computer connected to it. At the time of its initial launch, the OASIS could handle up to five million simultaneous users, with no discernible latency and no chance of a system crash.”
Forbes reports that Ernest Cline is very impressed with the Oculus Rift. “I realized all the things that I had written in Ready Player One were happening a lot faster than I had estimated,” he says. Palmer Luckey, the main creator of the Oculus Rift, recommends that everyone working at Oculus read Ready Player One. “I’d love to see us build something quite similar to what [Ready Player One] describes, but probably with less central agency,” says Second Life creator Philip Rosedale. “A global virtual world seems very likely to be (un)governed in a manner similar to the internet itself.”
Rosedale’s new project, High Fidelity, is “building a new virtual world enabling rich avatar interactions driven by sensor-equipped hardware, simulated and served by devices (phones, tablets and laptops/desktops) contributed by end-users.” Read distributed Metaverse powered by end-user devices, Oculus Rift support out-of-the-box, total immersion, live avatars with body language and non-verbal expression, and real 3D VR user interfaces. High Fidelity is in alpha at this moment, but it has awesome potential. Hiro and Wade, here we come.
According to Rosedale, High Fidelity has a latency of 100 milliseconds, which is much less than the 500 milliseconds latency of mobile phones and VoIP services services. A 100 milliseconds latency for voice and body language is a critical threshold for online personal interaction, below which the magic of real, immediate communication happens. In other words, High Fidelity will have less latency than videoconferencing, which is not surprising since it doesn’t need to transport realtime video, and permit suspending disbelief and “being there,” for real. Watch the videos below, with Rosedale’s introduction to High Fidelity at the Silicon Valley Virtual Reality Meetup and Virtual Reality Los Angeles Meetup:
Epic Games co-founder Tim Sweeney, told Develop that virtual reality is approaching it’s ‘iPhone moment’: “This is a trend that’s going to take over the world slowly,” he says. “In ten years, I think there will be VR hardware being worn by billions of people.” The revolution starts here, says the Develop article. This may be the beginning of a virtual renaissance that will make the Metaverse young and sexy again with emotionally immersive VR interfaces and new VR world architectures like in High Fidelity, whose technology may well be imported in future versions of Second Life. A VR tide seems rising again and will lift all virtual boats, and consumers may soon start to flock back to the Metaverse.
And so should we. Futurists, transhumanists, techno-progressives, singularity enthusiasts, and techno-spiritual seekers, have been early adopters and evangelists of VR. In 2006-2009 we organized a lot of virtual and mixed-reality talks and events in Second Life and other VR worlds. We kind of gave up when Second Life went out of fashion, but the most popular event was Humanity+ event “Leadership: MINDS” on the Terasem sim in Second Life, in September 2011. “More than 80 transhumanist avatars stormed the virtual world of Second Life for a community event organized by Humanity+ on September 15,” I wrote in the post-conference report. “This has been by far the largest virtual transhumanist event that I have seen, and I believe I have seen them all.” Natasha Vita-More said: “The Second Life event was a success. Why? Because we were packed with avatars and the presentations were focused and inspiring. Giulio [Prisco]‘s talk revved-up the audience by expanding on how we need to return to the enthusiasm of the 1990s and reinstate that visionary, can-do attitude that we once had. Martine [Rothblatt]‘s talk introduced the specifics of organizational management and that to succeed we need to promote diversity. Howard [Bloom]‘s talk reminded us that passion is the key to not only getting our own work accomplished, but invigorating others to work with us. Linda [MacDonald Glenn]‘s talk emphasized the ideas of generosity, respect and appreciation within organizations. Ben [Goertzel]‘s talk produced an analogy between the phenomenal self and social organizations. My own talk asked what we can do to improve what we are already doing and how we can face change head-on and thrive.”
The Oculus Rift, data gloves, hand sensors, and other immersive VR hardware interface devices, are temporary fixes, and soon we won’t need them – I am sure that, by the end of next decade, high performance neural interfaces will stream VR worlds directly to our brains. We will be able to send our minds to roam the Metaverse at the flicker of a thought, but there is a lot of work to do to prepare the way.
This is a call to arms: let’s go back to the future in the Metaverse. I am organizing a meeting in Second Life to discuss the virtual renaissance and future initiatives, talks, workshops, conferences, transhumanist entertainment, events in virtual and mixed reality, and educational projects. I hope to see you there and, since this is beginning to sound like a manifesto, let me conclude with “We have nothing to lose but our chains to meatspace and mediocrity. We have infinite worlds to win.”
Thanks to Khannea Suntzu for useful ideas and editing help.