A light flickers from two distinct points in time. As a child in the early-1970s, one of my toys was a View-Master, a binoculars-like device for viewing 3D images (called stereograms), essentially a mini-program excerpted from popular destinations, TV shows, cartoons, events and the like.
The View-Master completely predated the advent of electronic toys (it was light powered and human click driven), but it was dumb simple to operate, and the 3D viewing experience was quirky cool. Plus, the content was customizable (just pop in a different program card) and for its time, it was engaging (sound could play on top of each image, making it even more so).
Flash forward, and it’s 1992. I am reading Mondo 2000, a long since deceased magazine that was at the bleeding edge of the technology wave that was to come. Total reboot in terms of re-thinking and re-imaging the schema of the possible.
In one issue, Jaron Lanier is prophesizing his vision of virtual reality (VR). It’s a revelation, but it’s also too early, and so despite Mattel-scale efforts to commercialize “something” that embraces VR concepts, the VR industry is stillborn (except, as verbally rendered in Snow Crash and Neuromancer, two seminal novels to this day).
Alas, the MetaVerse would have to wait. In the intervening years, there would be RPGs (Role-Playing Games), MUDs (Multi-User Dungeons) and of
course, Second Life, but virtual reality was just that; more virtual than reality.
Flash forward to the present, and we are suddenly on the cusp of a game-changing event; one that I believe kicks the door open for 3D and VR apps to become mainstream. I am talking about the release of iPhone OS version 3.0.
Huh? Did I lose you? Well, let me take a step back.
Hardware Accessories Innovation: A Phoenix Rises
I have harped repeatedly (HERE and HERE) about the fact that the next version of the iPhone OS (and the underlying SDK) will allow third-party hardware accessory makers to build external hardware accessory offerings that take advantage of the software, service and hardware capabilities of the iPhone and iPod touch platforms.
Specifically, Apple is opening up the 30-pin connector at the base of the iPhone and iPod touch such that hardware accessory makers can create a software layer that is optimized to take advantage of the capabilities of the hardware in a way that cobbles together with and extends the capabilities of the iPhone Platform. (Side note: Bluetooth-accessible hardware accessories can also take advantage of these new capabilities.)
What this means is a rebirth of hardware-based innovation, a segment that Apple has historically played a leading role in fostering, first with postscript-based printing, then with Appletalk networking (pre-dates Ethernet) and now with Mobile Broadband Computing.
Look at it this way; very soon, hardware accessory makers will be able to leverage the same tools and marketplace functions that have resulted in more than 35K applications being built and more than 1 billion apps downloaded, all the while tapping into a 37 million device global footprint.
What’s that worth? Consider this. The iPod accessory business itself is already a $2B market, and there has really been no such thing as “software value-add” to the hardware accessory itself. With iPhone 3.0, this changes. That’s a big deal.
3D Mobility: The Rise of the Meta-Platform
Connecting the pieces, there is 3D/Virtual Reality and there is this macro trend of the iPhone Platform becoming more hardware extensible in a programmatic fashion.
My assertion is this: beginning with the assumption that View-Master 2.0 is an achievable baseline in terms of a “good enough” 3D viewing experience, and full immersive VR the design goal, 3D Glasses are a compelling hardware accessory to harness the power, extensibility and mobility functions of the iPhone Platform.
Framing the point a bit further, 3D Glasses are a kind of “meta” platform built to embrace and extend the iPhone Platform vis-à-vis application specific libraries, open APIs, custom tools and of course, the glasses themselves, to enable third-parties to create their own 3D/VR-powered applications.
The clearest way to think about the 3D Glasses Meta-Platform – let’s call it X-Ray Specs – is as being a composite of a Runtime Layer, a Physical Glasses Layer and a Cloud Service Layer.
The Runtime Layer is an iPhone application that deals with items such as media control (search, activate, play/pause/forward/rewind, bookmark); communication handling between the 3D Glasses, the iPhone and the Internet; and can execute custom-built X-Ray Specs applications.
The Physical Glasses Layer is the actual hardware glasses that support viewing, playback and output resolution in a “good enough” fashion relative to mass-market cost economics. Augmenting the glasses would be headphones and microphone inputs, with physical controls for simple navigation, modal and settings adjustment.
Finally, the Cloud Service Layer is a hosted service where third-party applications are sourced and hosted, and social communications are sent, tracked, federated, filtered and routed back to the X-Ray Specs Runtime.
In other words, 3D Glasses would do more than 3D viewing. They would also support overlay and immersive viewing applications.
By overlay applications, I mean viewing applications that act as filters, enhancing, modifying or shaping whatever you are doing/viewing by adding information or inserting characters into viewed content.
Imagine watching a ballgame and being served up feeds, like sports scores, news items or other animated tickers, or having video highlights pop up in your view when key events occur (home run hit by a watched player, key match-up between players).
By contrast, immersive applications consume 100% of your viewing perspective, making them ideal for virtual reality types of applications, not to mention shared (social) and passive viewing experiences.
At its most basic, this means View-Master types of applications (i.e., 3D stereogram content), akin to slide shows on steroids; namely tuned to take advantage of easy access to the local media libraries resident on your iPhone/iPod touch and shared libraries accessible via the Internet (and App Store/iTunes).
But how exactly does one navigate virtual worlds in such an immersive realm?
One, they can use touch, tilt and shake as the primary controller mechanism.
Two, voice-based control can be embedded in the glasses, supporting a simple voice dictionary for audible-lizing actions, such as “GET,” “PLAY,” “NEXT,” “PREVIOUS,” “PAUSE,” “SEARCH,” “FORWARD,” “BACKWARD,” “TURN,” “GRAB” and so on. Think: BASIC Programming language for humans.
On top of this, one could easily envision voice-powered shortcuts for accessing and controlling popular apps like YouTube, Hulu, Flickr, Scribd, SlideShare and the like.
So there you have it. The MetaVerse is virtually within reach (pun intended). No less, there is a proven platform the enables fairly creative application partitioning scenarios (between hardware, iPhone and cloud).
And oh yeah, that same platform plugs into a marketplace that can support subscriptions, chapters or levels, not to mention a digital distribution center and e-wallet that reaches 37M users.
It doesn’t get much cooler than that. What do you think?