3D Glasses: Virtual Reality, Meet the iPhone

A light flickers from two distinct points in time. As a child in the early-1970s, one of my toys was a View-Master, a binoculars-like device for viewing 3D images (called stereograms), essentially a mini-program excerpted from popular destinations, TV shows, cartoons, events and the like.

The View-Master completely predated the advent of electronic toys (it was light powered and human click driven), but it was dumb simple to operate, and the 3D viewing experience was quirky cool. Plus, the content was customizable (just pop in a different program card) and for its time, it was engaging (sound could play on top of each image, making it even more so).

Flash forward, and it’s 1992. I am reading Mondo 2000, a long since deceased magazine that was at the bleeding edge of the technology wave that was to come. Total reboot in terms of re-thinking and re-imaging the schema of the possible.

In one issue, Jaron Lanier is prophesizing his vision of virtual reality (VR). It’s a revelation, but it’s also too early, and so despite Mattel-scale efforts to commercialize “something” that embraces VR concepts, the VR industry is stillborn (except, as verbally rendered in Snow Crash and Neuromancer, two seminal novels to this day).

Alas, the MetaVerse would have to wait. In the intervening years, there would be RPGs (Role-Playing Games), MUDs (Multi-User Dungeons) and of
course, Second Life, but virtual reality was just that; more virtual than reality.

Flash forward to the present, and we are suddenly on the cusp of a game-changing event; one that I believe kicks the door open for 3D and VR apps to become mainstream. I am talking about the release of iPhone OS version 3.0.

Huh? Did I lose you? Well, let me take a step back.

Hardware Accessories Innovation: A Phoenix Rises

I have harped repeatedly (HERE and HERE) about the fact that the next version of the iPhone OS (and the underlying SDK) will allow third-party hardware accessory makers to build external hardware accessory offerings that take advantage of the software, service and hardware capabilities of the iPhone and iPod touch platforms.

Specifically, Apple is opening up the 30-pin connector at the base of the iPhone and iPod touch such that hardware accessory makers can create a software layer that is optimized to take advantage of the capabilities of the hardware in a way that cobbles together with and extends the capabilities of the iPhone Platform. (Side note: Bluetooth-accessible hardware accessories can also take advantage of these new capabilities.)

What this means is a rebirth of hardware-based innovation, a segment that Apple has historically played a leading role in fostering, first with postscript-based printing, then with Appletalk networking (pre-dates Ethernet) and now with Mobile Broadband Computing.

Look at it this way; very soon, hardware accessory makers will be able to leverage the same tools and marketplace functions that have resulted in more than 35K applications being built and more than 1 billion apps downloaded, all the while tapping into a 37 million device global footprint.

What’s that worth? Consider this. The iPod accessory business itself is already a $2B market, and there has really been no such thing as “software value-add” to the hardware accessory itself. With iPhone 3.0, this changes. That’s a big deal.

3D Mobility: The Rise of the Meta-Platform

Connecting the pieces, there is 3D/Virtual Reality and there is this macro trend of the iPhone Platform becoming more hardware extensible in a programmatic fashion.

My assertion is this: beginning with the assumption that View-Master 2.0 is an achievable baseline in terms of a “good enough” 3D viewing experience, and full immersive VR the design goal, 3D Glasses are a compelling hardware accessory to harness the power, extensibility and mobility functions of the iPhone Platform.

Framing the point a bit further, 3D Glasses are a kind of “meta” platform built to embrace and extend the iPhone Platform vis-à-vis application specific libraries, open APIs, custom tools and of course, the glasses themselves, to enable third-parties to create their own 3D/VR-powered applications.

The clearest way to think about the 3D Glasses Meta-Platform – let’s call it X-Ray Specs – is as being a composite of a Runtime Layer, a Physical Glasses Layer and a Cloud Service Layer.

The Runtime Layer is an iPhone application that deals with items such as media control (search, activate, play/pause/forward/rewind, bookmark); communication handling between the 3D Glasses, the iPhone and the Internet; and can execute custom-built X-Ray Specs applications.

The Physical Glasses Layer is the actual hardware glasses that support viewing, playback and output resolution in a “good enough” fashion relative to mass-market cost economics. Augmenting the glasses would be headphones and microphone inputs, with physical controls for simple navigation, modal and settings adjustment.

Finally, the Cloud Service Layer is a hosted service where third-party applications are sourced and hosted, and social communications are sent, tracked, federated, filtered and routed back to the X-Ray Specs Runtime.

In other words, 3D Glasses would do more than 3D viewing. They would also support overlay and immersive viewing applications.


By overlay applications, I mean viewing applications that act as filters, enhancing, modifying or shaping whatever you are doing/viewing by adding information or inserting characters into viewed content.

Imagine watching a ballgame and being served up feeds, like sports scores, news items or other animated tickers, or having video highlights pop up in your view when key events occur (home run hit by a watched player, key match-up between players).

By contrast, immersive applications consume 100% of your viewing perspective, making them ideal for virtual reality types of applications, not to mention shared (social) and passive viewing experiences.

At its most basic, this means View-Master types of applications (i.e., 3D stereogram content), akin to slide shows on steroids; namely tuned to take advantage of easy access to the local media libraries resident on your iPhone/iPod touch and shared libraries accessible via the Internet (and App Store/iTunes).

But how exactly does one navigate virtual worlds in such an immersive realm?

One, they can use touch, tilt and shake as the primary controller mechanism.

Two, voice-based control can be embedded in the glasses, supporting a simple voice dictionary for audible-lizing actions, such as “GET,” “PLAY,” “NEXT,” “PREVIOUS,” “PAUSE,” “SEARCH,” “FORWARD,” “BACKWARD,” “TURN,” “GRAB” and so on. Think: BASIC Programming language for humans.

On top of this, one could easily envision voice-powered shortcuts for accessing and controlling popular apps like YouTube, Hulu, Flickr, Scribd, SlideShare and the like.

So there you have it. The MetaVerse is virtually within reach (pun intended). No less, there is a proven platform the enables fairly creative application partitioning scenarios (between hardware, iPhone and cloud).

And oh yeah, that same platform plugs into a marketplace that can support subscriptions, chapters or levels, not to mention a digital distribution center and e-wallet that reaches 37M users.

It doesn’t get much cooler than that. What do you think?

tags: , ,
  • Great post Mark. In fact one of the best I have seen in a while! We’re in violent agreement with you on this one and are eager to see more platform development hit the mainstream.

    This is the most interesting area in technology right now and for the next several years in our view.

  • Clint Kennedy

    Great article…..have you read “The Deamon”? http://www.amazon.com/Daemon-Daniel-Suarez/dp/0525951113/ref=sr_1_1?ie=UTF8&s=books&qid=1244217756&sr=8-1

    I am much more interested in virtual world “layers” over the real world thru a heads up display built into glasses than I am a completely different virtual world. Can’t wait to see what is coming!

    Please keep writing on this topic.

  • Jerome McDonough

    I think you’re definitely right, and if add-on hardware included some dedicated graphics processing, you could get some pretty sophisticated immersive media displays.

    The gaming app people could have a field day with this. I do worry about standards though. The 3D world has always had a bit of a problem developing standards that actually get traction, and developers who might develop 3D systems don’t want to have to develop to 100 different hardware interfaces.

  • AppleTalk doesn’t predate Ethernet. Ethernet was developed in the early ’70s, AppleTalk came out with the first Mac in 1984. AppleTalk does predate 10BASE-T Ethernet which was standardized in 1990.

  • Mike

    Your article sounded vaguely familiar, and that’s when I remembered that Apple has very likely been working on this themselves, given these patents:


  • @Kris, thanks for the note. It sure feels like a segment that could have some killer apps behind it (not to mention being ready for prime time).

    @Clint, thanks for the recommendation. I have added to my Amazon Wish List. Wrt the overlay v. immersive fork, it’s interesting that of the people that I talk to, they are pretty split down the middle in terms of which approach is most compelling to them, and have pretty well formed perspective as to the reason why they prefer a given path.

    @Jerome, I think that the standards factor is actually one that Apple could drive/push if they see that as a segment of interest; namely, let a community of developers form around the segment, pick an approach that seems to have the healthiest balance of solving the right problem and user adoption, and then formally support it as a set of libraries within the SDK.

    @Kent, I appreciate the distinction, and am familiar with the history, but for practical purposes Appletalk brought networking into the mainstream well in advance of Ethernet; mainly because it was cheap/free (built into Macs, Apple II). My first tech gig was selling networking hardware, and at that point it was hardly clear whether Token Ring, 10B-T Ethernet or ARCNet was going to prevail.

  • personne

    I think its quite sad that all these facilities are being built on a proprietary bus and platform. It could all be done with USB host and bluetooth. We’re already seeing these kind of reality overlays in Android applications (street view and two travel applications), since it has a built in compass and the camera. For example, http://crave.cnet.co.uk/software/0,39029471,49302451-3,00.htm But media will happily play along with the impression that only Apple can do this, and we’re going to see an influx of proprietary (and probably patented) add-ons. Gee whiz.

  • @Mike, an important distinction should be made between filing patents and “working on.”

    I have been pretty vocal in raising a red flag that Apple is amassing a big-time patent portfolio around every app under the sun wrt iPhone.

    The concern is that it is unclear whether Apple’s long term goal is to protect themselves from getting sued by competitors, developers as the deep pockets; protecting iPhone developers from getting sued by same group (since they can point to prior art and offer cheap/free licenses to iPhone developers); or implementing a toll road where they can squeeze developers as the platform continues to proliferate in other segments.

    In fact, here is a post that I wrote on the topic some time back:

    Upward Mobility, Land Grabs and the iPhone Universe

    Check it out if interested.

  • I like the implications of the overlay of information. IIRC Boeing did work with this for the construction of airplane fuselages. Everyone had to wear the glasses and headpieces that would map information onto what they were riveting and drilling on the production line. My reservations are regarding the processing required to mathemtically match up what you see in Real Life versus the data overlay from the glasses.

    Is there enough CPU in any mobile processor to accomplish this?

    Will the iPhone have the battery power to accomplish this?

  • Bleyddyn

    On the input side, I would love to see a combination Wii remote and chording keyboard. With at most ten buttons (two for each finger and thumb) plus the orientation and movement sensors of the Wii remote. Maybe two versions, with the other one having an X/Y stick instead of buttons for the thumb, like the ‘nunchuku’ Wii controller.

    Lightweight, wireless, hopefully not too expensive.

  • Roger Weeks

    Just a quibble: Neuromancer was published in 1984, long before Mondo 2000 or Jaron Lanier had started talking about VR.

    We might as well mention William Gibson’s “Virtual Light” which took the concept a different direction in the 90s, around the same time as “Snow Crash”.

  • @Roger. Thanks for the correction. It’s ironic how we remember these things with the passage of time. When I read Neuromancer, I had this one dimensional, somewhat abstract sense of the grid, but when I read Snow Crash, it was vivid, three dimensional, perhaps aided by the advances in technology that grounded and expanded the vocabulary and imagery of what was possible. Beyond wacky to think of Neuromancer as a 25 YEAR OLD novel. Cheers, Mark

  • what a great post, thanks so much

    by the way, for great up-to-date information; tips and tricks as well as great deals on most things to do with iphones and i touchs go to http://www.bestiphone2u.com


  • It looks like the default platform for the 3D world you’re talking about is OpenSim — it uses the same interface (though its different on the inside) as Second Life, is open source, and has companies like IBM, Microsoft and Intel lining up behind it — in addition to colleges and universities. Unlike previous attempts at creating a virtual reality platform, OpenSim worlds can be hyperlinked, and avatars can already travel between worlds, bringing their appearance and belongings with them.

  • haig

    Ya, this post is not really that original, but its a good one anyway. For some really far-out visions of a fully ubiquitous, augmented/immersive world, go read the book Rainbows End by Vinge. For a more technical and pragmatic look at the technologies needed, check out the book The End of Hardware by Hainich. And definitely you have to go check out the father of wearable computing who has been doing this sort of thing since the 80s, Steve Mann.

    Now, for my opinion, I think the ‘3d glasses’ like LCD HMDs et al are not there yet. There are many problems with human visual biomechanics that need to be considered for any type of 3d display to be worn for a significant period of time. The best prospect seems to be VRD (virtual retinal device) type displays invented at U of Washington’s HIT Lab and commercialized (monopolized really) by the company Microvision (makers of those nifty picoprojecters for the iphone as well).

    For augmented reality to become useful outside niche industrial/military applications, we need better machine vision software, which is progressing. Markers are fine, but will not scale or provide the right experience.

    As for the cloud computing infrastructure, I’m going to disagree with the previous poster regarding OpenSim and have to put my money on Croquet (opencroquet.org) and the newer technologies coming out of VPRI (vpri.org).

    All these plus a true social graph layer and you get some very funky future scenarios.

  • Fully agree that augmented reality can provide value and entertainment. My concern would be the rendering limitations, being based on iPhone platform. Might need some external chips from NVIDIA, IBM cell, etc.

  • @Jeremy and @haig, I am not a hardware engineer to speak to the limits of the technology, although I am assuming some level or processing/playback intelligence to work in tandem with the capabilities of iPhone and ability to shuttle information between cloud and iPhone/Glasses.

    The core thesis is not that this solution is perfect but that it’s good enough for a subset of application scenarios.

    My assumption is that we will see lots of hardware accessory extensions to iPhone/iPod touch so long as Apple takes a comparable approach to accessory makers (in terms of software support within SDK) that they took with app developers, so this is a logical bucket (I think) for innovation, inasmuch as the tools are robust, the iPhone has proven its mettle in terms of robust graphics for gaming apps (and will get better) and there is an installed base of 37M devices ready for the targeting.

  • Russell de Silva

    I agree that VR and mobile devices are a compelling combination.
    With the VR glasses one of the limitations of current mobile technology, display definition is effectively removed.
    It will be interesting to see what other technologies are developed to overcome the other great drawback of current mobile devices, i.e. the appalling interface for user input.
    I imagine voice control has a part to play, but hand control of some desciption is necessary to get a truly easy to use virtual desktop experience.

    Great days ahead.

  • I agree with Russell. Project Natal-style input would be the best. The glasses would need to have cameras on them anyway to overlay virtual things on your real-world environment. Why not also use those cameras to see what your hands are doing? Hand gestures together with voice could make a really immersive experience.

  • Great article! I’ve read about this on Apple’s website a few weeks ago. But, as someone said before, will the i Phone have the necessary battery power?

  • @iPhoneLand, thanks for the comments. Battery intensity is a legitimate issue. One wonders how much more so than a graphically intensive RPG, but logic suggests that you would probably plug in the device with a longish cord if planning an extended session.

  • Mark

    I like the passion and imagination in this post, and really what’s ‘new’ here is the ability to take a decently powerful and location aware, internet connected processor with you vs. at you desk.

    My concern with the unfolding of this as a real phenomena would be simple though: given that we rarely interact with our powerful home PC’s or even laptops in this fashion today, do you really think the added dimension of mobility will radically change 3d immersion technologies?

    I’d look at the best next alternative (substitute) of simply checking ones iphone for a batters stats. Compared to the social stigma of wearing headgear in public I think it might be the ‘dominant design’ of the UI now and for a while.

    Just a thought about change levers or lack thereof. Love the imagination here.


  • I agree with Coleman. While Microsoft is hard pressed to come up with anything truely “original”, they certainly know how to rework a concept. Project Natal is a perfect support platform for any type of main AR/VR platform. At least in it’s current incarnation. Eventually all of the cameras that are mounted at city intersections monitoring traffic will come into play as AR support. It’s a question of licencing and leasing from the municipalities. But thats a headache for another day. When a true set of vision displays are made available for the iPhone/Touch, there’s going to be another paradigm shift.

  • @James, fair knee jerk, but that’s like saying that iPod wouldn’t be successful because PC based media players never took off in a big way. Sometimes, the increment of convenience (portability/pocket sized) is what’s needed to make a new innovation successful. Plus, a key piece is the platform/SDK/reach of iPhone, which already has a strong tilt towards gaming, immersive graphics around tilt and touch. Also, to be clear, my sense is that people will use such devices in living rooms and fixed locations not wandering the streets.

    @Joey1058, I am definitely warming up to Project-Natal style approach, specifically because it doesn’t require the user to wear something immersive, and based on a friend who is a popular iPhone app developer, who was all hot and bothered about the concept.

  • WesH

    I have been thinking along these lines since the days of the Palm Pilot and the Sony Clie’.

    An easier, very useful form of overlay would be one that overlays the iPhone held in the hand with almost anything – for instance, a 3D reconstruction of a human heart CT study. Accelerometer signals would allow examination from all angles. And with a ‘virtual scalpel’ …

    I had earlier thought that by now we would have PDA and Smartphone CPUs several orders of magnitude more powerful than we do, but I now think cloud computing will be needed for a very long time.

  • Excellent observation – I think you’re absolutely correct. Similarly, if add-on hardware included some dedicated graphics processing, we could see a revolution in the sophistication and depth of media displays. Obviously the issue of standardisation raises its ugly head again here in that there will need to be some sort of co-ordination across applications to ensure that there a basic standard set. This will avoid a repeat of the dvd/blu-ray type standards battle. All in all though a very exciting concept!

  • @Tanzanite, thanks for the note. Definitely exciting potential in this segment.

  • Wow
    I have been searching the web for a few years now hoping to fall on a decent pair of virtual specs that I can use to enhance my productivity on my laptop. There is just too much going on around me most of the time and I would love to be able to just immerse myself in my programming. One of the things I have been able to do is use my PDA for creating web pages and py scripts but its not that good when you can’t run them. Being able to sit on the train with my laptop in my bag and my Bluetooth keyboard on my lap and see only an 80″ HD linux desktop in front of me would be a dream come true!

  • @Tristan, for you and me both. Let me know if/when you find some workable specs. Mark

  • Hi Mark,
    Great lateral thinking.

    Anything new on whether the 30 pin connector on the iPhone iPod Touch is being used in controlling some external device like a display.

    It would seem very simple to have the screen shown on an external device. Is something like this available?


  • Wayne

    Vuzix Announces World’s First Fashionable Sunglass-Style Video Eyewear with Revolutionary “See-Thru” Quantum Optics or Wrap 920 find here:

  • Mark Sigal

    @Dom, I was just about to say that I have heard very little on that front, and in fact, feel like Apple has not marketed the hardware accessory side of their story very aggressively, when I saw @Wayne’s comment, which I will check out. Thanks Wayne.


  • Mr.t

    Eyeglasses: Buy Prescription Eyeglasses Online at OrderYourGlasses.com! Prices from $14.95, with a wide selection and the best quality lenses available online. We offer fashionable frames
    for Men, Women, and Kids at prices that will help you save money while getting the best quality lenses!