You say you want a revolution? It's called post-PC computing

An examination of the post-PC wave and its major players.

“You say you want a revolution,
Well, you know,
We all want to change the world.”
— The Beatles

I loved Google engineer Steve Yegge’s rant about: A) Google not grokking how to build and execute platforms; and B) How his ex-employer, Amazon, does.

First off, it bucks conventional wisdom. How could Google, the high priest of the cloud and the parent of Android, analytics and AdWords/AdSense, not be a standard-setter for platform creation?

Second, as Amazon’s strategy seems to be to embrace “open” Android and use it to make a platform that’s proprietary to Amazon, that’s a heck of a story to watch unfold in the months ahead. Even more so, knowing that Amazon has serious platform mojo.

But mostly, I loved the piece because it underscores the granular truth about just how hard it is to execute a coherent platform strategy in the real world.

Put another way, Yegge’s rant, and what it suggests about Google’s and Amazon’s platform readiness, provides the best insider’s point of reference for appreciating how Apple has played chess to everyone’s checkers in the post-PC platform wars.

Case in point, what company other than Apple could have executed something even remotely as rich and well-integrated as the simultaneous release of iOS 5, iCloud and iPhone 4S, the latter of which sold four million units in its first weekend of availability?

Let me answer that for you: No one.

Post-PC: Putting humans into the center of the computing equation

Each computing wave dwarfs and disrupts its predecessor

There is a truism that each wave of computing not only disrupts, but dwarfs its predecessor.

The mainframe was dwarfed by the PC, which in turn has been subordinated by the web. But now, a new kind of device is taking over. It’s mobile, lightweight, simple to use, connected, has a long battery life and is a digital machine for running native apps, web browsing, playing all kinds of media, enabling game playing, taking photos and communicating.

Given its multiplicity of capabilities, it’s not hard to imagine a future where post-PC devices dot every nook and cranny of the planet (an estimated 10 billion devices by 2020, according to Morgan Stanley).

But, an analysis of evolving computing models suggests a second, less obvious moral of the story. Namely, when you solve the right core problems central to enabling the emergent wave (as opposed to just bolting on more stuff), all sorts of lifecycle advantages come your way.

In the PC era, for example, the core problems were centered on creating homogeneity to get to scale and to give developers a singular platform to program around, something that the Wintel hardware-software duopoly addressed with bull’s-eye accuracy. As a result, Microsoft and Intel captured the lion’s share of the industry’s profits.

By contrast, the wonderful thing about the way that the web emerged is that HTML initially made it so simple to “write once, run anywhere” that any new idea — brilliant or otherwise — could rapidly go from napkin to launch to global presence. The revolution was completely decentralized, and suddenly, web-based applications were absorbing more and more of the PC’s reason for being.

Making all of this new content discoverable via search and monetizable (usually via advertising) thus became the core problem where the lion’s share of profits flowed, and Google became the icon of the web.

The downside of this is that because the premise of the web is about abstracting out hardware and OS specificity, browsers are prone to crashing, slowdowns and sub-optimal performance. Very little about the web screams out “great design” or “magical user experience.”

Enter Apple. It brought back a fundamental appreciation of the goodness of “native” experiences built around deeply integrated hardware, software and service platforms.

Equally important, Apple’s emphasis on outcomes over attributes led it to marry design, technology and liberal arts in ways that brought humans into the center of the computing equation, such that for many, an iPhone, iPod Touch or iPad is the most “personal” computer they have ever owned.

The success of Apple in this regard is best appreciated by how it took a touch-based interfacing model and made it seamless and invisible across different device types and interaction methods. Touch facilitated the emotional bond that users have with their iPhones, iPads and the like. Touch is one of the human senses, after all.

Thus, it’s little surprise that the lion’s share of profits in the post-PC computing space are flowing to the company that is delivering the best, most human-centric user experience: Apple.

Now, Apple is opening a second formal interface into iOS through Siri, a voice-based helper system that is enmeshed in the land of artificial intelligence and automated agents. This was noted by Daring Fireball’s John Gruber in an excellent analysis of the iPhone 4S:

… Siri is indicative of an AI-focused ambition that Apple hasn’t shown since before Steve Jobs returned to the company. Prior to Siri, iOS struck me being designed to make it easy for us to do things. Siri is designed to do things for us.

Once again, Apple is looking to one of the human senses — this time, sound — to provide a window for users into computing. While many look at Siri as a concept that’s bound to fail, if Apple gets Siri right, it could become even more transformational than touch — particularly as Siri’s dictionary, grammar and contextual understanding grow.

Taken together, a new picture of the evolution of computing starts to emerge. An industry that was once defined by the singular goal of achieving power (the mainframe era), morphed over time into the noble ambition of achieving ubiquity via the “PC on every desktop” era. It then evolved into the ideal of universality, vis-à-vis the universal access model of the web, which in turn was aided by lots of free, ad-supported sites and services. Now, human-centricity is emerging as the raison d’être for computing, and it seems clear that the inmates will never run the asylum again. That may quite possibly be the greatest legacy of Steve Jobs.

Do technology revolutions drive economic revolutions?

Sitting in these difficult economic times, it is perhaps fair to ask if the rise of post-PC computing is destined to be a catalyst for economic revival. After all, we’ve seen the Internet disrupt industry after industry with a brutal efficiency that has arguably wiped out more jobs than it has created.

Before answering that, though, let me note that while the seminal revolutions always appear in retrospect to occur in one magical moment, in truth, they play out as a series of compounding innovations, punctuated by a handful of catalytic, game-changing events.

For example, it may seem that the Industrial Revolution occurred spontaneously, but the truth is that for the revolution to realize its destiny, multiple concurrent innovations had to occur in manufacturing, energy utilization, information exchange and machine tools. And all of this was aided by significant public infrastructure development. It took continuous, measurable improvements in the products, markets, suppliers and sales channels participating in the embryonic wave before things sufficiently coalesced to transform society, launch new industries, create jobs, and rain serious material wealth on the economy.

It’s often a painful, messy process going from infancy to maturation, and it may take still more time for this latest wave to play out in our society. But, I fully believe that we are approaching what VC John Doerr refers to as the “third wave” in technology:

We are at the beginning of a third wave in technology (the prior two were the commercialization of the microprocessor, followed 15 years later by the advent of the web), which is this convergence of mobile and social technologies made possible by the cloud. We will see the creation of multiple multi-billion-dollar businesses, and equally important, tens maybe hundreds of thousands of smaller companies.

For many folks, the revolution can’t come soon enough. But it is coming.

Quantifying the post-PC “standard bearers”

A couple years back, I wrote an article called “Built-to-Thrive — The Standard Bearers,” where I argued that Apple was the gold standard company (i.e., the measuring stick by which all others are judged), Google was the silver and Amazon was the bronze.

The only re-thinking I have with respect to that medal stand is that Amazon and Google have now flipped places.

Most fundamentally, this exemplifies:

  1. How well Apple has succeeded in actually solving the core problems of its constituency base through an integrated, human-centered platform.
  2. How Amazon has gained religion about the importance of platform practice.
  3. How, as Yegge noted, Google doesn’t always “eat its own dog food.”

If you doubt this, check out the adjacent charts, which spotlight the relative stock performance of Apple, Amazon and Google after each company’s strategic foray into post-PC computing: namely, iPod, Kindle and Android, respectively.

This is one of those cases where the numbers may surprise, but they don’t lie.

Amazon, Google, Apple stock charts in the post-PC era


tags: , , , , , , ,
  • Nice presentation of numbers.. Now who’s up or down?

  • Dick Applebaum


    Good article… as far as it went.

    But it just kinda’ ended — it feels if it is missing a summation or predictions of how human-computer interaction will be different 10 years from now.

  • As Mr. Applebaum mentioned, it feels like you’re missing another paragraph or three. Also, the “post-PC” world does not mean that desktop computers will go away, nor will mainframes. Siri itself is not powered by your tiny iPhone, but by a mainframe similar to IBM’s “Watson.”

    Cloud computing is, in my book, a huge step backwards. When I was in school, computers consisted of a massive mainframe with dozens of dumb terminals attached to it. Cloud computing is exactly the same thing. The terminal is smarter and sexier, but you’re still putting all of your data in a central warehouse, and relying on someone else’s computer to do the bulk of the work.

    Amazon’s new setup with Fire is the same deal. The Kindle Fire’s browser is fast because a mainframe hosted at Amazon is fetching pages, condensing graphics for the Fire’s low resolution, and delivering a customized version to fit your screen. Useful yes, but when the server goes down, you have a paperweight.

    btw, this isn’t even new. I had a webbrowser on my ancient Microsoft PDA that worked exactly the same way .. 15 years ago.

  • This was great Mark!
    I have to admit that I used the post before I knew it was yours, and didn’t refer to you. Your opening just coincided with the Tim Cook citing Steve on Beatles as a business model.
    I just now made a new post citing you properly at

  • @Dick, fair take. I had A LOT more thoughts on the topic, and re-structured the piece a few times so it wouldn’t be 2500 words. To me, the larger essence was trying to work from the center of how waves of computing evolve, show how Apple is strategically a magnitude more deep in their platform approach, provide relativity to past technical revolutions (beyond computing) in terms of how long they take to bake, and close with a little “aperitif” on the conventional wisdom that Google is winning by virtue of the success of Android, and how that’s NOT reflected in their stock price. I’d argue that they are winning per se. Rather, they are defending turf they otherwise risk losing.

    @Brian, disruptive waves don’t mean that the predecessor disappears or dies. Just that they lose their relevance and centricity. Microsoft, after all, is a HUGELY profitable legacy business, but few would argue that they are driving the next decade of growth, and their stock price is certainly a reflection of that. As to cloud computing, everything old is new again. 15 years ago people were arguing that the web was just dumb terminal revisited. We’ve made progress on the browser experience getting better, but frankly the biggest disappointed of the web is that the browser is such a kluge.

    @Morten, thanks for giving props and for the post. Glad you enjoyed.

    @Bus, thanks. I am not clear on your question.

  • Jim Noble

    I think that it is interesting to show Apple, Amazon and Google on the same page, with the “similar” charts.

    However, what is NOT shown, is that Amazon also launched many different businesses and services during that same period of time, and placing the words “Since the Kindle” seems to be a little bit deceptive. That stock growth was not directly attributable to the Kindle sales, unless you’re specifically showing Kindle sales… which you are not. The same is true of the Apple and Google charts. You’re inferring that the stock share prices of the companies are direclty related to the single products in your article. However, the truth is FAR from that. It would be nice to see a clarification, and perhaps real numbers of sales that ARE attributable to the devices mentioned.

  • Jonathan Watson

    Nice chart showing “that each wave of computing not only disrupts, but dwarfs its predecessor”. The numbers shown are 1M, 100M, 1B and 10B. It would be a geometric progression if there were also a datapoint at 10M. That datapoint could represent the minicomputer boom of the 1970s.

  • @Jonathan Watson: it’s good that you noticed the missing data point. The good thing about charts like this one (since the underlying data points are so generalized and used in a metaphorical sense) is that it’s easy to interpolate a new number and just assign it to anything you want.

    For example, what sense does it make to say the web is one billion units. What does that mean? One billion pages? One billion scripts? One billion downloadable elements?

    Not to mention the “apples and orangutans” in effect in making the comparison across those four categories.

  • anonymous3

    What the charts don’t tell you is that Apple’s stock achieved that phenomenal rise while undergoing unprecedented PE compression.

    Adjust for that, and the difference in charts maybe even more striking.

  • You will appreciate “Let’s Talk TV: Steve Jobs Was Right & Wrong. How Post-PC Era Enables Mac/PC To Dominate Post-TV Era” If you are involved or interested in the future of PCs and/or TV, your thoughts are welcome. Thank you.

    Imran Anwar

  • Thanks for this great article. I used your chart to write a follow up about the defining applications of the different computing eras:

    I included a link to your article. Please let me know if you need more attribution.

  • fred

    This is truly a step forward. I think technology has made many steps forward in recent times and will do. I was expecting another technology boom, which is good. This version is superior to the previous one anyway.

  • Kamlesh Kumar

    Nice one, all the data which is provide here is very impressive & effective. We all know that how effective is modern technology, so we expect more for the future.