Google, WalMart, and The Power of the Real Time Enterprise

What do Google, WalMart, and have in common, besides their extraordinary success? They are organizations that are infused with IT in such a way that it leads to a qualitative change in their entire business.

I get frustrated when I see people highlighting use of social media–blogging, wikis, twitter, customer feedback systems like Dell IdeaStorm or MyStarbucksIdea–as if they were exemplars of what has come to be called “Enterprise 2.0.”

As I said in my keynote at the Web 2.0 Expo NY (and in a followup radar post), WalMart is a better example of Enterprise 2.0 than any of these more trendy examples of user contribution systems. If Google’s key innovation with PageRank was to recognize that a link was a vote, which could be counted and measured to get better search results, so too, WalMart recognized early on that a purchase was a vote. Each company built real-time information systems to capture and respond to that vote. WalMart built a supply chain in which goods are automatically re-ordered as they go out the door, with algorithms based on rate of sale controlling the reorders. Google built a better search engine, in which pages that were “better linked” were given priority over the ones produced by pure keyword matches. They went on to build real-time systems to measure what John Battelle called the database of intentions, as expressed by people’s queries and subsequent clickstream data, as well as an ad auction system that prices ads in real-time based on the predicted likelihood of the ad being clicked on.

I came to see just how closely emulated these ideas of the real-time enterprise in accounts of the Houdini project, a bold program in which poll watchers eliminated the names from voters who had actually made it to the polling station from the “get out the vote” call lists:

While the hot line was too overwhelmed to be of much use, the source said the program itself still proved a smashing success….the campaign was able to clean 1.6 million voters from the call lists they distributed to canvassers that afternoon, making those lists 25 percent shorter on average.

While the infrastructure for data reporting broke down under the pressure of the election, the general trend is clear here: competitive advantage comes from capturing data more quickly, and building systems to respond automatically to that data.

Consider as a kind of vast machine, with humans as extensions of the programmatic brain: volunteers log in to get their get-out-the-vote call lists. They place their calls, then use the web to report back their results. Those results modify the call lists for the next volunteer. At the other end, the Houdini volunteers are taking note of who is actually coming out to vote, allowing the system to dispatch additional attention to hot spots, for example where there is an undervote compared to the campaign’s projections. Meanwhile, the pruned call lists make the volunteers more effective. Inside the machine, programmers are tuning the algorithms, while top campaign staffers are making key decisions to adjust the resource mix.

Now put these three examples, Google, WalMart, and MyBarackObama together, and ask yourself what they tell you about the future of business, military operations, or any large organization.

Sensing, processing, and responding (based on pre-built models of what matters, “the database of expectations,” so to speak) is arguably the hallmark of living things. We’re now starting to build computers that work the same way. And we’re building enterprises around this new kind of sense-and-respond computing infrastructure. In this sense, you can argue that Microsoft’s term “Live Software” is the best name yet for the kind of software-infused enterprise we’re building.

It’s essential to recognize that each of these systems is a hybrid human-machine system, in which human actions are part of the computational loop. Back in 1998, when I was trying to understand just how people were using Perl and other scripting languages on the web, I came to recognize that web applications, unlike desktop applications, still have the programmers inside them. Perl was called “the duct tape of the internet” precisely because it was used for programming that was only expected to last a short time; the programmers were still there, constantly tweaking the application. (I first began using the image of “the Mechanical Turk” in my talks about this aspect of web applications in 2003.)

What became clear in the ensuing decade is that humans are not just part of the programming, but also sensors and actuators for computers. Our aggregate behavior is measured, monitored, and becomes feedback that improves the overall intelligence of the system. That is why I’ve said that the defining characteristic of Web 2.0 applications is that they “harness collective intelligence.”

Aside: I seem to have lost the battle to define Web 2.0 as” the use of the network as platform to build systems that get better the more people use them. Perhaps its the lure of the obvious: companies and products that harness explicit user contribution are easier to recognize than those that pursue the more subtle and difficult task of harnessing implicit contribution. Or perhaps it’s the persistent gravitational tug of the idea that the heart of Web 2.0 is ad-supported business models; therefore, enterprise features that look like those of well-known companies featuring user contribution and ad-supported business models must by definition also be “2.0.” For me, the far more profound and powerful systems come from harnessing both explicit and implicit human contribution.

Again, consider It definitely harnessed explicit contribution, providing a platform for volunteers to organize and host local calling parties, to blog, or perform other campaign activities. But ultimately, Obama’s ground game–old fashioned precinct-level organizing, amped up to a new level by an army of distributed volunteers armed with mobile phones and coordinated via a web application–was the key to his victory. The “explicit” social media elements of paled in impact compared to the development of a next generation electronic nervous system, in which volunteers were trained, deployed, and managed by a web application who used them, in John Sean McMullen’s memorable phrase, as “souls in the great machine.”

tags: , , ,
  • Great observations, Tim. Previous organizational structures like hierarchies were about delivering information to a single decision-maker in a timely fashion and distributing the decisions made to the employee actuators. Better IT infrastructure is allowing organizations to distribute their thinking and decision-making across the organization, ensuring that the person best equipped to make the decisions have access to the necessary knowledge and the ability to publish their thoughts accordingly. I believe that we are in an age where we are creating new models for how organizations think, and the companies like Google and WalMart that innovate most effectively in new organizational cognition models will be tremendously successful.

  • Tim,

    What exactly is the contribution you are trying to make here?

    The concept of cybernetic systems that blend man and machine is pretty old. You could read Weiner’s “God and Golem, Inc.” for example but really it goes back much farther:

    Consider a system of slavery under a singular sovereign based on the technology of an accounting system comprised of clay tablets and rules of bureaucracy: a cybernetic system as old as literacy itself. You asked “What do Google, Walmart, and have in common” but the answer you give equally well describes the IT of the clay tablets and system of thugs and slaves.

    To be sure, the modern systems are faster and greater of more complex computations. To be sure, volunteers processing a call-list are not wielding whips. Yet, all of the essential elements that you cite as part of the “trend” are found as well in the older slavery systems.

    And the commonalities are deeper than you might realize at first. In the slavery system, the record keeping machine and its human “peripherals” determine the built environment and the social and economic environment of the slaves. Here in the modern system that control of the slave experience is partitioned: Walmart in charge of commerce in real goods, Google in charge information, mediating participation in political process. In the slave system power is consolidated through vast asymmetries of information … just as in the modern system. In the slave system the information shadow cast by a slave defines a place for the slave in the juridical process of the society; in the modern system, government access to the databases of Google et al. serves a similar role.

    In short, your answer to what these three examples have in common is at best uninteresting in its lack of specificity. Arguably, your account is pernicious precisely because of its lack of specificity – it’s uncritical, celebratory, gee-whiz approach.

    So, what is added by taking the analytic tack that you’re taking? On the one hand, I suppose that some may find and recognize new business opportunities just as an earlier entrepreneur might have gazed upon the enormous expense of digging up clay and shaping tablets and recognized the opportunity for advantage within the slave society by promoting the adoption of paper record keeping. On the other hand, I suppose that some will be distracted from consideration of the differentiating details of each real case by a vague fascination with the “big theme” of a society with cybernetic control elements as its main organizing principle. Thus distracted away from the details, people will tend to see instances like Google or Walmart as somehow inevitable – as phenomena outside the sphere of moral analysis. It’s just the Big Picture, man; it’s Trippy, isn’t it?

    On the other hand, the criticism that you are distracted from the details could be turned around to take your questions in a different direction:

    So, these firms have cybernetic “nervous systems” with human peripherals – that’s fine. By definition, signals are collected from the “users” – from the objects of control. This surveillance data is collected in centralized ways and processed to emit control signals as part of the overall feedback circuit.

    With that analytic framework in place we can rediscover the details and return to moral questions rather than questions of economic opportunity for a small elite:

    For example, the use of surveillance data is nearly unconstrained. The teleology of the control mechanism is constrained only by the need to arrange the production of the surveillance data it takes as input — beyond that it can do anything it likes.

    Therefore, if we want to consider the morality of the details of something like a Google we can look at what data is collected and how that can be used, regardless of how it happens to be used on a given day. We can ask, given that this is a cybernetic system, what kinds of control operations is it capable of? And then we can ask, do we really want those operations to be so casually – or even enthusiastically – potentiated? Is Google really a use of resources we want? What could possibly go wrong?

    If there is such profound power in the asymmetry of these control systems – the centralization of information and the centralization of control – then what checks are appropriate here? For example, should we have in place some mechanism to prevent Walmart from driving its profits so hard so as to effect the dismantling of domestic manufacturing of many kinds of goods? should its efficiency be unchecked to the point it drives the dismantling of less IT-dependent, less-oil-dependent forms of retail? Or of Google: if we are to permit the collection of our “click streams” and our searches should we not have insight into the processing of these, the models conveyed to advertisers, the perspectives afforded the government?

    I hope we can get past a warmed-over Weiner and maybe instead pick up in the same general directions he ended in with works like “God and Golem, Inc.” And I’ll borrow from him here, as he borrowed from W. W. Jacobs’ “The Monkey’s Paw”:

    If there is, today, a consensus teleology to those three organizations it is, indeed, “success”: revenues for each of the three organizations, market share for each, and so on. These are the metrics which are in turn used for other forms of cybernetic control such as stock trading or vote counting.

    Yet, clearly, to achieve good “scores” on those metrics the three firms engage in a great deal of activities with much broader impact. The ranking of a page can be life or death for a small business. The fate of a click stream can be life or death for a dissident. A roster of political supporters can mean job or business discrimination or the denial of presidential pardon.

    How odd that the only cybernetic controls to which those three firms are meaningfully subjected have to do with the narrow, sterile, non-human metrics of revenues and votes.

    New frontiers of computing and communications capability are thus our “Monkey’s Paw”: a technology capable of granting wishes like “200 pounds sterling, please” but likely to grant them in unanticipated and rarely desirable ways (“Sorry, your son died at work today. Without admitting fault his employer has sent you this 200 pounds sterling as condolences.”).


  • Off topic

    For the links you can ad to your blog you add a ” rel=”nofollow”” . So you don’t get any google pagerank from writing on this blog, is that not pretty old world for your company.

    Love the content though

  • Mads – rel=nofollow in comments is not old school at all. It’s a defense against comment spam. Another sad instance of the tragedy of the commons, in which bad actors ruin things for everyone. It’s a bummer, though.

  • Joe H

    Another post that reminds me that your work has always drawn parallels for me to Stuart Kauffman’s writings.

    These are all great examples of complex systems in which both humans and software contribute to new emergent behavior when we view these actors as part of a single, whole system.

    The result, in the cases you mention, is perhaps both unexpected and generates “better behavior” than in more simple systems. The emergent behavior of the Google system is to create a better search system and a related intent-based ad system that fuels it, the emergent behavior of is to create a more effective vote generating system and a new political regime.

    The dynamics of Web 2.0 (and it’s relatives) aren’t just about network effects as you say, but also the power of new emergement behaviors and the impact these “systems” have on the on ecosystems they exist in.

    I’m not a student of complex systems, but it seems like the complex systems viewpoint might be a broader platform to understand the dynamics here.

  • Joe Hayashi – Thanks. Good perspective that there’s lots of research and thinking on complex/emergent systems that applies here.

  • Carm McDuff

    I hope this gets to you. I’ve read your bio and a couple of your articles and I am as passionate about contributing to the betterment of mankind; specifically the under-served.
    I must connect with you in some way. Please prove that you live what you preach and give me a few moments of your time — it is critical to the state of the current jobless crisis. I’ve developed a solution and presented to IT experts, marketing experts, DVR experts serving the disabled and across the board I’m told there has never been anything like my solution and I need to find a way to get it to the unemployed population and the disabled unemployed because this is a solution that will revolutionize their lives. I need your help, please.
    I’m a well educated woman, alone, with Leukemia, lost everything,unemployed for 2 years and filing for disability. I don’t know how much time I have in this life and my 23 y.o. daughter is now also homeless and unemployed.I have to do something to take care of her by launching this tool.
    I’ve generated capital for development. To demonstrate how incredible this tool is; I have a prominent patent attorney providing legal services pro-bono, my developer is providing most of the development pro-bono, and every IT person I’ve presented to has joined my marketing team. I need your help desperately — you are the lobbyist who can help me get this to the disabled. Please contact me at the above e-mail address. I must speak with you immediately because there is an initiative I’m trying to get involved with that is going before the legislature mid-January 09. Please, please contact me.

  • You may find my essay on how Obama’s use of internet tools in his campaign could effect the power of a sitting president interesting ( The ideas in it were discussed in the NY Times (

  • I have often found myself frustrated by people who confuse “web browser as platform” with “web as platform.” When designing the IdeaExchange (the forerunner of IdeaStorm & MyStarbucksIdea), we consciously applied the principles we learned from large scale social applications to an enterprise function that has been broken for decades. While doing so, we realized that it was very hard for people to grasp the underlying principles of building community -> harnessing collective intelligence -> solving problems. It’s simply easier to brand something “web 2.0″ or enterprise 2.0” and leave it up to interpretation than to bother explaining it. But I’m definitely always on the lookout for people who do understand the principles.

  • Tim,

    I agree with Thomas: your post doesn’t seem to add much to any debate. I encourage everyone to write as many blog posts as they can muster, understanding that the genuinely informed opinions will come floating to the top. This, alas, is a false positive.

    One observation you make is particularly striking: Obama’s old fashioned precinct-level organizing is key to his victory. Tim, you couldn’t be more wrong.

    There is nothing old-fashioned about Obama’s precinct organizing, and Obama did not win by organizing on the precinct level either. The way the Obama campaign maintained relationships with their volunteers is unique in the history of political campaigns. The IT systems they’ve deployed actually were designed to make those relationships between people stronger.

    When one is not a student of Obama’s campaign organization it’s easy to misunderstand the workings of their volunteer communities. One might think that knocking on doors, or working the phones somehow got all those votes in. But that’s not how things worked. Every interaction between people in the campaign, and people in the campaign and outsiders has been explicitly designed by the campaign to build and maintain relationships, and through those relationships spread a story. It is that story that got Obama the victory, and I won’t go into the detail of that story here.

    Their IT systems did their bit, but here’s a question for you Tim: do you personally believe that if Obama’s IT systems would have crashed completely on election day he would still have won?

    Your insights in that question may help you to better understand the characteristics of the Obama campaign.

    Kind regards

    Steven Devijver

  • Tim,

    Interesting thoughts. As another poster has pointed out, this perspective is an extension of the feedback loop theory that is at the heart of cybernetics. However your broader point – that “intelligence” in the interest of sharply defined business objectives is more impactful, every day, than blogs and wikis – is quite right. Blogs, wikis, etc. come in handy for human driven creative activities. But the money, routinely, derives from more structured processes – increasingly driven by business intelligence techniques. The outcomes may be “emergent”, but the decision rules are made by hierarchies, and in their interests. This is neither intrinsically bad, nor good. It’s just the way things work. Good or bad depends on the interests and motives of the institutional owner of the system. But it’s why dreamy idealism about “mass collaboration” misses the mark.

  • Zephyr Teachout wrote: “decentralized power is different than decentralized tasks. The internet enables both, but the former increases democracy, whereas the latter increases heirarchical control… Power is decentralized when participants have a meaningful chance to change the structure—what Jonathan Zittrain calls ‘generativity.’ Power is not decentralized every time a person participates. A supporter can make phone calls, door knock, forward emails, but not be encouraged to strategize on her own; she has little more power than a person sending in a video entry to a Cheerios contest for a new ad campaign”

  • “competitive advantage comes from capturing data more quickly, and building systems to respond automatically to that data”

    Kinda why I like ESBs and using event-driven solutions – have you seen this link:

    It’s been said business is war a few times I think ;-)

  • i know this is a statement riddled with oversimplification, but the internet [or interweb as i’ve affectionately heard it called] reminds me a LOT of the old mainframe days. gimme a programmable terminal and plug me in to the old VAX/VMS main-brain. [not to exclude the IBM or UNIX main-brains – but i cut my teeth on JCL – slight preference]. i was with the computing services group, led by a shy, quiet brilliant man named Mahendra. he was paranoid – we had great security.

    anyway, my first experience with a “PC” was a standalone – coming from a mainframe environment, i considered it “useless” when i was told i could not use it to message anyone. [NOVELL came along, and i was a bit happier – but only having 10 or so people compared to hundreds still seemed a little… limiting].

    there are, of course, many differences between a mainframe or network environment, and “The Web” [remember, here, that i am grossly oversimplifying]. the biggest difference, though, might be the lack of one central organization or agency “in charge”. oh, there are groups that write standards, and browser-makers that both follow and ignore standards, and encryptions to ease our concerns about electronic and virtual security. but “The Web” is, and will continue to be, a massive hodge-podge of global virtual identities. not just hundreds, but millions. more. we really are becoming a “collective” virtual culture.

    so is there an answer to “who’s in charge of the web?” isn’t it kind of like asking “who’s in charge of the world?”

  • tim,
    I must connect with you in some way,You asked “What do Google, Walmart, and have in common” but the answer you give equally well describes the IT of the clay tablets and system of thugs and confuse.

  • Scott Heiferman –

    Really good point. You’re absolutely right about the difference between decentralized power and decentralized tasks.

  • Thanks for giving the Republicans a shovel to dig out of the ditch with! :-)

    Two weeks ’til the flood gates open…

  • renato

    i had never read anything on these topics until tonite; extremely interesting adaptation of cognitive psychology as i remember it.
    i liked Lord’s take even more – more provocative, far-reaching and humanistic. attributes exactly what IT needs.

  • muslim Dinmahamed

    Hope having power is not a prevelege , but an emmense courage to bring the belove country in a right prospective direction .Being UNITED I think is the best target.Barack Obama have the courage to bring a good example that life is like love ,not colour problems
    Hope to hear from you soon……
    Faithfully yours

  • Tim,

    I think one of the issues is simply explicit versus implicit data. More people see contributions such as reviews, comments, ideas. However the insights that follow from small, seemingly mundane actions such as links, purchases or votes do seem to be more interesting.

    I like Flickr Interestingness in this regard, too – I am constantly amazed by the beautiful images surfaced without any knowledge of the image content (as I understand things).

    Similarly analysis of location data, is likely to yield more voting as companies like Sense Networks are showing (disclosure: I am an investor in Sense).

    Thanks for reminding everyone, that while the conversations at the party might sound interesting, who people are standing with, how they are standing and gesturing, might be even more interesting.

  • Sorry I’m late! ;-) Had to think about this one for a while. I’m intruiged by this post as well as your report on Web Squared. The remark that gets me most is: Infused by IT. I think you have a big point here. But what does this mean? Yes, this means that the Obama campaign was much more than web 2.0 concepts and tools. Of course they were new and used them effectively. It was also about culture. But wasn’t it even more so about the underlying ICT? The Obama campaign is ‘infused with IT’. I’d really like to learn more about what this means. Does it relate to what I’ve been writing about? That structured and unstructured information need to be managed under one architecture? That unstructured information is easily collected and generated by web 2.0 tools (sensors), for instance, and is subsequently piped into more formal systems, like product data management and enterprise resource tools? It would be really nice to see an overview of all the IT Obama used, not ‘just’ the new media tools.

  • I couldn’t agree more. I’d like to add that the education sector– both K-12 and adult–should be added to the list of types of organizations that will benefit from real-time feedback and better analytics. ePortfolio platforms like ours help track, learn from and publish such data. These data can be transformed into knowledge graphs that may represent one learner, one class, or a whole college campus.