Google, WalMart, and The Power of the Real Time Enterprise

What do Google, WalMart, and have in common, besides their extraordinary success? They are organizations that are infused with IT in such a way that it leads to a qualitative change in their entire business.

I get frustrated when I see people highlighting use of social media–blogging, wikis, twitter, customer feedback systems like Dell IdeaStorm or MyStarbucksIdea–as if they were exemplars of what has come to be called “Enterprise 2.0.”

As I said in my keynote at the Web 2.0 Expo NY (and in a followup radar post), WalMart is a better example of Enterprise 2.0 than any of these more trendy examples of user contribution systems. If Google’s key innovation with PageRank was to recognize that a link was a vote, which could be counted and measured to get better search results, so too, WalMart recognized early on that a purchase was a vote. Each company built real-time information systems to capture and respond to that vote. WalMart built a supply chain in which goods are automatically re-ordered as they go out the door, with algorithms based on rate of sale controlling the reorders. Google built a better search engine, in which pages that were “better linked” were given priority over the ones produced by pure keyword matches. They went on to build real-time systems to measure what John Battelle called the database of intentions, as expressed by people’s queries and subsequent clickstream data, as well as an ad auction system that prices ads in real-time based on the predicted likelihood of the ad being clicked on.

I came to see just how closely emulated these ideas of the real-time enterprise in accounts of the Houdini project, a bold program in which poll watchers eliminated the names from voters who had actually made it to the polling station from the “get out the vote” call lists:

While the hot line was too overwhelmed to be of much use, the source said the program itself still proved a smashing success….the campaign was able to clean 1.6 million voters from the call lists they distributed to canvassers that afternoon, making those lists 25 percent shorter on average.

While the infrastructure for data reporting broke down under the pressure of the election, the general trend is clear here: competitive advantage comes from capturing data more quickly, and building systems to respond automatically to that data.

Consider as a kind of vast machine, with humans as extensions of the programmatic brain: volunteers log in to get their get-out-the-vote call lists. They place their calls, then use the web to report back their results. Those results modify the call lists for the next volunteer. At the other end, the Houdini volunteers are taking note of who is actually coming out to vote, allowing the system to dispatch additional attention to hot spots, for example where there is an undervote compared to the campaign’s projections. Meanwhile, the pruned call lists make the volunteers more effective. Inside the machine, programmers are tuning the algorithms, while top campaign staffers are making key decisions to adjust the resource mix.

Now put these three examples, Google, WalMart, and MyBarackObama together, and ask yourself what they tell you about the future of business, military operations, or any large organization.

Sensing, processing, and responding (based on pre-built models of what matters, “the database of expectations,” so to speak) is arguably the hallmark of living things. We’re now starting to build computers that work the same way. And we’re building enterprises around this new kind of sense-and-respond computing infrastructure. In this sense, you can argue that Microsoft’s term “Live Software” is the best name yet for the kind of software-infused enterprise we’re building.

It’s essential to recognize that each of these systems is a hybrid human-machine system, in which human actions are part of the computational loop. Back in 1998, when I was trying to understand just how people were using Perl and other scripting languages on the web, I came to recognize that web applications, unlike desktop applications, still have the programmers inside them. Perl was called “the duct tape of the internet” precisely because it was used for programming that was only expected to last a short time; the programmers were still there, constantly tweaking the application. (I first began using the image of “the Mechanical Turk” in my talks about this aspect of web applications in 2003.)

What became clear in the ensuing decade is that humans are not just part of the programming, but also sensors and actuators for computers. Our aggregate behavior is measured, monitored, and becomes feedback that improves the overall intelligence of the system. That is why I’ve said that the defining characteristic of Web 2.0 applications is that they “harness collective intelligence.”

Aside: I seem to have lost the battle to define Web 2.0 as” the use of the network as platform to build systems that get better the more people use them. Perhaps its the lure of the obvious: companies and products that harness explicit user contribution are easier to recognize than those that pursue the more subtle and difficult task of harnessing implicit contribution. Or perhaps it’s the persistent gravitational tug of the idea that the heart of Web 2.0 is ad-supported business models; therefore, enterprise features that look like those of well-known companies featuring user contribution and ad-supported business models must by definition also be “2.0.” For me, the far more profound and powerful systems come from harnessing both explicit and implicit human contribution.

Again, consider It definitely harnessed explicit contribution, providing a platform for volunteers to organize and host local calling parties, to blog, or perform other campaign activities. But ultimately, Obama’s ground game–old fashioned precinct-level organizing, amped up to a new level by an army of distributed volunteers armed with mobile phones and coordinated via a web application–was the key to his victory. The “explicit” social media elements of paled in impact compared to the development of a next generation electronic nervous system, in which volunteers were trained, deployed, and managed by a web application who used them, in John Sean McMullen’s memorable phrase, as “souls in the great machine.”

tags: , , ,