Ubicomp and Web 2.0: Connecting the Dots

I’ve been saying for some time that the next stage of Web 2.0 is the application of collective intelligence techniques to sensor data, not just to data input directly by humans.

Two stories this weekend illustrate this point nicely. The New York Times published a story on Saturday entitled Billboards that Look Back, about a new generation of electronic billboards that use cameras to track who looks at the billboards, and a story yesterday on Techcrunch about Like.com’s contextual ads triggered by Facebook photos.

Most people will immediately recognize the first story as a ubiquitous computing (ubicomp) story: a next generation display equipped with sensors bringing computing to an arena that was previously analog and uninstrumented. But connecting the dots between that story and the second one is really important.

Many of the most important breakthroughs in Web 2.0 have come through finding new meaning in data that already exists, often through statistical methods and related algorithms, not by gathering new data, or adding metadata and structure to existing data. (Pagerank is the canonical example.) If Like.com really is able to do a good job of matching ads to photos via clever algorithms, they’ve effectively turned a wealth of existing user-generated photos into sensors for their application, without having to deploy a single camera of their own.

In my talks, I’ve long argued, following Dan Bricklin’s Cornucopia of the Commons, that there is a hierarchy in architectures of participation, with the most powerful literally building a system in which participation is automatic, and driven by the design of the system itself rather than any explicit request for user contribution. Methods for extracting additional layers of meaning from activities that users perform for their own self-interest fall into this category.

Thus, it’s important to include in the category of sensor data richer interpretation of photos and audio/video streams. So for example, photosynth is a great example of an application that, after the fact, extracts additional data from user-contributed photos. Similarly, Last.fm’s audioscrobbler turns your playlist into a sensor, and Wesabe is effectively turning the credit card into a collective intelligence sensor. (Disclosure: Wesabe is an OATV investment.)

Take away two messages:

  1. Think about ubiquitous computing not just as the move from the computer to the cellphone and other mobile devices but the fact that those devices are becoming sensors for cloud applications harnessing collective intelligence

  2. Remember that “data is the Intel Inside” of Web 2.0, and that databases driven by network effects and applications deriving meaning from that data via statistical methods will continue to be the key to competitive advantage in the ongoing network era.

P.S. I’ve been calling this trend ambient computing, because I like the sense of computing encountered while walking around, and because I found Peter Morville’s Ambient Findability so thought-provoking, but ubiquitous computing or ubicomp seems to be the winning buzzword.

tags: , ,