Recently, whenever people ask me “What’s Web 3.0?” I’ve been saying that it’s when we apply all the principles we’re learning about aggregating human-generated data and turning it into collective intelligence, and apply that to sensor-generated (machine-generated) data.
A good example of this trend showed up this morning on slashdot: “An anonymous reader writes “Signals from mobile phone masts have been used to measure rainfall patterns in Israel, scientists report. From the BBC article: ‘The University of Tel-Aviv analyzed information routinely collected by mobile networks and say their technique is more accurate than current methods used by meteorological services. The data is a by-product of mobile network operators’ need to monitor signal strength. If bad weather causes a signal to drop, an automatic system analyzing the data boosts the signal to make sure that people can still use their mobile phones. The amount of reduction in signal strength gave the researchers an indication of how much rain had fallen.'”
There are so many non-obvious sources of data being generated by current technology. Some data collection will be explicit, but there are other sources that will depend on creative data mining. Entire business sectors will be disrupted as someone realizes how to cheaply program a service that was once manual and expensive.
This isn’t quite what David Weinberger was talking about when he coined the lovely phrase “the semantic earth” (he was talking specifically about GIS, but I think it’s what he meant! The implications are just now coming into focus. Location technologies such as those discussed at Where 2.0 will be one backbone for organizing this data, but there will be others.