Guest blogger Dylan Field is an intern at O’Reilly and Senior at Technology High School in Rohnert Park, CA, where he is a member of the FIRST Robotics team, Dylan is especially interested in Computer Science, Mathematics,and Statistics.
In his “Web Meets World” talk at the Web 2.0 Expo in New York last September, Tim O’Reilly described where he saw the web heading. “The next stage of Web 2.0 is going to be driven by sensors,” he said. “We are moving out of the world in which people typing on keyboards are going to be driving collective intelligence applications.”
Like all transitions, the incorporation of data from the physical web onto existing platforms is gradual. We are just beginning to see applications surface and the best is still ahead of us. Below are a few observations, predictions, and implementations of this emerging trend.
Sensors Help Keep Elderly Safe
This New York Times article highlights how Seniors are taking advantage of sensors so they can continue to live independently. Sensor systems are able to detect everything from neglected pills to glucose levels to falls. Seniors seem to like the systems, as do their relatives. “In the past, I tried to spend more time on, ‘How are you feeling?’ ” Marvin Joss says. “I still ask those questions, but now it’s more to an idea of having a conversation, not trying to listen for clues about whether she’s O.K.”
The Demon-Haunted World
If I had to use one word to describe this presentation by Dopplr’s Matt Jones, it would be “Psychogeography,” a term developed by French Theorist Guy Debord. Psychogeography is defined informally as “a whole toy box full of playful, inventive strategies for exploring cities…just about anything that takes pedestrians off their predictable paths and jolts them into a new awareness of the urban landscape.” Jones cites examples like twittering bridges and pollution sensing robotic dogs to back up a claim by architect Richard Rogers that “Our cities are increasingly linked, and learning.” “It seems to me like there are a bunch of hackers reclaiming information from the city,” says Jones. “[They are] gardening it without permission.”
SENSEable City Laboratory
MIT’s SENSEable City Laboratory uses sensors to understand the macro-dynamics of cities. For example, in one experiment the lab collected all cellphone usage in Rome for one night. They then aggregated the data and produced a visualization showing how people moved around and where events were taking place. If we had real time access to this kind of information, how would it affect our choices? Would we decide not to eat at a particular restaurant because it is too crowded? Would we choose our entertainment based on the flow of the crowd?
AMEE and Google PowerMeter
AMEE and Google PowerMeter are two ways the “here’s your data, do something with it” methodology can be used to make people aware of their carbon footprints. Both use sensors such as smart meters to track and display energy consumption over time. (Disclaimer: OATV is an investor in AMEE.)
In a previous partnership between the two companies, Google used AMEE’s profiling engine to let users calculate their carbon footprints. After completing the web form, users were taken to a Google Map mashed up with the carbon footprints of those nearby. Soon, we’ll be able to do this without the web form. Like O’Reilly said, we are slowly transitioning out of a world where people typing on keyboards are driving collective intelligence.
What role do you see sensors playing in your life? How do you interact with them now? Does the possibility of sensor driven collective intelligence frighten or excite you? Post a comment and let us know.