The Results of Reality Mining at Where 2.0

Your cellphone and laptop leave an invisible trail. Collecting this data is known as Reality Mining. A formal definition of Reality Mining is “the collection of machine-sensed environmental data pertaining to human social behavior“. We had two such collection projects this year at Where 2.0. They each took advantage of the signals that our inadvertent sensors (mobile phones and laptops primarily) broadcast. Several times through out the conference I told the attendees about the projects and showed them the same visualizations shared in this post.

The goal in having them at Where 2.0 was two-fold. First, Where 2.0 is a conference on location, It’s the goal of the conference to show the latest technology at the intersection of location and technology. Reality Mining applications definitely fit that description. Though our attendees’ devices are undoubtedly tracked without their knowledge all the time we wanted to make the applications. Secondly, as an event organizer knowing even a little more about attendees group actions is very valuable. We received a little bit of useful data, but there definitely would need to be refinements before it could be considered one of our tools for judging how well a talk was received (the number of new http requests might be more accurate than the movement of attendees).


Path Intelligence is able to detect GSM signals from cellphones. We used it to learn more about the attendees of Where 2.0. The embedded slides show where our attendees are from and when they were in the session rooms vs. the hallway. One of the slides also shows the path of a single cellphone. All of the data that was collected is anonymous. The team behind Path Intelligence were on-hand to discuss the application and they were usually busy explaining See the TechCrunch post for more information on the company. (Disclosure: OATV has invested in the company)

Leonard Lin, co-founder of Upcoming, has put together a fun reality mining project for the conference. A couple of weeks ago Leonard created Fireball (also profiled on TechCrunch) to let attendees check in locations at the Web 2.0 Expo (like Dodgeball). We got to talking about how on could use BlueTooth to do the same thing at a more contained conference. He took on the challenge and turned three Mac Minis into sensors to be used at the conference. He wasn’t able to detect signal strength via the native BlueTooth stack on the Minis so there is no proximity or geolocationbut he was able to collect some interesting info like the number of devices. He used Nodebox to create the visualizations. Future versions might be on the much cheaper and smaller Gumstix hardware.


The BlueBall data is more revealing on people’s identities in some ways. As you can see from the image below sometimes the BlueTooth name has been modified to reveal more about the person. Other times it is just a device number. As these sensors proliferate (or people realize that any computer they walk by can detect their BlueTooth signal) I wonder how much the device names will change.

Thanks Toby, Sharon and Leonard for your hard work.

tags: ,