Dynamic Time-Travel Maps From MySociety and Stamen

london dynamic commute map

UK-based non-profit MySociety teamed up with Stamen Design to develop some innovative time-travel maps. The snapshot of the map that you see above shows where you can live in London with a commute between 30 to 60 minutes where the median house price is over £230, 000. As you adjust the sliders, the map changes in realtime letting you adjust the commute times from 0 up to 90 minutes and the housing price from 0 to £990,00. The Department of Transportation, who requested the work, is the map’s center (and basis for the commute times).

You can try out the map after the jump. They also made dynamic maps with the Olympic Stadium and the BBC as the center.

london commute map

These maps are an update of Chris Lightfoot’s 2006 Time Travel project. The focus of that project was how to present commute time data (see a static commute map of London with contour lines at half-hour intervals to the right). This version (2007) they focused on making them interactive. The mapping data comes from Open Street Map (converted from the Ordinance Survey data used in 2006) .

MySociety is a tech-centric non-profit that focuses on making websites for the civic good and teaching others about the internet. Some of their previous projects have had a more political bent to them. TheyWorkForYou was their first project. It’s a searchable site that provides British citizens a way to find out what is happening in the parliament. Their most recent site, FixMyStreet, provides neighborhoods with tools to discuss local problems.

To generate the maps MySociety screen scraped the Transport Direct website. At first they would query for the routes at each public transport stop. With this method Cardiff took 4 hours and 15 minutes to generate. Next they tried parallel screen scraping and got the time to generate Cardiff down to 45 minutes. They estimate that with a better algorithm they can get it down to 15 minutes. 15 minutes is a long time to wait for a map.

In the future they want to make these maps generated on the fly for users. This would require direct access to the data (Google Maps, MySociety estimates, with their lightning fast routing servers and direct access to the data would only take 2 minutes to generate Cardiff). To achieve this MySociety is considering building a client app or getting dedicated servers from Transport Direct (as this was a government-sponsored project anything is possible).

Time is a difficult thing to represent on maps, but will become more common in the future. Should it be a loop? (Like the Stamen project Trulia Hindsight, where 20th century housing data is shown (Radar post)) Or should it be sliders? (Like this project) Or more like a video? (Google Earth lets you “play” your GPS tracks.) We’ll explore this more at Where 2.0.

tags: ,
  • Michael R. Bernstein

    Um, Brady, these are ‘travel time’ maps, not ‘time travel’ maps.

  • There’s something deeply perverted about these algorithms, including the ‘slightly smarter’ ones they discuss.

    A single brute force run through all the possible points in a region would be several orders of magnitude more efficient.

    Algorithms to compute the shortest path between any two points avoid doing this, usually with very tricky optimizations. Running this hundreds or thousands of times, each time avoiding several branches – all to reconstruct what could be done in a single pass… does that strike anyone else as stupid?

    I am NOT saying MySociety developers are stupid – I think they are quite brilliant. Blame for the inefficiency has to be laid at the feet of the government, whose policy it is to not share all the data needed for such projects.

  • Michael, you’re right! It’s a great idea though: “how far back in time would I have to go to be able to afford a house in Chelsea on a teacher’s salary?”

    Daniel makes a good point though – mySociety has a knack for writing bullet-proof scraper code, but wouldn’t it be nice to put those code ninja skills to use optimising a real-time routing algorithm on the underlying data instead?

  • Daniel, in terms of generating algorithm, there is one extra niggle. The situation for public transport is very different from car journeys. If there is only a train once an hour, the exact time you arrive at a node really matters.

    For this reason we rather anally like to calculate exact routes, based on arrival at a certain time at the destination. If we had access to the underlying transport model, we’d have to decide what aspects of that are worth keeping, in return for making more maps, faster.

    The same timing problem occurs for road journeys as well, if you allow for congestion. Imagine that in addition to route finding, we also had typical journey times on each road at each minute of the day for each day of the week.

    One other thing – for the purpose of working out where to live, there are of course endless other data sets you might like to overlay and adjust with sliders. For example, more single women / more single men, more republican / more democrat, journey time to nearest hospital, wind speed for your personal turbine, air pollution levels for asthmatics.

    And you might also like to combine them – add up the travel times to the place of work of both you and your boyfriend. And factor in the journey time to school as well.

  • And in case you think Francis is kidding about the endless other data sets, here’s the kind of thing someone does when the data is at their finger-tips:

    “…concentrations of popular bars, low crime rate, lots of tech workers and digerati, newer homes, female motorcycle riders…” – http://www.geocommons.com/workspace/show/157