Four short links: 6 October 2009

Birdwatching Technology, Transportation Data, Multitouch in Python, and Face Detection on the iPhone

  1. Bird-watching Turns To Technology (BBC) — CCTV-esque automated bird watching. Sensor networks + computer vision for an ecological purpose. In a bid to track the guillemots behaviour, Dr Dickinson is refining established work that involves modelling the visual structure of an area around a nest. The computer system will be able to use this model to identify changing elements in the scene, and determine if they correspond to movement by a guillemot. “That is the typical way of doing surveillance,” said Dr Dickinson, “work out what’s moving, that gives you an idea about what is interesting in a scene.”
  2. The Case for Open MTA DataIf you live in Portland, there are dozens of mobile applications that help fill gaps in transit information. You can check your phone to see when the next bus is supposed to come. You can plan a trip from one unfamiliar part of town to another. You can even have your mobile device buzz if you fall asleep before reaching your destination. For the basic stuff, there’s no iPhone necessary (although that certainly helps for information luxuries). Anyone who has a plain old cell phone with text messaging can ride the train or the bus with greater ease thanks to these apps. (via Making Light)
  3. PyMTa python module for developing multi-touch enabled media rich applications. Currently the aim is to allow for quick and easy interaction design and rapid prototype development. There is also a focus on logging tasks or sessions of user interaction to quantitative data and the analysis/visualization of such data.
  4. Near Realtime Face Detection on the iPhone with OpenCV Port — we’re probably only one or two revisions of iPhone hardware away from being able to do some serious computer vision tasks on the handset. Proof of concept adds a tie to the face you’re pointing the camera at.
tags: , , , , , ,