Networked sensors and machine learning make it easy to see when things are out of the ordinary.
Much of health care — particularly for the elderly — is about detecting change, and, as the mobile health movement would have it, computers are very good at that. Given enough sensors, software can model an individual’s behavior patterns and then figure out when things are out of the ordinary — when gait slows, posture stoops or bedtime moves earlier.
Technology already exists that lets users set parameters for households they’re monitoring. Systems are available that send an alert if someone leaves the house in the middle of the night or sleeps past a preset time. Those systems involve context-specific hardware (i.e., a bed-pressure sensor) and conscientious modeling (you have to know what time your grandmother usually wakes up).
The next step would be a generic system. One that, following simple setup, would learn the habits of the people it monitors and then detect the sorts of problems that beset elderly people living alone — falls, disorientation, and so forth — as well as more subtle changes in behavior that could signal other health problems.
A group of researchers from Austria and Turkey has developed just such a system, which they presented at the IEEE’s Industrial Electronics Society meeting in Montreal in October.*
Activity as surmised in different rooms by the researchers’ machine-learning algorithms. Source: “Activity Recognition Using a Hierarchical Model.”
In their approach, the researchers train a machine-learning algorithm with several days of routine household activity using door and motion sensors distributed through the living space. The sensors aren’t associated with any particular room at the outset: their software algorithmically determines the relative positions of the sensors, then classifies the rooms that they’re in based on activity patterns over the course of the day. Read more…