The end of integrated systems

The programmable world will increasingly rely on machine-learning techniques that can interact with human interfaces

I always travel with a pair of binoculars and, to the puzzlement of my fellow airline passengers, spend part of every flight gazing through them at whatever happens to be below us: Midwestern towns, Pennsylvania strip mines, rural railroads that stretch across the Nevada desert. Over the last 175 years or so, industrialized America has been molded into a collection of human patterns–not just organic New England villages in haphazard layouts, but also the Colorado farm settlements surveyed in strict grids. A close look at the rectangular shapes below reveals minor variations, though. Every jog that a street takes is a testament to some compromise between mankind and our environment that results in a deviation from mathematical perfection. The world, and our interaction with it, can’t conform strictly to top-down ideals.

We’re about to enter a new era in which computers will interact aggressively with our physical environment. Part of the challenge in building the programmable world will be finding ways to make it interact gracefully with the world as it exists today. In what you might call the virtual Internet, human information has been squeezed into the formulations of computer scientists: rigid relational databases, glyphs drawn from UTF-8, information addressed through the Domain Name System. To participate in it, we converse in the language and structures of computers. The physical Internet will need to understand much more flexibly the vagaries of human behavior and the conventions of the built environment.


A few weeks ago, I found myself explaining the premise of the programmable world to a stranger. He caught on when the conversation turned to driverless cars and intelligent infrastructure. “Aha!” he exclaimed, “They have these smart traffic lights in Los Angeles now; they could talk to the cars!”

The idea that centralized traffic computers will someday communicate wirelessly with vehicles, measuring traffic conditions and directing cars, was widely discussed a few years ago, but now it struck me as outdated. Computers can easily read traffic signals with machine vision the same way that humans do–by interpreting red, yellow, and green lights in the context of an intersection’s layout. On the other side, traffic signals often use cameras to analyze volumes and adjust to them. With these kinds of flexible-sensing technologies perfected, there will be no need for rigid, integrated systems to link cars to traffic lights. The example will repeat itself in other cases where software gathers data from sensors and controls physical devices.

The world is becoming programmable, with physical devices drawing intelligence from layers of software that can connect to widespread sensor networks and optimize entire systems. Cars drive themselves, thermostats silently adjust to our apparent preferences, wind turbines gird themselves for stormy gusts based on data collected from other turbines. These services are increasingly enabled not by rigid software infrastructure that connects giant systems end-to-end, but by ambient data-gathering, Web-like connections, and machine-learning techniques that help computers interact with the same user interfaces that humans rely on.

We’re headed for a re-hash of the API/Web scraping debate, and even more than on the Web, scraping is poised for an advantage since physical-world conventions have been under refinement for millennia. It’s remarkably easy to differentiate a men’s room door from a women’s room door, or to tell whether a door is locked by looking at the deadbolt, and software that can understand these distinctions is already available.

There will, of course, still be a very big place for integrated systems in the physical Internet. Environments built from scratch; machines that operate in formalized, well-contained environments; and systems whose first priority is reliability and clarity will all make good use of traditional APIs. Trains will always communicate with dispatchers through formal integrated systems, just as Twitter’s API is likely the best foundation for a Twitter app regardless of how sophisticated your Twitter-scraping software might be.

But building a dispatching system for a railroad is nothing like developing a car that could communicate, machine-to-machine, with any traffic light it might encounter. Governments would need to invest billions in retrofitting their signaling systems, cars would need to accommodate the multitude of standards that would emerge from the process, and in order for adoption to be practical, traffic-signal retrofits would need to be universal. It’s better to gently build systems layer by layer that can accept human convention as it already exists, and the technology to do that is emerging now.

A couple of other observations about the superiority of loosely-coupled services in the physical world:

  • Google Maps, Waze, and Inrix all harvest traffic data ambiently from connected devices that constantly report location and speed. The alternative, described before networked navigation devices were common, was to gather this data through connected infrastructure. State departments of transportation, spurred by the SAFETEA-LU transportation bill of 2005 (surely the most tortured acronym in Washington), invested heavily in “smart transportation” infrastructure like roadway sensors and digital overhead signs, with the aim of, for instance, directing traffic around congestion. Compared to any of the Web services, these systems are slow to react, have poor coverage, are inflexible, and are incredibly expensive. The Web services make collateral use of infrastructure–gathering data quietly from devices that were principally installed for other purposes.
  • A few weeks ago I spent a storm-drenched six hours in one of O’Hare Airport’s lower-ring terminals. As we waited for the storm to clear and flights to resume, we watched the data screen over the gate–the Virgil in our inferno–which periodically showed a weather map and updated our predicted departure time. It was clear to anyone with a smartphone, though, that the information it displayed, at the end of a long chain of data transmission and reformatting that went from weather services and the airline’s management system through the airport’s display system, was out of date. The flight updates arrived on the airline’s Web site 10 to 15 minutes before they appeared on the gate screen; the weather map, which showed our storm crossing the Mississippi River even as it appeared over the airfield, was a consistent hour behind. By following updates on the Internet, customers were able to subvert an otherwise-rigid and hierarchical information flow, finding data where it was freshest and most accurate among loosely-coupled services online. Software that can automatically seek out the highest-quality information, flexibly switching between sources, will in turn provide the highest-quality intelligence.

The API represents the world on the computers’ terms: human output squished constrained by programming conventions. Computers are on their way to participating in the world in human terms, accepting winding roads and handwritten signs. In other words, they’ll soon meet us halfway.

tags: , , ,

Get the IoT+ Newsletter

Software / Hardware / Everywhere

The programmable world is creating disruptive innovation as profound as the Internet itself. Be among the first to learn about the latest news, trends, and opportunities.

  • jcanosa

    Jon – excellent insight. There has been a lot of myopic vision that the Internet of Things is some giant centralized big data system with sensors feeding into it. As you correctly point out, this couldn’t be further from reality or practicality. The future belongs to distributed application intelligence *and* distributed data using a decentralized/federated model. With Moore’s law and some phenomenal advances in connected application development such as ThingWorx helping us along, we believe it is the only practical way for the IoT to really take off.

  • wilbur

    If the (sub) system’s not integrated, then the job of integration just moves to the app. OK, big win for app capability, but likely a suboptimal result. Think of this as a bridge to better times.

  • utilitus

    From today’s’ EETimes:

    “NIST and the interested companies laid out five areas the consortium’s framework could address in a meeting in March. They included defining an architecture for:

    Co-engineering cyber and physical systems

    Identifying cyber-security issues and solutions

    Addressing concerns about interoperability

    Identifying ways to maintain robust wireless connections

    Setting standards for real-time data collection and analytics

    “The trick is to look at all these issues holistically rather than domain by domain,…”

    See: http://www.eetimes.com/document.asp?doc_id=1319162&

  • jpattinson

    A thought provoking article, which is really about data sources – how do you efficiently get the data you need to help you resolve your problem at the right time and place. Sometimes you can use your eyes and ears (or locally sourced data in the case of isolated traffic signals) and sometimes you need more help (urban traffic control systems using area wide data and, maybe when useful, eventually linked to vehicle systems).
    I often have the same problem you had at the airport. When waiting for a train on my daily commute, the platform signs say the train is on time, but my 10 years experience says the train is late – because it is past the time it should have left. The system is delivering information that is useless, because it has a weak link somewhere.
    Your weather app is also at the end of a long chain of integrated systems; at the time, it just happened to be a more efficient chain than the airport could muster. With current technology the language of integration is necessarily simplistic and slow. This limits the richness and timeliness of the conversation, but this is changing over time.
    Integration is needed now and in the future as the world get more complex. Its objective is to deliver timely data that is useful information when and where we need it. No easy task, but one that we continue to try to deliver with small improvements all the time.

  • kylesamani

    reminds me of a post I wrote some time ago, Quantified Civilization

    http://kylesamani.com/blog/2013/4/21/quantified-earth-and-civilziation