Carl Malamud gave the opening keynote at the Government 2.0 Summit yesterday and greatly magnified my disappointment at having missed the event. If you’ve seen Carl speak, you know he is one person on the agenda that won’t give a “presentation” or a “talk.” Carl is an orator from a different era. He gives speeches. Rousing, moving, elevating speeches that turn our shared history into a kind of sermon; that inform us and inspire our better angels in equal measure. This is the kind of speech whose stirring coda lifts an audience to its feet and leaves them hitting replay in their heads — not just to pilfer the richest sound bites for their tweet streams — but to gather it all in.
I absolutely agree with Carl that the failure in government IT is a failure to govern. We simply can’t do without these things that we are trying so hard to build. Information technology is the infrastructure of governance, yet we fail to deliver it over and over again, and at great cost.
Carl’s prescriptions for open data, open systems, and government copyright reform are also right on. However, after years of observing the system from the inside, I have come to believe they aren’t enough.
While much can be said about the business practices of the “beltway bandits,” they are not exactly the modern analogs to those turn-of-the-century pharma companies Carl describes. In the world before the FDA those exploiters waged an asymmetric and undefended war on American consumers. The beltway bandits are waging a war too, but one with more conventional symmetry — where the government on the other side is heavily armed with a bureaucracy of its own.
We are the witnesses to an arms race of growing bureaucratic complexity, where absent market forces are inadequately replaced with the Federal Acquisition Rules and binders full of subordinate regulation. Where each additional regulation, intended to de-risk the process of building software, instead adds risk by delaying the delivery of anything actually useful.
The end result is hardly a lion preying on sheep. A better analogy might be Mothra and Godzilla locked in a death embrace. If your view is from the side of the government it is easy to place the blame with the beltway bandits. However, get inside their world a little bit and you may see them as a bit less rapacious and a lot more hogtied. When the customer is more interested in earned value reports and paperwork than working software, well, that’s what they get. The beltway bandits are without a doubt a Frankenstein’s monster, but it took a Dr. Frankenstein to build them.
Really though, I don’t care one whit who is to blame. The real issue is that all of this complexity of product and process keeps out participants, innovation, and success. Open access systems can experiment, innovate, and deliver things while closed or limited access systems evolve to deliver rents — in the economic sense.
In attempting to regulate a market that has inadequate competition, the government has inadvertently erected a bureaucracy that burdens market entry and facilitates the taking of uneconomic rents by those same beltway bandits they are trying to regulate. We should not be surprised that our creation, essentially a regulated quadopoly, is neither efficient or innovative.
Carl’s focus on open data, open systems, and misuse of intellectual property is important and relevant, as they will all contribute to moving government IT back into open access territory. But we will have to deal with the market and incentive factors as well — and they are probably bigger. In other words, demonizing the bandits without addressing the root cause — the lock-in incentives inherent in a single-customer market — will just lead to new ways to lock them in.
Shifting gears a bit, there is another area I would like to parse a bit further. The other major problem with government IT is the problem of enterprise IT in general, but at even larger scale. Some of this stuff is just really friggin’ complicated. And I’m just not convinced we know how to build some of these systems, at this scale, inside this rate of technological change. Eliminate the added complexity of working with the government for a moment and ask yourself, do repeatable best practices even exist for specifying, planning, delivering and operating systems at the scale of the Navy Marine Corps Internet (400,000 nodes and thousands of applications)? Do they even exist for the example Carl used, the National Archives system?
Before you jump all over me, notice exactly how I worded the question: specify, plan, deliver, and operate. This is the classic systems engineering approach to developing IT. It is reductionist, assumes a reasonably stable technology ecosystem and problem space, and relies on complex planning and execution to deliver.
I am beginning to be of the opinion that this approach fundamentally won’t work for systems at the scale that government often finds itself building, especially as they interconnect into ever larger and more complex wholes — even if bureaucracy wasn’t an impediment.
Consider this analogy: Our economy consists of countless businesses, each one to a large degree planned, hierarchical, and reductionist in outlook. And while they often acquire each other and grow very large, so far no single company has grown to swallow our entire economy. And if there is a lesson in the Soviet’s five-year plans, none will. These planned entities have natural limits (based on current management science and systems) to how large they can become before they become unwieldy, and beyond that we rely on market mechanisms to coordinate them in an emergent way.
So, if large scale software systems are like that, what do you do if you want one that is bigger or more complex than our plans-oriented methodologies can deliver? Well, our government has been busily demonstrating that you don’t do it by planning harder and de-risking more aggressively.
What Carl intrinsically grasps when he suggests open data and open systems, even if he doesn’t say it outright, is that government IT must recognize that it is entering a different realm, one where we need to abandon planning beyond a certain scale and adopt an approach that intentionally facilitates emergence. Intentional emergence isn’t planning faster or harder, it’s about structuring incentives, policies, and eco-systems to encourage the complex to emerge from the simple. This may not mean a fully atomized simplicity, but may come to look like our economy where pockets of planning coexist in an emergent ecosystem.
Like on the web, it means that software/systems engineering will still exist in the nodes, but the coordinating signals among the whole must be economic. Systems engineering simply isn’t equipped to operate at that scale and complexity. To cope we have to make a cognitive shift from planning, reductionism, and hierarchy to flattened networks and emergence, and put specifics such as open source, open systems and intellectual property policy into this broader framework.
I’m not exactly certain how to replace systems engineering as the basis for large system emergence, but I have some ideas. They draw inspiration from the transition of a planned to market economy. Reforms to government IT should look less like a more comprehensive CMMI and more like China’s market reforms of the 1980s — less about systems engineering and more about ecosystem engineering focused on incentives and policy.
As a starting point, we might ask the question, how might a NARA emerge in bits and pieces if a decentralized meta organization of government entities and citizens had budgetary and cultural incentives to contribute along a least resistant path that encouraged interoperability? And what should the FARS and other policy say to encourage rather than prohibit such an outcome?