There are bigger issues surrounding the .gov review

The efforts behind .gov reform go beyond domain management.

The United States federal government is reforming its Internet strategy. In the context of looming concerns about the debt ceiling, high unemployment, wars abroad, rising healthcare costs, and the host of other issues that the White House and Congress should be addressing, that might seem like a side issue.

It’s not. The federal government spends some $80 billion dollars every year on information technology. If you’re paying any attention at all to government IT, you know that Uncle Sam is not getting his money’s worth from that investment. Furthermore, the federal government has not been getting the kinds of returns in productivity or efficiency that the private sector has enjoyed over the past decade or so. Former White House OBM director Peter Orzag called that disparity the “IT gap” last year. So when the Obama administration launched a new initiative on Web reform this summer, it might have seemed a little overdue.

Better late than never, and better now than later.

Citizens are turning to the Internet for government data and services in unprecedented numbers, and they’re expecting to find answers, applications and, increasingly, people. While public servants can join the conversations on social networks to address the latter demand, delivering improved information and e-services confronts the federal government with some tough choices, given budget constraints. That challenge is one reason that they’re looking to the general public and the private sector for some ideas on how they can improve their strategy.

This week, in service of that goal, the White House hosted a livechat on improving federal websites with Macon Phillips, White House director of digital strategy, Vivek Kundra, the federal chief information officer, and Sheila Campbell, director of the GSA’s Center for Excellence in Digital Government. The chat, which has become a standard tool in the White House’s online communications toolkit over the last year, included a livestream from WhiteHouse.gov/live, a Facebook chat and an active Twitter backchannel at the #dotgov hashtag. The White House also took questions through a form on WhiteHouse.gov and its Facebook wall.

These issues aren’t new, of course, even if the tools for discussion have improved. And if you’ve been following the Gov 2.0 movement over the years this issue of how the government can use the Internet and associated technologies to work better has been at the core of the discussion throughout. Success online used to be measured by having a website, said federal chief information Vivek Kundra. As he observed immediately afterwards, “those days are long gone.”

If the federal government is going to reform how it uses the Internet, it will need to learn and apply the lessons that Web 2.0 offers to Gov 2.0, whether it’s standing up open government platforms, leveraging the cloud, crowdsourcing, or making data-driven policy.

Government is also going to need to stop creating a new .gov website for every new initiative, particularly if they’re not optimized for search engines. There’s some good news here: “Every month, historically, federal agencies would register 50 new domain names, said Kundra on Tuesday. “That’s been halted.”

This proliferation of federal .gov websites has been an issue for some time — call it “.gov sprawl” — and that what’s driven the .gov reform effort in the context of the Obama administration’s campaign to cut government waste. This week, for the first time, a dataset of federal executive branch Internet domains has been published as open government data online. The dataset of federal gov domains is hosted on Data.gov and has been embedded below:

Federal Executive Branch Internet Domains

“This dataset lists all of the executive branch second-level domains within the top-level .gov domain, and which agencies own them,” commented General Services Agency new media specialist Dan Munz in the Community page for the dataset. “As White House Director of Digital Strategy Macon Philips has pointed out (see “TooManyWebsites.gov“), while many of these domain names point to sites that are valuable, some are duplicative or unnecessary. That makes it harder to manage the .gov domain, impairs agencies’ ability to distribute information, and creates a user experience for citizens that just isn’t as good as it could or should be. How can we fix that? Over the coming months, we’ll have a plan for streamlining the federal executive branch webspace, and we want to invite you into the conversation. We’re releasing this dataset as a first step, so that you can explore, comment, remix, and maybe even use the data to map the .gov domain in ways we haven’t seen before.”

OSCON Data 2011, being held July 25-27 in Portland, Ore., is a gathering for developers who are hands-on, doing the systems work and evolving architectures and tools to manage data. (This event is co-located with OSCON.)

Save 20% on registration with the code OS11RAD

Why reforming .gov matters

This effort is not impressing all observers. Micah Sifry, the co-founder of the Personal Democracy Forum, has called the move to delete redundant websites “cheap, dumb and cynical” at techPresident. “Redundant government websites probably cost the taxpayer a fraction of what we spend on military bands, let alone what we spend on duplicative and unnecessary government websites promoting the Army’s, Navy’s, Air Force’s, Merchant Marine’s, Naval Academy’s, and Coast Guard’s bands’ websites! (According to NPR, the Marines spend $50 million a year on their bands, and the Army $198 million.” In a larger sense, Sifry argued, “if you are really serious about eliminating stupid and pointless spending, then you’d be pushing for laws to strengthen protections for government whistleblowers (instead of going an stupid and pointless rampage to prosecute them!), since insiders know where the real waste is hidden.”

Sifry is absolutely right on one count: the amount of money up for savings through reducing federal .gov websites is dwarfed by what is saved by, say, reducing Medicare fraud using new data analytics tools, or in finding cost savings in defense spending. Reducing the number of federal .gov websites by 90% would not significantly address the federal deficit. The biggest federal cost savings from this week’s .gov livechat were likely cited by Kundra, when he said that 137 federal data centers were closed by the end of this calendar year, each of which consumes immense amounts of energy.

Where Sifry may have been overly harsh in his critique is in not acknowledging how progressive a perspective the White House appears to have embraced here. (Progressive meaning “forward-thinking,” not political ideology, in this case.) Democratizing healthcare data so that it showed up in search engine results or is integrated into applications makes it more useful, argues Kundra, citing the improvements to hospitalcompare.gov. Moving from a static website to a universe of applications and services provisioned by open government data is shifting from a Web 1.0 vision to 2.0 reality. In a country where 35% of citizens have a smartphone, delivering services and providing information to a mobile audience has to be factored into any online strategy, whether in the public or private sector. And, in most cases, it’s the private sector that will be able to create the best applications that use that data, if government acts as a platform to empower civic coders. Phillips acknowledged that explicitly. “The best mobile apps,” he said, “are going to be driven by the private sector making use of public data.”

If e-government is going to move toward “We-government” — as Sifry has described the growing ecosystem of civic media, technology-fueled transparency advocates and empowered citizens — government data and services will need to be discoverable where and when people are looking for them. That is ultimately, in part, what getting .gov reform right needs to be about, versus straightforward cost-savings.

Kundra asked the broader community to “help us think through how we’re going to provide services over the mobile Internet.” If, as he said, search is the default way that people search for information now, then releasing high quality open data about government spending, the financial industry, healthcare, energy, education, transportation, legislation and campaign finance would be a reasonable next step. Tim O’Reilly has been sharing a simple piece of advice to the architects of platforms for years: “Don’t make people find data. Make data find the people.”

The .gov reform, in that context, isn’t just about reducing the number of websites and saving associated design or maintenance costs. It’s about reducing the need to ever visit a website to retrieve the information or access a citizen requires. In the years ahead, it will be up to Congress and Kundra’s successor as federal CIO — along with whomever he or she reports to in the Oval Office — to get that part of “web reform” done.

UPDATE: Mike Rupert, an open government for Washington-based PhaseOne Consulting, raised a important point about the .gov review: content and analytics are key. Specifically, Rupert highlighted the importance of good communicators that explain what matters in plain language.

In all of the blog posts, tweets, and news items around this effort, however, there is little discussion about ensuring government websites have fresh and relevant content in plain language that people can relate to/understand. Nor is there any discussion about using real-time analytics to determine what content isn’t connecting with citizens, making revisions based on what people are looking for, and creating/expanding content based on demand.

There is very little discussion about ensuring basic search engine optimization tactics – metadata, friendly URLS, links, site maps, key words in headers, key words, tags – are being used so customers can find the information they want and need. Whether we have 2 or 2,000 or even 200,000 .gov domains, if the citizens can’t find our great stuff, we are wasting precious time and resources.

With Google, Bing, and other search engines crawling more than 130 million active domains worldwide, we need to be sure citizens can find what they need. It doesn’t matter what we name the web site, how many apps we build, or how intuitive our e-service interface is… because no matter what, if people can’t find it, they can’t find it.

Rupert is not alone in raising the issues. Vanessa Fox has written here about how government transparency relates to search data, including tips for making government websites searchable. White House OIRA Administrator Cass Sunstein has written that plain writing should be seen as an essential part of open government. If citizens are going to be asked to participate in government, “what” is being asked of them needs to be clear — and it needs to be findable.

tags: , , , , ,