• Print

There are bigger issues surrounding the .gov review

The efforts behind .gov reform go beyond domain management.

The United States federal government is reforming its Internet strategy. In the context of looming concerns about the debt ceiling, high unemployment, wars abroad, rising healthcare costs, and the host of other issues that the White House and Congress should be addressing, that might seem like a side issue.

It’s not. The federal government spends some $80 billion dollars every year on information technology. If you’re paying any attention at all to government IT, you know that Uncle Sam is not getting his money’s worth from that investment. Furthermore, the federal government has not been getting the kinds of returns in productivity or efficiency that the private sector has enjoyed over the past decade or so. Former White House OBM director Peter Orzag called that disparity the “IT gap” last year. So when the Obama administration launched a new initiative on Web reform this summer, it might have seemed a little overdue.

Better late than never, and better now than later.

Citizens are turning to the Internet for government data and services in unprecedented numbers, and they’re expecting to find answers, applications and, increasingly, people. While public servants can join the conversations on social networks to address the latter demand, delivering improved information and e-services confronts the federal government with some tough choices, given budget constraints. That challenge is one reason that they’re looking to the general public and the private sector for some ideas on how they can improve their strategy.

This week, in service of that goal, the White House hosted a livechat on improving federal websites with Macon Phillips, White House director of digital strategy, Vivek Kundra, the federal chief information officer, and Sheila Campbell, director of the GSA’s Center for Excellence in Digital Government. The chat, which has become a standard tool in the White House’s online communications toolkit over the last year, included a livestream from WhiteHouse.gov/live, a Facebook chat and an active Twitter backchannel at the #dotgov hashtag. The White House also took questions through a form on WhiteHouse.gov and its Facebook wall.

These issues aren’t new, of course, even if the tools for discussion have improved. And if you’ve been following the Gov 2.0 movement over the years this issue of how the government can use the Internet and associated technologies to work better has been at the core of the discussion throughout. Success online used to be measured by having a website, said federal chief information Vivek Kundra. As he observed immediately afterwards, “those days are long gone.”

If the federal government is going to reform how it uses the Internet, it will need to learn and apply the lessons that Web 2.0 offers to Gov 2.0, whether it’s standing up open government platforms, leveraging the cloud, crowdsourcing, or making data-driven policy.

Government is also going to need to stop creating a new .gov website for every new initiative, particularly if they’re not optimized for search engines. There’s some good news here: “Every month, historically, federal agencies would register 50 new domain names, said Kundra on Tuesday. “That’s been halted.”

This proliferation of federal .gov websites has been an issue for some time — call it “.gov sprawl” — and that what’s driven the .gov reform effort in the context of the Obama administration’s campaign to cut government waste. This week, for the first time, a dataset of federal executive branch Internet domains has been published as open government data online. The dataset of federal gov domains is hosted on Data.gov and has been embedded below:

Federal Executive Branch Internet Domains

“This dataset lists all of the executive branch second-level domains within the top-level .gov domain, and which agencies own them,” commented General Services Agency new media specialist Dan Munz in the Community page for the dataset. “As White House Director of Digital Strategy Macon Philips has pointed out (see “TooManyWebsites.gov“), while many of these domain names point to sites that are valuable, some are duplicative or unnecessary. That makes it harder to manage the .gov domain, impairs agencies’ ability to distribute information, and creates a user experience for citizens that just isn’t as good as it could or should be. How can we fix that? Over the coming months, we’ll have a plan for streamlining the federal executive branch webspace, and we want to invite you into the conversation. We’re releasing this dataset as a first step, so that you can explore, comment, remix, and maybe even use the data to map the .gov domain in ways we haven’t seen before.”

OSCON Data 2011, being held July 25-27 in Portland, Ore., is a gathering for developers who are hands-on, doing the systems work and evolving architectures and tools to manage data. (This event is co-located with OSCON.)

Save 20% on registration with the code OS11RAD

Why reforming .gov matters

This effort is not impressing all observers. Micah Sifry, the co-founder of the Personal Democracy Forum, has called the move to delete redundant websites “cheap, dumb and cynical” at techPresident. “Redundant government websites probably cost the taxpayer a fraction of what we spend on military bands, let alone what we spend on duplicative and unnecessary government websites promoting the Army’s, Navy’s, Air Force’s, Merchant Marine’s, Naval Academy’s, and Coast Guard’s bands’ websites! (According to NPR, the Marines spend $50 million a year on their bands, and the Army $198 million.” In a larger sense, Sifry argued, “if you are really serious about eliminating stupid and pointless spending, then you’d be pushing for laws to strengthen protections for government whistleblowers (instead of going an stupid and pointless rampage to prosecute them!), since insiders know where the real waste is hidden.”

Sifry is absolutely right on one count: the amount of money up for savings through reducing federal .gov websites is dwarfed by what is saved by, say, reducing Medicare fraud using new data analytics tools, or in finding cost savings in defense spending. Reducing the number of federal .gov websites by 90% would not significantly address the federal deficit. The biggest federal cost savings from this week’s .gov livechat were likely cited by Kundra, when he said that 137 federal data centers were closed by the end of this calendar year, each of which consumes immense amounts of energy.

Where Sifry may have been overly harsh in his critique is in not acknowledging how progressive a perspective the White House appears to have embraced here. (Progressive meaning “forward-thinking,” not political ideology, in this case.) Democratizing healthcare data so that it showed up in search engine results or is integrated into applications makes it more useful, argues Kundra, citing the improvements to hospitalcompare.gov. Moving from a static website to a universe of applications and services provisioned by open government data is shifting from a Web 1.0 vision to 2.0 reality. In a country where 35% of citizens have a smartphone, delivering services and providing information to a mobile audience has to be factored into any online strategy, whether in the public or private sector. And, in most cases, it’s the private sector that will be able to create the best applications that use that data, if government acts as a platform to empower civic coders. Phillips acknowledged that explicitly. “The best mobile apps,” he said, “are going to be driven by the private sector making use of public data.”

If e-government is going to move toward “We-government” — as Sifry has described the growing ecosystem of civic media, technology-fueled transparency advocates and empowered citizens — government data and services will need to be discoverable where and when people are looking for them. That is ultimately, in part, what getting .gov reform right needs to be about, versus straightforward cost-savings.

Kundra asked the broader community to “help us think through how we’re going to provide services over the mobile Internet.” If, as he said, search is the default way that people search for information now, then releasing high quality open data about government spending, the financial industry, healthcare, energy, education, transportation, legislation and campaign finance would be a reasonable next step. Tim O’Reilly has been sharing a simple piece of advice to the architects of platforms for years: “Don’t make people find data. Make data find the people.”

The .gov reform, in that context, isn’t just about reducing the number of websites and saving associated design or maintenance costs. It’s about reducing the need to ever visit a website to retrieve the information or access a citizen requires. In the years ahead, it will be up to Congress and Kundra’s successor as federal CIO — along with whomever he or she reports to in the Oval Office — to get that part of “web reform” done.

UPDATE: Mike Rupert, an open government for Washington-based PhaseOne Consulting, raised a important point about the .gov review: content and analytics are key. Specifically, Rupert highlighted the importance of good communicators that explain what matters in plain language.

In all of the blog posts, tweets, and news items around this effort, however, there is little discussion about ensuring government websites have fresh and relevant content in plain language that people can relate to/understand. Nor is there any discussion about using real-time analytics to determine what content isn’t connecting with citizens, making revisions based on what people are looking for, and creating/expanding content based on demand.

There is very little discussion about ensuring basic search engine optimization tactics – metadata, friendly URLS, links, site maps, key words in headers, key words, tags – are being used so customers can find the information they want and need. Whether we have 2 or 2,000 or even 200,000 .gov domains, if the citizens can’t find our great stuff, we are wasting precious time and resources.

With Google, Bing, and other search engines crawling more than 130 million active domains worldwide, we need to be sure citizens can find what they need. It doesn’t matter what we name the web site, how many apps we build, or how intuitive our e-service interface is… because no matter what, if people can’t find it, they can’t find it.

Rupert is not alone in raising the issues. Vanessa Fox has written here about how government transparency relates to search data, including tips for making government websites searchable. White House OIRA Administrator Cass Sunstein has written that plain writing should be seen as an essential part of open government. If citizens are going to be asked to participate in government, “what” is being asked of them needs to be clear — and it needs to be findable.

tags: , , , , , ,
  • Sanchezjb

    You refer to “data” 20 times and “information” (in the context of what people would look for) five times. This can create a perception that, based on your article, .gov reform is more focused on data vs. information.

    There’s a big difference between data and information. Information provides context around data. It’s this context that makes data valuable and what creates resonance.

    Citizens still need to have access to information and today, given the choice between having access to government data or information, most citizens would probably state that prefer being able to have efficient access to the information that they deem important.

    It’s not just about the data.

  • charlz

    Minor quibble: A .gov registration does not equal a website. Many agencies registered second-level domains defensively so other agencies couldn’t get them. Eliminating, say, 10,000 of those would save the taxpayer ~1.25 million, but it will probably take at least that in gov’t personnel costs to find and process that 10K.

  • Monty

    The dollar savings my not be all that significant but is there some value to be added by the fact that ordinary citizens will more likely land at the appropriate site is (seemingly) redundant sites aren’t dividing up their search engine rankings and potentially pushing other sites above them in the results?

  • http://mixtmediastrategies.com mixtmedia

    Not only is .gov reform about the literal — eliminating useless and duplicative destination sites — it’s also about changing the mentality of .gov so that web managers across the agencies are driven by meeting citizens’ needs (“delighting” the customer…?) rather than owning and decorating more “web land.”

    At the core of this reform must be a shift in incentives for agencies’ web and program managers: more is not better. So often the validity of a new program or government initiative is marked by the stand-up of a snazzy, flashy website. The rationale is, “we are not real until we have our website — our storefront sign — up.” This was the case in the ’90s days of “brochure-ware” sites. We should be way beyond that.

    One of the core principles of Gov 2.0 is that information doesn’t equal power; rather, information SHARING amplifies power. Embracing this philosophy in the world of government websites means that program and web managers will not be rewarded for launching their own destination sites. They will be rewarded for achieving the mission of their program. If the web provides efficient and effective means to that end, great; if not, that’s fine too.

    Dot-gov reform is more about shifting agencies’ orientation from “proving” to “doing.” Like t-shirts sporting slogans like, “I’m a great soccer player” or “Sexy Gal” — if you really were, would you need to say so…? So government, follow one of my favorite principles: show, don’t tell when it comes to meeting citizens’ needs.

  • http://www.techpresident.com Micah Sifry

    Alex: Thanks for your always valuable review of the state of play on open government. And thanks also for citing my critique of Obama’s stupid little campaign against wasteful .gov website bloat.

    But let me be clear: I am all for efforts like Kundra’s to democratize access to vital government data, like health care data. Whenever I’m asked about the overall state of open data/open government under Obama, I cite the cutting edge work of Todd Park at HHS, along with the less well-known efforts at many govt agencies to share more information in up-to-date ways.

    My criticism of this new campaign against supposedly wasteful .gov bloat is of its cynicism and misdirection. As I wrote in my post about it, if you were really serious about pressing for changes in how the federal government manages information resources and policy, not only would you strengthen protections for whistleblowers, “you’d be expanding government use of the web to shed light on who is trying to influence it and obtain special favors from it, say, by creating a ‘centralized Internet database of lobbying reports, ethics records, and campaign finance filings in a searchable, sortable and downloadable format,’ as some guy who ran for President in 2008 promised back then.” Unfortunately, for anyone who had hopes that Obama would indeed drive change in how Washington works, “with Monday’s video attacking pointless and stupid government websites, President Obama has spent more time personally going after Fiddlin’ Foresters website than he has in pressing for an Ethics.gov website, despite his campaign promises” (quoting myself again).

    The White House communications team has probably spent more taxpayer money ferreting out that poor Fiddlin Foresters website for attack than they saved us in forcing it to shut down. It’s cheap politics and nasty symbolism that I’m criticizing, not the value of other efforts, like Kundra’s.

    I’d be far more impressed if the comm team got the president to brag about using open government and improved data collection and sharing to actually save us real money by denying corrupt federal contractors the ability to keep bidding on and winning billion-dollar contracts, or more minimally, for announcing that going forward, that the govt’s “excluded parties” database will start showing which companies an excluded individual worked for, so we can better tell which government contractors are breeding grounds for misconduct and thus raise the cost to those companies of tolerating corrupt practices. (Looking at the EPLS database, you’d never know that Lockheed Martin, for example, has never run afoul of the law.)

    But doing that kind of data improvement is hard and requires spending political capital. Getting the president to whack at “welfare queens” or “fiddlin foresters” is easy. Or, as I said, cheap, dumb and cynical.

  • http://radar.oreilly.com/alexh Alex

    Thanks to everyone here for the comments, both here and out on the distributed social Web.

    @charlz Good point on registering a .gov website vs standing up a new one. (I’m glad I didn’t write something that created false equivalency here!) I’ve heard a figure of around 24,000 registered federal .gov domains, but perhaps 2,000 sites.

    @monty I updated the post to include a note on SEO. In general, I don’t think a lot of these sites are particularly well optimized for search at all, which goes back to getting information or services when and where they want it.

    @sanchezjb I’m not sure a frequency distribution is the right analysis of the underlying point here, but fair enough. With respect to your point – “Citizens still need to have access to information and today, given the choice between having access to government data or information, most citizens would probably state that prefer being able to have efficient access to the information that they deem important.” – just so. Developers, NGOs, watchdogs, media or businesses are all much more likely to want data. Citizens want information. The point that Phillips and Kundra have made is that if data is released in structured form, all of those constituencies can integrate it into forms that provide citizens with the desired information or services. For instance, if someone wants information about local hospitals, releasing data can allow nearby hospitals to be compared, along with performance data if that’s available. If someone else wants to find their local representative, they can use a mobile service that has integrated directory data. If they google a pill, an image and link to a NIH page pops up. In the future, if they want to check a product recall, they might scan an object and do a search against government data from a mobile app, instead of going to a .gov site and trying to type in an ISBN number. And so on. If that translation wasn’t clear in the article, I apologize.

    @Mixtmedia Your points on information sharing in a networked world and tying the Web to customer service and agency mission speak to central value propositions for the Internet and government. Thank you for sharing.

    @micah I wish every post prompted these kinds of comments! Thank you for adding more context here, for those who hadn’t clicked through. You simultaneously highlight how much progress remains and what might yet be possible.

  • http://www.accio-shop.de Kinderkleidung

    There still the myth that backlinks from .gov Homepages would increase somebody’s site. For me personal, I can’t see any effect, yeah Google still gives .gov and .edu domains extra trust and reputation, but you will not earn it while you set an Backlink from their site to yours

  • Paul Ashley

    I searched and found this site after hearing the umpteenth radio spot for another most-likely-usedless nanny-state website – brought to you by the “Ad Council”.

    I understand that in this day and age government at all levels need to make needed information available on the web and that the effort must be coordinated to make the presentation as efficient as possible. But do we really need sites the like’s of MyPlate.gov? How much are we paying people to produce this drivel (I’ll bet it’s a lot)and how many unique hits do these sites get?. I would bet that a cost-per-hit analysis would weed out a ton of sites, allowing that much more to be returned to the taxpayers to help the market create worthwhile jobs creating vaulable websites for businesses.