"Gov 2.0" entries

Tracking the data storm around Hurricane Sandy

When natural disasters loom, public open government data feeds become critical infrastructure.

Just over fourteen months ago, social, mapping and mobile data told the story of Hurricane Irene. As a larger, more unusual late October storm churns its way up the East Coast, the people in its path are once again acting as sensors and media, creating crisis data as this “Frankenstorm” moves over them.

Hurricane Sandy is seen on the east coast of the United States in this NASA handout satellite image taken at 0715 GMT, October 29, 2012.

[Photo Credit: NASA}

As citizens look for hurricane information online, government websites are under high demand. In late 2012, media, government, the private sector and citizens all now will play an important role in sharing information about what’s happening and providing help to one another.

In that context, it’s key to understand that it’s government weather data, gathered and shared from satellites high above the Earth, that’s being used by a huge number of infomediaries to forecast, predict and instruct people about what to expect and what to do. In perhaps the most impressive mashup of social and government data now online, an interactive Google Crisis Map for Hurricane Sandy pictured below predicts the future of the ‘Frankenstorm’ in real-time, including a NYC-specific version.

If you’re looking for a great example of public data for public good, these maps like the Weather Underground’s interactive are a canonical example of what’s possible.

Read more…

San Francisco looks to tap into the open data economy

With revised legislation and a chief data officer, San Francisco is iterating on its platform goals.

As interest in open data continues to grow around the world, cities have become laboratories for participatory democracy. They’re also ground zero for new experiments in spawning civic startups that deliver city services or enable new relationships between the people and city government. San Francisco was one of the first municipalities in the United States to embrace the city as a platform paradigm in 2009, with the launch of an open data platform.

Years later, the city government is pushing to use its open data to accelerate economic development. On Monday, San Francisco announced revised open data legislation to enable that change and highlighted civic entrepreneurs who are putting the city’s data to work in new mobile apps.

City staff have already published the revised open data legislation on GitHub. (If other cities want to “fork” it, clone away.) David Chiu, the chairman of the San Francisco Board of Supervisors, the city’s legislative body, introduced the new version on Monday and submitted it on Tuesday. A vote is expected before the end of the year.

Speaking at the offices of the Hatchery in San Francisco, Chiu observed that, by and large, the data that San Francisco has put out showed the city in a positive light. In the future, he suggested, that should change. Chiu challenged the city and the smartest citizens of San Francisco to release more data, figure out where the city could take risks, be more entrepreneurial and use data to hold the city accountable. In his remarks, he said that San Francisco is working on open budgeting but is still months away from getting the data that they need. Read more…

Data from health care reviews could power “Yelp for health care” startups

Data-driven decision engines will need patient experience to complete the feedback loop.

A hospital in MaineGiven where my work and health has taken me this year, I’ve been thinking much more about the relationship of the Internet and health data to accountability and patient-driven health care.

When I was looking for a place in Maine to go for care this summer, I went online to look at my options. I consulted hospital data from the government at HospitalCompare.HHS.gov and patient feedback data on Yelp, and then made a decision based upon proximity and those ratings. If I had been closer to where I live in Washington D.C., I would also have consulted friends, peers or neighbors for their recommendations of local medical establishments.

My brush with needing to find health care when I was far from home reminded me of the prism that collective intelligence can now provide for the treatment choices we make, if we have access to the Internet.

Patients today are sharing more of their health data and experiences online voluntarily, which in turn means that the Internet is shaping health care. There’s a growing phenomenon of “e-patients” and caregivers going online to find communities and information about illness and disability.

Aided by search engines and social media, newly empowered patients are discussing health conditions with others suffering from disease and sickness — and they’re taking that peer-to-peer health care knowledge into their doctors’ offices with them, frequently on mobile devices. E-patients are sharing their health data of their own volition because they have a serious health condition, want to get healthy, and are willing.

From the perspective of practicing physicians and hospitals, the trend of patients contributing to and consulting on online forums adds the potential for errors, fraud, or misunderstanding. And yet, I don’t think there’s any going back from a networked future of peer-to-peer health care, anymore than we can turn back the dial on networked politics or disaster response. Read more…

The missing ingredient from hyperwired debates: the feedback loop

The 2012 Presidential debates show how far convergence has come and how far we have yet to go.

PodiumWhat a difference a season makes. A few months after widespread online frustration with a tape-delayed Summer Olympics, the 2012 Presidential debates will feature the most online livestreams and wired, up-to-the-second digital coverage in history.

Given the pace of technological change, it’s inevitable that each election season will bring with it new “firsts,” as candidates and campaigns set precedents by trying new approaches and platforms. This election has been no different: the Romney and Obama campaigns have been experimenting with mobile applications, social media, live online video and big data all year.

Tonight, one of the biggest moments in the presidential campaign to date is upon us and there are several new digital precedents to acknowledge.

The biggest tech news is that YouTube, in a partnership with ABC, will stream the debates online for the first time. The stream will be on YouTube’s politics channel, and it will be embeddable.

With more and more livestreamed sports events, concerts and now debates available online, tuning in to what’s happening no longer means passively “watching TV.” The number of other ways people can tune in online in 2012 has skyrocketed, as you can see in GigaOm’s post listing debate livestreams or Mashable’s ways to watch the debates online.

This year, in fact, the biggest challenge people will have will not be finding an online alternative to broadcast or cable news but deciding which one to watch.

Read more…

DataMarket charges up with open energy data

Want to build a business on open data? Add value by solving a problem for your users.

Hjalmar Gislason commented earlier this year that open data has been all about apps. In the future, it should be about much more than consumer-facing tools. “Think also about the less sexy cases that can help a few people save us millions of dollars in aggregate, generate new insights and improve decision making on various levels,” he suggested.

Today, the founder and CEO of DataMarket told the audience of the first White House Energy Datapalooza that his company would make energy data more discoverable and usable. In doing so, Datamarket will be be tapping into an emerging data economy of businesses using open government data.

“We are honored to have been invited to take part in this fantastic initiative,” said Gislason in a prepared statement. “At DataMarket we focus on doing one thing well: aggregating vast amounts of heterogeneous data to help business users with their planning and decision-making. Our new energy portal applies this know-how to the US government’s energy data, for the first time enabling these valuable resources to be searched, visualized and shared through one gateway and in combination with other domestic and worldwide open data sources.”

Energy.datamarket.com, which won’t go live officially until mid-October, will offer search for 10 thousand data sets, 2 million time series and 50 million energy facts. DataMarket.com is based upon data from thirteen different data providers including the U.S. Department of Energy’s Energy Information Agency (EIA), Oak Ridge National Laboratory, Energy Efficiency and Renewable Energy program, National Renewable Energy Laboratory, the Environmental Protection Agency (EPA), the Bureau of Transportation Statistics, the World Bank and United Nations.

Last week, I interviewed Gislason about his company and why they’re focusing on energy data.

Read more…

Four key trends changing digital journalism and society

Commonalities between the Knight Foundation's News Challenge winners hint at journalism's networked future.

See something or say something: Los AngelesIt’s not just a focus on data that connects the most recent class of Knight News Challenge winners. They all are part of a distributed civic media community that works on open source code, collects and improves data, and collaborates across media organizations.

These projects are “part of an infrastructure that helps journalists better understand and serve their communities through data,” commented Chris Sopher, Knight Foundation Journalism Program Associate, in an interview last week. To apply a coding metaphor, the Knight Foundation is funding the creation of patches for the source code of society. This isn’t a new focus: in 2011, Knight chose to help build the newsroom stack, from editorial search engines to data cleaning tools.

Following are four themes that jumped out when I looked across the winners of the latest Knight News Challenge round.

Networked accountability

An intercontinental project that bridged citizen science, open data, open source hardware, civic hacking and the Internet of things to monitor, share and map radiation data? Safecast is in its own category. Adapting the system to focus on air quality in Los Angeles — a city that’s known for its smog — will be an excellent stress test for seeing if this distributed approach to networked accountability can scale.

If it does — and hacked Chumbys, LED signs, Twitter bots, smartphone apps and local media reports start featuring the results — open data is going to be baked into how residents of Los Angeles understand their own atmosphere. If this project delivers on some of its promise, the value of this approach will be clearer.

If this project delivers on all of its potential, the air itself might improve. For that to happen, the people who are looking at the realities of air pollution will need to advocate for policy makers to improve it. In the future, the success or failure of this project will inform similar efforts that seek to enlist communities in data collection, including whether governments embrace “citizensourcing” beyond natural disasters and crises. The idea of citizens as sensors continues to have legs. Read more…

Knight winners are putting data to work

The common thread among the Knight Foundation's latest grants: practical application of open data.

Data, on its own, locked up or muddled with errors, does little good. Cleaned up, structured, analyzed and layered into stories, data can enhance our understanding of the most basic questions about our world, helping journalists to explain who, what, where, how and why changes are happening.

Last week, the Knight Foundation announced the winners of its first news challenge on data. These projects are each excellent examples of working on stuff that matters: they’re collective investments in our digital civic infrastructure. In the 20th century, civil society and media published the first websites. In the 21st century, civil society is creating, cleaning and publishing open data.

The grants not only support open data but validate its place in the media ecosystem of 2012. The Knight Foundation is funding data science, accelerating innovation in the journalism and media space to help inform and engage communities, a project that they consider “vital to democracy.”

Why? Consider the projects. Safecast creates networked accountability using sensors, citizen science and open source hardware. LocalData is a mobile method for communities to collect information about themselves and make sense of it. Open Elections will create a free, standardized database stream of election results. Development Seed will develop better tools to contribute to and use OpenStreetMap, the “Wikipedia of maps.” Pop Up Archive will develop an easier way to publish and archive multimedia data to the Internet. And Census.IRE.org will improve the ability of a connected nation and its data editors to access and use the work of U.S. Census Bureau.

The projects hint at a future of digital open government, journalism and society founded upon the principles that built the Internet and World Wide Web and strengthened by peer networks between data journalists and civil society. A river of open data flows through them all. The elements and code in them — small pieces, loosely joined by APIs, feeds and the social web — will extend the plumbing of digital democracy in the 21st century.

Read more…

Congress launches Congress.gov in beta, doesn’t open the data

The Library of Congress launched a new website for a more mobile public to access legislative information

The Library of Congress is now more responsive — at least when it comes to web design. Today, the nation’s repository for its laws launched a new beta website at Congress.gov and announced that it would eventually replace Thomas.gov, the 17-year-old website that represented one of the first significant forays online for Congress. The new website will educate the public looking for information on their mobile devices about the lawmaking process, but it falls short of the full promise of embracing the power of the Internet. (More on that later).

Tapping into a growing trend in government new media, the new Congress.gov features responsive design, adapting to desktop, tablet or smartphone screens. It’s also search-centric, with Boolean search and, in an acknowledgement that most of its visitors show up looking for information, puts a search field front and center in the interface. The site includes member profiles for U.S. Senators and Representatives, with associated legislative work. In a nod to a mainstay of social media and media websites, the new Congress.gov also has a “most viewed bills” list that lets visitors see at a glance what laws or proposals are gathering interest online. (You can download a fact sheet on all the changes as a PDF).

On the one hand, the new Congress.gov is a dramatic update to a site that desperately needed one, particularly in a historic moment where citizens are increasingly connecting to the Internet (and one another) through their mobile devices.

On the other hand, the new Congress.gov beta has yet to realize the potential of Congress publishing bulk open legislative data. There is no application programming interface (API) for open government developers to build upon. In many ways, the new Congress.gov replicates what was already available to the public at sites like Govtrack.us and OpenCongress.org. Read more…

President Obama participates in first Presidential AMA on Reddit

The President's participation in a user-driven Q&A was a notable precedent in digital democracy.

Starting around 4:30 PM ET today, President Barack Obama made history by going onto Reddit to answer questions about anything for an hour. Reddit, one of the most popular social news sites on the Internet, has been hosting “Ask Me Anything” forums — or AMAs – for years, including sessions with prominent legislators like Representative Darrell Issa (R-CA), but to host a sitting President of the United States will elevate Reddit’s prominence in the intersection of technology and politics. AllThingsD has the story of Reddit got the President onto the site. Reddit co-founder Alexis Ohanian told Peter Kafka that “there are quite a few redditors at 1600 Pennsylvania Ave and at the campaign HQ — given the prominence of reddit, it’s an easy sell.”

President Obama made some news in the process, with respect to the Supreme Court decision that allowed super political action committees, or “Super PACs,” to become part of the campaign finance landscape.

“Over the longer term, I think we need to seriously consider mobilizing a constitutional amendment process to overturn Citizens United (assuming the Supreme Court doesn’t revisit it),” commented President Obama. “Even if the amendment process falls short, it can shine a spotlight of the super-PAC phenomenon and help apply pressure for change.”

President Obama announced that he’d be participating in the AMA in a tweet and provided photographic evidence that he was actually answering questions in an image posted to Reddit (above) and in a second tweet during the session.

Read more…

Palo Alto looks to use open data to embrace ‘city as a platform’

Palo Alto CIO Jonathan Reichental talks about the city's vision for open data.

In the 21st century, one of the strategies cities around the world are embracing to improve services, increase accountability and stimulate economic activity is to publish open data online. The vision for New York City as a data platform earned wider attention last year, when the Big Apple’s first chief digital officer, Rachel Sterne, pitched the idea to the public.

This week, the city of Palo Alto in California joined over a dozen cities around the United States and globe when it launched its own open data platform. The platform includes an application programming interface (API) which enables direct access through a RESTful interface to open government data published in a JSON format. Datasets can also be embedded like YouTube videos, as below:

Read more…