"data journalism" entries
A data-driven investigation of emergency response times by the Los Angeles Data Desk found larger issues.
Here’s an ageless insight that will endure well beyond the “era of big data“: poor collection practices and aging IT will derail any institutional efforts to use data analysis to improve performance.
According to an investigation by the Los Angeles Times, poor record-keeping is holding back state government efforts to upgrade California’s 911 system. As with any database project, beware “garbage in, garbage out,” or “GIGO.”
As Ben Welsh and Robert J. Lopez reported for the L.A. Times in December, California’s Emergency Medical Services Authority has been working to centralize performance data since 2009.
Unfortunately, it’s difficult to achieve data-driven improvements or manage against perceived issues by applying big data to the public sector if the data collection itself is flawed. The L.A. Times reported quality issues stemmed from how response times were measured to record keeping on paper to a failure to keep records at all. Read more…
The Guardian's Simon Rogers on why data journalism caught on and where it goes from here.
In the following interview, Rogers discusses the changes he’s seen in data journalism over the last five years and how new tools and increased notoriety will shape the data journalism space.
Why has data become the story for some journalists like yourself?
Simon Rogers: It’s a big change for reporters, to go from being suspicious of numbers to noticing that often data journalism is the only way to get stories from them. I think it’s a combination — the huge growth in published data out there combining with things like WikiLeaks, which changed the game for news editors to realize this was a new way to get stories. Read more…
Early responses from our investigation into data-driven journalism had an international flavor.
When I wrote that Radar was investigating data journalism and asked for your favorite examples of good work, we heard back from around the world.
I received emails from Los Angeles, Philadelphia, Canada and Italy that featured data visualization, explored the role of data in government accountability, and shared how open data can revolutionize environmental reporting. A tweet pointed me to a talk about how R is being used in the newsroom. Another tweet linked to relevant interviews on social science and the media:
— Journalist’sResource (@JournoResource) November 26, 2012
Several other responses are featured at more length below. After you read through, make sure to also check out this terrific Ignite talk on data journalism recorded at this year’s Newsfoo in Arizona. Read more…
An interactive timeline visualization of The Rolling Stones' touring history.
From a data display perspective, the visualization is an interesting approach to a timeline story. It shows a progression through visualization as opposed to the more traditional static images along a bar or time graph.
Justin Arenstein is building the capacity of African media to practice data-driven journalism.
This interview is part of our ongoing look at the people, tools and techniques driving data journalism.
I first met Justin Arenstein (@justinarenstein) in Chişinău, Moldova, where the media entrepreneur and investigative journalist was working as a trainer at a “data boot camp” for journalism students. The long-haired, bearded South African instantly makes an impression with his intensity, good humor and focus on creating work that gives citizens actionable information.
Whenever we’ve spoken about open data and open government, Arenstein has been a fierce advocate for data-driven journalism that not only makes sense of the world for readers and viewers, but also provides them with tools to become more engaged in changing the conditions they learn about in the work.
He’s relentlessly focused on how open data can be made useful to ordinary citizens, from Africa to Eastern Europe to South America. For instance, in November, he highlighted how data journalism boosted voter registration in Kenya, creating a simple website using modern web-based tools and technologies.
For the last 18 months, Arenstein has been working as a Knight International Fellow embedded with the African Media Initiative (AMI) as a director for digital innovation. The AMI is a group of the 800 largest media companies on the continent of Africa. In that role, Arenstein has been creating an innovation program for the AMI, building more digital capacity in countries that are as in need of effective accountability from the Fourth Estate as any in the world. That disruption hasn’t yet played itself out in Africa because of a number of factors, explained Arenstein, but he estimates that it will be there within five years.
“Media wants to be ready for this,” he said, “to try and avoid as much of the business disintegration as possible. The program is designed to help them grapple with and potentially leapfrog coming digital disruption.”
Scraping together the best tools, techniques and tactics of the data journalism trade.
Great journalism has always been based on adding context, clarity and compelling storytelling to facts. While the tools have improved, the art is the same: explaining the who, what, where, when and why behind the story. The explosion of data, however, provides new opportunities to think about reporting, analysis and publishing stories.
As you may know, there’s already a Data Journalism Handbook to help journalists get started. (I contributed some commentary to it). Over the next month, I’m going to be investigating the best data journalism tools currently in use and the data-driven business models that are working for news startups. We’ll then publish a report that shares those insights and combines them with our profiles of data journalists.
Why dig deeper? Getting to the heart of what’s hype and what’s actually new and noteworthy is worth doing. I’d like to know, for instance, whether tutorials specifically designed for journalists can be useful, as Joe Brockmeier suggested at ReadWrite. On a broader scale, how many data journalists are working today? How many will be needed? What are the primary tools they rely upon now? What will they need in 2013? Who are the leaders or primary drivers in the area? What are the most notable projects? What organizations are embracing data journalism, and why?
This isn’t a new interest for me, but it’s one I’d like to found in more research. When I was offered an opportunity to give a talk at the second International Open Government Data Conference at the World Bank this July, I chose to talk about open data journalism and invited practitioners on stage to share what they do. If you watch the talk and the ensuing discussion in the video below, you’ll pick up great insight from the work of the Sunlight Foundation, the experience of Homicide Watch and why the World Bank is focused on open data journalism in developing countries.
The Washington Post developed an interactive map using data from area homicides from 2000 through 2011.
Residents in Washington D.C., or citizens considering a move to D.C., have a new tool to assess the city’s homicide rate. As part of a 15-month investigative study, The Washington Post has created an interactive map of the homicides in D.C. from 2000 through 2011. The interactive tool lets users drill down into the information by demographic, motive and manner of murder, for instance — all of which can also be isolated by neighborhood or by individual homicide.
Commonalities between the Knight Foundation's News Challenge winners hint at journalism's networked future.
It’s not just a focus on data that connects the most recent class of Knight News Challenge winners. They all are part of a distributed civic media community that works on open source code, collects and improves data, and collaborates across media organizations.
These projects are “part of an infrastructure that helps journalists better understand and serve their communities through data,” commented Chris Sopher, Knight Foundation Journalism Program Associate, in an interview last week. To apply a coding metaphor, the Knight Foundation is funding the creation of patches for the source code of society. This isn’t a new focus: in 2011, Knight chose to help build the newsroom stack, from editorial search engines to data cleaning tools.
Following are four themes that jumped out when I looked across the winners of the latest Knight News Challenge round.
An intercontinental project that bridged citizen science, open data, open source hardware, civic hacking and the Internet of things to monitor, share and map radiation data? Safecast is in its own category. Adapting the system to focus on air quality in Los Angeles — a city that’s known for its smog — will be an excellent stress test for seeing if this distributed approach to networked accountability can scale.
If it does — and hacked Chumbys, LED signs, Twitter bots, smartphone apps and local media reports start featuring the results — open data is going to be baked into how residents of Los Angeles understand their own atmosphere. If this project delivers on some of its promise, the value of this approach will be clearer.
If this project delivers on all of its potential, the air itself might improve. For that to happen, the people who are looking at the realities of air pollution will need to advocate for policy makers to improve it. In the future, the success or failure of this project will inform similar efforts that seek to enlist communities in data collection, including whether governments embrace “citizensourcing” beyond natural disasters and crises. The idea of citizens as sensors continues to have legs. Read more…
The common thread among the Knight Foundation's latest grants: practical application of open data.
Data, on its own, locked up or muddled with errors, does little good. Cleaned up, structured, analyzed and layered into stories, data can enhance our understanding of the most basic questions about our world, helping journalists to explain who, what, where, how and why changes are happening.
Last week, the Knight Foundation announced the winners of its first news challenge on data. These projects are each excellent examples of working on stuff that matters: they’re collective investments in our digital civic infrastructure. In the 20th century, civil society and media published the first websites. In the 21st century, civil society is creating, cleaning and publishing open data.
The grants not only support open data but validate its place in the media ecosystem of 2012. The Knight Foundation is funding data science, accelerating innovation in the journalism and media space to help inform and engage communities, a project that they consider “vital to democracy.”
Why? Consider the projects. Safecast creates networked accountability using sensors, citizen science and open source hardware. LocalData is a mobile method for communities to collect information about themselves and make sense of it. Open Elections will create a free, standardized database stream of election results. Development Seed will develop better tools to contribute to and use OpenStreetMap, the “Wikipedia of maps.” Pop Up Archive will develop an easier way to publish and archive multimedia data to the Internet. And Census.IRE.org will improve the ability of a connected nation and its data editors to access and use the work of U.S. Census Bureau.
The projects hint at a future of digital open government, journalism and society founded upon the principles that built the Internet and World Wide Web and strengthened by peer networks between data journalists and civil society. A river of open data flows through them all. The elements and code in them — small pieces, loosely joined by APIs, feeds and the social web — will extend the plumbing of digital democracy in the 21st century.