- Open Source Metrics — Talking about the health of the project based on a single metric is meaningless. It is definitely a waste of time to talk about the health of a project based on metrics like number of software downloads and mailing list activities. Amen!
- BitTorrent To Your TV — The first ever certified BitTorrent Android box goes on sale today, allowing users to stream files downloaded with uTorrent wirelessly to their television. The new set-top box supports playback of all popular video formats and can also download torrents by itself, fully anonymously if needed. (via Andy Baio)
- Tumblr URL Culture — the FOO.tumblr.com namespace is scarce and there’s non-financial speculation. People hoard and trade URLs, whose value is that they say “I’m cool and quirky”. I’m interested because it’s a weird largely-invisible Internet barter economy. Here’s a rant against it. (via Beta Knowledge)
- Design-Fiction Slider Bar of Disbelief (Bruce Sterling) — I love the list as much as the diagram. He lays out a sliding scale from “objective reality” to “holy relics” and positions black propaganda, 419 frauds, design pitches, user feedback, and software code on that scale (among many other things). Bruce is an avuncular Loki, pulling you aside and messing with your head for your own good.
ENTRIES TAGGED "media"
Open Source Metrics, BitTorrent to TV, Tumblr Value, and Variable Fiction
A data-driven investigation of emergency response times by the Los Angeles Data Desk found larger issues.
Here’s an ageless insight that will endure well beyond the “era of big data“: poor collection practices and aging IT will derail any institutional efforts to use data analysis to improve performance.
According to an investigation by the Los Angeles Times, poor record-keeping is holding back state government efforts to upgrade California’s 911 system. As with any database project, beware “garbage in, garbage out,” or “GIGO.”
As Ben Welsh and Robert J. Lopez reported for the L.A. Times in December, California’s Emergency Medical Services Authority has been working to centralize performance data since 2009.
Unfortunately, it’s difficult to achieve data-driven improvements or manage against perceived issues by applying big data to the public sector if the data collection itself is flawed. The L.A. Times reported quality issues stemmed from how response times were measured to record keeping on paper to a failure to keep records at all. Read more…
The Guardian's Simon Rogers on why data journalism caught on and where it goes from here.
In the following interview, Rogers discusses the changes he’s seen in data journalism over the last five years and how new tools and increased notoriety will shape the data journalism space.
Why has data become the story for some journalists like yourself?
Simon Rogers: It’s a big change for reporters, to go from being suspicious of numbers to noticing that often data journalism is the only way to get stories from them. I think it’s a combination — the huge growth in published data out there combining with things like WikiLeaks, which changed the game for news editors to realize this was a new way to get stories. Read more…
Early responses from our investigation into data-driven journalism had an international flavor.
When I wrote that Radar was investigating data journalism and asked for your favorite examples of good work, we heard back from around the world.
I received emails from Los Angeles, Philadelphia, Canada and Italy that featured data visualization, explored the role of data in government accountability, and shared how open data can revolutionize environmental reporting. A tweet pointed me to a talk about how R is being used in the newsroom. Another tweet linked to relevant interviews on social science and the media:
— Journalist’sResource (@JournoResource) November 26, 2012
Several other responses are featured at more length below. After you read through, make sure to also check out this terrific Ignite talk on data journalism recorded at this year’s Newsfoo in Arizona. Read more…
Justin Arenstein is building the capacity of African media to practice data-driven journalism.
This interview is part of our ongoing look at the people, tools and techniques driving data journalism.
I first met Justin Arenstein (@justinarenstein) in Chişinău, Moldova, where the media entrepreneur and investigative journalist was working as a trainer at a “data boot camp” for journalism students. The long-haired, bearded South African instantly makes an impression with his intensity, good humor and focus on creating work that gives citizens actionable information.
Whenever we’ve spoken about open data and open government, Arenstein has been a fierce advocate for data-driven journalism that not only makes sense of the world for readers and viewers, but also provides them with tools to become more engaged in changing the conditions they learn about in the work.
He’s relentlessly focused on how open data can be made useful to ordinary citizens, from Africa to Eastern Europe to South America. For instance, in November, he highlighted how data journalism boosted voter registration in Kenya, creating a simple website using modern web-based tools and technologies.
For the last 18 months, Arenstein has been working as a Knight International Fellow embedded with the African Media Initiative (AMI) as a director for digital innovation. The AMI is a group of the 800 largest media companies on the continent of Africa. In that role, Arenstein has been creating an innovation program for the AMI, building more digital capacity in countries that are as in need of effective accountability from the Fourth Estate as any in the world. That disruption hasn’t yet played itself out in Africa because of a number of factors, explained Arenstein, but he estimates that it will be there within five years.
“Media wants to be ready for this,” he said, “to try and avoid as much of the business disintegration as possible. The program is designed to help them grapple with and potentially leapfrog coming digital disruption.”
Scraping together the best tools, techniques and tactics of the data journalism trade.
Great journalism has always been based on adding context, clarity and compelling storytelling to facts. While the tools have improved, the art is the same: explaining the who, what, where, when and why behind the story. The explosion of data, however, provides new opportunities to think about reporting, analysis and publishing stories.
As you may know, there’s already a Data Journalism Handbook to help journalists get started. (I contributed some commentary to it). Over the next month, I’m going to be investigating the best data journalism tools currently in use and the data-driven business models that are working for news startups. We’ll then publish a report that shares those insights and combines them with our profiles of data journalists.
Why dig deeper? Getting to the heart of what’s hype and what’s actually new and noteworthy is worth doing. I’d like to know, for instance, whether tutorials specifically designed for journalists can be useful, as Joe Brockmeier suggested at ReadWrite. On a broader scale, how many data journalists are working today? How many will be needed? What are the primary tools they rely upon now? What will they need in 2013? Who are the leaders or primary drivers in the area? What are the most notable projects? What organizations are embracing data journalism, and why?
This isn’t a new interest for me, but it’s one I’d like to found in more research. When I was offered an opportunity to give a talk at the second International Open Government Data Conference at the World Bank this July, I chose to talk about open data journalism and invited practitioners on stage to share what they do. If you watch the talk and the ensuing discussion in the video below, you’ll pick up great insight from the work of the Sunlight Foundation, the experience of Homicide Watch and why the World Bank is focused on open data journalism in developing countries.
A/B with Google Analytics, Lego Rubiks Solver, TV Torrents, and Performance Tools
- ABalytics — dead simple A/B testing with Google Analytics. (via Dan Mazzini)
- Fastest Rubik Cube Solver is Made of Lego — it takes less than six seconds to solve the cube. Watch the video, it’s … wow. Also cool is watching it fail. (via Hacker News)
- Fairfax Watches BitTorrent (TorrentFreak) — At a government broadband conference in Sydney, Fairfax’s head of video Ricky Sutton admitted that in a country with one of the highest percentage of BitTorrent users worldwide, his company determines what shows to buy based on the popularity of pirated videos online.
- Web Performance Tools (Steve Souders) — compilation of popular web performance tools. Reminds me of nmap’s list of top security tools.
The 2012 Presidential debates show how far convergence has come and how far we have yet to go.
What a difference a season makes. A few months after widespread online frustration with a tape-delayed Summer Olympics, the 2012 Presidential debates will feature the most online livestreams and wired, up-to-the-second digital coverage in history.
Given the pace of technological change, it’s inevitable that each election season will bring with it new “firsts,” as candidates and campaigns set precedents by trying new approaches and platforms. This election has been no different: the Romney and Obama campaigns have been experimenting with mobile applications, social media, live online video and big data all year.
Tonight, one of the biggest moments in the presidential campaign to date is upon us and there are several new digital precedents to acknowledge.
The biggest tech news is that YouTube, in a partnership with ABC, will stream the debates online for the first time. The stream will be on YouTube’s politics channel, and it will be embeddable.
With more and more livestreamed sports events, concerts and now debates available online, tuning in to what’s happening no longer means passively “watching TV.” The number of other ways people can tune in online in 2012 has skyrocketed, as you can see in GigaOm’s post listing debate livestreams or Mashable’s ways to watch the debates online.
This year, in fact, the biggest challenge people will have will not be finding an online alternative to broadcast or cable news but deciding which one to watch.
Open Publishing, Theatre Sensing, Reddit First, and Math Podcasts
- Open Monograph Press — an open source software platform for managing the editorial workflow required to see monographs, edited volumes and, scholarly editions through internal and external review, editing, cataloguing, production, and publication. OMP will operate, as well, as a press website with catalog, distribution, and sales capacities. (via OKFN)
- Sensing Activity in Royal Shakespeare Theatre (NLTK) — sensing activity in the theatre, for graphing. Raw data available. (via Infovore)
- Why Journalists Love Reddit (GigaOM) — “Stories appear on Reddit, then half a day later they’re on Buzzfeed and Gawker, then they’re on the Washington Post, The Guardian and the New York Times. It’s a pretty established pattern.”
- Relatively Prime: The Toolbox — Kickstarted podcasts on mathematics. (via BoingBoing)
- Seriesly — time-series database written in go.
- Tablets and TV (Luke Wroblewski) — In August 2012, 77% of TV viewers used another device at the same time in a typical day. 81% used a smartphone and TV at the same time. 66% used a laptop and TV at the same time.
- Tiny Transactions on Computer Science — computer science research in 140 characters or fewer.