Profile of the Data Journalist: The Data Editor

Meghan Hoyer is a data editor at The Virginan Pilot in Norfolk, Virginia.

Around the globe, the bond between data and journalism is growing stronger. In an age of big data, the growing importance of data journalism lies in the ability of its practitioners to provide context, clarity and, perhaps most important, find truth in the expanding amount of digital content in the world. In that context, data journalism has profound importance for society.

To learn more about the people who are doing this work and, in some cases, building the newsroom stack for the 21st century, I conducted a series of email interviews during the 2012 NICAR Conference.

Meghan Hoyer (@MeghanHoyer) is a data editor based in Virginia. Our interview follows.

Where do you work now? What is a day in your life like?

I work in an office within The Virginian Pilot’s newsroom. I’m a one-person team, so there’s no such thing as typical.

What I might do: Help a reporter pull Census data, work with IT on improving our online crime report app, create a DataTable of city property assessment changes, and plan training for a group of co-workers who’d like to grow their online skills. At least, that’s what I’m doing today.

Tomorrow, it’ll be helping with our online election report, planning a strategy to clean a dirty database, and working with a reporter to crunch data for a crime trend story.

How did you get started in data journalism? Did you get any special degrees or certificates?

I have a journalism degree from Northwestern, but I got started the same way most reporters probably got started – I had questions about my community and I wanted quantifiable answers. How had the voting population in a booming suburb changed? Who was the region’s worst landlord? Were our localities going after delinquent taxpayers? Anecdotes are nice, but it’s an amazingly powerful thing to be able to get the true measure of a situation. Numbers and analysis help provide a better focus – and sometimes, they upend entirely your initial theory.

Did you have any mentors? Who? What were the most important resources they shared with you?

I haven’t collected a singular mentor as much as a group of people whose work I keep tabs on, for inspiration and follow-up. The news community is pretty small. A lot of people have offered suggestions, guidance, cheat sheets and help over the years. Data journalism – from analysis to building apps — is definitely not something you can or need to learn in a bubble all on your own.

What does your personal data journalism “stack” look like? What tools could you not live without?

In terms of daily tools, I keep it basic: Google docs, Fusion Tables and Refine, QGIS, SQLite and Excel are all in use pretty much every day.

I’ve learned some Python and JavaScript for specific projects and to automate some of the newsroom’s daily tasks, but I definitely don’t have the programming or technical background that a lot of people in this field have. That’s left me trying to learn as much as I can as quick as I can.

In terms of a data stack, we keep information such as public employee salaries, land assessment databases and court record databases (among others) updated in a shared drive in our newsroom. It’s amazing how often reporters use them, even if it’s just to find out which properties a candidate owns or how long a police officer caught at a DUI checkpoint has been on the force.

What data journalism project are you the most proud of working on or creating?

I’m proud of using regional records to do an analysis which forced Norfolk to revamp its whole real estate tax collection process.

A few years ago, I combined property ownership records, code enforcement citations, real estate tax records and rental inspection information from all our local cities into “Cashing Blight — and found a company with hundreds of derelict properties.

Their properties seemed to change hands often, so a partner and I then hand-built a database from thousands of land deeds that proved the company was flipping houses among investors in a $26 million mortgage fraud scheme. None of the cities in our region had any idea this was going on because they were dealing with each parcel as a separate entity.

That’s what combining sets of data can get you – a better overall view of what’s really happening. While government agencies are great at collecting piles of data, it’s that kind of larger analysis that’s missing.

Where do you turn to keep your skills updated or learn new things?

To be honest – Twitter. I get a lot of ideas and updates on new tools there. And the NICAR conference and listserv. Usually when you hit up against a problem – whether it’s dealing with a dirty dataset or figuring out how to best visualize your data — it’s something that someone else has already faced.

I also learn a lot from the people within our newsroom. We have a talented group of web producers who all are eager to try new things and learn.

Why are data journalism and “news apps” important, in the context of the contemporary digital environment for information?

Data is everywhere, but in most cases it’s just stockpiled and warehoused without a second thought to analysis or using it to solve larger problems.

Journalists are in a unique position to make sense of it, to find the stories in it, to make sure that governments and organizations are considering the larger picture.

I think, too, that people in our field need to truly push for open government in the sense not of government building interfaces for data, but for just releasing raw data streams. Government is still far too stuck in the “Here’s a PDF of a spreadsheet” mentality. That doesn’t create informed citizens, and it doesn’t lead to innovative ways of thinking about government.

I’ve been involved recently in a community effort to create an API and then apps out of the regional transit authority’s live bus GPS stream. It has been a really fun project – and one I hope makes local governments rethink their practices.

tags: , , , ,