Privacy, contexts and Girls Around Me

Problems arise when data is taken out of social contexts.

Last weekend, I read two excellent articles on the problems that privacy presents in a mobile, digital age. The Atlantic presented a summary of Helen Nissenbaum’s thoughts on privacy and social norms: When we discuss the use of online privacy, we too often forget the social context in which data exists, even when we’re talking about social media. And Amit Runchal posted a TechCrunch article about the Girls Around Me fiasco, “Creating Victims and Blaming Them,” where he points out that the victims of a service like Girls Around Me shouldn’t be blamed for not understanding the arcane privacy settings of services like Facebook:

“But … the women signed up to be a part of this when they signed up to be on Facebook. No. What they signed up for was to be on Facebook. Our identities change depending on our context, no matter what permissions we have given to the Big Blue Eye. Denying us the right to this creates victims who then get blamed for it. ‘Well … you shouldn’t have been on Facebook if you didn’t want to…’ No. Please recognize them as a person. Please recognize what that means.

Runchal’s powerful “no” underscores the problem: People sign up with Facebook and Foursquare (which quickly blocked Girls Around Me’s access to their API) to communicate with friends, to play games, to find former classmates, and so on. They don’t sign up to have their data sold to the highest bidder. And while Facebook and Foursquare have a legitimate right to run a profitable business, their users have a legitimate right to be treated with some respect, and it’s hard to construe hundreds of inscrutable privacy settings as “respect.” Even if you understand the settings, it’s next to impossible to block apps that you don’t even know about. Perhaps the only way to protect yourself is a complete retreat into privacy, which defeats the purpose of Facebook.

Runchal’s article demonstrates the principles for which Nissenbaum is arguing. Privacy and data don’t exist in the abstract. Privacy and data always exist in social contexts, and problems occur when data is taken out of that context. Users give data to Facebook all the time; that’s normal, and the service couldn’t exist without that happening. Hundreds of millions of people use and enjoy Facebook, so the company is clearly doing a lot of things right. However, handing that same data to another application rips it out of context: Facebook data on its own might be fine, Facebook data crossed with location data from Foursquare is getting fishy (almost any use of location data quickly becomes “fishy”), and that combination published via an app that’s designed to encourage stalking has crossed the line. Nissenbaum has articulated the general principle; Runchal has provided an excellent case study.

In a similar vein, Tim O’Reilly has argued that we should regulate the use of data, and expect data collectors to obey cultural norms about reasonable and unreasonable uses of data. A doctor could share your medical history with researchers, but not with an insurance company that might use it to cancel your policy. That’s the only way to get the medical progress that comes from sharing data without the chilling side effect of making medical care inaccessible to anyone who actually needs it. Tim has defended Facebook for being willing to push the limits of privacy because that’s the only way to find out what the new norms should be and what benefits we can derive from new applications. That’s fair enough, and in this case (as I already pointed out), Foursquare was quick to yank API access.

It’s useful to imagine the same software with a slightly different configuration. Girls Around Me has undeniably crossed a line. But what if, instead of finding women, the app was Hackers Around Me? That might be borderline creepy, but most people could live with it, and it might even lead to some wonderful impromptu hackathons. EMTs Around Me could save lives. I doubt that you’d need to change a single line of code to implement either of these apps, just some search strings. The problem isn’t the software itself, nor is it the victims, but what happens when you move data from one context into another. Moving data about EMTs into context where EMTs are needed is socially acceptable; moving data into a context that facilitates stalking isn’t acceptable, and shouldn’t be.

The Atlantic’s article about Nissenbaum ends with some pessimism about our ability to define social norms surrounding privacy: “It’s quite difficult to figure out what the norms for a given situation might be.” And that’s true. We don’t yet know what cultural norms for privacy are, let alone how to regulate for them, or how regulations should evolve as technology evolves and cultural norms change. Locking in our present norms through some badly thought out regulation strikes me as a recipe for disaster. I care much more about the TSA’s scanners at an airport than about Google photographing my house for Street View, but I’d be ecstatically surprised to see legislation that reflected my priorities. The New York Times reports that cell phone tracking is routinely used by local law enforcement agencies, with little or no court oversight; and in the current climate, I’d be surprised to see privacy regulation that challenges the widespread use and abuse of surveillance by the police.

But this isn’t the time to throw up our hands. It isn’t as if we’re completely lacking in clue. With that in mind, I’ll give Amit Runchal the last word:

“The line is this: When you begin speaking for another person without their permission, you are doing something wrong. When you create another identity for them without their permission, you are doing something wrong. When you make people feel victimized who previously did not feel that way, you are doing something wrong.”

Those are words I can live by.

Related:

tags: , ,