Privacy, contexts and Girls Around Me

Problems arise when data is taken out of social contexts.

Last weekend, I read two excellent articles on the problems that privacy presents in a mobile, digital age. The Atlantic presented a summary of Helen Nissenbaum’s thoughts on privacy and social norms: When we discuss the use of online privacy, we too often forget the social context in which data exists, even when we’re talking about social media. And Amit Runchal posted a TechCrunch article about the Girls Around Me fiasco, “Creating Victims and Blaming Them,” where he points out that the victims of a service like Girls Around Me shouldn’t be blamed for not understanding the arcane privacy settings of services like Facebook:

“But … the women signed up to be a part of this when they signed up to be on Facebook. No. What they signed up for was to be on Facebook. Our identities change depending on our context, no matter what permissions we have given to the Big Blue Eye. Denying us the right to this creates victims who then get blamed for it. ‘Well … you shouldn’t have been on Facebook if you didn’t want to…’ No. Please recognize them as a person. Please recognize what that means.

Runchal’s powerful “no” underscores the problem: People sign up with Facebook and Foursquare (which quickly blocked Girls Around Me’s access to their API) to communicate with friends, to play games, to find former classmates, and so on. They don’t sign up to have their data sold to the highest bidder. And while Facebook and Foursquare have a legitimate right to run a profitable business, their users have a legitimate right to be treated with some respect, and it’s hard to construe hundreds of inscrutable privacy settings as “respect.” Even if you understand the settings, it’s next to impossible to block apps that you don’t even know about. Perhaps the only way to protect yourself is a complete retreat into privacy, which defeats the purpose of Facebook.

Runchal’s article demonstrates the principles for which Nissenbaum is arguing. Privacy and data don’t exist in the abstract. Privacy and data always exist in social contexts, and problems occur when data is taken out of that context. Users give data to Facebook all the time; that’s normal, and the service couldn’t exist without that happening. Hundreds of millions of people use and enjoy Facebook, so the company is clearly doing a lot of things right. However, handing that same data to another application rips it out of context: Facebook data on its own might be fine, Facebook data crossed with location data from Foursquare is getting fishy (almost any use of location data quickly becomes “fishy”), and that combination published via an app that’s designed to encourage stalking has crossed the line. Nissenbaum has articulated the general principle; Runchal has provided an excellent case study.

In a similar vein, Tim O’Reilly has argued that we should regulate the use of data, and expect data collectors to obey cultural norms about reasonable and unreasonable uses of data. A doctor could share your medical history with researchers, but not with an insurance company that might use it to cancel your policy. That’s the only way to get the medical progress that comes from sharing data without the chilling side effect of making medical care inaccessible to anyone who actually needs it. Tim has defended Facebook for being willing to push the limits of privacy because that’s the only way to find out what the new norms should be and what benefits we can derive from new applications. That’s fair enough, and in this case (as I already pointed out), Foursquare was quick to yank API access.

It’s useful to imagine the same software with a slightly different configuration. Girls Around Me has undeniably crossed a line. But what if, instead of finding women, the app was Hackers Around Me? That might be borderline creepy, but most people could live with it, and it might even lead to some wonderful impromptu hackathons. EMTs Around Me could save lives. I doubt that you’d need to change a single line of code to implement either of these apps, just some search strings. The problem isn’t the software itself, nor is it the victims, but what happens when you move data from one context into another. Moving data about EMTs into context where EMTs are needed is socially acceptable; moving data into a context that facilitates stalking isn’t acceptable, and shouldn’t be.

The Atlantic’s article about Nissenbaum ends with some pessimism about our ability to define social norms surrounding privacy: “It’s quite difficult to figure out what the norms for a given situation might be.” And that’s true. We don’t yet know what cultural norms for privacy are, let alone how to regulate for them, or how regulations should evolve as technology evolves and cultural norms change. Locking in our present norms through some badly thought out regulation strikes me as a recipe for disaster. I care much more about the TSA’s scanners at an airport than about Google photographing my house for Street View, but I’d be ecstatically surprised to see legislation that reflected my priorities. The New York Times reports that cell phone tracking is routinely used by local law enforcement agencies, with little or no court oversight; and in the current climate, I’d be surprised to see privacy regulation that challenges the widespread use and abuse of surveillance by the police.

But this isn’t the time to throw up our hands. It isn’t as if we’re completely lacking in clue. With that in mind, I’ll give Amit Runchal the last word:

“The line is this: When you begin speaking for another person without their permission, you are doing something wrong. When you create another identity for them without their permission, you are doing something wrong. When you make people feel victimized who previously did not feel that way, you are doing something wrong.”

Those are words I can live by.


tags: , ,
  • Gwen Jenkins

    The “social norm” isn’t an adequate standard. Take those TSA screenings, for example. They respond to (and encourage) the current, decidedly unAmerican, social norm that says we have a right not only to BE safe, but to FEEL safe for the few hours we’re in flight. In furtherance of that illusion, we’re entitled to demand that everyone else submit to inconvenience, indignity and loss of privacy, or stay home.

  • Great article. Contex is too often left out of the discussion. The comment about what one expects when one signs up for Facebook (vs. what one gets) is too true!

  • Gar

    They should be testing the waters of privacy to know where the limits should be set. Of course, right after we yell that they want too far is way too late, our private data is already out there. Acting responsible is a great thing to say but means very little to a company that needs to make money due to being down for the year, behind expectations or just in it to get rich. Begging for forgiveness for going too far is not a data sharing model we can live with.

  • rj

    In many ways.. if I look at all the hand wringing about personal privacy, I’m amazed that people can be in such denial about the results of their own actions.. yet I’m a student of history.. and if history is any measure, there are simply no limits to the depths that human denial can go..

    Facebook, if viewed objectively, can ONLY be about using private data in ways that would not be agreeable to its users (I won’t call them customers.. because Facebook’s customers are ad agencies and folks looking to use consumer based data). To state it most directly: no service is ever free. If you are getting something for free, then you are most typically providing something else of value to the provider of those free services. When you sign up for Facebook, this is the contract you are making.. yet most folks act like the proverbial home owners who buy a house in an airport landing pattern, and then after the fact complain about the noise.

    An ancient latin phrase captures my perspective the best.. Caveat Emptor. Like it or not, when we take actions as adults, those actions have consequences. This phrase goes to the heart of the matter. If you expose your personal data without considering the motivations of those who get that information, you get exactly what you deserve..

    Now.. to all those who might say.. ‘what about Amazon’, ‘what about ‘. There is also a risk there.. truly! BUT.. Amazon and its ilk takes your money for products.. they have a vested interest in playing fairly with any private data you may provide.. and stand to lose greatly if discovered playing fast and loose with personal info.. Again.. nothing is free.. folks must eat day to day.. so if you as a ‘customer’ aren’t paying for a service, then you must consider who IS paying.. and why. That is the essence of being conscious of threats to privacy and such on the internet.

  • Eve

    Most of the fine print reads that they have a right to use everything that you post on their sites. Maybe they feel that because you agree to this term, they own you and can do what they please.

  • Gar

    Facebook pushes the next ‘limit’ with the purchase of Intagram user’s list and their history.