The essence of my argument is that there’s enormous advantage for users in giving up some privacy online and that we need to be exploring the boundary conditions – asking ourselves when is it good for users, and when is it bad, to reveal their personal information. I’d rather have entrepreneurs making high-profile mistakes about those boundaries, and then correcting them, than silently avoiding controversy while quietly taking advantage of public ignorance of the subject, or avoiding a potentially contentious area of innovation because they are afraid of backlash. It’s easy to say that this should always be the user’s choice, but entrepreneurs from Steve Jobs to Mark Zuckerberg are in the business of discovering things that users don’t already know that they will want, and sometimes we only find the right balance by pushing too far, and then recovering.
The world is changing. We give up more and more of our privacy online in exchange for undoubted benefits. We give up our location in order to get turn by turn directions on our phone; we give up our payment history in return for discounts or reward points; we give up our images to security cameras equipped with increasingly sophisticated machine learning technology. As medical records go online, we’ll increase both the potential and the risks of having private information used and misused.
We need to engage deeply with these changes, and we best do that in the open, with some high profile mis-steps to guide us. In an odd way, Facebook is doing us a favor by bringing these issues to the fore, especially if (as they have done in the past), they react by learning from their mistakes. It’s important to remember that there was a privacy brouhaha when Facebook first introduced the Newsfeed back in 2006!
What we’re really trying to figure out are the right tradeoffs. And there’s no question that there will be tradeoffs. The question is whether, in the end, Facebook is creating more value than they capture. I’m finding Facebook increasingly useful. And I think a lot of other people are too. Does anyone else see the irony in the screenshot below, from ReadWriteWeb’s article More Web Industry Leaders Quit Facebook, Call for an Open Alternative:
Almost an order of magnitude more people have used Facebook’s “Like” feature to approve of the suggestion to quit Facebook than have commented on the blog post!
That being said, T.S. Eliot’s judgment (from Murder in the Cathedral) that
The last temptation is the greatest treason:
To do the right deed for the wrong reason.
is hauntingly apt. Facebook is not pushing these boundaries for user benefit but for their own.
danah boyd goes to the heart of the matter when she writes:
The battle that is underway is not a battle over the future of privacy and publicity. It’s a battle over choice and informed consent. It’s unfolding because people are being duped, tricked, coerced, and confused into doing things where they don’t understand the consequences. Facebook keeps saying that it gives users choices, but that is completely unfair. It gives users the illusion of choice and hides the details away from them “for their own good.”
There’s a fabulous New York Times infographic that demonstrates just how complex Facebook has made its privacy controls: more than 50 settings with a total of 170 options. Now, Facebook may think you may need a dashboard of that much complexity in order to properly manage your privacy. But much of the complexity is of Facebook’s own making.
Users have the right to:
1. Honesty: Tell the truth. Don’t make our information public against our will and call it “giving users more control.” Call things what they are.
2. Accountability: Keep your word. Honor the deals you make and the expectations they create. If a network asks users to log in, users expect that it’s private. Don’t get us to populate your network based on one expectation of privacy, and then change the rules once we’ve connected with 600 friends.
3. Control: Let us decide what to do with our data. Get our permission before you make any changes that make our information less private. We should not have data cross-transmitted to other services without our knowledge. We should always be asked to opt in before a change, rather than being told we have the right to opt out after a change is unilaterally imposed.
4. Transparency: We deserve to know what information is being disclosed and to whom. When there has been a glitch or a leak that involves our information, make sure we know about it.
5. Freedom of movement: If we want to leave your network, let us. If we want to take our data with us, let us do that, too. This will encourage competition through innovation and service, instead of hostage-taking. If we want to delete our data, let us. It’s our data.
6. Simple settings: If we want to change something, let us. Use intuitive, standard language. Put settings in logical places. Give us a “maximize privacy settings” button, a and a “delete my account” button.
7. Be treated as a community, not a data set: We join communities because we like them, not “like” them. Advertise to your community if you want. But don’t sell our data out from under us.
Everyone is right to hold Facebook’s feet to the fire as long as they fail to meet those guidelines. But let’s not make privacy a third rail issue, pillorying any company that makes a mistake on the privacy front. If we do that, we’ll never get the innovation we need to solve the thorny nest of issues around privacy and data ownership that are intrinsic to the network era.
We need to heed the advice of management gurus Tom Peters and Esther Dyson. Tom reminds us to “Fail. Forward. Fast.” Esther’s tag line is Always make new mistakes. With that in mind, I’m willing to cut Facebook some slack. For now.