We want to share. We want to buy. We want help. We want to talk.
At the end of the day, though, we want to be able to go to sleep without worrying that all of those great conversations on the open web will endanger the rest of what we do.
Making the web work has always been a balancing act between enabling and forbidding, remembering and forgetting, and public and private. Managing identity, security, and privacy has always been complicated, both because of the challenges in each of those pieces and the tensions among them.
Complicating things further, the web has succeeded in large part because people — myself included — have been willing to lock their paranoias away so long as nothing too terrible happened.
I talked for years about expecting that the NSA was reading all my correspondence, but finding out that yes, indeed they were filtering pretty much everything, opened the door to a whole new set of conversations and concerns about what happens to my information. I made my home address readily available in an IETF RFC document years ago. In an age of doxxing and SWATting, I wonder whether I was smart to do that. As the costs move from my imagination to reality, it’s harder to keep the door to my paranoia closed.
Identity used to seem easy — I am me and you are you, right? Get us some accounts, preferably quickly, and all will be well. OAuth seemed like a great way to concentrate account-making in places where people thought it was worth the trouble. I thought anonymity was useful for some key cases — whistleblowers in particular — but otherwise a mistake. I’ve avoided using anonymity myself, as the temptations are too great. “Real name” policies felt like overkill to me, but I could see the point. Permanent accounts created with minimum friction seemed like the right way to go.
Today, when identity is the point the snoops use to track you and quick accounts are the tools trolls use to make attacks look bigger than they are, it’s harder to tell any of those stories. Real name policies cause real harm to many people, and that’s before the NSA comes looking. The business web’s seemingly perpetual goal of minimizing transactional friction creates its own problems around identity. That goal combines with a lack of interest in policing conversations (perhaps changing as I write?) to create sock puppet festivals of abuse.
Security has never seemed easy, but the sheer exhaustion of trying to keep up with it has meant that non-specialists have long had to hope that someone else was helping them. HTML itself can’t do much harm, but all the surrounding technologies of the web have weak spots. The NSA was our friend, helping develop SELinux and cryptographic tools for protecting information. Black Hats and White Hats fought it out, while most hoped that updating our infrastructure regularly would keep us safe…enough.
Enough is a hard word. The NSA idea didn’t work out too well. Relying on a small core of developers to create secure code that is widely used didn’t work out so well, either, and the bugs keep coming in both open source and proprietary software. Starting over is hard, especially in a world where we are “faced with the hilariously terrifying possibility that technology is now moving so fast that the US government can no longer distinguish a rogue state with nuclear weapons from a gang of trolls in it for the lulz.” Well, maybe not hilarious.
Web security used to seem like a programmer’s problem. Programmers dealt with cryptography and other libraries, while the rest of us simply applied their code, added some passwords and certificates, and off we went. The last decade has taught us that programming is part of the story, but so are user interface, database administration, network management, and identity management. Passwords no longer seem like a good answer. Certificate authorities have proven expensive, inconvenient, and fragile, though perhaps the EFF and friends can help with that this year.
Privacy combines the challenges of identity and security and asks what we want to do with the information attached to identities behind secure walls. We can mock privacy, or suggest that “This word, privacy? — it’s a problem.” Many tech folks see privacy’s decline as inevitable. Humans outside of the tech world — and many inside of it — though, still find it a useful concept, and aren’t necessarily “data-positive.”
“Data-negativity” may peak in a country where left, right, and center were all spied upon, to terrible result and with acknowledgment of the disaster: Germany. I visited twice last year, once for the amazing LinuxTag conference, and once to visit family. I wasn’t surprised by conversations about security and privacy at LinuxTag, but they weren’t limited to the show. Technically hyper-literate and less technical folks both wondered what I, as an American technologist, could be thinking of Silicon Valley’s lack of interest in their story. Die Zeit ran an extended piece on “Die Vereinigten Staaten von Google” — “The United States of Google” — trying to explain where Silicon Valley’s attitudes and approaches came from.
I should be able to explain all that — after all, I wrote a book on Cookies long ago. I was one of the people who thought tools were neutral, and the benefits of tracking (at least for people writing applications, my primary audience) vastly outweighed the costs. Unfortunately, especially given the Snowden revelations, I can’t. Commercial and government surveillance come with costs we haven’t begun to estimate, again trying to leave our paranoia locked in a separate room while we reduce friction on the web.
Security and identity failures compromise privacy quickly, but even when those two pieces are working smoothly, we still have to figure out how privacy should really work.
Programmers are capable of creating code that forgets information or encapsulates it. We deliberately lose credit card information all the time, though maybe not as frequently as we should. Snapchat has demonstrated the business viability of models that forget data, and business is getting those options, though there always seem to be people and companies that want to enforce memory. Many retention policies are shifting from “keep all email forever” to “incinerate after 90 days whenever possible because you never know what a hostile prosecutor or public can find.” I’m also intrigued by frameworks that emphasize transparency, unlinkability, and intervenability.
“Keep me safe” means different things to different people. The questions involved definitely go beyond the web, but the web is the visible surface of technology. We need to figure out how to balance reducing transactional friction with minimizing surprise. That surprise won’t always be a SWAT team at the door or strange credit card charges, fortunately. We need to address these questions — at the technology level, in our own sites and apps, and in our conversations with the outside world.
Editor’s note: If you’re interested in learning more about the tools, processes, and policies shaping online privacy today, check out The Architecture of Privacy, by Courtney Bowman, Ari Gesher, John K Grant, and Daniel Slate.
This post is part of our exploration into web privacy and security.