This post is part three of a series raising questions about the mass adoption of social technologies. Here are links to part one and two. These posts will be opened to live discussion in an upcoming webcast on May 27. (special guest to be announced shortly)
In 1785 utilitarian philosopher Jeremy Bentham proposed architectural plans for the Panopticon, a prison Bentham described as “a new mode of obtaining power of mind over mind, in a quantity hitherto without example.” Its method was a circular grid of surveillance; the jailors housed in a central tower being provided a 360-degree view of the imprisoned. Prisoners would not be able to tell when a jailor was actually watching or not. The premise ran that under the possibility of total surveillance (you could be being observed at any moment of the waking day) the prisoners would self-regulate their behavior to conform to prison norms. The perverse genius of the Panopticon was that even the jailor existed within this grid of surveillance; he could be viewed at any time (without knowing) by a still higher authority within the central tower – so the circle was complete, the surveillance – and thus conformance to authority – total.
In 1811 the King refused to authorize the sale of land for the purpose and Bentham was left frustrated in his vision to build the Panopticon. But the concept endured – not just as a literal architecture for controlling physical subjects (there are many Panopticons that now bear Bentham’s stamp) – but as a metaphor for understanding the function of power in modern times. French philosopher Michel Foucault dedicated a whole section of his book Discipline and Punish to the significance of the Panopticon. His take was essentially this: The same mechanism at work in the Panopticon – making subjects totally visible to authority – leads to those subjects internalizing the norms of power. In Foucault’s words “…the major effect of the Panopticon; to induce in the inmate a state of conscious and permanent visibility that assures the automatic functioning of power. So to arrange things that the surveillance is permanent in its effects, even if it is discontinuous in its action; that the perfection of power should tend to render its actual exercise unnecessary” In short, under the possibility of total surveillance the inmate becomes self regulating.
The social technologies we see in use today are fundamentally panoptical – the architecture of participation is inherently an architecture of surveillance.
In the age of social networks we find ourselves coming under a vast grid of surveillance – of permanent visibility. The routine self-reporting of what we are doing, reading, thinking via status updates makes our every action and location visible to the crowd. This visibility has a normative effect on behavior (in other words we conform our behavior and/or our speech about that behavior when we know we are being observed).
In many cases we are opting into automated reporting structures (Google Lattitude, Loopt etc.) that detail our location at any given point in time. We are doing this in exchange for small conveniences (finding local sushi more quickly, gaining “ambient intimacy”) without ever considering the bargain that we are striking. In short, we are creating the ultimate Panopticon – with our data centrally housed in the cloud (see previous post on the Captivity of the Commons) – our every movement, and up-to-the-minute status is a matter of public record. In the same way that networked communications move us from a one to many broadcast model to a many to many – so we are seeing the move to a many-to-many surveillance model. A global community of voyeurs ceaselessly confessing to “What are you doing? (Twitter) or “What’s on your mind? (Facebook)
Captivity of the Commons focused on the risks corporate ownership of personal data. This post is concerned with how, as individuals, we have grown comfortable giving our information away; how our sense of privacy is changing under the small conveniences that disclosure brings. How our identity changes as an effect of constant self-disclosure. Many previous comments have rightly noted that privacy is often cultural — if you don’t expect it – there is no such thing as an infringement. Yet it is important to reckon with the changes we see occurring around us and argue what kind of a culture we wish to create (or contribute to).
Jacques Ellul’s book, Propaganda, had a thesis that was at once startling and obvious: Propaganda’s end goal is not to change your mind at any one point in time – but to create a changeable mind. Thus when invoked at the necessary time – humans could be manipulated into action. In the U.S. this language was expressed by catchphrases like, “communism in our backyard,” “enemies of freedom” or the current manufactured hysteria about Obama as a “socialist”.
Similarly the significance of status updates and location based services may not lie in the individual disclosure but in the significance of a culture that has become accustomed to constant disclosure.