Skinner Box? There's an App for That

If you are reading this post it means that after countless misfires, I finally kept my attention focused long enough to finish it. That may seem like no big deal, a mere trifling effort, but I’m basking in the moment. In fact, I’ll probably tweet it.

web20expotwitter.jpgIt didn’t start out to be about digital Skinner boxes. It was a Radar backchannel email about the infamous Web 2.0 Expo Twitterfall incident. I got all curmudgeonly and ranted about continuous partial attention, Twitter as a snark amplifier, and the “Ignite’ification” of conferences (with apologies to Brady). In short, I demonstrated myself unfit to contribute to a blog called Radar.

I swear I’m not a Luddite. I’m not moving to Florida to bitch about the government full time and I’m not in some remote shack banging this out on an ancient Underwood. However, I guess I count myself among the skeptics when it comes to the unmitigated goodness of progress. Or at least its distant cousin, trendiness.

Anyway, I sent the email, inexplicably Jesse said “post!”, and I tried reworking it. I still am. This piece has been grinding away like sand in my cerebral gears since, and along the way it has become about something else.

In The Anthologist, Nicholson Baker describes writing poetry as the process of starting with a story and building a poem around it. I try to do that with photography and build pictures around narrative and metaphor. After the work takes shape the story is carved back out and what remains hints at the story’s existence, like a smoke ring without the mouth.

He says it better: “If you listen to them, the stories and fragments of your stories you hear can sometimes slide right into your poem and twirl around in it. Then later you cut out the story and the poem has a mysterious feeling of charged emptiness, like the dog after the operation.” Don’t worry about the dog, it lived and it isn’t relevant. My point is that this post isn’t about the Twitterfall fail story, that was just a catalyst. The inchoate uneasiness still twirling around in here is what’s left of it.

This all began with these lingering questions: “Why are we conference attendees paying good money, traveling long distances, and sitting for hours in chairs narrower than our shoulders only to stare at our laptops? Why do we go to all that trouble and then spend the time Twittering and wall posting on the overwhelmed conference wifi? Or, more specifically, why are we so fascinated with our own 140 character banalities pouring down the stage curtain that we ignore, or worse, mob up on, the speakers that drew us there in the first place?”

As I kept working away on what has become this overlong post, the question eventually turned into, “why the hell can’t I finish this?” This has become the post about distraction that I’ve been too distracted to complete. It’s also about ADHD and the digital skinner box that makes it worse, narcissism’s mirror, network collectivism and the opt-in borg, and an entropic counter-argument for plugging in anyway. So, here goes…

My name is Jim, and I’m a digital stimulusaholic

A few weeks ago I was watching TV from across the room in the airport and I couldn’t hear the sound. The missing sound track made the cuts more obvious so I timed them. They averaged about 1.5 seconds while ranging from about a quarter to at most three. The standard deviation was pretty tight but there was plenty of random jitter and the next cut was always a surprise. Even during the shortest clips the camera zoomed or panned (or both). Everything was always in motion, like a drunk filming dancers. Even though I’ve known this was the trend for a while it surprised me. Without the dialog to provide continuity it was disconcerting and vertigo inducing.

In his book Blink of an Eye, Walter Murch describes movie editing as a process akin to adding eye blinks where they naturally belong so that film works for the brain like a dream. It’s a lovely book by the way. I think these frenetic transitions alter how we experience that film-induced dream state, and for me at least, can make film feel a nightmare during exam week. Unfortunately, much of my daily experience mirrors this new cultural reality.

Before I discovered Twitter I used to joke that coffee was at the root of a personal productivity paradox. I could drink it and stay alert while wearing a path in the carpet back and forth to the men’s room. Or I could stay stimulant free and sleep at my desk. That was a joke, but the information-sphere that we live in now is like that. I can either drink liberally from the fire hose and stimulate my intellect with quick-cutting trends, discoveries, and memes; but struggle to focus. Or I can sign off, deactivate, and opt out. Then focus blissfully and completely on the rapidly aging and increasingly entropic contents of my brain, but maybe finish stuff. Stuff of rapidly declining relevance.

We have such incredible access to information, I just wish it wasn’t so burdened with this payload of distraction. Also, I wish my brain wasn’t being trained to need these constant microburst’s of stimulation.

Email was the first electronic medium to raise my clock speed, and also my first digital distraction problem. After some “ding, you have mail,” I turned off the blackberry notification buzz, added rationing to my kit bag of coping strategies, and kept on concentrating. Then RSS came along and it was like memetic crystal meth. The pursuit of novelty in super-concentrated form delivered like the office coffee service. Plus, no one had to worry about all that behind-the-counter pseudoephedrine run around. “Hey, read as much as you want, no houses were blown up in Indiana to make your brain buzz.”

It was a RUSH to know all this stuff, and know it soonest; but it came like a flood. That un-read counter was HARD to keep to zero and there was always one more blog to add. Read one interesting post and be stuck with them forever. In time keeping up with my RSS reader came to be like Lucy in the chocolate factory with the conveyor belt streaming by. From my vantage point today, RSS seems quaint. The good old days. I gave it up for good last year when I finally bought an iPhone and tapped Twitter straight into the vein. Yeah, I went real time.

Now I can get a hit at every stop light. Between previews at the movies. Waiting for the next course at a restaurant. While you are talking to me on a conference call (it’s your fault, be interesting). When you look down at dinner to check yours. Last thing before I go to sleep. The moment I wake up. Sitting at a bar. Walking home. While opening presents on Christmas morning (don’t judge me, you did it too). In between the sentences of this paragraph.

I am perfectly informed (I will know it before it hits the New York Times home page) and I’m utterly distracted.

Here are just a few of the things I learned yesterday while I was working on this post. Scientists are tracking malaria with cell phone data, there is an open source GSM base station project, I need to bend over (and touch my toes) more, WWII 8th Air Force bomber crews had brass ones (seriously, read this pdf), Erik Prince is probably graymailing the CIA, and electric motorcycles seem to be on the verge of being popular.

So here I am at the nexus of ADHD, AMS*, and digital Narcissism. I’m in a Skinner box alright, but I don’t smack the bar and wait for pellets, I tweet into the void and listen for echoes. There it is now, that sweet sweet tweet of instant 140 char affirmation. Feels good. RT means validation. I think I’m developing a Pavlovian response to the @ symbol that borders on the sexual.

And I remember to give RT love too. Even if the tweet didn’t really grab me as much as I let on. After all, you have to grease the machine to keep its pellet chute clear. Give to get. I won’t RT cheeky_geeky though, gotta draw the line somewhere. No preferential attachment help from this tweeter. Better to RT the ones that really need it; they’ll be more grateful and they’ll come through later when I’m jonesing hard for 140 characters of meaningful interaction.

And Twitterfall! I’ve only experienced it once, but holy shit, it’s a Skinner Box spitting champagne truffles. It’s real time plus real place. Back channel my ass, this is narcissism’s mirror mirror on the wall, who’s the twitteringest mofo of them all? And it’s big. Don’t have to wait for the echo, I can see it right there! And so can everyone else. A perfect cybernetic feedback loop of self. A self licking ice cream cone of the mind. I didn’t know it till I experienced Twitterfall, but ASCII echo isn’t enough. We’re still flesh with pumping hearts after all and we want to feel the response. Listen to them shift in their seats as my last twitticism wends its way down the wall. Slow down you bastards, let it hang there a bit, it was a good one. Hear that? Yeah, they saw it.

This brave new inter-networked, socially-mediated, post-industrial, cybernetically-interwoven world is an integrated web of Pavlovian stimulus and response and I’m barking at the bell. Turns out, this isn’t a Skinner Box. No, “box” is too confining for this metaphor. This is a fully networked, digitally rendered, voluntarily joined Skinner Borg. It doesn’t embed itself in us, we embed ourselves in it. It’s Clockwork Orange, self served.

For the last couple of years I’ve jacked in to this increasing bit rate of downloadable intellectual breadth and I’ve traded away the slow conscious depth of my previous life. And you know what? Now I’m losing my self. I used to be a free standing independent cerebral cortex. My own self. But not any more. Now I’m a dumb node in some uber-net’s basal ganglia. Tweet, twitch, brief repose; repeat. My autonomic nervous system is plugged in, in charge, and interrupt ready while the gray wrinkly stuff is white knuckled from holding on.

The singularity is here, and it’s us… also it’s dumb, snarky, and in love with itself.

Everyone is worried that the singularity will be smart, I’m worried that it will be dumb, with a high clock speed. Any dumb ass can beat you at chess if it gets ten moves to your one. In fact, what if the singularity already happened, we are its neurons, and it’s no smarter than a C. elegans worm? Worse, after the Twitterfall incident, I’m worried about what it will do when it discovers its motor neural pathways.

The human brain is brilliance derived from dumb nerves. Out of those many billions of simple connections came our Threshold of Reflection and everything that followed. But consciousness is going meta and we’re being superseded by a borg-like singularity; intelligence turned upside down. Smart nodes suborning ourselves to a barely conscious #fail-obsessed network. It’s dumb as a worm, fast as a photo multiplier tube, and ready to rage on at the slightest provocation. If you’re on stage (or build a flawed product, or ever ever mention politics), watch out.

We don’t plan to go mob rules any more than a single transistor on your computer intends to download porn. We participate in localized stimulus and response. Macro digital collectivism from local interaction. Macro sentiment from local pellet bar smacking.

We’re pre-implant so I plug into the Skinner Borg with fingers and eyes that are low bandwidth synapses. When I try to unplug (or when I’m forced to in an airplane at altitude), my fingers tingle and I feel it still out there. I’m a stimulus seeking bundle of nerves. I experience the missing network like a phantom limb.

So where’s this going? Like I said, I’m not a Luddite but I’m no Pollyanna Digitopian either. Age of spiritual machines? Whatever. Show me spiritual people. When the first machine or machine-assisted meta-consciousness arrives on the scene, it’s going to be less like the little brother that you played Battleship with and more like a dumb digital version of poor Joe from Johnny Got His Gun. Barely sentient but isolated from sensation. Do we think that a fully formed functional consciousness is going to spring to life the first time sufficient processing power is there to enable it? I’m not worried about it replicating and taking over the world, I’m worried about it going completely bat shit crazy and stumbling around breaking stuff in an impotent rage.

My Dilemma, Evolution, and Entropy

All this talk of borgs, singularities, and addiction doesn’t address my very real and right now dilemma. The world is changing and we all have to keep up. Mainlining memes is AWESOME for that, but at what cost? It’s a bargain that I’m trying not to see as Faustian.

We don’t have parallel ports so we have to choose. Lots of bite sized pellets or slow down and go deep? Frenetic pursuit of the novel or quiet concentration? Can I stay plugged in without giving up my ability to focus? I don’t want to be a donor synapse to the worm and I don’t want to have to intravenously drip Adderall to cope.

At root, this is a question of breadth vs. depth and finding the right balance. This conversation was started by a conference. Organizers have to choose too, and they base their choices on what they think we prefer. Do we want to listen to Sandy Pentland for an hour and come away with a nuanced understanding of his work on honest signals, or would we rather have six twitter-overlaid ten minute overviews in the same hour? Are we looking for knowledge? Or suggestions of what to investigate more deeply later (assuming we can find some “later” to work with)? Can we sit still for an hour even if we want to?

We humans and our organizations are open dissipative systems evolving memetically in the information realm and genetically on intergenerational time scales. Living organisms beat entropy by remaining in a perpetual state of disequilibrium – they absorb energy from their environment and exhaust disorder back into it. The greater their disequilibrium, the more energy that is required to maintain an internally ordered state, but paradoxically, the more adaptive they are to changing surroundings.

If we work in a domain in flux we require a higher rate of information consumption to maintain our ability to adapt while maintaining an internally ordered state. The U.S. Army is experiencing this now as it tries to adapt to the difference between a Fulda Gap standoff and the current counter insurgency mission. Moving from a period of relative stasis to a tighter evolutionary time scale, it’s adapt or lose. As a learning organization its emphasis has to shift from transmitting and conforming with existing knowledge, to consuming and processing new.

The pursuit of novelty in this context isn’t just fun, it is the foundation for a stocked library of adaptive schemata that support intellectual evolution. Since memes and the seeds of synthesis can come in compact packages, a broad, fast, and shallow headspace can work in times of rapid change. This isn’t just an argument for fast paced conferences with lots of breadth, but it also explains why twitter, RSS feeds, and broad weakly-connected social networks (e.g. Foo) are so valuable. It’s also one of the arguments that I make in enterprises like the DoD why they should promote rather than discourage social media use.

However, I don’t think the evolutionary / entropic argument is the only one in play. The cultural and cognitive domains are relevant too, and speaking personally, I feel like I’m bumping hard up against some relevant limits. My memetic needs are increasing faster than genetic barriers can evolve. Obviously, in the cultural domain we are becoming more accustomed to fast paced transitions and partial attention. However, anecdotally it seems like I’m not the only one wondering about the impact of all this stuff. Early signals are popping up in the strangest places. When I attended the Gov 2.0 Summit more than one participant commented that the fast paced format was intellectually exhausting.

By nature I’m an abstainer more than a moderator. It’s hard for me to limit a behavior by doing just a little bit of it. Just check the empty quart ice cream container in my trash can if you doubt me. So, frankly, I am stumped on what to do. I simply don’t know how to proceed in a way that will keep the information flow going but in a manner that doesn’t damage my ability to produce work of my own.

Which Singularity?

In the early years of the last century the Dadaists observed America’s technological progress from their Parisian perch and recoiled artistically from the dehumanizing eruptions of concrete and steel in the machine age capital of Manhattan. Paintings like Picabia’s Universal Prostitution were comments on how our culture (and perhaps selves) seemed to be merging with machines. Having observed the early machine age’s ascendance first hand, Duchamp would have understood our uneasy fascination with the singularity.

I’m trapped in a cybernetic feedback loop. That much is clear. However, these loops operate at different scales in both time and space and maybe Twitter itself solves at least one larger conundrum. As we join the Skinner Borg in droves, and our ability to concentrate is compromised at scale, who is going to develop the technologies that evolve the worm?

When astrophysicists use the term “singularity” they mean that edge of a black hole where the decaying-from-the-center gravitational force just balances the ability of light to escape. Along the surface of that sphere, some distance out from the hole itself, light just hangs there in perpetual stop motion.

The very technology that makes our collective integration possible also distracts us from advancing it. In equilibrium, distraction and ambition square off at the singular point of failed progress. If the next generation of Moores, Joys, and Kurzweils are half as distracted as I am, we are going to find ourselves frozen right here, nodes in a wormy borg that never becomes a butterfly. (yeah, I know, worms don’t become butterflies, but I’m desperate to finish…). Anyway, maybe Twitter is just God’s way of making sure we never manage to finish creating our future oppressor.

p.s. There really is an app for that.

*AMS = Afraid to miss something

tags: , , ,