• Print

Skinner Box? There's an App for That

If you are reading this post it means that after countless misfires, I finally kept my attention focused long enough to finish it. That may seem like no big deal, a mere trifling effort, but I’m basking in the moment. In fact, I’ll probably tweet it.

web20expotwitter.jpgIt didn’t start out to be about digital Skinner boxes. It was a Radar backchannel email about the infamous Web 2.0 Expo Twitterfall incident. I got all curmudgeonly and ranted about continuous partial attention, Twitter as a snark amplifier, and the “Ignite’ification” of conferences (with apologies to Brady). In short, I demonstrated myself unfit to contribute to a blog called Radar.

I swear I’m not a Luddite. I’m not moving to Florida to bitch about the government full time and I’m not in some remote shack banging this out on an ancient Underwood. However, I guess I count myself among the skeptics when it comes to the unmitigated goodness of progress. Or at least its distant cousin, trendiness.

Anyway, I sent the email, inexplicably Jesse said “post!”, and I tried reworking it. I still am. This piece has been grinding away like sand in my cerebral gears since, and along the way it has become about something else.

In The Anthologist, Nicholson Baker describes writing poetry as the process of starting with a story and building a poem around it. I try to do that with photography and build pictures around narrative and metaphor. After the work takes shape the story is carved back out and what remains hints at the story’s existence, like a smoke ring without the mouth.

He says it better: “If you listen to them, the stories and fragments of your stories you hear can sometimes slide right into your poem and twirl around in it. Then later you cut out the story and the poem has a mysterious feeling of charged emptiness, like the dog after the operation.” Don’t worry about the dog, it lived and it isn’t relevant. My point is that this post isn’t about the Twitterfall fail story, that was just a catalyst. The inchoate uneasiness still twirling around in here is what’s left of it.

This all began with these lingering questions: “Why are we conference attendees paying good money, traveling long distances, and sitting for hours in chairs narrower than our shoulders only to stare at our laptops? Why do we go to all that trouble and then spend the time Twittering and wall posting on the overwhelmed conference wifi? Or, more specifically, why are we so fascinated with our own 140 character banalities pouring down the stage curtain that we ignore, or worse, mob up on, the speakers that drew us there in the first place?”

As I kept working away on what has become this overlong post, the question eventually turned into, “why the hell can’t I finish this?” This has become the post about distraction that I’ve been too distracted to complete. It’s also about ADHD and the digital skinner box that makes it worse, narcissism’s mirror, network collectivism and the opt-in borg, and an entropic counter-argument for plugging in anyway. So, here goes…

My name is Jim, and I’m a digital stimulusaholic

A few weeks ago I was watching TV from across the room in the airport and I couldn’t hear the sound. The missing sound track made the cuts more obvious so I timed them. They averaged about 1.5 seconds while ranging from about a quarter to at most three. The standard deviation was pretty tight but there was plenty of random jitter and the next cut was always a surprise. Even during the shortest clips the camera zoomed or panned (or both). Everything was always in motion, like a drunk filming dancers. Even though I’ve known this was the trend for a while it surprised me. Without the dialog to provide continuity it was disconcerting and vertigo inducing.

In his book Blink of an Eye, Walter Murch describes movie editing as a process akin to adding eye blinks where they naturally belong so that film works for the brain like a dream. It’s a lovely book by the way. I think these frenetic transitions alter how we experience that film-induced dream state, and for me at least, can make film feel a nightmare during exam week. Unfortunately, much of my daily experience mirrors this new cultural reality.

Before I discovered Twitter I used to joke that coffee was at the root of a personal productivity paradox. I could drink it and stay alert while wearing a path in the carpet back and forth to the men’s room. Or I could stay stimulant free and sleep at my desk. That was a joke, but the information-sphere that we live in now is like that. I can either drink liberally from the fire hose and stimulate my intellect with quick-cutting trends, discoveries, and memes; but struggle to focus. Or I can sign off, deactivate, and opt out. Then focus blissfully and completely on the rapidly aging and increasingly entropic contents of my brain, but maybe finish stuff. Stuff of rapidly declining relevance.

We have such incredible access to information, I just wish it wasn’t so burdened with this payload of distraction. Also, I wish my brain wasn’t being trained to need these constant microburst’s of stimulation.

Email was the first electronic medium to raise my clock speed, and also my first digital distraction problem. After some “ding, you have mail,” I turned off the blackberry notification buzz, added rationing to my kit bag of coping strategies, and kept on concentrating. Then RSS came along and it was like memetic crystal meth. The pursuit of novelty in super-concentrated form delivered like the office coffee service. Plus, no one had to worry about all that behind-the-counter pseudoephedrine run around. “Hey, read as much as you want, no houses were blown up in Indiana to make your brain buzz.”

It was a RUSH to know all this stuff, and know it soonest; but it came like a flood. That un-read counter was HARD to keep to zero and there was always one more blog to add. Read one interesting post and be stuck with them forever. In time keeping up with my RSS reader came to be like Lucy in the chocolate factory with the conveyor belt streaming by. From my vantage point today, RSS seems quaint. The good old days. I gave it up for good last year when I finally bought an iPhone and tapped Twitter straight into the vein. Yeah, I went real time.

Now I can get a hit at every stop light. Between previews at the movies. Waiting for the next course at a restaurant. While you are talking to me on a conference call (it’s your fault, be interesting). When you look down at dinner to check yours. Last thing before I go to sleep. The moment I wake up. Sitting at a bar. Walking home. While opening presents on Christmas morning (don’t judge me, you did it too). In between the sentences of this paragraph.

I am perfectly informed (I will know it before it hits the New York Times home page) and I’m utterly distracted.

Here are just a few of the things I learned yesterday while I was working on this post. Scientists are tracking malaria with cell phone data, there is an open source GSM base station project, I need to bend over (and touch my toes) more, WWII 8th Air Force bomber crews had brass ones (seriously, read this pdf), Erik Prince is probably graymailing the CIA, and electric motorcycles seem to be on the verge of being popular.

So here I am at the nexus of ADHD, AMS*, and digital Narcissism. I’m in a Skinner box alright, but I don’t smack the bar and wait for pellets, I tweet into the void and listen for echoes. There it is now, that sweet sweet tweet of instant 140 char affirmation. Feels good. RT means validation. I think I’m developing a Pavlovian response to the @ symbol that borders on the sexual.

And I remember to give RT love too. Even if the tweet didn’t really grab me as much as I let on. After all, you have to grease the machine to keep its pellet chute clear. Give to get. I won’t RT cheeky_geeky though, gotta draw the line somewhere. No preferential attachment help from this tweeter. Better to RT the ones that really need it; they’ll be more grateful and they’ll come through later when I’m jonesing hard for 140 characters of meaningful interaction.

And Twitterfall! I’ve only experienced it once, but holy shit, it’s a Skinner Box spitting champagne truffles. It’s real time plus real place. Back channel my ass, this is narcissism’s mirror mirror on the wall, who’s the twitteringest mofo of them all? And it’s big. Don’t have to wait for the echo, I can see it right there! And so can everyone else. A perfect cybernetic feedback loop of self. A self licking ice cream cone of the mind. I didn’t know it till I experienced Twitterfall, but ASCII echo isn’t enough. We’re still flesh with pumping hearts after all and we want to feel the response. Listen to them shift in their seats as my last twitticism wends its way down the wall. Slow down you bastards, let it hang there a bit, it was a good one. Hear that? Yeah, they saw it.

This brave new inter-networked, socially-mediated, post-industrial, cybernetically-interwoven world is an integrated web of Pavlovian stimulus and response and I’m barking at the bell. Turns out, this isn’t a Skinner Box. No, “box” is too confining for this metaphor. This is a fully networked, digitally rendered, voluntarily joined Skinner Borg. It doesn’t embed itself in us, we embed ourselves in it. It’s Clockwork Orange, self served.

For the last couple of years I’ve jacked in to this increasing bit rate of downloadable intellectual breadth and I’ve traded away the slow conscious depth of my previous life. And you know what? Now I’m losing my self. I used to be a free standing independent cerebral cortex. My own self. But not any more. Now I’m a dumb node in some uber-net’s basal ganglia. Tweet, twitch, brief repose; repeat. My autonomic nervous system is plugged in, in charge, and interrupt ready while the gray wrinkly stuff is white knuckled from holding on.

The singularity is here, and it’s us… also it’s dumb, snarky, and in love with itself.

Everyone is worried that the singularity will be smart, I’m worried that it will be dumb, with a high clock speed. Any dumb ass can beat you at chess if it gets ten moves to your one. In fact, what if the singularity already happened, we are its neurons, and it’s no smarter than a C. elegans worm? Worse, after the Twitterfall incident, I’m worried about what it will do when it discovers its motor neural pathways.

The human brain is brilliance derived from dumb nerves. Out of those many billions of simple connections came our Threshold of Reflection and everything that followed. But consciousness is going meta and we’re being superseded by a borg-like singularity; intelligence turned upside down. Smart nodes suborning ourselves to a barely conscious #fail-obsessed network. It’s dumb as a worm, fast as a photo multiplier tube, and ready to rage on at the slightest provocation. If you’re on stage (or build a flawed product, or ever ever mention politics), watch out.

We don’t plan to go mob rules any more than a single transistor on your computer intends to download porn. We participate in localized stimulus and response. Macro digital collectivism from local interaction. Macro sentiment from local pellet bar smacking.

We’re pre-implant so I plug into the Skinner Borg with fingers and eyes that are low bandwidth synapses. When I try to unplug (or when I’m forced to in an airplane at altitude), my fingers tingle and I feel it still out there. I’m a stimulus seeking bundle of nerves. I experience the missing network like a phantom limb.

So where’s this going? Like I said, I’m not a Luddite but I’m no Pollyanna Digitopian either. Age of spiritual machines? Whatever. Show me spiritual people. When the first machine or machine-assisted meta-consciousness arrives on the scene, it’s going to be less like the little brother that you played Battleship with and more like a dumb digital version of poor Joe from Johnny Got His Gun. Barely sentient but isolated from sensation. Do we think that a fully formed functional consciousness is going to spring to life the first time sufficient processing power is there to enable it? I’m not worried about it replicating and taking over the world, I’m worried about it going completely bat shit crazy and stumbling around breaking stuff in an impotent rage.

My Dilemma, Evolution, and Entropy

All this talk of borgs, singularities, and addiction doesn’t address my very real and right now dilemma. The world is changing and we all have to keep up. Mainlining memes is AWESOME for that, but at what cost? It’s a bargain that I’m trying not to see as Faustian.

We don’t have parallel ports so we have to choose. Lots of bite sized pellets or slow down and go deep? Frenetic pursuit of the novel or quiet concentration? Can I stay plugged in without giving up my ability to focus? I don’t want to be a donor synapse to the worm and I don’t want to have to intravenously drip Adderall to cope.

At root, this is a question of breadth vs. depth and finding the right balance. This conversation was started by a conference. Organizers have to choose too, and they base their choices on what they think we prefer. Do we want to listen to Sandy Pentland for an hour and come away with a nuanced understanding of his work on honest signals, or would we rather have six twitter-overlaid ten minute overviews in the same hour? Are we looking for knowledge? Or suggestions of what to investigate more deeply later (assuming we can find some “later” to work with)? Can we sit still for an hour even if we want to?

We humans and our organizations are open dissipative systems evolving memetically in the information realm and genetically on intergenerational time scales. Living organisms beat entropy by remaining in a perpetual state of disequilibrium – they absorb energy from their environment and exhaust disorder back into it. The greater their disequilibrium, the more energy that is required to maintain an internally ordered state, but paradoxically, the more adaptive they are to changing surroundings.

If we work in a domain in flux we require a higher rate of information consumption to maintain our ability to adapt while maintaining an internally ordered state. The U.S. Army is experiencing this now as it tries to adapt to the difference between a Fulda Gap standoff and the current counter insurgency mission. Moving from a period of relative stasis to a tighter evolutionary time scale, it’s adapt or lose. As a learning organization its emphasis has to shift from transmitting and conforming with existing knowledge, to consuming and processing new.

The pursuit of novelty in this context isn’t just fun, it is the foundation for a stocked library of adaptive schemata that support intellectual evolution. Since memes and the seeds of synthesis can come in compact packages, a broad, fast, and shallow headspace can work in times of rapid change. This isn’t just an argument for fast paced conferences with lots of breadth, but it also explains why twitter, RSS feeds, and broad weakly-connected social networks (e.g. Foo) are so valuable. It’s also one of the arguments that I make in enterprises like the DoD why they should promote rather than discourage social media use.

However, I don’t think the evolutionary / entropic argument is the only one in play. The cultural and cognitive domains are relevant too, and speaking personally, I feel like I’m bumping hard up against some relevant limits. My memetic needs are increasing faster than genetic barriers can evolve. Obviously, in the cultural domain we are becoming more accustomed to fast paced transitions and partial attention. However, anecdotally it seems like I’m not the only one wondering about the impact of all this stuff. Early signals are popping up in the strangest places. When I attended the Gov 2.0 Summit more than one participant commented that the fast paced format was intellectually exhausting.

By nature I’m an abstainer more than a moderator. It’s hard for me to limit a behavior by doing just a little bit of it. Just check the empty quart ice cream container in my trash can if you doubt me. So, frankly, I am stumped on what to do. I simply don’t know how to proceed in a way that will keep the information flow going but in a manner that doesn’t damage my ability to produce work of my own.

Which Singularity?

In the early years of the last century the Dadaists observed America’s technological progress from their Parisian perch and recoiled artistically from the dehumanizing eruptions of concrete and steel in the machine age capital of Manhattan. Paintings like Picabia’s Universal Prostitution were comments on how our culture (and perhaps selves) seemed to be merging with machines. Having observed the early machine age’s ascendance first hand, Duchamp would have understood our uneasy fascination with the singularity.

I’m trapped in a cybernetic feedback loop. That much is clear. However, these loops operate at different scales in both time and space and maybe Twitter itself solves at least one larger conundrum. As we join the Skinner Borg in droves, and our ability to concentrate is compromised at scale, who is going to develop the technologies that evolve the worm?

When astrophysicists use the term “singularity” they mean that edge of a black hole where the decaying-from-the-center gravitational force just balances the ability of light to escape. Along the surface of that sphere, some distance out from the hole itself, light just hangs there in perpetual stop motion.

The very technology that makes our collective integration possible also distracts us from advancing it. In equilibrium, distraction and ambition square off at the singular point of failed progress. If the next generation of Moores, Joys, and Kurzweils are half as distracted as I am, we are going to find ourselves frozen right here, nodes in a wormy borg that never becomes a butterfly. (yeah, I know, worms don’t become butterflies, but I’m desperate to finish…). Anyway, maybe Twitter is just God’s way of making sure we never manage to finish creating our future oppressor.

p.s. There really is an app for that.

*AMS = Afraid to miss something

tags: , , , , ,
  • Linda Stone

    A couple of years ago, Dr. Mark Liponis wrote an interesting book that deserved more attention than it got. The book: Ultralongevity, suggests that there are seven things we can do in support of our health. Twitter is _not_ one of the seven things. Liponis suggests: breathing, eating, sleeping, loving, soothing, enhancing, and dancing.

    When I realized I had email apnea, temporary cessation of breath or shallow breathing while in front of _any_ screen, and this had the potential to have serious impact on my nervous system, emotional regulation and cognition, I started to pay more attention to breathing, and more importantly, to how I was feeling.

    The key take away: our bodies may be smarter than our minds. Our bodymind, integrated, has true genius. Liponis’ formula is one of the clearest paths.

    I find Twitter delightful and valuable. It’s a filter, a tool that uncovers a few of the glittering gems on the web more readily than I would on my own. I miss more than I catch. I just don’t want to miss my morning walk, dancing to the Kinks, or that full exhale.

  • Dave Kaye

    Two things: 1) Socrates argued in the Phaedro that the emerging use of paper would destroy the human mind. So we’ve been having this conversation for at least that long. 2) My life got a lot more interesting once I stopped believing I could multitask. To see something and engage in it deeply is far more satisfying than to engage in many things not really all that much.

  • Aaron

    It sounds like this T-shirt was designed for you:

    Social Media Venn Diagram
    http://www.despair.com/somevedi.html

  • BmoreKarl

    Wow. I’m stunned. I read the entire post (with a digression or two into WWII bomber-speak and a break to touch my toes).

    Where do we go from here? Hey, look! build your own bamboo bicycle!!!

  • Brian Ahier

    Great post!!!

    Welcome to the future (and don’t forget to breathe :-)

  • Jennifer Pahlka

    Great post, Jim. It was long enough that I was proud of reading it all the way through with only a few interruptions ;-) and thoroughly entertaining. I fully agree about the validation of the RT. I think I became addicted to twhirl’s reply sound, exactly like pavlov’s dogs. A little happy pill whenever you want to feel important, connected, or needed. Extra points if the RT was from Tim. ;-) I’ve been off the drug for over a month now. Got a new computer and don’t like my new Twitter client. That’s all it took. Oh, and a lot of pressure to get some stuff done. Turns out there are other distractions too, though.

    I just finished reading Nurture Shock, which is basically Freakonomics for parents. One of the studies the authors discuss is a preschool curriculum that has gotten crazy positive results. It’s built around creative self-directed play but asks the kids to commit to a play scenario for a full 45 minutes at a stretch. Apparently getting kids to stick with one idea for a long time makes a massive difference in their executive function and their future academic performance. The other key is that they have to make a plan for each play scenario and then stick to it. Planning and concentration. Who’d have thought? ;-)

  • Todd Geist

    Great Post! I wage much the same battle everyday. Recently the movie Avatar inspired a new take on the matter for me.

    The massive amount of information flowing at us is NOT the problem. Our attachment to it is. In other words the problem is, we get distracted by it.

    I don’t think there is a biological barrier here. The human brain seems quite capable of handling huge quantities of information. Just think about how much information needs to be processed by my brain just to type this comment. While I don’t know for sure but I would guess it much more than what twitter throws at me in a whole year.

    I think there are two core problems. One, our current crop of gadgets are too crude to allow us to integrate very affectively with the web. Web integration requires too much of my attention. When my iphone beeps, I can’t even tell if it is twitter, or just plain sms, or who it is. What if different patterns of vibration could signal what the source was? Or who it was? That would be a start. We need Augmented Reality.

    The second problem is much more subtle and personal. Humans are often unsatisfied with their current situation. This dissatisfaction with the present moment predisposes us towards chasing the next shiny thing that comes into view. We love to escape. And the web is the ultimate escape.

    Luckily the Buddha and others have been teaching us how to deal with this problem for a couple of thousand years, so we have something to go one there. Eckart Tolle does a great job of describing this phenomenon in modern language, in his book.”The Power of Now”. If we are going to achieve effective web integration we are going to need to learn how to not leap at every new tweet that comes into view, but rather absorb it the way a tree blows in the wind. The branches sway, but the trunk remains still.

    Some say our consciousness and our sense of self is a side effect of the brain handling the massive amount of information flowing at it from its own body and immediate environment. If so, and if we are on the verge of effectively integrating with information on a global scale,

    what does that mean for our sense of self?

    -
    Todd Geist

    for more on my thoughts on this see, “Avatar, Earthquakes, and The Real Time Web”

  • Tim O'Reilly

    Brilliant -

    Are you sure you’re not Thomas Pynchon?

    This post belongs in a book somewhere. We’ll have to create a book to preserve it!

  • Frank Gilroy

    I often struggle to find the balance you’re looking for. I’ve been thinking allot about discipline lately. I recently read a definition of discipline I liked. Basically that “discipline is the ability to put off what we want in the moment for something we want in the future”. Of course the philosopher in me then goes on to wonder, “What do I want in the future?”, “What do I want to be when I grow up”, “What am I here for”?

    This is the rabbit hole I’m going down right now. Figure out what your core values are, what you really want people saying about you at your funeral, if you’re in the majority of people (unlike Kurzweil) who still believes in death, and use that to find discipline and then find balance.

  • Nicolau Werneck

    This post has just too many characters to express too-little-if-any ideas. I’m afraid I wouldn’t “get it” even if it had just 140 characters. Is it just a singular high-entropy post-ludite dadaistic clueless college collage?

  • Kathy

    Fantastic post and it took me forever to get through because I kept having to check the incoming stream on both of my Twitter accounts. Meanwhile, there were videos loading from the facebook stream and email to catch up on. AND I only get 1 mb from the wireless it’s so far away. The phone rang and my hot water and honey needed warming. Then I forgot I was reading the post and went for a walk in the woods because I can’t stand being in my house where everything seems to shout at me; like all the tweets saying look at me, love me because I love you, buy my stuff, I’m going to be famous (which makes me feel I am in the middle of Grand Central Station).
    Like Linda (above) I still enjoy Twitter and some of the people I’ve met there. And I do things that I enjoy, though I need to pay attention sometimes, which can be hard to do, to how I feel about what I’m doing in the moment compared with what else I might be pursuing.
    That said, I scanned a lot of what you wrote, picking up the intelligence, the snarkiness, the search for truth therein, because I just don’t have the focus or time to really savor it. Could you please put it into 140 characters? Or make it an audio I can listen to while doing something else?
    Truly – thank you!

  • Android

    The Twingularity is here..

  • Tyler

    I’m also an abstainer trying desperately to become a moderator.

    I tell myself I’ll be set if I follow just the right group of people on Twitter, who tweet with just the right frequency and only about the right topics. I tell myself if I set my Blackberry to silent then I’ll only check it when I have time. I tell myself to curate the perfect collection of RSS feeds so I won’t waste time reading anything that isn’t essential.

    But then I micromanage who I follow on Twitter so I’ll always have that perfect stream of information. I reach for my Blackberry every time I think I see the LED light blink out of the corner of my eye; sometimes I subconsciously check it while doing something else. And I’m constantly evaluating whether I think each RSS feed is really necessary or if I could live without it.

    Since we can’t seem manage things ourselves, there needs to be a smarter way to filter the stream of information so we’re only getting what we truly need/want. We’ve created all these social platforms and technologies with reckless abandon, and now we need a way to manage them more effectively. Or manage them at all.

    Or maybe we need to fundamentally rethink the way it all works in the first place.

    Side note: I quit watching cable news and now when I happen upon it, all the quick cuts, in-your-face graphics, and melodramatic music drive me crazy. Some things might be worth just abstaining from.

  • Andrew Wooldridge

    I wrote about this in a different way myself back in December:

    http://andrewwooldridge.com/blog/2009/12/07/met-someone-from-the-future/

    I think what is going to happen that is revolutionary is this merging of “play” with “work” and every interaction you have will result in some sort of change in “status”. You’ll learn to segment your attention in an almost schizophrenic way…

  • Ernest Adams

    There’s an important distinction between enthusiasm and breathless enthusiasm. I’m enthusiastic about technology, but I’m not breathlessly enthusiastic. This distinction is why I stopped reading Wired.

    I like technology that cures people or enables disabled ones or helps to create art or music or books. I don’t care about technology that encourages the flow of mindless prattle, like Twitter, and I actively dislike technology that helps people to be crappy to each other, like viruses and H-bombs.

  • siobhan bulfin

    Hi from New Zealand Jim, and thanks for articulating what so many of us are experiencing. I don’t know the answer but part of it must lie with being conscious of the ‘problem’, and you have highlighted it in a way that has been a long time coming.
    Top effort and cheers to you for your honesty. Tim’s not wrong when he says there’s a book in this. Distraction and paranoia of not keeping up is going to be an enduring frustration in this information age. Collectively perhaps we can crowd source a solution :) I like also Linda’s comment which effectively promotes balance. Yes remember that.. sigh..
    Keep writing.
    Siobhan
    twitter @siobhanbulfin

  • Tetsujin 28

    Jim my friend, where does one start? I’ll try to be brief. You suggest:

    “When astrophysicists use the term “singularity” they mean that edge of a black hole where the decaying-from-the-center gravitational force just balances the ability of light to escape. Along the surface of that sphere, some distance out from the hole itself, light just hangs there in perpetual stop motion.”

    No. That would be the “event horizon” not a singularity. And therein lies a clue to the real value of the mind-torrent overwhelming you.

    This is not, I fear, about “learning” many things – it is about reading many things. And, my fear only escalates at the prospect, writing many things based on what you have read.

    It is a fine example of once-sound information decaying into junk. Readers may, and probably will, pass it on thereby converting yet more signal into noise.

    Please stop now, while there is still time.

  • AbbydonKrafts

    Spectacular post! I’m going to share this on FB for all the people I know that are addicted to Twitter and texting. I’m also going to blog about it as your post is pure art!

    Luckily, I’m not as hooked into the stream as it burned me out years ago. I had to disconnect or die. 50+ feeds were dumped. Now I only use newsletter subscriptions to get my fix. lol

  • TearsTheWingsOffAngels

    @Tetsujin 28 – no need to be quite so snarky, I think. While you’re absolutely correct, I think this mistake did not start with Jim.

  • Tetsujin 28

    @TearsTheWingsOffAngels

    I don’t mean to be “snarky”. Whether Jim understands his references or not my point stands. And there are other mistakes (as you call it) in his piece but life is too short.

    If our analyses are to mean anything they must be internally sound. The text is overlong and frequently unsound, while bemoaning information overload.

    The irony is glaring. Perhaps if we shut out distractions and ensure that we publish accurate data there would be less diverting babble.

  • Tetsujin 28

    Jim, I have another question. I read your piece very carefully, trying to extract its meaning. The Internet is big and more than one person can read, that’s true. People have a lot of ideas. Also true.

    But the world was always big. More happens in a second than we could comprehend in a lifetime. Nobody feels compelled to visit every country, read every magazine, meet every individual, speak every language, listen to every broadcast or learn every skill. Fortunately humans have, among other things, good shortcut techniques.

    During our waking day we don’t attempt to give full attention to every stimulus. We don’t try to follow every path or drain every interaction for its full significance. And this includes people at the very top of their game. Do you imagine Steve Jobs sweats over breakfast because he can’t hear the news in Hindustani as well?

    There’s a very old apocryphal story about a philosophy student who dies of starvation because he can’t see any good reason to put one shoe on before the other. So, that idea is not new.

    But it’s just a mindgame. How could anybody really feel ease of access to a vast amount of data is in some way a conundrum? Are you paralysed by indecision when you walk into a library? How on Earth are you ever going to read all those books?

    Of course there is no general solution but that doesn’t mean there is a specific problem. So what are we left with? Did I miss something?

  • Jim Stogdill

    @Tetsujin 28

    First, thanks for the correction regarding singularity vs. event horizon. I should have been more careful.

    Wrt to your question, this post isn’t about information overload. It stems from my realization a while ago that I was an unwitting participant in a very real operant conditioning experiment. It’s about the loss of control and agency I experience as my newly connected-to-the-web autonomic nervous system ignores the pleas of my rational self and checks each of my email, twitter, yammer, and facebook accounts in a racetrack pursuit of stimulation and addiction. It’s also about how as we all jack in to this, we seem to take on a snarky, angry, and frankly stupid collective persona that isn’t a scaled up manifestation of our individual selves.

    There are certain information streams that I need to be plugged into though to remain good at my job. And my conundrum is how to stay current while somehow breaking this cycle.

    I NEVER feel this way in a library. A library for me is an oasis filled with anticipation and the pleasure of a serendipitous find. There are no latent mechanisms of operant conditioning and capture there, at least not for me.

    I guess this post made no sense to you because you are lucky and haven’t experienced this feeling of addiction.

    Btw, you are right about the irony inherent in this post’s length. It was intentional in the sense that I wanted to prove to myself that I could still produce something thoughtful and more deeply analytical than a headline followed by one or two breathless paragraphs. Also, it was an intentional challenge to readers to make it past the gauntlet of their own distractions and finish it. 4000 people have averaged a stunning 8.5 minutes time on page since this thing was posted. I’m thrilled that the challenge was accepted and crushed by so many people.

    @Aaron, I LOVE that tee shirt. In fact, I had considered including the picture of it in this post but in the end I decided not to confuse things by introducing stalking into the mix.

    @ Andrew Wooldridge I read your post about someone from the future and that is one awesome dream. Lucid and freaky and a logical conclusion of where we are headed. Although, it seems to me that it will be a long way in the future before we are able to evolve physiologically to handle such an experience with the apparent calm that your friend from the future displayed. Gonna need a multicore brain I think.

    Thanks to everyone for all of the comments here, on twitter, and elsewhere. It’s been a pleasure snapping up the pellets as they arrive.

  • Tetsujin 28

    Jim, thank you for the reply. Two tiny worlds collide, possibly.

    I have never literally been any kind of addict. I was a programmer in the 1970s; I am moderately geekoid and have mild gear lust. But the web, email swamp and mobile phone fetters are familiar.

    Two things strike me in the light of your clarification.

    You mention ‘needs’ (we must keep up, memetic needs, etc.) and mere titillation (your own tweet onscreen). You also observe how the profound and trivial jostle for attention. It seems to me, as long as we know the difference things will always be fine.

    Also, I wonder if the signal to noise ratio might not be far worse than you imagine. A meme, for practical purposes, must have some threshold and a reasonable half-life. A good test for significance is to switch everything off and listen. If you don’t hear about it I suggest it probably doesn’t matter.

    We all know that being connected doesn’t make ourselves or the information more important. Incidentally, a good way to get your Inbox counter to zero is to drag everything to the little dustbin. A thin address book and a fat music playlist is the correct configuration for a mobile phone.

  • Tom Callard

    Thanks for this. It describes the panic all of us who use Twitter feel almost daily.

    I was reading an interesting article recently on social conformity and youtube, and it struck me that it might be linked to what you are saying here. It was a study on the importance of networks in youtube videos gaining views. It was all about networks of users sharing with each other- and the social influence this gave those individuals, and how the communities functioned.

    ANYWAY, the argument was very similiar to yours but was based on a psychological insight that humans have a basic need to be a part of a community, and feel the need to conform and fit in. By being connected to a lot of people in your network, and receiving a larger flow of information, you are essentially encouraging homogeneity. You are sacrificing thinking for yourself because of a deep human need to be a part of a community. So as you say- the only way to really escape is to cut yourself off from the community so you are freed from the tyranny of trying to please it.

    I liked it as it seemed to dovetail well with your ideas, and it is obvious when you think about it. The reason we obsessively check twitter and retweet things is to ingratiate ourselves to a community within which most people are doing the same- just trying to please others in it by retweeting. Not exactly a recipe for creativity or independence is it?

  • Dan

    Great post, but I don’t think it’s wise to jump to conclusions about the effects of a global brain or global consciousness based on one example.
    Twitter is just one of the many co-evolving and co-existing collective entities around us. There’s the Gaia Theory and the Global Consciousness Project for instance, but there may be many more we don’t know or haven’t yet done anything about.
    What we need is a system in which we can analyze and compare such entities and anticipate their ups and downs, so we can avoid turning into blind cells inside a global mastermind.
    I’m exploring the subject in my blog: http://collectiveweb.wordpress.com/category/global-consciousness/

    cheers
    @DanielStocker

  • Marc Jarman

    I much enjoyed reading your post Jim. Are we in a perpetual state of flight or fight? Able only to process info & ideas in a very cursory and shallow manner?

    Without going all spiritual on ya I’d strongly recommend taking time out from the Borg. And not for just a few minutes in the midst of the miasma. And more than just some toe-touching and deep breaths. More like a whole morning ‘away’ every now and then, moving up to a weekend when the anxiety has decreased and you can comfortably ignore Mater Borg’s teat. Be reflective. Join some dots. Do it lazily. Take your eyes off the ball.

    To some extent your post is exactly that. It’s doesn’t reek of a bar smacking rodent. It does go deeper than 1.5 second jump-cuts with panning and zooming.

    I’d suggest watching a movie like Tarkovsky’s Stalker – in fact, abandon yourself to it. Beautiful, lyrical, no “Answer”, and 160 minutes long with 140 shots. With the average shot being over a minute long, it could be the perfect foil for the cybernetic feedback loop.

    Cheers
    Marc

  • SDC

    Bravo on this post, I very much enjoyed it. I read it all eagerly, uninterrupted, but that may be because I was reading it on a certain eBook reader which, while wonderful for that purpose, doesn’t handle email or web or Twitter very well at all.

    This is where I recall with some embarrassment my addiction to newsgroup feeds back in grad school, which I could blame for topping out at an MS instead of a Ph.D. were I the bitter type. The problem has been around for a while. Part of the solution may be detaching, as a reader commented above.

    Another solution may be just ‘selfishly’ turning off the fire hose from time to time. There’s the old saying: ‘the phone is there for my convenience, not yours’. Sounds like sacrelege today, but the truth lies somewhere between that and always-on, which is egocentric and selfish (disrespectful to those in meatspace who’d like to talk to us) in its own way.

  • Sam Knox

    At the very end of your article you mention a “singularity”, but I think you mean to say “event horizon”. A singularity is a point is space of infinite mass, the center of the black hole itself. The sphere around the black hole where the speed of light is balanced with the gravitational pull of the singularity is called the event horizon. I won’t provide a wikipedia link – I read that in a book *gasp* !

  • VIktor Venson

    Civilization advances by extending the number of important operations which we can perform without thinking of them.
    - Alfred North Whitehead

  • NickN

    When advertising was young almost any ad would work. An endorsement from a famous person would work spectacularly well. As advertising saturated the market people rapidly evolved defenses to ignore more advertising, so advertising got more sophisticated, funny, memorable. The co-evolution continues today.
    The same will happen with Twitter and the internet. I get lost in wikipedia in a way I don’t get lost in a library anymore, although I used to get lost that way in libraries.
    Continued satiety will blunt your initial binging. It sounds like you’ve discoverd that the Hedonist’s Paradox applies to new media.

  • Sara Williams

    Great post, Jim — thought-provoking, immediate and written with such brio, I loved it.

    We’ve been talking about this post over at http://www.madebymany.co.uk — your stimulusaholic confession has us looking at our own addictions and wondering just how hooked we all are.

    To put my own ten pence into the pool, I think the hooked-on-Twitter-etc thing is less about being hooked on constant — or near-constant — stimulus, and more about about getting constant — or near-constant — affirmation that yes, we are still part of this community, and yes, this community still wants to hear what we have to say. Needy? Hugely. But oh, so human.