Social Science Moves from Academia to the Corporation

This is the latest of a series of posts addressing questions regarding social technologies. Previous posts: The Evangelist Fallacy, Captivity of the Commons and The Digital Panopticon. These topics will be opened to live discussion in an upcoming webcast on May 27 with a special guest to be announced.

In order to control a thing you must first classify a thing — and we are seeing a massive classification of social behavior. While that classification falls under the guise of making life easier (targeted ads, locating a nearby pizza joint using your mobile), history tells us that we should be leery of motives and masters of our social data (see Captivity of the Commons).

Social sciences (behavioral psychology, sociology, organizational development), whose historical lack of data and scientific method left them open to ridicule from the “hard” sciences, finally have enough volume of data and analytics and processing power (see Big Data) to make “social” much more scientific. But this time social science is going to be coming to you not courtesy of Princeton, but courtesy of Google. Not through small studies on willing subjects, but through massive multivariate testing and optimization upon (largely) unknowing test subjects. The corporation, in other words, will hold the keys to social science at a level of precision only dreamed of by the academic and state institutions of yore.

This recent New York Times article highlights just how much social science, psychology, and personal data converge when a credit card company wants its debts repaid (via Andy Oram’s Radar post).

Should we be concerned about this shift from academia to the corporation?

I hold the current structure of government and corporations in equal regard in terms of how well they adhere to Google’s maxim, “Don’t be Evil.” So in some regard, I shouldn’t really be troubled that social science has moved from academia (which has often been a handmaiden of government) to the corporation (which really just wants to understand what moves you to click that “buy” button, or bump up your average order size by $10, etc.). Except…

Except if you believe that consumer culture is wreaking havoc upon the systems that support life and that the application of social science on behalf of the corporation is intended to simply turbo charge the status quo…

We find ourselves in 2009 facing deep, structural challenges — peak oil, environmental degradation, climate change, and financial meltdown.

That’s why the notion of social science in service of accelerating the existing system troubles me. Tim has spoken about the need to “Work on Stuff that Matters.” How might we apply social science toward “stuff that matters” instead of toward “buying more stuff that doesn’t matter?”

tags: , , ,
  • Joshua-Michéle,

    When I was a kid, someone told me about an AI experiment that was easy enough for me to code up myself. I did it. It worked.

    It’s a neat trick and I think it tells us something about our self-perceptions of what information we reveal vs. what we really do – albeit in a highly abstract way.

    The experiment: Write a program that is an infinite loop. On each iteration, this loop should: (1) print a prompt asking the user “heads or tails” – the user is instructed to make up the answer in his head, not use a real coin; (2) compute a guess of what the user will type using a markov chain process that is trained by the history of past entries from the user; (3) read the user’s answer; (4) print “i win” or “i lose” depending on whether the guess from step 2 was right; (5) update the markov matrix; (6) report the score of human v. computer over the course of the run. Lather, rinse, repeat.

    Obviously you get to make up some heuristic there for steps (2) and (5) but even very simple-minded heuristics work pretty well.

    Predicted result: while the user will self-report that they are randomly generating answers or even trying to fool the computer, most times, the computer will quickly learn to guess correctly significantly more than 50% of the time.

    Try it.

    So now, there’s a follow on experiment I didn’t do but that is easy to work out as a thought experiment:

    Attach of peripheral device to the computer. The peripheral can, under direction of a program, dispense some … let’s say … cocaine. The user is presumed to like cocaine.

    One variation: every time the computer prints “i lose”, dispense some cocaine.

    Another variation: every time the computer prints “i win”, dispense some cocaine.

    I think that after enough iterations, the outcome in both variations is about the same.

    That trivial AI program is a scientific sociologist.

    An interesting observation about such a program is that it is so *simple* in structure that it easily arises as an emergent property of larger systems. Nobody actually needs to code it up very directly – it just tends to arise by accident wherever programs arise by accident. You know, like on-line ad placement, for one example. Or like twitter when considered in combination with things like lymbic system responses.

    Google alarms me in part because of all I’ve read about how they manage things – how they manage “innovation’ at that firm. The founders hand is visible, I think: they built the management culture as an AI algorithm, highly stochastic. The workers are search agents, each chasing down individual branches of a tree, roughly speaking. The feedback is that when a worker successfully prunes away other branches of the tree they get a treat. The biggest treats come from search agents that impact the front page. Set it and forget it: they just run it that way.

    George Dyson was more correct than I suspect he realized, about Google.



  • I believe that what really matters in social media is the relationship with developing countries and the web bridging opportunities. Not marketing and advertising….

  • Good post. But the social science “shift from academia to the corporation” doesn’t mean that its left academia any more than research on biotechnology or engineering have ceased inside universities now that they are established in the commerce. In fact, its the same quantitative social science research that has revolutionized the academic world which is now also transforming the commercial world. Groups like our Institute for Quantitative Social Science at Harvard ( are devoted to promoting this kind of progress. Among the recent business startups that have resulted include, most recently, Crimson Hexagon (, which extracts meaning from large quantities of unstructured text, like blog posts and twitter updates. Fortunately, work continues in both realms.

  • Han Xu

    I find it extremely fascinating the kinds of things that Google is doing with the data they have. Generally, I think this trend of increasing use of large-n datasets is good. But on the other hand, I can only suspend my disbelief for so long before some of these statistical regressions start looking absurd.

    As for your article, I am not quite sure what you are really worried about. The way I see it, America’s consumer culture seems to be the heart of the problem, not social science methods. In fact, I’m sure that there are plenty of examples where the exact same excess of data and regression analysis is being used for “good” (whatever we believe that to be) by non-academics.

  • sknlkl

    Red (at) Red Dragon Leo (dot) com

    You can also add me to your Google Chat at:

    reddragonleo (at) gmail (dot) com