Credit card company data mining makes us all instances of a type

The New York Times has recently published one of their in-depth,
riveting descriptions of how

credit card companies use everything they can learn about us
Any detail can be meaningful: what time of day you buy things, or the
quality of the objects you choose.

The way credit collectors use psychology reminds me of CIA
interogators (without the physical aspects of pressure). In fact,
they’re probably more effective than CIA interogators because they
stick to the basic insight that kindness elicits more cooperation than

So who gave them permission to use our purchase information against
us? What law could possibly address this kind of power play?

There’s another disturbing aspect to the data mining: it treats us
all as examples of a pattern rather than as individuals.
Almost eleven years I wrote an


criticizing this trend. The New York Times article shows how much
we’ve lost from what we consider essential to our identity–our


This article drew six comments in a few hours–thoughtful and valid
comments, which have made me set down attitudes into words. Now we can
look put the attitudes under a light and see what makes sense, or
doesn’t, to readers.

The article contained two levels of criticism: a criticism of data
mining to build up composite pictures of individuals, and a criticism
of the use of data accumulated from routine transactions to manipulate
those individuals.

Building up a composite picture

Of course, a company that reaches out and does any marketing has to
target people. Someone who bought the O’Reilly book

Even Faster Web Sites

(sorry about the plug) might appreciate a notification about our upcoming
Velocity conference,
which was founded by the book’s author and covers the same topics.
Someone who bought a book on a totally different subject wouldn’t want
or respond to the notification. O’Reilly does this kind of targeting,
like most companies, and until everybody participates in truly
frictionless information exchanges, companies will have to continue
doing it.

Aggregated information is useful too. Organizations that mine public
data for evidence of health epidemics can identify likely sites and
investigate them further. The data mining is understood to provide an
approximation of the truth.

Where I see a problem is when the increasing quantity of constant
information refinement shades over into a qualitative change. There’s
a difference between a campaign targeted to 500 likely customers and a
campaign targeted to one.

At some point the composite portrait starts to look so much like a
person that corporate decision makers can begin to believe it
is the person. The portrait becomes like a replicant, or like
the statues that came to life in myths from Pygmalion to Pinocchio.

Joseph Weizenbaum, creator of the classic Eliza program, was shocked
to see that people treated his “doctor” program like a human
interviewer. There were plenty of computer programs that prompted the
user with questions and gave varied responses based on the answers,
but none had imitated a person so realistically.

Nowadays, nobody would be drawn in by Eliza. And perhaps companies and
customers alike will get used to composite portraits. Perhaps the
companies will send their composite to each of us and we can update it
to make it more accurate. That will be a very different world, though.

Now we can turn to the next level, manipulation.


I’ve read numerous accounts in biographies and articles about
interrogations, and talked to a couple people who have undergone
interrogations. I haven’t been on either side of an interrogation, but
I’ve been deposed for a court case. All these situations remind me
vividly of the exchanges reported in the New York Times article.

In these exchanges, a well-armed caller is laying, like a silkscreen,
a composite over the real person and trying to manipulate the result.
It’s not exactly a case of asymmetric knowledge (because at least in
theory, a customer could also learn a lot about a company and use that
knowledge to manipulate it). It’s more insidious: an employee carrying
out a precise initiative on behalf of a company–a machine in the
service of a goal–approaching the targeted customer in an informal
manner that brings out a natural, human, empathetic reaction in

Interrogation always takes place in the context of an open or implied
threat–there would be no reason for making the contact otherwise–but
as I mentioned in the article, the interrogation goes best when the
threat is raised only rarely and strategically. A feigned sympathy and
heart-to-heart engagement is the path to the most desired outcome.

In a sense, now, the employee has become the replicant. He is using a
careful counterfeit of human responses to induce the behavior he or
she is paid to induce. This is ethical when dealing with a criminal,
although even then US law limits (based on the Fourth Amendment) the
gathering of relevant information by the interrogator beforehand. I
question how ethical it is in a business situation, especially when
exploiting information given by the customer for entirely different

tags: , ,
  • David Strauss

    “So who gave them permission to use our purchase information against us?”

    It’s a dangerous road to assume someone (read: the government) must grant company X permission to do Y (for all X and all Y).

  • David

    How much do you really know about interrogations?

  • Jacem Yorob

    Hi Andy,

    Thanks for bringing the article to our attention, I found it interesting. That said, I have some concerns about your commentary about it.

    Your comparison to CIA interrogators seems a bit random, and you don’t bother explaining it. Exactly how does what these companies do resemble CIA interrogation tactics? I’m not really an expert on such tactics (I just know about waterboarding, which doesn’t bear much resemblance to what is described in the article), but I guess you are familiar with them, so please expand on this resemblance.

    As for the claim that “data mining … treats us all as examples of a pattern rather than as individuals,” I don’t see why we should find that either surprising or terrifying. Of course a large organization that barely knows you is going to treat you as an instance of a type, what else could they do? As long as they treat you accurately, do you really have cause to complain? And data mining techniques don’t serve to diminish individuality, they promote it. Which is more depersonalizing, a company having a detailed profile of you, or having them know you only as customer #59214862?. With data mining organizations can identify the unique combination of characteristics in their clients and customers. It’s *more* individualized, not less. So where exactly is the problem, what is the downfall? Specifically what is evil about these companies actions in getting to know you better and acting on that information? Yes, people who aren’t savvy will be manipulated by people who are, but that’s a timeless truth, data mining didn’t create that.

    Looking forward to your response,


  • bowerbird

    jacem said:
    > Specifically what is evil about these companies actions
    > in getting to know you better and acting on that information?

    most of these companies are not interested in our well-being.
    they want to move money from our pockets into their pockets.
    as that implies, they define the situation as a zero-sum game.
    they have no moral compass to prevent them from being evil,
    just as they have no moral compass nudging them to be good.
    so occasionally they’ll be good and occasionally they’ll be evil.
    if that doesn’t bother you, fine. but it does bother some of us.

    especially troublesome is when they extract information on us,
    because we didn’t give them _permission_ to use it against us,
    as zero-sum implies, and have no clue that they’re doing that.


  • @bowerbird: I don’t know of many companies that have the zero-sum mentality you’re describing. If they do, they probably don’t last long. If a company really wants to grow quickly (or to usurp their markets before someone else does), they realize that the best way to do that is to create markets that don’t already exist, not to try to get more and more money from their existing consumers. That way lies bankruptcy, when all your customers decide they don’t want you around anymore.

    I agree that many companies don’t have much of a moral compass, but honestly that’s not unique to companies.

  • Mark Fuqua

    Blah, Blah, Blah…they could get more from facebook and twitter.

  • It seems to me that the direction of the conversation here is moving towards the area of “ethics” and “philosophy”, (authentic interaction, symbolic interaction, personhood, etc.), yet how many companies are really ready to “explore these questions” when at the end of the day, you have to deliver x result by end of quarter y? (i.e. financial and reporting structure). SAP announced a bit push into sustainability that might be seen as measuring corporate responsibility. Perhaps one such “dashboard element” might be VRM (Vendor Relationship Management) where a third party broker holds your data, and “trades” it with the company you interact with/trade with for mutual benefit.

    This article and thread has shown a clear market failure, and a clear market opportunity. Why does your credit card company get “all your data”? Because you give it to them. Surely, a “Google card/ Skype Card” night hold/validate your purchase information, while passing on the “amount” + “specific” data fields to the credit card company?

    I also see another opportunity for companies to offer services, where they compare your spending with others “like you” to make inferences about your spend pattern (i.e. you are spending too much much on clothes vis-a-vis your comparators, would you like to see other vendors/brands?” etc. etc). There are also Finance2.0 services that help you manage your money across multiple accounts so that you have maximum ROI for YOU !

    I do believe that there is a plug in on mozilla weave that is a start in this direction? Anyway, it is a conversation worth having, because I don’t want my credit rating, available to those that will pay for it, to be inflected by spur of the moment purchase of beer, in a supermarket, at ten in the evening ;)

  • Brett

    Whenever someone approaches me directly – on the street, via sms, via email or via telephone call – they’re doing it for their benefit, first and foremost. Whatever technique they use: heart-rending story, informal chat, value-add offer, etc, they’re all trying to get me to give them some money. The more insidious the technique, the more annoyed I get since I feel that they are most definitely trying to manipulate me and the situation to their advantage. At no time do they really have my best interests at heart.
    Do I need these advances? Not really – I am perfectly capable of researching and purchasing anything I require, in my own time.

  • Jacem Yorob

    Thanks for the added detail Andy, you’ve identified some good issues. The mistake of confusing a model with the entity being modeled is a real danger in a lot of fields. It is a challenge which members of society are increasingly being confronted with, as abstract models of real and composite phenomena increase in real world importance. And unfortunately people’s track record of meeting these challenges is rather poor. No example of this is more timely and apt than our present financial mess: To a large extent this was brought on by individuals in the financial industry blankly assuming that their models of the economy and of financial instruments would closely match reality, an assumption which did not stand up to the harsh light of day. The NY times article mentions that many credit card companies are suffering today from the overly cheery models of client behavior they relied upon during the housing boom.

    Just as people today are no longer fooled by Eliza, we’ll need people to become more savvy about models and simulations, and their limitations in describing reality. This will not be easy though, as it requires understanding of subtle concepts, which only get more complicated as models become more sophisticated. And mentioning ‘statistics’, ‘unbiased estimators’ and ‘confidence intervals’ seems to send most people running. As difficult as it will be to educate broad society on statistical modeling savvy, I believe it is the only answer, because sophisticated models are only going to increase in prevalence.

    A similar thing will have to happen to deal with the second issue you expanded upon: Manipulation. Here though people will need to develop more social savvy, an ability to detect when someone is attempting to take advantage of them with social engineering. This is another difficult task, because the psychological manipulation techniques companies increasingly employ to induce compliance appear to tap into basic social rules-of-thumb which are probably wired into people’s brains at a low level ( I highly recommend Dr. Robert Cialdini’s “Influence: The Psychology of Persuasion” as an intro to these compliance techniques).

    Legislation may help to limit the methods companies can employ, but legislators may be hard pressed to achieve this without stepping on free speech and freedom to do business (which at least somewhat falls under the rubric of The Pursuit of Happiness).

    In summary, I believe the ideal solution to these issues is for businesses and consumers to become more savvy, such that they don’t blindly follow models and are able to resist manipulation. I’m not wholly against legislative protections, but these may have the negative side-effects of limiting the ability of organizations to innovate; and I believe there are innovations of value, both to businesses and their consumers, which are made possible when each party can get to know the other better.

    Bowerbird: I agree that companies are basically amoral, (though ultimately they are made up of people which presumably have the moral compasses you speak of), and that they shouldn’t be trusted more than a random stranger on the street. But most of us are already capable of dealing with strangers on the street without being swindled blind, and we should be capable of the same with strange businesses. As for companies using information in ways we did not give them permission to use, it is naive to assume they won’t. Naivete of this sort is unfortunately ubiquitous, as anyone who sees their friends installing silly, information stealing facebook applications is aware.

    Let’s hope both citizens and statistical models can keep up with our changing landscape.

  • Andy,

    It seems like you are noticing the increasingly panoptic nature of our society.

    The panopticon is a technology of imprisonment. In this technology, the prisoners are of course denied liberty and are exposed to intervention while the guards are granted a position of unrestricted but unobservable surveillance over the prisoners. The guards can see the prisoners at any moment. The prisoners can see that the guards might be looking. But the prisoners can not know if at a given moment they are being watched.

    There is a kind of industrial efficiency which is one heart of the panopticon: a small number of guards can manage a larger number of prisoners.

    There is also a kind of social theory or theory of moral behavior to the panopticon: that the solution to societal problems of “bad behavior” is to identify bad actors and ramp up both the restrictions on their bodies and the threat of observation and interrogation.

    The question arises, what modern institutions are not, increasingly, panoptic? Are there any?

    Surveillance cameras on city streets and, in England, some equipped with speakers (“Citizen, pick up that litter you just dropped!”). Schools with their cameras, yes, but also universal testing regimens and record sharing. Aggregation of private photos on Flickr is another fine example. Sure, the credit card system, as you note. How about all of and its features (and partners) in “behavioral tracking”? How about the legal fast footwork that gives us warrantless state data-mining of calls-placed records? How about the emerging space of location-aware devices? How about employer monitoring of employee email and web usage?

    Someone above remarked that the CC companies, looking at purchase records, weren’t getting anything they couldn’t get off of a Facebook or Myspace page. While that’s obviously false the point is understood: people are seen today to be a lot more publicly confessional than in the past.

    In some sense, “the new confessionalism” is just a natural part of the panoptic tendencies of power. The confessionalism on social networks is constructed, first and foremost, by choices about what ventures to fund and what firms to promote and what business models to erect around them and so forth. There is a de facto conspiracy to illicit these confessions.

    The prisoner in a classic panopticon prison learns quickly the rewards afforded them for keeping in easy sight rather than, say, hiding in a dark corner at the back of the cell. That prisoner becomes confessional as to his whereabouts within the cell from a self interest yet it is a self interest that didn’t exist until the powers over the prisoner built that specific situation.

    In some sense, then, it’s legitimate to say that the prisoner’s “self” is constructed by the panoptic environment in which he dwells. He adopts a prisoner’s mentality, vis a vis his conceptualization of his own self-interest. He begins to proactively present himself for ready inspection and he begins to internalize the consequences of observation as his moral norms.

    It is from that we get the description of ours as a disciplinary society where the social order is a relentless advancement of technical disciplines of surveillance and punishment, intervention, control, and so forth conducted by a few upon a many. We have industrialized the formation of the self.

    And, if we became confessional animals over quite a long time thanks to institutions like the church and then psychology, perhaps today there is a shift to our becoming animals of the spectacle: increasingly doing our confessing in public. If so, there is some continuity in that – it’s a “logical” extension of the penal system. If the penal system can be regarded as a diluting and superficial democratization of the right to punish, and if it can be properly regarded as one element of a larger panoptic system of managing bodies regarded as demographic populations, it follows that the confessional aspects of surveillance would become similarly, superficially democratrized.

    Again, in the comments above, someone suggests that the cc companies aren’t using any data they couldn’t get from what people put on-line, in public, themselves. There seems to be some kind of error in reasoning to that comment – an implicit belief that if people are “freely” putting all of this information out there that somehow that means it is not an element in asymmetric, oppressing power relations.

    That kind of “it’s out there anyway” reasoning – the whole discourse people have around web 2.0 and (so-called, superficial) “user control of their own data” – seems to want to say that the CC firms aren’t doing anything that everyone else can’t do, so that makes it ok. It’s not true, of course. The asymmetries run deep as when advertisers or employees can data-mine a social networking sight. But to the extent it *is* true, what happens? Well, I guess the CC firms are somewhat equally capable as, say, cyberstalkers and grifters.

    I miss cultures where people may indeed be “investigatable” to one another but where that investigation is itself the main object of observation and control. I am thinking of societies where the moral norm is to maintain mystery between people, to avert one’s gaze in many circumstances, to show respect by granting privacy and freedom, and to scrutinize and judge any apparently necessary transgression of these boundaries.

    I have no trouble imagining, only in finding a world in which whether we are talking about CC collections practices or Google’s behavioral tracking the answer could plausibly be to sue or criminalize the perpetrators out of existence on the grounds that they are practicing psychology without a license, invading privacy, and conducting involuntary human subject experimentation on a massive scale.


  • Hi, I hope you don’t mind, but I used my browser to block this page’s call out to when I visited.

  • Falafulu Fisi

    Datamining by credit card companies is something useful and an advantage for the user to protect credit card fraud, such as someone using a stolen credit card. So, datamining is something good.

  • Falafulu Fisi

    bowerbird said…
    they want to move money from our pockets into their pockets.

    What a dumb comment. That’s what a business is all about. It exists in the first place to make money and not to be there as a Santa Claus. If they come with a gun and force you to move money from your pocket (ie, buy from them), then that’s illegal and they should be prosecuted for doing so, but since you (the consumer/buyer) who goes to the producer or owner of the business to buy the goods/services on your own free volition, then you should blame yourself for being a sucker. Don’t blame the vendor since it was your free will to buy from them, which they didn’t force you or physically threaten you do buy their services/goods.

  • Jacem Yorob

    Hi Thomas Lord, you’re post is interesting and captures a lot of the uneasiness people have towards this type of profiling. I have some comments:

    “The confessionalism on social networks is constructed, first and foremost, by choices about what ventures to fund and what firms to promote and what business models to erect around them and so forth. There is a de facto conspiracy to illicit these confessions.”

    While social networking sites, like all sites which depend upon user generated content, wish users to contribute their information freely, I don’t see any malicious conspiracy in this. The sites don’t provide any direct incentives to put up accurate personal information, nor do they penalize withholding it. The pressure to provide personal info is from the real life social network itself (friends want to know about you), and to maximize the value of using the social networking application. If anything, to me the trend appears to be for these sites to provide more finely nuanced user control over privacy; for example, facebook has provided users with increasing granular controls to manage the flow of their personal information to others.

    “And, if we became confessional animals over quite a long time thanks to institutions like the church and then psychology, perhaps today there is a shift to our becoming animals of the spectacle: increasingly doing our confessing in public.”

    While I am uncertain as to the origins of this culture of the ‘confessional’, I agree there is a clear trend towards people becoming more open in sharing aspects of their lives with the barest of acquaintances, if not to complete strangers. For the most part, I think this is wonderful. It is a much better world where each person can directly perceive the rich diversity of their fellows than one in which people hide behind a much more limited set of socially sanctioned personas.

    I believe a lot of uneasiness people have with this trend stems from fears of embarrassment and disgrace(for themselves and for others), ultimately rooted in the remembered experience of such states. What this perspective fails to take into account is that there is no embarrassment when there is commonality.

    Let me illustrate with a simple analogy: Imagine you are standing amongst your neighbors. You think everything is normal and then you look down only to discover that you are completely naked. You would probably be mortified, yes?

    But what if you lived in a nudist colony. Everyone is naked, so no one is embarrassed.

    The world we’re creating for ourselves is less a panopticon than it is a nudist colony. ‘Confessional’ implies that what you are revealing is sinful, but people aren’t revealing these things out of shame nor for hope of redemption, they are simply being open about who they are and how they live. The pressure is from the social networks themselves, because the more you share, the more people can feel they know you. Only to a lesser extent is the pressure from the social networking apps.

    You express a nostalgia for cultures where people live within tightly proscribed social norms, but I think you are in a shrinking minority in that regard. These social networking innovations make sharing a more rewarding experience, so people will continue to embrace them and in turn share more.

  • Falafulu,

    You wrote: “Datamining by credit card companies is something useful and an advantage for the user to protect credit card fraud, such as someone using a stolen credit card. So, datamining is something good.

    You did not read the NYT article carefully.

    The article clearly articulates that:

    The card companies took improvements in fraud detection capability as an excuse to incite fraud.

    In pursuit of frauds, the credit card companies are using their data-mining for psychological profiling in order to cause troubled but not fraudulent customers to harm themselves under a false impression of a meaningful emotional connection to the company.

    In other words, in the name of fighting some forms of credit card fraud, the CC companies both incited that fraud themselves and also defrauded innocent customers.


  • I’m getting a lot from this discussion, and I’ll just add a couple
    more pointers. First an article I wrote last year about people who
    share personal information on social networks and how to think about
    their privacy rights:

    Second, an article in this morning’s Washington Post about combining
    records on the medical treatment of millions of patients to find more
    efficient treatments:

    I think streamlining medicine and reducing unnecessary procedures is
    crucial. But it would be ironic (and tragic) if insurance companies
    could create a composite patient that substitutes for the real one,
    and refused to consider procedures because of low success rates. This
    is what happens in many countries with public health systems. They
    refuse to recognize that each human is different.

  • bowerbird

    ken said:
    > I don’t know of many companies that have
    > the zero-sum mentality you’re describing.

    falafulu said:
    > That’s what a business is all about.

    it’s only on matters of severe hypocrisy and denial
    that one can provoke such a variance in responses.


    > for example, facebook has provided users with
    > increasing granular controls to manage the flow
    > of their personal information to others.

    so now facebook knows not only your secrets,
    but from whom you are keeping those secrets…


  • J Mellon

    Use Cash, live within your means and the rest is moot.

  • bowerbird said:
    > it’s only on matters of severe hypocrisy and denial
    > that one can provoke such a variance in responses.

    I don’t see the conflict. I agree, of course, that businesses intend to make money.

  • thomas graham

    Can a merchant access data about the size of a purchaser’s credit line?

  • Bhupendrasinh Thakre

    Hi Andy,

    Sorry for all those threat which you might be assuming. But I have completed various Data Mining Model which you have captured as “Investigation”.

    In case of banks, 1000% of time we don’t really know who is the customer like name, SSN or any single personal information. Even information about address is also very cryptic and in most of the cases they are untouched during modeling. Also in Data Mining models we have at least 500,000 rows i.e. Customer Information without any personal info.

    So do you still think that Banks are going to interpret any information, unless it is someone as big as Bill Gates.



  • dan

    My question here is where can they buy this data from other people,it will be interesting to see if there is a market for that.