On marketing’s terminal addiction to personal data fracking and bad guesswork

Doc Searls
6 min readJan 10, 2015

Quit fracking our lives to extract data that’s none of your business and that your machines misinterpret. — New Clues, #58

That’s the blunt advice David Weinberger and I give to marketers who still make it hard to talk, sixteen years after many of them started failing to get what we meant by Markets are Conversations.

For a look at modern marketing at its wurst (pun intended), here’s one part of something called The Big Datastillery, by IBM and Aberdeen:

Those beakers on the conveyor belt are you and me. Really. They think we’re just empty vessels, moving along under their control, waiting to get filled with marketing goop, and farting feedback gas back up into the plumbing. (That’s the gray flame rising from the green beaker on the right.) Can you get more irony-free than that?

New Clues again:

61 When personalizing something is creepy, it’s a pretty good indication that you don’t understand what it means to be a person.

62 Personal is human. Personalized isn’t.

And being human is far more complicated than any marketing research can comprehend. Kim Cameron, an authority on digital identity, isn’t joking when he calls himself “the committee of the whole.”

Think about it. Sanity requires that we line up many different personalities behind a single first person pronoun: I, me, mine. And also behind multiple identifiers. In my own case, I am Doc to most of those who know me, David to various government agencies (and most of the entities that bill me for stuff), Dave to many (but not all) family members, @dsearls to Twitter, and no name at all to the rest of the world, wherein I remain, like most of us, anonymous (literally, nameless), because that too is a civic grace. (And if you doubt that, ask any person who has lost their anonymity through the Faustian bargain called celebrity.)

Michael Ventura explained the challenge here, way back in 1985, in an essay called Shadow Dancing in the USA:

…there may be no more important project of our time than displacing the … fiction of monopersonality. This fiction is the notion that each person has a central and unified “I” which determines his or her acts. “I” have been writing this to say that I don’t think people experience life that way. I do think they experience language that way, and hence are doomed to speak about life in structures contrary to their experience.

And that was when our main issue was the disconnect between the sane self that Devon Loffreto calls our sovereign source authority and the countless administrative identifiers that computing systems mistake for our actual selves. Today, administrative identifiers are exist just to make sense of us in databases, which in the distributed world where our sovereign selves actually live are increasingly obsolete. (HTs for those two links go to @Leashless and @PortiaCrowe /@BusinessInsider.)

And now what happens when what marketers call our “experience” of the Net is anchored by Facebook chatter and Google searches, and when online life and language (“liking,” “friending” and so on) soaks up time formerly spent around tables, in bars or in cars, and our environment is “personalized” through guesswork by companies whose robotic filtering systems constantly customize everything, supposedly to satisfy a singular self, when their real purpose is actually to sell shit to somebody/anybody?

One thing is that we get changed. In the closing sentences of The Shallows: What the Internet is Doing to our Brains, Nicholas Carr writes,

In the world of 2001 (the movie), people have become so machinelike that the most human character turns out to be a machine. That’s the essence of Kubrick’s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.

Even if our own intelligence is not yet artificialized, what marketers feed it surely is.

In The Filter Bubble, after explaining Google’s and Facebook’s very different approaches to personalized “experience” filtration, and the assumptions behind both, Eli Pariser says both companies’ approximations are based on “a bad theory of you,” and yield “pretty poor representations of who we are, in part because there is no one set of data that describes who we are.” He says the ideal of perfect personalization dumps us into what animators, puppetry and robotics engineers call the uncanny valley: a “place where something is lifelike but not convincingly alive, and it gives people the creeps.” On a graph it looks like this:

That valley is where perfect personalized advertising lives.

All you need to do is look at that red line below the X axis to see why “how to block ads” coincides with searches for what marketers call “retargeting”:

In case you don’t know what retargeting is, The Onion explains it here:

Here are four assumptions by personal data frackers that are certifiably nuts, because they are disconnected from reality called the marketplace, which is filled with human beings called customers. You know: us —

1) We are always in the market to buy something. We are not. (Are you shopping right now? And are you open to being distracted this very instant by an ad that thinks you are? — one placed by a machine guided by big data guesswork based on knowledge gained by following you around? Didn’t think so.)

2) We don’t mind being fracked. In fact we do, because it violates our privacy. That’s why one stain on the Web looks like this:

Source: TRUSTe 2014 US. Consumer Confidence Survey.

3) Machines can know people well — sometimes better than they know themselves. They can’t, especially when the machines are made to sell you something.

4) Consumers are subjects and marketers are experimenters. In fact we are all applied behavioral economists here. (Via Don Marti.) Another clue: That’s why we block ads.

So, where do we go from here?

First we need to continue expanding individual agency through VRM and similar efforts. Here’s a list of developers.

Second, marketing needs to stop excusing the harms caused by personalization of advertising by frack-fed Big Data methods. For guidance from history, read Tim Walsh‘s Big Data: the New Big Tobacco.

Third, advertising needs to return to what it does best: straightforward brand messaging that is targeted at populations, and doesn’t get personal. For help with that, start reading Don Marti and don’t stop until his points sink in. Begin here and continue here.

Original version published at blogs.harvard.edu on January 10, 2015. Current version published on March 8, 2016 and edited down to something better on March 9, 2016.

--

--

Doc Searls

Author of The Intention Economy, co-author of The Cluetrain Manifesto, Fellow of CITS at UCSB, alumnus Fellow of the Berkman Klein Center at Harvard.