Register's Googlewashing Story Overblown

I’m disappointed by the pile-on of people rising to Andrew Orlowski’s classic bit of yellow journalism (or trolling, as it’s more often referred to today), Google Cranks Up the Consensus Engine. If so many other people weren’t taking it seriously, I’d just ignore it. (I just picked this story up via Jim Warren’s alarmed forward to Dave Farber’s IP list.)

Orlowski breathlessly “reports”: “Google this week admitted that its staff will pick and choose what appears in its search results. It’s a historic statement – and nobody has yet grasped its significance.”

Orlowski has divined this fact based on the following “evidence,” a report by Techcrunch’s Michael Arrington on comments made by Marissa Mayer at the Le Web Conference in Paris:

Mayer also talked about Google’s use of user data created by actions on Wiki search to improve search results on Google in general. For now that data is not being used to change overall search results, she said. But in the future it’s likely Google will use the data to at least make obvious changes. An example is if “thousands of people” were to knock a search result off a search page, they’d be likely to make a change.

While I agree that, if true, Google’s manipulation of search results would be a serious problem, I don’t see any evidence in this comment of a change in Google’s approach to search. I fail to see how tuning Google’s algorithms based on the input of thousands of people about which search results they prefer is different from Google’s initial algorithms like Pagerank, in which Google weights links from sites differently based on a calculated value that reflects — guess what, the opinions of the thousands of people linking to each of those sites in turn.

The idea that Google’s algorithms are somehow magically neutral to human values misses their point entirely. What distinguished Google from its peers in 1998 was precisely that it exploited an additional layer of implicit human values as expressed by link behavior, rather than relying on purely mechanistic analysis of the text contained on pages.

Google is always tuning their algorithms to produce what they consider to be better results. What makes a better result? More people click on it.

There’s a feedback loop here that has always guided Google. Google’s algorithms have never been purely mechanistic. They are an attempt to capture the flow of human meaning that is expressed via choices like linking, clicking on search results, and perhaps in the future, gasp, whether people using the wikified version of the search engine de-value certain links.

This is not to say that Google’s search quality team doesn’t actually make human interventions from time to time. In fact, search for O’Reilly (my name) and you’ll see one of them. You’ll see an unusual split page, with the organic search results, dominated by yours truly and my namesake Bill O’Reilly, with the second half of the page given over to Fortune 500 company O’Reilly Auto Parts.

Why? Because Google’s algorithms pushed O’Reilly Auto Parts off the first page in favor of lots more Tim O’Reilly and Bill O’Reilly links, and Google judged, based on search behavior, that folks looking for O’Reilly Auto Parts were going away frustrated. Google uses direct human intervention when it believes that there is no easy way to accomplish the same goal by tuning the algorithms to solve the general case of providing the best results.

(I should note that my only inside knowledge of this subject comes from a few conversations with Peter Norvig, plus a few attempts to persuade Google to give more prominence to book search results, which failed due to the resistance of the search quality team to mucking with the algorithms in ways that they don’t consider justified by the goal of providing greater search satisfaction.)

Even if Google were to become manipulative for their own benefit in the way Orlowski implies, I don’t think we have to worry. They’d soon start losing share to someone who gives better results.

P.S. Speaking of the dark underbelly of editorial bias, consider this: Orlowski doesn’t even bother to link to his source, the Techcrunch article. There’s only one external link in his piece, and it’s done in such a way as to minimize the search engine value of the link (i.e. with no key search terms in the anchor text.) Orlowski either doesn’t understand how search engines work, or he understands them all too well, and is trying not to lead anyone away from his own site. A good lesson in how human judgment can be applied to search results: consider the source.

tags: