Tuesday, April 5, 2011

Google Still Doesn't Get It

September is a long time ago in our history-averse culture.  Since Google seems to be part of that culture, they may be unaware that they have just “bumped into” (to use Ken Auletta’s turn of phrase) a repetition of history.  On the other hand it may just be that what happened in September did not have enough impact.

For those unaware of what did happen in September, it was when a Paris court ruled that the actions of the software driving autocomplete suggestions for search were defamatory.  The suit was triggered when it turned out that typing a particular name brought up “rapist” and “satanist” as suggestions for refining the search.  The good news was that the court ruled in favor of the plaintiff that defamation of character had taken place.  The bad news may be why this episode seems to have slipped the collective Google memory.  Here is how it was reported by Agence France-Press at the time:

The court ordered Google to make a symbolic payment of one euro in damages and take measures to ensure they could be no repeat of the offence.

Apparently, “he that filches from me my good name” is only out for one euro;  which is probably why this episode was quickly relegated to mere noise in the Google collective memory.

Indeed, the insignificance of the episode has now been validated by what may be taken as abject disregard of the second half of the ruling.  According to a Business Tech story for CNET News by David Meyer, Google has just lost another ruling on the same grounds, this time before the Court of Milan in Italy.  This time the plaintiff’s name was autocompleted in Italian with the suggestion to add the keywords “truffatore” (con man) and “truffa” (fraud).  Meyer also reported that Google issued a statement in response to this repetition of history:

We believe that Google should not be held liable for terms that appear in autocomplete as these are predicted by computer algorithms based on searches from previous users, not by Google itself.  We are currently reviewing our options.

This strikes me as a decision on Google’s part to invoke the logic of the National Rifle Association:

Guns don't kill people; people kill people.

True as this may be, one must also remember the dramatic logic of Anton Chekhov:

One must not put a loaded rifle on the stage if no one is thinking of firing it.

The reason that Auletta was so insistent about bumping into reality is that reality has a way of dishing out nasty consequences.  Those consequences are the results of actions that people take, whether or not they happen to be implemented by objective algorithms.  However compelling Auletta’s reasoning may have been, it is clear that Google has never given it much attention.  Why worry about bumping into reality when you can crush it with your own steamroller?

No comments: