Thursday, March 29, 2012

Should Google Block Autocomplete if Your Name is Dick Hurtz?

shutterstock_yourefired150.jpgA Japanese man recently sought a court injunction against Google, alleging that its autocompletion of search queries got him fired. The man's lawyer, Hiroyuki Tomita, says that typing his client's name into Google produces thousands of automatic suggestions of "criminal acts," with which the man is not familiar, "defaming or disparaging him."

It's unknown what the man's name is since the Japanese court withheld it. But imagine your name was "Jenny Kills," and when someone Googled your name, Google suggested "Jenny Kills kittens" every time. Does Google have any responsibility here? Autocomplete uses a predictive algorithm based on the overall popularity of searches. Statistically, it's helping the vast majority of people. Are people with edge-case names harmed by this practice?

Sponsor

The Japanese man alleges that Google's autocomplete suggestions contributed to him getting fired and subsequently rejected when applying for several new jobs, Kyodo News reports. He petitioned the Tokyo District Court to demand that Google remove the offending terms from autocomplete. The court approved his petition, but Google has not complied.

"Google is currently reviewing the order," a Google spokesperson told ReadWriteWeb. Autocomplete results are "produced by a number of factors, including the popularity of search terms. Google does not determine these terms manually - all of the queries shown in autocomplete have been typed previously by other Google users."

Autocomplete suggestions when searching for a name are not based on data about the person. They're assembled from queries by other people. If your name is coincidentally close to some awful thing for which people Google often, Google doesn't think it should penalize its other users to save you face.

jennykills_google.jpg

But that doesn't make these hapless victims feel any better. Google lost another similar case in Italy a year ago. Another anonymous person complained that Google searches for his name produced autocomplete suggestions including truffa ("fraud") and truffatore ("con man"). The Court of Milan ruled in his favor and ordered Google to filter out defamatory autocomplete suggestions.

Google tried to argue that it wasn't liable for these results as a "hosting provider." But the plaintiff's side successfully argued that this is Google's own content. Google filters out alleged piracy sites from autocomplete, the plaintiff argued, so it should filter out defamation as well.

In a statement after the ruling, Google said it was "disappointed" by the results. "We believe that Google should not be held liable for terms that appear in Autocomplete as these are predicted by computer algorithms based on searches from previous users, not by Google itself. We are currently reviewing our options."

When asked for an update today, Google offered the same statement it gave a year ago. That is, Google is still reviewing its options in response to the ruling. As a U.S.-based company, there's not much these local courts can do to force Google's hand.

ReadWriteWeb hasn't found any cases on this matter in the U.S., which would make things interesting, since it would be easier to enforce rulings in favor of victims.

What Should Google Do?

But what is the right thing to do in this situation? Piracy is illegal and libel is also illegal, but the cases are different. Piracy searches show links to illegal content (which Google does still index). But it's just capitulating to copyright holders by removing piracy terms from autocomplete. Google shouldn't even be doing that if it wants to provide the best possible search results, but at least the links are still indexed.

These searches for people's names don't point to defamatory content about them. It's just a sad coincidence that an algorithm links their names to other suggested keywords. It's terrible that the world is still so technologically ignorant that auto-suggested search terms with no real connection to a person could get them fired. But that can't be Google's problem.

The slope is too slippery. Though these automated suggestions might hurt a few people with unfortunate names, they're suggested precisely because they help lots of others.

*What do you think? Should Google fix autocomplete for these people, or are they out of luck?

Lead image courtesy of Shutterstock

Discuss


do you lose weight in a sauna does sauna burn fat sporty vans

No comments:

Post a Comment