What about the users? Shouldn't they be sued?
Surely, Google only repeats queries written by previous users?
Google has been fined $65,000 by a French court after its search engine suggested the French word for 'crook' when users typed-in the name of an insurance company, according to reports. The court said Google had ignored requests to remove the suggestion from its 'autocomplete' search engine technology when users searched for …
Although, as ever with Google, that's not quite the case is it? They choose for certain topics to not appear in the predictive list, porn would be one, they also grade the list so that it appears differently if you have safe search on or off. So they clearly can modify the list, but choose to do so on their own terms, rather than anything so inconvenient as local legal requirements.
It's a big ask complying to local requirements in comparison to the examples you've given;
compare
if (($keywords[porn] != 1) || ($safesearch != 1)){
suggest($data);
}else{
return false;
}
to
if ($location == 'france'){
switch($data){
case "insurance company":
return false;
break;
case "frog eating surr"
return false;
break;
default:
suggest($data);
}
}
}
As tempting as it might be to bash Google, it's nigh on impossible to comply with every local law in the world when you operate on the WWW. I think you're probably trolling, but if not then you're probably being quite unreasonable comparing filtering porn to not automatically including 'crook' after the name of an insurance company.
As others have pointed out, there's usually a reason words are suggested so it's not like Google have deliberately targetted this insurance company. People have either been searching for that, or it's commonly being used on sites.
Except «the internet» is not always right, and suggesting negative terms only add to that feedback loop that more people will look for them and add to its weight, with no regard to the actual facts.
Except that Google is already happily censoring those suggestions to avoid sexual terms, which the Court noticed.
Except that in the end, Google *suggestions* are not *search results*, and thus very different from stopping people from accessing the result of those searches.
Or are you just lazy enough that you don't want to type anything in the search box, and just let Google feed you with whatever crap «the Internet» is regurgitating at the moment?
I agree, AC - Google's suggestion box is not "the internet", as much as some people might claim that it's all just algorithms which are perfectly and spontaneously in metaphysical tune with the natural state of the universe.
What it is, is a Google Prism on the internet, constantly altered and adjusted in various ways for various Google-centric purposes. With the latter in mind, it is perfectly reasonable for the judge to reason that Google can also alter it to conform with local defamation law, and perfectly reasonable to expect it to do so.
Which implies that changing the suggestion box is not the same as "censoring the internet". That's just being hysterical.
By the time the input reaches 'Lyonnaise de G' and the search has worked out it's an insurance company...
Google does not just autocomplete words, it autocompletes phrases (or more accurately appends secondary keywords), but, as others have pointed out, why is this company so closely linked to the term 'crook' in the first place?
And if a conviction does exist against them what right does a court have to help hide this fact? (Can company convictions end up 'spent' in France?)
It suggests words based on what has gone before. If other people have typed "Lyonnaise de Garantie escroc", then, when you start to type "Lyonnaise..." it will, at some point, suggest things that others have typed.
It's worth pointing out that it's quite common to type "some_company_I_want_to_deal_with sucks" just to see what complaints there are and how they're dealt with.
Google's autocomplete suggests phrases and combinations of search terms based on what you type so far.
So anyone searching for Lyonnaise de Garantie with autocomplete on will see the term come up in a drop-down list of options. Which isn't going to be encouraging.
I'm in two minds about this:
On the one hand, if the autocomplete suggestion is only there because there are substantial numbers of discussion threads/blog posts/websites making the allegation about Lyonnaise de Garantie, it's hardly Google's fault that a load of other people think the company's misbehaving.
On the other hand, autocomplete is one of those functions in Google that has long struck me as an answer to a problem nobody has - an extension beyond necessity of the ability to suggest alternative options for search queries containing typographical errors.
As someone else said above, though - I'm sure Google are quaking in their boots over a $65 fine, though I'm sure their lawyers are worried about the precedent this sets.
Google "Lyonnaise de Garantie escroc" now generates "About 368,000 results (0.23 seconds)"
So id say Lyonnaise de Garantie have well and truly cemented the association of their name with the term crook, top work by that insurance company, unless they were trying to drown a signal in noise, in which case they might be very clever.
Even better, googling just "Lyonnaise de Garantie" and the top entry is the corporate site and the second is a news article about this case.
Legal query: if Google remove the auto-complete facility from www.google.fr (that being, in my view, a gesture of good faith towards the French legal system in not wanting to inadvertently fall foul of their fine laws again) is their French operation still liable for the behaviour of www.google.com?
1) With enough proxies (or even an iframe on a busy website) you can easily abuse the suggest feature and get it to suggest things like '[competitor name] scam'
2) We used to contact a lot of different webmasters and those who weren't sure if we were legit or not would search for '[our company] spam' or '[our company] scam'. We were neither scammers nor spammers but people do a quick search to check if we are. After awhile, it sticks.
I have wondered about that you know? As a more likely example
Your car auto-parks but fucks it up and smashes into one of the cars (say the one behind you). You are in control of the car, so you are at fault - but do you have recourse to then sue Ford?
If/when we get auto-driving cars. When there's an accident, who's at fault?
If an unknown bug in your traction control/ABS unit surfaces and slams the brakes on, who's at fault (well technically the guy behind you for not leaving sufficient stopping distance)?
I feel like a lawyer trying to find someone to sue asking all these questions! Where there's blame, there's a..... ambulance chaser!
I've often wondered if Google and other services should just strip results from vexacious litigants altogether. People, particularly people in goverments, seem to not understand how the internet works - how search works.
It's like suing the publisher of a word processor because autocomplete didn't provide the correct word you were thinking of. Back in college we used to liven up an otherwise boring class by subtely replacing common autocorrect words whilst people weren't looking. Should someone have become offended - would Microsoft be responsible if it was Word for example?
Yes, but it's one thing to have a search result for the term (which can presumably be justified by providing some of the sample data that Google have aggregated and processed to create the page ranking). It's another to suggest the term in an autocompleted field.
As I understand it the issue is with the autocomplete suggestion, not the search results themselves (though no doubt the firm would like to be shot of those results too if they could find a way to justify it...)