I have already reported about Google Suggest from a trademark / law of unfair competition point of view. Ott and Goldberg recently reported about a case where a woman tries to sue Google, as the search engine suggested, when typing in the name of the claimant (“Beverly Stayart“), “Beverly Stayart Levitra“. “Levitra” however is the name of a drug fighting Erectile Dysfunction (Stayart v. Google, Inc., 2:10-cv-00336-LA (E.D. Wis. complaint filed April 20, 2010).
What makes this case even more interesting (Goldberg, Ott again reported) is that the same woman already sued Yahoo (Stayart v. Yahoo! Inc., 2009 WL 2840478 (E.D. Wis. Aug. 28, 2009) after the search engine presented users, searching for “Beverly Stayart”, results that also lead to porn websites. The claimant thus argued that this would establish kind of a connection of her person to these websites. The court however found that there was no chance that users could however assume such a connection as: “No one who accessed these links could resonably conclude that Bev Stayart endorsed the products at issue.”
Obviously not encouraged by this loss, Ms Stayart sued again, this time Google. As Google suggests the name of sex-performance-enhancing drug, in addition to her name she sees here rights violated and similar grounds as in the previous case. Presumably the case will face a similar outcome.
As SearchEngineLand has reported, Google recently disabled a number of search suggestions. Thus suggestions for queries as “muslims are” (“xyz are“) are no longer shown. Interestingly enough for the query “islam is” (“xyz is“) a number of suggestions are shown. The suggestions shown for this query are however mostly negative/offensive.
A Google Support page explains how the Suggestion Function actually works:
“As you type, Google Suggest returns search queries based on other users’ search activities. These searches are algorithmically determined based on a number of purely objective factors (including popularity of search terms) without human intervention. All of the queries shown in Suggest have been typed previously by other Google users. The Suggest dataset is updated frequently to offer fresh and rising search queries.”
So, in case I understood that right, the suggestions are based on previous searches, or let’s say on information previously submitted by other users. This made me wonder if different Google versions would display different suggestions. Thus I took the phrase: ‘Germans are‘ [“Deutsche sind“] and entered it into Google.at and subsequently into Google.de. To my big surprise the suggestions did differ. Not much but they did. While the Google.de brought “deutsche sind die besten” [‘Germans are the best‘] as the 8th suggestion, Google.at had “deutsche sind arrogant” [‘Germans are arrogant‘] at the same position.
Without going into detail regarding the interesting relation between Germans and Austrians I then went on to check out Italians, Greeks, people from Poland, Christians and as a last point members of minorities as eg. homosexuals.
What was interesting to see is that the suggestions are sadly enough pretty accurate on the respective stereotypes & prejudices. I guess I shouldn’t be too surprised about this as the suggestions are nothing else then bullet-pointed search results but somehow the whole thing makes me wonder… how many people need to type “Italians are there to be kissed” into Google or how many websites have to contain the words “Beverly Stayart” and “Levitra” before Google ads it to its suggestions. Quite a few I reckon 😉
One thing however is clear, Google Suggest, maybe just because it’s new and users are not used to it yet, obviously contains a certain confusing element and thus Google might be well advised to explain users a bit more how this service works. In a similar case Google has shown “Explanatory Ads” in connection with their picture search as the search term “Michelle Obama” led to a clearly racist and inappropriate image.