Earlier this week, conservative leader Andrew Sheer Twitter displays some unusual search results on Google. The search for the term "Canadian soldiers" returned the photograph of a former prisoner at Guantanamo Bay, Omar Khadr, who was charged with killing an American soldier in 2002.
Sheer asked Google to take action, and it did not take long for another user to suggest that the whole thing was a work of the Russian troll.
Omar Khadr is a convicted terrorist who killed a doctor and blinded another. He is not a victim, nor should he be shown this way along with real Canadian heroes. @googlecanada: correct this. pic.twitter.com/qywUGQihVb
The truth? It's far more subtle.
However, the episode is another reminder of how even algorithms with the best intentions may inadvertently spur the spread of disinformation over the Internet. And with the federal election in Canada for just a few months, the stakes are even greater when policies are involved.
How did Fad come in?
The name of the Frame appears in what Google calls the results of Google's results. These sometimes appear above or in the usual Google search results when the user raises the question, asks for some general knowledge, or requires a well-known place or public personality.
A graph of knowledge pulls its data from a variety of sources – one of them is Wikidata, an open-source data store hosted by the same organization that hosts Wikipedia. Consider Wikipedia as a final report, and Wikidata raw data used to write it. Like Wikipedia, everyone can contribute to Wikidata, for better and worse.
Twitter user Stephen Punwasi pointed out that the data used to put Omar Khadr among the Canadian soldiers was extracted from Omar Khadr's Wikimedia site – and that the "Russian troll" was the one who did it.
This is embarrassing.
The Google Knowledge Chart collects data from a variety of sources, so I checked the change that Omar Cad made under the "Canadian soldier."
Changes from the Russian account a month ago. ?♂️
Was this the work of the Russian troll?
It does not seem to be the case.
On Modifications to Wikidata's Qadir site were made by user Huron. The user appears to be an active participant in the data on the site, and according to their account of Gitub, he is going to live in St. Petersburg, Russia.
But Guron's activity is not aimed at data related to a particular person, ideology, country, or political subject. Instead, it looks like automatic cleaning work, which aims to improve the quality of Wikidata at a speed much faster than anyone can do.
According to discussions between Huron and other Wikidad members, Huron runs a script that uses machine learning to automatically add and modify large amounts of data in Wikidot (for example, a person's occupation). Basically, it is designed to put data in buckets.
His script ensures that the street fighter is correctly classified as a video game, that the Faroe Islands are merged under the larger category of "islands" or that true Renaissance artists are properly classified as painters. Just watch for yourself.
From time to time, his script also appears to be working wrong – and other Wikidata users were not timid to let him know.
Is that what happened to Fad?
Yes! Using Frame Editing the history of Wikidata as a guide, here is a short timeline that returns even further from Shire's tweet:
On July 26, 2018, the Huron script categorizes Omar Khadr's occupation of Wikipedia as a "soldier" – part of a larger, automatic effort to grant occupations to everyone from the Zodiac killer to the Danish priests.
On September 24, 2018, users start publishing in Omar Kadr's Wikimedia section for discussion, asking why search results for his name on Google describe him as a Canadian soldier. However, the phrase "Canadian soldier" never appeared on his Wikipedia page, which means that the phrase was probably extracted from his Wikipedia page. Google has yet to explicitly confirm this.
Later that day, the data is removed from the Wikidata page on Fader. The user of the discussion section of the Wikipedia article on frame said: "It was a Google error, only linked to their search engine and not supported by Wikipedia. Google corrected the error tonight, and Omar Cad is no longer shown as a Canadian soldier."
On September 30, 2018, Guron's script categorized the Omar Khadr's occupation of the Wikileaks as a "soldier" one more time. It's unclear whether Google's Knowledge Graph ignored the change this time.
However, on December 8, 2018, the Huron script then categorizes the cadres of the cadre of "Wikipedia" as a "soldier" – data that is only sufficiently different that is likely to return to the results of the Knowledge Graph. Before long, people spotted Fadre among the results for a "Canadian soldier" once again.
What did Google do about it?
Denny Sullivan, Google's closet to the Ombudsman of the search engine, Canadian Jesse Brown answers on Twitter.
I'm not familiar with who claims what. As we have said, we simply saw that concerns are widespread. We reviewed, and because it was a problem with a Graph of Knowledge, we took action there (we did not take action to search for ads). And yes, to the last because it is true.
"We looked at it, and since that was a problem with the knowledge graph, we took action there," Sullivan writes.
It seems that the page of "Qadir" has not been changed – only data processing of the data contains.
PSV News has touched Google, and will update this story if we hear more.
Is this normal?
As Sullivan also points to Twitter, Google does not change the search results – at least not, unless it is forced to remove the information from its index. Instead, Google changes its results from the Graph of Knowledge.
Whether the difference is obvious to most users, given its setting – especially in situations where political tensions are high – is less clear.