Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on June 25, 2021

A Google algorithm misidentified a software engineer as a serial killer

The knowledge graph strikes again


A Google algorithm misidentified a software engineer as a serial killer Image by: Topher McCulloch — edited

Google’s algorithmic failures can have dreadful consequences, from directing racist search terms to the White House in Google Maps to labeling Black people as gorillas in Google Photos.

This week, the Silicon Valley giant added another algorithmic screw-up to the list: misidentifying a software engineer as a serial killer.

The victim of this latest botch was Hristo Georgiev, an engineer based in Switzerland. Georgiev discovered that a Google search of his name returned a photo of him linked to a Wikipedia entry on a notorious murderer.

“My first reaction was that somebody was trying to pull off some sort of an elaborate prank on me, but after opening the Wikipedia article itself, it turned out that there’s no photo of me there whatsoever,” said Georgiev in a blog post.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

[Read: Why entrepreneurship in emerging markets matters]

Georgiev believes the error was caused by Google’s knowledge graph, which generates infoboxes next to search results.

He suspects the algorithm matched his picture to the Wikipedia entry because the now-dead killer shared his name.

Georgiev is far from the first victim of the knowledge graph misfiring. The algorithm has previously generated infoboxes that falsely registered actor Paul Campbell as deceased and listed the California Republican Party’s ideology as “Nazism”.

In Georgiev’s case, the issue was swiftly resolved. After reporting the bug to Google, the company removed his image from the killer’s infobox. Georgiev gave credit to the HackerNews community for accelerating the response.

Other victims, however, may not be so lucky. If they never find the error — or struggle to resolve it — the misinformation could have troubling consequences.

I certainly wouldn’t want a potential employer, client, or partner to see my face next to an article about a serial killer.

Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with