In the very first thing I wrote about Revealing Errors — an article published in the journal Media/Culture — one of my core examples was Google News. In my discussion, I described how the fact that Google News aggregates articles without human intervention can become quite visible through the site’s routine mistakes — errors that human editors would never commit. I gave the example of the aggregation of two articles: one from Al Jazeera on how, “Iran offers to share nuclear technology,” and another from the Guardian on how, “Iran threatens to hide nuclear program.” Were they really discussing the same event? Maybe. But few humans would have made the same call that Google News did.
Yesterday, I saw this article from Network World that described an error that is even more egregious and that was, apparently, predicted by the article’s author ahead of time.
In this case, Google listed a parody by McNamara as the top story about the recent lawsuit filed by the MBTA (the Boston mass transit authority) against security researchers at MIT. In the past, McNamara has pointed to other examples of Google News being duped by obvious spoofs. This long list of possible examples includes a story about congress enlisting the help of YouTube to grill the Attorney General (it was listed as the top story on Google News) and this story (which I dug up) about Paris Hilton’s genitals being declared a wonder of the modern world!
McNamara has devoted an extraordinary amount of time to finding and discussing other shortcomings of Google News. For example, he’s talked about the fact that GN has trouble filtering out highly-local takes on stories that are of broader interest when presenting them to the general Google News audience, about its sluggishness and inability to react to changing news circumstances, and about the sometimes hilarious and wildly inappropriate mismatches of images on the Google News website. Here’s one example I dug up. Imagine what it looked like before it was censored!
As McNamara points out repeatedly, all of these errors are only possible because Google News employs no human editors. Computers remain pretty horrible at sorting images for relevance to news stories and discerning over-the-top parody from the real thing — two tasks that most humans don’t have too much trouble with. The more generally inappropriate errors wouldn’t have made it past a human for multiple reasons!
As I mentioned in my original Revealing Errors article, the decision to use a human editor is an important one with profound effects on the way that users are exposed to news and, as an effect, experience and understand one important part of the world around them. Google News’ frequent mistakes gives us repeated opportunity to consider the way that our choice of technology — and of editors — frames this understanding.
Any error that include a picture of bare… uhm, “big-box”es(?)… is okay by me :-)
I for one welcome our new mistaken overlords.
Hi, enjoying your blog, however, your Google News/Al Jezeera/Guardian example given here doesn’t really support your point.
In your own screenshot the excerpt from the Al Jezeera article has “offers to share” in the headline and “while warning it will hide its nuclear programme” in the body text. So Google News is correct here, and your comment about a human making a different call is dependant on the human making a judgement based purely on the headlines (not saying that wouldn’t happen though).