Thanks to everyone who pointed me to the flub below. It was reported all over the place today.
The error occurred on One News Now, a news website run by the conservative Christian American Family Association. The site provides Christian conservative news and commentary. One of the things they do, apparently, is offer a version of the standard Associated Press news feed. Rather than just republishing it, they run software to clean up the language so it more accurately reflects their values and choice of terminology. They do so with a computer program.
The error is a pretty straightforward variant of the clbuttic effect — a run-away filter trying to clean up text by replacing offensive terms with theoretically more appropriate ones. Among other substitutions, AFA/ONN replaced the term “gay” with “homosexual.” In this case, they changed the name of champion sprinter and U.S. Olympic hopeful Tyson Gay to “Tyson Homosexual.” In fact, they did it quite a few times as you can see in the screenshot below.
Now, from a technical perspective, the technology this error reveals is identical to the clbuttic mistake. What’s different, however, is the values that the error reveals.
AFA doesn’t advertise the fact that it changes words in its AP stories — it just does it. Most of its readers probably never know the difference or realize that the messages and terminology they are being communicated to in is being intentionally manipulated. AFA prefers the term “homosexual,” which sounds clinical, to “gay” which sounds much less serious. Their substitution, and the error it created, reflects a set of values that AFA and ONN have about the terminology around homosexuality.
It’s possible than the AFA/ONN readers already know about AFA’s values. This error provides an important reminder and shows, quite clearly, the importance that AFA gives to terminology. It reveals their values and some of the actions they are willing to take to protect them.
Medireview is a reference to what has become a classic revealing error. The error was noticed in 2001 and 2002 when people started seeing a series of implausibly misspelled words on a wide variety of websites. In particular, website authors were frequently swapping the nonsense word medireview for medieval. Eventually, the errors were traced back to Yahoo: each webpage containing medireview had been sent as an attachment over Yahoo’s free email system.
The explanation of this error shares a lot in common with previous discussions of the the difficulty of searching for AppleScript on Apple’s website and my recent description of the term clbuttic. Medireview was caused, yet again, by an overzealous filter. Like the AppleScript error, the filter was attempt to defeat cross site scripting. Nefarious users were sending HTML attachments that, when clicked, might run scripts and cause bad things to happen — for example, they might gain access to passwords or data without a user’s permission or knowledge. To protect its users, Yahoo scanned through all HTML attachments and simply removed any references to “problematic” terms frequently used in cross-site scripting. Yahoo made the follow changes to HTML attachments — each line shows a term that can be used to invoke a script and the “safe” synonym it was replaced with:
- jscript → j-script
- vbscript → vb-script
- livescript → live-script
- eval → review
- mocha → espresso
- expression → statement
This caused problems because, like in the Clbuttic error, Yahoo didn’t check for word boundaries. This mean that any word containing eval (for example) would be changed to review. The term evaluate was rendered reviewuate. The term medieval was rendered medireview.
Of course, neither sender nor receiver knew that their attachments had been changed! Many people emailed webpages or HTML fragments which, complete with errors introduced by Yahoo, were then put online. The Indian newspaper The Hindu published an article referring to “medireview Mughal emperors of India.” Hundreds of others made similar mistakes.
The flawed script was in effect on Yahoo’s email system from at least March 2001 through July 2002 before the error was reported by the BBC, New Scientist and others.
Like a growing number of errors that I’ve covered here, this error pointed out the presence and power of an often hidden intermediary. The person who controls the technology one uses to write, send, and read email has power over one’s messages. This error forced some users of Yahoo’s system to consider this power and to make a choice about their continued use of the system. Quite a few stopped using Yahoo after this news hit the press. Others turned to other technologies, like public-key cryptography, to help themselves and others verify that their future messages’ integrity could be protected from accidental or intentional corruption.
I’ve recently given two talks on Revealing Errors at LUG Radio Live USA and at Penguicon. I’ll be giving at least two more in the near future: today (June 18, 2008 — sorry for the late notice!) at MIT for Boston Linux Unix and at OSCON in Portland, Oregon on July 25.
There’s a post on Copyrighteous (my personal blog) with more details and links from my talks page to notes and slides.
So far, they’ve been successful and lots of fun. I travel frequently and am interested in doing more. Contact me if you’d be interested in organizing something.
Revealings errors are often most powerful when they reveal the presence of or details about a technology’s designer. One of my favorite
clbuttes classes of revealing errors are those that go one step further and reveal the values of the designers of systems. I’ve touched on these twice before in my post about T9 input systems and when I talked about profanity in wordlists.
Another wonderful example surfaced in this humorous anecdote about what was supposed to be an invisible anti-profanity system that instead filled a website with nonsensical terms like “clbuttic.”
Basically, the script in question tried to look through user input and to swap out instances of profanity with less offensive synonyms. For example, “ass” might become “butt”, “shit” might become “poop” or “feces”, and so on. To work correctly, the script should have looked for instances of profanity between word boundaries — i.e., profanity surrounded on both sides by spaces or punctuation. The script in question did not.
The result was hilarious. Not only was “ass” changed to “butt,” but any word that contained the letters “ass” were transformed as well! The word “classic” was mangled as “clbuttic.”
The mistake was an easy one to make. In fact, other programmers made the same mistake and searches for “clbuttic” turn up thousands of instances of the term on dozens of independent websites. Searching around, one can find references to a mbuttive music quiz, a mbuttive multiplayer online game, references to how the average consumer is a pbutterby, a transit pbuttenger executed by Singapore, Fermin Toro Jimenez (Ambbuttador of Venezuela), the correct way to deal with an buttailant armed with a banana, and much, much more.
You can even find a reference to how Hinckley tried to buttbuttinate Ronald Reagan!
Each error reveals the presence of an anti-profanity script; obviously, no human would accidentally misspell or mistake the words in question in any other situation! In each case, the existence of a designer and an often hidden intermediary is revealed. What’s perhaps more shocking than this error is that fact that most programmers won’t make this mistake when implementing similar systems. On thousands of websites, our posts and messages and interactions are “cleaned-up” and edited without our consent or knowledge. As a matter of routine, our words are silently and invisibly changed by these systems. Few of us, and even fewer of our readers, ever know the difference. While switching “ass” to “butt” may be harmless enough, it’s a stark reminder of the power that technology gives the designers of technical systems to force their own values on their users and to frame — and perhaps to substantively change — the messages that their technologies communicate.
The blog Photoshop Disasters recently wrote a story about a small fiasco regarding cover art for the popular video game Okami.
Okami was originally released for the Sony Playstation 2 (PS2) in 2006. The developers of the game, Clover Studios closed up shop several months later. Here is the cover art for the PS2 game which is indicative of the unique sumi-e inspired game art.
Despite Clover’s failure, Okami won many award and was a commercial success. It was ported (i.e., made to run on a different platform) to the Nintendo Wii by a video game production house called Ready at Dawn and by the PS2 version’s distributor Capcom. The Wii version was released in April, 2008. Here is the cover art for that version:
People looking closely at the cover of the Wii game noticed something strange right near the wolf’s mouth. Here’s a highlight with the area circled.
The blurry symbol near Okami’s mouth was a watermark — an artifact intentionally added to an image to denote the source of the picture and often to prevent others from taking undue credit. In fact, it was the logo for IGN — a very large video game website and portal. As part of writing reviews, IGN frequently takes screenshots of games, watermarks them, and posts them on their website.
Sure enough, a little bit of digging on the IGN website revealed this high-resolution image from the cover, complete with IGN watermark in the appropriate place. Apparently, a designer working for Capcom had found it easier to use the images posted by IGN than to go and get the original art from the game itself.
This error revealed quite a bit about the process and constraints that the cover designers for the Wii version were working under. Rather than getting original source images — which Capcom presumably owned — they found it easier to take it from the Internet-available source. Through the error, the usually invisible process, people, and technologies involved in this type of artwork preparation were revealed.
Embarrassed by the whole affair, Capcom offered to replace the covers with non-watermarked ones — free of charge.