Akamai and SSL

SSL stands for “Secure Sockets Layer” and refers to a protocol for using the web in a secure, encrypted, manner. Every time you connect to a website with an address prepended with https://, instead of just http://, you’re connecting over SSL. Almost all banks and e-commerce sites, for example, use SSL exclusively.

SSL helps provide security for users in at least two ways. First, it helps keep communication encoded in such a way that only you and the site you are communicating with can read it. The Internet is designed in a way that makes messages susceptible to eavesdropping; SSL helps prevent this. But sending coded messages only offer protection if you trust that the person you are communicating in code with really is who they say they are. For example, if I’m banking, I want to make sure the website I’m using really is my bank’s and not some phisher trying to get my account information. The fact that we’re talking in a secret code will protect me from eavesdroppers but won’t help me if I can’t trust the person I’m talking in code with.

To address this, web browsers come with a list of trusted organizations that verify or vouch for websites. When one of these trusted organizations vouches that a website really is who they say they are, they offer what is called a “certificate” that attests to this fact. A certificate for revealingerrors.com would help users verify that that they really are viewing Revealing Errors, and not some intermediary, impostor, or stand-in. If someone were redirect traffic meant for Revealing Errors to an intermediary, users connecting using SSL would get an error message warning them that the certificate offered is invalid and that something might be awry.

That bit of background provides the first part of this explanation for this error message.

whitehouse.gov error message claiming the host is a248.e.akamai.net

In this image, a user attempted to connect to the Whitehouse.gov website over SSL — visible from the https in the URL bar. Instead of a secure version of the White House website, however, the user saw an error explaining that the certificate attesting to the identity of the website was not from the United States White House, but rather from some other website called a248.e.akamai.net.

This is a revealing error, of course. The SSL system, normally represented by little more than a lock icon in the status bar of a browser, is thrust awkwardly into view. But this particularly revealing error has more to tell. Who is a248.e.akamai.net? Why is their certificate being offered to someone trying to connect to the White House website?

a248.e.akamai.net is the name of a server that belongs to a company called Akamai. Akamai, while unfamiliar to most Internet users, serves between 10 and 20 percent of all web traffic. The company operates a vast network of servers around the world and rents space on these servers to customers who want their websites to work faster. Rather than serving content from their own computers in centralized data centers, Akamai’s customers can distribute content from locations close to every user. When a user goes to, say, Whitehouse.gov, their computer is silently redirected to one of Akamai’s copies of the Whitehouse website. Often, the user will receive the web page much more quickly than if they had connected directly to the Whitehouse servers. And although Akamai’s network delivers more 650 gigabits of data per second around the world, it is almost entirely invisible to the vast majority of its users. Nearly anyone reading this uses Akamai repeatedly throughout the day and never realizes it. Except when Akamai doesn’t work.

Akamai is an invisible Internet intermediary on a massive scale. But because SSL is designed to detect and highlight hidden intermediaries, Akamai has struggled to make SSL work with their service. Although Akamai offers a service designed to let their customers use Akamai’s service with SSL, many customers do not take advantage of this. The result is that SSL remains one place where, through error messages like the one shown above, Akamai’s normally hidden network is thrust into view. An attempt to connect to a popular website over SSL will often reveal Akamai. The White House is hardly the only victim; Microsoft’s Bing search engine launched with an identical SSL error revealing Akamai’s behind-the-scenes role.

Akamai plays an important role as an intermediary for a large chunk of all activity online. Not unlike Google, Akamai has an enormous power to monitor users’ Internet usage and to control or even alter the messages that users send and receive. But while Google is repeatedly — if not often enough — held to the fire by privacy and civil liberties advocates, Akamai is mostly ignored.

We appreciate the power that Google has because they are visible — right there in our URL bar — every time we connect to Google Search, GMail, Google Calendar, or any of Google’s growing stable of services. On the other hand, Akamai’s very existence is hidden and their power is obscured. But Akamai’s role as an intermediary is no less important due its invisibility. Errors provide one opportunity to highlight Akamai’s role and the power they retain.

Faces of Google Street View

This error was revealed and written up by Fred Beneson and first published on his blog.

Google Streetview Blurred Face Example

After receiving criticism for the privacy-violating “feature” of Google Street View that enabled anyone to easily identify people who happened to be on the street as Google’s car drove by, the search giant started blurring faces.

What is interesting, and what Mako would consider a “Revealing Error”, is when the auto-blur algorithm can not distinguish between an advertisement’s face and a regular human’s face. For the ad, the model has been compensated to have his likeness (and privacy) commercially exploited for the brand being advertised. On the other hand, there is a legal grey-area as to whether Google can do the same for random people on the street, and rather than face more privacy criticism, Google chooses to blur their identities to avoid raising the issue of whether it is their right to do so, at least in America.

So who cares that the advertisement has been modified? The advertiser, probably. If a 2002 case was any indication, advertisers do not like it when their carefully placed and expensive Manhattan advertisements get digitally altered. While the advertisers lost a case against Sony for changing (and charging for) advertisements in the background of Spiderman scenes located in Times Square, its clear that they were expecting their ads to actually show up in whatever work happened to be created in that space. There are interesting copyright implications here, too, as it demonstrates an implicit desire by big media for work like advertising to be reappropriated and recontextualized because it serves the point of getting a name “out there.”

To put my undergraduate philosophy degree to use, I believe these cases bring up deep ethical and ontological questions about the right to control and exhibit realities (Google Street View being one reality, Spiderman’s Time Square being another) as they obtain to the real reality. Is it just the difference between a fiction and a non-fiction reality? I don’t think so, as no one uses Google maps expecting to retrieve information that is fictional. Regardless, expect these kinds of issues to come up more and more frequently as Google increases its resolution and virtual worlds merge closer to real worlds.

Speed Camera

In the past, I’ve talked about how certain errors can reveal a human in what we may imagine is an entirely automated process. I’ve also shown quite a few errors that reveal the absence of a human just as clearly. Here’s a photograph attached to a speeding ticket given by an automated speed camera that shows the latter.

Photograph of a tow-truck towing a car down a road.

The Daily WTF published this photograph which was sent in by Thomas, one of their readers. The photograph came attached to this summons which arrived in the mail and explained that Thomas had been caught traveling 72 kilometers per hour in a 60 KPH speed zone. The photograph above was attached as evidence of his crime. He was asked to pay a fine or show up in court to contest it.

Obviously, Thomas should never have been fined or threatened. It’s obvious from the picture that Thomas’ car is being towed. Somebody was going 72 KPH but it was the tow-truck driver, not Thomas! Anybody who looked at the image could see this.

In fact, Thomas was the first person to see the image. The photograph was taken by a speed camera: a radar gun measured a vehicle moving in excess of the speed limit and triggered a camera which took a photograph. A computer subsequently analyzed the image to read the license plate number and look up the driver in a vehicle registration database. The system then printed a fine notice and summons notice and mailed it to the vehicle’s owner. The Daily WTF editor points out that proponents of these automated systems often guarantee human oversight in the the implementation of these systems. This error reveals that the human oversight in the application of this particular speed camera is either very little or none and all.

Of course, Thomas will be able to avoid paying the fine — the evidence that exonerates him is literally printed on his court summons. But it will take work and time. The completely automated nature of this system, revealed by this error, has deep implications for the way that justice is carried out. The system is one where people are watched, accused, fined, and processed without any direct human oversight. That has some benefits — e.g., computers are unlikely to let people of a certain race, gender, or background off easier than others.

But in addition to creating the possibilities of new errors, the move from a human to a non-human process has important economic, political, and social consequences. Police departments can give more tickets with cameras — and generate more revenue — than they could ever do with officers in squad cars. But no camera will excuse a man speeding to the hospital with a wife in labor or a hurt child in the passanger seat. As work to rule or “rule-book slowdowns” — types of labor protests where workers cripple production by following rules to the letter — show, many rules are only productive for society because they are selectively enforced. The complex calculus that goes into deciding when to not apply the rules, second nature to humans, is still impossibly out of reach for most computerized expert systems. This is an increasingly important fact we are reminded of by errors like the one described here.

Tyson Homosexual

Thanks to everyone who pointed me to the flub below. It was reported all over the place today.

Screenshot showing Tyson Homosexual instead of Tyson Gay

The error occurred on One News Now, a news website run by the conservative Christian American Family Association. The site provides Christian conservative news and commentary. One of the things they do, apparently, is offer a version of the standard Associated Press news feed. Rather than just republishing it, they run software to clean up the language so it more accurately reflects their values and choice of terminology. They do so with a computer program.

The error is a pretty straightforward variant of the clbuttic effect — a run-away filter trying to clean up text by replacing offensive terms with theoretically more appropriate ones. Among other substitutions, AFA/ONN replaced the term “gay” with “homosexual.” In this case, they changed the name of champion sprinter and U.S. Olympic hopeful Tyson Gay to “Tyson Homosexual.” In fact, they did it quite a few times as you can see in the screenshot below.

Screenshot showing Tyson Homosexual instead of Tyson Gay.

Now, from a technical perspective, the technology this error reveals is identical to the clbuttic mistake. What’s different, however, is the values that the error reveals.

AFA doesn’t advertise the fact that it changes words in its AP stories — it just does it. Most of its readers probably never know the difference or realize that the messages and terminology they are being communicated to in is being intentionally manipulated. AFA prefers the term “homosexual,” which sounds clinical, to “gay” which sounds much less serious. Their substitution, and the error it created, reflects a set of values that AFA and ONN have about the terminology around homosexuality.

It’s possible than the AFA/ONN readers already know about AFA’s values. This error provides an important reminder and shows, quite clearly, the importance that AFA gives to terminology. It reveals their values and some of the actions they are willing to take to protect them.

Medireview

Medireview is a reference to what has become a classic revealing error. The error was noticed in 2001 and 2002 when people started seeing a series of implausibly misspelled words on a wide variety of websites. In particular, website authors were frequently swapping the nonsense word medireview for medieval. Eventually, the errors were traced back to Yahoo: each webpage containing medireview had been sent as an attachment over Yahoo’s free email system.

The explanation of this error shares a lot in common with previous discussions of the the difficulty of searching for AppleScript on Apple’s website and my recent description of the term clbuttic. Medireview was caused, yet again, by an overzealous filter. Like the AppleScript error, the filter was attempt to defeat cross site scripting. Nefarious users were sending HTML attachments that, when clicked, might run scripts and cause bad things to happen — for example, they might gain access to passwords or data without a user’s permission or knowledge. To protect its users, Yahoo scanned through all HTML attachments and simply removed any references to “problematic” terms frequently used in cross-site scripting. Yahoo made the follow changes to HTML attachments — each line shows a term that can be used to invoke a script and the “safe” synonym it was replaced with:

  • javascript → java-script
  • jscript → j-script
  • vbscript → vb-script
  • livescript → live-script
  • eval → review
  • mocha → espresso
  • expression → statement

This caused problems because, like in the Clbuttic error, Yahoo didn’t check for word boundaries. This mean that any word containing eval (for example) would be changed to review. The term evaluate was rendered reviewuate. The term medieval was rendered medireview.

Of course, neither sender nor receiver knew that their attachments had been changed! Many people emailed webpages or HTML fragments which, complete with errors introduced by Yahoo, were then put online. The Indian newspaper The Hindu published an article referring to “medireview Mughal emperors of India.” Hundreds of others made similar mistakes.

The flawed script was in effect on Yahoo’s email system from at least March 2001 through July 2002 before the error was reported by the BBC, New Scientist and others.

Like a growing number of errors that I’ve covered here, this error pointed out the presence and power of an often hidden intermediary. The person who controls the technology one uses to write, send, and read email has power over one’s messages. This error forced some users of Yahoo’s system to consider this power and to make a choice about their continued use of the system. Quite a few stopped using Yahoo after this news hit the press. Others turned to other technologies, like public-key cryptography, to help themselves and others verify that their future messages’ integrity could be protected from accidental or intentional corruption.