Saturday, April 21, 2012

Prove That You Are Human

Or "The Reverse Turing Test"

As stated previously, the Turing Test was a system proposed by late-and-great thinker Alan Turing to explore the possibility of artificial intelligence in machines.

While it is a most impressive feat for a machine to pass the Turing Test, whatever shall we do in the event that this amazing pantomime is... misapplied?



I'm not speaking of highly-intelligent robot doppelgangers hellbent on revenge against the fleshy, stinky, and altogether flawed race that developed them. No, nothing so outwardly nefarious. Rather, the computer-for-human bait-and-switch is being carried out everyday, and has been a problem since the widespread proliferation of household personal computers and the internet.

As most of us already know by now, a "bot" is a program designed to mimic human conversation in an effort to get important information out of users. For example, a common sort of bot nowadays poses as a pretty femaletype to entice humans into filling out a form in order to join what would amount to a porn site, were it not buggy and entirely fictional. By doing this, certain parties can collect that info and use it for their own dastardly purposes, whether it be theft or blackmail or making Private Information Stirfry (I don't know what these sorts of people do).

The bot problem was initially identified by an employee of the now mostly-defunct Yahoo!. The employee presented the problem to a team of scientists, who sought out methods to simultaneously block machines and allow access to genuine humans. Hence, CAPTCHA.

In order to develop such methods, the scientists devised something that is essentially a "reverse" Turing Test, in that the judge is a machine rather than a human and the goal is to distinguish for certain the difference between the two. This test is performed by displaying a series of Roman letters that have been distorted, visually filtered, and jumbled in such a way that it is immediately recognizable to a somewhat literate human, but utterly confounding to a scanner attached to an artificial mind.

Thus far, such efforts have been reasonably successful. Very very mildly inconvenient, but absolutely necessary.

Article here: http://www.aladdin.cs.cmu.edu/papers/pdfs/y2001/pessimalprint.pdf



No comments: