While it is easy to study your website’s traffic stats and assume that all those figures represent human visitors, the reality is that a growing percentage of traffic is non-human in nature — robots (bots) and search engine spiders, among other electronic entities — benignly seeking to index your site, and on occasion, to maliciously harvest its content or otherwise exploit its services.
One way to prevent unauthorized access to a website is via a user login; but machines being increasingly clever, a way to prevent them from logging in is required — a method that distinguishes man from machine.
Legitimate user difficulties have plagued the use of CAPTCHA systems since their inception.
Computer scientist Alan Turing performed groundbreaking research in this endeavor, with the resulting Turing tests bearing his name; but it was the efforts of the team from Carnegie Mellon University that developed the most familiar form of the test: CAPTCHA (Completely Automated Public Turing test to Tell Computers and Humans Apart).
Wikipedia notes that a CAPTCHA “is sometimes described as a reverse Turing test, because it is administered by a machine and targeted at a human, in contrast to the standard Turing test that is typically administered by a human and targeted at a machine.”
Legitimate user difficulties have plagued the use of CAPTCHA systems since their inception, with the World Wide Web Consortium addressing the issue of accessibility and Turning tests (www.w3.org/TR/turingtest/).
“The most popular solution at present is the use of graphical representations of text in registration or comment areas,” the standards body explains. “The site attempts to verify that the user in question is in fact a human by requiring the user to read a distorted set of characters from a bitmapped image, [and] then enter those characters into a form.”
According to the W3C, this type of visual and textual verification comes at a huge price to users who are blind, visually impaired or dyslexic, because these images have no accompanying text equivalent, since computerized systems would then thwart the efforts at securing the website.