Saturday, August 19, 2006

Snap Judgments

Clive Thompson posted some interesting reflections about's Mechanical Turk program, in which ordinary people -- often at their day jobs -- can earn money from piece work doing digital tasks that even sophisticated computers can't manage. In "Why Humans Have the Best Artificial Intelligence," Thompson explains how our capacity as humans to make simple snap judgments makes us better categorizers than high-tech indexing machines.

What I love about the Mechanical Turk is that it capitalizes on an interesting limitation in artificial intelligence: Computers suck at many tasks that are super-easy for humans. Any idiot can look at picture and instantly recognize that it's a picture of a pink shoe. Any idiot can listen to a .wav file and realize it's the sound of a dog baring. But computer scientists have spent billions trying to train software to do this, and they've utterly failed.

More generally, this metadata industry is booming in many parts of the country in which traditional industries have failed or natural resources have become exhausted. For example, in Chester, Vermont, I watched digital archivists patiently categorizing pictures and choosing between misspellings in ways that only a more efficient human can do. Unlike the concentrated work practices at the scrupulous Chester plant, however, Mechanical Turk encourages multitasking.

Thompson also notes the negative implications for collective bargaining created by programs like Mechanical Turk: "Mind you, while the cognitive-science aspects of the Mechanical Turk are incredibly cool, the labor dimensions freak the hell out of high-tech labor unions."

Of course, there are several online programs that celebrate our subjective tastes rather than objective observations. For example, you can vote for the cutest kittens at Kitten War (I love the "Losingest Kittens" category) or indicate the kind of sky you prefer at Cloud Shape Classifier. All this data can be aggregated so that you can see the aesthetic of the collective emerge or an ideal object of contemplation for a specific individual be generated.

Our online behaviors tell us a lot about our human vulnerabilities as well, as the recent AOL scandal involving the release of highly personal search terms shows us. The New York Times story, "A Face Is Exposed for AOL Searcher No. 4417749," makes manifest how simple it can be to work backwards to the individual from search terms, even without elaborate data mining tools. To see why this might be a problem for privacy, you can see some of the more embarrassing online searches immortalized at Something Awful. (Watch out, since some of the words and phrases aren't "work-safe.")

Labels: ,


Post a Comment

<< Home