How computers find naked people

We are all just collections of limbs and appendages, naked before the Internet. Some of us—or some of our devices—just may not know this yet. Curiously, it’s our devices that lag that lag behind in this regard. The Internet, as was correctly noted in Avenue Q, may be for porn, but identifying nudity is not technology’s forte. As such, content moderation is still a tedious human labor, and as Adrian Chen documented in his excellent 2014 Wired feature, it’s a soul-crushing search for dick pics—among other things.

So, here we are. It’s 2016, and as Clarifai “data scientist and NSFW enthusiast” Ryan Compton (yes, that’s his title) puts it: “the discovery of nude pictures has been a central problem in computer vision for over two decades. Potter Stewart’s “I know it when I see it” maxim doesn’t apply to technology. Devices don’t really see things—not in any conventional sense. Computers can recognize patterns, but that raises the question of what nudity looks like as a pattern. Humanity, as anyone with an Internet connection can tell you, has conjured all sorts of imaginative ways of being nude, so good luck with that one.

it is simply projecting human mores onto images

Early approaches to nudity pattern recognition, Compton notes, searched for large stretches of skin-toned colors in images and then sought to narrow down the field by further focusing on limb-like shapes. This technique, outlined in the wonderfully titled academic text “Finding Naked People,” worked a little more than half the time. The more recent (and seemingly effective) approach that Compton identifies teaches artificial intelligence to recognize “not safe for work” (NSFW) imagery from historical patterns as opposed to strict prescriptions. (Clarifai works in the artificial intelligence space and if you want more technical details, their blog post is a great resource.) In short, the system “correctly learned penis, anus, vulva, nipple, and buttocks—objects which our model should flag.”

Is this not safe for work?
Is this not safe for work?

This model of NSFW content appears to be effective, but it carries with it much of the baggage of human moderation. After all, the term ‘not safe for work’ is more of a moral judgment than an objective determination. If the AI learns that red lips are indicators of more adult content, which they apparently are, it is simply projecting human mores onto images. Similarly, one cannot expect deep learning or AI to solve the running problem of social networks treating breastfeeding as sexual content. (That is also to say nothing of the double standards for male and female bodies.) Technology can make content moderation easier, but it can’t fix our deeper, more human struggles with the naked human form.