Smut Filter Blocks All But Smut

This Story from Wired tells about a filterware product that could supposedly tell the difference between a naughty picture, and one that wasn\’t. The company, called Exotrope Inc., introduced its \”BAIR\” program last year, to much fanfare, but Wired ran some tests, and it turns out the saftware does not perform as advertised.

\”I agree with you. There\’s something wrong,\” says Dave Epler, Exotrope operations manager. \”That\’s not the way our image server is supposed to be working.\”

This Story from Wired tells about a filterware product that could supposedly tell the difference between a naughty picture, and one that wasn\’t. The company, called Exotrope Inc., introduced its \”BAIR\” program last year, to much fanfare, but Wired ran some tests, and it turns out the saftware does not perform as advertised.

\”I agree with you. There\’s something wrong,\” says Dave Epler, Exotrope operations manager. \”That\’s not the way our image server is supposed to be working.\” But an investigation by Wired News shows that BAIR\’s \”artificial intelligence\” does not work as advertised.


In tests of hundreds of images, BAIR incorrectly blocked dozens of photographs including portraits, landscapes, animals, and street scenes. It banned readers from viewing news photos at time.com and newsweek.com, but rated images of oral sex, group sex, and masturbation as acceptable for youngsters.

Company representatives say they can\’t explain the program\’s seemingly random behavior.