Christian Haschek, who lives in Austria, came up with the solution after he discovered an image showing child sex abuse had been uploaded on his image hosting platform Pictshare.
He called the police, who told him to print it out and bring it to them.
However it is illegal to possess images of child abuse, digitally or in print.
“Erm… not what I planned to do,” Mr Haschek said.
Instead he put together a homegrown solution for identifying and removing explicit images.
Mr Haschek used three Raspberry PIs, powering two Intel Movidius sticks, which can be trained to classify images. He also used an open source algorithm for identifying explicit material called NSFW (Not Safe For Work), available free of charge from Yahoo.
Each Friday is PiDay here at Adafruit! Be sure to check out our posts, tutorials and new Raspberry Pi related products. Adafruit has the largest and best selection of Raspberry Pi accessories and all the code & tutorials to get you up and running in no time!
Stop breadboarding and soldering – start making immediately! Adafruit’s Circuit Playground is jam-packed with LEDs, sensors, buttons, alligator clip pads and more. Build projects with Circuit Playground in a few minutes with the drag-and-drop MakeCode programming site, learn computer science using the CS Discoveries class on code.org, jump into CircuitPython to learn Python and hardware together, or even use Arduino IDE. Circuit Playground Express is the newest and best Circuit Playground board, with support for MakeCode, CircuitPython, and Arduino. It has a powerful processor, 10 NeoPixels, mini speaker, InfraRed receive and transmit, two buttons, a switch, 14 alligator clip pads, and lots of sensors: capacitive touch, IR proximity, temperature, light, motion and sound. A whole wide world of electronics and coding is waiting for you, and it fits in the palm of your hand.