It has long been stressed that once you send an image to someone else, or post it online, you lose complete control over how that image is used and where it might end up. While this is still mostly true, the National Center for Missing and Exploited Children (NCMEC), with funding from Meta, has developed a free tool that makes it easier to remove explicit images of minors from certain online environments. If you are under 18 and believe a nude or semi-nude image of you has been posted online (or could be), you can anonymously use this service to remove the image (or prevent it from being posted in the first place).
The new service, called “Take it Down,” works by assigning a unique digital fingerprint or “hash value” to an image so that it can be easily and automatically detected when it ends up online. Essentially each pixel of the image is analyzed to create an identifier unique to that particular image. Users do not need to upload the image to an online database; you simply select the image to be analyzed by the tool (e.g., from your camera roll, photo gallery or File Explorer). The image never leaves your device. The hash value that is created is then added to the NCMEC database and anytime that value appears on certain online platforms, the image is flagged and removed (and the user who posted it could be punished by the platform). The tool is currently only available to those who were under the age of 18 when a nude, partially nude, or otherwise explicit image of themselves may have been shared online.
At the moment, only a handful of online platforms are participating (Facebook, Instagram, Yubo, OnlyFans, and Pornhub), but this is expected to increase over time. The tool also isn’t currently able to stop people from sharing your image via messaging apps. There are other limitations. For example, only identical copies of the image will be detected by the online platforms. Any change at all to the image will change its digital fingerprint. Simply cropping the image or adding a sticker or other visual alteration will disrupt the identification process. I suspect the technology will continue to advance, though, and perhaps in the not-too-distant future even better detection models will be developed.
When I first heard about this tool, I immediately wondered about other possible applications for the technology. For now it can only be used for explicit images of minors. But could it also be used to address cyberbullying? For example, if someone took a humiliating image of me and posted it online without my permission, this tool could be used to remove that image. Presumably any online content (including audio, video, and text) could be analyzed in the same way to allow easier removal. We’re regularly contacted by people who are desperate to remove embarrassing or hurtful content from the web, and this tool could make that easier.
Obviously the best way to avoid explicit images from being shared online is to prevent them from being created in the first place. But our research shows that at least one in five high schoolers has sent a sext, and more than a third has received one. And situations can occur where a teen shares an image with someone they think they can trust, and that person turns on them and shares the images online (or threatens to do so). Or maybe someone takes your picture without your knowledge, or hacks into your device and steals some personal images. Whatever happened, it is important that platforms and apps provide tools to address these kinds of situations and minimize the damage. This is a step in the right direction for the prevention of sextortion–the threatened distribution of private images–a problem that only seems to be growing based on our research.
If you need help dealing with a problem involving explicit images of yourself or someone you care about who is under the age of 18, you can also contact the CyberTipline. If you are an adult and someone is threatening to share your intimate images without your permission, you can visit https://stopncii.org/.