Take It Down, a new online tool, gives teens the power to remove their own nude photos from certain platforms and block their photos from being uploaded.
The program was developed by the US National Center for Missing and Exploited Children (NCMEC) and funded in part by Facebook’s parent company Meta. The tool is specifically intended for minors.
A spokesperson for NCMEC told Britain’s The Guardian newspaper that the program is aimed at people who know that nude photos are circulating on the internet or are afraid the photo will end up on the internet. For example, sexually explicit photos sent to a partner, or someone blackmailed with nude pictures.
How does it work?
If you want, you can post pictures anonymously, after which the site creates a “hash”, which is a kind of unique mathematical fingerprint. The nude photo or the film itself will not be uploaded.
What’s new about the program is that teens who suspect their nude pictures are circulating can now also create a preventive digital fingerprint. These “hashes” are then blacklisted for social media to test their images. If someone later tries to post the photo on a website, the photo may be detected and blocked.
Antigone Davis, head of Meta’s security department, told The Guardian that the tool also works with “deepfakes.” These are artificial videos where images are manipulated by someone. These videos may appear lifelike.
What can’t he do?
A major limitation of Take It Down is that it relies on platforms that voluntarily participate in the initiative. So far, only Facebook, Instagram, OnlyFans, Pornhub and Yubo have stated that they are included in the tool. Images cannot be removed from other platforms such as TikTok and Telegram. Also, the blacklist doesn’t work with platforms that send encrypted messages like WhatsApp and iMessage.
Even if the picture has been edited, for example, because a filter has been placed on it or an emoji has been added, the photo can no longer be recognized by the diamond.
It’s not clear how people are prevented from blacklisting legal images that can no longer be shared. Finally, the image or video itself is not published, so NCMEC cannot review which material has been blacklisted.
traumatic
According to NCMEC, the program specifically aims to address child abuse and exploitation online: “Having your own nude pictures online can be very traumatic, especially for young people,” a foundation spokesperson told The Guardian.
This is not the first time that Meta has been involved in such initiatives. In 2017, Facebook, now called Meta, attempted to create a similar tool for adults. At the time, that didn’t work because the company asked volunteers to submit nude photos. In 2021, Meta, in partnership with the UK Revenge Porn Hotline, launched the StopNCII (Stop Consent Intimate Images) program specifically targeted at revenge porn distribution.
Source: NOS

Karen Clayton is a seasoned journalist and author at The Nation Update, with a focus on world news and current events. She has a background in international relations, which gives her a deep understanding of the political, economic and social factors that shape the global landscape. She writes about a wide range of topics, including conflicts, political upheavals, and economic trends, as well as humanitarian crisis and human rights issues.