Today, we’re announcing that Instagram and Facebook are founding members of Take It Down — a new platform by NCMEC to help prevent young people’s intimate images from being posted online in the future.
Having a personal intimate image shared with others can be scary and overwhelming, especially for young people. It can feel even worse when someone tries to use those images as a threat for additional images, sexual contact or money — a crime known as sextortion.
Take It Down lets young people take back control of their intimate images. People can go to TakeItDown.NCMEC.org and follow the instructions to submit a case that will proactively search for their intimate images on participating apps. Take It Down assigns a unique hash value — a numerical code — to their image or video privately and directly from their own device. Once they submit the hash to NCMEC, companies like ours can use those hashes to find any copies of the image, take them down and prevent the content from being posted on our apps in the future.
Built in a way that respects young peoples’ privacy and data security, Take It Down allows people to only submit a hash — rather than the intimate image or video itself — to NCMEC. Hashing turns images or videos into a coded form that can no longer be viewed, producing hashes that are secure digital fingerprints.
With the launch of Take It Down, people of all ages can stop the spread of their intimate images online, including:
Take It Down was designed with Meta’s financial support. We are working with NCMEC to promote Take It Down across our platforms, in addition to integrating it into Facebook and Instagram so people can easily access it when reporting potentially violating content. Take It Down builds off of the success of platforms like StopNCII, a platform we launched in 2021 with South West Grid for Learning (SWGfL) and more than 70 NGO’s worldwide, which helps adults stop the spread of their intimate images online, a practice commonly referred to as “revenge porn.”
Meta doesn’t allow content or behavior that exploits young people, including the posting of intimate images or sextortion activities. We work to prevent this content as well as inappropriate interactions between young people and suspicious accounts attempting to take advantage of them. For example, we default teens into the most private settings on Facebook and Instagram, we work to restrict suspicious adults from connecting with teens on those apps, and we educate teens about the dangers of engaging with adults they do not know online. We’ve also made it easier for people to report potentially harmful content, particularly if it involves a child.
On Instagram, we recently introduced new features to make it even more difficult for suspicious adults to interact with teens. Now, these adults will no longer be able to see teen accounts when scrolling through the list of people who have liked a post or when looking at an account’s Followers or Following list. If a suspicious adult follows a teen account, we will send that teen a notification prompting them to review and remove the new follower. We are also prompting teens to review and restrict their privacy settings. When someone comments on a teen’s post, tags/mentions them in another post, or includes their content in Reels Remixes or Guides, the teen will receive a notification to review their privacy settings, and will have the option to stop people from interacting with them.
We’ve developed more than 30 tools to support the safety of teens and families across our apps, including supervision tools for parents and age-verification technology that helps teens have age-appropriate experiences online. We also provide resources for teens that inform them of the potential harms of taking intimate images, and the other ways they can find help if they want to stop the spread of that content. There are more resources for parents so they can talk to their teens about how to be safe online in our Safety Center and Family Center.