Today, we’re sharing an update on how we protect young people from harm and seek to create safe, age-appropriate experiences for teens on Facebook and Instagram.
Last year, we shared some of the measures we take to protect teens from interacting with potentially suspicious adults. For example, we restrict adults from messaging teens they aren’t connected to or from seeing teens in their People You May Know recommendations.
In addition to our existing measures, we’re now testing ways to protect teens from messaging suspicious adults they aren’t connected to, and we won’t show them in teens’ People You May Know recommendations. A “suspicious” account is one that belongs to an adult that may have recently been blocked or reported by a young person, for example. As an extra layer of protection, we’re also testing removing the message button on teens’ Instagram accounts when they’re viewed by suspicious adults altogether.
We’ve developed a number of tools so teens can let us know if something makes them feel uncomfortable while using our apps, and we’re introducing new notifications that encourage them to use these tools.
For example, we’re prompting teens to report accounts to us after they block someone, and sending them safety notices with information on how to navigate inappropriate messages from adults. In just one month in 2021, more than 100 million people saw safety notices on Messenger. We’ve also made it easier for people to find our reporting tools and, as a result, we saw more than a 70% increase in reports sent to us by minors in Q1 2022 versus the previous quarter on Messenger and Instagram DMs.
Starting today, everyone who is under the age of 16 (or under 18 in certain countries) will be defaulted into more private settings when they join Facebook, and we’ll encourage teens already on the app to choose these more private settings for:
We’re also sharing an update on the work we’re doing to stop the spread of teens’ intimate images online, particularly when these images are used to exploit them — commonly known as “sextortion.” The non-consensual sharing of intimate images can be extremely traumatic and we want to do all we can to discourage teens from sharing these images on our apps in the first place.
We’re working with the National Center for Missing and Exploited Children (NCMEC) to build a global platform for teens who are worried intimate images they created might be shared on public online platforms without their consent. This platform will be similar to work we have done to prevent the non-consensual sharing of intimate images for adults. It will allow us to help prevent a teen’s intimate images from being posted online and can be used by other companies across the tech industry. We’ve been working closely with NCMEC, experts, academics, parents and victim advocates globally to help develop the platform and ensure it responds to the needs of teens so they can regain control of their content in these horrific situations. We’ll have more to share on this new resource in the coming weeks.
We’re also working with Thorn and their NoFiltr brand to create educational materials that reduce the shame and stigma surrounding intimate images, and empower teens to seek help and take back control if they’ve shared them or are experiencing sextortion.
We found that more than 75% of people that we reported to NCMEC for sharing child exploitative content shared the content out of outrage, poor humor, or disgust, and with no apparent intention of harm. Sharing this content violates our policies, regardless of intent. We’re planning to launch a new PSA campaign that encourages people to stop and think before resharing those images online and to report them to us instead.
Anyone seeking support and information related to sextortion can visit our education and awareness resources, including the Stop Sextortion hub on the Facebook Safety Center, developed with Thorn. We also have our guide for parents on how to talk to their teens about intimate images on the Education hub of our Family Center.