close
close

Our new educational campaign to help protect teens from sextortion scams

Our new educational campaign to help protect teens from sextortion scams

Sextortion is a horrific crime, in which financially motivated scammers target young adults and teenagers around the world, threatening to reveal their intimate images if they don’t get what they want. Today, we are announcing new measures in our fight against these criminals, including an education campaign aimed at raising awareness among teens and parents about how to spot sextortion scams and what steps they can take to regain control. They are the target of one of these scams. scams. We’re also announcing important new security features to further help prevent sextortion on our apps.

New campaign to help teens and parents spot sextortion scams

As well as developing safety features to help protect our community, we want to help young people feel confident that they can spot the signs of a sextortion scam. That’s why we worked with leading child safety experts, the National Center for Missing and Exploited Children (NCMEC) and Thorn – including the Thorn Youth Advisory Council – to develop an educational video which helps teens recognize the signs that someone might be a sextortion scammer. These red flags include being too loud, asking to exchange photos, or suggesting moving the conversation to another app.

“The dramatic rise in sextortion scams is taking a heavy toll on children and adolescents, with reports of online enticement increasing by more than 300% between 2021 and 2023. Campaigns like this bring essential education to help families recognize these threats as early as possible. By equipping young people with knowledge and directing them to resources like NCMEC’s ​​CyberTipline and Take it Down, we can better protect them from online exploitation.

John Shehan, a senior vice president of the National Center for Missing and Exploited Children

Embarrassment or fear can stop teens from seeking help, which is why our video reassures teens that sextortion is never their fault and tells them what they can do to get back on track. control of fraudsters. The video invites adolescents to Instagram.com/PreventSextortionwhich includes tips – co-developed by Meta and Thorn – for teens affected by sextortion scams, a link to NCMEC Take It Down Toolwhich helps prevent intimate images of teens from being shared online, and live chat support from Crisis Text Line in the United States.

We will show this video to millions of teens and young adults on Instagram in the US, UK, Canada and Australia – countries commonly targeted by sextortion scammers.

“Our research at Thorn has shown that sextortion is on the rise and poses a growing risk to young people. This is a devastating threat – and joint initiatives like this, aimed at informing children about the risks and empowering them to take action, are crucial.

Kelbi Schnabel, Senior Director at Thorn

We’ve teamed up with creators that teens love to raise awareness about these scams while reassuring them that sextortion is never their fault and that help is available. Starting today, these creators will add their voice to the campaign by sharing educational content with their followers on Instagram.

We also partner with a creative parent group to help parents understand what sextortion is, how to recognize the signs of a sextortion scam, and what steps to take if their teen becomes a victim of this crime. The creators will direct parents to Useful resources we developed with Thornincluding phrasebooks.

New security features to disrupt sextortion

In addition to this campaign, Meta is announcing a series of new security features designed to further protect people from sextortion and make it even harder for sextortion criminals to succeed. These features complement our recent announcement from Teen Accountswhich provides tens of millions of teens with built-in protections that limit who can contact them, what content they see, and how much time they spend online. Teens under the age of 16 cannot change their teen account settings without parental permission.

With Instagram accounts for teenagersTeens under 18 get stricter messaging settings by default, meaning they can’t receive messages from people they don’t follow or aren’t connected with – but accounts can always ask to follow them, and teens can choose to follow. Now we’re also making it harder for accounts showing signs of potentially fraudulent behavior to request to follow teens. Depending on the strength of these signals – which include how new an account is – we will block the follow request completely or send it to a teen’s spam folder.

Fraudsters who practice sextortion often misrepresent their location to trick teens into trusting them. To prevent this, we’re testing new safety notices in Instagram DM and Messenger to notify teens when they’re chatting with someone who may be in another country.

This slideshow requires JavaScript.

Sextortion scammers often use their targets’ following lists and follower lists to try to blackmail them. Now, accounts that we detect as showing signs of fraudulent behavior will not be able to see people’s followers or following lists, removing the ability for them to exploit this feature. These potential sextors will also not be able to see lists of accounts that have liked someone’s posts, photos they have been tagged in, or other accounts that have been tagged in their photos.

This slideshow requires JavaScript.

Soon, we will no longer allow users to use their device to directly capture or save fleeting images or videos sent in private messages. This means that if someone sends a photo or video in Instagram DM or Messenger using our “view once” or “allow replay” feature, they don’t have to worry about it being screenshotted or recording in the application without their consent. We also won’t allow users to open “view once” or “allow replay” images or videos on the Instagram web, to avoid them circumventing this screenshot prevention .

Video showing blocked screenshot attempt

After first announcing the test in AprilWe are currently rolling out our nudity protection feature globally in Instagram DMs. This feature, which will be enabled by default for teens under 18, will blur images we detect containing nudity when sent or received in Instagram DMs and warn people of the risks associated with sending sensitive images. We also worked with Larry Magid of ConnectSafely to develop a video for parents, which will be available in the Meta Safety Center Stop Sextortion Pagethis explains how the feature works.

This slideshow requires JavaScript.

Provide more in-app support

We also partner with Crisis Text Line in the United States to provide people with with free, 24/7 mental health support when they need it. Now, when people report issues related to sextortion or child safety – like nudity, threats to share private images, or sexual exploitation or solicitation – they will see an option to chat live . a volunteer crisis counselor of crisis text line.

Image of a screen showing phone support and support options

Take action against sextortion criminals

Last week, we removed more than 1,620 assets – 800 Facebook groups and 820 accounts – affiliated with Yahoo Boys that were attempting to organize, recruit and train new sextortion scammers. This comes after we announced it in July that we had removed approximately 7,200 Facebook assets that engaged in similar behavior. Yahoo Boys is banned under Meta Dangerous Organizations and Individuals Policy – one of our strictest policies. While we are already aggressively combating Yahoo Boys account breaches, we are implementing new processes that will allow us to identify and remove these accounts more quickly.

We’re constantly working to improve the techniques we use to identify scammers, remove their accounts, and prevent them from returning. When our experts observe patterns in sextortion attempts, such as certain commonalities between scammers’ profiles, we train our technology to recognize these patterns. This allows us to quickly find and take action against sextortion accounts, and make significant progress in detecting new and old scammers. We also share aspects of these models with the Tech Coalition. Lantern programso that other companies can investigate their use on their own platforms.

These updates represent a big step forward in our fight against sextortion scammers, and we will continue to evolve our defenses to help protect our community.