close
close

Meta unveils features to combat “sextortion” among teenagers

Meta unveils features to combat “sextortion” among teenagers

Meta is stepping up its efforts to combat sextortion on Instagram with the launch of new features designed to prevent young users from being blackmailed into sharing nude photos.

Starting Thursday, Instagram will automatically block follow requests sent to teen users from accounts displaying certain “fraudulent behavior,” according to Meta. It will hide the following and following lists of users of these accounts in an attempt to prevent fraudsters from using these lists to blackmail their targets.

The social media platform is also launching a global rollout of its “nudity protection” feature in Instagram direct messages, which blurs any images the platform detects as containing nudity. Users who send or forward images with nudity detected will also be asked to think twice. This feature will be automatically enabled for users under 18, according to Meta’s latest announcement. The company said it is also removing the ability to take screenshots or screen recordings of missing photos and videos sent via Instagram direct messages or Facebook Messenger.

Antigone Davis, head of global security at Meta, said sextortion – defined by the FBI as “a crime in which adults coerce children and adolescents to send explicit images online” – has become more common on social media platforms.

“A lot of people who commit this crime are motivated by money, which means we are seeing more people involved in this type of crime, and we have seen an increase in this type of crime,” a- she told NBC News.

The announcement marks Meta’s latest effort to combat child safety concerns on its platforms, amid continued scrutiny from lawmakers and parents, who fear social media platforms don’t have enough safeguards in place to protect their children from harm.

During an online Senate hearing on child safety in January, Meta CEO Mark Zuckerberg apologized to parents who said Instagram contributed to their children’s suicide or exploitation.

The social media giant has been criticized for failing to approve the Children Online Safety Act (KOSA) and the Children and Adolescents Online Privacy Protection Act (COPPA 2.0), which were adopted by the Senate in July.

The legislation, which has not yet passed the House, is part of a series of bills that lawmakers have considered in recent years that focus on the privacy and safety of American children online.

A Meta spokesperson previously told NBC News that the tech giant supports “the development of age-appropriate standards for teens online” but believes “federal law should require that app stores get parental approval every time their teens under 16 download apps.”

The new features also constitute the latest safeguards put in place for young users of the application. Last month, Meta announced that it would begin automatically placing all users under the age of 18 into “teen accounts” with stricter privacy settings.

Haley Hinkle, policy advisor at Fairplay, a nonprofit working with families who have lost children to harm related to social media platforms, said features like those unveiled Thursday are like Band-Aids for a systemic problem that requires more action from lawmakers. .

“I don’t think the safety features are enough when ultimately what families and advocates are calling for is federal regulation that will require platforms to implement, at every step, safe design for young people,” Hinkle said.

Although the platforms have repeatedly taken “incremental steps in order to improve public relations and try to appease lawmakers,” Hinkle said, she questions the effectiveness of leaving it to private companies. to propose solutions according to their own schedule.

“Blocking screenshots on a platform is a good thing, but a determined, organized criminal could simply use a workaround,” Hinkle said. “They could use another device and take a photo of the photo that they then use to extort a child.”

Meanwhile, Instagram will display an educational PSA about sextortion to users in the United States, Canada, the United Kingdom and Australia.

The video, according to Davis, will explain to viewers what sextortion is, how to spot it and how to seek help if they are targeted. It’s common for shame to prevent teens from seeking help if they’re victims of sextortion, she said. The campaign therefore aims to reassure victims that they are not at fault.

Hinkle said she would like to see platforms focus on crime prevention without putting the onus on children or their parents. But it’s difficult to know whether Meta is doing enough, she said, while the platform’s inner workings remain a “black box” to the public.

“If they really want to promote meaningful change and online safety for children and teens, they can stop opposing federal solutions like the Kids Online Safety Act,” Hinkle said. “We need transparency and auditing requirements like those in the bill to truly evaluate for ourselves as a society, as advocates, and as families. »