Regulatory scrutiny: Meta restricts more teen content

437

Meta Platforms has announced its commitment to enhancing content restrictions for teenagers on both Instagram and Facebook.

The decision follows increased global regulatory scrutiny, urging the social media conglomerate to bolster measures safeguarding minors from potentially harmful content across its platforms.

Content control settings

Meta has implemented a comprehensive approach by subjecting all teenagers to the strictest content control settings across both Instagram and Facebook. Moreover, Instagram will see a reduction in available search terms, as outlined in Meta’s latest blog post.

The initiative by Meta is designed to heighten the barrier for teenagers in accessing sensitive content like suicide, self-harm, and eating disorders through features such as Search and Explore on Instagram, reinforcing a safer digital environment for young users.

The company said the measures, expected to roll out over the coming weeks, would help deliver a more “age-appropriate” experience.

Global regulatory pressure

Meta faces mounting scrutiny in both the United States and Europe, as it contends with allegations that its apps are not only addictive but also implicated in exacerbating a youth mental health crisis.

In October, attorneys general from 33 U.S. states, including California and New York, filed a lawsuit against the company, alleging that Meta consistently deceived the public by downplaying the risks associated with its platforms.

In Europe, the European Commission has sought information on how Meta protects children from illegal and harmful content.

Regulatory scrutiny intensified after a former Meta employee testified in the U.S. Senate, asserting that the company was cognizant of harassment and other harms affecting teenagers on its platforms but neglected to take decisive action against them.

Also Read: Google, Meta, others to push for open digital ecosystems

The employee, Arturo Bejar, called for the company to make design changes on Facebook and Instagram to nudge users toward more positive behaviors and provide better tools for young people to manage unpleasant experiences.

On Tuesday, Bejar expressed dissatisfaction, stating that Meta’s modifications failed to address his concerns. He criticized the company for employing a “‘grade your own homework’ approach to defining harm” and emphasized that there still isn’t a streamlined mechanism for teenagers to report unwanted advances.

“This should be a conversation about goals and numbers, about harm as experienced by teens,” he said.

For advertisers on Facebook and Instagram, children have historically been an appealing demographic. Brands seek to engage them as consumers during formative years, capitalizing on their impressionability and aiming to establish enduring brand loyalty.

Market competition

Meta finds itself in an intense rivalry with TikTok, vying for the attention of young users. The landscape has shifted, with the usage of Facebook—once a dominant platform among teens—steadily diminishing in recent years.

According to a Pew Research Center survey conducted in 2023, 63% and 59% of U.S. teens reported using TikTok and Instagram respectively, while only 33% said they used Facebook.

Comments are closed.