Facebook, Instagram will hide more sensitive content from teens, Meta says
January 9, 2024

Facebook, Instagram will hide more sensitive content from teens, Meta says

Meta said on Tuesday it would hide more sensitive content from teenagers on Instagram and Facebook amid global pressure from regulators for the social media giant to protect children from harmful content on its apps.

The move will make it more difficult for teenagers to come across sensitive content such as suicide, self-harm and eating disorders when they use features like Search and Explore on Instagram, according to Meta. All teenagers’ accounts will by default be under the most restrictive content control settings on Instagram and Facebook, and additional search terms will be limited on Instagram, Meta said in a blog post.

“We want teens to have safe, age-appropriate experiences on our apps,” the blog post reads. “Today, we’re announcing additional protections that are focused on the types of content teens see on Instagram and Facebook.”

Even if a teenager follows an account posting about sensitive topics, those posts would be removed from the teenager’s feed, per Meta’s blog. The company said the measures, expected to roll out over the coming weeks, would help deliver a more “age-appropriate” experience.

“Take the example of someone posting about their ongoing struggle with thoughts of self-harm. This is an important story, and can help destigmatize these issues, but it’s a complex topic and isn’t necessarily suitable for all young people. Now, we’ll start to remove this type of content from teens’ experiences on Instagram and Facebook,” the company’s blog post reads.

Meta is under pressure both in the United States and Europe over allegations that its apps are addictive and have helped fuel a youth mental health crisis. Attorneys general of 33 US states including California and New York sued the company in October, saying it repeatedly misled the public about the dangers of its platforms. In Europe, the European Commission has sought information on how Meta protects children from illegal and harmful content.

The regulatory pressure followed testimony in the US Senate by a former Meta employee, Arturo Bejar, who alleged the company was aware of harassment and other harms facing teenagers on its platforms but failed to act against them.

Bejar called for the company to make design changes on Facebook and Instagram to nudge users toward more positive behaviors and provide better tools for young people to manage unpleasant experiences. Bejar said his own daughter had received unwanted advances on Instagram, a problem that he brought to the attention of the company’s senior leadership. Meta’s top brass ignored his pleas, he testified.

Children have long been an appealing demographic for businesses, which hope to attract them as consumers at ages when they may be more impressionable and solidify brand loyalty.

For Meta, which has been in a fierce competition with TikTok for young users in the past few years, teenagers may help secure more advertisers, who hope children will keep buying their products as they grow up.

Source:

Leave a Reply

Your email address will not be published. Required fields are marked *