June 30, 2024

Instagram, Facebook to conceal harmful teen content

2 min read

Action taken as global regulators urge Meta to safeguard children from explicit posts on its platforms.

On Tuesday, Meta announced its decision to conceal more sensitive content from teenagers on Instagram and Facebook in response to international regulatory demands for the social media company to shield children from harmful material on its platforms.

This action aims to increase the difficulty for teenagers to encounter content related to suicide, self-harm, and eating disorders when utilizing features like search and explore on Instagram, according to Meta. As outlined in a blog post by Meta, all teenage accounts will automatically be set to the most stringent content control settings on both Instagram and Facebook. Additionally, Instagram will impose restrictions on additional search terms.

“We aim to ensure teens have secure, age-appropriate experiences on our platforms,” states the blog post. “Today, we’re introducing extra safeguards specifically geared towards the content teens encounter on Instagram and Facebook.

Even if a teenager follows an account that shares content on sensitive subjects, Meta’s blog notes that such posts will be excluded from the teenager’s feed. Meta anticipates that these measures, slated for implementation in the coming weeks, will contribute to a more “age-appropriate” user experience.

Consider someone sharing their ongoing struggle with thoughts of self-harm as an example. While this narrative is significant and aids in destigmatizing such issues, it’s a nuanced subject that may not be suitable for all young individuals. Hence, we will now take steps to remove this type of content from the experiences of teens on Instagram and Facebook,” states the company’s blog post.

Meta faces scrutiny both in the United States and Europe amid allegations that its apps are addictive and have contributed to a youth mental health crisis. In October, attorneys general from 33 US states, including California and New York, filed a lawsuit against the company, accusing it of repeatedly misleading the public about the risks associated with its platforms. Meanwhile, in Europe, the European Commission has requested information on how Meta safeguards children from illegal and harmful content.

This regulatory scrutiny intensified after a former Meta employee, Arturo Bejar, testified before the US Senate, claiming that the company was aware of harassment and other harms faced by teenagers on its platforms but failed to take appropriate action.

Bejar advocated for the company to implement design changes on Facebook and Instagram, steering users toward more positive behaviors and offering improved tools for young individuals to manage unpleasant experiences. Bejar highlighted the issue of his own daughter receiving unwanted advances on Instagram, bringing it to the attention of Meta’s senior leadership. However, his pleas were allegedly ignored by Meta’s top executives, as he testified.

Children have long been a coveted demographic for businesses seeking to attract them as consumers during impressionable ages and establish brand loyalty.

For Meta, engaged in intense competition with TikTok for young users in recent years, teenagers could potentially attract more advertisers who anticipate continued product purchases as these individuals mature.

Copyright © All rights reserved | WebbSocial |