Meta will begin removing some sensitive and “age-inappropriate” content from teenagers’ feeds, the company said in an announcement published Tuesday.
Meta already restricted topics such as self-harm, eating disorders and mental illnesses from being recommended to teens on Reels and the Explore pages of its apps. The new update also restricts these topics from appearing in young users’ feeds and Stories, even if the content was posted by people they follow.
The company said it is hiding more search results and terms relating to suicide, self-harm and eating disorders for everyone in coming weeks. Meta said it will be directing people to resources for help if they search these topics.
“We want teens to have safe, age-appropriate experiences on our apps,” the company wrote in a blog post announcing the changes.
Teens will have the most restrictive content control settings on Facebook and Instagram by default. This measure is being used to reduce the amount of sensitive content that they are exposed to on Meta’s apps.
Meta will also send notifications and prompts to teens to encourage them to update their settings to make their accounts more private.
The shift comes as the social networking giant has remained the target of intense scrutiny over how its products affect young people.
James P. Steyer, founder and CEO of Common Sense, a children’s media nonprofit organization, said the new protections are a “short-term fix” for child safety on Meta’s platforms.
“A closer look at Meta’s policies shows that they are still putting the onus on teens to navigate their own privacy, and are inventing their own standards for how content will be limited,” Steyer said in an email statement.
Steyer added that legislation is needed to “reduce the harm on millions of children who are served inappropriate content on a daily basis.”
Meta was sued in 2022 by one family for recommending their teen daughter content that glorified anorexia and self-harm on Instagram. The company was also criticized following the release of the “Facebook Papers,” internal Meta research documents leaked in 2021, which revealed that the company was aware of Instagram’s harmful effects on teen girls.
More recently, a former engineering director and consultant for the social media company testified in a congressional hearing in November that Meta needed to do more to keep children safe. In October, a bipartisan group of 33 state attorneys general sued Meta in connection with what they said were addictive features aimed at young people.
Meta CEO Mark Zuckerberg responded to backlash over its policies for children in a Facebook post in 2021.
“I’ve spent a lot of time reflecting on the kinds of experiences I want my kids and others to have online, and it’s very important to me that everything we build is safe and good for kids,” he said in the post.
In its announcement Tuesday, Meta said it has “developed more than 30 tools and resources to support teens and their parents” and regularly consults with experts to make its platforms safer.