Facebook logo
Image: Bastian Riccardi via Unsplash

Under pressure, Meta say it will change how it delivers some content to children

Meta announced Tuesday that it will be placing more restrictions on content served to children and will make it easier for minors to enhance their privacy settings.

In an announcement made public Tuesday morning, the tech giant said its Instagram and Facebook platforms will begin to hide more troubling content from teens by applying the “most restrictive content control settings” to their accounts.

Meta also said it will restrict more terms than it currently does on Facebook and in Instagram Search, both platforms it owns, and will no longer recommend content about self-harm and other potentially unsafe topics in Instagram feeds and stories.

“Now, when people search for terms related to suicide, self-harm and eating disorders, we’ll start hiding these related results and will direct them to expert resources for help,” the announcement said. “We already hide results for suicide and self harm search terms that inherently break our rules and we’re extending this protection to include more terms.”

The company said it also will do more to protect children’s privacy by nudging teens to update privacy settings in Instagram and make it possible to do so with the push of a button.

“We want teens to have safe, age-appropriate experiences on our apps,” the Meta announcement said, pointing to more than 30 tools and resources it has already developed to protect children online, including by addressing the problem of teens receiving harmful content about eating disorders, suicide and other charged topics.

Children’s online safety advocates called the changes too little too late.

“I just don't want anybody to be fooled by this — this is Meta trying to avoid regulation, trying to have something to point to when their CEO testifies in three weeks,” said Josh Golin, the executive director of the childrens’ advocacy group Fairplay, referring to Meta CEO Mark Zuckerberg’s subpoena to testify in Congress. “Honestly, that it took until 2024 to do this is appalling.”

Golin added that tech platforms must be held liable for their practices if the online environment is to truly improve for kids. He pointed out that Meta made similar promises in 2019, but said it failed to change its behavior.

A Meta spokesperson pushed back on that assessment.

"We've worked on these issues for over a decade and have over 30 tools and resources to support teens and parents," the spokesperson said. "Today's updates make young people safer online, it’s unfortunate that these groups oppose them.”

The announcement comes as Meta faces pressure on multiple fronts to better control the content it feeds children and how it uses childrens’ data.

The Federal Trade Commission (FTC) announced last May that it will seek a blanket prohibition of the monetization of youth data by Meta. The company has since sued the agency in federal court, saying the FTC overreached and does not have the authority to impose the new rule.

Children’s advocates backed the FTC action, saying Meta’s practices violate the Children’s Online Privacy Protection Act (COPPA). That law, in place since 1998, mandates parents grant permission before websites collect and use personal information from kids under 13.

The FTC proposed more restrictive COPPA rules last month, which would shift the burden for protecting children online to platforms instead of parents.

Separately, a Senate bill meant to rein in Meta’s content delivery to children, the Kids Online Safety Act, is currently languishing in Congress.

Arturo Béjar, a former Facebook engineering director and Meta consultant, came forward in November, telling Congress that Meta did nothing after he reported to executives that in a given week 13% of Instagram users under the age of 16 were subjected to unwanted sexual advances on the platform.

Meta said its new rules are meant to mitigate these harms. The announcement said the tech giant will specifically begin to remove posts about self-harm from teens’ feeds and stories even if they are shared by someone they follow. It said the changes will apply to teens under age 18 and will be effective in the “coming months.”

The platform’s new privacy measures will prompt teens to routinely vet their Instagram privacy and safety settings by sending new notifications allowing them to restrict who sees their content with the push of a button.

“If teens choose to “Turn on recommended settings”, we will automatically change their settings to restrict who can repost their content, tag or mention them, or include their content in Reels Remixes,” the blog post said. “We’ll also ensure only their followers can message them and help hide offensive comments.”

Get more insights with the
Recorded Future
Intelligence Cloud.
Learn more.
No previous article
No new articles
Suzanne Smalley

Suzanne Smalley

is a reporter covering privacy, disinformation and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop and Reuters. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.