Instagram head pressed on lengthy delay to launch teen safety features, like a nudity filter, court filing reveals
Prosecutors in a lawsuit focused on whether or not social media apps, like Instagram, are addictive and harmful, wanted to know why it took so long for Meta to roll out basic safety tools, like a nudity filter for private messages sent to teens.
In a newly unsealed deposition in a federal lawsuit, Instagram head Adam Mosseri was asked about an August 2018 email chain with Meta VP and Chief Information Security Officer Guy Rosen, where he mentioned that “horrible” things could happen via Instagram private messages, also known as DMs.
Those horrible things could include dick pics, the plaintiff’s lawyer said, and Mosseri agreed.
“I think that it’s pretty clear that you can message problematic content in any messaging app, whether it’s Instagram or otherwise,” Mosseri said. He said the company tried to balance people’s interest in privacy with its own interests in safety.
The testimony also revealed new stats about harmful activity on Instagram, revealing that 19.
2% of survey respondents ages 13 to 15 said they had seen nudity or sexual images on Instagram that they didn’t want to see.
Mosseri was also questioned on other topics, like an email from a Facebook intern in 2017, who said that he wanted to find “addicted” Facebook users and figure out if there were ways to help them. The 2018 email chain was meant to serve as one example that Meta was aware of the risks to minors, but it took the company until 2024 to release a product that addressed the problem of sexual images sent to teens.
Reached for comment, Meta spokesperson Liza Crenshaw pointed to the other ways the company has worked to keep teens safe over the years, noting that, “for over a decade, we’ve listened to parents, worked with experts and law enforcement, and conducted in-depth research to understand the issues that matter most. We use these insights to make meaningful changes—like introducing Teen Accounts with built-in protections and providing parents with tools to manage their teens’ experiences. We’re proud of the progress we’ve made, and we’re always working to do better,” she said.
This particular case, taking place in the U.
District Court in the Northern District of California, involves plaintiffs alleging that social media platforms are defective because they’re designed to maximize screen time, which encourages addictive behavior in teens. The defendants include Meta, Snap, TikTok, and YouTube (Google).
Similar lawsuits are also underway in the Los Angeles County Superior Court and in New Mexico.
The timing of these trials comes amid a growing number of laws restricting social media teen use, both in several U. Updated after publication with Meta’s comment
Logic Quality Breakdown:
- Updated_At:
- Truth_Blocks:
- Analysis_Method: