Teenagers pose for a photo while holding smartphones in front of a Facebook logo in this illustration taken September 11, 2025. (Reuters/Dado Ruvic/Illustration/File Photo)
- Instagram chief Adam Mosseri defended the platform’s decisions around features critics say harmed young users, testifying in a trial over alleged social media addiction.
- A California woman who began using Instagram at age 9 claims the platform contributed to her depression and body dysmorphia.
- The case could test the scope of U.S. legal protections that shield social media companies from liability for user-generated content.
Share
LOS ANGELES, Feb 11 – The top executive at Meta Platforms’ Instagram defended the social media platform’s choices around features that some company insiders called harmful to young users, at a trial on claims the app helped fuel a youth mental-health crisis.
Adam Mosseri, the head of Instagram, is testifying in Los Angeles as part of a trial on what plaintiffs call “social media addiction” in children and young adults. Meta CEO Mark Zuckerberg is also expected to testify in the coming weeks.
A California woman who began using Instagram at age 9 is suing Meta and Google’s YouTube, saying the companies sought to profit by hooking young children on their services despite knowing social media could harm their mental health. She alleges the platforms contributed to her depression and body dysmorphia.
Several parents who say social media platforms led to their children’s deaths sat in the front row of the courtroom audience.
Access to social media for children has become an issue globally, with Australia in December becoming the first nation to prohibit use of the platforms for children younger than 16. Spain, Greece, Britain and France are among the many countries considering similar action.
Internal Debates Over Teen Safety
In 2019, Mosseri and others at Instagram were discussing whether to lift a ban on photo filters that mimicked the effects of plastic surgery, according to emails shown in court.
Instagram teams working on policy, communications and well-being preferred to keep the ban in place while gathering more data on potential harms to teen girls.
“We would – rightly – be accused of putting growth over responsibility,” said Nick Clegg, who was then Meta’s vice president of global affairs, according to emails shown in court.
Mosseri and Zuckerberg preferred to reverse the ban but remove the filters from the app’s recommendation section, an option described in emails as presenting “a notable well-being risk” but having a lower impact on user growth.
“I was trying to balance all the different considerations,” Mosseri said in court, adding that he agreed with the ultimate decision to prohibit filters that promote plastic surgery.
“Our policies, like our products, evolve all the time. We try to focus on the most important issues,” he said.
“Move fast and break things,” the company’s early motto coined by Zuckerberg, which came to symbolize Silicon Valley, is no longer appropriate, Mosseri said.
The case is a key test of a U.S. law that protects online platforms from liability for user-created content, which has long shielded social media companies from lawsuits. The outcome of the case will influence how the companies respond to hundreds of similar lawsuits in the U.S.
Meta’s lawyers cited the law in objecting to some evidence presented in court. The company could raise the issue on appeal if it loses at trial.
A Meta spokesperson said on Tuesday that the main question in the case is whether Instagram was a substantial factor in the woman’s mental health struggles, and that “the evidence will show she faced many significant, difficult challenges well before she ever used social media.”
—
(Reporting by Jody Godoy in Los Angeles and Courtney Rozen in Washington; Editing by Jamie Freed and Matthew Lewis)
RELATED TOPICS:
Categories
Israel to Join Trump’s ‘Board of Peace’, Netanyahu Says
Trump Excludes Two Democrats from US Governors’ Meeting Invite




