By Colin Lecher

The Meta researcher’s tone was alarmed.
“oh my gosh yall IG is a drug,” the user experience specialist allegedly wrote to a colleague, referring to the social media platform Instagram. “We’re basically pushers… We are causing Reward Deficit Disorder bc people are binging on IG so much they can’t feel reward anymore.”
The researcher concluded that users’ addiction was “biological and psychological” and that company management was keen to exploit the dynamic. “The top down directives drive it all towards making sure people keep coming back for more,” the researcher added.
The conversation was included recently as part of a long-simmering lawsuit in a California-based federal court. Condensing complaints from hundreds of school districts and state attorneys general, including California’s, the suit alleges that social media companies knew about risks to children and teens but pushed ahead with marketing their products to them, putting profits above kids’ mental health. The suit seeks monetary damages and changes to companies’ business practices.
The suit, and a similar one filed in Los Angeles Superior Court, targets Facebook, Instagram, YouTube, TikTok, and Snap. The cases are exposing embarrassing internal conversations and findings at the companies, particularly Facebook and Instagram owner Meta, further tarnishing their brands in the public eye. They are also testing a particular vector of attack against the platforms, one that targets not so much alarming content as design and marketing decisions that accelerated harms. The upshot, some believe, could be new forms of regulation, including at the federal level.
One document discussed during a hearing this week included a 2016 email from Mark Zuckerberg about Facebook’s live videos feature. In the email, the Meta chief wrote, “we’ll need to be very good about not notifying parents / teachers” about teens’ videos.
“If we tell teens’ parents about their live videos, that will probably ruin the product from the start,” he wrote, according to the email.
In slides summarizing internal tech company documents, released this week as part of the litigation, an internal YouTube discussion suggested that accounts from minors in violation of YouTube policies were actively on the platform for years, producing content an average of “938 days before detection – giving them plenty of time to create content and continue putting themselves and the platform at risk.”
A spokesperson for Meta didn’t immediately respond to requests for comment.
A YouTube spokesperson, José Castañeda, described the slide released this week as “a cherry-picked view of a much larger safety framework” and said the company uses more than one tool to detect underage accounts, while taking action every time it finds an underage account.
“If we tell teens’ parents about their live videos, that will probably ruin the product from the start.”
Mark Zuckerberg, Meta CEO, in 2016 email
In court, the companies have argued that they are making editorial decisions permitted by the First Amendment,. That trial is set for June.
The state court litigation moved into jury selection this week, increasing the pressure on social media companies.
While the state and federal cases differ slightly, the core argument is the same: that social media companies deliberately designed their products to hook young people, leading to disastrous but foreseeable consequences.
“It’s led to mental health issues, serious anxiety, depression, for many. For some, eating disorders, suicidality,” said Previn Warren, co-lead counsel on the case in federal court. “For the schools, it’s been lost control over the educational environment, inability of teachers to really control their classrooms and teach.”
A federal suit
Meta and other companies have faced backlash for years over their treatment of kids on their platforms, including Facebook and Instagram. Parents, lawmakers and privacy advocates have argued that social media contributed to a mental health crisis among young people and that tech companies failed to act when that fact became clear.
Those allegations gained new scrutiny last month when a brief citing still-sealed documents in the federal suit became public.
While the suit also names TikTok, Snap, and Google as defendants, the filing includes allegations against Meta that are especially detailed.
In the more than 200-page filing, for example, the plaintiffs argue that Meta deliberately misled the public about how damaging their platforms were.
Warren pointed to claims in the brief that Meta researchers found that 55% of Facebook users had “mild” problematic use of the platform, while 3.1 percent had “severe” problems. Zuckerberg, according to the brief, pointed out that 3% of billions would still be millions of people.
But the brief claims the company published research noting only that “we estimate (as an upper bound) that 3.1% of Facebook users in the US experience problematic use.”
“That’s a lie,” Warren said.
In response to recent interest in the suits, Meta published a blog post this month arguing that the litigation “oversimplifies” the issue of youth mental health, and pointed to past instances where it has worked with parents and families with features to protect kids.
The federal case faced a key hearing this week, as the defendants argued that a judge should summarily dismiss the case. A decision on that motion is likely coming in the next few weeks, Warren said.
Social media companies, like other web-based services, receive protection from some legal claims under a part of federal law. Section 230 of the Communications Decency Act gives legal immunity to website operators for potentially illegal content on their platforms.
Mary Anne Franks, a legal scholar in First Amendment issues at George Washington University who has long studied Section 230, said rather than online content in and of itself, the recent social media cases are focusing on the design of the platforms and their marketing.
“The litigation strategy is saying it’s the way that you’re providing that space and you’re pushing this toward individuals that are vulnerable that is really an issue here,” she said. “It’s your own conduct, not somebody else’s.”
The companies are making key decisions behind the scenes, she said, and could be held responsible for them.
“You were manipulating things,” she said the plaintiffs are arguing. “You were deliberately making choices about what comes to the top or what is directly accessible or may be tempting to vulnerable users.”
A California state trial begins
Meanwhile, the related state lawsuit went to jury selection this week.
The case, which makes similar claims about personal injury caused by the social media companies, has also drawn nationwide attention, and major industry figures like Zuckerberg are expected to appear on the stand.
The personal injury case focuses on an unnamed plaintiff who claims to have had her mental health damaged by an addiction to social media.
In a last-minute development this week, TikTok and Snap reportedly reached undisclosed settlements in the case. Meta and Google are continuing as defendants.
Franks said these trials could be a tipping point in regulating how tech companies design and market their products. While the companies have faced scrutiny in the past, she said, the glare of examination at trial could be especially bright.
“There’s always been talk of it and the members of Congress have kind of said, ‘maybe we’ll regulate you,’” she said. “I think now the platforms are really getting nervous about what this is going to mean if they look really bad on the stand.”
This article was originally published on CalMatters and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.
