Arguments in a Landmark Social Media Addiction Trial Start Next Week. This Is What’s at Stake
Google and Meta both deny the allegations in the complaint. “Providing young people with a safer, healthier experience has always been at the core of our work,” Google spokesman José Castañeda said in a statement. “Working with youth, mental health and parenting experts, we've built services and policies to provide young people with age-appropriate experiences, and parents with robust controls.”
“For more than a decade, we've listened to parents, worked with experts and law enforcement, and conducted in-depth research to understand the issues that matter most,” Meta spokeswoman Stephanie Otway said in a statement. “We're using these insights to make meaningful changes — like introducing Teen Accounts with built-in protections and giving parents tools to manage their teens' experiences.”
The Bellwether case
KGM started watching YouTube at the age of six, had an Instagram account when she was 11, got on Snapchat at 13, and TikTok one year later — with each app “sending her spiral into anxiety and depression, fueled by low self-esteem and body dysmorphia,” according to her attorney Joseph VanZandt. She, along with her mother Karen Glenn, filed a lawsuit against Meta, Google's YouTube, Snap, and TikTok claiming that features like “autoplay” and “infinite scroll” contributed to her social media addiction, and that social media use contributed to her anxiety and depression, making her feel insecure about herself. (Snap and TikTok settled the case with KGM before the trial. Terms were not disclosed.)
Glenn witnessed last year that she did not realize the damage these platforms could do to her daughter, and that she would not have given her a phone if she had known about this damage before. Bergman says KGM's lawsuit was chosen as the “bellwether” case because she “represents so many other young women who have suffered serious mental health and emotional ailments and disorders as a result of social media.”
“The goal of the lawyers who bring these cases is not just to win and receive compensation for their individual clients,” says Benjamin Zipursky, a law professor at Fordham University School of Law. “They intend to achieve a series of victories in this sampling of so-called 'bellwether trials.' Then they will try to pressure the companies into a massive settlement in which they pay potentially billions of dollars and also agree to change their practices.
KGM's is the first of 22 such bellwether trials to be held in Los Angeles Superior Court. A positive result in favor of the plaintiff could give the remaining approximately 1,600 lawsuits a significant impact – and potentially force technology companies to embrace new protections. The trial also promises to raise wider awareness of social media business models and practices. “If the public has a very negative reaction to what comes forward, or what a jury finds, then this can affect legislation at the state or federal level,” Zipursky adds.
Bergman, who has represented asbestos victims for 25 years, says this trial feels like a repeat of what happened in the past. “When Frances Haugen testified before Congress and revealed for the first time what social media companies know their platforms are doing to get vulnerable young people, I realized this was asbestos again,” says Bergman.
Dividing Lines
Looking to draw parallels from product liability cases against Big Tobacco and the auto industry, the plaintiffs' main argument is that big tech companies designed their social media platforms in a negligent manner, meaning they didn't take reasonable steps to prevent harm. “Specifically, the plaintiffs allege that design features such as infinite scrolling and autoplay caused certain injuries to minors, including disordered eating, self-harm and suicide,” said Mary Anne Franks, a law professor at George Washington University.
On the other hand, the tech companies will likely focus on free speech causes and defenses. “The defendants will argue that it was third-party content that caused the plaintiffs' injuries, not the access to that content that was provided by the platforms,” Franks says. The companies could also likely argue, she says, “that to the extent that companies' decision-making about content moderation is involved, that decision-making is protected by the First Amendment,” referring to the US Supreme Court's 2024 ruling in Moody v. Netchoice.