Thirty years ago, Congress passed a law that offered internet companies a powerful shield from liability over content posted by users of their web platforms. The companies, lawmakers suggested, were like bookstores. They distributed content; they didn’t create it.
The courts have broadly interpreted the law, Section 230 of the Communications Decency Act of 1996, and internet companies have used it to bat away suits that would have held them responsible for activity on their sites. Plaintiffs’ lawyers have tried for decades to pierce Section 230’s protections—and have mostly failed.
That is, until now. In a pivotal case decided March 26, a state court jury in Los Angeles held two internet giants—Meta Platforms and YouTube—liable for a young woman’s social media addiction and the harm she suffered as a result. The verdict came only two days after a New Mexico jury found that Meta had failed to protect children from online predators.
How are lawyers overcoming Section 230? In the social media addiction case, at least, they did so by sidestepping it. Instead of arguing that social media content was to blame, they alleged Meta and YouTube deliberately and negligently developed addictive design features to hook young users. In essence, the lawyers sought to make social media harm a product liability problem rather than a content issue that might run into Section 230 and First Amendment roadblocks.
Seismic Impact
The Los Angeles jury awarded $6 million in damages to the young woman—identified as K.G.M. The size of the individual award poses no serious financial threat to the companies. Together, Meta Platforms, owner of Facebook and Instagram, and Alphabet, parent company of Google’s YouTube, earned $600 billion in revenue last year.
Yet the verdict may have a seismic impact. Thousands of similar cases have been brought by individuals and school districts in state and federal courts. K.G.M. v. Meta & YouTube was a bellwether trial, an initial case selected to test legal theories, gauge juror reactions, and assess potential damages. Plaintiffs in other pending cases now have a potential roadmap that could help them circumvent Section 230.
As one digital media lawyer told TechCrunch, $6 million is “nothing to the Metas of the world. But when you take that $6 million and you multiply it by all of the cases that they have against them, that becomes a huge number.”
In California alone, some 2,000 state cases have been consolidated in Los Angeles County Superior Court. Another 10,000 federal cases are consolidated in the U.S. District Court for the Northern District of California.
Big Tobacco Parallel?
Some advocates are calling the bellwether verdict a “Big Tobacco moment” for Big Tech. They argue that social media companies, like cigarette makers, have long denied their products are addictive. Now, “thousands of cases will follow, bringing Meta, Snap, TikTok, and YouTube to court,” one prominent activist told CNN.
A loss in a bellwether case can ramp up pressure on companies to reach a settlement. And it may also embolden state and federal lawmakers to press the companies to do more to protect minors. “Big Tech’s Big Tobacco moment has arrived,” Democratic Sen. Ed Markey said in the wake of the verdict, CNN reported. “We cannot rely on the courthouse alone—Congress must do its part to impose real guardrails on these platforms.”
However, whether social media companies currently have the appetite for a multibillion-dollar settlement like the one reached between Big Tobacco and state attorneys general in 1998 remains an open question. Meta and YouTube will likely appeal the jury verdict, and it is not yet clear how the appellate courts and federal judges will react to the product liability arguments.
Still, 40 state attorneys general are pursuing cases against Meta for allegedly endangering child safety. And some tech players have been willing to settle individual social media addiction cases—which may auger well for a global settlement with them. Snapchat and TikTok, for instance, both reached confidential settlements with K.G.M. just days before the start of the bellwether trial. Nonetheless, neither company admitted liability, and other cases against them are still pending.
Duty to Warn
The jury in Los Angeles found that Meta and YouTube should have reasonably known that the design and operation of their apps would be dangerous to minors. Jurors said the companies knew young users would be unable to recognize potential threats and failed to adequately warn them.
Design elements like beauty filters, infinite scroll, and algorithmic amplification fuel compulsive behavior, the plaintiffs asserted. In court documents, the young woman said she spent as many as 16 hours a day on Instagram and that her entire self-image hinged on the number of likes and followers she received. She also alleged that she suffered clinical depression, anxiety, and eating disorders, and engaged in self-harm, among other issues, as a result of her addiction.
Lawyers for Meta and YouTube argued that compulsive social media use does not meet the clinical definition of addiction and that scientists have yet to conclusively establish a link between social media and mental health issues. Meta CEO Mark Zuckerberg testified that his company has invested heavily in safety measures to protect young users.
K.G.M.’s lawyers countered with an expert witness, "Dopamine Nation" author Dr. Anna Lembke, who said the rewards offered by social media use stimulated the same neural pathways as gambling or drug addiction. The plaintiffs also alleged that internal company documents showed employees knew that young users were vulnerable and demonstrating a compulsive response to social media.
A Path to the Supreme Court?
Product liability arguments around social media have been developing since at least 2019. In Lemmon v. Snap, a federal case filed that year, the parents of two boys who died in a high-speed accident, alleged that a filter feature on Snapchat encouraged their sons to drive at dangerous speeds. While a U.S. district court said Section 230 barred the plaintiffs’ suit, the U.S. Court of Appeals for the Ninth Circuit partially reversed the decision, ruling the statute did not prevent internet companies from being sued over design flaws.
Some 200 cases have been filed in state and federal courts that rely on product liability arguments, Jonathan Cedarbaum, a professor at George Washington University Law School, wrote in an article for Lawfare. The claims, he said, typically “fall into two broad categories—what might be called defects of commission and of omission.” On one hand are design features that lead directly to harm. On the other are failures by companies to build guardrails that protect users.
While “the K.G.M. verdict has made headlines, and rightly so,” Cedarbaum expects any state appeal to reach the California Supreme Court. “As to the crucial federal law issues—Section 230 and the First Amendment—the appellate path will likely lead to the U.S. Supreme Court as well,” he wrote.
For claims against social media companies to ultimately succeed, Cedarbaum said, courts “will have to rethink not just the scope of Section 230 but also basic elements of product liability law and the proper way to apply the First Amendment to many online activities.”
Mega Verdict in New Mexico
While the bellwether case in California hinged on a novel products liability argument, it is not the only strategy plaintiffs are using to pursue social media companies.
On March 24, just two days before the Los Angeles decision, a state court jury in Santa Fe, N.M., found that Meta endangered children and misled consumers about the safety of its social media platforms. The jury ordered the company to pay $375 million, or $5,000 per violation, the maximum allowed by state law.
A two-year investigation by The Guardian newspaper into child sex trafficking on Facebook and Instagram led New Mexico Attorney General Raúl Torrez to launch a sting—"Operation MetaPhile”—conducted by undercover agents. Three men were later arrested for preying on children, and Torrez filed suit against Meta.
Internal company documents presented in court showed that child safety experts and company employees had “repeatedly warned about risks and harmful conditions on Meta’s platforms,” according to The Guardian. In taped depositions, Zuckerberg and Adam Mosseri, head of Instagram, said criminal behavior targeting children was “inevitable on the company’s platforms due to their vast user bases,” The Guardian reported.
Law enforcement officials and representatives from the National Center for Missing and Exploited Children (NCMEC) testified that the company was “deficient” in reporting about crimes committed against children on its apps. In its testimony, NCMEC said Meta generated “junk reports” and relied on artificial intelligence to moderate its platforms. This made the company’s reporting “useless,” NCMEC said, and hampered law enforcement efforts to stop criminal activity.
More Trials Ahead
Meta has said it will appeal the New Mexico verdict. “We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content,” the company said in a statement to The Guardian.
The company will not need to wait long to face off again with Torrez. A May 4 bench trial is scheduled to determine whether Meta must fund programs to address the harms caused to the public. The state is also asking a judge to decide if the company should be required to implement stronger age verification, kick predators off its platforms, and end encryption for messages sent by minors.
Meanwhile, another eight bellwether social media addiction cases are set for trial in Los Angeles. And a slate of federal cases is expected to go to trial over the summer. “There is a long road ahead, but [the Los Angeles] decision is quite significant,” Clay Calvert, a nonresident senior fellow at the American Enterprise Institute and expert on media law, told The New York Times. “If there are a series of verdicts for plaintiffs, it will force the defendants to reconsider how they design social media platforms and how they deliver content to minors.”
---
David L. Brown is a legal affairs writer and consultant, who has served as head of editorial at ALM Media, editor-in-chief of The National Law Journal and Legal Times, and executive editor of The American Lawyer. He consults on thought leadership strategy and creates in-depth content for legal industry clients and works closely with Best Law Firms as senior content consultant.