Meta and YouTube ordered to pay $3 million to young woman in social media addiction trial

After nine days of deliberation, a Los Angeles jury found Google and Meta liable for harms stemming from the design of their social media products on Wednesday and ordered them to pay $3 million in compensatory damages to a plaintiff who said that Instagram and YouTube caused depression, body dysmor...

Meta and YouTube ordered to pay $3 million to young woman in social media addiction trial

After nine days of deliberation, a Los Angeles jury found Google and Meta liable for harms stemming from the design of their social media products on Wednesday and ordered them to pay $3 million in compensatory damages to a plaintiff who said that Instagram and YouTube caused depression, body dysmorphia and suicidal thoughts. 

Meta was ordered to pay 70 percent of damages and YouTube the rest. The amount owed the plaintiff may rise, and the jury will continue to deliberate over potential punitive damages for egregious conduct, per The New York Times. 

This is the first trial tackling the legal question of whether features of social media, like autoplay, infinite scroll and beauty filters can cause harm to users. 

“This momentous verdict shows that tech companies will be held accountable for the harm they cause. These companies have spent years choosing profit over people’s well-being, and now a jury has decided they must pay the price for their actions,” said Maddy Batt, a legal fellow at Tech Justice Project, a law firm specializing in suits against AI chatbots.

The plaintiff, KGM, filed her lawsuit using a pseudonym in 2023. KGM, now 20, says she has been addicted to social media since she was a child. It was one of three cases selected out of thousands as “bellwether trials” to test out a new theory of liability.

Batt cautioned that the outcome of this trial doesn’t mean “an automatic legal win” for the thousands of pending cases, as determining causation varies greatly given the circumstances. “Each individual plaintiff still does have to show, if they go to trial, that any negative mental health outcomes they personally experienced were linked to social media,” she said.

It is a huge boon to tech accountability advocates to see this success though, Batt said, and could lead to tech companies changing their products because of the amount of money in play to settle cases or pay damages. This jury decision, coupled with a $375 million verdict against Meta announced yesterday, is the first step to achieving that goal.

The New Mexico Attorney General Raúl Torrez sued Meta in 2023, alleging the company misled constituents over how safe its platforms are for children. State prosecutors focused specifically on Instagram’s potential to facilitate the sexual exploitation of kids. 

On Tuesday, a jury sided with New Mexico, saying the company also engaged in deceptive trade practices. Meta was ordered to pay $5,000 per violation — $375 million total. Torrez plans to pursue more damages at a future bench trial, and hopes to compel changes to the platform. Meta said it plans to appeal. 

Batt pointed out that this trial is the first time tech leaders like Mark Zuckerberg have had to make a case and submit to questioning in front of a jury of their peers. (The CEO did not take the stand in the New Mexico case.) Large tech companies have faced a public backlash over the past decade, and much of it has revolved around their products’ impact on the mental health of young people. 

Frances Haugen, a whistleblower, leaked internal research documents from the company previously known as Facebook showing employees were aware girls reported their eating disorders worsening after using Instagram. Social media use can prompt girls to compare and criticize their own bodies, and many companies struggle to moderate influencers promoting eating disorders on their platforms.

Over two-thirds of teenage girls reported using Instagram, more than boys. A quarter each of Black and Latinx teens said they use Instagram and YouTube “constantly” according to a 2024 survey by Pew Research Center. 

Google argued that YouTube was not social media, while Meta pressed on the question of whether social media was the cause of KGM’s anxiety, depression and body dysmorphia. Meta’s lawyers deconstructed KGM’s home environment, alleging her parents’ divorce and treatment by her mother were the root cause of her emotional pain. The companies also argued that it wasn’t the way their products were designed that caused problems, but rather the specific content seen. 

KGM originally named the companies behind Snapchat and Tiktok in the lawsuit, but those parties settled for an undisclosed sum before the trial started. The trial focused on Instagram and Facebook, both Meta products, and YouTube, which is owned by Google.

The burden was on KGM’s lawyers to prove that Meta and Google were negligent in their design of social media products and show that those same products caused the plaintiff’s mental health issues. The jury agreed with those arguments. 

KGM testified that features like notifications made the app addictive, and she was unable to stop whenever she tried to limit her usage. She said she started her first Instagram account at age 9 and joined YouTube at age 10, even though legally kids aren’t supposed to have online accounts before they’re 13. Almost all of her Instagram posts had image filters on them, and KGM said she didn’t feel bad about her body until she began using the platform.

The tech accountability watchdogs who rallied behind KGM are ecstatic over this win. “The era of Big Tech invincibility is over,” said Sacha Haworth, executive director of The Tech Oversight Project, in a statement. 

For parents who have lost their kids to what many describe as social media-related harms, this is a moment of vindication. 

“For years, families have been told this was a parenting issue, but the jury saw the truth: these companies made deliberate decisions to prioritize growth and profit over kids’ safety,” said Shelby Knox, director of online safety campaigns at nonprofit ParentsTogether.

Social media companies have been battling allegations of harm, particularly to kids, for years. Most of the claims are easily dismissed under Section 230, the law that says a platform isn’t held liable for third-party content it hosts. But these bellwether cases are testing whether the design of products like YouTube, Facebook and Instagram are inherently harmful. Plaintiffs have pointed to the impacts of features such as infinite scroll and face filters as harmful regardless of the content being shared. 

The case concludes as Congress works to pass a package of internet bills that is aimed at protecting kids online but that critics say may lead to the removal of digital LGBTQ+ and abortion content — a particular concern given the Trump administration’s policy positions. 

In her statement, Haworth at The Tech Oversight Project called on lawmakers to pass the Kids Online Safety Act, one of the most hotly debated pieces of tech legislation in recent years. It has failed to pass the House since its first was introduced in 2022, but now is being considered as part of the aforementioned package. 

“It’s good that people are suing these companies and winning in court to reduce their power and force them to change their policies,” said Evan Greer, director of digital rights nonprofit Fight For The Future, to The 19th. But she’s concerned how the verdict in KGM’s case will be used to advocate for laws that she says could threaten free speech online. 

Greer pointed to the way activists are using social platforms to monitor abuses by Immigration and Customs Enforcement, advocate for human rights and discuss accustations of sexual abuse against people like Jeffery Epstein. “We need policies that address corporate abuse without kneecapping the ability of front-line activists to use social media to change the world,” she said.

Jess Miers, associate professor of law at the University of Akron School of Law, is concerned about the long-term consequences of the verdict. While these cases focus on the way platforms are designed, said in practice, there isn’t a strong delineation between content and feature design. 

“Autoplay is only engaging because of what it plays,” she told The 19th. “Infinite scroll only retains users because of what it surfaces.” She pointed out many apps use these kinds of features, but those aren’t the ones being sued.

Thus, liability tied to design will inevitably trickle down to judgements about content. “The only practical way to reduce the risks alleged in these suits is to restrict or suppress categories of content that might later be characterized as harmful or ‘addictive,’” she noted. 

And what’s the content most likely to be labeled as harmful? “History shows they expand to cover disfavored speech—whether that’s reproductive health information, gender-affirming care, or speech about policing and immigration enforcement,” she said. 

“The people most likely to be affected are those who already rely on the Internet as a primary space for connection and support,” Miers said — like disabled people, LGBTQ+ youth or people looking for accurate information on contraception.

Need Support?

Find verified resources for reproductive healthcare, support services, and advocacy organizations.

Find Resources