x
Breaking News
More () »

Schools sue social media companies for targeting children

Even if the Supreme Court were to clear the way for lawsuits like Seattle's, the district has a daunting challenge in proving the industry's liability.

SEATTLE — Like the tobacco, oil, gun, opioid and vaping industries before them, the big U.S. social media companies are now facing lawsuits brought by public entities that seek to hold them accountable for a huge societal problem — in their case, the mental health crisis among youth.

But the new lawsuits — one by the public school district in Seattle last week, with a second filed by a suburban district on Monday and almost certainly more to come — face an uncertain legal road.

The U.S. Supreme Court is scheduled to hear arguments next month over the extent to which federal law protects the tech industry from such claims when social media algorithms push potentially harmful content.

Even if the high court were to clear the way for lawsuits like Seattle's, the district has a daunting challenge in proving the industry's liability.

And the tech industry insists there are many ways social media's effects on teen mental health differ from, say, big pharma's role in pushing opioid addiction.

“The underlying argument is that the tech industry is to blame for the emotional state of teenagers, because they made recommendations on content that has caused emotional harm,” said Carl Szabo, vice president and general counsel of the tech industry trade association NetChoice. “It would be absurd to sue Barnes & Noble because an employee recommended a book that caused emotional harm or made a teenager feel bad. But that's exactly what this lawsuit is doing.”

Seattle Public Schools on Friday sued the tech giants behind TikTok, Instagram, Facebook, YouTube and Snapchat, alleging they have created a public nuisance by targeting their products to children. The Kent School District south of Seattle followed suit on Monday.

The districts blame the companies for worsening mental health and behavioral disorders including anxiety, depression, disordered eating and cyberbullying; making it more difficult to educate students; and forcing schools to take steps such as hiring additional mental health professionals, developing lesson plans about the effects of social media and providing additional training to teachers.

“Our students — and young people everywhere — face unprecedented learning and life struggles that are amplified by the negative impacts of increased screen time, unfiltered content, and potentially addictive properties of social media,” Seattle Superintendent Brent Jones said in an emailed statement Tuesday. “We are confident and hopeful that this lawsuit is a significant step toward reversing this trend for our students.”

Federal law — Section 230 of the Communications Decency Act of 1996 — helps protect online companies from liability arising from what third-party users post on their platforms. But the lawsuits argue the provision, which predates all the social media platforms, does not protect the tech giants’ behavior in this case, where their own algorithms promote harmful content.

That's also the issue in Gonzalez v. Google, the parent company of YouTube, set for argument at the Supreme Court on Feb. 21. In that case, the family of an American woman killed in an Islamic State group attack in Paris in 2015 alleges that YouTube's algorithms aided the terror group's recruitment.

If the high court's decision makes clear that tech companies can be held liable in such cases, the school districts will still have to show that social media was in fact to blame. Seattle's lawsuit says that from 2009 to 2019, there was on average a 30% increase in the number of its students who reported feeling “so sad or hopeless almost every day for two weeks or more in a row” that they stopped doing some typical activities.

But Szabo pointed out that Seattle's graduation rates have been on the rise since 2019, during a time when many kids relied on social media to keep in touch with their friends throughout the pandemic. If social media were truly so harmful to the district's educational efforts, the graduation rate wouldn't be rising, he suggested.

“The complaint focuses on only how social media harms kids, and there might be evidence of that,” said Eric Goldman, a professor at Santa Clara University School of Law in Silicon Valley. “But there’s also a lot of evidence that social media benefits teenagers and other kids. What we don’t know is what the distress rate would look like without social media. It’s possible the distress rate would be higher, not lower.”

The companies have insisted that they take the safety of their users, especially kids, seriously, and they have introduced tools to make it easier for parents to know whom their children are contacting; made mental health resources, including the new 988 crisis hotline, more prominent; and improved age verification and screen time limits.

“We automatically set teens’ accounts to private when they join Instagram, and we send notifications encouraging them to take regular breaks," Anitigone Davis, Meta's global head of safety, said in an emailed statement. “We don’t allow content that promotes suicide, self-harm or eating disorders, and of the content we remove or take action on, we identify over 99% of it before it’s reported to us.”

Facebook whistleblower Frances Haugen revealed internal studies in 2021 showing the company knew Instagram negatively affected teenagers by harming their body images and worsening eating disorders and suicidal thoughts. She alleged the platform prioritized profits over safety and hid its research from investors and the public.

Even if social media benefits some students, that doesn't erase the serious harm to many others, said Josh Golin, executive director of Fairplay for Kids, a nonprofit working to insulate children from commercialization and marketing.

“The mental health costs to students, the amount of time schools have to spend monitoring and responding to social media drama, is exorbitant,” Golin said. “It is ridiculous that schools are responsible for the damages caused by these social media platforms to young people. Nobody is seeing the kinds of cumulative effects that social media is causing to the extent school districts are.”

Both cases were filed in U.S. District Court in Seattle, but they are based on state public nuisance law — a broad, vaguely defined legal concept whose origins date back at least to 13th century England. In Washington, public nuisance is defined, in part, as “every act unlawfully done and every omission to perform a duty” which “shall annoy, injure or endanger the safety, health, comfort, or repose of any considerable number of persons.”

Most famously, public nuisance claims helped prompt the tobacco industry’s $246 billion, 25-year settlement with the states in 1998. But public nuisance law also has been at least part of the basis for litigation by state, city, county or tribal governments seeking to hold oil companies responsible for climate change, the gun industry for gun violence, the pharmaceutical industry for the opioid crisis and vaping companies like Juul for teen vaping.

Much of the litigation is ongoing. Juul Labs last month agreed to settle thousands of lawsuits — including 1,400 from school districts, cities and counties — for a reported $1.2 billion.

The Seattle litigation has the potential to enact massive change, prompting questions about the appropriateness of addressing big societal issues in court rather than through lawmaking. Yet there is little risk to the school district because a private law firm filed the complaint on a contingency basis in which the firm is paid only if the case succeeds.

Jolina Cuaresma, senior counsel for privacy and tech policy at Common Sense Media, which aims to make media safer for children, said she was thrilled to see a school district make a public nuisance claim against the tech companies.

“Folks have become tired of waiting for Congress to do something,” she said.

Before You Leave, Check This Out