Even if the Supreme Court cleared the way for lawsuits like the one in Seattle, the district would face an uphill task in proving the industry’s liability.

SEATTLE — Just like the tobacco, oil, gun, opioid and vape industries before them, big American social media companies are now facing lawsuits brought by government entities trying to hold them accountable for a huge social problem — in their case, the mental health crisis. among young people.

But the new lawsuits — one filed by a Seattle school district last week, another by a suburban district on Monday and likely more to come — face an uncertain legal path.

US Supreme Court Arguments are scheduled to be heard next month over the extent to which federal law protects the tech industry from such claims when social media algorithms promote potentially harmful content.

Even if the High Court cleared the way for lawsuits like Seattle’s, the county would face an uphill task in proving the industry’s liability.

And the tech industry insists that social media’s impact on teen mental health is in many ways different from, say, the role of a big pharmaceutical company in promoting opioid addiction.

“The main argument is that the technology industry is to blame for the emotional state of teenagers because they have recommended content that has caused emotional harm,” said Carl Szabo, vice president and general counsel of the technology industry trade association NetChoice. “It would be absurd to sue Barnes & Noble because an employee recommended a book that caused emotional harm or made a teenager feel bad. But that’s exactly what this lawsuit does.”

Seattle Public Schools on Friday sued the tech giants for TikTok, Instagram, Facebook, YouTube and Snapchat, alleging that they created a public nuisance by targeting their products at children. The Kent School District, south of Seattle, followed suit on Monday.

Counties blame the companies for worsening mental health and behavioral disorders, including anxiety, depression, disordered eating and cyberbullying; complication of students’ learning; and force schools to take steps such as hiring additional mental health professionals, developing lesson plans about the effects of social media, and providing additional training for teachers.

“Our students — and young people around the world — face unprecedented challenges in learning and life, exacerbated by the negative impact of increased screen time, unfiltered content and potential social media addiction,” Seattle Superintendent Brent Jones said in a statement sent to email on Tuesday. “We are confident and hopeful that this lawsuit is a significant step toward reversing this trend for our students.”

A federal law — Section 230 of the Communications Decency Act of 1996 — helps protect Internet companies from liability for what third-party users post on their platforms. But the lawsuits argue that the provision, which predates all social media platforms, doesn’t protect the tech giants’ behavior in this case when their own algorithms promote harmful content.

It’s also at issue in Gonzalez v. Google, YouTube’s parent company, which is being heard at the Supreme Court on February 21. In that case, the family of an American woman killed in a 2015 Islamic State attack in Paris alleges that YouTube’s algorithms aided the terror group’s recruiting efforts.

If the high court’s ruling makes it clear that tech companies can be held liable in such cases, school districts will still have to show that social media is to blame. Seattle’s lawsuit says that from 2009 to 2019, there was an average 30% increase in the number of students who reported feeling “so sad or hopeless almost every day for two or more consecutive weeks” that they stopped doing some typical activities activities. .

But Szabo noted that Seattle’s graduation rate has been on the rise since 2019, a time when many kids relied on social media to keep in touch with their friends throughout the pandemic. If social media were really that detrimental to the district’s educational efforts, graduation rates wouldn’t be growing, he suggested.

“The complaint only focuses on how social media harms children, and there may be evidence of that,” said Eric Goldman, a professor at Santa Clara University’s School of Law in Silicon Valley. “But there is a lot of evidence that social media is good for teenagers and other children. What we don’t know is what the level of distress would look like without social media. Perhaps the level of distress will be higher, not lower.”

The companies insist they take the safety of their users, especially children, seriously, and have introduced tools to make it easier for parents to find out who their children are interacting with; made mental health resources, including a new 988 hotline, more visible; and improved age verification and device usage time limits.

“We automatically make teens’ accounts private when they join Instagram and send notifications encouraging them to take regular breaks,” Anitigon Davis, Meta’s global head of security, said in an emailed statement. “We don’t allow content promoting suicide, self-harm or eating disorders, and of the content we remove or take action on, we identify over 99% of it before it’s reported to us.”

Facebook whistleblower Frances Haugen revealed internal research in 2021 that showed the company knew Instagram was having a negative impact on teenagers, damaging their body images and exacerbating eating disorders and suicidal thoughts. She claimed the platform prioritized profits over security and hid its research from investors and the public.

Even if social media benefits some students, it doesn’t undo the serious harms to many others, said Josh Golin, executive director of Fairplay for Kids, a nonprofit that works to insulate kids from commercialization and marketing.

“The cost to student mental health, the amount of time schools have to spend monitoring and responding to social media drama, is excessive,” Gollin said. “It is ridiculous that schools are being held responsible for the harm caused to young people by these social media platforms. No one sees the kind of cumulative effects that social media causes to the extent that school districts do.”

Both cases were filed in U.S. District Court in Seattle, but they are based on the state’s public nuisance statute, a broad, vaguely defined legal concept with origins in 13th-century England. In Washington, a public nuisance is defined, inter alia, as “every unlawful act and every omission” that “must annoy, injure, or endanger the safety, health, comfort, or recreation of any substantial number of persons.”

Most famously, public nuisance claims helped the tobacco industry win a 25-year, $246 billion settlement with states in 1998. But public nuisance law is also at least part of the basis for lawsuits by state, city, county and tribal governments holding oil companies responsible for climate change, the gun industry for gun violence, the pharmaceutical industry for the opioid crisis and vaping companies, such as the Juul, for teenage vaping.

Most of the legal proceedings are ongoing. Juul Labs last month agreed to settle thousands of lawsuits — including 1,400 from school districts, cities and counties — for $1.2 billion.

The lawsuit in Seattle could lead to massive change, raising questions about the advisability of solving big public problems in court rather than through legislation. Still, the risk to the school district is small because the private law firm filed the complaint on a contingency basis, where the firm only gets paid if the case is successful.

Jolina Quaresma, senior policy and technology advisor for Common Sense Media, a company that works to make media safer for children, said she was thrilled that the school district filed public nuisance claims against the tech companies.

“People are tired of waiting for Congress to do something,” she said.


Previous articleA woman was fatally wounded in a parking lot in Westerville
Next articleFAA shutdown halts flights nationwide after computer problem