Technology
Social media companies must face youth addiction lawsuits
A federal judge on Tuesday rejected efforts by major social media companies to dismiss nationwide litigation accusing them of illegally enticing and then addicting millions of children to their platforms, damaging their mental health.
US District Judge Yvonne Gonzalez Rogers in Oakland, California, ruled against Alphabet , which operates Google and YouTube; Meta Platforms, which operates Facebook and Instagram; ByteDance, which operates TikTok; and Snap, which operates Snapchat.
The decision covers hundreds of lawsuits filed on behalf of individual children who allegedly suffered negative physical, mental and emotional Health effects from social media use including anxiety, depression, and occasionally suicide.
The litigation seeks, among other remedies, damages and a halt to the defendants' alleged wrongful practices.
"Today’s decision is a significant victory for the families that have been harmed by the dangers of social media," the plaintiffs' lead lawyers - Lexi Hazam, Previn Warren and Chris Seeger - said in a joint statement.
More than 140 school districts have filed similar lawsuits against the industry that are also before Gonzalez, and 42 states plus the District of Columbia last month sued Meta for youth addiction to its social media platforms.
Alphabet through a spokesperson called the allegations "simply not true," and said that protecting children "has always been core to our work." A TikTok spokesperson said it had "robust safety policies and parental controls."
Snap declined to comment. Meta did not respond to a request for comment.
In her 52-page ruling, Rogers rejected arguments that the companies were immune from being sued under the US Constitution's First Amendment and a provision of the federal Communications Decency Act.
The companies said that provision, Section 230, provides immunity from liability for anything users publish on their platforms, and required the dismissal of all claims.
But Rogers said the plaintiffs' claims were broader than just focusing on third-party content and said the defendants did not address why they should not be liable for providing defective parental controls, not helping users limit screen time and creating barriers to deactivating accounts.
She cited as an example allegations that companies could have used age-verification tools to warn parents when their children were online.
"Accordingly, they pose a plausible theory under which failure to validly verify user age harms users that is distinct from harm caused by consumption of third-party content on defendants' platforms," Rogers wrote.
Rogers said the companies legally owed a duty to their users arising from their status as product makers and could be sued for negligence over their duty to design reasonably safe products and to warn users of known defects.
But the judge said the companies owed no legal obligation to protect users from harm from third-party users of their platforms, and she narrowed the litigation by dismissing some of the claims the plaintiffs were pursuing.
-
Technology1d ago
Breaking up Google? What a Chrome sell-off could mean for the digital world | The Express Tribune
-
Technology2d ago
AI harm is often behind the scenes and builds over time – a legal scholar explains how the law can adapt to respond
-
Technology2d ago
Newborn planet found orbiting young star, defying planet formation timeline | The Express Tribune
-
Technology2d ago
Awkwardness can hit in any social situation – here are a philosopher’s 5 strategies to navigate it with grace
-
Technology2d ago
No need to overload your cranberry sauce with sugar this holiday season − a food scientist explains how to cook with fewer added sweeteners
-
Technology2d ago
Teslas are deadliest road vehicles despite safety features: study | The Express Tribune
-
Technology3d ago
There Is a Solution to AI’s Existential Risk Problem
-
Technology3d ago
US pushes to break up Google, calls for Chrome sell-off in major antitrust move | The Express Tribune