美国陪审团对Meta、谷歌作出不利裁决,科技公司责任豁免争议升级


By Diana Novak Jones
2026年3月26日 上午10:02 UTC 更新于1小时前

律师马克·拉尼尔(原告Kaley G.M.的代理律师)在加州洛杉矶法院外与媒体交谈,此前陪审团认定Meta和谷歌在关键测试案件中负有责任,该案件指控Meta和谷歌旗下YouTube通过成瘾性社交媒体平台损害儿童心理健康。路透社/Mike Blake 购买授权,新标签页打开

  • 总结
  • 公司
  • Meta、谷歌计划上诉,或援引第230条保护条款
  • 数千起诉讼针对科技公司平台设计选择
  • 对社交媒体之外其他在线平台的潜在影响
  • 专家称最高法院或介入第230条适用范围争议

3月26日(路透社)——在美国针对社交媒体公司损害儿童权益的诉讼浪潮中,首批两场审判的陪审团裁定Meta(META.O)和谷歌(Alphabet的GOOGL.O,新标签页打开)负有责任。这一裁决可能引发上诉战,最终重塑美国法律对科技公司的诉讼豁免规则。

在加州,洛杉矶陪审团周三裁定,一名年轻女性因年轻时沉迷Instagram和YouTube产生抑郁和自杀念头,Meta和谷歌需共同支付600万美元赔偿金。在新墨西哥州的另一起案件中,陪审团周二裁定Meta误导用户关于产品对年轻用户的安全性,并纵容平台上儿童被性剥削,判其赔偿3.75亿美元。

通过路透社《可持续发展切换》新闻通讯了解影响企业和政府的最新环境、社会和治理(ESG)趋势。在此注册。

广告 · 滚动继续

这些裁决打破了原告起诉科技公司时长期难以逾越的法律盾牌——1996年《通信规范法》第230条,该条款通常保护在线平台免受用户生成内容的责任追究。在这两起案件中,原告均规避了这一障碍,转而指控公司通过平台设计决策而非内容本身伤害了年轻用户。

内布拉斯加大学法学院助理教授格雷戈里·迪金森(研究科技与法律交叉领域)表示:”法院正越来越倾向于区分关于平台功能或行为的诉求,与仅因第三方言论而强加责任的诉求。”

Meta和谷歌否认相关指控,称已采取措施保护年轻人。

Meta、谷歌声称享有责任豁免

在这两起案件中,Meta均要求法官驳回诉讼,谷歌在洛杉矶案件中也声称受第230条保护。法官驳回了这一主张,允许案件进入审判程序。

Meta发言人拒绝置评,仅表示公司计划对两起案件提起上诉。谷歌称将对洛杉矶案件提起上诉,但未立即回应置评请求。

这些上诉几乎肯定会围绕第230条展开,且可能产生广泛影响。

Meta、谷歌、Snapchat母公司Snap和TikTok母公司字节跳动正面临数千起州法院和联邦法院诉讼,指控其平台设计导致青少年心理健康危机。超过2400起案件已在加州联邦法院集中审理,数千起案件在加州州法院合并处理。

法律专家表示,法院对第230条责任豁免的解释正趋于严格。多家下级法院裁定,公司的平台设计选择不受该法律保护,但尚无上诉法院对此作出裁决——上诉法院而非初审法官的判决将对其他法院产生约束力。

对社交媒体以外领域的影响

法律专家称,第230条的上诉裁决可能影响范围远超社交媒体,将塑造针对其他托管儿童内容平台的诉讼。例如,联邦法院已有130多起针对Roblox公司的诉讼,指控该热门游戏平台未能保护用户免受性剥削,Roblox否认相关指控。

圣克拉拉大学法学院高科技法律研究所联合主任埃里克·戈德曼表示:”我认为现在接受审判的是互联网,而非社交媒体。如果这些诉讼理论成立,它们将被应用到其他领域。”

两起案件的上诉将首先由州级上诉法院审理,之后可能进入更高层级法院。

美国最高法院已表现出介入第230条适用范围的意愿。2023年,法院审理了一起涉及谷歌视频分享平台YouTube的争议,但最终未就互联网公司法律保护作出裁决。

2024年,最高法院驳回一名德克萨斯州青少年要求恢复其起诉Snapchat所有者Snap公司未能保护未成年用户免受性侵犯的诉讼。然而,两名保守派大法官克拉伦斯·托马斯和尼尔·戈萨奇对此决定提出异议,警告称”社交媒体平台越来越将第230条当作免罪符”。

代表科技公司提起诉讼的”科技正义法律项目”主任米塔利·贾恩表示,她认为最高法院现在可能愿意介入第230条适用范围的争议。

贾恩称:”我个人认为,最高法院甚至已准备好审理此类案件,只要出现合适的案例。”

Diana Novak Jones在芝加哥报道,Andrew Chung在纽约补充报道,Alexia Garamfalvi和Rod Nickel编辑

我们的标准:路透社信托原则,新标签页打开

US jury verdicts against Meta, Google tee up fight over tech liability shield

By Diana Novak Jones
March 26, 2026 10:02 AM UTC Updated 1 hour ago

节点运行失败

Lawyer Mark Lanier, of the plaintiff Kaley G.M., speaks with the media outside the court after the jury found Meta and Google liable in a key test case accusing Meta and Google’s YouTube of harming children’s mental health through addictive social media platforms, in Los Angeles, California, U.S., March 25, 2026. REUTERS/Mike Blake Purchase Licensing Rights, opens new tab

  • Summary
  • Companies
  • Meta, Google plan to appeal, may invoke Section 230 protection
  • Thousands of lawsuits target tech firms over platform design choices
  • Potential implications for other online platforms beyond social media
  • Supreme Court may weigh in on Section 230 scope, experts say

March 26 (Reuters) – Jurors in the first two trials in the U.S. from a growing wave of lawsuits targeting social media firms over harm to children ​have found Meta (META.O), opens new tab and Alphabet’s (GOOGL.O), opens new tab Google liable, potentially teeing up an appeals fight that could reshape how U.S. law shields tech companies from lawsuits.

In California, a Los Angeles jury ‌on Wednesday found Meta and Google liable for a young woman’s depression and suicidal thoughts after she said she became addicted to Instagram and YouTube at a young age, ordering them to pay a combined $6 million in damages. In a separate New Mexico case, jurors on Tuesday ordered Meta to pay $375 million after finding the company misled users about the safety of its products for young users and enabled the sexual exploitation of children on its platforms.

Make sense of the latest ESG trends affecting companies and governments with the Reuters Sustainable Switch newsletter. Sign up here.

Advertisement · Scroll to continue

Report Ad

The verdicts ​pierce a legal shield that plaintiffs suing tech companies have long struggled to overcome: Section 230 of the Communications Decency Act, a 1996 federal law that generally protects online ​platforms from liability over user-generated content. In both cases, the plaintiffs sidestepped that hurdle by arguing the companies harmed young users through decisions they made ⁠about the platforms’ design rather than the content itself.

“Courts are increasingly trying to distinguish claims about platform functionality or platform conduct from claims that would really just impose liability for third-party speech,” ​said Gregory Dickinson, an assistant professor at University of Nebraska College of Law who studies the intersection of tech and the law.

Advertisement · Scroll to continue

Meta and Google have denied the claims, arguing they have taken actions to ​protect young people.

META, GOOGLE CLAIMED LIABILITY SHIELD

In both cases, Meta urged the judge to dismiss the lawsuit, as did Google in the Los Angeles case, claiming they were shielded from liability by Section 230. The judges rejected the argument, saying the cases could move to trial.

A Meta spokesperson declined to comment beyond noting that Meta plans to appeal in both cases. Google has said it plans to appeal in the Los Angeles case, but did not ​immediately respond to a request for comment.

Those appeals are almost certain to center on Section 230 – and they could have broad implications.

Meta, Google, Snapchat parent Snap Inc, and TikTok parent ByteDance are facing ​thousands of lawsuits in both state and federal court over claims their design choices have led to a mental health crisis for teens and young people. More than 2,400 cases have been centralized before a single judge in ‌California federal ⁠court, while thousands of cases are consolidated in California state court.

Legal experts say courts have been moving toward a narrower view of Section 230’s liability shield. Several lower courts have ruled that companies’ platform design choices are not protected by the law, but no appellate court has weighed in. Appellate courts, not trial judges, are the ones whose rulings bind other courts.

IMPLICATIONS BEYOND SOCIAL MEDIA

An appellate ruling on Section 230 could have implications beyond social media, legal experts say, shaping lawsuits against other online platforms that host content used by children. More than 130 lawsuits are pending in federal court against ​Roblox Corporation, for example, accusing the popular gaming ​site of failing to protect users from sexual ⁠exploitation. Roblox denies the claims.

“I think the internet is on trial, not social media,” said Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University School of Law. “If the theories work, they will be deployed elsewhere.”

Appeals in both cases would be heard first by appeals courts ​at the state level. But they could go to higher courts after that.

The U.S. Supreme Court has shown a willingness to potentially decide the ​scope of Section 230. In ⁠2023, the court heard a challenge involving Google’s video-sharing platform YouTube, but ultimately sidestepped a ruling on the legal protections for internet companies.

In 2024, the high court declined to hear a Texas teen’s bid to revive his lawsuit accusing Snapchat owner Snap of failing to protect underage users of its social media platform from sexual predators. Two conservative justices – Clarence Thomas and Neil Gorsuch – dissented from that decision, however, warning ⁠of further delays ​in addressing the issue. “Social-media platforms have increasingly used (Section) 230 as a get-out-of-jail free card,” they wrote in a dissent.

Meetali ​Jain, director of the Tech Justice Law Project, which brings litigation against tech companies, said she thinks the U.S. Supreme Court may now be open to weighing in on the scope of Section 230.

“I personally think that the Supreme Court is ​even ready for a case like this, for the right case,” Jain said.

Reporting by Diana Novak Jones in Chicago, additional reporting by Andrew Chung in New York, Editing by Alexia Garamfalvi and Rod Nickel

Our Standards: The Thomson Reuters Trust Principles., opens new tab

评论

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注