Meta told to pay $375m for misleading users over child safety

robot
Abstract generation in progress

Meta told to pay $375m for misleading users over child safety

Just now

ShareSave

Kali HaysTechnology reporter

ShareSave

Getty Images

Meta chairman and chief executive Mark Zuckerberg

A court in New Mexico has ordered Meta to pay $375m (£279m) for misleading users over the safety of its platforms for children.

A jury found that Meta, which owns Facebook, Instagram and WhatsApp, was liable for the way in which its platforms endangered children and exposed them to sexually explicit material and contact with sexual predators.

New Mexico Attorney General Raul Torrez said the verdict is “historic” and marks the first time that a state has successfully sued Meta over child safety issues.

A spokeswoman for Meta, led by chairman and chief executive Mark Zuckerberg, said the company disagrees with the verdict and intends to appeal.

She said: “We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors and harmful content. We remain confident in our record of protecting teens online.”

The jury found that Meta was responsible for violating New Mexico’s Unfair Practices Act because it misled the public about the safety of its platforms for young users.

The total civil penalty of $375m was reached after the jury decided there were thousands of violations of the act, each with a maximum penalty of $5,000.

Meta is also involved in a separate trial in Los Angeles, in which a young woman claims that she became addicted to platforms like Instagram and YouTube, owned by Google, as a child because of how they are intentionally designed.

There are thousands of similar lawsuits winding their way through the US courts.

New Mexico sued Meta in 2022, claiming the company “steered” young users to content that was sexually explicit, showed child sexual abuse, or even exposed them to solicitation of such material and sex trafficking.

It said the company did so through its recommendation algorithms, which are essentially tools that Meta uses to automatically curate the content a user sees on its platforms.

“Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew,” Torrez said.

“Today the jury joined families, educators, and child safety experts in saying enough is enough.”

Meta and TikTok let harmful content rise after evidence outrage drove engagement, say whistleblowers

She spent 16 hours on Instagram in a day. It’s up to a jury to decide if Meta is to blame

WhatsApp

Children

Meta

Instagram

Facebook

This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin