Meta to Cut 40% of Outsourced Moderators as AI Reduces Moderation Errors by 25% Compared to Humans

robot
Abstract generation in progress

Fast Technology, March 24 — According to media reports, Meta announced on March 19 local time that it will undertake a major overhaul of its content moderation strategy, significantly reducing reliance on external contractors and shifting toward advanced artificial intelligence systems.

For a long time, Meta has depended on a large network of third-party content moderators, often located in low-cost regions such as the Philippines and India, who review billions of posts daily to identify hate speech, misinformation, and inappropriate content.

Meta stated that advancements in AI technology are the primary driver of this change.

Under the plan, Meta will gradually cut up to 40% of its external moderation staff over the next 12 months and reallocate resources to internal AI tools, such as the fine-tuned Llama model for moderation tasks.

Third-party data indicates that Meta’s outsourced content moderation workforce is approximately 15,000 people, meaning about 6,000 jobs could be eliminated.

Meanwhile, Meta plans to invest $500 million in AI safety features, including real-time human oversight for high-risk decisions and enhanced testing protocols after vulnerabilities are discovered. Early pilots show that AI reduces moderation errors by 25% compared to humans.

This decision by Meta reflects a broader industry trend: leveraging AI to handle scalable, repetitive tasks while allowing human employees to focus on work that AI cannot easily replace.

【End of article】 Please cite the source: Fast Technology

Editor: Deer Antler

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin