A Meta spokesperson confirmed that external vendors responsible for content moderation on Meta’s platforms, including Facebook, were affected by a global tech outage on Friday that disrupted airports, banks, and hospitals.
According to a source familiar with the situation, the social media giant experienced a SEV1 incident, a “code red”-style alert indicating high-stakes system problems requiring immediate attention.
In a statement, the Meta spokesperson acknowledged the issues, stating that they had been resolved earlier in the day. “The global CrowdStrike outage earlier today temporarily impacted several of our vendors’ tools. While this caused a small impact on some of our support operations, there was minimal to no impact on our content moderation efforts,” the spokesperson said.
Like most social media companies, Meta uses a combination of artificial intelligence and human review to moderate the billions of posts on its platforms, including Instagram, WhatsApp, and Threads. While Meta employees handle some content moderation, most are outsourced to vendors employing low-paid workers to assess posts for hate speech, violence, and other rule violations.
The alert on Friday affected vendor access to two Meta systems used to route flagged content for review, known as SRT and HumanOps, the source said.