Brunswick, ME • (207) 245-1010 • contact@johnzblack.com
OpenAI, Anthropic, and Meta all used the same company to help train their models. That company just got breached.
Mercor, a $10 billion AI training data startup, confirmed it was hit through a trojanized version of LiteLLM, an open-source API proxy tool. A group called TeamPCP planted the malicious code, and Mercor was one of thousands of organizations that pulled the poisoned package into production.
Meta has paused all work with Mercor indefinitely. OpenAI says it’s investigating. Anthropic hasn’t said anything.
Lapsus$ published samples claiming to come from Mercor’s systems, including Slack messages, internal tickets, and contractor conversation recordings. They say they hold 4TB total. That number is unverified. What IS confirmed: internal communications and contractor data were exposed. Thousands of AI data labeling workers had their personal information and pay data caught in the blast.
These are the people doing the low-paid, tedious work of tagging data that trains billion-dollar models. They’re the most vulnerable people in the AI supply chain, and nobody’s talking about them.
Three of the biggest AI labs concentrated critical training operations on a single vendor. When that vendor got popped through one compromised dependency, all three took the hit simultaneously. Classic supply chain concentration risk.
TeamPCP has publicly stated they plan to partner with ransomware groups to target affected companies at scale. That’s the MOVEit playbook: compromise one link, then methodically extort everyone downstream.
If your organization uses third-party vendors for AI training or data labeling, this is your wake-up call. Most companies haven’t mapped their AI supply chain dependencies. They don’t know what open-source tools their vendors rely on, and they’re concentrating critical operations on providers they’ve never stress-tested.
Three labs. One vendor. One poisoned dependency. That’s all it took.
Read the full breakdown, including what TeamPCP is planning next