Brunswick, ME • (207) 245-1010 • contact@johnzblack.com
Michael Smith, a 54-year-old musician from North Carolina, pleaded guilty to collecting somewhere between $8 and $10 million in fraudulent royalty payments over several years. The scheme wasn’t complicated. He just did it at machine scale.
Hundreds of thousands of AI-generated songs. Thousands of fake streaming accounts. Spotify, Apple Music, Amazon Music, YouTube Music – each fake stream generating a fraction of a cent. Multiplied across a massive catalog played constantly by bots, those fractions became millions.
He did this for years before anyone caught up.
Streaming royalties are proportional. A song that accounts for 1% of all streams gets 1% of the royalty pool. The system has one obvious weakness: it can’t easily distinguish real engagement from manufactured engagement at scale. A real stream and a bot stream look identical in the calculation.
What made Smith’s version different from older streaming fraud was AI generation. By producing a massive catalog cheaply, he could absorb bot traffic without triggering content-based anomaly detection. The attack adapted to the defenses.
Catching this isn’t a content problem. It’s a behavioral one. A thousand accounts that all started listening to the same album on the same day, from the same IP blocks, with uniform 30-second listening windows – that should look different from organic engagement that grew over months.
Platforms have gotten better at behavioral analytics. The scheme still worked for years, which says something about either the quality of detection or the economic incentive to look too closely.
Every digital platform with algorithmic payouts, engagement-based monetization, or rankings driven by user behavior faces the same structural exposure. Fake reviews, click fraud, ad impression fraud, social media manipulation – same underlying pattern. When content creation becomes cheap through AI, detection has to move from “does this look real?” to “does this behavior look real?”
Smith’s operation is the first to reach a guilty plea. It sets precedent, raises the risk for copycats, and creates pressure on platforms to show their detection actually works.
The number worth sitting with: it ran for years.