North Carolina Resident Admits to Multi-Million Dollar AI Music Fraud Scheme
In a landmark case that underscores the growing threat of artificial intelligence in creative industries, a North Carolina man has pleaded guilty to defrauding music streaming platforms and legitimate artists out of millions of dollars. Michael Smith, 52, from Cornelius, North Carolina, entered his plea on Friday as part of an agreement with federal prosecutors in the Southern District of New York.
How the Elaborate AI Fraud Operation Worked
Smith's scheme involved flooding popular music streaming services with thousands of AI-generated songs, then using automated "bot farms" to artificially inflate play counts into the billions. This sophisticated manipulation allowed him to fraudulently collect royalty payments intended for human musicians and legitimate copyright holders.
"Michael Smith generated thousands of fake songs using artificial intelligence and then streamed those fake songs billions of times," stated U.S. Attorney Jay Clayton. "Although the songs and listeners were fake, the millions of dollars Smith stole was real."
The Financial Scale of the Fraud
Court documents reveal the staggering scope of Smith's operation:
- He accumulated as many as 661,440 streams daily between 2017 and 2024
- Generated annual royalties of approximately $1,027,128
- Fraudulently obtained more than $10 million in total royalty payments
- Will forfeit $8,091,843.64 as part of his plea agreement
Former U.S. Attorney Damian Williams emphasized that Smith had stolen "millions in royalties that should have been paid to musicians, songwriters, and other rights holders whose songs were legitimately streamed."
A Watershed Moment for AI-Related Prosecutions
This case represents one of the first successful prosecutions of AI-related fraud in the music business, coming at a time when the industry faces unprecedented challenges from artificial intelligence. The music industry had largely recovered from the Napster piracy era of the early 2000s, only to confront this new AI-based threat to streaming revenue.
Social media commentary highlighted the irony of Smith's scheme. As one X user noted, he had used "AI to make the music AND the audience" and earned $1.2 million annually "for music no human ever actually listened to."
The Broader Impact on the Music Industry
The Smith case illuminates a systemic vulnerability in the streaming economy. Under current business models, platforms like Amazon Music, Apple Music, Spotify, and YouTube Music distribute royalty payments from a collective pool based on stream counts. This system, already criticized by musicians for providing subsistence earnings except for top stars, becomes particularly vulnerable to AI manipulation.
The scale of AI-generated music is becoming overwhelming:
- French streaming service Deezer reports receiving 60,000 fully AI-generated tracks daily
- Spotify removed 75 million spam tracks in the past year alone
- Suno, an AI music generation company, produces approximately 7 million songs daily
- Deezer research suggests 97% of people cannot distinguish between human and AI music
Industry Responses and Regulatory Developments
The music industry's struggle with AI extends beyond fraudulent schemes. The UK government recently abandoned controversial plans that would have allowed AI companies to use copyrighted works without permission, following strong opposition from thousands of artists including Elton John, Dua Lipa, and Paul McCartney.
Even AI industry leaders express ambivalence about the technology's impact. Suno CEO Paul Sinclair told Billboard, "Truly, every single day I'm conflicted. This stuff is complicated... I want to make sure there's whole future generations of the beauty of art and music and the ability to build careers around it."
Legal Consequences and Future Implications
Under his plea agreement, Michael Smith now faces up to five years in federal prison when sentenced in July, in addition to the substantial financial forfeiture. His case serves as both a warning to would-be fraudsters and a signal that law enforcement is developing strategies to combat AI-assisted crimes.
As the music industry grapples with this new technological frontier, the Smith prosecution establishes important legal precedent while highlighting the urgent need for updated safeguards in digital content distribution systems. The case demonstrates how AI, while offering creative possibilities, also presents unprecedented opportunities for fraud that threaten the livelihoods of human creators worldwide.



