52-year-old Michael Smith of North Carolina was indicted on federal charges on Thursday, September 5, 2024 for allegedly making millions of dollars off of streaming services by using AI-generated music over the last seven years. Charges include wire fraud and money laundering conspiracy.
At first, Mr. Smith was just making basic loops on his computer, uploading them to Spotify, then creating tens of thousands of fake accounts which would repeatedly stream the songs all day. From The New York Times:
According to a financial breakdown that he emailed himself in 2017 — the year that prosecutors say he began the scheme — Mr. Smith calculated that he could stream his songs 661,440 times each day. At that rate, he estimated, he could bring in daily royalty payments of $3,307.20 and as much as $1.2 million in a year.
This worked well enough for a while. But starting in 2018, he teamed up with an executive from an AI music company to automate even the loop-making process. Also from the Times:
The compositions would arrive to Mr. Smith with file names like "n_7a2b2d74-1621- 4385-895d-ble4af78d860.mp3." Then, he generated plausible names for the songs and their artists: "Zygopteris," "Zygopteron," "Zygopterous," "Zygosporic" and so on.
In a world with real bands called Dirty Projectors, Neutral Milk Hotel and Sunn 0))), real albums like "Yankee Hotel Foxtrot" and real songs like "MMMBop," the titles did not stand out.
By June 2019, Mr. Smith was earning about $110,000 each month, with a portion going to co-conspirators, the indictment said. In an email in February of this year, Mr. Smith bragged that he had reached 4 billion streams and $12 million in royalties since 2019.
This is hardly the only fake-music problem plaguing the streaming ecosystem recently. As writer Andy Vasoyan reported in Slate last month, there has also been a recent increase in uploads of cover songs by fake bands, many of which seem to be AI-generated as well:
Covers of popular songs were being inserted into large, publicly available playlists, hidden among dozens of other covers by real artists while racking up millions of listens and getting paid.
The artists "performing" the covers—the Highway Outlaws, Waterfront Wranglers, Saltwater Saddles—all fit a certain pattern, with monthly listeners in the hundreds of thousands, zero social media footprint, and some very ChatGPT-sounding bios. A group of vigilante Redditors initially found the pattern in bands covering country classics, but a wider look showed that there were groups covering songs across decades and genres. None of the bands had originals, but a group might cover the Red Hot Chili Peppers and Third Eye Blind and then pivot to "Linger" by the Cranberries in the same record. If you didn't think the song was A.I., you probably wouldn't suspect a thing.
Spotify doesn't technically have a policy against the use of generative AI in music creation; in fact, some people believe that Spotify may be rigging the game by padding playlists with its own fake artists. However, these fake cover bands could potentially be violating the company's policies against deceptive content.
Meanwhile, Spotify stopped paying out for songs by actual humans that get less than 1,000 plays, even though those songs still generate money for the company. This might not sound too crazy until you realize that that accounts for 2/3 of the content on the streaming service. That's a lot of money—that's now getting paid out to executives and, it seems, AI scammers.
The Bands and the Fans Were Fake. The $10 Million Was Real. [Maia Coleman / The New York Times]
Spotify Has a Fake-Band Problem. It's a Sign of Things to Come. [Andy Vasoyan / Slate]
Disclosure: I also write for Wirecutter, which is owned by the New York Times Company, which also publishes The New York Times