Is AI Ruining Music? King Gizzard, Spotify’s Doppelgängers, and the Streaming Era’s Creative Panic

By the time Charlie Warzel pressed record on the February 13, 2026 episode of Galaxy Brain, the question in the title already felt less like a thought experiment and more like an incident report: Is AI ruining music? The Atlantic staff writer frames the conversation around a reality artists have been muttering about for years—streaming payouts that rarely feel sustainable, discovery systems that behave like black boxes, and a platform economy that turns musicians into full-time content strategists. Then generative AI arrives as a multiplier: more tracks, more speed, more impersonation, more “music” that isn’t made so much as manufactured to fit.

Warzel’s guest is Stu Mackenzie, the frontman of Australia’s famously prolific King Gizzard & the Lizard Wizard—an ideal interview subject because his band has spent a decade stress-testing the modern music business from the outside. King Gizzard are the rare rock success story that still reads like a DIY fable: relentless touring, genre-hopping albums, obsessive fans, and a philosophy that treats the internet as a tool for community rather than a scoreboard. That ethos is exactly what’s under threat in Warzel’s telling—because the new fight isn’t just about getting paid. It’s about keeping a band’s identity intact in an ecosystem where your “artist page” can be hijacked, your sound can be imitated, and your catalog can be replaced by something algorithmically adjacent and legally slippery.

What follows is a clear-eyed tour through the ways streaming—and now AI—has reshaped music into something closer to an attention market than an art form. Not because anyone decided it should be that way, but because the incentives are built to reward volume, frictionless listening, and plausible sameness.

The streaming hangover: when “discovery” becomes the job

Warzel opens from a familiar place: the streaming era’s greatest hits of musician frustration. Albums get atomized into tracks. Catalogs become endless feeds. The path to real money feels reserved for the tiny slice of artists who hit stadium-level scale. And even to approach that scale, artists must “play the platform game,” where getting slotted into playlists and recommendation systems can matter as much as the music itself.

That last point is where the tone darkens. Warzel describes a psychological grind artists increasingly share: the sense of “shadowboxing an algorithm”—fighting for scraps of attention while the rules change without notice. In this world, one weird deep-cut can suddenly outperform the rest of an album because a recommendation system likes it, and the reward isn’t just more plays; it’s a kind of creative misclassification. The algorithm decides who you are, and your future discovery starts routing through that assumption.

That pressure doesn’t only shape how artists promote music; it shapes what they feel permitted to make. Warzel sketches a climate where musicians feel pushed to release faster and more often, hoping to stay visible in a system that punishes silence—an arms race that turns “work” into churn.

Then the episode pivots to the accelerant: generative AI.

Warzel’s framing is blunt: AI makes it possible to create “entire complex songs” with a prompt, or by humming into a phone. The result isn’t just novelty. It’s supply—mass supply—entering a market already defined by abundance.

In the transcript, Warzel cites Suno CEO Mikey Shulman discussing the appeal of automating parts of music-making, including a line that captures the cultural vibe of AI adoption: it has been compared to “the Ozempic of the music industry,” widely used but quietly discussed.

Whether you find that metaphor funny, bleak, or both, the downstream effect is measurable: AI tracks aren’t staying in demo folders. They’re getting uploaded, distributed, and monetized—often in the most algorithm-friendly corners of streaming (instrumentals, mood playlists, ambient backgrounds). Warzel points to the emergence of AI music on the charts as a sign the pipeline is real, not theoretical.

One example frequently cited in the broader debate is Xania Monet, an AI-created “artist” whose song “How Was I Supposed to Know?” debuted on Billboard’s Adult R&B Airplay chart—an industry milestone because it shows how quickly synthetic work can enter traditional gatekept spaces.

But the most alarming AI story in Warzel’s episode isn’t “AI music exists.” It’s AI music impersonates.

Warzel references reporting about voice-cloning and impersonation, including a case involving reggaeton superstar Bad Bunny: a voice clone used to create a song that briefly reached Spotify’s Top 100 in Chile before it was removed.

That incident matters because it clarifies what’s at stake. If a global superstar’s identity can be copied and distributed at scale, the problem isn’t just legal—it’s structural. The same architecture that makes global distribution easy also makes global fraud easy.

Which brings the episode to King Gizzard.

King Gizzard pulled their catalog from Spotify in July 2025. The trigger, as covered in multiple outlets and referenced in the episode’s transcript, was outrage over Spotify CEO Daniel Ek’s role in a major investment in Helsing, a German defense company associated with military drone and AI-defense tech.

The decision made headlines because it wasn’t a token protest from an artist who relies on Spotify for discovery—it was a meaningful withdrawal from the world’s dominant streaming platform. And it raised a practical question: what happens when a band willingly vacates the biggest stage?

Part of the answer, as the newsletter Platformer reported—and as Warzel summarizes—is that fans soon noticed “King Gizzard” tracks still appearing on Spotify, but they weren’t actual King Gizzard recordings. They were Muzak-like, ringtone-style instrumental covers paired with lookalike metadata and artwork. According to Platformer, some of these tracks collectively amassed more than 10 million streams before being removed after fans raised alarms.

That detail is the gut punch of the episode: even leaving doesn’t protect you. The platform’s discovery system still has to answer the query “King Gizzard,” and if the real catalog is gone, the system can be gamed to fill the vacuum with knockoffs.

It’s a perfect illustration of Warzel’s larger thesis: streaming has turned music into a commodity interface. If what matters is that the listener stays in the app, then anything that resembles the requested vibe can become “close enough”—and close enough can be profitable.

Warzel doesn’t just present King Gizzard as victims of the AI era. He positions them as a band that’s been quietly building an alternative value system for years—one rooted in fan community and permission rather than platform optimization.

A key example is the band’s “bootlegger” philosophy, which Mackenzie describes in the episode as emerging from early scarcity: older records pressed in small quantities, demand exploding internationally, and fans creating their own physical versions. Instead of waging war on bootlegs, King Gizzard leaned in—most famously with Polygondwanaland, which the band released with explicit permission for people to download, press, and distribute.

The band’s official materials around Polygondwanaland have long framed it as genuinely free—free to download and free to copy—an approach that helped turn fandom into infrastructure.

In Warzel’s reading, this is the philosophical opposite of algorithmic culture. Instead of competing for playlist placement, you build a world where fans do the spreading because they feel ownership, not because a recommendation engine rewarded a hook.

It’s also a reminder that “control” is complicated. King Gizzard gave up certain kinds of control (distribution, physical pressings, resale ecosystems) to preserve what Mackenzie calls the band’s “creative core”—the internal dynamic that lets them keep taking weird swings without becoming content robots.

To the episode’s credit, The Atlantic includes Spotify’s response on the record. Warzel notes that Spotify acknowledged AI is “accelerating problems that already exist,” including impersonation and fraud, and points to new policies announced in September 2025—covering stronger impersonation enforcement, spam filtering, and standardized AI disclosure in credits (aligned with the DDEX ecosystem).

Spotify has publicly described these measures as part of broader “AI protections” for artists and listeners.

Spotify also pushes back on a claim you hear in the conversation: that Spotify pays worse than other services. In the transcript, Spotify’s spokesperson argues other services pay less, and notes Spotify paid out more than $11 billion to the music industry in the prior year.

That $11B figure is consistent with Spotify’s own January 2026 public statements about 2025 payouts.

Still, the episode’s underlying point remains: policy statements don’t automatically solve an enforcement problem in a system processing massive quantities of uploads through distributors and intermediaries. Even when content is removed, new versions appear. Even when rules exist, bad actors find seams.

That’s why Mackenzie’s comments land as less like a rant and more like exhaustion: it feels like a game that can’t be won by individual artists constantly filing takedowns.

The real fear isn’t “AI will write the hits.” It’s that music becomes wallpaper by default.

The cleanest takeaway from Warzel and Mackenzie’s conversation is that the biggest threat may not be an AI-generated pop star replacing human headliners. It’s a slow shift in what music is in everyday life.

Streaming already pushed listening toward frictionless consumption—endless playlists, passive discovery, “lean back” culture. Add generative AI, and the supply of functional, mood-fitting tracks becomes effectively infinite. In that world, the most rewarded music may not be the most daring or personal; it may be the most useful—the least interruptive soundtrack for work, sleep, studying, driving, shopping.

That’s what Warzel means by “diet music”: not necessarily bad music, but music engineered to be consumed the way an app wants you to consume—quickly, endlessly, with minimal resistance.

King Gizzard’s story becomes a case study in refusing that bargain. They’re not pretending they can stop the tide. They’re trying to keep their band’s internal logic alive: real people, real risk, real community, and a willingness to walk away from the platform that “everyone uses,” even if it means inconvenience and weird new forms of fraud.

Which is why the episode’s title works as clickbait and diagnosis at the same time. The question isn’t whether AI can make music. The question is whether the systems distributing music—and the incentives behind them—still leave room for art that feels human, surprising, and worth protecting.

And if you’re Mackenzie, staring at a fake version of your band on the world’s biggest streamer, the answer starts to sound less like debate and more like a warning flare.

 

Share this post