Hi everyone,
I saw with this youtube post:
So to sum up what’s going on:
Context of the lawsuit :
- Warner (plus its labels) sued Suno in June 2024, accusing them of training their AI on Warner’s catalog (decades of masters) without permission or payment.
- Suno’s defense was the usual AI line: “it’s transformative, it’s fair use, we’re not copying, we’re teaching a model.”
- Everyone in the industry watched this case because it could become the “can AI train on our music for free?” precedent.
What the settlement actually does :
1-Artist opt-in for training: Warner artists can now choose whether their voice/name/likeness/catalog is used to train Suno’s models. That’s huge symbolically: it introduces consent-based training at a major label level. But he points out the power imbalance: will smaller/younger artists really feel free to say “no” to their label?
2- New licensed models from scratch: Suno will build new models (aiming for 2026) trained only on licensed Warner material. The old “possibly trained on everything without permission” model still exists in the wild, and Warner doesn’t get retroactive compensation for that.
3-New usage rules / monetization caps: Free users can generate/stream/share. Paid users can download with new limits on how much they can generate and how they can use it commercially. But it’s not clear whether a paid user can safely generate music and license it for a TV show, ad, game, etc. That lack of clarity is a nightmare for music supervisors trying to clear rights.
4-Lawsuits dropped, no one admits fault Warner drops all claims, Suno doesn’t admit guilt. Classic settlement.
Why this matters for musicians / sync people
It proves that labels can force AI companies into licensing deals. So the “scrape everything and claim fair use” era is getting pushed back, at least politically and commercially.
For composers and libraries, this is partially good: it shows AI companies can’t just hoover up your catalog with zero consequences.
But he warns that independent composers, small libraries, and session musicians don’t have Warner’s lawyers. They could still be steamrolled by future deals that mostly protect majors.
Big headaches coming for sync & licensing If Suno’s new model is trained on Warner’s catalog, and someone generates a track influenced by a Warner artist: Who owns it? Suno? The user? Warner? The artist? Who do you actually pay and clear for sync? Until there’s a clean, standardized system, he says music supervisors will treat AI tracks as too risky to touch.
Market impact: “good enough for free”
He stresses that we’re heading into a world where tons of AI tracks will be “good enough” for background use and cost almost nothing. That puts pressure on composers who charge real money for custom work. So your real “edge” won’t be just technical skill; it’ll be: Relationships Taste & unique voice Speed / reliability Basically: the human parts AI can’t fully copy (yet).
His final vibe This is a win for major labels and top artists (more control and leverage). For working composers and sync people, it’s more of a warning shot than a celebration. The settlement isn’t the end of the AI; it’s the start of a more complex phase where AI is legalized but not necessarily fair.
I am so confused with all these stuff. But I wonder what’s your opinion about these also how we can proctect our work from AI training?