(Demo included)
Not a forced, robotic “ha ha”.
A small, subtle one, that matched the emotion in the line I fed it.
And for a second, I just sat there thinking...
“Holy sh*t. We’re really getting close.”
Both voices I tested (Kira and Ariana) sounded surprisingly natural.
Kira’s still a bit volatile. You can feel the variability when she shifts emotion.
But Ariana?
That one felt consistent.
She adapted to tone, rhythm, and emotional subtleties in a way that almost made it feel like a real conversation.
You can still tell it’s AI, if you’ve been in voice for long enough.
But the emotional nuance?
That’s new.
That’s progress.
Cartesia’s Sonic 3 isn’t perfect yet.
But it’s a reminder that voice AI is crossing the uncanny valley faster than people realise.
And when AI starts not just talking like us, but feeling like us...
the game changes.
What do you think?
Are we ready for AI that sounds human and understands emotion?
And for all those curious minds...
Here's a quick demo for y'all