Good morning
Here's what's happening in AI today:
1. Chinese AI Firm MiniMax Rockets 78 Percent on Hong Kong Debut
What happened: MiniMax Group, a Chinese AI startup, surged 78 percent on its first day of Hong Kong trading, reaching an $11.6 billion valuation. Founded in 2022, they make popular consumer apps like Hailuo AI (video generation) and Talkie (AI character chat).
Why it matters: This is the second Chinese "AI tiger" to go public this week, following Zhipu AI which climbed 13 percent on debut. Investors are hungry for Chinese AI exposure, especially consumer focused apps rather than enterprise tools.
The contrast: MiniMax's consumer apps drove excitement while Zhipu's enterprise and government focus was "more stable but less exciting." This tells you what investors actually want: consumer viral hits, not boring B2B deals.
Our take: China's AI IPO pipeline is hot. DeepSeek (the model everyone's talking about) hasn't announced IPO plans yet, but Huawei's AI server spinoff, Baidu's chip arm, and others are all lining up. Expect more Chinese AI listings to flood markets in 2026.
2. AI Compute Shifts from Training to Inference at CES 2026
What happened: At CES, the big money is moving from training large models to running inference (actually using the models). This marks a fundamental shift in where AI spending flows.
Why it matters: For years, companies threw billions at training bigger and bigger models. Now they're realizing inference (making models actually do work) is where the bottleneck is. This changes which companies win.
Who benefits: Inference requires different hardware than training. Companies optimizing for inference efficiency (not just raw training power) become more valuable. This is why everyone's talking about edge AI and smaller models.
Our take: The training arms race is slowing. The new race is who can run AI cheapest and fastest for actual users. Expect 2026 to be the year of inference optimization over training scale.
3. Google Predicted to Default 10 to 15 Percent of Searches to AI Mode by End of 2026
What happened: Analysts predict Google will start defaulting users into AI search results (instead of traditional links) for longer, more complex queries. Could hit 10 to 15 percent of all searches by year end.
Why it matters: This is how Google kills the "10 blue links" model. They'll slowly train users into AI mode without asking permission. Start with power users, then expand to everyone.
The strategy: Google is good at ramping changes slowly. They'll use the search box to detect query type and default you into AI results when they think it fits. Most users won't notice the shift happening.
Our take: This is the real threat to traditional web publishers. Not that AI mode exists, but that Google will make it the default. Expect publishers to scream about traffic loss by Q4 2026.
4. MIT Professor Says AI Investments Don't Match Actual Energy Benefits
What happened: MIT's Priya Donti says current AI investment priorities are mismatched with what would actually benefit the energy sector. Too much money in large language models, not enough in grid optimization AI.
Why it matters: AI could help make power grids cleaner and more efficient through predictive maintenance, better planning, and demand management. But that's not where the money is flowing.
The problem: Resource intensive LLMs get billions in funding. Practical grid AI that could prevent blackouts gets scraps. The benefits don't align with where capital is deployed.
Our take: This is the AI investment bubble in a nutshell. Glamorous chatbots get funding. Boring infrastructure AI that could actually save the planet gets ignored. Priorities are backwards.
BY THE NUMBERS
78 percent MiniMax stock surge on first day of trading
$11.6B MiniMax valuation after Hong Kong IPO
10 to 15 percent of Google searches that could default to AI mode by end of 2026
2022 year MiniMax was founded (less than 4 years ago)
WHAT WE'RE WATCHING
DeepSeek IPO announcement (if it happens)
First major publisher lawsuit against Google over AI search traffic loss
Power grid strain reports as inference workloads scale
The infrastructure layer of AI is becoming more important than the model layer. That's the shift happening in 2026.
What do you think matters more? Training bigger models or running existing ones efficiently?
Drop your take below.
See you tomorrow
The AI Pulse Team