Stop tuning the model. The harness rewrites itself.
jcode boots Claude Opus in 14 ms, runs at 27.8 MB, and edits its own Rust source mid-session. Same model inside. I believe this is the start of a sub-genre nobody is naming yet. Quick context: The first era of LLMs was prompt engineering. Era two was context engineering. Era three is what we've all been arguing about all year, model picking, Opus 4.7 vs Gemini 3 Pro vs Kimi K2.6. Now, someone rebuilt the wrapper around Claude in Rust, and the gap on cold start, RAM, and per-session scaling is bigger than any model gap I've seen this year. What jcode actually is: - Coding agent harness, 94% Rust - Custom terminal called handterm, custom Rust mermaid renderer - Native logins for Claude, OpenAI, Gemini, Copilot, Azure, plus aggregator providers - MCP works out of the box. Falls back to your existing .claude/mcp.json so anything you've already wired up keeps running - Install via brew, curl, or cargo build. Single binary - 3.3k stars on GitHub, 57 releases, actively maintained The numbers (jcode vs Claude Code, from the README): - Cold start: 14 ms vs 590 ms to 3.4 seconds. 42 to 245 times faster - Idle RAM with local embedding off: 27.8 MB - 10 sessions in parallel: 260 MB total vs 334 MB to 3.2 GB for Claude Code - Per added session: 9.9 MB vs 76 to 318 MB - Custom mermaid renderer the author claims is 1800x faster than browser-based versions That last one is the kind of detail that tells you what they're really doing. Someone is going through every layer of the Claude Code experience and rebuilding it natively, and the gains compound. What's actually different at the harness layer: - Self-dev mode. Agents inside jcode can edit the harness's own Rust source, run cargo build, hot-reload the binary across active sessions without dropping you. The wrapper is recursively modifiable from inside the agent loop. - Memory as semantic vectors per turn. Recall is automatic via cosine similarity. Not "remember to update CLAUDE.md". There's an ambient mode that consolidates memory and resolves conflicts in the background while you work.