How to run Claude Code for free using local Gemini.
This is the full setup. My reel on this got over 10k views and heaps of people asked for the actual steps, so here they are.
What you need:
- Ollama installed
- Gemini model pulled locally (I use Gemma for lighter machines)
- Claude Code CLI
- A config tweak to point Claude Code at your local model instead of the API
Steps:
- Install Ollama using 'brew install ollama' or go to ollama.com
- Run 'ollama pull gemma2' in terminal (or whichever Gemini/Gemma variant fits your machines RAM)
- Install Claude Code if you haven't — npm install -g @anthropic-ai/claude-code
- Set up the proxy/config so Claude Code routes to localhost instead of Anthropic's API. I'll drop the exact config in the comments because formatting here is a pain
- Run it. No API key needed, no subscription charge
Things to know:
- It's slower than real Claude obviously. Local models aren't Sonnet
- Works best for small-medium tasks, not huge refactors
- Great for learning, side projects, or when you're rate limited
If you hit issues drop them below. I've probably seen it.