HeadlinesBriefing favicon HeadlinesBriefing.com

Claude Code Now Runs on Local Ollama Models

DEV Community •
×

Ollama's new API compatibility with Anthropic lets developers bypass cloud costs. Users can now connect Claude Code to local or self-hosted models by setting two environment variables. This shifts the tool from a paid service to a private, offline-capable assistant.

The setup requires pointing ANTHROPIC_BASE_URL to your Ollama server. For local use, 'localhost' works. For remote setups, Ollama must allow external connections. This flexibility is key for privacy-focused teams or those avoiding subscription fees for model inference.

Developers can pull specific models like gpt-oss:120b or cloud-based ones like minimax-m2.1. The process involves an Ollama account sign-in for cloud models. Testing confirms the integration works, with Claude Code reporting its active model in responses, effectively turning it into a versatile local AI tool.