OpenPawz works with multiple LLM providers. Hereβs how to set each one up.
Anthropic Claude (Recommended)
Claude is our recommended provider β great reasoning, follows instructions well, and excellent at tool use.
Setup
- Get an API key at console.anthropic.com
- Add to your
.env:
LLM_PROVIDER=anthropic
ANTHROPIC_API_KEY=sk-ant-xxxxx
Models Available
claude-sonnet-4-20250514(default, best balance)claude-opus-4-20250514(most capable)claude-3-5-haiku-20241022(fastest)
OpenAI GPT
GPT-4 works great with OpenPawz, especially for coding tasks.
Setup
- Get an API key at platform.openai.com
- Add to your
.env:
LLM_PROVIDER=openai
OPENAI_API_KEY=sk-xxxxx
Models Available
gpt-4o(default, recommended)gpt-4-turbo(previous gen)gpt-3.5-turbo(faster, cheaper)
Ollama (Local/Free)
Run completely local with no API costs. Great for privacy and experimentation.
Setup
- Install Ollama: ollama.ai
- Pull a model:
ollama pull llama3
- Add to your
.env:
LLM_PROVIDER=ollama
OLLAMA_MODEL=llama3
OLLAMA_HOST=http://localhost:11434
Recommended Models
llama3β Best overallmistralβ Fast and capablecodellamaβ Great for coding
Switching Providers
Just update your .env and restart OpenPawz. You can even set up multiple and switch between them!
Provider not working? Post in
Questions with your error logs!