πŸ€– LLM Provider Setup β€” Claude, GPT, Ollama

OpenPawz works with multiple LLM providers. Here’s how to set each one up.

Anthropic Claude (Recommended)

Claude is our recommended provider β€” great reasoning, follows instructions well, and excellent at tool use.

Setup

  1. Get an API key at console.anthropic.com
  2. Add to your .env:
LLM_PROVIDER=anthropic
ANTHROPIC_API_KEY=sk-ant-xxxxx

Models Available

  • claude-sonnet-4-20250514 (default, best balance)
  • claude-opus-4-20250514 (most capable)
  • claude-3-5-haiku-20241022 (fastest)

OpenAI GPT

GPT-4 works great with OpenPawz, especially for coding tasks.

Setup

  1. Get an API key at platform.openai.com
  2. Add to your .env:
LLM_PROVIDER=openai
OPENAI_API_KEY=sk-xxxxx

Models Available

  • gpt-4o (default, recommended)
  • gpt-4-turbo (previous gen)
  • gpt-3.5-turbo (faster, cheaper)

Ollama (Local/Free)

Run completely local with no API costs. Great for privacy and experimentation.

Setup

  1. Install Ollama: ollama.ai
  2. Pull a model:
ollama pull llama3
  1. Add to your .env:
LLM_PROVIDER=ollama
OLLAMA_MODEL=llama3
OLLAMA_HOST=http://localhost:11434

Recommended Models

  • llama3 β€” Best overall
  • mistral β€” Fast and capable
  • codellama β€” Great for coding

Switching Providers

Just update your .env and restart OpenPawz. You can even set up multiple and switch between them!


Provider not working? Post in :red_question_mark: Questions with your error logs!