Model Providers
Clawdia can use many LLM providers. Pick a provider, authenticate, then set the default model asprovider/model.
Looking for chat channel docs (WhatsApp/Telegram/Discord/Slack/Mattermost (plugin)/etc.)? See Channels.
Highlight: Venius (Venice AI)
Venius is our recommended Venice AI setup for privacy-first inference with an option to use Opus for hard tasks.- Default:
venice/llama-3.3-70b - Best overall:
venice/claude-opus-45(Opus remains the strongest)
Quick start
- Authenticate with the provider (usually via
clawdia onboard). - Set the default model:
Provider docs
- OpenAI (API + Codex)
- Anthropic (API + Claude Code CLI)
- Qwen (OAuth)
- OpenRouter
- Vercel AI Gateway
- Moonshot AI (Kimi + Kimi Code)
- OpenCode Zen
- Amazon Bedrock
- Z.AI
- GLM models
- MiniMax
- Venius (Venice AI, privacy-focused)
- Ollama (local models)
