Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

LLM Providers

Zeph supports multiple LLM backends. Choose based on your needs:

ProviderTypeEmbeddingsVisionBest For
OllamaLocalYesYesPrivacy, free, offline
ClaudeCloudNoYesQuality, reasoning
OpenAICloudYesYesEcosystem, compatibility
CompatibleCloudVariesVariesTogether AI, Groq, Fireworks
CandleLocalNoNoMinimal footprint

Claude does not support embeddings natively. Use the orchestrator to combine Claude chat with Ollama embeddings.

Quick Setup

Ollama (default — no API key needed):

ollama pull mistral:7b
ollama pull qwen3-embedding
zeph

Claude:

ZEPH_CLAUDE_API_KEY=sk-ant-... zeph

OpenAI:

ZEPH_LLM_PROVIDER=openai ZEPH_OPENAI_API_KEY=sk-... zeph

Switching Providers

One config change: set provider in [llm]. All skills, memory, and tools work the same regardless of which provider is active.

[llm]
provider = "claude"   # ollama, claude, openai, candle, compatible, orchestrator, router

Or via environment variable: ZEPH_LLM_PROVIDER.

Deep Dives