Manage and use local Ollama models. Use for model management (list/pull/remove), chat/completions, embeddings, and tool-use with local LLMs. Covers OpenClaw sub-agent integration and model selection guidance.
Generalized for any Ollama setup - host now configurable via OLLAMA_HOST env var (defaults to localhost:11434)