How to Setup Ollama in OpenClaw (Free AI Model)
This guide walks you through connecting Ollama Cloud (completely free) to your OpenClaw agent. Once configured, your agent will be powered by open-source language models like Llama, Mistral, etc with no cost.
Available Ollama Models in OpenClaw
OpenClaw supports multiple leading AI providers. When you run setup-ai and choose Ollama Cloud, you can select from these free models.
No. | Model Name | Description |
|---|---|---|
1 | Qwen3.5 | Prioritizes stability and real-world utility, offering developers a more intuitive, responsive, and genuinely productive coding experience. |
2 | MiniMax-M2.7 | Actively evolves itself, building complex agents and handling advanced tasks using agent teams, skills, and dynamic tools. |
3 | Kimi K2.6 | Open-source multimodal agentic model for long-horizon coding, autonomous execution, and task orchestration. |
4 | Ministral-3 | Lightweight efficient model from Mistral AI with vision and tool support. |
5 | GLM-5.1 | Z.AI’s next-generation flagship model for agentic engineering, with significantly stronger coding capabilities than its predecessor. |
6 | GPT-OSS | OpenAI’s open-weight models designed for powerful reasoning, agentic tasks, and versatile developer use cases. |
7 | Nemotron-3-Super | Large language model (LLM) trained by NVIDIA, designed to deliver strong agentic, reasoning, and conversational capabilities |
8 | Gemma4 | Multimodal, handling text and image input and generating text output. |
How to Set Up Ollama in OpenClaw
Follow these steps to configure your chosen AI provider via SSH.
Prerequisites
- An active OpenClaw VPS
- SSH access to your VPS (see the Essential Command Guide)
- No API key needed – Ollama Cloud is free
- Run setup command:
setup-ai.
- When prompted, choose your AI provider by entering the corresponding number from the displayed list of options (1–8). You can select number
8for Ollama Cloud.

- Wait for the initial setup to complete, during which the system will automatically download and install the necessary Ollama Cloud components and prepare the connection to the free cloud service.
- Click the provided link to review and confirm permissions, then return to your SSH session to click enter to proceed.


- After authorization, the script will display a list of available cloud models; enter the numbers of the models you wish to enable, or enter
9to enable all models.

- You will be prompted to decide whether to set selected model as primary or to designate it as a fallback.
- Primary: The main model your agent will use by default.
- Fallback: The model your agent will use if the primary is unavailable.

- Once the configuration is complete, the OpenClaw service will automatically restart.
Verifying Your Configuration
After the restart, you can verify the setup:
- Test your agent by sending a message to your agent; if you set the model as primary, you will see the agent using that model.

- Check the configuration file to view your configured provider and models.
cat /opt/openclaw/data/.openclaw/openclaw.json

Important Notes
- No API keys needed - skip any .env editing for Ollama
- Provider Not Listed? - If you don't see Ollama Cloud or other recent providers as options, your OpenClaw version may be outdated. Run an update first:
openclaw-update. - Usage limits apply - Ollama Cloud is free but has rate limits and fair usage policies. For detailed limits, refer to here.
- To change models later - just run
setup-aiagain and re-select
Need Further Assistance?
If you face any issues or need assistance, don’t hesitate to reach out — our support team is always ready to help!
🔧 Need help? Submit a Support Ticket
💬 Chat with us on Live Chat via our website
Updated on: 23/04/2026
Thank you!