How to Setup Ollama in OpenClaw (Free AI Model)
This guide walks you through connecting Ollama Cloud (completely free) to your OpenClaw agent. Once configured, your agent will be powered by open-source language models like Llama, Mistral, etc with no cost.
Available Ollama Models in OpenClaw
OpenClaw supports multiple leading AI providers. When you run setup-ai and choose Ollama Cloud, you can select from these free models.
No. | Model Name | Description |
|---|---|---|
1 | Qwen 3.5 | Vision + tools + thinking, 397B parameters from Alibaba. Best all-round model. |
2 | MiniMax M2.5 | Productivity & coding focused model from MiniMax. |
3 | Kimi K2.5 | Vision + tools + thinking from Moonshot AI. Supports 1M token context. |
4 | Ministral 3 | Vision + tools, 14B lightweight model from Mistral. |
5 | GLM-5 | Agentic engineering + long-horizon tool use from Z.ai. |
6 | GPT-OSS | Tools + thinking, 120B open-source model from OpenAI. |
7 | Nemotron-3-Super | Agentic + reasoning, 120B MoE (12B active) from NVIDIA. |
8 | Gemma 4 | Multimodal + reasoning, 31B model from Google DeepMind. |
How to Set Up Ollama in OpenClaw
Follow these steps to configure your chosen AI provider via SSH.
Prerequisites
- An active OpenClaw VPS
- SSH access to your VPS (see the Essential Command Guide)
- No API key needed – Ollama Cloud is free
- Run setup command:
setup-ai.
- When prompted, choose your AI provider by entering the corresponding number from the displayed list of options (1–8). You can select number
8for Ollama Cloud.

- Wait for the initial setup to complete, during which the system will automatically download and install the necessary Ollama Cloud components and prepare the connection to the free cloud service.
- Click the provided link to review and confirm permissions, then return to your SSH session to click enter to proceed.


- After authorization, the script will display a list of available cloud models; enter the numbers of the models you wish to enable, or enter
9to enable all models.

- You will be prompted to decide whether to set selected model as primary or to designate it as a fallback.
- Primary: The main model your agent will use by default.
- Fallback: The model your agent will use if the primary is unavailable.

- Once the configuration is complete, the OpenClaw service will automatically restart.
Verifying Your Configuration
After the restart, you can verify the setup:
- Test your agent by sending a message to your agent; if you set the model as primary, you will see the agent using that model.

- Check the configuration file to view your configured provider and models.
cat /opt/openclaw/data/.openclaw/openclaw.json

Important Notes
- No API keys needed - skip any .env editing for Ollama
- Provider Not Listed? - If you don't see Ollama Cloud or other recent providers as options, your OpenClaw version may be outdated. Run an update first:
openclaw-update. - Usage limits apply - Ollama Cloud is free but has rate limits and fair usage policies. For detailed limits, refer to here.
- To change models later - just run
setup-aiagain and re-select
Need Further Assistance?
If you face any issues or need assistance, don’t hesitate to reach out — our support team is always ready to help!
🔧 Need help? Submit a Support Ticket
💬 Chat with us on Live Chat via our website
Updated on: 20/04/2026
Thank you!