Articles on: VPS Hosting

How to Setup Ollama in OpenClaw (Free AI Model)

This guide walks you through connecting Ollama Cloud (completely free) to your OpenClaw agent. Once configured, your agent will be powered by open-source language models like Llama, Mistral, etc with no cost.


Available Ollama Models in OpenClaw


OpenClaw supports multiple leading AI providers. When you run setup-ai and choose Ollama Cloud, you can select from these free models.


No.

Model Name

Description

1

Qwen 3.5

Vision + tools + thinking, 397B parameters from Alibaba. Best all-round model.

2

MiniMax M2.5

Productivity & coding focused model from MiniMax.

3

Kimi K2.5

Vision + tools + thinking from Moonshot AI. Supports 1M token context.

4

Ministral 3

Vision + tools, 14B lightweight model from Mistral.

5

GLM-5

Agentic engineering + long-horizon tool use from Z.ai.

6

GPT-OSS

Tools + thinking, 120B open-source model from OpenAI.

7

Nemotron-3-Super

Agentic + reasoning, 120B MoE (12B active) from NVIDIA.

8

Gemma 4

Multimodal + reasoning, 31B model from Google DeepMind.


💡 Tip: Ollama Cloud is completely free and great for testing or as a backup. Use it as a fallback provider to ensure your agent always has a model available.


How to Set Up Ollama in OpenClaw


Follow these steps to configure your chosen AI provider via SSH.


Prerequisites


  • An active OpenClaw VPS
  • SSH access to your VPS (see the Essential Command Guide)
  • No API key needed – Ollama Cloud is free


If you're new to OpenClaw, feel free to check out this guideline.


  1. Run setup command: setup-ai.


  1. When prompted, choose your AI provider by entering the corresponding number from the displayed list of options (1–8). You can select number 8 for Ollama Cloud.


  1. Wait for the initial setup to complete, during which the system will automatically download and install the necessary Ollama Cloud components and prepare the connection to the free cloud service.


  1. Click the provided link to review and confirm permissions, then return to your SSH session to click enter to proceed.


If the page does not open automatically, copy the link and paste it into your browser.


Click "Connect" to enable the Ollama Cloud model on the VPS.


  1. After authorization, the script will display a list of available cloud models; enter the numbers of the models you wish to enable, or enter 9 to enable all models.


  1. You will be prompted to decide whether to set selected model as primary or to designate it as a fallback.
  • Primary: The main model your agent will use by default.
  • Fallback: The model your agent will use if the primary is unavailable.


  1. Once the configuration is complete, the OpenClaw service will automatically restart.


Verifying Your Configuration


After the restart, you can verify the setup:


  1. Test your agent by sending a message to your agent; if you set the model as primary, you will see the agent using that model.


  1. Check the configuration file to view your configured provider and models.


cat /opt/openclaw/data/.openclaw/openclaw.json



Important Notes

  • No API keys needed - skip any .env editing for Ollama
  • Provider Not Listed? - If you don't see Ollama Cloud or other recent providers as options, your OpenClaw version may be outdated. Run an update first: openclaw-update.
  • Usage limits apply - Ollama Cloud is free but has rate limits and fair usage policies. For detailed limits, refer to here.
  • To change models later - just run setup-ai again and re-select


Need Further Assistance?


If you face any issues or need assistance, don’t hesitate to reach out — our support team is always ready to help!


🔧 Need help? Submit a Support Ticket

💬 Chat with us on Live Chat via our website

Updated on: 20/04/2026

Was this article helpful?

Share your feedback

Cancel

Thank you!