Articles on: VPS Hosting

How to Set Up an AI Provider on OpenClaw VPS (Ollama Cloud)

This guide walks you through connecting an AI provider to your OpenClaw agent. Once configured, your agent will be powered by the latest language models to handle conversations, tasks, and integrations.


Available AI Providers


OpenClaw supports multiple leading AI providers. For each provider, except Ollama Cloud, you will need to obtain an API key from the provider's website. You can choose the one that best fits your needs:


No.

Provider

Description

1

Anthropic (Claude)

Advanced reasoning and safety-focused models, ideal for complex tasks.

2

OpenAI

GPT-4 and GPT-3.5 models, versatile for general-purpose AI interactions.

3

OpenRouter

A unified API giving access to multiple models (Claude, GPT, Llama, and more).

4

Google (Gemini)

Google's multimodal models, excellent for understanding text, images, and more.

5

Kimi (Moonshot AI)

A powerful model optimized for long-context understanding.

6

MiniMax

Known for strong performance in conversational AI and text generation.

7

Mistral

Efficient, open-weight models with great performance-to-cost ratio.

8

Ollama Cloud [free]

Run open-source models (like Llama, Mistral) in the cloud. Completely free API keys are available, perfect as a fallback option.


πŸ’‘ Tip: Ollama Cloud is completely free and great for testing or as a backup. Use it as a fallback provider to ensure your agent always has a model available.


How to Set Up an AI Provider


Follow these steps to configure your chosen AI provider via SSH.


Prerequisites


  • An active OpenClaw VPS
  • SSH access to your VPS (see the Essential Command Guide)
  • Your API key ready (if using a paid provider)


If you're new to OpenClaw, feel free to check out this guideline.


  1. Run setup command: setup-ai.


  1. When prompted, choose your AI provider by entering the corresponding number from the displayed list of options (1–8). You can select number 8 for Ollama Cloud.


  1. Wait for the initial setup to complete, during which the system will automatically download and install the necessary Ollama Cloud components and prepare the connection to the free cloud service.


  1. Click the provided link to review and confirm permissions, then return to your SSH session to click enter to proceed.


If the page does not open automatically, copy the link and paste it into your browser.


  1. After authorization, the script will display a list of available cloud models; enter the numbers of the models you wish to enable, or enter 7 to enable all models.


  1. You will be prompted to decide whether to set selected model as primary or to designate it as a fallback.
  • Primary: The main model your agent will use by default.
  • Fallback: The model your agent will use if the primary is unavailable.


  1. Once the configuration is complete, the OpenClaw service will automatically restart.


Verifying Your Configuration


After the restart, you can verify the setup:


  1. Test your agent by sending a message to your agent; if you set the model as primary, you will see the agent using that model.


  1. Check the configuration file to view your configured provider and models.


cat /opt/openclaw/data/.openclaw/openclaw.json



Important Notes


  • API Keys - Your API keys (for paid providers) are stored securely in /opt/openclaw/.env. You can edit this file manually if needed, but using setup-ai is recommended to avoid errors. If you do edit manually, you must restart the service for changes to take effect: openclaw-restart.
  • Provider Not Listed? - If you don't see Ollama Cloud or other recent providers as options, your OpenClaw version may be outdated. Run an update first: openclaw-update.


Need Further Assistance?


If you face any issues or need assistance, don’t hesitate to reach out β€” our support team is always ready to help!


πŸ”§ Need help? Submit a Support Ticket

πŸ’¬ Chat with us on Live Chat via our website

Updated on: 27/02/2026

Was this article helpful?

Share your feedback

Cancel

Thank you!