How to Integrate Z.AI Models in Openclaw

Image
Table of contents: [Show]

Want to use Z.AI GLM models with Openclaw? This guide shows you how to connect your Z.AI API key and start using models like zai/glm-5 in minutes.

Z.AI GLM Models in Openclaw

Quick Start

Z.AI provides access to GLM models through a simple API. Once configured, you can use Z.AI models alongside other providers like GLM Models Openclaw or LiteLLM Models Openclaw.

Step 1: Get Your Z.AI API Key

First, generate an API key from the Z.AI console. Openclaw uses this key under the environment variable ZAI_API_KEY.

Step 2: Configure Openclaw

You have two options for setup:

Option A: Interactive Setup (Recommended)

Openclaw onboard --auth-choice zai-api-key

Option B: Non-Interactive Setup (for scripting)

Openclaw onboard --zai-api-key "$ZAI_API_KEY"

Or with full non-interactive mode:

Openclaw onboard --non-interactive \ --auth-choice zai-coding-global \ --zai-api-key "$ZAI_API_KEY"

Option C: Manual Configuration

Edit your ~/.Openclaw/Openclaw.json file directly:

{ "env": { "ZAI_API_KEY": "sk-..." }, "agents": { "defaults": { "model": { "primary": "zai/glm-5" } } } }

Testing Your Setup

Once configured, test your connection:

Openclaw models list | grep zai

You should see Z.AI models available. Try a quick prompt:

Openclaw ask --model zai/glm-5 "Hello, can you confirm you're working?"
Tip: Z.AI models support the same features as other OpenAI-compatible providers. You can use them for chat, code generation, and more.

Troubleshooting

  • Authentication errors: Verify your ZAI_API_KEY is correct and exported in your environment.
  • Model not found: Ensure the Gateway has been restarted after configuration changes with openclaw gateway restart.
  • Rate limiting: Check your Z.AI console for usage limits and quotas.
Note: If you're new to Openclaw model integration, you might also want to read about Elevated Mode Openclaw for advanced permissions when configuring providers.

Best Practices

  • Store your API key in environment variables, not hardcoded in config files
  • Use ~/.Openclaw/.env for sensitive credentials if running as a daemon
  • Set a fallback model in case Z.AI is temporarily unavailable
  • Monitor usage through the Z.AI console to track costs

That's it! You're now ready to use Z.AI GLM models with Openclaw.