Skip to main content

Integrating OpenClaw with B.AI

From Zero to Your Private AI Agent: Deploy OpenClaw with B.AI in 15 Minutes

OpenClaw (formerly ClawdBot or Moltbot) is an open-source personal AI assistant. Unlike cloud-based SaaS tools, OpenClaw runs locally on your own machine, giving you full control over your data, workflows, and operating environment.

You can interact with OpenClaw through familiar messaging platforms such as Telegram, WhatsApp, Lark, and DingTalk to handle tasks like managing email, organizing calendars, writing code, and automating everyday workflows.

OpenClaw is more than a chatbot. It is a functional AI agent designed for real-world execution. It supports persistent memory, access to your local file system and the internet, and extensibility through installable skills.

Because it is open-source and self-hosted, OpenClaw has attracted a vibrant community of developers and tech enthusiasts. Users have already built creative use cases ranging from business automation to personal productivity, demonstrating the potential of a truly personal AI assistant.

This guide walks you through the full setup process, including installation, initialization, and connecting OpenClaw to the B.AI API. By the end, you will have your own private AI agent running locally.


Step 1: Get Your B.AI API Key

Before configuring OpenClaw, you need a B.AI API key.

  1. Log in to B.AI Chat
  2. Go to the API key management page
  3. Create or copy your api_key

Keep this key safe. You will use it in the configuration step below.


Step 2: Prepare Your System

Before installation, make sure your system meets the following requirements.

RequirementDetails
Node.jsNode.js 24 is recommended. Node.js 22.14+ is also supported.
Operating SystemmacOS, Linux, or Windows. Native Windows and WSL2 are both supported, but WSL2 is recommended for better compatibility.
Package Managernpm is supported for standard installation. pnpm is commonly used when building from source.

To check your Node.js version, open a terminal and run:

node --version

If Node.js is not installed, or your version is too low, please install or upgrade it from the official website:

https://nodejs.org/


Step 3: Install OpenClaw

OpenClaw supports multiple installation methods. For most users, the easiest option is to install the latest CLI globally.

Option 1: Install with npm

npm install -g openclaw@latest

Option 2: Use the Official Install Script

macOS / Linux

curl -fsSL https://openclaw.ai/install.sh | bash

Windows PowerShell

iwr -useb https://openclaw.ai/install.ps1 | iex

Choose either method above. After installation, continue to the onboarding step.


Step 4: Complete the Initialization Wizard

After installing OpenClaw, run the onboarding command:

openclaw onboard --install-daemon

This command starts the initialization wizard and installs the local Gateway service.

During onboarding, OpenClaw will guide you through several setup sections, such as:

  • AI model provider setup
  • Communication channels
  • Skills and optional features
  • Local background services

If you plan to configure B.AI manually, you can skip the default model provider setup during onboarding and continue with the manual configuration below.

Once onboarding is complete, OpenClaw will be ready for custom model configuration.


Step 5: Configure the B.AI Model

After onboarding, you need to manually add the B.AI model provider to the OpenClaw configuration file and set it as the default model.

There are two ways to do this:

  • One-Click Script: Follow the official one-click tutorial if available
  • Manual Configuration: Follow the steps below

Configuration note: The manual configuration below uses the current b.ai provider namespace and api.b.ai host so it stays aligned with the latest B.AI branding and API endpoint.


5.1 Open the Configuration File

Open the configuration file:

vim ~/.openclaw/openclaw.json

OpenClaw reads this file at startup to load model providers and agent settings.


5.2 Add the B.AI Provider Configuration

Locate the "models" section and merge in the following configuration.

Replace {B.AI_API_KEY} with your actual API key.

{
"models": {
"mode": "merge",
"providers": {
"b.ai": {
"baseUrl": "https://api.b.ai/v1/",
"apiKey": "{B.AI_API_KEY}",
"api": "openai-completions",
"models": [
{
"id": "gpt-5.2",
"name": "gpt-5.2"
},
{
"id": "gpt-5-mini",
"name": "gpt-5-mini"
},
{
"id": "gpt-5-nano",
"name": "gpt-5-nano"
},
{
"id": "claude-opus-4.6",
"name": "claude-opus-4.6"
},
{
"id": "claude-sonnet-4.6",
"name": "claude-sonnet-4.6"
},
{
"id": "claude-haiku-4.5",
"name": "claude-haiku-4.5"
}
]
}
}
}
}

5.3 Set the Default Model

In the same openclaw.json file, set the default model using the current OpenClaw structure:

{
"agents": {
"defaults": {
"model": {
"primary": "b.ai/gpt-5-nano"
},
"models": {
"b.ai/gpt-5-nano": {
"alias": "gpt-5-nano"
}
}
}
}
}

This tells OpenClaw to use the current official b.ai provider namespace for B.AI by default. If you expose more than one B.AI model, add them under agents.defaults.models as aliases so commands like openclaw models status and openclaw models set ... can resolve them correctly.


5.4 Restart the Gateway

After saving the configuration file, restart the Gateway so the changes take effect:

openclaw gateway restart

If the CLI tells you the Gateway service is not installed yet, run this first:

openclaw gateway install
openclaw gateway restart

Then verify the service state:

openclaw gateway status

5.5 Test the Connection

Send a simple test message from your terminal:

openclaw agent --agent main --message "How are you doing today?"

If OpenClaw returns a valid response, the connection to B.AI is working correctly. If you instead see a billing or insufficient-balance error, the provider is already connected but your API key does not currently have enough balance to complete the request.


Step 6: Understand Gateway and Diagnostics

If you run into issues during setup, it is helpful to understand the Gateway and the built-in diagnostic commands.

What Is the Gateway?

The Gateway is the local service layer that powers OpenClaw’s runtime features. It manages model access, local services, and background processes.

Common Gateway commands:

ActionCommand
Install the Gatewayopenclaw gateway install
Start the Gatewayopenclaw gateway start
Stop the Gatewayopenclaw gateway stop
Restart the Gatewayopenclaw gateway restart
Uninstall the Gatewayopenclaw gateway uninstall
Check Gateway Statusopenclaw gateway status

Diagnostic Commands

After onboarding and configuration, run the following command to check your environment:

openclaw doctor

You can also check the Gateway status directly:

openclaw gateway status

If everything is working properly, the Gateway should show a healthy or running status.


Step 7: Launch OpenClaw

Once setup is complete, you can interact with your AI agent using either the web dashboard or the terminal interface.

Option 1: Web Dashboard

Start the dashboard:

openclaw dashboard

Then open the following address in your browser:

http://127.0.0.1:18789

From the dashboard, you can:

  • Chat with your AI
  • View conversation history
  • Configure models
  • Monitor system status

Option 2: Terminal UI (TUI)

Start the terminal interface:

openclaw tui

Useful TUI commands:

CommandDescription
/statusView the current system status
/session <key>Switch to a specific chat session
/model <name>Switch the current LLM
/helpView available commands

Step 8: Learn Essential Commands

1. Check Model Status

openclaw models status

2. Manage Channels

openclaw channels list

3. Search Memory

openclaw memory search "keyword"

4. View Documentation

openclaw docs

Done

You now have a working private AI agent powered by OpenClaw + B.AI.

You can now:

  • Build automation workflows
  • Connect Telegram bots
  • Extend capabilities with skills
  • Create your own AI agent product

Welcome to your personal AI infrastructure powered by B.AI.