OpenClaw - Self-hosted AI Assistant Platform
OpenClaw Tutorial — Install OpenClaw, integrate New API, and quickly set up a self-hosted AI assistant. Open-source project, supporting multi-channel integration like Telegram, Discord, WhatsApp.
Project Introduction
OpenClaw is an open-source, self-hosted personal AI assistant platform that connects messaging apps to AI agents running on your own hardware. Designed for developers and advanced users, it allows you to have an autonomous AI assistant without giving up control of your data.
- Official Homepage: https://openclaw.ai
- Project Documentation: https://docs.openclaw.ai
- GitHub: https://github.com/openclaw/openclaw
OpenClaw is completely open-source. You can browse the source code, submit issues, or contribute on OpenClaw's GitHub repository. This tutorial covers the complete steps for installing, configuring, and integrating OpenClaw with New API.
🌟 Core Features
Multi-channel Integration
- Multi-channel Integration: Supports various messaging channels like Telegram, Discord, WhatsApp, iMessage, and can be extended to more platforms via plugins.
- Single Gateway: Unified management of all channels through a single Gateway process.
- Voice Support: Supports macOS/iOS/Android voice interaction.
- Canvas Interface: Can render interactive Canvas interfaces.
Self-hosting and Data Security
- Fully Self-hosted: Runs on your own machine or server.
- Open Source Transparency: MIT open-source license, completely transparent code.
- Local Data: Context and skills are stored on your local computer, not in the cloud.
Intelligent Agent Capabilities
- Continuous Operation: Supports persistent background operation with long-term memory.
- Scheduled Tasks: Supports cron scheduled tasks.
- Session Isolation: Isolates sessions by agent/workspace/sender.
- Multi-agent Routing: Supports collaborative work among multiple agents.
- Tool Calling: Native support for tool calling and code execution.
📦 Preparation Before Integration
Preparation Information
- Node.js 22 or higher
- An available New API address (usually ending with
/v1) - An available New API API Key
- Please use your own deployed New API, or confirm that the service provider has legitimate upstream authorization and compliance obligations for their New API service. Do not connect unknown API addresses or keys to production environments.
Before integrating New API, it's recommended to first get the Gateway and Control UI running according to OpenClaw's official current recommended process. This makes it easier to distinguish whether OpenClaw itself is not starting or if the model provider configuration is incorrect when troubleshooting later.
1. Install OpenClaw (macOS/Linux)
curl -fsSL https://openclaw.ai/install.sh | bashFor other installation methods, refer to the OpenClaw official documentation: Getting Started.
2. Run the Onboarding Wizard
openclaw onboard --install-daemonThis wizard will complete basic authentication, Gateway setup, and optional channel initialization. The goal here is to get OpenClaw running first, then switch the default model to New API later.
3. Check Gateway and Control UI
openclaw gateway statusopenclaw dashboardIf the browser can open the Control UI, it means OpenClaw's basic operation is normal. At this stage, there's no need to configure Telegram, Discord, Feishu, or other messaging channels yet.
4. Locate the Configuration File
OpenClaw's configuration file is usually located at ~/.openclaw/openclaw.json. You can continue to modify it based on what the onboarding wizard generated.
Path-related Environment Variables
If you run OpenClaw under a dedicated service account, or wish to customize the configuration/state directory, you can use:
OPENCLAW_HOMEOPENCLAW_STATE_DIROPENCLAW_CONFIG_PATH
For detailed instructions, see the official environment variables documentation: Environment Variables.
🚀 Using New API as a Model Provider
OpenClaw supports integrating custom or OpenAI-compatible model gateways via models.providers. For New API, the most common approach is to add it as a custom provider to the configuration, then point the default model to newapi/model-ID.
Integration Approach
- Declare a
newapiprovider undermodels.providers. - Point
baseUrlto your New API address, ensuring it includes/v1. - Set
apitoopenai-completions. - List the model IDs you want OpenClaw to use in
models. - Switch the default model in
agents.defaults.model.primarytonewapi/....
Recommended Practice: Store API Key in Environment Variables
First, provide your New API key in the current shell, service environment, or an .env file readable by OpenClaw:
export NEWAPI_API_KEY="sk-your-newapi-key"Then, add or modify the following snippet in openclaw.json:
{
models: {
mode: "merge",
providers: {
newapi: {
baseUrl: "https://<your-newapi-domain>/v1",
apiKey: "${NEWAPI_API_KEY}",
api: "openai-completions",
models: [
{ id: "gemini-2.5-flash", name: "Gemini 2.5 Flash" },
{ id: "kimi-k2.5", name: "Kimi K2.5" },
],
},
},
},
agents: {
defaults: {
model: {
primary: "newapi/gemini-2.5-flash",
fallbacks: ["newapi/kimi-k2.5"],
},
models: {
"newapi/gemini-2.5-flash": { alias: "flash" },
"newapi/kimi-k2.5": { alias: "kimi" },
},
},
},
}This is not a complete configuration that must be copied exactly, but rather the most critical part for integrating New API. As long as the provider, model IDs, and default model references are correct, OpenClaw will be able to call your exposed model resources via New API.
Key Configuration Details
| Configuration Item | Description |
|---|---|
models.mode | Recommended to set to merge, appending newapi while retaining OpenClaw's built-in providers |
models.providers.newapi.baseUrl | Your New API address, usually needs to include /v1 |
models.providers.newapi.apiKey | New API key, recommended to inject via ${NEWAPI_API_KEY} |
models.providers.newapi.api | For OpenAI-compatible gateways like New API, use openai-completions |
models.providers.newapi.models | The model IDs listed here must match the actual model names exposed by your New API |
agents.defaults.model.primary | Default primary model, format must be provider/model-id |
agents.defaults.model.fallbacks | List of fallback models, automatically switches if the primary model fails |
agents.defaults.models | Optional, used to create aliases for models, convenient for referencing in UI or sessions |
Verify Successful Integration
After completing the configuration, return to the Control UI or reopen it:
openclaw dashboardIf you can initiate conversations normally in OpenClaw and the default model has become newapi/..., then the integration is successful. You can also use:
openclaw models listto confirm that models with the newapi/ prefix appear in the selectable list.
Common Issues
baseUrlmissing/v1: This is one of the most common integration errors.- Incorrect model ID:
primaryandfallbacksmust correspond to theidinmodels.providers.newapi.models. - API key only effective in the current terminal: If Gateway runs as a background service, ensure the service process can also read
NEWAPI_API_KEY. - For foreground troubleshooting: Use the official foreground running method
openclaw gateway --port 18789to observe logs and errors.
How is this guide?
Memoh - Containerized AI Agent Platform
Memoh integration guide — Connect the self-hosted containerized AI agent platform to New API, creating AI bots with long-term memory for Telegram, Discord, Lark, QQ, WeChat, and more.
LangBot - Instant Messaging Bot Development Platform
LangBot integration guide — build AI-powered chatbots for Feishu, DingTalk, Telegram, Discord, and more. Supports Knowledge Base, Agent, and MCP with New API.