PlugAndClawGet Started →

OpenClaw API Keys Guide

OpenClaw needs API access to AI models. You have two options: use bundled credits through OpenRouter (the PlugAndClaw default), or bring your own keys from Anthropic, OpenAI, or Google. Here's how each option works and when to use which.

How OpenClaw Connects to AI Models

OpenClaw is the infrastructure layer - the server, the memory system, the tool execution engine, the Telegram interface. It doesn't include AI models itself. Instead, it makes API calls to external model providers and passes the responses back to you.

This architecture means you need an API key to use OpenClaw. The key tells the model provider who you are, how to bill you, and what rate limits to apply. Without a valid API key, OpenClaw can run but can't generate any responses.

OpenClaw is model-agnostic. It can talk to any model that implements the OpenAI-compatible API format (which most major providers do). You can run Claude one day, switch to GPT the next, or set up different models for different tasks. The assistant's personality and memory are stored in your workspace files - they persist across model changes.

Two distinct approaches exist for providing this API access: using a centralized API gateway with pooled credits, or connecting directly to individual providers with your own keys. Each has tradeoffs worth understanding.

Option 1: Bundled Credits via OpenRouter (PlugAndClaw Default)

PlugAndClaw's $39.50/month plan includes $20 in AI credits delivered through OpenRouter. OpenRouter is an API gateway that provides unified access to models from Anthropic, OpenAI, Google, and dozens of other providers through a single API endpoint.

The practical benefit: you don't need accounts with individual AI providers. No Anthropic console, no OpenAI platform, no Google Cloud setup. You pay PlugAndClaw, we handle the OpenRouter integration, and your assistant has immediate access to all supported models.

With bundled credits, you can use Claude Opus 4.6, Claude Sonnet 4.6, GPT-5.2, Gemini 3 Flash, Kimi K2.5, Minimax M2.5, and many others. The model selector in OpenClaw lets you specify which model to use per conversation, so you can use the fast, cheap Gemini 3 Flash for quick lookups and Claude Opus 4.6 for complex reasoning - all from the same assistant.

When you use $20 in credits, you can top up through your PlugAndClaw account. Usage is tracked per-model, so you can see exactly what each model costs and optimize accordingly. Most users find $20/month covers extensive daily assistant use - the credits go further than you'd expect because conversational AI is cheaper per token than most people assume.

This option is best for: most users, especially those who want multiple model access without the overhead of managing separate provider accounts.

Option 2: Bring Your Own Key (BYOK)

BYOK means configuring OpenClaw with API keys from your own accounts at Anthropic, OpenAI, Google, or other providers. Requests go directly from your server to the provider - no intermediary.

To configure BYOK on OpenClaw, open your `openclaw.json` and add your key to the model configuration:

`{ "model": { "provider": "anthropic", "apiKey": "sk-ant-...", "model": "claude-sonnet-4-6" } }`

For OpenAI: replace provider with 'openai' and use your OpenAI key. For Google's Gemini models: use 'google' and your Gemini API key. OpenClaw handles the provider-specific API format differences automatically.

BYOK has different billing implications. You pay your AI provider directly, at their published API rates, on your own account. For high-volume users, this can be cheaper or more expensive than bundled credits depending on which models you use and how much. Anthropic's Claude Opus is expensive per token; Gemini 3 Flash is very cheap.

BYOK also changes the privacy picture. With bundled credits via OpenRouter, your requests pass through OpenRouter's infrastructure before reaching the model provider. With BYOK, the request goes from your PlugAndClaw server directly to Anthropic (or OpenAI, or Google) - one fewer company in the data path.

This option is best for: users with existing provider accounts, high-volume users who've run the math on direct API pricing, or users with strict requirements about which companies see their conversation data.

Privacy Implications of Each Approach

Privacy is worth thinking through carefully when it comes to AI API keys and request routing.

With PlugAndClaw's bundled credits: your conversations flow from your dedicated Hetzner VPS to OpenRouter's API gateway, then to the final model provider (Anthropic, OpenAI, etc.). Two companies' infrastructure handles your requests. OpenRouter's privacy policy covers their data handling; model providers' policies cover theirs. PlugAndClaw's server stores no conversation history beyond your workspace memory files.

With BYOK: your conversations flow from your dedicated Hetzner VPS directly to your chosen model provider. One fewer company in the path. If you're using your own Anthropic key, only Anthropic sees the API request content. This is the maximum privacy option available without running models locally.

With local models (advanced): OpenClaw supports locally-run models via Ollama. If you install Ollama on your PlugAndClaw server and configure OpenClaw to use it, no conversation content leaves your server at all. This trades response quality (local models are less capable than frontier models) for maximum privacy.

For most users, the bundled credits option provides a good balance. OpenRouter's business model depends on being a trustworthy intermediary, and they have standard data handling practices. But if your use case involves confidential business information, medical details, or anything where minimizing data exposure is critical, BYOK or local models is the right choice.

Your PlugAndClaw server is under your control in either case. The VPS is yours - you can audit what's on it, what processes are running, and what network calls are being made.

Configuring API Keys on PlugAndClaw

PlugAndClaw comes pre-configured with bundled OpenRouter credits. You don't need to do anything to start using AI models - your assistant is ready from the moment provisioning completes.

To switch to BYOK: SSH into your server (connection details are in your dashboard) and edit the OpenClaw configuration file. Your assistant can also walk you through this: ask via Telegram 'How do I configure my own Anthropic API key?' and it will give you the exact steps for your server.

If you want to use your own key for specific conversations without changing the default: OpenClaw supports per-session model configuration. You can tell your assistant 'use my Anthropic key for this conversation' and it will switch for that session only.

For teams or business users who want to track AI usage separately: BYOK gives you full visibility in your provider's dashboard. You can see token counts per request, set spending limits, and generate usage reports - all through your provider's standard tooling.

The bundled $20/month credits are a starting point, not a ceiling. If you need more, you can top up or configure BYOK for unlimited usage without plan changes. The flexibility to switch between approaches - or combine them - is built into OpenClaw's architecture and fully supported on PlugAndClaw.

Frequently Asked Questions

What is BYOK in OpenClaw?

BYOK stands for Bring Your Own Key. In OpenClaw's context, it means using your own API key from Anthropic, OpenAI, Google, or another provider instead of going through an intermediary like OpenRouter. With BYOK, requests go directly from your server to the AI provider's API, billed to your own account.

What is OpenRouter and how does it relate to OpenClaw?

OpenRouter is an API gateway that provides unified access to dozens of AI models through a single API key and billing account. Instead of maintaining separate keys for Anthropic, OpenAI, and Google, you use one OpenRouter key and get access to all their models. PlugAndClaw uses OpenRouter to provide the $20/month bundled AI credits, giving you access to Claude, GPT, Gemini, and more without managing individual provider accounts.

Which AI models can I use with OpenClaw?

With PlugAndClaw's bundled credits via OpenRouter, you can access Claude Opus 4.6, Claude Sonnet 4.6, GPT-5.2, Gemini 3 Flash, Kimi K2.5, and Minimax M2.5, plus dozens of other models. With BYOK, you can use any model from the provider whose key you're using. OpenClaw lets you set a default model and switch models per-conversation.

Is BYOK more private than using OpenRouter credits?

With BYOK, your conversations go directly to your chosen AI provider - no intermediary sees the requests. With OpenRouter, your requests pass through OpenRouter's infrastructure before reaching the model provider. OpenRouter has a privacy policy that covers this, but if your privacy requirement is that only one company sees your data, BYOK is the right choice. PlugAndClaw's server itself never stores your conversation content.

Can I use both bundled credits and my own API key?

Yes. OpenClaw lets you configure a primary API source and override it for specific sessions. You might use the bundled credits for everyday tasks and switch to your own Anthropic key for sensitive conversations. This flexibility is available on PlugAndClaw through the openclaw.json configuration, which you can edit via SSH access to your dedicated server.

Your AI assistant. Live in under 1 minute.

Get Started with Included Credits

$39.50/month includes $20 AI credits - BYOK always supported - 7-day money-back guarantee