This guide covers the current Rust workspace under rust/ and the claw CLI binary. If you are brand new, make the doctor health check your first run: start claw, then run /doctor.
Run this before prompts, sessions, or automation:
cd rust
cargo build --workspace
./target/debug/claw
# first command inside the REPL
/doctor/doctor is the built-in setup and preflight diagnostic. Once you have a saved session, you can rerun it with ./target/debug/claw --resume latest /doctor.
- Rust toolchain with
cargo - One of:
ANTHROPIC_API_KEYfor direct API accessclaw loginfor OAuth-based auth
- Optional:
ANTHROPIC_BASE_URLwhen targeting a proxy or local service
cd rust
cargo build --workspaceThe CLI binary is available at rust/target/debug/claw after a debug build. Make the doctor check above your first post-build step.
cd rust
./target/debug/claw
/doctorcd rust
./target/debug/clawcd rust
./target/debug/claw prompt "summarize this repository"cd rust
./target/debug/claw "explain rust/crates/runtime/src/lib.rs"cd rust
./target/debug/claw --output-format json prompt "status"cd rust
./target/debug/claw --model sonnet prompt "review this diff"
./target/debug/claw --permission-mode read-only prompt "summarize Cargo.toml"
./target/debug/claw --permission-mode workspace-write prompt "update README.md"
./target/debug/claw --allowedTools read,glob "inspect the runtime crate"Supported permission modes:
read-onlyworkspace-writedanger-full-access
Model aliases currently supported by the CLI:
opus→claude-opus-4-6sonnet→claude-sonnet-4-6haiku→claude-haiku-4-5-20251213
export ANTHROPIC_API_KEY="sk-ant-..."cd rust
./target/debug/claw login
./target/debug/claw logoutclaw accepts two Anthropic credential env vars and they are not interchangeable — the HTTP header Anthropic expects differs per credential shape. Putting the wrong value in the wrong slot is the most common 401 we see.
| Credential shape | Env var | HTTP header | Typical source |
|---|---|---|---|
sk-ant-* API key |
ANTHROPIC_API_KEY |
x-api-key: sk-ant-... |
console.anthropic.com |
| OAuth access token (opaque) | ANTHROPIC_AUTH_TOKEN |
Authorization: Bearer ... |
claw login or an Anthropic-compatible proxy that mints Bearer tokens |
OpenRouter key (sk-or-v1-*) |
OPENAI_API_KEY + OPENAI_BASE_URL=https://openrouter.ai/api/v1 |
Authorization: Bearer ... |
openrouter.ai/keys |
Why this matters: if you paste an sk-ant-* key into ANTHROPIC_AUTH_TOKEN, Anthropic's API will return 401 Invalid bearer token because sk-ant-* keys are rejected over the Bearer header. The fix is a one-line env var swap — move the key to ANTHROPIC_API_KEY. Recent claw builds detect this exact shape (401 + sk-ant-* in the Bearer slot) and append a hint to the error message pointing at the fix.
If you meant a different provider: if claw reports missing Anthropic credentials but you already have OPENAI_API_KEY, XAI_API_KEY, or DASHSCOPE_API_KEY exported, you most likely forgot to prefix the model name with the provider's routing prefix. Use --model openai/gpt-4.1-mini (OpenAI-compat / OpenRouter / Ollama), --model grok (xAI), or --model qwen-plus (DashScope) and the prefix router will select the right backend regardless of the ambient credentials. The error message now includes a hint that names the detected env var.
claw can talk to local servers and provider gateways through either Anthropic-compatible or OpenAI-compatible endpoints. Use ANTHROPIC_BASE_URL with ANTHROPIC_AUTH_TOKEN for Anthropic-compatible services, or OPENAI_BASE_URL with OPENAI_API_KEY for OpenAI-compatible services. OAuth is Anthropic-only, so when OPENAI_BASE_URL is set you should use API-key style auth instead of claw login.
export ANTHROPIC_BASE_URL="http://127.0.0.1:8080"
export ANTHROPIC_AUTH_TOKEN="local-dev-token"
cd rust
./target/debug/claw --model "claude-sonnet-4-6" prompt "reply with the word ready"export OPENAI_BASE_URL="http://127.0.0.1:8000/v1"
export OPENAI_API_KEY="local-dev-token"
cd rust
./target/debug/claw --model "qwen2.5-coder" prompt "reply with the word ready"export OPENAI_BASE_URL="http://127.0.0.1:11434/v1"
unset OPENAI_API_KEY
cd rust
./target/debug/claw --model "llama3.2" prompt "summarize this repository in one sentence"export OPENAI_BASE_URL="https://openrouter.ai/api/v1"
export OPENAI_API_KEY="sk-or-v1-..."
cd rust
./target/debug/claw --model "openai/gpt-4.1-mini" prompt "summarize this repository in one sentence"For Qwen models via Alibaba's native DashScope API (higher rate limits than OpenRouter):
export DASHSCOPE_API_KEY="sk-..."
cd rust
./target/debug/claw --model "qwen/qwen-max" prompt "hello"
# or bare:
./target/debug/claw --model "qwen-plus" prompt "hello"Model names starting with qwen/ or qwen- are automatically routed to the DashScope compatible-mode endpoint (https://dashscope.aliyuncs.com/compatible-mode/v1). You do not need to set OPENAI_BASE_URL or unset ANTHROPIC_API_KEY — the model prefix wins over the ambient credential sniffer.
Reasoning variants (qwen-qwq-*, qwq-*, *-thinking) automatically strip temperature/top_p/frequency_penalty/presence_penalty before the request hits the wire (these params are rejected by reasoning models).
claw has three built-in provider backends. The provider is selected automatically based on the model name, falling back to whichever credential is present in the environment.
| Provider | Protocol | Auth env var(s) | Base URL env var | Default base URL |
|---|---|---|---|---|
| Anthropic (direct) | Anthropic Messages API | ANTHROPIC_API_KEY or ANTHROPIC_AUTH_TOKEN or OAuth (claw login) |
ANTHROPIC_BASE_URL |
https://api.anthropic.com |
| xAI | OpenAI-compatible | XAI_API_KEY |
XAI_BASE_URL |
https://api.x.ai/v1 |
| OpenAI-compatible | OpenAI Chat Completions | OPENAI_API_KEY |
OPENAI_BASE_URL |
https://api.openai.com/v1 |
| DashScope (Alibaba) | OpenAI-compatible | DASHSCOPE_API_KEY |
DASHSCOPE_BASE_URL |
https://dashscope.aliyuncs.com/compatible-mode/v1 |
The OpenAI-compatible backend also serves as the gateway for OpenRouter, Ollama, and any other service that speaks the OpenAI /v1/chat/completions wire format — just point OPENAI_BASE_URL at the service.
Model-name prefix routing: If a model name starts with openai/, gpt-, qwen/, or qwen-, the provider is selected by the prefix regardless of which env vars are set. This prevents accidental misrouting to Anthropic when multiple credentials exist in the environment.
These are the models registered in the built-in alias table with known token limits:
| Alias | Resolved model name | Provider | Max output tokens | Context window |
|---|---|---|---|---|
opus |
claude-opus-4-6 |
Anthropic | 32 000 | 200 000 |
sonnet |
claude-sonnet-4-6 |
Anthropic | 64 000 | 200 000 |
haiku |
claude-haiku-4-5-20251213 |
Anthropic | 64 000 | 200 000 |
grok / grok-3 |
grok-3 |
xAI | 64 000 | 131 072 |
grok-mini / grok-3-mini |
grok-3-mini |
xAI | 64 000 | 131 072 |
grok-2 |
grok-2 |
xAI | — | — |
Any model name that does not match an alias is passed through verbatim. This is how you use OpenRouter model slugs (openai/gpt-4.1-mini), Ollama tags (llama3.2), or full Anthropic model IDs (claude-sonnet-4-20250514).
You can add custom aliases in any settings file (~/.claw/settings.json, .claw/settings.json, or .claw/settings.local.json):
{
"aliases": {
"fast": "claude-haiku-4-5-20251213",
"smart": "claude-opus-4-6",
"cheap": "grok-3-mini"
}
}Local project settings override user-level settings. Aliases resolve through the built-in table, so "fast": "haiku" also works.
- If the resolved model name starts with
claude→ Anthropic. - If it starts with
grok→ xAI. - Otherwise,
clawchecks which credential is set:ANTHROPIC_API_KEY/ANTHROPIC_AUTH_TOKENfirst, thenOPENAI_API_KEY, thenXAI_API_KEY. - If nothing matches, it defaults to Anthropic.
The name "codex" appears in the Claw Code ecosystem but it does not refer to OpenAI Codex (the code-generation model). Here is what it means in this project:
oh-my-codex(OmX) is the workflow and plugin layer that sits on top ofclaw. It provides planning modes, parallel multi-agent execution, notification routing, and other automation features. See PHILOSOPHY.md and the oh-my-codex repo..codex/directories (e.g..codex/skills,.codex/agents,.codex/commands) are legacy lookup paths thatclawstill scans alongside the primary.claw/directories.CODEX_HOMEis an optional environment variable that points to a custom root for user-level skill and command lookups.
claw does not support OpenAI Codex sessions, the Codex CLI, or Codex session import/export. If you need to use OpenAI models (like GPT-4.1), configure the OpenAI-compatible provider as shown above in the OpenAI-compatible endpoint and OpenRouter sections.
claw honours the standard HTTP_PROXY, HTTPS_PROXY, and NO_PROXY environment variables (both upper- and lower-case spellings are accepted) when issuing outbound requests to Anthropic, OpenAI-, and xAI-compatible endpoints. Set them before launching the CLI and the underlying reqwest client will be configured automatically.
export HTTPS_PROXY="http://proxy.corp.example:3128"
export HTTP_PROXY="http://proxy.corp.example:3128"
export NO_PROXY="localhost,127.0.0.1,.corp.example"
cd rust
./target/debug/claw prompt "hello via the corporate proxy"As an alternative to per-scheme environment variables, the ProxyConfig type exposes a proxy_url field that acts as a single catch-all proxy for both HTTP and HTTPS traffic. When proxy_url is set it takes precedence over the separate http_proxy and https_proxy fields.
use api::{build_http_client_with, ProxyConfig};
// From a single unified URL (config file, CLI flag, etc.)
let config = ProxyConfig::from_proxy_url("http://proxy.corp.example:3128");
let client = build_http_client_with(&config).expect("proxy client");
// Or set the field directly alongside NO_PROXY
let config = ProxyConfig {
proxy_url: Some("http://proxy.corp.example:3128".to_string()),
no_proxy: Some("localhost,127.0.0.1".to_string()),
..ProxyConfig::default()
};
let client = build_http_client_with(&config).expect("proxy client");- When both
HTTPS_PROXYandHTTP_PROXYare set, the secure proxy applies tohttps://URLs and the plain proxy applies tohttp://URLs. proxy_urlis a unified alternative: when set, it applies to bothhttp://andhttps://destinations, overriding the per-scheme fields.NO_PROXYaccepts a comma-separated list of host suffixes (for example.corp.example) and IP literals.- Empty values are treated as unset, so leaving
HTTPS_PROXY=""in your shell will not enable a proxy. - If a proxy URL cannot be parsed,
clawfalls back to a direct (no-proxy) client so existing workflows keep working; double-check the URL if you expected the request to be tunnelled.
cd rust
./target/debug/claw status
./target/debug/claw sandbox
./target/debug/claw agents
./target/debug/claw mcp
./target/debug/claw skills
./target/debug/claw system-prompt --cwd .. --date 2026-04-04REPL turns are persisted under .claw/sessions/ in the current workspace.
cd rust
./target/debug/claw --resume latest
./target/debug/claw --resume latest /status /diffUseful interactive commands include /help, /status, /cost, /config, /session, /model, /permissions, and /export.
Runtime config is loaded in this order, with later entries overriding earlier ones:
~/.claw.json~/.config/claw/settings.json<repo>/.claw.json<repo>/.claw/settings.json<repo>/.claw/settings.local.json
The workspace includes a deterministic Anthropic-compatible mock service and parity harness.
cd rust
./scripts/run_mock_parity_harness.shManual mock service startup:
cd rust
cargo run -p mock-anthropic-service -- --bind 127.0.0.1:0cd rust
cargo test --workspaceCurrent Rust crates:
apicommandscompat-harnessmock-anthropic-servicepluginsruntimerusty-claude-clitelemetrytools