Claude Code is Anthropic's official agentic coding tool. By setting a custom API base URL, you can use LLM Resayil's models as the backend for all Claude Code completions and conversations — the same tool, pointed at your own credits. This guide covers installation, environment configuration, persistent shell setup, and troubleshooting for both bash and zsh users. For background on authentication, see the Authentication guide.
node --version to confirmnpm install -g @anthropic-ai/claude-code
Claude Code reads ANTHROPIC_BASE_URL to redirect all requests.
Set it alongside your key:
export ANTHROPIC_BASE_URL=https://llmapi.resayil.io/v1
export ANTHROPIC_API_KEY=YOUR_LLM_RESAYIL_API_KEY
Why llmapi.resayil.io? This domain bypasses Cloudflare's
100-second proxy timeout, which matters for long agentic tasks. Both
llm.resayil.io and llmapi.resayil.io share the
same API keys and credit balance.
Run claude in any project directory. It will connect to LLM Resayil
automatically and consume credits from your account.
claude
Add both exports to your shell profile so they persist across terminal sessions:
# LLM Resayil — Claude Code
export ANTHROPIC_BASE_URL=https://llmapi.resayil.io/v1
export ANTHROPIC_API_KEY=YOUR_LLM_RESAYIL_API_KEY
Reload with source ~/.bashrc or source ~/.zshrc.
Claude Code sends requests in Anthropic's native message format to
ANTHROPIC_BASE_URL/v1/messages. LLM Resayil exposes a
/v1/messages endpoint
that accepts the same format — system prompts, multi-turn conversations, tool calls,
and streaming are all supported natively. Credits are deducted per token, the same as
any other API call through LLM Resayil.
The model Claude Code uses is determined by its own defaults or your
ANTHROPIC_MODEL environment variable. LLM Resayil maps unknown Anthropic
model names to the most capable available model automatically — or you can set an
explicit model name from the
models list.
Claude Code is the only integration that uses the /v1/messages
endpoint. All other integrations (Python SDK, Cline, Open WebUI, etc.) use
/v1/chat/completions and the standard OPENAI_BASE_URL
variable.
Once environment variables are set, Claude Code works identically to its default mode. Any task that would normally hit Anthropic's API now routes through LLM Resayil:
claude "Refactor this function to use async/await and add error handling"
You can also run Claude Code in interactive mode with claude and use it
exactly as documented in
Anthropic's Claude Code docs.
The only difference is where the requests go.
Claude Code is not sending your LLM Resayil key. Verify ANTHROPIC_API_KEY
is set in the current shell: echo $ANTHROPIC_API_KEY. If blank, re-export
it or add it to your shell profile (Step 3 above).
Check ANTHROPIC_BASE_URL is exactly
https://llmapi.resayil.io/v1 with no trailing slash.
Run echo $ANTHROPIC_BASE_URL to confirm. A trailing slash causes 404 on
some Claude Code versions.
The first request after a period of inactivity warms up the model on the GPU server. Subsequent requests in the same session are fast. This is expected behaviour.
LLM Resayil auto-selects a capable model when it receives an Anthropic model name it
does not recognise. To pin a specific model, set
ANTHROPIC_MODEL=qwen3:14b (or any name from the
/docs/models page)
before launching Claude Code.
Claude Code returns a 402 error when your balance reaches zero. Top up at Dashboard → Billing.