Learn how to integrate LLM Resayil into your applications. Our API is OpenAI-compatible, supports 45+ models, and offers pay-per-token pricing. Get started in minutes with clear documentation and code examples.
| URL | Description |
|---|---|
https://llmapi.resayil.io/v1/
Preferred
|
Primary shorthand — OpenAI-compatible |
https://llmapi.resayil.io/v1/
New
|
Alternate domain — works identically |
https://llmapi.resayil.io/v1/ |
Full path — legacy compatibility |
All three URLs work correctly. We recommend https://llmapi.resayil.io/v1/ or https://llmapi.resayil.io/v1/ for the shortest OpenAI-compatible base path.
curl -X POST https://llmapi.resayil.io/v1/chat/completions \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "mistral",
"messages": [{"role": "user", "content": "Hello!"}],
"max_tokens": 100
}'
You can also use https://llmapi.resayil.io/v1/chat/completions or /api/v1/chat/completions — both resolve identically.
Learn the basics. Discover how to register, obtain your API key, and make your first request to the LLM Resayil API.
Understand API key authentication, Bearer token format, and how to securely manage your credentials.
Explore our 45+ available models including Mistral, Llama 2, Neural Chat, and more. Learn their capabilities.
Learn how our credit system works, token consumption rates, and how to manage your account balance.
Understand request limits, quota resets, and best practices for handling rate limit responses.
Reference guide for common errors, status codes, and how to debug and resolve API issues.
Track your API usage, token consumption, and request history with detailed analytics and reporting.
How to purchase additional credits, manage your balance, and understand available payment methods.
Complete guide to how credits work, pricing tiers, and per-model token consumption rates.
Use the Anthropic Messages API format with the same API keys and credit system. Supports streaming and tool calling.
Connect LLM Resayil with n8n workflows. Covers setup, streaming, timeouts, error handling, and best practices.
Connect LLM Resayil to Claude Code, n8n, Open WebUI, Python SDK, and more.