One API to rule them all.
Unify OpenAI, Anthropic, Google, DeepSeek, and more behind a single endpoint.
A dead-simple, single-binary gateway that turns dozens of LLM providers into one clean API.
Run it once, forget about provider differences. Keys, models, endpoints — all abstracted.
Why? Because juggling multiple SDKs and API formats is a pain.
This just works.
# 1️⃣ Clone & install
git clone https://github.com/shenald-dev/one-api
cd one-api
npm install
# 2️⃣ Configure (copy .env.example → .env and add your keys)
cp .env.example .env
# Edit .env with your provider API keys
# 3️⃣ Run
npm start
# 4️⃣ Use
curl -H "Content-Type: application/json" \
-d '{"model":"gpt-4","messages":[{"role":"user","content":"Hello!"}]}' \
http://localhost:3000/v1/chat/completionsThat's it. Your app now talks to any LLM provider through the same /v1 OpenAI-compatible interface.
- OpenAI
- Anthropic Claude
- Google Gemini
- Azure OpenAI
- DeepSeek
- Groq
- Together AI
- Mistral AI
- Local inference (Ollama, LM Studio)
- +20 more...
docker run -d \
-p 3000:3000 \
-v $(pwd)/data:/app/data \
-e OPENAI_API_KEY=sk-... \
-e ANTHROPIC_API_KEY=sk-... \
shenald/one-api:latestSingle command, up forever.
Environment variables in .env:
# Port
PORT=3000
# Add any provider keys you need
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-...
GOOGLE_API_KEY=...
DEEPSEEK_API_KEY=...
# ... see .env.example for full listNo database, no migrations. JSON file storage. Just run.
- Zero config — starts, works, stays
- Single binary — no Node, no Python, just
npm i && npm startor Docker - OpenAI-compatible — your existing code works unchanged
- Lightweight — no admin UI bloat, just the API
- Free & open — MIT license, no SaaS tier
# Install deps
npm install
# Run dev with hot reload
npm run dev
# Test
npm test
# Build Docker image
docker build -t shenald/one-api .MIT — do whatever you want.
Built by a vibe coder who got tired of rewriting integrations.
If it's useful, star it ⭐ — if not, open an issue and tell me why.
Keep it simple. 🧘