Skip to content
#

llm-proxy

Here are 113 public repositories matching this topic...

Open-source LLM router & AI cost optimizer. Routes simple prompts to cheap/local models, complex ones to premium — automatically. Drop-in OpenAI-compatible proxy for Claude Code, Codex, Cursor, OpenClaw. Saves 40-70% on AI API costs. Self-hosted, no middleman.

  • Updated Apr 16, 2026
  • Python

Zero trust LLM gateway. OpenAI-compatible proxy with semantic routing and load balancing across OpenAI, Anthropic, Ollama, vLLM, and any compatible backend. Identity-based access, virtual API keys, and end-to-end encryption via OpenZiti

  • Updated Apr 16, 2026
  • Go

Self-hosted LLM gateway that routes requests across AI providers (OpenAI, Anthropic, Gemini, Mistral, Ollama) using intelligent multi-policy scoring — including an LLM-native routing policy. Drop-in compatible: just swap the base URL. No database required, built-in cost tracking, budget enforcement and multi-tenant isolation.

  • Updated Apr 8, 2026
  • TypeScript
AI-Worker-Proxy

OpenAI-compatible AI proxy: Anthropic Claude, Google Gemini, GPT-5, Cloudflare AI. Free hosting, automatic failover, token rotation. Deploy in 1 minute.

  • Updated Apr 17, 2026
  • TypeScript

Improve this page

Add a description, image, and links to the llm-proxy topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the llm-proxy topic, visit your repo's landing page and select "manage topics."

Learn more