Stateless dynamic request relay for CORS-safe forwarding on Cloudflare Workers.
relayx is intentionally thin. The caller puts the upstream URL in the request path, sends the upstream headers it needs, and relayx forwards the request without storing anything.
- Dynamic target URL in the path:
/https://api.example.com/v1/... - Optional explicit prefix:
/relay/https://api.example.com/v1/... - Browser CORS preflight handling
- Optional browser origin allowlist with
ALLOWED_ORIGINS - HTTPS-only upstream targets
- Caller-provided auth headers; no Worker-side API key injection
- No KV, D1, R2, Durable Objects, Queues, cache storage, logs, or persistence
Install dependencies:
pnpm installRun locally:
pnpm devSend a test request through the local Worker:
curl -X POST 'http://localhost:8787/https://api.openai.com/v1/chat/completions' \
-H 'Authorization: Bearer YOUR_API_KEY' \
-H 'Content-Type: application/json' \
-d '{"model":"gpt-4o-mini","messages":[{"role":"user","content":"hi"}]}'Put the full upstream URL after the Worker origin:
https://relayx.bax.workers.dev/https://api.openai.com/v1/chat/completionsThe /relay/ prefix is also accepted if you prefer a clearer route:
https://relayx.bax.workers.dev/relay/https://api.openai.com/v1/chat/completionsQuery strings are preserved from the outer Worker URL:
https://relayx.bax.workers.dev/https://api.openai.com/v1/chat/completions?stream=falseFor an OpenAI-compatible frontend, set BASE_URL to the Worker URL plus the upstream base:
https://relayx.bax.workers.dev/https://api.openai.com/v1relayx does not append API endpoints automatically. If the client requests only the URL above, the upstream request also stops at /v1. OpenAI-compatible clients usually append paths such as /chat/completions themselves.
Callers provide upstream headers directly. For OpenAI-compatible APIs, send headers such as:
Authorization: Bearer YOUR_API_KEY
Content-Type: application/json
OpenAI-Organization: OPTIONAL_ORG
OpenAI-Project: OPTIONAL_PROJECTrelayx forwards allowed request headers and adds CORS headers to responses. It strips browser/session and edge forwarding headers before upstream fetch:
CookieCF-*X-Forwarded-*HostAccess-Control-*- hop-by-hop headers such as
ConnectionandTransfer-Encoding
By default, relayx reflects any browser Origin. This is convenient for local testing and caller-supplied credentials.
To allow only specific browser origins, configure ALLOWED_ORIGINS in wrangler.jsonc:
Requests from other browser origins receive 403. Server-to-server requests without an Origin header are still allowed.
To allow local development on any port, use wildcard rules:
{
"vars": {
"ALLOWED_ORIGINS": "http://localhost:*, http://127.0.0.1:*"
}
}These wildcards only match localhost and 127.0.0.1, not similar-looking hostnames.
Log in to Cloudflare:
pnpm wrangler loginDeploy the Worker:
pnpm run deployWrangler prints the deployed Worker URL, for example:
https://relayx.example.workers.devUse it as your AI base URL:
https://relayx.example.workers.dev/https://api.openai.com/v1pnpm test # run unit tests
pnpm typecheck # run TypeScript checks
pnpm dev # run locally with Wrangler
pnpm run deploy # deploy to Cloudflare Workersrelayx deliberately does not include:
UPSTREAM_BASE_URL- Worker-side API key injection
- request or response persistence
- request logging
- user accounts
- quota tracking
- model routing
- prompt or completion inspection
The Worker is a transport relay only.
Use the package script instead:
pnpm run deployUse backslash (\) for line continuation in bash, not Windows cmd caret (^).
The upstream did not receive a valid auth header. Check that your request includes one of the headers required by that upstream, such as:
Authorization: Bearer YOUR_API_KEYThe request reached the upstream service. Try the same request directly against the upstream URL to compare behavior.
{ "vars": { "ALLOWED_ORIGINS": "https://rpx.pages.dev, https://app.example.com" } }