This repository was archived by the owner on Feb 18, 2026. It is now read-only.
Releases: 9j/claude-code-mux
Releases · 9j/claude-code-mux
v0.6.3
Full Changelog: v0.6.2...v0.6.3
v0.6.2
Full Changelog: v0.6.1...v0.6.2
v0.6.1
Full Changelog: v0.6.0...v0.6.1
v0.6.0
Full Changelog: v0.5.3...v0.6.0
v0.5.2
Full Changelog: v0.5.1...v0.5.2
v0.5.1
Full Changelog: v0.5.0...v0.5.1
v0.5.0
Full Changelog: v0.4.5...v0.5.0
v0.4.5
Full Changelog: v0.4.4...v0.4.5
v0.4.2
Full Changelog: v0.4.1...v0.4.2
v0.4.1: Fix Codex Streaming
🐛 Bug Fixes
Fixed Streaming Requests for Codex Models
This patch release fixes a critical issue where streaming requests to Codex models were using the wrong API endpoint, causing 404 errors and forcing fallback to secondary providers.
What was broken:
- ❌ Streaming requests to Codex models used
/v1/chat/completions - ❌ OpenAI API returned 404 errors
- ❌ System fell back to zenmux provider
What's fixed:
- ✅ Streaming requests now use
/v1/responsesendpoint - ✅ No more 404 errors from OpenAI API
- ✅ All requests succeed with primary provider
📝 Changes
- Modified
send_message_stream()to detect Codex models - Route streaming requests to correct endpoint based on model type
- Added debug logging for streaming endpoint selection
🔧 Technical Details
Code Changes:
// Before: Always used /v1/chat/completions
let url = format!("{}/chat/completions", self.base_url);
// After: Smart routing based on model
let (url, request_body) = if is_codex {
// Codex models → /v1/responses
} else {
// Standard models → /v1/chat/completions
}✅ Testing
- Verified streaming works with
gpt-5.1-codexmodel - Confirmed no 404 errors in production logs
- Both streaming and non-streaming requests work correctly
📦 Files Changed
src/providers/openai.rs: +17 lines, -4 lines
🤖 Generated with Claude Code