Describe the bug
After several hours of heavy use, one session hit 64,408.5 k tokens and then became unusable—constant 502 errors. Only starting a fresh session lets me continue; it feels impossible to use in depth, and context handling is poor. I can’t tell whether the issue lies with Codex itself; I’ve already switched LLM endpoints twice.
Steps to reproduce
- Session length reached 64,408.5 k tokens, and the time exceeded 400 minutes.
- Retrying keeps returning 502.
Expected behavior
Hope it runs normally
Screenshots or logs
No response
Operating System
macos 15.7.3
OpenCow Version
0.3.9
Node.js Version
22.11.0
AI Engine
OpenAI Codex
Describe the bug
After several hours of heavy use, one session hit 64,408.5 k tokens and then became unusable—constant 502 errors. Only starting a fresh session lets me continue; it feels impossible to use in depth, and context handling is poor. I can’t tell whether the issue lies with Codex itself; I’ve already switched LLM endpoints twice.
Steps to reproduce
Expected behavior
Hope it runs normally
Screenshots or logs
No response
Operating System
macos 15.7.3
OpenCow Version
0.3.9
Node.js Version
22.11.0
AI Engine
OpenAI Codex