LLMSee is a cross-platform lightweight proxy for logging OpenAI-compatible API calls with real-time monitoring.
- Platform Compatibility: Single executable for Linux, macOS, Windows, or Docker
- Real-time Web Interface: Vue 3 web UI with live request monitoring
- Multiple Provider Support: OpenAI, Ollama, Anthropic, Gemini, Groq, and more
- Enhanced Model Configuration: Automatic parameter injection for models
- No Dependencies: Embedded SQLite database, no external services required
See Quick Start Guide for complete setup instructions
All documentation is organized in the docs/ folder:
- Quick Start Guide - Installation and setup
- Configuration Guide - Complete configuration reference
- Developer Guide - Backend - Go backend development
- Developer Guide - Frontend - Vue 3 frontend development
- Testing Guide - Testing procedures and best practices
- Database Guide - SQLite operations and schema
- Project Summary - Architecture overview
Contributions are welcome! Please see the Developer Guides for detailed information on the codebase.
LLMSee is made by @yz778 and licensed under the MIT License. See LICENSE for more details.