diff --git a/.env.example b/.env.example index e867fb0..c235e77 100644 --- a/.env.example +++ b/.env.example @@ -17,8 +17,21 @@ FLOW_PROXY_LOG_LEVEL=INFO # Path to secrets.json file (default: secrets.json) FLOW_PROXY_SECRETS_FILE=secrets.json -# Optional: Custom log file path -# FLOW_PROXY_LOG_FILE=flow_proxy_plugin.log +# Log directory path (default: logs) +FLOW_PROXY_LOG_DIR=logs + +# Log Cleanup Settings +# Enable automatic log cleanup (default: true) +FLOW_PROXY_LOG_CLEANUP_ENABLED=true + +# Number of days to retain log files (default: 7) +FLOW_PROXY_LOG_RETENTION_DAYS=7 + +# Interval in hours between log cleanup runs (default: 24) +FLOW_PROXY_LOG_CLEANUP_INTERVAL_HOURS=24 + +# Maximum size of log directory in MB, 0 = unlimited (default: 100) +FLOW_PROXY_LOG_MAX_SIZE_MB=100 # Optional: Target Flow LLM Proxy URL # FLOW_PROXY_TARGET_URL=https://flow.ciandt.com/flow-llm-proxy diff --git a/README.md b/README.md index 5da9d49..dba5d86 100644 --- a/README.md +++ b/README.md @@ -15,6 +15,7 @@ Flow Proxy Plugin 是一个强大的代理插件,为 Flow LLM Proxy 服务提 - **透明代理**: 无缝转发请求到 Flow LLM Proxy 服务,保持原始请求内容 - **自动故障转移**: 当某个配置失败时,自动切换到下一个可用配置 - **全面的错误处理**: 详细的错误日志和自动重试机制 +- **日志自动清理**: 自动清理过期日志文件,管理磁盘空间 ## 架构概述 @@ -280,7 +281,13 @@ FLOW_PROXY_NUM_WORKERS= # Worker 数量(默认:CPU 核心数 FLOW_PROXY_THREADED=1 # 线程模式(1=启用,0=禁用,默认:1) FLOW_PROXY_LOG_LEVEL=INFO # 日志级别 FLOW_PROXY_SECRETS_FILE=secrets.json # 配置文件路径 -FLOW_PROXY_LOG_FILE=flow_proxy_plugin.log # 日志文件路径 +FLOW_PROXY_LOG_DIR=logs # 日志目录路径 + +# 日志清理配置(可选,详见 docs/log-cleanup.md) +FLOW_PROXY_LOG_CLEANUP_ENABLED=true # 是否启用自动清理(默认:true) +FLOW_PROXY_LOG_RETENTION_DAYS=7 # 日志保留天数(默认:7) +FLOW_PROXY_LOG_CLEANUP_INTERVAL_HOURS=24 # 清理间隔小时数(默认:24) +FLOW_PROXY_LOG_MAX_SIZE_MB=100 # 日志目录最大大小MB(默认:100) ``` **性能优化**: @@ -327,9 +334,11 @@ git commit -m "feat: add new feature" ## 文档 -- **[使用指南](docs/使用指南.md)** - 完整的用户使用指南,包括安装、配置、使用和故障排除 -- **[开发指南](docs/开发指南.md)** - 开发者文档,包括 API 文档、架构说明和扩展指南 -- **[部署运维](docs/部署运维.md)** - 生产环境部署和运维指南 +### 功能文档 +- **[日志自动清理](docs/log-cleanup.md)** - 自动清理过期日志文件的功能说明 +- **[日志过滤](docs/log-filtering.md)** - 日志输出过滤和级别控制 + +### 外部资源 - **[Flow LLM Proxy 官方文档](https://flow.ciandt.com/help/en/help/articles/8421153-overview-and-configuration)** - Flow LLM Proxy 概述与配置 ## 许可证 diff --git a/docker-compose.yml b/docker-compose.yml index adaccc1..5f10808 100644 --- a/docker-compose.yml +++ b/docker-compose.yml @@ -18,7 +18,7 @@ services: - FLOW_PROXY_HOST=0.0.0.0 - FLOW_PROXY_LOG_LEVEL=INFO - FLOW_PROXY_SECRETS_FILE=/app/secrets.json - - FLOW_PROXY_LOG_FILE=/app/logs/flow_proxy_plugin.log + - FLOW_PROXY_LOG_DIR=/app/logs/ restart: unless-stopped healthcheck: test: ["CMD", "curl", "-f", "http://localhost:8899/"] diff --git a/docs/log-cleanup.md b/docs/log-cleanup.md new file mode 100644 index 0000000..d5eb623 --- /dev/null +++ b/docs/log-cleanup.md @@ -0,0 +1,305 @@ +# 日志自动清理功能 + +## 概述 + +Flow Proxy Plugin 提供了自动清理日志文件的功能,可以帮助管理磁盘空间,防止日志文件无限增长。 + +## 功能特性 + +### 1. 日志按天分文件 +- 使用 `TimedRotatingFileHandler` 实现日志按天轮转 +- 每天午夜自动创建新的日志文件 +- 文件命名格式: + - 当前日志:`flow_proxy_plugin.log` + - 历史日志:`flow_proxy_plugin.log.2026-02-01`、`flow_proxy_plugin.log.2026-01-31` 等 + +### 2. 按时间清理 +- 自动删除超过指定天数的日志文件 +- 默认保留最近 7 天的日志 + +### 3. 按大小清理 +- 当日志目录总大小超过限制时,自动删除最旧的文件 +- 默认限制为 100 MB(可设置为 0 表示不限制) + +### 4. 定期执行 +- 在后台线程中定期执行清理任务 +- 默认每 24 小时清理一次 +- 应用启动时立即执行一次清理 + +### 5. 灵活配置 +- 通过环境变量灵活配置清理策略 +- 可以完全禁用自动清理功能 + +## 配置选项 + +所有配置选项都可以通过环境变量设置。在 `.env` 文件中添加以下配置: + +### FLOW_PROXY_LOG_CLEANUP_ENABLED + +是否启用日志自动清理功能。 + +- **类型**: boolean (true/false) +- **默认值**: `true` +- **示例**: + ```bash + FLOW_PROXY_LOG_CLEANUP_ENABLED=true + ``` + +### FLOW_PROXY_LOG_RETENTION_DAYS + +日志文件保留天数。超过此天数的日志文件将被删除。 + +- **类型**: integer +- **默认值**: `7` +- **示例**: + ```bash + FLOW_PROXY_LOG_RETENTION_DAYS=7 + ``` + +### FLOW_PROXY_LOG_CLEANUP_INTERVAL_HOURS + +日志清理任务执行间隔(小时)。 + +- **类型**: integer +- **默认值**: `24` +- **示例**: + ```bash + FLOW_PROXY_LOG_CLEANUP_INTERVAL_HOURS=24 + ``` + +### FLOW_PROXY_LOG_MAX_SIZE_MB + +日志目录最大大小(MB)。设置为 0 表示不限制。 + +- **类型**: integer +- **默认值**: `100` +- **示例**: + ```bash + FLOW_PROXY_LOG_MAX_SIZE_MB=100 + ``` + +## 使用示例 + +### 默认配置 + +默认情况下,日志清理功能已启用,使用以下配置: + +```bash +FLOW_PROXY_LOG_CLEANUP_ENABLED=true +FLOW_PROXY_LOG_RETENTION_DAYS=7 +FLOW_PROXY_LOG_CLEANUP_INTERVAL_HOURS=24 +FLOW_PROXY_LOG_MAX_SIZE_MB=100 +``` + +### 保留更多天数的日志 + +如果需要保留更长时间的日志(例如 30 天): + +```bash +FLOW_PROXY_LOG_RETENTION_DAYS=30 +``` + +### 增加日志目录大小限制 + +如果需要更大的日志存储空间(例如 500 MB): + +```bash +FLOW_PROXY_LOG_MAX_SIZE_MB=500 +``` + +### 更频繁的清理 + +如果希望更频繁地清理日志(例如每 6 小时): + +```bash +FLOW_PROXY_LOG_CLEANUP_INTERVAL_HOURS=6 +``` + +### 禁用大小限制 + +如果只想按时间清理,不限制总大小: + +```bash +FLOW_PROXY_LOG_MAX_SIZE_MB=0 +``` + +### 完全禁用自动清理 + +如果不需要自动清理功能: + +```bash +FLOW_PROXY_LOG_CLEANUP_ENABLED=false +``` + +## 工作原理 + +### 启动流程 + +1. 应用启动时,`setup_logging()` 函数会初始化日志系统 +2. 如果启用了日志清理功能,会自动创建并启动 `LogCleaner` 实例 +3. 立即执行一次清理任务 +4. 在后台线程中定期执行清理任务 + +### 清理逻辑 + +#### 按时间清理 + +- 扫描日志目录中的所有 `.log*` 文件 +- 检查每个文件的修改时间 +- 删除修改时间早于 `retention_days` 天前的文件 + +#### 按大小清理 + +- 如果设置了 `max_size_mb` 限制(非 0 值) +- 计算日志目录的总大小 +- 如果超过限制,按修改时间排序(最旧的在前) +- 依次删除最旧的文件,直到总大小低于限制 + +### 线程安全 + +- 清理任务在独立的后台线程中运行 +- 不会阻塞主应用程序 +- 线程设置为 daemon 模式,应用退出时自动停止 + +## 监控和调试 + +### 日志输出 + +清理任务会输出详细的日志信息: + +``` +INFO flow_proxy_plugin.utils.log_cleaner - 日志清理任务已启动,保留 7 天,每 24 小时清理一次 +INFO flow_proxy_plugin.utils.log_cleaner - 开始清理日志,删除 2026-01-25 12:00:00 之前的文件 +DEBUG flow_proxy_plugin.utils.log_cleaner - 删除过期日志文件: old_log.log +INFO flow_proxy_plugin.utils.log_cleaner - 日志清理完成: 删除 3 个文件,释放 15.23 MB 空间 +``` + +### 获取统计信息 + +可以通过编程方式获取日志统计信息: + +```python +from flow_proxy_plugin.utils.log_cleaner import get_log_cleaner + +cleaner = get_log_cleaner() +if cleaner: + stats = cleaner.get_log_stats() + print(f"日志文件数量: {stats['total_files']}") + print(f"总大小: {stats['total_size_mb']} MB") + print(f"最旧文件: {stats['oldest_file']}") + print(f"最新文件: {stats['newest_file']}") +``` + +### 手动触发清理 + +如果需要手动触发清理: + +```python +from flow_proxy_plugin.utils.log_cleaner import get_log_cleaner + +cleaner = get_log_cleaner() +if cleaner: + result = cleaner.cleanup_logs() + print(f"删除了 {result['deleted_files']} 个文件") + print(f"释放了 {result['freed_space_mb']} MB 空间") +``` + +## 最佳实践 + +### 生产环境建议 + +1. **保留天数**: 根据合规要求设置,通常 7-30 天 +2. **大小限制**: 根据可用磁盘空间设置,建议至少 100 MB +3. **清理间隔**: 每天清理一次通常足够 + +示例配置: + +```bash +FLOW_PROXY_LOG_CLEANUP_ENABLED=true +FLOW_PROXY_LOG_RETENTION_DAYS=30 +FLOW_PROXY_LOG_CLEANUP_INTERVAL_HOURS=24 +FLOW_PROXY_LOG_MAX_SIZE_MB=500 +``` + +### 开发环境建议 + +开发环境可以使用更宽松的设置: + +```bash +FLOW_PROXY_LOG_CLEANUP_ENABLED=true +FLOW_PROXY_LOG_RETENTION_DAYS=3 +FLOW_PROXY_LOG_CLEANUP_INTERVAL_HOURS=6 +FLOW_PROXY_LOG_MAX_SIZE_MB=100 +``` + +### 高负载环境 + +对于日志产生量大的环境: + +```bash +FLOW_PROXY_LOG_CLEANUP_ENABLED=true +FLOW_PROXY_LOG_RETENTION_DAYS=7 +FLOW_PROXY_LOG_CLEANUP_INTERVAL_HOURS=12 # 更频繁 +FLOW_PROXY_LOG_MAX_SIZE_MB=1000 # 更大的空间 +``` + +## 故障排查 + +### 日志文件没有被清理 + +1. 检查是否启用了清理功能: + ```bash + echo $FLOW_PROXY_LOG_CLEANUP_ENABLED + ``` + +2. 检查日志文件的修改时间是否超过保留天数 + +3. 查看应用日志中的清理信息 + +### 清理过于激进 + +如果日志被过早删除: + +1. 增加 `FLOW_PROXY_LOG_RETENTION_DAYS` 的值 +2. 增加或禁用 `FLOW_PROXY_LOG_MAX_SIZE_MB` 限制 + +### 磁盘空间仍然不足 + +1. 检查是否有其他应用在同一目录产生日志 +2. 减少 `FLOW_PROXY_LOG_RETENTION_DAYS` 的值 +3. 减少 `FLOW_PROXY_LOG_MAX_SIZE_MB` 的值 +4. 增加清理频率(减少 `FLOW_PROXY_LOG_CLEANUP_INTERVAL_HOURS`) + +## 技术细节 + +### 实现架构 + +- **LogCleaner 类**: 核心清理逻辑 +- **后台线程**: 定期执行清理任务 +- **全局实例**: 通过 `init_log_cleaner()` 创建全局实例 +- **集成点**: 在 `setup_logging()` 中自动初始化 + +### 日志轮转机制 + +使用 Python 标准库的 `TimedRotatingFileHandler`: +- **轮转时机**: 每天午夜(`when='midnight'`) +- **日期后缀**: `%Y-%m-%d` 格式(如 `2026-02-01`) +- **编码**: UTF-8 +- **备份数量**: 不限制(`backupCount=0`),由清理器管理 + +### 文件匹配模式 + +清理器使用 glob 模式 `*.log*` 匹配文件,包括: +- `flow_proxy_plugin.log` - 当前日志文件 +- `flow_proxy_plugin.log.2026-02-01` - 按天轮转的历史日志 +- `*.log.gz` - 如果手动压缩的日志文件 + +### 时间判断 + +使用文件的修改时间(mtime)来判断文件年龄,而不是创建时间。 + +## 相关资源 + +- [Python logging 文档](https://docs.python.org/3/library/logging.html) +- [日志过滤文档](./log-filtering.md) +- [配置管理文档](../README.md#配置) diff --git a/docs/refactoring-summary.md b/docs/refactoring-summary.md deleted file mode 100644 index 308b440..0000000 --- a/docs/refactoring-summary.md +++ /dev/null @@ -1,294 +0,0 @@ -# 插件重构总结 - -## 概述 - -成功重构了 `proxy_plugin.py` 和 `web_server_plugin.py`,使代码更加优雅、简洁和可维护。 - -## 重构目标 - -1. **消除代码重复**:提取公共逻辑到基类 -2. **提高可读性**:方法拆分,单一职责 -3. **增强可维护性**:统一的初始化和错误处理 -4. **保持向后兼容**:所有测试通过 - -## 主要改进 - -### 1. 创建基类 `BaseFlowProxyPlugin` - -**位置**:`flow_proxy_plugin/plugins/base_plugin.py` - -**提取的公共功能**: -- ✅ 日志设置和过滤器配置 -- ✅ 组件初始化(SecretsManager, LoadBalancer, JWTGenerator, RequestForwarder) -- ✅ 配置选择和 JWT 令牌生成(带故障转移) -- ✅ 字节解码和头部值提取工具方法 - -**代码对比**: - -**重构前**: -```python -# 在两个插件中重复的初始化代码(~50 行) -self.logger = logging.getLogger(__name__) -log_level_str = os.getenv("FLOW_PROXY_LOG_LEVEL", "INFO") -# ... 更多重复代码 -setup_colored_logger(self.logger, log_level_str) -setup_proxy_log_filters(...) -``` - -**重构后**: -```python -# 基类中统一实现 -def _setup_logging(self) -> None: - """Set up logging with colored output and filters.""" - # 6 行简洁代码 - -def _initialize_components(self) -> None: - """Initialize core components for request processing.""" - # 统一的初始化逻辑 -``` - -### 2. 重构 `FlowProxyWebServerPlugin` - -**改进点**: - -#### 方法拆分和简化 - -**重构前** `handle_request`(~100 行): -```python -def handle_request(self, request: HttpParser) -> None: - # 配置选择 - # 令牌生成 - # 请求转发 - # 响应发送 - # 所有逻辑混在一起 -``` - -**重构后** `handle_request`(~20 行): -```python -def handle_request(self, request: HttpParser) -> None: - """Handle web server request.""" - method = self._decode_bytes(request.method) if request.method else "GET" - path = self._decode_bytes(request.path) if request.path else "/" - - self.logger.info("→ %s %s", method, path) - - try: - config, config_name, jwt_token = self._get_config_and_token() - response = self._forward_request(request, method, path, jwt_token) - self._send_response(response) - - log_func = self.logger.info if response.status_code < 400 else self.logger.warning - log_func("← %d %s [%s]", response.status_code, response.reason, config_name) - except Exception as e: - self.logger.error("✗ Request failed: %s", str(e), exc_info=True) - self._send_error() -``` - -#### 新增的辅助方法 - -- `_forward_request()`: 处理请求转发逻辑 -- `_build_headers()`: 构建请求头 -- `_get_request_body()`: 提取请求体 -- `_log_request_details()`: DEBUG 日志 -- `_send_response_headers()`: 发送响应头 -- `_stream_response_body()`: 流式响应体 - -**代码行数对比**: -- 重构前:~370 行(包含重复的初始化逻辑) -- 重构后:~270 行(共享基类后) -- **减少约 27%** - -### 3. 重构 `FlowProxyPlugin` - -**改进点**: - -#### 简化的请求处理 - -**重构前** `before_upstream_connection`(~90 行): -```python -def before_upstream_connection(self, request: HttpParser) -> HttpParser | None: - # 路径转换 - # 请求验证 - # 配置选择 - # 令牌生成 - # 故障转移逻辑 - # 请求修改 - # 所有逻辑耦合在一起 -``` - -**重构后** `before_upstream_connection`(~40 行): -```python -def before_upstream_connection(self, request: HttpParser) -> HttpParser | None: - """Process request before establishing upstream connection.""" - try: - self._convert_reverse_proxy_request(request) - - if not self.request_forwarder.validate_request(request): - self.logger.error("Request validation failed") - return None - - config, config_name, jwt_token = self._get_config_and_token() - - modified_request = self.request_forwarder.modify_request_headers( - request, jwt_token, config_name - ) - - target_url = self._decode_bytes(request.path) if request.path else "unknown" - self.logger.info("Request processed with config '%s' → %s", config_name, target_url) - - return modified_request - except (RuntimeError, ValueError) as e: - self.logger.error("Request processing failed: %s", str(e)) - return None -``` - -#### 新增的辅助方法 - -- `_convert_reverse_proxy_request()`: 处理反向代理请求转换 - -**代码行数对比**: -- 重构前:~240 行 -- 重构后:~140 行 -- **减少约 42%** - -### 4. 代码质量指标 - -#### 圈复杂度降低 - -| 方法 | 重构前 | 重构后 | 改进 | -|------|--------|--------|------| -| `handle_request` | 12 | 4 | ↓67% | -| `before_upstream_connection` | 15 | 6 | ↓60% | -| `_send_response` | 10 | 5 | ↓50% | - -#### 可维护性提升 - -- ✅ 单一职责:每个方法专注一个任务 -- ✅ 易于测试:方法更小更独立 -- ✅ 易于扩展:基类可供未来插件复用 -- ✅ 代码复用:消除 ~100 行重复代码 - -### 5. 测试结果 - -```bash -============================= 160 passed in 1.33s ============================== -``` - -**测试覆盖率**: -- ✅ 所有 160 个单元测试通过 -- ✅ 保持向后兼容性 -- ✅ 新增日志过滤器测试(14 个) - -## 技术亮点 - -### 1. 多重继承的优雅使用 - -```python -class FlowProxyWebServerPlugin(HttpWebServerBasePlugin, BaseFlowProxyPlugin): - """Combines proxy.py base with our shared logic.""" -``` - -### 2. 统一的错误处理和故障转移 - -```python -def _get_config_and_token(self) -> tuple[dict[str, Any], str, str]: - """Get next config and generate JWT token with failover support.""" - try: - jwt_token = self.jwt_generator.generate_token(config) - return config, config_name, jwt_token - except ValueError as e: - self.logger.error("Token generation failed for '%s': %s", config_name, str(e)) - self.load_balancer.mark_config_failed(config) - # 自动故障转移 - config = self.load_balancer.get_next_config() - # ... -``` - -### 3. 流式响应的优雅处理 - -```python -def _stream_response_body(self, response: requests.Response) -> tuple[int, int]: - """Stream response body to client. - - Returns: - Tuple of (bytes_sent, chunks_sent) - """ - bytes_sent = 0 - chunks_sent = 0 - - for chunk in response.iter_content(chunk_size=8192): - # 检查连接、发送数据、处理错误 - # ... - - return bytes_sent, chunks_sent -``` - -### 4. 日志过滤器集成 - -```python -def _setup_logging(self) -> None: - """Set up logging with colored output and filters.""" - setup_colored_logger(self.logger, log_level) - setup_proxy_log_filters(suppress_broken_pipe=True, suppress_proxy_noise=True) -``` - -## 向后兼容性 - -为保持向后兼容,保留了以下内容: - -1. **属性名称**: - - `self.secrets_manager` - 虽然作为局部变量已足够,但保留供测试使用 - -2. **方法别名**: - ```python - def _prepare_headers(self, request: HttpParser, jwt_token: str) -> dict[str, str]: - """Deprecated: Use _build_headers instead.""" - return self._build_headers(request, jwt_token) - ``` - -## 代码结构 - -``` -flow_proxy_plugin/plugins/ -├── __init__.py # 导出所有插件 -├── base_plugin.py # ✨ 新增:基类 -├── proxy_plugin.py # 重构:从 ~240 行 → ~140 行 -└── web_server_plugin.py # 重构:从 ~370 行 → ~270 行 -``` - -## 性能影响 - -- ✅ **无性能损失**:重构仅改变代码组织,不影响运行时性能 -- ✅ **内存使用相同**:对象结构未改变 -- ✅ **启动时间相同**:初始化逻辑保持一致 - -## 未来扩展性 - -基类 `BaseFlowProxyPlugin` 为未来插件提供了标准模板: - -```python -class NewCustomPlugin(SomeBasePlugin, BaseFlowProxyPlugin): - """Future plugin can easily reuse shared logic.""" - - def __init__(self, *args, **kwargs): - super().__init__(*args, **kwargs) - self._setup_logging() # 复用 - self._initialize_components() # 复用 - - def process_request(self, request): - config, name, token = self._get_config_and_token() # 复用 - # 自定义逻辑 -``` - -## 总结 - -这次重构成功地: - -1. ✅ **减少代码重复**:消除 ~100 行重复代码 -2. ✅ **提高可读性**:方法更小、更专注 -3. ✅ **降低复杂度**:圈复杂度降低 50-67% -4. ✅ **保持兼容性**:所有测试通过 -5. ✅ **增强可维护性**:统一的模式和结构 -6. ✅ **提升扩展性**:基类可供未来复用 - -**代码质量显著提升,同时保持了功能完整性和测试覆盖率!** diff --git a/flow_proxy_plugin/cli.py b/flow_proxy_plugin/cli.py index 4359c47..47958e1 100644 --- a/flow_proxy_plugin/cli.py +++ b/flow_proxy_plugin/cli.py @@ -52,10 +52,10 @@ def main() -> None: ) parser.add_argument( - "--log-file", + "--log-dir", type=str, - default=os.getenv("FLOW_PROXY_LOG_FILE", "flow_proxy_plugin.log"), - help="Path to log file (default: flow_proxy_plugin.log, env: FLOW_PROXY_LOG_FILE)", + default=os.getenv("FLOW_PROXY_LOG_DIR", "logs"), + help="Log directory path (default: logs, env: FLOW_PROXY_LOG_DIR)", ) parser.add_argument( @@ -74,7 +74,7 @@ def main() -> None: args = parser.parse_args() # Setup logging - setup_logging(args.log_level, args.log_file) + setup_logging(args.log_level, args.log_dir) logger = logging.getLogger(__name__) # Check if secrets file exists @@ -106,9 +106,10 @@ def main() -> None: logger.info(f" Log level: {args.log_level}") logger.info("=" * 60) - # Store secrets file path and log level in environment for plugin to access + # Store secrets file path, log level, and log dir in environment for plugin to access os.environ["FLOW_PROXY_SECRETS_FILE"] = args.secrets_file os.environ["FLOW_PROXY_LOG_LEVEL"] = args.log_level + os.environ["FLOW_PROXY_LOG_DIR"] = args.log_dir # Build proxy.py arguments proxy_args = [ diff --git a/flow_proxy_plugin/plugins/base_plugin.py b/flow_proxy_plugin/plugins/base_plugin.py index a551644..bacd97b 100644 --- a/flow_proxy_plugin/plugins/base_plugin.py +++ b/flow_proxy_plugin/plugins/base_plugin.py @@ -5,7 +5,7 @@ from typing import Any from ..utils.log_filter import setup_proxy_log_filters -from ..utils.logging import setup_colored_logger +from ..utils.logging import setup_colored_logger, setup_file_handler_for_child_process from ..utils.plugin_base import initialize_plugin_components @@ -24,7 +24,14 @@ def _setup_logging(self) -> None: if not os.getenv("FLOW_PROXY_LOG_LEVEL") and isinstance(flags_level, str): log_level = flags_level - setup_colored_logger(self.logger, log_level) + # Setup logger with console output only (no propagation to avoid duplicate logs) + setup_colored_logger(self.logger, log_level, propagate=False) + + # In multi-process environment, CREATE NEW file handler in child process + # This is necessary because file handlers from parent process don't work after fork() + log_dir = os.getenv("FLOW_PROXY_LOG_DIR", "logs") + setup_file_handler_for_child_process(self.logger, log_level, log_dir) + setup_proxy_log_filters(suppress_broken_pipe=True, suppress_proxy_noise=True) def _initialize_components(self) -> None: diff --git a/flow_proxy_plugin/utils/log_cleaner.py b/flow_proxy_plugin/utils/log_cleaner.py new file mode 100644 index 0000000..f3179b4 --- /dev/null +++ b/flow_proxy_plugin/utils/log_cleaner.py @@ -0,0 +1,285 @@ +"""日志清理工具模块。 + +提供自动清理过期日志文件的功能。 +""" + +import logging +import threading +from datetime import datetime, timedelta +from pathlib import Path + +logger = logging.getLogger(__name__) + + +class LogCleaner: + """日志清理器,负责定期清理过期的日志文件。""" + + def __init__( + self, + *, + log_dir: Path, + retention_days: int = 7, + cleanup_interval_hours: int = 24, + max_size_mb: int = 0, + enabled: bool = True, + ): + """初始化日志清理器。 + + Args: + log_dir: 日志目录路径 + retention_days: 日志保留天数 + cleanup_interval_hours: 清理间隔(小时) + max_size_mb: 日志目录最大大小(MB),0 表示不限制 + enabled: 是否启用自动清理 + """ + self.log_dir = Path(log_dir) + self.retention_days = retention_days + self.cleanup_interval_hours = cleanup_interval_hours + self.max_size_mb = max_size_mb + self.enabled = enabled + self._stop_event = threading.Event() + self._thread: threading.Thread | None = None + + def start(self) -> None: + """启动日志清理任务。""" + if not self.enabled: + logger.info("日志自动清理功能已禁用") + return + + if self._thread is not None and self._thread.is_alive(): + logger.warning("日志清理任务已在运行中") + return + + # 确保日志目录存在 + self.log_dir.mkdir(parents=True, exist_ok=True) + + # 立即执行一次清理 + self.cleanup_logs() + + # 启动定期清理线程 + self._stop_event.clear() + self._thread = threading.Thread(target=self._cleanup_loop, daemon=True) + self._thread.start() + logger.info( + f"日志清理任务已启动,保留 {self.retention_days} 天," + f"每 {self.cleanup_interval_hours} 小时清理一次" + ) + + def stop(self) -> None: + """停止日志清理任务。""" + if self._thread is None or not self._thread.is_alive(): + return + + logger.info("正在停止日志清理任务...") + self._stop_event.set() + self._thread.join(timeout=5) + self._thread = None + logger.info("日志清理任务已停止") + + def _cleanup_loop(self) -> None: + """清理循环,定期执行日志清理。""" + while not self._stop_event.is_set(): + # 等待指定的时间间隔 + if self._stop_event.wait(self.cleanup_interval_hours * 3600): + break + + # 执行清理 + try: + self.cleanup_logs() + except Exception as e: + logger.error(f"日志清理失败: {e}", exc_info=True) + + def cleanup_logs(self) -> dict: + """清理过期的日志文件。 + + Returns: + 清理结果统计,包含删除的文件数量和释放的空间 + """ + if not self.log_dir.exists(): + logger.warning(f"日志目录不存在: {self.log_dir}") + return {"deleted_files": 0, "freed_space_mb": 0} + + deleted_files = 0 + freed_space = 0 + cutoff_time = datetime.now() - timedelta(days=self.retention_days) + + logger.info(f"开始清理日志,删除 {cutoff_time.strftime('%Y-%m-%d %H:%M:%S')} 之前的文件") + + # 按修改时间清理 + for log_file in self.log_dir.glob("*.log*"): + try: + # 获取文件修改时间 + mtime = datetime.fromtimestamp(log_file.stat().st_mtime) + + if mtime < cutoff_time: + file_size = log_file.stat().st_size + log_file.unlink() + deleted_files += 1 + freed_space += file_size + logger.debug(f"删除过期日志文件: {log_file.name}") + except Exception as e: + logger.error(f"删除日志文件 {log_file} 失败: {e}") + + # 按总大小清理(如果设置了限制) + if self.max_size_mb > 0: + deleted, freed = self._cleanup_by_size() + deleted_files += deleted + freed_space += freed + + freed_space_mb = freed_space / (1024 * 1024) + logger.info(f"日志清理完成: 删除 {deleted_files} 个文件,释放 {freed_space_mb:.2f} MB 空间") + + return { + "deleted_files": deleted_files, + "freed_space_mb": round(freed_space_mb, 2), + } + + def _cleanup_by_size(self) -> tuple[int, int]: + """按总大小清理日志文件。 + + 如果日志目录总大小超过限制,删除最旧的文件直到满足限制。 + + Returns: + (删除的文件数量, 释放的空间字节数) + """ + max_size_bytes = self.max_size_mb * 1024 * 1024 + deleted_files = 0 + freed_space = 0 + + # 获取所有日志文件及其大小和修改时间 + log_files = [] + total_size = 0 + for log_file in self.log_dir.glob("*.log*"): + try: + stat = log_file.stat() + log_files.append((log_file, stat.st_size, stat.st_mtime)) + total_size += stat.st_size + except Exception as e: + logger.error(f"获取文件信息失败 {log_file}: {e}") + + # 如果总大小未超过限制,无需清理 + if total_size <= max_size_bytes: + return 0, 0 + + # 按修改时间排序(最旧的在前) + log_files.sort(key=lambda x: x[2]) + + # 删除最旧的文件直到满足大小限制 + logger.info( + f"日志目录大小 {total_size / (1024 * 1024):.2f} MB 超过限制 " + f"{self.max_size_mb} MB,开始清理最旧的文件" + ) + + for log_file, size, _ in log_files: + if total_size <= max_size_bytes: + break + + try: + log_file.unlink() + deleted_files += 1 + freed_space += size + total_size -= size + logger.debug(f"删除日志文件以满足大小限制: {log_file.name}") + except Exception as e: + logger.error(f"删除日志文件 {log_file} 失败: {e}") + + return deleted_files, freed_space + + def get_log_stats(self) -> dict: + """获取日志目录统计信息。 + + Returns: + 日志统计信息,包含文件数量、总大小等 + """ + if not self.log_dir.exists(): + return { + "total_files": 0, + "total_size_mb": 0, + "oldest_file": None, + "newest_file": None, + } + + log_files = list(self.log_dir.glob("*.log*")) + total_size = 0 + oldest_time = None + newest_time = None + + for log_file in log_files: + try: + stat = log_file.stat() + total_size += stat.st_size + mtime = datetime.fromtimestamp(stat.st_mtime) + + if oldest_time is None or mtime < oldest_time: + oldest_time = mtime + if newest_time is None or mtime > newest_time: + newest_time = mtime + except Exception: + pass + + return { + "total_files": len(log_files), + "total_size_mb": round(total_size / (1024 * 1024), 2), + "oldest_file": oldest_time.strftime("%Y-%m-%d %H:%M:%S") if oldest_time else None, + "newest_file": newest_time.strftime("%Y-%m-%d %H:%M:%S") if newest_time else None, + } + + +class _LogCleanerState: + """日志清理器状态管理类,避免使用 global 语句。""" + + def __init__(self) -> None: + self.cleaner: LogCleaner | None = None + + +# 全局日志清理器状态实例 +_state = _LogCleanerState() + + +def init_log_cleaner( + log_dir: Path, + retention_days: int = 7, + cleanup_interval_hours: int = 24, + max_size_mb: int = 0, + enabled: bool = True, +) -> LogCleaner: + """初始化全局日志清理器。 + + Args: + log_dir: 日志目录路径 + retention_days: 日志保留天数 + cleanup_interval_hours: 清理间隔(小时) + max_size_mb: 日志目录最大大小(MB) + enabled: 是否启用自动清理 + + Returns: + 日志清理器实例 + """ + if _state.cleaner is not None: + _state.cleaner.stop() + + _state.cleaner = LogCleaner( + log_dir=log_dir, + retention_days=retention_days, + cleanup_interval_hours=cleanup_interval_hours, + max_size_mb=max_size_mb, + enabled=enabled, + ) + _state.cleaner.start() + return _state.cleaner + + +def get_log_cleaner() -> LogCleaner | None: + """获取全局日志清理器实例。 + + Returns: + 日志清理器实例,如果未初始化则返回 None + """ + return _state.cleaner + + +def stop_log_cleaner() -> None: + """停止全局日志清理器。""" + if _state.cleaner is not None: + _state.cleaner.stop() + _state.cleaner = None diff --git a/flow_proxy_plugin/utils/logging.py b/flow_proxy_plugin/utils/logging.py index 2d87393..4d119b3 100644 --- a/flow_proxy_plugin/utils/logging.py +++ b/flow_proxy_plugin/utils/logging.py @@ -1,7 +1,12 @@ -"""Logging utilities with colored output.""" +"""Logging utilities with colored output and daily rotation.""" import logging +import os import sys +from dataclasses import dataclass +from logging.handlers import TimedRotatingFileHandler +from pathlib import Path +from typing import ClassVar # ANSI color codes @@ -10,8 +15,6 @@ class Colors: RESET = "\033[0m" BOLD = "\033[1m" - - # Bright foreground colors BRIGHT_BLACK = "\033[90m" BRIGHT_CYAN = "\033[96m" BRIGHT_YELLOW = "\033[93m" @@ -22,7 +25,7 @@ class Colors: class ColoredFormatter(logging.Formatter): """Custom formatter with colors for different log levels.""" - LEVEL_COLORS = { + LEVEL_COLORS: ClassVar[dict[int, str]] = { logging.DEBUG: Colors.BRIGHT_BLACK, logging.INFO: Colors.BRIGHT_CYAN, logging.WARNING: Colors.BRIGHT_YELLOW, @@ -32,87 +35,267 @@ class ColoredFormatter(logging.Formatter): def format(self, record: logging.LogRecord) -> str: """Format log record with colors.""" - # Add color to level name level_color = self.LEVEL_COLORS.get(record.levelno, "") record.levelname = f"{level_color}{record.levelname:8s}{Colors.RESET}" - - # Color the logger name record.name = f"{Colors.BRIGHT_BLACK}{record.name}{Colors.RESET}" - - # Format the message return super().format(record) +@dataclass +class FormatConfig: + """Log format configuration.""" + + console_format: str = "%(levelname)s %(name)s - %(message)s" + console_date_format: str = "%H:%M:%S" + file_format: str = "%(asctime)s - %(name)s - %(levelname)s - %(message)s" + file_date_format: str = "%Y-%m-%d %H:%M:%S" + + +@dataclass +class RotationConfig: + """Log rotation configuration.""" + + when: str = "midnight" + interval: int = 1 + backup_count: int = 0 + suffix: str = "%Y-%m-%d" + encoding: str = "utf-8" + + +@dataclass +class CleanupConfig: + """Log cleanup configuration.""" + + enabled: bool = True + retention_days: int = 7 + cleanup_interval_hours: int = 24 + max_size_mb: int = 100 + + @classmethod + def from_env(cls) -> "CleanupConfig": + """Create config from environment variables.""" + return cls( + enabled=os.getenv("FLOW_PROXY_LOG_CLEANUP_ENABLED", "true").lower() == "true", + retention_days=int(os.getenv("FLOW_PROXY_LOG_RETENTION_DAYS", "7")), + cleanup_interval_hours=int(os.getenv("FLOW_PROXY_LOG_CLEANUP_INTERVAL_HOURS", "24")), + max_size_mb=int(os.getenv("FLOW_PROXY_LOG_MAX_SIZE_MB", "100")), + ) + + +@dataclass +class LogConfig: + """Main log configuration.""" + + level: str = "INFO" + log_dir: str = "logs" + log_filename: str = "flow_proxy_plugin.log" + format: FormatConfig = None # type: ignore + rotation: RotationConfig = None # type: ignore + cleanup: CleanupConfig = None # type: ignore + + def __post_init__(self) -> None: + """Initialize nested configs with defaults if not provided.""" + if self.format is None: + self.format = FormatConfig() + if self.rotation is None: + self.rotation = RotationConfig() + if self.cleanup is None: + self.cleanup = CleanupConfig.from_env() + + @classmethod + def from_env(cls, level: str = "INFO", log_dir: str = "logs") -> "LogConfig": + """Create config from environment variables.""" + return cls( + level=level, + log_dir=log_dir, + format=FormatConfig(), + rotation=RotationConfig(), + cleanup=CleanupConfig.from_env(), + ) + + @property + def log_level(self) -> int: + """Get logging level as integer.""" + return getattr(logging, self.level.upper(), logging.INFO) + + @property + def log_dir_path(self) -> Path: + """Get log directory as Path object.""" + return Path(self.log_dir) + + @property + def log_file_path(self) -> Path: + """Get full log file path.""" + return self.log_dir_path / self.log_filename + + +class LoggerFactory: + """Factory for creating logging handlers and formatters.""" + + @staticmethod + def create_console_handler(config: LogConfig) -> logging.StreamHandler: + """Create console handler with colored formatter.""" + handler = logging.StreamHandler(sys.stdout) + handler.setFormatter( + ColoredFormatter( + fmt=config.format.console_format, + datefmt=config.format.console_date_format, + ) + ) + return handler + + @staticmethod + def create_file_handler(config: LogConfig) -> TimedRotatingFileHandler: + """Create rotating file handler for daily logs.""" + handler = TimedRotatingFileHandler( + filename=str(config.log_file_path), + when=config.rotation.when, + interval=config.rotation.interval, + backupCount=config.rotation.backup_count, + encoding=config.rotation.encoding, + ) + handler.suffix = config.rotation.suffix + handler.setFormatter( + logging.Formatter( + fmt=config.format.file_format, + datefmt=config.format.file_date_format, + ) + ) + return handler + + +class LogSetup: + """Main logging setup coordinator.""" + + def __init__(self, config: LogConfig): + """Initialize with configuration.""" + self.config = config + self._ensure_log_directory() + + def _ensure_log_directory(self) -> None: + """Ensure log directory exists.""" + self.config.log_dir_path.mkdir(parents=True, exist_ok=True) + + def configure_root_logger(self) -> None: + """Configure root logger with console and file handlers.""" + console_handler = LoggerFactory.create_console_handler(self.config) + file_handler = LoggerFactory.create_file_handler(self.config) + + logging.basicConfig( + level=self.config.log_level, + handlers=[console_handler, file_handler], + force=True, # Override any existing configuration + ) + + def initialize_cleaner(self) -> None: + """Initialize log cleaner for automatic cleanup.""" + from .log_cleaner import init_log_cleaner + + init_log_cleaner( + log_dir=self.config.log_dir_path, + retention_days=self.config.cleanup.retention_days, + cleanup_interval_hours=self.config.cleanup.cleanup_interval_hours, + max_size_mb=self.config.cleanup.max_size_mb, + enabled=self.config.cleanup.enabled, + ) + + def setup(self) -> None: + """Perform complete logging setup.""" + self.configure_root_logger() + self.initialize_cleaner() + + +def setup_logging(level: str = "INFO", log_dir: str = "logs") -> None: + """Setup logging configuration for the application. + + This is the main entry point for logging configuration. It: + - Creates a log configuration from environment variables + - Sets up console output with colors + - Configures daily rotating file logs + - Initializes automatic log cleanup + + Args: + level: Logging level (DEBUG, INFO, WARNING, ERROR) + log_dir: Log directory path + + Example: + >>> setup_logging(level="INFO", log_dir="logs") + # Creates logs/flow_proxy_plugin.log with daily rotation + # Old logs: logs/flow_proxy_plugin.log.2026-02-01, etc. + """ + config = LogConfig.from_env(level=level, log_dir=log_dir) + setup = LogSetup(config) + setup.setup() + + def setup_colored_logger( - logger: logging.Logger, log_level: str = "INFO", propagate: bool = False + logger: logging.Logger, + log_level: str = "INFO", + propagate: bool = False, ) -> None: - """Setup colored logger for a plugin. + """Setup colored console logger for a specific logger instance. + + This is useful for setting up individual module loggers with + colored output, separate from the root logger configuration. Args: logger: Logger instance to configure log_level: Log level string (DEBUG, INFO, WARNING, ERROR) - propagate: Whether to propagate logs to parent loggers (default: False) + propagate: Whether to propagate logs to parent loggers + + Example: + >>> logger = logging.getLogger(__name__) + >>> setup_colored_logger(logger, log_level="DEBUG") """ level = getattr(logging, log_level.upper(), logging.INFO) logger.setLevel(level) # Clear existing handlers - if logger.handlers: - logger.handlers.clear() + logger.handlers.clear() # Add colored console handler - console_handler = logging.StreamHandler(sys.stdout) - console_handler.setLevel(level) # Set handler level too - console_handler.setFormatter( - ColoredFormatter(fmt="%(levelname)s %(name)s - %(message)s", datefmt="%H:%M:%S") - ) + config = LogConfig(level=log_level) + console_handler = LoggerFactory.create_console_handler(config) + console_handler.setLevel(level) logger.addHandler(console_handler) - # Add file handler (get log file from environment) - import os + logger.propagate = propagate - log_file = os.getenv("FLOW_PROXY_LOG_FILE", "flow_proxy_plugin.log") - try: - file_handler = logging.FileHandler(log_file) - file_handler.setLevel(level) - file_handler.setFormatter( - logging.Formatter( - fmt="%(asctime)s - pid:%(process)d - %(name)s - %(levelname)s - %(message)s", - datefmt="%Y-%m-%d %H:%M:%S", - ) - ) - logger.addHandler(file_handler) - except Exception as e: - # If file handler fails, just log to console - logger.warning(f"Could not setup file handler: {e}") - logger.propagate = propagate # Allow control of propagation +def setup_file_handler_for_child_process( + logger: logging.Logger, + log_level: str = "INFO", + log_dir: str = "logs", +) -> None: + """Setup file handler for logger in child process. - -def setup_logging(level: str = "INFO", log_file: str = "flow_proxy_plugin.log") -> None: - """Setup logging configuration for the application. + This function creates a NEW file handler in the child process, + which is necessary because file handlers from the parent process + don't work correctly after fork() due to file descriptor issues. Args: - level: Logging level (DEBUG, INFO, WARNING, ERROR) - log_file: Path to log file + logger: Logger instance to add file handler to + log_level: Log level string (DEBUG, INFO, WARNING, ERROR) + log_dir: Log directory path + + Example: + >>> # In child process after fork + >>> logger = logging.getLogger(__name__) + >>> setup_file_handler_for_child_process(logger, "DEBUG", "logs") """ - # Console handler with colors - console_handler = logging.StreamHandler(sys.stdout) - console_handler.setFormatter( - ColoredFormatter(fmt="%(levelname)s %(name)s - %(message)s", datefmt="%H:%M:%S") - ) - - # File handler without colors - file_handler = logging.FileHandler(log_file) - file_handler.setFormatter( - logging.Formatter( - fmt="%(asctime)s - %(name)s - %(levelname)s - %(message)s", - datefmt="%Y-%m-%d %H:%M:%S", - ) - ) + # Create new config for child process + config = LogConfig.from_env(level=log_level, log_dir=log_dir) + + # Ensure log directory exists + config.log_dir_path.mkdir(parents=True, exist_ok=True) + + # Create NEW file handler (not copy from parent) + file_handler = LoggerFactory.create_file_handler(config) + file_handler.setLevel(getattr(logging, log_level.upper(), logging.INFO)) + + # Add file handler to logger (avoid duplicates) + for handler in logger.handlers: + if isinstance(handler, TimedRotatingFileHandler): + logger.removeHandler(handler) - # Configure root logger - logging.basicConfig( - level=getattr(logging, level.upper()), - handlers=[console_handler, file_handler], - ) + logger.addHandler(file_handler) diff --git a/poetry.lock b/poetry.lock index e38539b..48157c6 100644 --- a/poetry.lock +++ b/poetry.lock @@ -188,14 +188,14 @@ files = [ [[package]] name = "commitizen" -version = "4.11.6" +version = "4.12.1" description = "Python commitizen client tool" optional = false python-versions = "<4.0,>=3.10" groups = ["dev"] files = [ - {file = "commitizen-4.11.6-py3-none-any.whl", hash = "sha256:735073011e272f7fe2ed87e61225d33161741226b254b85213c9f50bba38d087"}, - {file = "commitizen-4.11.6.tar.gz", hash = "sha256:ed8aec7eba95eaa9c6c83958396e4c8ec831926cab26f80840f70afaf539c5f2"}, + {file = "commitizen-4.12.1-py3-none-any.whl", hash = "sha256:779438b4881803433342b32aab55485ece9c1f05be60add6399570811b03f9f0"}, + {file = "commitizen-4.12.1.tar.gz", hash = "sha256:3bf952793cf19466116e23802df56ca019c5d34aaaa4785bba718b556b3732c1"}, ] [package.dependencies] @@ -389,24 +389,24 @@ files = [ [[package]] name = "hypothesis" -version = "6.150.2" +version = "6.151.4" description = "The property-based testing library for Python" optional = false python-versions = ">=3.10" groups = ["dev"] files = [ - {file = "hypothesis-6.150.2-py3-none-any.whl", hash = "sha256:648d6a2be435889e713ba3d335b0fb5e7a250f569b56e6867887c1e7a0d1f02f"}, - {file = "hypothesis-6.150.2.tar.gz", hash = "sha256:deb043c41c53eaf0955f4a08739c2a34c3d8040ee3d9a2da0aa5470122979f75"}, + {file = "hypothesis-6.151.4-py3-none-any.whl", hash = "sha256:a1cf7e0fdaa296d697a68ff3c0b3912c0050f07aa37e7d2ff33a966749d1d9b4"}, + {file = "hypothesis-6.151.4.tar.gz", hash = "sha256:658a62da1c3ccb36746ac2f7dc4bb1a6e76bd314e0dc54c4e1aaba2503d5545c"}, ] [package.dependencies] sortedcontainers = ">=2.1.0,<3.0.0" [package.extras] -all = ["black (>=20.8b0)", "click (>=7.0)", "crosshair-tool (>=0.0.101)", "django (>=4.2)", "dpcontracts (>=0.4)", "hypothesis-crosshair (>=0.0.27)", "lark (>=0.10.1)", "libcst (>=0.3.16)", "numpy (>=1.21.6)", "pandas (>=1.1)", "pytest (>=4.6)", "python-dateutil (>=1.4)", "pytz (>=2014.1)", "redis (>=3.0.0)", "rich (>=9.0.0)", "tzdata (>=2025.3) ; sys_platform == \"win32\" or sys_platform == \"emscripten\"", "watchdog (>=4.0.0)"] +all = ["black (>=20.8b0)", "click (>=7.0)", "crosshair-tool (>=0.0.102)", "django (>=4.2)", "dpcontracts (>=0.4)", "hypothesis-crosshair (>=0.0.27)", "lark (>=0.10.1)", "libcst (>=0.3.16)", "numpy (>=1.21.6)", "pandas (>=1.1)", "pytest (>=4.6)", "python-dateutil (>=1.4)", "pytz (>=2014.1)", "redis (>=3.0.0)", "rich (>=9.0.0)", "tzdata (>=2025.3) ; sys_platform == \"win32\" or sys_platform == \"emscripten\"", "watchdog (>=4.0.0)"] cli = ["black (>=20.8b0)", "click (>=7.0)", "rich (>=9.0.0)"] codemods = ["libcst (>=0.3.16)"] -crosshair = ["crosshair-tool (>=0.0.101)", "hypothesis-crosshair (>=0.0.27)"] +crosshair = ["crosshair-tool (>=0.0.102)", "hypothesis-crosshair (>=0.0.27)"] dateutil = ["python-dateutil (>=1.4)"] django = ["django (>=4.2)"] dpcontracts = ["dpcontracts (>=0.4)"] @@ -1122,31 +1122,31 @@ use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"] [[package]] name = "ruff" -version = "0.14.13" +version = "0.14.14" description = "An extremely fast Python linter and code formatter, written in Rust." optional = false python-versions = ">=3.7" groups = ["dev"] files = [ - {file = "ruff-0.14.13-py3-none-linux_armv6l.whl", hash = "sha256:76f62c62cd37c276cb03a275b198c7c15bd1d60c989f944db08a8c1c2dbec18b"}, - {file = "ruff-0.14.13-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:914a8023ece0528d5cc33f5a684f5f38199bbb566a04815c2c211d8f40b5d0ed"}, - {file = "ruff-0.14.13-py3-none-macosx_11_0_arm64.whl", hash = "sha256:d24899478c35ebfa730597a4a775d430ad0d5631b8647a3ab368c29b7e7bd063"}, - {file = "ruff-0.14.13-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9aaf3870f14d925bbaf18b8a2347ee0ae7d95a2e490e4d4aea6813ed15ebc80e"}, - {file = "ruff-0.14.13-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ac5b7f63dd3b27cc811850f5ffd8fff845b00ad70e60b043aabf8d6ecc304e09"}, - {file = "ruff-0.14.13-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:78d2b1097750d90ba82ce4ba676e85230a0ed694178ca5e61aa9b459970b3eb9"}, - {file = "ruff-0.14.13-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:7d0bf87705acbbcb8d4c24b2d77fbb73d40210a95c3903b443cd9e30824a5032"}, - {file = "ruff-0.14.13-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a3eb5da8e2c9e9f13431032fdcbe7681de9ceda5835efee3269417c13f1fed5c"}, - {file = "ruff-0.14.13-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:642442b42957093811cd8d2140dfadd19c7417030a7a68cf8d51fcdd5f217427"}, - {file = "ruff-0.14.13-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4acdf009f32b46f6e8864af19cbf6841eaaed8638e65c8dac845aea0d703c841"}, - {file = "ruff-0.14.13-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:591a7f68860ea4e003917d19b5c4f5ac39ff558f162dc753a2c5de897fd5502c"}, - {file = "ruff-0.14.13-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:774c77e841cc6e046fc3e91623ce0903d1cd07e3a36b1a9fe79b81dab3de506b"}, - {file = "ruff-0.14.13-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:61f4e40077a1248436772bb6512db5fc4457fe4c49e7a94ea7c5088655dd21ae"}, - {file = "ruff-0.14.13-py3-none-musllinux_1_2_i686.whl", hash = "sha256:6d02f1428357fae9e98ac7aa94b7e966fd24151088510d32cf6f902d6c09235e"}, - {file = "ruff-0.14.13-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:e399341472ce15237be0c0ae5fbceca4b04cd9bebab1a2b2c979e015455d8f0c"}, - {file = "ruff-0.14.13-py3-none-win32.whl", hash = "sha256:ef720f529aec113968b45dfdb838ac8934e519711da53a0456038a0efecbd680"}, - {file = "ruff-0.14.13-py3-none-win_amd64.whl", hash = "sha256:6070bd026e409734b9257e03e3ef18c6e1a216f0435c6751d7a8ec69cb59abef"}, - {file = "ruff-0.14.13-py3-none-win_arm64.whl", hash = "sha256:7ab819e14f1ad9fe39f246cfcc435880ef7a9390d81a2b6ac7e01039083dd247"}, - {file = "ruff-0.14.13.tar.gz", hash = "sha256:83cd6c0763190784b99650a20fec7633c59f6ebe41c5cc9d45ee42749563ad47"}, + {file = "ruff-0.14.14-py3-none-linux_armv6l.whl", hash = "sha256:7cfe36b56e8489dee8fbc777c61959f60ec0f1f11817e8f2415f429552846aed"}, + {file = "ruff-0.14.14-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:6006a0082336e7920b9573ef8a7f52eec837add1265cc74e04ea8a4368cd704c"}, + {file = "ruff-0.14.14-py3-none-macosx_11_0_arm64.whl", hash = "sha256:026c1d25996818f0bf498636686199d9bd0d9d6341c9c2c3b62e2a0198b758de"}, + {file = "ruff-0.14.14-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f666445819d31210b71e0a6d1c01e24447a20b85458eea25a25fe8142210ae0e"}, + {file = "ruff-0.14.14-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3c0f18b922c6d2ff9a5e6c3ee16259adc513ca775bcf82c67ebab7cbd9da5bc8"}, + {file = "ruff-0.14.14-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1629e67489c2dea43e8658c3dba659edbfd87361624b4040d1df04c9740ae906"}, + {file = "ruff-0.14.14-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:27493a2131ea0f899057d49d303e4292b2cae2bb57253c1ed1f256fbcd1da480"}, + {file = "ruff-0.14.14-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:01ff589aab3f5b539e35db38425da31a57521efd1e4ad1ae08fc34dbe30bd7df"}, + {file = "ruff-0.14.14-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1cc12d74eef0f29f51775f5b755913eb523546b88e2d733e1d701fe65144e89b"}, + {file = "ruff-0.14.14-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bb8481604b7a9e75eff53772496201690ce2687067e038b3cc31aaf16aa0b974"}, + {file = "ruff-0.14.14-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:14649acb1cf7b5d2d283ebd2f58d56b75836ed8c6f329664fa91cdea19e76e66"}, + {file = "ruff-0.14.14-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:e8058d2145566510790eab4e2fad186002e288dec5e0d343a92fe7b0bc1b3e13"}, + {file = "ruff-0.14.14-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:e651e977a79e4c758eb807f0481d673a67ffe53cfa92209781dfa3a996cf8412"}, + {file = "ruff-0.14.14-py3-none-musllinux_1_2_i686.whl", hash = "sha256:cc8b22da8d9d6fdd844a68ae937e2a0adf9b16514e9a97cc60355e2d4b219fc3"}, + {file = "ruff-0.14.14-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:16bc890fb4cc9781bb05beb5ab4cd51be9e7cb376bf1dd3580512b24eb3fda2b"}, + {file = "ruff-0.14.14-py3-none-win32.whl", hash = "sha256:b530c191970b143375b6a68e6f743800b2b786bbcf03a7965b06c4bf04568167"}, + {file = "ruff-0.14.14-py3-none-win_amd64.whl", hash = "sha256:3dde1435e6b6fe5b66506c1dff67a421d0b7f6488d466f651c07f4cab3bf20fd"}, + {file = "ruff-0.14.14-py3-none-win_arm64.whl", hash = "sha256:56e6981a98b13a32236a72a8da421d7839221fa308b223b9283312312e5ac76c"}, + {file = "ruff-0.14.14.tar.gz", hash = "sha256:2d0f819c9a90205f3a867dbbd0be083bee9912e170fd7d9704cc8ae45824896b"}, ] [[package]] diff --git a/tests/test_log_cleaner.py b/tests/test_log_cleaner.py new file mode 100644 index 0000000..7c26a8b --- /dev/null +++ b/tests/test_log_cleaner.py @@ -0,0 +1,248 @@ +"""Tests for log cleaner module.""" + +import time +from datetime import datetime, timedelta +from pathlib import Path + +import pytest + +from flow_proxy_plugin.utils.log_cleaner import ( + LogCleaner, + get_log_cleaner, + init_log_cleaner, + stop_log_cleaner, +) + + +@pytest.fixture +def temp_log_dir(tmp_path: Path) -> Path: + """Create a temporary log directory.""" + log_dir = tmp_path / "logs" + log_dir.mkdir() + return log_dir + + +@pytest.fixture +def log_cleaner(temp_log_dir: Path) -> LogCleaner: # type: ignore[misc] + """Create a log cleaner instance for testing.""" + cleaner = LogCleaner( + log_dir=temp_log_dir, + retention_days=1, + cleanup_interval_hours=1, + max_size_mb=1, + enabled=False, # Don't start automatically in tests + ) + yield cleaner + cleaner.stop() + + +def create_log_file(log_dir: Path, name: str, age_days: int = 0, size_mb: float = 0.1) -> Path: + """Create a test log file with specific age and size. + + Args: + log_dir: Directory to create the log file in + name: Name of the log file + age_days: Age of the file in days + size_mb: Size of the file in MB + + Returns: + Path to the created log file + """ + log_file = log_dir / name + + # Create file with specified size + content = "x" * int(size_mb * 1024 * 1024) + log_file.write_text(content) + + # Set modification time + if age_days > 0: + old_time = datetime.now() - timedelta(days=age_days) + timestamp = old_time.timestamp() + log_file.touch() + import os + os.utime(log_file, (timestamp, timestamp)) + + return log_file + + +class TestLogCleaner: + """Test cases for LogCleaner class.""" + + def test_init(self, temp_log_dir: Path) -> None: + """Test log cleaner initialization.""" + cleaner = LogCleaner( + log_dir=temp_log_dir, + retention_days=7, + cleanup_interval_hours=24, + max_size_mb=100, + enabled=True, + ) + + assert cleaner.log_dir == temp_log_dir + assert cleaner.retention_days == 7 + assert cleaner.cleanup_interval_hours == 24 + assert cleaner.max_size_mb == 100 + assert cleaner.enabled is True + + cleaner.stop() + + def test_cleanup_old_logs(self, log_cleaner: LogCleaner, temp_log_dir: Path) -> None: + """Test cleaning up old log files.""" + # Create test files + old_log = create_log_file(temp_log_dir, "old.log", age_days=2) + recent_log = create_log_file(temp_log_dir, "recent.log", age_days=0) + + # Run cleanup + result = log_cleaner.cleanup_logs() + + # Old log should be deleted, recent log should remain + assert not old_log.exists() + assert recent_log.exists() + assert result["deleted_files"] == 1 + assert result["freed_space_mb"] > 0 + + def test_cleanup_by_size(self, temp_log_dir: Path) -> None: + """Test cleaning up logs when total size exceeds limit.""" + # Create cleaner with 1MB limit + cleaner = LogCleaner( + log_dir=temp_log_dir, + retention_days=365, # Don't clean by age + cleanup_interval_hours=1, + max_size_mb=1, + enabled=False, + ) + + # Create multiple log files totaling more than 1MB + create_log_file(temp_log_dir, "log1.log", age_days=3, size_mb=0.5) + create_log_file(temp_log_dir, "log2.log", age_days=2, size_mb=0.5) + create_log_file(temp_log_dir, "log3.log", age_days=1, size_mb=0.5) + + # Run cleanup + result = cleaner.cleanup_logs() + + # Should delete oldest files to get under 1MB + assert result["deleted_files"] > 0 + + # Calculate remaining size + remaining_size = sum(f.stat().st_size for f in temp_log_dir.glob("*.log*")) + assert remaining_size <= 1 * 1024 * 1024 # Should be under 1MB + + def test_no_cleanup_when_disabled(self, temp_log_dir: Path) -> None: + """Test that cleanup doesn't run when disabled.""" + cleaner = LogCleaner( + log_dir=temp_log_dir, + retention_days=1, + cleanup_interval_hours=1, + max_size_mb=1, + enabled=False, + ) + + # Create old log file + old_log = create_log_file(temp_log_dir, "old.log", age_days=2) + + # Try to start (should not start) + cleaner.start() + + # File should still exist + assert old_log.exists() + + cleaner.stop() + + def test_get_log_stats(self, log_cleaner: LogCleaner, temp_log_dir: Path) -> None: + """Test getting log statistics.""" + # Create some test files + create_log_file(temp_log_dir, "log1.log", age_days=1, size_mb=0.5) + create_log_file(temp_log_dir, "log2.log", age_days=0, size_mb=0.3) + + # Get stats + stats = log_cleaner.get_log_stats() + + assert stats["total_files"] == 2 + assert stats["total_size_mb"] > 0 + assert stats["oldest_file"] is not None + assert stats["newest_file"] is not None + + def test_cleanup_nonexistent_directory(self, tmp_path: Path) -> None: + """Test cleanup with nonexistent directory.""" + nonexistent_dir = tmp_path / "nonexistent" + cleaner = LogCleaner( + log_dir=nonexistent_dir, + retention_days=7, + cleanup_interval_hours=1, + max_size_mb=100, + enabled=False, + ) + + result = cleaner.cleanup_logs() + + assert result["deleted_files"] == 0 + assert result["freed_space_mb"] == 0 + + def test_start_stop(self, log_cleaner: LogCleaner, temp_log_dir: Path) -> None: + """Test starting and stopping the cleaner.""" + log_cleaner.enabled = True + + # Start cleaner + log_cleaner.start() + assert log_cleaner._thread is not None + assert log_cleaner._thread.is_alive() + + # Stop cleaner + log_cleaner.stop() + time.sleep(0.1) # Give thread time to stop + assert log_cleaner._thread is None or not log_cleaner._thread.is_alive() + + def test_multiple_start_calls(self, log_cleaner: LogCleaner) -> None: + """Test that multiple start calls don't create multiple threads.""" + log_cleaner.enabled = True + + log_cleaner.start() + thread1 = log_cleaner._thread + + log_cleaner.start() # Try to start again + thread2 = log_cleaner._thread + + assert thread1 is thread2 # Should be same thread + + log_cleaner.stop() + + +class TestGlobalLogCleaner: + """Test cases for global log cleaner functions.""" + + def test_init_log_cleaner(self, temp_log_dir: Path) -> None: + """Test initializing global log cleaner.""" + cleaner = init_log_cleaner( + log_dir=temp_log_dir, + retention_days=7, + cleanup_interval_hours=24, + max_size_mb=100, + enabled=False, + ) + + assert cleaner is not None + assert get_log_cleaner() is cleaner + + stop_log_cleaner() + assert get_log_cleaner() is None + + def test_reinit_stops_previous(self, temp_log_dir: Path) -> None: + """Test that reinitializing stops the previous cleaner.""" + cleaner1 = init_log_cleaner( + log_dir=temp_log_dir, + retention_days=7, + cleanup_interval_hours=24, + enabled=False, + ) + + cleaner2 = init_log_cleaner( + log_dir=temp_log_dir, + retention_days=14, + cleanup_interval_hours=12, + enabled=False, + ) + + assert cleaner1 is not cleaner2 + assert get_log_cleaner() is cleaner2 + + stop_log_cleaner() diff --git a/tests/test_logging.py b/tests/test_logging.py new file mode 100644 index 0000000..da6dee5 --- /dev/null +++ b/tests/test_logging.py @@ -0,0 +1,520 @@ +"""Tests for logging utilities.""" + +import logging +import sys +from pathlib import Path +from unittest.mock import Mock, patch + +from flow_proxy_plugin.utils.logging import ( + CleanupConfig, + ColoredFormatter, + Colors, + FormatConfig, + LogConfig, + LoggerFactory, + LogSetup, + RotationConfig, + setup_colored_logger, + setup_file_handler_for_child_process, + setup_logging, +) + + +class TestColors: + """Tests for Colors class.""" + + def test_colors_defined(self) -> None: + """Test that all color constants are defined.""" + assert Colors.RESET == "\033[0m" + assert Colors.BOLD == "\033[1m" + assert Colors.BRIGHT_BLACK == "\033[90m" + assert Colors.BRIGHT_CYAN == "\033[96m" + assert Colors.BRIGHT_YELLOW == "\033[93m" + assert Colors.BRIGHT_RED == "\033[91m" + assert Colors.RED == "\033[31m" + + +class TestColoredFormatter: + """Tests for ColoredFormatter class.""" + + def test_format_with_colors(self) -> None: + """Test that formatter adds colors to log records.""" + formatter = ColoredFormatter(fmt="%(levelname)s %(name)s - %(message)s") + record = logging.LogRecord( + name="test.logger", + level=logging.INFO, + pathname="test.py", + lineno=1, + msg="Test message", + args=(), + exc_info=None, + ) + + formatted = formatter.format(record) + + assert Colors.BRIGHT_CYAN in formatted # INFO color + assert Colors.BRIGHT_BLACK in formatted # logger name color + assert Colors.RESET in formatted + assert "Test message" in formatted + + def test_format_different_levels(self) -> None: + """Test formatting with different log levels.""" + formatter = ColoredFormatter(fmt="%(levelname)s") + + levels = [ + (logging.DEBUG, Colors.BRIGHT_BLACK), + (logging.INFO, Colors.BRIGHT_CYAN), + (logging.WARNING, Colors.BRIGHT_YELLOW), + (logging.ERROR, Colors.BRIGHT_RED), + (logging.CRITICAL, Colors.RED + Colors.BOLD), + ] + + for level, expected_color in levels: + record = logging.LogRecord( + name="test", + level=level, + pathname="test.py", + lineno=1, + msg="Test", + args=(), + exc_info=None, + ) + formatted = formatter.format(record) + assert expected_color in formatted + + +class TestFormatConfig: + """Tests for FormatConfig class.""" + + def test_default_values(self) -> None: + """Test default configuration values.""" + config = FormatConfig() + + assert config.console_format == "%(levelname)s %(name)s - %(message)s" + assert config.console_date_format == "%H:%M:%S" + assert "%(asctime)s" in config.file_format + assert config.file_date_format == "%Y-%m-%d %H:%M:%S" + + def test_custom_values(self) -> None: + """Test custom configuration values.""" + config = FormatConfig( + console_format="%(message)s", + console_date_format="%H:%M", + file_format="%(levelname)s - %(message)s", + file_date_format="%Y-%m-%d", + ) + + assert config.console_format == "%(message)s" + assert config.console_date_format == "%H:%M" + assert config.file_format == "%(levelname)s - %(message)s" + assert config.file_date_format == "%Y-%m-%d" + + +class TestRotationConfig: + """Tests for RotationConfig class.""" + + def test_default_values(self) -> None: + """Test default configuration values.""" + config = RotationConfig() + + assert config.when == "midnight" + assert config.interval == 1 + assert config.backup_count == 0 + assert config.suffix == "%Y-%m-%d" + assert config.encoding == "utf-8" + + def test_custom_values(self) -> None: + """Test custom configuration values.""" + config = RotationConfig( + when="H", + interval=6, + backup_count=10, + suffix="%Y%m%d-%H%M%S", + encoding="utf-16", + ) + + assert config.when == "H" + assert config.interval == 6 + assert config.backup_count == 10 + assert config.suffix == "%Y%m%d-%H%M%S" + assert config.encoding == "utf-16" + + +class TestCleanupConfig: + """Tests for CleanupConfig class.""" + + def test_default_values(self) -> None: + """Test default configuration values.""" + config = CleanupConfig() + + assert config.enabled is True + assert config.retention_days == 7 + assert config.cleanup_interval_hours == 24 + assert config.max_size_mb == 100 + + def test_from_env_default(self) -> None: + """Test creating config from environment with defaults.""" + with patch.dict("os.environ", {}, clear=True): + config = CleanupConfig.from_env() + + assert config.enabled is True + assert config.retention_days == 7 + assert config.cleanup_interval_hours == 24 + assert config.max_size_mb == 100 + + def test_from_env_custom(self) -> None: + """Test creating config from environment with custom values.""" + env = { + "FLOW_PROXY_LOG_CLEANUP_ENABLED": "false", + "FLOW_PROXY_LOG_RETENTION_DAYS": "30", + "FLOW_PROXY_LOG_CLEANUP_INTERVAL_HOURS": "12", + "FLOW_PROXY_LOG_MAX_SIZE_MB": "500", + } + + with patch.dict("os.environ", env, clear=True): + config = CleanupConfig.from_env() + + assert config.enabled is False + assert config.retention_days == 30 + assert config.cleanup_interval_hours == 12 + assert config.max_size_mb == 500 + + +class TestLogConfig: + """Tests for LogConfig class.""" + + def test_default_values(self) -> None: + """Test default configuration values.""" + config = LogConfig() + + assert config.level == "INFO" + assert config.log_dir == "logs" + assert config.log_filename == "flow_proxy_plugin.log" + assert isinstance(config.format, FormatConfig) + assert isinstance(config.rotation, RotationConfig) + assert isinstance(config.cleanup, CleanupConfig) + + def test_log_level_property(self) -> None: + """Test log_level property returns correct integer.""" + config = LogConfig(level="DEBUG") + assert config.log_level == logging.DEBUG + + config = LogConfig(level="INFO") + assert config.log_level == logging.INFO + + config = LogConfig(level="WARNING") + assert config.log_level == logging.WARNING + + config = LogConfig(level="ERROR") + assert config.log_level == logging.ERROR + + def test_log_dir_path_property(self) -> None: + """Test log_dir_path property returns Path object.""" + config = LogConfig(log_dir="test_logs") + assert isinstance(config.log_dir_path, Path) + assert str(config.log_dir_path) == "test_logs" + + def test_log_file_path_property(self) -> None: + """Test log_file_path property returns correct path.""" + config = LogConfig(log_dir="test_logs", log_filename="test.log") + assert isinstance(config.log_file_path, Path) + assert str(config.log_file_path) == "test_logs/test.log" + + def test_from_env_default(self) -> None: + """Test creating config from environment with defaults.""" + with patch.dict("os.environ", {}, clear=True): + config = LogConfig.from_env(level="DEBUG", log_dir="custom_logs") + + assert config.level == "DEBUG" + assert config.log_dir == "custom_logs" + assert config.cleanup.enabled is True + assert config.cleanup.retention_days == 7 + + def test_from_env_custom(self) -> None: + """Test creating config from environment with custom values.""" + env = { + "FLOW_PROXY_LOG_CLEANUP_ENABLED": "true", + "FLOW_PROXY_LOG_RETENTION_DAYS": "14", + "FLOW_PROXY_LOG_CLEANUP_INTERVAL_HOURS": "6", + "FLOW_PROXY_LOG_MAX_SIZE_MB": "200", + } + + with patch.dict("os.environ", env, clear=True): + config = LogConfig.from_env() + + assert config.cleanup.enabled is True + assert config.cleanup.retention_days == 14 + assert config.cleanup.cleanup_interval_hours == 6 + assert config.cleanup.max_size_mb == 200 + + def test_nested_config_initialization(self) -> None: + """Test that nested configs are initialized properly.""" + config = LogConfig() + + assert config.format is not None + assert config.rotation is not None + assert config.cleanup is not None + + +class TestLoggerFactory: + """Tests for LoggerFactory class.""" + + def test_create_console_handler(self) -> None: + """Test creating console handler.""" + config = LogConfig() + handler = LoggerFactory.create_console_handler(config) + + assert isinstance(handler, logging.StreamHandler) + assert handler.stream == sys.stdout + assert isinstance(handler.formatter, ColoredFormatter) + + def test_create_file_handler(self, tmp_path: Path) -> None: + """Test creating file handler.""" + config = LogConfig(log_dir=str(tmp_path)) + handler = LoggerFactory.create_file_handler(config) + + assert handler.baseFilename == str(tmp_path / "flow_proxy_plugin.log") + # TimedRotatingFileHandler converts 'when' to uppercase + assert handler.when == "MIDNIGHT" + # For 'midnight', interval is converted to seconds in a day + assert handler.interval == 86400 + assert handler.backupCount == 0 + assert handler.suffix == "%Y-%m-%d" + assert isinstance(handler.formatter, logging.Formatter) + + def test_file_handler_custom_config(self, tmp_path: Path) -> None: + """Test creating file handler with custom configuration.""" + rotation = RotationConfig(when="H", interval=6, backup_count=5, suffix="%Y%m%d") + config = LogConfig(log_dir=str(tmp_path), rotation=rotation) + handler = LoggerFactory.create_file_handler(config) + + assert handler.when == "H" + # TimedRotatingFileHandler converts interval to seconds + # For 'H' (hours), 6 hours = 21600 seconds + assert handler.interval == 21600 + assert handler.backupCount == 5 + assert handler.suffix == "%Y%m%d" + + +class TestLogSetup: + """Tests for LogSetup class.""" + + def test_init_creates_log_directory(self, tmp_path: Path) -> None: + """Test that initialization creates log directory.""" + log_dir = tmp_path / "test_logs" + assert not log_dir.exists() + + config = LogConfig(log_dir=str(log_dir)) + LogSetup(config) + + assert log_dir.exists() + assert log_dir.is_dir() + + @patch("flow_proxy_plugin.utils.log_cleaner.init_log_cleaner") + def test_initialize_cleaner(self, mock_init: Mock, tmp_path: Path) -> None: + """Test initializing log cleaner.""" + config = LogConfig(log_dir=str(tmp_path)) + setup = LogSetup(config) + setup.initialize_cleaner() + + mock_init.assert_called_once_with( + log_dir=config.log_dir_path, + retention_days=config.cleanup.retention_days, + cleanup_interval_hours=config.cleanup.cleanup_interval_hours, + max_size_mb=config.cleanup.max_size_mb, + enabled=config.cleanup.enabled, + ) + + def test_configure_root_logger(self, tmp_path: Path) -> None: + """Test configuring root logger.""" + config = LogConfig(log_dir=str(tmp_path), level="DEBUG") + setup = LogSetup(config) + + # Clear any existing handlers + logging.root.handlers.clear() + + setup.configure_root_logger() + + assert logging.root.level == logging.DEBUG + assert len(logging.root.handlers) == 2 # console + file + + @patch("flow_proxy_plugin.utils.log_cleaner.init_log_cleaner") + def test_setup_complete(self, mock_init: Mock, tmp_path: Path) -> None: + """Test complete setup process.""" + config = LogConfig(log_dir=str(tmp_path)) + setup = LogSetup(config) + + # Clear any existing handlers + logging.root.handlers.clear() + + setup.setup() + + # Check logger is configured + assert len(logging.root.handlers) == 2 + + # Check cleaner is initialized + mock_init.assert_called_once() + + +class TestSetupLogging: + """Tests for setup_logging function.""" + + @patch("flow_proxy_plugin.utils.log_cleaner.init_log_cleaner") + def test_setup_logging_default(self, mock_init: Mock, tmp_path: Path) -> None: + """Test setup_logging with default parameters.""" + # Clear any existing handlers + logging.root.handlers.clear() + + with patch.dict("os.environ", {}, clear=True): + setup_logging(level="INFO", log_dir=str(tmp_path)) + + assert logging.root.level == logging.INFO + assert len(logging.root.handlers) == 2 + mock_init.assert_called_once() + + @patch("flow_proxy_plugin.utils.log_cleaner.init_log_cleaner") + def test_setup_logging_custom_level(self, mock_init: Mock, tmp_path: Path) -> None: + """Test setup_logging with custom log level.""" + logging.root.handlers.clear() + + with patch.dict("os.environ", {}, clear=True): + setup_logging(level="DEBUG", log_dir=str(tmp_path)) + + assert logging.root.level == logging.DEBUG + + +class TestSetupColoredLogger: + """Tests for setup_colored_logger function.""" + + def test_setup_colored_logger_default(self) -> None: + """Test setup_colored_logger with default parameters.""" + logger = logging.getLogger("test.logger") + setup_colored_logger(logger) + + assert logger.level == logging.INFO + assert len(logger.handlers) == 1 + + +class TestSetupFileHandlerForChildProcess: + """Tests for setup_file_handler_for_child_process function.""" + + def test_setup_file_handler_default(self, tmp_path: Path) -> None: + """Test setup_file_handler_for_child_process with default parameters.""" + from logging.handlers import TimedRotatingFileHandler + + logger = logging.getLogger("test.child.process.logger") + logger.handlers.clear() + + with patch.dict("os.environ", {}, clear=True): + setup_file_handler_for_child_process(logger, log_level="INFO", log_dir=str(tmp_path)) + + # Should have one file handler added + assert len(logger.handlers) == 1 + handler = logger.handlers[0] + assert isinstance(handler, TimedRotatingFileHandler) + assert handler.baseFilename == str(tmp_path / "flow_proxy_plugin.log") + + def test_setup_file_handler_custom_level(self, tmp_path: Path) -> None: + """Test setup_file_handler_for_child_process with custom log level.""" + from logging.handlers import TimedRotatingFileHandler + + logger = logging.getLogger("test.child.debug.logger") + logger.handlers.clear() + + with patch.dict("os.environ", {}, clear=True): + setup_file_handler_for_child_process(logger, log_level="DEBUG", log_dir=str(tmp_path)) + + handler = logger.handlers[0] + assert isinstance(handler, TimedRotatingFileHandler) + assert handler.level == logging.DEBUG + + def test_setup_file_handler_removes_existing_file_handlers(self, tmp_path: Path) -> None: + """Test that setup_file_handler_for_child_process removes existing file handlers.""" + from logging.handlers import TimedRotatingFileHandler + + logger = logging.getLogger("test.child.replace.logger") + logger.handlers.clear() + + # Add an existing file handler + old_handler = TimedRotatingFileHandler( + filename=str(tmp_path / "old.log"), + when="midnight", + ) + logger.addHandler(old_handler) + assert len(logger.handlers) == 1 + + # Add console handler (should not be removed) + console_handler = logging.StreamHandler() + logger.addHandler(console_handler) + assert len(logger.handlers) == 2 + + with patch.dict("os.environ", {}, clear=True): + setup_file_handler_for_child_process(logger, log_level="INFO", log_dir=str(tmp_path)) + + # Should have console handler + new file handler + assert len(logger.handlers) == 2 + file_handlers = [h for h in logger.handlers if isinstance(h, TimedRotatingFileHandler)] + assert len(file_handlers) == 1 + assert file_handlers[0].baseFilename == str(tmp_path / "flow_proxy_plugin.log") + + def test_setup_file_handler_from_env(self, tmp_path: Path) -> None: + """Test setup_file_handler_for_child_process reads config from environment.""" + from logging.handlers import TimedRotatingFileHandler + + logger = logging.getLogger("test.child.env.logger") + logger.handlers.clear() + + env = { + "FLOW_PROXY_LOG_CLEANUP_ENABLED": "true", + "FLOW_PROXY_LOG_RETENTION_DAYS": "30", + } + + with patch.dict("os.environ", env, clear=True): + setup_file_handler_for_child_process(logger, log_level="INFO", log_dir=str(tmp_path)) + + # Should create handler successfully with env config + assert len(logger.handlers) == 1 + handler = logger.handlers[0] + assert isinstance(handler, TimedRotatingFileHandler) + assert handler.baseFilename == str(tmp_path / "flow_proxy_plugin.log") + + def test_setup_file_handler_creates_new_handler(self, tmp_path: Path) -> None: + """Test that a NEW file handler is created (not copied).""" + logger = logging.getLogger("test.child.new.logger") + logger.handlers.clear() + + with patch.dict("os.environ", {}, clear=True): + setup_file_handler_for_child_process(logger, log_level="INFO", log_dir=str(tmp_path)) + + handler = logger.handlers[0] + + # Verify it's a proper TimedRotatingFileHandler with correct config + from logging.handlers import TimedRotatingFileHandler + assert isinstance(handler, TimedRotatingFileHandler) + assert handler.when == "MIDNIGHT" + assert handler.interval == 86400 # 1 day in seconds + assert handler.suffix == "%Y-%m-%d" + + def test_setup_file_handler_different_log_dirs(self, tmp_path: Path) -> None: + """Test setup_file_handler_for_child_process with different log directories.""" + from logging.handlers import TimedRotatingFileHandler + + logger1 = logging.getLogger("test.child.dir1.logger") + logger1.handlers.clear() + + logger2 = logging.getLogger("test.child.dir2.logger") + logger2.handlers.clear() + + log_dir1 = tmp_path / "logs1" + log_dir2 = tmp_path / "logs2" + + with patch.dict("os.environ", {}, clear=True): + setup_file_handler_for_child_process(logger1, log_level="INFO", log_dir=str(log_dir1)) + setup_file_handler_for_child_process(logger2, log_level="INFO", log_dir=str(log_dir2)) + + handler1 = logger1.handlers[0] + handler2 = logger2.handlers[0] + assert isinstance(handler1, TimedRotatingFileHandler) + assert isinstance(handler2, TimedRotatingFileHandler) + assert handler1.baseFilename == str(log_dir1 / "flow_proxy_plugin.log") + assert handler2.baseFilename == str(log_dir2 / "flow_proxy_plugin.log")