Skip to content

Phase 4.4: Extract finetune endpoints to domain modules#570

Merged
shikhalev merged 1 commit intomainfrom
server/phase-4.4-finetune
Mar 13, 2026
Merged

Phase 4.4: Extract finetune endpoints to domain modules#570
shikhalev merged 1 commit intomainfrom
server/phase-4.4-finetune

Conversation

@shikhalev
Copy link
Collaborator

Summary

  • Extract 17 LLM finetune endpoints (dataset CRUD, training control, LoRA adapters) → modules/llm/router_finetune.py (~260 lines)
  • Extract 13 TTS finetune endpoints (samples, transcription, training, models) → modules/speech/router_finetune.py (~150 lines)
  • Conditional registration: both routers only loaded when DEPLOYMENT_MODE != "cloud" (GPU-only)
  • Orchestrator reduced: 2875 → 2471 lines (-404 lines, -14%)

Key changes

File Action Lines
modules/llm/router_finetune.py New: 4 Pydantic models + 17 endpoints ~260
modules/speech/router_finetune.py New: 13 endpoints, try/except for optional manager ~150
orchestrator.py Removed extracted code + unused imports -404

Design decisions

  • Used prefix="/admin/finetune" and prefix="/admin/tts-finetune" on routers to keep URL paths identical
  • tts_finetune_manager remains optional (try/except import) in the new module
  • finetune_manager is always available (non-optional import) — same as orchestrator behavior
  • Training log SSE endpoint preserves require_permission("system", "view") auth guard

Test plan

  • ruff check passes on all changed files
  • ruff format --check passes
  • pytest tests/ — all 65 tests pass
  • Verify /admin/finetune/config returns config (GPU mode)
  • Verify /admin/tts-finetune/config returns config (GPU mode)

Closes #549

🤖 Generated with Claude Code

Phase 4.4 of orchestrator decomposition (Strangler Fig):
- Extract 17 LLM finetune endpoints (dataset, training, adapters) to
  modules/llm/router_finetune.py (~260 lines)
- Extract 13 TTS finetune endpoints (samples, training, models) to
  modules/speech/router_finetune.py (~150 lines)
- Move 4 Pydantic models (DatasetProcessRequest, GenerateProjectDatasetRequest,
  FinetuneConfigRequest, AdapterRequest) to LLM finetune router
- Conditional registration: GPU-only (DEPLOYMENT_MODE != "cloud")
- Remove unused imports: get_finetune_manager, get_tts_finetune_manager,
  TTS_FINETUNE_AVAILABLE, File, UploadFile
- Orchestrator: 2875 → 2471 lines (-404)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@shikhalev shikhalev merged commit 10cdb25 into main Mar 13, 2026
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Phase 4.4: Admin finetune endpoints → domain modules

1 participant