Production-ready FastAPI backend with async LLM generation via Celery + Redis, JWT auth, and Docker Compose orchestration.
- JWT authentication based on fastapi-users
- User registration and user management endpoints
- Async feature specification generation with Celery tasks
- Redis as Celery broker/result backend
- Ollama integration for LLM responses
- Readiness and health probes for runtime checks
- Alembic database migrations
- Security middleware baseline:
- CORS
- Trusted Host
- Optional HTTPS redirect
- Request ID propagation
- Security headers (CSP, HSTS in production, etc.)
- In-memory rate limiting for auth paths
- Suspicious-request logging
- Python 3.10
- FastAPI
- SQLAlchemy 2.0
- Alembic
- fastapi-users
- Celery
- Redis
- PostgreSQL
- Ollama
- Pytest
- Client calls
POST /api/v1/feature-spec/generate. - API stores a run row and enqueues Celery task to Redis.
- API returns immediately with
task_idandprocessingstatus. - Celery worker calls Ollama and persists success/error in DB.
- Client polls
GET /api/v1/feature-spec/tasks/{task_id}for result.
- app/main.py: FastAPI app bootstrap and router registration
- app/core/: settings, database wiring, startup/lifespan, bootstrap logic
- app/api/: health, readiness, OpenAPI customization
- app/middlewares/: security middleware composition and implementations
- app/modules/auth/: auth domain (models, schemas, dependencies, router)
- app/modules/feature_spec/: API layer + application services for feature spec
- app/infrastructure/: Celery app, task workers, Ollama client
- app/scripts/: utility scripts (admin and prompt/model bootstrap)
- alembic/: migration config and versions
- docker-compose.yml: app + celery-worker + redis + ollama
- Python 3.10+
- PostgreSQL database
- Docker + Docker Compose (recommended)
Create .env in the project root and set required values.
Required minimum:
- DATABASE_URL
- SECRET_KEY
- CELERY_BROKER_URL
- CELERY_RESULT_BACKEND
Recommended auth bootstrap values:
- AUTH_USERNAME
- AUTH_EMAIL
- AUTH_PASSWORD (or AUTH_PASSWORD_HASH)
LLM values:
- OLLAMA_BASE_URL
- OLLAMA_MODEL
- OLLAMA_TIMEOUT
Compose ports (host -> container):
- FASTAPI_HOST_PORT=8005
- FASTAPI_CONTAINER_PORT=8001
- REDIS_HOST_PORT=6380
- REDIS_CONTAINER_PORT=6379
For Docker Compose in this project use:
- OLLAMA_BASE_URL=http://ollama:11434
- CELERY_BROKER_URL=redis://redis:6379/0
- CELERY_RESULT_BACKEND=redis://redis:6379/1
python -m venv venv
# Windows
venv\Scripts\activate
# Linux/macOS
source venv/bin/activate
pip install --upgrade pip
pip install -r requirements.txt
pip install -r requirements-dev.txt
python -m alembic upgrade head
python -m app.scripts.bootstrap_admin
python -m app.scripts.bootstrap_prompt_template
# terminal 1: api
python -m uvicorn app.main:app --host 0.0.0.0 --port 8005
# terminal 2: worker
celery -A app.infrastructure.celery_app:celery_app worker --loglevel=INFOdocker compose up -d --buildNotes:
- Container entrypoint automatically runs:
- migration + DB head check (
python -m app.scripts.migrate_and_check) - admin bootstrap script (
python -m app.scripts.bootstrap_admin) - prompt template bootstrap script (
python -m app.scripts.bootstrap_prompt_template) - Ollama model bootstrap (
python -m app.scripts.ensure_ollama_model) - uvicorn app startup
- migration + DB head check (
- Celery worker runs in a dedicated container (
celery-worker). - Redis runs only in Docker Compose and is used internally by service name
redis. - On first deploy, startup may take longer while the configured
OLLAMA_MODELis downloaded. - FastAPI container reaches Ollama via internal Docker network URL: http://ollama:11434
Verify services:
docker compose ps
docker compose logs -f app
docker compose logs -f celery-workerWhen server is running locally (custom port):
- Swagger UI: http://localhost:8005/docs
- ReDoc: http://localhost:8005/redoc (pinned ReDoc 2.x script)
- OpenAPI JSON: http://localhost:8005/openapi.json
For Docker Compose deployment (default host mapping):
- Swagger UI: http://localhost:8005/docs
- ReDoc: http://localhost:8005/redoc
- OpenAPI JSON: http://localhost:8005/openapi.json
Base API prefix: /api/v1
- GET /health
- GET /ready
- POST /api/v1/auth/jwt/login
- POST /api/v1/auth/jwt/refresh
- POST /api/v1/auth/jwt/logout
- POST /api/v1/auth/register
- GET /api/v1/auth/users/me
- PATCH /api/v1/auth/users/me
- GET /api/v1/auth/users/{id}
- PATCH /api/v1/auth/users/{id}
- DELETE /api/v1/auth/users/{id}
Login note:
- Endpoint
/api/v1/auth/jwt/loginusesapplication/x-www-form-urlencoded. - In form field
username, pass onlyAUTH_USERNAME. - Login response returns
access_tokenin body and sets HttpOnly refresh cookie. - Use
/api/v1/auth/jwt/refreshto get a new access token and refresh cookie. /api/v1/auth/jwt/logoutclears refresh cookie on client side.- Swagger
Authorizevalue must contain only raw JWT token (withoutBearerprefix).
- POST /api/v1/feature-spec/generate
- GET /api/v1/feature-spec/tasks/{task_id}
- GET /api/v1/feature-spec/history?limit=10
Generate request:
{
"feature_idea": "payment for premium posts"
}Generate response:
{
"task_id": "3b44daff-1e83-4328-925f-62c22a9163d2",
"status": "processing"
}Task status response examples:
{
"task_id": "3b44daff-1e83-4328-925f-62c22a9163d2",
"status": "PENDING"
}{
"task_id": "3b44daff-1e83-4328-925f-62c22a9163d2",
"status": "SUCCESS",
"result": {
"run_id": 10,
"status": "success",
"feature_idea": "payment for premium posts",
"feature_summary": {
"user_stories": [],
"acceptance_criteria": [],
"db_models_and_api_endpoints": {
"db_models": [],
"api_endpoints": []
},
"risk_assessment": []
}
}
}Run linter:
python -m flake8 .Run auth unit tests:
pytest tests/modules/auth -qRun all tests:
pytest -q- Replace default SECRET_KEY in production.
- Use explicit ALLOWED_ORIGINS and SECURITY_TRUSTED_HOSTS in production.
- Enable SECURITY_ENABLE_HTTPS_REDIRECT behind proper TLS/reverse proxy setup.
- In-memory rate limiting is a baseline; for high-load use Redis-based limiting.
If migrations fail:
python -m alembic upgrade headIf app cannot connect to DB:
- Verify DATABASE_URL
- Verify DB network access and sslmode if needed
If Celery tasks stay in PENDING:
- Check worker is healthy:
docker compose ps - Check worker logs:
docker compose logs -f celery-worker - Verify Redis URLs in
.envpoint toredisservice inside Docker network
If LLM requests fail or timeout:
- Verify OLLAMA_BASE_URL
- Ensure Ollama is running and model is available
- For Docker deployment, ensure OLLAMA_BASE_URL is http://ollama:11434
- Increase
OLLAMA_TIMEOUTfor long generations
If compose prints variable is not set for random token-like names:
- Your
.envlikely has$inside secret values - Escape
$as$$in.envvalues used by Docker Compose
