A production-ready FastAPI boilerplate with async PostgreSQL, SQLAlchemy, and Alembic migrations. This template provides a solid foundation for building scalable REST APIs with modern Python practices.
- π FastAPI with async/await support and automatic API documentation
- π PostgreSQL with async SQLAlchemy ORM
- π Alembic for database migrations
- π§ Environment-based configuration with type safety
- π Pydantic schemas for request/response validation
- π‘οΈ CORS middleware with configurable origins
- π CRUD operations with pagination support
- ποΈ Modular architecture with clear separation of concerns
- π API documentation with Swagger UI and ReDoc
- π― Production-ready settings and configurations
fastapi-boilerplate/
βββ alembic/ # Database migrations
β βββ versions/ # Migration files (auto-generated)
β βββ env.py # Alembic environment configuration
β βββ script.py.mako # Migration template
β βββ README # Alembic readme
βββ core/ # Core application modules
β βββ __init__.py
β βββ config.py # Centralized configuration
β βββ router.py # Main router configuration
βββ database/ # Database configuration
β βββ __init__.py
β βββ session.py # Async SQLAlchemy setup
βββ models/ # SQLAlchemy data models
β βββ __init__.py # Model imports and exports
β βββ common_model.py # Base model with common fields
β βββ user.py # User-specific models (example)
βββ schemas/ # Pydantic schemas
β βββ __init__.py
β βββ user.py # User request/response schemas (example)
βββ utils/ # Utility modules
β βββ __init__.py
β βββ response_helpers.py # API helper functions
βββ api/ # API endpoints
β βββ __init__.py
β βββ users.py # User-related API endpoints (example)
βββ main.py # FastAPI application entry point
βββ requirements.txt # Python dependencies
βββ alembic.ini # Alembic configuration file
βββ .env.example # Environment variables template
βββ .gitignore # Git ignore patterns
βββ LICENSE # License file
- Python 3.8+
- PostgreSQL (local installation or Docker)
-
Clone and setup the project:
git clone <your-repo-url> cd fastapi-boilerplate python -m venv venv # Windows venv\Scripts\activate # Linux/Mac source venv/bin/activate
-
Setup PostgreSQL:
Option A: Local PostgreSQL
- Install PostgreSQL locally
- Create a user and database
Option B: Docker
docker run --name postgres -e POSTGRES_PASSWORD=password -p 5432:5432 -d postgres
-
Configure environment variables:
cp .env.example .env # Edit .env with your database configuration -
Install dependencies and setup database:
make setup
-
Start the API server:
make run
-
Clone and setup the project:
git clone <your-repo-url> cd fastapi-boilerplate python -m venv venv # Windows venv\Scripts\activate # Linux/Mac source venv/bin/activate pip install -r requirements.txt
-
Setup PostgreSQL:
Option A: Local PostgreSQL
- Install PostgreSQL locally
- Create a user and database
Option B: Docker
docker run --name postgres -e POSTGRES_PASSWORD=password -p 5432:5432 -d postgres
-
Configure environment variables:
cp .env.example .env # Edit .env with your database configuration -
Run initial migration:
alembic revision --autogenerate -m "Initial migration" alembic upgrade headOr use the setup script:
python scripts/setup.py
-
Start the API server:
python main.py
Copy .env.example to .env and configure your settings:
DB_USER=postgres
DB_PASSWORD=password
DB_NAME=fastapi_db
DB_HOST=localhost
DB_PORT=5432
DB_ECHO=true
DB_POOL_PRE_PING=true
DB_POOL_RECYCLE=300API_TITLE=FastAPI Boilerplate
API_DESCRIPTION=A production-ready FastAPI boilerplate with async PostgreSQL
API_VERSION=1.0.0
API_HOST=0.0.0.0
API_PORT=8000
API_RELOAD=trueCORS_ALLOW_ORIGINS=*
CORS_ALLOW_CREDENTIALS=true
CORS_ALLOW_METHODS=*
CORS_ALLOW_HEADERS=*ENVIRONMENT=development
LOG_LEVEL=INFOThe application automatically adjusts based on the ENVIRONMENT variable:
-
Development (
ENVIRONMENT=development):- API documentation enabled at
/docsand/redoc - Auto-reload enabled if
API_RELOAD=true - Database echoing enabled if
DB_ECHO=true
- API documentation enabled at
-
Production (
ENVIRONMENT=production):- API documentation disabled for security
- Auto-reload disabled regardless of
API_RELOADsetting - More restrictive default settings
Once the server is running, visit (in development mode):
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
Note: Documentation is automatically disabled in production for security.
The boilerplate includes example CRUD operations for users:
GET /- Root endpoint with API information and environment details
POST /api/v1/users/register- Create new userGET /api/v1/users/- List all users (with pagination)GET /api/v1/users/{user_id}- Get user by IDPUT /api/v1/users/{user_id}- Update userDELETE /api/v1/users/{user_id}- Delete user
The users table includes:
id(integer, primary key)username(string, unique, required)email(string, unique, required)full_name(string, optional)is_active(boolean, default: True)is_superuser(boolean, default: False)created_at(timestamp, auto-generated)updated_at(timestamp, auto-updated)
This project uses Alembic for database migrations with async SQLAlchemy support.
# Create a new migration
alembic revision --autogenerate -m "Description of changes"
# Apply migrations
alembic upgrade head
# Check migration status
alembic current
# Downgrade migrations
alembic downgrade -1Use the Makefile for common development tasks:
make help # Show all available commands
make setup # Install dependencies and setup database
make run # Start the development server
make migrate # Apply database migrations
make clean # Clean up cache filesOr use the setup script directly:
python scripts/setup.pyThe setup process will:
- Check prerequisites (Python version, .env file)
- Install dependencies
- Create initial database migration
- Apply migrations to set up the database
- Create model in
models/directory - Export model in
models/__init__.py - Add Pydantic schemas to
schemas/directory - Create API endpoints in
api/directory - Include router in
core/router.py - Generate and run migrations:
alembic revision --autogenerate -m "Add new model" alembic upgrade head
- Models (
models/): SQLAlchemy ORM models - Schemas (
schemas/): Pydantic models for request/response validation - API (
api/): API endpoints and business logic - Core (
core/): Application configuration and main router - Database (
database/): Database connection and session management - Utils (
utils/): Helper functions and utilities
For production deployment:
- Set
ENVIRONMENT=production - Configure proper database credentials
- Set appropriate CORS origins instead of
* - Set
DB_ECHO=falsefor better performance - Configure proper logging levels
- Use a production WSGI server like Gunicorn
Example production .env:
ENVIRONMENT=production
API_HOST=0.0.0.0
API_PORT=8000
API_RELOAD=false
DB_ECHO=false
CORS_ALLOW_ORIGINS=https://yourdomain.com,https://www.yourdomain.com
LOG_LEVEL=WARNINGThe easiest way to run the application with Docker:
make docker-run # Start with Docker Compose
make docker-logs # View logs
make docker-stop # Stop containers# Start the application with PostgreSQL
docker-compose up -d
# View logs
docker-compose logs -f
# Stop the application
docker-compose down# Build the image
docker build -t fastapi-boilerplate .
# Run the container
docker run -p 8000:8000 fastapi-boilerplate- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
MIT License - see LICENSE file for details.
If you encounter any issues or have questions:
- Check the troubleshooting section
- Review the API documentation at
/docs - Open an issue on GitHub
Database Connection Issues:
- Ensure PostgreSQL is running
- Check database credentials in
.envfile - Verify database exists
Migration Issues:
- Make sure all models are imported in
alembic/env.py - Check database connection before running migrations
- Review migration files in
alembic/versions/
Configuration Issues:
- Check that
.envfile exists and has correct syntax - Verify environment variable names match the expected format
Port Already in Use:
- Change
API_PORTin.envfile if 8000 is occupied - Or kill the process using the port