SupportIQ is a comprehensive multi-tenant SaaS backend for support ticketing with AI-powered features. Built with FastAPI, PostgreSQL, and Redis for high performance and scalability.
- Complete tenant isolation with row-level security
- Flexible pricing plans: Free, Starter, Professional, Enterprise
- Per-tenant quotas for users, tickets, and API calls
- Custom branding and webhook integrations
- Full lifecycle support: New β Open β In Progress β Pending β Resolved β Closed
- Priority levels: Critical, High, Medium, Low
- SLA management with breach detection and escalation
- Comments and attachments with audit trails
- Bulk operations for efficient management
- Sentiment analysis for incoming tickets
- Auto-categorization based on content
- Smart routing to appropriate agents
- Response suggestions for faster resolution
- Entity extraction for structured data
- Async/await throughout for high concurrency
- Celery workers for background processing
- Redis caching for frequently accessed data
- Rate limiting per tenant and endpoint
- Connection pooling for database efficiency
SupportIQ/
βββ app/
β βββ api/v1/ # API endpoints
β β βββ endpoints/ # Route handlers
β β βββ deps.py # Dependencies
β β βββ router.py # API router
β βββ core/ # Core utilities
β β βββ cache.py # Redis caching
β β βββ rate_limit.py # Rate limiting
β βββ models/ # SQLAlchemy models
β βββ schemas/ # Pydantic schemas
β βββ services/ # Business logic
β βββ workers/ # Celery tasks
β βββ config.py # Configuration
β βββ database.py # Database setup
β βββ main.py # FastAPI app
βββ tests/ # Test suite
βββ scripts/ # Utility scripts
βββ docker-compose.yml # Docker services
βββ Dockerfile # Container image
βββ pyproject.toml # Project config
| Component | Technology |
|---|---|
| Framework | FastAPI 0.109+ |
| Database | PostgreSQL 16 |
| ORM | SQLAlchemy 2.0 (async) |
| Cache/Broker | Redis 7 |
| Task Queue | Celery 5.3 |
| AI | OpenAI GPT-4 |
| Auth | JWT (python-jose) |
| Validation | Pydantic v2 |
- Python 3.11+
- Docker & Docker Compose
- OpenAI API key (for AI features)
git clone https://github.com/yourusername/supportiq.git
cd supportiq
# Copy environment file
cp .env.example .env
# Edit .env with your configuration
# Especially: OPENAI_API_KEY, SECRET_KEY# Start all services
docker-compose up -d
# View logs
docker-compose logs -f api
# The API will be available at http://localhost:8000# Enter the API container
docker-compose exec api bash
# Run migrations
alembic upgrade head# Using the API (after migrations)
curl -X POST http://localhost:8000/api/v1/tenants \
-H "Content-Type: application/json" \
-d '{
"name": "My Company",
"slug": "my-company",
"plan": "professional"
}'Once running, access the interactive API documentation:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
- OpenAPI JSON: http://localhost:8000/openapi.json
Most endpoints require JWT authentication:
# Login
curl -X POST http://localhost:8000/api/v1/auth/login \
-H "Content-Type: application/x-www-form-urlencoded" \
-d "username=admin@example.com&password=your_password"
# Use the access_token in subsequent requests
curl http://localhost:8000/api/v1/tickets \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN"| Endpoint | Method | Description |
|---|---|---|
/api/v1/auth/login |
POST | User authentication |
/api/v1/tenants |
GET/POST | Tenant management |
/api/v1/users |
GET/POST | User management |
/api/v1/tickets |
GET/POST | Ticket operations |
/api/v1/tickets/{id}/assign |
POST | Assign ticket |
/api/v1/tickets/{id}/resolve |
POST | Resolve ticket |
/api/v1/ai/analyze |
POST | AI analysis |
/api/v1/webhooks/ingest |
POST | External ticket ingestion |
# Create a ticket
curl -X POST http://localhost:8000/api/v1/tickets \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"subject": "Unable to login to dashboard",
"description": "I keep getting an error message when trying to access my account. This is urgent as I have a presentation tomorrow!",
"priority": "high",
"customer_email": "customer@example.com"
}'
# The AI automatically analyzes sentiment, categorizes, and suggests routing# Auto-assign based on skills and workload
curl -X POST http://localhost:8000/api/v1/tickets/{ticket_id}/assign \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"strategy": "hybrid"}'# Get ticket with SLA info
curl http://localhost:8000/api/v1/tickets/{ticket_id} \
-H "Authorization: Bearer $TOKEN"
# Response includes:
# - sla_deadline
# - sla_breached
# - time_to_breach# Bulk update tickets
curl -X POST http://localhost:8000/api/v1/tickets/bulk-update \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"ticket_ids": ["uuid1", "uuid2", "uuid3"],
"updates": {
"status": "open",
"assigned_to_id": "agent-uuid"
}
}'# Run all tests
pytest
# Run with coverage
pytest --cov=app --cov-report=html
# Run specific test file
pytest tests/test_tickets.py -v
# Run with parallel execution
pytest -n auto# Create virtual environment
python -m venv venv
source venv/bin/activate # or `venv\Scripts\activate` on Windows
# Install dependencies
pip install -e ".[dev]"
# Start PostgreSQL and Redis (manually or via Docker)
docker-compose up -d db redis
# Run migrations
alembic upgrade head
# Start the API
uvicorn app.main:app --reload
# Start Celery worker (separate terminal)
celery -A app.workers.celery_app worker --loglevel=info
# Start Celery beat (separate terminal)
celery -A app.workers.celery_app beat --loglevel=info# Format code
black app tests
# Lint
ruff check app tests
# Type checking
mypy appβββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Clients β
β (Web App, Mobile App, External Systems, Email) β
βββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Load Balancer β
β (Rate Limiting, SSL) β
βββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β FastAPI Server β
β βββββββββββββββ βββββββββββββββ βββββββββββββββ β
β β Auth β β Tickets β β AI β β
β β Endpoints β β Endpoints β β Endpoints β β
β βββββββββββββββ βββββββββββββββ βββββββββββββββ β
β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β Service Layer β β
β β (Business Logic, Validation, Authorization) β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββ β
βββββββββββββββ¬ββββββββββββββββββββ¬ββββββββββββββββββββββββββββ
β β
βΌ βΌ
βββββββββββββββββββββββ ββββββββββββββββββββββ
β PostgreSQL β β Redis β
β (Primary Data) β β (Cache, Queue) β
βββββββββββββββββββββββ βββββββββββ¬βββββββββββ
β
βΌ
ββββββββββββββββββββββ
β Celery Workers β
β (Background Jobs) β
βββββββββββ¬βββββββββββ
β
βΌ
ββββββββββββββββββββββ
β OpenAI API β
β (AI Analysis) β
ββββββββββββββββββββββ
- JWT Authentication with access/refresh tokens
- Password hashing using bcrypt
- Row-level tenant isolation - users can only access their tenant's data
- Rate limiting per tenant and endpoint
- Audit logging for compliance
- Input validation with Pydantic
- SQL injection protection via SQLAlchemy ORM
Enable the Flower dashboard for Celery monitoring:
docker-compose --profile monitoring up -d
# Access at http://localhost:5555Health check endpoints:
/health- Overall health/ready- Readiness (database connectivity)/live- Liveness probe
ENVIRONMENT=production
DEBUG=false
DATABASE_URL=postgresql+asyncpg://user:pass@host:5432/db
REDIS_URL=redis://host:6379/0
SECRET_KEY=your-production-secret-key
OPENAI_API_KEY=your-openai-keydocker build --target production -t supportiq:latest .See k8s/ directory for Kubernetes manifests (if applicable).
This project is licensed under the MIT License - see the LICENSE file for details.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
For questions or support, please open an issue on GitHub.
Built with β€οΈ using FastAPI, PostgreSQL, and OpenAI