Skip to content

AI-powered multi-tenant customer support SaaS with Row-Level Security, automated ticket routing, and SLA management.

Notifications You must be signed in to change notification settings

cliff-de-tech/SupportIQ

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

1 Commit
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

SupportIQ

Python 3.11+ FastAPI License: MIT

SupportIQ is a comprehensive multi-tenant SaaS backend for support ticketing with AI-powered features. Built with FastAPI, PostgreSQL, and Redis for high performance and scalability.

πŸš€ Features

Multi-tenant Architecture

  • Complete tenant isolation with row-level security
  • Flexible pricing plans: Free, Starter, Professional, Enterprise
  • Per-tenant quotas for users, tickets, and API calls
  • Custom branding and webhook integrations

Ticket Management

  • Full lifecycle support: New β†’ Open β†’ In Progress β†’ Pending β†’ Resolved β†’ Closed
  • Priority levels: Critical, High, Medium, Low
  • SLA management with breach detection and escalation
  • Comments and attachments with audit trails
  • Bulk operations for efficient management

AI-Powered Features

  • Sentiment analysis for incoming tickets
  • Auto-categorization based on content
  • Smart routing to appropriate agents
  • Response suggestions for faster resolution
  • Entity extraction for structured data

Scalability & Performance

  • Async/await throughout for high concurrency
  • Celery workers for background processing
  • Redis caching for frequently accessed data
  • Rate limiting per tenant and endpoint
  • Connection pooling for database efficiency

πŸ“ Project Structure

SupportIQ/
β”œβ”€β”€ app/
β”‚   β”œβ”€β”€ api/v1/              # API endpoints
β”‚   β”‚   β”œβ”€β”€ endpoints/       # Route handlers
β”‚   β”‚   β”œβ”€β”€ deps.py          # Dependencies
β”‚   β”‚   └── router.py        # API router
β”‚   β”œβ”€β”€ core/                # Core utilities
β”‚   β”‚   β”œβ”€β”€ cache.py         # Redis caching
β”‚   β”‚   └── rate_limit.py    # Rate limiting
β”‚   β”œβ”€β”€ models/              # SQLAlchemy models
β”‚   β”œβ”€β”€ schemas/             # Pydantic schemas
β”‚   β”œβ”€β”€ services/            # Business logic
β”‚   β”œβ”€β”€ workers/             # Celery tasks
β”‚   β”œβ”€β”€ config.py            # Configuration
β”‚   β”œβ”€β”€ database.py          # Database setup
β”‚   └── main.py              # FastAPI app
β”œβ”€β”€ tests/                   # Test suite
β”œβ”€β”€ scripts/                 # Utility scripts
β”œβ”€β”€ docker-compose.yml       # Docker services
β”œβ”€β”€ Dockerfile               # Container image
└── pyproject.toml           # Project config

πŸ› οΈ Technology Stack

Component Technology
Framework FastAPI 0.109+
Database PostgreSQL 16
ORM SQLAlchemy 2.0 (async)
Cache/Broker Redis 7
Task Queue Celery 5.3
AI OpenAI GPT-4
Auth JWT (python-jose)
Validation Pydantic v2

πŸš€ Quick Start

Prerequisites

  • Python 3.11+
  • Docker & Docker Compose
  • OpenAI API key (for AI features)

1. Clone & Setup

git clone https://github.com/yourusername/supportiq.git
cd supportiq

# Copy environment file
cp .env.example .env

# Edit .env with your configuration
# Especially: OPENAI_API_KEY, SECRET_KEY

2. Start with Docker Compose

# Start all services
docker-compose up -d

# View logs
docker-compose logs -f api

# The API will be available at http://localhost:8000

3. Run Database Migrations

# Enter the API container
docker-compose exec api bash

# Run migrations
alembic upgrade head

4. Create Initial Admin User

# Using the API (after migrations)
curl -X POST http://localhost:8000/api/v1/tenants \
  -H "Content-Type: application/json" \
  -d '{
    "name": "My Company",
    "slug": "my-company",
    "plan": "professional"
  }'

πŸ“– API Documentation

Once running, access the interactive API documentation:

Authentication

Most endpoints require JWT authentication:

# Login
curl -X POST http://localhost:8000/api/v1/auth/login \
  -H "Content-Type: application/x-www-form-urlencoded" \
  -d "username=admin@example.com&password=your_password"

# Use the access_token in subsequent requests
curl http://localhost:8000/api/v1/tickets \
  -H "Authorization: Bearer YOUR_ACCESS_TOKEN"

Key Endpoints

Endpoint Method Description
/api/v1/auth/login POST User authentication
/api/v1/tenants GET/POST Tenant management
/api/v1/users GET/POST User management
/api/v1/tickets GET/POST Ticket operations
/api/v1/tickets/{id}/assign POST Assign ticket
/api/v1/tickets/{id}/resolve POST Resolve ticket
/api/v1/ai/analyze POST AI analysis
/api/v1/webhooks/ingest POST External ticket ingestion

🎯 Demo Workflows

1. Create and Process a Ticket

# Create a ticket
curl -X POST http://localhost:8000/api/v1/tickets \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "subject": "Unable to login to dashboard",
    "description": "I keep getting an error message when trying to access my account. This is urgent as I have a presentation tomorrow!",
    "priority": "high",
    "customer_email": "customer@example.com"
  }'

# The AI automatically analyzes sentiment, categorizes, and suggests routing

2. Smart Assignment

# Auto-assign based on skills and workload
curl -X POST http://localhost:8000/api/v1/tickets/{ticket_id}/assign \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: application/json" \
  -d '{"strategy": "hybrid"}'

3. Check SLA Status

# Get ticket with SLA info
curl http://localhost:8000/api/v1/tickets/{ticket_id} \
  -H "Authorization: Bearer $TOKEN"

# Response includes:
# - sla_deadline
# - sla_breached
# - time_to_breach

4. Bulk Operations

# Bulk update tickets
curl -X POST http://localhost:8000/api/v1/tickets/bulk-update \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "ticket_ids": ["uuid1", "uuid2", "uuid3"],
    "updates": {
      "status": "open",
      "assigned_to_id": "agent-uuid"
    }
  }'

πŸ§ͺ Testing

# Run all tests
pytest

# Run with coverage
pytest --cov=app --cov-report=html

# Run specific test file
pytest tests/test_tickets.py -v

# Run with parallel execution
pytest -n auto

πŸ”§ Development

Local Development (without Docker)

# Create virtual environment
python -m venv venv
source venv/bin/activate  # or `venv\Scripts\activate` on Windows

# Install dependencies
pip install -e ".[dev]"

# Start PostgreSQL and Redis (manually or via Docker)
docker-compose up -d db redis

# Run migrations
alembic upgrade head

# Start the API
uvicorn app.main:app --reload

# Start Celery worker (separate terminal)
celery -A app.workers.celery_app worker --loglevel=info

# Start Celery beat (separate terminal)
celery -A app.workers.celery_app beat --loglevel=info

Code Quality

# Format code
black app tests

# Lint
ruff check app tests

# Type checking
mypy app

πŸ“Š Architecture Overview

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                        Clients                               β”‚
β”‚     (Web App, Mobile App, External Systems, Email)          β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                          β”‚
                          β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                    Load Balancer                             β”‚
β”‚                 (Rate Limiting, SSL)                         β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                          β”‚
                          β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                     FastAPI Server                           β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”         β”‚
β”‚  β”‚   Auth      β”‚  β”‚   Tickets   β”‚  β”‚     AI      β”‚         β”‚
β”‚  β”‚  Endpoints  β”‚  β”‚  Endpoints  β”‚  β”‚  Endpoints  β”‚         β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜         β”‚
β”‚                                                              β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”       β”‚
β”‚  β”‚              Service Layer                       β”‚       β”‚
β”‚  β”‚  (Business Logic, Validation, Authorization)    β”‚       β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜       β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
              β”‚                   β”‚
              β–Ό                   β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚    PostgreSQL       β”‚  β”‚      Redis         β”‚
β”‚  (Primary Data)     β”‚  β”‚  (Cache, Queue)    β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                                   β”‚
                                   β–Ό
                         β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
                         β”‚   Celery Workers   β”‚
                         β”‚  (Background Jobs) β”‚
                         β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                                   β”‚
                                   β–Ό
                         β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
                         β”‚    OpenAI API      β”‚
                         β”‚   (AI Analysis)    β”‚
                         β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

πŸ” Security

  • JWT Authentication with access/refresh tokens
  • Password hashing using bcrypt
  • Row-level tenant isolation - users can only access their tenant's data
  • Rate limiting per tenant and endpoint
  • Audit logging for compliance
  • Input validation with Pydantic
  • SQL injection protection via SQLAlchemy ORM

πŸ“ˆ Monitoring

Enable the Flower dashboard for Celery monitoring:

docker-compose --profile monitoring up -d
# Access at http://localhost:5555

Health check endpoints:

  • /health - Overall health
  • /ready - Readiness (database connectivity)
  • /live - Liveness probe

πŸš€ Production Deployment

Environment Variables

ENVIRONMENT=production
DEBUG=false
DATABASE_URL=postgresql+asyncpg://user:pass@host:5432/db
REDIS_URL=redis://host:6379/0
SECRET_KEY=your-production-secret-key
OPENAI_API_KEY=your-openai-key

Docker Production Build

docker build --target production -t supportiq:latest .

Kubernetes

See k8s/ directory for Kubernetes manifests (if applicable).

πŸ“ License

This project is licensed under the MIT License - see the LICENSE file for details.

🀝 Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

πŸ“§ Support

For questions or support, please open an issue on GitHub.


Built with ❀️ using FastAPI, PostgreSQL, and OpenAI

About

AI-powered multi-tenant customer support SaaS with Row-Level Security, automated ticket routing, and SLA management.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published