Skip to content

Latest commit

 

History

History
396 lines (308 loc) · 12.6 KB

File metadata and controls

396 lines (308 loc) · 12.6 KB

SupportIQ

Python 3.11+ FastAPI License: MIT

SupportIQ is a comprehensive multi-tenant SaaS backend for support ticketing with AI-powered features. Built with FastAPI, PostgreSQL, and Redis for high performance and scalability.

🚀 Features

Multi-tenant Architecture

  • Complete tenant isolation with row-level security
  • Flexible pricing plans: Free, Starter, Professional, Enterprise
  • Per-tenant quotas for users, tickets, and API calls
  • Custom branding and webhook integrations

Ticket Management

  • Full lifecycle support: New → Open → In Progress → Pending → Resolved → Closed
  • Priority levels: Critical, High, Medium, Low
  • SLA management with breach detection and escalation
  • Comments and attachments with audit trails
  • Bulk operations for efficient management

AI-Powered Features

  • Sentiment analysis for incoming tickets
  • Auto-categorization based on content
  • Smart routing to appropriate agents
  • Response suggestions for faster resolution
  • Entity extraction for structured data

Scalability & Performance

  • Async/await throughout for high concurrency
  • Celery workers for background processing
  • Redis caching for frequently accessed data
  • Rate limiting per tenant and endpoint
  • Connection pooling for database efficiency

📁 Project Structure

SupportIQ/
├── app/
│   ├── api/v1/              # API endpoints
│   │   ├── endpoints/       # Route handlers
│   │   ├── deps.py          # Dependencies
│   │   └── router.py        # API router
│   ├── core/                # Core utilities
│   │   ├── cache.py         # Redis caching
│   │   └── rate_limit.py    # Rate limiting
│   ├── models/              # SQLAlchemy models
│   ├── schemas/             # Pydantic schemas
│   ├── services/            # Business logic
│   ├── workers/             # Celery tasks
│   ├── config.py            # Configuration
│   ├── database.py          # Database setup
│   └── main.py              # FastAPI app
├── tests/                   # Test suite
├── scripts/                 # Utility scripts
├── docker-compose.yml       # Docker services
├── Dockerfile               # Container image
└── pyproject.toml           # Project config

🛠️ Technology Stack

Component Technology
Framework FastAPI 0.109+
Database PostgreSQL 16
ORM SQLAlchemy 2.0 (async)
Cache/Broker Redis 7
Task Queue Celery 5.3
AI OpenAI GPT-4
Auth JWT (python-jose)
Validation Pydantic v2

🚀 Quick Start

Prerequisites

  • Python 3.11+
  • Docker & Docker Compose
  • OpenAI API key (for AI features)

1. Clone & Setup

git clone https://github.com/yourusername/supportiq.git
cd supportiq

# Copy environment file
cp .env.example .env

# Edit .env with your configuration
# Especially: OPENAI_API_KEY, SECRET_KEY

2. Start with Docker Compose

# Start all services
docker-compose up -d

# View logs
docker-compose logs -f api

# The API will be available at http://localhost:8000

3. Run Database Migrations

# Enter the API container
docker-compose exec api bash

# Run migrations
alembic upgrade head

4. Create Initial Admin User

# Using the API (after migrations)
curl -X POST http://localhost:8000/api/v1/tenants \
  -H "Content-Type: application/json" \
  -d '{
    "name": "My Company",
    "slug": "my-company",
    "plan": "professional"
  }'

📖 API Documentation

Once running, access the interactive API documentation:

Authentication

Most endpoints require JWT authentication:

# Login
curl -X POST http://localhost:8000/api/v1/auth/login \
  -H "Content-Type: application/x-www-form-urlencoded" \
  -d "username=admin@example.com&password=your_password"

# Use the access_token in subsequent requests
curl http://localhost:8000/api/v1/tickets \
  -H "Authorization: Bearer YOUR_ACCESS_TOKEN"

Key Endpoints

Endpoint Method Description
/api/v1/auth/login POST User authentication
/api/v1/tenants GET/POST Tenant management
/api/v1/users GET/POST User management
/api/v1/tickets GET/POST Ticket operations
/api/v1/tickets/{id}/assign POST Assign ticket
/api/v1/tickets/{id}/resolve POST Resolve ticket
/api/v1/ai/analyze POST AI analysis
/api/v1/webhooks/ingest POST External ticket ingestion

🎯 Demo Workflows

1. Create and Process a Ticket

# Create a ticket
curl -X POST http://localhost:8000/api/v1/tickets \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "subject": "Unable to login to dashboard",
    "description": "I keep getting an error message when trying to access my account. This is urgent as I have a presentation tomorrow!",
    "priority": "high",
    "customer_email": "customer@example.com"
  }'

# The AI automatically analyzes sentiment, categorizes, and suggests routing

2. Smart Assignment

# Auto-assign based on skills and workload
curl -X POST http://localhost:8000/api/v1/tickets/{ticket_id}/assign \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: application/json" \
  -d '{"strategy": "hybrid"}'

3. Check SLA Status

# Get ticket with SLA info
curl http://localhost:8000/api/v1/tickets/{ticket_id} \
  -H "Authorization: Bearer $TOKEN"

# Response includes:
# - sla_deadline
# - sla_breached
# - time_to_breach

4. Bulk Operations

# Bulk update tickets
curl -X POST http://localhost:8000/api/v1/tickets/bulk-update \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "ticket_ids": ["uuid1", "uuid2", "uuid3"],
    "updates": {
      "status": "open",
      "assigned_to_id": "agent-uuid"
    }
  }'

🧪 Testing

# Run all tests
pytest

# Run with coverage
pytest --cov=app --cov-report=html

# Run specific test file
pytest tests/test_tickets.py -v

# Run with parallel execution
pytest -n auto

🔧 Development

Local Development (without Docker)

# Create virtual environment
python -m venv venv
source venv/bin/activate  # or `venv\Scripts\activate` on Windows

# Install dependencies
pip install -e ".[dev]"

# Start PostgreSQL and Redis (manually or via Docker)
docker-compose up -d db redis

# Run migrations
alembic upgrade head

# Start the API
uvicorn app.main:app --reload

# Start Celery worker (separate terminal)
celery -A app.workers.celery_app worker --loglevel=info

# Start Celery beat (separate terminal)
celery -A app.workers.celery_app beat --loglevel=info

Code Quality

# Format code
black app tests

# Lint
ruff check app tests

# Type checking
mypy app

📊 Architecture Overview

┌─────────────────────────────────────────────────────────────┐
│                        Clients                               │
│     (Web App, Mobile App, External Systems, Email)          │
└─────────────────────────┬───────────────────────────────────┘
                          │
                          ▼
┌─────────────────────────────────────────────────────────────┐
│                    Load Balancer                             │
│                 (Rate Limiting, SSL)                         │
└─────────────────────────┬───────────────────────────────────┘
                          │
                          ▼
┌─────────────────────────────────────────────────────────────┐
│                     FastAPI Server                           │
│  ┌─────────────┐  ┌─────────────┐  ┌─────────────┐         │
│  │   Auth      │  │   Tickets   │  │     AI      │         │
│  │  Endpoints  │  │  Endpoints  │  │  Endpoints  │         │
│  └─────────────┘  └─────────────┘  └─────────────┘         │
│                                                              │
│  ┌─────────────────────────────────────────────────┐       │
│  │              Service Layer                       │       │
│  │  (Business Logic, Validation, Authorization)    │       │
│  └─────────────────────────────────────────────────┘       │
└─────────────┬───────────────────┬───────────────────────────┘
              │                   │
              ▼                   ▼
┌─────────────────────┐  ┌────────────────────┐
│    PostgreSQL       │  │      Redis         │
│  (Primary Data)     │  │  (Cache, Queue)    │
└─────────────────────┘  └─────────┬──────────┘
                                   │
                                   ▼
                         ┌────────────────────┐
                         │   Celery Workers   │
                         │  (Background Jobs) │
                         └─────────┬──────────┘
                                   │
                                   ▼
                         ┌────────────────────┐
                         │    OpenAI API      │
                         │   (AI Analysis)    │
                         └────────────────────┘

🔐 Security

  • JWT Authentication with access/refresh tokens
  • Password hashing using bcrypt
  • Row-level tenant isolation - users can only access their tenant's data
  • Rate limiting per tenant and endpoint
  • Audit logging for compliance
  • Input validation with Pydantic
  • SQL injection protection via SQLAlchemy ORM

📈 Monitoring

Enable the Flower dashboard for Celery monitoring:

docker-compose --profile monitoring up -d
# Access at http://localhost:5555

Health check endpoints:

  • /health - Overall health
  • /ready - Readiness (database connectivity)
  • /live - Liveness probe

🚀 Production Deployment

Environment Variables

ENVIRONMENT=production
DEBUG=false
DATABASE_URL=postgresql+asyncpg://user:pass@host:5432/db
REDIS_URL=redis://host:6379/0
SECRET_KEY=your-production-secret-key
OPENAI_API_KEY=your-openai-key

Docker Production Build

docker build --target production -t supportiq:latest .

Kubernetes

See k8s/ directory for Kubernetes manifests (if applicable).

📝 License

This project is licensed under the MIT License - see the LICENSE file for details.

🤝 Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

📧 Support

For questions or support, please open an issue on GitHub.


Built with ❤️ using FastAPI, PostgreSQL, and OpenAI