An AI-powered backend service that analyzes customer support tickets in real time. Send raw ticket text through the API and receive structured JSON with category classification, priority assignment, order ID extraction, sentiment analysis, and a concise summary β all powered by OpenAI's GPT-4.1 Mini.
Built with FastAPI, Pydantic, and the OpenAI Responses API.
Customer support teams handle hundreds of tickets daily. Manually reading, categorizing, and prioritizing each one is slow, inconsistent, and expensive.
AI Support Ticket Analyzer solves this by:
- Classifying tickets into 8 business categories automatically
- Assigning priority based on deterministic rules β no hallucinated urgency
- Extracting structured data (order IDs, summaries) from unstructured text
- Detecting sentiment to flag frustrated customers early
- Integrating with Slack for real-time team notifications
This is the kind of service that plugs directly into helpdesk pipelines (Zendesk, Freshdesk, Intercom) or custom CRM backends.
After starting the server you can explore the interactive API documentation:
This Swagger interface allows you to test the AI ticket analyzer directly from the browser.
Client β POST /analyze-ticket { "text": "..." }
β FastAPI router
β Analyzer (loads prompt, calls LLM, maps priority)
β OpenAI Responses API
β Validated TicketAnalysis
β JSON response
| Layer | Responsibility |
|---|---|
app/main.py |
FastAPI app, endpoints, CORS, lifespan |
app/config.py |
Environment variable management via pydantic-settings |
app/schemas.py |
Pydantic models, enums, validation |
app/analyzer.py |
Orchestration: prompt β LLM β parse β priority mapping |
app/llm_client.py |
Async OpenAI Responses API client (httpx) |
app/integrations/slack.py |
Decoupled Slack webhook sender |
prompts/analyze_ticket.txt |
Externalized LLM system prompt |
Key design decisions:
- Priority is derived deterministically from category β not by the LLM
- Slack integration is fully decoupled from the analysis endpoint and can be used independently
- The prompt is stored as a plain text file for easy iteration without code changes
- Ticket text is submitted through the API
- The prompt template is loaded from
prompts/analyze_ticket.txt - The ticket content is sent to the OpenAI Responses API
- The LLM extracts structured fields
- Priority is deterministically derived from category
- The validated result is returned via Pydantic schema
git clone https://github.com/andrei-ameliugin/ai-support-ticket-analyzer.git
cd ai-support-ticket-analyzerpython3 -m venv .venv
source .venv/bin/activatepip install -r requirements.txtcp .env.example .envEdit .env and add your OpenAI API key:
OPENAI_API_KEY=sk-your-actual-key
OPENAI_API_URL=https://api.openai.com/v1/responses
OPENAI_MODEL=gpt-4.1-mini
SLACK_WEBHOOK_URL=https://hooks.slack.com/services/your/webhook/url
uvicorn app.main:app --reloadNavigate to http://localhost:8000/docs in your browser.
Analyze a customer support ticket.
Request:
{
"text": "Hello, my order #45123 hasn't arrived yet. Can you check the delivery status?"
}Response:
{
"category": "delivery_issue",
"priority": "medium",
"order_id": "45123",
"summary": "Customer reports missing delivery.",
"customer_sentiment": "neutral"
}Forward an analyzed ticket to Slack (requires SLACK_WEBHOOK_URL).
Request: A full TicketAnalysis JSON object.
Health check β returns {"status": "ok"}.
| Category | Priority |
|---|---|
billing_issue |
π΄ High |
payment_failed |
π΄ High |
account_access |
π΄ High |
delivery_issue |
π‘ Medium |
refund_request |
π‘ Medium |
technical_problem |
π‘ Medium |
product_question |
π’ Low |
general_inquiry |
π’ Low |
Every ticket is classified as:
- positive β grateful, satisfied, complimentary tone
- neutral β factual, informational, no strong emotion
- negative β frustrated, angry, disappointed tone
ai-support-ticket-analyzer/
βββ app/
β βββ __init__.py
β βββ main.py # FastAPI application & endpoints
β βββ config.py # Settings from environment variables
β βββ schemas.py # Pydantic models & enums
β βββ analyzer.py # Core analysis orchestrator
β βββ llm_client.py # OpenAI Responses API client
β βββ integrations/
β βββ __init__.py
β βββ slack.py # Slack webhook integration
βββ prompts/
β βββ analyze_ticket.txt # LLM system prompt
βββ examples/ # Sample tickets for every category
βββ .env.example # Environment variable template
βββ .gitignore
βββ requirements.txt
βββ LICENSE
βββ README.md
The examples/ folder contains realistic sample tickets for every category:
| File | Category |
|---|---|
billing_issue.txt |
Incorrect charges |
payment_failed.txt |
Declined payments |
delivery_issue.txt |
Missing packages |
refund_request.txt |
Return & refund |
account_access.txt |
Locked accounts |
product_question.txt |
Pre-purchase questions |
technical_problem.txt |
App bugs & crashes |
general_inquiry.txt |
Store hours & general info |
Use any of these in the Swagger UI to test the API.
This service is designed to plug into larger support workflows:
- Gmail / Outlook β Incoming email webhook triggers
/analyze-ticket - Zendesk / Freshdesk β Auto-tag and prioritize new tickets
- Intercom / HubSpot β Enrich conversation metadata with AI analysis
- Slack β Push high-priority tickets to
#urgent-supportchannels - PagerDuty β Auto-escalate
account_accessorpayment_failedtickets - Internal dashboards β Aggregate sentiment trends and category distributions
| Technology | Purpose |
|---|---|
| Python 3.11+ | Runtime |
| FastAPI | Web framework & Swagger UI |
| Pydantic v2 | Data validation & serialization |
| pydantic-settings | Environment configuration |
| httpx | Async HTTP client |
| OpenAI Responses API | LLM inference |
- Streaming responses for large ticket analysis
- Batch ticket processing endpoint
- Queue-based processing (Redis / Celery)
- Observability with OpenTelemetry
- Unit and integration tests
- Rate limiting and API key authentication
This project demonstrates:
- Clean Python backend architecture with clear module boundaries
- AI/LLM integration in a production-style API
- Prompt engineering for structured JSON extraction
- Pydantic-first data modeling and validation
- Async programming with FastAPI and httpx
- Decoupled third-party integrations (Slack)
- Professional documentation and project structure
Ideal for: AI/ML engineer portfolios, backend developer showcases, Upwork client proposals, and recruiter demonstrations.
This service can be used by companies that receive high volumes of customer support emails and want to automate:
- ticket triage
- priority routing
- CRM enrichment
- sentiment monitoring
- support team notifications
This project is licensed under the MIT License. See the LICENSE file for details.