This project demonstrates how to build, containerize, and deploy a robust Machine Learning application using Docker and Docker Compose. It features a microservices architecture that separates concerns between the web frontend, the ML inference engine, and the analytics backend.
The core application is a Pomegranate Disease Detection System that uses Deep Learning to identify diseases in pomegranate leaves. However, the primary focus of this repository is to showcase docker based ML deployment, including:
- Microservices Architecture: Decoupling services for scalability and maintainability.
- Containerization: Using Dockerfiles to create reproducible environments for Python (Flask/FastAPI) and Postgres.
- Orchestration: Using
docker-composeto manage multi-container applications, networking, and volumes. - Service Discovery: configuring inter-service communication using Docker network aliases.
- Monitoring: integrating Grafana for real-time analytics visualization.
The system follows a microservices architecture layout.
The system is composed of the following containerized services:
-
- Tech: Flask, Jinja2, Bootstrap.
- Role: User interface for image uploads. It acts as the gateway, forwarding requests to the ML service and logging interactions to the Analytics service.
- Docker: Uses a Python 3.9 image.
-
- Tech: FastAPI, TensorFlow/Keras, SQLite (caching).
- Role: Loads the heavy DL model and provides a REST API for inference. It isolates the heavy compute and dependencies (TensorFlow) from the lightweight frontend.
- Docker: optimized image with system dependencies for image processing (
libgl1).
-
- Tech: FastAPI, SQLAlchemy, PostgreSQL.
- Role: Centralized logging service. Stores prediction results and visit stats, allowing the ML and Web services to remain stateless regarding analytics.
-
Database (
postgres):- Role: Persistent storage for the Analytics service.
- Docker: Official
postgres:15image with named volumes for data persistence.
-
Visualization (
grafana):- Role: operational dashboards to monitor disease trends, upload stats, and system usage.
- Docker: Official
grafana/grafanaimage with pre-provisioned dashboards and datasources.
- Docker Desktop installed and running.
-
Clone the repository:
git clone <repository-url> cd <repository-name>
-
Setup the Model:
Note: The actual deep learning model (
model.h5) is not included in this repository to keep it lightweight.- Place your trained model file (e.g.,
model.h5) in theml/Model/directory. - Ensure the filename matches
MODEL_PATHinml/main.py(default:model.h5).
- Place your trained model file (e.g.,
-
Build and Run: Use Docker Compose to build the images and start the entire stack with a single command:
docker-compose up -d --build
-d: Runs containers in the background (detached mode).--build: Forces a rebuild of images to ensure code changes are applied.
-
Access the Services:
- Web App: http://localhost:5000
- Grafana Dashboards: http://localhost:3000
- Default Credentials:
admin/admin
- Default Credentials:
-
Stop the Application:
docker-compose down
To stop and remove persistent configurations (reset database):
docker-compose down -v
-
Admin Maintenance (Reset Data): Only for administrators. To wipe all project data (ML cache, analytics log, and uploads) while the system is running:
Powershell:
curl -Method POST -Uri http://localhost:5000/admin/reset-project -Headers @{ "X-Admin-Secret" = "supersecretadmin" }
Bash:
curl -X POST http://localhost:5000/admin/reset-project -H "X-Admin-Secret: supersecretadmin"(Note: The secret key
supersecretadminis defined indocker-compose.yml)
The directory structure acts as a monorepo for the microservices:
├── web/ # Flask Web Service
├── ml/ # FastAPI ML Service
├── analytics/ # FastAPI Analytics Service
├── grafana/ # Dashboard Configuration
└── docker-compose.yml
To work on a specific service (e.g., web), you can rebuild just that container:
docker-compose up -d --build webPOST /predict: Input:file(image). Output: JSON with class probabilities.
POST /log-prediction: Input: JSON prediction data. Output: Log ID.
