This repository contains the official code and resources for the research presented in the following papers:
-
"Quantized YOLO for Real-Time Weed Detection: An Affordable Embedded System for Precision Agriculture"
Proceedings of SBIAGRO 2025 -
"Real-Time Weed Detection on Low-Cost Embedded Devices Using Quantized YOLO"
2025 16th IEEE International Conference on Industry Applications (INDUSCON)
This research addresses challenges in precision agriculture through the development of an affordable embedded system for real-time weed detection. By applying advanced deep learning optimization techniques to YOLOv5 neural networks deployed on edge computing platforms, we demonstrate how agricultural computer vision can be made accessible for producers of all sizes. Our methodology combines post-training quantization (16-bit floating point, 8-bit integer) and ONNX format conversion with systematic performance benchmarking on the Raspberry Pi 4B platform.
- Affordable Hardware: Designed for Raspberry Pi 4B (~$60), democratizing precision agriculture.
- Deep Learning Optimization: Utilizes Quantization (FP16, INT8) and ONNX for real-time performance.
- Agricultural Relevance: Trained on the 4Weed Dataset covering Cocklebur, Foxtail, Redroot Pigweed, and Giant Ragweed.
- Real-Time Performance: Achieves up to 2.66 FPS (ONNX) on RPi 4B, suitable for tractor speeds up to 10 km/h.
We used the 4Weed Dataset containing 618 images of four weed species:
- Cocklebur (Xanthium Strumarium)
- Foxtail (Setaria Viridis)
- Redroot Pigweed (Amaranthus Retroflexus)
- Giant Ragweed (Ambrosia Trifida)
The dataset was split into training (518) and validation (100) sets.
We explored YOLOv5n, s, m, l, and x models. Training was conducted at 640x640 resolution for 150 epochs.
- FP16: 16-bit floating point quantization.
- INT8: 8-bit integer quantization (Post-Training Quantization).
- ONNX: Open Neural Network Exchange format.
| Model | Format | mAP@0.5 | FPS | Inference (s) |
|---|---|---|---|---|
| YOLOv5n | ONNX | 0.727 | 2.66 | 0.376 |
| YOLOv5s | ONNX | 0.785 | 1.15 | 0.869 |
| YOLOv5l | INT8 | 0.815 | 0.18 | 5.685 |
Conclusion:
- YOLOv5n-ONNX is best for real-time mobile applications (spraying at ~10 km/h).
- YOLOv5l-INT8 is best for high-accuracy stationary analysis.
- Python 3.8+
- Git
-
Clone the repository:
git clone https://github.com/LuizCMarquesJr/Smart-Agricultural-Weed-Management-Real-Time-Detection-System-Using-Quantized-YOLO.git cd Smart-Agricultural-Weed-Management-Real-Time-Detection-System-Using-Quantized-YOLO -
Install dependencies:
pip install -r requirements.txt
-
(Optional) Clone YOLOv5 submodule if you plan to retrain:
git clone https://github.com/ultralytics/yolov5 cd yolov5 pip install -r requirements.txt cd ..
Run the dataset preparation notebook to download the 4Weed dataset and organize it for YOLOv5:
notebooks/01_Dataset_Preparation.ipynb
To reproduce the training results:
notebooks/02_YOLOv5_Training.ipynb
To convert trained models to ONNX and apply quantization:
notebooks/03_Quantization_and_Export.ipynb
If you use this work, please cite our papers:
@article{marques2025quantized,
title={Quantized YOLO for Real-Time Weed Detection: An Affordable Embedded System for Precision Agriculture},
author={Marques Jr, L. C. and Papa, J. P. and Ulson, J. A. C.},
journal={Proceedings of SBIAGRO},
year={2025}
}@inproceedings{marques2025realtime,
title={Real-Time Weed Detection on Low-Cost Embedded Devices Using Quantized YOLO},
author={Marques Jr, L. C. and Ulson, J. A. C.},
booktitle={2025 16th IEEE International Conference on Industry Applications (INDUSCON)},
pages={1--6},
year={2025},
organization={IEEE},
doi={10.1109/INDUSCON66435.2025.11241472}
}This project is part of a broader research initiative detailed in the PhD Thesis. For additional scripts, implementations, and the full thesis context, please refer to:
- Phd-Thesis-Embedded-Yolov5: Contains the comprehensive scripts and implementation of "Redes Neurais Profundas com Redução de Pesos para uso em Hardwares de Baixo Custo Aplicadas à Identificação de Espécies Invasoras em..."
This project is licensed under the GPL-3.0 License.