π TensorRT YOLO
TensorRT YOLO is an inference acceleration project that supports a wide range of YOLO models ranging from YOLOv3 and YOLOv5 to YOLO11, as well as PP-YOLOE and PP-YOLOE+. This project leverages NVIDIA's TensorRT for optimization, aiming to provide a fast and efficient object detection solution.
π Key Features
-
Model Support: TensorRT YOLO supports numerous YOLO models: YOLOv3, YOLOv5, up to YOLO11, including variations like PP-YOLOE and PP-YOLOE+.
-
Detection Capabilities: It can handle both standard Detection and Oriented Bounding Box (OBB) Detection models.
-
In-depth Optimization: The project integrates TensorRT plugins for enhanced post-processing speed and uses CUDA kernels for quick pre-processing. Furthermore, CUDA Graphs are employed to accelerate the inference process.
-
Versatility in Programming: Offers inference support for both C++ and Python, providing flexibility for developers.
-
User-Friendly: Includes a command-line interface (CLI) for easy model exportation and inference execution.
-
Easy Deployment: Supports Docker for straightforward deployment setups.
π οΈ System Requirements
To get the most out of TensorRT YOLO, it is recommended to use:
- CUDA version 11.6 or higher
- TensorRT version 8.6 or higher
π¦ How to Use
TensorRT YOLO provides comprehensive documentation to guide users through the setup and implementation process:
- Quick Compilation and Installation: Instructions can be found here.
- Model Exportation via CLI: Guidance available here.
- Model Inference Examples: Demonstrations can be accessed here.
- Video Analysis Examples: Visit here for more information.
πΊ Additional Resources
For visual learners, several videos are available on BiliBili:
- A comprehensive deployment tool overview
- Demonstrations of custom plugin acceleration
- VideoPipe integration examples
- Utilizing CUDA Graphs for inference acceleration
- Docker deployment guides
β Support the Developer
Creating and maintaining open-source projects like TensorRT YOLO is challenging. If the project has helped you, consider buying the developer a cup of coffee as a token of appreciation. Your support is instrumental in keeping this project alive and thriving.
π License
TensorRT YOLO operates under the GPL-3.0 License, an OSI-approved open-source license ideal for students and enthusiasts. This license encourages open collaboration and knowledge sharing. For more details, please refer to the LICENSE file.
π Contact
For reporting bugs or making feature requests, visit the GitHub Issues page of TensorRT YOLO. Your feedback is invaluable for the continuous improvement of the project.