Machine Learning Using YOLOv5
Published on January 16, 2025
One of the finer examples for acessible and powerful Machine Learning frameworks is YOLO (You Only Look Once) in particular YOLOv5.
This framework enables for Neural Network based Computer Vision system for object detection and real time frame analysis.
(<GitHub - ultralytics/yolov5: YOLOv5 🚀 in PyTorch > ONNX > CoreML > TFLite>)
Some of YOLOv5 feature include:
Training custom vision models, multi-gpu training, multi-stream inference, custom training datasets, specialized models for specific or custom object classes and detailed logging and analytics.
CometML logging and visualization is also available during model training.
Running
python detect.py --weights yolov5n.pt --source 0
is enough to run inference on nano-scale yolov5 model trained on COCO128 dataset on a device as low-powered as Raspberry Pi where most of the time delay for frame processing is due to python interpreter.
Source 0 corresponds to /dev/video0 on a Linux system.
On a more powerful system, even a laptop CPU, it is quite possible to analyze and detect objects in real time from a webcam or similar video source at 60 fps with Full HD resolution.
Running inference on CUDA enabled GPUs and further more RTX based systems that have Tensor cores makes the processing blazingly fast and quite useful.
For instance,
YOLOv5 can effectively detect and track objects in real time on from various video sources and YOLO functionality extends to include human posture detection and more advanced and complete feature sets in newer version, improving training and inference time as well as the variety of functional applications for the model.
This short video can demonstrate how an YOLOv5 running on a webcam connected to a laptop can detect a complex object such as a cat (an animal).
Video 1 : Example YOLOv5s application. (single class object detection)
All mentioned technologies that include open source project, brand names and product names or descriptions are intellectual property/trademarks of their respective owners.