A Comparative Study of YOLOv5 and YOLOv7 Object Detection Algorithms
DOI:
https://doi.org/10.33736/jcsi.5070.2023Keywords:
YOLOv5, YOLOv7, Object detection, Computer Vision, Detection AlgorithmAbstract
This paper presents a comparative analysis of the widely accepted YOLOv5 and the latest version of YOLO which is YOLOv7. Experiments were carried out by training a custom model with both YOLOv5 and YOLOv7 independently in order to consider which one of the two performs better in terms of precision, recall, mAP@0.5 and mAP@0.5:0.95. The dataset used in the experiment is a custom dataset for Remote Weapon Station which consists of 9,779 images containing 21,561 annotations of four classes gotten from Google Open Images Dataset, Roboflow Public Dataset and locally sourced dataset. The four classes are Persons, Handguns, Rifles and Knives. The experimental results of YOLOv7 were precision score of 52.8%, recall value of 56.4%, mAP@0.5 of 51.5% and mAP@0.5:0.95 of 31.5% while that of YOLOv5 were precision score of 62.6%, recall value of 53.4%, mAP@0.5 of 55.3% and mAP@0.5:0.95 of 34.2%. It was observed from the experiment conducted that YOLOv5 gave a better result than YOLOv7 in terms of precision, mAP@0.5 and mAP@0.5:0.95 overall while YOLOv7 has a higher recall value during testing than YOLOv5. YOLOv5 records 4.0% increase in accuracy compared to YOLOv7.
References
Alexey B., Chien-Yao W., Hong-Yuan M. L. (2020) Yolov4: Optimal speed and accuracy of object detectionarXiv:2004.10934.
Banerjee A. (2022). YOLOv5 vs YOLOv6 vs YOLOv7. Retrieved October 12, 2022, from https://www.learnwitharobot.com/p/yolov5-vs-yolov6-vs-yolov7/.
Cengil, E., & Cinar, A. (2021). Poisonous mushroom detection using YOLOV5. Turkish Journal of Science and Technology, 16(1), 119-127.
Chuyi L., Lulu L., Hongliang J., Kaiheng W., Yifei G., Liang L., Zaidan K., Qingyuan L., Meng C., Weiqiang N., Yiduo L., Bo Z., Yufei L., Linyuan Z., Xiaoming X., Xiangxiang C., Xiaoming W., Xiaolin W. (2022). YOLOv6: A single-stage object detection framework for industrial applications. _arXiv_:2209.02976
Dima, T. F., & Ahmed, M. E. (2021, July). Using YOLOv5 Algorithm to Detect and Recognize American Sign Language. In 2021 International Conference on Information Technology (ICIT) (pp. 603-607). IEEE.
https://doi.org/10.1109/ICIT52682.2021.9491672
Google Open Images. (n.d.). Google Open Images Dataset of Person, Handgun, Rifle and Knife. Retrieved from https://storage.googleapis.com/openimages/web/visualizer/index.html.
Górriz, J. M., Ramírez, J., Ortíz, A., Martínez-Murcia, F. J., Segovia, F., Suckling, J. & Ferrández, J. M. (2020). Artificial intelligence within the interplay between natural and artificial computation: Advances in data science, trends and applications. Neurocomputing, 410, 237-270.
https://doi.org/10.1016/j.neucom.2020.05.078
Hao, X., Bo, L., & Fei, Z. (2021). Light-YOLOv5: A Lightweight Algorithm for Improved YOLOv5 in Complex Fire Scenarios.
Hussain, M., Al-Aqrabi, H., Munawar, M., Hill, R., & Alsboui, T., (2022). Domain Feature Mapping with YOLOv7 for Automated Edge-Based Pallet Racking Inspections. Sensors, 22, 6927.
https://doi.org/10.3390/s22186927
Jia, W., Xu, S., Liang, Z., Zhao, Y., Min, H., Li, S., & Yu, Y. (2021). Real‐time automatic helmet detection of motorcyclists in urban traffic using improved YOLOv5 detector. IET Image Processing, 15(14), 3623-3637.
https://doi.org/10.1049/ipr2.12295
Kasper-Eulaers, M., Hahn, N., Berger, S., Sebulonsen, T., Myrland, Ø. & Kummervold, P. E. (2021). Detecting heavy goods vehicles in rest areas in winter conditions using YOLOv5. Algorithms, 14(4), 114.
https://doi.org/10.3390/a14040114
Liu, W., Wang, Z., Zhou, B., Yang, S., & Gong, Z. (2021, May). Real-time signal light detection based on yolov5 for railway. In IOP Conference Series: Earth and Environmental Science (Vol. 769, No. 4, p. 042069). IOP Publishing.
https://doi.org/10.1088/1755-1315/769/4/042069
Malta, A., Mendes, M., & Farinha, T. (2021). Augmented reality maintenance assistant using yolov5. Applied Sciences, 11(11), 4758.
https://doi.org/10.3390/app11114758
Nepal, U., & Eslamiat, H. (2022). Comparing YOLOv3, YOLOv4 and YOLOv5 for Autonomous Landing Spot Detection in Faulty UAVs. Sensors, 22(2), 464
https://doi.org/10.3390/s22020464
Padilla, R., Passos, W. L., Dias, T. L., Netto, S. L., & da Silva, E. A. (2021). A comparative analysis of object detection metrics with a companion open-source toolkit. Electronics, 10(3), 279.
https://doi.org/10.3390/electronics10030279
Patel, D., Patel, S., & Patel, M. (2022). Application to image-to-image translation in improving pedestrian detection.
Ramya, A., Venkateswara, G. P., Amrutham, B.V., Sai, S. K. (2021). Comparison of YOLOv3, YOLOv4 and YOLOv5 Performance for Detection of Blood Cells. International Research Journal of Engineering and Technology (IRJET) 8(4), (pp. 4225 - 4229).
Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016). You only look once: Unified, real-time object detection. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 779-788).
https://doi.org/10.1109/CVPR.2016.91
Redmon, J., & Farhadi, A. (2018). Yolov3: An incremental improvement. arXiv preprint arXiv:1804.02767.
Roboflow (n.d). Roboflow Public Dataset (n.d). Public Dataset of Pistols. Retrieved from https://public.roboflow.com/object-detection/pistols
Sahal, M. A. (2021). Comparative Analysis of Yolov3, Yolov4 and Yolov5 for Sign Language Detection. IJARIIE, 7(4), (pp. 2395 - 4396).
Wan, J., Chen, B., & Yu, Y. (2021). Polyp Detection from Colorectum Images by Using Attentive YOLOv5. Diagnostics, 11(12), 2264.
https://doi.org/10.3390/diagnostics11122264
Wang, C. Y., Bochkovskiy, A., & Liao, H. Y. M. (2022). YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. arXiv preprint arXiv:2207.02696.
Yang, F., Zhang, X., & Liu, B. (2022). Video object tracking based on YOLOv7 and DeepSORT. arXiv preprint arXiv:2207.12202.
Yao, J., Qi, J., Zhang, J., Shao, H., Yang, J., & Li, X. (2021). A real-time detection algorithm for Kiwifruit defects based on YOLOv5. Electronics, 10(14), 1711.
Downloads
Published
How to Cite
Issue
Section
License
Copyright Transfer Statement for Journal
1) In signing this statement, the author(s) grant UNIMAS Publisher an exclusive license to publish their original research papers. The author(s) also grant UNIMAS Publisher permission to reproduce, recreate, translate, extract or summarise, and to distribute and display in any forms, formats, and media. The author(s) can reuse their papers in their future printed work without first requiring permission from UNIMAS Publisher, provided that the author(s) acknowledge and reference publication in the Journal.
2) For open access articles, the author(s) agree that their articles published under UNIMAS Publisher are distributed under the terms of the CC-BY-NC-SA (Creative Commons Attribution-Non Commercial-Share Alike 4.0 International License) which permits unrestricted use, distribution, and reproduction in any medium, for non-commercial purposes, provided the original work of the author(s) is properly cited.
3) For subscription articles, the author(s) agree that UNIMAS Publisher holds copyright, or an exclusive license to publish. Readers or users may view, download, print, and copy the content, for academic purposes, subject to the following conditions of use: (a) any reuse of materials is subject to permission from UNIMAS Publisher; (b) archived materials may only be used for academic research; (c) archived materials may not be used for commercial purposes, which include but not limited to monetary compensation by means of sale, resale, license, transfer of copyright, loan, etc.; and (d) archived materials may not be re-published in any part, either in print or online.
4) The author(s) is/are responsible to ensure his or her or their submitted work is original and does not infringe any existing copyright, trademark, patent, statutory right, or propriety right of others. Corresponding author(s) has (have) obtained permission from all co-authors prior to submission to the journal. Upon submission of the manuscript, the author(s) agree that no similar work has been or will be submitted or published elsewhere in any language. If submitted manuscript includes materials from others, the authors have obtained the permission from the copyright owners.
5) In signing this statement, the author(s) declare(s) that the researches in which they have conducted are in compliance with the current laws of the respective country and UNIMAS Journal Publication Ethics Policy. Any experimentation or research involving human or the use of animal samples must obtain approval from Human or Animal Ethics Committee in their respective institutions. The author(s) agree and understand that UNIMAS Publisher is not responsible for any compensational claims or failure caused by the author(s) in fulfilling the above-mentioned requirements. The author(s) must accept the responsibility for releasing their materials upon request by Chief Editor or UNIMAS Publisher.
6) The author(s) should have participated sufficiently in the work and ensured the appropriateness of the content of the article. The author(s) should also agree that he or she has no commercial attachments (e.g. patent or license arrangement, equity interest, consultancies, etc.) that might pose any conflict of interest with the submitted manuscript. The author(s) also agree to make any relevant materials and data available upon request by the editor or UNIMAS Publisher.