Publication Type : Journal Article
Publisher : Pattern Recognition and Image Analysis
Source : Pattern Recognition and Image Analysis, 32(2), pp.351-362.
Url : https://link.springer.com/article/10.1134/S1054661822020110
Campus : Coimbatore
School : School of Engineering
Department : Electronics and Communication
Verified : No
Year : 2022
Abstract : Object detection and recognition is a significant activity in computer vision applications. Advanced driver assistance systems (ADAS) uses computer vision predominantly as its tool. For improving the performance of ADAS, traffic sign is one of the important object that needs to be detected and recognized to assist the drivers for safe driving. Under real time conditions, this befits extremely challenging due to varying illumination, resolution of images, external weather conditions, position of sign board and occlusions. This article proposes an efficient algorithm that can detect, and classify (recognize) the traffic signs. This traffic sign processing has been done in two phases: sign detection and sign recognition through classification. In the first phase, the traffic signs are detected using YOLOv3 architecture by generating seven classes based on shape, color and background. In phase 2, traffic sign classification has been done using the newly proposed architecture based on convolutional neural networks, using the output generated from the first phase. The German Traffic Sign Detection Benchmark (GTSDB) and German Traffic Sign Recognition Benchmark (GTSRB) datasets have been used for experimentation. The proposed method gives a mean average precision of 89.56% for traffic sign detection with an accuracy of 86.6% for traffic sign recognition. This shows the efficacy of the proposed architecture.
Cite this Research Publication : Karthika, R. and Parameswaran, L., 2022. A Novel Convolutional Neural Network Based Architecture for Object Detection and Recognition with an Application to Traffic Sign Recognition from Road Scenes. Pattern Recognition and Image Analysis, 32(2), pp.351-362.