Please use this identifier to cite or link to this item: http://studentrepo.iium.edu.my/handle/123456789/5321
Title: Vision based single-shot real-time ego-lane estimation and vehicle detection for forward collision warning system
Authors: Matin, Muhammad Anwar Alhaq A
Year: 2019
Publisher: Kuala Lumpur :International Islamic University Malaysia,2019
Abstract in English: Vision-based Forward Collision Warning System (FCWS) is a promising assist feature in a car to alleviate road accidents and make roads safer. In practice, it is exceptionally hard to accurately and efficiently develop algorithm for FCWS application due to the complexity of steps involved in FCWS. For FCWS application, multiple steps are involved namely vehicle detection, target vehicle verification, time-to-collision (TTC). These involve an elaborated pipeline for the FCWS application using classical computer vision methods which limits the robustness of the overall system and limits the scalability of the algorithm. Advancement in deep neural network (DNN) has shown unprecedented performance for the task of vision-based object detection which opens up the possibility to be explored as an effective perceptive tool for automotive application. In this thesis a DNN based single-shot vehicle detection and ego-lane estimation architecture is presented. This architecture allows simultaneous detection of vehicles and estimation of ego-lanes in a single-shot. SSD-MobileNetv2 architecture were used as a backbone network to achieve this. Traffic ego-lanes in this thesis were defined in two ways; first as a second-degree polynomial and second as semantic regression points. We collected and labelled 59,068 images of ego-lane datasets and trained the feature extractor architecture MobileNetv2 to estimate where the ego-lanes are. Once the feature extractor is trained for ego-lane estimation the meta-architecture single-shot detector (SSD) was then trained to detect vehicles. This thesis had demonstrated that this method achieves real-time performance with test results of 88% total precision on CULane dataset and 91% on our own dataset for ego-lane estimation. Moreover, we achieve 63.7% mAP for vehicle detection on our own dataset. The proposed architecture shows that elaborate pipeline of multiple steps to develop algorithm for FCWS application is eliminated. The proposed method achieves real-time at 60fps performance on standard PC running on Nvidia GTX1080 proving its potential to run on embedded device for Forward Collision Warning System.
Degree Level: Master
Kullliyah: Kulliyyah of Engineering
Programme: Master of Science (Mechatronics Engineering).
URI: http://studentrepo.iium.edu.my/jspui/handle/123456789/5321
URL: https://lib.iium.edu.my/mom/services/mom/document/getFile/KZGFv9hBzopB1SZoBFehi95cdM59yCcb20200312160404594
Appears in Collections:KOE Thesis

Files in This Item:
File Description SizeFormat 
t11100409616MohdAnwarAlhaqAMatin_SEC_24.pdf24 pages file434.65 kBAdobe PDFView/Open
t11100409616MohdAnwarAlhaqAMatin_SEC.pdf
  Restricted Access
Full text secured file2.62 MBAdobe PDFView/Open    Request a copy
Show full item record

Page view(s)

12
checked on May 20, 2021

Download(s)

22
checked on May 20, 2021

Google ScholarTM

Check


Items in this repository are protected by copyright, with all rights reserved, unless otherwise indicated. Please give due acknowledgement and credits to the original authors and IIUM where applicable. No items shall be used for commercialization purposes except with written consent from the author.