Please use this identifier to cite or link to this item: http://studentrepo.iium.edu.my/handle/123456789/5321
Full metadata record
DC FieldValueLanguage
dc.contributor.authorMatin, Muhammad Anwar Alhaq Aen_US
dc.date.accessioned2020-08-20T11:26:02Z-
dc.date.available2020-08-20T11:26:02Z-
dc.date.issued2019-
dc.identifier.urihttp://studentrepo.iium.edu.my/jspui/handle/123456789/5321-
dc.description.abstractVision-based Forward Collision Warning System (FCWS) is a promising assist feature in a car to alleviate road accidents and make roads safer. In practice, it is exceptionally hard to accurately and efficiently develop algorithm for FCWS application due to the complexity of steps involved in FCWS. For FCWS application, multiple steps are involved namely vehicle detection, target vehicle verification, time-to-collision (TTC). These involve an elaborated pipeline for the FCWS application using classical computer vision methods which limits the robustness of the overall system and limits the scalability of the algorithm. Advancement in deep neural network (DNN) has shown unprecedented performance for the task of vision-based object detection which opens up the possibility to be explored as an effective perceptive tool for automotive application. In this thesis a DNN based single-shot vehicle detection and ego-lane estimation architecture is presented. This architecture allows simultaneous detection of vehicles and estimation of ego-lanes in a single-shot. SSD-MobileNetv2 architecture were used as a backbone network to achieve this. Traffic ego-lanes in this thesis were defined in two ways; first as a second-degree polynomial and second as semantic regression points. We collected and labelled 59,068 images of ego-lane datasets and trained the feature extractor architecture MobileNetv2 to estimate where the ego-lanes are. Once the feature extractor is trained for ego-lane estimation the meta-architecture single-shot detector (SSD) was then trained to detect vehicles. This thesis had demonstrated that this method achieves real-time performance with test results of 88% total precision on CULane dataset and 91% on our own dataset for ego-lane estimation. Moreover, we achieve 63.7% mAP for vehicle detection on our own dataset. The proposed architecture shows that elaborate pipeline of multiple steps to develop algorithm for FCWS application is eliminated. The proposed method achieves real-time at 60fps performance on standard PC running on Nvidia GTX1080 proving its potential to run on embedded device for Forward Collision Warning System.en_US
dc.language.isoenen_US
dc.publisherKuala Lumpur :International Islamic University Malaysia,2019en_US
dc.rightsCopyright International Islamic University Malaysia
dc.titleVision based single-shot real-time ego-lane estimation and vehicle detection for forward collision warning systemen_US
dc.typeMaster Thesisen_US
dc.identifier.urlhttps://lib.iium.edu.my/mom/services/mom/document/getFile/KZGFv9hBzopB1SZoBFehi95cdM59yCcb20200312160404594-
dc.description.identityt11100409616MohdAnwarAlhaqAMatinen_US
dc.description.identifierThesis : Vision based single-shot real-time ego-lane estimation and vehicle detection for forward collision warning system /by Muhammad Anwar Alhaq A Matinen_US
dc.description.kulliyahKulliyyah of Engineeringen_US
dc.description.programmeMaster of Science (Mechatronics Engineering).en_US
dc.description.degreelevelMasteren_US
dc.description.notesThesis (MSMCT)--International Islamic University Malaysia, 2019.en_US
dc.description.physicaldescriptionxiv, 63 leaves :colour illustrations ;30cm.en_US
item.openairetypeMaster Thesis-
item.grantfulltextopen-
item.fulltextWith Fulltext-
item.languageiso639-1en-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.cerifentitytypePublications-
Appears in Collections:KOE Thesis
Files in This Item:
File Description SizeFormat 
t11100409616MohdAnwarAlhaqAMatin_SEC_24.pdf24 pages file434.65 kBAdobe PDFView/Open
t11100409616MohdAnwarAlhaqAMatin_SEC.pdf
  Restricted Access
Full text secured file2.62 MBAdobe PDFView/Open    Request a copy
Show simple item record

Page view(s)

12
checked on May 20, 2021

Download(s)

22
checked on May 20, 2021

Google ScholarTM

Check


Items in this repository are protected by copyright, with all rights reserved, unless otherwise indicated. Please give due acknowledgement and credits to the original authors and IIUM where applicable. No items shall be used for commercialization purposes except with written consent from the author.