Please use this identifier to cite or link to this item: http://studentrepo.iium.edu.my/handle/123456789/5309
Full metadata record
DC FieldValueLanguage
dc.contributor.authorNor Nadirah binti Abdul Azizen_US
dc.date.accessioned2020-08-20T11:25:48Z-
dc.date.available2020-08-20T11:25:48Z-
dc.date.issued2016-
dc.identifier.urihttp://studentrepo.iium.edu.my/jspui/handle/123456789/5309-
dc.description.abstractThe rise of crime has led to the increasing of demand on automated video surveillance system due to the limitations in the ability of humans to vigilantly monitor the video surveillance footage. Video surveillance system is an important tool used in detection of snatch theft crime that includes detection and tracking of the objects which can provide detailed information about the objects` appearance and their biometric information. Tracking moving objects in multi-cameras environment is more challenging than a single camera view due to variation in illumination conditions, poses and viewing angles. Besides, there is no spatial continuity between cameras with non-overlapping view, thus is more challenging. Most of existing tracking methods perform well for single camera, but not for multiple cameras. Some of available trackers that work well for multi-cameras environment have high computational time. This thesis builds on prior studies to select the optimal features from the object`s appearance and to develop tracking algorithm for multiple non-overlapping cameras view that can provide the optimal trade-off between accuracy and speed. In this thesis, the method based on an adaptive Gaussian Mixture Model and background subtraction to extract the foreground object is presented. The proposed tracking algorithm is formulated based on visual appearance including Hue colour, YCbCr colour, texture, shape and edge features extracted from the upper and lower parts of body for correspondence management. Position cue is used in single-camera tracking to reduce the computational cost. The comparison between the effectiveness of the features is presented in the result section. The accuracy of the proposed framework for tracking the moving objects based on frame-based performance is very good, that is 95.97 percents with a speed of 43.967 frames per second (fps) for single camera. For two and three non-overlapping cameras, the overall accuracy based on frame-based performance is 99.29 percents and 99.73 percents with a speed of 26.30 fps and 17.54 fps respectively. The proposed algorithm is reliable for real-time performance based on the experimental results.en_US
dc.language.isoenen_US
dc.publisherKuala Lumpur : International Islamic University Malaysia, 2016en_US
dc.rightsCopyright International Islamic University Malaysia
dc.subject.lcshVideo surveillanceen_US
dc.subject.lcshClosed-circuit televisionen_US
dc.subject.lcshElectronic surveillanceen_US
dc.titleTracking moving objects across distributed camerasen_US
dc.typeMaster Thesisen_US
dc.identifier.urlhttps://lib.iium.edu.my/mom/services/mom/document/getFile/kmDIeqSkwgmUs4b2yRXu9HbAx1xCz6xp20170504093022562-
dc.description.identityt11100350581NorNadirahen_US
dc.description.identifierThesis : Tracking moving objects across distributed cameras /by Nor Nadirah binti Abdul Azizen_US
dc.description.kulliyahKulliyyah of Engineeringen_US
dc.description.programmeMaster of Science (Mechatronics Engineering)en_US
dc.description.degreelevelMaster
dc.description.callnumbert TK 6680.3 N822T 2016en_US
dc.description.notesThesis (MSMCT)--International Islamic University Malaysia, 2016.en_US
dc.description.physicaldescriptionxxi, 213 leaves :color illustrations. ;30cm.en_US
item.openairetypeMaster Thesis-
item.grantfulltextopen-
item.fulltextWith Fulltext-
item.languageiso639-1en-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.cerifentitytypePublications-
Appears in Collections:KOE Thesis
Files in This Item:
File Description SizeFormat 
t11100350581NorNadirah_SEC_24.pdf24 pages file826.48 kBAdobe PDFView/Open
t11100350581NorNadirah_SEC.pdf
  Restricted Access
Full text secured file8.34 MBAdobe PDFView/Open    Request a copy
Show simple item record

Page view(s)

8
checked on May 18, 2021

Download(s)

2
checked on May 18, 2021

Google ScholarTM

Check


Items in this repository are protected by copyright, with all rights reserved, unless otherwise indicated. Please give due acknowledgement and credits to the original authors and IIUM where applicable. No items shall be used for commercialization purposes except with written consent from the author.