• Chỉ mục bởi
  • Năm xuất bản
LIÊN KẾT WEBSITE

Occlusion vehicle segmentation algorithm in crowded scene for traffic surveillance system

Phan H.N. School of Computer Science and Engineering, International University, Vietnam National University HCMC, Block 6, Linh Trung, Thu Duc District, Ho Chi Minh City, Viet Nam|
Ha S.V.-U. | Tran D.N.-N. | Pham L.H. |

Advances in Intelligent Systems and Computing Số , năm 2018 (Tập 672, trang 584-595)

ISSN: 211509

ISSN: 211509

DOI: 10.1007/978-981-10-7512-4_58

Tài liệu thuộc danh mục: Scopus

Adv. Intell. Sys. Comput.

English

Từ khóa: Cameras; Developing countries; Image segmentation; Information systems; Information use; Monitoring; Security systems; Systems analysis; Blob splitting; Moving vehicles; Occlusion detection; Surveillance cameras; Tracking and classification; Traffic surveillance; Vehicle detection; Vehicle segmentation; Vehicles
Tóm tắt tiếng anh
Traffic surveillance system (TSS) is an essential tool to extract necessary information (count, type, speed, etc.) from cameras for further analysis. In this issue, vehicle detection is considered one of the most important studies as it is a vital process from which modules such as vehicle tracking and classification can be built upon. However, detecting moving vehicles in urban areas is difficult because the inter-vehicle space is significantly reduced, which increases the occlusion among vehicles. This issue is more challenging in developing countries where the roads are crowded with 2-wheeled motorbikes in rush hours. This paper proposes a method to improve the occlusion vehicle detection from static surveillance cameras. The main contribution is an overlapping vehicle segmentation algorithm in which undefined blobs of occluded vehicles are examined to extract the vehicles individually based on the geometry and the ellipticity characteristics of objects’ shapes. Experiments on real-world data have shown promising results with a detection rate of 84.10% in daytime scenes. © Springer Nature Singapore Pte Ltd. 2018.

Xem chi tiết