PTZ camera-based adaptive panoramic and multi-layered background model

  • Kang Xue*
  • , Gbolabo Ogunmakin
  • , Yue Liu
  • , Patricio A. Vela
  • , Yongtian Wang
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

14 Citations (Scopus)

Abstract

In this paper, we present a novel approach for constructing an adaptive panoramic and multi-layered background model for Pan-tilt-zoom (PTZ) camera that provides fast registration of the observed frame and localizes the foreground targets with arbitrary camera position and scale (optical zoom). Our method consists of two stages. (1) An adaptive panoramic background mixture model is generated off-line for foreground detection. (2) A layered correspondence is generated off-line from frames captured at different optical zoom values of the camera, and a correspondence propagation method is used to register the observed frame with the panoramic background online. We demonstrate the advantages of the proposed adaptive panoramic and multi-layered background model within wide field of view (FOV) and over large scale range.

Original languageEnglish
Title of host publicationICIP 2011
Subtitle of host publication2011 18th IEEE International Conference on Image Processing
Pages2949-2952
Number of pages4
DOIs
Publication statusPublished - 2011
Event2011 18th IEEE International Conference on Image Processing, ICIP 2011 - Brussels, Belgium
Duration: 11 Sept 201114 Sept 2011

Publication series

NameProceedings - International Conference on Image Processing, ICIP
ISSN (Print)1522-4880

Conference

Conference2011 18th IEEE International Conference on Image Processing, ICIP 2011
Country/TerritoryBelgium
CityBrussels
Period11/09/1114/09/11

Keywords

  • Object detection
  • PTZ camera
  • multi-layered propagation
  • panoramic background

Fingerprint

Dive into the research topics of 'PTZ camera-based adaptive panoramic and multi-layered background model'. Together they form a unique fingerprint.

Cite this