In this correspondence, we address the problem of fusing data for object tracking for video surveillance. The fusion process is dynamically regulated to take into account the performance of the sensors in detecting and tracking the targets. This is performed through a function that adjusts the measurement error covariance associated with the position information of each target according to the quality of its segmentation. In this manner, localization errors due to incorrect segmentation of the blobs are reduced thus improving tracking accuracy. Experimental results on video sequences of outdoor environments show the effectiveness of the proposed approach. © 2007 IEEE.
Quality-based fusion of multiple video sensors for video surveillance
SNIDARO, Lauro;FORESTI, Gian Luca;
2007-01-01
Abstract
In this correspondence, we address the problem of fusing data for object tracking for video surveillance. The fusion process is dynamically regulated to take into account the performance of the sensors in detecting and tracking the targets. This is performed through a function that adjusts the measurement error covariance associated with the position information of each target according to the quality of its segmentation. In this manner, localization errors due to incorrect segmentation of the blobs are reduced thus improving tracking accuracy. Experimental results on video sequences of outdoor environments show the effectiveness of the proposed approach. © 2007 IEEE.File | Dimensione | Formato | |
---|---|---|---|
Quality_based_fusion.pdf
non disponibili
Tipologia:
Altro materiale allegato
Licenza:
Non pubblico
Dimensione
1.44 MB
Formato
Adobe PDF
|
1.44 MB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.