In recent years, analysis of human motion has become an increasingly relevant research topic with applications as diverse as animation, virtual reality, security, and advanced human-machine interfaces. In particular, motion capture systems are well known nowadays since they are used in the movie industry. These systems require expensive multi-camera setups or markers to be worn by the user. This paper describes an attempt to provide a markerless low cost and real-time solution for home users. We propose a novel approach for robust detection and tracking of the user's body joints that exploits different algorithms as different sources of information and fuses their estimates with particle filters. This system may be employed for real-time animation of VRML or X3D avatars using an off-the-shelf digital camera and a standard PC.

Tracking human motion from monocular sequences

SNIDARO L;G. L. FORESTI;CHITTARO L
2008-01-01

Abstract

In recent years, analysis of human motion has become an increasingly relevant research topic with applications as diverse as animation, virtual reality, security, and advanced human-machine interfaces. In particular, motion capture systems are well known nowadays since they are used in the movie industry. These systems require expensive multi-camera setups or markers to be worn by the user. This paper describes an attempt to provide a markerless low cost and real-time solution for home users. We propose a novel approach for robust detection and tracking of the user's body joints that exploits different algorithms as different sources of information and fuses their estimates with particle filters. This system may be employed for real-time animation of VRML or X3D avatars using an off-the-shelf digital camera and a standard PC.
File in questo prodotto:
File Dimensione Formato  
IJIG_FINAL.pdf

non disponibili

Tipologia: Versione Editoriale (PDF)
Licenza: Non pubblico
Dimensione 931.87 kB
Formato Adobe PDF
931.87 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11390/877680
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? ND
social impact