Virtual and augmented realities are expected to become more and more important in everyday life in the next future; the role of spatial audio technologies over headphones will be pivotal for application scenarios which involve mobility. This paper introduces the SelfEar project, aimed at low-cost acquisition and personalization of Head-Related Transfer Functions (HRTFs) on mobile devices. This first version focuses on capturing individual spectral features which characterize external ear acoustics, through a self-adjustable procedure which guides users in collecting such information: their mobile device must be held with the stretched arm and positioned at several specific elevation points; acoustic data are acquired by an audio augmented reality headset which embeds a pair of microphones at listener ear-canals. A preliminary measurement session assesses the ability of the system to capture spectral features which are crucial for elevation perception. Moreover, a virtual experiment using a computational auditory model predicts clear vertical localization cues in the measured features.

Acoustic selfies for extraction of external ear features in mobile audio augmented reality

Geronazzo M.
Primo
;
2016-01-01

Abstract

Virtual and augmented realities are expected to become more and more important in everyday life in the next future; the role of spatial audio technologies over headphones will be pivotal for application scenarios which involve mobility. This paper introduces the SelfEar project, aimed at low-cost acquisition and personalization of Head-Related Transfer Functions (HRTFs) on mobile devices. This first version focuses on capturing individual spectral features which characterize external ear acoustics, through a self-adjustable procedure which guides users in collecting such information: their mobile device must be held with the stretched arm and positioned at several specific elevation points; acoustic data are acquired by an audio augmented reality headset which embeds a pair of microphones at listener ear-canals. A preliminary measurement session assesses the ability of the system to capture spectral features which are crucial for elevation perception. Moreover, a virtual experiment using a computational auditory model predicts clear vertical localization cues in the measured features.
2016
9781450344913
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11390/1211230
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 4
social impact