Airborne laser scanners produce 3D data that can be used for a range of applications, such as urban planning, facility monitoring, flood mapping, and forest management. Additional information on the surveyed area can be obtained from the backscattered waveforms recorded by modern light detection and ranging (lidar) sensors. However, the high-dimensional representation of full-waveform data has hindered progress in its use due to difficulties in processing and storage. This paper develops a quantized convolutional autoencoder network to compress lidar waveform data into a condensed feature representation, resulting in a compression rate of up to 20:1. This, together with height information, is fed into a U-net convolutional neural network that achieves an accuracy of 93.7% on six classes.

Classification of compressed full-waveform airborne lidar data

Maset E.
;
Fusiello A.
2024-01-01

Abstract

Airborne laser scanners produce 3D data that can be used for a range of applications, such as urban planning, facility monitoring, flood mapping, and forest management. Additional information on the surveyed area can be obtained from the backscattered waveforms recorded by modern light detection and ranging (lidar) sensors. However, the high-dimensional representation of full-waveform data has hindered progress in its use due to difficulties in processing and storage. This paper develops a quantized convolutional autoencoder network to compress lidar waveform data into a condensed feature representation, resulting in a compression rate of up to 20:1. This, together with height information, is fed into a U-net convolutional neural network that achieves an accuracy of 93.7% on six classes.
File in questo prodotto:
File Dimensione Formato  
RSL_2024_paper.pdf

non disponibili

Licenza: Non pubblico
Dimensione 4.6 MB
Formato Adobe PDF
4.6 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11390/1280306
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact