- 1Professorship of Remote Sensing Applications, Technical University of Munich, Germany
- 2LETG Rennes, UMR 6554 CNRS, Université Rennes 2, France
- 3Geosciences Rennes, UMR 6118 CNRS, Univ Rennes, France
Bathymetry is a critical component in many geomorphological and ecological studies of coastal and fluvial environments. Bathymetric lidar remote sensing enables the acquisition of high-resolution, 3D data on shallow sea- and riverbeds, thus providing precise modeling of their topography. However, in certain contexts, the optical interactions between light and water make extracting bathymetry from lidar signals particularly challenging, resulting in incomplete bed coverage. This is the case in very shallow waters at the transition between land and water – where the signals of the water surface, column, and bottom become entangled – and in deep or turbid waters, where signal extinction hinders the detection of the sea- or riverbed.
In this work, we explore new approaches for bathymetry extraction from lidar waveforms across diverse environments. With temporal convolutional neural networks, we show improvements in detecting the position of the sea- or riverbed using bathymetric lidar waveforms. Our experiments, conducted on simulated data representing a wide range of environmental conditions – varying turbidity, depth, and reflectance – yield promising results. They highlight the potential to extract the position of the sea- or riverbed even in simulated waveforms with low signal-to-noise ratios or highly overlapping signals – cases that have posed challenges for existing processing methods. Given the increasing popularity of bathymetric lidar, particularly with the advent of UAV-mounted sensors, enhancing waveform processing methods could help advance the surveying of submerged areas.
How to cite: Letard, M., Corpetti, T., and Lague, D.: Extracting bathymetric information from LiDAR waveforms with 1D neural networks., EGU General Assembly 2025, Vienna, Austria, 27 Apr–2 May 2025, EGU25-20513, https://doi.org/10.5194/egusphere-egu25-20513, 2025.