EGU General Assembly 2023
© Author(s) 2023. This work is distributed under
the Creative Commons Attribution 4.0 License.

3DMASC: accessible, explainable 3D point clouds classification. Application to bi-spectral topo-bathymetric LiDAR data.

Mathilde Letard1, Dimitri Lague1,2, Arthur Le Guennec1,3, Sébastien Lefevre4, Baptiste Feldmann1,2, Paul Leroy1,2, Daniel Girardeau-Montaut5, and Thomas Corpetti3
Mathilde Letard et al.
  • 1Univ Rennes, Geosciences Rennes, UMR 6118 CNRS, France (
  • 2Univ Rennes, Plateforme LiDAR Topo-Bathymétrique Nantes-Rennes, OSUR, UAR 3343 CNRS, France
  • 3LETG UMR 6554, CNRS, F-35000 Rennes, France
  • 4IRISA UMR 6074, Université Bretagne Sud, F-56000 Vannes, France
  • 5Johnson and Johnson, France

Three-dimensional data have become increasingly present in earth observation over the last decades and, more recently, with the development of accessible 3D sensing technologies. However, many 3D surveys are still underexploited due to the lack of accessible and explainable automatic classification methods. In this work, we introduce explainable machine learning for 3D data classification using Multiple Attributes, Scales, and Clouds under 3DMASC, a new workflow. It handles multiple clouds at once, including or not spectral and multiple returns attributes. Through 3DMASC, we use classical 3D data multi-scale descriptors and new ones based on the spatial variations of geometrical, spectral and height-based features of the local point cloud. We also introduce dual-cloud features, encrypting local spectral and geometrical ratios and differences, which improve the interpretation of multi-cloud surveys. 3DMASC thus offers new possibilities for point cloud classification, namely for the interpretation of bi-spectral lidar data. Here, we experiment on topo-bathymetric lidar data, which are acquired using two lasers at infrared and green wavelengths, and feature two irregular point clouds characterized by different samplings of vegetated and flooded areas, that 3DMASC can harvest. By exploring the contributions of 88 features and 30 scales – including two types of neighborhoods – we identify a core set of features and scales particularly relevant for coastal and riverine scenes description, and give indications on how to build an optimal predictor vector to train 3D data classifiers. Our findings highlight the predominance of lidar return-based attributes over classical features based on dimensionality or eigenvalues, and the significant contribution of spectral information to the detection of more than a dozen of land and sea covers – artificial/vegetated/rocky/bare ground, rocky/sandy seabed, intermediate/high vegetation, buildings, vehicles, power lines. The experimental results show that 3DMASC competes with state-of-the-art methods in terms of classification performances while demanding lower complexity and thus remaining accessible to non-specialist users. Relying on a random forest algorithm, it generalizes and applies quickly to large datasets, and offers the possibility to filter out misclassified points depending on their prediction confidence. Classification accuracies between 91% for complex scene classifications and 98% for lower-level processing are observed, with average prediction confidences above 90% and models relying on less than 2000 samples per class and at most 30 descriptors – including both features and scales. Though dual-cloud features systematically outperform their single cloud equivalents, 3DMASC also performs on single cloud lidar data, or structure from motion point clouds. Our contributions are made available through a self-contained plugin in CloudCompare allowing non-specialist users to create a classifier and apply it, and an opensource labelled dataset of topo-bathymetric data.

How to cite: Letard, M., Lague, D., Le Guennec, A., Lefevre, S., Feldmann, B., Leroy, P., Girardeau-Montaut, D., and Corpetti, T.: 3DMASC: accessible, explainable 3D point clouds classification. Application to bi-spectral topo-bathymetric LiDAR data., EGU General Assembly 2023, Vienna, Austria, 24–28 Apr 2023, EGU23-7115,, 2023.

Supplementary materials

Supplementary material file