EGU2020-19357
https://doi.org/10.5194/egusphere-egu2020-19357
EGU General Assembly 2020
© Author(s) 2022. This work is distributed under
the Creative Commons Attribution 4.0 License.

Glacier Front Detection at Tidewater Glaciers from Radar Images

AmirAbbas Davari1, Thorsten Seehaus2, Matthias Braun2, and Andreas Maier1
AmirAbbas Davari et al.
  • 1Pattern Recognition Lab, Friedrich-Alexander-University Erlangen-Nürnberg, Erlangen, Germany
  • 2Institute of Geography, Friedrich-Alexander-University Erlangen-Nürnberg, Erlangen, Germany

Glacier and ice sheets are currently contributing 2/3 of the observed global sea level rise of about 3.2 mm a-1. Many of these glaciated regions (Antarctica, sub-Antarctic islands, Greenland, Russian and Canadian Arctic, Alaska, Patagonia), often with ocean calving ice front. Many glaciers on those regions show already considerable ice mass loss, with an observed acceleration in the last decade [1]. Most of this mass loss is caused by dynamic adjustment of glaciers, with considerable glacier retreat and elevation change being the major observables. The continuous and precise extraction of glacier calving fronts is hence of paramount importance for monitoring the rapid glacier changes. Detection and monitoring the ice shelves and glacier fronts from optical and Synthetic Aperture Radar (SAR) satellite images needs well-identified spectral and physical properties of glacier characteristics.

Earth Observation (EO) is producing massive amounts of data that are currently often processed either by the expensive and slow manual digitization or with simple unreliable methods such as heuristically found rule-based systems. As it was mentioned earlier, due to the variable occurrence of sea ice, icebergs and the similarity of fronts to crevasses, exact mapping of the glacier front position poses considerable difficulties to existing algorithms. Deep learning techniques are successfully applied in many tasks in image analysis [2]. Recently, Zhang et al. [3] adopted the state-of-the-art deep learning-based image segmentation method, i.e., U-net [4], on TerraSAR-X images for glacier front segmentation. The main motivation in using SAR modality instead of the optical aerial imagery is the capability of the SAR waves to penetrate cloud cover and its all year acquisition.

We intend to bridge the gap for a fully automatic and end-to-end deep learning-based glacier front detection using time series SAR imagery. U-net has performed extremely well in image segmentation, specifically in medical image processing community [5]. However, it is a large and complex model and is rather slow to train. Fully Convolutional Network (FCN) [6] can be considered as architecturally less complex variant of U-net, which has faster training and inference time. In this work, we will investigate the suitability of FCN for the glacier front segmentation and compare their performance with U-net. Our preliminary results on segmenting the glaciers demonstrate the dice coefficient of 92.96% by FCN and 93.20% by U-net, which essentially indicate the suitability of FCN for this task and its comparable performance to U-net.

 

References:

[1] Vaughan et al. "Observations: cryosphere." Climate change 2103 (2013): 317-382.

[2] LeCun et al. "Deep learning." nature 521, no. 7553 (2015): 436.

[3] Zhang et al. "Automatically delineating the calving front of Jakobshavn Isbræ from multitemporal TerraSAR-X images: a deep learning approach." The Cryosphere 13, no. 6 (2019): 1729-1741.

[4] Ronneberger et al. "U-net: Convolutional networks for biomedical image segmentation." MICCAI 2015.

[5] Vesal et al. "A multi-task framework for skin lesion detection and segmentation." In OR 2.0 Context-Aware Operating Theaters, Computer Assisted Robotic Endoscopy, Clinical Image-Based Procedures, and Skin Image Analysis, 2018.

[6] Long et al. "Fully convolutional networks for semantic segmentation." CVPR 2015.

How to cite: Davari, A., Seehaus, T., Braun, M., and Maier, A.: Glacier Front Detection at Tidewater Glaciers from Radar Images, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-19357, https://doi.org/10.5194/egusphere-egu2020-19357, 2020.

Displays

Display file