EGU General Assembly 2023
© Author(s) 2023. This work is distributed under
the Creative Commons Attribution 4.0 License.

Framework for creating daily semantic segmentation maps of classified eddies using SLA along-track altimetry data 

Eike Bolmer1, Adili Abulaitijiang2, Luciana Fenoglio-Marc2, Jürgen Kusche2, and Ribana Roscher1,3
Eike Bolmer et al.
  • 1Remote Sensing Group, Institute of Geodesy and Geoinformation (IGG), University of Bonn, Bonn, Germany
  • 2Astronomical, Physical and Mathematical Geodesy (APMG) Group, Institute of Geodesy and Geoinformation (IGG), University of Bonn, Bonn, Germany
  • 3Institute for Bio- and Geosciences Plant Sciences (IBG-2), Forschungszentrum Jülich, Jülich, Germany

Mesoscale eddies are gyrating currents in the ocean and have horizontal scales from 10 km up to 100 km and above. They transport water mass, heat, and nutrients and therefore are of interest among others to marine biologists, oceanographers, and geodesists. Usually, gridded sea level anomaly maps, processed from several radar altimetry missions, are used to detect eddies. However, operational processors create multi-mission (processing level 4) SLA grid maps with an effective spatiotemporal resolution far lower than their grid spacing and temporal resolution. 

This drawback leads to erroneous eddy detection. We, therefore, investigate if the higher-resolution along-track data could be used instead to solve the problem of classifying the SLA observations into cyclonic, anticyclonic, or no eddies in a more accurate way than using processed SLA grid map products. With our framework, we aim to infer a daily two-dimensional segmentation map of classified eddies. Due to repeat cycles between 10 and 35 days and cross-track spacing of a few 10 km to a few 100 km, ocean eddies are clearly visible in altimeter observations but are typically covered only by a few ground tracks where the spatiotemporal context within the input data is highly variable each day. However conventional convolutional neural networks (CNNs) rely on data without varying gaps or jumps in time and space in order to use the intrinsic spatial or temporal context of the observations. Therefore, this is a challenge that needs to be addressed with a deep neural network that on the one hand utilizes the spatiotemporal context information within the modality of along-track data and on the other hand is able to output a two-dimensional segmentation map from data of varying sparsity. Our approach with our architecture Teddy is to use a transformer module to encode and process the spatiotemporal information along with the ground track's sea level anomaly data that produces a sparse feature map. This will then be fed into a sparsity invariant convolutional neural network in order to infer a two-dimensional segmentation map of classified eddies. Reference data that is used to train Teddy is produced by an open-source geometry-based approach (py-eddy-tracker [1]). 

The focus of this presentation is on how we implemented this approach in order to derive two-dimensional segmentation maps of classified eddies with our deep neural network architecture Teddy from along-track altimetry. We show results and limitations for the classification of eddies using only along-track SLA data from the multi-mission level 3 product of the Copernicus Marine Environment Monitoring Service (CMEMS) within the 2017 - 2019 period for the Gulf Stream region. We find that using our methodology, we can create two-dimensional maps of classified eddies from along-track data without using preprocessed SLA grid maps.

[1] Evan Mason, Ananda Pascual, and James C. McWilliams, “A new sea surface height–based code for oceanic mesoscale eddy tracking,” Journal of Atmospheric and Oceanic Technology, vol. 31, no. 5, pp. 1181–1188, 2014.

How to cite: Bolmer, E., Abulaitijiang, A., Fenoglio-Marc, L., Kusche, J., and Roscher, R.: Framework for creating daily semantic segmentation maps of classified eddies using SLA along-track altimetry data , EGU General Assembly 2023, Vienna, Austria, 24–28 Apr 2023, EGU23-13367,, 2023.