NP4.1 | Time series analysis: New developments and lessons learned from applications across the geosciences
EDI
Time series analysis: New developments and lessons learned from applications across the geosciences
Co-organized by BG2/CL5/EMRP2/ESSI1/G1/GI2/HS13/SM3/ST2
Convener: Reik Donner | Co-conveners: Tommaso AlbertiECSECS, Giorgia Di CapuaECSECS, Simone BenellaECSECS, Nina Kukowski
Orals
| Tue, 16 Apr, 16:15–18:00 (CEST)
 
Room K2
Posters on site
| Attendance Wed, 17 Apr, 10:45–12:30 (CEST) | Display Wed, 17 Apr, 08:30–12:30
 
Hall X4
Orals |
Tue, 16:15
Wed, 10:45
Time series are a very common type of data sets generated by observational and modeling efforts across all fields of Earth, environmental and space sciences. The characteristics of such time series may however vastly differ from one another between different applications – short vs. long, linear vs. nonlinear, univariate vs. multivariate, single- vs. multi-scale, etc., equally calling for specifically tailored methodologies as well as generalist approaches. Similarly, also the specific task of time series analysis may span a vast body of problems, including
- dimensionality/complexity reduction and identification of statistically and/or dynamically meaningful modes of (co-)variability,
- statistical and/or dynamical modeling of time series using stochastic or deterministic time series models or empirical components derived from the data,
- characterization of variability patterns in time and/or frequency domain,
- quantification various aspects of time series complexity and predictability,
- identification and quantification of different flavors of statistical interdependencies within and between time series, and
- discrimination between mere correlation and true causality among two or more time series.
According to this broad range of potential analysis goals, there exists a continuously expanding plethora of time series analysis concepts, many of which are only known to domain experts and have hardly found applications beyond narrow fields despite being potentially relevant for others, too.

Given the broad relevance and rather heterogeneous application of time series analysis methods across disciplines, this session shall serve as a knowledge incubator fostering cross-disciplinary knowledge transfer and corresponding cross-fertilization among the different disciplines gathering at the EGU General Assembly. We equally solicit contributions on methodological developments and theoretical studies of different methodologies as well as applications and case studies highlighting the potentials as well as limitations of different techniques across all fields of Earth, environmental and space sciences and beyond.

Orals: Tue, 16 Apr | Room K2

Chairpersons: Reik Donner, Tommaso Alberti, Simone Benella
16:15–16:20
16:20–16:30
|
EGU24-18197
|
ECS
|
Highlight
|
On-site presentation
Karim Zantout, Katja Frieler, and Jacob Schewe and the ISIMIP team

Climate variability gives rise to many different kinds of extreme impact events, including heat waves, crop failures, or wildfires. The frequency and magnitude of such events are changing under global warming. However, it is less known to what extent such events occur with some regularity, and whether this regularity is also changing as a result of climate change. Here, we present a novel method to systematically study the time-autocorrelation of these extreme impact events, that is, whether they occur with a certain regularity. In studies of climate change impacts, different types of events are often studied in isolation, but in reality they interact. We use ensembles of global biophysical impact simulations from the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) driven with climate models to assess current conditions and projections. The time series analysis is based on a discrete Fourier transformation that accounts for the stochastic fluctuations from the climate model. Our results show that some climate impacts, such as crop failure, indeed exhibit a dominant frequency of recurrence; and also, that these regularity patterns change over time due to anthropogenic climate forcing.

How to cite: Zantout, K., Frieler, K., and Schewe, J. and the ISIMIP team: The regularity of climate-related extreme events under global warming, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-18197, https://doi.org/10.5194/egusphere-egu24-18197, 2024.

16:30–16:40
|
EGU24-9367
|
ECS
|
On-site presentation
Susan Mariam Rajesh, Muraleekrishnan Bahuleyan, Arathy Nair GR, and Adarsh Sankaran

Evaluation of scaling properties and fractal formalisms is one of the potential approaches for modelling complex series. Understanding the complexity and fractal characterization of drought index time series is essential for better preparedness against drought disasters. This study presents a novel visibility graph-based evaluation of fractal characterization of droughts of three meteorological subdivisions of India. In this method, the horizontal visibility graph (HVG) and Upside-down visibility graph (UDVG) are used for evaluating the network properties for different standardized precipitation index (SPI) series of 3, 6 and 12 month time scales representing short, medium and long term droughts. The relative magnitude of fractal estimates is controlled by the drought characteristics of wet-dry transitions. The estimates of degree distribution clearly deciphered the self-similar properties of droughts of all the subdivisions. For an insightful depiction of drought dynamics, the fractal exponents and spectrum are evaluated by the concurrent application of Sand Box Method (SBM) and Chhabra and Jenson Method (CJM). The analysis was performed for overall series along with the pre- and post-1976-77 Global climate shift scenarios. The complexity is more evident in short term drought series and UDVG formulations implied higher fractal exponents for different moment orders irrespective of drought type and locations considered in this study. Useful insights on the relationship between complex network and fractality are evolved from the study, which may help in improved drought forecasting. The visibility graph based fractality estimation evaluation is efficient in capturing drought and it has vast potential in the drought predictions in a changing environment.

Keywords:  Drought, Fractal, SPI, Visibility Graph

How to cite: Rajesh, S. M., Bahuleyan, M., Nair GR, A., and Sankaran, A.: Fractal complexity evaluation of meteorological droughts over three Indian subdivisions using visibility Graphs, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-9367, https://doi.org/10.5194/egusphere-egu24-9367, 2024.

16:40–16:50
|
EGU24-10258
|
On-site presentation
|
Norbert Marwan and Tobias Braun

A wide range of geoprocesses manifest as observable events in a variety of contexts, including shifts in palaeoclimate regimes, evolutionary milestones, tectonic activities, and more. Many prominent research questions, such as synchronisation analysis or power spectrum estimation of discrete data, pose considerable challenges to linear tools. We present recent advances using a specific similarity measure for discrete data and the method of recurrence plots for different applications in the field of highly discrete event data. We illustrate their potential for palaeoclimate studies, particularly in detecting synchronisation between signals of discrete extreme events and continuous signals, estimating power spectra of spiky signals, and analysing data with irregular sampling.

How to cite: Marwan, N. and Braun, T.: New concepts on quantifying event data, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-10258, https://doi.org/10.5194/egusphere-egu24-10258, 2024.

16:50–17:00
|
EGU24-24
|
Virtual presentation
Gael Kermarrec, Federico Maddanu, Anna Klos, and Tommaso Proietti

In the analysis of sub-annual climatological or geodetic time series such as tide gauges, precipitable water vapor, or GNSS vertical displacements time series but also temperatures or gases concentrations, seasonal cycles are often found to have a time-varying amplitude and phase.

These time series are usually modelled with a deterministic approach that includes trend, annual, and semi-annual periodic components having constant amplitude and phase-lag. This approach can potentially lead to inadequate interpretations, such as an overestimation of Global Navigation Satellite System (GNSS) station velocity, up to masking important geophysical phenomena that are related to the amplitude variability and are important for deriving trustworthy interpretation for climate change assessment.

We address that challenge by proposing a novel linear additive model called the fractional Sinusoidal Waveform process (fSWp), accounting for possible nonstationary cyclical long memory, a stochastic trend that can evolve over time and an additional serially correlated noise capturing the short-term variability. The model has a state space representation and makes use of the Kalman filter (KF). Suitable enhancements of the basic methodology enable handling data gaps, outliers, and offsets. We demonstrate our method using various climatological and geodetic time series to illustrate its potential to capture the time-varying stochastic seasonal signals.

How to cite: Kermarrec, G., Maddanu, F., Klos, A., and Proietti, T.: The fractional Sinusoidal wavefront Model (fSwp) for time series displaying persistent stationary cycles, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-24, https://doi.org/10.5194/egusphere-egu24-24, 2024.

17:00–17:10
|
EGU24-5268
|
ECS
|
On-site presentation
Hang Qian, Meng Tian, and Nan Zhang

For its immunity to post-formation geological modifications, zircon is widely utilized as chronological time capsule and provides critical time series data potential to unravel key events in Earth’s geological history, such as supercontinent cycles. Fourier analysis, which assumes stationary periodicity, has been applied to zircon-derived time series data to find the cyclicity of supercontinents, and wavelet analysis, which assumes non-stationary periodicity, corroborates the results of Fourier Analysis in addition to detecting finer-scale signals. Nonetheless, both methods still prognostically assume periodicity in the zircon-derived time-domain data. To stay away from the periodicity assumption and extract more objective information from zircon data, we opt for a Bayesian approach and treat zircon preservation as a composite stochastic process where the number of preserved zircon grains per magmatic event obeys logarithmic series distribution and the number of magmatic events during a geological time interval obeys Poisson distribution. An analytical solution was found to allow us to efficiently invert for the number and distribution(s) of changepoints hidden in the globally compiled zircon data, as well as for the zircon preservation potential (encoded as a model parameter) between two neighboring changepoints. If the distributions of changepoints temporally overlap with those of known supercontinents, then our results serve as an independent, mathematically robust test of the cyclicity of supercontinents. Moreover, our statistical approach inherently provides a sensitivity parameter the tuning of which allows to probe changepoints at various temporal resolution. The constructed Bayesian framework is thus of significant potential to detect other types of trend swings in Earth’s history, such as shift of geodynamic regimes, moving beyond cyclicity detection which limits the application of conventional Fourier/Wavelet analysis.

How to cite: Qian, H., Tian, M., and Zhang, N.: Unveiling Geological Patterns: Bayesian Exploration of Zircon-Derived Time Series Data, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-5268, https://doi.org/10.5194/egusphere-egu24-5268, 2024.

17:10–17:20
|
EGU24-18061
|
ECS
|
On-site presentation
Davide Consoli, Leandro Parente, and Martijn Witjes

Several machine learning algorithms and analytical techniques do not allow gaps or non-values in input data. Unfortunately, earth observation (EO) datasets, such as satellite images, are gravely affected by cloud contamination and sensor artifacts that create gaps in the time series of collected images. This limits the usage of several powerful techniques for modeling and analysis. To overcome these limitations, several works in literature propose different imputation methods to reconstruct the gappy time series of images, providing complete time-space datasets and enabling their usage as input for many techniques.

However, among the time-series reconstruction methods available in literature, only a few of them are publicly available (open source code), applicable without any external source of data, and suitable for application to petabyte (PB) sized dataset like the full Landsat archive. The few methods that match all these characteristics are usually quite trivial (e.g. linear interpolation) and, as a consequence, they often show poor performance in reconstructing the images. 

For this reason, we propose a new methodology for time series reconstruction designed to match all these requirements. Like some other methods in literature, the new method, named seasonally weighted average generalization (SWAG), works purely on the time dimension, reconstructing the images working on each time series of each pixel separately. In particular, the method uses a weighted average of the samples available in the original time series to reconstruct the missing values. Enforcing the annual seasonality of each band as a prior, for the reconstruction of each missing sample in the time series a higher weight is given to images that are collected exactly on integer multiples of a year. To avoid propagation of land cover changes in future or past images, higher weights are given to more recent images. Finally, to have a method that respects causality, only images from the past of each sample in the time series are used.

To have computational performance suitable for PB sized datasets the method has been implemented in C++ using a sequence of fast convolution methods and Hadamard products and divisions. The method has been applied to a bimonthly aggregated version of the global GLAD Landsat ARD-2 collection from 1997 to 2022, producing a 400 terabyte output dataset. The produced dataset will be used to generate maps for several biophysical parameters, such as Fraction of Absorbed Photosynthetically Active Radiation (FAPAR), normalized difference water index (NDWI) and bare soil fraction (BSF). The code is available as open source, and the result is fully reproducible.

References:

Potapov, Hansen, Kommareddy, Kommareddy, Turubanova, Pickens, ... & Ying  (2020). Landsat analysis ready data for global land cover and land cover change mapping. Remote Sensing, 12(3), 426.

Julien, & Sobrino (2019). Optimizing and comparing gap-filling techniques using simulated NDVI time series from remotely sensed global data. International Journal of Applied Earth Observation and Geoinformation, 76, 93-111.

Radeloff, Roy, Wulder, Anderson, Cook, Crawford, ... & Zhu (2024). Need and vision for global medium-resolution Landsat and Sentinel-2 data products. Remote Sensing of Environment, 300, 113918.

How to cite: Consoli, D., Parente, L., and Witjes, M.: A new methodology for time-series reconstruction of global scale historical Earth observation data, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-18061, https://doi.org/10.5194/egusphere-egu24-18061, 2024.

17:20–17:30
|
EGU24-6065
|
ECS
|
On-site presentation
Kendra R. Gilmore, Sarah N. Bentley, and Andy W. Smith

Forecasting the aurora and its location accurately is important to mitigate any potential harm to vital infrastructure like communications and electricity grid networks. Current auroral prediction models rely on our understanding of the interaction between the magnetosphere and the solar wind or geomagnetic indices. Both approaches do well in predicting but have limitations concerning forecasting (geomagnetic indices-based model) or because of the underlying assumptions driving the model (due to a simplification of the complex interaction). By applying machine learning algorithms to this problem, gaps in our understanding can be identified, investigated, and closed. Finding the important time scales for driving empirical models provides the necessary basis for our long-term goal of predicting the aurora using machine learning.

Periodicities of the Earth’s magnetic field have been extensively studied on a global scale or in regional case studies. Using a suite of different time series analysis techniques including frequency analysis and investigation of long-scale changes of the median/ mean, we examine the dominant periodicities of ground magnetic field measurements at selected locations. A selected number of stations from the SuperMAG network (Gjerloev, 2012), which is a global network of magnetometer stations across the world, are the focus of this investigation.

The periodicities retrieved from the different magnetic field components are compared to each other as well as to other locations. In the context of auroral predictions, an analysis of the dominating periodicities in the auroral boundary data derived from the IMAGE satellite (Chisham et al., 2022) provides a counterpart to the magnetic field periodicities.

Ultimately, we can constrain the length of time history sensible for forecasting.

How to cite: Gilmore, K. R., Bentley, S. N., and Smith, A. W.: Magnetospheric time history:  How much do we need for forecasting?, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-6065, https://doi.org/10.5194/egusphere-egu24-6065, 2024.

17:30–17:40
|
EGU24-16981
|
ECS
|
On-site presentation
Sho Sato and Hiroaki Toh

We present a machine-learning-based approach for predicting the geomagnetic main field changes, known as secular variation (SV), in a 5-year range for use for the 14th generation of International Geomagnetic Reference Field (IGRF-14). The training and test datasets of the machine learning (ML) models are geomagnetic field snapshots derived from magnetic observatory hourly means, and CHAMP and Swarm-A satellite data (MCM Model; Ropp et al., 2020). The geomagnetic field data are not used as-is in the original time series but were differenced twice before training. Because SV is strongly influenced by the geodynamo process occurring in the Earth's outer core, challenges still persist despite efforts to model and forecast the realistic nonlinear behaviors (such as the geomagnetic jerks) of the geodynamo through data assimilation. We compare three physics-uninformed ML models, namely, the Autoregressive (AR) model, Vector Autoregressive (VAR) model, and Recurrent Neural Network (RNN) model, to represent the short-term temporal evolution of the geomagnetic main field on the Earth’s surface. The quality of 5-year predictions is tested by the hindcast results for the learning window from 2004.50 to 2014.25. These tests show that the forecast performance of our ML model is comparable with that of candidate models of IGRF-13 in terms of data misfits after the release epoch (Year 2014.75). It is found that all three ML models give 5-year prediction errors of less than 100nT, among which the RNN model shows a slightly better accuracy. They also suggest that Overfitting to the training data used is an undesirable machine learning behavior that occurs when the RNN model gives accurate reproduction of training data but not for forecasting targets.

How to cite: Sato, S. and Toh, H.: A machine-learning-based approach for predicting the geomagnetic secular variation, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-16981, https://doi.org/10.5194/egusphere-egu24-16981, 2024.

17:40–17:50
|
EGU24-4280
|
ECS
|
On-site presentation
Iram Parvez

Iram Parvez1, Massimiliano Cannata2, Giorgio Boni1, Rossella Bovolenta1 ,Eva Riccomagno3 , Bianca Federici1

1 Department of Civil, Chemical and Environmental Engineering (DICCA), Università degli Studi di Genova, Via Montallegro 1, 16145 Genoa, Italy (iram.parvez@edu.unige.it,bianca.federici@unige.it, giorgio.boni@unige.it, rossella.bovolenta@unige.it).

2 Institute of Earth Sciences (IST), Department for Environment Constructions and Design (DACD), University of Applied Sciences and Arts of Southern Switzerland (SUPSI), CH-6952 Canobbio, Switzerland(massimiliano.cannata@supsi.ch).

3 Department of Mathematics, Università degli Studi di Genova, Via Dodecaneso 35, 16146 Genova, Italy(riccomag@dima.unige.it).

The deployment of hydrometeorological sensors significantly contributes to generating real-time big data. The quality and reliability of large datasets pose considerable challenges, as flawed analyses and decision-making processes can result. This research aims to address the issue of anomaly detection in real-time data by exploring machine learning models. Time-series data is collected from IstSOS - Sensor Observation Service, an open-source software that stores, collects and disseminates sensor data. The methodology consists of Gated Recurrent Units based on recurrent neural networks, along with corresponding prediction intervals, applied both to individual sensors and collectively across all temperature sensors within the Ticino region of Switzerland. Additionally, non-parametric methods like Bootstrap and Mean absolute deviation are employed instead of standard prediction intervals to tackle the non-normality of the data. The results indicate that Gated Recurrent Units based on recurrent neural networks, coupled with non-parametric forecast intervals, perform well in identifying erroneous data points. The application of the model on multivariate time series-sensor data establishes a pattern or baseline of normal behavior for the area (Ticino). When a new sensor is installed in the same region, the recognized pattern is used as a reference to identify outliers in the data gathered from the new sensor.

How to cite: Parvez, I.: Exploring Machine Learning Models to Detect Outliers in HydroMet Sensors, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-4280, https://doi.org/10.5194/egusphere-egu24-4280, 2024.

17:50–18:00
|
EGU24-10415
|
ECS
|
On-site presentation
|
Marek Lóderer, Michal Sandanus, Peter Pavlík, and Viera Rozinajová

Nowadays photovoltaic panels are becoming more affordable, efficient, and popular due to their low carbon footprint. PV panels can be installed in many places providing green energy to the local grid reducing energy cost and transmission losses. Since the PV production is highly dependent on the weather conditions, it is extremely important to estimate expected output in advance in order to maintain energy balance in the grid and provide enough time to schedule load distribution. The PV production output can be calculated by various statistical and machine learning prediction methods. In general, the more data available, the more precise predictions can be produced. This poses a problem for recently installed PV panels for which not enough data has been collected or the collected data are incomplete. 

A possible solution to the problem can be the application of an approach called Transfer Learning which has the inherent ability to effectively deal with missing or insufficient amounts of data. Basically, Transfer Learning is a machine learning approach which offers the capability of transferring knowledge acquired from the source domain (in our case a PV panel with a large amount of historical data) to different target domains (PV panels with very little collected historical data) to resolve related problems (provide reliable PV production predictions). 

In our study, we investigate the application, benefits and drawbacks of Transfer Learning for one day ahead PV production prediction. The model used in the study is based on complex neural network architecture, feature engineering and data selection. Moreover, we focus on the exploration of multiple approaches of adjusting weights in the target model retraining process which affect the minimum amount of training data required, final prediction accuracy and model’s overall robustness. Our models use historical meteorological forecasts from Deutscher Wetterdienst (DWD) and photovoltaic measurements from the project PVOutput which collects data from installed solar systems across the globe. Evaluation is performed on more than 100 installed PV panels in Central Europe.

How to cite: Lóderer, M., Sandanus, M., Pavlík, P., and Rozinajová, V.: Application of Transfer Learning techniques in one day ahead PV production prediction, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-10415, https://doi.org/10.5194/egusphere-egu24-10415, 2024.

Posters on site: Wed, 17 Apr, 10:45–12:30 | Hall X4

Display time: Wed, 17 Apr 08:30–Wed, 17 Apr 12:30
Chairpersons: Reik Donner, Tommaso Alberti, Simone Benella
Nonlinear Time Series Analysis
X4.114
|
EGU24-6151
|
Javier Amezcua and Nachiketa Chakraborty

Dynamical systems can display a range of dynamical regimes (e.g. attraction to, fixed points, limit cycles, intermittency, chaotic behaviour) depending on the values of parameters in the system. In this work we demonstrate how non-parametric entropy estimation codes (in particular NPEET) based on the Kraskov method can be applied to find regime transitions in a 3D chaotic model (the Lorenz 1963 system) when varying the values of the parameters. These infromation-theory-based methods are simpler and cheaper to apply than more traditional metrics from dynamical systems (e.g. computation of Lyapunov exponents). The non-parametric nature of the method allows for handling long time series without a prohibitive computational burden. 

How to cite: Amezcua, J. and Chakraborty, N.: Using information-theory metrics to detect regime changes in dynamical systems, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-6151, https://doi.org/10.5194/egusphere-egu24-6151, 2024.

Applications in Geophysics: Geomagnetism, Volcanism and Seismology
X4.115
|
EGU24-582
|
ECS
|
Benedek Koszta and Gábor Timár

On some maps of the first military survey of the Habsburg Empire, the upper direction of the sections does not face the cartographic north, but makes an angle of about 15° with it. This may be due to the fact that the sections were subsequently rotated to the magnetic north of the time. Basically, neither their projection nor their projection origin is known yet.

In my research, I am dealing with maps of Inner Austria, the Principality of Transylvania and Galicia (nowadays Poland and Ukraine), and I am trying to determine their projection origin. For this purpose, it is assumed, based on the archival documentation of the survey, that these are Cassini projection maps. My hypothesis is that they are Graz, Cluj Napoca or Alba Julia and Lviv. I also consider the position of Vienna in each case, since it was the main centre of the survey.

The angle of rotation was taken in part from the gufm1 historical magnetic model back to 1590 for the assumed starting points and year of mapping. In addition, as a theoretical case, I calculated the rotation angle of the map sections using coordinate geometry. I then calculated the longitude of the projection starting point for each case using univariate minimization. Since the method is invariant to latitude, it can only be determined from archival data.

Based on these, the starting point for Inner Austria from the rotation of the map was Vienna, which is not excluded by the archival sources, and since the baseline through Graz also started from there, it is partly logical. The map rotation for Galicia and Transylvania also confirmed the starting point of the hypothesis.  Since both Alba Julia and Cluj Napoca lie at about the same longitude, the method cannot make a difference there; and the archival data did not provide enough evidence. In comparison, the magnetic declination rotations yielded differences of about 1°, which may be due to an error in the magnetic model.

On this basis, I have given the assumed projections of the three maps with projection starting points, and developed a method for determining the projection starting points of the other rotated grid maps. The results suggest that there is a very high probability that the section network was rotated in the magnetic north direction, and thus provide a way to refine the magnetic declination data at that time.

With this method I managed to give new indirekt magnetic declinations data from Central-East Europe, which can help to improve the historical magnetic field models. The main reason for this is that we don’t have any measurement from that region.

Furthermore the difference beetwen the angle of the section north and the declination data from gufm1 always 0.8-1°. Maybe there are systematical data error at that region.

Supported by the ÚNKP-23-6 New National Excellence Program of the Ministry for Culture and Innovation from the source of the National Research, Development and Innovation Fund.

How to cite: Koszta, B. and Timár, G.: A possible cartographical data source for historical magnetic field improvement: The direction of the section north of the Habsburg first military survey, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-582, https://doi.org/10.5194/egusphere-egu24-582, 2024.

X4.116
|
EGU24-11897
|
Beibit Zhumabayev, Ivan Vassilyev, Zhasulan Mendakulov, Inna Fedulina, and Vitaliy Kapytin

In each magnetic observatory, the magnetic field is registered in local Cartesian coordinate systems associated with the geographic coordinates of the locations of these observatories. To observe extraterrestrial magnetic field sources, such as the interplanetary magnetic field or magnetic clouds, a method of joint processing of data from magnetic observatories of the international Intermagnet network was implemented. In this method, the constant component is removed from the observation results of individual observatories, their measurement data is converted into the ecliptic coordinate system, and the results obtained from all observatories are averaged after the coordinate transformation.

The first data on joint processing of measurement results from the international network of Intermagnet magnetic observatories in the period before the onset of magnetic storms of various types, during these storms and after their end are presented. There is a significant improvement in the signal-to-noise ratio after combining the measurement results from all observatories, which makes it possible to isolate weaker external magnetic fields. A change in the shape of magnetic field variations is shown, which can provide new knowledge about the mechanism of development of magnetic storms.

How to cite: Zhumabayev, B., Vassilyev, I., Mendakulov, Z., Fedulina, I., and Kapytin, V.: Results of joint processing of magnetic observatory data of international Intermagnet network in a unified coordinate system, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-11897, https://doi.org/10.5194/egusphere-egu24-11897, 2024.

X4.117
|
EGU24-19601
|
ECS
Palangio Paolo Giovanni and Santarelli Lucia

Discrimination of  geomagnetic quasi-periodic signals by using SSA Transform

  • Palangio1, L. Santarelli 1

1Istituto Nazionale di Geofisica e Vulcanologia L’Aquila

3Istituto Nazionale di Geofisica e Vulcanologia Roma

 

Correspondence to:  lucia.santarelli@ingv.it

 

Abstract

In this paper we present an application of  the SSA Transform to the detection and reconstruction of  very weak geomagnetic signals hidden in noise. In the SSA Transform  multiple subspaces are used for representing and reconstructing   signals and noise.  This analysis allows us to reconstruct, in the time domain, the different harmonic components contained in the original signal by using  ortogonal functions. The objective is to identificate the dominant  subspaces that can be attributed to the  signals and the subspaces that can be attributed to the noise,  assuming that all these  subspaces are orthogonal to each other, which implies that the  signals and noise  are independent of one another. The subspace of the signals is mapped simultaneously on several spaces with a lower dimension, favoring the dimensions that best discriminate the patterns. Each subspace of the signal space is used to encode different subsets of functions having common characteristics, such as  the same periodicities. The subspaces  identification was performed by using singular value decomposition (SVD) techniques,  known as  SVD-based identification methods  classified in a subspace-oriented scheme.The  quasi-periodic variations of geomagnetic field  has been investigated in the range of scale which span from 22 years to 8.9 days such as the  Sun’s polarity reversal cycle (22 years), sun-spot cycle (11 years), equinoctial effect (6 months), synodic rotation of the Sun (27 days) and its harmonics. The strength of these signals vary from fractions of a nT to tens of nT. Phase and frequency variability of these cycles has been evaluated from the range of variations in the geomagnetic field recorded at middle latitude place (covering roughly 4.5 sunspot cycles). Magnetic data recorded at L'Aquila Geomagnetic observatory (geographic coordinates: 42° 23’ N, 13° 19’E, geomagnetic coordinates: 36.3° N,87°.2 E, L-shell=1.6) are used from 1960 to 2009.

 

 

How to cite: Paolo Giovanni, P. and Lucia, S.: Discrimination of  geomagnetic quasi-periodic signals by using SSA Transform, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-19601, https://doi.org/10.5194/egusphere-egu24-19601, 2024.

X4.118
|
EGU24-12928
Fadil Inceoglu and Paul Loto'aniu

We introduce the CLEAN algorithm to identify narrowband Ultra Low Frequency (ULF) Pc5 plasma waves in Earth’s magnetosphere. The CLEAN method was first used for constructing 2D images in astronomical radio interferometry but has since been applied to a huge range of areas including adaptation for time series analysis. The algorithm performs a nonlinear deconvolution in the frequency domain (equivalent to a least-squares in the time domain) allowing for identification of multiple individual wave spectral peaks within the same power spectral density. The CLEAN method also produces real amplitudes instead of model fits to the peaks and retains phase information. We applied the method to GOES magnetometer data spanning 30 years to study the distribution of narrowband Pc5 ULF waves at geosynchronous orbit. We found close to 30,0000 wave events in each of the vector magnetic field components in field-aligned coordinates. We discuss wave occurrence and amplitudes distributed in local time and frequency. The distribution of the waves under different solar wind conditions are also presented. With some precautions, which are applicable to other event identification methods, the CLEAN technique can be utilized to detect wave events and its harmonics in the magnetosphere and beyond. We also discuss limitations of the method mainly the detection of unrealistic peaks due to aliasing and Gibbs phenomena.

How to cite: Inceoglu, F. and Loto'aniu, P.: Using the CLEAN Algorithm to Determine the Distribution of Ultra Low Frequency Waves at Geostationary Orbit, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12928, https://doi.org/10.5194/egusphere-egu24-12928, 2024.

X4.119
|
EGU24-12938
Marisol Monterrubio-Velasco, Xavier Lana, Raúl Arámbula-Mendoza, and Ramón Zúñiga

Understanding volcanic activity through time series data analysis is crucial for uncovering the fundamental physical mechanisms governing this natural phenomenon.

In this study, we show the application of multifractal and fractal methodologies, along with statistical analysis, to investigate time series associated with volcanic activity. We aim to make use of these approaches to identify significant variations within the physical processes related to changes in volcanic activity. These methodologies offer the potential to identify pertinent changes preceding a high-energy explosion or a significant volcanic eruption.

In particular, we apply it to analyze two study cases. First, the evolution of the multifractal structure of volcanic emissions of low, moderate, and high energy explosions applied to Volcán de Colima (México years 2013-2015). The results contribute to obtaining quite evident signs of the immediacy of possible dangerous emissions of high energy, close to 8.0x10^8 J. Additionally, the evolution of the adapted Gutenberg-Richter seismic law to volcanic energy emissions contributes to confirm the results obtained using multifractal analysis. Secondly, we also studied the time series of the Gutenberg-Richter b-parameter of seismic activities associated with volcanic emissions in Iceland, Hawaii, and the Canary Islands, through the concept of Disparity (degree of irregularity), the fractal Hurst exponent, H, and several multifractal parameters. The results obtained should facilitate a better knowledge of the relationships between the activity of volcanic emissions and the corresponding related seismic activities.  

How to cite: Monterrubio-Velasco, M., Lana, X., Arámbula-Mendoza, R., and Zúñiga, R.: Applying Multifractal Theory and Statistical Techniques for High Energy Volcanic Explosion Detection and Seismic Activity Monitoring in Volcanic Time Series, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12938, https://doi.org/10.5194/egusphere-egu24-12938, 2024.

X4.120
|
EGU24-17344
Éric Beucler, Mickaël Bonnin, and Arthur Cuvier

The quality of the seismic signal recorded at permanent and temporary stations is sometimes degraded, either abruptly or over time. The most likely cause is a high level of humidity, leading to corrosion of the connectors but environmental changes can also alter recording conditions in various frequency ranges and not necessarily for all three components in the same way. Assuming that the continuous seismic signal can be described by a normal distribution, we present a new approach to quantify the seismogram quality and to point out any time sample that deviates from this Gaussian assumption. To this end the notion of background Gaussian signal (BGS) to statistically describe a set of samples that follows a normal distribution. The discrete function obtained by sorting the samples in ascending order of amplitudes is compared to a modified probit function to retrieve the elements composing the BGS, and its statistical properties, mostly the Gaussian standard deviation, which can then differ from the classical standard deviation. Hence the ratio of both standard deviations directly quantifies the dominant gaussianity of the continuous signal and any variation reflects a statistical modification of the signal quality. We present examples showing daily variations in this ratio for stations known to have been affected by humidity, resulting in signal degradation. The theory developed can be used to detect subtle variations in the Gaussianity of the signal, but also to point out any samples that don't match the Gaussianity assumption, which can then be used for other seismological purposes, such as coda determination.

How to cite: Beucler, É., Bonnin, M., and Cuvier, A.: Introducing a new statistical theory to quantify the Gaussianity of the continuous seismic signal, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-17344, https://doi.org/10.5194/egusphere-egu24-17344, 2024.

X4.121
|
EGU24-17566
|
ECS
|
Yichen Zhong, Chen Gu, Michael Fehler, German Prieto, Peng Wu, Zhi Yuan, Zhuoyu Chen, and Borui Kang

Climate events may induce abnormal ocean wave activities, that can be detected by seismic array on nearby coastlines. We collected long-term continuous array seismic data in the Groningen area and the coastal areas of the North Sea, conducted a comprehensive analysis to extract valuable climate information hidden within the ambient noise. Through long-term spectral analysis, we identified the frequency band ranging from approximately 0.2Hz, which appears to be associated with swell waves within the region, exhibiting a strong correlation with the significant wave height (SWH). Additionally, the wind waves with a frequency of approximately 0.4 Hz and gravity waves with periods exceeding 100 seconds were detected from the seismic ambient noise. We performed a correlation analysis between the ambient noise and various climatic indexes across different frequency bands. The results revealed a significant correlation between the North Atlantic Oscillation (NAO) Index and the ambient noise around 0.17Hz.

Subsequently, we extracted the annual variation curves of SWH frequency from ambient noise at each station around the North Sea and assembled them into a sparse spatial grid time series (SGTS). An empirical orthogonal function (EOF) analysis was conducted, and the Principal Component (PC) time series derived from the EOF analysis were subjected to a correlation analysis with the WAVEWATCH III (WW3) model simulation data, thereby confirming the wave patterns. Moreover, we conducted the spatial distribution study of SGTS. The spatial features revealed that the southern regions of the North Sea exhibit higher wind-wave energy components influenced by the Icelandic Low pressure system and topography, which explains the correlation between ambient noise in the region and the NAO index. Furthermore, spatial features disclosed a correlation between the first EOF mode of the North Sea ocean waves and the third mode of sea surface temperature anomalies. This research shows the potential of utilizing existing off-shore seismic monitoring systems to study global climate variation and physical oceanography.

How to cite: Zhong, Y., Gu, C., Fehler, M., Prieto, G., Wu, P., Yuan, Z., Chen, Z., and Kang, B.: Unveiling Climate-Induced Ocean Wave Activities Using Seismic Array Data in the North Sea Region, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-17566, https://doi.org/10.5194/egusphere-egu24-17566, 2024.

Applications in Climate, Atmospheric Sciences and Hydrology
X4.122
|
EGU24-1988
|
ECS
Santiago Zazo, José Luis Molina, Carmen Patino-Alonso, and Fernando Espejo

The alteration of traditional hydrological patterns due to global warming is leading to a modification of the hydrological cycle. This situation draws a complex scenario for the sustainable management of water resources. However, this issue offers a challenge for the development of innovative approaches that allow an in-depth capturing the logical temporal-dependence structure of these modifications to advance sustainable management of water resources, mainly through the reliable predictive models. In this context, Bayesian Causality (BC), addressed through Causal Reasoning (CR) and supported by a Bayesian Networks (BNs), called Bayesian Causal Reasoning (BCR) is a novel hydrological research area that can help identify those temporal interactions efficiently.

This contribution aims to assesses the BCR ability to discover the logical and non-trivial temporal-dependence structure of the hydrological series, as well as its predictability. For this, a BN that conceptually synthesizes the time series is defined, and where the conditional probability is propagated over the time throughout the BN through an innovative Dependence Mitigation Graph. This is done by coupling among an autoregressive parametric approach and causal model. The analytical ability of the BCR highlighted the logical temporal structure, latent in the time series, which defines the general behavior of the runoff. This logical structure allowed to quantify, through a dependence matrix which summarizes the strength of the temporal dependencies, the two temporal fractions that compose the runoff: one due to time (Temporally Conditioned Runoff) and one not (Temporally Non-conditioned Runoff). Based on this temporal conditionality, a predictive model is implemented for each temporal fraction, and its reliability is assessed from a double probabilistic and metrological perspective.

This methodological framework is applied to two Spanish unregulated sub-basins; Voltoya river belongs to Duero River Basin, and Mijares river, in the Jucar River Basin. Both cases with a clearly opposite temporal behavior, Voltoya independent and Mijares dependent, and with increasingly more problems associated with droughts.

The findings of this study may have important implications over the knowledge of temporal behavior of water resources of river basin and their adaptation. In addition, TCR and TNCR predictive models would allow advances in the optimal dimensioning of storage infrastructures (reservoirs), with relevant substantial economic/environmental savings. Also, a more sustainable management of river basins through more reliable control reservoirs’ operation is expected to be achieved. Finally, these results open new possibilities for developing predictive hydrological models within a BCR framework.

How to cite: Zazo, S., Molina, J. L., Patino-Alonso, C., and Espejo, F.: Predictive ability assessment of Bayesian Causal Reasoning (BCR) on runoff temporal series, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-1988, https://doi.org/10.5194/egusphere-egu24-1988, 2024.

X4.123
|
EGU24-13593
|
ECS
Conor Doherty and Weile Wang

Spatially interpolated meteorological data products are widely used in the geosciences as well as disciplines like epidemiology, economics, and others. Recent work has examined methods for quantifying uncertainty in gridded estimates of near-surface air temperature that produce distributions rather than simply point estimates at each location. However, meteorological variables are correlated not only in space but in time, and sampling without accounting for temporal autocorrelation produces unrealistic time series and potentially underestimates cumulative errors. This work first examines how uncertainty in air temperature estimates varies in time, both seasonally and at shorter timescales. It then uses data-driven, spectral, and statistical methods to better characterize uncertainty in time series of estimated air temperature values. Methods for sampling that reproduce spatial and temporal autocorrelation are presented and evaluated. The results of this work are particularly relevant to domains like agricultural and ecology. Physical processes including evapotranspiration and primary production are sensitive to variables like near-surface air temperature, and errors in these important meteorological inputs accumulate in model outputs over time.

How to cite: Doherty, C. and Wang, W.: Characterizing Uncertainty in Spatially Interpolated Time Series of Near-Surface Air Temperature, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-13593, https://doi.org/10.5194/egusphere-egu24-13593, 2024.

X4.124
|
EGU24-9537
|
Elise Faulx, Xavier Fettweis, Georges Mabille, and Samuel Nicolay

The Wavelet-Induced Mode Extraction procedure (WIME) [2] was developed drawing inspiration from Empirical Mode Decomposition. The concept involves decomposing the signal into modes, each presenting a characteristic frequency, using continuous wavelet transform. This method has yielded intriguing results in climatology [3,4]. However, the initial algorithm did not account for the potential existence of slight frequency fluctuations within a mode, which could impact the reconstruction of the original signal [4]. The new version (https://atoms.scilab.org/toolboxes/toolbox_WIME/0.1.0) now allows for the evolution of a mode in the space-frequency half-plane, thus considering the frequency evolution of a mode [2]. A natural application of this tool is in the analysis of Milankovitch cycles, where subtle changes have been observed throughout history. The method also refines the study of solar activity, highlighting the role of the "Solar Flip-Flop." Additionally, the examination of temperature time series confirms the existence of cycles around 2.5 years. It is now possible to attempt to correlate solar activity with this observed temperature cycle, as seen in speleothem records [1].

[1] Allan, M., Deliège, A., Verheyden, S., Nicolay S. and Fagel, N. Evidence for solar influence in a Holocene speleothem record, Quaternary Science Reviews, 2018.
[2] Deliège, A. and Nicolay, S., Extracting oscillating components from nonstationary time series: A wavelet-induced method, Physical Review. E, 2017.
[3] Nicolay, S., Mabille, G., Fettweis, X. and Erpicum, M., A statistical validation for the cycles found in air temperature data using a Morlet wavelet-based method, Nonlinear Processes in Geophysics, 2010.
[4] Nicolay, S., Mabille, G., Fettweis, X. and Erpicum, M., 30 and 43 months period cycles found in air temperature time series using the Morlet wavelet, Climate Dynamics, 2009.

How to cite: Faulx, E., Fettweis, X., Mabille, G., and Nicolay, S.: Wavelet-Induced Mode Extraction procedure: Application to climatic data, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-9537, https://doi.org/10.5194/egusphere-egu24-9537, 2024.

Applications in Biogeosciences
X4.125
|
EGU24-5733
|
ECS
|
Aminah Kaharuddin and Rokhya Kaligatla

Semi-enclosed freshwater and brackish ecosystems, characterised by restricted water outflow and prolonged residence times, often accumulate nutrients, influencing their productivity and ecological dynamics. These ecosystems exhibit significant variations in bio-physical-chemical attributes, ecological importance, and susceptibility to human impacts. Untangling the complexities of their interactions remains challenging, necessitating a deeper understanding of effective management strategies adapted to their vulnerabilities. This research focuses on the bio-physical aspects, investigating the differential effects of spring and summer light on phytoplankton communities in semi-enclosed freshwater and brackish aquatic ecosystems.

Through extensive field sampling and comprehensive environmental parameter analysis, we explore how phytoplankton respond to varying light conditions in these distinct environments. Sampling campaigns were conducted at Müggelsee, a freshwater lake on Berlin's eastern edge, and Barther Bodden, a coastal lagoon northeast of Rostock on the German Baltic Sea coast, during the springs and summers of 2022 and 2023, respectively. Our analysis integrates environmental factors such as surface light intensity, diffuse attenuation coefficients, nutrient availability, water column dynamics, meteorological data, Chlorophyll-a concentration, and phytoplankton communities. Sampling encompassed multiple depths at continuous intervals lasting three days.

Preliminary findings underscore significant differences in seasonal light availability, with summer exhibiting extended periods of substantial light penetration. These variations seem to impact phytoplankton abundance and diversity uniquely in each ecosystem. While ongoing analyses are underway, early indications suggest distinct phytoplankton responses in terms of species composition and community structure, influenced by the changing light levels. In 2022 the clear water phase during spring indicated that bloom events have occurred under ice cover much earlier than spring, while in the summer there were weak and short-lived blooms of cyanobacteria. The relationship between nutrient availability and phytoplankton dynamics, however, remains uncertain according to our data.

This ongoing study contributes to understanding the role of light as a primary driver shaping phytoplankton community structures and dynamics in these environments.  Our research findings offer insights for refining predictive models, aiding in ecosystem-specific eutrophication management strategies, and supporting monitoring efforts of Harmful Algal Blooms.

How to cite: Kaharuddin, A. and Kaligatla, R.: Comparative Study of Spring and Summer Light Effects on Phytoplankton Communities in Semi-Enclosed Fresh- and Brackish Aquatic Ecosystems., EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-5733, https://doi.org/10.5194/egusphere-egu24-5733, 2024.

X4.126
|
EGU24-18210
|
ECS
Katharina Ramskogler, Moritz Altmann, Sebastian Mikolka-Flöry, and Erich Tasser

The availability of comprehensive aerial photography is limited to the mid-20th century, posing a challenge for quantitatively analyzing long-term surface changes in proglacial areas. This creates a gap of approximately 100 years, spanning the end of the Little Ice Age (LIA). Employing digital monoplotting and historical terrestrial images, our study reveals quantitative surface changes in a LIA lateral moraine section dating back to the second half of the 19th century, encompassing a total study period of 130 years (1890 to 2020). With the long-term analysis at the steep lateral moraines of Gepatschferner (Kauner Valley, Tyrol, Austria) we aimed to identify changes in vegetation development in context with morphodynamic processes and the changing climate.

In 1953, there was an expansion in the area covered by vegetation, notably encompassing scree communities, alpine grassland, and dwarf shrubs. However, the destabilization of the system after 1980, triggered by rising temperatures and the resulting thawing of permafrost, led to a decline in vegetation cover by 2020. Notably, our observations indicated that, in addition to morphodynamic processes, the overarching trends in temperature and precipitation exerted a substantial influence on vegetation development. Furthermore, areas with robust vegetation cover, once stabilised, were reactivated and subjected to erosion, possibly attributed to rising temperatures post-1980.

This study demonstrates the capability of historical terrestrial images to enhance the reconstruction of vegetation development in context with morphodynamics in high alpine environments within the context of climate change. However, it is important to note that long-term mapping of vegetation development through digital monoplotting has limitations, contingent on the accessibility and quality of historical terrestrial images, as well as the challenges posed by shadows in high alpine regions. Despite these limitations, this long-term approach offers fundamental data on vegetation development for future modelling efforts.

How to cite: Ramskogler, K., Altmann, M., Mikolka-Flöry, S., and Tasser, E.: Long-term vegetation development in context of morphodynamic processes since mid-19th century, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-18210, https://doi.org/10.5194/egusphere-egu24-18210, 2024.

X4.127
|
EGU24-3857
|
ECS
|
Highlight
Sara Alibakhshi

Climate-induced forest mortality poses an increasing threat worldwide, which calls for developing robust approaches to generate early warning signals of upcoming forest state change. This research explores the potential of satellite imagery, utilizing advanced spatio-temporal indicators and methodologies, to assess the state of forests preceding mortality events. Traditional approaches, such as techniques based on temporal analyses, are impacted by limitations related to window size selection and detrending methods, potentially leading to false alarms. To tackle these challenges, our study introduces two new approaches, namely the Spatial-Temporal Moran (STM) and Spatial-Temporal Geary (STG) approaches, both focusing on local spatial autocorrelation measures. These approaches can effectively address the shortcomings inherent in traditional methods. The research findings were assessed across three study sites within California national parks, and Kendall's tau was employed to quantify the significance of false and positive alarms. To facilitate the measurement of ecosystem state change, trend estimation, and identification of early warning signals, this study also provides "stew" R package. The implications of this research extend to various groups, such as ecologists, conservation practitioners, and policymakers, providing them with the means to address emerging environmental challenges in global forest ecosystems.

How to cite: Alibakhshi, S.: Spatial-Temporal Analysis of Forest Mortality, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-3857, https://doi.org/10.5194/egusphere-egu24-3857, 2024.

X4.128
|
EGU24-13879
|
ECS
|
Eunhye Choi and Josh Gray

Vegetation phenology is the recurring of plant growth, including the cessation and resumption of growth, and plays a significant role in shaping terrestrial water, nutrient, and carbon cycles. Changes in temperature and precipitation have already induced phenological changes around the globe, and these trends are likely to continue or even accelerate. While warming has advanced spring arrival in many places, the effects on autumn phenology are less clear-cut, with evidence for earlier, delayed, or even unchanged end of the growing season (EOS). Meteorological droughts are intensifying in duration and frequency because of climate change. Droughts intricately impact changes in vegetation, contingent upon whether the ecosystem is limited by water or energy. These droughts have the potential to influence EOS changes. Despite this, the influence of drought on EOS remains largely unexplored. This study examined moisture’s role in controlling EOS by understanding the relationship between precipitation anomalies, vegetation’s sensitivity to precipitation (SPPT), and EOS. We also assess regional variations in responses to the impact of SPPT on EOS.

The study utilized multiple vegetation and water satellite products to examine the patterns of SPPT in drought and its impact on EOS across aridity gradients and vegetation types. By collectively evaluating diverse SPPTs from various satellite datasets, this work offers a comprehensive understanding and critical basis for assessing the impact of drought on EOS. We focused on the Northern Hemisphere from 2000 to 2020, employing robust statistical methods. This work found that, in many places, there was a stronger relationship between EOS and drought in areas with higher SPPT. Additionally, a non-linear negative relationship was identified between EOS and SPPT in drier regions, contracting with a non-linear positive relationship observed in wetter regions. These findings were consistent across a range of satellite-derived vegetation products. Our findings provide valuable insights into the effects of SPPT on EOS during drought, enhancing our understanding of vegetation responses to drought and its consequences on EOS and aiding in identifying drought-vulnerable areas.

How to cite: Choi, E. and Gray, J.: Understanding the role of vegetation responses to drought in regulating autumn senescence, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-13879, https://doi.org/10.5194/egusphere-egu24-13879, 2024.

X4.129
|
EGU24-22262
|
ECS
|
Mate Simon, Mátyás Richter-Cserey, Vivien Pacskó, and Dániel Kristóf

Over the past decades, especially since 2014, large quantities of Earth Observation (EO) data became available in high spatial and temporal resolution, thanks to ever-developing constellations (e.g.: Sentinel, Landsat) and open data policy. However, in the case of optical images, affected by cloud coverage and the spatially changing overlap of relative satellite orbits, creating temporally generalized and dense time series by using only measured data is challenging, especially when studying larger areas.

Several papers investigate the question of spatio-temporal gap filling and show different interpolation methods to calculate missing values corresponding to the measurements. In the past years more products and technologies have been constructed and published in this field, for example Copernicus HR-VPP Seasonal Trajectories (ST) product.  These generalized data structures are essential to the comparative analysis of different time periods or areas and improve the reliability of data analyzing methods such as Fourier transform or correlation. Temporally harmonized input data is also necessary in order to improve the results of Machine Learning classification algorithms such as Random Forest or Convolutional Neural Networks (CNN). These are among the most efficient methods to separate land cover categories like arable lands, forests, grasslands and built-up areas, or crop types within the arable category.

This study analyzes the efficiency of different interpolation methods on Sentinel-2 multispectral time series in the context of land cover classification with Machine Learning. We compare several types of interpolation e.g. linear, cubic and cubic-spline and also examine and optimize more advanced methods like Inverse Distance Weighted (IDW) and Radial Basis Function (RBF). We quantify the accuracy of each method by calculating mean square error between measured and interpolated data points. The role of interpolation of the input dataset in Deep Learning (CNN) is investigated by comparing Overall, Kappa and categorical accuracies of land cover maps created from only measured and interpolated time series. First results show that interpolation has a relevant positive effect on accuracy statistics. This method is also essential in taking a step towards constructing robust pretrained Deep Learning models, transferable between different time intervals and agro-ecological regions.

The research has been implemented with the support provided by the Ministry of Culture and Innovation of Hungary from the National Research, Development and Innovation Fund, financed under the KDP-2021 funding scheme.

 

Keywords: time series analysis, Machine Learning, interpolation, Sentinel

How to cite: Simon, M., Richter-Cserey, M., Pacskó, V., and Kristóf, D.: Temporal Interpolation of Sentinel-2 Multispectral Time Series in Context of Land Cover Classification with Machine Learning Algorithms, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-22262, https://doi.org/10.5194/egusphere-egu24-22262, 2024.