- 1University of Salerno, Salerno, Italy (e.roberto@studenti.unisa.it)
- 2University of Salerno, Salerno, Italy (fmadonna@unisa.it)
- 3University of Salerno, Salerno, Italy (fkarimiansarakhs@unisa.it)
- 4National Research Council - Institute of Methodologies for Environmental Analysis, Tito, Italy (gessicacosimato@cnr.it)
The study of global warming using near-surface temperature measurements requires homogenization algorithms to detect and correct inhomogeneities in time series through statistical analysis. Homogenization improves data quality and stability, enabling more reliable climate analysis and modeling. However, homogenization poses challenges, particularly the need for benchmarking datasets to ensure consistent adjustments. The recent development of reference networks, as defined by the GCOS framework, provides datasets where homogenization algorithms can be tested using metadata and measurement uncertainties that identify potential discontinuities.
This study is part of the project "Strumenti per la Mitigazione dell’Isola di Calore e il Recupero delle Aree Boschive" (SMICRAB) which integrates georeferenced data and advanced statistical modelling to develop geostatistical models and to assess vulnerability of urban and wooded areas useful for urban planning, biodiversity conservation and natural disaster risk reduction.
This study aims to optimize homogenization algorithms using data from the U.S. Climate Reference Network (USCRN). While USCRN datasets lack uncertainty estimates, these have been made available through a Copernicus project via the Climate Data Store (CDS).
The focus is on detecting breakpoints in monthly near-surface (2m) temperature series using the Standard Normal Homogeneity Test (SNHT), where breakpoints correspond to abrupt changes in time series combined with uncertainty measurements.
This work will show the results obtained from the optimization process, which examined the original series alongside trends derived using moving-average decomposition, ARIMA, and LOESS models with various spans. By comparing SNHT-detected breakpoints with those in uncertainty series, the analysis evaluates accuracy and delay in breakpoint detection. The work is supported by metadata to explain the breakpoints nature. The analysis of uncertainty measurement allows to identify a breakpoint since the uncertainty increases or decreases in correspondence with a change in the instruments or measurement setup.
The ultimate goal is to reduce false positives and enhance the reliability of adjustments, contributing to more accurate climate datasets for modeling and analysis.
How to cite: Roberto, E., Madonna, F., Karimian Sarakhs, F., and Cosimato, G.: Optimization of homogenization algorithms to detect discontinuities in near-surface (2m) temperature time series, EGU General Assembly 2025, Vienna, Austria, 27 Apr–2 May 2025, EGU25-6408, https://doi.org/10.5194/egusphere-egu25-6408, 2025.