ESSI2.11 | Open Data Spaces & Copernicus satellite data Infrastructures – new opportunities for evidence-based decisions
EDI
Open Data Spaces & Copernicus satellite data Infrastructures – new opportunities for evidence-based decisions
Co-organized by NP8
Convener: András Zlinszky | Co-conveners: Paolo Mazzetti, Jurry De La Mar, Weronika Borejko, Joanna Małaśnicka, Joan Masó, Razvan Cosac
Orals
| Fri, 19 Apr, 10:45–12:30 (CEST)
 
Room G2
Posters on site
| Attendance Fri, 19 Apr, 16:15–18:00 (CEST) | Display Fri, 19 Apr, 14:00–18:00
 
Hall X2
Orals |
Fri, 10:45
Fri, 16:15
Data plays a crucial role in driving innovation and making informed decisions. The European strategy for data is a vision of data spaces that aims to foster creativity and open data, while also prioritizing personal data protection, consumer safeguards, and FAIR principles. Satellite imagery has transformative potential, but limitations of data size and access have previously constrained applications.

Additional thematic or geographical data spaces are being developed, such as the European sectorial data spaces, the Copernicus Data Space Ecosystem and Green Deal Data Space. These provide access to high-quality, interoperable data, through streamlined access, on-board processing and online visualization generating actionable knowledge and supporting more effective decision-making. These novel tools of this digital ecosystem create a vast range of opportunities for geoscience research, development and communication at local to global scale. Operational applications such as monitoring networks and early warning systems are built on top of these infrastructures, facilitating governance and sustainability in the face of global challenges. Worldwide satellite imagery data series can be accessed through API systems, creating analysis ready data for advanced machine learning applications. Put together, these advances in data availability, analysis tools and processing capacity are transformative for geoscience research. There is a growing demand for a deeper understanding of their design, establishment, integration, and evolution within the environmental and Earth sciences. As a geoscience community, it is imperative that we explore how data spaces can revolutionize our work and actively contribute to their development.

This session connects developers and users of the Copernicus Data Space Ecosystem and other European satellite data infrastructures and Data Spaces, showing how data spaces facilitate the sharing, integration, and flexible processing of environmental and Earth system data from diverse sources. The speakers will discuss how ongoing efforts to build data spaces will connect with existing initiatives on data sharing and processing, and present examples of innovative services that can be developed using data spacess. By leveraging these cutting-edge tools within the digital ecosystem, the geoscience community will gain access to a vast range of opportunities for research, development, and communication at local and global scales.

Orals: Fri, 19 Apr | Room G2

Chairpersons: András Zlinszky, Weronika Borejko, Paolo Mazzetti
10:45–10:50
10:50–11:00
|
EGU24-4942
|
Highlight
|
On-site presentation
Grega Milcinski, Jan Musial, Jacek Leszczenski, Dennis Clarijs, and Jurry de la Mar

The Copernicus Data Space Ecosystem (CDSE), a pivotal initiative of the European Space Agency (ESA) and funded by the European Commission, represents a groundbreaking advancement in Earth Observation (EO). It is an integral part of the ambitious Copernicus Programme, which has revolutionized EO data access and utilization across various sectors. The CDSE is more than a mere data repository; it is a collaborative platform where knowledge, expertise, and resources are shared and enhanced. This ecosystem approach is transforming the EO landscape, encouraging a more inclusive and participatory model where researchers, policymakers, businesses, and citizen scientists contribute to and benefit from the wealth of EO data. The CDSE stands as a beacon of international cooperation and a driver for sustainable development, setting new standards for leveraging EO data in addressing some of the most pressing global challenges.

Central to the CDSE is its commitment to federation and cooperation, offering a flexible framework for integration and participation to data providers, remote sensing experts, and application developers. This approach fosters innovation and collaboration, ensuring that the ecosystem benefits from a diverse range of contributions. Data providers can upload their data to the ecosystem’s object storage, making it available to CDSE users under free or commercial conditions. This not only enhances the CDSE’s data repository but also expands the providers' reach. Remote sensing experts contribute their algorithms and workflows to the openEO Algorithm Plaza, a collaborative space for sharing and enhancing EO technologies and applications. Application developers can utilize CDSE's infrastructure and data to create solutions for specific needs like agricultural monitoring, demonstrating the ecosystem’s potential for application development.

The CDSE's principle of federation extends beyond individual contributors to collaborations with other platform providers, enhancing the range of services and capabilities available to users. These partnerships are crucial for the ecosystem's growth, ensuring access to the best tools and data. The governance of the Data Space Ecosystem incorporates several third-party commercial providers and is designed to grow with additional contributions. It is based on European values of data sovereignty and privacy, ensuring trust, collaboration, and sustainability. The ecosystem maintains a clear distinction between publicly funded services and third-party offerings, both free and commercial. This open and federated approach makes CDSE a comprehensive solution for EO needs and solidifies its position as a leader in the EO community.

How to cite: Milcinski, G., Musial, J., Leszczenski, J., Clarijs, D., and de la Mar, J.: Copernicus Data Space Ecosystem - Platform That Enables Federated Earth Observation Services and Applications, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-4942, https://doi.org/10.5194/egusphere-egu24-4942, 2024.

11:00–11:10
|
EGU24-20830
|
On-site presentation
Sébastien Denvil, Marta Gutierrez, Mark Dietrich, Nevena Raczko, Mattia Santoro, Christian Briese, and Charis Chatzikyriakou

The European Commission has a program to accelerate the Digital Transition and is putting forward a vision based on cloud, common European Data Spaces and AI. As the data space paradigm unfolds across Europe, the Green Deal Data Space emerges. Its foundational pillars are to be built by the GREAT project.

GREAT, the Green Deal Data Space Foundation and its Community of Practice, has the ambitious goal of defining how data with the potential to help combat climate and environmental related challenges, in line with the European Green Deal, can be shared more broadly among many stakeholders, sectors and boundaries, according to European values such as fair access, privacy and security.

The Green Deal Data Space stands at the intersection of two major European policy initiatives: the EU Strategy for Data and the European Green Deal. The GDDS will be designed and implemented to exploit the potential of data to effectively support the Green Deal priority actions, empowering policy makers, businesses, researchers, and citizens, from Europe and around the world, to jointly tackle issues such as climate change, circular economy, zero pollution, biodiversity protection, deforestation and compliance assurance.

Out of the many European Green Deal strategic actions, the GREAT project focussed on three priorities (Biodiversity 2030, Zero Pollution and Climate change), in order to effectively capture the diversity of requirements across the full range of the European Green Deal. These three initiatives are interlinked with other EGD strategic action, approximate the full scope of the GDDS, as well as complementing actions being addressed by other thematic data spaces (such as the “Farm to Fork Strategy” being addressed by the agricultural data space). During the final stage of the roadmap elaboration, circular economy aspects together with the digital product passport concept were considered so that in addition to the Green Deal Policies binding targets, circular economy and digital product pilots and use cases should be instrumental in driving implemtation's decisions.

The implementation roadmap will guide the efforts of multiple actors to converge toward a blueprint technical architecture, a data governance scheme that enables innovative business cases, and an inventory of high value datasets. Data sharing by design and data sovereignty are some of the main principles that will apply from the early stages. This talk will present the final version  of the roadmap, pillars, goals and strategic actions and will seek feedback and collaborations from the EGU community.

How to cite: Denvil, S., Gutierrez, M., Dietrich, M., Raczko, N., Santoro, M., Briese, C., and Chatzikyriakou, C.: The European Green Deal Data Space roadmap, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-20830, https://doi.org/10.5194/egusphere-egu24-20830, 2024.

11:10–11:20
|
EGU24-9921
|
On-site presentation
Mattia Santoro, Alessandro Scremin, Gregory Giuliani, Joost van Bemmelen, Eliana Li Santi, Daniele Giordani, Małgorzata Herman, Aleksandra Marczuk, and Piotr Krupa

The GEOSS Platform Plus (GPP) project, funded by the European Union’s Horizon 2020 Framework Programme, aims to contribute to the implementation of the Global Earth Observation System of Systems (GEOSS) by evolving the European GEOSS Platform components to allow access to tailor-made information and actionable knowledge.

In this context, the GPP project is defining a vision for evolving GEOSS based on the concept of Digital Ecosystem (DE). Designing GEOSS as a Digital Ecosystem (DE) requires, first, the identification of its ecosystem service(s). As advocated in several documents, including the “Concept Paper for the Next Phase of the GEOSS Infrastructure” released by GEOSS Infrastructure Development Task Team (GIDTT) and concept of Earth Intelligence as introduced in the GEO Post 2025 Strategy, GEOSS should shift from the traditional data sharing paradigm to a model where knowledge generation and sharing is facilitated.

Therefore, we can identify, but not limiting to, the GEOSS Ecosystem Service as the Provisioning of trusted EO-based knowledge for decision-making. Considering the GEO mission and users basin, if such an ecosystem service is provided by GEOSS, a virtuous cycle can be initiated. End-users will utilize tools and services of the GEOSS DE because they find what they need and can use the desired information to obtain actionable knowledge; intermediate users (e.g., developers, scientists, data intermediaries, etc.) will build new tools on top of (and contributing to) the DE to attract new end users and contribute to the GEO communities benefits; providers will have more interest in participating (belonging) to the DE to spread the use of their resources and enlarge their users basin.

Such a framework enables a more flexible environment, which facilitates (and encourages) the extension of GEOSS capabilities through new components, and strengthen the involvement and contribution of GEO communities at large. To demonstrate this concept with practical examples, in this presentation we will introduce some of the use cases developed by the GPP project: (i) the SDG 15.3.1 for the Land Degradation use case, and (ii) the Nutrient Pollution in European Inland and Coastal Waters use case. Such use cases use (and extend) the current GEOSS Platform for the generation of actionable knowledge (and beyond, towards Earth Intelligence).

How to cite: Santoro, M., Scremin, A., Giuliani, G., van Bemmelen, J., Li Santi, E., Giordani, D., Herman, M., Marczuk, A., and Krupa, P.: Towards a GEOSS Digital Ecosystem: the GEOSS Platform Plus Vision, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-9921, https://doi.org/10.5194/egusphere-egu24-9921, 2024.

11:20–11:30
|
EGU24-21733
|
On-site presentation
Peter James Zellner, Rufai Omowunmi Balogun, Robert Eckhardt, Henryk Hodam, Tyna Dolezalova, Stefan Meißl, Jonas Eberle, Michele Claus, Mattia Callegari, Alexander Jacob, and Anca Anghelea

Earth Observation scientists are confronted with unprecedent data volumes that are constantly growing as the number of satellite missions increases, as well as spatial and temporal resolution do. Traditional working modes, such as downloading satellite data and processing it on a local computer, do not apply anymore and EO science is moving quickly towards cloud technologies and open science practices. Even though these new technologies and practices are evolving quickly and are becoming community standard, there is not much educational material available to educate the next generation of EO scientists.

The Massive Open Online Course Cubes & Clouds - Cloud Native Open Data Sciences for Earth Observation teaches the concepts of data cubes, cloud platforms and open science in the context of earth observation. It targets Earth Science students and researchers who want to increase their technical capabilities onto the newest standards in EO computing, as well as Data Scientists who want to dive into the world of EO and apply their technical background to a new field. Before starting,  prerequisites are a general knowledge of EO and python programming. The course explains the concepts of data cubes, EO cloud platforms and open science by applying them to a typical EO workflow from data discovery, data processing up to sharing the results in an open and FAIR (Findable, Accessible, Interoperable, Reusable) way. An engaging mixture of videos, animated content, lectures, hands-on exercises and quizzes transmits the content. After finishing the participant will understand the theoretical concepts of cloud native EO processing and have gained practical experience by conducting an end-to-end EO workflow. The participant will be capable of independently using cloud platforms to approach EO related research questions and be confident in how to share research by adhering to the concepts of open science.
The hands on exercises are carried out on the EO cloud platform Copernicus Data Space Ecosystem and are leveraging the Copernicus Data available through the STAC Catalogue and the cloud processing API openEO. In the final exercise the participants collaborate on a community mapping project adhering to open science and FAIR standards. A snow cover map is jointly created where every participants maps a small area of the alps and submits it to a STAC catalogue and web viewer. Ultimately, the map grows in space and time and every participant has contributed proving they are capable of EO cloud computing and open science practices.
The talk will guide through the topics covered in the MOOC and show how they are presented in the EOCollege e-learning platform, the links to the exercises carried out on CDSE will be explored and the open science aspect will be shown in the community mapping project and the invitation to collaborate on the courses completely open GitHub repository.

 

 

How to cite: Zellner, P. J., Balogun, R. O., Eckhardt, R., Hodam, H., Dolezalova, T., Meißl, S., Eberle, J., Claus, M., Callegari, M., Jacob, A., and Anghelea, A.: Cubes & Clouds – A Massive Open Online Course for Cloud Native Open Data Sciences in Earth Observation , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-21733, https://doi.org/10.5194/egusphere-egu24-21733, 2024.

11:30–11:40
|
EGU24-9662
|
ECS
|
On-site presentation
|
Raniero Beber, Piergiorgio Cipriano, Oscar Corcho, Michel Gabriel, Giacomo Martirano, Francesca Noardo, Daniela Poli, Fabio Remondino, and Danny Vandenbroucke

Data spaces are a new data sharing paradigm through a distributed ecosystem, allowing data owners to maintain control over their data use, unlocking the data potential against Findability Accessibility Interoperability and Reusability (FAIR) principles. For urban areas, data spaces provide geospatial, environmental and climate data to everyone. Various international and interdisciplinary projects and initiatives are developing data spaces, allowing policy-makers, researchers, citizens and the private sector to access high-quality, interoperable data in order to generate actionable knowledge, support the green transition of cities as well as enable a more effective decision-making process. 

The Urban Data Space for Green Deal - USAGE - EU project aims to offer innovative governance mechanisms, consolidated frameworks, best practices, AI-based tools and data analytics to share, access and use city-level data from satellite and airborne platforms, Internet of Things (IoT) sensors, authoritative and crowdsource sources. 

Within USAGE, a series of geospatial, thematic and other datasets have been newly acquired or generated over four pilot areas and will be shared through a standard-based web ecosystem to test and evaluate solutions (i) to better understand issues and trends on how our planet and its climate are changing; (ii) to support decision making intended to mitigate the effects of changes and environmental issues on life; and (iii) to address the role that humans play in these changes, e.g., with behaviour adaptation and mitigation actions.

Urban areas are the focus of USAGE since most of human activities are concentrated there, being the main source of some of the issues considered as green deal priorities (e.g. energy use, pollution, climate changes effects); solutions in USAGE are developed by an interdisciplinary team analysing geospatial data and by meeting multiple and diverse local requirements.

In this work we will present the relevant datasets collected in our pilot areas, reporting processing methodologies and applications of analysis-ready and decision-ready geospatial data. In particular we will report experiences related to:

- detection of urban heat islands (UHI) and production of UHI maps, utilizing open data like high-resolution satellite imagery, meteorological ground sensor data, surface properties and a hybrid model based on machine learning and geostatistics;

- generation of semantic 3D city models for photovoltaic solar potential estimation and energy efficiency purposes;

- generation of high-resolution thematic maps (surface materials, ecological indexes, etc.) from hyperspectral airborne imagery using a multi-level machine learning approach and supported by training data provided by the municipalities;

- realization of canopy thematic layers combining 3D point clouds and hyperspectal images to monitor health and growth of trees over time, to estimate biomass and to map species distribution;

- initiation of multi-temporal (night/days, summer/winter) thermal analyses based on high-resolution aerial thermal images, deriving proper land surface temperatures (LST) by correcting raw sensor data with thematic maps of surface materials. 

The presentation will highlight the importance of shared urban data spaces to enable visualization, sharing, fusion and processing of environmental and Earth Observation data from heterogeneous sources, ultimately facilitating more effective decision-making processes, besides advances in scientific research.

How to cite: Beber, R., Cipriano, P., Corcho, O., Gabriel, M., Martirano, G., Noardo, F., Poli, D., Remondino, F., and Vandenbroucke, D.: Urban Data Space to Support Green Deal Priority Actions, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-9662, https://doi.org/10.5194/egusphere-egu24-9662, 2024.

11:40–11:50
|
EGU24-20886
|
On-site presentation
Bente Lilja Bye and Marie-Francoise Voidrot

The European Commission has launched the European strategy for data, introducing an exciting vision of data spaces for many domains. This data space concept and its expected benefits have been so attractive that we already observe many ongoing initiatives building data spaces with large thematic overlaps as well as data spaces at different geographic scales including addressing different periods of time.  Interoperability across data spaces in such a context is of increasing importance, and in order to prepare for the necessary interoperability also between the numerous thematic and domain-specific data spaces,  a minimum requirement to begin with is good data management. 

In this presentation, an overview of key data management principles such as the Group on Earth Observation (GEO) data sharing and data management principles, FAIR, CARE, and TRUST principles will be given. A comparison based on the experience from the GEO dialogue series in 2022 and 2023 and how the principles in various ways support the creation of efficient, usable data spaces will be offered. There will be a focus on interoperability throughout the preparation to the creation and management of data spaces and how this is pivotal to avoid data fragmentation and thereby contribute to efficiently implementing  parts of the European data strategy, namely the data spaces concept. 

How to cite: Bye, B. L. and Voidrot, M.-F.: The importance of data management principles for efficient data spaces, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-20886, https://doi.org/10.5194/egusphere-egu24-20886, 2024.

11:50–12:00
|
EGU24-21701
|
ECS
|
On-site presentation
William Ray

Traditionally, processing and analysing satellite imagery has not been a straightforward task, with a lot of specialist knowledge required to make your data analysis ready. In addition, limited computing power made it only feasible to process and analyse individual satellite image acquisitions. 

This was not an issue as data even 10 years ago was not as numerous as it is today. For example, back in 2013, you would likely use Landsat 8 which has a 16 day revisit time at 30m resolution. In 2021 though, we now have access to Sentinel-2 with a revisit time of 5 days at 10m resolution. And if a scientist wished to develop any multitemporal application then they would likely rely on super-computers funded and managed by their institution. 

However, with the advances in computing power and the improvements in satellite revisit times meaning there is more data available now means that processing imagery on your desktop/laptop is no longer an effective approach.

The introduction of the Copernicus Data Space Ecosystem is now making cloud processing methods accessible and open to earth observation scientists across Europe and the world. With its scalability, ease of use, and powerful data processing capabilities, the Copernicus Data Space Ecosystem is designed to be flexible and adaptable to the needs of all types of users, from researchers to intergovernmental institutions. 

In this talk, we will run a comparison exercise generating a multi-temporal satellite imagery derived product running a traditional workflow, before demonstrating an optimized version of the workflow using cloud processing APIs. A use case will be presented; where a simple harvest detection map will be derived using the Normalized Difference Vegetation Index (NDVI) and the Bare Soil Index (BSI). Using the difference in NDVI between the most recent and least recent timestamp, the BSI in the most recent timestamp, and thresholding these values, we classify pixels as either harvested or not harvested.

The same workflow will then be run using the Sentinel Hub APIs that are available to all users of the Copernicus Data Space Ecosystem. The lines of code, amount of data downloaded and the time taken to process the data will then be compared demonstrating the efficiencies that can be gained from moving your satellite imagery processing to the cloud.

 

How to cite: Ray, W.: Why it's time to switch your EO data processing and analysis to the cloud , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-21701, https://doi.org/10.5194/egusphere-egu24-21701, 2024.

12:00–12:10
|
EGU24-20084
|
On-site presentation
Marcin Niemyjski and Jan Musiał

The Copernicus Data Space Ecosystem provides open access to the petabyte-scale EO data repository and to a wide range of tools and services, limited to some predefined quoatas. For users who would like to develop commercial services or for those who would like to have larger quotas/unlimited access to services the offer of CREODIAS platform is the solution. In this study an example of such a (pre)commercial service will be presented which publishes the Copernicus Sentinel-1 and Sentinel-2 products in the form of a Web Map Service (WMS) and WCS (Web Coverage Service). The architecture of the services based on the Kubernetes cluster allows horizontal scaling of a service along with a number of users requests. The WMS/WCS services to be presented combine data discovery, access, (pre)-processing, publishing (rendering) and dissemination capabilities available within a single RESTful (Representational state transfer) query. This gives a user great flexibility in terms of on-the-fly data extraction across a specific AOI (Area Of Interest), mosaicing, reprojection, simple band processing (cloud masking, normalized difference vegetation), rendering. The performance of the Copernicus Data Space Ecosystem and CREODIAS platform combined with the efficient software (Postgres 15 with PostGIS extension, MapServer with GDAL backend and Kakadu JPEG2000 driver) allows to achieve WMS/WCS service response time below 1 second on average. This in turn, gives a potential for massive parallelization of the computations given the horizontal scaling of the Kubernetes cluster.  
 
 

 

 

How to cite: Niemyjski, M. and Musiał, J.: Scaling up the Copernicus Data Space Ecosystem services using the CREODIAS cloud computing platform example of the WMS server deployed on the Kubernetes cluster, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-20084, https://doi.org/10.5194/egusphere-egu24-20084, 2024.

12:10–12:20
|
EGU24-21708
|
On-site presentation
Heiko Balzter, Nezha Acil, Ivan Reading, Konstantina Bika, Toby Drakesmith, Chris McNeill, Sarah Cheesbrough, and Justin Byrne

At COP26, the Glasgow Leaders Declaration committed the majority of the world’s nations to
ending deforestation by 2030. On 29 June 2023, the EU Regulation on deforestation-free
products (EU) 2023/1115 entered into force. The main driver of deforestation and forest
degradation is the expansion of agricultural land for the production of commodities like cattle,
wood, cocoa, soy, palm oil, coffee, rubber and derived products. Any trader wishing to sell these
commodities on the EU single market or export from within it, must prove that the products do
not originate from recently deforested land or have contributed to forest degradation.
Satellite imagery provides the means of addressing the implementation of the EU Regulation
on deforestation-free supply chains, and of strengthening forest governance through the
provision of timely information to national forest services. We present the PyEO near-real-time
forest alert system from Sentinel-2, a current operational application to reduce illegal logging in
Kenya, and its potential to support im- and exporters in demonstrating deforestation-free supply
chains developed by the ForestMind project.

The software implementation used the Python for Earth Observation (PyEO) library to
automatically extract information on forest loss from Sentinel-2 satellite imagery. It queries the
Copernicus Data Space Ecosystem for new imagery, downloads the automatically selected
Sentinel-2 images, applies a previously trained random forest machine learning model to detect
forest loss, and generates a multi-layer analyst report.
For the forest law enforcement in Kenya, the latest forest alerts are sifted and prioritised by
the Kenya Forest Service’s Forest Information Centre in Nairobi, and delegated to forest rangers
in the field for investigation. Forest rangers navigate to the field site inside the forest reserve,
accompanied by a local community scout, and report back to head office with their observations
and whether any arrests for illegal logging were made. Since its introduction in Kwale County in
2019, over 2000 forest alerts have been investigated. The dominant cause of the deforestation is
illegal logging, followed by charcoal production.
For the due diligence application, a Forest Analyst can then use the analyst-report and
additional software tools to create company reports suitable for communication to im- and
exporters for monitoring the impact of their supply chains on deforestation and forest
degradation.

 

How to cite: Balzter, H., Acil, N., Reading, I., Bika, K., Drakesmith, T., McNeill, C., Cheesbrough, S., and Byrne, J.: Due diligence for deforestation-free supply chainswith Sentinel-2 imagery from the Copernicus DataSpace Ecosystem , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-21708, https://doi.org/10.5194/egusphere-egu24-21708, 2024.

12:20–12:25
12:25–12:30

Posters on site: Fri, 19 Apr, 16:15–18:00 | Hall X2

Display time: Fri, 19 Apr 14:00–Fri, 19 Apr 18:00
Chairpersons: Joanna Małaśnicka, Jurry De La Mar, Razvan Cosac
X2.1
|
EGU24-628
|
ECS
Barbara Stimac Tumara and Taras Matselyukh

In the context of the AGEMERA project (European Union's Horizon Europe research and innovation programme under grant agreement No 101058178), OPT/NET introduced the AGEMERA Platform as an extension to its flagship product, MONITORED AITM. AGEMERA's core objectives include unlocking the EU's resource potential, enhancing public awareness of critical raw materials, and fostering environmentally and socially responsible mineral exploration.

OPT/NET solutions deliver exceptional user experience through an intuitive AI-driven interface, streamlining the complex processes of remote sensing data acquisition and data fusion to boost productivity by merging AI processing and automation with human problem-solving skills. The AI engine optimally harnesses heterogeneous Earth Observation data (predominantly  Copernicus Programme) and analysis-ready data from multiple heterogeneous and multimodal sources, promoting responsible mineral exploration and addressing environmental concerns, resulting in an interactive user-friendly web platform.

The platform is structured around three main modules:

  • OCLI 

Serving as the backend, OCLI manages data mining processes and automates functions, such as data acquisition, preprocessing, image processing, thorough workflows known as AI Knowledge Packs (AIKPs). OCLI collects relevant satellite data products, prepares analysis-ready data (ARD) cubes, and acts as the foundation for multi-source data fusion.

  • AGEMERA Geo-Suite Graphical User Interface 

As the frontend module, this web-based platform offers a general data repository and visualization in Cloud Optimized data formats. Powered by OGC standards compliant web services, it enables interactive exploration of extensive datasets for the public and mine site operators and owners. The conversational AI agent assists in easy navigation through the extensive collection of the data and insights in the platform, while the data itself is secured with various authentication and authorization access levels.

  • Data Repository

Comprised of bulk data warehouse enriched with INSPIRE nomenclature compliant metadata, the Data Repository employs Cloud Object Storage Service as the primary storage solution. This service provides web-based access to the object storage, ensuring efficient data access from any web browser.

OPT/NET's AI platform, exemplified in the AGEMERA project, stands as a pioneering solution seamlessly integrating AI processing, heterogeneous data fusion, human problem-solving, and advanced data management for generation of impactful insights and responsible resource exploration.

How to cite: Stimac Tumara, B. and Matselyukh, T.: AGEMERA AI: Innovative AI solution for responsible resource exploration, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-628, https://doi.org/10.5194/egusphere-egu24-628, 2024.

X2.2
|
EGU24-2538
|
Vasiliki Charalampopoulou, Eirini Marinou, Christos Kontopoulos, and Pol Kolokoussis

Horizon project OCEANIDS aims at building user-driven applications and tools, which act as an enabling technological layer for regional authorities & stakeholders in order to achieve a more resilient and inclusive systemic pathway to a Blue Economy in coastal regions. Under a single-access window platform for Climate- Informed Maritime Spatial Planning, the project will allow a more integrated seascape management of coastal regions. The overarching concept is to collect, harmonize and curate existing climate data services, making data accessible, reusable and interoperable for the development of local adaptation strategies. On top of the Euro Data Cube1 and the Copernicus Coastal Hub2 the OCEANIDS data cubes will be fused with EO data and further processed and analyzed using advanced deep learning and AI methods and techniques. The purpose of these deep learning and AI methods software tools is to extract meaningful information from the high dimensional datasets and to efficiently structure the collected data into a semantically enriched framework through supervised and unsupervised learning. Generative adversarial neural networks (GANs) are exploited to fill data inconsistencies and incompleteness where existing, so that OCEANIDS are well structured and consistent and therefore optimization of processing, analysis and classification performance is achieved.

A holistic hazard & risk assessment platform will be elaborated, assimilating data from all available sources in OCEANIDS, including asset exposure datasets, population statistics, long-term hazard simulations, short-term hazard forecasting, vulnerability information (both historical and simulation based) as well as impact assessment data from past and forecasted events. The modelling framework for assessing the magnitude of impacts will ensure accurate propagation of aleatory and epistemic uncertainties from all pertinent sources, e.g., data, methods, models, and parameters, all the way to the final quantification of risk. The applied modelling and simulation tools will estimate the state of assets, either single or in portfolios, depending on their currently reported state and/or the states of interconnected assets, where available. The state of an interconnected asset is thus a result of the nature of the hazard pressure affecting the originating asset, the characteristics of the asset under consideration (risk mitigation, means of immediate response, safety equipment) and the type of interconnection between the assets. This approach is the basis for accurately quantifying the risk over a region, allowing the improvement and optimization of the safety of the complex infrastructures related to their operation processes and their inside and outside interactions, while offering actionable metrics for regional planning, insurance, and natural catastrophe prevention/mitigation.

OCEANIDS Decision Support Platform will be implemented to give reliable recommendations to the end-users regarding spatiotemporal changes and the impact of climate change on the environment. The Digital Twin Earth model will be used and be able to monitor Climate Change, perform simulations of Earth’s interconnected system and human behavior, and help support environmental policies. This high-level front-end platform will be providing the end-users with an assessment of hazards in their respective region and leading to subsequent multi-scale planning.

1 https://eurodatacube.com/

2 https://www.coastal.hub.copernicus.eu/

How to cite: Charalampopoulou, V., Marinou, E., Kontopoulos, C., and Kolokoussis, P.: OCEANIDS: Utilizing Copernicus Satellite Imagery Open Data Infrastructures for User-driven applications and Tools for Climate-Informed Maritime Spatial Planning and Integrated Seascape Management, towards a Resilient & Inclusive Blue Economy., EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-2538, https://doi.org/10.5194/egusphere-egu24-2538, 2024.

X2.3
|
EGU24-3391
BEACON - A high-performance data lake solution
(withdrawn)
Peter Thijsse, Tjerk Krijger, Dick Schaap, and Robin Kooyman
X2.4
|
EGU24-4848
|
ECS
|
Ming Lin, Meng Jin, Juanzi Li, and Yuqi Bai

The advancement of satellite remote sensing technology has transformed our capacity to monitor and address global challenges. This technology provides global coverage, frequent observation revisits, and consistent monitoring, thus providing critical data support. Since the first Earth observation satellite was launched in the 1960s, more than a thousand Earth observation satellites have been deployed by various countries and organizations. However, the substantial accumulation of Earth observation assets is maintained independently by different organizations using varying methodologies. This poses a significant challenge in effectively utilizing and maximizing the value of these global observation resources.

This study introduces GEOSatDB, a comprehensive semantic database specifically tailored for civil Earth observation satellites. The foundation of the database is an ontology model conforming to standards set by the International Organization for Standardization (ISO) and the World Wide Web Consortium (W3C). This conformity enables data integration and promotes the reuse of accumulated knowledge. Our approach advocates a novel method for integrating Earth observation satellite information from diverse sources. It notably incorporates a structured prompt strategy utilizing a large language model to derive detailed sensor information from vast volumes of unstructured text.

To demonstrate the capabilities of GEOSatDB, we performed a comprehensive analysis of the distribution of Earth observation satellites in 195 countries. This analysis unveiled the global distribution and diversity of these assets. Furthermore, two distinct case studies showcase the practical application and robust data mining potential of GEOSatDB. With information on 2,340 remote sensing satellites and 1,018 sensors, this database represents a significant advancement in semantically sharing and applying Earth observation resources. Its establishment encourages enhanced international cooperation and more efficient environmental monitoring and management.

How to cite: Lin, M., Jin, M., Li, J., and Bai, Y.: Global Civil Earth Observation Satellite Semantic Database, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-4848, https://doi.org/10.5194/egusphere-egu24-4848, 2024.

X2.5
|
EGU24-6231
Support for the combined agricultural application of EO and non-EO data with the Danube Data Cube platform
(withdrawn)
Márton Tolnai
X2.6
|
EGU24-10109
Jan Musial, Jacek Leszczenski, Jedrzej Bojanowski, Grega Milcinski, Anja Vrecko, Dennis Clarijs, Jeroen Dries, and Uwe Marquard

The Copernicus Data Space Ecosystem introduces an innovative approach to accessing large Earth Observation (EO) data through streaming services and APIs. This study elaborates on advantages of the novel data discovery, streaming and access APIs within the Data Space, emphasizing key components such as: STAC catalog, OData and OpenSearch querying APIs, Browser, S3 interface, Jupyter Hub, openEO, and Sentinel Hub’s APIs.


The OData (Open Data Protocol) stands as the ISO/IEC approved standard, outlining best practices for designing and utilizing REST APIs. It harmonizes various elements such as request and response headers, status codes, HTTP methods, URL conventions, media types, payload formats, and query options. The OpenSearch catalogue enables the search for Copernicus data through a standardized web service. For technical details, the OpenSearch specification serves as a reference. The results from this web service are delivered as GeoJSON feature collections, with each feature representing an Earth observation 'product' and referencing the location of the actual data.


STAC (Spatio-Temporal Asset Catalog) emerges as a cutting-edge EO data management standard with robust data discovery and access capabilities. Built on simple JSON entries with resource-pointing URLs, it boasts flexibility in the types of data stored and accommodates new extensions and capabilities seamlessly. The hierarchical structure includes collections (e.g., Sentinel-2 L2A), subdivided into items representing satellite products. These items further break down into assets corresponding to spectral bands. This detailed STAC catalogue facilitates user searches for specific spectral bands and enables streaming of spatial data subsets using HTTP range requests.


JupyterHub is a central platform for managing Jupyter Notebooks, simplifying tasks like resource allocation and user authentication. Copernicus Data Space Ecosystem users get free access, but with limited resources. Jupyter Notebooks offer a user-friendly interface for EO data prototyping, integrating with Python SDKs like Sentinel Hub and OpenEO. They grant direct access to the EO data repository, eliminating dependency hassles with pre-configured kernels for immediate prototyping. 


Sentinel Hub, a satellite imagery processing service, that excels in on-the-fly actions like gridding, re-projection, mosaicking, etc. It efficiently fetches data for web applications or analytical processes like ML, based on satellite data without replication or pre-processing. Several APIs, including OGC services, contribute to the Data Space Browser for advanced data visualization. Key APIs within the Copernicus Data Space Ecosystem include: BYOC (Bring Your Own COG) API, (Batch) Processing API, (Batch) Statistical API. These APIs empower users to perform diverse tasks, from data ingestion to advanced statistical analyses.

OpenEO, revolutionizes geospatial data processing and analysis by offering a unified, interoperable platform for developers, researchers, and scientists. This framework empowers users to address complex geospatial challenges collaboratively, leveraging distributed computing environments and cloud-based resources. The collaborative nature of openEO enables seamless sharing of code, workflows, and data processing methods across platforms, advancing the accessibility, scalability, and reproducibility of Earth Observation (EO) data. OpenEO's intuitive libraries in Python, R, and JavaScript simplify the analysis of diverse datasets. In addition to the SDK, openEO features the Web Editor - an online tool for visually constructing workflows.

How to cite: Musial, J., Leszczenski, J., Bojanowski, J., Milcinski, G., Vrecko, A., Clarijs, D., Dries, J., and Marquard, U.: Overview of the Copernicus Data Space Ecosystem APIs, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-10109, https://doi.org/10.5194/egusphere-egu24-10109, 2024.

X2.7
|
EGU24-11477
Implementing Data Space Integration and Interoperability in the Earth System Sciences 
(withdrawn)
Claus Weiland, Jonas Grieb, Leyla Jael Castro, Tim Schäfer, and Stian Soiland-Reyes
X2.8
|
EGU24-17472
|
ECS
Donatello Elia, Fabrizio Antonio, Sandro Fiore, Emanuele Donno, Gabriele Accarino, Paola Nassisi, and Giovanni Aloisio

Several scientific fields, including climate science, have undergone radical changes in the last years due to the increase in the data volumes and the emergence of data science and Machine Learning (ML) approaches. In this context, providing fast data access and analytics has become of paramount importance. The data space concept has emerged to address some of the key challenges and support scientific communities  towards a more sustainable and FAIR use of data.

The ENES Data Space (EDS) represents a domain-specific example of this concept for the climate community developed under the umbrella of the European Open Science Cloud (EOSC) initiative by the European Commission. EDS provides an open, scalable, cloud-enabled data science environment for climate data analysis on top of the EOSC Compute Platform made available through a user-friendly JupyterLab GUI. The service integrates into a single environment climate datasets from well-known initiatives (e.g., CMIP6), compute resources, data science and ML tools. It was launched in the context of the EGI-ACE project, it is accessible through the EOSC Catalogue and Marketplace (https://marketplace.eosc-portal.eu/services/enes-data-space) and it also provides a web portal (https://enesdataspace.vm.fedcloud.eu) including information, tutorials and training materials. It has recently been added to the Data Spaces radar, an initiative launched by the IDSA (International Data Space Association) with the main goal of mapping data spaces from several domains into one easy-to-use tool.

The EDS is being employed in climate applications targeting big data processing, interactive analytics and visualization, and recently it has been extended to support more advanced scientific applications based on ML. In particular, it has been enhanced to provide a cloud-based development and testing environment for the implementation of some data-driven Digital Twin applications for extreme climate events in the context of the interTwin project. Finally, the ENES Data Space will be also one of the pilots in EOSC Beyond, a Horizon Europe project where the EDS will integrate and validate the new EOSC Core capabilities developed by the project.

This work has been supported in part by interTwin; interTwin is funded by the European Union (Horizon Europe) under grant agreement No 101058386.

How to cite: Elia, D., Antonio, F., Fiore, S., Donno, E., Accarino, G., Nassisi, P., and Aloisio, G.: A Data Space environment for big data and ML-based climate applications in the European Open Science Cloud, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-17472, https://doi.org/10.5194/egusphere-egu24-17472, 2024.

X2.9
|
EGU24-18047
Joan Masó, Alba Brobia, Malte Zamzov, Ivette Serral, Thomas Hodson, Raul Palma, Lucy Bastin, and Victoria Lush

The Green Deal Data Space is born in the big data paradigm where sensors produce constant streams of Earth observation data (remote sensing or in-situ). The traditional and manual organization of data in layers is no longer efficient as data is constantly evolving and mixed together in new ways. There is a need for a new organization of the data that favors users and in particular can be considered ready for use. The AD4GD project proposes a threefold solution: A new information model, dynamic multidimensional datacubes and OGC APIs for data query and filtering. The use of data in the context of the Green Deal is difficult due to the wide heterogeneity of data sources (many times expressed in different data models) and different semantics used to represent data, and the lack of sufficient interoperability mechanisms that enable the connection of existing data models. The solution is a framework composed by a suite of ontologies implemented in line with best practices, reusing existing standards and well-scoped models as much as possible and establishing alignments between them to enable their interoperability and the integration of existing data. This approach was successfully applied to object based data stored in a RDF triple store e.g. vector data and sensor data) in the agricultural sector. The traditional representation of static two-dimensional layers needs to be replaced by a dynamic view where time becomes an extra dimension. In addition, other dimensions emerge such as the height in atmospheric and oceanic models, the frequency in hyperspectral remote sensing data or the species in a biodiversity distribution model. These dimensions define a data cube that offers an entry to a constantly evolving world. The dimensions in the datacube as well as the values stored in the cube cells (a.k.a. attributes) should be connected to the concepts in the Information Model. In the big data paradigm access to individual layers is difficult due to the dynamic nature of the data. Instead we need a set of modern APIs as an entry point to the data in the data space. The OGC APIs offer a set of building blocks that can be combined together with other web API design principles to build the Green Deal Data Space data access APIs. Users will be able to get the necessary data for modeling and simulating the reality using the OGC APIs endpoints described in the OpenAPI description document. The OGC APIs include the concept of collections as an initial filtering mechanism but a filtering extension using CQL2 is in advanced draft status. It is fundamental that the OGC APIs are also capable of generating detailed metadata about the extracted subset including data sources, producer information and data quality estimations.

AD4GD is a Horizon Europe project co-funded by the European Union, Switzerland and the United Kingdom.

How to cite: Masó, J., Brobia, A., Zamzov, M., Serral, I., Hodson, T., Palma, R., Bastin, L., and Lush, V.: OGC Web APIs to Make Data Available in the Green Deal Data Space, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-18047, https://doi.org/10.5194/egusphere-egu24-18047, 2024.

X2.10
|
EGU24-19101
Ivette Serral, Lucy Bastin, Vitalii Kriukov, and Joan Masó

Integrity of natural ecosystems, including terrestrial ones, and their connectivity is one of the main concerns of current European and Global Green Policies, e.g., the European Green Deal. Thus, public administration managers need reliable and long-term information for a better monitoring of the ecosystems evolution and inform decision making. Data Spaces are intended to become the EC comprehensive solution to integrate data from different sources with the aim to generate and provide a more ready to use knowledge on climate change, circular economy, pollution, biodiversity, and deforestation.

The AD4GD project does research on the co-creation of the European Green Deal Data Space as an open space for FAIR data and standards-based services tested in 3 pilot cases providing testbeds in terms of data, standards, sharing and interoperability. One of these pilots, is focused on Ecological Terrestrial Connectivity in Catalonia (NE of Spain).

The challenges are: (1) monitoring ecological connectivity in terrestrial ecosystems through the integration of state-of-the-art multi-sensor remote sensing imagery, ecological models, in-situ biodiversity observations and sensors; and (2) forecasting ecological connectivity to help to define effective actions to reduce terrestrial biodiversity loss.

To this goal, solutions are being proposed and tested, to integrate data from different sources using modern standards such as, raster-based land cover maps and connectivity maps structured as data cubes (Open Data Cube and Rasdaman), GBIF species occurrences exposed via OGC STAplus, as well as data from low-cost automatic sensors (camera traps with species identification software). All this data is related together by the use of semantic tagging with references pointing to vocabularies of GEO Essential Variables stored in OGC RAINBOW Definition Server.

AD4GD is a Horizon Europe project co-funded by the European Union, Switzerland and the United Kingdom.

How to cite: Serral, I., Bastin, L., Kriukov, V., and Masó, J.: Data Spaces as an exploratory solution for big data biodiversity challenges in support of the European Green Deal: the case of terrestrial habitat connectivity, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-19101, https://doi.org/10.5194/egusphere-egu24-19101, 2024.

X2.11
|
EGU24-20031
Jędrzej Bojanowski, Jan Musiał, Jan Małaśnicki, Jacek Chojnacki, and Wojciech Bylica

Identification of crops over large areas is necessary for monitoring agricultural production, establishing food security policies, or controlling compliance with the Common Agricultural Policy. Data acquired by Sentinel-1 and Sentinel-2 satellite constellations with a combination of high temporal and spatial resolutions has already demonstrated, separately or fused, its ability to classify crops. However, here we propose a methodology for crop type classification based on Sentinel-2 data focusing on a few specific aspects that have been often not addressed simultaneously in the previous works:

  • Comparison of processing steps when single field representation is generated, i.e. multiple-pixel representation (Pixel-Set Encoder, https://doi.org/10.48550/arXiv.1911.07757) vs zonal statistics
  • Application of various machine and deep learning classifiers, from Random Forest as a background, through Long Short-Term Memory and Gated Recurrent Unit, to novel Transformers
  • Providing reliable full probability distribution of all crops for each classified crop field that allow for flexible user-specific minimalization of omission or commission errors
  • Applicability in operational mode on a large beyond-country-scale i.e. for tens of millions of crop fields
  • Efficient use of EO-data open cloud infrastructures such us the Copernicus Data Space Ecosystem or CREODIAS

The methodology has been already implemented and validated for Austria and Poland using Land Parcel Information System as a reference, but the list of countries will continue to grow. For Poland, 41 crops were successfully classified in 2022, for which the precision ranged from 0. 64 (pepper) to 0.97 (maize, beetroot, winter rape), and the overall classification accuracy (kappa) was 0.88. However, using a threshold of 0.85 on the probabilities allowed kappa of 0.95 and a range of accuracy (precision) for crops from 0.88 to 1.00, with 26 crops classified with an accuracy above 0.95. For Austria, for the years 2018-2021, a kappa coefficient of 0.85 to 0.9 was obtained without applying thresholds on probabilities. For Poland, classification was also carried out during the growing season reaching kappa of: 0.78 (in May), 0.84 (in June), 0.87 (in July) and 0.88 at the end of the season.

With this approach, we also demonstrate how to use the Copernicus Data Space Ecosystem to efficiently extract information from big volumes of satellite data - here from Sentinel-2 and tens of millions of agricultural parcels.

How to cite: Bojanowski, J., Musiał, J., Małaśnicki, J., Chojnacki, J., and Bylica, W.: Optimizing Country-Scale Crop Type Classification: A Comprehensive Study on Preprocessing Techniques and Deep Learning Classifiers within the Copernicus Data Space Ecosystem, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-20031, https://doi.org/10.5194/egusphere-egu24-20031, 2024.

X2.12
|
EGU24-21722
Garin Smith, Eddie Glover, Alasdair Kyle, Stefan Meißl, Annekatrien Debien, and Anca Anghelea

The EarthCODE vision will provide a cloud-based, user-centric development environment which can be used to support ESA science activities and projects. Our solution is built around existing open-source solutions and building blocks, primarily the Open Science Catalogue, EOxHub and the Exploitation Platform Common Architecture (EOEPCA+). EarthCODE capability will be hosted on the Copernicus Data Space Ecosystem (CDSE). Because EarthCODE will adopt a federated approach, it will also facilitate processing on other platforms, and these may include Deep Earth System Data Lab (DeepESDL), ESA EURO Data Cube (EDC), Open EO Cloud / Open EO Platform and perhaps AIOPEN / AI4DTE.

1. GOALS

EarthCODE will support FAIR Open Science. The main use cases will allow users to “Access Compute Resource” in the
cloud and “Conduct Science” using their preferred approach to create reproducible workflows. “Manage Workflows” will
allow users to execute workflows on a cloud platform of their choice and enable federated workflow and federated data
solutions. “Publish Results” will allow scientists to share experiments and data within the community and “Interact with Community” will enable a collaborative approach throughout this lifecycle. EarthCODE will allow scientists to concentrate
on science and collaboration and hide the complexity of Managing and implementing Workflows when required.

2. OPEN SCIENCE PLATFORMS
EarthCODE will provide users with a portal to the underlying platform services. Broadly speaking, this will include FAIR Open Science services that support the delivery of scientific workflows, Infrastructure Services that support the Execution of
workflows and Community Services to support underlying collaboration and exploitation.

3. BUILDING BLOCKS

EarthCODE will be created use existing building blocks. These will include EOEPCA+, EOxHub, Open Science Catalogue and other existing platform services.

4. PLATFORM FREEDOM

EarthCODE will be designed using open standards to help facilitate platform freedom and platform interoperability. This means that more than one type of Development Tooling can be used to conduct science and more than one of Integrated
Platform service can be used to Manage Workflows.

5. PLATFORM INTEGRATION
Using the above approach, EarthCODE will facilitate the use of data and services from a wide variety of platforms as illustrated
below. This shows how EarthCODE will promote integration with different platforms yet still provide freedom of platform
implementation. STAC will be used at the heart of the Science Catalogue to describe resources.

Platform integration will also be supported by a DevOps best practice approach to facilitate integration.

6. WORKFLOW AND DATA INTEGRATION

EarthCODE will prioritise the management of Workflow and Data modelling to ensure successful platform integration and
successful collaboration. This approach will further support the integration of platforms and data. FAIR Data principles will be
applied.

ACKNOWLEDGEMENTS

This work is funded by the European Space Agency.

 

How to cite: Smith, G., Glover, E., Kyle, A., Meißl, S., Debien, A., and Anghelea, A.: EarthCODE , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-21722, https://doi.org/10.5194/egusphere-egu24-21722, 2024.

X2.14
|
EGU24-22217
|
ECS
Patryk Grzybowski, Krzysztof Markowicz, and Jan Musiał

Nitrogen dioxide (NO2 ) pollution is one of the most dangerous threats for public health. It is responsible for respiratory diseases, cardiovascular diseases, asthma, and many others. The objective of the presented study was to verify the potential of the Sentinel-5 Precursor Tropospheric NO2 Column Number Density (NO2 TVCD), provided by Copernicus Data Space Ecosystem, to support air pollution monitoring in Poland. The implementation of the objective was divided into three stages:

  • to develop a model which estimates a ground-level NO 2 concentration;
  • to find spatial representativeness area (SR area) referring to existing ground stations;
  • to propose localization of new ground stations.

Model estimating a ground-level NO2 concentration was developed in order to get spatial distribution of NO2 pollution over whole Poland. As a input data there was used NO2 TVCD provided by Copernicus Data Space Ecosystem as well as meteorological factors like air temperature, wind speed, planetary boundary layer height, pressure, atmospheric circulation type, wind direction, solar adiation flux from ERA-5 provided by European Centre for Medium-Range Weather Forecasts (ECMWF) and anthropogenic conditions (nightlight intensity, population, roads density). Due to the need for high computing power and a constantly working environment, CREODIAS resources were used. There were used several machine learning approach among which random forest had been found as a most accurate. The results revealed that model demonstrated MAE values of 3.4 μg/m3 (MAPE~37%) and 3.2 μg/m3 (MAPE~31%) for the hourly and weekly estimates, respectively.
Obtaining NO2 ground concentration for whole Poland allowed for investigation of spatial autocorrelation of the air pollution phenomenon and determination of SR area. There were used five methods:

  • Global and Local Moran’s: it was found strong spatial autocorrelation (Global Moran’s=0.99 and p-value <0.05). Also, ~1% of Poland NO 2 pollution does not depend on neighborhood area;
  • Correlation in respect to distance: it was observed that c.a. 10% of Poland’s population exposed to high levels of pollution (higher than yearly World Health Organization - WHO recommendation=10 μg/m3 ) is not covered by a SR area;
  • Semivariance: it was found that c.a. 12% of Poland’s population exposed to high levels of pollution is not covered by a SR area.
  • Similarity threshold: It was found that c.a. 7% of Poland’s population exposed to high levels of pollution is not covered by a SR area.


Due to the findings above it was possible to determinate a number and localizations of the new
stations. Depending on method it was found there was a need to establish from 7 to 22 new stations
in order to cover all population threaten by high NO 2 concentration.

To sum up, implementation of Sentinel-5P data as well as meteorological and anthropogenic data
allowed for finding solution which could be very useful for design or improvement of air quality
measurements network.

 

How to cite: Grzybowski, P., Markowicz, K., and Musiał, J.: Support of air pollution monitoring using Senitnel-5P data – how to improve air quality measurement network , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-22217, https://doi.org/10.5194/egusphere-egu24-22217, 2024.