ESSI1 – Next Generation Analytics for Scientific Discovery: Data Science, Machine Learning, AI
Programme group scientific officers:
Spatio-temporal Data Science: Theoretical Advances and Applications in AI and ML
Big data analytics will have a primary role in addressing modern challenges such as climate change, disaster management, public health and safety, resources management, and logistics. Most of these phenomena are characterized by spatio-temporal patterns that have been traditionally investigated using linear statistical approaches, as in the case of physically-based models and geostatistical models. Additionally, the rising attention toward machine learning, the variety of modern technologies generating massive volumes of geospatial data at local and global scales, and the rapid growth of computational resources, open new horizons in understanding, modelling, and forecasting complex spatio-temporal systems using stochastics non-linear models.
This session aims at exploring the new challenges and opportunities opened by the spread of big geospatial datasets and data-driven statistical learning approaches in Earth and Soil Sciences. We invite cutting-edge contributions related to methods of spatio-temporal geostatistics or data mining on topics that include, but are not limited to:
- advances in spatio-temporal modeling using geostatistics and machine learning;
- software and infrastructure development for geospatial data;
- uncertainty quantification and representation;
- innovative techniques of knowledge extraction based on clustering, pattern recognition and, more generally, data mining.
The main applications will be closely related to the research in environmental sciences and quantitative geography. A non-complete list of possible applications includes:
- natural and anthropogenic hazards (e.g. floods; landslides; earthquakes; wildfires; soil, water, and air pollution);
- interaction between geosphere and anthroposphere (e.g. land degradation; urban sprawl);
- socio-economic sciences, characterized by the spatial and temporal dimension of the data (e.g. public health management, census data; transport; commuter traffic).
This session collects the abstract submitted to the session “Strategies and Applications of AI and ML in a Spatiotemporal Context” and “Spatio-temporal Data Science: Theoretical Advances and Applications in Computational Geosciences”.
Novel Methods and Applications of Satellite and Aerial Imagery
Understanding Earth’s system natural processes, especially in the context of global climate change, has been recognized globally as a very urgent and central research direction which need further exploration. With the launch of new satellite platforms with a high revisit time, combined with the increasing capability for collecting repetitive ultra-high aerial images, through unmade aerial vehicles, the scientific community have new opportunities for developing and applying new image processing algorithms to solve old and new environmental issues.
The purpose of the proposed session is to gather scientific researchers related to this topic aiming to highlight ongoing researches and new applications in the field of satellite and aerial time-series imagery. The session focus is on presenting studies aimed at the development or exploitation of novel satellite time-series processing algorithms, and applications to different types of remote sensing data for investigating longtime processes in all branches of Earth (sea, ice, land, atmosphere).
The conveners encourage both applied and theoretical research contributions focusing in novel methods and applications of satellite and aerial time-series imagery all disciplines of geosciences, including both aerial and satellite platforms and data acquired in all regions of the electromagnetic spectrum.
Remote sensing big data analysis and applications in geosciences
Remote sensing techniques, such as radar (e.g., synthetic aperture radar - SAR), optical, Lidar and hyperspectral imagery, together with hydroclimatic, geological, and geophysical data, as well as in-situ observations, have been widely employed for monitoring, and responding to natural and anthropogenic hazards and assessing environmental resources. Especially with the unprecedented spatio-temporal resolution and the rapid accumulation of remote sensing data collections from various spaceborne and airborne missions, we have much more opportunities to exploit hazard- and environmental- related signals, to classify the associated spatio-temporal surface changes such as deformations and landform alterations, and to interpret the primary and secondary driving mechanisms. Yet, when archiving, processing, and analyzing abundant remote sensing data, the ad hoc artificial intelligence (AI), like machine/deep learning and computer vision, is urgently required.
In this session, we welcome contributions that focus on new AI-based algorithms to retrieve remote sensing products related to environmental resources and hazards in an accurate, automated, and efficient framework. We particularly welcome contributions for applications in (1) mining, oil/gas production, fluid injection/extraction, civil infrastructure, sinkholes, land degradation, peatlands, glaciers, permafrost, and coastal subsidence; (2) emergency response based on remote sensing data to landslides, floods, winter storms, wildfires, pandemics, earthquakes, and volcanoes; and (3) mathematical and physical modeling of the remote sensing products for a better understanding on the surface and subsurface processes.
"Enter Zoom Meeting" button for the session will show up 8:15 am (CEST), 15 minutes before the start time. Our solicited speaker Dr. Sigrid Roessner is unable to participate in EGU. Instead, Prof. Ramon Hanssen from Delft University of Technology will give us a talk entitled “InSAR time series ambiguity resolution using recurrent neural networks” to start our session today. Looking forward to "seeing" you :-)
Cryospheric Data Science and Artificial Intelligence: Opportunities and Challenges
Machine learning, artificial intelligence and big data approaches have recently emerged as key tools in understanding the cryosphere. These approaches are being increasingly applied to answer long standing questions in cryospheric science, including those relating to remote sensing, forecasting, and improving process understanding across Antarctic, Arctic and Alpine regions. In doing so, data science and AI techniques are being used to gain insight into system complexity, analyse data on unprecedented temporal and spatial scales, and explore much wider parameter spaces than were previously possible.
In this session we invite submissions that utilise data science and/or AI techniques that address research questions relating to glaciology, sea ice, permafrost and/or polar climate science. Approaches used may include (but are not limited to) machine learning, artificial intelligence, big data processing/automation techniques, advanced statistics, and innovative software/computing solutions. These could be applied to any (or combinations) of data sources including remote sensing, numerical model output and field/lab observations. We particularly invite contributions that apply techniques and approaches that reveal new insights into cryospheric research problems that would not otherwise be achievable using traditional methods, and those that discuss how or if approaches can be applied or adapted to other areas of cryospheric science. Given the rapid development of this field by a diverse group of international researchers, we convene this session to help foster future collaboration amongst session contributors, attendees, and international stakeholders and help address the most challenging questions in cryospheric science.
Analysis of complex geoscientific time series: linear, nonlinear, and computer science perspectives
This interdisciplinary session welcomes contributions on novel conceptual and/or methodological approaches and methods for the analysis and statistical-dynamical modeling of observational as well as model time series from all geoscientific disciplines.
Methods to be discussed include, but are not limited to linear and nonlinear methods of time series analysis. time-frequency methods, statistical inference for nonlinear time series, including empirical inference of causal linkages from multivariate data, nonlinear statistical decomposition and related techniques for multivariate and spatio-temporal data, nonlinear correlation analysis and synchronisation, surrogate data techniques, filtering approaches and nonlinear methods of noise reduction, artificial intelligence and machine learning based analysis and prediction for univariate and multivariate time series.
Contributions on methodological developments and applications to problems across all geoscientific disciplines are equally encouraged. We particularly aim at fostering a transfer of new methodological data analysis and modeling concepts among different fields of the geosciences.
Advances in diagnostics, inversion, sensitivity, uncertainty analysis, and hypothesis testing of Earth and Environmental Systems models
Proper characterization of uncertainty remains a major research and operational challenge in Environmental Sciences, and is inherent to many aspects of modelling impacting model structure development; parameter estimation; an adequate representation of the data (inputs data and data used to evaluate the models); initial and boundary conditions; and hypothesis testing. To address this challenge, methods for a) uncertainty analysis (UA) that seek to identify, quantify and reduce the different sources of uncertainty, as well as propagating them through a system/model, and b) the closely-related methods for sensitivity analysis (SA) that evaluate the role and significance of uncertain factors (in the functioning of systems/models), have proved to be very helpful.
This session invites contributions that discuss advances, both in theory and/or application, in methods for SA/UA applicable to all Earth and Environmental Systems Models (EESMs), which embraces all areas of hydrology, such as classical hydrology, subsurface hydrology and soil science.
Topics of interest include (but are not limited to):
1) Novel methods for effective characterization of sensitivity and uncertainty
2) Analyses of over-parameterised models enabled by AI/ML techniques
3) Single- versus multi-criteria SA/UA
4) Novel approaches for parameter estimation, data inversion and data assimilation
5) Novel methods for spatial and temporal evaluation/analysis of models
6) The role of information and error on SA/UA (e.g., input/output data error, model structure error, parametric error, regionalization error in environments with no data etc.)
7) The role of SA in evaluating model consistency and reliability
8) Novel approaches and benchmarking efforts for parameter estimation
9) Improving the computational efficiency of SA/UA (efficient sampling, surrogate modelling, parallel computing, model pre-emption, model ensembles, etc.)
Earth System Model Evaluation with ESMValTool in the Jupyter notebook
This Short Course is aimed at researchers in climate-related domains, who have an interest in working with climate data. We will introduce the ESMValTool, a Python project developed to facilitate the analysis of climate data through so-called recipes. An ESMValTool recipe specifies which input data will be used, which preprocessor functions will be applied, and which analytics should be computed. As such, it enables readable and reproducible workflows. The tool takes care of finding, downloading, and preparing data for analysis. It includes a suite of preprocessing functions for commonly used operations on the input data, such as regridding or computation of various statistics, as well as a large collection of established analytics.
In this course, we will run some of the available example recipes using ESMValTool’s convenient Jupyter notebook interface. You will learn how to customize the examples, in order to get started with implementing your own analysis. A number of core developers of ESMValTool will be present to answer any and all questions you may have.
The ESMValTool has been designed to analyze the data produced by Earth System Models participating in the Coupled Model Intercomparison Project (CMIP), but it also supports commonly used observational and re-analysis climate datasets, such as ERA5. Version 2 of the ESMValTool has been specifically developed to target the increased data volume and complexity of CMIP Phase 6 (CMIP6) datasets. ESMValTool comes with a large number of well-established analytics, such as those in Chapter 9 of the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) (Flato et al., 2013) and has been extensively used in preparing the figures of the Sixth Assessment Report (AR6). In this way, the evaluation of model results can be made more efficient, thereby enabling scientists to focus on developing more innovative methods of analysis rather than constantly having to "reinvent the wheel".
Course material will be made available at https://github.com/ESMValGroup/EGU22-short-course
Unsupervised, supervised, semi-supervised as well as reinforcement learning are now increasingly used to address Earth system related challenges.
Machine learning could help extract information from numerous Earth System data, such as in-situ and satellite observations, as well as improve model fidelity through novel parameterisations or speed-ups. This session invites submissions spanning modelling and observational approaches towards providing an overview of the state-of-the-art of application of these novel methods for predicting and monitoring our earth system. This includes (but it is not restricted to):
- the use of machine learning to improve forecast skill
- generate significant speedups
- design new parameterization schemes
- emulate numerical models.
Please consider submitting abstracts focussed on ML applied to observations and modelling of climate processes to the companion "ML for Climate Science" session.
Room N1, Tue, 24 May, 08:30–11:50 (CEST), 13:20–14:50 (CEST)
Machine Learning in Planetary Sciences and Heliophysics
The increasing amount of data from an increasing number of spacecraft in our solar system shouts out for new data analysis strategies. There is a need for frameworks that can rapidly and intelligently extract information from these data sets in a manner useful for scientific analysis. The community is starting to respond to this need. Machine learning, with all of its different facets, provides a viable playground for tackling a wide range of research questions in planetary and heliospheric physics.
We encourage submissions dealing with machine learning approaches of all levels in planetary sciences and heliophysics. The aim of this session is to provide an overview of the current efforts to integrate machine learning technologies into data driven space research, to highlight state-of-the art developments and to generate a wider discussion on further possible applications of machine learning.
Artificial Intelligence for Natural Hazard and Disaster Management
Through a wealth of geospatial data, growing computational power, and demonstrated success of application across many fields, artificial intelligence (in particular, machine learning) promises to advance our understanding of natural hazards and our ability to predict and respond to natural disasters. The ITU/WMO/UNEP Focus Group AI for Natural Disaster Management (FG-AI4NDM) is building a community of experts and stakeholders to identify best practices in the use of AI for data processing, improved modeling across spatiotemporal scales, and providing effective communication. This multidisciplinary FG-AI4NDM-session invites contributions addressing challenges related to floods, landslides, earthquakes, volcanic eruptions, tsunamis, among others, as well as multi-hazard. It also welcomes presentations on novel AI methods (including advances in automated annotation, explainability, etc.), which are hazard agnostic.
Wed, 25 May, 11:05–11:47 (CEST), 13:20–16:36 (CEST)
Short-term Earthquakes Forecast (StEF) and multi-parametric time-Dependent Assessment of Seismic Hazard (t-DASH)
From the real-time integration of multi-parametric observations is expected the major contribution to the development of operational t-DASH systems suitable for supporting decision makers with continuously updated seismic hazard scenarios. A very preliminary step in this direction is the identification of those parameters (seismological, chemical, physical, biological, etc.) whose space-time dynamics and/or anomalous variability can be, to some extent, associated with the complex process of preparation of major earthquakes.
This session wants then to encourage studies devoted to demonstrate the added value of the introduction of specific, observations and/or data analysis methods within the t-DASH and StEF perspectives. Therefore, studies based on long-term data analyses, including different conditions of seismic activity, are particularly encouraged. Similarly welcome will be the presentation of infrastructures devoted to maintain and further develop our present observational capabilities of earthquake related phenomena also contributing in this way to build a global multi-parametric Earthquakes Observing System (EQuOS) to complement the existing GEOSS initiative.
To this aim this session is not addressed just to seismology and natural hazards scientists but also to geologist, atmospheric sciences and electromagnetism researchers, whose collaboration is particular important for fully understand mechanisms of earthquake preparation and their possible relation with other measurable quantities. For this reason, all contributions devoted to the description of genetic models of earthquake’s precursory phenomena are equally welcome.
Fri, 27 May, 08:30–11:47 (CEST), 13:20–14:05 (CEST)
Deep learning in hydrological science
Machine learning (ML) and Deep Learning (DL) have seen accelerated adoption across Hydrology and the broader Earth Sciences. This session highlights the continued integration of ML, and its many variants, including DL, into traditional and emerging hydrology-related workflows. Abstracts are solicited related to novel theory development, novel methodology, or practical applications of ML in hydrological modeling. This might include, but is not limited to, the following:
(1) Development of novel DL models or modeling workflows.
(2) Integrating DL with process-based models and/or physical understanding.
(3) Improving understanding of the (internal) states/representations of ML/DL models.
(4) Understanding the reliability of ML/DL, e.g., under non-stationarity.
(5) Deriving scaling relationships or process-related insights with ML/DL.
(6) Modeling human behavior and impacts on the hydrological cycle.
(7) Hazard analysis, detection, and mitigation.
(8) Natural Language Processing in support of models and/or modeling workflows
Recent developments in machine learning (ML) are transforming Earth observation data analysis and modelling of the Earth system and its constituent processes. While statistical models have been used for a long time, state-of-the-art machine and deep learning algorithms allow encoding non-linear, spatio-temporal relationships robustly without sacrificing interpretability. These advances have the potential to accelerate climate science by improving our understanding of the underlying processes, reducing and better quantifying uncertainty, and even making predictions directly from observations across different spatio-temporal scales.
This session aims to provide a venue to present the latest progress in the use of ML applied to all aspects of climate science including, but not limited to:
- Causal discovery and inference
- Learning (causal) process and feature representations in observations
- Hybrid models (physically informed ML)
- Novel detection and attribution approaches
- Probabilistic modelling and uncertainty quantification
- Explainable AI applications to climate science
Please consider submitting abstracts focussed on ML for model improvement, particularly for near-term (including seasonal) forecasting to the companion “ML for Earth System modelling” session.
Mon, 23 May, 08:30–11:50 (CEST), 13:20–14:50 (CEST), 15:10–16:40 (CEST)
Advances in geomorphometry and landform mapping: possibilities, challenges and perspectives
Geomorphometry and landform mapping are important tools used for understanding landscape processes and dynamics on Earth and other planetary bodies. The recent rapid advances in technology and data collection methods have made available vast quantities of geospatial data offering unprecedented spatio-temporal range, density, and resolution, but it also created new challenges in terms of data processing and analysis.
This inter-disciplinary session on geomorphometry and landform mapping aims to bridge the gap between process-focused research fields and the technical domain where geospatial products and analytical methods are developed. The increasing availability of a wide range of geospatial datasets requires the continued development of new tools and analytical approaches as well as landform/landscape classifications. However, a potential lack of communication across disciplines results in efforts to be mainly focused on problems within individual fields. We aim to foster collaboration and the sharing of ideas across subject-boundaries, between technique developers and users, enabling us as a community to fully exploit the wealth of geospatial data that is now available.
We welcome perspectives on geomorphometry and landform mapping from ANY discipline (e.g. geomorphology, planetary science, natural hazard assessment, computer science, remote sensing). This session aims to showcase both technical and applied studies, and we welcome contributions that present (a) new techniques for collecting or deriving geospatial data products, (b) novel tools for analysing geospatial data and extracting innovative geomorphometric variables, (c) mapping and/or morphometric analysis of specific landforms as well as whole landscapes, and (d) mapping and/or morphometric analysis of newly available geospatial datasets. Contributions that demonstrate multi-method or inter-disciplinary approaches are particularly encouraged. We also actively encourage contributors to present tools/methods that are “in development”.
Tue, 24 May, 10:20–11:44 (CEST), 13:20–14:37 (CEST)
Novel data, methods and applications in Geomorphometry
Geomorphometry, a science of quantitative land surface analysis, gathers various mathematical, statistical and image processing techniques to quantify morphological, hydrological, ecological and other aspects of a land surface. The typical input to geomorphometric analysis is a square-grid representation of the land surface: a digital elevation model (DEM) or one of its derivatives. DEMs provide the backbone for many studies in Geo sciences, hydrology, land use planning and management, Earth observation and natural hazards.
One topic of active research concerns compromises between the use of global DEMs at 1-3 arc second, ~30-90 m grid spacing, and local LiDAR/structure from motion (SFM) elevation models at 1 m or finer grid spacing. Point clouds from LiDAR, either ground-based or from airborne vehicles, are a generally accepted reference tool to assess the accuracy of other DEMs. SFM data have a resolution comparable to LiDAR point clouds, but can cost significantly less to acquire for smaller areas. Globally available DEMS include the recently published Copernicus GLO-90 and GLO-30. This session provides an exciting forum to show the potential applications of this new DEM and its improvements over SRTM. We would like to investigate the tradeoff between the employment of the two kinds of data, and applications which can benefit from data at both (local and global) scales.
The improvements in the global DEMs, as well as the increasing availability of much finer resolution LiDAR and SFM DEMs, call for new analytical methods and advanced geo-computation techniques, necessary to cope with diverse application contexts. We aim at investigating new methods of analysis and advanced geo-computation techniques, including high-performance and parallel computing implementations of specific approaches.
Commercial applications of DEM data and of geomorphometric techniques can benefit important business sectors. Besides a proliferation of applications that can tolerate low accuracy geographical data and simple GIS applications, a large base of professionals use high-resolution, high-accuracy elevation data and high-performance GIS processing. We would like to survey and investigate professional, commercial and industrial applications, including software packages, from small enterprises to large companies, to ascertain how academic researchers and industry can work together.
ESSI2 – Data, Software and Computing Infrastructures Across Earth and Space Sciences
Programme group scientific officers:
Established and Establishing Disciplinary International Frameworks that will Ultimately Enable Real-Time Interdisciplinary Sharing of Data.
As we increasingly face global challenges such as climate change, pandemics, environmentally sustainable exploitation of our resources, there is a greater urgency to bring together multiple existing data/information infrastructure systems that are distributed around the world to create machine actionable, interoperable, reusable, real-time data sharing frameworks.
The problem is that research can be a ‘competitive’ process, and there is a tendency for this competition to be focused on which is considered to be the best data sharing system or data standard that supposedly is THE one that everyone SHOULD use.
An alternative approach is to build loosely coupled frameworks that allow multiple existing systems to interoperate, but still, preserve their deeper disciplinary specialization. For this approach to work, there will need to be agreement on 1) the minimum core variables for sharing data content, and 2) the technical standards/technologies required to enable real-time data interoperability.
There are well-established examples of groups facilitating global data sharing (e.g., Federation of Digital Seismograph Networks (FDSN), OneGeology, Earth Systems Grid Federation (ESFG), OGC, W3C, GEO). Many new groups are starting to form global disciplinary data networks: some are already trying to link frameworks together to enable global interdisciplinary sharing of data (e.g., CODATA/DDI Cross-Domain Data Initiative).
This session seeks contributions from any group that has established or is establishing a data-sharing infrastructure system/framework regardless of scale, as well as those that are attempting global and/or interdisciplinary networking. Topics may range from (meta)data standards, defining minimum core content variables, or be focused on technologies or organizational setups for enabling data sharing. Papers on the social dynamics of building sharing systems/frameworks are also welcome.
Meeting Exascale Computing Challenges with Compression and Pangeo
Current pre-exascale computing systems, and the strong push towards exascale warrant substantial efforts to improve the geoscientific software infrastructure used for Earth System Model (ESM) development, data analysis, and storage. The Exascale era opens a range of opportunities, including increased domain size, simulation duration, model resolution, large ensembles, and new physics. This session will discuss challenges and solutions involving domain scientists, applied mathematicians, computer scientists, HPC, and compression experts.
Contributions address challenges and advances to achieve exascale-readiness geoscience disciplines, methods, and technologies.
Pangeo (pangeo.io) is a community of researchers and developers that tackle these issues in a collaborative manner using a growing Python ecosystem whose core tools include xarray, Iris, DASK, Jupyter, Zarr and INTAKE. Many contributors to this session will share novel tools within the Pangeo ecosystem devoted to Atmosphere, Ocean and Land Models, Satellite Observations, HPC, Cloud computing, Machine Learning, and Scalable scientific computing.
This session also considers how geoscientists can shift towards greener computing by adopting modern data compression techniques including, though not limited to: algorithmic advances, assessments of data storage sustainability, compression efficiency and speed in software and/or hardware, interoperability issues, remote sensing applications, and support in widely used languages (e.g., C/C++, Fortran, Java, Python), data storage formats (e.g., HDF, netCDF, Zarr), and Open Source workflows (e.g., CDO, NCO, Pangeo, Ruby, Xarray).
All authors in this session have the option to submit Jupyter notebooks of their work; the best five will be selected as part of the Pangeo applications gallery of EGU22. Examples of previous galleries are at http://gallery.pangeo.io.
Software tools and semantics for geospatial research
This session invites contributions focusing on modern software tools developed to facilitate the analysis of mainly geospatial data in any branch of geosciences for the purpose of better understanding Earth’s natural environment. We encourage the contribution of any kind of tools, including open source and those that are built on top of global used commercial GIS solutions. It also invites contributions that explore developments in interoperable data sharing, and the representation of semantic meaning to enable interpretation of geoscientific information.
Potential topics for the session include the presentation of software tools developed for displaying, processing and analysing geospatial data and modern cloud webGIS platforms and services used for geographical data analysis and cartographic purposes, information systems using web-based (W3C, ISO, OGC) standards are leading to advances in data interoperability. We also welcome contributions on presenting tools that make use of parallel processing on high performance computers (HPC) and graphic processing units (GPUs) and also on topics linked to interoperable data sharing, use of metadata and semantics in an interoperability context applied in any field of geosciences.
Mon, 23 May, 13:20–14:44 (CEST), 15:10–15:38 (CEST)
Crafting beautiful maps with PyGMT
In many scientific disciplines, accurate, intuitive, and aesthetically pleasing display of geospatial information is a critical tool. PyGMT (https://www.pygmt.org) - a Python interface to the Generic Mapping Tools (GMT) - is a mapping toolbox designed to produce publication-quality figures and maps for insertion into posters, reports, and manuscripts. This short course is geared towards geoscientists interested in creating beautiful maps using Python. Only basic Python knowledge is needed, and a background in cartography is not required to use PyGMT effectively! By the end of this tutorial, students will be able to:
- Craft basic maps with geographic map frames using different projections
- Add context to their figures, such as legends, colorbars, and inset overview maps
- Use PyGMT to process PyData data structures (xarray/pandas/geopandas) and plot them on maps
- Understand how PyGMT can be used for various applications in the Earth sciences and beyond!
The 1.5 hour long short course will be based on content adapted from https://github.com/GenericMappingTools/2021-unavco-course and https://github.com/GenericMappingTools/foss4g2019oceania. Each of the 30 minute sessions will involve a quick (~10 minute) walkthrough by the speaker, followed by a more hands-on session in breakout rooms where tutorial participants work on the topic (using interactive Jupyter notebooks) in a guided environment with one of four instructors on hand to answer questions.
We expressly welcome students and geoscientists working on any geo related fields (e.g. Earth Observation, Geophysical, Marine, Magnetic, Gravity, Planetary, etc) to join. Come and find out what PyGMT can do to level up your geoprocessing workflow!
Course materials are available as a Jupyter Book on https://www.generic-mapping-tools.org/egu22pygmt. GitHub repository is at https://github.com/GenericMappingTools/egu22pygmt
ESSI3 – Open Science Informatics for Earth and Space Sciences
Programme group scientific officers:
Best Practices and Realities of Research Data Repositories: Balancing the needs of Repositories, Researchers and Publishers
As funders and publishers increasingly require that research data be made publicly available, research data repositories, especially in the Earth and environmental sciences, play a new and major role in the publication process. They are on one hand supporting researchers during the data publication process and, on the other hand, need to present the data in a way that they are fully integrable in the ecosystem of modern scientific communication as required by the FAIR Data Principles. More recent developments, like the CoreTrustSeal Certification and the Enabling FAIR Data Commitment Statement have defined additional benchmarks and expectations for the capabilities of repositories.
How do repositories comply with increasing expectations for machine accessibility of their data and the requirements for machine learning (particularly for long tail data)? How do researchers know which repositories meet these benchmarks and future expectations? How should publishers work together with repositories and researchers to ensure a more complete record of science? What role can data journals and editors play?
This session will showcase the range of practices in research data repositories, data publication and the integration of data, software, samples, models and notebooks into the scholarly publication process. It invites repositories, researchers, information scientists, journals, and editors to discuss challenges they are facing in meeting community best practice.
Making Geoanalytical Data FAIR: Managing Data from Field to Laboratory to Archive to Publication
Globally, geoscience and research analytical laboratories collect ever increasing volumes of data: an acute challenge now is how to collate, store and make these data accessible in a standardised, interoperable and machine-accessible form that is FAIR. Many solutions today are bespoke and inefficient, lacking, for example, unique identification of samples, instruments, and data sets needed to trace the analytical history of the data; and there are few community agreed standards to facilitate sharing and interoperability between systems.
The push for a solution is being driven by publishers and journals who increasingly require researchers to provide access to the supporting data from a trusted repository prior to publication of manuscripts or finalisation of grants. We urgently need community development of systems to facilitate easy and efficient management of geoanalytical laboratory data. We need to address the lack of global standards, best practices and protocols for analytical data management and exchange, in order for scientists to better share their data in a global network of distributed databases. Buy-in from users and laboratory managers/technicians is essential in order to develop efficient and supported mechanisms.
This session seeks a diversity of papers from any initiative around the world that organises and structures sample/field metadata and research laboratory data at any scale to facilitate sharing and processing of geoanalytical data. We welcome papers on data and metadata standardisation efforts and papers on data management and systems that transfer data/metadata from instruments to shared data systems and relevant persistent repositories. Efforts on how to collate, curate, share and publicise sample/data collections as well as papers on the social dynamics of building sharing systems/frameworks are also welcome.
Fri, 27 May, 11:05–11:48 (CEST), 13:20–14:49 (CEST)
Free and Open Source Software (FOSS), Cloud-based Technologies and HPC to Facilitate Collaborative Science
Earth science research has become increasingly collaborative through shared code and shared platforms. Researchers work together on data, software and algorithms to answer cutting-edge research questions. Teams also share these data and software with other collaborators to refine and improve these products. This work is supported by Free and Open Source Software (FOSS) and by shared virtual research infrastructures utilising cloud and high-performance computing.
Software is critical to the success of science. Creating and using FOSS enhances collaboration and innovation in the scientific community, creates a peer-reviewed and consensus-oriented environment, and promotes the sustainability of science infrastructures.
This session will showcase solutions and applications based on the Free and Open Source Software (FOSS), cloud-based architecture and high-performance computing to support information sharing, scientific collaboration, and solutions that enable large-scale data analytics at scale solutions.
Participatory Citizen Science and Open Science as a new era of environmental observation for society
Citizen science (the involvement of the public in scientific processes) is gaining momentum across multiple disciplines, increasing multi-scale data production on Earth Sciences that is extending the frontiers of knowledge. Successful participatory science enterprises and citizen observatories can potentially be scaled-up in order to contribute to larger policy strategies and actions (e.g. the European Earth Observation monitoring systems), for example to be integrated in GEOSS and Copernicus. Making credible contributions to science can empower citizens to actively participate as citizen stewards in decision making, helping to bridge scientific disciplines and promote vibrant, liveable and sustainable environments for inhabitants across rural and urban localities.
Often, citizen science is seen in the context of Open Science, which is a broad movement embracing Open Data, Open Technology, Open Access, Open Educational Resources, Open Source, Open Methodology, and Open Peer Review. Before 2003, the term Open Access was related only to free access to peer-reviewed literature (e.g., Budapest Open Access Initiative, 2002). In 2003 and during the “Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities”, the definition was considered to have a wider scope that includes raw research data, metadata, source materials, and scholarly multimedia material. Increasingly, access to research data has become a core issue in the advance of science. Both open science and citizen science pose great challenges for researchers to facilitate effective participatory science, yet they are of critical importance to modern research and decision-makers.
We want to ask and find answers to the following questions:
Which approaches and tools can be used in Earth and planetary observation?
What are the biggest challenges in bridging between scientific disciplines and how to overcome them?
What kind of participatory citizen scientist involvement (e.g. how are citizen scientists involved in research, which kind of groups are involved) and open science strategies exist?
How to ensure transparency in project results and analyses?
What kind of critical perspectives on the limitations, challenges, and ethical considerations exist?
How can citizen science and open science approaches and initiatives be supported on different levels (e.g. institutional, organizational, national)?
ESSI4 – Advanced Technologies and Informatics Enabling Transdisciplinary Science
Programme group scientific officers:
Faster Uptake of Earth Observation Based Services for the Geosciences
One of today's challenges in the Earth sciences is the continuous evolution of technologies, making it hard for users and developers to be up to date and take advantage of the most advanced solutions.
On the other hand, there have never been such favorable conditions for the development of Earth observation based services and applications targeting both public and private sector.
For this session we invite abstracts that present operational applications for scientific services across the geosciences, including ocean research, forest monitoring, crop monitoring and more. By presenting these success stories we hope to initiate a discussion between different actors involved in making and using services addressing gaps and needs for faster uptake of Earth observation based services.
New frontiers of multiscale monitoring, analysis, modeling and decisional support (DSS) of environmental systems
Environmental systems often span spatial and temporal scales covering different orders of magnitude. The session is oriented toward collecting studies relevant to understand multiscale aspects of these systems and in proposing adequate multi-platform and inter-disciplinary surveillance networks monitoring tools systems. It is especially aimed to emphasize the interaction between environmental processes occurring at different scales. In particular, special attention is devoted to the studies focused on the development of new techniques and integrated instrumentation for multiscale monitoring of high natural risk areas, such as volcanic, seismic, energy exploitation, slope instability, floods, coastal instability, climate changes, and another environmental context.
We expect contributions derived from several disciplines, such as applied geophysics, geology, seismology, geodesy, geochemistry, remote and proximal sensing, volcanology, geotechnical, soil science, marine geology, oceanography, climatology, and meteorology. In this context, the contributions in analytical and numerical modeling of geological and environmental processes are also expected.
Finally, we stress that the inter-disciplinary studies that highlight the multiscale properties of natural processes analyzed and monitored by using several methodologies are welcome.
Remote sensing measurements, acquired using different platforms - ground, UAV, aircraft and satellite - have increasingly become rapidly developing technologies to study and monitor Earth surface, to perform comprehensive analysis and modeling, with the final goal of supporting decision systems for ecosystem management. The spectral, spatial and temporal resolutions of remote sensors have been continuously improving, making environmental remote sensing more accurate and comprehensive than ever before. Such progress enables understanding of multiscale aspects of high-risk natural phenomena and development of multi-platform and inter-disciplinary surveillance monitoring tools. The session welcomes contributions focusing on present and future perspectives in environmental remote sensing, from multispectral/hyperspectral optical and thermal sensors. Applications are encouraged to cover, but not limited to, the monitoring and characterization of environmental changes and natural hazards from volcanic and seismic processes, landslides, and soil science. Specifically, we are looking for novel solutions and approaches including the topics as follows: (i) state-of-the-art techniques focusing on novel quantitative methods; (ii) new applications for state-of-the-art sensors, including UAVs and other close-range systems; (iii) techniques for multiplatform data fusion.
Detailed seabed maps portraying the distribution of geomorphic features, substrates, and habitats are used for a wide range of scientific, maritime industry, and government applications. These maps provide essential information for ocean industry sectors and are used to guide local and regional conservation action. Fundamental to seabed mapping are acoustic remote sensing technologies, including single beam and multibeam echosounders and sidescan, interferometric, and synthetic-aperture sonars. These are deployed on a variety of crewed and robotic surface and underwater platforms. In shallow clear waters, optical sensors including LiDAR, multispectral, and hyperspectral cameras are also increasingly employed from aircraft, drones, and satellites to create maps of the seabed. Innovative data processing, image analysis, and statistical approaches for classification are advancing the field of seabed mapping. These methods are yielding increasingly comprehensive and detailed maps. We welcome submissions that provide insights into the use of advanced technologies, novel processing and analytical approaches, and current and emerging applications in the field of seabed mapping and classification – from shallow coastal waters to the deep seafloor.
Geo-infrastructure monitoring: complex data analysis and instrument application
Continues monitoring of infrastructure systems are essential to ensure a reliable movement of people and goods, which involves in the economy growth and human interaction. The wide variety of instruments available allows diverse applications to increase data availability for a better understanding of geotechnical surroundings which are directly linked to the safe operation of infrastructures to prevent catastrophise such as soil erosion, settlements, liquefaction, landslides, seismic activities, flooding and even wildfires close to the highways. Understanding mentioned events are vital to provide a safe infrastructure in extreme climate conditions. This session focus on the application of geosciences and geophysical instrumentation including sensors on the infrastructures monitoring and data analysis from critical infrastructures (e.g., roadways, railway system, bridges, tunnels, water supply, underground utilities, electrical grids, and other embedded facilities in cities). The session aims to increase knowledge on geo-infrastructure management to overcome future challenges associated with the societal and human interaction, present advance knowledge research and novel approaches from various disciplines with a vibrant interaction to economy and human-interaction studies to provide an efficient infrastructure management system. The session is considered inter-and transdisciplinary (ITS) session. The applications and topics include but are not limited to: (1) Advance knowledge of the destructive and non-destructive geoscience and geophysical techniques including contactless and non-contactless techniques such as sensors. (2) Intelligent data analysis approaches to analyse accurate and precise interpretation of big data sets driven from various technologies (e.g., computer vision and image, and signal processing). (3) Influence of the surrounding areas on infrastructure management systems linked to natural events such as soil erosion, settlements, liquefaction, landslides, seismic activities, flooding, wildfires and extreme weather condition. (4) Continuous real-time monitoring to provide smart tools such as an integration of geosciences data with BIM models, Internet of Things, digital twins, robotic monitoring, artificial intelligence, automation systems based on machine learning and computational modelling for better decision-making for infrastructure owner/operators. (5) Human-interaction computer-based aided to generate reliable infrastructures.
Interferometric Synthetic Aperture Radar added value products for Natural & Anthropogenic hazard assessment at local, regional and national scale.
Synthetic aperture radar (SAR) remote sensing is an established tool for natural and anthropogenic hazards mapping and monitoring. The new generation of radar satellite constellations along with a consistent repository of historical observations is fostering comprehensive multi-sensor hazard analyses. New constellations’ capabilities rely on innovative techniques based on high-resolution/wide-swath and short-temporal Interferometric SAR (InSAR). While acknowledging the benefits brought by these recent developments, the scientific community is now defining a new paradigm of techniques capable of: extracting relevant information from SAR imagery, designing proper methodologies for specific hazards, managing large SAR datasets (e.g. National ground motion services, Copernicus EGMS), and integrating radar data with multispectral satellite observations.
Application of remote sensing and Earth-observation data in natural hazard and risk studies
Remote sensing and Earth Observations (EO) are used increasingly in the different phases of the risk management and in development cooperation, due to the challenges posed by contemporary issues such as climate change, and increasingly complex social interactions. The advent of new, more powerful sensors and more finely tuned detection algorithms provide the opportunity to assess and quantify natural hazards, their consequences, and vulnerable regions, more comprehensively than ever before.
Several agencies have now inserted permanently into their program the applications of EO data to risk management. During the preparedness and prevention phase, EO revealed, fundamental for hazard, vulnerability, and risk mapping. EO data intervenes both in the emergency forecast and early emergency response, thanks to the potential of rapid mapping. EO data is also increasingly being used for mapping useful information for planning interventions in the recovery phase, and then providing the assessment and analysis of natural hazards, from small to large regions around the globe. In this framework, Committee on Earth Observation Satellites (CEOS) has been working from several years on disasters management related to natural hazards (e.g., volcanic, seismic, landslide and flooding ones), including pilots, demonstrators, recovery observatory concepts, Geohazard Supersites, and Natural Laboratory (GSNL) initiatives and multi-hazard management projects.
The session is dedicated to multidisciplinary contributions focused on the demonstration of the benefit of the use of EO for natural hazards and risk management.
The research presented might focus on:
- Addressed value of EO data in hazard/risk forecasting models
- Innovative applications of EO data for rapid hazard, vulnerability and risk mapping, the post-disaster recovery phase, and in support of disaster risk reduction strategies
- Development of tools for assessment and validation of hazard/risk models
The use of different types of remote sensing (e.g. thermal, visual, radar, laser, and/or the fusion of these) is highly recommended, with an evaluation of their respective pros and cons focusing also on future opportunities (e.g. new sensors, new algorithms).
Early-stage researchers are strongly encouraged to present their research. Moreover, contributions from international cooperation, such as CEOS and GEO initiatives, are welcome.
Geoscience problems related to massive release of radioactive materials by nuclear accidents and other human activities
The session gathers geoscientific aspects such as dynamics, reactions, and environmental/health consequences of radioactive materials that are massively released accidentally (e.g., Chernobyl and Fukushima nuclear power plant accidents, wide fires, etc.) and by other human activities (e.g., nuclear tests).
The radioactive materials are known as polluting materials that are hazardous for human society, but are also ideal markers in understanding dynamics and physical/chemical/biological reactions chains in the environment. Thus, the radioactive contamination problem is multi-disciplinary. In fact, this topic involves regional and global transport and local reactions of radioactive materials through atmosphere, soil and water system, ocean, and organic and ecosystem, and its relations with human and non-human biota. The topic also involves hazard prediction and nowcast technology.
By combining 35 years (> halftime of Cesium 137) monitoring data after the Chernobyl Accident in 1986, 10 years dense measurement data by the most advanced instrumentation after the Fukushima Accident in 2011, and other events, we can improve our knowledgebase on the environmental behavior of radioactive materials and its environmental/biological impact. This should lead to improved monitoring systems in the future including emergency response systems, acute sampling/measurement methodology, and remediation schemes for any future nuclear accidents.
The following specific topics have traditionally been discussed:
(a) Atmospheric Science (emissions, transport, deposition, pollution);
(b) Hydrology (transport in surface and ground water system, soil-water interactions);
(c) Oceanology (transport, bio-system interaction);
(d) Soil System (transport, chemical interaction, transfer to organic system);
(f) Natural Hazards (warning systems, health risk assessments, geophysical variability);
(g) Measurement Techniques (instrumentation, multipoint data measurements);
(h) Ecosystems (migration/decay of radionuclides).
The session consists of updated observations, new theoretical developments including simulations, and improved methods or tools which could improve observation and prediction capabilities during eventual future nuclear emergencies. New evaluations of existing tools, past nuclear contamination events and other data sets also welcome.
Data fusion, integration, correlation and advances of non-destructive testing methods and numerical developments for engineering and geosciences applications
Non-destructive testing (NDT) methods are employed in a variety of engineering and geosciences applications and their stand-alone use has been greatly investigated to date. New theoretical developments, technological advances and the progress achieved in surveying, data processing and interpretation have in fact led to a tremendous growth of the equipment reliability, allowing outstanding data quality and accuracy.
Nevertheless, the requirements of comprehensive site and material investigations may be complex and time-consuming, involving multiple expertise and multiple equipment. The challenge is to step forward and provide an effective integration between data outputs with different physical quantities, scale domains and resolutions. In this regard, enormous development opportunities relating to data fusion, integration and correlation between different NDT methods and theories are to be further investigated.
This Session primarily aims at disseminating contributions from state-of-the-art NDT methods and new numerical developments, promoting the integration of existing equipment and the development of new algorithms, surveying techniques, methods and prototypes for effective monitoring and diagnostics. NDT techniques of interest are related–but not limited to–the application of acoustic emission (AE) testing, electromagnetic testing (ET), ground penetrating radar (GPR), geoelectric methods (GM), laser testing methods (LM), magnetic flux leakage (MFL), microwave testing, magnetic particle testing (MT), neutron radiographic testing (NR), radiographic testing (RT), thermal/infrared testing (IRT), ultrasonic testing (UT), seismic methods (SM), vibration analysis (VA), visual and optical testing (VT/OT).
The Session will focus on the application of different NDT methods and theories and will be related –but not limited to– the following investigation areas:
- advanced data fusion;
- advanced interpretation methods;
- design and development of new surveying equipment and prototypes;
- real-time and remote assessment and monitoring methods for material and site inspection (real-life and virtual reality);
- comprehensive and inclusive information data systems for the investigation of survey sites and materials;
- numerical simulation and modelling of data outputs with different physical quantities, scale domains and resolutions;
- advances in NDT methods, numerical developments and applications (stand-alone use of existing and state-of-the-art NDTs).
Geoscience and health during the Covid-19 pandemic
The virus is still with us, with more potent variants. It remains the most immediate challenge for geosciences and health, including its impacts on geoscience development (data collection, training, dissemination) and the achievement of the UN Sustainable Development Goals, in particular that urban systems should increase well-being and health.
Long-term visions based on transdisciplinary scientific advances are therefore essential. As a consequence, this session, like the ITS1.1 session in 2021, calls for contributions based on data-driven and theory-based approaches to health in the context of global change. This includes :
- main lessons from lockdowns?
- how to get the best scientific results during a corona pandemic?
- how to manage field works, geophysical monitoring and planetary missions?
- qualitative improvements in epidemic modelling, with nonlinear, stochastic, and complex system science approaches;
- eventual interactions between weather and/or climate factors and epidemic/health problems
- new surveillance capabilities (including contact tracing), data access, assimilation and multidimensional analysis techniques;
- a fundamental revision of our urban systems, their greening and their need for mobility;
- a special focus on urban biodiversity, especially to better manage virus vectors;
- urban resilience must include resilience to epidemics, and therefore requires revisions of urban governance.
ESSI5 – General Contributions to Earth and Space Science Informatics
Programme group scientific officers:
Please decide on your access
Please use the buttons below to download the presentation materials or to visit the external website where the presentation is linked. Regarding the external link, please note that Copernicus Meetings cannot accept any liability for the content and the website you will visit.
Please use the buttons below to download the presentation materials or to visit the external website where the presentation is linked. Regarding the external link, please note that Copernicus Meetings cannot accept any liability for the content and the website you will visit.