ESSI1 – Next Generation Analytics for Scientific Discovery: Data Science, Machine Learning, AI
Programme group scientific officers:
Spatio-temporal Data Science: Theoretical Advances and Applications in Computational Geosciences
Most of the processes studied by geoscientists are characterized by variations in both space and time. These spatio-temporal phenomena have been traditionally investigated using linear statistical approaches, as in the case of physically-based models and geostatistical models. Additionally, the rising attention toward machine learning, as well as the rapid growth of computational resources, opens new horizons in understanding, modeling and forecasting complex spatio-temporal systems through the use of stochastics non-linear models.
This session aims at exploring the new challenges and opportunities opened by the spread of data-driven statistical learning approaches in Earth and Soil Sciences. We invite cutting-edge contributions related to methods of spatio-temporal geostatistics or data mining on topics that include, but are not limited to:
- advances in spatio-temporal modeling using geostatistics and machine learning;
- uncertainty quantification and representation;
- innovative techniques of knowledge extraction based on clustering, pattern recognition and, more generally, data mining.
The main applications will be closely related to the research in environmental sciences and quantitative geography. A non-complete list of possible applications includes:
- natural and anthropogenic hazards (e.g. floods; landslides; earthquakes; wildfires; soil, water, and air pollution);
- interaction between geosphere and anthroposphere (e.g. land degradation; urban sprawl);
- socio-economic sciences, characterized by the spatial and temporal dimension of the data (e.g. census data; transport; commuter traffic).
Strategies and Applications of AI and ML in a Spatiotemporal Context
Modern challenges of climate change, disaster management, public health and safety, resources management, and logistics can only be addressed through big data analytics. A variety of modern technologies are generating massive volumes of conventional and non-conventional geospatial data at local and global scales. Most of this data includes geospatial data components and are analysed using spatial algorithms. Ignoring the geospatial component of big data can lead to an inappropriate interpretation of extracted information. This gap has been recognised and led to the development of new spatiotemporally aware strategies and methods.
This session discusses advances in spatiotemporal machine learning methods and the softwares and infrastructures to support them.
Novel Methods and Applications of Satellite and Aerial Imagery
Understanding Earth’s system natural processes, especially in the context of global climate change, has been recognized globally as a very urgent and central research direction which need further exploration. With the launch of new satellite platforms with a high revisit time, combined with the increasing capability for collecting repetitive ultra-high aerial images, through unmade aerial vehicles, the scientific community have new opportunities for developing and applying new image processing algorithms to solve old and new environmental issues.
The purpose of the proposed session is to gather scientific researchers related to this topic aiming to highlight ongoing researches and new applications in the field of satellite and aerial time-series imagery. The session focus is on presenting studies aimed at the development or exploitation of novel satellite time-series processing algorithms, and applications to different types of remote sensing data for investigating longtime processes in all branches of Earth (sea, ice, land, atmosphere).
The conveners encourage both applied and theoretical research contributions focusing in novel methods and applications of satellite and aerial time-series imagery all disciplines of geosciences, including both aerial and satellite platforms and data acquired in all regions of the electromagnetic spectrum.
Remote sensing big data analysis and applications in geosciences
Remote sensing techniques, such as radar (e.g., synthetic aperture radar - SAR), optical, Lidar and hyperspectral imagery, together with hydroclimatic, geological, and geophysical data, as well as in-situ observations, have been widely employed for monitoring, and responding to natural and anthropogenic hazards and assessing environmental resources. Especially with the unprecedented spatio-temporal resolution and the rapid accumulation of remote sensing data collections from various spaceborne and airborne missions, we have much more opportunities to exploit hazard- and environmental- related signals, to classify the associated spatio-temporal surface changes such as deformations and landform alterations, and to interpret the primary and secondary driving mechanisms. Yet, when archiving, processing, and analyzing abundant remote sensing data, the ad hoc artificial intelligence (AI), like machine/deep learning and computer vision, is urgently required.
In this session, we welcome contributions that focus on new AI-based algorithms to retrieve remote sensing products related to environmental resources and hazards in an accurate, automated, and efficient framework. We particularly welcome contributions for applications in (1) mining, oil/gas production, fluid injection/extraction, civil infrastructure, sinkholes, land degradation, peatlands, glaciers, permafrost, and coastal subsidence; (2) emergency response based on remote sensing data to landslides, floods, winter storms, wildfires, pandemics, earthquakes, and volcanoes; and (3) mathematical and physical modeling of the remote sensing products for a better understanding on the surface and subsurface processes.
Cryospheric Data Science and Artificial Intelligence: Opportunities and Challenges
Machine learning, artificial intelligence and big data approaches have recently emerged as key tools in understanding the cryosphere. These approaches are being increasingly applied to answer long standing questions in cryospheric science, including those relating to remote sensing, forecasting, and improving process understanding across Antarctic, Arctic and Alpine regions. In doing so, data science and AI techniques are being used to gain insight into system complexity, analyse data on unprecedented temporal and spatial scales, and explore much wider parameter spaces than were previously possible.
In this session we invite submissions that utilise data science and/or AI techniques that address research questions relating to glaciology, sea ice, permafrost and/or polar climate science. Approaches used may include (but are not limited to) machine learning, artificial intelligence, big data processing/automation techniques, advanced statistics, and innovative software/computing solutions. These could be applied to any (or combinations) of data sources including remote sensing, numerical model output and field/lab observations. We particularly invite contributions that apply techniques and approaches that reveal new insights into cryospheric research problems that would not otherwise be achievable using traditional methods, and those that discuss how or if approaches can be applied or adapted to other areas of cryospheric science. Given the rapid development of this field by a diverse group of international researchers, we convene this session to help foster future collaboration amongst session contributors, attendees, and international stakeholders and help address the most challenging questions in cryospheric science.
Analysis of complex geoscientific time series: linear, nonlinear, and computer science perspectives
This interdisciplinary session welcomes contributions on novel conceptual and/or methodological approaches and methods for the analysis and statistical-dynamical modeling of observational as well as model time series from all geoscientific disciplines.
Methods to be discussed include, but are not limited to linear and nonlinear methods of time series analysis. time-frequency methods, statistical inference for nonlinear time series, including empirical inference of causal linkages from multivariate data, nonlinear statistical decomposition and related techniques for multivariate and spatio-temporal data, nonlinear correlation analysis and synchronisation, surrogate data techniques, filtering approaches and nonlinear methods of noise reduction, artificial intelligence and machine learning based analysis and prediction for univariate and multivariate time series.
Contributions on methodological developments and applications to problems across all geoscientific disciplines are equally encouraged. We particularly aim at fostering a transfer of new methodological data analysis and modeling concepts among different fields of the geosciences.
Advances in diagnostics, sensitivity, uncertainty analysis, and hypothesis testing of Earth and Environmental Systems models
Proper characterization of uncertainty remains a major research and operational challenge in Environmental Sciences, and is inherent to many aspects of modelling impacting model structure development; parameter estimation; an adequate representation of the data (inputs data and data used to evaluate the models); initial and boundary conditions; and hypothesis testing. To address this challenge, methods for a) uncertainty analysis (UA) that seek to identify, quantify and reduce the different sources of uncertainty, as well as propagating them through a system/model, and b) the closely-related methods for sensitivity analysis (SA) that evaluate the role and significance of uncertain factors (in the functioning of systems/models), have proved to be very helpful.
This session invites contributions that discuss advances, both in theory and/or application, in methods for SA/UA applicable to all Earth and Environmental Systems Models (EESMs), which embraces all areas of hydrology, such as classical hydrology, subsurface hydrology and soil science.
Topics of interest include (but are not limited to):
1) Novel methods for effective characterization of sensitivity and uncertainty
2) Analyses of over-parameterised models enabled by AI/ML techniques
3) Single- versus multi-criteria SA/UA
4) Novel methods for spatial and temporal evaluation/analysis of models
5) The role of information and error on SA/UA (e.g., input/output data error, model structure error, parametric error, regionalization error in environments with no data etc.)
6) The role of SA in evaluating model consistency and reliability
7) Novel approaches and benchmarking efforts for parameter estimation
8) Improving the computational efficiency of SA/UA (efficient sampling, surrogate modelling, parallel computing, model pre-emption, model ensembles, etc.)
Earth System Model Evaluation with ESMValTool in the Jupyter notebook
This Short Course is aimed at researchers in climate-related domains, who have an interest in working with climate data. We will introduce the ESMValTool, a Python project developed to facilitate the analysis of climate data through so-called recipes. An ESMValTool recipe specifies which input data will be used, which preprocessor functions will be applied, and which analytics should be computed. As such, it enables readable and reproducible workflows. The tool takes care of finding, downloading, and preparing data for analysis. It includes a suite of preprocessing functions for commonly used operations on the input data, such as regridding or computation of various statistics, as well as a large collection of established analytics.
In this course, we will run some of the available example recipes using ESMValTool’s convenient Jupyter notebook interface. You will learn how to customize the examples, in order to get started with implementing your own analysis. A number of core developers of ESMValTool will be present to answer any and all questions you may have.
The ESMValTool has been designed to analyze the data produced by Earth System Models participating in the Coupled Model Intercomparison Project (CMIP), but it also supports commonly used observational and re-analysis climate datasets, such as ERA5. Version 2 of the ESMValTool has been specifically developed to target the increased data volume and complexity of CMIP Phase 6 (CMIP6) datasets. ESMValTool comes with a large number of well-established analytics, such as those in Chapter 9 of the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) (Flato et al., 2013) and has been extensively used in preparing the figures of the Sixth Assessment Report (AR6). In this way, the evaluation of model results can be made more efficient, thereby enabling scientists to focus on developing more innovative methods of analysis rather than constantly having to "reinvent the wheel".
Unsupervised, supervised, semi-supervised as well as reinforcement learning are now increasingly used to address Earth system related challenges.
Machine learning could help extract information from numerous Earth System data, such as in-situ and satellite observations, as well as improve model fidelity through novel parameterisations or speed-ups. This session invites submissions spanning modelling and observational approaches towards providing an overview of the state-of-the-art of the application of these novel methods. This includes (but it is not restricted to):
- the use of machine learning to improve forecast skill
- generate significant speedups
- design new parameterization schemes
- emulate numerical models.
Please consider submitting abstracts focussed on ML applied to observations and modelling of climate processes to the companion "ML for Climate Science" session.
Machine Learning in Planetary Sciences and Heliophysics
shouts out for new data analysis strategies. There is a need for frameworks that can rapidly and intelligently extract information from these data sets in a manner useful for scientific analysis. The community is starting to respond to this need. Machine learning, with all of its different facets, provides a viable playground for tackling a wide range of research questions in planetary and heliospheric physics.
We encourage submissions dealing with machine learning approaches of all levels in planetary sciences and heliophysics. The aim of this session is to provide an overview of the current efforts to integrate machine learning technologies into data driven space research, to highlight state-of-the art developments and to generate a wider discussion on further possible applications of machine learning.
Artificial Intelligence for Natural Hazard and Disaster Management
Through a wealth of geospatial data, growing computational power, and demonstrated success of application across many fields, artificial intelligence (in particular, machine learning) promises to advance our understanding of natural hazards and our ability to predict and respond to natural disasters. The ITU/WMO/UNEP Focus Group AI for Natural Disaster Management (FG-AI4NDM) is building a community of experts and stakeholders to identify best practices in the use of AI for data processing, improved modeling across spatiotemporal scales, and providing effective communication. This multidisciplinary FG-AI4NDM-session invites contributions addressing challenges related to floods, landslides, earthquakes, volcanic eruptions, tsunamis, among others, as well as multi-hazard. It also welcomes presentations on novel AI methods (including advances in automated annotation, explainability, etc.), which are hazard agnostic.
Short-term Earthquakes Forecast (StEF) and multi-parametric time-Dependent Assessment of Seismic Hazard (t-DASH) | Virtual PICO
From the real-time integration of multi-parametric observations is expected the major contribution to the development of operational t-DASH systems suitable for supporting decision makers with continuously updated seismic hazard scenarios. A very preliminary step in this direction is the identification of those parameters (seismological, chemical, physical, biological, etc.) whose space-time dynamics and/or anomalous variability can be, to some extent, associated with the complex process of preparation of major earthquakes.
This session wants then to encourage studies devoted to demonstrate the added value of the introduction of specific, observations and/or data analysis methods within the t-DASH and StEF perspectives. Therefore, studies based on long-term data analyses, including different conditions of seismic activity, are particularly encouraged. Similarly welcome will be the presentation of infrastructures devoted to maintain and further develop our present observational capabilities of earthquake related phenomena also contributing in this way to build a global multi-parametric Earthquakes Observing System (EQuOS) to complement the existing GEOSS initiative.
To this aim this session is not addressed just to seismology and natural hazards scientists but also to geologist, atmospheric sciences and electromagnetism researchers, whose collaboration is particular important for fully understand mechanisms of earthquake preparation and their possible relation with other measurable quantities. For this reason, all contributions devoted to the description of genetic models of earthquake’s precursory phenomena are equally welcome.
Machine learning (ML) and Deep Learning (DL) have seen accelerated adoption across Hydrology and the broader Earth Sciences. This session highlights the continued integration of ML, and its many variants, including DL, into traditional and emerging hydrology-related workflows. Abstracts are solicited related to novel theory development, novel methodology, or practical applications of ML in hydrological modeling. This might include, but is not limited to, the following:
(1) Development of novel DL models or modeling workflows.
(2) Integrating DL with process-based models and/or physical understanding.
(3) Improving understanding of the (internal) states/representations of ML/DL models.
(4) Understanding the reliability of ML/DL, e.g., under non-stationarity.
(5) Deriving scaling relationships or process-related insights with ML/DL.
(6) Modeling human behavior and impacts on the hydrological cycle.
(7) Hazard analysis, detection, and mitigation.
(8) Natural Language Processing in support of models and/or modeling workflows
Clustering in hydrology: methods, applications and challenges | Virtual PICO
Clustering analysis is a well-known exploratory task for partitioning databases into smaller groups based on patterns or inherent similarity in data. Clustering methods have found many applications in many disciplines due to growing interest in unravelling the hidden and meaningful patterns that exist in large amounts of available data. Due to its unsupervised nature, clustering data is a complex task that requires attention to optimal choice alternatives regarding methods, model parameters and performance metrics. However, the suitability of clustering algorithms depends on their application. Different methods and approaches co-exist in a large pool. The challenge is to obtain application-specific insights while enabling a practical knowledge perspective for benchmarking. There are still research gaps in the wider clustering literature, and hydrology-specific knowledge is fragmented and difficult to find.
In hydrology, unsupervised classification of multivariate data is often used but typically in rather basic forms and as an intermediate step. Recently, the number of studies using clustering methods has rapidly increased. However, a clear and integrative vision on clustering algorithms is currently missing. Despite advances in other fields, the scope of hydrological studies is limited. Knowledge exchange on hydrology-specific ways of dealing with the issues related to clustering is needed.
The aim of this session is to explore theoretical and conceptual underpinnings of well-known clustering methods, offer fresh insights into applications of new clustering methods, gain thorough understanding of pearls and pitfalls in clustering algorithms, provide a critical overview of the main challenges associated with data clustering process, discuss major research trends and highlight open research issues. It is expected to improve scientific practice within the hydrology domain, and foster scientific debate on benchmarking in cluster analysis.
We invite contributions (case studies, comparative analyses, theoretical experiments) on a wide range of topics including (but not limited to): hard vs fuzzy clustering; comparison of clustering algorithms; benchmarking in cluster analysis; clustering as an exploratory tool vs clustering as a hypothesis testing tool; determination of number of clusters; selecting variables to cluster upon; evaluation of clustering performance; alternative clustering methods (sequential, evolutionary, deep, ensemble, etc.)
Recent developments in machine learning (ML) are transforming Earth observation data analysis and modelling of the Earth system and its constituent processes. While statistical models have been used for a long time, state-of-the-art machine and deep learning algorithms allow encoding non-linear, spatio-temporal relationships robustly without sacrificing interpretability. These advances have the potential to accelerate climate science by improving our understanding of the underlying processes, reducing and better quantifying uncertainty, and even making predictions directly from observations across different spatio-temporal scales.
This session aims to provide a venue to present the latest progress in the use of ML applied to all aspects of climate science including, but not limited to:
- Causal discovery and inference
- Learning (causal) process and feature representations in observations
- Hybrid models (physically informed ML)
- Novel detection and attribution approaches
- Probabilistic modelling and uncertainty quantification
- Explainable AI applications to climate science
Please consider submitting abstracts focussed on ML for model improvement, particularly for near-term (including seasonal) forecasting to the companion “ML for Earth System modelling” session.
Tom Beucler,Christina Heinze-Deml
Advances in geomorphometry and landform mapping: possibilities, challenges and perspectives | Virtual PICO
Geomorphometry and landform mapping are important tools used for understanding landscape processes and dynamics on Earth and other planetary bodies. The recent rapid advances in technology and data collection methods have made available vast quantities of geospatial data offering unprecedented spatio-temporal range, density, and resolution, but it also created new challenges in terms of data processing and analysis.
This inter-disciplinary session on geomorphometry and landform mapping aims to bridge the gap between process-focused research fields and the technical domain where geospatial products and analytical methods are developed. The increasing availability of a wide range of geospatial datasets requires the continued development of new tools and analytical approaches as well as landform/landscape classifications. However, a potential lack of communication across disciplines results in efforts to be mainly focused on problems within individual fields. We aim to foster collaboration and the sharing of ideas across subject-boundaries, between technique developers and users, enabling us as a community to fully exploit the wealth of geospatial data that is now available.
We welcome perspectives on geomorphometry and landform mapping from ANY discipline (e.g. geomorphology, planetary science, natural hazard assessment, computer science, remote sensing). This session aims to showcase both technical and applied studies, and we welcome contributions that present (a) new techniques for collecting or deriving geospatial data products, (b) novel tools for analysing geospatial data and extracting innovative geomorphometric variables, (c) mapping and/or morphometric analysis of specific landforms as well as whole landscapes, and (d) mapping and/or morphometric analysis of newly available geospatial datasets. Contributions that demonstrate multi-method or inter-disciplinary approaches are particularly encouraged. We also actively encourage contributors to present tools/methods that are “in development”.
Novel data, methods and applications in Geomorphometry
Geomorphometry, a science of quantitative land surface analysis, gathers various mathematical, statistical and image processing techniques to quantify morphological, hydrological, ecological and other aspects of a land surface. The typical input to geomorphometric analysis is a square-grid representation of the land surface: a digital elevation model (DEM) or one of its derivatives. DEMs provide the backbone for many studies in Geo sciences, hydrology, land use planning and management, Earth observation and natural hazards.
One topic of active research concerns compromises between the use of global DEMs at 1-3 arc second, ~30-90 m grid spacing, and local LiDAR/structure from motion (SFM) elevation models at 1 m or finer grid spacing. Point clouds from LiDAR, either ground-based or from airborne vehicles, are a generally accepted reference tool to assess the accuracy of other DEMs. SFM data have a resolution comparable to LiDAR point clouds, but can cost significantly less to acquire for smaller areas. Globally available DEMS include the recently published Copernicus GLO-90 and GLO-30. This session provides an exciting forum to show the potential applications of this new DEM and its improvements over SRTM. We would like to investigate the tradeoff between the employment of the two kinds of data, and applications which can benefit from data at both (local and global) scales.
The improvements in the global DEMs, as well as the increasing availability of much finer resolution LiDAR and SFM DEMs, call for new analytical methods and advanced geo-computation techniques, necessary to cope with diverse application contexts. We aim at investigating new methods of analysis and advanced geo-computation techniques, including high-performance and parallel computing implementations of specific approaches.
Commercial applications of DEM data and of geomorphometric techniques can benefit important business sectors. Besides a proliferation of applications that can tolerate low accuracy geographical data and simple GIS applications, a large base of professionals use high-resolution, high-accuracy elevation data and high-performance GIS processing. We would like to survey and investigate professional, commercial and industrial applications, including software packages, from small enterprises to large companies, to ascertain how academic researchers and industry can work together.
ESSI2 – Data, Software and Computing Infrastructures Across Earth and Space Sciences
Programme group scientific officers:
Solutions and Applications on HPC and Cloud Infrastratures
This session aims to highlight Earth Science research concerned with state of the art computational and data infrastructures such as HPC (Supercomputer, Cluster, accelerator-based systems), data platform solutions or portable infrastructures as a service for public and commercial cloud offerings.The session presents an opportunity for everyone to present and learn from results achieved, success stories and experience gathered during the process of study, adaptation and exploitation of these systems.
Further contributions are welcome that showcase platforms and services to support Earth Science applications on HPC systems and Cloud infrastructures or combination of both, e.g. to increase effectivity, robustness or ease of use.
Topics of interest include:
- Data intensive Earth Science applications and how they have been adapted to different HPC infrastructures
- Data mining software stacks in use for large environmental data
- HPC simulation and High Performance Data Analytics e.g. code coupling, in-situ workflows
- Experience with Earth Science applications in Cloud environments e.g. solutions on Amazon EC2, Microsoft Azure, and Earth Science simulation codes in private and European Cloud infrastructures (Open Science Cloud)
- Infrastructure as a Service solutions in various Clouds commercial and public. We like to see Micorservice arcitectures combing access to data centers and schedule jobs to HPC.
- Tools and services for Earth Science data management, workflow execution, web services and portals to ease access to compute resources.
Cloud-based Workflows for Big Earth Data for Operational Decision Making
In recent years new cloud-based systems for Big Earth data have emerged aiming to support easier data access as well as workflow and application development. The current landscape of systems for Big Earth data consists of pure cloud-based services, e.g. the Copernicus Data and Information Access Services (DIAS) to more ‘user-friendly’ platforms, including Google Earth Engine, the Open Data Cube initiative, or the CASEarth platform. All claim to address user needs by making large volumes of data easier accessible and providing powerful processing resources. And yet, from a user perspective, these systems differ in their underlying technologies, the level of openness, the level of abstraction, and the data and functionalities they offer.
We are interested in practical application examples of developing a workflow with the help of a cloud-based service or platform. Aspects we are interested in include, but are not limited to:
- Are data and functionalities sufficient for complex application needs?
- Can existing workflows easily be transferred to a cloud-based system? If yes, what are the benefits compared to traditional, local processing? If no, what are obstacles or limitations (organisational, technical, ...)?
- What type of system is preferred? Pure cloud-based services or more ‘user-friendly’ platforms?
The purpose of this session is to discuss the advantages and limitations of current (cloud-based) services and platforms for Big Earth data from a user perspective and to collect feedback on future requirements and needs. We are specifically interested in contributions related to:
- Successful (or unsuccessful) applications of current cloud-based services or platforms;
- Experience in migrating existing applications and workflows to a cloud-based system/service, including benefits, challenges and limitations;
- Perspectives on requirements or needs for the (short and long-term) evolution of cloud-based services.
Established and Establishing Disciplinary International Frameworks that will Ultimately Enable Real-Time Interdisciplinary Sharing of Data.
As we increasingly face global challenges such as climate change, pandemics, environmentally sustainable exploitation of our resources, there is a greater urgency to bring together multiple existing data/information infrastructure systems that are distributed around the world to create machine actionable, interoperable, reusable, real-time data sharing frameworks.
The problem is that research can be a ‘competitive’ process, and there is a tendency for this competition to be focused on which is considered to be the best data sharing system or data standard that supposedly is THE one that everyone SHOULD use.
An alternative approach is to build loosely coupled frameworks that allow multiple existing systems to interoperate, but still, preserve their deeper disciplinary specialization. For this approach to work, there will need to be agreement on 1) the minimum core variables for sharing data content, and 2) the technical standards/technologies required to enable real-time data interoperability.
There are well-established examples of groups facilitating global data sharing (e.g., Federation of Digital Seismograph Networks (FDSN), OneGeology, Earth Systems Grid Federation (ESFG), OGC, W3C, GEO). Many new groups are starting to form global disciplinary data networks: some are already trying to link frameworks together to enable global interdisciplinary sharing of data (e.g., CODATA/DDI Cross-Domain Data Initiative).
This session seeks contributions from any group that has established or is establishing a data-sharing infrastructure system/framework regardless of scale, as well as those that are attempting global and/or interdisciplinary networking. Topics may range from (meta)data standards, defining minimum core content variables, or be focused on technologies or organizational setups for enabling data sharing. Papers on the social dynamics of building sharing systems/frameworks are also welcome.
Metadata, Data Models, Semantics, and Collaboration
Earth systems science is fundamentally cross-disciplinary, and increasingly this requires sharing and exchange of geoscientific information across discipline boundaries. This information can be both rich and complex, and content is not always readily interpretable by either humans or machines. Difficulties arise through differing exchange formats, lack of common semantics, divergent access mechanisms, etc.
Recent developments in distributed, service-oriented, information systems using web-based (W3C, ISO, OGC) standards are leading to advances in data interoperability. At the same time, work is underway to understand how meaning may be represented using ontologies and other semantic mechanisms, and how this can be shared with other scientists.
This session aims to explore developments in interoperable data sharing, and the representation of semantic meaning to enable interpretation of geoscientific information. Topics may include, but are not limited to:
- standards-based information modelling
- interoperable data sharing
- use of metadata
- knowledge representation
- use of semantics in an interoperability context
- application of semantics to discovery and analysis
- metadata and collaboration
Challenges and Advances Towards Exascale Computing in Earth System Modelling
Current pre-exascale computing systems, and the strong push towards exascale warrant a substantial effort from Earth System Model (ESM) developers. Exascale ESM is expected to open up a new range of opportunities, from increasing domain sizes, simulation duration, model resolution, larger ensemble simulations, and new components and physics.
The exascale challenges are manyfold: leveraging accelerators, coupling models for different compartments, adding processes, parametrisations, and feedbacks relevant at higher resolutions, ensuring accuracy and stability, dealing with increased I/O, and enabling big-data workflows. Solutions to these challenges must also guarantee future-proofing codes, pursue performance-portability across current (and future) architectures, domain-specific languages, code sustainability (including readability, maintenance, etc), and facilitate the development of new and improved ESM codes. Additionally, exascale-ready ESM codes are expected to be performant in terms of both computational efficiency and energy efficiency. This expectation requires an intense cross-disciplinary effort between geoscientists of all Earth system compartments, computer scientists, applied mathematicians, and software engineers.
Harnessing exascale resources requires a range of solutions from improving parallel efficiency and scalability of numerical methods and coupling strategies, porting code to accelerators, to leveraging heterogeneous and modular computing. Exascale ESMs will have a stronger relationship to Machine Learning (ML) in many applications, such as ML-based ensemble and probabilistic modelling, inverse modelling, data assimilation, uncertainty quantification, ML-based model surrogates and ML-driven model operation.
This session aims to be a forum to discuss challenges and solutions involving domain scientists, applied mathematicians, computer scientists, and HPC experts. We welcome contributions addressing open challenges and advances to achieve exascale-readiness for ESM across all Earth System compartments, methods and technologies. We especially encourage contributions discussing challenges and advances that may be applicable to modelling efforts across Earth system compartments (and coupling) towards exascale and digital twins. The session intends to be broad and transverse, but we also acknowledge that solutions are done with specific applications in mind, which are welcome to stimulate cross-fertilisation with other applications.
Pangeo: The Modern Solution to Challenges in Earth & Space Science Using a Community Driven Approach
Earth & Space Science Informatics (ESSI) comprises among other things, the science and technology behind Earth and Space Observation. In particular Earth Observation (EO) stands as a fundamental requirement for example, among others, to keep track of a fast changing climate and the increasingly negative effects on society.
Such effort is inter-disciplinary by definition and requires both rigorous scientific research and proper software development. The challenge at hand can be summarized as: datasets growth (i.e. Big Data), gap between industrial and research approaches (i.e. Technological Gap) and fragmentation between different solutions (i.e. Reproducibility).
Pangeo (pangeo.io) is a community of researchers and developers that tackle these issues in a collaborative manner; a Python ecosystem has been developed with support from different partners among others: US NSF EarthCube Program, NASA, LDEO Columbia University, NCAR, Met Office, ANACONDA, etc.
This session's aim is twofold:
(1) to motivate researchers that are using or developing in the Pangeo framework, to share their endeavors with a broader community that can benefit from these new tools.
(2) to contribute to the Pangeo community in terms of potential new applications for the Pangeo ecosystem, containing the following core packages: xarray, Iris, DASK, Jupyter, Zarr and INTAKE.
Topics are broad in content, if you are using at least one of Pangeo’s core packages we want to hear from you:
- Atmosphere, Ocean and Land Models
- Satellite Observations
- Infrastructure and Deployment: HPC, Cloud computing (publicly funded and commercial clouds)
- Scientific Software Practices
- Machine Learning using Pangeo
- Scalable scientific computing
- And other related applications
Authors will have the option to submit Jupyter notebooks of their works; the best 5 will be selected as part of the Pangeo applications gallery of EGU22. Examples of galleries can be found in: http://gallery.pangeo.io/ This is an optional step and not a requirement for the submissions.
Lossy and Lossless Compression for Greener Geoscientific Computing and Data Storage | Virtual PICO
The ongoing explosion in size of geoscientific model and measurement datasets has generally been accommodated by a combination of increased storage space (i.e., brute force) and older lossless data compression techniques. Newer, more efficient, and faster lossy and lossless compression techniques can significantly mitigate storage growth and accelerate workflows without sacrificing data of scientific value. Reduced storage requirements lower data center power consumption and its attendant consequences for greenhouse gas emissions and environmental sustainability. Thus modern data compression techniques allow researchers to analyze and/or generate more data with a greener climate footprint.
This session invites presentations on all aspects of how geosciences can shift towards greener computing by adopting modern data compression techniques including, though not limited to: algorithmic advances, assessments of geoscientific computing and data storage sustainability, compression efficiency and speed in software and/or hardware, implementation in MIPs (e.g., CMIP7), interoperability issues, metadata standards (e.g., CF), remote sensing applications, and support in widely used languages (e.g., C/C++, Fortran, Java, Python), data storage formats (e.g., HDF, netCDF, Zarr), and Open Source workflows (e.g., CDO, NCO, R, Xarray).
Remote Sensing and GIS for climate-related hazards in natural and man-made environments
The increase in recent decades of the occurrence of climate-related hazards and the subsequent consequences has been driven by climate change, the increasing human activities and infrastructure development, particularly in vulnerable areas. In order to reduce the damages and/or losses, more efforts should be directed towards effective disaster risk management, with a focus on hazard, vulnerability and elements-at-risk mapping.
Remote Sensing (RS) and Geographic Information Systems (GIS) have been proved to be powerful tools in monitoring, and mapping change and rate of change in relation to hydrological hazards, particularly in data scarce environments, thanks to the great advantage of sensing extended areas, at low cost and with regular revisit capability. Furthermore, it offers the opportunity to gain fresh insights into biophysical environments through the spatial, temporal, spectral and radiometric resolutions of satellite systems. The advantages of RS are further supported by the analytical and geospatial data integration capabilities of GIS.
The main goal of this session is to present the recent advancements and range of applications in the fields of hazard monitoring and early warning, using RS (active and passive sensors, Lidar, UAVs, thermal, etc.) supported by GIS for the successful assessment and management of climate-related hazards. In particular, this session intends to give the floor to novel studies and applications in the analysis of Earth Observation (EO) and other geospatial data for the detection, monitoring, modeling and mapping of phenomena such as floods, landslides, soil erosion, droughts, etc. Water resources management, urban and cultural heritage management, and agriculture adaptation to address extreme conditions will be thoroughly discussed.
The session aims to serve a diverse community of research scientists, practitioners, end-users and decision-makers. Early Career Scientists (ECS) are strongly encouraged to present their research.
In recent decades, the advent in geoinformation technology has played an increasingly important role in determining various parameters that characterize the Earth's environment. This session invites contributions focusing on modern open-source software tools developed to facilitate the analysis of mainly geospatial data in any branch of geosciences for the purpose of better understanding Earth’s natural environment. We encourage the contribution of any kind of open source tools, including those that are built on top of global used commercial GIS solutions. Potential topics for the session include the presentation of software tools developed for displaying, processing and analysing geospatial data and modern cloud webGIS platforms and services used for geographical data analysis and cartographic purposes. We also welcome contributions that focus on presenting tools that make use of parallel processing on high performance computers (HPC) and graphic processing units (GPUs) and also on simulation process models applied in any field of geosciences.
In many scientific disciplines, accurate, intuitive, and aesthetically pleasing display of geospatial information is a critical tool. PyGMT (https://www.pygmt.org) - a Python interface to the Generic Mapping Tools (GMT) - is a mapping toolbox designed to produce publication-quality figures and maps for insertion into posters, reports, and manuscripts. This short course is geared towards geoscientists interested in creating beautiful maps using Python. Only basic Python knowledge is needed, and a background in cartography is not required to use PyGMT effectively! By the end of this tutorial, students will be able to:
- Craft basic maps with geographic map frames using different projections
- Add context to their figures, such as legends, colorbars, and inset overview maps
- Use PyGMT to process PyData data structures (xarray/pandas/geopandas) and plot them on maps
- Understand how PyGMT can be used for various applications in the Earth sciences and beyond!
The 3.5 hour long short course will be based on content adapted from https://github.com/GenericMappingTools/2021-unavco-course and https://github.com/GenericMappingTools/foss4g2019oceania. Each of the 30–45 minute sessions will involve a quick (~10 minute) walkthrough by the speaker, followed by a more hands-on session in breakout rooms where tutorial participants work on the topic (using interactive Jupyter notebooks) in a guided environment with one of four instructors on hand to answer questions.
We expressly welcome students and geoscientists working on any geo related fields (e.g. Earth Observation, Geophysical, Marine, Magnetic, Gravity, Planetary, etc) to join. Come and find out what PyGMT can do to level up your geoprocessing workflow!
ESSI3 – Open Science Informatics for Earth and Space Sciences
Programme group scientific officers:
Best Practices and Realities of Research Data Repositories: Balancing the needs of Repositories, Researchers and Publishers | Virtual PICO
As funders and publishers increasingly require that research data be made publicly available, research data repositories, especially in the Earth and environmental sciences, play a new and major role in the publication process. They are on one hand supporting researchers during the data publication process and, on the other hand, need to present the data in a way that they are fully integrable in the ecosystem of modern scientific communication as required by the FAIR Data Principles. More recent developments, like the CoreTrustSeal Certification and the Enabling FAIR Data Commitment Statement have defined additional benchmarks and expectations for the capabilities of repositories.
How do repositories comply with increasing expectations for machine accessibility of their data and the requirements for machine learning (particularly for long tail data)? How do researchers know which repositories meet these benchmarks and future expectations? How should publishers work together with repositories and researchers to ensure a more complete record of science? What role can data journals and editors play?
This session will showcase the range of practices in research data repositories, data publication and the integration of data, software, samples, models and notebooks into the scholarly publication process. It invites repositories, researchers, information scientists, journals, and editors to discuss challenges they are facing in meeting community best practice.
Making Geoanalytical Data FAIR: Managing Data from Field to Laboratory to Archive to Publication
Globally, geoscience and research analytical laboratories collect ever increasing volumes of data: an acute challenge now is how to collate, store and make these data accessible in a standardised, interoperable and machine-accessible form that is FAIR. Many solutions today are bespoke and inefficient, lacking, for example, unique identification of samples, instruments, and data sets needed to trace the analytical history of the data; and there are few community agreed standards to facilitate sharing and interoperability between systems.
The push for a solution is being driven by publishers and journals who increasingly require researchers to provide access to the supporting data from a trusted repository prior to publication of manuscripts or finalisation of grants. We urgently need community development of systems to facilitate easy and efficient management of geoanalytical laboratory data. We need to address the lack of global standards, best practices and protocols for analytical data management and exchange, in order for scientists to better share their data in a global network of distributed databases. Buy-in from users and laboratory managers/technicians is essential in order to develop efficient and supported mechanisms.
This session seeks a diversity of papers from any initiative around the world that organises and structures sample/field metadata and research laboratory data at any scale to facilitate sharing and processing of geoanalytical data. We welcome papers on data and metadata standardisation efforts and papers on data management and systems that transfer data/metadata from instruments to shared data systems and relevant persistent repositories. Efforts on how to collate, curate, share and publicise sample/data collections as well as papers on the social dynamics of building sharing systems/frameworks are also welcome.
Free and Open Source Software (FOSS), Cloud-based Technologies and HPC to Facilitate Collaborative Science | Virtual PICO
Earth science research has become increasingly collaborative through shared code and shared platforms. Researchers work together on data, software and algorithms to answer cutting-edge research questions. Teams also share these data and software with other collaborators to refine and improve these products. This work is supported by Free and Open Source Software (FOSS) and by shared virtual research infrastructures utilising cloud and high-performance computing.
Software is critical to the success of science. Creating and using FOSS enhances collaboration and innovation in the scientific community, creates a peer-reviewed and consensus-oriented environment, and promotes the sustainability of science infrastructures.
This session will showcase solutions and applications based on the Free and Open Source Software (FOSS), cloud-based architecture and high-performance computing to support information sharing, scientific collaboration, and solutions that enable large-scale data analytics at scale solutions.
Improving the state of Research Software across the EGU community
Software lies at the heart of all research conducted across the EGU community, from the use of edge computing in bespoke laboratory environments to earth system models running on high performance computing centres. A growing community of researchers and software developers have started to gather under the umbrella of Research Software Engineering (RSEng) and argue that research software is not merely a by-product of science, but effective and sustainable development of research software needs a skillset and resources beyond current academic education or management plans. Following the great debate ‘Improving Research Software in the Geosciences’ at EGU21, it is clear that challenges remain. However, there are also clear opportunities to enable positive change. With this in mind, in this session we want to hear from the EGU community on the entire spectrum of emerging and required developments that act to improve the state of Research Software across the EGU community. We welcome any contributions. This could include, but is not limited to:
Improved software citation mechanisms.
Policy changes to better support Research Software at funders, conferences, organizations, and journals.
Training and career progression initiatives.
Success stories of sustainable Research Software.
Approached for sustainable development of Research Software.
Cultural changes to improve the state of Research Software.
Innovative Evaluation Frameworks and Platforms for Weather and Climate Research | Virtual PICO
Comprehensive evaluations of Earth Systems Science Prediction (ESSP) systems (e.g., numerical weather prediction, hydrologic prediction, climate prediction and projection, etc.) are essential to understand sources of prediction errors and to improve earth system models. However, numerous roadblocks limit the extent and depth of ESSP system performance evaluations. Observational data used for evaluation are often not representative of the physical structures that are being predicted. Satellite and other large spatial and temporal observations datasets can help provide this information, but the community lacks tools to adequately integrate these large datasets to provide meaningful physical insights on the strengths and weaknesses of predicted fields. ESSP system evaluations also require large storage volumes to handle model simulations, large spatial datasets, and verification statistics which are difficult to maintain. Standardization, infrastructure, and communication in one scientific field is already a challenge. Bridging different communities to allow knowledge transfers, is even harder. The development of innovative methods in open frameworks and platforms is needed to enable meaningful and informative model evaluations and comparisons for many large Earth science applications from weather to climate.
The purpose of this Open Science 2.0 session is to bring experts together to discuss innovative methods for integrating, managing, evaluating, and disseminating information about the quality of ESSP fields in meaningful way. Presentations of these innovative methods applied to Earth science applications is encouraged. The session should generate some interest in communities and research projects building and maintaining these systems (e.g. ESMVal, Copernicus, Climaf, Freva, Birdhouse, MDTF, UV-CDAT, CMEC - PCMDI Metrics Package, Doppyo, MET-TOOLS, CDO, NCO, etc.). The session allows room for the exchange of ideas. An outcome of this session is to connect the scientists, develop a list of tools and techniques that could be developed and provided to the community in the future.
Show Cases by Implementations on Benefits Based on GEO Data Management and Data Sharing Principles Supporting Open Science in Earth and Space Research Domain
Since its inception, the intergovernmental Group on Earth Observations (GEO) has been a strong advocate for Open Science, especially open access to Earth observation data, information and knowledge generation. Openness in science is fundamental to ensure the equality of access, accelerate scientific progress, and disseminate knowledge. GEO created since 2005 a set of principles to share content by enabling information and data lifecycle on research processes findable, accessible and reusable.
GEO provides the GEOSS Data Sharing Policy and the GEO Data Management Principles (DMP) form a comprehensive set of principles for guiding data sharing and data management. In 2020 a new set up of the GEO Data Working Group were formed, which focus on evolution of existing documents regarding efforts on cloud-based system and machine actionable processing, Analyses Ready Data, etc. Carrying out analytic studies of TRUST, CARE and FAIR Principles, as well to develop a GEO In-Situ Strategy and deals with the new challenges regarding legal and ethical issues are actions within this Working Group.
With guest conveners from the US, this session enables a broader view on cultural framework regarding open sciences.
The aim of this session is to invite a broad community which brings aspect like success stories or implementations on how to assess or measure metrics on data sharing and data management frameworks regarding the aim on open sciences for Earth Observation Sciences. With special interest are analyses between different data management and sharing principles or directives which can deliver implication showcases which enhance to lowering barriers and creating benefits.
Proposed topics could be, but are not limited to:
- relationship analyses between different Open Science data management frameworks in Earth Sciences (e.g. EOSC -European Open Science Cloud)
- the interaction between European directives (e.g. Open Data Directive, INSPIRE, SEVESO, etc.) towards earth sciences and legal and ethical issues
- approaches on assessments for the measurement of success (e.g. metrics, KPIs)
- benefit mechanism for data provider by proofed concept of implementation
Participatory Citizen Science and Open Science as a new era of environmental observation for society
Citizen science (the involvement of the public in scientific processes) is gaining momentum across multiple disciplines, increasing multi-scale data production on Earth Sciences that is extending the frontiers of knowledge. Successful participatory science enterprises and citizen observatories can potentially be scaled-up in order to contribute to larger policy strategies and actions (e.g. the European Earth Observation monitoring systems), for example to be integrated in GEOSS and Copernicus. Making credible contributions to science can empower citizens to actively participate as citizen stewards in decision making, helping to bridge scientific disciplines and promote vibrant, liveable and sustainable environments for inhabitants across rural and urban localities.
Often, citizen science is seen in the context of Open Science, which is a broad movement embracing Open Data, Open Technology, Open Access, Open Educational Resources, Open Source, Open Methodology, and Open Peer Review. Before 2003, the term Open Access was related only to free access to peer-reviewed literature (e.g., Budapest Open Access Initiative, 2002). In 2003 and during the “Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities”, the definition was considered to have a wider scope that includes raw research data, metadata, source materials, and scholarly multimedia material. Increasingly, access to research data has become a core issue in the advance of science. Both open science and citizen science pose great challenges for researchers to facilitate effective participatory science, yet they are of critical importance to modern research and decision-makers.
We want to ask and find answers to the following questions:
Which approaches and tools can be used in Earth and planetary observation?
What are the biggest challenges in bridging between scientific disciplines and how to overcome them?
What kind of participatory citizen scientist involvement (e.g. how are citizen scientists involved in research, which kind of groups are involved) and open science strategies exist?
How to ensure transparency in project results and analyses?
What kind of critical perspectives on the limitations, challenges, and ethical considerations exist?
How can citizen science and open science approaches and initiatives be supported on different levels (e.g. institutional, organizational, national)?
ESSI4 – Advanced Technologies and Informatics Enabling Transdisciplinary Science
Programme group scientific officers:
Bridging the spatial scales, from surface sensors to satellite sensors: Innovative approaches towards the construction of Earth’s digital twin
To drive step-changes in our understanding of global environmental change, and improve prospective analysis (predictions), intelligent new methods to combine large-scale satellite and local, detailed surface sensor data within physics-based environmental simulators are needed.
This session aims to present innovative data science approaches combining information across spatial scales and various modalities. The session intends to showcase advances in mechanisms and tools tackling this common, crucial problem observed in a broad range of environmental issues. By covering a diverse range of environmental domains and modalities, we propose that it will be easier to identify research synergies and opportunities for future collaborations.
We encourage abstracts which present innovative data science solutions, including, but not limited to:
• Uncertainty and probabilistic modelling in environmental science
• Intelligent placement of planet earth sensors
• Physically aware data-driven methods using existing environmental datasets
• Computer vision methodologies and tools for detecting and tracking planet earth change from remote sensors
• Agent-based systems for modelling and feedback digital replicas of earth’s systems
The session is organized as a standard (oral) session.
Earth and Space Science Sensing: Informatics, Challenges and Innovation
The aim of this session is to bring together researchers investigating the most recent technology and informatics to sense different environments. There is a lot of experimental work carried out every year to obtain data, ranging from trying new sensors or telemetry to improving the necessary craft to deploy them. We welcome presentations on all aspects of sensing: techniques, engineering, radio networks and the whole pipeline from sensing to data analysis and integration.
We particularly invite papers from students and researchers across all fields of earth and space science. The session will provide an opportunity for cross-topic exchanges which will help us learn from each other.
Faster Uptake of Earth Observation Based Services for the Geosciences
One of today's challenges in the Earth sciences is the continuous evolution of technologies, making it hard for users and developers to be up to date and take advantage of the most advanced solutions.
On the other hand, there have never been such favorable conditions for the development of Earth observation based services and applications targeting both public and private sector.
For this session we invite abstracts that present operational applications for scientific services across the geosciences, including ocean research, forest monitoring, crop monitoring and more. By presenting these success stories we hope to initiate a discussion between different actors involved in making and using services addressing gaps and needs for faster uptake of Earth observation based services.
New frontiers of multiscale monitoring, analysis, modeling and decisional support (DSS) of environmental systems
Environmental systems often span spatial and temporal scales covering different orders of magnitude. The session is oriented toward collecting studies relevant to understand multiscale aspects of these systems and in proposing adequate multi-platform and inter-disciplinary surveillance networks monitoring tools systems. It is especially aimed to emphasize the interaction between environmental processes occurring at different scales. In particular, special attention is devoted to the studies focused on the development of new techniques and integrated instrumentation for multiscale monitoring of high natural risk areas, such as volcanic, seismic, energy exploitation, slope instability, floods, coastal instability, climate changes, and another environmental context.
We expect contributions derived from several disciplines, such as applied geophysics, geology, seismology, geodesy, geochemistry, remote and proximal sensing, volcanology, geotechnical, soil science, marine geology, oceanography, climatology, and meteorology. In this context, the contributions in analytical and numerical modeling of geological and environmental processes are also expected.
Finally, we stress that the inter-disciplinary studies that highlight the multiscale properties of natural processes analyzed and monitored by using several methodologies are welcome.
Building an observing system in Svalbard and associated waters using remotely sensed observations | Virtual PICO
Remotely sensed observations from ground, air (aircraft, UAVs), and satellite instruments are an important means of comprehensively understanding processes within the Earth system. This includes understanding processes in the cryosphere (e.g. glaciological processes, icebergs), the atmosphere (e.g. climatology, aerosols), the terrestrial component (e.g. vegetation, hydrology) and the oceans (e.g. sea ice, ocean colour properties, circulation patterns). The development of new remote sensing techniques and data products is vital to ensuring the long-term sustainability of the Svalbard observing system and its contribution to answering Earth system science questions. Further, Svalbard is uniquely positioned between the mild (Atlantic) oceanic and harsh Polar conditions, a feature which attracts international researchers seeking to conduct cal/val studies for various space agencies (e.g. ESA).
This session invites contributions from the community that develops and extends remote sensing observations around the Svalbard archipelago. We welcome studies that investigate new remote sensing applications in Svalbard, the development of new sensors/instruments for Svalbard research, cal/val activities, the expansion of existing monitoring networks and long-term measurements of key environmental parameters. The spatial scale of contributions may vary between localised field studies with ground-based instruments to regional monitoring from space-based platforms. Numerical modelling studies are also welcome, particularly if they utilise data sets acquired through the Svalbard observing system. This session will help identify gaps across the existing observational networks which organisations such as the Svalbard Integrated Arctic Earth Observing System (SIOS) will use to improve the monitoring system in the future.
Long-term interdisciplinary in-situ observations in the world’s mountains: challenges and opportunities
The world's mountains hold enormous societal and ecological importance. Long-term efforts to effectively monitor the complex socio-ecological systems that are embedded within such regions – many of which are rapidly changing – are urgently required to understand the driving mechanisms and processes involved, and ultimately develop sound quantitative future predictions and management strategies. Indeed, key trends can be missed if long observations are not sustained. However, the generally rugged, inhospitable, and inaccessible nature of mountainous terrain, coupled with the strong influence of topography on conditions, represent persistent challenges to obtaining informative in-situ mountain observations and sustaining these measurements across the time scales on which many underlying system dynamics operate (i.e. decades, not years). That said, the development of improved environmental sensors and options for their remote management (e.g. direct data transfer), as well as the growth of scientific networks (e.g. LTER and GEO Mountains) and their associated standards, infrastructure (eLTER RI & services like DEIMS-SDR, the GEO Mountain in-situ inventory, etc.), and knowledge sharing opportunities are improving the situation in many regards. In this context, we welcome disciplinary and interdisciplinary contributions related to mountainous regions that summarize persistent monitoring experiences, exploit long-term datasets to answer pressing research questions, and provide ideas on how common challenges associated with the development of formal research infrastructure might be best overcome.
Remote sensing measurements, acquired using different platforms - ground, UAV, aircraft and satellite - have increasingly become rapidly developing technologies to study and monitor Earth surface, to perform comprehensive analysis and modeling, with the final goal of supporting decision systems for ecosystem management. The spectral, spatial and temporal resolutions of remote sensors have been continuously improving, making environmental remote sensing more accurate and comprehensive than ever before. Such progress enables understanding of multiscale aspects of high-risk natural phenomena and development of multi-platform and inter-disciplinary surveillance monitoring tools. The session welcomes contributions focusing on present and future perspectives in environmental remote sensing, from multispectral/hyperspectral optical and thermal sensors. Applications are encouraged to cover, but not limited to, the monitoring and characterization of environmental changes and natural hazards from volcanic and seismic processes, landslides, and soil science. Specifically, we are looking for novel solutions and approaches including the topics as follows: (i) state-of-the-art techniques focusing on novel quantitative methods; (ii) new applications for state-of-the-art sensors, including UAVs and other close-range systems; (iii) techniques for multiplatform data fusion.
Detailed seabed maps portraying the distribution of geomorphic features, substrates, and habitats are used for a wide range of scientific, maritime industry, and government applications. These maps provide essential information for ocean industry sectors and are used to guide local and regional conservation action. Fundamental to seabed mapping are acoustic remote sensing technologies, including single beam and multibeam echosounders and sidescan, interferometric, and synthetic-aperture sonars. These are deployed on a variety of crewed and robotic surface and underwater platforms. In shallow clear waters, optical sensors including LiDAR, multispectral, and hyperspectral cameras are also increasingly employed from aircraft, drones, and satellites to create maps of the seabed. Innovative data processing, image analysis, and statistical approaches for classification are advancing the field of seabed mapping. These methods are yielding increasingly comprehensive and detailed maps. We welcome submissions that provide insights into the use of advanced technologies, novel processing and analytical approaches, and current and emerging applications in the field of seabed mapping and classification – from shallow coastal waters to the deep seafloor.
Geo-infrastructure monitoring: complex data analysis and instrument application | Virtual PICO
Continues monitoring of infrastructure systems are essential to ensure a reliable movement of people and goods, which involves in the economy growth and human interaction. The wide variety of instruments available allows diverse applications to increase data availability for a better understanding of geotechnical surroundings which are directly linked to the safe operation of infrastructures to prevent catastrophise such as soil erosion, settlements, liquefaction, landslides, seismic activities, flooding and even wildfires close to the highways. Understanding mentioned events are vital to provide a safe infrastructure in extreme climate conditions. This session focus on the application of geosciences and geophysical instrumentation including sensors on the infrastructures monitoring and data analysis from critical infrastructures (e.g., roadways, railway system, bridges, tunnels, water supply, underground utilities, electrical grids, and other embedded facilities in cities). The session aims to increase knowledge on geo-infrastructure management to overcome future challenges associated with the societal and human interaction, present advance knowledge research and novel approaches from various disciplines with a vibrant interaction to economy and human-interaction studies to provide an efficient infrastructure management system. The session is considered inter-and transdisciplinary (ITS) session. The applications and topics include but are not limited to: (1) Advance knowledge of the destructive and non-destructive geoscience and geophysical techniques including contactless and non-contactless techniques such as sensors. (2) Intelligent data analysis approaches to analyse accurate and precise interpretation of big data sets driven from various technologies (e.g., computer vision and image, and signal processing). (3) Influence of the surrounding areas on infrastructure management systems linked to natural events such as soil erosion, settlements, liquefaction, landslides, seismic activities, flooding, wildfires and extreme weather condition. (4) Continuous real-time monitoring to provide smart tools such as an integration of geosciences data with BIM models, Internet of Things, digital twins, robotic monitoring, artificial intelligence, automation systems based on machine learning and computational modelling for better decision-making for infrastructure owner/operators. (5) Human-interaction computer-based aided to generate reliable infrastructures.
Thanks to the increasing number of accessible low-cost hardware and open-source software, recent years have seen a fast growing number of Climate experiments involving large numbers of sensors and instruments. These experiments, often designed to be incorporated in wireless sensor networks, deal with establishing, maintaining and managing fixed environmental sensor networks near surface measurements. One of the major achievements of these experiments is to provide data quality metrics for citizen-science projects. This session, open for works about existing or planned environmental wireless sensor networks, has the final goal of reviewing the best practices and common problems when dealing with environmental wireless sensor networks. Our goal is to put the basis of a community and create an open repository where to collect and harmonize best practices and tools that have been proven to be useful in this new and promising research field.
Climate study related experiments and ground-based monitoring networks are getting bigger, and number of sensors and instruments involved is growing very fast. Experiments like SPRUCE, ICOS, NEON have to deal with hundreds of sensors and instruments. The most effective way to manage such large installations is to incorporate all equipment into a network. At this session we would like people to share their experience in establishing, maintaining, and managing a fixed sensor networks to monitor atmospheric chemical composition, on or near surface measurements, and meteorology (it does not cover remotely sensed data - satellite imagery, aerial photography, etc.). This session is open for all works about an existing system, planning a completely new network, upgrading an existing system, improving streaming data management, and archiving data.
Instrumentation and measurement technologies are currently playing a key role in the monitoring, assessment and protection of water resources.
This session focuses on measurement techniques, sensing methods and data science implications for the observation of water systems, emphasizing the strong link between measurement aspects and computational aspects that characterises the water sector.
This session aims at providing an updated framework of the observational techniques, data processing approaches and sensing technologies for water management and protection, giving also attention to today’s data science aspects, e.g. data analytics, big data, cloud computing and Artificial Intelligence.
We welcome contributions about field measurement approaches, development of new sensing techniques, low cost sensor systems and measurement methods enabling crowdsourced data collection also through social sensing. Therefore, water quantity and quality measurements as well as water characterization techniques are within the scope of this session.
Remote sensing techniques for the monitoring of water resources and/or the related infrastructures are also welcome.
Contributions dealing with the integration of data from multiple sources are solicited, as well as the design of ICT architectures (including IoT concepts) and of computing systems for the user-friendly monitoring of the water resource and the related networks.
Studies about signal and data processing techniques (including AI approaches) and the integration between sensor networks and large data systems are also very encouraged.
Interferometric Synthetic Aperture Radar added value products for Natural & Anthropogenic hazard assessment at local, regional and national scale.
Synthetic aperture radar (SAR) remote sensing is an established tool for natural and anthropogenic hazards mapping and monitoring. The new generation of radar satellite constellations along with a consistent repository of historical observations is fostering comprehensive multi-sensor hazard analyses. New constellations’ capabilities rely on innovative techniques based on high-resolution/wide-swath and short-temporal Interferometric SAR (InSAR). While acknowledging the benefits brought by these recent developments, the scientific community is now defining a new paradigm of techniques capable of: extracting relevant information from SAR imagery, designing proper methodologies for specific hazards, managing large SAR datasets (e.g. National ground motion services, Copernicus EGMS), and integrating radar data with multispectral satellite observations.
Application of remote sensing and Earth-observation data in natural hazard and risk studies
Remote sensing and Earth Observations (EO) are used increasingly in the different phases of the risk management and in development cooperation, due to the challenges posed by contemporary issues such as climate change, and increasingly complex social interactions. The advent of new, more powerful sensors and more finely tuned detection algorithms provide the opportunity to assess and quantify natural hazards, their consequences, and vulnerable regions, more comprehensively than ever before.
Several agencies have now inserted permanently into their program the applications of EO data to risk management. During the preparedness and prevention phase, EO revealed, fundamental for hazard, vulnerability, and risk mapping. EO data intervenes both in the emergency forecast and early emergency response, thanks to the potential of rapid mapping. EO data is also increasingly being used for mapping useful information for planning interventions in the recovery phase, and then providing the assessment and analysis of natural hazards, from small to large regions around the globe. In this framework, Committee on Earth Observation Satellites (CEOS) has been working from several years on disasters management related to natural hazards (e.g., volcanic, seismic, landslide and flooding ones), including pilots, demonstrators, recovery observatory concepts, Geohazard Supersites, and Natural Laboratory (GSNL) initiatives and multi-hazard management projects.
The session is dedicated to multidisciplinary contributions focused on the demonstration of the benefit of the use of EO for natural hazards and risk management.
The research presented might focus on:
- Addressed value of EO data in hazard/risk forecasting models
- Innovative applications of EO data for rapid hazard, vulnerability and risk mapping, the post-disaster recovery phase, and in support of disaster risk reduction strategies
- Development of tools for assessment and validation of hazard/risk models
The use of different types of remote sensing (e.g. thermal, visual, radar, laser, and/or the fusion of these) is highly recommended, with an evaluation of their respective pros and cons focusing also on future opportunities (e.g. new sensors, new algorithms).
Early-stage researchers are strongly encouraged to present their research. Moreover, contributions from international cooperation, such as CEOS and GEO initiatives, are welcome.
Geoscience problems related to massive release of radioactive materials by nuclear accidents and other human activities
The session gathers geoscientific aspects such as dynamics, reactions, and environmental/health consequences of radioactive materials that are massively released accidentally (e.g., Chernobyl and Fukushima nuclear power plant accidents, wide fires, etc.) and by other human activities (e.g., nuclear tests).
The radioactive materials are known as polluting materials that are hazardous for human society, but are also ideal markers in understanding dynamics and physical/chemical/biological reactions chains in the environment. Thus, the radioactive contamination problem is multi-disciplinary. In fact, this topic involves regional and global transport and local reactions of radioactive materials through atmosphere, soil and water system, ocean, and organic and ecosystem, and its relations with human and non-human biota. The topic also involves hazard prediction and nowcast technology.
By combining 35 years (> halftime of Cesium 137) monitoring data after the Chernobyl Accident in 1986, 10 years dense measurement data by the most advanced instrumentation after the Fukushima Accident in 2011, and other events, we can improve our knowledgebase on the environmental behavior of radioactive materials and its environmental/biological impact. This should lead to improved monitoring systems in the future including emergency response systems, acute sampling/measurement methodology, and remediation schemes for any future nuclear accidents.
The following specific topics have traditionally been discussed:
(a) Atmospheric Science (emissions, transport, deposition, pollution);
(b) Hydrology (transport in surface and ground water system, soil-water interactions);
(c) Oceanology (transport, bio-system interaction);
(d) Soil System (transport, chemical interaction, transfer to organic system);
(f) Natural Hazards (warning systems, health risk assessments, geophysical variability);
(g) Measurement Techniques (instrumentation, multipoint data measurements);
(h) Ecosystems (migration/decay of radionuclides).
The session consists of updated observations, new theoretical developments including simulations, and improved methods or tools which could improve observation and prediction capabilities during eventual future nuclear emergencies. New evaluations of existing tools, past nuclear contamination events and other data sets also welcome.
Data fusion, integration, correlation and advances of non-destructive testing methods and numerical developments for engineering and geosciences applications
Non-destructive testing (NDT) methods are employed in a variety of engineering and geosciences applications and their stand-alone use has been greatly investigated to date. New theoretical developments, technological advances and the progress achieved in surveying, data processing and interpretation have in fact led to a tremendous growth of the equipment reliability, allowing outstanding data quality and accuracy.
Nevertheless, the requirements of comprehensive site and material investigations may be complex and time-consuming, involving multiple expertise and multiple equipment. The challenge is to step forward and provide an effective integration between data outputs with different physical quantities, scale domains and resolutions. In this regard, enormous development opportunities relating to data fusion, integration and correlation between different NDT methods and theories are to be further investigated.
This Session primarily aims at disseminating contributions from state-of-the-art NDT methods and new numerical developments, promoting the integration of existing equipment and the development of new algorithms, surveying techniques, methods and prototypes for effective monitoring and diagnostics. NDT techniques of interest are related–but not limited to–the application of acoustic emission (AE) testing, electromagnetic testing (ET), ground penetrating radar (GPR), geoelectric methods (GM), laser testing methods (LM), magnetic flux leakage (MFL), microwave testing, magnetic particle testing (MT), neutron radiographic testing (NR), radiographic testing (RT), thermal/infrared testing (IRT), ultrasonic testing (UT), seismic methods (SM), vibration analysis (VA), visual and optical testing (VT/OT).
The Session will focus on the application of different NDT methods and theories and will be related –but not limited to– the following investigation areas:
- advanced data fusion;
- advanced interpretation methods;
- design and development of new surveying equipment and prototypes;
- real-time and remote assessment and monitoring methods for material and site inspection (real-life and virtual reality);
- comprehensive and inclusive information data systems for the investigation of survey sites and materials;
- numerical simulation and modelling of data outputs with different physical quantities, scale domains and resolutions;
- advances in NDT methods, numerical developments and applications (stand-alone use of existing and state-of-the-art NDTs).
Geoscience and health during the Covid-19 pandemic
The virus is still with us, with more potent variants. It remains the most immediate challenge for geosciences and health, including its impacts on geoscience development (data collection, training, dissemination) and the achievement of the UN Sustainable Development Goals, in particular that urban systems should increase well-being and health.
Long-term visions based on transdisciplinary scientific advances are therefore essential. As a consequence, this session, like the ITS1.1 session in 2021, calls for contributions based on data-driven and theory-based approaches to health in the context of global change. This includes :
- main lessons from lockdowns?
- how to get the best scientific results during a corona pandemic?
- how to manage field works, geophysical monitoring and planetary missions?
- qualitative improvements in epidemic modelling, with nonlinear, stochastic, and complex system science approaches;
- eventual interactions between weather and/or climate factors and epidemic/health problems
- new surveillance capabilities (including contact tracing), data access, assimilation and multidimensional analysis techniques;
- a fundamental revision of our urban systems, their greening and their need for mobility;
- a special focus on urban biodiversity, especially to better manage virus vectors;
- urban resilience must include resilience to epidemics, and therefore requires revisions of urban governance.
Corona pandemic appeared to be very different between regions (from town level to continent level), seasons, and climates including ocean. Lockdown and other measures decreased seismic noises and pollutions that give lessons on environmental issue. Geophysical measurement sometimes monitors the pandemic and its precursors such as sledge and river water. The infection model requires consideration of multi-parameter, non-linear and feedback effect on the re-production rate, for which the knowledge of non-linear analyses and modellings of geophysical phenomena can contribute and vice versa.
While urban system and non-linear analyses/modelling will be discussed in the related session ("Geoscience and health during the Covid-19 pandemic" in ITS-3/NP), this session stress more on geoscientific phenomena that are related to the pandemic, for example,
- Geophysical effect on Covid-19: Virus has preferred geographic/climate/pollution/ocean conditions including their variation.
- Covid-19 effect on geophysical phenomena: Covid-19 caused many environmental effect (sledge, emission, traffic, etc.).
- Monitoring Covid using geo-data: Use remote instrument (drone, satellite) and ground instruments (pollution, sledge).
Sustainable Architecture, What Future? | Virtual PICO
the desire today is to rethink the human way of life and the human biotope to reduce expenditure on energy and the acquisition of raw materials, mainly in the construction sector.
Our session deals with sustainable architecture and responsible architecture as an environmental solution against global warming and energy expenditure by introducing renewable energies, bio-air-conditioning in the construction project, use of materials with low environmental impact, and guaranteeing a better quality of life in spaces, lighting, air and human well-being.
We accept contributions related to the study of the aspects which condition the durability in constructions through time and space and which are:
- Water efficiency
- Bioclimatization and Energy
- Materials and resources
- Internal and external environmental quality.
ESSI5 – General Contributions to Earth and Space Science Informatics
Programme group scientific officers:
ESSI General Session
Earth and Space Sciences Informatics (ESSI) is concerned with evolving issues of data management and analysis, technologies and methodologies, large-scale computational experimentation and modeling, and hardware and software infrastructure needs. Together, these elements provide us with the capability to change data into knowledge that can be applied to advance our understanding of the Earth and Space Sciences. This session is for presentations on all aspects of Earth and Space Science Informatics, but especially those topics not represented by other ESSI sessions.
Please use the buttons below to download the presentation materials or to visit the external website where the presentation is linked. Regarding the external link, please note that Copernicus Meetings cannot accept any liability for the content and the website you will visit.
We are sorry, but presentations are only available for conference attendees. Please register for the conference first. Thank you.
Please decide on your access
Please use the buttons below to download the presentation materials or to visit the external website where the presentation is linked. Regarding the external link, please note that Copernicus Meetings cannot accept any liability for the content and the website you will visit.