Displays

ESSI2.12

This session provides a multi-disciplinary overview of Geoscience research and applied case studies involving MATLAB, and it further discusses technical resources and new capabilities available to researchers and educators. MATLAB is a multi-paradigm numerical computing environment and programming language developed by MathWorks, which is supported by a large community of skilled toolbox developers and active users. It allows matrix manipulations, data plotting, algorithms implementation, creation of user interfaces, and interfacing with programs written in other programming languages. These characteristics of MATLAB functions and tools have attracted various projects in geoscientific fields of academia and industry, and particularly in data analysis, 2D/3D visualization and program development. Many scientific articles, including MATLAB-based applications, have been published in international journals. This session encourages studies introducing/applying MATLAB-based programs and applications. Contributions from all related fields of Earth Science are welcome.
In this session, we would like to provide an overview over the MATLAB ecosystem for geoscientists and engineers and to discuss recent technological developments. Useful techniques will be introduced to manage large distributed files and leverage cluster solutions for geoscientific computations. We will focus on the visualization of results for scientific publication and present state of the art capabilities to visualize geo-referenced data.

Share:
Convener: Eun Young LeeECSECS | Co-convener: Steve Schäfer
Displays
| Attendance Thu, 07 May, 10:45–12:30 (CEST)

Files for download

Download all presentations (162MB)

Chat time: Thursday, 7 May 2020, 10:45–12:30

D809 |
EGU2020-2479
Dimitrios Piretzidis and Michael Sideris

We present a collection of MATLAB tools for the post-processing of temporal gravity field solutions from the Gravity Recovery and Climate Experiment (GRACE) satellite mission. GRACE final products are in the form of monthly sets of spherical harmonic coefficients and have been extensively used by the scientific community to study the land surface mass redistribution that is predominantly due to ice melting, glacial isostatic adjustment, seismic activity and hydrological phenomena. Since the launch of GRACE satellites, a substantial effort has been made to develop processing strategies and improve the surface mass change estimates.

The MATAB software presented in this work is developed and used by the Gravity and Earth Observation group at the department of Geomatics Engineering, University of Calgary. A variety of techniques and tools for the processing of GRACE data are implemented, tested and analyzed. Some of the software capabilities are: filtering of GRACE data using decorrelation and smoothing techniques, conversion of gravity changes into mass changes on the Earth’s spherical, ellipsoidal and topographical surface, implementation of forward modeling techniques for the estimation and removal of long-term trends due to ice mass melting, basin-specific spatial averaging in the spatial and spectral domain, time series smoothing and decomposition techniques, and data visualization.

All tools use different levels of parameterization in order to assist both expert users and non-specialists. Such a software makes the comparison between different GRACE processing methods and parameters used easier, leading to optimal strategies for the estimation of surface mass changes and to the standardization of GRACE data post-processing. It could also facilitate the use of GRACE data to non-geodesists.

How to cite: Piretzidis, D. and Sideris, M.: MATLAB tools for the post-processing of GRACE temporal gravity field solutions, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-2479, https://doi.org/10.5194/egusphere-egu2020-2479, 2020.

D810 |
EGU2020-4803
| Highlight
Martin H. Trauth

Geoscientists from the University of Potsdam reconstruct environmental changes in East Africa over the past five million years. Micro-organisms such as diatoms and rotifers, clay minerals and pollen, thousands of years old, help to reconstruct large lakes and braided rivers, dense forests and hot deserts, high mountains and deep valleys. This is the habitat of our ancestors, members of a complicated family tree or network, of which only one single species, Homo sapiens, has survived. MATLAB is the tool of choice for analyzing these complicated and extensive data sets, extracted from up to 300 m long drill cores, from satellite images, and from the fossil remains of humans and other animals. The software is used to analyze to detect and classify important climate transitions in climate time series, to detect objects and quantify materials in microscope and satellite imagery, to predict river networks from digital terrain models, and to model lake-level fluctuations from environmental data. The advantage of MATLAB is the use of multiple methods with one single tool. Not least because of this, the software is also becoming increasingly popular in Africa, as shown by the program of an international summer school series in Africa and Germany for collecting, processing, and presenting geo-bio-information.

How to cite: Trauth, M. H.: Dust storms, blackouts and 50°C in the shade: Exploring the Roots of Humankind with MATLAB, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-4803, https://doi.org/10.5194/egusphere-egu2020-4803, 2020.

D811 |
EGU2020-7253
Douaa Fathy and Eun Young Lee

Middle Miocene sediments are the most important productive oil zone in the Sidri Member within Belayim oil field. The Belayim oil field is one of well-known oil fields in Egypt, which is located on the eastern side of the Gulf of Suez. The Sidri Member consists of shales, sandstones and limestone with net pay thickness ranges from 5 to 60 m. The oil saturated sandstone layers are coarse grained and poorly sorted, which are classified into sub-litharenite, lithic arkose and arkose microfacies with several diagenetic features. This study measured and collected petrophysical data from the sandstone core samples and well logging of drilling sites to evaluate oil potentiality and reservoir characteristics of the Sidri Member. The collected petrophysical data are porosity, permeability, water and oil saturation, resistivity and grain and bulk density. MATLAB tools were used to analyze the extensive dataset, quantify the correlation trends and visualize the spatial distribution. The porosity values range from 2% to 30%, which show very good positive correlation with horizontal permeability (0 to 1,300 md). The porosity as well as type and radius of pore throats present important relationship with permeability and fluid saturation. The petrophysical characteristics of the Sidri sandstones are controlled by the depositional texture, clay-rich matrix and diagenetic features. This study distinguished poorly, fairly, good to excellent reservoir intervals in the Sidri Member. The best quality reservoir potentiality is recorded in the well sorted sand layers with little clay matrix in the lower part of the Sidri Member. The petrophysical characteristics are high porosity (20% to 30%), high permeability (140 to 1250 md), high oil saturation (20% to 78%), low water saturation (13% to 36%), moderate to high resistivity and relatively low grain density. The hydrocarbon production rates reported from the Sidri reservoirs are greatly correlated with the petrophysical characteristics described in this study.

How to cite: Fathy, D. and Lee, E. Y.: Petrophysical data analysis using MATLAB tools for the middle Miocene sediments in the Gulf of Suez, Egypt, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-7253, https://doi.org/10.5194/egusphere-egu2020-7253, 2020.

D812 |
EGU2020-7368
Wolfgang Knierzinger, Michael Wagreich, and Eun Young Lee

We present a new interactive MATLAB-based visualization and calculation tool (TETGAR_C) for assessing the provenance of detrital garnets in a four-component (tetrahedral) plot system (almandine–pyrope–grossular–spessartine). The chemistry of more than 2,600 garnet samples was evaluated and used to create various subfields in the tetrahedron that correspond to calc-silicate rocks, felsic igneous rocks (granites and pegmatites) as well as metasedimentary and metaigneous rocks of various metamorphic grades. These subfields act as reference structures facilitating assignments of garnet chemistries to source lithologies. An integrated function calculates whether a point is located in a subfield or not. Moreover, TETGAR_C determines the distance to the closest subfield. Compared with conventional ternary garnet discrimination diagrams, this provenance tool enables a more accurate assessment of potential source rocks by reducing the overlap of specific subfields and offering quantitative testing of garnet compositions. In particular, a much clearer distinction between garnets from greenschist-facies rocks, amphibolite-facies rocks, blueschist-facies rocks and felsic igneous rocks is achieved. Moreover, TETGAR_C enables a distinction between metaigenous and metasedimentary garnet grains. In general, metaigneous garnet tends to have higher grossular content than metasedimentary garnet formed under similar P–T conditions.

How to cite: Knierzinger, W., Wagreich, M., and Lee, E. Y.: TETGAR_C: a three-dimensional (3D) provenance plot and calculation tool for detrital garnets, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-7368, https://doi.org/10.5194/egusphere-egu2020-7368, 2020.

D813 |
EGU2020-8825
Eun Young Lee

This study quantifies compaction trends of Jurassic-Quaternary sedimentary units in the Perth Basin, and applies the trends to reconstruct the sedimentation and subsidence history with 2D and 3D models. BasinVis 2.0, a MATLAB-based program, as well as MATLAB 3D surface plotting functions, ‘Symbolic Math’ and ‘Curve Fitting’ toolboxes are used to analyze well data. The data were collected from fourteen industry wells and IODP Site U1459 in a study area (200x70 km2) on an offshore part of the basin, which were arranged for four successive stratigraphic units; Cattamarra, Cadda, Yarragadee, and post-breakup sequences. The Perth Basin is a large north-south elongated sedimentary basin extending offshore and onshore along the rifted continental margin of southwestern Australia. It is a relatively under-explored region, despite being an established hydrocarbon producing basin. The basin has developed by multiple episodes of rifting, drifting and breakup of Greater Indian, Australian and Antarctic plates since the Permian. The basin consists of faulted structures, which are filled by Late Paleozoic to Cenozoic sedimentary rocks and sediments. After deltaic-fluvial and shallow marine deposition until early Cretaceous time, carbonate sedimentation has prevailed in the basin, which is related to the post-rift subsidence and the long-term northward drift of the Australian plate.

High-resolution porosity data of Site U1459 and well Houtman-1 were examined to estimate best fitting compaction trends with linear, single- and two-term exponential equations. In the compaction trend plot of Site U1459 (post-breakup Cenozoic carbonates), the linear and single-term exponential trends are relatively alike, while the two-term exponential trend has abrupt change near seafloor due to highly varying porosity. The compaction trends at well Houtman-1 (Jurassic sandstones) are alike in the estimated interval, however initial porosities are quite low and different. In the compilation plot of the two wells, the two-term exponential trend presents better the porosity distribution, by adopting a trend change as estimation overfitting, by the lithologic transition from carbonates to sandstones. The abrupt trend change suggests that the multiple piece-wise compaction trend is suitable for the Perth Basin. The compaction trends are used to quantify the sedimentation profile and subsidence curves at Site U1459. 2D and 3D models of unit thickness, sedimentation rate and subsidence of the study area are reconstructed by applying the exponential trend to the stratigraphic data of industry wells. The models are visualized using the Ordinary Kriging spatial interpolation. The results allow us to compare differences between compacted (present) and decompacted (original) units through depth and age. The compaction trend has an impact on thickness restoration as well as subsidence analysis. The differences become larger with increasing depth due to the rising compaction effect during burial. Other factors can deviate the compaction trend further through age. This phenomenon highlights the fact that the restoration of largely compacted (usually deeper or older) layers is crucial to reconstruct sedimentation systems and basin evolution. This has often been underestimated in academic and industry fields. This study suggests that researchers apply the appropriate compaction trend estimated from on-site data for basin reconstruction and modelling.

How to cite: Lee, E. Y.: Quantitative analysis for compaction trend and basin reconstruction of the Perth Basin, Australia: Limitations, uncertainties and requirements, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-8825, https://doi.org/10.5194/egusphere-egu2020-8825, 2020.

D814 |
EGU2020-9232
Sebastian Bomberg and Neha Goel

The presented work focuses on disaster risk management of cities which are prone to natural hazards. Based on aerial imagery captured by drones of regions in Caribbean islands, we show how to process and automatically identify roof material of individual structures using a deep learning model. Deep learning refers to a machine learning technique using deep artificial neural networks. Unlike other techniques, deep learning does not necessarily require feature engineering but may process raw data directly. The outcome of this assessment can be used for steering risk mitigations measures, creating risk hazard maps or advising municipal bodies or help organizations on investing their resources in rebuilding reinforcements. Data at hand consists of images in BigTIFF format and GeoJSON files including the building footprint, unique building ID and roof material labels. We demonstrate how to use MATLAB and its toolboxes for processing large image files that do not fit in computer memory. Based on this, we perform the training of a deep learning model to classify roof material present in the images. We achieve this by subjecting a pretrained ResNet-18 neural network to transfer learning. Training is further accelerated by means of GPU computing. The accuracy computed from a validation data set achieved by this baseline model is 74%. Further tuning of hyperparameters is expected to improve accuracy significantly.

How to cite: Bomberg, S. and Goel, N.: Supporting risk management in the Caribbean by application of Deep Learning for object classification of aerial imagery with MATLAB, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-9232, https://doi.org/10.5194/egusphere-egu2020-9232, 2020.

D815 |
EGU2020-15695
Johannes Novotny

Geoscience is a highly interdisciplinary field of study, drawing upon conclusions from physics, chemistry and many other academic disciplines. Over the course of the last decades, computer science has become an integral component of geoscientific research. This coincides with the rising popularity of the open-source movement, which helped to develop better tools for collaboration on complex software projects across physical distances and academic boundaries.

However, while the technical frameworks supporting interdisciplinary work between geoscience and computer science exist, there are still several hurdles one must take in order to achieve successful collaborations. This work summarizes the lessons learned from the development of BasinVis from the perspective of a computer science collaborator. BasinVis is a modular open-source application that aims to allow geoscientists to analyze and visualize sedimentary basins in a comprehensive workflow. A particular development goal was to introduce the advances of 2D and 3D visualization techniques to the quantitative analysis of the stratigraphic setting and subsidence of sedimentary basins based on well data and/or stratigraphic profiles.

Development of BasinVis started in 2013 with its first release as a MATLAB GUI application in 2016. Apart from functionality, one of the major problems to solve in this period was the alignment of research goals and methodology, which may diverge greatly between geoscience and computer science. Examples of this would be to clarify the scientific terminologies of each fields early on and to clearly establish the expected results of the application in terms of mathematical accuracy and uncertainty (a concept that may catch computer scientists off guard).

How to cite: Novotny, J.: The Development of BasinVis: Lessons learned from an open-source collaboration of geoscience and computer science, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-15695, https://doi.org/10.5194/egusphere-egu2020-15695, 2020.

D816 |
EGU2020-17594
Insun Song and Chandong Chang

We report a MATLAB code for the stochastic optimization of in situ horizontal stress magnitudes from wellbore wall image and sonic logging data in a vertical borehole. In undeformed sedimentary formations, one of the principal stresses is commonly assumed to be vertical, and its magnitude (σv) is simply related to the gravitational overburden. The two horizontal far-field principal stresses (σH and σh) are then theoretically constrained by the relationship between the breakout width (or angular span) and rock compressive strength at a given depth. However, the deterministic relationship yields indeterminate solutions for the two unknown stresses. Instead of using the deterministic relationship between their average values in an interval of borehole, we introduce probabilistic distributions of rock strength and breakout width in the interval. This method optimizes the complete set of in situ principal stresses (σH, σh, and σv) by minimizing the objective function. For the rock failure model, we use a true triaxial failure criterion referred to as the modified Wiebols and Cook criterion that incorporates all three principal stresses. This criterion is expressed in the form of an implicit function with two equation parameters; the uniaxial compressive strength UCS and the internal friction coefficient μ. The Weibull distribution model of UCS in a borehole section (~30 m interval) is obtained from the wellbore sonic logging data using the relation between UCS and P-wave velocity. The value of μ is assumed to be constant at 0.6 based on a previous experimental study. The breakout model is established based on the probabilistic distribution of rock strength at the margins of the breakout for a uniform set of far-field stresses. The inverse problem is solved with a MATLAB algorithm for the optimization by choosing the best-fit set of far-field stresses in a stress polygon. This process also enables one to evaluate the statistical reliability in terms of sensitivity and uncertainty. The stochastic optimization process is demonstrated using borehole images and sonic logging data obtained from the Integrated Ocean Drilling Program (IODP) Hole C0002A, a vertical hole near the seaward margin of the Kumano basin offshore from the Kii Peninsula, southwest Japan.

How to cite: Song, I. and Chang, C.: Stochastic optimization using MATLAB code for the determination of in situ horizontal stress magnitudes from wellbore logging data, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-17594, https://doi.org/10.5194/egusphere-egu2020-17594, 2020.

D817 |
EGU2020-18878
| Highlight
Celso Reyes and Stefan Wiemer

Science is a transient discipline. Research priorities change with funding, students come and go, and faculty moves on. What remains are the published and unpublished articles, as well as the data and code they depend upon. Few projects outlive their immediate use, and even projects with thousands of hours of investment grow stagnant and irrelevant, causing confusion and draining time from the following generation of researchers.

However, a moderate investment in maintenance may save many hours of headache down the road. As a case study, I present practical lessons from the recent overhaul of ZMAP v6 to ZMAP7. ZMAP is a set of MATLAB tools driven by a graphical user interface, designed to help seismologists analyze catalog data. It debuted in 1994 as a collection of scripts written in MATLAB 4. The last official update was 7 years later, with a formal release of ZMAP v6 (in 2001). This version was the agglomeration of code written by a host of scientists and scientists-in-training as they tackled their various independent projects. In a way, ZMAP is already a success story, having survived 26 years of alternate development and stagnation, and is still in use around the world. Dozens of research papers have used ZMAP through time, with the 2001 publication having 825 google scholar citations.

With the release of MATLAB R2014b, changes to the graphics engine had rendered ZMAP 6 largely unusable. Over the interim, not only had both the MATLAB language and toolbox changed significantly, but so had common programming practices. The ZMAP7 project started as a “simple” graphical retrofit, but has evolved to leverage modern MATLAB’s new capabilities and updated toolboxes. Targeting a recent version of MATLAB (R2018a) while foregoing backwards language compatibility opened up a wide array of tools for use, and also extended the expected lifetime of ZMAP. A subset of techniques employed follows:

All changes were tracked in Git, providing snapshots and a safety net while avoiding a proliferation of folders. Code was simultaneously changed across the entire project using regular expressions. Variables were renamed according to their purpose. Unreachable files were removed to reduce the maintenance burden. Scripts and global variables were transformed to functions and classes, providing robust error checking and improving readability. Quoted scripts were extracted into functions where MATLAB itself could help evaluate their correctness. Wrapper functions allowed for global behavior changes and instrumentation. Time-consuming activities, such as determining UI placement for dialog boxes were automated.

Though irregular updates still occur, approximately 2000 hours have been devoted to the project, or one year of work by a full-time employee. This time may be amortized through both application speed-up and reliability for researchers across the globe and transparency/reproducibility provided by open-source, version-controlled code. ZMAP7 is hosted on GitHub, where the community is welcome to keep up with the latest developments. More information about Zmap7 can be accessed from the main SED page: http://www.seismo.ethz.ch/en/research-and-teaching/products-software/software/ZMAP/.

How to cite: Reyes, C. and Wiemer, S.: From ZMAP to ZMAP7: Fast-forwarding 25 years of software evolution, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-18878, https://doi.org/10.5194/egusphere-egu2020-18878, 2020.

D818 |
EGU2020-19260
| Highlight
Dirk Scherler and Wolfgang Schwanghart

The TopoToolbox v2 (TT2; available at https://github.com/wschwanghart/topotoolbox) (Schwanghart and Scherler, 2014) is a set of functions for the analysis of digital elevation models (DEM) in the MATLAB programming environment. Its functionality is mainly developed along the lines of hydrological and geomorphic terrain analysis, complemented with a wide range of functions for visual display, including a class for swath profiles. Fast and efficient algorithms in TopoToolbox form the backbone of the numerical landscape evolution model TTLEM (Campforts et al., 2017). In this presentation, we will demonstrate new functionalities that are part of the upcoming release v 2.4: DIVIDEobj and PPS.

DIVIDEobj is a numerical class to store, analyze and visualize drainage divide networks. Drainage networks are derived from flow directions and a stream network. We will present the extraction and analysis of the drainage divide network of the Big Tujunga catchment, CA, to illustrate it functionality and associated analysis tools. PPS is a class to explore, analyze and model spatial point processes on or alongside river networks. Specifically, PPS provides access to a set of statistical tools to work with inhomogeneous Poisson point processes that facilitate the statistical modelling of phenomena such as river bank failures, landslide dams, or wood jams at the regional scale.

Campforts, B., Schwanghart, W., and Govers, G.: Accurate simulation of transient landscape evolution by eliminating numerical diffusion: the TTLEM 1.0 model, Earth Surface Dynamics, 5, 47-66. https://doi.org/10.5194/esurf-5-47-2017, 2017.

Schwanghart, W., and Scherler, D.: Short Communication: TopoToolbox 2 – MATLAB-based software for topographic analysis and modeling in Earth surface sciences, Earth Surf. Dynam., 2, 1-7, https://doi.org/10.5194/esurf-2-1-2014, 2014.

How to cite: Scherler, D. and Schwanghart, W.: The TopoToolbox v2.4: new tools for topographic analysis and modelling, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-19260, https://doi.org/10.5194/egusphere-egu2020-19260, 2020.

D819 |
EGU2020-20519
| Highlight
Annarita D'Addabbo, Alberto Refice, Francesco Lovergine, and Guido Pasquariello

DAFNE(Data Fusion by Bayesian Network) is a Matlab-based open source toolbox, conceived to produce flood maps from remotely sensed and other ancillary information, through a data fusion approach [1]. It is based on Bayesian Networks and it is composed of five modules, which can be easily modified or upgraded to meet different user needs. DAFNE provides, as output products, probabilistic flood maps, i.e., for each pixel in a given output map, the probability value that the corresponding area has been reached from the inundation is reported. Moreover, if remote sensed images have been acquired in different days during a flood event, DAFNE allows to follow the inundation temporal evolution.

It is well known that flood scenarios are typical examples of complex situations in which different factors have to be considered to provide accurate and robust interpretation of the situation on the ground [2]. In particular, the combined analysis of multi-temporal and multi-frequency SAR intensity and coherence trends, together with optical data and other ancillary information, can be particularly useful to map flooded area, characterized by different land cover and land use [3]. Here a recent upgrade is presented that allows to consider as input data multi-frequency SAR intensity images, such as X-band, C-band and L-band images.

Three different inundation events have been considered as applicative examples: for each one, multi-temporal probabilistic flood maps have been produced by combining multi-temporal and multi-frequency SAR intensity images images (such as COSMO-SkyMed , Sentinel-1 images and ALOS 2 images), InSAR coherence and optical data (such as Landsat 5 images or High Resolution images), together with geomorphic and other ground information. Experimental results show good capabilities of producing accurate flood maps with computational times compatible with a near real time application.

 

[1] A. D’Addabbo, A. Refice, F. Lovergine, G. Pasquariello, DAFNE: A Matlab toolbox for Bayesian multi-source remote sensing and ancillary data fusion, with application to flood mapping. Computer and Geoscience 112 (2018), 64-75.

[2] A. Refice et al, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 7, no. 7, pp. 2711–2722, 2014.

[3] A. D’Addabbo et al., “A Bayesian Network for Flood Detection combining SAR Imagery and Ancillary Data,” IEEE Transactions on Geoscience and Remote Sensing, vol.54, n.6, pp.3612-3625, 2016.

 

How to cite: D'Addabbo, A., Refice, A., Lovergine, F., and Pasquariello, G.: Examples of DAFNE application to multi-temporal and multi-frequency remote sensed images and geomorphic data for accurate flood mapping., EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-20519, https://doi.org/10.5194/egusphere-egu2020-20519, 2020.

D820 |
EGU2020-22607
Stéphane Rondenay, Lucas Sawade, and Peter Makus

Project GLImER (Global Lithospheric Imagining using Earthquake Recordings) aims to conduct a global survey of lithospheric interfaces using converted teleseismic body waves. Data from permanent and temporary seismic networks worldwide are processed automatically to produce global maps of key interfaces (crust-mantle boundary, intra-lithospheric interfaces, lithosphere-asthenosphere boundary). In this presentation, we reflect on the challenges associated with automating the analysis of converted waves and the potential of the resulting data products to be used in novel imaging approaches. A large part of the analysis and the visualization are carried out via MATLAB-based applications. The main steps of the workflow include signal processing for quality control of the input data and earthquake source normalization, mapping of the data to depth for image generation, and interactive 2-D/3-D plotting for visualization. We discuss how these various tools, especially the visualization ones, can be used for both research and education purposes.

How to cite: Rondenay, S., Sawade, L., and Makus, P.: GLImER: a MATLAB-based tool to image global lithospheric structure, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-22607, https://doi.org/10.5194/egusphere-egu2020-22607, 2020.

D821 |
EGU2020-22668
Lisa Kempler and Steve Schäfer

Data visualization plays an essential role in conveying complex relationships in real-world physical systems. As geoscience and atmospheric data quantities and data sources have increased, so, too, have the corresponding capabilities in MATLAB and Mapping Toolbox for analyze and visualize them. The talk will present end-to-end geospatial analysis workflows, from data access to visualization to publication. The talk will include software demonstrations of how to process and visualize large out-of-memory data, including accessing remote and cloud-based file systems; working with multiple data formats and sources, such as Web Map Service (WMS), for visualizing of publicly accessible geospatial information; and applying new 2-D and 3-D high resolution mapping visualizations. MATLAB live notebooks combining code, pictures, graphics, and equations will show the use of interactive notebooks for capturing, teaching, and research sharing.

How to cite: Kempler, L. and Schäfer, S.: Advances in Scientific Visualization of Geospatial Data with MATLAB, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-22668, https://doi.org/10.5194/egusphere-egu2020-22668, 2020.