3D representation technology is a fast developing field, made possible by progress in computing and reflected by the gaming and video industries. Notably, 3D-printing provides a cost-effective and tactile way to illustrate research concepts, while Augmented Reality (AR) or Virtual Reality (VR) allow to grasp complex processes and geometries using common smartphones or handheld devices. Using these technologies, 3-dimensional objects and datasets are developed that are well suited for outreach, teaching and wider public engagement.

The design of 3D models of the Earth on different scales based on photogrammetry, mapping and imagery, modelling and inverse modelling is a challenging task. Before 3D data sets (physically real or virtual) can be explored in outreach activities or teaching settings, several steps have to be taken which all come with their own issues: how to export data into the objects formats used in the 3D engineering community; how to feed objects into software to allow 3D-printing; how to manipulate virtual objects easily using handheld devices.

We welcome contributions that are focused on technical aspects of real or virtual realisations, as well as their use in pedagogy, outreach or public communication of Earth Sciences research topics.

Convener: Renaud Toussaint | Co-convener: Paula Koelemeijer
| Attendance Thu, 07 May, 16:15–18:00 (CEST)

Files for download

Download all presentations (173MB)

Chat time: Thursday, 7 May 2020, 16:15–18:00

Chairperson: Renaud Toussaint, Paula Koelemeijer
D3537 |
| solicited
| Highlight
Markus Wiedemann, Bernhard S.A. Schuberth, Lorenzo Colli, Hans-Peter Bunge, and Dieter Kranzlmüller

Precise knowledge of the forces acting at the base of tectonic plates is of fundamental importance, but models of mantle dynamics are still often qualitative in nature to date. One particular problem is that we cannot access the deep interior of our planet and can therefore not make direct in situ measurements of the relevant physical parameters. Fortunately, modern software and powerful high-performance computing infrastructures allow us to generate complex three-dimensional models of the time evolution of mantle flow through large-scale numerical simulations.

In this project, we aim at visualizing the resulting convective patterns that occur thousands of kilometres below our feet and to make them "accessible" using high-end virtual reality techniques.

Models with several hundred million grid cells are nowadays possible using the modern supercomputing facilities, such as those available at the Leibniz Supercomputing Centre. These models provide quantitative estimates on the inaccessible parameters, such as buoyancy and temperature, as well as predictions of the associated gravity field and seismic wavefield that can be tested against Earth observations.

3-D visualizations of the computed physical parameters allow us to inspect the models such as if one were actually travelling down into the Earth. This way, convective processes that occur thousands of kilometres below our feet are virtually accessible by combining the simulations with high-end VR techniques.

The large data set used here poses severe challenges for real time visualization, because it cannot fit into graphics memory, while requiring rendering with strict deadlines. This raises the necessity to balance the amount of displayed data versus the time needed for rendering it.

As a solution, we introduce a rendering framework and describe our workflow that allows us to visualize this geoscientific dataset. Our example exceeds 16 TByte in size, which is beyond the capabilities of most visualization tools. To display this dataset in real-time, we reduce and declutter the dataset through isosurfacing and mesh optimization techniques.

Our rendering framework relies on multithreading and data decoupling mechanisms that allow to upload data to graphics memory while maintaining high frame rates. The final visualization application can be executed in a CAVE installation as well as on head mounted displays such as the HTC Vive or Oculus Rift. The latter devices will allow for viewing our example on-site at the EGU conference.

How to cite: Wiedemann, M., Schuberth, B. S. A., Colli, L., Bunge, H.-P., and Kranzlmüller, D.: Visualising large-scale geodynamic simulations: How to Dive into Earth's Mantle with Virtual Reality, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-5714, https://doi.org/10.5194/egusphere-egu2020-5714, 2020.

D3538 |
| Highlight
Paula Koelemeijer, Jeff Winterbourne, Renaud Toussaint, and Christophe Zaroli

3D-printing techniques allow us to visualise geophysical concepts that are difficult to grasp, making them perfect for incorporation into teaching and outreach packages. Abstract models, often represented as 2D coloured maps, become more tactile when represented as 3D physical objects. In addition, new questions tend to be asked and different features noticed when handling such objects, while they also make outreach and education more inclusive to the visually impaired.

Some of our most effective models are simply exaggerated planetary topography in 3D, including Earth, Mars and the Moon. The resulting globes provide a powerful way to explain the importance of plate tectonics in shaping a planet and linking surface features to deeper dynamic processes. In addition, we have developed a simple method for portraying abstract global models by 3D printing globes of surface topography, representing the parameter of interest as additional, exaggerated long-wavelength topography. This workflow has been applied to models of dynamic topography, the geoid and seismic tomography. In analogy to Russian nesting dolls, the resulting “seismic matryoshkas” have multiple layers that can be removed by the audience to explore the structures present deep within our planet and learn about the ongoing dynamic processes.

While these 3D objects are easily printed on a cheap (<300 GBP, 400USD) desktop 3D-printer, the printing times still prohibit large-scale production. To ensure that there is sufficient material in a teaching setting, we have therefore also developed complementary paper equivalents. By projecting the coloured maps onto a dodecahedron, we developed cut-out-and-fold models to be handed out in a classroom setting to complement the 3D printed globes used for demonstration purposes. Together with animations, suggested questions and instructor “cheat-sheets”, these materials form a complete teaching and outreach package that is both interactive and inclusive.

How to cite: Koelemeijer, P., Winterbourne, J., Toussaint, R., and Zaroli, C.: 3D printing the world: developing geophysical teaching materials and outreach packages, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-3515, https://doi.org/10.5194/egusphere-egu2020-3515, 2020.

D3539 |
Renaud Toussaint, Paula Koelemeijer, and Christophe Zaroli

Globe representation of the Earth has a long history in pedagogy and outreach. To help people realize global processes, these representations allow the conception and the manipulation of global fields and planetary geography. The realization of a physical representation of such global fields is demanding. 3D printing allows representing well scalar data at a fixed time, via for example the deformation of elevation maps. We propose here an alternative allowing to represent easily dynamic fields, and reproducing in a simple principle the effect obtained by the first astronauts visualizing planet Earth as a "pale blue dot". To that effect, we use virtual reality and represent mobile fields on a globe, associated with a physical object permitting spatial manipulation. The open software Unity, common in videogame conception and development, and the library Vuforia, allowing virtual reality, are utilized for the development. The fields represented are associated with the solid earth, and with oceanic and atmospheric dynamics: Seismic velocity fields, global seismicity catalogs, geoid, geothermal gradient, or oceanic and atmospheric currents. The software is can be easily deployed on tablets and phones, complementing printed images.

How to cite: Toussaint, R., Koelemeijer, P., and Zaroli, C.: Inside blue dots - Grasping dynamic global fields thanks to Virtual Reality, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-11708, https://doi.org/10.5194/egusphere-egu2020-11708, 2020.

D3540 |
Alain Steyer, Tom Vincent-Dospital, and Renaud Toussaint

Anisotropic phenomena have long been studied in the vicinity of seismic faults. It has for instance been shown that both in situ pore fluids and seismic mechanical waves travel at different velocities along various directions of a fault zone. Yet, while more and more complexity and disorder in seismic models are introduced to better understand earthquakes, frictional anisotropy is only rarely regarded. In many other domains than geophysics, however, such anisotropy in solid friction is believed to be crucial. For instance, the tribology of rubber tires, skis or advanced adhesives is improved when those are designed to have a preferential frictional direction. But numerous natural systems also benefit from such anisotropy: is is notably essential in the motion of numerous animal skins and in the efficient hydration of some plants. In most cases, these frictional anisotropies derive from the existence of preferential topographic orientations on, at least, one of the contact surfaces, with scales for such structural directivity that can be multiple and various. Seismic faults also exhibit such preferential directions in their topography: unique rock crystals, such as antigorite, can already display some frictional anisotropy, fault zones are  initiated by early fractures that often propagates through layered sediments, generating ramp-flat morphology in their surfaces and, finally, mature faults are marked by grooves of various wavelengths due to the slip induced erosion.


In this work, we study how the morphology of faults affects their stability, as understood by their Coulomb static coefficient of friction. In particular we study its anisotropy with the slip direction. To do so, we make use of the 3D-printing technology and print actual fault surfaces, that were measured in the field. We perform friction experiments with gypsum casts of these 3D-printed faults, as mineral-like materials might deform differently under shear than plastic materials. With these experiments, we show that the friction coefficient along seismic faults is highly anisotropic, with slip motions that are easier in, but not limited to, the direction of the main grooves. This anisotropy could for instance be paramount to better predict the next direction of rupture along some faults under a varying stress state. In some cases, it could indeed not only be related to the orientation of the main regional stress, but also to the structural anisotropy, and  depending on stress and friction anisotropy, along which orientation a rupture criterion will first be exceeded.

How to cite: Steyer, A., Vincent-Dospital, T., and Toussaint, R.: Frictional anisotropy in casted seismic faults: or how to 3D print a fault to better characterize it, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-19131, https://doi.org/10.5194/egusphere-egu2020-19131, 2020.

D3541 |
| solicited
| Highlight
Jordi Cortés, Daniel Garcia-Castellanos, Angel Valverde, and Samadrita Karmakar

Augmented-reality sandboxes are increasingly used for outreach purposes in many fields. Here we show the benefits of modifying a standard AR sandbox to significantly improve its teaching capability in Earth Science.

First prototypes of AR sandboxes date back to at least (Microsoft Fest conference in Prague), consisting of a Kinect card scanning the surface of the sand and producing a digital elevation model (DEM) of it in real time. This DEM is used to compute the flow of virtual water on the surface and produce an image combining the DEM and the water, which are projected back on the surface through a standard image projector, also in real time. In this way, water appears to flow on top of the actual sand topography, responding to any manipulation of the surface within a time lag shorter than 1s. The idea was popularized thanks to the open-source ARSandbox distribution published by Reed & Kreylos (2014). 

In our portable sARndbox device, we have modified the original GLSL (OpenGL Scripting Language) SARndbox code with the purpose of teaching experimentally how erosion and geodynamics interact during the development of Earth's topography and relief. Our version of the Fragment Shader file allows to visualize the areas of enhanced erosion and sedimentation driven by high and low water energy, respectively, to better communicate its role in shaping landscape. This is done by colour-shading water as a function of water flow energy, which is approximated as proportional to water depth and velocity at each location. The modified scripts and other info is available on GitHub (https://github.com/danigeos/sARndbox). 

The setting has proved useful in conveying basic principles of landscape evolution to students ranging from primary school to master level. We used this in combination with 3D prints of real tectonic plates and concept explanations, in sessions lasting typically between 30 and 60 minutes.


How to cite: Cortés, J., Garcia-Castellanos, D., Valverde, A., and Karmakar, S.: Teaching erosion and landscape evolution with an Augmented Reality Sandbox, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-7069, https://doi.org/10.5194/egusphere-egu2020-7069, 2020.

D3542 |
| solicited
Dominique Frizon de Lamotte, Pascale Leturmy, Pauline Souloumiac, and Adrien Frizon de Lamotte

Geology is a scientific discipline where a 3D view is important - even essential. When starting to learn geology, as a first exercise students should be able to gain a 3D vision of geological maps, which like all maps are 2D objects, and interpret them. Many people have an objective difficulty in "seeing in 3D", that is, in achieving a mental representation of a dimension, which is not shown. To help them in this task, we propose a wide range of objects, which anyone can use or make in line with an educational approach that combines digital creation and object manipulation. In fact, our computer-designed prototypes are saved in a format from which they can be printed in 3D. Three types of objects are presented:

(1) models, which help to see things in 3D and thus understand particular structures;

(2) models where the third dimension offers an approach to successive geometries (kinematics) during the formation of particular geological structures;

(3) models that provide the opportunity to move different parts relative to each other to generate structures like faults.

We venture that through using our models, and possibly creating other objects by themselves, students will be helped to find their way in this 3D world, which is often confusing at first sight. We will also present printed models of natural examples in different geological context. The target audience is students from first degree to Master's level, trainee teachers, secondary school science teachers and amateur or professional geologists. We also want to reach the growing network of ‘fablabs’, whether or not they are university-based.

How to cite: Frizon de Lamotte, D., Leturmy, P., Souloumiac, P., and Frizon de Lamotte, A.: Using 3D printed models to help the understanding of geological maps, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-8757, https://doi.org/10.5194/egusphere-egu2020-8757, 2020.

D3543 |
Ryan L Payton

Researchers often have to carefully select data for figures to best show their results for a static 2D format such as a conference poster or outreach handout. This can result in the scientific message being harder to understand or only part of the story being visualised. Augmented reality can help in improving the clarity of temporal data as well as the understanding of 3D structures which may be challenging to otherwise visualise.

A series of software packages may be used in order to take video files (MP4, AVI etc…) and 3D model files (OBJ, STL, PLY etc…) and pair them with a target image, detectable by a mobile app for Android or iOS. The Vuforia engine plug-in for Unity allows for target images to be imported for use with AR and paired with a 3D model or video in Unity. Manipulation of the AR element is achieved using the Lean-Touch asset in Unity, allowing for scaling, rotation and movement.

The incorporation of AR in science communication at a professional and public level creates a memorable interaction which is also enriched by greater  scientific clarity. The interactive element of AR, especially using Lean-Touch, makes it an appealing tool for the public and children which results in greater engagement with science. The ability to show more data such as full simulations or experiment time lapses rather than a select series of still images also makes this an appealing tool for researchers in a variety of fields including modellers, experimentalists and anyone using digital data.

How to cite: Payton, R. L.: Dynamic and Interactive Scientific Posters: Visualising 3D Models and Simulation Data Using AR, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-19037, https://doi.org/10.5194/egusphere-egu2020-19037, 2020.

D3544 |
Raluca Ilie, Eric Shaffer, Cynthia D’Angelo, Erhan Kudeki, Olivia Coiado, Lucas Wagner, and Marcia Pool

A solid understanding of electromagnetic theory is key to the education of electrical engineering students. However, concepts in electricity and magnetism (E&M) are notoriously challenging for students to learn, due to the difficulty in grasping abstract concepts such as the electric force as an invisible force that is acting at a distance, or how electromagnetic radiation is permeating and propagating in physical space. Building physical intuition to manipulate these abstractions requires means to visualize electromagnetism concepts in a three-dimensional space. This project involves the development of 3D visualizations of abstract E&M concepts in Virtual Reality (VR), in an immersive, exploratory, and engaging environment, with the potential to be adopted by Engineering, Science, Mathematics and Medical college curricula across the country.

VR provides a disruptive platform for teaching and learning, in a realistic and most importantly, interactive three-dimensional environment. There are many advantages for using VR as a teaching tool, as it has the potential of addressing many challenges traditional teaching usually faces, and can lead to increased student engagement while removing some of the anxiety student experience while in active learning environments. Virtual Reality provides the means of exploration, to construct visuals and manipulable objects to represent knowledge, which in turns leads to a constructivist way of learning, in the sense that students are allowed to build their own knowledge from meaningful experiences.

The VR labs for E&M courses in the ECE department are generated by Electrical Engineering and Computer Science students enrolled in the “Virtual Reality" course at the same university, as part of the course term projects. This reflects the strong educational impact of this project, as it allows students to contribute to the educational experiences of their peers.  Student competencies around conceptual understanding of electromagnetism topics, as well as their understanding of mathematical concepts, are measured via formative and summative assessments. To evaluate the effectiveness of VR learning, each VR experience is followed by a short 10-minute multiple choice test, designed to primarily measure conceptual understanding of the various topics, rather than measuring the ability to simply manipulate equations, and will be tied to the specific contexts and topics of that lab's instruction.

This paper discusses the implementation and the pedagogy of the Virtual Reality laboratory experiences to visualize concepts in E&M, with examples for specific labs, as well as challenges, and student feedback with the new approach. We will also discuss the integration of the 3D visualizations into lab exercises and the design of the student assessment tools used to assess the knowledge gain when the VR technology is employed. In addition, we discuss the development of VR labs to visualize concepts pertaining to elements vector calculus, designed to enhance student understanding of the nature of operators such the gradient, curl and divergence, as well as the development of VR labs to visualize concepts pertaining to spatial geometry and coordinate transformations. 



How to cite: Ilie, R., Shaffer, E., D’Angelo, C., Kudeki, E., Coiado, O., Wagner, L., and Pool, M.: Learning by Immersion: Developing Virtual reality Labs for Engineering Courses, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-6114, https://doi.org/10.5194/egusphere-egu2020-6114, 2020.

D3545 |
Petra Žvab Rožič, Matevž Novak, Boštjan Rožič, Nace Pušnik, and Helena Gabrijelčič Tomc

The Montanistika building, which also houses the department of Geology of University of Ljubljana, is a remarkable object that is entered in the register of cultural heritage. The interior of the building (walls, floors and other elements) is adorned with numerous stone elements that emphasize its monumentality, and also carry important information about the extraction and use of natural stone in the past. From a geological point of view, the corridors and lobby of the building represent a special geological museum that provide the place for the education and combines natural and cultural heritage.

For decoration of the building interior local architectural stones (mostly Slovenian, partly Croatian) was used. The rocks used cover representatives of all three basic rock types (sedimentary, igneous and metamorphic rocks) which offers the opportunity to use these rocks also for the dissemination of geological contents to wider public.

The main objective of the research was to present the natural heritage to a wider audience in a narrative way using pictured dialogues, augmented reality (AR) and virtual reality (VR). In addition to the implementation aspect of planning and designing a digital representation of natural heritage, the research also included the study process of graphic students, whose task was to optimally solve the digital presentation of the natural heritage with new media.

In the research three approaches were implemented with the task to digitally and interactively present the representative rocks in Montanistika building. The workflow of the research included the following creative steps: definition of digital strategy for natural heritage presentation, definition of content types and functionalities of interactive media, planning of information architecture and designing of wireframes, content creation (character design, 3D acquisition of the rocks, text and graphics creation), graphic design (layout, composition of elements), interaction and navigation design, developing of AR and VR applications, testing and optimization. 

In AR apps the rocks were interpreted and described through the stories using their main characters, such as fossils and minerals. By the stylized characters and based on the geological knowledge and facts the comics were drawn. Characters were included in animated, video and sound storytelling that augmented building’s walls, staircases and floors made of rocks. These approaches enabled the presentation of the main rock properties to the observer in a more attractive way. In VR app, 360 scenes and 360 video recordings of the rocks were included. Here, the detailed information about each rock is additionally presented in the info boxes and the navigation allows the participants to interactively move from one virtual room to another. Additionally, elements such as stickers, tabs, and overlays were added to make the materials even more interactive and of interest to a younger audience.

The results of the research present three approaches of digitalization of natural heritage that include different levels of presentative and/or interpretative principles. The results demonstrate that VR presentations and stylized animated interpretations of rocks are valuable communicative media for digital natural heritage that enable an immersive experience of geological content.

How to cite: Žvab Rožič, P., Novak, M., Rožič, B., Pušnik, N., and Gabrijelčič Tomc, H.: Stories of Montanistika – experience through comics, AR and VR, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-4777, https://doi.org/10.5194/egusphere-egu2020-4777, 2020.