EGU2020-21494
https://doi.org/10.5194/egusphere-egu2020-21494
EGU General Assembly 2020
© Author(s) 2020. This work is distributed under
the Creative Commons Attribution 4.0 License.

Using Blender for Earth Science’s visualization

stella paronuzzi ticco, oriol tinto primis, and Thomas Arsouze
stella paronuzzi ticco et al.
  • Barcelona Supercomputing Center

Blender is an open-source 3D creation suite with a wide range of applications and users. Even though it is not a tool specifically designed for scientific visualization, it proved to be a very valuable tool to produce stunning visual results. We will show how in our workflow we go from model’s output written in netCDF to a finished visual product just relying on open-source software. The kind of visualization formats that can be produced ranges from static images to 2D/3D/360/Virtual Reality videos, enabling a wide span of potential outcomes. This kind of products are highly suitable for dissemination and scientific outreach.

How to cite: paronuzzi ticco, S., tinto primis, O., and Arsouze, T.: Using Blender for Earth Science’s visualization, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-21494, https://doi.org/10.5194/egusphere-egu2020-21494, 2020

Displays

Display file

Comments on the display

AC: Author Comment | CC: Community Comment | Report abuse

displays version 2 – uploaded on 05 May 2020
Added system requirements
  • CC1: Comment on EGU2020-21494, Eelco Doornbos, 05 May 2020

    Thanks for the excellent presentation. I am an enthusiastic user of Blender for scientific visualisation and have some examples in my own EGU presentation here: https://meetingorganizer.copernicus.org/EGU2020/EGU2020-7386.html

    I have been working on converting my python scripts into a Blender add-on to work directly with the NetCDF data from the Blender interface, but this is still in very early stages, and I am not able to spend a lot of time on this. So I am happy to hear that you have been working on this as well. Let me know if you see a possibility to cooperate (exchange ideas, beta testing of the code, etc.).

  • CC2: Comment on EGU2020-21494, Julia Hargreaves, 05 May 2020

    I am not sure what you are doing. Are you taking a flat rectangular image and putting it on to a sphere? Anything more than that? 

    • AC2: Reply to CC2, Oriol Tinto, 05 May 2020

      Hi Julia,

      well I think that what I'm showing here is that there's a free and open source tool that allows us to create nice visualizations by simply taking a flat rectangular image and putting it on to a sphere.

  • CC3: is there a website where we can follow your updates?, Maxime Bernard, 05 May 2020

    I also use blender to render my models outputs. I edited a python script to import a suite of model outputs into blender and render it at the same time, very convenient ! Here, an example of a animation

    I am therefore very interesting about your work in Blender ! Is there a website where we can follow your updates? Thank you in advance.

    • AC3: Reply to CC3, Oriol Tinto, 05 May 2020

      Hi Maxime,

      it is nice to see that there's other people using Blender and willing to collaborate in making it easier to use for other scientists.

      I don't really have a place in which I share this work, please contact me by mail and we will find a way to keep you updated.

  • CC4: Comment on EGU2020-21494, Paula Koelemeijer, 05 May 2020

    Hi Oriol,

    Very nice to see Blender being used for these visualisations! We have taken a similar approach - using a rectangular image to deform the surface in a sphere - with the primary goal then to export these for 3D printing purposes: https://meetingorganizer.copernicus.org/EGU2020/EGU2020-3515.html

    There will be more on this on Thursday in session EOS4.3 (which is also talking about AR visualisation): https://meetingorganizer.copernicus.org/EGU2020/displays/34772

    We will look forward to your netCDF tool, as so far we generate images ourselves with grids from GMT.

    Best wishes,

    Paula

    • AC4: Reply to CC4, Oriol Tinto, 06 May 2020

      Hi Paula,

      thanks!

      I took a look at your work and it is really nice to see how you bring this scientific data into a real world object.

      Great to know that using netCDF directly in Blender can have other uses too.

      Cheers,

      Oriol

      • CC6: Reply to AC4, Paula Koelemeijer, 06 May 2020

        Hi Oriol,

        Thanks! They are really useful for teaching and outreach, particularly things like the geoid model or the crustal thickness one! All our designs are open online on https://www.thingiverse.com/jeffwinterbourne/designs, so if you have a 3D printer accessible, you can print them relatively easily.

        The only imitation still for us at the moment - only mono-colour prints, so the 3D visualisations from Blender are really complementary!

        Cheers,

        Paula

    • CC5: Reply to CC4, Eelco Doornbos, 06 May 2020

      Oh, those 3D printing examples are very nice!

      It would be really nice to be able to 3D print the magnetospheric shells that I have in my presentation. Direct link to the magnetosphere Blender viz is here: https://www.youtube.com/watch?v=P4LCuqbLTYU

      • CC8: Reply to CC5, Paula Koelemeijer, 06 May 2020

        Hi Eelco,

        Thanks! They are fun to make and have!

        I had a look at your video - very cool visualisation! Printing something like that is quite hard though the way we do things (PLA printer), as we cannot have huge gaps or overhangs easily.

        Lots of cool 3D printing models are shared on the thingyverse website - take a look! Ours are available here: https://www.thingiverse.com/jeffwinterbourne/designs

        Cheers,

        Paula

         

  • CC7: Comment on EGU2020-21494, Paula Koelemeijer, 06 May 2020

    Hi Eelco,

    Thanks! They are fun to make and have!

    I had a look at your video - very cool visualisation! Printing something like that is quite hard though the way we do things (PLA printer), as we cannot have huge gaps or overhangs easily.

    Lots of cool 3D printing models are shared on the thingyverse website - take a look! Ours are available here: https://www.thingiverse.com/jeffwinterbourne/designs

    Cheers,

    Paula

  • CC9: Blender netCDF Tool, Stefan Versick, 06 May 2020

    I never thought about using Blender for scientific visualisation before. Now I am interested in it. What do you think is the time scale to finish your netCDF Tool?

    • AC5: Reply to CC9, Oriol Tinto, 06 May 2020

      Thank you for the interest. I'm afraid that I don't have an answer for your question yet.

      Let's see how we can coordinate with the people intereseted in collaborating and what time I have left from other tighter responsabilities.

displays version 1 – uploaded on 04 May 2020
  • CC1: Hardware requirements?, Markus Ungersböck, 05 May 2020

    If someone comes from a supercomputing facility and recommends software, you have to ask yourself if you are equipped accordingly. I have my first attempts with Blender behind me but when it comes to rendering a simple PC / Laptop needs hours even days to finish even a short sequence. Happy who is equipped with a good graphics card. How is it with the beautiful video shown? Do I need a supercomputer, a workstation, an average PC + graphics card to create a similar animation (in hours, not weeks)?

    Very nice work with a lot of potential for the future!

    Regards Markus

    • AC1: Hardware requirements, Oriol Tinto, 05 May 2020

      Hi Markus,

      thanks for the feedback, I'll add this information in the presentation.

      Mosts parts of this presentation have been produced with a laptop, so no, you don't need supercomputer.

      Blender comes with two render engines, Cycles and EEVEE.

      Cycles is a physically based renderer that can produce photorealistic results but requires time while EEVEE its a realtime render engine focused on speed but which can achieve also great results. Most parts of the presentation have been done using EEVEE.

      EEVEE was introduced in version 2.8 so if you had tried Blender in the past probably it was not an option yet.

      In a laptop, the time required to produce each frame ranged from less than a second to around 20 seconds for the most complex scenes.

       

      Best Regards,

      Oriol

      • CC2: Reply to AC1, Markus Ungersböck, 05 May 2020

        Thank you very much for the quick answer - sounds very good! All the best, Markus