EGU26-11026, updated on 14 Mar 2026
https://doi.org/10.5194/egusphere-egu26-11026
EGU General Assembly 2026
© Author(s) 2026. This work is distributed under
the Creative Commons Attribution 4.0 License.
Oral | Monday, 04 May, 09:35–09:45 (CEST)
 
Room 0.94/95
Characteristic Interior Structures of Jupiter and Saturn Revealed with Machine Learning
Maayan Ziv, Eli Galanti, and Yohai Kaspi
Maayan Ziv et al.
  • Weizmann Institute of Science, Department of Earth and Planetary Sciences, Rehovot, Israel (maayan.ziv@weizmann.ac.il)
Jupiter and Saturn provide complementary observational windows into giant-planet interiors, Jupiter through in situ atmospheric measurements and Juno gravity measurements, and Saturn through Cassini gravity data together with ring seismology, offering a critical Solar-System benchmark for exoplanet studies. Yet, even with these recent highly precise data, inferring interior structures remains a fundamental degenerate inverse problem.
 
To address this, we develop a unified framework that retains the accuracy of the concentric Maclaurin spheroid (CMS) method for computing hydrostatic interior models of rapidly rotating planets, while dramatically improving efficiency using NeuralCMS, a machine-learning surrogate trained on CMS solutions. NeuralCMS enables rapid exploration of broad interior parameter spaces and is coupled to a self-consistent wind model that links the atmosphere and deep interior via wind-induced gravity, allowing atmosphere–interior interactions to be treated consistently.
 
We apply this approach to Jupiter and Saturn under the same modeling assumptions, enabling a like-to-like comparison between the planets. Using clustering analysis on the multidimensional model ensembles, we identify four characteristic classes of interior structures for each planet, reflecting differences in envelope properties and core configuration. We further show that the diversity of solutions can be captured by two effective parameters: one describing the envelope and one describing the deep planetary core. With tighter observational constraints, solutions collapse to one class in each planet, revealing similar architectures yet distinct most-plausible interiors.
 
This work shows that machine learning can accelerate comprehensive accurate interior modeling and distill it into representative structures and effective parameters, especially valuable for exoplanets, where interior inference is more degenerate given the wider parameter space and fewer measurements.
 

How to cite: Ziv, M., Galanti, E., and Kaspi, Y.: Characteristic Interior Structures of Jupiter and Saturn Revealed with Machine Learning, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-11026, https://doi.org/10.5194/egusphere-egu26-11026, 2026.