EGU General Assembly 2020
© Author(s) 2021. This work is distributed under
the Creative Commons Attribution 4.0 License.

Developing a data-driven ocean model

Rachel Furner1,2, Peter Haynes1, Dan Jones2, Dave Munday2, Brooks Paige3,4, and Emily Shuckburgh1
Rachel Furner et al.
  • 1University of Cambridge, Cambridge, UK
  • 2British Antarctic Survey, Cambridge, UK
  • 3UCL, London, UK
  • 4Alan Turing Institute, London, UK

The recent boom in machine learning and data science has led to a number of new opportunities in the environmental sciences. In particular, climate models represent the best tools we have to predict, understand and potentially mitigate climate change, however these process-based models are incredibly complex and require huge amounts of high-performance computing resources. Machine learning offers opportunities to greatly improve the computational efficiency of these models.

Here we discuss our recent efforts to reduce the computational cost associated with running a process-based model of the physical ocean by developing an analogous data-driven model. We train statistical and machine learning algorithms using the outputs from a highly idealised sector configuration of general circulation model (MITgcm). Our aim is to develop an algorithm which is able to predict the future state of the general circulation model to a similar level of accuracy in a more computationally efficient manner.

We first develop a linear regression model to investigate the sensitivity of data-driven approaches to various inputs, e.g. temperature on different spatial and temporal scales, and meta-variables such as location information. Following this, we develop a neural network model to replicate the general circulation model, as in the work of Dueben and Bauer 2018, and Scher 2018.

We present a discussion on the sensitivity of data-driven models and preliminary results from the neural network based model.


Dueben, P. D., & Bauer, P. (2018). Challenges and design choices for global weather and climate models based on machine learning. Geoscientific Model Development, 11(10), 3999-4009.

Scher, S. (2018). Toward Data‐Driven Weather and Climate Forecasting: Approximating a Simple General Circulation Model With Deep Learning. Geophysical Research Letters, 45(22), 12-616.

How to cite: Furner, R., Haynes, P., Jones, D., Munday, D., Paige, B., and Shuckburgh, E.: Developing a data-driven ocean model, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-19339,, 2020.

Display materials

Display file

Comments on the display material

AC: Author Comment | CC: Community Comment | Report abuse

Display material version 1 – uploaded on 01 May 2020
  • CC1: Comment on EGU2020-19339, Alexander Barth, 04 May 2020

    Thank for your interesting poster! The model does not change much between two days (about 0.001 degree Celsius) at the considered spatial scale (2° degree resolution). Did you consider to use longer prediction time scales ? For very short time scale (a day), the model dynamics are probably still quite close to linear.

    • AC1: Reply to CC1, Rachel Furner, 04 May 2020

      Hi Alexander, thanks for your question, and I'm glad you found the poster interesting!

      I actually originally began looking at forecasting over month long time steps (looking at monthly mean output from MITgcm, and so training on, and predicting, the change in monthly mean temperature from one month to the next). The results from this were qualitatively similar in terms of the sensitivity analysis, although the actual stats differed (but of course, this is to be expected as what we are then forecasting is also different, and the baseline persistence skill also changed). The scatter plots etc for forecasting over monthly steps were similar to those shown here, and so it doesn't appear that this change fundamentally changes whether the dynamics can be modelled as linear to first order . 

      I did find that when using monthly steps I couldn't iteratively forecast well with the regressor. I haven't shown any of the iterative results (for either daily time stepping or monthly time stepping here) but did run the model such that predictions from the regressor for temperature were used as inputs to the regressor (along with MITgcm output for the other variables) to forecast future steps iteratively. When doing this with the monthly time stepping we saw what looked a lot like numerical noise (checkered extreme values). My thoughts are that given we are only using a 3x3x3 stencil of inputs, we run into a CFL like issues - over month long steps the change in likely temperature is impacted by the ocean state at points far outside the input stencil, and so I think this causes the noise. By comparison, running iteratively with the daily time stepping the errros grow over time (as would be expected), but never in the same noisy way. The catch here however, is that this requires more iterations to cover the same forecast period, and so the errors accumulate faster than if using large steps, and we move away from 'reality' within a few weeks of forecast time.

      Looking into this further, and finding a 'optimum' step size, over which errors are kept small would be some interesting further work.

      Thanks, Rachel

      • CC2: Reply to AC1, Alexander Barth, 04 May 2020

        Thanks Rachel, for your detailed response! What you are doing seems to be quite close to a recurrent neural network (without activation functions). It might be useful to look into this direction (in particular LSTM, GRU..., for the problem of "exploding" gradients). In the context of data assimilation, one need to make also sometimes a linear approaximation of the model (in particular in 4D Var). But running these linearize models (or its adjoint) lead to unbounded error growth, which might be similar than the effect that you are seeing.

        Keep up your nice work!



        • AC2: Reply to CC2, Rachel Furner, 04 May 2020

          Thanks for your comments - I hadn't considered it from a DA perspective before. And certainly looking at reccurent networks feels like the best way forward!