EGU26-17470, updated on 14 Mar 2026
https://doi.org/10.5194/egusphere-egu26-17470
EGU General Assembly 2026
© Author(s) 2026. This work is distributed under
the Creative Commons Attribution 4.0 License.
Oral | Wednesday, 06 May, 17:50–18:00 (CEST)
 
Room 2.15
Hybrid Numerical–Machine Learning Framework for Predicting Aquifer Drainage
Sami Ghordoyee Milan1, Mehdi Rasti2, and Ali Torabi Haghighi1
Sami Ghordoyee Milan et al.
  • 1Water, Energy, and Environmental Engineering Research Unit, University of Oulu, Oulu, Finland, (sami.ghordoyeemilan@oulu.fi) (ali.torabihaghighi@oulu.fi )
  • 2Faculty of Information Technology and Electrical Engineering, University of Oulu, Oulu, Finland, (mehdi.rasti@oulu.fi)

The amount of drainage from the aquifer is one of the most important components of the groundwater balance in aquifers situated in humid and extremely humid climates. To manage groundwater and modify the groundwater balance, modeling this interaction is essential. However, it has rarely been taken into consideration thus far, and no comprehensive approach has been put out due to the complexity of simulating and forecasting the two systems. There are two methods for draining the amount of drainage from the aquifer: in the first, a portion of the river serves as a natural drain. In the second, artificial drains are excavated to a depth of one to three meters to regulate the amount of groundwater rise. It regulates groundwater levels, keeps agricultural areas from draining, stops land salinization and groundwater contamination, and shields plant roots.  Based on the outcomes of numerical modeling, machine learning models can forecast the amount of drainage from the aquifer. This strategy led to the development of a numerical modeling and machine learning method that simulates the drainage system and aquifer and then estimates drainage from the aquifer. The Guilan Plain in northern Iran was used to evaluate such an approach. The drainage-aquifer system was simulated using MODFLOW in GMS software. The drained amount of the aquifer was then predicted using Gaussian process regression (GPR), a probabilistic and Gaussian machine learning technique. Features such as surface recharge, groundwater level, topography, and aquifer discharge were extracted from the simulated model and later used as inputs to machine learning models to predict the amount of aquifer drainage. The MODFLOW simulation's results showed that drains discharge a significant quantity of groundwater each year, which could not be disregarded when evaluating the groundwater balance. In addition, GPR performed the best in predicting the volume of aquifer drainage with RMSE, MAE, and NSE of 1.772 thousand cubic meters per month, 0.70 thousand cubic meters per month, and 0.78, respectively. The proposed approach can be investigated in other comparable locations to forecast the amount of aquifer draining and modify the groundwater balance based on the results obtained.

How to cite: Ghordoyee Milan, S., Rasti, M., and Torabi Haghighi, A.: Hybrid Numerical–Machine Learning Framework for Predicting Aquifer Drainage, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-17470, https://doi.org/10.5194/egusphere-egu26-17470, 2026.