Shale microstructure is critical for predicting shale-oil “sweet spots” and improving reservoir evaluation. As a typical unconventional reservoir, shale exhibits ultra-low permeability, complex pore systems, strong heterogeneity, and diverse mineral compositions with highly uneven spatial distributions; pore morphology and connectivity, together with mineral assemblages, strongly control fluid migration, mechanical behaviour, and acoustic–electrical responses. However, conventional rock-physics experiments and image-analysis workflows are often time-consuming and insufficiently accurate for such complex materials. Here we develop a multi-scale, multi-modal segmentation workflow based on nnU-Net v2. Using paired 0.3 µm-resolution SEM and QEMSCAN images, we perform multi-class segmentation of pores, organic matter, clays, felsic minerals, carbonates, and heavy metal-bearing minerals, achieving a weighted Dice score of 0.95 and clearly outperforming threshold-based segmentation. The SEM-trained network is then transferred to 0.3 µm CT data to enable cross-modality prediction and reconstruct three-dimensional distributions of the segmented phases. We further extend the model to 4 nm-resolution CT images for cross-scale and cross-modality segmentation; three denoising filters are evaluated to suppress noise and improve nanoscale segmentation accuracy. Finally, we compare 3D digital rock volumes generated from single-direction inference with those obtained by tri-axial inference and fusion, highlighting differences in volumetric consistency and structural representation. This workflow provides a robust basis for future multi-scale digital rock construction and for simulations of porosity, permeability, and saturation, thereby supporting more comprehensive shale-oil reservoir assessment.
How to cite: Huang, X., Guo, Y., Li, X., and Li, Y.: AI-Driven Automatic Segmentation and Quantitative Characterization of Shale Microstructures, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-3653, https://doi.org/10.5194/egusphere-egu26-3653, 2026.