- Federal Institute for Geosciences and Natural Resources (BGR), Hannover, Germany (jewgenij.torizin@bgr.de)
Semantic segmentation of texture-rich Earth-science imagery (e.g. UAV and outcrop photographs) is common, but supervised segmentation workflows are often assembled from disconnected tools and still rely on labour-intensive, dense pixel-wise annotation. We present SegFlow, an end-to-end pipeline that integrates texture-patch curation, dataset synthesis, model training, experiment tracking, and inference for texture-centric segmentation.
SegFlow defines classes through curated texture patches and generates synthetic training composites and label masks using parameterised, procedural mask generation. This supports rapid model bootstrapping for initial training, reduces the amount of dense pixel-wise annotation required on real imagery, and helps keep label definitions consistent via versioned datasets and repeatable train/validation/test splits in a portable project structure.
For model development, SegFlow includes a PyTorch training interface centred on a configurable U-Net, with various segmentation losses and metrics. Training and inference are organised as scripted, job-based runs that capture data and model provenance in standardised run reports. For assisted segmentation, SegFlow combines the texture-focused U-Net with a prompt-based segmenter (Segment Anything Model, SAM) driven by sparse prompts (points or boxes). In our use case, SAM is helpful for object-like structures, whereas the U-Net is better suited to extended texture regions where prompting can be less stable. Outputs can be refined interactively, and corrected masks can be added back for iterative fine-tuning on real imagery.
We demonstrate the workflow on UAV imagery for geological outcrop mapping (e.g. chalk, glacial till, vegetation) and discuss how provenance tracking, label consistency, and hybrid assistance support reproducible iteration in Earth-science segmentation projects. SegFlow will be made available under GPLv3.
How to cite: Torizin, J. and Schüßler, N.: SegFlow: an end-to-end workflow for texture-centric image segmentation, from texture-patch curation to hybrid assistance, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-19144, https://doi.org/10.5194/egusphere-egu26-19144, 2026.