EGU24-18759, updated on 11 Mar 2024
EGU General Assembly 2024
© Author(s) 2024. This work is distributed under
the Creative Commons Attribution 4.0 License.

Detection and identification of environmental faunal proxies in digital images and video footage from northern Norwegian fjords and coastal waters using deep learning object detection algorithms

Steffen Aagaard Sørensen1, Eirik Myrvoll-Nielsen2, Iver Martinsen2, Fred Godtliebsen2, Stamatia Galata3, Juho Junttila4, and Tone Vassdal4
Steffen Aagaard Sørensen et al.
  • 1Department of Geosciences, UiT The Arctic University of Norway, Norway (
  • 2Department of Mathematics and Statistics, UiT The Arctic University of Norway, Norway
  • 3School of Biological and Environmental Sciences, Liverpool John Moores University, UK
  • 4Multiconsult Norge AS, Norway

The ICT+ project:” Transforming ocean surveying by the power of DL and statistical methods” hosted by UiT The Artic University of Norway, aims at employing machine learning techniques in improving and streamlining methods currently used in ocean surveying by private sector partners to the project, MultiConsult and Argeo. The tasks include detection and identification of µm (e.g. foraminifera, microplastics) to m (e.g. boulders, shipwrecks) sized objects and elements at and in the seabed in data that presently is processed manually by skilled workers, but ideally could be wholly or partially processed using an automated approach.

Here we present preliminary work and results related to application of the YOLO (You Only Look Once) algorithms in detection and identification of meio fauna (foraminifera) in - and macro (mollusc) fauna at the seabed. Both proxies are used in evaluation of the environmental state of the seabed. YOLO is a real-time object detection deep learning algorithm that efficiently identifies and locates objects in images or videos in a single pass through the neural network.

Presently the year on year growth or shrinkage of protected mollusc banks in northern Norwegian fjords is manually evaluated via video observation in seabed video sequences annually captured via remotely operated vehicles. The preliminary results suggest that upon moderate training the YOLO algorithm can identify presence/absence of mollusc bank formations in set video sequences, thus supporting and eventually minimizing the task of inspecting the video footage manually.      

Foraminifera are abundant marine meiofauna living in the water column or at and in the seabed. Foraminifera are utilized in research into both modern and past marine environments as they have high turnover rates and individual shells have high preservation potential. Foraminiferal shells are accumulated in the sediments and after sample processing, they subsequently can be manually detected and identified via microscope. This work is very labour-intensive and demands skilled expertise but suffers from errors by and bias of the individual expert.

Preliminary results show that a YOLO network, trained on ca 4100 individuals (20 subgroups; benthic calcareous foraminifera (n=19), Planktic foraminifera (n=1)) in 346 images have model performances of up to 0.96 mAP (mean average precision) when trained, validated and tested on the training set. These promising results will be tested on real world samples. This testing is complicated by real world samples containing many more foraminiferal species/groups that were not trained upon, overlapping or closely set specimens and presence of non-foraminiferal material (e.g. sediment grains, other meio-fauna or –flora, etc.). Thus, additional training with focus on set complicating aspects will likely be necessary and most recent result will be presented.

How to cite: Aagaard Sørensen, S., Myrvoll-Nielsen, E., Martinsen, I., Godtliebsen, F., Galata, S., Junttila, J., and Vassdal, T.: Detection and identification of environmental faunal proxies in digital images and video footage from northern Norwegian fjords and coastal waters using deep learning object detection algorithms, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-18759,, 2024.