EGU24-18868, updated on 11 Mar 2024
https://doi.org/10.5194/egusphere-egu24-18868
EGU General Assembly 2024
© Author(s) 2024. This work is distributed under
the Creative Commons Attribution 4.0 License.

A comparison of point and bounding box annotation methods to detect wild animals using remote sensing and deep learning

Zeyu Xu, Tiejun Wang, Andrew Skimore, and Richard Lampery
Zeyu Xu et al.
  • University of Twente, Faculty of Geo-Information Science and Earth Observation, Nature Resources, Enschede, Netherlands (z.xu-1@utwente.nl)

The point and bounding box are the two widely used annotation techniques for deep learning-based wild animal detection using remote sensing. However, the impact of these two annotation methods on the performance of deep learning is still unknown. Here, using a publicly available Aerial Elephant Dataset, we evaluate the effect of two annotation methods on model accuracy in two commonly used neural networks (YOLO and U-Net). The results show that when using YOLO, there are no statistically significant differences between the point and bounding box-based annotation methods, as indicated by an overall F1-score being 82.7% and 82.8% (df = 4, P = 0.683, t-test), respectively. While when using U-Net, the accuracy of the results based on bounding boxes with an overall F1-score of 82.7% is significantly higher than that of the point-based annotation method with an overall F1-score of 80.0% (df = 4, P < 0.001, t-test). Our study demonstrates that the effectiveness of the two annotation methods is influenced by the choice of deep learning models. The result suggests that the deep learning method should be taken into account when deciding on annotation techniques for animal detection in remote sensing images.

How to cite: Xu, Z., Wang, T., Skimore, A., and Lampery, R.: A comparison of point and bounding box annotation methods to detect wild animals using remote sensing and deep learning, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-18868, https://doi.org/10.5194/egusphere-egu24-18868, 2024.