Articles | Volume 22, issue 21
https://doi.org/10.5194/bg-22-6545-2025
https://doi.org/10.5194/bg-22-6545-2025
Research article
 | Highlight paper
 | 
06 Nov 2025
Research article | Highlight paper |  | 06 Nov 2025

Automated mask generation in citizen science smartphone photos and their value for mapping plant species in drone imagery

Salim Soltani, Lauren E. Gillespie, Moises Exposito-Alonso, Olga Ferlian, Nico Eisenhauer, Hannes Feilhauer, and Teja Kattenborn

Related authors

Litter vs. Lens: Evaluating LAI from Litter Traps and Hemispherical Photos Across View Zenith Angles and Leaf Fall Phases
Simon Lotz, Teja Kattenborn, Julian Frey, Salim Soltani, Anna Göritz, Tom Jakszat, and Negin Katal
EGUsphere, https://doi.org/10.5194/egusphere-2025-1496,https://doi.org/10.5194/egusphere-2025-1496, 2025
Short summary

Cited articles

Affouard, A., Goëau, H., Bonnet, P., Lombardo, J.-C., and Joly, A.: Pl@ ntnet app in the era of deep learning, in: ICLR: International Conference on Learning Representations [data set], https://www.gbif.org/dataset/7a3679ef-5582-4aaa-81f0-8c2545cafc81 (last access: 10 February 2025), 2017. a
Bah, M. D., Hafiane, A., and Canals, R.: Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images, Remote Sens., 10, 1690, https://doi.org/10.3390/rs10111690, 2018. a
Bayraktar, E., Basarkan, M. E., and Celebi, N.: A low-cost UAV framework towards ornamental plant detection and counting in the wild, ISPRS Journal of Photogrammetry and Remote Sensing, 167, 1–11, 2020. a
Boone, M. E. and Basille, M.: Using iNaturalist to contribute your nature observations to science, EDIS, 2019, 5–5, 2019. a, b
Bouguettaya, A., Zarzour, H., Kechida, A., and Taberkit, A. M.: Deep learning techniques to classify agricultural crops through UAV imagery: A review, Neural Computing and Applications, 34, 9511–9536, 2022. a
Co-editor-in-chief
Soltani et al. present an automated workflow for transforming weakly annotated citizen science plant photographs into robust training data for drone-based remote sensing of vegetation. Their approach demonstrates that plant images collected by non-specialists can be effectively leveraged in machine learning models to improve the accuracy of species mapping using drones.
Short summary
We introduce an automated approach for generating segmentation masks for citizen science plant photos, making them applicable to computer vision models. This framework effectively transforms citizen science data into a data treasure for segmentation models for plant species identification in aerial imagery. Using automatically labeled photos, we train segmentation models for mapping tree species in drone imagery, showcasing their potential for forestry, agriculture, and biodiversity monitoring.
Share
Altmetrics
Final-revised paper
Preprint