Articles | Volume 22, issue 21
https://doi.org/10.5194/bg-22-6545-2025
https://doi.org/10.5194/bg-22-6545-2025
Research article
 | Highlight paper
 | 
06 Nov 2025
Research article | Highlight paper |  | 06 Nov 2025

Automated mask generation in citizen science smartphone photos and their value for mapping plant species in drone imagery

Salim Soltani, Lauren E. Gillespie, Moises Exposito-Alonso, Olga Ferlian, Nico Eisenhauer, Hannes Feilhauer, and Teja Kattenborn

Viewed

Total article views: 1,623 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
1,362 213 48 1,623 36 53
  • HTML: 1,362
  • PDF: 213
  • XML: 48
  • Total: 1,623
  • BibTeX: 36
  • EndNote: 53
Views and downloads (calculated since 24 Feb 2025)
Cumulative views and downloads (calculated since 24 Feb 2025)

Viewed (geographical distribution)

Total article views: 1,623 (including HTML, PDF, and XML) Thereof 1,623 with geography defined and 0 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 
Latest update: 27 Nov 2025
Co-editor-in-chief
Soltani et al. present an automated workflow for transforming weakly annotated citizen science plant photographs into robust training data for drone-based remote sensing of vegetation. Their approach demonstrates that plant images collected by non-specialists can be effectively leveraged in machine learning models to improve the accuracy of species mapping using drones.
Short summary
We introduce an automated approach for generating segmentation masks for citizen science plant photos, making them applicable to computer vision models. This framework effectively transforms citizen science data into a data treasure for segmentation models for plant species identification in aerial imagery. Using automatically labeled photos, we train segmentation models for mapping tree species in drone imagery, showcasing their potential for forestry, agriculture, and biodiversity monitoring.
Share
Altmetrics
Final-revised paper
Preprint