Articles | Volume 21, issue 11
https://doi.org/10.5194/bg-21-2909-2024
https://doi.org/10.5194/bg-21-2909-2024
Research article
 | 
14 Jun 2024
Research article |  | 14 Jun 2024

From simple labels to semantic image segmentation: leveraging citizen science plant photographs for tree species mapping in drone imagery

Salim Soltani, Olga Ferlian, Nico Eisenhauer, Hannes Feilhauer, and Teja Kattenborn

Related authors

AngleCam V2: Predicting leaf inclination angles across taxa from daytime and nighttime photos
Luis Kremer, Jan Pisek, Ronny Richter, Julian Frey, Daniel Lusk, Christiane Werner, Christian Wirth, and Teja Kattenborn
EGUsphere, https://doi.org/10.1101/2025.09.17.676742,https://doi.org/10.1101/2025.09.17.676742, 2025
This preprint is open for discussion and under review for Biogeosciences (BG).
Short summary
Automated mask generation in citizen science smartphone photos and their value for mapping plant species in drone imagery
Salim Soltani, Lauren E. Gillespie, Moises Exposito-Alonso, Olga Ferlian, Nico Eisenhauer, Hannes Feilhauer, and Teja Kattenborn
Biogeosciences, 22, 6545–6561, https://doi.org/10.5194/bg-22-6545-2025,https://doi.org/10.5194/bg-22-6545-2025, 2025
Short summary
The ECOSENSE forest: A distributed sensor and data management system for real-time monitoring of ecosystem processes and stresses
Jasmin Tesch, Kathrin Kühnhammer, Delon Wagner, Andreas Christen, Carsten Dormann, Julian Frey, Rüdiger Grote, Teja Kattenborn, Markus Sulzer, Ulrike Wallrabe, Markus Weiler, Christiane Werner, Samaneh Baghbani, Julian Brzozon, Laura Maria Comella, Lea Dedden, Stefanie Dumberger, Yasmina Frey, Matthias Gassilloud, Timo Gerach, Anna Göritz, Simon Haberstroh, Johannes Klüppel, Luis Kremer, Jürgen Kreuzwieser, Hojin Lee, Joachim Maack, Julian Müller, Oswald Prucker, Sanam Kumari Rajak, Jürgen Rühe, Stefan J. Rupitsch, Helmer Schack-Kirchner, Christian Scharinger, Uttunga Shinde, Till Steinmann, Clara Stock, and Josef Strack
EGUsphere, https://doi.org/10.5194/egusphere-2025-4979,https://doi.org/10.5194/egusphere-2025-4979, 2025
This preprint is open for discussion and under review for Geoscientific Instrumentation, Methods and Data Systems (GI).
Short summary
Impact of stratiform liquid water clouds on vegetation albedo quantified by coupling an atmosphere and a vegetation radiative transfer model
Kevin Wolf, Evelyn Jäkel, André Ehrlich, Michael Schäfer, Hannes Feilhauer, Andreas Huth, Alexandra Weigelt, and Manfred Wendisch
Biogeosciences, 22, 2909–2933, https://doi.org/10.5194/bg-22-2909-2025,https://doi.org/10.5194/bg-22-2909-2025, 2025
Short summary
Litter vs. Lens: Evaluating LAI from Litter Traps and Hemispherical Photos Across View Zenith Angles and Leaf Fall Phases
Simon Lotz, Teja Kattenborn, Julian Frey, Salim Soltani, Anna Göritz, Tom Jakszat, and Negin Katal
EGUsphere, https://doi.org/10.5194/egusphere-2025-1496,https://doi.org/10.5194/egusphere-2025-1496, 2025
Short summary

Cited articles

Affouard, A., Goëau, H., Bonnet, P., Lombardo, J.-C., and Joly, A.: Pl@ntnet app in the era of deep learning, in: ICLR: International Conference on Learning Representations, April 2017, Toulon, France, ffhal-01629195f, 2017. a, b
Bayraktar, E., Basarkan, M. E., and Celebi, N.: A low-cost UAV framework towards ornamental plant detection and counting in the wild, ISPRS J. Photogramm., 167, 1–11, https://doi.org/10.1016/j.isprsjprs.2020.06.012, 2020. a
Boone, M. E. and Basille, M.: Using iNaturalist to contribute your nature observations to science, EDIS, 2019, 5–5, 2019. a, b, c
Bouguettaya, A., Zarzour, H., Kechida, A., and Taberkit, A. M.: Deep learning techniques to classify agricultural crops through UAV imagery: A review, Neural Comput. Appl., 34, 9511–9536, 2022. a
Braga, G., J. R., Peripato, V., Dalagnol, R., P. Ferreira, M., Tarabalka, Y., OC Aragão, L. E., F. de Campos Velho, H., Shiguemori, E. H., and Wagner, F. H.: Tree crown delineation algorithm based on a convolutional neural network, Remote Sens., 12, 1288, https://doi.org/10.3390/rs12081288, 2020. a
Short summary
In this research, we developed a novel method using citizen science data as alternative training data for computer vision models to map plant species in unoccupied aerial vehicle (UAV) images. We use citizen science plant photographs to train models and apply them to UAV images. We tested our approach on UAV images of a test site with 10 different tree species, yielding accurate results. This research shows the potential of citizen science data to advance our ability to monitor plant species.
Share
Altmetrics
Final-revised paper
Preprint