Articles | Volume 22, issue 21
https://doi.org/10.5194/bg-22-6545-2025
© Author(s) 2025. This work is distributed under
the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
https://doi.org/10.5194/bg-22-6545-2025
© Author(s) 2025. This work is distributed under
the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
Automated mask generation in citizen science smartphone photos and their value for mapping plant species in drone imagery
Salim Soltani
CORRESPONDING AUTHOR
Chair of Sensor-based Geoinformatics (geosense), University of Freiburg, Freiburg, Germany
Remote Sensing Centre for Earth System Research (RSC4Earth), Leipzig University, Leipzig, Germany
Department of Plant Biology, Carnegie Science, Stanford, California, USA
Lauren E. Gillespie
Department of Plant Biology, Carnegie Science, Stanford, California, USA
Department of Integrative Biology, University of California, Berkeley, Berkeley, California, USA
Department of Computer Science, Stanford University, Stanford, California, USA
Moises Exposito-Alonso
Department of Plant Biology, Carnegie Science, Stanford, California, USA
Department of Integrative Biology, University of California, Berkeley, Berkeley, California, USA
Olga Ferlian
German Centre for Integrative Biodiversity Research (iDiv), Halle-Jena-Leipzig, Leipzig, Germany
Institute of Biology, Leipzig University, Leipzig, Germany
Nico Eisenhauer
German Centre for Integrative Biodiversity Research (iDiv), Halle-Jena-Leipzig, Leipzig, Germany
Institute of Biology, Leipzig University, Leipzig, Germany
Hannes Feilhauer
Remote Sensing Centre for Earth System Research (RSC4Earth), Leipzig University, Leipzig, Germany
German Centre for Integrative Biodiversity Research (iDiv), Halle-Jena-Leipzig, Leipzig, Germany
Helmholtz Centre for Environmental Research, Leipzig, Germany
Teja Kattenborn
Chair of Sensor-based Geoinformatics (geosense), University of Freiburg, Freiburg, Germany
Related authors
Simon Lotz, Teja Kattenborn, Julian Frey, Salim Soltani, Anna Göritz, Tom Jakszat, and Negin Katal
EGUsphere, https://doi.org/10.5194/egusphere-2025-1496, https://doi.org/10.5194/egusphere-2025-1496, 2025
Short summary
Short summary
Digital hemispherical photography (DHP) is a valuable tool for monitoring leaf area index (LAI), a key factor in ecosystem productivity and climate interactions. We compared DHP with litter traps in a temperate forest and found that at a 20° view angle, both methods aligned well. We applied a calibration model to assess site variability which significantly improved accuracy. Our findings enhance the reliability of ground-based LAI monitoring, supporting better ecosystem assessments.
Luis Kremer, Jan Pisek, Ronny Richter, Julian Frey, Daniel Lusk, Christiane Werner, Christian Wirth, and Teja Kattenborn
EGUsphere, https://doi.org/10.1101/2025.09.17.676742, https://doi.org/10.1101/2025.09.17.676742, 2025
This preprint is open for discussion and under review for Biogeosciences (BG).
Short summary
Short summary
To adapt to changing environmental conditions, plants can adjust their leaf angles. We developed AngleCam V2, an AI method that estimates leaf inclination angles from photos taken during both day and night. Trained on thousands of images from about 200 species, it monitors daily changes in leaf angle, aligns with laser-scanning data, and detects systematic shifts under water limitation. AngleCam V2 provides an open-source tool for monitoring leaf angle dynamics over time, taxa, and environments.
Jasmin Tesch, Kathrin Kühnhammer, Delon Wagner, Andreas Christen, Carsten Dormann, Julian Frey, Rüdiger Grote, Teja Kattenborn, Markus Sulzer, Ulrike Wallrabe, Markus Weiler, Christiane Werner, Samaneh Baghbani, Julian Brzozon, Laura Maria Comella, Lea Dedden, Stefanie Dumberger, Yasmina Frey, Matthias Gassilloud, Timo Gerach, Anna Göritz, Simon Haberstroh, Johannes Klüppel, Luis Kremer, Jürgen Kreuzwieser, Hojin Lee, Joachim Maack, Julian Müller, Oswald Prucker, Sanam Kumari Rajak, Jürgen Rühe, Stefan J. Rupitsch, Helmer Schack-Kirchner, Christian Scharinger, Uttunga Shinde, Till Steinmann, Clara Stock, and Josef Strack
EGUsphere, https://doi.org/10.5194/egusphere-2025-4979, https://doi.org/10.5194/egusphere-2025-4979, 2025
This preprint is open for discussion and under review for Geoscientific Instrumentation, Methods and Data Systems (GI).
Short summary
Short summary
In the ECOSENSE forest, we developed a robust infrastructure for distributed forest sensing. Reliable power supply, stable network connection, and smart data collection systems enable the operation of hundreds of sensors under challenging conditions. By detailing the infrastructure design and implementation, we provide a transferable blueprint for building complex monitoring sites that support high-resolution, long-term ecosystem observations.
Kevin Wolf, Evelyn Jäkel, André Ehrlich, Michael Schäfer, Hannes Feilhauer, Andreas Huth, Alexandra Weigelt, and Manfred Wendisch
Biogeosciences, 22, 2909–2933, https://doi.org/10.5194/bg-22-2909-2025, https://doi.org/10.5194/bg-22-2909-2025, 2025
Short summary
Short summary
This paper reports an investigation of the influence of clouds on vegetation albedo using a coupled atmosphere–vegetation radiative transfer model. Both models are iteratively linked to simulate cloud–vegetation–radiation interactions over canopies more realistically. Solar, spectral, and broadband irradiances have been simulated under varying cloud conditions. The simulated irradiances were used to investigate the spectral and broadband effect of clouds on vegetation albedo.
Simon Lotz, Teja Kattenborn, Julian Frey, Salim Soltani, Anna Göritz, Tom Jakszat, and Negin Katal
EGUsphere, https://doi.org/10.5194/egusphere-2025-1496, https://doi.org/10.5194/egusphere-2025-1496, 2025
Short summary
Short summary
Digital hemispherical photography (DHP) is a valuable tool for monitoring leaf area index (LAI), a key factor in ecosystem productivity and climate interactions. We compared DHP with litter traps in a temperate forest and found that at a 20° view angle, both methods aligned well. We applied a calibration model to assess site variability which significantly improved accuracy. Our findings enhance the reliability of ground-based LAI monitoring, supporting better ecosystem assessments.
Kevin Wolf, Evelyn Jäkel, André Ehrlich, Michael Schäfer, Hannes Feilhauer, Andreas Huth, and Manfred Wendisch
EGUsphere, https://doi.org/10.5194/egusphere-2025-2082, https://doi.org/10.5194/egusphere-2025-2082, 2025
Short summary
Short summary
This paper presents combined atmosphere-vegetation radiative transfer simulations to systematically investigate cloud-induced biases in remotely sensed vegetation indices (VIs) derived from below-cloud measurements. The biases in VIs have been investigated for the general case of two-band VIs, and for the special cases of the normalized difference vegetation index (NDVI), the normalized difference water index (NDWI), and the enhanced vegetation index (EVI).
Eya Cherif, Teja Kattenborn, Luke A. Brown, Michael Ewald, Katja Berger, Phuong D. Dao, Tobias B. Hank, Etienne Laliberté, Bing Lu, and Hannes Feilhauer
EGUsphere, https://doi.org/10.5194/egusphere-2025-1284, https://doi.org/10.5194/egusphere-2025-1284, 2025
Short summary
Short summary
Hyperspectral imagery combined with machine learning enables accurate large-scale mapping of plant traits but struggles with uncertainty when facing unfamiliar environmental conditions. This study introduces a distance-based method that measures dissimilarities between new and training data to reliably quantify uncertainty. Results show it effectively identifies uncertain predictions, greatly improving the reliability of global vegetation monitoring compared to traditional methods.
Simon Scheiter, Sophie Wolf, and Teja Kattenborn
Biogeosciences, 21, 4909–4926, https://doi.org/10.5194/bg-21-4909-2024, https://doi.org/10.5194/bg-21-4909-2024, 2024
Short summary
Short summary
Biomes are widely used to map vegetation patterns at large spatial scales and to assess impacts of climate change, yet there is no consensus on a generally valid biome classification scheme. We used crowd-sourced species distribution data and trait data to assess whether trait information is suitable for delimiting biomes. Although the trait data were heterogeneous and had large gaps with respect to the spatial distribution, we found that a global trait-based biome classification was possible.
Salim Soltani, Olga Ferlian, Nico Eisenhauer, Hannes Feilhauer, and Teja Kattenborn
Biogeosciences, 21, 2909–2935, https://doi.org/10.5194/bg-21-2909-2024, https://doi.org/10.5194/bg-21-2909-2024, 2024
Short summary
Short summary
In this research, we developed a novel method using citizen science data as alternative training data for computer vision models to map plant species in unoccupied aerial vehicle (UAV) images. We use citizen science plant photographs to train models and apply them to UAV images. We tested our approach on UAV images of a test site with 10 different tree species, yielding accurate results. This research shows the potential of citizen science data to advance our ability to monitor plant species.
Cited articles
Affouard, A., Goëau, H., Bonnet, P., Lombardo, J.-C., and Joly, A.: Pl@ ntnet app in the era of deep learning, in: ICLR: International Conference on Learning Representations [data set], https://www.gbif.org/dataset/7a3679ef-5582-4aaa-81f0-8c2545cafc81 (last access: 10 February 2025), 2017. a
Bah, M. D., Hafiane, A., and Canals, R.: Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images, Remote Sens., 10, 1690, https://doi.org/10.3390/rs10111690, 2018. a
Bayraktar, E., Basarkan, M. E., and Celebi, N.: A low-cost UAV framework towards ornamental plant detection and counting in the wild, ISPRS Journal of Photogrammetry and Remote Sensing, 167, 1–11, 2020. a
Bouguettaya, A., Zarzour, H., Kechida, A., and Taberkit, A. M.: Deep learning techniques to classify agricultural crops through UAV imagery: A review, Neural Computing and Applications, 34, 9511–9536, 2022. a
Brandt, M., Tucker, C. J., Kariryaa, A., Rasmussen, K., Abel, C., Small, J., Chave, J., Rasmussen, L. V., Hiernaux, P., Diouf, A. A., Kergoat, L., Mertz, O., Igel, C., Gieseke, F., Schöning, J., Li, S., Melocik, K., Meyer, J., Sinno, S., Romero, E., Glennie, E., Montagu, A., Dendoncker, M., and Fensholt, R.: An unexpectedly large count of trees in the West African Sahara and Sahel, Nature, 587, 78–82, 2020. a
Brodrick, P. G., Davies, A. B., and Asner, G. P.: Uncovering ecological patterns with convolutional neural networks, Trends in Ecology & Evolution, 34, 734–745, 2019. a
Curnick, D. J., Davies, A. J., Duncan, C., Freeman, R., Jacoby, D. M., Shelley, H. T., Rossi, C., Wearn, O. R., Williamson, M. J., and Pettorelli, N.: SmallSats: a new technological frontier in ecology and conservation?, Remote Sensing in Ecology and Conservation, https://doi.org/10.1002/rse2.239, 2021. a
Di Cecco, G. J., Barve, V., Belitz, M. W., Stucky, B. J., Guralnick, R. P., and Hurlbert, A. H.: Observing the observers: How participants contribute data to iNaturalist and implications for biodiversity science, BioScience, 71, 1179–1188, 2021. a
Fassnacht, F. E., Latifi, H., Stereńczak, K., Modzelewska, A., Lefsky, M., Waser, L. T., Straub, C., and Ghosh, A.: Review of studies on tree species classification from remotely sensed data, Remote Sensing of Environment, 186, 64–87, 2016. a
Ferlian, O., Cesarz, S., Craven, D., Hines, J., Barry, K. E., Bruelheide, H., Buscot, F., Haider, S., Heklau, H., Herrmann, S., Kühn, P., Pruschitzki, U., Schädler, M., Wagg, C., Weigelt, A., Wubet, T., and Eisenhauer, N.: Mycorrhiza in tree diversity–ecosystem function relationships: conceptual framework and experimental implementation, Ecosphere, 9, e02226, https://doi.org/10.1002/ecs2.2226, 2018. a, b
Galuszynski, N. C., Duker, R., Potts, A. J., and Kattenborn, T.: Automated mapping of Portulacaria afra canopies for restoration monitoring with convolutional neural networks and heterogeneous unmanned aerial vehicle imagery, PeerJ, 10, e14219, https://doi.org/10.7717/peerj.14219, 2022. a
GBIF: GBIF: the global biodiversity information facility [data set], https://www.gbif.org/occurrence/search?q=plantae&taxon_key=6 (last access: 10 February 2025), 2019. a
Gillespie, L. E., Ruffley, M., and Expósito-Alonso, M.: Deep Learning Models Map Rapid Plant Species Changes from Citizen Science and Remote Sensing Data, Proceedings of the National Academy of Sciences, https://doi.org/10.1073/pnas.2318296121, 2024. a, b
Hoeser, T. and Kuenzer, C.: Object detection and image segmentation with deep learning on earth observation data: A review-part i: Evolution and recent trends, Remote Sensing, 12, 1667, https://doi.org/10.3390/rs12101667, 2020. a
iNaturalist: iNaturalist Observations – Plantae, iNaturalist Website, https://www.inaturalist.org/observations?view=species&iconic_taxa=Plantae (last access: 10 February 2025), 2025. a
Illarionova, S., Shadrin, D., and Ignatiev, V.: A Survey of Computer Vision Techniques for Forest Characterization and Carbon Monitoring Tasks, MDPI Remote Sensing, https://doi.org/10.3390/rs14225861, 2022. a
Katal, N., Rzanny, M., Mäder, P., and Wäldchen, J.: Deep Learning in Plant Phenological Research: A Systematic Literature Review, Frontiers in Plant Science, https://doi.org/10.3389/fpls.2022.805738, 2022. a, b
Kattenborn, T. and Soltani, S.: CrowdVision2TreeSegment, Zenodo [data set], https://doi.org/10.5281/zenodo.10019552, 2023. a
Kattenborn, T., Leitloff, J., and Schiefer, F.: Review on Convolutional Neural Networks in Vegetation Remote Sensing, ISPRS Journal of Photogrammetry and Remote Sensing, https://doi.org/10.1016/j.isprsjprs.2020.12.010, 2021a. a, b, c
Kirillov, A., Mintun, E., Ravi, N., Mao, H., Rolland, C., Gustafson, L., Xiao, T., Whitehead, S., Berg, A. C., Lo, W.-Y., Dollár, P., and Girshick, R. B.: Segment Anything, arXiv [preprint], https://doi.org/10.48550/arXiv.2304.02643, 2023. a, b, c
Leitão, P. J., Schwieder, M., Pötzschner, F., Pinto, J. R. R., Teixeira, A. M. C., Pedroni, F., Sanchez, M., Rogass, C., van der Linden, S., Bustamante, M. M. C., and Hostert, P.: From sample to pixel: multi-scale remote sensing data for upscaling aboveground carbon data in heterogeneous landscapes, Ecosphere, 9, e02298, https://doi.org/10.1002/ecs2.2298, 2018. a
Li, J., Cai, Y., Li, Q., Kou, M., and Zhang, T.: A review of remote sensing image segmentation by deep learning methods, International Journal of Digital Earth, 17, 2328827, https://doi.org/10.1080/17538947.2024.2328827, 2024. a
Lopatin, J., Dolos, K., Kattenborn, T., and Fassnacht, F. E.: How canopy shadow affects invasive plant species classification in high spatial resolution remote sensing, Remote Sensing in Ecology and Conservation, 5, 302–317, 2019. a
Maes, W. H. and Steppe, K.: Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture, Trends in Plant Science, 24, 152–164, 2019. a
Martins, V. S., Kaleita, A. L., Gelder, B. K., da Silveira, H. L., and Abe, C. A.: Exploring multiscale object-based convolutional neural network (multi-OCNN) for remote sensing image classification at high spatial resolution, ISPRS Journal of Photogrammetry and Remote Sensing, 168, 56–73, 2020. a
Maß, V. and Alirezazadeh, P.: Annotated Image Dataset with Different Stages of European Pear Rust for UAV-Based Automated Symptom Detection in Orchards, ScienceDirect, https://doi.org/10.1016/j.dib.2025.111271, 2025. a, b
Möhring, J., Kattenborn, T., Mahecha, M. D., Cheng, Y., Schwenke, M. B., Cloutier, M., Denter, M., Frey, J., Gassilloud, M., Göritz, A., Hempel, J., Horion, S., Jucker, T., Junttila, S., Khatri-Chhetri, P., Korznikov, K., Kruse, S., Laliberté, E., Maroschek, M., Neumeier, P., Pérez-Priego, O., Potts, A., Schiefer, F., Seidl, R., Vajna-Jehle, J., Zielewska-Büttner, K., and Mosig, C.: Global, multi-scale standing deadwood segmentation in centimeter-scale aerial images, Authorea [preprint], https://doi.org/10.36227/techrxiv.174137781.13803217/v1, 2025. a
Mosig, C., Vajna-Jehle, J., Mahecha, M. D., Cheng, Y., Hartmann, H., Montero, D., Junttila, S., Horion, S., Schwenke, M. B., Adu-Bredu, S., Al-Halbouni, D., Allen, M., Altman, J., Angiolini, C., Astrup, R., Barrasso, C., Bartholomeus, H., Brede, B., Buras, A., Carrieri, E., Chirici, G., Cloutier, M., Cushman, K. C., Dalling, J. W., Dempewolf, J., Denter, M., Ecke, S., Eichel, J., Eltner, A., Fabi, M., Fassnacht, F., Ferreira, M. P., Frey, J., Frick, A., Ganz, S., Garbarino, M., García, M., Gassilloud, M., Ghasemi, M., Giannetti, F., Gonzalez, R., Gosper, C., Greinwald, K., Grieve, S., Aguirre-Gutierrez, J., Göritz, A., Hajek, P., Hedding, D., Hempel, J., Hernández, M., Heurich, M., Honkavaara, E., Jucker, T., Kalwij, J. M., Khatri-Chhetri, P., Klemmt, H.-J., Koivumäki, N., Korznikov, K., Kruse, S., Krüger, R., Laliberté, E., Langan, L., Latifi, H., Lehmann, J., Li, L., Lines, E., Lopatin, J., Lucieer, A., Ludwig, M., Ludwig, A., Lyytikäinen-Saarenmaa, P., Ma, Q., Marino, G., Maroschek, M., Meloni, F., Menzel, A., Meyer, H., Miraki, M., Moreno-Fernández, D., Müller-Landau, H. C., Mälicke, M., Möhring, J., Müllerova, J., Neumeier, P., Näsi, R., Oppgenoorth, L., Palmer, M., Paul, T., Potts, A., Prober, S., Puliti, S., Pérez-Priego, O., Reudenbach, C., Rossi, C., Rühr, N. K., Ruiz-Benito, P., Runge, C. M., Scherer-Lorenzen, M., Schiefer, F., Schladebach, J., Schmehl, M.-T., Schwarz, S., Seidl, R., Shafeian, E., de Simone, L., Sohrabi, H., Sotomayor, L., Sparrow, B., Steer, B. S. C., Stenson, M., Stöckigt, B., Su, Y., Suomalainen, J., Torresani, M., Umlauft, J., Vargas-Ramírez, N., Volpi, M., Vásquez, V., Weinstein, B., Tagle-Casapia, X., Zdunic, K., Zielewska-Büttner, K., de Oliveira, R. A., van Wagtendonk, L., von Dosky, V., and Kattenborn, T.: deadtrees. earth-An Open-Access and Interactive Database for Centimeter-Scale Aerial Imagery to Uncover Global Tree Mortality Dynamics, bioRxiv, 2024–10, https://doi.org/10.1101/2024.10.18.619094, 2024. a
Müllerová, J., Brundu, G., Große-Stoltenberg, A., Kattenborn, T., and Richardson, D. M.: Pattern to process, research to practice: remote sensing of plant invasions, Biological Invasions, 26 pp., https://doi.org/10.1007/s10530-023-03150-z, 2023. a
Plantnet: Stats – Pl@ntNet – Plant Identifier, Pl@ntNet Website, https://identify.plantnet.org/stats (last access: 10 February 2025), 2025. a
Ronneberger, O., Fischer, P., and Brox, T.: U-net: Convolutional networks for biomedical image segmentation, in: International Conference on Medical image computing and computer-assisted intervention, Springer, 234–241, https://doi.org/10.1007/978-3-319-24574-4_28, 2015. a, b
Schiefer, F., Kattenborn, T., Frick, A., Frey, J., Schall, P., Koch, B., and Schmidtlein, S.: Mapping forest tree species in high resolution UAV-based RGB-imagery by means of convolutional neural networks, ISPRS Journal of Photogrammetry and Remote Sensing, 170, 205–215, https://doi.org/10.1016/j.isprsjprs.2020.10.015, 2020a. a
Schiller, C., Schmidtlein, S., Boonman, C., Moreno-Martínez, A., and Kattenborn, T.: Deep learning and citizen science enable automated plant trait predictions from photographs, Scientific Reports, 11, 1–12, 2021. a
Selvaraju, R. R., Cogswell, M., Das, A., Vedantam, R., Parikh, D., and Batra, D.: Grad-CAM: Visual explanations from deep networks via gradient-based localization, in: Proceedings of the IEEE International Conference on Computer Vision, 618–626, https://doi.org/10.1109/ICCV.2017.74, 2017. a, b, c, d
Singh, K. K. and Surasinghe, T. D.: Systematic Review and Best Practices for Drone Remote Sensing of Invasive Plants, Wiley Online Library, https://doi.org/10.1111/2041-210X.14330, 2024. a
Smith, L. N.: A disciplined approach to neural network hyper-parameters: Part 1 – learning rate, batch size, momentum, and weight decay, arXiv [preprint] arXiv:1803.09820, https://doi.org/10.48550/arXiv.1803.09820, 2018. a
Soltani, S.: salimsoltani28/Flora_Mask: Flora_Mask (Final_release), Zenodo [code], https://doi.org/10.5281/zenodo.17456239, 2025. a
Soltani, S., Feilhauer, H., Duker, R., and Kattenborn, T.: Transfer learning from citizen science photographs enables plant species identification in UAVs imagery, ISPRS Open Journal of Photogrammetry and Remote Sensing, 100016, https://doi.org/10.1016/j.ophoto.2022.100016, 2022. a, b, c, d, e, f, g
Soltani, S., Ferlian, O., Eisenhauer, N., Feilhauer, H., and Kattenborn, T.: From simple labels to semantic image segmentation: leveraging citizen science plant photographs for tree species mapping in drone imagery, Biogeosciences, 21, 2909–2935, https://doi.org/10.5194/bg-21-2909-2024, 2024. a, b, c, d, e, f, g
Sun, Z., Wang, X., Wang, Z., Yang, L., Xie, Y., and Huang, Y.: UAVs as remote sensing platforms in plant ecology: review of applications and challenges, Journal of Plant Ecology, 14, 1003–1023, 2021. a
Tan, M. and Le, Q. V.: EfficientNetV2: Smaller Models and Faster Training, arXiv [preprint], https://doi.org/10.48550/arXiv.2104.00298 2021. a
Van Horn, G., Mac Aodha, O., Song, Y., Cui, Y., Sun, C., Shepard, A., Adam, H., Perona, P., and Belongie, S.: The inaturalist species classification and detection dataset, in: Proceedings of the IEEE conference on computer vision and pattern recognition, 8769–8778, https://doi.org/10.1109/CVPR.2018.00914, 2018. a
Wagner, F. H.: The flowering of Atlantic Forest Pleroma trees, Scientific Reports, 11, 1–20, 2021. a
Co-editor-in-chief
Soltani et al. present an automated workflow for transforming weakly annotated citizen science plant photographs into robust training data for drone-based remote sensing of vegetation. Their approach demonstrates that plant images collected by non-specialists can be effectively leveraged in machine learning models to improve the accuracy of species mapping using drones.
Soltani et al. present an automated workflow for transforming weakly annotated citizen science...
Short summary
We introduce an automated approach for generating segmentation masks for citizen science plant photos, making them applicable to computer vision models. This framework effectively transforms citizen science data into a data treasure for segmentation models for plant species identification in aerial imagery. Using automatically labeled photos, we train segmentation models for mapping tree species in drone imagery, showcasing their potential for forestry, agriculture, and biodiversity monitoring.
We introduce an automated approach for generating segmentation masks for citizen science plant...
Altmetrics
Final-revised paper
Preprint