Automatic flower detection and phenology monitoring using time-lapse cameras and deep learning |
|
Author: | Mann, Hjalte M. R.1,2; Iosifidis, Alexandros2; Jepsen, Jane U.3; |
Organizations: |
1Department of Ecoscience and Arctic Research Center, Aarhus University, C.F. Møllers Allé 8, 8000 Aarhus C, Denmark 2Department of Electrical and Computer Engineering – Signal Processing and Machine Learning, Aarhus University, Finlandsgade 22, 8200 Aarhus N, Denmark 3Department of Arctic Ecology, Fram Centre, Norwegian Institute for Nature Research, Hjalmar Johansens gt.14, 9007 Tromsø, Norway
4Department of Ecology and Genetics, University of Oulu, Oulu, Finland
5University of the Arctic, Rovaniemi, Finland 6Department of Biological Sciences, University of Alaska, Anchorage, Alaska, USA 7Arctic Centre, University of Groningen, Groningen, the Netherlands |
Format: | article |
Version: | published version |
Access: | open |
Online Access: | PDF Full Text (PDF, 3.6 MB) |
Persistent link: | http://urn.fi/urn:nbn:fi-fe2022071451687 |
Language: | English |
Published: |
John Wiley & Sons,
2022
|
Publish Date: | 2022-07-14 |
Description: |
AbstractThe advancement of spring is a widespread biological response to climate change observed across taxa and biomes. However, the species level responses to warming are complex and the underlying mechanisms are difficult to disentangle. This is partly due to a lack of data, which are typically collected by direct observations, and thus very time-consuming to obtain. Data deficiency is especially pronounced in the Arctic where the warming is particularly severe. We present a method for automated monitoring of flowering phenology of specific plant species at very high temporal resolution through full growing seasons and across geographical regions. The method consists of image-based monitoring of field plots using near-surface time-lapse cameras and subsequent automated detection and counting of flowers in the images using a convolutional neural network. We demonstrate the feasibility of collecting flower phenology data using automatic time-lapse cameras and show that the temporal resolution of the results surpasses what can be collected by traditional observation methods. We focus on two Arctic species, the mountain avens Dryas octopetala and Dryas integrifolia in 20 image series from four sites. Our flower detection model proved capable of detecting flowers of the two species with a remarkable precision of 0.918 (adjusted to 0.966) and a recall of 0.907. Thus, the method can automatically quantify the seasonal dynamics of flower abundance at fine scale and return reliable estimates of traditional phenological variables such as the timing of onset, peak, and end of flowering. We describe the system and compare manual and automatic extraction of flowering phenology data from the images. Our method can be directly applied on sites containing mountain avens using our trained model, or the model could be fine-tuned to other species. We discuss the potential of automatic image-based monitoring of flower phenology and how the method can be improved and expanded for future studies. see all
|
Series: |
Remote sensing in ecology and conservation |
ISSN: | 2056-3485 |
ISSN-E: | 2056-3485 |
ISSN-L: | 2056-3485 |
Volume: | 8 |
Issue: | 6 |
Pages: | 765 - 777 |
DOI: | 10.1002/rse2.275 |
OADOI: | https://oadoi.org/10.1002/rse2.275 |
Type of Publication: |
A1 Journal article – refereed |
Field of Science: |
1181 Ecology, evolutionary biology 213 Electronic, automation and communications engineering, electronics |
Subjects: | |
Funding: |
TTH acknowledges funding from Independent Research Fund Denmark Grant 8021-00423B, and JUJ from the Fram Centre Terrestrial Flagship Program. Research at the Thule, Greenland site was supported by NSF grant 1836837 awarded to JMW, in addition to support from his UArctic Research Chairship, University of Oulu, Finland. |
Copyright information: |
© 2022 The Authors. Remote Sensing in Ecology and Conservation published by John Wiley & Sons Ltd on behalf of Zoological Society of London. This is an open access article under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made. |
https://creativecommons.org/licenses/by-nc-nd/4.0/ |