Hostname: page-component-89b8bd64d-7zcd7 Total loading time: 0 Render date: 2026-05-07T21:14:41.123Z Has data issue: false hasContentIssue false

Early and on-ground image-based detection of poppy (Papaver rhoeas) in wheat using YOLO architectures

Published online by Cambridge University Press:  15 December 2022

Fernando J. Pérez-Porras
Affiliation:
Postdoctoral Researcher, Department of Graphic and Geomatics Engineering of the University of Cordoba, Cordoba, Spain
Jorge Torres-Sánchez
Affiliation:
Postdoctoral Researcher, imaPing Group, Plant Protection Department, Institute for Sustainable Agriculture-CSIC-Córdoba, Spain
Francisca López-Granados
Affiliation:
Research Scientist, imaPing Group, Plant Protection Department, Institute for Sustainable Agriculture-CSIC-Córdoba, Spain
Francisco J. Mesas-Carrascosa*
Affiliation:
Full Professor, Department of Graphic and Geomatics Engineering of the University of Cordoba, Cordoba, Spain
*
Author for correspondence: Francisco J. Mesas-Carrascosa, Campus Rabanales, Department of Graphic and Geomatics Engineering of the University of Cordoba, Building C5, N-IV km. 396, 14071 Cordoba, Spain. Email: ig2mecaf@uco.es
Rights & Permissions [Opens in a new window]

Abstract

Poppy (also common poppy or corn poppy; Papaver rhoeas L., PAPRH) is one of the most harmful weeds in winter cereals. Knowing the precise and accurate location of weeds is essential for developing effective site-specific weed management (SSWM) for optimized herbicide use. Among the available tools for weed mapping, deep learning (DL) is used for its accuracy and ability to work in complex scenarios. Crops represent intricate situations for weed detection, as crop residues, occlusion of weeds, or spectral similarities between crop and weed seedlings are frequent. Timely discrimination of weeds is needed, because postemergence herbicides are used just when weeds and crops are at an early growth stage. This study addressed P. rhoeas early detection in wheat (Triticum spp.) by comparing the performance of six DL-based object-detection models focused on the “You Only Look Once” (YOLO) architecture (v3 to v5) using proximal RGB images to train the models. The models were assessed using open-source software, and evaluation offered a range of results for quality of recognition of P. rhoeas as well as computational capacity during the inference process. Of all the models, YOLOv5s performed best in the testing phase (75.3%, 76.2%, and 77% for F1-score, mean average precision, and accuracy, respectively). These results indicated that under real field conditions, DL-based object-detection strategies can identify P. rhoeas at an early stage, providing accurate information for developing SSWM.

Information

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2022. Published by Cambridge University Press on behalf of the Weed Science Society of America
Figure 0

Figure 1. Flowchart used for Papaver rhoeas detection.

Figure 1

Figure 2. Example of Papaver rhoeas in wheat, both at growth stages 12 to 14 of the BBCH scale. The upper right image includes a Euro coin as a reference to the plant’s small size.

Figure 2

Table 1. Definition of evaluation parameters.

Figure 3

Table 2. Optimized hyperparameters by “You Only Look Once” (YOLO) version.a

Figure 4

Table 3. Quality metrics (%) obtained in testing by “You Only Look Once” (YOLO) architectures.

Figure 5

Table 4. Comparative use of computing resources between “You Only Look Once” (YOLO) architectures.

Figure 6

Figure 3. Examples of Papaver rhoeas detection in wheat using six YOLO models: (A) v3, (B) v4-CSP, (C) v4-P5, (D) v5l, (E) v5m, and (F) v5s. Different situations regarding P. rhoeas are also shown: (1) isolated, (2) next to wheat plants, and (3) next to crop residues.