Hostname: page-component-89b8bd64d-b5k59 Total loading time: 0 Render date: 2026-05-08T02:15:16.450Z Has data issue: false hasContentIssue false

New directions in weed management and research using 3D imaging

Published online by Cambridge University Press:  14 October 2022

April M. Dobbs
Affiliation:
Graduate Student, Department of Crop and Soil Sciences, North Carolina State University, Raleigh, NC, USA
Daniel Ginn
Affiliation:
Postdoctoral Research Scholar, Department of Soil and Crop Sciences, Texas A&M University, College Station, TX, USA
Søren Kelstrup Skovsen
Affiliation:
Researcher, Department of Electrical and Computer Engineering, Aarhus University, Aarhus, Denmark
Muthukumar V. Bagavathiannan
Affiliation:
Associate Professor, Department of Soil and Crop Sciences, Texas A&M University, College Station, TX, USA
Steven B. Mirsky
Affiliation:
Research Ecologist, Sustainable Agricultural Systems Laboratory, USDA-ARS, Beltsville, MD, USA
Chris S. Reberg-Horton
Affiliation:
Professor and University Faculty Scholar, Department of Crop and Soil Sciences and Center for Environmental Farming Systems, North Carolina State University, Raleigh, NC, USA
Ramon G. Leon*
Affiliation:
Professor and University Faculty Scholar, Department of Crop and Soil Sciences, Center for Environmental Farming Systems, and Genetic Engineering and Society Center, North Carolina State University, Raleigh, NC, USA
*
Author for correspondence: Ramon G. Leon, 4402C Williams Hall, Raleigh, NC 27695. (Email: rleon@ncsu.edu)
Rights & Permissions [Opens in a new window]

Abstract

Recent innovations in 3D imaging technology have created unprecedented potential for better understanding weed responses to management tactics. Although traditional 2D imaging methods for mapping weed populations can be limited in the field by factors such as shadows and tissue overlap, 3D imaging mitigates these challenges by using depth data to create accurate plant models. Three-dimensional imaging can be used to generate spatiotemporal maps of weed populations in the field and target weeds for site-specific weed management, including automated precision weed control. This technology will also help growers monitor cover crop performance for weed suppression and detect late-season weed escapes for timely control, thereby reducing seedbank persistence and slowing the evolution of herbicide resistance. In addition to its many applications in weed management, 3D imaging offers weed researchers new tools for understanding spatial and temporal heterogeneity in weed responses to integrated weed management tactics, including weed–crop competition and weed community dynamics. This technology will provide simple and low-cost tools for growers and researchers alike to better understand weed responses in diverse agronomic contexts, which will aid in reducing herbicide use, mitigating herbicide-resistance evolution, and improving environmental health.

Information

Type
Review
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2022. Published by Cambridge University Press on behalf of the Weed Science Society of America
Figure 0

Table 1. Comparison of photogrammetric techniques and light detection and ranging (LIDAR) for 3D imaging.

Figure 1

Figure 1. Use of images taken from different angles to create a 3D reconstruction in structure-from-motion (SfM; top) vs. stereo-vision photogrammetry (bottom).

Figure 2

Figure 2. Red, green, and blue (RGB) image of soybeans and weeds (A) and corresponding 3D point cloud reconstruction (B). Lower panels show point cloud reconstructions from different angles, including a top view (C), top view offset 45° from vertical (D), front view (E), under canopy and offset 45° (F), directly under canopy (G), facing canopy from behind (H), facing canopy offset 45° right (I), side view (J), and facing canopy offset 45° left (K).

Figure 3

Figure 3. Data pipeline for calculating canopy height and estimating biomass in the field using red, green, and blue (RGB) images and depth data.

Figure 4

Figure 4. Three-dimensional point cloud reconstructions of soybean (A, top view; B, front view) and cereal rye (Secale cereale L.) (C, top view; D, front view). Note the voids in the soybean point cloud (B) caused by dense canopy cover. Such voids are largely absent in cereal rye (D) due to a more even canopy with greater light penetration.