Hostname: page-component-89b8bd64d-z2ts4 Total loading time: 0 Render date: 2026-05-06T22:33:43.777Z Has data issue: false hasContentIssue false

A robust simultaneous localization and mapping method for agricultural environments: integrating spatial modeling and noisy estimation

Published online by Cambridge University Press:  22 April 2026

Ziyang Wang
Affiliation:
State Key Laboratory of Precision Manufacturing for Extreme Service Performance, College of Mechanical and Electrical Engineering, Central South University, Changsha, China
Wenfu Tong
Affiliation:
State Key Laboratory of Precision Manufacturing for Extreme Service Performance, College of Mechanical and Electrical Engineering, Central South University, Changsha, China
Haibo Zhou*
Affiliation:
State Key Laboratory of Precision Manufacturing for Extreme Service Performance, College of Mechanical and Electrical Engineering, Central South University, Changsha, China
Mingjun Li
Affiliation:
Guangdong Institute of Modern Agricultural Equipment, Guangzhou, China
Xiaozhuo Wang
Affiliation:
Xi’an Institute of Electromechanical Information Technology, Xi'an, China
Biyao Cheng
Affiliation:
Xi’an Institute of Electromechanical Information Technology, Xi'an, China
*
Corresponding author: Haibo Zhou; Email: zhouhaibo@csu.edu.cn
Rights & Permissions [Opens in a new window]

Abstract

To address the challenges posed by highly repetitive structures, unstructured terrain, and pronounced sensor disturbances in agricultural environments, this paper proposes a simultaneous localization and mapping framework that integrates explicit spatial feature modeling with time-varying noise estimation. First, a curved-voxel-based intensity–geometry joint probability model is constructed to transform conventional point features into spatial features capable of effectively capturing local structural patterns. These features are then aligned by minimizing the joint probability distance using the normal distributions transform, thereby enhancing feature discriminability and registration stability in highly repetitive scenes. Second, a maximum a posteriori-based recursive noise estimator is designed. By employing a dual sub-filter architecture, the proposed estimator enables joint modeling and online optimization of the noise parameters associated with LiDAR odometry and the inertial measurement unit, thereby improving the system’s adaptability to terrain-induced perturbations and sensor uncertainties. Experimental results demonstrate that the proposed method achieves a mean relative translation error of 1.42% and a mean relative rotation error of 1.53°/100 m in agricultural scenarios. In addition, the average error in row spacing estimation derived from the reconstructed map is constrained within 0.071 m. Compared with baseline methods, the proposed system exhibits significant advantages in both pose estimation accuracy and agronomic structural perception capability.

Information

Type
Research Article
Copyright
© The Author(s), 2026. Published by Cambridge University Press
Figure 0

Figure 1. System overview.

Figure 1

Figure 2. Schematic of spatial feature modeling.

Figure 2

Figure 3. Experimental platform.

Figure 3

Table I. Relative pose errors on the KITTI dataset.

Figure 4

Table II. Relative pose errors on the NCLT dataset.

Figure 5

Figure 4. Mapping and trajectory results on the KITTI. (a–d) Mapping results on Seq. 01, 03, 06, and 10, (e–h) corresponding trajectory results.

Figure 6

Figure 5. Mapping and trajectory results on the NCLT. (a)–(c) Mapping results on Seq. N1, N2, and N3, (d)–(f) corresponding trajectory results.

Figure 7

Figure 6. Error statistics. (a) Translation error, (b) rotation error.

Figure 8

Figure 7. Simulated and real agricultural scenes. (a) Simulated agricultural scene, (b) real agricultural scene.

Figure 9

Table III. Relative pose errors on the agricultural scenes.

Figure 10

Figure 8. Mapping and trajectory results on the simulated agricultural scenes. (a) Mapping result, (b) trajectory result.

Figure 11

Figure 9. Mapping and trajectory results on the real agricultural scenes. (a) Mapping result, (b) trajectory result.

Figure 12

Table IV. Row-spacing estimation results on the agricultural scenes.

Figure 13

Figure 10. Row-spacing estimation in agricultural scenes.

Figure 14

Figure 11. Statistical distribution of row-spacing estimation error.