Hostname: page-component-89b8bd64d-dvtzq Total loading time: 0 Render date: 2026-05-08T10:24:33.377Z Has data issue: false hasContentIssue false

Stereo-Image Matching Using a Speeded Up Robust Feature Algorithm in an Integrated Vision Navigation System

Published online by Cambridge University Press:  12 June 2012

Chun Liu*
Affiliation:
(College of Surveying and Geo-Informatics, Tongji University, Shanghai, China) (Key Laboratory of Advanced Engineering Surveying of NASMG, Shanghai, China)
Fagen Zhou
Affiliation:
(College of Surveying and Geo-Informatics, Tongji University, Shanghai, China)
Yiwei Sun
Affiliation:
(Institute of Remote Sensing Application, Chinese Academy of Science, Beijing, China)
Kaichang Di
Affiliation:
(Institute of Remote Sensing Application, Chinese Academy of Science, Beijing, China)
Zhaoqin Liu
Affiliation:
(Institute of Remote Sensing Application, Chinese Academy of Science, Beijing, China)
Rights & Permissions [Opens in a new window]

Abstract

Visual navigation is comparatively advanced without a Global Positioning System (GPS). It obtains environmental information via real-time processing of the data gained through visual sensors. Compared with other methods, visual navigation is a passive method that does not launch light or other radiation applications, thus making it easier to hide. The novel navigation system described in this paper uses stereo-matching combined with Inertial Measurement Units (IMU). This system applies photogrammetric theory and a matching algorithm to identify the matching points of two images of the same scene taken from different views and obtains their 3D coordinates. Integrated with the orientation information output by the IMU, the system reduces model-accumulated errors and improves the point accuracy.

Information

Type
Research Article
Copyright
Copyright © The Royal Institute of Navigation 2012
Figure 0

Figure 1. Pinhole projection model.

Figure 1

Figure 2. Camera calibration result.

Figure 2

Figure 3. Flow chart of SURF.

Figure 3

Figure 4. SURF-matching results.

Figure 4

Figure 5. Flow chart of the improved SURF algorithm.

Figure 5

Figure 6. Matching results of the improved SURF.

Figure 6

Table 1. Comparison between the efficiencies of the original and improved SURF.

Figure 7

Figure 7. Image sequence-based coordinate measurement.

Figure 8

Figure 8. Exterior parameters of the image sequence.

Figure 9

Table 2. Coordinates on the left image.

Figure 10

Table 3. Coordinates on the right image.

Figure 11

Table 4. Retrieved posture results.

Figure 12

Figure 9. Spatial relationship between IMU and the camera.

Figure 13

Figure 10. Visual odometry/IMU integration Kalman filter.

Figure 14

Figure 11. Navigation platform in an experiment.

Figure 15

Figure 12. Trajectories obtained from the different solutions.

Figure 16

Table 5. Solution errors compared to GPS.

Figure 17

Figure 13. Position innovation of Kalman filtering.

Figure 18

Figure 14. Attitude innovation of Kalman filtering.