Skip to main content Accessibility help
×
Home
Hostname: page-component-5c569c448b-nqqt6 Total loading time: 0.213 Render date: 2022-07-05T12:45:31.603Z Has data issue: true Feature Flags: { "shouldUseShareProductTool": true, "shouldUseHypothesis": true, "isUnsiloEnabled": true, "useRatesEcommerce": false, "useNewApi": true } hasContentIssue true

Article contents

Homing with stereovision

Published online by Cambridge University Press:  28 May 2015

Paramesh Nirmal
Affiliation:
Robotics and Computer Vision Lab, Fordham University, Bronx, NY, USA
Damian M. Lyons*
Affiliation:
Robotics and Computer Vision Lab, Fordham University, Bronx, NY, USA
*
*Corresponding author. E-mail: dlyons@fordham.edu

Summary

Visual Homing is a navigation method based on comparing a stored image of a goal location to the current image to determine how to navigate to the goal location. It is theorized that insects such as ants and bees employ visual homing techniques to return to their nest or hive, and inspired by this, several researchers have developed elegant robot visual homing algorithms. Depth information, from visual scale, or other modality such as laser ranging, can improve the quality of homing. While insects are not well equipped for stereovision, stereovision is an effective robot sensor. We describe the challenges involved in using stereovision derived depth in visual homing and our proposed solutions. Our algorithm, Homing with Stereovision (HSV), utilizes a stereo camera mounted on a pan-tilt unit to build composite wide-field stereo images and estimate distance and orientation from the robot to the goal location. HSV is evaluated in a set of 200 indoor trials using two Pioneer 3-AT robots showing it effectively leverages stereo depth information when compared to a depth from scale approach.

Type
Articles
Information
Robotica , Volume 34 , Issue 12 , December 2016 , pp. 2741 - 2758
Copyright
Copyright © Cambridge University Press 2015 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1. Cartweight, B. and Collet, T., “Landmark learning in bees,” J. Comparative Physiol. 151, 521543 (1983).CrossRefGoogle Scholar
2. Choi, D., Shim, I., Bok, Y., Oh, T. and Kweon, I., “Autonomous homing based on laser-camera fusion system,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, (2012).Google Scholar
3. Churchill, D. and Vardy, A., “Homing in scale space,” Proceedings of the IEEE/RSJ Conference on Intelligent Robots and Systems (IROS), (2008).Google Scholar
4. Churchill, D. and Vardy, A., “An orientation invarient visual homing algorithm,” J. Intell. Robot. Syst. 17 (1), 329 (2012).Google Scholar
5. Dudek, G. and Jenkin, M., Computational Principles of Mobile Robotics (Cambridge University Press, Cambridge, 2000).Google Scholar
6. Feng, W., Zhang, B., Roning, J., Zong, X. and Yi, T., “Panoramic Stereo Vision,” Proceedings of the SPIE Conference on Intelligent Robotics and Computer Vision XXX: Algorithms and Techniques, Burlingame CA (2013).Google Scholar
7. Fischler, M. and Bolles, R., “Random sample concensus: A paradigm for model fitting with applications to image abalysis and automated cartography,” Comm. ACM 24 (6), 381395 (1981).CrossRefGoogle Scholar
8. Franz, M., Scholkopf, B., Mallot, M. and Bulthoff, H., Where did I take that snapshot? Scene-based homing by image matching. Biol. Cybern. (79), 191202 (1998).CrossRefGoogle Scholar
9. Jin, Y. and Xie, M., “Vision guided Homing for Humanoid Service Robot,” Proceedings of the 15th International Conference on Pattern Recognition (ICPR), Vol. 4 (2000).Google Scholar
10. Liu, M., Pradalier, C., Pomerleau, F. and Siegwart, R., “The Role of Homing in Visual Topological Navigation,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), (2012).Google Scholar
11. Lowe, D., “Distinctive image features from scaleinvarient keypoints,” J. Comput. Vis. 60 (2), 91110 (2004).CrossRefGoogle Scholar
12. Möller, R., “Do insects use templates or parameters for landmark navigation?,” J. Theor. Biol. 210, 3345 (2001).CrossRefGoogle ScholarPubMed
13. Möller, R., “Local visual homing by warping of two-dimensional images,” Robot. Auton. Syst. 31 (1), 87101 (2009).CrossRefGoogle Scholar
14. Möller, R. and Vardy, A., “Local visual homing by matched-filter descent in image databases,” Biol. Cybern. 95, 413430 (2006).CrossRefGoogle Scholar
15. Möller, R., Lambrinos, D., Pfeifer, R. and Wehner, R., “Insect strategies of visual homing in mobile robots,” Comput. Vis. Mobile Robot. Workshop (1998).Google Scholar
16. Nirmal, P. and Lyons, D., “Visual Homing with a Pan-Tilt Based stereo camera,” Proceedings of the SPIE Conference on Intelligent Robots and Computer Vision XXX: Algorithms and Techniques, Burlingame CA (2013).Google Scholar
17. Pons, J., Huhner, W., Dahmen, J. and Mallot, H., “Vision-Based Robot Homing in Dynamic Environments,” Proceedings of the 13th IASTED International Conference on Robotics and Applications (2007).Google Scholar
18. Srinivasan, M., “Insects as gibsonian animals,” Ecol. Psychol. 10 (3–4), 251270 (1998).CrossRefGoogle Scholar
19. Sturzl, W. and Mallot, H., “Vision-Based Homing with a Panoramic Stereovision Sensor,” Proceedings of the British Machine Vision Conference 2002 LNCS 2525 (2002).Google Scholar
20. Thrun, S., Burgard, W. and Fox, D., Probabilistic Robotics, (MIT Press, Cambridge, MA, 2005).Google Scholar
21. Vardy, A. and Möller, R., “Biologically plausible visual homing methods based on optical flow techniques,” Connect. Sci. 17, 4790 (2005).CrossRefGoogle Scholar
22. Vardy, A. and Oppacher, F., “Low-level visual homing,” Advances in artificial life - Proceedings, 7th European Concerence on Artificial Life (vol. 2801 Lecture Notes in Artificial Intelligence) (2003).Google Scholar
23. Wehner, R., “Spatial Vision in Anthropods,” In: Handbook of Sensory Physiology VII/6C, Comparative physiology and evolution of vision in vertebrates (1981).Google Scholar
24. Zeil, J., Hoffman, H. and Chal, J., “Catchment areas of panoramic images in outdoor scenes,” J. Opt. Soc. Am. 20 (3), 450469 (2003).CrossRefGoogle ScholarPubMed
5
Cited by

Save article to Kindle

To save this article to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Homing with stereovision
Available formats
×

Save article to Dropbox

To save this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you used this feature, you will be asked to authorise Cambridge Core to connect with your Dropbox account. Find out more about saving content to Dropbox.

Homing with stereovision
Available formats
×

Save article to Google Drive

To save this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you used this feature, you will be asked to authorise Cambridge Core to connect with your Google Drive account. Find out more about saving content to Google Drive.

Homing with stereovision
Available formats
×
×

Reply to: Submit a response

Please enter your response.

Your details

Please enter a valid email address.

Conflicting interests

Do you have any conflicting interests? *