Hostname: page-component-8448b6f56d-42gr6 Total loading time: 0 Render date: 2024-04-23T09:56:21.978Z Has data issue: false hasContentIssue false

Occupancy-elevation grid: an alternative approach for robotic mapping and navigation

Published online by Cambridge University Press:  15 April 2015

Anderson Souza*
Affiliation:
Department of Informatics, University of State of the Rio Grande do Norte, Natal, Rio Grande do Norte, Brazil
Luiz M. G. Gonçalves
Affiliation:
Department of Computing Engineer and Automation, Federal University of the Rio Grande do Norte, Natal, Rio Grande do Norte, Brazil E-mail: lmarcos@natalnet.br
*
*Corresponding author. E-mail: andersonabner@uern.br
Rights & Permissions [Opens in a new window]

Summary

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

This paper proposes an alternative environment mapping method for accurate robotic navigation based on 3D information. Typical techniques for 3D mapping using occupancy grid require intensive computational workloads in order to both build and store the map. This work introduces an Occupancy-Elevation Grid (OEG) mapping technique, which is a discrete mapping approach where each cell represents the occupancy probability, the height of the terrain and its variance. This representation allows a mobile robot to know with an accurate degree of certainty whether a place in the environment is occupied by an obstacle and the height of such obstacle. Thus, based on its hardware characteristics, it can make calculations to decide if it is possible to traverse that specific place. In general, the map representation introduced can be used in conjunction with any kind of distance sensor. In this work, we use laser range data and stereo system data with a probabilistic treatment. The resulting maps allow the execution of tasks as decision making for autonomous navigation, exploration, localization and path planning, considering the existence and the height of the obstacles. Experiments carried out with real data demonstrate that the proposed approach yields useful maps for autonomous navigation.

Type
Articles
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/3.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
Copyright © Cambridge University Press 2015

References

1. Yang, S.-W. and Wang, C.-C., “Feasibility Grids for Localization and Mapping in Crowded Urban Scenes,” Proceedings of IEEE International Conference on Robotic and Automation, Shanghai, China (2011) pp. 2322–2326.Google Scholar
2. Thrun, S., “Robotic Mapping: A Survey,” In: Exploring Artificial Intelligence in the New Millenium (Morgan Kaufmann, San Francisco, CA, USA, 2009).Google Scholar
3. Einhorn, E., Schröter, C. and Gross, H.-M., “Finding the Adequate Resolution for Grid Mapping - Cell Size Locally Adapting on-the-Fly,” Proceeding of IEEE International Conference on Robotic and Automation, Shanghai, China (2011) pp. 1843–1848.Google Scholar
4. Pfaff, P., Triebel, R. and Burgard, W., “An Efficient Extension to Elevation Maps for Outdoor Terrain Mapping and Loop Closing,” Int. J. Robot. Res. – IJRR 26 (2), 217230 (2007).CrossRefGoogle Scholar
5. Elfes, A., “Sonar-based real-world mapping and navigation,” IEEE J. Robot. Autom. 3 (3), 249265 (1987).Google Scholar
6. Thrun, S., Burgard, W. and Fox, D., Probabilistic Robotics, 1st ed. (MIT Press, Cambridge, MA, USA, 2005).Google Scholar
7. Liu, Y., Emery, R., Chakrabarti, D., Burgard, W. and Thrun, S., “Using EM to Learn 3D Models with Mobile Robots,” Proceedings of the International Conference on Machine Learning (ICML), Bellevue, Washington, USA (2001).Google Scholar
8. Moravec, H., “Sensor fusion in certainty grids for mobile robots,” Computer 9 (2), 6174 (1988).Google Scholar
9. Moravec, H. P., “Robot spacial perception by stereoscopic vision on 3d evidence grid,” CMU Robotics Institute, Pittsburg, Pensylvania, Technical Report CMU-RI-TR-96-34, 1996.Google Scholar
10. Stachniss, C. and Burgard, W., “Mapping and Exploration with Mobile Robots using Coverage Maps,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA (2003) pp. 476–481.Google Scholar
11. Chen, H. and Xu, Z., “3d Map Building based on Stereo Vision,” Proceedings of the 2006 IEEE International Conference on Networking, Sensing and Control - ICNSC, Ft. Lauderdale, FL, USA (2006) pp. 969–973.Google Scholar
12. Andert, F., “Drawing Stereo Disparity Images into Occupancy Grids: Measurement Model and Fast Implementation,” Proceedings of IEEE International Conference on Intelligent Robots and Systems, St. Louis, MO, USA (2009) pp. 5191–5197.Google Scholar
13. Azim, A. and Aycard, O., “Detection, Classification and Tracking of Moving Objects in a 3d Environment,” Proceedings of 2012 IEEE Intelligent Vehicles Symposium (IV), Alcalá de Henares, Spain (2012) pp. 802–807.Google Scholar
14. Hornung, A., Wurm, K. M., Bennewitz, M., Stachniss, C. and Burgard, W., “Octomap: An efficient probabilistic 3D mapping framework based on octrees,” Auton. Robots (2013), software available at http://octomap.github.com. [Online]. Available: http://octomap.github.com.Google Scholar
15. Bares, J., Hebert, M., Kanade, T., Krotkov, E., Mitchell, T., Simmons, R. and Whittaker, W. R. L., “Ambler: An autonomous rover for planetary exploration,” IEEE Comput. Soc. Press 22 (6), 1822 (1989).Google Scholar
16. Hebert, M., Caillas, C., Krotkov, E., Kweon, I. S. and Kanade, T., “Terrain Mapping for a Roving Planetary Explorer,” Proceedings of the IEEE International Conference on Robotics and Automation, Scottsdale, AZ, USA (1989) pp. 997–1002.Google Scholar
17. Marks, T., Howard, A., Bajracharya, M., Cotrell, G. and Mathies, L., “Gamma-slam: Visual slam in unstructured environments using variance grid maps,” J. Field Robot. 26 (1), 2651 (2009).CrossRefGoogle Scholar
18. Dryanovski, I., Morris, W. and Xiao, J., “Multi-Volume Occupancy Grids: An Efficient Probabilistic 3d Mapping Model for Micro Aerial Vehicles,” Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan (2010) pp. 1553–1559.Google Scholar
19. Triebel, R., Pfaff, P. and Burgard, W., “Multi-Level Surface Maps for Outdoor Terrain Mapping and Loop Closing,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Beijing, China (2006).Google Scholar
20. Rivadeneyra, C., Miller, I. and Campbell, M., “Probabilistic Estimation of Multi-Level Terrain Maps,” Proceedings of IEEE International Conference on Robotics and Automation, Kobe, Japan (2009) pp. 1643–1648.Google Scholar
21. Fairfield, N., Kantor, G. and Wettergreen, D., “Real-time slam with octree evidence grids for exploration in underwater tunnels,” J. Field Robot. 24 (1–2), 321 (2007).Google Scholar
22. Choset, H., Lynch, K., Hutchinson, S., Kantor, G., Burgard, W., Kavraki, L. and Thrun, S., Principles of Robot Motion: Theory, Algorithms, and Implementations, 1st ed. (MIT Press, Cambridge, MA, USA, 2005).Google Scholar
23. Yguel, M., Aycard, O. and Laugier, C., “Update Policy of dense Maps: Efficient Algorithms and Sparce Representation,” Proceedings of the International Conference on Field Service Robotics (FSR), Chamonix, France (2007) pp. 23–33.Google Scholar
24. Smith, M., Baldwin, I., Churchill, W., Paul, R. and Newman, P., “The new college vision and laser data set,” Int. J. Robot. Res. (IJRR) 28 (5), 249265 (2009).Google Scholar
25. Wurm, K. M., Hornung, A., Bennewitz, M., Stachniss, C. and Burgard, W., “Octomap: A Probabilistic, Flexible, and Compact 3d Map Representation for Robotic Systems,” ICRA, Workshop on 3D Perception and Modeling, Anchorage, Alaska, USA (2010).Google Scholar