Skip to main content

Active camera stabilization to enhance the vision of agile legged robots

  • Stéphane Bazeille (a1), Jesus Ortiz (a1), Francesco Rovida (a2), Marco Camurri (a1), Anis Meguenani (a3), Darwin G. Caldwell (a1) and Claudio Semini (a1)...

Legged robots have the potential to navigate in more challenging terrains than wheeled robots. Unfortunately, their control is more demanding, because they have to deal with the common tasks of mapping and path planning as well as more specific issues of legged locomotion, like balancing and foothold planning. In this paper, we present the integration and the development of a stabilized vision system on the fully torque-controlled hydraulically actuated quadruped robot (HyQ). The active head added onto the robot is composed of a fast pan and tilt unit (PTU) and a high-resolution wide angle stereo camera. The PTU enables camera gaze shifting to a specific area in the environment (both to extend and refine the map) or to track an object while navigating. Moreover, as the quadruped locomotion induces strong regular vibrations, impacts or slippages on rough terrain, we took advantage of the PTU to mechanically compensate for the robot's motions. In this paper, we demonstrate the influence of legged locomotion on the quality of the visual data stream by providing a detailed study of HyQ's motions, which are compared against a rough terrain wheeled robot of the same size. Our proposed Inertial Measurement Unit (IMU)-based controller allows us to decouple the camera from the robot motions. We show through experiments that, by stabilizing the image feedback, we can improve the onboard vision-based processes of tracking and mapping. In particular, during the outdoor tests on the quadruped robot, the use of our camera stabilization system improved the accuracy on the 3D maps by 25%, with a decrease of 50% of mapping failures.

Corresponding author
*Corresponding author. E-mail:
Hide All
1. Semini, C., Tsagarakis, N. G., Guglielmino, E., Focchi, M., Cannella, F. and Caldwell, D. G., “Design of HyQ - a hydraulically and electrically actuated quadruped robot,” J. Syst.Control Eng. 225 (6), 831849 (2011).
2. Barasuol, V., Buchli, J., Semini, C., Frigerio, M., De Pieri, E. R. and Caldwell, D. G., “A Reactive Controller Framework for Quadrupedal Locomotion on Challenging Terrain,” IEEE International Conference on Robotics and Automation (ICRA) (2013) pp. 2554–2561.
3. Havoutis, I., Semini, C., Buchli, J. and Caldwell, D. G., “Quadrupedal Trotting with Active Compliance,” IEEE International Conference on Mechatronics (ICM) (2013) pp. 2554–2561.
4. Focchi, M., Barasuol, V., Havoutis, I., Buchli, J., Semini, C. and Caldwell, D. G., “Local Reflex Generation for Obstacle Negotiation in Quadrupedal Locomotion,” International Conference on Climbing and Walking Robots (CLAWAR) (2013) pp. 610–616.
5. Bazeille, S., Barasuol, V., Focchi, M., Havoutis, I., Frigerio, M., Buchli, J., Semini, C. and Caldwell, D. G., “Vision Enhanced Reactive Locomotion Control for Trotting on Rough Terrain,” IEEE International Conference on Technologies for Practical Robot Applications (TePRA) (2013) pp. 1–6.
6. Havoutis, I., Ortiz, J., Bazeille, S., Barasuol, V., Semini, C. and Caldwell, D. G., “Onboard Perception-Based Trotting and Crawling with the Hydraulic Quadruped Robot (hyq),” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2013) pp. 6052–6057.
7. Winkler, A., Havoutis, I., Bazeille, S., Ortiz, J., Focchi, M., Dillmann, R., Caldwell, D. G. and Semini, C., “Path Planning with Force-Based Foothold Adaptation and Virtual Model Control for Torque Controlled Quadruped Robots,” IEEE ICRA (2014) pp. 6476–6482.
8. Kolter, J. Z., Youngjun, K. and Ng, A. Y., “Stereo Vision and Terrain Modeling for Quadruped Robots,” IEEE International Conference on Robotics and Automation (2009) pp. 1557–1564.
9. Filitchkin, P. and Byl, K., “Feature-Based Terrain Classification for Littledog,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2012) pp. 1387–1392.
10. Stelzer, A., Hirschmüller, H. and Görner, M., “Stereo-vision-based navigation of a six-legged walking robot in unknown rough terrain,” Int. J. Robot. Res. 31 (4), 381402 (2012).
11. Ma, J., Susca, S., Bajracharya, M., Matthies, L., Malchano, M. and Wooden, D., “Robust Multi-Sensor, Day/Night 6-dof Pose estimation for a Dynamic Legged Vehicle in gps-Denied Environments,” IEEE International Conference on Robotics and Automation (ICRA) (2012) pp. 619–626.
12. Wooden, D., Malchano, M., Blankespoor, K., Howardy, A., Rizzi, A. A. and Raibert, M., “Autonomous Navigation for Bigdog,” IEEE International Conference on Robotics and Automation (ICRA) IEEE (2010) pp. 4736–4741.
13. Bajracharya, M., Ma, J., Malchano, M., Perkins, A., Rizzi, A. and Matthies, L., “High Fidelity Day/Night Stereo Mapping with Vegetation and Negative Obstacle Detection for Vision-in-the-Loop Walking,” IEEE/RSJ IROS (2013) pp. 3663–3670.
14. Shao, X., Yang, Y. and Wang, W., “Obstacle Crossing with Stereo Vision for a Quadruped Robot,” International Conference on Mechatronics and Automation (ICMA), (2012) pp. 1738–1743.
15. Rawat, P. and Singhai, J., “Review of Motion Estimation and Video stabilization Techniques for Hand Held Mobile Video,” International Journal of Signal and Image Processing (SIPIJ) (2011).
16. Karazume, R. and Hirose, S., “Development of Image Stabilization System for Remote Operation of Walking Robots,” Proceedings IEEE International Conference on Robotics and Automation (2000) pp. 1856–1861.
17. Marcinkiewicz, M., Kunin, M., Parsons, S., Sklar, E. and Raphan, T., “Towards a Methodology for Stabilizing the Gaze of a Quadrupedal Robot,” ser. Lecture Notes in Computer Science, vol. 4434 (Springer, Berlin Heidelberg, 2007) pp. 540547.
18. Marcinkiewicz, M., Kaushik, R., Labutov, I., Parsons, S. and Raphan, T., “Learning to Stabilize the Head of a Walking Quadrupedal Robot using a Bioinspired Artificial Vestibular System,” Proceedings of the IEEE International Conference on Robotics and Automation (2008).
19. Giesbrecht, J., Goldsmith, P. and Pieper, J., “Control Strategies for Visual Tracking from a Moving Platform,” Proceedings of Canadian Conference on Electrical and Computer Engineering (CCECE) (2010) pp. 1–7.
20. Jung, B. and Sukhatme, G., “Real-time motion tracking from a mobile robot,” Int. J. Soc. Robot. 2, 6378 (2010).
21. Gasteratos, A., “Tele-autonomous active stereo-vision head,” Int. J. Optomechatronics 2, 144161 (2008).
22. Kang, T.-K., Zhang, H. and Park, G.-T., “Stereo-Vision Based Motion Estimation of a Humanoid Robot for the Ego-Motion Compensation by Type-2 Fuzzy Sets,” Proceedings of the IEEE International Symposium on Industrial Electronics (ISIE) (2009) pp. 1785–1790.
23. Ude, A. and Oztop, E., “Active 3-d Vision on a Humanoid Head,” Proceedings of the IEEE International Conference on Advanced Robotics (ICAR) (2009).
24. Bauml, B., Birbach, O., Wimbock, T., Frese, U., Dietrich, A. and Hirzinger, G., “Catching Flying Balls with a Mobile Humanoid: System Overview and Design Considerations,” International Conference on Humanoid Robots (2011) pp. 513–520.
25. Xiang, G., “Real-Time Follow-Up Tracking Fast Moving Object with an Active Camera,” Proceedings of the International Congress on Image and Signal Processing (CISP) (2009) pp. 1–4.
26. Akhloufi, M., “Real-Time Target Tracking using a Pan and Tilt Platform,” Proceedings of the International Conference on Machine Vision, Image Processing, and Pattern Analysis (MVIPPA) (2009) pp. 437–441.
27. Huebner, K., Welke, K., Przybylski, M., Vahrenkamp, N., Asfour, T., Kragic, D. and Dillmann, R., “Grasping Known Objects with Humanoid Robots: A Box-based Approach,” Proceedings of the International Conference on Advanced Robotics (2009) pp. 1–6.
28. Gay, S., Ijspeert, A. and Santos-Victor, J., “Predictive Gaze Stabilization During Periodic Locomotion Based on Adaptive Frequency Oscillators,” IEEE International Conference on Robotics and Automation (ICRA), (May 2012) pp. 271–278.
29. Shibata, T. and Schaal, S., “Biomimetic gaze stabilization based on feedback-error-learning with nonparametric regression networks,” Neural Netw. 14 (2), 201216 (2001).
30. Panerai, F., Metta, G. and Sandini, G., “Learning visual stabilization reflexes in robots with moving eyes,” Neurocomputing 48 (1–4), 323337 (2002).
31.Ocular Robotics.” [Online]. Available:
32.FLIR.” [Online]. Available:
33. Kwon, O., Shin, J. and Paik, J., “Video Stabilization using Kalman Filter and Phase Correlation Matching,” In: Image Analysis and Recognition, vol. 3656 (Springer, Berlin Heidelberg, 2005) pp. 141148.
34. Lucas, B. D. and Kanade, T., “An iterative image registration technique with an application to stereo vision,” vol. 2, 1981, pp. 674–679.
35. Shi, J. and Tomasi, C., “Good Features to Track,” International Conference on Computer Vision and Pattern Recognition (1994) pp. 593–600.
36.ARTOR (Autonomous Rough Terrain Outdoor Robot),” [Online]. Available:
37. Ortiz, J., Tapia, C., Rossi, L., Fontaine, J. and Maza, M., “Description and Tests of a Multisensorial Driving Interface for Vehicle Teleoperation,” Proceedings of the 11th International IEEE Conference on Intelligent Transportation Systems, ITSC, IEEE (2008) pp. 616–621.
38. Bradski, G., “Computer video face tracking for use in a perceptual user interface,” Intel Technol. J. (1998).
39. Endres, F., Hess, J., Engelhard, N., Sturm, J., Cremers, D. and Burgard, W., “An Evaluation of the rgb-d Slam System,” IEEE International Conference on Robotics and Automation (2012) pp. 1691–1696.
40. Szeliski, R. and Scharstein, D., “Symmetric Sub-Pixel Stereo Matching,” in ECCV 2002. Springer, vol. 2. (2002) pp. 525540.
Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

  • ISSN: 0263-5747
  • EISSN: 1469-8668
  • URL: /core/journals/robotica
Please enter your name
Please enter a valid email address
Who would you like to send this to? *



Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed