Hostname: page-component-848d4c4894-4hhp2 Total loading time: 0 Render date: 2024-05-09T09:27:50.942Z Has data issue: false hasContentIssue false

Panoramic visual system for spherical mobile robots

Published online by Cambridge University Press:  05 February 2024

Muhammad Affan Arif
Affiliation:
Institute of Robotics and Intelligent Systems, Xi’an Jiaotong University, Xi’an, 710049, China Shaanxi Key Laboratory of Intelligent Robots, Xi’an, 710049, China
Aibin Zhu*
Affiliation:
Institute of Robotics and Intelligent Systems, Xi’an Jiaotong University, Xi’an, 710049, China Shaanxi Key Laboratory of Intelligent Robots, Xi’an, 710049, China
Han Mao
Affiliation:
Institute of Robotics and Intelligent Systems, Xi’an Jiaotong University, Xi’an, 710049, China Shaanxi Key Laboratory of Intelligent Robots, Xi’an, 710049, China
Yao Tu
Affiliation:
Institute of Robotics and Intelligent Systems, Xi’an Jiaotong University, Xi’an, 710049, China Shaanxi Key Laboratory of Intelligent Robots, Xi’an, 710049, China
*
Corresponding author: Aibin Zhu; Email: abzhu@mail.xjtu.edu.cn

Abstract

Aimed at the challenges of wide-angle mobile robot visual perception for diverse field applications, we present the spherical robot visual system that uses a 360° field of view (FOV) for realizing real-time object detection. The spherical robot image acquisition system model is developed with optimal parameters, including camera spacing, camera axis angle, and the distance of the target image plane. Two 180$^{\circ}$-wide panoramic FOVs, front and rear view, are formed using four on-board cameras. The speed of the SURF algorithm is increased for feature extraction and matching. For seamless fusion of the images, an improved fade-in and fade-out algorithm is used, which not only improves the seam quality but also improves object detection performance. The speed of the dynamic image stitching is significantly enhanced by using a cache-based sequential image fusion method. On top of the acquired panoramic wide FOVs, the YOLO algorithm is used for real-time object detection. The panoramic visual system for the spherical robot is then tested in real time, which outputs panoramic views of the scene at an average frame rate of 21.69 fps and panoramic views with object detection at an average of 15.39 fps.

Type
Research Article
Copyright
© The Author(s), 2024. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Mizumura, Y., Ishibashi, K., Yamada, S., Takanishi, A. and Ishii, H., Mechanical design of a jumping and rolling spherical robot for children with developmental disorders (2017).CrossRefGoogle Scholar
Gu, S., Guo, S. and Zheng, L., “A highly stable and efficient spherical underwater robot with hybrid propulsion devices,” Auton. Robots 44(5), 759771 (2020). doi: 10.1007/s10514-019-09895-8.CrossRefGoogle Scholar
Yue, C., Guo, S. and Shi, L., “Hydrodynamic analysis of the spherical underwater robot SUR-II,” Int. J. Adv. Robot. Syst. 10(5), 247 (2013). doi: 10.5772/56524.CrossRefGoogle Scholar
Lu, L. and Chu, M., “Shock absorber design and analysis of a catapult-spherical reconnaissance robot,” Adv. Mater. Res. 662, 694697 (2013). doi: 10.4028/www.scientific.net/AMR.662.694.CrossRefGoogle Scholar
Hogan, F. R., Forbes, J. R. and Barfoot, T. D., “Rolling stability of a power-generating tumbleweed rover,” J. Spacecr. Rockets 51(6), 18951906 (2014). doi: 10.2514/1.A32883.CrossRefGoogle Scholar
Chase, R. and Pandya, A., “A review of active mechanical driving principles of spherical robots,” Robotics 1(1), 323 (2012). doi: 10.3390/robotics1010003.CrossRefGoogle Scholar
Ba, P. D., Hoang, Q. D., Lee, S.-G., Nguyen, T. H., Duong, X. Q. and Tham, B. C., “Kinematic Modeling of Spherical Rolling Robots with a Three-Omnidirectional-Wheel Drive Mechanism,” 2020 20th International Conference on Control, Automation and Systems (ICCAS) (2020) pp. 463466. doi:10.23919/ICCAS50221.2020.9268200.CrossRefGoogle Scholar
Dong, H. Q., Lee, S.-G., Woo, S. H. and Leb, T.-A., “Back-Stepping Approach for Rolling Motion Control of an Under-Actuated Two-Wheel Spherical Robot,” 2020 20th International Conference on Control, Automation and Systems (ICCAS) (2020) pp. 233238. doi:10.23919/ICCAS50221.2020.9268438.CrossRefGoogle Scholar
Kim, H. W. and Jung, S., “Design and control of a sphere robot using a control moment gyroscope actuator for navigation,” Int. J. Control. Autom. Syst. 18(12), 31123120 (2020). doi: 10.1007/s12555-019-0526-2.CrossRefGoogle Scholar
Kim, H. W. and Jung, S., “Design and Control of a Sphere Robot Actuated by a Control Moment Gyroscope,” 2019 19th International Conference on Control, Automation and Systems (ICCAS) (2019) pp. 12871290. doi:10.23919/ICCAS47443.2019.8971512.CrossRefGoogle Scholar
Lee, S. D. and Jung, S., “Synchronized motion and compliant control for dual control moment gyroscopes by realizing the impedance transformation matrix,” Mechatronics 84, 102793 (2022). doi: 10.1016/j.mechatronics.2022.102793.CrossRefGoogle Scholar
Burkhardt, M. R., Dynamic Modeling and Control of Spherical Robots (2018).Google Scholar
Othman, K., “Dynamics control of pendulums driven spherical robot,” Appl. Mech. Mater. 315, 192195 (2013). doi: 10.4028/www.scientific.net/AMM.315.192.CrossRefGoogle Scholar
Michaud, O. I. S. and Caron, S., “Roball, the rolling robot,” Auton. Robots 12(2), 211222 (2002). doi: 10.1023/A:1014005728519.CrossRefGoogle Scholar
Arif, M. A., Zhu, A., Mao, H., Zhou, X., Song, J., Tu, Y. and Peifeng, , “Design of an amphibious spherical robot driven by twin eccentric pendulums with flywheel-based inertial stabilization,” IEEE/ASME Trans. Mech. 28(5), 113 (2023). doi: 10.1109/TMECH.2023.3247641.CrossRefGoogle Scholar
Li, C., Zhu, A., Zheng, C., Mao, H., Arif, M. A., Song, J. and Zhang, Y., “Design and Analysis of a Spherical Robot Based on Reaction Wheel Stabilization,” 2022 19th International Conference on Ubiquitous Robots (UR) (2022) pp. 143148. doi:10.1109/UR55393.2022.9826266.CrossRefGoogle Scholar
Satria, S., Portable Amphibious Spherical Rolling Robot with Live-Streaming Capability for Ground and Aquatic Deployment (2015).Google Scholar
Quan, L. Z., Chen, C., Li, Y., Qiao, Y., Xi, D., Zhang, T. and Sun, W., “Design and test of stem diameter inspection spherical robot,” Int. J. Agric. Biol. Eng. 12(2), 141151 (2019). doi: 10.25165/j.ijabe.20191202.4163.Google Scholar
Guo, S., Liu, Y., Shi, L., Guo, P., Xing, H., Hou, X., Chen, Z., Su, S. and Liu, H., “Binocular Camera-based a Docking System for an Amphibious Spherical Robot,” Proceedings of 2018 IEEE International Conference on Mechatronics and Automation (ICMA 2018) (2018) pp. 16211626. doi:10.1109/ICMA.2018.8484518.CrossRefGoogle Scholar
Zhang, Q., Jia, Q. and Sun, H., “Monocular Vision-based Path Identification of a Spherical Robot,” 2010 8th World Congress on Intelligent Control and Automation (2010) pp. 67036707. doi:10.1109/WCICA.2010.5554150.CrossRefGoogle Scholar
Lowe, D. G., “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vis. 60(2), 91110 (2004). doi: 10.1023/B:VISI.0000029664.99615.94.CrossRefGoogle Scholar
Xu, X., Tian, L., Feng, J. and Zhou, J., “OSRI: A rotationally invariant binary descriptor,” IEEE Trans. Image Process. 23(7), 29832995 (2014). doi: 10.1109/TIP.2014.2324824.CrossRefGoogle ScholarPubMed
Alcantarilla, P. F., Nuevo, J. and Bartoli, A., “Fast Explicit Diffusion for Accelerated Features in Nonlinear Scale Spaces,” BMVC 2013 - Electronic Proceedings of the British Machine Vision Conference 2013 (2013).CrossRefGoogle Scholar
Bay, H., Ess, A., Tuytelaars, T. and Van Gool, L., “Speeded-up robust features (SURF),” Comput. Vis. Image Underst. 110(3), 346359 (2008). doi: 10.1016/j.cviu.2007.09.014.CrossRefGoogle Scholar
Guo, S., Sun, S. and Guo, J., “The Application of Image Mosaic in Information Collecting for an Amphibious Spherical Robot System,” 2016 IEEE International Conference on Mechatronics and Automation (2016) pp. 15471552. doi:10.1109/ICMA.2016.7558794.CrossRefGoogle Scholar
Rettkowski, J., Gburek, D. and Göhringer, D., “Robot Navigation Based on an Efficient Combination of an Extended A∗ Algorithm, Bird’s Eye View and Image Stitching,” 2015 Conference on Design and Architectures for Signal and Image Processing (DASIP) (2015) pp. 18. doi:10.1109/DASIP.2015.7367240.CrossRefGoogle Scholar
Li, Q.-Z., Zhang, Y. and Zang, F.-N., “Fast multicamera video stitching for underwater wide field-of-view observation,” J. Electron. Imaging 23(2), 023008 (2014). doi: 10.1117/1.JEI.23.2.023008.CrossRefGoogle Scholar
Hu, W.-C. and Huang, D.-Y., “Multicamera setup-based stitching scheme for generating wide fov video without ghost,” Int. J. Innov. Comput. Inf. Control 5(11), 40754088 (2009).Google Scholar
Ullah, H., Zia, O., Kim, J. H., Han, K. and Lee, J. W., “Automatic 360° mono-stereo panorama generation using a cost-effective multi-camera system,” Ah S Sens. 20(11), 3097 (2020). doi: 10.3390/s20113097.CrossRefGoogle ScholarPubMed