Hostname: page-component-848d4c4894-ttngx Total loading time: 0 Render date: 2024-06-10T13:40:52.839Z Has data issue: false hasContentIssue false

Automatic extrinsic calibration for structured light camera and repetitive LiDARs

Published online by Cambridge University Press:  11 April 2024

Yangtao Ge
Affiliation:
Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen, Guangdong Province, P R China
Chen Yao
Affiliation:
Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen, Guangdong Province, P R China
Zirui Wang
Affiliation:
Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen, Guangdong Province, P R China
Bangzhen Huang
Affiliation:
Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen, Guangdong Province, P R China
Haoran Kang
Affiliation:
Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen, Guangdong Province, P R China
Wentao Zhang
Affiliation:
Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen, Guangdong Province, P R China
Zhenzhong Jia
Affiliation:
Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen, Guangdong Province, P R China
Jing Wu*
Affiliation:
Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen, Guangdong Province, P R China
*
Corresponding author: Jing Wu; Email: wuj@sustech.edu.cn

Abstract

The integration of camera and LiDAR technologies has the potential to significantly enhance construction robots’ perception capabilities by providing complementary construction information. Structured light cameras (SLCs) are a desirable alternative as they provide comprehensive information on construction defects. However, fusing these two types of information depends largely on the sensors’ relative positions, which can only be established through extrinsic calibration. This paper introduces a novel calibration algorithm considering a customized board for SLCs and repetitive LiDARs, which are designed to facilitate the automation of construction robots. The calibration board is equipped with four symmetrically distributed hemispheres, whose centers are obtained by fitting the spheres and adoption with the geometric constraints. Subsequently, the spherical centers serve as reference features to estimate the relationship between the sensors. These distinctive features enable our proposed method to only require one calibration board pose and minimize human intervention. We conducted both simulation and real-world experiments to assess the performance of our algorithm. And the results demonstrate that our method exhibits enhanced accuracy and robustness.

Type
Research Article
Copyright
© The Author(s), 2024. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Yangtao Ge and Chen Yao contributed equally to this paper.

References

Yao, C., Shi, G., Xu, P., Lyu, S., Qiang, Z., Zhu, Z., Ding, L. and Jia, Z., “STAF: Interaction-Based Design and Evaluation of Sensorized Terrain-Adaptive Foot for Legged Robot Traversing on Soft Slopes,” IEEE/ASME Trans Mechatro, (2024). doi: 10.1109/TMECH.2024.3350183.CrossRefGoogle Scholar
Seeni, A., Schäfer, B. and Hirzinger, G., “Robot mobility systems for planetary surface exploration–State-of-the-art and future outlook: A literature survey,” Aerosp Technol Advance 492, 189208 (2010).Google Scholar
Yao, C., Xue, F., Wang, Z., Yuan, Y., Zhu, Z., Ding, L. and Jia, Z., “Wheel vision: Wheel-terrain interaction measurement and analysis using a sensorized transparent wheel on deformable terrains,” IEEE Robot Auto Lett 8(12), 79387945 (2023). doi: 10.1109/LRA.2023.3324291.CrossRefGoogle Scholar
Yu, T., Man, Q., Wang, Y., Shen, G. Q., Hong, J., Zhang, J. and Zhong, J., “Evaluating different stakeholder impacts on the occurrence of quality defects in offsite construction projects: A Bayesian-network-based model,” J Clean Prod 241, 118390 (2019). doi: 10.1016/j.jclepro.2019.118390.CrossRefGoogle Scholar
Zhu, Y., Zheng, C., Yuan, C., Huang, X. and Hong, X., “CamVox: a Low-Cost and Accurate LiDARs-Assisted Visual SLAM System,” In: 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China (2021) pp. 50495055.Google Scholar
Zhang, X., Zhu, S., Guo, S., Li, J. and Liu, H., “Line-Based Automatic Extrinsic Calibration of LiDARs and Camera,” In: 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China (2021) pp. 93479353.Google Scholar
Zhao, Y., Wang, Y. and Tsai, Y.. 2D-image to 3D-range Registration in Urban Environments via Scene Categorization and Combination of Similarity measurements. In: 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm (2016) pp. 18661872.Google Scholar
Schneider, N., Piewak, F., Stiller, C. and Franke, U., “RegNet: Multimodal Sensor Registration Using Deep Neural Networks,” In: 2017 IEEE Intelligent Vehicles Symposium (IV), Los Angeles, CA, USA (2017) pp. 18031810.Google Scholar
Iyer, G., Ram, R. K., Murthy, J. K. and Krishna, K. M., “CalibNet: Geometrically Supervised Extrinsic Calibration using 3D Spatial Transformer Networks,” In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain (2018) pp. 11101117.Google Scholar
Lv, X., Wang, B., Dou, Z., Ye, D. and Wang, S., “LCCNet: LiDARs and Camera Self-Calibration using Cost Volume Network,” In: 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Nashville, TN, USA (2021) pp. 28882895. doi: 10.1109/CVPRW53098.2021.00324 CrossRefGoogle Scholar
Wang, W., Nobuhara, S., Nakamura, R. and Sakurada, K., “SOIC: Semantic online initialization and calibration for liDARs and camera (2020).arXiv, Mar. 09.Google Scholar
Lepetit, V., Moreno-Noguer, F. and Fua, P., “EPnP: An accurate O(n) solution to the PnP problem,” Int J Comput Vision 81(2), 155166 (2009).CrossRefGoogle Scholar
Kummerle, R., Grisetti, G., Strasdat, H., Konolige, K. and Burgard, W.. G 2 o: A General Framework for Graph Optimization. In: 2011 IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China (2011) pp. 36073613.Google Scholar
Zhou, L., Li, Z. and Kaess, M., Automatic Extrinsic Calibration of a Camera and a 3D LiDARs Using Line and Plane Correspondences. In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid (IEEE, 2018) pp. 55625569. doi: 10.1109/IROS.2018.8593660 CrossRefGoogle Scholar
Koo, G., Kang, J., Jang, B. and Doh, N., Analytic Plane Covariances Construction for Precise Planarity-based Extrinsic Calibration of Camera and LiDARs. In: 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France (IEEE, 2020) pp. 60426048. doi: 10.1109/ICRA40945.2020.9197149 CrossRefGoogle Scholar
Zhang, Q. and Pless, R., “Extrinsic calibration of a camera and laser range finder (improves camera calibration),” In: 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566), Sendai, Japan, (IEEE, 2004) pp. 2301–2306. doi: 10.1109/IROS.2004.1389752.CrossRefGoogle Scholar
Unnikrishnan, R. and Hebert, M., “Fast extrinsic calibration of a laser rangefinder to a camera,” Robotics Institute, Pittsburgh, PA, Tech. Rep. CMU-RI-TR-05-09 (2005).Google Scholar
Wang, W., Sakurada, K. and Kawaguchi, N., “Reflectance intensity assisted automatic and accurate extrinsic calibration of 3D liDARs and panoramic camera using a printed chessboard,” Rem Sen 9(8), 851 (2017).CrossRefGoogle Scholar
Xie, Y., Shao, R., Guli, P., Li, B. and Wang, L., “Infrastructure Based Calibration of a Multi-Camera and Multi-LiDARs System Using Apriltags,” In: 2018 IEEE Intelligent Vehicles Symposium (IV), Changshu (2018) pp. 605610. doi: 10.1109/IVS.2018.8500646 CrossRefGoogle Scholar
Geiger, A., Moosmann, F., Car, O. and Schuster, B., “Automatic Camera and Range Sensor Calibration Using a Single Shot,” In: 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA (2012) pp. 39363943.Google Scholar
Fang, C., Ding, S., Dong, Z., Li, H., Zhu, S. and Tan, P., “Single-Shot is Enough: Panoramic Infrastructure Based Calibration of Multiple Cameras and 3D LiDARs,” In: 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic (2021) pp. 88908897.Google Scholar
Debattisti, S., Mazzei, L. and Panciroli, M., “Automated Extrinsic Laser and Camera Inter-Calibration using Triangular Targets,” In: 2013 IEEE Intelligent Vehicles Symposium (IV), Gold Coast, QLD, Australia (2013) pp. 696701.Google Scholar
Park, Y., Yun, S., Won, C. S., Cho, K., Um, K. and Sim, S., “Calibration between color camera and 3D LiDARs instruments with a polygonal planar board,” Sensors 14(3), 53335353 (2014).CrossRefGoogle ScholarPubMed
Liao, Q., Chen, Z., Liu, Y., Wang, Z. and Liu, M., “Extrinsic Calibration of LiDARs and Camera with Polygon,” In: 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO), Kuala Lumpur, Malaysia (2018) pp. 200205. doi: 10.1109/ROBIO.2018.8665256 CrossRefGoogle Scholar
Deng, Z., Xiong, L., Yin, D. and Shan, F., “Joint calibration of dual LiDARs and camera using a circular chessboard,” SAE International, Warrendale, PA, (2020). SAE Technical Paper 2020-01-0098.CrossRefGoogle Scholar
Fremont, V., Rodriguez F., S. A. and Bonnifait, P., “Circular targets for 3D alignment of video and LiDARs sensors,” Adv Robotics 26(18), 20872113 (2012).CrossRefGoogle Scholar
Pereira, M., Silva, D., Santos, V. and Dias, P., “Self calibration of multiple LiDARs and cameras on autonomous vehicles,” Robot Auton Syst 83, 326337 (2016).CrossRefGoogle Scholar
Kummerle, J., Kuhner, T. and Lauer, M., “Automatic Calibration of Multiple Cameras and Depth Sensors with a Spherical Target,” In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid (2018) pp. 18. doi: 10.1109/IROS.2018.8593955 CrossRefGoogle Scholar
Beltran, J., Guindel, C., de la Escalera, A. and Garcia, F., “Automatic extrinsic calibration method for LiDARs and camera sensor setups,” IEEE Trans Intell Transport Syst 23(10), 1767717689 (2022). doi: 10.1109/TITS.2022.3155228.CrossRefGoogle Scholar
Tóth, T., Pusztai, Z. and Hajder, L.. “Automatic LiDARs-Camera Calibration of Extrinsic Parameters Using a Spherical Target,” In: 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France (2020) pp. 85808586.Google Scholar
Dhall, A., Chelani, K., Radhakrishnan, V. and Krishna, K. M..LiDARs camera calibration using 3D-3D point correspondences(2017). ArXiv eprintsGoogle Scholar
Koenig, N. and Howard, A., “Design and use Paradigms for Gazebo, an Open-Source Multi-Robot Simulator,” In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Sendai, Japan 3, (2004) pp. 21492154.Google Scholar
Jiao, J., Chen, F., Wei, H., Wu, J. and Liu, M., “LCE-calib: Automatic LiDAR-frame/Event camera extrinsic calibration with a globally optimal solution,” IEEE/ASME Trans Mechatro 28(5), 29882999 (2023). doi: 10.1109/TMECH.2023.3259444.CrossRefGoogle Scholar