Hostname: page-component-8448b6f56d-tj2md Total loading time: 0 Render date: 2024-04-18T11:27:52.594Z Has data issue: false hasContentIssue false

Soft Robotic Finger Embedded with Visual Sensor for Bending Perception

Published online by Cambridge University Press:  17 June 2020

Shixin Zhang
Affiliation:
College of Mechanical Engineering, Anhui University of Technology, Maanshan 243002, China E-mails: zsx1723190077@163.com, 379751793@qq.com
Jianhua Shan
Affiliation:
College of Mechanical Engineering, Anhui University of Technology, Maanshan 243002, China E-mails: zsx1723190077@163.com, 379751793@qq.com
Bin Fang*
Affiliation:
Department of Computer Science and Technology, Tsinghua National Laboratory for Information Science and Technology, Tsinghua University, Beijing 100084, China
Fuchun Sun*
Affiliation:
Department of Computer Science and Technology, Tsinghua National Laboratory for Information Science and Technology, Tsinghua University, Beijing 100084, China
*
*Corresponding author. E-mails: fangbin@mail.tsinghua.edu.cn; fcsun@tsinghua.edu.cn
*Corresponding author. E-mails: fangbin@mail.tsinghua.edu.cn; fcsun@tsinghua.edu.cn

Summary

The various vision-based tactile sensors have been developed for robotic perception in recent years. In this paper, the novel soft robotic finger embedded with the visual sensor is proposed for perception. It consists of a colored soft inner chamber, an outer structure, and an endoscope camera. The bending perception algorithm based on image preprocessing and deep learning is proposed. The boundary of color regions and the position of marker dots are extracted from the inner chamber image and label image, respectively. Then the convolutional neural network with multi-task learning is trained to obtain bending states of the finger. Finally, the experiments are implemented to verify the effectiveness of the proposed method.

Type
Articles
Copyright
Copyright © The Author(s), 2020. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Shixin Zhang and Jianhua Shan contributed equally to this work.

References

Furukawa, S., Wakimoto, S., Kanda, T. and Hagihara, H., “A soft master-slave robot mimicking octopus arm structure using thin artificial muscles and wire encoders,Actuators. 8(2), 40 (2019).CrossRefGoogle Scholar
Zhang, Y., Liu, Y. and Sui, X., “A mechatronics-embedded pneumatic soft modular robot powered via single air tube,Appl. Sci. 9(11), 2260 (2019).CrossRefGoogle Scholar
Nassour, J., Ghadiya, V., Hugel, V. and Hamker, F. H., “Design of New Sensory Soft Hand: Combining Air-Pump Actuation with Superimposed Curvature and Pressure Sensors,” Proceedings of the IEEE International Conference on Soft Robotics (2018) pp. 164169.Google Scholar
Fang, B., Sun, F., Chen, Y., Zhu, C., Xia, Z. and Yang, Y., “A Tendon-Driven Dexterous Hand Design with Tactile Sensor Array for Grasping and Manipulation,” Proceedings of the IEEE International Conference on Robotics and Biomimetics (2019) pp. 203210.Google Scholar
Zhao, H., O’Brien, K., Li, S. and Shepherd, R. F., “Optoelectronically innervated soft prosthetic hand via stretchable optical waveguides,Sci. Rob. 1(1), 7529 (2016).CrossRefGoogle Scholar
Ozel, S., Skorina, E. H., Luo, M., Tao, W., Chen, F., Pan, Y. and Onal, C. D., “A Composite Soft Bending Actuation Module with Integrated Curvature Sensing,” Proceedings of the IEEE International Conference on Robotics and Automation (2016) pp. 49634968.Google Scholar
Fang, B., Xue, H., Sun, F. and Liu, H., “A cross-modal tactile sensor design for measuring robotic grasping forces,Ind. Rob. 46(3), 3744 (2019).Google Scholar
Sun, F., Fang, B., Xue, H. and Liu, H., “A novel multi-modal tactile sensor design using thermochromic material,SCIENTIA SINICA Inf. 48(48), 449 (2018).Google Scholar
Sferrazzać, C. and Andrea, R. D., “Design, motivation and evaluation of a full-resolution optical tactile sensor,Sensors 19(4), 928 (2019).CrossRefGoogle Scholar
Fang, B., Sun, F., Yang, C., Xue, H., Chen, W., Zhang, C., Guo, D. and Liu, H., “A Dual-Modal Vision-Based Tactile Sensor for Robotic Hand Grasping,” Proceedings of the IEEE International Conference on Robotics and Automation (2018) pp. 19.Google Scholar
Aquilina, K., Barton, D. A. and Lepora, N. F., “Principal Components of Touch,” Proceedings of the IEEE International Conference on Robotics and Automation (2018) pp. 18.Google Scholar
Li, J., Dong, S. and Adelson, E., “Slip Detection with Combined Tactile and Visual Information,” Proceedings of the IEEE International Conference on Robotics and Automation (2018) pp. 77727777.Google Scholar
McInroe, W. B., Chen, L. C., Goldberg, Y. K., Bajcsy, R. and Fearing, S. R., “Towards a Soft Fingertip with Integrated Sensing and Actuation,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (2018) pp. 64376444.Google Scholar
Yuan, W., Mo, Y., Wang, S. and Adelson, E., “Active Clothing Material Perception Using Tactile Sensing and Deep Learning,” Proceedings of the IEEE International Conference on Robotics and Automation (2018) pp. 18.Google Scholar
Luo, S., Yuan, W., Adelson, E., Cohn, G. A. and Fuentes, R., “ViTac: Feature Sharing Between Vision and Tactile Sensing for Cloth Texture Recognition,” Proceedings of the IEEE International Conference on Robotics and Automation (2018) pp. 27222727.Google Scholar
Pestell, N., Lloyd, J., Rossiter, J. and Lepora, F. N., “Dual-modal tactile perception and exploration,IEEE Rob. Autom. Lett. 3(2), 10331040 (2018).CrossRefGoogle Scholar
Ward-Cherrier, B., Pestell, N., Cramphorn, L., Winstone, B., Giannaccini, M. E., Rossiter, J. and Lepora, F. N., “The TacTip family: Soft optical tactile sensors with 3D-printed biomimetic morphologies,Soft Rob. 5(2), 216227 (2018).CrossRefGoogle Scholar
Xiang, C., Guo, J. and Rossiter, J., “Soft-smart robotic end effectors with sensing, actuation, and gripping capabilities,Smart Mater. Struct. 28(5), 055034 (2019).CrossRefGoogle Scholar
Krizhevsky, A., Sutskever, I. and Hinton, G. E., “ImageNet Classification with Deep Convolutional Neural Networks,” Proceedings of the Conference on Advances in Neural Information Processing Systems (2012) pp. 10971105.Google Scholar
LeCun, Y., Kavukcuoglu, K. and Farabet, C., “Convolutional Networks and Applications in Vision,” Proceedings of IEEE International Symposium on Circuits and Systems (2010) pp. 253256.Google Scholar
Agarap, A. F., “Deep learning using rectified linear units (ReLU),” arXiv preprint arXiv.1803.08375 (2018).Google Scholar
Ramachandran, P., Zoph, B. and Le, Q. V., “Searching for activation functions,” arXiv preprint arXiv.1710.05941 (2017).Google Scholar
Gulcehre, C., Moczulski, M., Denil, M. and Bengio, Y., “Noisy Activation Functions,” Proceedings of International Conference on Machine Learning (2016) pp. 30593068.Google Scholar
Qi, X., Wang, T. and Liu, J., “Comparison of Support Vector Machine and Softmax Classifiers in Computer Vision,” Proceedings of International Conference on Mechanical, Control and Computer Engineering (2017) pp. 151155.Google Scholar
Sculley, D., “Combined Regression and Ranking,” Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (2010) pp. 979988.Google Scholar
Ruder, S., “An overview of gradient descent optimization algorithms,” arXiv preprint arXiv.1069.04747 (2016).Google Scholar
Shamirć, O. and Zhang, T., “Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes,” Proceedings of International Conference on Machine Learning (2013) pp. 7179.Google Scholar
Qian, N., “On the momentum term in gradient descent learning algorithms,Neural Netw. 12(1), 145151 (1999).CrossRefGoogle Scholar
O’donoghueć, B. and Candes, E., “Adaptive restart for accelerated gradient schemes,Found. Comput. Math. 15(3), 715732 (2015).CrossRefGoogle Scholar
Wilson, A. C., Roelofs, R., Stern, M., Srebro, N. and Recht, B., “The Marginal Value of Adaptive Gradient Methods in Machine Learning,” Proceedings of the Conference on Advances in Neural Information Processing Systems (2017) pp. 41484158.Google Scholar
Zeiler, M. D., “ADADELTA: An adaptive learning rate method,” arXiv preprint arXiv.1212.5701 (2012).Google Scholar
Zou, F., Shen, L., Jie, Z., Zhang, W. and Liu, W., “A Sufficient Condition for Convergences of Adam and RMSProp,” Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2019) pp. 1112711135.Google Scholar
Kingmać, D. P and Ba, J., “Adam: A method for stochastic optimization,” arXiv preprint arXiv.1412.6980 (2014).Google Scholar
Forrest, A. R., “Interactive interpolation and approximation by Bẽzier polynomials,Comput. J. 15(1), 7179 (1972).CrossRefGoogle Scholar