Hostname: page-component-848d4c4894-mwx4w Total loading time: 0 Render date: 2024-06-14T02:43:58.526Z Has data issue: false hasContentIssue false

A low-cost non-intrusive spatial hand tracking pipeline for product-process interaction

Published online by Cambridge University Press:  16 May 2024

James Gopsill*
Affiliation:
University of Bristol, United Kingdom
Aman Kukreja
Affiliation:
University of Bristol, United Kingdom
Christopher Michael Jason Cox
Affiliation:
University of Bristol, United Kingdom
Chris Snider
Affiliation:
University of Bristol, United Kingdom

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

Hands are the sensors and actuators for many design tasks. While several tools exist to capture human interaction and pose, many are expensive and require intrusive measurement devices to be placed on participants and often takes them out of the natural working environment. This paper reports a novel workflow that combines computer vision, several Machine Learning algorithms, and geometric transformations to provide a low-cost non-intrusive means of spatially tracking hands. A ±3mm position accuracy was attained across a series of 3-dimensional follow the path studies.

Type
Artificial Intelligence and Data-Driven Design
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives licence (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is unaltered and is properly cited. The written permission of Cambridge University Press must be obtained for commercial re-use or in order to create a derivative work.
Copyright
The Author(s), 2024.

References

Buckingham, G. (2021). Hand tracking for immersive virtual reality: Opportunities and challenges. Frontiers in Virtual Reality, 2. https://doi.org/10.3389/frvir.2021.728461CrossRefGoogle Scholar
Caeiro-Rodríguez, M., Otero-González, I., Mikic-Fonte, F. A., & Llamas-Nistal, M. (2021). A systematic review of commercial smart gloves: Current status and applications. Sensors, 21(8). https://doi.org/10.3390/s21082667CrossRefGoogle ScholarPubMed
Chien, C. - F., Kerh, R., Lin, K. - Y., & Yu, A. P.- I. (2016). Data-driven innovation to capture user-experience product design: An empirical study for notebook visual aesthetics design. Computers & Industrial Engineering, 99, 162173. https://doi.org/https://doi.org/10.1016/j.cie.2016.07.006CrossRefGoogle Scholar
Cox, C. M. J., Hicks, B., Gopsill, J., & Snider, C. (2023). From haptic interaction to design insight: An empirical comparison of commercial hand-tracking technology. Proceedings of the Design Society, 3, 19651974. https://doi.org/10.1017/pds.2023.197CrossRefGoogle Scholar
Harris, C. R., Millman, K. J., van der Walt, S. J., Gommers, R., Virtanen, P., Cournapeau, D., Wieser, E., Taylor, J., Berg, S., Smith, N. J., Kern, R., Picus, M., Hoyer, S., van Kerkwijk, M. H., Brett, M., Haldane, A., del Río, J. F., Wiebe, M., Peterson, P., … Oliphant, T. E. (2020). Array programming with NumPy. Nature, 585(7825), 357362. https://doi.org/10.1038/s41586-020-2649-2CrossRefGoogle ScholarPubMed
Helminen, P., Hamalainen, M. M., & Makinen, S. (2010, August). Redefining User Perception: A Method for Fully Capturing the User Perspective of a Product Concept (Vol. Volume 5: 22nd International Conference on Design Theory and Methodology; Special Conference on Mechanical Vibration and Noise). https://doi.org/10.1115/DETC2010-28698CrossRefGoogle Scholar
Hunter, J. D. (2007). Matplotlib: A 2d graphics environment. Computing in Science & Engineering, 9(3), 9095. https://doi.org/10.1109/MCSE.2007.55CrossRefGoogle Scholar
Itseez. (2015). Open source computer vision library.Google Scholar
Johnston, S. H., Berg, M. F., Eikevåg, S. W., Ege, D. N., Kohtala, S., & Steinert, M. (2022). Pure vision-based motion tracking for data-driven design – a simple, flexible, and cost-effective approach for capturing static and dynamic interactions. Proceedings of the Design Society, 2, 485494. https://doi.org/10.1017/pds.2022.50CrossRefGoogle Scholar
Lee, Joong Hee, & Yun, W. K., H, M.. (2023). Development of a therblig-based evaluation methodology for accessible product: A case study of spinal-cord impaired users [PMID: 37477263]. Disability and Rehabilitation: Assistive Technology, 0(0), 111. https://doi.org/10.1080/17483107.2023.2235378Google Scholar
Karras, O., Unger-Windeler, C., Glauer, L., & Schneider, K. (2017). Video as a by-product of digital prototyping: Capturing the dynamic aspect of interaction. 2017 IEEE 25th International Requirements Engineering Conference Workshops (REW), 118124. https://doi.org/10.1109/REW.2017.16CrossRefGoogle Scholar
Lederman, S. J., & Klatzky, R. L. (2009). Haptic perception: A tutorial. Attention, Perception, & Psychophysics, 71(7), 14391459. https://doi.org/10.3758/APP.71.7.1439Google Scholar
Lugaresi, C., Tang, J., Nash, H., McClanahan, C., Uboweja, E., Hays, M., Zhang, F., Chang, C. - L., Yong, M. G., Lee, J., Chang, W.- T., Hua, W., Georg, M., & Grundmann, M. (2019). Mediapipe: A framework for building perception pipelines.Google Scholar
Merriaux, P., Dupuis, Y., Boutteau, R., Vasseur, P., & Savatier, X. (2017). A study of vicon system positionin performance. Sensors, 17(7). https://doi.org/10.3390/s17071591CrossRefGoogle Scholar
Openshaw, S., & Taylor, E. (2006). Ergonomics and design a reference guide. Allsteel Inc., Muscatine, Iowa. Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., & Duchesnay, E. (2011).Google Scholar
Scikit-learn: Machine learning in Python. Journal of Machine Learning Research, 12, 28252830.Google Scholar
Schifferstein, H. N., & Cleiren, M. P. (2005). Capturing product experiences: A split-modality approach. Acta Psychologica, 118(3), 293318. https://doi.org/https://doi.org/10.1016/j.actpsy.2004.10.009CrossRefGoogle ScholarPubMed
Shi, J., Tang, W., Li, N., Zhou, Y., Zhou, T., Chen, Z., & Yin, K. (2021). User cognitive abilities-human computer interaction tasks model. In Russo, D., Ahram, T., Karwowski, W., Di Bucchianico, G., & Taiar, R. (Eds.), Intelligent human systems integration 2021 (pp. 194199). Springer International Publishing.CrossRefGoogle Scholar
Zhang, F., Bazarevsky, V., Vakunov, A., Tkachenka, A., Sung, G., Chang, C. - L., & Grundmann, M. (2020). Mediapipe hands: On-device real-time hand tracking.Google Scholar