Skip to main content
×
×
Home

A comparative study of in-field motion capture approaches for body kinematics measurement in construction

  • JoonOh Seo (a1), Abdullatif Alwasel (a2), SangHyun Lee (a3), Eihab M. Abdel-Rahman (a2) and Carl Haas (a4)...
Summary
SUMMARY

Due to physically demanding tasks in construction, workers are exposed to significant safety and health risks. Measuring and evaluating body kinematics while performing tasks helps to identify the fundamental causes of excessive physical demands, enabling practitioners to implement appropriate interventions to reduce them. Recently, non-invasive or minimally invasive motion capture approaches such as vision-based motion capture systems and angular measurement sensors have emerged, which can be used for in-field kinematics measurements, minimally interfering with on-going work. Given that these approaches have pros and cons for kinematic measurement due to adopted sensors and algorithms, an in-depth understanding of the performance of each approach will support better decisions for their adoption in construction. With this background, the authors evaluate the performance of vision-based (RGB-D sensor-, stereovision camera-, and multiple camera-based) and an angular measurement sensor-based (i.e., an optical encoder) approach to measure body angles through experimental testing. Specifically, measured body angles from these approaches were compared with the ones obtained from a marker-based motion capture system that has less than 0.1 mm of errors. The results showed that vision-based approaches have about 5–10 degrees of error in body angles, while an angular measurement sensor-based approach measured body angles with about 3 degrees of error during diverse tasks. The results indicate that, in general, these approaches can be applicable for diverse ergonomic methods to identify potential safety and health risks, such as rough postural assessment, time and motion study or trajectory analysis where some errors in motion data would not significantly sacrifice their reliability. Combined with relatively accurate angular measurement sensors, vision-based motion capture approaches also have great potential to enable us to perform in-depth physical demand analysis such as biomechanical analysis that requires full-body motion data, even though further improvement of accuracy is necessary. Additionally, understanding of body kinematics of workers would enable ergonomic mechanical design for automated machines and assistive robots that helps to reduce physical demands while supporting workers' capabilities.

Copyright
Corresponding author
*Corresponding author. E-mail: joonoh.seo@polyu.edu.hk
References
Hide All
1. Everett J. G., “Overexertion injuries in construction,” J. Constr. Eng. Manag. 125 (2), 109114 (1999).
2. Center for Construction Research and Training (CPWR), The Construction Chart Book: The U.S. Construction Industry and Its Workers (CPWR Publications, Silver Spring, MD, USA, 2013).
3. Zatsiorski V. M., Kinematics of Human Motion (Human Kinetics, Champaign, IL, USA, 2002).
4. Radwin R. G., Marras W. S. and Lavender S. A., “Biomechanical aspects of work-related musculoskeletal disorders,” Theor. Issues Ergon. 2 (2), 153217 (2001).
5. Armstrong T. J., Buckle P., Fine L. J., Hagberg M., Jonsson B., Kilbom A., Kuorinka I. A., Silverstein B. A., Sjogaard G. and Viikari-Juntura E. R., “A conceptual model for work-related neck and upper-limb musculoskeletal disorders,” Scand. J. Work Environ. Health, 19 (2), 7384 (1993).
6. Everett J. G. and Slocum A. H., “Automation and robotics opportunities: Construction versus manufacturing,” J. Constr. Eng. Manag. 120 (2), 443452 (1994).
7. Schiele A. and van der Helm F. C., “Kinematic design to improve ergonomics in human machine interaction,” IEEE Trans. Neural Syst. Rehabil. Eng. 14 (4), 456469 (2006).
8. Aminian K. and Najafi B., “Capturing human motion using body-fixed sensors: Outdoor measurement and clinical applications,” Comput. Animat. Virtual Worlds, 15 (2), 7994 (2004).
9. Takubo T., Imada Y., Ohara K., Mae Y. and Arai T., “Rough Terrain Walking for Bipedal Robot by Using ZMP Criteria Map,” Proceedings of the IEEE International Conference on Robotics and Automation (May 12–17, 2009) pp. 788–793.
10. Cruz-Ramírez S. R., Mae Y., Arai T., Takubo T. and Ohara K., “Vision-based hierarchical recognition for dismantling robot applied to interior renewal of buildings,” Comput. Aided Civ. Inf. 26 (5), 336355 (2011).
11. Jacobs T., Reiser U., Hägele M. and Verl A., “Development of Validation Methods for the Safety of Mobile Service Robots with Manipulator,” Proceedings of the 7th German Conference on Robotics ROBOTIK2012, Munich, Germany (May 21–22, 2012) pp. 117–122.
12. Migliaccio G., Teizer J., Cheng T. and Gatti U., “Automatic Identification of Unsafe Bending Behavior of Construction Workers Using Real-Time Location Sensing and Physiological Status Monitoring,” Proceedings of the Construction Research Congress CRC2012, West Lafayette, IN, USA (May 21–23, 2012).
13. Ray S. J. and Teizer J., “Real-time construction worker posture analysis for ergonomics training,” Adv. Eng. Inform. 26 (2), 439455 (2012).
14. Georgoulas C., Linner T. and Bock T., “Towards a vision controlled robotic home environment,” Automat. Constr. 39, 106116 (2014).
15. Linner T., Güttler J., Bock T. and Georgoulas C., “Assistive robotic micro-rooms for independent living,” Automat. Constr. 51, 822 (2015).
16. Han S. and Lee S., “A vision-based motion capture and recognition framework for behavior-based safety management,” Automat. Constr. 35, 131141 (2013).
17. Han S., Achar M., Lee S. and Peña-Mora F., “Empirical assessment of a RGB-D sensor on motion capture and action recognition for construction worker monitoring,” Vis. Eng. 1 (1), 113 (2013).
18. Liu M., Han S. and Lee S., “Tracking-based 3D human skeleton extraction from stereo video camera toward an on-site safety and ergonomic analysis,” Constr. Innov. 16 (3), 348367 (2016).
19. Starbuck R., Seo J., Han S. and Lee S., “A Stereo Vision-Based Approach to Marker-Less Motion Capture for On-Site Kinematic Modeling of Construction Worker Tasks,” Proceedings of the 15th International Conference on Computing in Civil and Building Engineering ICCCBE2014, Orlando, FL, USA (Jun. 23–25, 2014).
20. Alwasel A. A., Elrayes K., Abdel-Rahman E. M. and Haas C. T., “A Human Body Posture Sensor for Monitoring and Diagnosing MSD Risk Factors,” Proceedings of the 30th International Symposium on Automation and Robotics in Construction ISARC2013, Montreal, Canada (Aug. 11–15, 2013).
21. Poppe R., “A survey on vision-based human action recognition,” Image Vis. Comput. 28 (6), 976990 (2010).
22. Lee M. W. and Cohen I., “A model-based approach for estimating human 3D poses in static images,” IEEE Trans. Pattern Anal. 28 (6), 905916 (2006).
23. Plagemann C., Ganapathi V., Koller D. and Thrun S., “Real-Time Identification and Localization of Body Parts from Depth Images,” Proceedings of the IEEE International Conference on Robotics and Automation ICRA2010, Anchorage, AK, USA pp. 3108–3113.
24. Siddiqui M. and Medioni G., “Human Pose Estimation from a Single View Point, Real-Time Range Sensor,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops CVPRW2010, San Francisco, CA, USA (Jun. 13–18, 2010) pp. 1–8.
25. Shotton J., Sharp T., Kipman A., Fitzgibbon A., Finocchio M., Blake A. and Moore R., “Real-time human pose recognition in parts from single depth images.” Commun. ACM, 56 (1), 116124 (2013).
26. Rafibakhsh N., Gong J., Siddiqui M. K., Gordon C. and Lee H. F., “Analysis of Xbox Kinect Sensor Data for Use on Construction Sites: Depth Accuracy and Sensor Interference Assessment,” Proceedings of Constitution Research Congress CRC2012, West Lafayette, IN, USA (May 21–23, 2012) pp. 848–857.
27. Zhang Z., “Microsoft kinect sensor and its effect,” IEEE Multimed. Mag. 19 (2), 410 (2012).
28. Jin S., Cho J., Pham X. D., Lee K. M., Park S. K., Kim M. and Jeon J. W., “FPGA design and implementation of a real-time stereo vision system,” IEEE Trans. Circ. Syst. Vid. 20 (1), 1526 (2010).
29. Woodfill J. I., Gordon G. and Buck R., “Tyzx Deepsea High Speed Stereo Vision System,” Proceedings of the Conference on Computer Vision and Pattern Recognition Workshop CVPRW2004, Washington, DC, USA (Jun. 27–Jul. 2, 2004).
30. Shan C., Tan T. and Wei Y., “Real-time hand tracking using a mean shift embedded particle filter,” Pattern Recogn. 40 (7), 19581970 (2007).
31. Liu C., Yuen J. and Torralba A., “SIFT flow: Dense correspondence across scenes and its applications,” IEEE Trans. Pattern Anal. 33 (5), 978994 (2011).
32. Uijlings J. R. R., Smeulders A. W. M. and Scha R. J. H., “Real-time visual concept classification,” IEEE Trans. Multimed. 12 (7), 665681 (2010).
33. Bay H., Tuytelaars T. and van Gool L., “SURF: Speeded up robust features,” Comput. Vis. Image Underst. 110 (3), 346359 (2008).
34. Hartley R. and Zisserman A., Multiple View Geometry in Computer Vision (Cambridge University Press, Cambridge, UK, 2003).
35. Zhang Z., “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22 (11), 13301334 (2000).
36. Tousignant M., de Bellefeuille L., O'Donoughue S. and Grahovac S., “Criterion validity of the cervical range of motion (CROM) goniometer for cervical flexion and extension,” Spine 25 (3), 324330 (2000).
37. Alwasel A. A., Abdel-Rahman E. M. and Haas C. T., “A Technique to Detect Fatigue in the Lower Limbs,” Proceedings of the ASME 2014 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, New York, USA (Aug. 17–20, 2014).
38. Yamada T., Hayamizu Y., Yamamoto Y., Yomogida Y., Izadi-Najafabadi A., Futaba D. N. and Hata K., “A stretchable carbon nanotube strain sensor for human-motion detection,” Nat. Nanotechnol. 6 (5), 296301 (2011).
39. Veltink P. H., Bussmann H. B., De Vries W., Martens W. L. and Van Lummel R. C., “Detection of static and dynamic activities using uniaxial accelerometers,” IEEE Trans. Neural Syst. Rehabil. Eng. 4 (4), 375385 (1996).
40. Savitzky A. and Golay M. J., “Smoothing and differentiation of data by simplified least squares procedures,” J. Anal. Chem. 36 (8), 16271639 (1964).
41. Esser P., Dawes H., Collett J. and Howells K., “IMU: Inertial sensing of vertical CoM movement,” J. Biomech. 42 (10), 15781581 (2009).
42. Kim M. Y., Ayaz S. M., Park J. and Roh Y., “Adaptive 3D sensing system based on variable magnification using stereo vision and structured light,” Opt. Laser Eng. 55, 113127 (2014).
43. Rowe P. J., Myles C. M., Hillmann S. J. and Hazlewood M. E., “Validation of flexible electrogoniometry as a measure of joint kinematics,” J. Physiother. 87 (9), 479488 (2001).
44. Li G. and Buckle P., “Current techniques for assessing physical exposure to work-related musculoskeletal risks, with emphasis on posture-based methods,” Ergonomics 42 (5), 674695 (1999).
45. McAtamney L. and Corlett E. N., “RULA: A survey method for the investigation of work-related upper limb disorders,” Appl. Ergon. 24 (2), 9199 (1993).
46. Hignett S. and McAtamney L., “Rapid entire body assessment (REBA),” Appl. Ergon. 31 (2), 201205 (2000).
47. Laurig W., Kühn F. M. and Schoo K. C., “An approach to assessing motor workload in assembly tasks by the use of predetermined-motion-time systems,” Appl. Ergon. 16 (2), 119125 (1985).
48. Kilbom Å. and Persson J., “Work technique and its consequences for musculoskeletal disorders,” Ergonomics, 30 (2), 273279 (1987).
49. Kilbom Å., “Repetitive work of the upper extremity: Part II-the scientific basis (knowledge base) for the guide,” Int. J. Ind. Ergon. 14 (1), 5986 (1994).
50. Seo J., Starbuck R., Han S., Lee S. and Armstrong T., “Motion-data–driven biomechanical analysis during construction tasks on sites,” J. Comput. Civil Eng. 29 (4), (2014).
51. Chaffin D. B., Andersson G. and Martin B. J., Occupational Biomechanics, 4th ed. (Wiley, New York, USA, 2006).
52. Chaffin D. B. and Erig M., “Three-dimensional biomechanical static strength prediction model sensitivity to postural and anthropometric inaccuracies,” IIE Trans. 23 (3), 215227 (1991).
53. Knoop S., Vacek S. and Dillmann R., “Sensor Fusion for 3D Human Body Tracking with an Articulated 3D Body Model,” Proceedings of the IEEE International Conference on Robotics and Automation ICRA2006 pp. 1686–1691.
Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

Robotica
  • ISSN: 0263-5747
  • EISSN: 1469-8668
  • URL: /core/journals/robotica
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×

Keywords:

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 24 *
Loading metrics...

Abstract views

Total abstract views: 106 *
Loading metrics...

* Views captured on Cambridge Core between 20th December 2017 - 17th January 2018. This data will be updated every 24 hours.