Published in

2014 IEEE International Conference on Image Processing (ICIP)

DOI: 10.1109/icip.2014.7025463

Links

Tools

Export citation

Search in Google Scholar

Upper limb movement analysis via marker tracking with a single-camera system

Proceedings article published in 2014 by Cheng Yang, Andy Kerr, Vladimir Stankovic ORCID, Lina Stankovic, Philip Rowe ORCID
This paper is available in a repository.
This paper is available in a repository.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

Optical motion capture systems have been widely adopted for human motion analysis in stroke rehabilitation because of real-time processing and high-accuracy features. However, these systems require a large laboratory space and multiple cameras and thus can be expensive and not transportable. In this paper, we propose a portable, cheap, single-camera motion analysis system to implement upper limb movement analysis. The proposed system consists of video acquisition, camera calibration, marker tracking, autonomous joint angle calculation, visualization, validation and classification. The validation with a state-of-the-art optical motion analysis system using Bland-Altman plot, a typical clinical measure, indicates that the proposed system can accurately capture elbow movement, trunk-tilt, and shoulder movement for diagnosis. Furthermore, the volunteers are explicitly classified into healthy and stroke groups via a support vector machine trained on statistics of the trunk-tilt and shoulder movement. Experimental results show that the proposed system can accurately capture the upper limb movement patterns, automatically classify stroke survivors using ordinal scale classification of upper limb impairment, and offer a convenient and inexpensive solution for upper limb movement analysis.