Published in

Proceedings. The 7th International IEEE Conference on Intelligent Transportation Systems (IEEE Cat. No.04TH8749)

DOI: 10.1109/itsc.2004.1398960

Links

Tools

Export citation

Search in Google Scholar

Recognition of arm gestures using multiple orientation sensors: repeatability assessment

Proceedings article published in 2004 by Martin Urban, Peter Bajcsy ORCID, Rob Kooper ORCID, J.-C. Lementec
This paper is available in a repository.
This paper is available in a repository.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

We present a solution to repeatability assessment of an arm gesture recognition system using multiple orientation sensors. We focus specifically on the problem of controlling unmanned aerial vehicles (UAVs) in the presence of manned aircrafts on an aircraft deck. Our goal is to design a robust UAV control with the same gesture signals as used by current flight directors for controlling manned vehicles. Given the fact that such a system has to operate 24 hours a day in a noisy and harsh environment, for example, on a Navy carrier deck, our approach to this problem is based on arm gesture recognition rather than on speech recognition. We have investigated real-time and system design issues for a particular choice of active sensors, such as, the orientation sensors of the IS-300 Pro Precision Motion Tracker manufactured by InterSense. Our work consists of (1) scrutinizing sensor data acquisition parameters and reported arm orientation measurements, (2) choosing the optimal attachment and placement of sensors, (3) measuring repeatability of movements using dynamic time warping (DTW) metric, and (4) testing performance of a template-based gesture classification algorithm and robot control mechanisms, where the robot represents an UAV surrogate in a laboratory environment.