Published in

Hindawi, Wireless Communications and Mobile Computing, (2021), p. 1-10, 2021

DOI: 10.1155/2021/8213946

Links

Tools

Export citation

Search in Google Scholar

Real-Time Precise Human-Computer Interaction System Based on Gaze Estimation and Tracking

Journal article published in 2021 by Junhao Huang ORCID, Zhicheng Zhang ORCID, Guoping Xie, Hui He ORCID
This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Orange circle
Preprint: archiving restricted
Orange circle
Postprint: archiving restricted
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

Noncontact human-computer interaction has an important value in wireless sensor networks. This work is aimed at achieving accurate interaction on a computer based on auto eye control, using a cheap webcam as the video source. A real-time accurate human-computer interaction system based on eye state recognition, rough gaze estimation, and tracking is proposed. Firstly, binary classification of the eye states (opening or closed) is carried on using the SVM classification algorithm with HOG features of the input eye image. Second, rough appearance-based gaze estimation is implemented based on a simple CNN model. And the head pose is estimated to judge whether the user is facing the screen or not. Based on these recognition results, noncontact mouse control and character input methods are designed and developed to replace the standard mouse and keyboard hardware. Accuracy and speed of the proposed interaction system are evaluated by four subjects. The experimental results show that users can use only a common monocular camera to achieve gaze estimation and tracking and to achieve most functions of real-time precise human-computer interaction on the basis of auto eye control.