Directional eye movement detection system for virtual keyboard controller

Author(s):  
Watcharin Tangsuksant ◽  
Chittaphon Aekmunkhongpaisal ◽  
Patthiya Cambua ◽  
Theekapun Charoenpong ◽  
Theerasak Chanwimalueang
2019 ◽  
Vol 47 ◽  
pp. 159-167 ◽  
Author(s):  
Nathaniel Barbara ◽  
Tracey A. Camilleri ◽  
Kenneth P. Camilleri

Biosensors ◽  
2021 ◽  
Vol 11 (9) ◽  
pp. 343
Author(s):  
Chin-Teng Lin ◽  
Wei-Ling Jiang ◽  
Sheng-Fu Chen ◽  
Kuan-Chih Huang ◽  
Lun-De Liao

In the assistive research area, human–computer interface (HCI) technology is used to help people with disabilities by conveying their intentions and thoughts to the outside world. Many HCI systems based on eye movement have been proposed to assist people with disabilities. However, due to the complexity of the necessary algorithms and the difficulty of hardware implementation, there are few general-purpose designs that consider practicality and stability in real life. Therefore, to solve these limitations and problems, an HCI system based on electrooculography (EOG) is proposed in this study. The proposed classification algorithm provides eye-state detection, including the fixation, saccade, and blinking states. Moreover, this algorithm can distinguish among ten kinds of saccade movements (i.e., up, down, left, right, farther left, farther right, up-left, down-left, up-right, and down-right). In addition, we developed an HCI system based on an eye-movement classification algorithm. This system provides an eye-dialing interface that can be used to improve the lives of people with disabilities. The results illustrate the good performance of the proposed classification algorithm. Moreover, the EOG-based system, which can detect ten different eye-movement features, can be utilized in real-life applications.


JSAE Review ◽  
1995 ◽  
Vol 16 (1) ◽  
pp. 74-76
Author(s):  
T Nakano

1994 ◽  
Vol 41 (10) ◽  
pp. 990-995 ◽  
Author(s):  
G.M. Hatzilabrou ◽  
N. Greenberg ◽  
R.J. Sclabassi ◽  
T. Carroll ◽  
R.D. Guthrie ◽  
...  

2006 ◽  
Vol 3 (1) ◽  
pp. 29-41 ◽  
Author(s):  
J. J. Gu ◽  
M. Meng ◽  
A. Cook ◽  
P. X. Liu

Loss of an eye is a tragedy for a person, who may suffer psychologically and physically. This paper is concerned with the design, sensing and control of a robotic prosthetic eye that moves horizontally in synchronization with the movement of the natural eye. Two generations of robotic prosthetic eye models have been developed. The first generation model uses an external infrared sensor array mounted on the frame of a pair of eyeglasses to detect the natural eye movement and to feed the control system to drive the artificial eye to move with the natural eye. The second generation model removes the impractical usage of the eye glass frame and uses the human brain EOG (electro-ocular-graph) signal picked up by electrodes placed on the sides of a person's temple to carry out the same eye movement detection and control tasks as mentioned above. Theoretical issues on sensor failure detection and recovery, and signal processing techniques used in sensor data fusion, are studied using statistical methods and artificial neural network based techniques. In addition, practical control system design and implementation using micro-controllers are studied and implemented to carry out the natural eye movement detection and artificial robotic eye control tasks. Simulation and experimental studies are performed, and the results are included to demonstrate the effectiveness of the research project reported in this paper.


Sign in / Sign up

Export Citation Format

Share Document