scholarly journals Compensation Method of Natural Head Movement for Gaze Tracking System Using an Ultrasonic Sensor for Distance Measurement

Sensors ◽  
2016 ◽  
Vol 16 (1) ◽  
pp. 110 ◽  
Author(s):  
Dongwook Jung ◽  
Jong Lee ◽  
Su Gwon ◽  
Weiyuan Pan ◽  
Hyeon Lee ◽  
...  
2010 ◽  
Vol 36 (8) ◽  
pp. 1051-1061 ◽  
Author(s):  
Chuang ZHANG ◽  
Jian-Nan CHI ◽  
Zhao-Hui ZHANG ◽  
Zhi-Liang WANG

2021 ◽  
Vol 11 (2) ◽  
pp. 851
Author(s):  
Wei-Liang Ou ◽  
Tzu-Ling Kuo ◽  
Chin-Chieh Chang ◽  
Chih-Peng Fan

In this study, for the application of visible-light wearable eye trackers, a pupil tracking methodology based on deep-learning technology is developed. By applying deep-learning object detection technology based on the You Only Look Once (YOLO) model, the proposed pupil tracking method can effectively estimate and predict the center of the pupil in the visible-light mode. By using the developed YOLOv3-tiny-based model to test the pupil tracking performance, the detection accuracy is as high as 80%, and the recall rate is close to 83%. In addition, the average visible-light pupil tracking errors of the proposed YOLO-based deep-learning design are smaller than 2 pixels for the training mode and 5 pixels for the cross-person test, which are much smaller than those of the previous ellipse fitting design without using deep-learning technology under the same visible-light conditions. After the combination of calibration process, the average gaze tracking errors by the proposed YOLOv3-tiny-based pupil tracking models are smaller than 2.9 and 3.5 degrees at the training and testing modes, respectively, and the proposed visible-light wearable gaze tracking system performs up to 20 frames per second (FPS) on the GPU-based software embedded platform.


2009 ◽  
Vol 30 (12) ◽  
pp. 1144-1150 ◽  
Author(s):  
Diego Torricelli ◽  
Michela Goffredo ◽  
Silvia Conforto ◽  
Maurizio Schmid

our aim is to develop a project that will benefit society. But nowadays it is employed to assist human in surveillance, rescue and recovery missions. This paper presents the prototype model of an UGV which is operated wirelessly through manual navigation commands based on the live video captured from the IP camera mounted on the board. The distance measurement is done by the Ultrasonic sensor from the obstacle and displayed in the LCD. The target tracking as well as attacking is done based on the obstacle and environment situation monitored in the live video. This complete set up and working of the UGV is described further in this paper


Sign in / Sign up

Export Citation Format

Share Document