Eye Tracking for Avatar Eye Gaze Control During Object-Focused Multiparty Interaction in Immersive Collaborative Virtual Environments

Author(s):  
W. Steptoe ◽  
O. Oyekoya ◽  
A. Murgia ◽  
R. Wolff ◽  
J. Rae ◽  
...  
Author(s):  
Arun Sivananthan ◽  
Alexandros Kogkas ◽  
Ben Glover ◽  
Ara Darzi ◽  
George Mylonas ◽  
...  

Abstract Background Interventional endoluminal therapy is rapidly advancing as a minimally invasive surgical technique. The expanding remit of endoscopic therapy necessitates precision control. Eye tracking is an emerging technology which allows intuitive control of devices. This was a feasibility study to establish if a novel eye gaze-controlled endoscopic system could be used to intuitively control an endoscope. Methods An eye gaze-control system consisting of eye tracking glasses, specialist cameras and a joystick was used to control a robotically driven endoscope allowing steering, advancement, withdrawal and retroflexion. Eight experienced and eight non-endoscopists used both the eye gaze system and a conventional endoscope to identify ten targets in two simulated environments: a sphere and an upper gastrointestinal (UGI) model. Completion of tasks was timed. Subjective feedback was collected from each participant on task load (NASA Task Load Index) and acceptance of technology (Van der Laan scale). Results When using gaze-control endoscopy, non-endoscopists were significantly quicker when using gaze-control rather than conventional endoscopy (sphere task 3:54 ± 1:17 vs. 9:05 ± 5:40 min, p = 0.012, and UGI model task 1:59 ± 0:24 vs 3:45 ± 0:53 min, p < .001). Non-endoscopists reported significantly higher NASA-TLX workload total scores using conventional endoscopy versus gaze-control (80.6 ± 11.3 vs 22.5 ± 13.8, p < .001). Endoscopists reported significantly higher total NASA-TLX workload scores using gaze control versus conventional endoscopy (54.2 ± 16 vs 26.9 ± 15.3, p = 0.012). All subjects reported that the gaze-control had positive ‘usefulness’ and ‘satisfaction’ score of 0.56 ± 0.83 and 1.43 ± 0.51 respectively. Conclusions The novel eye gaze-control system was significantly quicker to use and subjectively lower in workload when used by non-endoscopists. Further work is needed to see if this would translate into a shallower learning curve to proficiency versus conventional endoscopy. The eye gaze-control system appears feasible as an intuitive endoscope control system. Hybrid gaze and hand control may prove a beneficial technology to evolving endoscopic platforms.


2015 ◽  
Vol 1 (6) ◽  
pp. 276
Author(s):  
Maria Rashid ◽  
Wardah Mehmood ◽  
Aliya Ashraf

Eye movement tracking is a method that is now-a-days used for checking the usability problems in the contexts of Human Computer Interaction (HCI). Firstly we present eye tracking technology and key elements.We tend to evaluate the behavior of the use when they are using the interace of eye gaze. Used different techniques i.e. electro-oculography, infrared oculography, video oculography, image process techniques, scrolling techniques, different models, probable approaches i.e. shape based approach, appearance based methods, 2D and 3D models based approach and different software algorithms for pupil detection etc. We have tried to compare the surveys based on their geometric properties and reportable accuracies and eventually we conclude this study by giving some prediction regarding future eye-gaze. We point out some techniques by using various eyes properties comprising nature, appearance and gesture or some combination for eye tracking and detection. Result displays eye-gaze technique is faster and better approach for selection than a mouse selection. Rate of error for all the matters determines that there have been no errors once choosing from main menus with eye mark and with mouse. But there have been a chance of errors when once choosing from sub menus in case of eye mark. So, maintain head constantly in front of eye gaze monitor.


2020 ◽  
Vol 11 (1) ◽  
pp. 99-106
Author(s):  
Marián Hudák ◽  
Štefan Korečko ◽  
Branislav Sobota

AbstractRecent advances in the field of web technologies, including the increasing support of virtual reality hardware, have allowed for shared virtual environments, reachable by just entering a URL in a browser. One contemporary solution that provides such a shared virtual reality is LIRKIS Global Collaborative Virtual Environments (LIRKIS G-CVE). It is a web-based software system, built on top of the A-Frame and Networked-Aframe frameworks. This paper describes LIRKIS G-CVE and introduces its two original components. The first one is the Smart-Client Interface, which turns smart devices, such as smartphones and tablets, into input devices. The advantage of this component over the standard way of user input is demonstrated by a series of experiments. The second component is the Enhanced Client Access layer, which provides access to positions and orientations of clients that share a virtual environment. The layer also stores a history of connected clients and provides limited control over the clients. The paper also outlines an ongoing experiment aimed at an evaluation of LIRKIS G-CVE in the area of virtual prototype testing.


Author(s):  
Federico Cassioli ◽  
Laura Angioletti ◽  
Michela Balconi

AbstractHuman–computer interaction (HCI) is particularly interesting because full-immersive technology may be approached differently by users, depending on the complexity of the interaction, users’ personality traits, and their motivational systems inclination. Therefore, this study investigated the relationship between psychological factors and attention towards specific tech-interactions in a smart home system (SHS). The relation between personal psychological traits and eye-tracking metrics is investigated through self-report measures [locus of control (LoC), user experience (UX), behavioral inhibition system (BIS) and behavioral activation system (BAS)] and a wearable and wireless near-infrared illumination based eye-tracking system applied to an Italian sample (n = 19). Participants were asked to activate and interact with five different tech-interaction areas with different levels of complexity (entrance, kitchen, living room, bathroom, and bedroom) in a smart home system (SHS), while their eye-gaze behavior was recorded. Data showed significant differences between a simpler interaction (entrance) and a more complex one (living room), in terms of number of fixation. Moreover, slower time to first fixation in a multifaceted interaction (bathroom), compared to simpler ones (kitchen and living room) was found. Additionally, in two interaction conditions (living room and bathroom), negative correlations were found between external LoC and fixation count, and between BAS reward responsiveness scores and fixation duration. Findings led to the identification of a two-way process, where both the complexity of the tech-interaction and subjects’ personality traits are important impacting factors on the user’s visual exploration behavior. This research contributes to understand the user responsiveness adding first insights that may help to create more human-centered technology.


1997 ◽  
Vol 29 (15) ◽  
pp. 1751-1761 ◽  
Author(s):  
Steve Benford ◽  
Dave Snowdon ◽  
Chris Brown ◽  
Gail Reynard ◽  
Rob Ingram

2010 ◽  
Vol 14 (4) ◽  
pp. 229-240 ◽  
Author(s):  
Nasser Nassiri ◽  
Norman Powell ◽  
David Moore

BMC Neurology ◽  
2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Petra Karlsson ◽  
Tom Griffiths ◽  
Michael T. Clarke ◽  
Elegast Monbaliu ◽  
Kate Himmelmann ◽  
...  

Abstract Background Limited research exists to guide clinical decisions about trialling, selecting, implementing and evaluating eye-gaze control technology. This paper reports on the outcomes of a Delphi study that was conducted to build international stakeholder consensus to inform decision making about trialling and implementing eye-gaze control technology with people with cerebral palsy. Methods A three-round online Delphi survey was conducted. In Round 1, 126 stakeholders responded to questions identified through an international stakeholder Advisory Panel and systematic reviews. In Round 2, 63 respondents rated the importance of 200 statements generated by in Round 1. In Round 3, 41 respondents rated the importance of the 105 highest ranked statements retained from Round 2. Results Stakeholders achieved consensus on 94 of the original 200 statements. These statements related to person factors, support networks, the environment, and technical aspects to consider during assessment, trial, implementation and follow-up. Findings reinforced the importance of an individualised approach and that information gathered from the user, their support network and professionals are central when measuring outcomes. Information required to support an application for funding was obtained. Conclusion This Delphi study has identified issues which are unique to eye-gaze control technology and will enhance its implementation with people with cerebral palsy.


Sign in / Sign up

Export Citation Format

Share Document