scholarly journals Eye-tracking for avatar eye-gaze and interactional analysis in immersive collaborative virtual environments

Author(s):  
William Steptoe ◽  
Robin Wolff ◽  
Alessio Murgia ◽  
Estefania Guimaraes ◽  
John Rae ◽  
...  
2015 ◽  
Vol 1 (6) ◽  
pp. 276
Author(s):  
Maria Rashid ◽  
Wardah Mehmood ◽  
Aliya Ashraf

Eye movement tracking is a method that is now-a-days used for checking the usability problems in the contexts of Human Computer Interaction (HCI). Firstly we present eye tracking technology and key elements.We tend to evaluate the behavior of the use when they are using the interace of eye gaze. Used different techniques i.e. electro-oculography, infrared oculography, video oculography, image process techniques, scrolling techniques, different models, probable approaches i.e. shape based approach, appearance based methods, 2D and 3D models based approach and different software algorithms for pupil detection etc. We have tried to compare the surveys based on their geometric properties and reportable accuracies and eventually we conclude this study by giving some prediction regarding future eye-gaze. We point out some techniques by using various eyes properties comprising nature, appearance and gesture or some combination for eye tracking and detection. Result displays eye-gaze technique is faster and better approach for selection than a mouse selection. Rate of error for all the matters determines that there have been no errors once choosing from main menus with eye mark and with mouse. But there have been a chance of errors when once choosing from sub menus in case of eye mark. So, maintain head constantly in front of eye gaze monitor.


2020 ◽  
Vol 11 (1) ◽  
pp. 99-106
Author(s):  
Marián Hudák ◽  
Štefan Korečko ◽  
Branislav Sobota

AbstractRecent advances in the field of web technologies, including the increasing support of virtual reality hardware, have allowed for shared virtual environments, reachable by just entering a URL in a browser. One contemporary solution that provides such a shared virtual reality is LIRKIS Global Collaborative Virtual Environments (LIRKIS G-CVE). It is a web-based software system, built on top of the A-Frame and Networked-Aframe frameworks. This paper describes LIRKIS G-CVE and introduces its two original components. The first one is the Smart-Client Interface, which turns smart devices, such as smartphones and tablets, into input devices. The advantage of this component over the standard way of user input is demonstrated by a series of experiments. The second component is the Enhanced Client Access layer, which provides access to positions and orientations of clients that share a virtual environment. The layer also stores a history of connected clients and provides limited control over the clients. The paper also outlines an ongoing experiment aimed at an evaluation of LIRKIS G-CVE in the area of virtual prototype testing.


Author(s):  
Federico Cassioli ◽  
Laura Angioletti ◽  
Michela Balconi

AbstractHuman–computer interaction (HCI) is particularly interesting because full-immersive technology may be approached differently by users, depending on the complexity of the interaction, users’ personality traits, and their motivational systems inclination. Therefore, this study investigated the relationship between psychological factors and attention towards specific tech-interactions in a smart home system (SHS). The relation between personal psychological traits and eye-tracking metrics is investigated through self-report measures [locus of control (LoC), user experience (UX), behavioral inhibition system (BIS) and behavioral activation system (BAS)] and a wearable and wireless near-infrared illumination based eye-tracking system applied to an Italian sample (n = 19). Participants were asked to activate and interact with five different tech-interaction areas with different levels of complexity (entrance, kitchen, living room, bathroom, and bedroom) in a smart home system (SHS), while their eye-gaze behavior was recorded. Data showed significant differences between a simpler interaction (entrance) and a more complex one (living room), in terms of number of fixation. Moreover, slower time to first fixation in a multifaceted interaction (bathroom), compared to simpler ones (kitchen and living room) was found. Additionally, in two interaction conditions (living room and bathroom), negative correlations were found between external LoC and fixation count, and between BAS reward responsiveness scores and fixation duration. Findings led to the identification of a two-way process, where both the complexity of the tech-interaction and subjects’ personality traits are important impacting factors on the user’s visual exploration behavior. This research contributes to understand the user responsiveness adding first insights that may help to create more human-centered technology.


1997 ◽  
Vol 29 (15) ◽  
pp. 1751-1761 ◽  
Author(s):  
Steve Benford ◽  
Dave Snowdon ◽  
Chris Brown ◽  
Gail Reynard ◽  
Rob Ingram

2010 ◽  
Vol 14 (4) ◽  
pp. 229-240 ◽  
Author(s):  
Nasser Nassiri ◽  
Norman Powell ◽  
David Moore

Sign in / Sign up

Export Citation Format

Share Document