A Study on the Development of the Psychological Assessment a Using Eye-Tracking: Focused on Eye Gaze Processing of Literacy Text

Author(s):  
Joon Hyun Jeon ◽  
Gyoung Kim ◽  
Jeong Ae Kim
2015 ◽  
Vol 1 (6) ◽  
pp. 276
Author(s):  
Maria Rashid ◽  
Wardah Mehmood ◽  
Aliya Ashraf

Eye movement tracking is a method that is now-a-days used for checking the usability problems in the contexts of Human Computer Interaction (HCI). Firstly we present eye tracking technology and key elements.We tend to evaluate the behavior of the use when they are using the interace of eye gaze. Used different techniques i.e. electro-oculography, infrared oculography, video oculography, image process techniques, scrolling techniques, different models, probable approaches i.e. shape based approach, appearance based methods, 2D and 3D models based approach and different software algorithms for pupil detection etc. We have tried to compare the surveys based on their geometric properties and reportable accuracies and eventually we conclude this study by giving some prediction regarding future eye-gaze. We point out some techniques by using various eyes properties comprising nature, appearance and gesture or some combination for eye tracking and detection. Result displays eye-gaze technique is faster and better approach for selection than a mouse selection. Rate of error for all the matters determines that there have been no errors once choosing from main menus with eye mark and with mouse. But there have been a chance of errors when once choosing from sub menus in case of eye mark. So, maintain head constantly in front of eye gaze monitor.


Author(s):  
Federico Cassioli ◽  
Laura Angioletti ◽  
Michela Balconi

AbstractHuman–computer interaction (HCI) is particularly interesting because full-immersive technology may be approached differently by users, depending on the complexity of the interaction, users’ personality traits, and their motivational systems inclination. Therefore, this study investigated the relationship between psychological factors and attention towards specific tech-interactions in a smart home system (SHS). The relation between personal psychological traits and eye-tracking metrics is investigated through self-report measures [locus of control (LoC), user experience (UX), behavioral inhibition system (BIS) and behavioral activation system (BAS)] and a wearable and wireless near-infrared illumination based eye-tracking system applied to an Italian sample (n = 19). Participants were asked to activate and interact with five different tech-interaction areas with different levels of complexity (entrance, kitchen, living room, bathroom, and bedroom) in a smart home system (SHS), while their eye-gaze behavior was recorded. Data showed significant differences between a simpler interaction (entrance) and a more complex one (living room), in terms of number of fixation. Moreover, slower time to first fixation in a multifaceted interaction (bathroom), compared to simpler ones (kitchen and living room) was found. Additionally, in two interaction conditions (living room and bathroom), negative correlations were found between external LoC and fixation count, and between BAS reward responsiveness scores and fixation duration. Findings led to the identification of a two-way process, where both the complexity of the tech-interaction and subjects’ personality traits are important impacting factors on the user’s visual exploration behavior. This research contributes to understand the user responsiveness adding first insights that may help to create more human-centered technology.


Author(s):  
Gavindya Jayawardena ◽  
Sampath Jayarathna

Eye-tracking experiments involve areas of interest (AOIs) for the analysis of eye gaze data. While there are tools to delineate AOIs to extract eye movement data, they may require users to manually draw boundaries of AOIs on eye tracking stimuli or use markers to define AOIs. This paper introduces two novel techniques to dynamically filter eye movement data from AOIs for the analysis of eye metrics from multiple levels of granularity. The authors incorporate pre-trained object detectors and object instance segmentation models for offline detection of dynamic AOIs in video streams. This research presents the implementation and evaluation of object detectors and object instance segmentation models to find the best model to be integrated in a real-time eye movement analysis pipeline. The authors filter gaze data that falls within the polygonal boundaries of detected dynamic AOIs and apply object detector to find bounding-boxes in a public dataset. The results indicate that the dynamic AOIs generated by object detectors capture 60% of eye movements & object instance segmentation models capture 30% of eye movements.


Author(s):  
Chandni Parikh

Eye movements and gaze direction have been utilized to make inferences about perception and cognition since the 1800s. The driving factor behind recording overt eye movements stem from the fundamental idea that one's gaze provides tremendous insight into the information processing that takes place early on during development. One of the key deficits seen in individuals diagnosed with Autism Spectrum Disorders (ASD) involves eye gaze and social attention processing. The current chapter focuses on the use of eye-tracking technology with high-risk infants who are siblings of children diagnosed with ASD in order to highlight potential bio-behavioral markers that can inform the ascertainment of red flags and atypical behaviors associated with ASD within the first few years of development.


Author(s):  
Priya Seshadri ◽  
Youyi Bi ◽  
Jaykishan Bhatia ◽  
Ross Simons ◽  
Jeffrey Hartley ◽  
...  

This study is the first stage of a research program aimed at understanding differences in how people process 2D and 3D automotive stimuli, using psychophysiological tools such as galvanic skin response (GSR), eye tracking, electroencephalography (EEG), and facial expressions coding, along with respondent ratings. The current study uses just one measure, eye tracking, and one stimulus format, 2D realistic renderings of vehicles, to reveal where people expect to find information about brand and other industry-relevant topics, such as sportiness. The eye-gaze data showed differences in the percentage of fixation time that people spent on different views of cars while evaluating the “Brand” and the degree to which they looked “Sporty/Conservative”, “Calm/Exciting”, and “Basic/Luxurious”. The results of this work can give designers insights on where they can invest their design efforts when considering brand and styling cues.


Author(s):  
Alexander L. Anwyl-Irvine ◽  
Thomas Armstrong ◽  
Edwin S. Dalmaijer

AbstractPsychological research is increasingly moving online, where web-based studies allow for data collection at scale. Behavioural researchers are well supported by existing tools for participant recruitment, and for building and running experiments with decent timing. However, not all techniques are portable to the Internet: While eye tracking works in tightly controlled lab conditions, webcam-based eye tracking suffers from high attrition and poorer quality due to basic limitations like webcam availability, poor image quality, and reflections on glasses and the cornea. Here we present MouseView.js, an alternative to eye tracking that can be employed in web-based research. Inspired by the visual system, MouseView.js blurs the display to mimic peripheral vision, but allows participants to move a sharp aperture that is roughly the size of the fovea. Like eye gaze, the aperture can be directed to fixate on stimuli of interest. We validated MouseView.js in an online replication (N = 165) of an established free viewing task (N = 83 existing eye-tracking datasets), and in an in-lab direct comparison with eye tracking in the same participants (N = 50). Mouseview.js proved as reliable as gaze, and produced the same pattern of dwell time results. In addition, dwell time differences from MouseView.js and from eye tracking correlated highly, and related to self-report measures in similar ways. The tool is open-source, implemented in JavaScript, and usable as a standalone library, or within Gorilla, jsPsych, and PsychoJS. In sum, MouseView.js is a freely available instrument for attention-tracking that is both reliable and valid, and that can replace eye tracking in certain web-based psychological experiments.


2021 ◽  
Vol 14 (2) ◽  
Author(s):  
Xin Liu ◽  
Bin Zheng ◽  
Xiaoqin Duan ◽  
Wenjing He ◽  
Yuandong Li ◽  
...  

Eye-tracking can help decode the intricate control mechanism in human performance. In healthcare, physicians-in-training requires extensive practice to improve their healthcare skills. When a trainee encounters any difficulty in the practice, they will need feedback from experts to improve their performance. The personal feedback is time-consuming and subjected to bias. In this study, we tracked the eye movements of trainees during their colonoscopic performance in simulation. We applied deep learning algorithms to detect the eye-tracking metrics on the moments of navigation lost (MNL), a signature sign for performance difficulty during colonoscopy. Basic human eye gaze and pupil characteristics were learned and verified by the deep convolutional generative adversarial networks (DCGANs); the generated data were fed to the Long Short-Term Memory (LSTM) networks with three different data feeding strategies to classify MNLs from the entire colonoscopic procedure. Outputs from deep learning were compared to the expert’s judgment on the MNLs based on colonoscopic videos. The best classification outcome was achieved when we fed human eye data with 1000 synthesized eye data, where accuracy (90%), sensitivity (90%), and specificity (88%) were optimized. This study built an important foundation for our work of developing a self-adaptive education system for training healthcare skills using simulation.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Minah Kim ◽  
Woncheol Shin ◽  
Tak Hyung Lee ◽  
Taekwan Kim ◽  
Wu Jeong Hwang ◽  
...  

AbstractThe symptoms of obsessive–compulsive disorder (OCD) are largely related to impaired executive functioning due to frontostriatal dysfunction. To better treat OCD, the development of biomarkers to bridge the gap between the symptomatic-cognitive phenotype and brain abnormalities is warranted. Therefore, we aimed to identify biomarkers of impaired organizational strategies during visual encoding processes in OCD patients by developing an eye tracking-based Rey–Osterrieth complex figure test (RCFT). In 104 OCD patients and 114 healthy controls (HCs), eye movements were recorded during memorization of the RCFT figure, and organizational scores were evaluated. Kullback–Leibler divergence (KLD) scores were calculated to evaluate the distance between a participant’s eye gaze distribution and a hypothetical uniform distribution within the RCFT figure. Narrower gaze distributions within the RCFT figure, which yielded higher KLD scores, indicated that the participant was more obsessed with detail and had less organizational strategy. The OCD patients showed lower organizational scores than the HCs. Although no group differences in KLD scores were noted, KLD scores were significantly associated with organization T scores in the OCD group. The current study findings suggest that eye tracking biomarkers of visual memory encoding provide a rapidly determined index of executive functioning, such as organizational strategies, in OCD patients.


Sign in / Sign up

Export Citation Format

Share Document