scholarly journals Evidence for a system in the auditory periphery that may contribute to linking sounds and images in space

2020 ◽  
Author(s):  
David LK Murphy ◽  
Cynthia D King ◽  
Stephanie N Schlebusch ◽  
Christopher A Shera ◽  
Jennifer M Groh

AbstractEye movements alter the relationship between the visual and auditory spatial scenes. Signals related to eye movements affect the brain’s auditory pathways from the ear through auditory cortex and beyond, but how these signals might contribute to computing the locations of sounds with respect to the visual scene is poorly understood. Here, we evaluated the information contained in the signals observed at the earliest processing stage, eye movement-related eardrum oscillations (EMREOs). We report that human EMREOs carry information about both horizontal and vertical eye displacement as well as initial/final eye position. We conclude that all of the information necessary to contribute to a suitable coordinate transformation of auditory spatial cues into a common reference frame with visual information is present in this signal. We hypothesize that the underlying mechanism causing EMREOs could impose a transfer function on any incoming sound signal, which could permit subsequent processing stages to compute the positions of sounds in relation to the visual scene.

Author(s):  
Angie M. Michaiel ◽  
Elliott T.T. Abe ◽  
Cristopher M. Niell

ABSTRACTMany studies of visual processing are conducted in unnatural conditions, such as head- and gaze-fixation. As this radically limits natural exploration of the visual environment, there is much less known about how animals actively use their sensory systems to acquire visual information in natural, goal-directed contexts. Recently, prey capture has emerged as an ethologically relevant behavior that mice perform without training, and that engages vision for accurate orienting and pursuit. However, it is unclear how mice target their gaze during such natural behaviors, particularly since, in contrast to many predatory species, mice have a narrow binocular field and lack foveate vision that would entail fixing their gaze on a specific point in the visual field. Here we measured head and bilateral eye movements in freely moving mice performing prey capture. We find that the majority of eye movements are compensatory for head movements, thereby acting to stabilize the visual scene. During head turns, however, these periods of stabilization are interspersed with non-compensatory saccades that abruptly shift gaze position. Analysis of eye movements relative to the cricket position shows that the saccades do not preferentially select a specific point in the visual scene. Rather, orienting movements are driven by the head, with the eyes following in coordination to sequentially stabilize and recenter the gaze. These findings help relate eye movements in the mouse to other species, and provide a foundation for studying active vision during ethological behaviors in the mouse.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Amir Akbarian ◽  
Kelsey Clark ◽  
Behrad Noudoost ◽  
Neda Nategh

AbstractSaccadic eye movements (saccades) disrupt the continuous flow of visual information, yet our perception of the visual world remains uninterrupted. Here we assess the representation of the visual scene across saccades from single-trial spike trains of extrastriate visual areas, using a combined electrophysiology and statistical modeling approach. Using a model-based decoder we generate a high temporal resolution readout of visual information, and identify the specific changes in neurons’ spatiotemporal sensitivity that underly an integrated perisaccadic representation of visual space. Our results show that by maintaining a memory of the visual scene, extrastriate neurons produce an uninterrupted representation of the visual world. Extrastriate neurons exhibit a late response enhancement close to the time of saccade onset, which preserves the latest pre-saccadic information until the post-saccadic flow of retinal information resumes. These results show how our brain exploits available information to maintain a representation of the scene while visual inputs are disrupted.


2007 ◽  
Author(s):  
Marco Sperduti ◽  
Ralf Veit ◽  
Andrea Caria ◽  
Paolo Belardinelli ◽  
Niels Birbaumer ◽  
...  

2020 ◽  
Author(s):  
David Harris ◽  
Mark Wilson ◽  
Tim Holmes ◽  
Toby de Burgh ◽  
Samuel James Vine

Head-mounted eye tracking has been fundamental for developing an understanding of sporting expertise, as the way in which performers sample visual information from the environment is a major determinant of successful performance. There is, however, a long running tension between the desire to study realistic, in-situ gaze behaviour and the difficulties of acquiring accurate ocular measurements in dynamic and fast-moving sporting tasks. Here, we describe how immersive technologies, such as virtual reality, offer an increasingly compelling approach for conducting eye movement research in sport. The possibility of studying gaze behaviour in representative and realistic environments, but with high levels of experimental control, could enable significant strides forward for eye tracking in sport and improve understanding of how eye movements underpin sporting skills. By providing a rationale for virtual reality as an optimal environment for eye tracking research, as well as outlining practical considerations related to hardware, software and data analysis, we hope to guide researchers and practitioners in the use of this approach.


Cortex ◽  
2016 ◽  
Vol 85 ◽  
pp. 182-193 ◽  
Author(s):  
Rosanna K. Olsen ◽  
Vinoja Sebanayagam ◽  
Yunjo Lee ◽  
Morris Moscovitch ◽  
Cheryl L. Grady ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document