The Neurobiology of Reading Differs for Deaf and Hearing Adults

Author(s):  
Karen Emmorey

Recent neuroimaging and electrophysiological studies reveal how the reading system successfully adapts when phonological codes are relatively coarse-grained due to reduced auditory input during development. New evidence suggests that the optimal end-state for the reading system may differ for deaf versus hearing adults and indicates that certain neural patterns that are maladaptive for hearing readers may be beneficial for deaf readers. This chapter focuses on deaf adults who are signers and have achieved reading success. Although the left-hemisphere-dominant reading circuit is largely similar in both deaf and hearing individuals, skilled deaf readers exhibit a more bilateral neural response to written words and sentences than their hearing peers, as measured by event-related potentials and functional magnetic resonance imaging. Skilled deaf readers may also rely more on neural regions involved in semantic processing than hearing readers do. Overall, emerging evidence indicates that the neural markers for reading skill may differ for deaf and hearing adults.

2020 ◽  
Author(s):  
Karen Emmorey

Recent neuroimaging and electrophysiological evidence reveal how the reading system successfully adapts when phonological codes are relatively coarse-grained due to reduced auditory input during development. New evidence suggests that the optimal end-state for the reading system may differ for deaf versus hearing adults and indicates that certain neural patterns that are maladaptive for hearing readers may be beneficial for deaf readers. This chapter focuses on deaf adults who are signers and have achieved reading success. Although the left-hemisphere dominant reading circuit is largely similar, skilled deaf readers exhibit a more bilateral neural response to written words and sentences compared to their hearing peers, as measured by event-related potentials and functional magnetic resonance imaging. Skilled deaf readers may also rely more on neural regions involved in semantic processing compared to hearing readers. Overall, emerging evidence indicates that the neural markers for reading skill may differ for deaf and hearing adults.


2020 ◽  
Vol 1 (2) ◽  
pp. 249-267 ◽  
Author(s):  
Karen Emmorey ◽  
Kurt Winsler ◽  
Katherine J. Midgley ◽  
Jonathan Grainger ◽  
Phillip J. Holcomb

To investigate possible universal and modality-specific factors that influence the neurophysiological response during lexical processing, we recorded event-related potentials while a large group of deaf adults ( n = 40) viewed 404 signs in American Sign Language (ASL) that varied in ASL frequency, concreteness, and iconicity. Participants performed a go/no-go semantic categorization task (does the sign refer to people?) to videoclips of ASL signs (clips began with the signer’s hands at rest). Linear mixed-effects regression models were fit with per-participant, per-trial, and per-electrode data, allowing us to identify unique effects of each lexical variable. We observed an early effect of frequency (greater negativity for less frequent signs) beginning at 400 ms postvideo onset at anterior sites, which we interpreted as reflecting form-based lexical processing. This effect was followed by a more widely distributed posterior response that we interpreted as reflecting lexical-semantic processing. Paralleling spoken language, more concrete signs elicited greater negativities, beginning 600 ms postvideo onset with a wide scalp distribution. Finally, there were no effects of iconicity (except for a weak effect in the latest epochs; 1,000–1,200 ms), suggesting that iconicity does not modulate the neural response during sign recognition. Despite the perceptual and sensorimotoric differences between signed and spoken languages, the overall results indicate very similar neurophysiological processes underlie lexical access for both signs and words.


2003 ◽  
Vol 15 (8) ◽  
pp. 1135-1148 ◽  
Author(s):  
Annett Schirmer ◽  
Sonja A. Kotz

The present study investigated the interaction of emotional prosody and word valence during emotional comprehension in men and women. In a prosody-word interference task, participants listened to positive, neutral, and negative words that were spoken with a happy, neutral, and angry prosody. Participants were asked to rate word valence while ignoring emotional prosody, or vice versa. Congruent stimuli were responded faster and more accurately as compared to incongruent emotional stimuli. This behavioral effect was more salient for the word valence task than for the prosodic task and was comparable between men and women. The event-related potentials (ERPs) revealed a smaller N400 amplitude for congruent as compared to emotionally incongruent stimuli. This ERP effect, however, was significant only for the word valence judgment and only for female listeners. The present data suggest that the word valence judgment was more difficult and more easily influenced by task-irrelevant emotional information than the prosodic task in both men and women. Furthermore, although emotional prosody and word valence may have a similar influence on an emotional judgment in both sexes, ERPs indicate sex differences in the underlying processing. Women, but not men, show an interaction between prosody and word valence during a semantic processing stage.


2002 ◽  
Vol 7 (4) ◽  
pp. 228-239 ◽  
Author(s):  
Yael Henkin ◽  
Liat Kishon-Rabin ◽  
Natan Gadoth ◽  
Hillel Pratt

1990 ◽  
Vol 13 (2) ◽  
pp. 201-233 ◽  
Author(s):  
Risto Näätänen

AbstractThis article examines the role of attention and automaticity in auditory processing as revealed by event-related potential (ERP) research. An ERP component called the mismatch negativity, generated by the brain's automatic response to changes in repetitive auditory input, reveals that physical features of auditory stimuli are fully processed whether or not they are attended. It also suggests that there exist precise neuronal representations of the physical features of recent auditory stimuli, perhaps the traces underlying acoustic sensory (“echoic”) memory. A mechanism of passive attention switching in response to changes in repetitive input is also implicated.Conscious perception of discrete acoustic stimuli might be mediated by some of the mechanisms underlying another ERP component (NI), one sensitive to stimulus onset and offset. Frequent passive attentional shifts might accountforthe effect cognitive psychologists describe as “the breakthrough of the unattended” (Broadbent 1982), that is, that even unattended stimuli may be semantically processed, without assuming automatic semantic processing or late selection in selective attention.The processing negativity supports the early-selection theory and may arise from a mechanism for selectively attending to stimuli defined by certain features. This stimulus selection occurs in the form ofa matching process in which each input is compared with the “attentional trace,” a voluntarily maintained representation of the task-relevant features of the stimulus to be attended. The attentional mechanism described might underlie the stimulus-set mode of attention proposed by Broadbent. Finally, a model of automatic and attentional processing in audition is proposed that is based mainly on the aforementioned ERP components and some other physiological measures.


2016 ◽  
Vol 23 (1) ◽  
pp. 78-89 ◽  
Author(s):  
Anthony J. Angwin ◽  
Nadeeka N.W. Dissanayaka ◽  
Alison Moorcroft ◽  
Katie L. McMahon ◽  
Peter A. Silburn ◽  
...  

AbstractObjectives: Cognitive-linguistic impairments in Parkinson’s disease (PD) have been well documented; however, few studies have explored the neurophysiological underpinnings of semantic deficits in PD. This study investigated semantic function in PD using event-related potentials. Methods: Eighteen people with PD and 18 healthy controls performed a semantic judgement task on written word pairs that were either congruent or incongruent. Results: The mean amplitude of the N400 for new incongruent word pairs was similar for both groups, however the onset latency was delayed in the PD group. Further analysis of the data revealed that both groups demonstrated attenuation of the N400 for repeated incongruent trials, as well as attenuation of the P600 component for repeated congruent trials. Conclusions: The presence of N400 congruity and N400 repetition effects in the PD group suggests that semantic processing is generally intact, but with a slower time course as evidenced by the delayed N400. Additional research will be required to determine whether N400 and P600 repetition effects are sensitive to further cognitive decline in PD. (JINS, 2017, 23, 78–89)


Author(s):  
Guoying Lu ◽  
Guanhua Hou

Objective The purpose of this study was to investigate the effects of semantic congruence and incongruence on sign identification by using event-related potentials (ERPs). Background Sign systems have crucial roles in public spaces and traffic facilities. Poorly designed signs can easily confuse pedestrians and drivers and reduce the efficiency of public activities and urban administration. Method Thirty-one participants completed a sign identification experiment independently in a laboratory setting. Experimental materials were selected from GB/T 10001, a Chinese national recommendation standard that is officially named Public Information Graphical Symbols for Use on Signs. All ERP data were processed using MATLAB 13b, and behavioral data were analyzed using Stata 14. Results N170, P200, N300, and N400 components were induced during semantic processing. Statistical analysis revealed that semantic congruence has a main effect on N300 in the frontal region and has a main effect on N400 at FZ in the frontal region, CPZ in the parietal-central region, and PZ in the parietal region. Amplitudes of N300 induced by picture–word matching were considerably different between the two experimental conditions at electrodes FZ and FCZ. Amplitudes of N400 were significantly larger in the incongruent condition than in the congruent condition. Conclusion The study demonstrated that N300 and N400 are promising indicators for measuring semantic congruence in future sign design. Application Our findings provide ERP indicators for measuring the semantic congruence of sign design, which can be easily applied to improve the efficiency of sign design and sign comprehension.


Sign in / Sign up

Export Citation Format

Share Document