tactile attention
Recently Published Documents


TOTAL DOCUMENTS

40
(FIVE YEARS 4)

H-INDEX

14
(FIVE YEARS 0)

2021 ◽  
Author(s):  
Clara Fritz ◽  
Mayra Flick ◽  
Eckart Zimmermann

A doorbell sounds less loud to us if we ring it ourselves than if someone else pushes the button. Self-produced stimuli appear attenuated to us compared to stimuli generated by others (Weiss et al., 2011). The effect is known as sensory attenuation for external events. Here, we asked whether this effect results from a competition for attentional resources of sensory events. We first tested whether tactile attention is boosted at the time of pushing a button. We presented a button in a virtual reality setup that allowed to manipulate the time of tactile feedback. We found that a tactile impulse was perceived as more intense in the moment the hand pushed the button. In a second experiment, participants pushed a button and estimated the loudness of sounds. We found sensory attenuation for the loudness of the sound only when tactile feedback was provided at the time of reaching the movement goal. In a third experiment, we found that this interaction between a tactile and an auditory event did not occur when the hands remained passive without movement. These data reveal that sensory attenuation for external events occurs because tactile attention is boosted at the time of a button pressing movement, thereby dragging attention from the auditory modality.


Author(s):  
Winko W. An ◽  
Hakim Si-Mohammed ◽  
Nicholas Huang ◽  
Hannes Gamper ◽  
Adrian KC Lee ◽  
...  

2019 ◽  
Vol 41 (10) ◽  
pp. 1088-1096
Author(s):  
Tigran Kesayan ◽  
Hamlet Gasoyan ◽  
Damon G. Lamb ◽  
John B. Williamson ◽  
Kenneth M. Heilman

Author(s):  
Allison Gabouer ◽  
John Oghalai ◽  
Heather Bortfeld

Parent-child dyads in which the child is deaf but the parent is hearing present a unique opportunity to examine parents’ use of non-auditory cues, particularly vision and touch, to establish communicative intent. This study examines the multimodal communication patterns of hearing parents during a free play task with their hearing (N=9) or deaf (N=9) children. Specifically, we coded parents’ use of multimodal cues in the service of establishing joint attention with their children. Dyad types were compared for overall use of multimodal – auditory, visual, and tactile – attention-establishing cues, and for the overall number of successful and failed bids by a parent for a child’s attention. The relationship between multimodal behaviors on the part of the parent were tracked for whether they resulted in successful or failed initiation of joint attention. We focus our interpretation of the results on how hearing parents differentially accommodate their hearing and deaf children to engage them in joint attention. Findings can inform the development of recommendations for hearing parents of deaf children who are candidates for cochlear implantation regarding communication strategies to use prior to a child’s implantation. Moreover, these findings expand our understanding of how joint attention is established between parents and their preverbal children, regardless of children’s hearing status.


Death Studies ◽  
2017 ◽  
Vol 42 (7) ◽  
pp. 426-431 ◽  
Author(s):  
Jonathan Beyrak-Lev ◽  
Zach Gerber ◽  
Tsachi Ein-Dor ◽  
Gilad Hirschberger

Author(s):  
Francesco Cerritelli ◽  
Piero Chiacchiaretta ◽  
Francesco Gambi ◽  
Antonio Ferretti

2016 ◽  
Vol 116 (3) ◽  
pp. 1218-1231 ◽  
Author(s):  
Manuel Gomez-Ramirez ◽  
Kristjana Hysaj ◽  
Ernst Niebur

Selective attention allows organisms to extract behaviorally relevant information while ignoring distracting stimuli that compete for the limited resources of their central nervous systems. Attention is highly flexible, and it can be harnessed to select information based on sensory modality, within-modality feature(s), spatial location, object identity, and/or temporal properties. In this review, we discuss the body of work devoted to understanding mechanisms of selective attention in the somatosensory system. In particular, we describe the effects of attention on tactile behavior and corresponding neural activity in somatosensory cortex. Our focus is on neural mechanisms that select tactile stimuli based on their location on the body (somatotopic-based attention) or their sensory feature (feature-based attention). We highlight parallels between selection mechanisms in touch and other sensory systems and discuss several putative neural coding schemes employed by cortical populations to signal the behavioral relevance of sensory inputs. Specifically, we contrast the advantages and disadvantages of using a gain vs. spike-spike correlation code for representing attended sensory stimuli. We favor a neural network model of tactile attention that is composed of frontal, parietal, and subcortical areas that controls somatosensory cells encoding the relevant stimulus features to enable preferential processing throughout the somatosensory hierarchy. Our review is based on data from noninvasive electrophysiological and imaging data in humans as well as single-unit recordings in nonhuman primates.


2016 ◽  
Author(s):  
Jonathan T.W. Schubert ◽  
Stephanie Badde ◽  
Brigitte Röder ◽  
Tobias Heed

ABSTRACTTask demands modulate tactile localization in sighted humans, presumably through weight adjustments in the spatial integration of anatomical, skin-based, and external, posture-based information. In contrast, previous studies have suggested that congenitally blind humans, by default, refrain from automatic spatial integration and localize touch using only skin-based information. Here, sighted and congenitally blind participants localized tactile targets on the palm or back of one hand, while ignoring simultaneous tactile distractors at congruent or incongruent locations on the other hand. We probed the interplay of anatomical and external location codes for spatial congruency effects by varying hand posture: the palms either both faced down, or one faced down and one up. In the latter posture, externally congruent target and distractor locations were anatomically incongruent and vice versa. Target locations had to be reported either anatomically (“palm” or “back” of the hand), or externally (“up” or “down” in space). Under anatomical instructions, performance was better for anatomically congruent than incongruent target-distractor pairs. In contrast, under external instructions, performance was better for externally congruent than incongruent pairs. These modulations were evident in sighted and blind individuals. Notably, distractor effects were overall far smaller in blind than in sighted participants, despite comparable target-distractor identification performance. Thus, the absence of developmental vision seems to be associated with an increased ability to focus tactile attention towards a non-spatially defined target. Nevertheless, that blind individuals exhibited effects of hand posture and task instructions in their congruency effects suggests that, like the sighted,, they automatically integrate anatomical and external information during tactile localization. Moreover, spatial integration in tactile processing is, thus, flexibly adapted by top-down information – here, task instruction – even in the absence of developmental vision.


Sign in / Sign up

Export Citation Format

Share Document