Cross-Modal Interaction between Vision and Touch: The Role of Synesthetic Correspondence

Perception ◽  
10.1068/p2984 ◽  
2000 ◽  
Vol 29 (6) ◽  
pp. 745-754 ◽  
Author(s):  
Gail Martino ◽  
Lawrence E Marks

At each moment, we experience a melange of information arriving at several senses, and often we focus on inputs from one modality and ‘reject’ inputs from another. Does input from a rejected sensory modality modulate one's ability to make decisions about information from a selected one? When the modalities are vision and hearing, the answer is “yes”, suggesting that vision and hearing interact. In the present study, we asked whether similar interactions characterize vision and touch. As with vision and hearing, results obtained in a selective attention task show cross-modal interactions between vision and touch that depend on the synesthetic relationship between the stimulus combinations. These results imply that similar mechanisms may govern cross-modal interactions across sensory modalities.

PARADIGMI ◽  
2009 ◽  
pp. 147-162
Author(s):  
Davide Monopoli ◽  
Cristina Cacciari

- The Role of Literal and Figurative Language Olfaction is still the less investigated of the sensory modalities. This also reflects the fact that olfaction is the most subjective and emotional sensory modality and the one with the fewer relationships with verbal language. Since metaphors are cognitive bridges between perception and language, in principle they might be more effective in giving voice to olfaction, the "speechless sense". However, research in this fascinating field is still in its infancy, and the linguistic and psychological results are still scarce and contradicting. Key Words: Cross modal interactions, Language, Metaphor, Olfaction, Perception, Synaesthetic metaphors.


2020 ◽  
Author(s):  
Anna-Katharina R. Bauer ◽  
Freek van Ede ◽  
Andrew J. Quinn ◽  
Anna C. Nobre

AbstractAt any given moment our sensory systems receive multiple, often rhythmic, inputs from the environment. Processing of temporally structured events in one sensory modality can guide both behavioural and neural processing of events in other sensory modalities, but how this occurs remains unclear. Here, we used human electroencephalography (EEG) to test the cross-modal influences of a continuous auditory frequency-modulated (FM) sound on visual perception and visual cortical activity. We report systematic fluctuations in perceptual discrimination of brief visual stimuli in line with the phase of the FM sound. We further show that this rhythmic modulation in visual perception is related to an accompanying rhythmic modulation of neural activity recorded over visual areas. Importantly, in our task, perceptual and neural visual modulations occurred without any abrupt and salient onsets in the energy of the auditory stimulation and without any rhythmic structure in the visual stimulus. As such, the results provide a critical validation for the existence and functional role of cross-modal entrainment and demonstrates its utility for organising the perception of multisensory stimulation in the natural environment.Highlightscross-modal influences are mediated by the synchronisation of neural oscillationsvisual performance fluctuates in line with the phase of a frequency-modulated soundcross-modal entrainment of neural activity predicts fluctuation in visual performancecross-modal entrainment organises perception of multisensory stimuli


2019 ◽  
Author(s):  
Bar Lambez ◽  
Galit Agmon ◽  
Paz Har-Shai ◽  
Yuri Rassovsky ◽  
Elana Zion Golumbic

AbstractManaging attention in multi-speaker environments is a challenging feat that is critical for human performance. However, why some people are better than others in allocating attention appropriately, remains highly unknown. Here we investigated the contribution of two factors – Cognitive Capacity and Acquired Experience – to performance on two different types of Attention task: Selective Attention to one speaker and Distributed Attention among multiple concurrent speakers. We compared performance across three groups: Individuals with low (n=20) and high cognitive capacity (n=26), and Aircraft Pilots (n=25), who have gained extensive experience on both Selective and Distributed attention to speech through their training and profession. Results indicate that both types of Attention benefit from higher Cognitive Capacity, suggesting reliance on common capacity-limited resources. However, only Selective Attention was further improved in the Pilots, pointing to its flexible and trainable nature, whereas Distributed Attention seems to suffer from more fixed and hard-wired processing-bottlenecks.


Author(s):  
Alice Bollini ◽  
Davide Esposito ◽  
Claudio Campus ◽  
Monica Gori

AbstractThe human brain creates an external world representation based on magnitude judgments by estimating distance, numerosity, or size. The magnitude and spatial representation are hypothesized to rely on common mechanisms shared by different sensory modalities. We explored the relationship between magnitude and spatial representation using two different sensory systems. We hypothesize that the interaction between space and magnitude is combined differently depending on sensory modalities. Furthermore, we aimed to understand the role of the spatial reference frame in magnitude representation. We used stimulus–response compatibility (SRC) to investigate these processes assuming that performance is improved if stimulus and response share common features. We designed an auditory and tactile SRC task with conflicting spatial and magnitude mapping. Our results showed that sensory modality modulates the relationship between space and magnitude. A larger effect of magnitude over spatial congruency occurred in a tactile task. However, magnitude and space showed similar weight in the auditory task, with neither spatial congruency nor magnitude congruency having a significant effect. Moreover, we observed that the spatial frame activated during tasks was elicited by the sensory inputs. The participants' performance was reversed in the tactile task between uncrossed and crossed hands posture, suggesting an internal coordinate system. In contrast, crossing the hands did not alter performance (i.e., using an allocentric frame of reference). Overall, these results suggest that space and magnitude interaction differ in auditory and tactile modalities, supporting the idea that these sensory modalities use different magnitude and spatial representation mechanisms.


2017 ◽  
Author(s):  
Silvia Convento ◽  
Md. Shoaibur Rahman ◽  
Jeffrey M. Yau

SummaryCortical sensory systems often activate in parallel, even when stimulation is experienced through a single sensory modality [1–3]. Critically, the functional relationship between co-activated cortical systems is unclear: Co-activations may reflect the interactive coupling between information-linked cortical systems or merely parallel but independent sensory processing. Here, we report causal evidence consistent with the hypothesis that human somatosensory cortex (S1), which co-activates with auditory cortex during the processing of vibrations and textures [4–9], interactively couples to cortical systems that support auditory perception. In a series of behavioural experiments, we used transcranial magnetic stimulation (TMS) to probe interactions between the somatosensory and auditory perceptual systems as we manipulated attention state. Acute manipulation of S1 activity using TMS impairs auditory frequency perception when subjects simultaneously attend to auditory and tactile frequency, but not when attention is directed to audition alone. Auditory frequency perception is unaffected by TMS over visual cortex thus confirming the privileged interactions between the somatosensory and auditory systems in temporal frequency processing [10–13]. Our results provide a key demonstration that selective attention can modulate the functional properties of cortical systems thought to support specific sensory modalities. The gating of crossmodal coupling by selective attention may critically support multisensory interactions and feature-specific perception.


2018 ◽  
Author(s):  
Nicola Jane Holt ◽  
Leah Furbert ◽  
Emily Sweetingham

The current research sought to replicate and extend work suggesting that coloring can reduce anxiety, asking whether coloring can improve cognitive performance. In two experiments undergraduates (N = 47; N = 52) colored and participated in a control condition. Subjective and performance measures of mood and mindfulness were included: an implicit mood test (Experiment 1) and a selective attention task (Experiment 2) along with a divergent thinking test. In both experiments coloring significantly reduced anxiety and increased mindfulness compared with control and baseline scores. Following coloring participants scored significantly lower on implicit fear, than the control condition, and significantly higher on selective attention and original ideation. Coloring may not only reduce anxiety, but also improve mindful attention and creative cognition.


1998 ◽  
Vol 30 (1-2) ◽  
pp. 191-192
Author(s):  
S. Hayashida ◽  
S.-I. Niwa ◽  
K. Kobayashi ◽  
K. Itoh

Sign in / Sign up

Export Citation Format

Share Document