Why There Is a Vestibular Sense, or How Metacognition Individuates the Senses

2020 ◽  
pp. 1-20
Author(s):  
Isabelle Garzorz ◽  
Ophelia Deroy

Abstract Should the vestibular system be counted as a sense? This basic conceptual question remains surprisingly controversial. While it is possible to distinguish specific vestibular organs, it is not clear that this suffices to identify a genuine vestibular sense because of the supposed absence of a distinctive vestibular personal-level manifestation. The vestibular organs instead contribute to more general multisensory representations, whose name still suggest that they have a distinct ‘sensory’ contribution. The vestibular case shows a good example of the challenge of individuating the senses when multisensory interactions are the norm, neurally, representationally and phenomenally. Here, we propose that an additional metacognitive criterion can be used to single out a distinct sense, besides the existence of specific organs and despite the fact that the information coming from these organs is integrated with other sensory information. We argue that it is possible for human perceivers to monitor information coming from distinct organs, despite their integration, as exhibited and measured through metacognitive performance. Based on the vestibular case, we suggest that metacognitive awareness of the information coming from sensory organs constitutes a new criterion to individuate a sense through both physiological and personal criteria. This new way of individuating the senses accommodates both the specialised nature of sensory receptors as well as the intricate multisensory aspect of neural processes and experience, while maintaining the idea that each sense contributes something special to how we monitor the world and ourselves, at the subjective level.

Author(s):  
Bruno and

Within the traditional notion of the senses, the perception of time is especially puzzling. There is no specific physical energy carrying information about time, and hence no sensory receptors can transduce a ‘temporal stimulus.’ Time-related properties of events can instead be shown to emerge from specific perceptual processes involving multisensory interactions. In this chapter, we will examine five such properties: the awareness that two events occur at the same time (simultaneity) or one after the other (succession); the coherent time-stamping of events despite inaccuracies and imprecisions in coding simultaneity and succession (temporal coherence); the awareness of the temporal extent occupied by events (duration); the organization of events in regular temporal units (rhythm).


2001 ◽  
Vol 24 (2) ◽  
pp. 232-233
Author(s):  
Kelvin S. Oie ◽  
John J. Jeka

The propositions that the senses are separate and that the global array may be sufficient for adequate perception are questioned. There is evidence that certain tasks may be primarily “input-driven,” but these are a special case along the behavioral continuum. Many tasks involve sensory information that is ambiguous, and other sources of information may be required for adequate perception.


Author(s):  
Lotfi Merabet ◽  
Alvaro Pascual-Leone

In the brain, information from all the senses interacts and is integrated in order to create a unified sensory percept. Some percepts appear unimodal, and some, cross modal. Unimodal percepts can be modified by crossmodal interactions given that our brains process multiple streams of sensory information in parallel and promote extensive interactions. TMS can provide valuable insights on the neural substrates associated with multisensory processing in humans. TMS is commonly described as a ‘relatively painless’ method of stimulating the brain noninvasively. However, TMS itself is strong multisensory and this should be considered while interpreting the results. With regard to the crossmodal sensory changes that follow sensory deprivation, these changes can be revealed using a variety of methods including the combination of TMS with neuroimaging.


2020 ◽  
Author(s):  
Graham Martin

Graham Martin takes the reader deep into the world of birds from a new perspective, with a ‘through birds’ eyes’ approach to ornithology that goes beyond the traditional habitat or ecological point of view. There is a lot more to a bird’s world than what it receives through its eyes. This book shows how all of the senses complement one another to provide each species with a unique suite of information that guides their daily activities. The senses of each bird have been fine-tuned by natural selection to meet the challenges of its environment and optimise its behaviour: from spotting a carcase on a hillside, to pecking at minute insects, from catching fish in murky waters, to navigating around the globe. The reader is also introduced to the challenges posed to birds by the obstacles with which humans have cluttered their worlds, from power lines to windowpanes. All of these challenges need explaining from the birds’ sensory perspectives so that effective mitigations can be put in place. The book leads the reader through a wealth of diverse information presented in accessible text, with over 100 colour illustrations and photographs. The result is a highly readable and authoritative account, which will appeal to birdwatchers and other naturalists, as well as researchers in avian biology. The author has researched the senses of birds throughout a 50-year career in ornithology and sensory science. He has always attempted to understand birds from the perspective of how sensory information helps them to carry out different tasks in different environments. He has published papers on more than 60 bird species, from Albatrosses and Penguins, to Spoonbills and Kiwi. His first fascination was with owls and night time, and owls have remained special to him throughout his career. He has collaborated and travelled widely and pondered diverse sensory challenges that birds face in the conduct of different tasks in different habitats, from mudflats and murky waters, to forests, deserts and caves. In recent years he has focused on how understanding bird senses can help to reduce the very high levels of bird deaths that are caused by human artefacts; particularly, wind turbines, power lines, and gill nets.


2007 ◽  
Vol 274 (1613) ◽  
pp. 1035-1041 ◽  
Author(s):  
Ben Mitchinson ◽  
Chris J Martin ◽  
Robyn A Grant ◽  
Tony J Prescott

Rats sweep their facial whiskers back and forth to generate tactile sensory information through contact with environmental structure. The neural processes operating on the signals arising from these whisker contacts are widely studied as a model of sensing in general, even though detailed knowledge of the natural circumstances under which such signals are generated is lacking. We used digital video tracking and wireless recording of mystacial electromyogram signals to assess the effects of whisker–object contact on whisking in freely moving animals exploring simple environments. Our results show that contact leads to reduced protraction (forward whisker motion) on the side of the animal ipsilateral to an obstruction and increased protraction on the contralateral side. Reduced ipsilateral protraction occurs rapidly and in the same whisk cycle as the initial contact. We conclude that whisker movements are actively controlled so as to increase the likelihood of environmental contacts while constraining such interactions to involve a gentle touch. That whisking pattern generation is under strong feedback control has important implications for understanding the nature of the signals reaching upstream neural processes.


2004 ◽  
Vol 14 (02) ◽  
pp. 515-530 ◽  
Author(s):  
WALTER J. FREEMAN

Semantics is the essence of human communication. It concerns the manufacture and use of symbols as representations to exchange meanings. Information technology is faced with the problem of using intelligent machines as intermediaries for interpersonal communication. The problem of designing such semantic machines has been intractable because brains and machines work on very different principles. Machines process information that is fed to them. Brains construct hypotheses and test them by acting and sensing. Brains do not process information because the intake through the senses is infinite. Brains sample information, hold it briefly, construct meaning, and then discard the information. A solution to the problem of communication with machines is to simulate how brains create meaning and express it as information by making a symbol to represent the meaning to another brain in pairwise communication. An understanding of the neurodynamics by which brains create meaning and represent it may enable engineers to build devices with which they can communicate pairwise, as they do now with colleagues.


2019 ◽  
Author(s):  
Karen Emmorey

We investigated linguistic codability for sensory information (colour, taste, shape, touch, taste, smell, and sound) and the use of iconic labels in American Sign Language (ASL) by deaf native signers. Colour was highly codable in ASL, but few iconic labels were produced. Shape labels were highly iconic (lexical signs and classifier constructions), and touch descriptions relied on iconic classifier constructions that depicted the shape of the tactile source object. Lexical taste-specific signs also exhibited iconic properties (articulated near the mouth), but taste codability was relatively low. No smell-specific lexical signs were elicited (all descriptions were source-based). Descriptions of sound stimuli were elicited through tactile vibrations and were often described using classifier constructions that visually depicted different sound qualities. Results indicated that iconicity of linguistic forms was not constant across the senses; rather, iconicity was most frequently observed for shape, touch, and sound stimuli, and least frequently for colour and smell.


2021 ◽  
Vol 15 ◽  
Author(s):  
Ute Korn ◽  
Marina Krylova ◽  
Kilian L. Heck ◽  
Florian B. Häußinger ◽  
Robert S. Stark ◽  
...  

Processing of sensory information is embedded into ongoing neural processes which contribute to brain states. Electroencephalographic microstates are semi-stable short-lived power distributions which have been associated with subsystem activity such as auditory, visual and attention networks. Here we explore changes in electrical brain states in response to an audiovisual perception and memorization task under conditions of auditory distraction. We discovered changes in brain microstates reflecting a weakening of states representing activity of the auditory system and strengthening of salience networks, supporting the idea that salience networks are active after audiovisual encoding and during memorization to protect memories and concentrate on upcoming behavioural response.


Author(s):  
Martin V. Butz ◽  
Esther F. Kutter

This chapter shows that multiple sensory information sources can generally be integrated in a similar fashion. However, seeing that different modalities are grounded in different frames of reference, integrations will focus on space or on identities. Body-relative spaces integrate information about the body and the surrounding space in body-relative frames of reference, integrating the available information across modalities in an approximately optimal manner. Simple topological neural population encodings are well-suited to generate estimates about stimulus locations and to map several frames of reference onto each other. Self-organizing neural networks are introduced as the basic computation mechanism that enables the learning of such mappings. Multisensory object recognition, on the other hand, is realized most effectively in an object-specific frame of reference – essentially abstracting away from body-relative frames of reference. Cognitive maps, that is, maps of the environment are learned by connecting locations over space and time. The hippocampus strongly supports the learning of cognitive maps, as it supports the generation of new episodic memories, suggesting a strong relation between these two computational tasks. In conclusion, multisensory integration yields internal predictive structures about spaces and object identities, which are well-suited to plan, decide on, and control environmental interactions.


2020 ◽  
Vol 8 (3-4) ◽  
pp. 279-298
Author(s):  
Jason Clarke ◽  
Michaela Porubanova

For a given physical duration, certain events can be experienced as subjectively longer than others, an illusion referred to as subjective time dilation. Many factors have been shown to lead to this illusion in perceived duration, including low-level visual properties of the stimulus (e.g., increase in motion, brightness, and flicker), an unexpected stimulus in a sequence of events, as well as affective factors. Here we report the results of two experiments in which we tested for the influence of scene and object knowledge on subjective time dilation. Based on the results of earlier studies, we predicted that visual scenes and objects containing semantic violations would be judged as lasting longer than control stimuli. The findings from both experiments indicate that stimuli containing semantic violations were judged to be present for longer than stimuli without semantic violations. We interpret our results to mean that neural processes that encode for perception of duration can operate on conceptually integrated scene and object representations. We further conjecture that the underlying neural code for duration perception is likely correlated with the amplitude of neural activity expended on the processing of incoming sensory information.


Sign in / Sign up

Export Citation Format

Share Document