Academic literature on the topic 'Crossmodal Modulation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Crossmodal Modulation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Crossmodal Modulation"

1

Convento, Silvia, Chiara Galantini, Nadia Bolognini, and Giuseppe Vallar. "Neuromodulation of crossmodal influences on visual cortex excitability." Seeing and Perceiving 25 (2012): 149. http://dx.doi.org/10.1163/187847612x647810.

Full text
Abstract:
Crossmodal interactions occur not only within brain regions deemed to be heteromodal, but also within primary sensory areas, traditionally considered as modality-specific. So far, mechanisms of crossmodal interactions in primary visual areas remain largely unknown. In the present study, we explored the effect of crossmodal stimuli on phosphene perception, induced by single-pulse transcranial magnetic stimulation (sTMS) delivered to the occipital visual cortex. In three experiments, we showed that redundant auditory and/or tactile information facilitated the detection of phosphenes induced by occipital sTMS, applied at sub-threshold intensity, which also increased their level of brightness, with the maximal enhancement occurring for trimodal stimulus combinations. Such crossmodal enhancement can be further boosted by the brain polarization of heteromodal areas mediating crossmodal links in spatial attention. Specifically, anodal transcranial direct current stimulation (tDCS) of both the occipital and the parietal cortices facilitated phosphene detection under unimodal conditions, whereas anodal tDCS of the parietal and temporal cortices enhanced phosphene detection selectively under crossmodal conditions, when auditory or tactile stimuli were combined with occipital sTMS. Overall, crossmodal interactions can enhance neural excitability within low-level visual areas, and tDCS can be used for boosting such crossmodal influences on visual responses, likely affecting mechanisms of crossmodal spatial attention involving feedback modulation from heteromodal areas on sensory-specific cortices. TDCS can effectively facilitate the integration of multisensory signals originating from the external world, hence improving visual perception.
APA, Harvard, Vancouver, ISO, and other styles
2

Gobara, Akihiko, Yuki Yamada, and Kayo Miura. "Crossmodal Modulation of Spatial Localization by Mimetic Words." i-Perception 7, no. 6 (December 2016): 204166951668424. http://dx.doi.org/10.1177/2041669516684244.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Macaluso, E. "Modulation of Human Visual Cortex by Crossmodal Spatial Attention." Science 289, no. 5482 (August 18, 2000): 1206–8. http://dx.doi.org/10.1126/science.289.5482.1206.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zhang, Ming, Xiaogang Wu, and Aijun Wang. "Crossmodal Nonspatial Repetition Inhibition Due to Modality Shift." Perception 50, no. 2 (January 18, 2021): 116–28. http://dx.doi.org/10.1177/0301006620988209.

Full text
Abstract:
Previous studies have found that processing of a second stimulus is slower when the modality of the first stimulus differs, which is termed the modality shift effect. Moreover, people tend to respond more slowly to the second stimulus when the two stimuli are similar in the semantic dimension, which is termed the nonspatial repetition inhibition effect. This study aimed to explore the modality shift effect on nonspatial repetition inhibition and whether such modulation was influenced by different temporal intervals. A cue–target paradigm was adopted in which modality priming and identity priming were manipulated at three interstimuli intervals. The results showed that the response times under the modality shift condition were slower than those under the modality repeat condition. In trials with modality shift, responses to congruent cues and targets were slower than to incongruent cue–target combinations, indicating crossmodal nonspatial repetition inhibition. The crossmodal nonspatial repetition inhibition effect decreased with increasing interstimuli interval. These results provide evidence that the additional intervening event proposed in previous studies is not necessary for the occurrence of crossmodal nonspatial repetition inhibition.
APA, Harvard, Vancouver, ISO, and other styles
5

TAN, Joo Huang, and Po-Jang HSIEH. "Context dependent crossmodal assocations between visual spatial frequencies and auditory amplitude modulation rates." Journal of Vision 17, no. 10 (August 31, 2017): 193. http://dx.doi.org/10.1167/17.10.193.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Haegens, Saskia, José Vergara, Román Rossi-Pool, Luis Lemus, and Ranulfo Romo. "Beta oscillations reflect supramodal information during perceptual judgment." Proceedings of the National Academy of Sciences 114, no. 52 (December 11, 2017): 13810–15. http://dx.doi.org/10.1073/pnas.1714633115.

Full text
Abstract:
Previous work on perceptual decision making in the sensorimotor system has shown population dynamics in the beta band, corresponding to the encoding of stimulus properties and the final decision outcome. Here, we asked how oscillatory dynamics in the medial premotor cortex (MPC) contribute to supramodal perceptual decision making. We recorded local field potentials (LFPs) and spikes in two monkeys trained to perform a tactile–acoustic frequency discrimination task, including both unimodal and crossmodal conditions. We studied the role of oscillatory activity as a function of stimulus properties (frequency and sensory modality), as well as decision outcome. We found that beta-band power correlated with relevant stimulus properties: there was a significant modulation by stimulus frequency during the working-memory (WM) retention interval, as well as modulation by stimulus modality—the latter was observed only in the case of a purely unimodal task, where modality information was relevant to prepare for the upcoming second stimulus. Furthermore, we found a significant modulation of beta power during the comparison and decision period, which was predictive of decision outcome. Finally, beta-band spike–field coherence (SFC) matched these LFP observations. In conclusion, we demonstrate that beta power in MPC is reflective of stimulus features in a supramodal, context-dependent manner, and additionally reflects the decision outcome. We propose that these beta modulations are a signature of the recruitment of functional neuronal ensembles, which encode task-relevant information.
APA, Harvard, Vancouver, ISO, and other styles
7

Kennett, Steffan, Chris Rorden, Masud Husain, and Jon Driver. "Crossmodal visual-tactile extinction: Modulation by posture implicates biased competition in proprioceptively reconstructed space." Journal of Neuropsychology 4, no. 1 (March 2010): 15–32. http://dx.doi.org/10.1348/174866409x415942.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Lyons, Georgina, Daniel Sanabria, Argiro Vatakis, and Charles Spence. "The modulation of crossmodal integration by unimodal perceptual grouping: a visuotactile apparent motion study." Experimental Brain Research 174, no. 3 (May 23, 2006): 510–16. http://dx.doi.org/10.1007/s00221-006-0485-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Pedersen, Nicolai F., Torsten Dau, Lars Kai Hansen, and Jens Hjortkjær. "Modulation transfer functions for audiovisual speech." PLOS Computational Biology 18, no. 7 (July 19, 2022): e1010273. http://dx.doi.org/10.1371/journal.pcbi.1010273.

Full text
Abstract:
Temporal synchrony between facial motion and acoustic modulations is a hallmark feature of audiovisual speech. The moving face and mouth during natural speech is known to be correlated with low-frequency acoustic envelope fluctuations (below 10 Hz), but the precise rates at which envelope information is synchronized with motion in different parts of the face are less clear. Here, we used regularized canonical correlation analysis (rCCA) to learn speech envelope filters whose outputs correlate with motion in different parts of the speakers face. We leveraged recent advances in video-based 3D facial landmark estimation allowing us to examine statistical envelope-face correlations across a large number of speakers (∼4000). Specifically, rCCA was used to learn modulation transfer functions (MTFs) for the speech envelope that significantly predict correlation with facial motion across different speakers. The AV analysis revealed bandpass speech envelope filters at distinct temporal scales. A first set of MTFs showed peaks around 3-4 Hz and were correlated with mouth movements. A second set of MTFs captured envelope fluctuations in the 1-2 Hz range correlated with more global face and head motion. These two distinctive timescales emerged only as a property of natural AV speech statistics across many speakers. A similar analysis of fewer speakers performing a controlled speech task highlighted only the well-known temporal modulations around 4 Hz correlated with orofacial motion. The different bandpass ranges of AV correlation align notably with the average rates at which syllables (3-4 Hz) and phrases (1-2 Hz) are produced in natural speech. Whereas periodicities at the syllable rate are evident in the envelope spectrum of the speech signal itself, slower 1-2 Hz regularities thus only become prominent when considering crossmodal signal statistics. This may indicate a motor origin of temporal regularities at the timescales of syllables and phrases in natural speech.
APA, Harvard, Vancouver, ISO, and other styles
10

Kreutzfeldt, Magali, Denise N. Stephan, Klaus Willmes, and Iring Koch. "Shifts in target modality cause attentional reset: Evidence from sequential modulation of crossmodal congruency effects." Psychonomic Bulletin & Review 23, no. 5 (January 26, 2016): 1466–73. http://dx.doi.org/10.3758/s13423-016-1001-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Crossmodal Modulation"

1

Jia, Lina. "Crossmodal emotional modulation of time perception." Diss., Ludwig-Maximilians-Universität München, 2013. http://nbn-resolving.de/urn:nbn:de:bvb:19-165138.

Full text
Abstract:
The thesis that consists of three studies investigated how visual affective stimuli or action as contexts influence crossmodal time processing, particularly on the role of the crossmodal/sensorimotor linkage in time perception. By using different types of emotional stimuli (e.g., threat, disgust, and neutral pictures) and manipulating the possibility of near-body interactions, three studies disassociated the impacts of embodied action from emotional dimensions (arousal and valence) on crossmodal emotional modulation in time perception. The whole thesis thus offered the first behavioral evidence that embodied action is an important factor that expands subjective tactile duration and facilitates tactile selection (modality-specific temporal processing) in emotion and action contexts. Moreover, subjective expansion of duration by threat and action contexts may reflect the evolutionary coupling of our perceptual and motor systems to adapt to the specific environments for survival and success.
APA, Harvard, Vancouver, ISO, and other styles
2

Jia, Lina [Verfasser], and Hermann J. [Akademischer Betreuer] Müller. "Crossmodal emotional modulation of time perception / Lina Jia. Betreuer: Hermann J. Müller." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2013. http://d-nb.info/1047062399/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Di, Filippo Alessandro. "Perceptual Strategies and Neuronal Underpinnings underlying Pattern Recognition through Visual and Tactile Sensory Modalities in Rats." Doctoral thesis, SISSA, 2015. http://hdl.handle.net/20.500.11767/3914.

Full text
Abstract:
The aim of my PhD project was to investigate multisensory perception and multimodal recognition abilities in the rat, to better understand the underlying perceptual strategies and neuronal mechanisms. I have chosen to carry out this project on the laboratory rat, for two reasons. First, the rat is a flexible and highly accessible experimental model, where it is possible to combine state-of-the-art neurophysiological approaches (such as multi-electrode neuronal recordings) with behavioral investigation of perception and (more in general) cognition. Second, extensive research concerning multimodal integration has already been conducted in this species, both at the neurophysiological and behavioral level. My thesis work has been organized in two projects: a psychophysical assessment of object categorization abilities in rats, and a neurophysiological study of neuronal tuning in the primary visual cortex of anaesthetized rats. In both experiments, unisensory (visual and tactile) and multisensory (visuo-tactile) stimulation has been used for training and testing, depending on the task. The first project has required development of a new experimental rig for the study of object categorization in rat, using solid objects, so as to be able to assess their recognition abilities under different modalities: vision, touch and both together. The second project involved an electrophysiological study of rat primary visual cortex, during visual, tactile and visuo-tactile stimulation, with the aim of understanding whether any interaction between these modalities exists, in an area that is mainly deputed to one of them. The results of both of the studies are still preliminary, but they already offer some interesting insights on the defining features of these abilities.
APA, Harvard, Vancouver, ISO, and other styles
4

Sanabria, Daniel. "Modulating crossmodial interactions : evidence from the perception of apparent motion." Thesis, University of Oxford, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.419541.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Qian, Cheng. "Crossmodal Modulation as a Basis for Visual Enhancement of Auditory Performance." Thesis, 2009. http://hdl.handle.net/1807/18831.

Full text
Abstract:
The human sensory system processes many modalities simultaneously. It was believed that each modality would be processed individually first, and their combination deferred to higher-level cortical areas. Recent neurophysiological investigations indicate interconnections between early visual and auditory cortices, areas putatively considered unimodal, but the function remains unclear. The present work explores how this cross-modality might contribute to a visual enhancement of auditory performance, using a combined theoretical and experimental approach. The enhancement of sensory performance was studied through a signal detection framework. A model was constructed using principles from signal detection theory and neurophysiology, demonstrating enhancements of roughly 1.8dB both analytically and through simulation. Several experiments were conducted to observe e ects of visual cues on a 2-alternative-forced-choice detection task of an auditory tone in noise. Results of the main experiment showed an enhancement of 1.6dB. Better enhancement also tended to occur for more realistic relationships between audio to visual stimuli.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Crossmodal Modulation"

1

Takahashi, Kohske, and Katsumi Watanabe. "Crossmodal Interactions in Visual Competition." In Advances in Bioinformatics and Biomedical Engineering, 64–72. IGI Global, 2013. http://dx.doi.org/10.4018/978-1-4666-2113-8.ch007.

Full text
Abstract:
Visual competition is one of the long-standing mysteries in vision science. The image that arises from a person’s visual awareness of a constant visual input can spontaneously and stochastically changed between two or more possible interpretations. Visual competition is largely defined by the actual visual experience. However, recent studies have suggested that the process of resolving visual ambiguity is not limited to the domain of vision. Rather, the process is likely susceptible to various types of nonvisual modulation (e.g., auditory and haptic/tactile). Here, the authors review the recent studies that investigate the crossmodal interactions found in visual competition. These current studies highlight the significant crossmodal effects in visual competition, including the bias toward visual interpretations that are congruent with other modalities and the temporal synchronization of the transition between two (or more) visual interpretations with nonvisual events. These nonvisual modulations of visual competition reveal that visual perception is built upon several levels of crossmodal synchronization.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Crossmodal Modulation"

1

Barros, Pablo, German I. Parisi, Di Fu, Xun Liu, and Stefan Wermter. "Expectation Learning and Crossmodal Modulation with a Deep Adversarial Network." In 2018 International Joint Conference on Neural Networks (IJCNN). IEEE, 2018. http://dx.doi.org/10.1109/ijcnn.2018.8489303.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography