Статті в журналах з теми "Crossmodal Modulation"

Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Crossmodal Modulation.

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-21 статей у журналах для дослідження на тему "Crossmodal Modulation".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Convento, Silvia, Chiara Galantini, Nadia Bolognini, and Giuseppe Vallar. "Neuromodulation of crossmodal influences on visual cortex excitability." Seeing and Perceiving 25 (2012): 149. http://dx.doi.org/10.1163/187847612x647810.

Повний текст джерела
Анотація:
Crossmodal interactions occur not only within brain regions deemed to be heteromodal, but also within primary sensory areas, traditionally considered as modality-specific. So far, mechanisms of crossmodal interactions in primary visual areas remain largely unknown. In the present study, we explored the effect of crossmodal stimuli on phosphene perception, induced by single-pulse transcranial magnetic stimulation (sTMS) delivered to the occipital visual cortex. In three experiments, we showed that redundant auditory and/or tactile information facilitated the detection of phosphenes induced by occipital sTMS, applied at sub-threshold intensity, which also increased their level of brightness, with the maximal enhancement occurring for trimodal stimulus combinations. Such crossmodal enhancement can be further boosted by the brain polarization of heteromodal areas mediating crossmodal links in spatial attention. Specifically, anodal transcranial direct current stimulation (tDCS) of both the occipital and the parietal cortices facilitated phosphene detection under unimodal conditions, whereas anodal tDCS of the parietal and temporal cortices enhanced phosphene detection selectively under crossmodal conditions, when auditory or tactile stimuli were combined with occipital sTMS. Overall, crossmodal interactions can enhance neural excitability within low-level visual areas, and tDCS can be used for boosting such crossmodal influences on visual responses, likely affecting mechanisms of crossmodal spatial attention involving feedback modulation from heteromodal areas on sensory-specific cortices. TDCS can effectively facilitate the integration of multisensory signals originating from the external world, hence improving visual perception.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Gobara, Akihiko, Yuki Yamada, and Kayo Miura. "Crossmodal Modulation of Spatial Localization by Mimetic Words." i-Perception 7, no. 6 (December 2016): 204166951668424. http://dx.doi.org/10.1177/2041669516684244.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Macaluso, E. "Modulation of Human Visual Cortex by Crossmodal Spatial Attention." Science 289, no. 5482 (August 18, 2000): 1206–8. http://dx.doi.org/10.1126/science.289.5482.1206.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Zhang, Ming, Xiaogang Wu, and Aijun Wang. "Crossmodal Nonspatial Repetition Inhibition Due to Modality Shift." Perception 50, no. 2 (January 18, 2021): 116–28. http://dx.doi.org/10.1177/0301006620988209.

Повний текст джерела
Анотація:
Previous studies have found that processing of a second stimulus is slower when the modality of the first stimulus differs, which is termed the modality shift effect. Moreover, people tend to respond more slowly to the second stimulus when the two stimuli are similar in the semantic dimension, which is termed the nonspatial repetition inhibition effect. This study aimed to explore the modality shift effect on nonspatial repetition inhibition and whether such modulation was influenced by different temporal intervals. A cue–target paradigm was adopted in which modality priming and identity priming were manipulated at three interstimuli intervals. The results showed that the response times under the modality shift condition were slower than those under the modality repeat condition. In trials with modality shift, responses to congruent cues and targets were slower than to incongruent cue–target combinations, indicating crossmodal nonspatial repetition inhibition. The crossmodal nonspatial repetition inhibition effect decreased with increasing interstimuli interval. These results provide evidence that the additional intervening event proposed in previous studies is not necessary for the occurrence of crossmodal nonspatial repetition inhibition.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

TAN, Joo Huang, and Po-Jang HSIEH. "Context dependent crossmodal assocations between visual spatial frequencies and auditory amplitude modulation rates." Journal of Vision 17, no. 10 (August 31, 2017): 193. http://dx.doi.org/10.1167/17.10.193.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Haegens, Saskia, José Vergara, Román Rossi-Pool, Luis Lemus, and Ranulfo Romo. "Beta oscillations reflect supramodal information during perceptual judgment." Proceedings of the National Academy of Sciences 114, no. 52 (December 11, 2017): 13810–15. http://dx.doi.org/10.1073/pnas.1714633115.

Повний текст джерела
Анотація:
Previous work on perceptual decision making in the sensorimotor system has shown population dynamics in the beta band, corresponding to the encoding of stimulus properties and the final decision outcome. Here, we asked how oscillatory dynamics in the medial premotor cortex (MPC) contribute to supramodal perceptual decision making. We recorded local field potentials (LFPs) and spikes in two monkeys trained to perform a tactile–acoustic frequency discrimination task, including both unimodal and crossmodal conditions. We studied the role of oscillatory activity as a function of stimulus properties (frequency and sensory modality), as well as decision outcome. We found that beta-band power correlated with relevant stimulus properties: there was a significant modulation by stimulus frequency during the working-memory (WM) retention interval, as well as modulation by stimulus modality—the latter was observed only in the case of a purely unimodal task, where modality information was relevant to prepare for the upcoming second stimulus. Furthermore, we found a significant modulation of beta power during the comparison and decision period, which was predictive of decision outcome. Finally, beta-band spike–field coherence (SFC) matched these LFP observations. In conclusion, we demonstrate that beta power in MPC is reflective of stimulus features in a supramodal, context-dependent manner, and additionally reflects the decision outcome. We propose that these beta modulations are a signature of the recruitment of functional neuronal ensembles, which encode task-relevant information.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Kennett, Steffan, Chris Rorden, Masud Husain, and Jon Driver. "Crossmodal visual-tactile extinction: Modulation by posture implicates biased competition in proprioceptively reconstructed space." Journal of Neuropsychology 4, no. 1 (March 2010): 15–32. http://dx.doi.org/10.1348/174866409x415942.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Lyons, Georgina, Daniel Sanabria, Argiro Vatakis, and Charles Spence. "The modulation of crossmodal integration by unimodal perceptual grouping: a visuotactile apparent motion study." Experimental Brain Research 174, no. 3 (May 23, 2006): 510–16. http://dx.doi.org/10.1007/s00221-006-0485-8.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Pedersen, Nicolai F., Torsten Dau, Lars Kai Hansen, and Jens Hjortkjær. "Modulation transfer functions for audiovisual speech." PLOS Computational Biology 18, no. 7 (July 19, 2022): e1010273. http://dx.doi.org/10.1371/journal.pcbi.1010273.

Повний текст джерела
Анотація:
Temporal synchrony between facial motion and acoustic modulations is a hallmark feature of audiovisual speech. The moving face and mouth during natural speech is known to be correlated with low-frequency acoustic envelope fluctuations (below 10 Hz), but the precise rates at which envelope information is synchronized with motion in different parts of the face are less clear. Here, we used regularized canonical correlation analysis (rCCA) to learn speech envelope filters whose outputs correlate with motion in different parts of the speakers face. We leveraged recent advances in video-based 3D facial landmark estimation allowing us to examine statistical envelope-face correlations across a large number of speakers (∼4000). Specifically, rCCA was used to learn modulation transfer functions (MTFs) for the speech envelope that significantly predict correlation with facial motion across different speakers. The AV analysis revealed bandpass speech envelope filters at distinct temporal scales. A first set of MTFs showed peaks around 3-4 Hz and were correlated with mouth movements. A second set of MTFs captured envelope fluctuations in the 1-2 Hz range correlated with more global face and head motion. These two distinctive timescales emerged only as a property of natural AV speech statistics across many speakers. A similar analysis of fewer speakers performing a controlled speech task highlighted only the well-known temporal modulations around 4 Hz correlated with orofacial motion. The different bandpass ranges of AV correlation align notably with the average rates at which syllables (3-4 Hz) and phrases (1-2 Hz) are produced in natural speech. Whereas periodicities at the syllable rate are evident in the envelope spectrum of the speech signal itself, slower 1-2 Hz regularities thus only become prominent when considering crossmodal signal statistics. This may indicate a motor origin of temporal regularities at the timescales of syllables and phrases in natural speech.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Kreutzfeldt, Magali, Denise N. Stephan, Klaus Willmes, and Iring Koch. "Shifts in target modality cause attentional reset: Evidence from sequential modulation of crossmodal congruency effects." Psychonomic Bulletin & Review 23, no. 5 (January 26, 2016): 1466–73. http://dx.doi.org/10.3758/s13423-016-1001-1.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Skals, N. "Her odours make him deaf: crossmodal modulation of olfaction and hearing in a male moth." Journal of Experimental Biology 208, no. 4 (February 15, 2005): 595–601. http://dx.doi.org/10.1242/jeb.01400.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Macaluso, E., J. Driver, J. van Velzen, and M. Eimer. "Influence of gaze direction on crossmodal modulation of visual ERPS by endogenous tactile spatial attention." Cognitive Brain Research 23, no. 2-3 (May 2005): 406–17. http://dx.doi.org/10.1016/j.cogbrainres.2004.11.003.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Verma, Tushar, Scott C. Aker, and Jeremy Marozeau. "Effect of Vibrotactile Stimulation on Auditory Timbre Perception for Normal-Hearing Listeners and Cochlear-Implant Users." Trends in Hearing 27 (January 2023): 233121652211383. http://dx.doi.org/10.1177/23312165221138390.

Повний текст джерела
Анотація:
The study tests the hypothesis that vibrotactile stimulation can affect timbre perception. A multidimensional scaling experiment was conducted. Twenty listeners with normal hearing and nine cochlear implant users were asked to judge the dissimilarity of a set of synthetic sounds that varied in attack time and amplitude modulation depth. The listeners were simultaneously presented with vibrotactile stimuli, which varied also in attack time and amplitude modulation depth. The results showed that alterations to the temporal waveform of the tactile stimuli affected the listeners’ dissimilarity judgments of the audio. A three-dimensional analysis revealed evidence of crossmodal processing where the audio and tactile equivalents combined accounted for their dissimilarity judgments. For the normal-hearing listeners, 86% of the first dimension was explained by audio impulsiveness and 14% by tactile impulsiveness; 75% of the second dimension was explained by the audio roughness or fast amplitude modulation, while its tactile counterpart explained 25%. Interestingly, the third dimension revealed a combination of 43% of audio impulsiveness and 57% of tactile amplitude modulation. For the CI listeners, the first dimension was mostly accounted for by the tactile roughness and the second by the audio impulsiveness. This experiment shows that the perception of timbre can be affected by tactile input and could lead to the developing of new audio-tactile devices for people with hearing impairment.
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Grabowecky, Marcia, Aleksandra Sherman, and Satoru Suzuki. "Natural scenes have matched amplitude-modulated sounds that systematically influence visual scanning." Seeing and Perceiving 25 (2012): 121. http://dx.doi.org/10.1163/187847612x647540.

Повний текст джерела
Анотація:
We have previously demonstrated a linear perceptual relationship between auditory amplitude-modulation (AM) rate and visual spatial-frequency using gabors as the visual stimuli. Can this frequency-based auditory–visual association influence perception of natural scenes? Participants consistently matched specific auditory AM rates to diverse visual scenes (nature, urban, and indoor). A correlation analysis indicated that higher subjective density ratings were associated with faster AM-rate matches. Furthermore, both the density ratings and AM-rate matches were relatively scale invariant, suggesting that the underlying crossmodal association is between visual coding of object-based density and auditory coding of AM rate. Based on these results, we hypothesized that concurrently presented fast (7 Hz) or slow (2 Hz) AM-rates might influence how visual attention is allocated to dense or sparse regions within a scene. We tested this hypothesis by monitoring eye movements while participants examined scenes for a subsequent memory task. To determine whether fast or slow sounds guided eye movements to specific spatial frequencies, we computed the maximum contrast energy at each fixation across 12 spatial frequency bands ranging from 0.06–10.16 cycles/degree. We found that the fast sound significantly guided eye movements toward regions of high spatial frequency, whereas the slow sound guided eye movements away from regions of high spatial frequency. This suggests that faster sounds may promote a local scene scanning strategy, acting as a ‘filter’ to individuate objects within dense regions. Our results suggest that auditory AM rate and visual object density are crossmodally associated, and that this association can modulate visual inspection of scenes.
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Nuku, Pines, and Harold Bekkering. "When one sees what the other hears: Crossmodal attentional modulation for gazed and non-gazed upon auditory targets." Consciousness and Cognition 19, no. 1 (March 2010): 135–43. http://dx.doi.org/10.1016/j.concog.2009.07.012.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Jamal, Yaseen, Simon Lacey, Lynne Nygaard, and K. Sathian. "Interactions Between Auditory Elevation, Auditory Pitch and Visual Elevation During Multisensory Perception." Multisensory Research 30, no. 3-5 (2017): 287–306. http://dx.doi.org/10.1163/22134808-00002553.

Повний текст джерела
Анотація:
Cross-modal correspondences refer to associations between apparently unrelated stimulus features in different senses. For example, high and low auditory pitches are associated with high and low visual elevations, respectively. Here we examined how this crossmodal correspondence between visual elevation and auditory pitch relates to auditory elevation. We used audiovisual combinations of high- or low-frequency bursts of white noise and a visual stimulus comprising a white circle. Auditory and visual stimuli could each occur at high or low elevations. These multisensory stimuli could be congruent or incongruent for three correspondence types: cross-modal featural (auditory pitch/visual elevation), within-modal featural (auditory pitch/auditory elevation) and cross-modal spatial (auditory and visual elevation). Participants performed a 2AFC speeded classification (high or low) task while attending to auditory pitch, auditory elevation, or visual elevation. We tested for modulatory interactions between the three correspondence types. Modulatory interactions were absent when discriminating visual elevation. However, the within-modal featural correspondence affected the cross-modal featural correspondence during discrimination of auditory elevation and pitch, while the reverse modulation was observed only during discrimination of auditory pitch. The cross-modal spatial correspondence modulated the other two correspondences only when auditory elevation was being attended, was modulated by the cross-modal featural correspondence only during attention to auditory pitch, and was modulated by the within-modal featural correspondence while performing discrimination of either auditory elevation or pitch. We conclude that the cross-modal correspondence between auditory pitch and visual elevation interacts strongly with auditory elevation.
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Guo, Lu, Ming Bao, Luyang Guan, and Lihan Chen. "Cognitive Styles Differentiate Crossmodal Correspondences Between Pitch Glide and Visual Apparent Motion." Multisensory Research 30, no. 3-5 (2017): 363–85. http://dx.doi.org/10.1163/22134808-00002556.

Повний текст джерела
Анотація:
Crossmodal correspondences are the automatic associations that most people have between different basic sensory stimulus attributes, dimensions, or features. For instance, people often show a systematic tendency to associate moving objects with changing pitches. Cognitive styles are defined as an individual’s consistent approach to think, perceive, and remember information, and they reflect qualitative rather than quantitative differences between individuals in their thinking processes. Here we asked whether cognitive styles played a role in modulating the crossmodal interaction. We used the visual Ternus display in our study, since it elicits two distinct apparent motion percepts: element motion (with a shorter interval between the two Ternus frames) and group motion (with a longer interval between the two frames). We examined the audiovisual correspondences between the visual Ternus movement directions (upward or downward) and the changes of pitches of concurrent glides (ascending frequency or descending frequency). Moreover, we measured the cognitive styles (with the Embedded Figure Test) for each participant. The results showed that congruent correspondence between pitch-ascending (decreasing) glides and moving upward (downward) visual directions led to a more dominant percept of ‘element motion’, and such an effect was typically observed in the field-independent group. Importantly, field-independent participants demonstrated a high efficiency for identifying the properties of audiovisual events and applying the crossmodal correspondence in crossmodal interaction. The results suggest cognitive styles could differentiate crossmodal correspondences in crossmodal interaction.
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Gallace, Alberto, Malika Auvray, and Charles Spence. "The Modulation of Haptic Line Bisection by a Visual Illusion and Optokinetic Stimulation." Perception 36, no. 7 (July 2007): 1003–18. http://dx.doi.org/10.1068/p5457.

Повний текст джерела
Анотація:
Research has shown that a variety of different sensory manipulations, including visual illusions, transcutaneous nerve stimulation, vestibular caloric stimulation, optokinetic stimulation, and prism adaptation, can all influence people's performance on spatial tasks such as line bisection. It has been suggested that these manipulations may act upon the ‘higher-order’ levels of representation used to code spatial information. We investigated whether we could influence haptic line bisection in normal participants crossmodally by varying the visual background that participants viewed. In experiment 1, participants haptically bisected wooden rods while looking at a variant of the Oppel–Kundt visual illusion. Haptic-bisection judgments were influenced by the orientation of the visual illusion (in line with previous unimodal visual findings). In experiment 2, haptic-bisection judgments were also influenced by the presence of a leftward or rightward moving visual background. In experiments 3 and 4, the position of the to-be-bisected stimuli was varied with respect to the participant's body midline. The results confirmed an effect of optokinetic stimulation, but not of the Oppel–Kundt illusion, on participants' tactile-bisection errors, suggesting that the two manipulations might differentially affect haptic processing. Taken together, these results suggest that the ‘higher-order’ levels of spatial representation upon which perceptual judgments and/or motor responses are made may have multisensory or amodal characteristics.
Стилі APA, Harvard, Vancouver, ISO та ін.
19

Wan, Yingqi, and Lihan Chen. "Temporal Reference, Attentional Modulation, and Crossmodal Assimilation." Frontiers in Computational Neuroscience 12 (June 5, 2018). http://dx.doi.org/10.3389/fncom.2018.00039.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Chen, Yi-Chuan, Su-Ling Yeh, and Charles Spence. "Crossmodal Constraints on Human Perceptual Awareness: Auditory Semantic Modulation of Binocular Rivalry." Frontiers in Psychology 2 (2011). http://dx.doi.org/10.3389/fpsyg.2011.00212.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Ćwiek, Aleksandra, Susanne Fuchs, Christoph Draxler, Eva Liina Asu, Dan Dediu, Katri Hiovain, Shigeto Kawahara, et al. "The bouba/kiki effect is robust across cultures and writing systems." Philosophical Transactions of the Royal Society B: Biological Sciences 377, no. 1841 (November 15, 2021). http://dx.doi.org/10.1098/rstb.2020.0390.

Повний текст джерела
Анотація:
The bouba/kiki effect—the association of the nonce word bouba with a round shape and kiki with a spiky shape—is a type of correspondence between speech sounds and visual properties with potentially deep implications for the evolution of spoken language. However, there is debate over the robustness of the effect across cultures and the influence of orthography. We report an online experiment that tested the bouba/kiki effect across speakers of 25 languages representing nine language families and 10 writing systems. Overall, we found strong evidence for the effect across languages, with bouba eliciting more congruent responses than kiki . Participants who spoke languages with Roman scripts were only marginally more likely to show the effect, and analysis of the orthographic shape of the words in different scripts showed that the effect was no stronger for scripts that use rounder forms for bouba and spikier forms for kiki . These results confirm that the bouba/kiki phenomenon is rooted in crossmodal correspondence between aspects of the voice and visual shape, largely independent of orthography. They provide the strongest demonstration to date that the bouba/kiki effect is robust across cultures and writing systems. This article is part of the theme issue ‘Voice modulation: from origin and mechanism to social impact (Part II)’.
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії