Journal articles on the topic 'Auditory-motor integration'

To see the other types of publications on this topic, follow the link: Auditory-motor integration.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Auditory-motor integration.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Jay, M. F., and D. L. Sparks. "Sensorimotor integration in the primate superior colliculus. I. Motor convergence." Journal of Neurophysiology 57, no. 1 (January 1, 1987): 22–34. http://dx.doi.org/10.1152/jn.1987.57.1.22.

Full text
Abstract:
Orienting movements of the eyes and head are made to both auditory and visual stimuli even though in the primary sensory pathways the locations of auditory and visual stimuli are encoded in different coordinates. This study was designed to differentiate between two possible mechanisms for sensory-to-motor transformation. Auditory and visual signals could be translated into common coordinates in order to share a single motor pathway or they could maintain anatomically separate sensory and motor routes for the initiation and guidance of orienting eye movements. The primary purpose of the study was to determine whether neurons in the superior colliculus (SC) that discharge before saccades to visual targets also discharge before saccades directed toward auditory targets. If they do, this would indicate that auditory and visual signals, originally encoded in different coordinates, have been converted into a single coordinate system and are sharing a motor circuit. Trained monkeys made saccadic eye movements to auditory or visual targets while the activity of visual-motor (V-M) cells and saccade-related burst (SRB) cells was monitored. The pattern of spike activity observed during trials in which saccades were made to visual targets was compared with that observed when comparable saccades were made to auditory targets. For most (57 of 59) V-M cells, sensory responses were observed only on visual trials. Auditory stimuli originating from the same region of space did not activate these cells. Yet, of the 72 V-M and SRB cells studied, 79% showed motor bursts prior to saccades to either auditory or visual targets. This finding indicates that visual and auditory signals, originally encoded in retinal and head-centered coordinates, respectively, have undergone a transformation that allows them to share a common efferent pathway for the generation of saccadic eye movements. Saccades to auditory targets usually have lower velocities than saccades of the same amplitude and direction made to acquire visual targets. Since fewer collicular cells are active prior to saccades to auditory targets, one determinant of saccadic velocity may be the number of collicular neurons discharging before a particular saccade.
APA, Harvard, Vancouver, ISO, and other styles
2

Loucks, Torrey M. J., Edward Ofori, Christopher M. Grindrod, Luc F. De Nil, and Jacob J. Sosnoff. "Auditory Motor Integration in Oral and Manual Effectors." Journal of Motor Behavior 42, no. 4 (July 2010): 233–39. http://dx.doi.org/10.1080/00222895.2010.492723.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Tran, Huynh-Truc, Yao-Chuen Li, Hung-Yu Lin, Shin-Da Lee, and Pei-Jung Wang. "Sensory Processing Impairments in Children with Developmental Coordination Disorder." Children 9, no. 10 (September 22, 2022): 1443. http://dx.doi.org/10.3390/children9101443.

Full text
Abstract:
The two objectives of this systematic review were to examine the following: (1) the difference in sensory processing areas (auditory, visual, vestibular, touch, proprioceptive, and multi-sensory) between children with and without developmental coordination disorder (DCD), and (2) the relationship between sensory processing and motor coordination in DCD. The following databases were comprehensively searched for relevant articles: PubMed, Science Direct, Web of Science, and Cochrane library. There were 1107 articles (published year = 2010 to 2021) found in the initial search. Full-text articles of all possibly relevant citations were obtained and inspected for suitability by two authors. The outcome measures were sensory processing impairments and their relationship with motor coordination. A total of 10 articles met the inclusion criteria. Children with DCD showed significant impairments in visual integration, tactile integration, proprioceptive integration, auditory integration, vestibular integration, and oral integration processes when compared with typically developing children. Evidence also supported that sensory processing impairments were associated with poor motor coordination in DCD. Preliminary support indicated that DCD have sensory processing impairments in visual, tactile, proprioceptive, auditory, and vestibular areas, which might contribute to participation restriction in motor activities. It is important to apply sensory integration therapy in rehabilitation programs for DCD in order to facilitate participation in daily activities.
APA, Harvard, Vancouver, ISO, and other styles
4

Jay, M. F., and D. L. Sparks. "Sensorimotor integration in the primate superior colliculus. II. Coordinates of auditory signals." Journal of Neurophysiology 57, no. 1 (January 1, 1987): 35–55. http://dx.doi.org/10.1152/jn.1987.57.1.35.

Full text
Abstract:
Based on the findings of the preceding paper, it is known that auditory and visual signals have been translated into common coordinates at the level of the superior colliculus (SC) and share a motor circuit involved in the generation of saccadic eye movements. It is not known, however, whether the translation of sensory signals into motor coordinates occurs prior to or within the SC. Nor is it known in what coordinates auditory signals observed in the SC are encoded. The present experiment tested two alternative hypotheses concerning the frame of reference of auditory signals found in the deeper layers of the SC. The hypothesis that auditory signals are encoded in head coordinates predicts that, with the head stationary, the response of auditory neurons will not be affected by variations in eye position but will be determined by the location of the sound source. The hypothesis that auditory responses encode the trajectory of the eye movement required to look to the target (motor error) predicts that the response of auditory cells will depend on both the position of the sound source and the position of the eyes in the orbit. Extracellular single-unit recordings were obtained from neurons in the SC while monkeys made delayed saccades to auditory or visual targets in a darkened room. The coordinates of auditory signals were studied by plotting auditory receptive fields while the animal fixated one of three targets placed 24 degrees apart along the horizontal plane. For 99 of 121 SC cells, the spatial location of the auditory receptive field was significantly altered by the position of the eyes in the orbit. In contrast, the responses of five sound-sensitive cells isolated in the inferior colliculus were not affected by variations in eye position. The possibility that systematic variations in the position of the pinnae associated with different fixation positions could account for these findings was controlled for by plotting auditory receptive fields while the pinnae were mechanically restrained. Under these conditions, the position of the eyes in the orbit still had a significant effect on the responsiveness of collicular neurons to auditory stimuli. The average magnitude of the shift of the auditory receptive field with changes in eye position (12.9 degrees) did not correspond to the magnitude of the shift in eye position (24 degrees). Alternative explanations for this finding were considered. One possibility is that, within the SC, there is a gradual transition from auditory signals in head coordinates to signals in motor error coordinates.(ABSTRACT TRUNCATED AT 400 WORDS)
APA, Harvard, Vancouver, ISO, and other styles
5

Matchin, William, Kier Groulx, and Gregory Hickok. "Audiovisual Speech Integration Does Not Rely on the Motor System: Evidence from Articulatory Suppression, the McGurk Effect, and fMRI." Journal of Cognitive Neuroscience 26, no. 3 (March 2014): 606–20. http://dx.doi.org/10.1162/jocn_a_00515.

Full text
Abstract:
Visual speech influences the perception of heard speech. A classic example of this is the McGurk effect, whereby an auditory /pa/ overlaid onto a visual /ka/ induces the fusion percept of /ta/. Recent behavioral and neuroimaging research has highlighted the importance of both articulatory representations and motor speech regions of the brain, particularly Broca's area, in audiovisual (AV) speech integration. Alternatively, AV speech integration may be accomplished by the sensory system through multisensory integration in the posterior STS. We assessed the claims regarding the involvement of the motor system in AV integration in two experiments: (i) examining the effect of articulatory suppression on the McGurk effect and (ii) determining if motor speech regions show an AV integration profile. The hypothesis regarding experiment (i) is that if the motor system plays a role in McGurk fusion, distracting the motor system through articulatory suppression should result in a reduction of McGurk fusion. The results of experiment (i) showed that articulatory suppression results in no such reduction, suggesting that the motor system is not responsible for the McGurk effect. The hypothesis of experiment (ii) was that if the brain activation to AV speech in motor regions (such as Broca's area) reflects AV integration, the profile of activity should reflect AV integration: AV > AO (auditory only) and AV > VO (visual only). The results of experiment (ii) demonstrate that motor speech regions do not show this integration profile, whereas the posterior STS does. Instead, activity in motor regions is task dependent. The combined results suggest that AV speech integration does not rely on the motor system.
APA, Harvard, Vancouver, ISO, and other styles
6

Westerman, Gert, and Eduardo Reck Miranda. "Modelling the Development of Mirror Neurons for Auditory-Motor Integration." Journal of New Music Research 31, no. 4 (December 1, 2002): 367–75. http://dx.doi.org/10.1076/jnmr.31.4.367.14166.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Cardin, Jessica A., Jonathan N. Raksin, and Marc F. Schmidt. "Sensorimotor Nucleus NIf Is Necessary for Auditory Processing But Not Vocal Motor Output in the Avian Song System." Journal of Neurophysiology 93, no. 4 (April 2005): 2157–66. http://dx.doi.org/10.1152/jn.01001.2004.

Full text
Abstract:
Sensorimotor integration in the avian song system is crucial for both learning and maintenance of song, a vocal motor behavior. Although a number of song system areas demonstrate both sensory and motor characteristics, their exact roles in auditory and premotor processing are unclear. In particular, it is unknown whether input from the forebrain nucleus interface of the nidopallium (NIf), which exhibits both sensory and premotor activity, is necessary for both auditory and premotor processing in its target, HVC. Here we show that bilateral NIf lesions result in long-term loss of HVC auditory activity but do not impair song production. NIf is thus a major source of auditory input to HVC, but an intact NIf is not necessary for motor output in adult zebra finches.
APA, Harvard, Vancouver, ISO, and other styles
8

Kagerer, Florian A., Priya Viswanathan, Jose L. Contreras-Vidal, and Jill Whitall. "Auditory–motor integration of subliminal phase shifts in tapping: better than auditory discrimination would predict." Experimental Brain Research 232, no. 4 (January 22, 2014): 1207–18. http://dx.doi.org/10.1007/s00221-014-3837-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lagarrigue, Yannick, Céline Cappe, and Jessica Tallet. "Regular rhythmic and audio-visual stimulations enhance procedural learning of a perceptual-motor sequence in healthy adults: A pilot study." PLOS ONE 16, no. 11 (November 15, 2021): e0259081. http://dx.doi.org/10.1371/journal.pone.0259081.

Full text
Abstract:
Procedural learning is essential for the effortless execution of many everyday life activities. However, little is known about the conditions influencing the acquisition of procedural skills. The literature suggests that sensory environment may influence the acquisition of perceptual-motor sequences, as tested by a Serial Reaction Time Task. In the current study, we investigated the effects of auditory stimulations on procedural learning of a visuo-motor sequence. Given that the literature shows that regular rhythmic auditory rhythm and multisensory stimulations improve motor speed, we expected to improve procedural learning (reaction times and errors) with repeated practice with auditory stimulations presented either simultaneously with visual stimulations or with a regular tempo, compared to control conditions (e.g., with irregular tempo). Our results suggest that both congruent audio-visual stimulations and regular rhythmic auditory stimulations promote procedural perceptual-motor learning. On the contrary, auditory stimulations with irregular or very quick tempo alter learning. We discuss how regular rhythmic multisensory stimulations may improve procedural learning with respect of a multisensory rhythmic integration process.
APA, Harvard, Vancouver, ISO, and other styles
10

Peschke, C., W. Ziegler, J. Kappes, and A. Baumgaertner. "Auditory–motor integration during fast repetition: The neuronal correlates of shadowing." NeuroImage 47, no. 1 (August 2009): 392–402. http://dx.doi.org/10.1016/j.neuroimage.2009.03.061.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Loucks, Torrey M., Heecheong Chon, Shelly Kraft, and Nicoline Ambrose. "Individual differences in auditory-motor integration revealed by speech fluency manipulations." Journal of the Acoustical Society of America 133, no. 5 (May 2013): 3518. http://dx.doi.org/10.1121/1.4806306.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Klatt, Stefanie, and Nicholas J. Smeeton. "Visual and Auditory Information During Decision Making in Sport." Journal of Sport and Exercise Psychology 42, no. 1 (February 1, 2020): 15–25. http://dx.doi.org/10.1123/jsep.2019-0107.

Full text
Abstract:
In 2 experiments, the authors investigated the effects of bimodal integration in a sport-specific task. Beach volleyball players were required to make a tactical decision, responding either verbally or via a motor response, after being presented with visual, auditory, or both kinds of stimuli in a beach volleyball scenario. In Experiment 1, players made the correct decision in a game situation more often when visual and auditory information were congruent than in trials in which they experienced only one of the modalities or incongruent information. Decision-making accuracy was greater when motor, rather than verbal, responses were given. Experiment 2 replicated this congruence effect using different stimulus material and showed a decreasing effect of visual stimulation on decision making as a function of shorter visual stimulus durations. In conclusion, this study shows that bimodal integration of congruent visual and auditory information results in more accurate decision making in sport than unimodal information.
APA, Harvard, Vancouver, ISO, and other styles
13

Du, Yi, and Robert J. Zatorre. "Musical training sharpens and bonds ears and tongue to hear speech better." Proceedings of the National Academy of Sciences 114, no. 51 (December 4, 2017): 13579–84. http://dx.doi.org/10.1073/pnas.1712223114.

Full text
Abstract:
The idea that musical training improves speech perception in challenging listening environments is appealing and of clinical importance, yet the mechanisms of any such musician advantage are not well specified. Here, using functional magnetic resonance imaging (fMRI), we found that musicians outperformed nonmusicians in identifying syllables at varying signal-to-noise ratios (SNRs), which was associated with stronger activation of the left inferior frontal and right auditory regions in musicians compared with nonmusicians. Moreover, musicians showed greater specificity of phoneme representations in bilateral auditory and speech motor regions (e.g., premotor cortex) at higher SNRs and in the left speech motor regions at lower SNRs, as determined by multivoxel pattern analysis. Musical training also enhanced the intrahemispheric and interhemispheric functional connectivity between auditory and speech motor regions. Our findings suggest that improved speech in noise perception in musicians relies on stronger recruitment of, finer phonological representations in, and stronger functional connectivity between auditory and frontal speech motor cortices in both hemispheres, regions involved in bottom-up spectrotemporal analyses and top-down articulatory prediction and sensorimotor integration, respectively.
APA, Harvard, Vancouver, ISO, and other styles
14

Rojas, J., and R. A. Peters II. "Sensory Integration with Articulated Motion on a Humanoid Robot." Applied Bionics and Biomechanics 2, no. 3-4 (2005): 171–78. http://dx.doi.org/10.1155/2005/295816.

Full text
Abstract:
This paper describes the integration of articulated motion with auditory and visual sensory information that enables a humanoid robot to achieve certain reflex actions that mimic those of people. Reflexes such as reach-and-grasp behavior enables the robot to learn, through experience, its own state and that of the world. A humanoid robot with binaural audio input, stereo vision, and pneumatic arms and hands exhibited tightly coupled sensory-motor behaviors in four different demonstrations. The complexity of successive demonstrations was increased to show that the reflexive sensory-motor behaviors combine to perform increasingly complex tasks. The humanoid robot executed these tasks effectively and established the groundwork for the further development of hardware and software systems, sensory-motor vector-space representations, and coupling with higher-level cognition.
APA, Harvard, Vancouver, ISO, and other styles
15

Chen, Joyce L., Virginia B. Penhune, and Robert J. Zatorre. "Moving on Time: Brain Network for Auditory-Motor Synchronization is Modulated by Rhythm Complexity and Musical Training." Journal of Cognitive Neuroscience 20, no. 2 (February 2008): 226–39. http://dx.doi.org/10.1162/jocn.2008.20018.

Full text
Abstract:
Much is known about the motor system and its role in simple movement execution. However, little is understood about the neural systems underlying auditory-motor integration in the context of musical rhythm, or the enhanced ability of musicians to execute precisely timed sequences. Using functional magnetic resonance imaging, we investigated how performance and neural activity were modulated as musicians and nonmusicians tapped in synchrony with progressively more complex and less metrically structured auditory rhythms. A functionally connected network was implicated in extracting higher-order features of a rhythm's temporal structure, with the dorsal premotor cortex mediating these auditory-motor interactions. In contrast to past studies, musicians recruited the prefrontal cortex to a greater degree than nonmusicians, whereas secondary motor regions were recruited to the same extent. We argue that the superior ability of musicians to deconstruct and organize a rhythm's temporal structure relates to the greater involvement of the prefrontal cortex mediating working memory.
APA, Harvard, Vancouver, ISO, and other styles
16

Salbenblatt, James A., Deborah C. Meyers, Bruce G. Bender, Mary G. Linden, and Arthur Robinson. "Gross and Fine Motor Development in 47,XXY and 47,XYY Males." Pediatrics 80, no. 2 (August 1, 1987): 240–44. http://dx.doi.org/10.1542/peds.80.2.240.

Full text
Abstract:
Neuromuscular deficits described in early childhood as motor awkwardness or slow movements are still clinically present in school-aged boys with XXY and XYY sex chromosome aneuploidy. A control group of 14 boys (6 to 19 years of age) and 14 XXY and four XYY boys (6 to 15 years of age), identified by newborn screening, were blindly evaluated by a physical therapist. The Bruininks-Oseretsky Test of Motor Proficiency (BOTMP) was administered and a clinical rating of neurologic status and sensory-motor integration was assigned. On the motor proficiency test, the XXY boys had significantly lower mean scores for upper limb coordination, speed and dexterity, and on gross motor and battery composites. The neuromuscular status of the aneuploid boys was deficient, with hypotonia, apraxia, primitive reflex retention, and problems with bilateral coordination and visual-perceptual-motor integration. This mild to moderate dysfunctional sensory-motor integration, as well as previously described auditory-processing deficits and dyslexia, contributed to school performance below that expected from their cognitive potential.
APA, Harvard, Vancouver, ISO, and other styles
17

Behroozmand, Roozbeh, Hanjun Liu, and Charles R. Larson. "Time-dependent Neural Processing of Auditory Feedback during Voice Pitch Error Detection." Journal of Cognitive Neuroscience 23, no. 5 (May 2011): 1205–17. http://dx.doi.org/10.1162/jocn.2010.21447.

Full text
Abstract:
The neural responses to sensory consequences of a self-produced motor act are suppressed compared with those in response to a similar but externally generated stimulus. Previous studies in the somatosensory and auditory systems have shown that the motor-induced suppression of the sensory mechanisms is sensitive to delays between the motor act and the onset of the stimulus. The present study investigated time-dependent neural processing of auditory feedback in response to self-produced vocalizations. ERPs were recorded in response to normal and pitch-shifted voice auditory feedback during active vocalization and passive listening to the playback of the same vocalizations. The pitch-shifted stimulus was delivered to the subjects' auditory feedback after a randomly chosen time delay between the vocal onset and the stimulus presentation. Results showed that the neural responses to delayed feedback perturbations were significantly larger than those in response to the pitch-shifted stimulus occurring at vocal onset. Active vocalization was shown to enhance neural responsiveness to feedback alterations only for nonzero delays compared with passive listening to the playback. These findings indicated that the neural mechanisms of auditory feedback processing are sensitive to timing between the vocal motor commands and the incoming auditory feedback. Time-dependent neural processing of auditory feedback may be an important feature of the audio-vocal integration system that helps to improve the feedback-based monitoring and control of voice structure through vocal error detection and correction.
APA, Harvard, Vancouver, ISO, and other styles
18

Rosati, Giulio, Antonio Rodà, Federico Avanzini, and Stefano Masiero. "On the Role of Auditory Feedback in Robot-Assisted Movement Training after Stroke: Review of the Literature." Computational Intelligence and Neuroscience 2013 (2013): 1–15. http://dx.doi.org/10.1155/2013/586138.

Full text
Abstract:
The goal of this paper is to address a topic that is rarely investigated in the literature of technology-assisted motor rehabilitation, that is, the integration of auditory feedback in the rehabilitation device. After a brief introduction on rehabilitation robotics, the main concepts of auditory feedback are presented, together with relevant approaches, techniques, and technologies available in this domain. Current uses of auditory feedback in the context of technology-assisted rehabilitation are then reviewed. In particular, a comparative quantitative analysis over a large corpus of the recent literature suggests that the potential of auditory feedback in rehabilitation systems is currently and largely underexploited. Finally, several scenarios are proposed in which the use of auditory feedback may contribute to overcome some of the main limitations of current rehabilitation systems, in terms of user engagement, development of acute-phase and home rehabilitation devices, learning of more complex motor tasks, and improving activities of daily living.
APA, Harvard, Vancouver, ISO, and other styles
19

Molholm, Sophie, Pejman Sehatpour, Ashesh D. Mehta, Marina Shpaner, Manuel Gomez-Ramirez, Stephanie Ortigue, Jonathan P. Dyke, Theodore H. Schwartz, and John J. Foxe. "Audio-Visual Multisensory Integration in Superior Parietal Lobule Revealed by Human Intracranial Recordings." Journal of Neurophysiology 96, no. 2 (August 2006): 721–29. http://dx.doi.org/10.1152/jn.00285.2006.

Full text
Abstract:
Intracranial recordings from three human subjects provide the first direct electrophysiological evidence for audio-visual multisensory processing in the human superior parietal lobule (SPL). Auditory and visual sensory inputs project to the same highly localized region of the parietal cortex with auditory inputs arriving considerably earlier (30 ms) than visual inputs (75 ms). Multisensory integration processes in this region were assessed by comparing the response to simultaneous audio-visual stimulation with the algebraic sum of responses to the constituent auditory and visual unisensory stimulus conditions. Significant integration effects were seen with almost identical morphology across the three subjects, beginning between 120 and 160 ms. These results are discussed in the context of the role of SPL in supramodal spatial attention and sensory-motor transformations.
APA, Harvard, Vancouver, ISO, and other styles
20

Tachibana, Ryosuke, Masuzo Yanagida, and Hiroshi Riquimaroux. "Temporo-frontal activities involved in auditory-motor integration: A functional MRI study." Neuroscience Research 68 (January 2010): e274. http://dx.doi.org/10.1016/j.neures.2010.07.1221.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Liu, Hanjun, Ying Liu, and Zhiqiang Guo. "Perceptual load of divided attention modulates auditory-motor integration of voice control." Journal of the Acoustical Society of America 138, no. 3 (September 2015): 1811. http://dx.doi.org/10.1121/1.4933748.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Tesche, C. D., and J. Karhu. "Interactive Processing of Sensory Input and Motor Output in the Human Hippocampus." Journal of Cognitive Neuroscience 11, no. 4 (July 1999): 424–36. http://dx.doi.org/10.1162/089892999563517.

Full text
Abstract:
Recent studies of visuomotor integration suggest that the motor system may be intimately involved in the detection of salient features of the sensory scene. The final stages of sensory processing occur in hippocampal structures. We measured human neuromagnetic responses during motor reaction to an auditory cue embedded in high-speed multimodal stimulation. Our results demonstrate that large-scale cognitive networks may recruit additional resources from the hippocampus during sensorimotor integration. Hippocampal activity from 300 msec before to 200 msec after cued movements was enhanced significantly over that observed during self-paced movements. The dominant hippocampal activity appeared equally synchronized to both sensory input and motor output, consistent with timing by an intrinsic mechanism, possibly provided by ongoing theta oscillations.
APA, Harvard, Vancouver, ISO, and other styles
23

Alain, Claude, Yu He, and Cheryl Grady. "The Contribution of the Inferior Parietal Lobe to Auditory Spatial Working Memory." Journal of Cognitive Neuroscience 20, no. 2 (February 2008): 285–95. http://dx.doi.org/10.1162/jocn.2008.20014.

Full text
Abstract:
There is strong evidence for dissociable “what” and “where” pathways in the auditory system, but considerable debate remains regarding the functional role of these pathways. The sensory-motor account of spatial processing posits that the dorsal brain regions (e.g., inferior parietal lobule, IPL) mediate sensory-motor integration required during “where” responding. An alternative account suggests that the IPL plays an important role in monitoring sound location. To test these two models, we used a mixed-block and event-related functional magnetic resonance imaging (fMRI) design in which participants responded to occasional repetitions in either sound location (“where” task) or semantic category (“what” task). The fMRI data were analyzed with the general linear model using separate regressors for representing sustained and transient activity in both listening conditions. This analysis revealed more sustained activity in right dorsal brain regions, including the IPL and superior frontal sulcus, during the location than during the category task, after accounting for transient activity related to target detection and the motor response. Conversely, we found greater sustained activity in the left superior temporal gyrus and left inferior frontal gyrus during the category task compared to the location task. Transient target-related activity in both tasks was associated with enhanced signal in the left pre- and postcentral gyrus, prefrontal cortex and bilateral IPL. These results suggest dual roles for the right IPL in auditory working memory—one involved in monitoring and updating sound location independent of motor responding, and another that underlies the integration of sensory and motor functions.
APA, Harvard, Vancouver, ISO, and other styles
24

Tierney, Adam, and Nina Kraus. "Getting back on the beat: links between auditory-motor integration and precise auditory processing at fast time scales." European Journal of Neuroscience 43, no. 6 (February 9, 2016): 782–91. http://dx.doi.org/10.1111/ejn.13171.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Ahn, Min-Hee, Nour Alsabbagh, Hyo-Jeong Lee, Hyung-Jong Kim, Myung-Hun Jung, and Sung-Kwang Hong. "Neurobiological Signatures of Auditory False Perception and Phantom Perception as a Consequence of Sensory Prediction Errors." Biology 11, no. 10 (October 13, 2022): 1501. http://dx.doi.org/10.3390/biology11101501.

Full text
Abstract:
In this study, we hypothesized that top-down sensory prediction error due to peripheral hearing loss might influence sensorimotor integration using the efference copy (EC) signals as functional connections between auditory and motor brain areas. Using neurophysiological methods, we demonstrated that the auditory responses to self-generated sound were not suppressed in a group of patients with tinnitus accompanied by significant hearing impairment and in a schizophrenia group. However, the response was attenuated in a group with tinnitus accompanied by mild hearing impairment, similar to a healthy control group. The bias of attentional networks to self-generated sound was also observed in the subjects with tinnitus with significant hearing impairment compared to those with mild hearing impairment and healthy subjects, but it did not reach the notable disintegration found in those in the schizophrenia group. Even though the present study had significant constraints in that we did not include hearing loss subjects without tinnitus, these results might suggest that auditory deafferentation (hearing loss) may influence sensorimotor integration process using EC signals. However, the impaired sensorimotor integration in subjects with tinnitus with significant hearing impairment may have resulted from aberrant auditory signals due to sensory loss, not fundamental deficits in the reafference system, as the auditory attention network to self-generated sound is relatively well preserved in these subjects.
APA, Harvard, Vancouver, ISO, and other styles
26

Liu, Dongxu, Guangyan Dai, Churong Liu, Zhiqiang Guo, Zhiqin Xu, Jeffery A. Jones, Peng Liu, and Hanjun Liu. "Top–Down Inhibitory Mechanisms Underlying Auditory–Motor Integration for Voice Control: Evidence by TMS." Cerebral Cortex 30, no. 8 (March 7, 2020): 4515–27. http://dx.doi.org/10.1093/cercor/bhaa054.

Full text
Abstract:
Abstract The dorsolateral prefrontal cortex (DLPFC) has been implicated in auditory–motor integration for accurate control of vocal production, but its precise role in this feedback-based process remains largely unknown. To this end, the present event-related potential study applied a transcranial magnetic stimulation (TMS) protocol, continuous theta-burst stimulation (c-TBS), to disrupt cortical activity in the left DLPFC as young adults vocalized vowel sounds while hearing their voice unexpectedly shifted upwards in pitch. The results showed that, as compared to the sham condition, c-TBS over left DLPFC led to significantly larger vocal compensations for pitch perturbations that were accompanied by significantly smaller cortical P2 responses. Source localization analyses revealed that this brain activity pattern was the result of reduced activation in the left superior frontal gyrus and right inferior parietal lobule (supramarginal gyrus). These findings demonstrate c-TBS-induced modulatory effects of DLPFC on the neurobehavioral processing of vocal pitch regulation, suggesting that disrupting prefrontal function may impair top–down inhibitory control mechanisms that prevent speech production from being excessively influenced by auditory feedback, resulting in enhanced vocal compensations for feedback perturbations. This is the first study that provides direct evidence for a causal role of the left DLPFC in auditory feedback control of vocal production.
APA, Harvard, Vancouver, ISO, and other styles
27

Murgia, Mauro, and Alessandra Galmonte. "Editorial: The Role of Sound in Motor Perception and Execution." Open Psychology Journal 8, no. 1 (December 31, 2015): 171–73. http://dx.doi.org/10.2174/1874350101508010171.

Full text
Abstract:
“Perception and action” is one of the main research fields in which experimental psychologists work together with experts of other disciplines, such as medicine, physiotherapy, engineering, and sport. Traditionally, researchers have mainly focused on visual perception and on its influences on motor processes, while less attention has been dedicated to the role of auditory perception. However, in the last decade, the interest towards the influence of sounds on both action perception and motor execution has increased significantly. On the one hand, researchers have been interested in determining how humans can represent motor actions through the sounds associated with movements, as well as which auditory cues are salient for recognizing and discriminating different features of movement [1-10]. On the other hand, researchers have studied how auditory stimuli affect the production of complex movements in different domains [11-21]. The general aim of this special issue is to provide an overview of the relationship between sounds and movements by addressing theoretical, methodological, and applied issues from a multidisciplinary perspective. ORGANIZATION OF THE VOLUME At the beginning of this special issue we report the contributions that deal with theoretical (Steenson & Rodger; Pizzera & Hohmann) and methodological (Dyer, Stapleton & Rodger) issues regarding auditory perception and action. After providing a theoretical and methodological background, we report those contributions that focus on possible applications of auditory training in the domain of sport and exercise psychology (O, Law & Rymal; Sors, Murgia, Santoro & Agostini), rehabilitation (Murgia, Corona, Pili, Sors, Agostini, Casula, Pau & Guicciardi), and motor learning (Effenberg, Schmitz, Baumann, Rosenhahn & Kroeger). In the first article, Steenson and Rodger highlight that despite the fact that sounds are helpful in executing many dayto- day and context-specific movements and skills in everyday life, there is a surprising lack of exploration of this topic in psychological studies. In fact, the authors review the auditory perception literature and note that auditory perception theories mainly describe the rules governing the processing and representation of sounds in memory, and largely disregard the meaning that sounds have to individuals engaged in movement and the subsequent use of movement sounds in movement priming and execution. Steenson and Rodger’s work can be framed in the context of Gibson’s ecological psychology, as they emphasize the role of sound as a very important affordance that we use to interact with our environment. In the second contribution, Pizzera and Hohmann extensively review studies that address the relevance of the mutual interactions between perception and motor control. Again, these authors highlight the scarcity of research on acoustic information, especially when comparing it with the amount of evidence available in the visual domain. Pizzera and Hohmann offer their perspective on the role of auditory information in controlling and integrating the perception and action cycle. The authors present both behavioral and neurophysiological evidence in support of the importance of auditory information in perception and action, and propose valuable suggestions that future investigators should consider in order to advance the state of knowledge in this domain. The methodological contribution of Dyer, Stapleton and Rodger highlights the feasibility of movement sonification as an effective feedback tool for enhancing motor skill learning and performance, particularly in novices. The authors critically discuss the strengths and weaknesses of movement sonification in the context of providing efficient perceptual feedback information to learners. Dyer, Stapleton and Rodger conclude that a well-defined framework for sonification mapping has yet to be established and that there is still need for controlled trials in motor learning. However, the authors do suggest that new technologies relevant to movement sound recording, mapping, and sonification are available to researchers and can facilitate meaningful and much-needed future research on this promising perceptual feedback method. With regards to the possible applications of audio-based interventions, the fourth article of the issue by O, Law, and Rymal provides an overview of imagery and modeling research in sport psychology and motor learning, documenting evidence supporting the cognitive processing similarities between imagery and modeling. Within this background, the authors critically examine the role of the auditory sense in modeling and imagery, analyzing both theoretical issues and empirical evidence. From a bio-informational theory perspective, O, Law, and Rymal offer several examples of potential applications of the deliberate integration of the auditory sense in movement teaching and instruction, but also offer a strong caveat regarding the severe lack of applied research on the auditory sense focused on sport populations, especially in the domain of imagery. In their conclusions the authors propose detailed recommendations for future research. A second contribution on audio-based interventions in sports is provided by Sors, Murgia, Santoro and Agostini. The authors extensively define the concepts of augmented feedback and modeling, and review studies demonstrating the effectiveness of sounds in improving the execution of simple rhythmic motor tasks. Then, Sors and colleagues describe both a theoretical background and neurophysiological evidence illustrating the mechanisms that are possibly influenced by audio-based interventions. Finally, they provide a complete description of the literature on auditory modeling and auditory augmented feedback in sports, specifying the methodological details of previous studies and proposing future directions for both, application and research. In the sixth article, Murgia, Corona, Pili, Sors, Agostini, Casula, Pau and Guicciardi illustrate the perceptual-motor impairments of patients affected by Parkinsons’ disease and new frontiers in assessment and interventions. They extensively review the empirical evidence concerning the Rhythmic Auditory Stimulation (RAS) method, describing the mechanisms underpinning its effectiveness. The authors propose possible methods for integrating auditory cues into physical therapy interventions as well as assessments. Last, Murgia and colleagues describe the biomechanical advantages of three-dimensional quantitative gait analysis, and discuss the potential impact of the incorporation of ecological footstep sounds in the modulation of patients’ gait. In the seventh and last contribution of this special issue, Effenberg, Schmitz, Baumann, Rosenhahn and Kroeger present a new method based on sonification called “Sound- Script”, which is aimed to facilitate the acquisition of writing. This method consists of the sonification of handwriting, that is, the conversion of physical parameters (i.e., position of the pen, pressure) into movement sounds, which provides children with auditory information which correlates with visual information of their handwriting performance. The authors report pilot data, showing that the multisensory integration elicited by SoundScript leads to a more adequate reproduction of writing kinematics. Effenberg and colleagues conclude by highlighting the potential of this new method and suggesting future steps for research. In sum, we hope that the papers presented in this special issue constitute a useful reference for movement researchers in the field of auditory perception and action, as well as for practitioners in the domains of sport, rehabilitation, and motor learning.
APA, Harvard, Vancouver, ISO, and other styles
28

Mates, Jiří, Ulrike Müller, Tomáš Radil, and Ernst Pöppel. "Temporal Integration in Sensorimotor Synchronization." Journal of Cognitive Neuroscience 6, no. 4 (July 1994): 332–40. http://dx.doi.org/10.1162/jocn.1994.6.4.332.

Full text
Abstract:
The concept of a temporal integration process in the timing mechanisms in the brain, postulated on the basis of experimental observations from various paradigms (for a review see P$oUppel, 1978), has been explored in a sensorimotor synchronization task. Subjects synchronized their finger taps to sequences of auditory stimuli with interstimulus-onset intervals (ISIs) between 300 and 4800 msec in different trials. Each tonal sequence consisted of 110 stimuli; the tones had a frequency of 500 Hz and a duration of 100 msec. As observed previously, response onsets preceded onsets of the stimuli by some tens of milliseconcls for ISIs in the range from about 600 to 1800 msec. For ISIs longer than or equal to 2400 msec, the ability to time the response sequence in such a way that the response 5 were placed right ahead of the stimuli started to break clown, i.e., the task was fulfilled by reactions to the stimuli rather than by advanced responses. This observation can he understood within the general framework of a temporal integration puce 55 that is supposed to have a maximal capacity (integration interval) of approximately 3 sec. Only if successive stimuli fall within one integration period, can motor programs be initiated properly by a prior stimulus and thus lead to an appropriate synchronization between the stimulus sequence and corresponding motor acts.
APA, Harvard, Vancouver, ISO, and other styles
29

Hoke, Kim L., Michael J. Ryan, and Walter Wilczynski. "Integration of sensory and motor processing underlying social behaviour in túngara frogs." Proceedings of the Royal Society B: Biological Sciences 274, no. 1610 (December 19, 2006): 641–49. http://dx.doi.org/10.1098/rspb.2006.0038.

Full text
Abstract:
Social decision making involves the perception and processing of social stimuli, the subsequent evaluation of that information in the context of the individual's internal and external milieus to produce a decision, and then culminates in behavioural output informed by that decision. We examined brain networks in an anuran communication system that relies on acoustic signals to guide simple, stereotyped motor output. We used egr-1 mRNA expression to measure neural activation in male túngara frogs, Physalaemus pustulosus , following exposure to conspecific and heterospecific calls that evoke competitive or aggressive behaviour. We found that acoustically driven activation in auditory brainstem nuclei is transformed into activation related to sensory–motor interactions in the diencephalon, followed by motor-related activation in the telencephalon. Furthermore, under baseline conditions, brain nuclei typically have correlated egr-1 mRNA levels within brain divisions. Hearing conspecific advertisement calls increases correlations between anatomically distant brain divisions; no such effect was observed in response to calls that elicit aggressive behaviour. Neural correlates of social decision making thus take multiple forms: (i) a progressive shift from sensory to motor encoding from lower to higher stages of neural processing and (ii) the emergence of correlated activation patterns among sensory and motor regions in response to behaviourally relevant social cues.
APA, Harvard, Vancouver, ISO, and other styles
30

Loui, Psyche. "A Dual-Stream Neuroanatomy of Singing." Music Perception 32, no. 3 (February 1, 2015): 232–41. http://dx.doi.org/10.1525/mp.2015.32.3.232.

Full text
Abstract:
Singing requires effortless and efficient use of auditory and motor systems that center around the perception and production of the human voice. Although perception and production are usually tightly coupled functions, occasional mismatches between the two systems inform us of dissociable pathways in the brain systems that enable singing. Here I review the literature on perception and production in the auditory modality, and propose a dual-stream neuroanatomical model that subserves singing. I will discuss studies surrounding the neural functions of feedforward, feedback, and efference systems that control vocal monitoring, as well as the white matter pathways that connect frontal and temporal regions that are involved in perception and production. I will also consider disruptions of the perception-production network that are evident in tone-deaf individuals and poor pitch singers. Finally, by comparing expert singers against other musicians and nonmusicians, I will evaluate the possibility that singing training might offer rehabilitation from these disruptions through neuroplasticity of the perception-production network. Taken together, the best available evidence supports a model of dorsal and ventral pathways in auditory-motor integration that enables singing and is shared with language, music, speech, and human interactions in the auditory environment.
APA, Harvard, Vancouver, ISO, and other styles
31

Wikman, Patrik, and Teemu Rinne. "Interaction of the effects associated with auditory-motor integration and attention-engaging listening tasks." Neuropsychologia 124 (February 2019): 322–36. http://dx.doi.org/10.1016/j.neuropsychologia.2018.11.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Huang, Xiyan, Hao Fan, Jingting Li, Jeffery A. Jones, Emily Q. Wang, Ling Chen, Xi Chen, and Hanjun Liu. "External cueing facilitates auditory-motor integration for speech control in individuals with Parkinson's disease." Neurobiology of Aging 76 (April 2019): 96–105. http://dx.doi.org/10.1016/j.neurobiolaging.2018.12.020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Rummukainen, Olli S., Thomas Robotham, and Emanuël A. P. Habets. "Head-Related Transfer Functions for Dynamic Listeners in Virtual Reality." Applied Sciences 11, no. 14 (July 20, 2021): 6646. http://dx.doi.org/10.3390/app11146646.

Full text
Abstract:
In dynamic virtual reality, visual cues and motor actions aid auditory perception. With multimodal integration and auditory adaptation effects, generic head-related transfer functions (HRTFs) may yield no significant disadvantage to individual HRTFs regarding accurate auditory perception. This study compares two individual HRTF sets against a generic HRTF set by way of objective analysis and two subjective experiments. First, auditory-model-based predictions examine the objective deviations in localization cues between the sets. Next, the HRTFs are compared in a static subjective (N=8) localization experiment. Finally, the localization accuracy, timbre, and overall quality of the HRTF sets are evaluated subjectively (N=12) in a six-degrees-of-freedom audio-visual virtual environment. The results show statistically significant objective deviations between the sets, but no perceived localization or overall quality differences in the dynamic virtual reality.
APA, Harvard, Vancouver, ISO, and other styles
34

Rutherford, Helena J. V., Marc N. Potenza, Linda C. Mayes, and Dustin Scheinost. "The Application of Connectome-Based Predictive Modeling to the Maternal Brain: Implications for Mother–Infant Bonding." Cerebral Cortex 30, no. 3 (November 5, 2019): 1538–47. http://dx.doi.org/10.1093/cercor/bhz185.

Full text
Abstract:
Abstract Maternal bonding early postpartum lays an important foundation for child development. Changing brain structure and function during pregnancy and postpartum may underscore maternal bonding. We employed connectome-based predictive modeling (CPM) to measure brain functional connectivity and predict self-reported maternal bonding in mothers at 2 and 8 months postpartum. At 2 months, CPM predicted maternal anxiety in the bonding relationship: Greater integration between cerebellar and motor–sensory–auditory networks and between frontoparietal and motor–sensory–auditory networks were associated with more maternal anxiety toward their infant. Furthermore, greater segregation between the cerebellar and frontoparietal, and within the motor-sensory-auditory networks, was associated with more maternal anxiety regarding their infant. We did not observe CPM prediction of maternal bonding impairments or rejection/anger toward the infant. Finally, considering 2 and 8 months of data, changes in network connectivity were associated with changes in maternal anxiety in the bonding relationship. Our results suggest that changing connectivity among maternal brain networks may provide insight into the mother–infant bond, specifically in the context of anxiety and the representation of the infant in the mother’s mind. These findings provide an opportunity to mechanistically investigate approaches to enhance the connectivity of these networks to optimize the representational and behavioral quality of the caregiving relationship.
APA, Harvard, Vancouver, ISO, and other styles
35

Grzywniak, Celestyna. "INTEGRATION EXERCISE PROGRAMME FOR CHILDREN WITH LEARNING DIFFICULTIES WHO HAVE PRESERVED VESTIGIAL PRIMITIVE REFLEXES." Acta Neuropsychologica 15, no. 3 (September 12, 2017): 0. http://dx.doi.org/10.5604/01.3001.0010.5491.

Full text
Abstract:
Background: The main goal of the research was to determine the usefulness of the Integration exercise programme stimulating development in children with learning difficulties who have preserved vestigial primitive reflexes. Their symptoms included weak motor and visual-motor coordination, lowered visual and auditory analysis and synthesis which resulted in difficulties in reading and writing, disrupted emotional development, psychomotor hyperactivity, weak concentration and other symptoms. Material/ Methods: 104 children with learning difficulties and other accompanying symptoms took part in the experiment. The children were trained in the shape of the Integration exercise programme at school under a therapist’s supervision and additionally at home under parental supervision. The children who went through the whole programme were qualified to the experimental group and those who resigned from the programme after a short period of time – to the control group. A pre-test and a post-test, before and after completion of the Integration exercise programme, was used to evaluate the results. Results: It was found that the Integration exercise programme is useful in therapy involving facilitation of development in children with learning difficulties, who exhibit various symptoms. Almost all the obtained results were statistically significant. The Integration exercise programme is particularly effective in the case of children exhibiting a whole set of symptoms along with learning difficulties, problems with concentration, weak emotion control, weak motor development, abnormal muscle tension, weak motor coordination. Conclusions:The Integration exercise programme widens the range of methods stimulating development and the range of possibilities to apply the therapy practiced in psychology, pedagogy and physiotherapy.
APA, Harvard, Vancouver, ISO, and other styles
36

Giurgola, Serena, Carlotta Casati, Chiara Stampatori, Laura Perucca, Flavia Mattioli, Giuseppe Vallar, and Nadia Bolognini. "Abnormal multisensory integration in relapsing–remitting multiple sclerosis." Experimental Brain Research 240, no. 3 (January 30, 2022): 953–68. http://dx.doi.org/10.1007/s00221-022-06310-0.

Full text
Abstract:
AbstractTemporal Binding Window (TBW) represents a reliable index of efficient multisensory integration process, which allows individuals to infer which sensory inputs from different modalities pertain to the same event. TBW alterations have been reported in some neurological and neuropsychiatric disorders and seem to negatively affects cognition and behavior. So far, it is still unknown whether deficits of multisensory integration, as indexed by an abnormal TBW, are present even in Multiple Sclerosis. We addressed this issue by testing 25 participants affected by relapsing–remitting Multiple Sclerosis (RRMS) and 30 age-matched healthy controls. Participants completed a simultaneity judgment task (SJ2) to assess the audio-visual TBW; two unimodal SJ2 versions were used as control tasks. Individuals with RRMS showed an enlarged audio-visual TBW (width range = from − 166 ms to + 198 ms), as compared to healthy controls (width range = − 177/ + 66 ms), thus showing an increased tendency to integrate temporally asynchronous visual and auditory stimuli. Instead, simultaneity perception of unimodal (visual or auditory) events overall did not differ from that of controls. These results provide first evidence of a selective deficit of multisensory integration in individuals affected by RRMS, besides the well-known motor and cognitive impairments. The reduced multisensory temporal acuity is likely caused by a disruption of the neural interplay between different sensory systems caused by multiple sclerosis.
APA, Harvard, Vancouver, ISO, and other styles
37

Doucet, Gaelle E., Sarah Baker, Tony W. Wilson, and Max J. Kurz. "Weaker Connectivity of the Cortical Networks Is Linked with the Uncharacteristic Gait in Youth with Cerebral Palsy." Brain Sciences 11, no. 8 (August 13, 2021): 1065. http://dx.doi.org/10.3390/brainsci11081065.

Full text
Abstract:
Cerebral palsy (CP) is the most prevalent pediatric neurologic impairment and is associated with major mobility deficiencies. This has led to extensive investigations of the sensorimotor network, with far less research focusing on other major networks. The aim of this study was to investigate the functional connectivity (FC) of the main sensory networks (i.e., visual and auditory) and the sensorimotor network, and to link FC to the gait biomechanics of youth with CP. Using resting-state functional magnetic resonance imaging, we first identified the sensorimotor, visual and auditory networks in youth with CP and neurotypical controls. Our analysis revealed reduced FC among the networks in the youth with CP relative to the controls. Notably, the visual network showed lower FC with both the sensorimotor and auditory networks. Furthermore, higher FC between the visual and sensorimotor cortices was associated with larger step length (r = 0.74, pFDR = 0.04) in youth with CP. These results confirm that CP is associated with functional brain abnormalities beyond the sensorimotor network, suggesting abnormal functional integration of the brain’s motor and primary sensory systems. The significant association between abnormal visuo-motor FC and gait could indicate a link with visuomotor disorders in this patient population.
APA, Harvard, Vancouver, ISO, and other styles
38

Segado, Melanie, Robert J. Zatorre, and Virginia B. Penhune. "Effector-independent brain network for auditory-motor integration: fMRI evidence from singing and cello playing." NeuroImage 237 (August 2021): 118128. http://dx.doi.org/10.1016/j.neuroimage.2021.118128.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Guo, Zhiqiang, Xiuqin Wu, Weifeng Li, Jeffery A. Jones, Nan Yan, Stanley Sheft, Peng Liu, and Hanjun Liu. "Top-Down Modulation of Auditory-Motor Integration during Speech Production: The Role of Working Memory." Journal of Neuroscience 37, no. 43 (September 26, 2017): 10323–33. http://dx.doi.org/10.1523/jneurosci.1329-17.2017.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Ross, Gail, Evelyn Lipper, and Peter A. M. Auld. "Cognitive Abilities and Early Precursors of Learning Disabilities in Very-low-birthweight Children with Normal Intelligence and Normal Neurological Status." International Journal of Behavioral Development 19, no. 3 (September 1996): 563–80. http://dx.doi.org/10.1177/016502549601900307.

Full text
Abstract:
Fifty-nine of 88 children with birthweights ≤1500 grams had normal Full Scale IQ scores (≥80) and were judged to have normal neurological status at 7 to 8 years old. Twenty-two (37%) of these children were classified as being learning-disabled, as they had academic achievement scores ≤25th percentile. The children with learning disabilities had significantly lower Full Scale and Verbal IQ scores on the Wechsler Intelligence Scale for Children-Revised (1974), but they did not differ significantly from the normal children without learning disabilities on Performance IQ. Learning-disabled children also scored significantly lower on some tests of auditory processing and auditory memory, but not on visuo-motor abilities. Discriminant function analysis indicated that it was possible to correctly predict classification of 81% of the children as learning-disabled or not, based on measures of neonatal respiratory distress and social class level, 1-year mental and neuromotor abilities, and 3-year-old measures of language and visuo-motor integration. Results suggest that verbal deficits, rather than visuo-motor ones, underly learning disabilities at school age in prematurely born children and that these children exhibit signs of subtle neurological impairment at earlier ages.
APA, Harvard, Vancouver, ISO, and other styles
41

Rastatter, Michael P., Andrew Stuart, and Joseph Kalinowski. "Quantitative Electroencephalogram of Posterior Cortical Areas of Fluent and Stuttering Participants during Reading with Normal and Altered Auditory Feedback." Perceptual and Motor Skills 87, no. 2 (October 1998): 623–33. http://dx.doi.org/10.2466/pms.1998.87.2.623.

Full text
Abstract:
In the left and right hemisphere, posterior quantitative electroencephalogram Beta band activity (13.5–25.5 Hz) of seven adult participants who stutter and seven age-matched normal controls was obtained while subjects read text under three experimental conditions of normal auditory feedback, delayed auditory feedback, and frequency-altered feedback. Data were obtained from surface electrodes affixed to the scalp using a commercial electrode cap. Electroencephalogram activity was amplified, band-pass analog-filtered, and then digitized. During nonaltered auditory feedback, stuttering participants displayed Beta band hyperreactivity, with the right temporal-parietal lobe region showing the greatest activity. Under conditions of delayed auditory feedback and frequency-altered auditory feedback, the stuttering participants displayed a decrease in stuttering behavior accompanied by a strong reduction in Beta activity for the posterior-temporal-parietal electrode sites, and the left hemisphere posterior sites evidenced a larger area of reactivity. Such findings suggest than an alteration in the electrical fields of the cortex occurred in the stuttering participants under both conditions, possibly reflecting changes in neurogenerator status or current dipole activity. Further, one could propose that stuttering reflects an anomaly of the sensory-linguistic motor integration wherein each hemisphere generates competing linguistic messages at hyperreactive amplitudes.
APA, Harvard, Vancouver, ISO, and other styles
42

Cabib, Christopher, Sara Llufriu, Jordi Casanova-Molla, Albert Saiz, and Josep Valls-Solé. "Defective sensorimotor integration in preparation for reaction time tasks in patients with multiple sclerosis." Journal of Neurophysiology 113, no. 5 (March 1, 2015): 1462–69. http://dx.doi.org/10.1152/jn.00591.2014.

Full text
Abstract:
Slowness of voluntary movements in patients with multiple sclerosis (MS) may be due to various factors, including attentional and cognitive deficits, delays in motor conduction time, and impairment of specific central nervous system circuits. In 13 healthy volunteers and 20 mildly disabled, relapsing-remitting MS patients, we examined simple reaction time (SRT) tasks requiring sensorimotor integration in circuits involving the corpus callosum and the brain stem. A somatosensory stimulus was used as the imperative signal (IS), and subjects were requested to react with either the ipsilateral or the contralateral hand (uncrossed vs. crossed SRT). In 33% of trials, a startling auditory stimulus was presented together with the IS, and the percentage reaction time change with respect to baseline SRT trials was measured (StartReact effect). The difference between crossed and uncrossed SRT, which requires interhemispheric conduction, was significantly larger in patients than in healthy subjects ( P = 0.021). The StartReact effect, which involves activation of brain stem motor pathways, was reduced significantly in patients with respect to healthy subjects (uncrossed trials: P = 0.015; crossed trials: P = 0.005). In patients, a barely significant correlation was found between SRT delay and conduction abnormalities in motor and sensory pathways ( P = 0.02 and P = 0.04, respectively). The abnormalities found specifically in trials reflecting interhemispheric transfer of information, as well as the evidence for reduced subcortical motor preparation, indicate that a delay in reaction time execution in MS patients cannot be explained solely by conduction slowing in motor and sensory pathways but suggest, instead, defective sensorimotor integration mechanisms in at least the two circuits examined.
APA, Harvard, Vancouver, ISO, and other styles
43

Sparks, D. L. "Translation of sensory signals into commands for control of saccadic eye movements: role of primate superior colliculus." Physiological Reviews 66, no. 1 (January 1, 1986): 118–71. http://dx.doi.org/10.1152/physrev.1986.66.1.118.

Full text
Abstract:
Afferent signals that guide orienting movements converge in the deeper layers of the SC in a wide variety of animals. The sensory cells are arranged topographically according to their receptive-field locations and, thereby, form maps of sensory space. Maps of visual, somatosensory, and/or auditory space have been obtained in the iguana, mouse, hamster, barn owl, chinchilla, cat, and monkey. The deeper layers of the SC also contain neurons involved in the generation of movements of the eyes, head, vibrissae, and pinnae. Thus the SC, a site containing multiple sensory maps and perhaps multiple motor maps, has been selected by many investigators as a structure for investigating the problem of sensorimotor integration. In the mammalian nervous system, emphasized in this review, much remains to be learned about the structure, organization, and function of the SC. While anatomical studies continue to add to the knowledge of the sources of afferent projections, their pattern of laminar termination, and the source and destination of efferent projections, relatively little is known about the intrinsic organization of the colliculus, especially the deeper layers. Recently, electrophysiological studies have moved from an emphasis on the sensory and motor properties of collicular neurons to an examination of the maps of auditory and somatosensory space and the correspondence of these maps. In the future, major efforts aimed at identifying the functional properties of cells that project to the SC from diverse brain regions as well as the functional properties that project to the various structures receiving input from the colliculus are needed. A combination of anatomical and electrophysiological methods is required to describe the signal transforms that occur between the SC and motor areas (such as the paramedian pontine reticular formation) closer to the final common pathway. Conceptual and empirical work is needed to develop and test models of how the dynamic visual and auditory maps found in the primate SC are generated. In general, new and/or improved models of the role of the SC in sensorimotor integration are needed as guides for future research. A point of view emphasized here is that it may be fruitful to examine the function of the SC from a motor perspective. The nature of the motor command imposes constraints on the configuration of signals that can initiate movements and thereby determines the required transformation of sensory signals.
APA, Harvard, Vancouver, ISO, and other styles
44

McKenna, Victoria S., Jennifer A. Hylkema, Monique C. Tardif, and Cara E. Stepp. "Voice Onset Time in Individuals With Hyperfunctional Voice Disorders: Evidence for Disordered Vocal Motor Control." Journal of Speech, Language, and Hearing Research 63, no. 2 (February 26, 2020): 405–20. http://dx.doi.org/10.1044/2019_jslhr-19-00135.

Full text
Abstract:
Purpose This study examined vocal hyperfunction (VH) using voice onset time (VOT). We hypothesized that speakers with VH would produce shorter VOTs, indicating increased laryngeal tension, and more variable VOTs, indicating disordered vocal motor control. Method We enrolled 32 adult women with VH (aged 20–74 years) and 32 age- and sex-matched controls. All were speakers of American English. Participants produced vowel–consonant–vowel combinations that varied by vowel (ɑ/u) and plosive (p/b, t/d, k/g). VOT—measured at the release of the plosive to the initiation of voicing—was averaged over three repetitions of each vowel–consonant–vowel combination. The coefficient of variation (CoV), a measure of VOT variability, was also computed for each combination. Results The mean VOTs were not significantly different between the two groups; however, the CoVs were significantly greater in speakers with VH compared to controls. Voiceless CoV values were moderately correlated with clinical ratings of dysphonia ( r = .58) in speakers with VH. Conclusion Speakers with VH exhibited greater variability in phonemic voicing targets compared to vocally healthy speakers, supporting the hypothesis for disordered vocal motor control in VH. We suggest future work incorporate VOT measures when assessing auditory discrimination and auditory–motor integration deficits in VH.
APA, Harvard, Vancouver, ISO, and other styles
45

Butler, Andrew J., and Karin Harman James. "Active Learning of Novel Sound-producing Objects: Motor Reactivation and Enhancement of Visuo-motor Connectivity." Journal of Cognitive Neuroscience 25, no. 2 (February 2013): 203–18. http://dx.doi.org/10.1162/jocn_a_00284.

Full text
Abstract:
Our experience with the world commonly involves physical interaction with objects enabling us to learn associations between multisensory information perceived during an event and our actions that create an event. The interplay among active interactions during learning and multisensory integration of object properties is not well understood. To better understand how action might enhance multisensory associative recognition, we investigated the interplay among motor and perceptual systems after active learning. Fifteen participants were included in an fMRI study during which they learned visuo-auditory-motor associations between novel objects and the sounds they produce, either through self-generated actions on the objects (active learning) or by observing an experimenter produce the actions (passive learning). Immediately after learning, behavioral and BOLD fMRI measures were collected while perceiving the objects used during unisensory and multisensory training in associative perception and recognition tasks. Active learning was faster and led to more accurate recognition of audiovisual associations than passive learning. Functional ROI analyses showed that in motor, somatosensory, and cerebellar regions there was greater activation during both the perception and recognition of actively learned associations. Finally, functional connectivity between visual- and motor-related processing regions was enhanced during the presentation of actively learned audiovisual associations. Overall, the results of the current study clarify and extend our own previous work [Butler, A. J., James, T. W., & Harman James, K. Enhanced multisensory integration and motor reactivation after active motor learning of audiovisual associations. Journal of Cognitive Neuroscience, 23, 3515–3528, 2011] by providing several novel findings and highlighting the task-based nature of motor reactivation and retrieval after active learning.
APA, Harvard, Vancouver, ISO, and other styles
46

Diederich, Adele, and Hans Colonius. "Multisensory Integration and Exogenous Spatial Attention: A Time-window-of-integration Analysis." Journal of Cognitive Neuroscience 31, no. 5 (May 2019): 699–710. http://dx.doi.org/10.1162/jocn_a_01386.

Full text
Abstract:
Although it is well documented that occurrence of an irrelevant and nonpredictive sound facilitates motor responses to a subsequent target light appearing nearby, the cause of this “exogenous spatial cuing effect” has been under discussion. On the one hand, it has been postulated to be the result of a shift of visual spatial attention possibly triggered by parietal and/or cortical supramodal “attention” structures. On the other hand, the effect has been considered to be due to multisensory integration based on the activation of multisensory convergence structures in the brain. Recent RT experiments have suggested that multisensory integration and exogenous spatial cuing differ in their temporal profiles of facilitation: When the nontarget occurs 100–200 msec before the target, facilitation is likely driven by crossmodal exogenous spatial attention, whereas multisensory integration effects are still seen when target and nontarget are presented nearly simultaneously. Here, we develop an extension of the time-window-of-integration model that combines both mechanisms within the same formal framework. The model is illustrated by fitting it to data from a focused attention task with a visual target and an auditory nontarget presented at horizontally or vertically varying positions. Results show that both spatial cuing and multisensory integration may coexist in a single trial in bringing about the crossmodal facilitation of RT effects. Moreover, the formal analysis via time window of integration allows to predict and quantify the contribution of either mechanism as they occur across different spatiotemporal conditions.
APA, Harvard, Vancouver, ISO, and other styles
47

Drijvers, Linda, Asli Özyürek, and Ole Jensen. "Alpha and Beta Oscillations Index Semantic Congruency between Speech and Gestures in Clear and Degraded Speech." Journal of Cognitive Neuroscience 30, no. 8 (August 2018): 1086–97. http://dx.doi.org/10.1162/jocn_a_01301.

Full text
Abstract:
Previous work revealed that visual semantic information conveyed by gestures can enhance degraded speech comprehension, but the mechanisms underlying these integration processes under adverse listening conditions remain poorly understood. We used MEG to investigate how oscillatory dynamics support speech–gesture integration when integration load is manipulated by auditory (e.g., speech degradation) and visual semantic (e.g., gesture congruency) factors. Participants were presented with videos of an actress uttering an action verb in clear or degraded speech, accompanied by a matching (mixing gesture + “mixing”) or mismatching (drinking gesture + “walking”) gesture. In clear speech, alpha/beta power was more suppressed in the left inferior frontal gyrus and motor and visual cortices when integration load increased in response to mismatching versus matching gestures. In degraded speech, beta power was less suppressed over posterior STS and medial temporal lobe for mismatching compared with matching gestures, showing that integration load was lowest when speech was degraded and mismatching gestures could not be integrated and disambiguate the degraded signal. Our results thus provide novel insights on how low-frequency oscillatory modulations in different parts of the cortex support the semantic audiovisual integration of gestures in clear and degraded speech: When speech is clear, the left inferior frontal gyrus and motor and visual cortices engage because higher-level semantic information increases semantic integration load. When speech is degraded, posterior STS/middle temporal gyrus and medial temporal lobe are less engaged because integration load is lowest when visual semantic information does not aid lexical retrieval and speech and gestures cannot be integrated.
APA, Harvard, Vancouver, ISO, and other styles
48

Scurry, Alexandra N., Daniela M. Lemus, and Fang Jiang. "Temporal Alignment but not Complexity of Audiovisual Stimuli Influences Crossmodal Duration Percepts." Multisensory Research 35, no. 2 (October 8, 2021): 131–49. http://dx.doi.org/10.1163/22134808-bja10062.

Full text
Abstract:
Abstract Reliable duration perception is an integral aspect of daily life that impacts everyday perception, motor coordination, and subjective passage of time. The Scalar Expectancy Theory (SET) is a common model that explains how an internal pacemaker, gated by an external stimulus-driven switch, accumulates pulses during sensory events and compares these accumulated pulses to a reference memory duration for subsequent duration estimation. Second-order mechanisms, such as multisensory integration (MSI) and attention, can influence this model and affect duration perception. For instance, diverting attention away from temporal features could delay the switch closure or temporarily open the accumulator, altering pulse accumulation and distorting duration perception. In crossmodal duration perception, auditory signals of unequal duration can induce perceptual compression and expansion of durations of visual stimuli, presumably via auditory influence on the visual clock. The current project aimed to investigate the role of temporal (stimulus alignment) and nontemporal (stimulus complexity) features on crossmodal, specifically auditory over visual, duration perception. While temporal alignment revealed a larger impact on the strength of crossmodal duration percepts compared to stimulus complexity, both features showcase auditory dominance in processing visual duration.
APA, Harvard, Vancouver, ISO, and other styles
49

Gori, Monica. "Multisensory Integration and Calibration in Children and Adults with and without Sensory and Motor Disabilities." Multisensory Research 28, no. 1-2 (2015): 71–99. http://dx.doi.org/10.1163/22134808-00002478.

Full text
Abstract:
During the first years of life, sensory modalities communicate with each other. This process is fundamental for the development of unisensory and multisensory skills. The absence of one sensory input impacts on the development of other modalities. Since 2008 we have studied these aspects and developed our cross-sensory calibration theory. This theory emerged from the observation that children start to integrate multisensory information (such as vision and touch) only after 8–10 years of age. Before this age the more accurate sense teaches (calibrates) the others; when one calibrating modality is missing, the other modalities result impaired. Children with visual disability have problems in understanding the haptic or auditory perception of space and children with motor disabilities have problems in understanding the visual dimension of objects. This review presents our recent studies on multisensory integration and cross-sensory calibration in children and adults with and without sensory and motor disabilities. The goal of this review is to show the importance of interaction between sensory systems during the early period of life in order to correct perceptual development to occur.
APA, Harvard, Vancouver, ISO, and other styles
50

Ravulakollu, Kiran Kumar, Harry Erwin, and Kevin Burn. "Improving Robot-Human Communication by Integrating Visual Attention and Auditory Localization Using a Biologically Inspired Model of Superior Colliculus." Advanced Materials Research 403-408 (November 2011): 4711–17. http://dx.doi.org/10.4028/www.scientific.net/amr.403-408.4711.

Full text
Abstract:
Effective agent communication is always been an important modern area of research. This paper focuses on achieving greater precision in common by improving agent-human communication with the help of visual attention and auditory localization based on a simple model of the superior colliculus in the human brain system. The model receives individual visual and auditory sensory stimuli and combines them to generate an integrated stimulus predicting the location of the sound source. This integrated stimulus is used to generate a motor saccade of the visual system to attend to the sound. The computational model is based on a neural network approach with learning and is explored in experiments reflecting varied conditions to determine whether it mimes the performance of superior colliculus in auditory and visual stimuli integration. Finally with a evaluation strategy carried between unimodal and multimodal data, the efficiency of the computational model of Superior Colliculus is determined. Performance of the neural network based computational model has proven effective in terms of learning, the better performance of the integrated response over unimodal response and providing a realistic communication experience.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography