Journal articles on the topic 'Binaural localization'

To see the other types of publications on this topic, follow the link: Binaural localization.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Binaural localization.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Snik, Ad F. M., Andy J. Beynon, Catharina T. M. van der Pouw, Emmanuel A. M. Mylanus, and Cor W. R. J. Cremers. "Binaural Application of the Bone-Anchored Hearing Aid." Annals of Otology, Rhinology & Laryngology 107, no. 3 (March 1998): 187–93. http://dx.doi.org/10.1177/000348949810700301.

Full text
Abstract:
Most, but not all, hearing-impaired patients with air conduction hearing aids prefer binaural amplification instead of monaural amplification. The binaural application of the bone conduction hearing aid is more disputable, because the attenuation (in decibels) of sound waves across the skull is so small (10 dB) that even one bone conduction hearing aid will stimulate both cochleas approximately to the same extent. Binaural fitting of the bone-anchored hearing aid was studied in three experienced bone-anchored hearing aid users. The experiments showed that sound localization, and speech recognition in quiet and also under certain noisy conditions improved significantly with binaural listening compared to the monaural listening condition. On the average, the percentage of correct identifications (within 45°) in the sound localization experiment improved by 53% with binaural listening; the speech reception threshold in quiet improved by 4.4 dB. The binaural advantage in the speech-in-noise test was comparable to that of a control group of subjects with normal hearing listening monaurally versus binaurally. The improvements in the scores were ascribed to diotic summation (improved speech recognition in quiet) and the ability to separate sounds in the binaural listening condition (improved sound localization and improved speech recognition in noise whenever the speech and noise signals came from different directions). All three patients preferred the binaural bone-anchored hearing aids and used them all day.
APA, Harvard, Vancouver, ISO, and other styles
2

Wu, Xinyi, Zhenyao Wu, Lili Ju, and Song Wang. "Binaural Audio-Visual Localization." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 4 (May 18, 2021): 2961–68. http://dx.doi.org/10.1609/aaai.v35i4.16403.

Full text
Abstract:
Localizing sound sources in a visual scene has many important applications and quite a few traditional or learning-based methods have been proposed for this task. Humans have the ability to roughly localize sound sources within or beyond the range of the vision using their binaural system. However most existing methods use monaural audio, instead of binaural audio, as a modality to help the localization. In addition, prior works usually localize sound sources in the form of object-level bounding boxes in images or videos and evaluate the localization accuracy by examining the overlap between the ground-truth and predicted bounding boxes. This is too rough since a real sound source is often only a part of an object. In this paper, we propose a deep learning method for pixel-level sound source localization by leveraging both binaural recordings and the corresponding videos. Specifically, we design a novel Binaural Audio-Visual Network (BAVNet), which concurrently extracts and integrates features from binaural recordings and videos. We also propose a point-annotation strategy to construct pixel-level ground truth for network training and performance evaluation. Experimental results on Fair-Play and YT-Music datasets demonstrate the effectiveness of the proposed method and show that binaural audio can greatly improve the performance of localizing the sound sources, especially when the quality of the visual information is limited.
APA, Harvard, Vancouver, ISO, and other styles
3

Koehnke, Janet, and Patrick M. Zurek. "Localization and binaural detection with monaural and binaural amplification." Journal of the Acoustical Society of America 88, S1 (November 1990): S169. http://dx.doi.org/10.1121/1.2028743.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Razak, Khaleel A., and Zoltan M. Fuzessery. "Functional Organization of the Pallid Bat Auditory Cortex: Emphasis on Binaural Organization." Journal of Neurophysiology 87, no. 1 (January 1, 2002): 72–86. http://dx.doi.org/10.1152/jn.00226.2001.

Full text
Abstract:
This report maps the organization of the primary auditory cortex of the pallid bat in terms of frequency tuning, selectivity for behaviorally relevant sounds, and interaural intensity difference (IID) sensitivity. The pallid bat is unusual in that it localizes terrestrial prey by passively listening to prey-generated noise transients (1–20 kHz), while reserving high-frequency (<30 kHz) echolocation for obstacle avoidance. The functional organization of its auditory cortex reflects the need for specializations in echolocation and passive sound localization. Best frequencies were arranged tonotopically with a general increase in the caudolateral to rostromedial direction. Frequencies between 24 and 32 kHz were under-represented, resulting in hypertrophy of frequencies relevant for prey localization and echolocation. Most neurons (83%) tuned <30 kHz responded preferentially to broadband or band-pass noise over single tones. Most neurons (62%) tuned >30 kHz responded selectively or exclusively to the 60- to 30-kHz downward frequency-modulated (FM) sweep used for echolocation. Within the low-frequency region, neurons were placed in two groups that occurred in two separate clusters: those selective for low- or high-frequency band-pass noise and suppressed by broadband noise, and neurons that showed no preference for band-pass noise over broadband noise. Neurons were organized in homogeneous clusters with respect to their binaural response properties. The distribution of binaural properties differed in the noise- and FM sweep-preferring regions, suggesting task-dependent differences in binaural processing. The low-frequency region was dominated by a large cluster of binaurally inhibited neurons with a smaller cluster of neurons with mixed binaural interactions. The FM sweep-selective region was dominated by neurons with mixed binaural interactions or monaural neurons. Finally, this report describes a cortical substrate for systematic representation of a spatial cue, IIDs, in the low-frequency region. This substrate may underlie a population code for sound localization based on a systematic shift in the distribution of activity across the cortex with sound source location.
APA, Harvard, Vancouver, ISO, and other styles
5

Koehnke, Janet, Joan Besing, Christine Goulet, Marla Allard, and Patrick M. Zurek. "Speech intelligibility, localization, and binaural detection with monaural and binaural amplification." Journal of the Acoustical Society of America 92, no. 4 (October 1992): 2434. http://dx.doi.org/10.1121/1.404588.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ge, Zheng. "SVM-based Binaural Sound Source Localization." Journal of Information and Computational Science 12, no. 14 (September 20, 2015): 5459–67. http://dx.doi.org/10.12733/jics20106459.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Craig, Rushby C., and Timothy R. Anderson. "Binaural sound localization using neural networks." Journal of the Acoustical Society of America 91, no. 4 (April 1992): 2415. http://dx.doi.org/10.1121/1.403229.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Van Wanrooij, Marc M., and A. John Van Opstal. "Sound Localization Under Perturbed Binaural Hearing." Journal of Neurophysiology 97, no. 1 (January 2007): 715–26. http://dx.doi.org/10.1152/jn.00260.2006.

Full text
Abstract:
This paper reports on the acute effects of a monaural plug on directional hearing in the horizontal (azimuth) and vertical (elevation) planes of human listeners. Sound localization behavior was tested with rapid head-orienting responses toward brief high-pass filtered (>3 kHz; HP) and broadband (0.5–20 kHz; BB) noises, with sound levels between 30 and 60 dB, A-weighted (dBA). To deny listeners any consistent azimuth-related head-shadow cues, stimuli were randomly interleaved. A plug immediately degraded azimuth performance, as evidenced by a sound level–dependent shift (“bias”) of responses contralateral to the plug, and a level-dependent change in the slope of the stimulus–response relation (“gain”). Although the azimuth bias and gain were highly correlated, they could not be predicted from the plug's acoustic attenuation. Interestingly, listeners performed best for low-intensity stimuli at their normal-hearing side. These data demonstrate that listeners rely on monaural spectral cues for sound-source azimuth localization as soon as the binaural difference cues break down. Also the elevation response components were affected by the plug: elevation gain depended on both stimulus azimuth and on sound level and, as for azimuth, localization was best for low-intensity stimuli at the hearing side. Our results show that the neural computation of elevation incorporates a binaural weighting process that relies on the perceived, rather than the actual, sound-source azimuth. It is our conjecture that sound localization ensues from a weighting of all acoustic cues for both azimuth and elevation, in which the weights may be partially determined, and rapidly updated, by the reliability of the particular cue.
APA, Harvard, Vancouver, ISO, and other styles
9

Razak, Khaleel A., and Zoltan M. Fuzessery. "GABA Shapes a Systematic Map of Binaural Sensitivity in the Auditory Cortex." Journal of Neurophysiology 104, no. 1 (July 2010): 517–28. http://dx.doi.org/10.1152/jn.00294.2010.

Full text
Abstract:
A consistent organizational feature of auditory cortex is a clustered representation of binaural properties. Here we address two questions. What is the intrinsic organization of binaural clusters and to what extent does intracortical processing contribute to binaural representation. We address these issues in the auditory cortex of the pallid bat. The pallid bat listens to prey-generated noise transients to localize and hunt terrestrial prey. As in other species studied, binaural clusters are present in the auditory cortex of the pallid bat. One cluster contains neurons that require binaural stimulation to be maximally excited, and are commonly termed predominantly binaural (PB) neurons. These neurons do not respond to monaural stimulation of either ear but show a peaked sensitivity to interaural intensity differences (IID) centered near 0 dB IID. We show that the peak IID varies systematically within this cluster. The peak IID is also correlated with the best frequency (BF) of neurons within this cluster. In addition, the IID selectivity of PB neurons is shaped by intracortical GABAergic input. Iontophoresis of GABAA receptor antagonists on PB neurons converts a majority of them to binaurally inhibited (EI) neurons that respond best to sounds favoring the contralateral ear. These data indicate that the cortex does not simply inherit binaural properties from lower levels but instead sharpens them locally through intracortical inhibition. The IID selectivity of the PB cluster indicates that the pallid bat cortex contains an increased representation of the frontal space that may underlie increased localization accuracy in this region.
APA, Harvard, Vancouver, ISO, and other styles
10

Samson, Annie-Hélène, and Gerald S. Pollack. "Encoding of Sound Localization Cues by an Identified Auditory Interneuron: Effects of Stimulus Temporal Pattern." Journal of Neurophysiology 88, no. 5 (November 1, 2002): 2322–28. http://dx.doi.org/10.1152/jn.00119.2002.

Full text
Abstract:
An important cue for sound localization is binaural comparison of stimulus intensity. Two features of neuronal responses, response strength, i.e., spike count and/or rate, and response latency, vary with stimulus intensity, and binaural comparison of either or both might underlie localization. Previous studies at the receptor-neuron level showed that these response features are affected by the stimulus temporal pattern. When sounds are repeated rapidly, as occurs in many natural sounds, response strength decreases and latency increases, resulting in altered coding of localization cues. In this study we analyze binaural cues for sound localization at the level of an identified pair of interneurons (the left and right AN2) in the cricket auditory system, with emphasis on the effects of stimulus temporal pattern on binaural response differences. AN2 spike count decreases with rapidly repeated stimulation and latency increases. Both effects depend on stimulus intensity. Because of the difference in intensity at the two ears, binaural differences in spike count and latency change as stimulation continues. The binaural difference in spike count decreases, whereas the difference in latency increases. The proportional changes in response strength and in latency are greater at the interneuron level than at the receptor level, suggesting that factors in addition to decrement of receptor responses are involved. Intracellular recordings reveal that a slowly building, long-lasting hyperpolarization is established in AN2. At the same time, the level of depolarization reached during the excitatory postsynaptic potential (EPSP) resulting from each sound stimulus decreases. Neither these effects on membrane potential nor the changes in spiking response are accounted for by contralateral inhibition. Based on comparison of our results with earlier behavioral experiments, it is unlikely that crickets use the binaural difference in latency of AN2 responses as the main cue for determining sound direction, leaving the difference in response strength, i.e., spike count and/or rate, as the most likely candidate.
APA, Harvard, Vancouver, ISO, and other styles
11

Kimberley, Barry P., Rob Dymond, and Abram Gamer. "Bilateral Digital Hearing Aids for Binaural Hearing." Ear, Nose & Throat Journal 73, no. 3 (March 1994): 176–79. http://dx.doi.org/10.1177/014556139407300311.

Full text
Abstract:
The rehabilitation of binaural hearing performance in hearing impaired listeners has received relatively little attention to date. Both localization ability and speech-understanding-in noise are affected in the impaired listener. When localization performance is tested in impaired ears with conventional hearing aid fittings it is found to be worse than the unaided condition. Advances in electronic design now permit speculation about the implementation of complex digital filters within the confines of an in-the-ear hearing aid. We have begun exploring strategies to enhance the localization performance of impaired listeners with bilateral digital signal processing. We are examining three strategies in bilateral hearing aid design to improve localization performance in hearing impaired listeners, namely 1) more accurate fitting of individual ear losses, 2) equalization of the effect of the hearing aid itself on the acoustics within the ear canal, and 3) binaural fitting strategies which in effect modify individual ear fittings to enhance localization performance. The results of early psychophysical testing suggests that localization performance can be improved with these strategies.
APA, Harvard, Vancouver, ISO, and other styles
12

Sharma, Snandan, Lucas H.M. Mens, Ad F.M. Snik, A. John van Opstal, and Marc M. van Wanrooij. "Hearing Asymmetry Biases Spatial Hearing in Bimodal Cochlear-Implant Users Despite Bilateral Low-Frequency Hearing Preservation." Trends in Hearing 27 (January 2023): 233121652211439. http://dx.doi.org/10.1177/23312165221143907.

Full text
Abstract:
Many cochlear implant users with binaural residual (acoustic) hearing benefit from combining electric and acoustic stimulation (EAS) in the implanted ear with acoustic amplification in the other. These bimodal EAS listeners can potentially use low-frequency binaural cues to localize sounds. However, their hearing is generally asymmetric for mid- and high-frequency sounds, perturbing or even abolishing binaural cues. Here, we investigated the effect of a frequency-dependent binaural asymmetry in hearing thresholds on sound localization by seven bimodal EAS listeners. Frequency dependence was probed by presenting sounds with power in low-, mid-, high-, or mid-to-high-frequency bands. Frequency-dependent hearing asymmetry was present in the bimodal EAS listening condition (when using both devices) but was also induced by independently switching devices on or off. Using both devices, hearing was near symmetric for low frequencies, asymmetric for mid frequencies with better hearing thresholds in the implanted ear, and monaural for high frequencies with no hearing in the non-implanted ear. Results show that sound-localization performance was poor in general. Typically, localization was strongly biased toward the better hearing ear. We observed that hearing asymmetry was a good predictor for these biases. Notably, even when hearing was symmetric a preferential bias toward the ear using the hearing aid was revealed. We discuss how frequency dependence of any hearing asymmetry may lead to binaural cues that are spatially inconsistent as the spectrum of a sound changes. We speculate that this inconsistency may prevent accurate sound-localization even after long-term exposure to the hearing asymmetry.
APA, Harvard, Vancouver, ISO, and other styles
13

Lorens, Artur, Anita Obrycka, Piotr Henryk Skarzynski, and Henryk Skarzynski. "Benefits of Binaural Integration in Cochlear Implant Patients with Single-Sided Deafness and Residual Hearing in the Implanted Ear." Life 11, no. 3 (March 23, 2021): 265. http://dx.doi.org/10.3390/life11030265.

Full text
Abstract:
The purpose of the study is to gauge the benefits of binaural integration effects (redundancy and squelch) due to preserved low-frequency residual hearing in the implanted ear of cochlear implant users with single-sided deafness. There were 11 cochlear implant users (age 18–61 years old) who had preserved low-frequency hearing in the implanted ear; they had a normal hearing or mild hearing loss in the contralateral ear. Patients were tested with monosyllabic words, under different spatial locations of speech and noise and with the cochlear implant activated and deactivated, in two listening configurations—one in which low frequencies in the implanted ear were masked and another in which they were unmasked. We also investigated how cochlear implant benefit due to binaural integration depended on unaided sound localization ability. Patients benefited from the binaural integration effects of redundancy and squelch only in the unmasked condition. Pearson correlations between binaural integration effects and unaided sound localization error showed significance only for squelch (r = −0.67; p = 0.02). Hearing preservation after cochlear implantation has considerable benefits because the preserved low-frequency hearing in the implanted ear contributes to binaural integration, presumably through the preserved temporal fine structure.
APA, Harvard, Vancouver, ISO, and other styles
14

Moeller, Henrik, Dorte Hammershoei, Clemen B. Jensen, and Michael F. Soerensen. "Localization with individual and nonindividual binaural recordings." Journal of the Acoustical Society of America 102, no. 5 (November 1997): 3116. http://dx.doi.org/10.1121/1.420566.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Ferber, Maike, Bernhard Laback, and Norbert Kopco. "Vision-induced reweighting of binaural localization cues." Journal of the Acoustical Society of America 143, no. 3 (March 2018): 1813. http://dx.doi.org/10.1121/1.5035942.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Willert, V., J. Eggert, J. Adamy, R. Stahl, and E. Korner. "A Probabilistic Model for Binaural Sound Localization." IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics) 36, no. 5 (October 2006): 982–94. http://dx.doi.org/10.1109/tsmcb.2006.872263.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Zhong, Xuan, Liang Sun, and William Yost. "Active binaural localization of multiple sound sources." Robotics and Autonomous Systems 85 (November 2016): 83–92. http://dx.doi.org/10.1016/j.robot.2016.07.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

KUMON, Makoto, and Shuji UOZUMI. "Binaural Localization for a Mobile Sound Source." Journal of Biomechanical Science and Engineering 6, no. 1 (2011): 26–39. http://dx.doi.org/10.1299/jbse.6.26.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Dunn, Camille C., Richard S. Tyler, and Shelley A. Witt. "Benefit of Wearing a Hearing Aid on the Unimplanted Ear in Adult Users of a Cochlear Implant." Journal of Speech, Language, and Hearing Research 48, no. 3 (June 2005): 668–80. http://dx.doi.org/10.1044/1092-4388(2005/046).

Full text
Abstract:
The purpose of this investigation was to document performance of participants wearing a cochlear implant and hearing aid in opposite ears on speech-perception and localization tests. Twelve individuals who wore a cochlear implant and a hearing aid on contralateral ears were tested on their abilities to understand words in quiet and sentences in noise, and to localize everyday sounds. All speech stimuli were presented from the front, with the noise stimuli presented from the front, the right, or the left at a 90° angle. Binaural summation in quiet and in noise, binaural squelch effects, and localization were studied to determine bilateral advantages. The magnitude of the monaural head shadow effect (the difference in unilateral performance when noise was facing the unilateral device vs. when the noise was opposite the unilateral device) also was studied. The test setup for localization was composed of an 8-speaker array spanning an arc of approximately 108° in front of each participant. Group results yielded a statistically significant combined benefit of wearing a hearing aid in conjunction with a cochlear implant on opposite ears in noise conditions. Those participants who received a binaural advantage in 1 condition did not necessarily show a binaural advantage in another. Only 2 participants out of 12 were able to localize when wearing 2 devices. Further efforts are required to improve the integration of information from combined use of cochlear implant and hearing aid devices for enhancement of speech perception in noise and localization.
APA, Harvard, Vancouver, ISO, and other styles
20

Pausch, Florian, and Janina Fels. "Localization Performance in a Binaural Real-Time Auralization System Extended to Research Hearing Aids." Trends in Hearing 24 (January 2020): 233121652090870. http://dx.doi.org/10.1177/2331216520908704.

Full text
Abstract:
Auralization systems for auditory research should ideally be validated by perceptual experiments, as well as objective measures. This study employed perceptual tests to evaluate a recently proposed binaural real-time auralization system for hearing aid (HA) users. The dynamic localization of real sound sources was compared with that of virtualized ones, reproduced binaurally over headphones, loudspeakers with crosstalk cancellation (CTC) filters, research HAs, or combined via loudspeakers with CTC filters and research HAs under free-field conditions. System-inherent properties affecting localization cues were identified and their effects on overall horizontal localization, reversal rates, and angular error metrics were assessed. The general localization performance in combined reproduction was found to fall between what was measured for loudspeakers with CTC filters and research HAs alone. Reproduction via research HAs alone resulted in the highest reversal rates and angular errors. While combined reproduction helped decrease the reversal rates, no significant effect was observed on the angular error metrics. However, combined reproduction resulted in the same overall horizontal source localization performance as measured for real sound sources, while improving localization compared with reproduction over research HAs alone. Collectively, the results with respect to combined reproduction can be considered a performance indicator for future experiments involving HA users.
APA, Harvard, Vancouver, ISO, and other styles
21

Brungart, Douglas S. "Three-Dimensional Auditory Localization of Nearby Sources." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 42, no. 21 (October 1998): 1531–34. http://dx.doi.org/10.1177/154193129804202114.

Full text
Abstract:
Although the head-related transfer function (HRTF) is known to change substantially with distance when a source is located within 1 m of the listener's head, very little is known about auditory localization performance in this region. In this experiment, an acoustic point source was used to measure auditory localization accuracy in azimuth, elevation and distance. The overall directional error (16.5°) was similar to that measured in previous localization experiments with more distant sources, although the number of front-back reversals increased when the source was near the head. Distance localization was relatively accurate for lateral sources (stimulus-response correlation r>0.80) and relatively inaccurate for sources near the median plane, indicating that binaural difference cues are important to auditory distance perception for nearby sources. These results suggest that distance-dependent HRTFs measured for nearby sources could be used to provide robust binaural distance information in a virtual audio display.
APA, Harvard, Vancouver, ISO, and other styles
22

Wu, Xiang, Dumidu S. Talagala, Wen Zhang, and Thushara D. Abhayapala. "Individualized Interaural Feature Learning and Personalized Binaural Localization Model." Applied Sciences 9, no. 13 (June 30, 2019): 2682. http://dx.doi.org/10.3390/app9132682.

Full text
Abstract:
The increasing importance of spatial audio technologies has demonstrated the need and importance of correctly adapting to the individual characteristics of the human auditory system, and illustrates the crucial need for humanoid localization systems for testing these technologies. To this end, this paper introduces a novel feature analysis and selection approach for binaural localization and builds a probabilistic localization mapping model, especially useful for the vertical dimension localization. The approach uses the mutual information as a metric to evaluate the most significant frequencies of the interaural phase difference and interaural level difference. Then, by using the random forest algorithm and embedding the mutual information as a feature selection criteria, the feature selection procedures are encoded with the training of the localization mapping. The trained mapping model is capable of using interaural features more efficiently, and, because of the multiple-tree-based model structure, the localization model shows robust performance to noise and interference. By integrating the direct path relative transfer function estimation, we propose to devise a novel localization approach that has improved performance in the presence of noise and reverberation. The proposed mapping model is compared with the state-of-the-art manifold learning procedure in different acoustical configurations, and a more accurate and robust output can be observed.
APA, Harvard, Vancouver, ISO, and other styles
23

Litovsky, Ruth Y., Brad Rakerd, Tom C. T. Yin, and William M. Hartmann. "Psychophysical and Physiological Evidence for a Precedence Effect in the Median Sagittal Plane." Journal of Neurophysiology 77, no. 4 (April 1, 1997): 2223–26. http://dx.doi.org/10.1152/jn.1997.77.4.2223.

Full text
Abstract:
Litovsky, Ruth Y., Brad Rakerd, Tom C. T. Yin, and William M. Hartmann. Psychophysical and physiological evidence for a precedence effect in the median sagittal plane. J. Neurophysiol. 77: 2223–2226, 1997. A listener in a room is exposed to multiple versions of any acoustical event, coming from many different directions in space. The precedence effect is thought to discount the reflected sounds in the computation of location, so that a listener perceives the source near its true location. According to most auditory theories, the precedence effect is mediated by binaural differences. This report presents evidence that the precedence effect operates in the median sagittal plane, where binaural differences are virtually absent and where spectral cues provide information regarding the location of sounds. Parallel studies were conducted in psychophysics by measuring human listeners' performance, and in neurophysiology by measuring responses of single neurons in the inferior colliculus of cats. In both experiments the precedence effect was found to operate similarly in the azimuthal and sagittal planes. It is concluded that precedence is mediated by binaurally based and spectrally based localization cues in the azimuthal and sagittal planes, respectively. Thus,models that attribute the precedence effect entirely to processes that involve binaural differences are no longer viable.
APA, Harvard, Vancouver, ISO, and other styles
24

Asakura, Takumi. "Bone Conduction Auditory Navigation Device for Blind People." Applied Sciences 11, no. 8 (April 8, 2021): 3356. http://dx.doi.org/10.3390/app11083356.

Full text
Abstract:
A navigation system using a binaural bone-conducted sound is proposed. This system has three features to accurately navigate the user to the destination point. First, the selection of the bone-conduction device and the optimal contact conditions between the device and the human head are discussed. Second, the basic performance of sound localization reproduced by the selected bone-conduction device with binaural sounds is confirmed considering the head-related transfer functions (HRTFs) obtained in the air-borne sound field. Here, a panned sound technique that may emphasize the localization of the sound is also validated. Third, to ensure the safety of the navigating person, which is the most important factor in the navigation of a visually impaired person by voice guidance, an appropriate warning sound reproduced by the bone-conduction device is investigated. Finally, based on the abovementioned conditions, we conduct an auditory navigation experiment using bone-conducted guide announcement. The time required to reach the destination of the navigation route is shorter in the case with voice information including the binaural sound reproduction, as compared to the case with only voice information. Therefore, a navigation system using binaural bone-conducted sound is confirmed to be effective.
APA, Harvard, Vancouver, ISO, and other styles
25

Koch, U., and B. Grothe. "Interdependence of Spatial and Temporal Coding in the Auditory Midbrain." Journal of Neurophysiology 83, no. 4 (April 1, 2000): 2300–2314. http://dx.doi.org/10.1152/jn.2000.83.4.2300.

Full text
Abstract:
To date, most physiological studies that investigated binaural auditory processing have addressed the topic rather exclusively in the context of sound localization. However, there is strong psychophysical evidence that binaural processing serves more than only sound localization. This raises the question of how binaural processing of spatial cues interacts with cues important for feature detection. The temporal structure of a sound is one such feature important for sound recognition. As a first approach, we investigated the influence of binaural cues on temporal processing in the mammalian auditory system. Here, we present evidence that binaural cues, namely interaural intensity differences (IIDs), have profound effects on filter properties for stimulus periodicity of auditory midbrain neurons in the echolocating big brown bat, Eptesicus fuscus. Our data indicate that these effects are partially due to changes in strength and timing of binaural inhibitory inputs. We measured filter characteristics for the periodicity (modulation frequency) of sinusoidally frequency modulated sounds (SFM) under different binaural conditions. As criteria, we used 50% filter cutoff frequencies of modulation transfer functions based on discharge rate as well as synchronicity of discharge to the sound envelope. The binaural conditions were contralateral stimulation only, equal stimulation at both ears (IID = 0 dB), and more intense at the ipsilateral ear (IID = −20, −30 dB). In 32% of neurons, the range of modulation frequencies the neurons responded to changed considerably comparing monaural and binaural (IID =0) stimulation. Moreover, in ∼50% of neurons the range of modulation frequencies was narrower when the ipsilateral ear was favored (IID = −20) compared with equal stimulation at both ears (IID = 0). In ∼10% of the neurons synchronization differed when comparing different binaural cues. Blockade of the GABAergic or glycinergic inputs to the cells recorded from revealed that inhibitory inputs were at least partially responsible for the observed changes in SFM filtering. In 25% of the neurons, drug application abolished those changes. Experiments using electronically introduced interaural time differences showed that the strength of ipsilaterally evoked inhibition increased with increasing modulation frequencies in one third of the cells tested. Thus glycinergic and GABAergic inhibition is at least one source responsible for the observed interdependence of temporal structure of a sound and spatial cues.
APA, Harvard, Vancouver, ISO, and other styles
26

Stevenson-Hoare, Joshua O., Tom C. A. Freeman, and John F. Culling. "The pinna enhances angular discrimination in the frontal hemifield." Journal of the Acoustical Society of America 152, no. 4 (October 2022): 2140–49. http://dx.doi.org/10.1121/10.0014599.

Full text
Abstract:
Human sound localization in the horizontal dimension is thought to be dominated by binaural cues, particularly interaural time delays, because monaural localization in this dimension is relatively poor. Remaining ambiguities of front versus back and up versus down are distinguished by high-frequency spectral cues generated by the pinna. The experiments in this study show that this account is incomplete. Using binaural listening throughout, the pinna substantially enhanced horizontal discrimination in the frontal hemifield, making discrimination in front better than discrimination at the rear, particularly for directions away from the median plane. Eliminating acoustic effects of the pinna by acoustically bypassing them or low-pass filtering abolished the advantage at the front without affecting the rear. Acoustic measurements revealed a pinna-induced spectral prominence that shifts smoothly in frequency as sounds move from 0° to 90° azimuth. The improved performance is discussed in terms of the monaural and binaural changes induced by the pinna.
APA, Harvard, Vancouver, ISO, and other styles
27

Dunai, Larisa, Ismael Lengua, Guillermo Peris-Fajarnés, and Fernando Brusola. "Virtual Sound Localization by Blind People." Archives of Acoustics 40, no. 4 (December 1, 2015): 561–67. http://dx.doi.org/10.1515/aoa-2015-0055.

Full text
Abstract:
AbstractThe paper demonstrates that blind people localize sounds more accurately than sighted people by using monaural and/or binaural cues.In the experiment, blind people participated in two tests; the first one took place in the laboratory and the second one in the real environment under different noise conditions. A simple click sound was employed and processed with non-individual head related transfer functions. The sounds were delivered by a system with a maximum azimuth of 32° to the left side and 32° to the right side of the participant’s head at a distance ranging from 0.3 m up to 5 m.The present paper describes the experimental methods and results of virtual sound localization by blind people through the use of a simple electronic travel aid based on an infrared laser pulse and the time of flight distance measurement principle. The lack of vision is often compensated by other perceptual abilities, such as the tactile or hearing ability.The results show that blind people easily perceive and localize binaural sounds and assimilate them with sounds from the environment.
APA, Harvard, Vancouver, ISO, and other styles
28

Preece, John P., Richard S. Tyler, Jay T. Rubinstein, Bruce J. Gantz, and Richard J. M. van Hoesel. "Localization of sound by binaural cochlear implant users." Journal of the Acoustical Society of America 109, no. 5 (May 2001): 2377. http://dx.doi.org/10.1121/1.4744374.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

HORIHATA, Satoshi, Takehiko KATAYAMA, Tetsuo MIYAKE, and Zhong ZHANG. "3-D Sound Localization Using the Binaural Model." Transactions of the Japan Society of Mechanical Engineers Series C 72, no. 723 (2006): 3567–75. http://dx.doi.org/10.1299/kikaic.72.3567.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

SCHAUER, CARSTEN, and PETER PASCHKE. "A SPIKE-BASED MODEL OF BINAURAL SOUND LOCALIZATION." International Journal of Neural Systems 09, no. 05 (October 1999): 447–52. http://dx.doi.org/10.1142/s0129065799000460.

Full text
Abstract:
This paper describes a spike-based model of binaural sound localization using interaural time differences (ITDs). To handle the problem of temporal coding and to facilitate a hardware implementation all neurons are simulated by a spike response model, which includes postsynaptic potentials (PSPs) and a refractory period. A winner-take-all (WTA) network selects the dominant source from the representation of the sound's angles of incidences, and can be biased by a multisensory support, We use simulations on real audio data to investigate the function and the practical application of the system.
APA, Harvard, Vancouver, ISO, and other styles
31

Musicant, Alan D., and Robert A. Butler. "Influence of monaural spectral cues on binaural localization." Journal of the Acoustical Society of America 77, no. 1 (January 1985): 202–8. http://dx.doi.org/10.1121/1.392259.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Katayama, Takehiko, Shuushi Makita, Satoshi Horihata, and Tetuwo Miyake. "3-D Sound Localization Using the Binaural Model." Proceedings of Conference of Tokai Branch 2004.53 (2004): 223–24. http://dx.doi.org/10.1299/jsmetokai.2004.53.223.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Butler, Robert A. "The bandwidth effect on monaural and binaural localization." Hearing Research 21, no. 1 (January 1986): 67–73. http://dx.doi.org/10.1016/0378-5955(86)90047-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Wang, Jie, Xikun Lu, Jinqiu Sang, Juanjuan Cai, and Chengshi Zheng. "Effects of Stimulation Position and Frequency Band on Auditory Spatial Perception with Bilateral Bone Conduction." Trends in Hearing 26 (January 2022): 233121652210971. http://dx.doi.org/10.1177/23312165221097196.

Full text
Abstract:
Virtual sound localization tests were conducted to examine the effects of stimulation position (mastoid, condyle, supra-auricular, temple, and bone-anchored hearing aid implant position) and frequency band (low frequency, high frequency, and broadband) on bone-conduction (BC) horizontal localization. Non-individualized head-related transfer functions were used to reproduce virtual sound through bilateral BC transducers. Subjective experiments showed that stimulation at the mastoid gave the best performance while the temple gave the worst performance in localization. Stimulation at the mastoid and condyle did not differ significantly from that using air-conduction (AC) headphones in localization accuracy. However, binaural reproduction at all BC stimulation positions led to similar levels of front-back confusion (FBC), which were also comparable to that with AC headphones. Binaural BC reproduction with high-frequency stimulation led to significantly higher localization accuracy than with low-frequency stimulation. When transcranial attenuation (TA) was measured, the attenuation became larger at the condyle and mastoid, and increased at high frequencies. The experiments imply that larger TAs may improve localization accuracy but do not improve FBC. The present study indicates that the BC stimulation at the mastoid and condyle can effectively convey spatial information, especially with high-frequency stimulation.
APA, Harvard, Vancouver, ISO, and other styles
35

Hartley, Douglas E. H., Johannes C. Dahmen, Andrew J. King, and Jan W. H. Schnupp. "Binaural sensitivity changes between cortical on and off responses." Journal of Neurophysiology 106, no. 1 (July 2011): 30–43. http://dx.doi.org/10.1152/jn.01070.2010.

Full text
Abstract:
Neurons exhibiting on and off responses with different frequency tuning have previously been described in the primary auditory cortex (A1) of anesthetized and awake animals, but it is unknown whether other tuning properties, including sensitivity to binaural localization cues, also differ between on and off responses. We measured the sensitivity of A1 neurons in anesthetized ferrets to 1) interaural level differences (ILDs), using unmodulated broadband noise with varying ILDs and average binaural levels, and 2) interaural time delays (ITDs), using sinusoidally amplitude-modulated broadband noise with varying envelope ITDs. We also assessed fine-structure ITD sensitivity and frequency tuning, using pure-tone stimuli. Neurons most commonly responded to stimulus onset only, but purely off responses and on-off responses were also recorded. Of the units exhibiting significant binaural sensitivity nearly one-quarter showed binaural sensitivity in both on and off responses, but in almost all (∼97%) of these units the binaural tuning of the on responses differed significantly from that seen in the off responses. Moreover, averaged, normalized ILD and ITD tuning curves calculated from all units showing significant sensitivity to binaural cues indicated that on and off responses displayed different sensitivity patterns across the population. A principal component analysis of ITD response functions suggested a continuous cortical distribution of binaural sensitivity, rather than discrete response classes. Rather than reflecting a release from inhibition without any functional significance, we propose that binaural off responses may be important to cortical encoding of sound-source location.
APA, Harvard, Vancouver, ISO, and other styles
36

Zeitler, Daniel, and Michael Dorman. "Cochlear Implantation for Single-Sided Deafness: A New Treatment Paradigm." Journal of Neurological Surgery Part B: Skull Base 80, no. 02 (February 4, 2019): 178–86. http://dx.doi.org/10.1055/s-0038-1677482.

Full text
Abstract:
AbstractUnilateral severe-to-profound sensorineural hearing loss (SNHL), also known as single sided deafness (SSD), is a problem that affects both children and adults, and can have severe and detrimental effects on multiple aspects of life including music appreciation, speech understanding in noise, speech and language acquisition, performance in the classroom and/or the workplace, and quality of life. Additionally, the loss of binaural hearing in SSD patients affects those processes that rely on two functional ears including sound localization, binaural squelch and summation, and the head shadow effect. Over the last decade, there has been increasing interest in cochlear implantation for SSD to restore binaural hearing. Early data are promising that cochlear implantation for SSD can help to restore binaural functionality, improve quality of life, and may faciliate reversal of neuroplasticity related to auditory deprivation in the pediatric population. Additionally, this new patient population has allowed researchers the opportunity to investigate the age-old question “what does a cochlear implant (CI) sound like?.”
APA, Harvard, Vancouver, ISO, and other styles
37

Liu, Hong, Yongheng Sun, Ge Yang, and Yang Chen. "Binaural sound source localization based on weighted template matching." CAAI Transactions on Intelligence Technology 6, no. 2 (March 10, 2021): 214–23. http://dx.doi.org/10.1049/cit2.12009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Zhou, Lin, Kangyu Ma, Lijie Wang, Ying Chen, and Yibin Tang. "Binaural Sound Source Localization Based on Convolutional Neural Network." Computers, Materials & Continua 60, no. 2 (2019): 545–57. http://dx.doi.org/10.32604/cmc.2019.05969.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Schillebeeckx, Filips, Fons De Mey, Dieter Vanderelst, and Herbert Peremans. "Biomimetic Sonar: Binaural 3D Localization using Artificial Bat Pinnae." International Journal of Robotics Research 30, no. 8 (September 24, 2010): 975–87. http://dx.doi.org/10.1177/0278364910380474.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

ZHANG, Zhong, Kazuaki I, Tetsuo MIYAKE, Takashi IMAMURA, and Satoshi HORIHATA. "Localization of Sound Source Direction Using the Binaural Model." Transactions of the Japan Society of Mechanical Engineers Series C 74, no. 739 (2008): 642–49. http://dx.doi.org/10.1299/kikaic.74.642.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Zhou, Yi, Leslie Balderas, and Emily Jo Venskytis. "Binaural ambiguity amplifies visual bias in sound source localization." Journal of the Acoustical Society of America 144, no. 6 (December 2018): 3118–23. http://dx.doi.org/10.1121/1.5079568.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Hartmann, William M., Zachary A. Constan, and Brad Rakerd. "Binaural coherence and the localization of sound in rooms." Journal of the Acoustical Society of America 103, no. 5 (May 1998): 3081. http://dx.doi.org/10.1121/1.422904.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Colburn, H. Steven. "Recent advances in models of binaural detection and localization." Journal of the Acoustical Society of America 105, no. 2 (February 1999): 964. http://dx.doi.org/10.1121/1.425283.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Hofman, P., and A. Van Opstal. "Binaural weighting of pinna cues in human sound localization." Experimental Brain Research 148, no. 4 (February 2003): 458–70. http://dx.doi.org/10.1007/s00221-002-1320-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Horihata, Satoshi, Shuushi Makita, and Tetuwo Miyake. "105 3-D Sound Localization Using the Binaural Model." Proceedings of the Symposium on Evaluation and Diagnosis 2005.4 (2005): 25–28. http://dx.doi.org/10.1299/jsmesed.2005.4.25.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

ZHANG, Zhong, Masahiro NAKATANI, Takashi IMAMURA, and Tetsuo MIYAKE. "Real world sound localization by using the binaural model." Transactions of the JSME (in Japanese) 81, no. 827 (2015): 15–00026. http://dx.doi.org/10.1299/transjsme.15-00026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Macpherson, Ewan A., and Andrew T. Sabin. "Binaural weighting of monaural spectral cues for sound localization." Journal of the Acoustical Society of America 121, no. 6 (2007): 3677. http://dx.doi.org/10.1121/1.2722048.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Parisi, Raffaele, Flavia Camoes, Michele Scarpiniti, and Aurelio Uncini. "Cepstrum Prefiltering for Binaural Source Localization in Reverberant Environments." IEEE Signal Processing Letters 19, no. 2 (February 2012): 99–102. http://dx.doi.org/10.1109/lsp.2011.2180376.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Burke, Kimberly A., Anna Letsos, and Robert A. Butler. "Asymmetric performances in binaural localization of sound in space." Neuropsychologia 32, no. 11 (November 1994): 1409–17. http://dx.doi.org/10.1016/0028-3932(94)00074-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Razak, K. A., Z. M. Fuzessery, and T. D. Lohuis. "Single Cortical Neurons Serve Both Echolocation and Passive Sound Localization." Journal of Neurophysiology 81, no. 3 (March 1, 1999): 1438–42. http://dx.doi.org/10.1152/jn.1999.81.3.1438.

Full text
Abstract:
Single cortical neurons serve both echolocation and passive sound localization. The pallid bat uses passive listening at low frequencies to detect and locate terrestrial prey and reserves its high-frequency echolocation for general orientation. While hunting, this bat must attend to both streams of information. These streams are processed through two parallel, functionally specialized pathways that are segregated at the level of the inferior colliculus. This report describes functionally bimodal neurons in auditory cortex that receive converging input from these two pathways. Each brain stem pathway imposes its own suite of response properties on these cortical neurons. Consequently, the neurons are bimodally tuned to low and high frequencies, and respond selectively to both noise transients used in prey detection, and downward frequency modulation (FM) sweeps used in echolocation. A novel finding is that the monaural and binaural response properties of these neurons can change as a function of the sound presented. The majority of neurons appeared binaurally inhibited when presented with noise but monaural or binaurally facilitated when presented with the echolocation pulse. Consequently, their spatial sensitivity will change, depending on whether the bat is engaged in echolocation or passive listening. These results demonstrate that the response properties of single cortical neurons can change with behavioral context and suggest that they are capable of supporting more than one behavior.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography