Letteratura scientifica selezionata sul tema "Auditory spatial perception"

Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili

Scegli il tipo di fonte:

Consulta la lista di attuali articoli, libri, tesi, atti di convegni e altre fonti scientifiche attinenti al tema "Auditory spatial perception".

Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.

Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.

Articoli di riviste sul tema "Auditory spatial perception":

1

Recanzone, Gregg H. "Auditory Influences on Visual Temporal Rate Perception". Journal of Neurophysiology 89, n. 2 (1 febbraio 2003): 1078–93. http://dx.doi.org/10.1152/jn.00706.2002.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Visual stimuli are known to influence the perception of auditory stimuli in spatial tasks, giving rise to the ventriloquism effect. These influences can persist in the absence of visual input following a period of exposure to spatially disparate auditory and visual stimuli, a phenomenon termed the ventriloquism aftereffect. It has been speculated that the visual dominance over audition in spatial tasks is due to the superior spatial acuity of vision compared with audition. If that is the case, then the auditory system should dominate visual perception in a manner analogous to the ventriloquism effect and aftereffect if one uses a task in which the auditory system has superior acuity. To test this prediction, the interactions of visual and auditory stimuli were measured in a temporally based task in normal human subjects. The results show that the auditory system has a pronounced influence on visual temporal rate perception. This influence was independent of the spatial location, spectral bandwidth, and intensity of the auditory stimulus. The influence was, however, strongly dependent on the disparity in temporal rate between the two stimulus modalities. Further, aftereffects were observed following approximately 20 min of exposure to temporally disparate auditory and visual stimuli. These results show that the auditory system can strongly influence visual perception and are consistent with the idea that bimodal sensory conflicts are dominated by the sensory system with the greater acuity for the stimulus parameter being discriminated.
2

Haas, Ellen C. "Auditory Perception". Proceedings of the Human Factors Society Annual Meeting 36, n. 3 (ottobre 1992): 247. http://dx.doi.org/10.1518/107118192786751817.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Auditory perception involves the human listener's awareness or apprehension of auditory stimuli in the environment. Auditory stimuli, which include speech communications as well as non-speech signals, occur in the presence and absence of environmental noise. Non-speech auditory signals range from simple pure tones to complex signals found in three-dimensional auditory displays. Special hearing protection device (HPD) designs, as well as additions to conventional protectors, have been developed to improve speech communication and auditory perception capabilities of those exposed to noise. The thoughtful design of auditory stimuli and the proper design, selection, and use of HPDs within the environment can improve human performance and reduce accidents. The purpose of this symposium will be to discuss issues in auditory perception and to describe methods to improve the perception of auditory stimuli in environments with and without noise. The issues of interest include the perception of non-speech auditory signals and the improvement of auditory perception capabilities of persons exposed to noise. The first three papers of this symposium describe the perception of non-speech auditory signals. Ellen Haas defines the extent to which certain signal elements affect the perceived urgency of auditory warning signals. Michael D. Good and Dr. Robert H. Gilkey investigate free-field masking as a function of the spatial separation between signal and masker sounds within the horizontal and median planes. Jeffrey M. Gerth explores the discrimination of complex auditory signal components that differ by sound category, temporal pattern, density, and component manipulation. The fourth paper of this symposium focuses upon the improvement of auditory perception capabilities of persons exposed to hazardous noise, and who must wear hearing protection. Special HPD designs, as well as additions to conventional protectors, have been developed to improve speech communication and auditory perception capabilities of persons exposed to noise. Dr. John G. Casali reviews several new HPD technologies and describes construction features, empirical performance data, and applications of each device. These papers illustrate current research issues in the perception of auditory signals. The issues are all relevant to the human factors engineering of auditory signals and personal protective gear. The perception of auditory stimuli can be improved by the thoughtful human factors design of auditory stimuli and by the proper use of HPDs.
3

Best, Virginia, Jorg M. Buchholz e Tobias Weller. "Measuring auditory spatial perception in realistic environments". Journal of the Acoustical Society of America 141, n. 5 (maggio 2017): 3692. http://dx.doi.org/10.1121/1.4988040.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Lau, Bonnie K., Tanya St. John, Annette Estes e Stephen Dager. "Auditory processing in neurodiverse children". Journal of the Acoustical Society of America 155, n. 3_Supplement (1 marzo 2024): A75. http://dx.doi.org/10.1121/10.0026855.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Many neurodiverse individuals experience auditory processing differences including hyper- or hyposensitivity to sound, attraction or aversions to sound, and difficulty listening under noisy conditions. However, the origins of these auditory symptoms are not well understood. In this study, we tested 7-to-10-year-old autistic children and age and sex-matched neurotypical comparison participants. To simulate a realistic classroom situation where many people are often speaking simultaneously, we obtained neural and behavioral measures of speech perception in both quiet and noise conditions. Using electroencephalography, we recorded neural responses to naturalistic, continuous speech to assess the cortical encoding of the speech envelope. We also obtained behavioral multitalker speech perception thresholds and estimates of spatial release from masking, a binaural hearing phenomenon in which speech perception improves when distracting speakers are spatially separated from the target speaker. Our preliminary results from both neural and behavioral measures suggest that the autistic group shows worse speech perception in noise and less spatial release from masking than the neurotypical group. These findings suggest that autistic children may benefit from environments with reduced noise to facilitate speech perception. These findings also warrant further investigation into speech perception under real-world conditions and the neural mechanisms underlying sound processing in autistic children.
5

Peng, Z. Ellen. "School-age children show poor use of spatial cues in reverberation for speech-in-speech perception". Journal of the Acoustical Society of America 151, n. 4 (aprile 2022): A169. http://dx.doi.org/10.1121/10.0011001.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Understanding speech is particularly difficult for children when there is competing speech in the background. When the target and masker talkers are spatially separated, as compared to co-located, the access to corresponding auditory spatial cues can provide release from masking, resulting in an intelligibility gain for speech-in-speech perception. When tested in free-field environments, previous work showed that children demonstrate adult-like spatial release from masking (SRM) by 9–10 years of age. However, in indoor environments where most critical communications occur such as classrooms, reverberation distorts the critical auditory spatial cues that lead to reduced SRM among adults. Little is known about how children process distorted auditory spatial cues for SRM to aid speech-in-speech perception. In this work, we measure SRM in children in virtual reverberant environments that mimic typical learning spaces. We show free-field measurements overestimate SRM maturation in realistic indoor environments. Children show a more protracted development of SRM in reverberation, likely due to immaturities in using distorted auditory spatial cues.
6

Koohi, Nehzat, Gilbert Thomas-Black, Paola Giunti e Doris-Eva Bamiou. "Auditory Phenotypic Variability in Friedreich’s Ataxia Patients". Cerebellum 20, n. 4 (18 febbraio 2021): 497–508. http://dx.doi.org/10.1007/s12311-021-01236-9.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
AbstractAuditory neural impairment is a key clinical feature of Friedreich’s Ataxia (FRDA). We aimed to characterize the phenotypical spectrum of the auditory impairment in FRDA in order to facilitate early identification and timely management of auditory impairment in FRDA patients and to explore the relationship between the severity of auditory impairment with genetic variables (the expansion size of GAA trinucleotide repeats, GAA1 and GAA2), when controlled for variables such as disease duration, severity of the disease and cognitive status. Twenty-seven patients with genetically confirmed FRDA underwent baseline audiological assessment (pure-tone audiometry, otoacoustic emissions, auditory brainstem response). Twenty of these patients had additional psychophysical auditory processing evaluation including an auditory temporal processing test (gaps in noise test) and a binaural speech perception test that assesses spatial processing (Listening in Spatialized Noise-Sentences Test). Auditory spatial and auditory temporal processing ability were significantly associated with the repeat length of GAA1. Patients with GAA1 greater than 500 repeats had more severe auditory temporal and spatial processing deficits, leading to poorer speech perception. Furthermore, the spatial processing ability was strongly correlated with the Montreal Cognitive Assessment (MoCA) score. To our knowledge, this is the first study to demonstrate an association between genotype and auditory spatial processing phenotype in patients with FRDA. Auditory temporal processing, neural sound conduction, spatial processing and speech perception were more severely affected in patients with GAA1 greater than 500 repeats. The results of our study may indicate that auditory deprivation plays a role in the development of mild cognitive impairment in FRDA patients.
7

Cui, Qi N., Babak Razavi, William E. O'Neill e Gary D. Paige. "Perception of Auditory, Visual, and Egocentric Spatial Alignment Adapts Differently to Changes in Eye Position". Journal of Neurophysiology 103, n. 2 (febbraio 2010): 1020–35. http://dx.doi.org/10.1152/jn.00500.2009.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Vision and audition represent the outside world in spatial synergy that is crucial for guiding natural activities. Input conveying eye-in-head position is needed to maintain spatial congruence because the eyes move in the head while the ears remain head-fixed. Recently, we reported that the human perception of auditory space shifts with changes in eye position. In this study, we examined whether this phenomenon is 1) dependent on a visual fixation reference, 2) selective for frequency bands (high-pass and low-pass noise) related to specific auditory spatial channels, 3) matched by a shift in the perceived straight-ahead (PSA), and 4) accompanied by a spatial shift for visual and/or bimodal (visual and auditory) targets. Subjects were tested in a dark echo-attenuated chamber with their heads fixed facing a cylindrical screen, behind which a mobile speaker/LED presented targets across the frontal field. Subjects fixated alternating reference spots (0, ±20°) horizontally or vertically while either localizing targets or indicating PSA using a laser pointer. Results showed that the spatial shift induced by ocular eccentricity is 1) preserved for auditory targets without a visual fixation reference, 2) generalized for all frequency bands, and thus all auditory spatial channels, 3) paralleled by a shift in PSA, and 4) restricted to auditory space. Findings are consistent with a set-point control strategy by which eye position governs multimodal spatial alignment. The phenomenon is robust for auditory space and egocentric perception, and highlights the importance of controlling for eye position in the examination of spatial perception and behavior.
8

Strybel, Thomas Z. "Auditory Spatial Information and Head-Coupled Display Systems". Proceedings of the Human Factors Society Annual Meeting 32, n. 2 (ottobre 1988): 75. http://dx.doi.org/10.1177/154193128803200215.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Developments of head-coupled control/display systems have focused primarily on the display of three dimensional visual information, as the visual system is the optimal sensory channel for the aquisition of spatial information in humans. The auditory system improves the efficiency of vision, however, by obtaining spatial information about relevant objects outside of the visual field of view. This auditory information can be used to direct head and eye movements. Head-coupled display systems, can also benefit from the addition of auditory spatial information, as it provides a natural method of signaling the location of important events outside of the visual field of view. This symposium will report on current efforts in the developments of head-coupled display systems, with an emphasis on the auditory spatial component. The first paper “Virtual Interface Environment Workstations”, by Scott S. Fisher, will report on the development of a prototype virtual environment. This environment consists of a head-mounted, wide-angle, stereoscopic display system which is controlled by operator position, voice, and gesture. With this interface, an operator can virtually explore a 360 degree synthesized environment, and viscerally interact with its components. The second paper, “A Virtual Display System For Conveying Three-Dimensional Acoustic Information” by Elizabeth M. Wenzel, Frederic L. Wightman and Scott H. Foster, will report on the development of a method of synthetically generating three-dimensional sound cues for the above-mentioned interface. The development of simulated auditory spatial cues is limited to some extent, by our knowlege of auditory spatial processing. The remaining papers will report on two areas of auditory space perception that have recieved little attention until recently. “Perception of Real and Simulated Motion in the Auditory Modality”, by Thomas Z. Strybel, will review recent research on auditory motion perception, because a natural acoustic environment must contain moving sounds. This review will consider applications of this knowledge to head-coupled display systems. The last paper, “Auditory Psychomotor Coordination”, will examine the interplay between the auditory, visual and motor systems. The specific emphasis of this paper is the use of auditory spatial information in the regulation of motor responses so as to provide efficient application of the visual channel.
9

Upadhya, Sushmitha, Rohit Bhattacharyya, Ritwik Jargar e K. Nisha Venkateswaran. "Closed-field Auditory Spatial Perception and Its Relationship to Musical Aptitude". Journal of Indian Speech Language & Hearing Association 37, n. 2 (2023): 61–65. http://dx.doi.org/10.4103/jisha.jisha_20_23.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Introduction: Musical aptitude is the innate ability of an individual to understand, appreciate, improvise, and have a good sense of pitch and rhythm, even without undergoing formal musical training. The present study aimed to understand the effect of musical aptitude on auditory spatial perception. Method: Forty nonmusicians were subjected to a musical aptitude test Mini Profile of Music Perception Skills (Mini-PROMS) based on which they were divided into two groups. Group I included 20 nonmusicians with good musical aptitude (NM-GA) and Group II comprised 20 nonmusicians with poor musical aptitude (NM-PA). Auditory spatial tests included interaural time difference (ITD) and interaural level difference (ILD) threshold tests and a closed-field spatial test called virtual acoustic space identification (VASI) test. Results: Kruskal–Wallis test revealed a significant difference between Group I (NM-GA) and Group II (NM-PA) in ITD (p < 0.001), ILD (p = 0.002), and VASI (p = 0.012) tests, suggesting the role of musical aptitude in auditory spatial perception. Correlational analyses showed a moderate positive correlation between Mini-PROMS scores with VASI (r = 0.31, p = 0.04) and a moderate negative correlation with ILD (r = −0.3, p = 0.04) and ITD (r = −0.5, p = 0.001). Conclusion: This study defines a positive association between musical aptitude and auditory spatial perception. Further research should include a comparison of spatial skills among musicians and nonmusicians with good musical aptitude.
10

Terrence, Peter I., J. Christopher Brill e Richard D. Gilson. "Body Orientation and the Perception of Spatial Auditory and Tactile Cues". Proceedings of the Human Factors and Ergonomics Society Annual Meeting 49, n. 17 (settembre 2005): 1663–67. http://dx.doi.org/10.1177/154193120504901735.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
This study investigated the effects of five body orientations (supine, kneeling, sitting, standing, and prone) on perception of spatial auditory and spatial tactile cues along eight equidistant points (45° separation) of the azimuth, using a within-participant design. Participants (N = 30) used a graphics tablet and stylus to indicate the perceived direction indicated by either vibrotactile stimuli applied to the abdomen, or spatial auditory stimuli presented via headphones. Response time data show responses to spatial tactile cues were significantly faster than spatial auditory cues at each body position and for each point along the azimuth, with no significant effects of body orientation. Absolute angle differences between presented and perceived cues were significantly smaller in the tactile condition for five of the eight stimulus positions, with no significant effects of body orientation. Results are discussed in terms of designing multi-sensory directional cues for reducing visual search space for dismounted soldiers

Tesi sul tema "Auditory spatial perception":

1

Keating, Peter. "Plasticity and integration of auditory spatial cues". Thesis, University of Oxford, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.561113.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Although there is extensive evidence that auditory spatial processing can adapt to changes in auditory spatial cues both in infancy and adulthood, the mechanisms underlying adaptation appear to differ across species. Whereas barn owls compensate for unilateral hearing loss throughout development by learning abnormal mappings between cue values and spatial position, adult mammals seem to adapt by ignoring the acoustical input available to the affected ear and learning to rely more on unaltered spatial cues. To investigate these differences further, ferrets were raised with a unilateral earplug and their ability to localize sounds was assessed. Although these animals did not fully compensate for the effects of an earplug, they performed considerably better than animals that experienced an earplug for the first time, indicating that adaptation had taken place. We subsequently found that juvenile-plugged (JP) ferrets learned to adjust both cue mappings and weights in response to changes in acoustical input, with the nature of these changes reflecting the expected reliability of different cues. Thus, the auditory system may be able to rapidly update the way in which individual cues are processed, as well as the way in which different cues are integrated, thereby enabling spatial cues to be processed in a context- specific way. In attempting to understand the mechanisms that guide plasticity of spatial hearing, previous studies have raised the possibility that changes in auditory spatial processing may be driven by mechanisms intrinsic to the auditory system. To address this possibility directly, we measured the sensitivity of human subjects to ITDs and ILDs following transient misalignment of these cues. We found that this induces a short-term recalibration that acts to compensate for the effects of cue misalignment. These changes occurred in the absence of error feedback, suggesting that mutual recalibration can occur between auditory spatial cues. The nature of these changes, however, was consistent with models of cue integration, suggesting that plasticity and integration may be inextricably linked. Throughout the course of this work, it became clear that future investigations would benefit from the application of closed-field techniques to the ferret. For this reason, we developed and validated methods that enable stimuli to be presented to ferrets over earphones, and used these methods to assess ITD and ILD sensitivity in ferrets using a variety of different stimuli. We found that the Duplex theory is able to account for binaural spatial sensitivity in these animals, and that sensitivity is comparable with that found in humans, thereby confirming the ferret as an excellent model for understanding binaural spatial hearing.
2

Geeseman, Joseph W. "The influence of auditory cues on visual spatial perception". OpenSIUC, 2010. https://opensiuc.lib.siu.edu/theses/286.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Traditional psychophysical studies have been primarily unimodal experiments due to the ease in which a single sense can be isolated in a laboratory setting. This study, however, presents participants with auditory and visual stimuli to better understand the interaction of the two senses in visuospatial perception. Visual stimuli, presented as Gaussian distributed blobs, moved laterally across a computer monitor to a central location and "bounced" back to their starting position. During this passage across the screen, a brief auditory "click" was presented via headphones. Participants were asked to respond to the bounce of the ball, and response latency was recorded. Response latency to the bounce position varied as a function of baseline (no sound) and the varying sound offset locations.
3

Griffiths, Shaaron S., e shaaron griffiths@deakin edu au. "Spatial and temporal disparaties in aurally aided visual search". Deakin University. School of Psychology, 2001. http://tux.lib.deakin.edu.au./adt-VDU/public/adt-VDU20061207.134032.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Research over the last decade has shown that auditorily cuing the location of visual targets reduces the time taken to locate and identify targets for both free-field and virtually presented sounds. The first study conducted for this thesis confirmed these findings over an extensive region of free-field space. However, the number of sound locations that are measured and stored in the data library of most 3-D audio spatial systems is limited, so that there is often a discrepancy in position between the cued and physical location of the target. Sampling limitations in the systems also produce temporal delays in which the stored data can be conveyed to operators. To investigate the effects of spatial and temporal disparities in audio cuing of visual search, and to provide evidence to alleviate concerns that psychological research lags behind the capabilities to design and implement synthetic interfaces, experiments were conducted to examine (a) the magnitude of spatial separation, and (b) the duration of temporal delay that intervened between auditory spatial cues and visual targets to alter response times to locate targets and discriminate their shape, relative to when the stimuli were spatially aligned, and temporally synchronised, respectively. Participants listened to free-field sound localisation cues that were presented with a single, highly visible target that could appear anywhere across 360° of azimuthal space on the vertical mid-line (spatial separation), or extended to 45° above and below the vertical mid-line (temporal delay). A vertical or horizontal spatial separation of 40° between the stimuli significantly increased response times, while separations of 30° or less did not reach significance. Response times were slowed at most target locations when auditory cues occurred 770 msecs prior to the appearance of targets, but not with similar durations of temporal delay (i.e., 440 msecs or less). When sounds followed the appearance of targets, the stimulus onset asynchrony that affected response times was dependent on target location, and ranged from 440 msecs at higher elevations and rearward of participants, to 1,100 msecs on the vertical mid-line. If targets appeared in the frontal field of view, no delay of acoustical stimulation affected performance. Finally, when conditions of spatial separation and temporal delay were combined, visual search times were degraded with a shorter stimulus onset asynchrony than when only the temporal relationship between the stimuli was varied, but responses to spatial separation were unaffected. The implications of the results for the development of synthetic audio spatial systems to aid visual search tasks was discussed.
4

Elias, Bartholomew. "Cross-modal facilitation of spatial frequency discriminations through auditory frequency cue presentations". Thesis, Georgia Institute of Technology, 1991. http://hdl.handle.net/1853/28611.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Best, Virginia Ann. "Spatial Hearing with Simultaneous Sound Sources: A Psychophysical Investigation". University of Sydney. Medicine, 2004. http://hdl.handle.net/2123/576.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
This thesis provides an overview of work conducted to investigate human spatial hearing in situations involving multiple concurrent sound sources. Much is known about spatial hearing with single sound sources, including the acoustic cues to source location and the accuracy of localisation under different conditions. However, more recently interest has grown in the behaviour of listeners in more complex environments. Concurrent sound sources pose a particularly difficult problem for the auditory system, as their identities and locations must be extracted from a common set of sensory receptors and shared computational machinery. It is clear that humans have a rich perception of their auditory world, but just how concurrent sounds are processed, and how accurately, are issues that are poorly understood. This work attempts to fill a gap in our understanding by systematically examining spatial resolution with multiple sound sources. A series of psychophysical experiments was conducted on listeners with normal hearing to measure performance in spatial localisation and discrimination tasks involving more than one source. The general approach was to present sources that overlapped in both frequency and time in order to observe performance in the most challenging of situations. Furthermore, the role of two primary sets of location cues in concurrent source listening was probed by examining performance in different spatial dimensions. The binaural cues arise due to the separation of the two ears, and provide information about the lateral position of sound sources. The spectral cues result from location-dependent filtering by the head and pinnae, and allow vertical and front-rear auditory discrimination. Two sets of experiments are described that employed relatively simple broadband noise stimuli. In the first of these, two-point discrimination thresholds were measured using simultaneous noise bursts. It was found that the pair could be resolved only if a binaural difference was present; spectral cues did not appear to be sufficient. In the second set of experiments, the two stimuli were made distinguishable on the basis of their temporal envelopes, and the localisation of a designated target source was directly examined. Remarkably robust localisation was observed, despite the simultaneous masker, and both binaural and spectral cues appeared to be of use in this case. Small but persistent errors were observed, which in the lateral dimension represented a systematic shift away from the location of the masker. The errors can be explained by interference in the processing of the different location cues. Overall these experiments demonstrated that the spatial perception of concurrent sound sources is highly dependent on stimulus characteristics and configurations. This suggests that the underlying spatial representations are limited by the accuracy with which acoustic spatial cues can be extracted from a mixed signal. Three sets of experiments are then described that examined spatial performance with speech, a complex natural sound. The first measured how well speech is localised in isolation. This work demonstrated that speech contains high-frequency energy that is essential for accurate three-dimensional localisation. In the second set of experiments, spatial resolution for concurrent monosyllabic words was examined using similar approaches to those used for the concurrent noise experiments. It was found that resolution for concurrent speech stimuli was similar to resolution for concurrent noise stimuli. Importantly, listeners were limited in their ability to concurrently process the location-dependent spectral cues associated with two brief speech sources. In the final set of experiments, the role of spatial hearing was examined in a more relevant setting containing concurrent streams of sentence speech. It has long been known that binaural differences can aid segregation and enhance selective attention in such situations. The results presented here confirmed this finding and extended it to show that the spectral cues associated with different locations can also contribute. As a whole, this work provides an in-depth examination of spatial performance in concurrent source situations and delineates some of the limitations of this process. In general, spatial accuracy with concurrent sources is poorer than with single sound sources, as both binaural and spectral cues are subject to interference. Nonetheless, binaural cues are quite robust for representing concurrent source locations, and spectral cues can enhance spatial listening in many situations. The findings also highlight the intricate relationship that exists between spatial hearing, auditory object processing, and the allocation of attention in complex environments.
6

Jin, Craig T. "Spectral analysis and resolving spatial ambiguities in human sound localization". Connect to full text, 2001. http://hdl.handle.net/2123/1342.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Thesis (Ph. D.)--School of Electrical and Information Engineering, Faculty of Engineering, University of Sydney, 2001.
Title from title screen (viewed 13 January 2009). Submitted in fulfilment of the requirements for the degree of Doctor of Philosophy to the School of Electrical and Information Engineering, Faculty of Engineering. Includes bibliographical references. Also available in print form.
7

Nuckols, Richard. "Localization of Auditory Spatial Targets in Sighted and Blind Subjects". VCU Scholars Compass, 2013. http://scholarscompass.vcu.edu/etd/3286.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
This research was designed to investigate the fundamental nature in which blind people utilize audible cues to attend to their surroundings. Knowledge on how blind people respond to external spatial stimuli is expected to assist in development of better tools for helping people with visual disabilities navigate their environment. There was also interest in determining how blind people compare to sighted people in auditory localization tasks. The ability of sighted individuals, blindfolded individuals, and blind individuals in localizing spatial auditory targets was assessed. An acoustic display board allowed the researcher to provide multiple sound presentations to the subjects. The subjects’ responses in localization tasks were measured using a combination of kinematic head tracking and eye tracking hardware. Data was collected and analyzed to determine the ability of the groups in localizing spatial auditory targets. Significant differences were found among the three groups in spatial localization error and temporal patterns.
8

Euston, David Raymond. "From spectrum to space the integration of frequency-specific intensity cues to produce auditory spatial receptive fields in the barn owl inferior colliculus /". [Eugene, Or. : University of Oregon Library System], 2000. http://libweb.uoregon.edu/UOTheses/2000/eustond00.pdf.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Euston, David Raymond 1964. "From spectrum to space: the integration of frequency-specific intensity cues to produce auditory spatial receptive fields in the barn owl inferior colliculus". Thesis, University of Oregon, 2000. http://hdl.handle.net/1794/143.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Advisers: Terry Takahashi and Richard Marrocco. xiv, 152 p.
Neurons in the barn owl's inferior colliculus (IC) derive their spatial receptive fields (RF) from two auditory cues: interaural time difference (ITD) and interaural level difference (ILD). ITD serves to restrict a RF in azimuth but the precise role of ILD was, up to this point, unclear. Filtering by the ears and head insures that each spatial location is associated with a unique combination of frequency-specific ILD values (i.e., an ILD spectrum). We isolated the effect of ILD spectra using virtual sound sources in which ITD was held fixed for all spatial locations while ILD spectra were allowed to vary normally. A cell's response to these stimuli reflects the contribution of ILD to spatial tuning, referred to as an “ILD-alone RF”. In a sample of 34 cells, individual ILD-alone RFs were distributed and amorphous, but consistently showed that the ILD spectrum is facilatory at the cell's best location and inhibitory above and/or below. Prior results have suggested that an IC cell's spatial specificity is generated by summing inputs which are narrowly tuned to frequency and selective for both ILD and ITD. Based on this premise, we present a developmental model which, when trained solely on a cell's true spatial RF, reproduces both the cell's true RF and its ILD-alone RF. According to the model, the connectivity between a space-tuned IC cell and its frequency-specific inputs develops subject to two constraints: the cell must be excited by ILD spectra from the cell's best location and inhibited by spectra from locations above and below but along the vertical strip defined by the best ITD. To assess how frequency-specific inputs are integrated to form restricted spatial RFs, we measured the responses of 47 space-tuned IC cells to pure tones at varying ILDs and frequencies. ILD tuning varied with frequency. Further, pure-tone responses, summed according to the head-related filters, accounted for 56 percent of the variance in broadband ILD-alone RFs. Modelling suggests that, with broadband sounds, cells behave as though they are linearly summing their inputs, but when testing with pure tones, non-linearities arise. This dissertation includes unpublished co-authored materials.
10

Cogné, Mélanie. "Influence de modulations sensorielles sur la navigation et la mémoire spatiale en réalité virtuelle : Processus cognitifs impliqués". Thesis, Bordeaux, 2017. http://www.theses.fr/2017BORD0704.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Se déplacer selon un but déterminé est une activité courante de la vie quotidienne. Des capacités cognitives variées sont associées aux déplacements, comme la navigation, la mémoire ou encore l’orientation spatiale. De nombreux patients cérébro-lésés ou atteints par une maladie neuro-dégénérative présentent des difficultés topographiques qui retentissent sur leur autonomie en vie quotidienne. Les outils de réalité virtuelle permettent d’évaluer la navigation et la mémoire spatiale à grande échelle, avec une bonne corrélation entre cette évaluation et celle qui serait réalisée dans un environnement réel. La réalité virtuelle permet également d’ajouter des stimuli à la tâche proposée. Ces stimuli additionnels peuvent être contextuels, c’est à dire reliés à la tâche à réaliser dans l’environnement virtuel, ou noncontextuels, soit sans lien avec la tâche à accomplir. Ce travail de thèse s’est attaché à évaluer l’impact de stimuli auditifs et visuels sur la navigation et la mémoire spatiale de patients cérébro-lésés ou présentant une maladie neuro-dégénérative, dans des expériences de réalité virtuelle. Les deux premiers volets de cette thèse ont étudié l’effet de stimuli auditifs contextuels ou non-contextuels lors d’une tâche de courses au sein du supermarché virtuel VAP-S. Le premier volet a montré que des stimuli auditifs contextuels de type effet sonar et énoncé du nom du produit facilitaient la navigation spatiale de patients cérébro-lésés impliqués dans cette tâche de courses. Le second volet a mis en évidence que des sons non-contextuels avec une importante saillance cognitive ou perceptuelle péjoraient la performance de navigation de patients ayant présenté un accident vasculaire cérébral. Les deux volets suivants de cette thèse ont étudié l’effet d’indiçages visuels ou auditifs dans une tâche de navigation spatialedans un quartier virtuel. Ainsi, le troisième volet de la thèse a démontré que des indices visuels comme des flèches directionnelles ou des points de repère sursignifiés facilitaient la navigation spatiale et certains aspects de mémoire spatiale de patients avec des troubles cognitifs légers (MCI) ou présentant une Maladie d’Alzheimer. Enfin, le quatrième volet a mis en évidence qu’un indiçage auditif par des bips indiquant la direction à chaque intersection améliorait la navigation spatiale de patients cérébro-lésés droits présentant une héminégligence visuelle et auditive controlatérale. Ces résultats suggèrent que des stimuli auditifs et visuels pourraient être utilisés lors de prises en charge rééducatives et réadaptatives de patients présentant des difficultés topographiques, ainsi qu’en vie quotidienne par le biais de la réalité augmentée afin de faciliter leurs déplacements. L’impact des stimuli chez les sujets sains et chez les cérébrolésés est différent et justifie une analyse spécifique de processus probablement distincts impliqués lors des déficits cognitifs
Navigating in a familiar or unfamiliar environment is a frequent challenge for human beings. Many patients with brain injury suffer from topographical difficulties, which influences their autonomy in daily life. Virtual Reality Tools enable the evaluation of largescale spatial navigation and spatial memory, resembling a real environment. Virtual reality also permits to add stimuli to the software. These stimuli can be contextual, that is to say linked to the task that participants have to accomplish in the Virtual Environment, or non-contextual, i.e. with no link with the require task. This thesis investigates whether visual or auditory stimuli influence spatial navigation and memory in Virtual Environments of patients with brain injury or with a neurodegenerative disease. The first part of the thesis showed contextual auditory stimuli type a sonar effect and the names of products of the shopping list improved spatial navigation of brain-injured patients during a shopping task in the virtual supermarket VAP-S. The second part of this thesis highlighted that non-contextual auditory stimuli with a high perceptual or cognitive salience decreased spatial navigation performance of brain-injured patients during a shopping task in the VAP-S. The third part of this thesis showed that visual cues like directional arrows and salient landmarks improved spatial navigation and some aspects of spatial memory of patients with Alzheimer’s disease or Mild Cognitive Impairments during a navigation task in a virtual district. The last part of this thesis demonstrated that auditory cues, i.e. beeping sounds indicating the directions, increased spatial navigation in a virtual district of patients who have had a stroke with contra-lesional visual and auditory neglect. These results suggest that some visual and auditory stimuli could be helpful for spatial navigation and memory tasks in patients with brain injury of neuro-degenerative disease. It further offers new research avenues for neuro-rehabilitation, such as the use of augmented reality in real-life settings to support the navigational capabilities of these patients

Libri sul tema "Auditory spatial perception":

1

Blauert, Jens. Spatial hearing: The psychophysics of human sound localization. Cambridge, Mass: MIT Press, 1997.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

H, Gilkey Robert, e Anderson Timothy R, a cura di. Binaural and spatial hearing in real and virtual environments. Mahwah, N.J: Lawrence Erlbaum Associates, 1997.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Principles And Applications Of Spatial Hearing. World Scientific Publishing Company, 2011.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Suzuki, Yoiti, Douglas Brungart e Kazuhiro Iida. Principles and Applications of Spatial Hearing. World Scientific Publishing Co Pte Ltd, 2011.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Anderson, Timothy R., e Robert Gilkey. Binaural and Spatial Hearing in Real and Virtual Environments. Taylor & Francis Group, 2014.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Anderson, Timothy R., e Robert Gilkey. Binaural and Spatial Hearing in Real and Virtual Environments. Taylor & Francis Group, 2014.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Anderson, Timothy R., e Robert Gilkey. Binaural and Spatial Hearing in Real and Virtual Environments. Taylor & Francis Group, 2014.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Anderson, Timothy R., e Robert Gilkey. Binaural and Spatial Hearing in Real and Virtual Environments. Taylor & Francis Group, 2014.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Anderson, Timothy R., e Robert H. Gilkey. Binaural and Spatial Hearing in Real and Virtual Environments. Taylor & Francis Group, 2015.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Parnas, Josef, e Annick Urfer-Parnas. The ontology and epistemology of symptoms: The case of auditory verbal hallucinations in schizophrenia. A cura di Kenneth S. Kendler e Josef Parnas. Oxford University Press, 2017. http://dx.doi.org/10.1093/med/9780198796022.003.0026.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
We present a phenomenological account of auditory verbal hallucinations (AVH) in schizophrenia. We examine the mode of articulation of AVH, their spatial and temporal characteristics, and their relation to self-alienation, reflecting an emergence of otherness (alterity) in the midst of the patient’s self. This process of self-alienation is associated with the emergence of a different reality, a new ontological framework, which obeys other rules of causality and time. Patient becomes psychotic not because they cannot distinguish AVH from mundane perception, but because they are in touch with an alternative form of reality. A characteristic feature of schizophrenia is the coexistence of these incompatible realities. AVH are radically different from perception, and associated delusions stem from a breakthrough to another ontological framework. Thus, the current definition of AVH seems incorrect: The symptom is ontologically complex, involving first- and second-person dimensions, relations to the structure of consciousness, and other psychopathological phenomena.

Capitoli di libri sul tema "Auditory spatial perception":

1

Hehrmann, P., J. K. Maier, N. S. Harper, D. McAlpine e Maneesh Sahani. "Adaptive Coding for Auditory Spatial Cues". In The Neurophysiological Bases of Auditory Perception, 357–66. New York, NY: Springer New York, 2010. http://dx.doi.org/10.1007/978-1-4419-5686-6_34.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Elhilali, Mounya, Ling Ma, Christophe Micheyl, Andrew Oxenham e Shihab Shamma. "Rate Versus Temporal Code? A Spatio-Temporal Coherence Model of the Cortical Basis of Streaming". In The Neurophysiological Bases of Auditory Perception, 497–506. New York, NY: Springer New York, 2010. http://dx.doi.org/10.1007/978-1-4419-5686-6_46.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Cedolin, Leonardo, e Bertrand Delgutte. "Spatio-Temporal Representation of the Pitch of Complex Tones in the Auditory Nerve". In Hearing – From Sensory Processing to Perception, 61–70. Berlin, Heidelberg: Springer Berlin Heidelberg, 2007. http://dx.doi.org/10.1007/978-3-540-73009-5_8.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Barrie, John M., e Walter J. Freeman. "Perceptual Topography: Spatio-Temporal Analysis of Prepyriform, Visual, Auditory, and Somesthetic EEGs in Perception by Trained Rabbits". In The Neurobiology of Computation, 173–78. Boston, MA: Springer US, 1995. http://dx.doi.org/10.1007/978-1-4615-2235-5_28.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Wightman, Frederic L., e Rick Jenison. "Auditory Spatial Layout". In Perception of Space and Motion, 365–400. Elsevier, 1995. http://dx.doi.org/10.1016/b978-012240530-3/50012-2.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

ZAHORIK, P., E. BRANDEWIE e V. P. SIVONEN. "AUDITORY PERCEPTION IN REVERBERANT SOUND FIELDS AND EFFECTS OF PRIOR LISTENING EXPOSURE". In Principles and Applications of Spatial Hearing, 24–34. WORLD SCIENTIFIC, 2011. http://dx.doi.org/10.1142/9789814299312_0003.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Milner, A. David, H. Chris Dijkerman e David P. Carey. "Visuospatial processing in a pure case of visual-form agnosia". In The Hippocampal and Parietal Foundations of Spatial Cognition, 443–66. Oxford University PressOxford, 1998. http://dx.doi.org/10.1093/oso/9780198524533.003.0023.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Abstract Visual form agnosia refers to a variety of apperceptive agnosia in which the patient’ s ability to perceive, discriminate and recognize the shape of objects is grossly impaired (Benson & Greenberg 1969). Detailed case descriptions of such patients have been given by Goldstein & Gelb (1918), Adler (1944), Benson & Greenberg (1969), Landis et al. (1982) and Milner et al. (1991), and extensive experimental studies of their perceptual abilities have been presented by the same authors, as well as by Gelb & Goldstein (1938), Efron (1969), and Campion & Latto (1985). There is a remarkable similarity among these patients, both in their pathology and in their visual symptoms. In all cases the brain damage has been bilateral and in most cases caused by carbon monoxide poisoning, the principal exceptions being the patients of Goldstein & Gelb (missile injury) and Landis et al. (mercury poisoning). Also in all cases the profound deficit in form perception is seen without any appreciable deficit in auditory or tactile perception, and in most cases without any appreciable deficit in other domains of visual perception, including colour, motion, stereopsis and even visual acuity.
8

von Kriegstein, Katharina, e Christa Müller-Axt. "The Role of the Thalamus for Human Auditory and Visual Speech Perception". In The Cerebral Cortex and Thalamus, a cura di Andrew J. King e Judith A. Hirsch, 239–47. Oxford University PressNew York, 2023. http://dx.doi.org/10.1093/med/9780197676158.003.0023.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Abstract Most speech and language neuroscience research is focused on the cerebral cortex. There are 2 strong contributing factors to this cortico-centric view. First, historically our understanding of human brain function has relied on neuropsychological cases with cerebral cortex lesions. The fascinating case reports of, for example, Broca and Wernicke have been particularly prominent in the development of speech and language neuroscience. Second, for a long time, assessment of brain function in tiny thalamic nuclei in humans in vivo was technically impossible. Fortunately, recent technological advances such as high-spatial-resolution neuroimaging as well as deep brain stimulation now make it possible to investigate the thalamus, its subdivisions, and connections to the cerebral cortex in humans in vivo. This chapter summarizes how these novel developments contribute to understanding of the role of thalamic nuclei and their interaction with cerebral cortex in auditory and visual speech perception. The results reveal that sensory thalamic nuclei are part of a hierarchical predictive coding mechanism that supports speech recognition in the auditory and the visual modality. Alterations in this mechanism might lead to difficulties with speech perception, such as those observed in developmental dyslexia.
9

Ladavas, Elisabetta, e Alessandro Farne. "Multisensory Representation of Peripe onal Space". In Human Body Perception From The Inside Out, 89–104. Oxford University PressNew York, NY, 2005. http://dx.doi.org/10.1093/oso/9780195178371.003.0005.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Abstract Humans can represent visual objects in nearby (peripersonal) space through multisensory (visual-auditory-tactile) integrative processes, as indicated by the large body of evidence that has been recently accumulated (Spence & Driver, 2004). In nonhuman primates, multisensory integration at the single-neuron level is a frequent feature of spatial representation, especially in the coding of near peripersonal space, that is, the sector of space that closely surrounds the animal’s body parts (Duhamel et al., 1991, 1998; Graziano & Gross, 1995, 1998; Hyvarinen & Poranen, 1974; Rizzolatti et al., 1981, 1998). Since the 1990s, neuropsychological and neurophysiological research has provided comparative support to the notion that multisensory coding of near peripersonal space in both species shares several similarities at a functional and, to some extent, at a neuroanatomical level (Bremmer, Schlack, Duhamel, et al., 2001; Bremmer, Schlack, Shah, et al., 2001; Calvert, Spence, & Stein, 2004; Ladavas, 2002; Macaluso, Driver, & Frith, 2003; Weiss et al., 2003).
10

Pesic, Peter. "Helmholtz and the Sirens". In Music and the Making of Modern Science. The MIT Press, 2014. http://dx.doi.org/10.7551/mitpress/9780262027274.003.0015.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Hermann von Helmholtz’s investigations of physiological optics and acoustics reflected his profound interest in music. After devising instruments to measure the space and time parameters of visual and auditory response, Helmholtz produced “color curves” characterizing the complex response of the eye to the appropriate “dimensions” of hue, saturation, intensity. In so doing, he critiqued Newton’s attempt to impose the musical scale on vision. Through experiments on sirens, Helmholtz generalized auditory perception from vibrating bodies to air puffs. He gradually formed the view that recognition of musical intervals was closely analogous to spatial resemblance or recurrence. His unfolding conception of the “manifolds” or “spaces” of sensory experience radically reconfigured and extended Newton’s connection between the musical scale and visual perception via Thomas Young’s theory of color vision. In the process, Helmholtz’s studies of hearing and seeing led him to compare them as differently structured geometric manifolds. Throughout the book where various sound examples are referenced, please see http://mitpress.mit.edu/musicandmodernscience (please note that the sound examples should be viewed in Chrome or Safari Web browsers).

Atti di convegni sul tema "Auditory spatial perception":

1

Chao, Yujing, Zhijun Zhao, Chang Liu e Lingyun Xie. "Auditory Space Perception Influenced by Visual Spatial Information". In 2018 IEEE 3rd Advanced Information Technology, Electronic and Automation Control Conference (IAEAC). IEEE, 2018. http://dx.doi.org/10.1109/iaeac.2018.8577596.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Bukvic, Ivica Ico, Gregory Earle, Disha Sardana e Woohun Joo. "Studies in Spatial Aural Perception: Establishing Foundations for Immersive Sonification". In ICAD 2019: The 25th International Conference on Auditory Display. Newcastle upon Tyne, United Kingdom: Department of Computer and Information Sciences, Northumbria University, 2019. http://dx.doi.org/10.21785/icad2019.017.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
The Spatial Audio Data Immersive Experience (SADIE) project aims to identify new foundational relationships pertaining to hu-man spatial aural perception, and to validate existing relation-ships. Our infrastructure consists of an intuitive interaction in-terface, an immersive exocentric sonification environment, and a layer-based amplitude-panning algorithm. Here we highlight the system’s unique capabilities and provide findings from an initial externally funded study that focuses on the assessment of human aural spatial perception capacity. When compared to the existing body of literature focusing on egocentric spatial perception, our data show that an immersive exocentric environment enhances spatial perception, and that the physical implementation using high density loudspeaker arrays enables significantly improved spatial perception accuracy relative to the egocentric and virtual binaural approaches. The preliminary observations suggest that human spatial aural perception capacity in real-world-like immersive exocentric environments that allow for head and body movement is significantly greater than in egocentric scenarios where head and body movement is restricted. Therefore, in the design of immersive auditory displays, the use of immersive exocentric environments is advised. Further, our data identify a significant gap between physical and virtual human spatial aural perception accuracy, which suggests that further development of virtual aural immersion may be necessary before such an approach may be seen as a viable alternative.
3

Băcilă, Bogdan Ioan, e Hyunkook Lee. "Subjective Elicitation Of Listener-Perspective-Dependent Spatial Attributes in a Rerverberant Room, using the Repertory Grid Technique". In ICAD 2019: The 25th International Conference on Auditory Display. Newcastle upon Tyne, United Kingdom: Department of Computer and Information Sciences, Northumbria University, 2019. http://dx.doi.org/10.21785/icad2019.073.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Spatial impression is a widely researched topic in concert hall acoustics and spatial audio display. In order to provide the listener with plausible spatial impression in virtual and augmented reality applications, especially in the 6 Degrees of Freedom (6DOF) context, it is first important to understand how humans perceive various acoustical cues from different listening perspectives in a real space. This paper presents a fundamental subjective study conducted on the perception of spatial impression for multiple listener positions and orientations. An in-situ elicitation test was carried out using the repertory grid technique in a reverberant concert hall. Cluster analysis revealed a number of conventional spatial attributes such as source width, environmental width and envelopment. However, reverb directionality and echo perception were also found to be salient spatial properties associated with changes in the listener’s position and head orientation.
4

Tomoriova, Beata, e Norbert Kopco. "Auditory Spatial Cuing for Speech Perception in a Dynamic Multi-talker Environment". In 2008 6th International Symposium on Applied Machine Intelligence and Informatics (SAMI '08). IEEE, 2008. http://dx.doi.org/10.1109/sami.2008.4469177.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Sardana, Disha, Woohun Joo, Ivica Ico Bukvic e Gregory Earle. "Perception of spatial data properties in an immersive multi-layered auditory environment". In AM'20: Audio Mostly 2020. New York, NY, USA: ACM, 2020. http://dx.doi.org/10.1145/3411109.3411134.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Beadling, Andrew, e Paul Vickers. "Listener Perception of Spatialised Audio for Embodied Interaction in Sonification". In ICAD 2023: The 28th International Conference on Auditory Display. icad.org: International Community for Auditory Display, 2023. http://dx.doi.org/10.21785/icad2023.3814.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
One underexplored method for examining sonification perception is through spatialised audio parameter mapping. To lower the entry threshold for listeners unfamiliar with sonification, binaural rendering of Ambisonic audio is employed as a spatialisation technique. A Pure Data patch was developed and a listener study was conducted to investigate participants’ experience of listening to the spatialised sonification of city air quality data. The results indicate there is a link between listener experience and quality of perception, indicating that future research should focus on developing listener skill. Spatial audio appears to have some influence on perception for musically-trained listeners, but this appears to be most effective as an extra factor supplementing the manipulation of other sonic parameters such as frequency. Binaural rendering was useful for introducing uninitiated listeners to sonification, but there was difficulty in replicating the experience of physical space.
7

Voong, Tray Minh, e Michael Oehler. "Auditory Spatial Perception Using Bone Conduction Headphones along with Fitted Head Related Transfer Functions". In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 2019. http://dx.doi.org/10.1109/vr.2019.8798218.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Hirahara, Tatsuya, Daisuke Yoshisaki e Daisuke Morikawa. "Impact of dynamic binaural signal associated with listener's voluntary movement in auditory spatial perception". In ICA 2013 Montreal. ASA, 2013. http://dx.doi.org/10.1121/1.4799864.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Larsson, Pontus, Johanna Bergfelt Ramos de Souza e Joel Begnert. "An Auditory Display for Remote Road Vehicle Operation That Increases Awareness and Presence". In ICAD 2023: The 28th International Conference on Auditory Display. icad.org: International Community for Auditory Display, 2023. http://dx.doi.org/10.21785/icad2023.8296.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Remote operation, which allows a human operator to support a connected and automated vehicle from a distance, has been proposed as a solution to overcome vehicle automation challenges. In the current paper, the use of sound is being explored as a potential improvement to the current predominantly visual remote operator interfaces. An experiment was conducted aimed to test whether an auditory display providing propulsion sound from the ego-vehicle and spatial, augmented sound of surrounding road users would improve the perception of speed and ego motion, as well as situational awareness and presence. The experiment used a within-group 2x2 full factorial design with propulsion sound and spatial augmented sounds on/off as independent variables. 28 participants took part and drove in a simulated environment with the different types of auditory feedback. It was found that the auditory displays’ propulsion sound improved participants' speed regulation performance. Moreover, both the propulsion sound and the augmented sound contributed to the sensation of ego-motion, presence and situational awareness.
10

Godfroy-Cooper, Martine, Elizabeth Wenzel, Joel Miller e Edward Bachelder. "Cheeseman Award Paper: Isomorphic Spatial Visual Auditory Displays for Operations in DVE for Obstacle Avoidance". In Vertical Flight Society 75th Annual Forum & Technology Display. The Vertical Flight Society, 2019. http://dx.doi.org/10.4050/f-0075-2019-14563.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Helicopter military missions such as combat search and rescue, medical evacuation and landing on unprepared sites can involve operating in hostile, low-altitude, and degraded visual environments (DVE). These conditions may significantly reduce the pilot's capability to use the natural out of the window (OTW) perceptual cues, increase workload and increase the risk of collision with terrain and natural or man-made obstacles. In modern helicopter cockpits, synthetic vision systems (SVSs) can employ conventional nonconformal two-dimensional (2D), egocentric three-dimensional (3D) conformal symbology (CS) and laser detection and ranging (LADAR)/ radio detection and ranging (RADAR)/ forward looking infrared (FLIR) imagery support guidance and control, especially during operations in DVE. Although 3D CS can decrease pilot workload, it can also produce attentional tunneling (cognitive capture) and may not provide maximally effective depiction of the environment around the helicopter. In this context, it is crucial to develop integrated multimodal interfaces that extend the current operational envelope while enhancing flight safety. Several flight simulator studies have investigated the use of spatial auditory displays (SADs) in combination with spatially and temporally congruent visual displays in tasks as diverse as collision avoidance, intruding aircraft detection, or system malfunction warning. In this paper we propose a novel approach to spatial sonification design based on the premises that perception-based synthetic cueing can increase situation awareness (SA), improve overall performance, and allow mental workload to be kept at operationally effective levels. This paper discusses the development, implementation, and evaluation of a sensor-based augmented-reality spatial auditory display (ARSAD) and its visual analog, an integrated collision avoidance display (ICAD) for all phases of flight. Five UH60M Army pilots participated in a low-level flight simulation evaluating the visual and the auditory displays, alone or in combination in low-visibility and zero visibility environments. The results are discussed in the context of pilot cueing synergies for DVE.

Rapporti di organizzazioni sul tema "Auditory spatial perception":

1

Letowski, Tomasz R., e Szymon T. Letowski. Auditory Spatial Perception: Auditory Localization. Fort Belvoir, VA: Defense Technical Information Center, maggio 2012. http://dx.doi.org/10.21236/ada562292.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Mahat, Marian, Vivienne Awad, Christopher Bradbeer, Chengxin Guo, Wesley Imms e Julia Morris. Furniture for Engagement. University of Melbourne, febbraio 2023. http://dx.doi.org/10.46580/124374.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
The aim of the study was to explore the impact of furniture and spatial settings on teachers and students. Drawing on a case study action research approach involving surveys, two primary schools (Frangipani and Jasmine Primary School) within the Sydney Catholic Schools were involved as case study sites. This report provides a summary of the findings of the impact of furniture and spatial settings on teacher efficacy, teacher mind frames, student learning and student engagement as well as perceptions of students on the furniture and spatial settings. In summary, teachers’ perceptions of their mind frames, student learning and engagement increased after the introduction of furniture in the prototype learning environment. For one teacher, the perception of their efficacy did not improve after the implementation of the prototype space and furniture. In terms of students’ perceptions of the furniture, a large proportion of students agreed that they enjoyed learning and are more motivated to learn because of the new furniture. At Jasmine Primary School, a fifth of students felt that they were not motivated to learn because of the new furniture. Further in-depth study is required to find out the underlying reasons for this. Key themes that emerged from the qualitative data on the furniture and spatial settings focus on characteristics of furniture that afforded comfort, improved concentration and auditory qualities, supported collaboration, and capacity for choice. These are important considerations to drive decisions in school designs and furniture purchases. The importance of good furniture in a learning space cannot be underestimated. New learning environments and furniture demand and create new possibilities for teacher practices and student learning. The findings of the study, whilst limited in its scale, provides three crucial considerations relating to the importance of prototyping, professional learning and longitudinal data. These carry ramifications for wider understanding and future research. Future inquiry in these three key areas can provide the much-needed evidence to support schools’ transition into new learning environments.

Vai alla bibliografia