Thèses sur le sujet « Auditory spatial perception »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Consultez les 22 meilleures thèses pour votre recherche sur le sujet « Auditory spatial perception ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Parcourez les thèses sur diverses disciplines et organisez correctement votre bibliographie.
Keating, Peter. « Plasticity and integration of auditory spatial cues ». Thesis, University of Oxford, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.561113.
Texte intégralGeeseman, Joseph W. « The influence of auditory cues on visual spatial perception ». OpenSIUC, 2010. https://opensiuc.lib.siu.edu/theses/286.
Texte intégralGriffiths, Shaaron S., et shaaron griffiths@deakin edu au. « Spatial and temporal disparaties in aurally aided visual search ». Deakin University. School of Psychology, 2001. http://tux.lib.deakin.edu.au./adt-VDU/public/adt-VDU20061207.134032.
Texte intégralElias, Bartholomew. « Cross-modal facilitation of spatial frequency discriminations through auditory frequency cue presentations ». Thesis, Georgia Institute of Technology, 1991. http://hdl.handle.net/1853/28611.
Texte intégralBest, Virginia Ann. « Spatial Hearing with Simultaneous Sound Sources : A Psychophysical Investigation ». University of Sydney. Medicine, 2004. http://hdl.handle.net/2123/576.
Texte intégralJin, Craig T. « Spectral analysis and resolving spatial ambiguities in human sound localization ». Connect to full text, 2001. http://hdl.handle.net/2123/1342.
Texte intégralTitle from title screen (viewed 13 January 2009). Submitted in fulfilment of the requirements for the degree of Doctor of Philosophy to the School of Electrical and Information Engineering, Faculty of Engineering. Includes bibliographical references. Also available in print form.
Nuckols, Richard. « Localization of Auditory Spatial Targets in Sighted and Blind Subjects ». VCU Scholars Compass, 2013. http://scholarscompass.vcu.edu/etd/3286.
Texte intégralEuston, David Raymond. « From spectrum to space the integration of frequency-specific intensity cues to produce auditory spatial receptive fields in the barn owl inferior colliculus / ». [Eugene, Or. : University of Oregon Library System], 2000. http://libweb.uoregon.edu/UOTheses/2000/eustond00.pdf.
Texte intégralEuston, David Raymond 1964. « From spectrum to space : the integration of frequency-specific intensity cues to produce auditory spatial receptive fields in the barn owl inferior colliculus ». Thesis, University of Oregon, 2000. http://hdl.handle.net/1794/143.
Texte intégralNeurons in the barn owl's inferior colliculus (IC) derive their spatial receptive fields (RF) from two auditory cues: interaural time difference (ITD) and interaural level difference (ILD). ITD serves to restrict a RF in azimuth but the precise role of ILD was, up to this point, unclear. Filtering by the ears and head insures that each spatial location is associated with a unique combination of frequency-specific ILD values (i.e., an ILD spectrum). We isolated the effect of ILD spectra using virtual sound sources in which ITD was held fixed for all spatial locations while ILD spectra were allowed to vary normally. A cell's response to these stimuli reflects the contribution of ILD to spatial tuning, referred to as an “ILD-alone RF”. In a sample of 34 cells, individual ILD-alone RFs were distributed and amorphous, but consistently showed that the ILD spectrum is facilatory at the cell's best location and inhibitory above and/or below. Prior results have suggested that an IC cell's spatial specificity is generated by summing inputs which are narrowly tuned to frequency and selective for both ILD and ITD. Based on this premise, we present a developmental model which, when trained solely on a cell's true spatial RF, reproduces both the cell's true RF and its ILD-alone RF. According to the model, the connectivity between a space-tuned IC cell and its frequency-specific inputs develops subject to two constraints: the cell must be excited by ILD spectra from the cell's best location and inhibited by spectra from locations above and below but along the vertical strip defined by the best ITD. To assess how frequency-specific inputs are integrated to form restricted spatial RFs, we measured the responses of 47 space-tuned IC cells to pure tones at varying ILDs and frequencies. ILD tuning varied with frequency. Further, pure-tone responses, summed according to the head-related filters, accounted for 56 percent of the variance in broadband ILD-alone RFs. Modelling suggests that, with broadband sounds, cells behave as though they are linearly summing their inputs, but when testing with pure tones, non-linearities arise. This dissertation includes unpublished co-authored materials.
Cogné, Mélanie. « Influence de modulations sensorielles sur la navigation et la mémoire spatiale en réalité virtuelle : Processus cognitifs impliqués ». Thesis, Bordeaux, 2017. http://www.theses.fr/2017BORD0704.
Texte intégralNavigating in a familiar or unfamiliar environment is a frequent challenge for human beings. Many patients with brain injury suffer from topographical difficulties, which influences their autonomy in daily life. Virtual Reality Tools enable the evaluation of largescale spatial navigation and spatial memory, resembling a real environment. Virtual reality also permits to add stimuli to the software. These stimuli can be contextual, that is to say linked to the task that participants have to accomplish in the Virtual Environment, or non-contextual, i.e. with no link with the require task. This thesis investigates whether visual or auditory stimuli influence spatial navigation and memory in Virtual Environments of patients with brain injury or with a neurodegenerative disease. The first part of the thesis showed contextual auditory stimuli type a sonar effect and the names of products of the shopping list improved spatial navigation of brain-injured patients during a shopping task in the virtual supermarket VAP-S. The second part of this thesis highlighted that non-contextual auditory stimuli with a high perceptual or cognitive salience decreased spatial navigation performance of brain-injured patients during a shopping task in the VAP-S. The third part of this thesis showed that visual cues like directional arrows and salient landmarks improved spatial navigation and some aspects of spatial memory of patients with Alzheimer’s disease or Mild Cognitive Impairments during a navigation task in a virtual district. The last part of this thesis demonstrated that auditory cues, i.e. beeping sounds indicating the directions, increased spatial navigation in a virtual district of patients who have had a stroke with contra-lesional visual and auditory neglect. These results suggest that some visual and auditory stimuli could be helpful for spatial navigation and memory tasks in patients with brain injury of neuro-degenerative disease. It further offers new research avenues for neuro-rehabilitation, such as the use of augmented reality in real-life settings to support the navigational capabilities of these patients
Best, Virginia Ann. « Spatial Hearing with Simultaneous Sound Sources : A Psychophysical Investigation ». Thesis, The University of Sydney, 2004. http://hdl.handle.net/2123/576.
Texte intégralFernández, Prieto Irene. « Development and neural bases of the spatial recoding of acoustic pitch ». Doctoral thesis, Universitat de Barcelona, 2016. http://hdl.handle.net/10803/401340.
Texte intégralEstudios previos sugieren la existencia de correspondencias transmodales entre la altura tonal y las dimensiones visuoespaciales (por ejemplo, la elevación espacial o el tamaño visual) (Gallace y Spence, 2006; Rusconi y cols., 2006). La investigación en neuroimagen ha puesto de manifiesto la activación de áreas parietales (por ejemplo, el surco intraparietal, IPS) tradicionalmente asociadas a funciones espaciales, durante la realización de tareas musicales relacionadas con la altura tonal (Foster y Zatorre, 2010a, 2010b). El objetivo de esta tesis doctoral fue investigar los mecanismos cognitivos, de desarrollo y neurales que subyacen a estas correspondencias transmodales. En la primera parte de la tesis, se estudió el papel de las correspondencias transmodales en la modulación de la atención. Exploramos cómo un sonido dinámico específico (por ejemplo, un sonido ascendente) modula exógenamente la localización espacial de un estímulo visual en un adulto. Los participantes realizaron una tarea que consistió en una modificación del paradigma clásico de Posner, en la que usamos sonidos ascendentes y descendentes como señales auditivas. En una segunda parte de la tesis se analizó el desarrollo de los mecanismos implicados en el procesamiento de las correspondencias transmodales. Más específicamente, se investigó si estas están presentes desde muy temprano en el desarrollo, o aparecen debido a al incremento de la experiencia en el entorno. Para ello, se observó la asociación entre la altura tonal y el tamaño visual en bebés de 4 y 6 meses de edad en un paradigma audiovisual de preferencia visual. Por último, en un intento por establecer el papel funcional de ciertas áreas del lóbulo parietal derecho en la asociación entre la altura tonal y la elevación espacial, analizamos si los síndromes neurológicos, asociados con una alteración funcional o estructural del lóbulo parietal derecho (por ejemplo, el trastorno de aprendizaje no verbal, TANV), pueden afectar al desempeño de tareas relacionadas con la altura tonal. Para lograr nuestros objetivos, se comparó el rendimiento en tareas visuoespaciales y auditivas de pacientes con pacientes con TANV y un grupo de control sin trastornos neurológicos o psiquiátricos. Los resultados de estos tres estudios que constituyen esta tesis sugieren: (1) que los sonidos ascendentes pueden ejercer mayor modulación del tiempo de reacción del sistema perceptivo respecto a los objetos visuales que los sonidos descendentes, (2) la existencia de asociaciones transmodales entre la altura tonal y el tamaño visual en bebés de 6 meses de edad, pero no en bebés más pequeños, lo que sugiere que la experiencia y/o maduración es necesaria para desarrollar plenamente esta asociación, y (3) que la alteración funcional o estructural de áreas parietales tiene un impacto negativo en las tareas de procesamiento auditivo, por ejemplo, discriminar alturas tonales. En conclusión, se confirmó que la altura tonal puede modular la atención espacial. En segundo lugar, las correspondencias transmodales están controladas por mecanismos básicos que están presentes desde temprana edad. Por último, también se encontró evidencia de que la percepción de la información auditiva compleja podría requerir áreas del cerebro relacionadas con trastornos neurológicos que afectan el procesamiento del espacio.
Scheperle, Rachel Anna. « Relationships among peripheral and central electrophysiological measures of spatial / spectral resolution and speech perception in cochlear implant users ». Diss., University of Iowa, 2013. https://ir.uiowa.edu/etd/5055.
Texte intégralStanley, Raymond M. « Toward adapting spatial audio displays for use with bone conduction the cancellation of bone-conducted and air-conducted sound waves / ». Thesis, Available online, Georgia Institute of Technology, 2006, 2006. http://etd.gatech.edu/theses/available/etd-11022006-103809/.
Texte intégralBordeau, Camille. « Développement d’un dispositif de substitution sensorielle vision-vers-audition : étude des performances de localisation et comparaison de schémas d’encodage ». Electronic Thesis or Diss., Bourgogne Franche-Comté, 2023. https://nuxeo.u-bourgogne.fr/nuxeo/site/esupversions/32b91892-b42f-4d42-bf10-0ad744828698.
Texte intégralVisual-to-auditory sensory substitution devices convert visual information into soundscapes for the purpose of allowing the perception of the environment with the auditory modality when the visual modality is impaired. They constitute a promising solution for improving the autonomy of visually impaired people when traveling on foot. The main objective of this thesis work was to determine an encoding scheme for sensory substitution allowing 3-dimensional spatial perception by proposing familiarization and evaluation protocols in virtual environments with different complexities. The first aim was to determine whether the reproduction of acoustic cues for auditory spatial perception was more effective than the use of acoustic cues involved in audio-visual interactions. The first study demonstrated that the modulation of pitch in the encoding scheme could partly compensate for the perceptual limits of spatialization for the dimension of elevation. The second study showed that the modification of the sound envelope could partly compensate for the compressed perception of distance. The second objective was to determine to what extent the determined encoding scheme preserved spatial perception abilities in a complex environment where several objects were present. The third study demonstrated that the segregation capabilities of a complex visual scene through the soundscape depend on the specific spectral signature of the objects composing it when pitch modulation is used as an acoustic cue in the encoding scheme. The work of this thesis has practical implications for the improvement of substitution devices concerning, on the one hand, the possibility of compensating spatial perceptual limits with non-spatial acoustic cues in the encoding scheme, and on the other hand, the need to reduce the amount of auditory information to preserve the segregation abilities of the soundscape. The familiarization and evaluation protocols in a virtual environment having been developed to be adapted to the visually impaired population, the work of this thesis highlights the potential of virtual environments to precisely evaluate the abilities to use sensory substitution devices in a secure context
Bergqvist, Emil. « Auditory displays : A study in effectiveness between binaural and stereo audio to support interface navigation ». Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-10072.
Texte intégralHobeika, Lise. « Interplay between multisensory integration and social interaction in auditory space : towards an integrative neuroscience approach of proxemics ». Thesis, Sorbonne Paris Cité, 2017. http://www.theses.fr/2017USPCB116.
Texte intégralThe space near the body, called peripersonal space (PPS), was originally studied in social psychology and anthropology as an important factor in interpersonal communication. It was later described by neurophysiological studies in monkeys as a space mapped with multisensory neurons. Those neurons discharge only when events are occurring near the body (be it tactile, visual or audio information), delineating the space that people consider as belonging to them. The human brain also codes events that are near the body differently from those that are farther away. This dedicated brain function is critical to interact satisfactorily with the external world, be it for defending oneself or to reach objects of interest. However, little is known about how this function is impacted by real social interactions. In this work, we have conducted several studies aiming at understanding the factors that contribute to the permeability and adaptive aspects of PPS. A first study examined lateral PPS for individuals in isolation, by measuring reaction time to tactile stimuli when an irrelevant sound is looming towards the body of the individual. It revealed an anisotropy of reaction time across hemispaces, that we could link to handedness. A second study explored the modulations of PPS in social contexts. It was found that minimal social instructions could influence the shape of peripersonal space, with a complex modification of behaviors in collaborative tasks that outreaches the handedness effect. The third study is a methodological investigation attempting to go beyond the limitations of the behavioral methods measuring PPS, and proposing a new direction to assess how stimuli coming towards the body are integrated according to their distance and the multisensory context in which they are processed. Taken together, our work emphasizes the importance of investigating multisensory integration in 3D space around the body to fully capture PPS mechanisms, and the potential impacts of social factors on low-level multisensory processes. Moreover, this research provides evidence that neurocognitive social investigations, in particular on space perception, benefit from going beyond the traditional isolated individual protocols towards actual live social interactive paradigms
Lidji, Pascale. « Musique et langage : spécificités, interactions et associations spatiales ». Thèse, Universite Libre de Bruxelles, 2008. http://hdl.handle.net/1866/6347.
Texte intégralLee, Pei-Jung, et 李佩蓉. « The auditory spatial perception of hearing impaired under noise exposure ». Thesis, 2012. http://ndltd.ncl.edu.tw/handle/63275674675376395841.
Texte intégral銘傳大學
商品設計學系碩士班
100
Philosopher Lorenzi Okun indicated that: "the human eye will be brought into the world, the ears will bring the world of mankind. " Obviously the hearing is an important way of human to receive external information, but also affect ability to communicate. Hearing impairment is bound to affect the lives learning and communication efficiency, but will also lead to a decline in the quality of life and the interpersonal relationship. Hearing impaired will make inconvenience of life and safety concerns, therefore, how to design and enhance the hearing impaired to improve usability to enhance their welfare should be an important issue. The objective of this study is to investigate the effect of orientation of the sound source, distance, frequency and source of clues to the impaired spatial perception. The results of experiment one indicated that the frequency of a sound, the distance of the sound source, the ears of the advantages of the sound source position and the subjects are significant to determine the sound source position tolerance,also it has two-way interaction of the sound frequency × sound source and sound frequency × sound distance. Experiment two indicated significant of stimulate sound time and stimulate the subjects was affect to impact of sound frequencies before the auditory threshold after hearing threshold tolerance, In addition, the experiment found that the two way interaction of stimulate the auditory type x audio rendering time. The results can guidance the design of the hearing impaired products to enhance the life of the well-being of the hearing-impaired. Example: the design of communications equipment, medical audiometry, calibration, broadcaster, alarms, doorbells, GPS navigation, mobile phones and parking sensor design.
Morgenstern, Yaniv. « Broad spatial pooling with local detectors for grating detection revealed with classification image analysis / ». 2006. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&res_dat=xri:pqdiss&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft_dat=xri:pqdiss:MR19699.
Texte intégralTypescript. Includes bibliographical references (leaves 89-93). Also available on the Internet. MODE OF ACCESS via web browser by entering the following URL: http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&res_dat=xri:pqdiss&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft_dat=xri:pqdiss:MR19699
Hirschausen, David. « Body space interaction : sound as a spatial mnemonic ». Master's thesis, 2007. http://hdl.handle.net/1885/149581.
Texte intégralHeron, James, N. W. Roach, James Vincent Michael Hanson, Paul V. McGraw et David J. Whitaker. « Audiovisual time perception is spatially specific ». 2012. http://hdl.handle.net/10454/6016.
Texte intégralOur sensory systems face a daily barrage of auditory and visual signals whose arrival times form a wide range of audiovisual asynchronies. These temporal relationships constitute an important metric for the nervous system when surmising which signals originate from common external events. Internal consistency is known to be aided by sensory adaptation: repeated exposure to consistent asynchrony brings perceived arrival times closer to simultaneity. However, given the diverse nature of our audiovisual environment, functionally useful adaptation would need to be constrained to signals that were generated together. In the current study, we investigate the role of two potential constraining factors: spatial and contextual correspondence. By employing an experimental design that allows independent control of both factors, we show that observers are able to simultaneously adapt to two opposing temporal relationships, provided they are segregated in space. No such recalibration was observed when spatial segregation was replaced by contextual stimulus features (in this case, pitch and spatial frequency). These effects provide support for dedicated asynchrony mechanisms that interact with spatially selective mechanisms early in visual and auditory sensory pathways.