Academic literature on the topic 'Virtual auditory space'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Virtual auditory space.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Virtual auditory space"
Adams, N. H., and G. H. Wakefield. "State-Space Synthesis of Virtual Auditory Space." IEEE Transactions on Audio, Speech, and Language Processing 16, no. 5 (July 2008): 881–90. http://dx.doi.org/10.1109/tasl.2008.924151.
Full textFindlay-Walsh, Iain. "Virtual auditory reality." SoundEffects - An Interdisciplinary Journal of Sound and Sound Experience 10, no. 1 (January 15, 2021): 71–90. http://dx.doi.org/10.7146/se.v10i1.124199.
Full textKapralos, B., M. R. Jenkin, and E. Milios. "Virtual Audio Systems." Presence: Teleoperators and Virtual Environments 17, no. 6 (December 1, 2008): 527–49. http://dx.doi.org/10.1162/pres.17.6.527.
Full textPoon, P. W., and J. F. Brugge. "Virtual-space receptive fields of single auditory nerve fibers." Journal of Neurophysiology 70, no. 2 (August 1, 1993): 667–76. http://dx.doi.org/10.1152/jn.1993.70.2.667.
Full textZotkin, D. N., R. Duraiswami, and L. S. Davis. "Rendering Localized Spatial Audio in a Virtual Auditory Space." IEEE Transactions on Multimedia 6, no. 4 (August 2004): 553–64. http://dx.doi.org/10.1109/tmm.2004.827516.
Full textHartung, Klaus, Susanne J. Sterbing, Clifford H. Keller, and Terry T. Takahashi. "Applications of virtual auditory space in psychoacoustics and neurophysiology." Journal of the Acoustical Society of America 105, no. 2 (February 1999): 1164. http://dx.doi.org/10.1121/1.425528.
Full textTakahashi, Terry T., Clifford H. Keller, David R. Euston, and Michael L. Spezio. "Analysis of auditory spatial receptive fields: An application of virtual auditory space technology." Journal of the Acoustical Society of America 111, no. 5 (2002): 2391. http://dx.doi.org/10.1121/1.4809152.
Full textCarlile, Simon, and Daniel Wardman. "Masking produced by broadband noise presented in virtual auditory space." Journal of the Acoustical Society of America 100, no. 6 (December 1996): 3761–68. http://dx.doi.org/10.1121/1.417236.
Full textIshii, Masahiro, Masanori Nakata, and Makoto Sato. "Networked SPIDAR:A Networked Virtual Environment with Visual, Auditory, and Haptic Interactions." Presence: Teleoperators and Virtual Environments 3, no. 4 (January 1994): 351–59. http://dx.doi.org/10.1162/pres.1994.3.4.351.
Full textVenkateswaran Nisha, Kavassery, and Ajith Uppunda Kumar. "Virtual Auditory Space Training-Induced Changes of Auditory Spatial Processing in Listeners with Normal Hearing." Journal of International Advanced Otology 13, no. 1 (May 29, 2017): 118–27. http://dx.doi.org/10.5152/iao.2017.3477.
Full textDissertations / Theses on the topic "Virtual auditory space"
Kelly, Michael C. "Efficient representation of adaptable virtual auditory space." Thesis, University of York, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.274510.
Full textSpezio, Michael L. "Using virtual reality to understand the brain : applications in virtual auditory space /." view abstract or download file of text, 2002. http://wwwlib.umi.com/cr/uoregon/fullcit?p3045096.
Full textTypescript. Includes vita and abstract. Includes bibliographical references (leaves 127-139). Also available for download via the World Wide Web; free to University of Oregon users. Address: http://wwwlib.umi.com/cr/uoregon/fullcit?p3045096.
Jin, Craig. "Spectral analysis and resolving spatial ambiguities in human sound localization." Thesis, The University of Sydney, 2001. http://hdl.handle.net/2123/1342.
Full textJin, Craig. "Spectral analysis and resolving spatial ambiguities in human sound localization." University of Sydney, 2001. http://hdl.handle.net/2123/1342.
Full textThis dissertation provides an overview of my research over the last five years into the spectral analysis involved in human sound localization. The work involved conducting psychophysical tests of human auditory localization performance and then applying analytical techniques to analyze and explain the data. It is a fundamental thesis of this work that human auditory localization response directions are primarily driven by the auditory localization cues associated with the acoustic filtering properties of the external auditory periphery, i.e., the head, torso, shoulder, neck, and external ears. This work can be considered as composed of three parts. In the first part of this work, I compared the auditory localization performance of a human subject and a time-delay neural network model under three sound conditions: broadband, high-pass, and low-pass. A “black-box” modeling paradigm was applied. The modeling results indicated that training the network to localize sounds of varying center-frequency and bandwidth could degrade localization performance results in a manner demonstrating some similarity to human auditory localization performance. As the data collected during the network modeling showed that humans demonstrate striking localization errors when tested using bandlimited sound stimuli, the second part of this work focused on human sound localization of bandpass filtered noise stimuli. Localization data was collected from 5 subjects and for 7 sound conditions: 300 Hz to 5 kHz, 300 Hz to 7 kHz, 300 Hz to 10 kHz, 300 Hz to 14 kHz, 3 to 8 kHz, 4 to 9 kHz, and 7 to 14 kHz. The localization results were analyzed using the method of cue similarity indices developed by Middlebrooks (1992). The data indicated that the energy level in relatively wide frequency bands could be driving the localization response directions, just as in Butler’s covert peak area model (see Butler and Musicant, 1993). The question was then raised as to whether the energy levels in the various frequency bands, as described above, are most likely analyzed by the human auditory localization system on a monaural or an interaural basis. In the third part of this work, an experiment was conducted using virtual auditory space sound stimuli in which the monaural spectral cues for auditory localization were disrupted, but the interaural spectral difference cue was preserved. The results from this work showed that the human auditory localization system relies primarily on a monaural analysis of spectral shape information for its discrimination of directions on the cone of confusion. The work described in the three parts lead to the suggestion that a spectral contrast model based on overlapping frequency bands of varying bandwidth and perhaps multiple frequency scales can provide a reasonable algorithm for explaining much of the current psychophysical and neurophysiological data related to human auditory localization.
Schönstein, David. "Individual of spectral cues for applications in virtual auditory space : study of inter-subject differences in Head-Related Transfer Functions using perceptual judgements from listening tests." Paris 6, 2012. http://www.theses.fr/2012PA066488.
Full textBooks on the topic "Virtual auditory space"
Carlile, Simon. Virtual Auditory Space: Generation and Applications. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-662-22594-3.
Full text1957-, Carlile Simon, ed. Virtual auditory space: Generation and applications. Austin, TX: RG Landes, 1996.
Find full textCarlile, Simon. Virtual Auditory Space: Generation and Applications. Springer, 2013.
Find full textCarlile, Simon. Virtual Auditory Space: Generation and Applications. Springer London, Limited, 2013.
Find full textCarlile, Simon. Virtual Auditory Space: Generation and Applications (Neuroscience Intelligence Unit). Landes Bioscience, 1996.
Find full textCarlile, Simon. Virtual Auditory Space: Generation and Applications (Neuroscience Intelligence Unit). R G Landes Co, 1996.
Find full textVirtual Auditory Space: Generation and Applications (Neuroscience Intelligence Unit). Springer, 1996.
Find full textBook chapters on the topic "Virtual auditory space"
Shinn-Cunningham, Barbara, and Abhijit Kulkarni. "Recent Developments in Virtual Auditory Space." In Neuroscience Intelligence Unit, 185–243. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-662-22594-3_6.
Full textPralong, Danièle, and Simon Carlile. "Generation and Validation of Virtual Auditory Space." In Neuroscience Intelligence Unit, 109–51. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-662-22594-3_4.
Full textKobayashi, Yosuke, Kazuhiro Kondo, and Kiyoshi Nakagawa. "Intelligibility of HE-AAC Coded Japanese Words with Various Stereo Coding Modes in Virtual 3D Audio Space." In Auditory Display, 219–38. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-12439-6_12.
Full textCalleri, Cristina, Louena Shtrepi, Alessandro Armando, and Arianna Astolfi. "Investigations on the Influence of Auditory Perception on Urban Space Design Through Virtual Acoustics." In Advances in Civil and Industrial Engineering, 344–67. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-3637-6.ch015.
Full textSpöhrer, Markus. "Playing With Auditory Environments in Audio Games." In Research Anthology on Game Design, Development, Usage, and Social Impact, 644–61. IGI Global, 2022. http://dx.doi.org/10.4018/978-1-6684-7589-8.ch032.
Full textSpöhrer, Markus. "Playing With Auditory Environments in Audio Games." In Advances in Human and Social Aspects of Technology, 87–111. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-7027-1.ch004.
Full textEngel, Isaac, and Lorenzo Picinali. "Reverberation and its Binaural Reproduction: The Trade-off between Computational Efficiency and Perceived Quality." In Advances in Fundamental and Applied Research on Spatial Audio [Working Title]. IntechOpen, 2022. http://dx.doi.org/10.5772/intechopen.101940.
Full textKAN, A., C. T. JIN, and A. VAN SCHAIK. "PSYCHOACOUSTIC EVALUATION OF DIFFERENT METHODS FOR CREATING INDIVIDUALIZED, HEADPHONE-PRESENTED VIRTUAL AUDITORY SPACE FROM B-FORMAT ROOM IMPULSE RESPONSES." In Principles and Applications of Spatial Hearing, 303–13. WORLD SCIENTIFIC, 2011. http://dx.doi.org/10.1142/9789814299312_0024.
Full textIWAYA, Y., M. OTANI, and Y. SUZUKI. "DEVELOPMENT OF VIRTUAL AUDITORY DISPLAY SOFTWARE RESPONSIVE TO HEAD MOVEMENT AND A CONSIDERATION ON DERATION OF SPATIALIZED AMBIENT SOUND TO IMPROVE REALISM OF PERCEIVED SOUND SPACE." In Principles and Applications of Spatial Hearing, 121–35. WORLD SCIENTIFIC, 2011. http://dx.doi.org/10.1142/9789814299312_0010.
Full textConference papers on the topic "Virtual auditory space"
Di Ai and HaiLong Wu. "Electronic compass for virtual auditory space." In 2010 International Conference on Progress in Informatics and Computing (PIC). IEEE, 2010. http://dx.doi.org/10.1109/pic.2010.5687906.
Full textHugeng, Hugeng, Jovan Anggara, and Dadang Gunawan. "Enhanced three-dimensional HRIRs interpolation for virtual auditory space." In 2017 International Conference on Signals and Systems (ICSigSys). IEEE, 2017. http://dx.doi.org/10.1109/icsigsys.2017.7967065.
Full textChabot, Samuel, Wendy Lee, Rebecca Elder, and Jonas Braasch. "Using a Multimodal Immersive Environment to Investigate Perceptions in Augmented Virtual Reality Systems." In The 24th International Conference on Auditory Display. Arlington, Virginia: The International Community for Auditory Display, 2018. http://dx.doi.org/10.21785/icad2018.014.
Full textChabot, Samuel, and Jonas Braasch. "An Immersive Virtual Environment for Congruent Audio-Visual Spatialized Data Sonifications." In The 23rd International Conference on Auditory Display. Arlington, Virginia: The International Community for Auditory Display, 2017. http://dx.doi.org/10.21785/icad2017.072.
Full textMay, Keenan R., Briana Sobel, Jeff Wilson, and Bruce N. Walker. "Auditory Displays to Facilitate Object Targeting in 3D Space." In ICAD 2019: The 25th International Conference on Auditory Display. Newcastle upon Tyne, United Kingdom: Department of Computer and Information Sciences, Northumbria University, 2019. http://dx.doi.org/10.21785/icad2019.008.
Full textIwaya, Yukio, Masahi Toyoda, Makoto Otani, and Yoiti Suzuki. "Evaluation of Realism of Dynamic Sound Space Using a Virtual Auditory Display." In 2012 13th ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel & Distributed Computing (SNPD). IEEE, 2012. http://dx.doi.org/10.1109/snpd.2012.99.
Full textGeronazzo, Michele, and Paola Cesari. "A motion based setup for peri-personal space estimation with virtual auditory displays." In VRST '16: 22th ACM Symposium on Virtual Reality Software and Technology. New York, NY, USA: ACM, 2016. http://dx.doi.org/10.1145/2993369.2996303.
Full textIwaya, Yukio, Makoto Otani, Takao Tsuchiya, and Junfeng Li. "Virtual Auditory Display on a Smartphone for High-Resolution Acoustic Space by Remote Rendering." In 2015 International Conference on Intelligent Information Hiding and Multimedia Signal Processing (IIH-MSP). IEEE, 2015. http://dx.doi.org/10.1109/iih-msp.2015.69.
Full textHiguera-Trujillo, Juan Luis, Carmen Llinares Millán, Susana Iñarra Abad, and Juan Serra Lluch. "A virtual reality study in university classrooms: The influence of classroom colour on memory and attention." In INNODOCT 2020. Valencia: Editorial Universitat Politècnica de València, 2020. http://dx.doi.org/10.4995/inn2020.2020.11858.
Full textIshikawa, Ayumi. "Visual and Auditory Impression for Virtual Reality Space Expressed by Panoramic Image and Impulse Response Signal." In 2019 IEEE 8th Global Conference on Consumer Electronics (GCCE). IEEE, 2019. http://dx.doi.org/10.1109/gcce46687.2019.9015489.
Full text