Dissertations / Theses on the topic 'Sonification'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Sonification.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Berman, Lewis Irwin. "Program comprehension through sonification." Thesis, Durham University, 2011. http://etheses.dur.ac.uk/1396/.
Full textEjdbo, Malin, and Elias Elmquist. "Interactive Sonification in OpenSpace." Thesis, Linköpings universitet, Medie- och Informationsteknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-170250.
Full textPietrucha, Matthew. "Sonification of Spectroscopy Data." Digital WPI, 2019. https://digitalcommons.wpi.edu/etd-theses/1277.
Full textIbrahim, Ag Asri Ag. "Usability inspection for sonification applications." Thesis, University of York, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.479510.
Full textSteffert, Tony. "Real-time electroencephalogram sonification for neurofeedback." Thesis, Open University, 2018. http://oro.open.ac.uk/57965/.
Full textPerkins, Rhys John. "Interactive sonification of a physics engine." Thesis, Anglia Ruskin University, 2013. http://arro.anglia.ac.uk/323077/.
Full textParseihian, Gaëtan. "Sonification binaurale pour l'aide à la navigation." Phd thesis, Université Pierre et Marie Curie - Paris VI, 2012. http://tel.archives-ouvertes.fr/tel-00771316.
Full textWorrall, David, and n/a. "SONIFICATION AND INFORMATION CONCEPTS, INSTRUMENTS AND TECHNIQUES." University of Canberra. Communication, 2009. http://erl.canberra.edu.au./public/adt-AUC20090818.142345.
Full textDyer, John. "Human movement sonification for motor skill learning." Thesis, Queen's University Belfast, 2017. https://pure.qub.ac.uk/portal/en/theses/human-movement-sonification-for-motor-skill-learning(4bda096c-e8ab-4af4-8f35-7445c6b0cb7e).html.
Full textLeplaÌ‚tre, GreÌgory. "The design and evaluation of non speech sounds to support navigation in restricted display devices." Thesis, University of Glasgow, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.270963.
Full textBreder, Elijah. "Towards the sonification of the World Wide Web : SprocketPlug." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp03/MQ29484.pdf.
Full textSun, Lichi. "Real-time sonification of muscle tension for piano players." Thesis, University of York, 2017. http://etheses.whiterose.ac.uk/18695/.
Full textJoliat, Nicholas D. (Nicholas David). "DoppelLab : spatialized data sonification in a 3D virtual environment." Thesis, Massachusetts Institute of Technology, 2013. http://hdl.handle.net/1721.1/85427.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (pages 67-69).
This thesis explores new ways to communicate sensor data by combining spatialized sonification with data visualiation in a 3D virtual environment. A system for sonifying a space using spatialized recorded audio streams is designed, implemented, and integrated into an existing 3D graphical interface. Exploration of both real-time and archived data is enabled. In particular, algorithms for obfuscating audio to protect privacy, and for time-compressing audio to allow for exploration on diverse time scales are implemented. Synthesized data sonification in this context is also explored.
by Nicholas D. Joliat.
M. Eng.
Anderson, Janet E. "Sonification design for complex work domains : streams, mappings and attention /." [St. Lucia, Qld.], 2004. http://www.library.uq.edu.au/pdfserve.php?image=thesisabs/absthe18173.pdf.
Full textGrond, Florian [Verfasser]. "Listening-Mode-Centered Sonification Design for Data Exploration / Florian Grond." Bielefeld : Universitätsbibliothek Bielefeld, 2003. http://d-nb.info/1052123473/34.
Full textForsberg, Joel. "A Mobile Application for Improving Running Performance Using Interactive Sonification." Thesis, KTH, Tal, musik och hörsel, TMH, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-159577.
Full textDet har blivit populärt med appar som riktar sig till långdistanslöpare, men de flesta av dessa fokuserar på resultat som kommer från uträkningar av distans och tid. För att bli en bättre löpare krävs att man förbättrar både sin kroppshållning och sin löpstil. Det har blivit ett etablerat forskningsämne under de senaste årtiondena att använda sig av ljudåterkoppling för att förbättra sin prestation inom olika sporter. Detta lämpar sig väl för aktiviteter där användaren behöver fokusera sin blick på något, till exempel under löpning. Målet med det här projektet var att implementera en mobil applikation som riktar sig till att förbättra långdistanslöpares kroppshållning och löpstil. Genom att minska på energin som krävs för att springa med en viss hastighet kan löparens prestationsförmåga öka. Applikationen använder sig av sensorerna i en mobiltelefon för att analysera användarens vertikala kraft, stegfrekvens, hastighet och kroppslutning genom att sonifiera dessa parametrar på ett interaktivt sätt där musiken som användaren lyssnar på ändras på olika sätt. Implementeringen gjordes i det visuella programmeringsspråket Pure Data tillsammans med MobMuPlat, som gör att implementeringen kan användas i en mobiltelefon. Tester genomfördes med löpare med olika grader av erfarenhet, resultaten visade att löparna kunde interagera med musiken för tre av de fyra parametrarna men mer övning krävs för att kunna förändra löpstilen i realtid.
Winters, Raymond. "Exploring music through sound: sonification of emotion, gesture, and corpora." Thesis, McGill University, 2014. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=121524.
Full textLa recherche actuelle en musique implique la collecte, l'analyse et l'affichage d'un large volume de données, abordées selon différentes approches. Bien que l'idée d'utiliser le son afin de percevoir l'information scientifique ait déjà été explorée, l'utilisation du son comme outil d'étude de la musique constitue un cas particulier encore malheureusement sous-développé. Afin d'explorer cette perspective, trois types de données endémiques en recherche musicale sont examinées dans ce mémoire: émotion, geste et corpus. L'émotion en tant que type de données se retrouve le plus fréquemment au sein du domaine émergent de l'informatique affective, même si la notion fut abordée en musique bien plus tôt. Le geste est étudié de façon quantitative à l'aide de systèmes de capture de mouvement conçus pour enregistrer précisément les mouvements de musiciens ou danseurs lors d'interprétations et performances. Le corpus désigne ici les vastes bases de données sur la musique elle-même que constituent, par exemple, le recueil des quatuors à cordes de Beethoven, ou une collection musicale personnelle. Bien que les motivations pour la sonification diffèrent entre ces trois cas, comme clairement illustré dans ce mémoire, leur relation commune au médium sonore peut engendrer des avantages supplémentaires. Dans le cas de l'émotion, la sonification peut tout d'abord s'appuyer sur les connaissances établie concernant les déterminants acoustiques et structurels de l'émotion musicale, ainsi que sur les nouveaux outils informatiques conçus pour leur identification. La sonification trouve alors son utilité dans les configurations systématiques et théoriquement justifiées qu'elle peut proposer pour précisément instancier un modèle informatique (appliquer un modèle informatique à l'objet d'étude) et extraire d'un contexte musical spécifique les vecteurs d'émotion du son. Pour le geste, le son peut servir à représenter les mouvements expressifs de l'interprète au sein du même médium que la musique interprétée, offrant ainsi un accès auditif simultané correspondant aux indices visuels pertinents. La capacité d'un outil logiciel spécialement conçu à atteindre des objectifs de sonification et d'analyse du mouvement expressif au sens large est évaluée. Enfin, la sonification est appliquée à l'analyse de corpus. La lecture à très haute vitesse (de l'ordre de 104 notes par seconde) des chorales de Bach, des quatuors à cordes de Beethoven ou des madrigaux de Monteverdi induit des sons différents et caractéristiques. Cette technique peut être employée pour l'analyse d'algorithmes de transcription de hauteurs.
Bodle, Carrie. "Sonification of the invisible : large scale sound installments on building facades." Thesis, Massachusetts Institute of Technology, 2005. http://hdl.handle.net/1721.1/33025.
Full textIncludes bibliographical references (p. 58).
The intention of this project is to utilize sound as representation of MIT research-extending out to the public what may be invisible, or less known to the broader community interested in MIT's spectrum of work. I am utilizing Building 54, also known as the Green Building, on the MIT campus to address the public and MIT community through a vehicle of transmission utilizing sound as representation of research here at MIT. Collaborating with scientists from MIT's Haystack Observatory, I am proposing the sonic display of research data from an architectural scale, a speaker setup on the south facade of the Green Building. This project will be a multi-speaker sound installment with a total of 35 Public Address speakers temporarily attached to the vertical concrete columns on the buildings' facade. The speakers will be broadcasting audio representations of sound waves embedded in Earth's charged upper atmosphere, or ionosphere. These sounds make tangible the state of the ionospheric portion of the terrestrial upper atmosphere, a region under active radar study by the Atmospheric Sciences Group at MIT's Haystack Observatory. The speaker arrangement on the Green Building's facade visually reminds the listener of an upwards-sloping graph. This is representative of the spectral frequency distribution of the sounds, which vary both by time and in altitude.
(Cont.) This large-scale sound installment will make tangible the converging perspectives of contemporary arts and upper atmospheric science, representative for the advanced research focus of this institution, and exemplary for MIT's interests in creating an environment in which the arts merge with technology to create inspirations for artists and scientist likewise. The scale of this project is considerable, but so is the size of the Haystack Observatory installation, the distance to the ionosphere, and the iconic silhouette of the Green Building overseeing the MIT campus when viewed from the Boston bank of the Charles River.
by Carrie Bodle.
S.M.
Zhao, Haixia. "Interactive sonification of abstract data - framework, design space, evaluation, and user tool." College Park, Md. : University of Maryland, 2006. http://hdl.handle.net/1903/3394.
Full textThesis research directed by: Computer Science. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
Kallel, Monem. "Traitement biologique des rejets organiques concentres apres sonification et saponification : graisses-margines." Paris 6, 1992. http://www.theses.fr/1992PA066194.
Full textYang, Jiajun. "Enhancing the quality and motivation of physical exercise using real-time sonification." Thesis, University of York, 2015. http://etheses.whiterose.ac.uk/10396/.
Full textDuarte, García Mario. "Portfolio of original compositions." Thesis, University of Manchester, 2018. https://www.research.manchester.ac.uk/portal/en/theses/portfolio-of-original-compositions(e0a5e175-d4d0-417c-a5d1-1b9587f019f4).html.
Full textPoirier-Quinot, David. "Design of a radio direction finder for search and rescue operations : estimation, sonification, and virtual prototyping." Thesis, Paris 6, 2015. http://www.theses.fr/2015PA066138/document.
Full textThis research investigates the design of a radio Direction Finder (DF) for rescue operation using victims' cellphone as localization beacons. The conception is focused on an audio interface, using sound to progressively guide rescuers towards their target. The thesis' ambition is to exploit the natural mechanisms of human hearing to improve the global performance of the search process rather than to develop new Direction-Of-Arrival (DOA) estimation techniques.Classical DOA estimation techniques are introduced along with a range of tools to assess their efficiency. Based on these tools, a case study is proposed regarding the performance that might be expected from a lightweight DF design tailored to portable operation. It is shown that the performance of high-resolution techniques usually implemented for DOA estimation are seriously impacted by any size-constraint applied on the DF, particularly in multi-path propagation conditions.Subsequently, a review of interactive parameter mapping sonification is proposed. Various sonification paradigms are designed and assessed regarding their capacity to convey information related to different levels of DF outputs. Listening tests are conducted suggesting that trained subjects are capable of monitoring multiple audio streams and gather information from complex sounds. Said tests also indicate the need for a DF sonification that perceptively orders the presented information, for beginners to be able to effortlessly focus on the most important data only. Careful attention is given to sound aesthetic and how it impacts operators' acceptance and trust in the DF, particularly regarding the perception of measurement noise during the navigation.Finally, a virtual prototype is implemented that recreates DF-based navigation in a virtual environment to evaluate the proposed sonification mappings. In the meantime, a physical prototype is developed to assess the ecological validity of the virtual evaluations. Said prototype exploits a software defined radio architecture for rapid iteration through design implementations. The overall performance evaluation study is conducted in consultation with rescue services representatives and compared with their current search solutions.It is shown that, in this context, simple DF designs based on the parallel sonification of the output signal of several antennas may produce navigation performance comparable to these of more complex designs based on high-resolution methods. As the task objective is to progressively localize a target, the system's cornerstone appears to be the robustness and consistency of its estimations rather than its punctual accuracy. Involving operators in the estimation allows avoiding critical situations where one feels helpless when faced with an autonomous system producing non-sensical estimations. Virtual prototyping proved to be a sensible and efficient method to support this study, allowing for fast iterations through sonifications and DF designs implementations
Nikolaidis, Ryan John. "A generative model of tonal tension and its application in dynamic realtime sonification." Thesis, Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/42697.
Full textEdström, Viking, and Fredrik Hallberg. "Human Interaction in 3D Manipulations : Can sonification improve the performance of the interaction?" Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-146344.
Full textWu, Hsin-Fu. "Spectral analysis and sonification of simulation data generated in a frequency domain experiment." Monterey, California: Naval Postgraduate School, 2002, 2002. http://hdl.handle.net/10945/9805.
Full textReynal, Maxime. "Non-visual interaction concepts : considering hearing, haptics and kinesthetics for an augmented remote tower environment." Thesis, Toulouse, ISAE, 2019. http://www.theses.fr/2019ESAE0034.
Full textIn an effort to simplify human resource management and reduce operational costs, control towers are now increasingly designed to not be implanted directly on the airport but remotely. This concept, known as remote tower, offers a “digital”working context: the view on the runways is broadcast remotely using cameras located on site. Furthermore, this concept could be enhanced to the control of several airports simultaneously from one remote tower facility, by only one air traffic controller (multiple remote tower). These concepts offer designers the possibility to develop novel interaction forms. However, the most part of the current augmentations rely on sight, which is largely used and, therefore, is sometimes becoming overloaded. In this Ph.D. work, the design and the evaluation of new interaction techniques that rely onnon-visual human senses have been considered (e.g. hearing, touch and proprioception). Two experimental campaigns have been led to address specific use cases. These use cases have been identified during the design process by involving experts from the field, appearing relevant to controllers due to the criticality of the situation they define. These situations are a) poor visibility (heavy fog conditions, loss of video signal in remote context), b) unauthorized movements on ground (when pilots move their aircraft without having been previously cleared), c) runway incursion (which occurs when an aircraft crosses the holding point to enter the runway while another one is about to land), and d) how to deal with multiple calls associated to distinct radio frequencies coming from multiple airports. The first experimental campaign aimed at quantifying the contribution of a multimodal interaction technique based on spatial sound, kinaesthetic interaction and vibrotactile feedback to address the first use case of poor visibility conditions. The purpose was to enhance controllers’ perception and increase overall level of safety, by providing them a novel way to locate aircraft when they are deprived of their sight. 22 controllers have been involved in a laboratory task within a simulated environment.Objective and subjective results showed significantly higher performance in poor visibility using interactives patial sound coupled with vibrotactile feedback, which gave the participants notably higher accuracy in degraded visibility.Meanwhile, response times were significantly longer while remaining acceptably short considering the temporal aspect of the task. The goal of the second experimental campaign was to evaluate 3 other interaction modalities and feedback addressing 3 other critical situations, namely unauthorized movements on ground, runway incursion and calls from a secondary airport. We considered interactive spatial sound, tactile stimulation and body movements to design3 different interaction techniques and feedback. 16 controllers’ participated in an ecological experiment in which they were asked to control 1 or 2 airport(s) (Single Vs. Multiple operations), with augmentations activated or not. Having no neat results regarding the interaction modalities into multiple remote tower operations, behavioural results shown asignificant increase in overall participants’ performance when augmentation modalities were activated in single remotecontrol tower operations. The first campaign was the initial step in the development of a novel interaction technique that uses sound as a precise means of location. These two campaigns constituted the first steps for considering non-visual multimodal augmentations into remote tower operations
Peyre, Iseline. "Sonification du mouvement pour la rééducation après une lésion cérébrale acquise : conception et évaluations de dispositifs." Electronic Thesis or Diss., Sorbonne université, 2022. https://accesdistant.sorbonne-universite.fr/login?url=https://theses-intra.sorbonne-universite.fr/2022SORUS414.pdf.
Full textAs the leading cause of acquired disability in adults, non-degenerative acquired brain injuries lead to multiple disorders affecting the sensory-motor, cognitive and psycho-social dimensions. The chronicisation of residual deficits leads to a loss of autonomy in the performance of daily living activities. Rehabilitation support promotes functional recovery through the use of complementary methods, techniques and tools. Thus, the pursuit of rehabilitation in supervised autonomy during the chronic phase is now encouraged. With the development of health technologies, new support methods are currently being investigated. The emergence of interactive movement sonification tools that provide continuous sound information in real-time in relation to the movements performed is a promising approach to rehabilitation. However, the orientation of design choices, particularly concerning the characteristics of sound feedback and the modalities of gesture-sound interactions, are currently at the centre of reflection. The main objective of this interdisciplinary health-arts-sciences thesis was to develop a movement sonification device for the supervised autonomous rehabilitation of patients with motor impairment after an acquired brain injury. In this perspective, the first objective was to evaluate the effect of different types of sound feedback (sound characteristics and modalities of gesture-sound interactions) on two gestural tasks: an elbow extension movement, and a postural maintenance, with participants of different profiles. The second objective was to define the design criteria and to select the appropriate solutions for the creation of a movement sonification device responding to the characteristics and needs of patients with motor impairment in the upper limb following an acquired brain injury, with a perspective of use in supervised autonomy. The third objective was to initiate an evaluation of the designed device, in order to consider a clinical study. The studies carried out confirmed the effect of the presence of interactive sound feedback during the execution of gestures and the importance of taking into consideration the modalities of gesture-sound interaction. The user-centered co-design process implemented with experts from several disciplines led to the creation of an innovative, functional, flexible (customisable) mobile movement sonification device, adapted to a supervised autonomous rehabilitation situation. The device is inexpensive and has been duplicated in 10 copies. The first results of the evaluations carried out with therapists are very encouraging, opening up perspectives for large-scale clinical evaluation
NORLIN, ANDERSSON JACOB. "App For Improving Heart Rate MonitorBased Endurance Training in RunningAthletes Through Heart Beat Sonification." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-134943.
Full textFriskvård är en stor del av vår moderna värld, speciellt löpning. Efter smarttelefonens ankomst har en stor mängd appar som fokuserar på att hjälpa löpare hålla reda på och förbättra sin löpning kommit. Varvid vissa innehåller någon form av ljudfeedback. För att bygga vidare på detta utforstkar denna uppsats möjligheten att utnyttja sonifiering av en löpares puls för att förbättra hens löpförmåga, genom att ligga inom vissa pulszoner". Sonifiering implementerades i form av en app på Androidplatformen. I uppsatsen upptäcktes det att medans sonifiering hjälpte löparna upptäcka när de kom utanför en pulszon, var det på kostnaden av en jämn puls. Därför föreslås det i slutet sätt att vidareutveckla detta koncept för att bättre hjälpa löpare förbättra sig.
Smith, Daniel R. "Effects of training and context on human performance in a point estimation sonification task." Thesis, Georgia Institute of Technology, 2003. http://hdl.handle.net/1853/32845.
Full textSavard, Alexandre. "When gestures are perceived through sounds : a framework for sonification of musicians' ancillary gestures." Thesis, McGill University, 2008. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=116051.
Full textAl, Bsoul Abeer. "Efficacité des ultrasons pour la destruction des trois biomasses." Grenoble INPG, 2009. http://www.theses.fr/2009INPG0020.
Full textThe goals of this research were to evaluate the effectiveness of the sonication for destruction of Mycobacterium sp. 6PY1 and Rhodobacter capsulatus B10 in the presence and absence of solid particles (TiO2 or SiO2), and sonication (low and high frequency) combined with UV irradiation for disinfection of Mycobacterium sp. 6PY1 in the presence and absence of solid particles (TiO2 or SiO2). The variables tested for both sonication systems (low or high frequency ultrasonic reactor) included : the sonication frequency (frequency of 20 kHz or 612 kHz), the sonication time, the sonication volume, the sonication power-to-volume ratio, the initial bacterial concentration, the effect of radical scavengers (Na2CO3), sequence of application of the two sonication systems, the effects of ultrasonic treatment on the growth kinetics, and the presence of solid particles (TiO2 or SiO2)
Lindstedt, Simon, and Hannes Derler. "Ljud som sammankopplar oss : Ett utforskande av Augmented Audio Reality för att hitta interaktioner som kopplar oss samman." Thesis, Blekinge Tekniska Högskola, Institutionen för teknik och estetik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-16567.
Full textThe purpose of this work is to investigate interactive sound in relation to Augmented Reality. We want to explore the concept by focusing on sound, thereby broadening the perception of what Augmented Reality potentially could mean. We have investigated Sonication, Audio Spatial Awareness and Augmented Reality to produce an artefact based on a combination of these theories. Our focus is to investigate how Sonification can make certain aspects of reality clearer to people, as well as to use this information to try to influence people's perception of each other. We aim to investigate how human interaction can be influenced by sound based on spatial data, and is directly influenced by the research which laid the theoretical foundation combined with the primary method we chose. Performative Experience Design aims to investigate and generate interaction based on a performance perspective, as it calls for an open and curious approach to human interaction. The result of this is a system with great potential for further development, but also a beginning to an initial discussion of what constitutes Augmented reality.
Portron, Arthur. "L'étude de l'influence du contexte sur la poursuite oculaire." Thesis, Paris 6, 2017. http://www.theses.fr/2017PA066123/document.
Full textPursuit eye movements allow us to track a target which moves continuously and slowly in our visual environment. Studies have shown this movement is based on the simultaneous contribution of retinal signals linked to the retinal image of the visual target and to the context, on extra retinal signals underlying cognitive process and the efference copy, and some inhibition and suppression processes related to the visual context. This dynamical combination allows the pursuit system to adapt in a wide range of contexts. If the presence of a motion signal in the visual environment is thought as a prerequisite to initiate and then to maintain the pursuit, some results since the 70’s moderate this view. In order to investigate the mechanisms underlying the maintenance of pursuit eye movements after target disappearance and the nature of signals leading to generate pursuit, we investigate the effects of two different contexts. These contexts, a visual one, and an auditory one, share the same property which is to be dependent on eye movements. As a result of this dependence, the contexts yield a new signal, visual or auditory, which carry an information about the ongoing eye movement. We study the effects of these information induced by the contexts in procedure involving the generation and the maintenance of smooth pursuit eye movements, and the generation of smooth and continuous eye movements without a moving target
Fagergren, Emma. "Wa-UM-eii : How a Choreographer Can Use Sonification to Communicate With Dancers During Rehearsals." Thesis, Linköpings universitet, Filosofiska fakulteten, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-79766.
Full textDubus, Gaël. "Interactive sonification of motion : Design, implementation and control of expressive auditory feedback with mobile devices." Doctoral thesis, KTH, Musikakustik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-127944.
Full textQC 20130910
PAPETTI, Stefano. "Sound modeling issues in interactive sonification - From basic contact events to synthesis and manipulation tools." Doctoral thesis, Università degli Studi di Verona, 2010. http://hdl.handle.net/11562/340961.
Full textThe work presented in this thesis ranges over a variety of research topics, spacing from human-computer interaction to physical-modeling. What combines such broad areas of interest is the idea of using physically-based computer simulations of acoustic phenomena in order to provide human-computer interfaces with sound feedback which is consistent with the user interaction. In this regard, recent years have seen the emergence of several new disciplines that go under the name of -- to cite a few -- auditory display, sonification and sonic interaction design. This thesis deals with the design and implementation of efficient sound algorithms for interactive sonification. To this end, the physical modeling of everyday sounds is taken into account, that is sounds not belonging to the families of speech and musical sounds.
Deschamps, Marie-Lys, Penelope Sanderson, Kelly Hinckfuss, Caitlin Browning, Robert G. Loeb, Helen Liley, and David Liu. "Improving the detectability of oxygen saturation level targets for preterm neonates: A laboratory test of tremolo and beacon sonifications." ELSEVIER SCI LTD, 2016. http://hdl.handle.net/10150/617179.
Full textWolczynski, Leon. "Ljud- eller oljud : Hur upplevs ljudsättning av gränssnitt." Thesis, Mittuniversitetet, Avdelningen för data- och systemvetenskap, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-28993.
Full textBrittell, Megen. "Neuro-imaging Support for the Use of Audio to Represent Geospatial Location in Cartographic Design." Thesis, University of Oregon, 2019. http://hdl.handle.net/1794/24538.
Full textBanf, Michael [Verfasser]. "Auditory image understanding for the visually impaired based on a modular computer vision sonification model / Michael Banf." Siegen : Universitätsbibliothek der Universität Siegen, 2013. http://d-nb.info/1045776394/34.
Full textScholz, Daniel S. [Verfasser]. "Sonification of arm movements in stroke rehabilitation: a novel approach in neurologic music therapy / Daniel S. Scholz." Hannover : Bibliothek der Tierärztlichen Hochschule Hannover, 2015. http://d-nb.info/1080868100/34.
Full textFranchin, Wagner José. "Adição e avaliação de estímulos sonoros como ferramenta de apoio à exploração visual de dados." Universidade de São Paulo, 2007. http://www.teses.usp.br/teses/disponiveis/55/55134/tde-05122007-115432/.
Full textVisualization is a generic process that uses visual and interactive representations to easy the analysis and the understanding of complex datasets. To this date, most of the visualization toolkits make use almost exclusively of visual aid to represent information, which has limited the capacity for data presentation and exploration. Many studies have shown that sound as an alternative data display tool (sonification) can be useful to support information interpretation and may also add dimensions to a visual display. Sonification is the object of study of this work. This work implements the new sonification module for a recently developed visual exploration system, the Super Spider (Watanabe, 2007). It has been extended with the implementation of functionalities in order to support data exploration through sounds. A new system, called Sonar 2D, was also developed and integrated to Super Spider, including a new technique of data sonification. In addition, this work presents results user evaluation for validation of some of the visual and sound mappings employed in both systems
Lansley, Alastair. "Adventures in software engineering : plugging HCI & acessibility gaps with open source solutions." Thesis, Federation University Australia, 2020. http://researchonline.federation.edu.au/vital/access/HandleResolver/1959.17/174516.
Full textDoctor of Philosophy
Bressolette, Benjamin. "Manipulations gestuelles d'un objet virtuel sonifié pour le contrôle d'une interface en situation de conduite." Thesis, Ecole centrale de Marseille, 2018. http://www.theses.fr/2018ECDM0009/document.
Full textCar manufacturers offer a wide range of secondary driving controls, such as GPS, music, or ventilation, often localized on a central touch-sensitive screen. However, operating them while driving proves to be unsafe: engaging the sense of sight for interface interaction can lead to vigilance reduction towards the driving task, which can lead to high-risk situations. In this PhD thesis, which is a part of a collaborative research project involving both the PSA Group and the PRISM laboratory, we aim to provide a gesture and sound association as an alternative to the visual solicitation. The goal is to enable blind interface interactions, allowing the driver to focus their eyes on the road. When jointly performing interface manipulations and the driving task, a multisensory solicitation can lower the driver's cognitive load, in comparison with a visual unimodal situation. For the gesture-sound association to feel more natural, a virtual object that can be handled with gestures is introduced. This object is the support for sonification strategies, constructed by analogy with sounds from our environment, which are the consequence of an action on an object .The virtual object also allows to structure different gestures around the same metaphor, or to redefine the interface's menu. The first part of this thesis deals with the development of sonification strategies, with the aim to inform users about the virtual object dynamic. Two perceptual experiments were set up, which led to the discrimination of two valuable sonification strategies. In a second part, the automotive application was addressed by designing new sound stimuli, the interface, and by studying the multisensory integration. Sounds were proposed for each of the two sonification strategies, to progress towards an in-vehicle integration. The evocations brought by the gestures and sounds association were the subject of a third perceptive blinded experiment. The concepts around the virtual object were unknown and gradually discovered by the subjects. These mental images conveyed by the sonification strategies can help users familiarize with the interface. A fourth perceptual experiment focused on the virtual object handling for first-time users, where the integration of audio-visual stimuli was studied, in the context of an interface manipulation. The experiment conditions were similar to the driver first discovering of the interface in a parked vehicle thanks to audio-visual stimuli, and then operating it through sonification strategies only. The results of this experiment lead to the design of a gestural interface, which was compared with the results obtained with a touchscreen interface in a final perceptual experiment, carried out in a driving simulator. Although the results show better performances for the tactile interface, the combination of gestures and sounds proved to be effective from the cognitive load point of view. The gesture interface can therefore offer a promising alternative or complement to tactile interfaces for a safe simultaneous use in driving condition
Landelle, Caroline. "Impact du vieillissement sur la perception multisensorielle et les processus cérébraux sous-jacents : étude de la kinesthésie et de la perception de textures." Thesis, Aix-Marseille, 2019. http://www.theses.fr/2019AIXM0146.
Full textWe can better perceive our body and our environment if we take into account several sensory sources at the same time. However, all sensory systems gradually decline with aging. This thesis contributes to a better understanding of how multisensory perceptions and the underlying brain networks are modified in the elderly. This work highlights both a reweighting of sensory information and a general facilitation of interaction processes between the senses to optimize the perception of body movements or the perception of textures as soon of 65 years old. At the brain level, the break-down of inhibitory processes with age would lead to a poorer selection of networks and would explain perceptual disorders. Nevertheless, older people could benefit from less specific brain recruitment to at least partially compensate these sensory declines
Denjean, Sebastien. "Sonification des véhicules électriques par illusions auditives : étude de l'intégration audiovisuelle de la perception du mouvement automobile en simulateur de conduite." Thesis, Aix-Marseille, 2015. http://www.theses.fr/2015AIXM4710.
Full textThis thesis aims to build an auditory display to sonify electric vehicles. Our goal consisted in bringing back to the driver the motion information, which is usually provided by the combustion engine noise.The first stage of this work consisted in analyzing how automotive noises can influence drivers’ perception of motion. We conducted two driving simulator experiments to study drivers’ speed perception in presence of different automotive noises. These results provided a link between the acoustic feedback and the speed perceived by the driver, on which we based our sonification strategy.Similarly to combustion engine noise, the acoustic feedback proposed in this work informs the driver via its pitch variation. We used the Shepard Risset glissando illusion to sonify the whole speed range of the vehicle. Pitch circularity in the construction of these sounds provides a precise information on small speed variation with fast pitch variations, and is in addition restrained within a narrow bandwith.We then tested the contribution of this strategy in two experiments. The first dealt with the influence of the proposed sounds on drivers’ speed perception ; the second with their behavior in a common braking task. These studies showed that the drivers easily integrate the information brought by this sound, and that it influences their perception of motion and modifies their driving behavior. These inputs make the proposed sound a good candidate to become the new « engine noise » of future electric cars
Henriks, Olof. "Mapping physical movement parameters to auditory parameters by using human body movement." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-200831.
Full textHenthorne, Cody M. "Sonifying Performance Data to Facilitate Tuning of Complex Systems." Thesis, Virginia Tech, 2010. http://hdl.handle.net/10919/78162.
Full textMaster of Science
Banf, Michael [Verfasser]. "Making the Visual World Audible : Auditory Image Understanding for the Visually Impaired Based on a Modular Computer Vision Sonification Model / Michael Banf." Aachen : Shaker, 2013. http://nbn-resolving.de/urn:nbn:de:101:1-20140428791.
Full text