Academic literature on the topic 'Interactive sonification'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Interactive sonification.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Interactive sonification"

1

Bresin, Roberto, Thomas Hermann, and Andy Hunt. "Interactive sonification." Journal on Multimodal User Interfaces 5, no. 3-4 (April 20, 2012): 85–86. http://dx.doi.org/10.1007/s12193-012-0095-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Grond, Florian, and Thomas Hermann. "Interactive Sonification for Data Exploration: How listening modes and display purposes define design guidelines." Organised Sound 19, no. 1 (February 26, 2014): 41–51. http://dx.doi.org/10.1017/s1355771813000393.

Full text
Abstract:
The desire to make data accessible through the sense of listening has led to ongoing research in the fields of sonification and auditory display since the early 1990s. Coming from the disciplines of computer sciences and human computer interface (HCI), the conceptualisation of sonification has been mostly driven by application areas and methods. On the other hand, the sonic arts, which have always participated in the auditory display community, have a genuine focus on sound. Despite these close interdisciplinary relationships between communities of sound practitioners, a rich and sound- or listening-centred concept of sonification is still missing for design guidelines. Complementary to the useful organisation by fields of application, a proper conceptual framework for sound needs to be abstracted from applications and also to some degree from tasks, as both are not directly related to sound. As an initial approach to recasting the thinking about sonification, we propose a conceptualisation of sonifications along two poles in which sound serves either anormativeor adescriptivepurpose. According to these two poles, design guidelines can be developed proper to display purposes and listening modes.
APA, Harvard, Vancouver, ISO, and other styles
3

Kirke, Alexis, Samuel Freeman, and Eduardo Reck Miranda. "Wireless Interactive Sonification of Large Water Waves to Demonstrate the Facilities of a Large-Scale Research Wave Tank." Computer Music Journal 39, no. 3 (September 2015): 59–70. http://dx.doi.org/10.1162/comj_a_00315.

Full text
Abstract:
Interactive sonification can provide a platform for demonstration and education as well as for monitoring and investigation. We present a system designed to demonstrate the facilities of the UK's most advanced large-scale research wave tank. The interactive sonification of water waves in the “ocean basin” wave tank at Plymouth University consisted of a number of elements: generation of ocean waves, acquisition and sonification of ocean-wave measurement data, and gesture-controlled pitch and amplitude of sonifications. The generated water waves were linked in real time to sonic features via depth monitors and motion tracking of a floating buoy. Types of water-wave patterns, varying in shape and size, were selected and triggered using wireless motion detectors attached to the demonstrator's arms. The system was implemented on a network of five computers utilizing Max/MSP alongside specialist marine research software, and was demonstrated live in a public performance for the formal opening of the Marine Institute building.
APA, Harvard, Vancouver, ISO, and other styles
4

Lindborg, PerMagnus. "Interactive Sonification of Weather Data for The Locust Wrath, a Multimedia Dance Performance." Leonardo 51, no. 5 (October 2018): 466–74. http://dx.doi.org/10.1162/leon_a_01339.

Full text
Abstract:
To work flexibly with the sound design for The Locust Wrath, a multimedia dance performance on the topic of climate change, the author developed software for interactive sonification of climate data. An open-ended approach to parameter mapping allowed tweaking and improvisation during rehearsals, resulting in a large range of musical expression. The sonifications represented weather systems pushing through Southeast Asia in complex patterns. The climate was rendered as a piece of electroacoustic music, whose compositional form—gesture, timbre, intensity, harmony, spatiality—was determined by the data. The article discusses aspects of aesthetic sonification, reports the process of developing the present work and contextualizes the design decisions within theories of cross-modal perception and listening modes.
APA, Harvard, Vancouver, ISO, and other styles
5

Haixia Zhao, B. K. Smith, K. Norman, C. Plaisant, and B. Shneiderman. "Interactive Sonification of Choropleth Maps." IEEE Multimedia 12, no. 2 (April 2005): 26–35. http://dx.doi.org/10.1109/mmul.2005.28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Degara, Norberto, Andy Hunt, and Thomas Hermann. "Interactive Sonification [Guest editors' introduction]." IEEE MultiMedia 22, no. 1 (January 2015): 20–23. http://dx.doi.org/10.1109/mmul.2015.8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Pauletto, Sandra, and Andy Hunt. "Interactive sonification of complex data." International Journal of Human-Computer Studies 67, no. 11 (November 2009): 923–33. http://dx.doi.org/10.1016/j.ijhcs.2009.05.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Weinberg, Gil, and Travis Thatcher. "Interactive Sonification: Aesthetics, Functionality and Performance." Leonardo Music Journal 16 (December 2006): 9–12. http://dx.doi.org/10.1162/lmj.2006.16.9.

Full text
Abstract:
The authors present a sonification installation that allows a group of players to interact with an auditory display of neural activity. The system is designed to represent electrical spike propagation in a neuron culture through sound propagation in space. Participants can simulate neural spikes by using a set of specially designed controllers, experimenting and sonically investigating the electrical activity of the brain. The article discusses some aesthetic and functional aspects of sonification and describes the authors' approach for group interaction with auditory displays. It concludes with the description of a performance piece for the system and ideas for improvements and future work.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhao, Haixia. "Interactive sonification of geo-referenced data." ACM SIGACCESS Accessibility and Computing, no. 82 (June 2005): 33–36. http://dx.doi.org/10.1145/1077238.1077244.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Han, Yoon Chung, and Byeong-jun Han. "Skin Pattern Sonification as a New Timbral Expression." Leonardo Music Journal 24 (December 2014): 41–43. http://dx.doi.org/10.1162/lmj_a_00199.

Full text
Abstract:
The authors discuss two sonification projects that transform fingerprint and skin patterns into audio: (1) Digiti Sonus, an interactive installation performing fingerprint sonification and visualization and (2) skin pattern sonification, which converts pore networks into sound. The projects include novel techniques for representing user-intended fingerprint expression and skin pattern selection as audio parameters.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Interactive sonification"

1

Ejdbo, Malin, and Elias Elmquist. "Interactive Sonification in OpenSpace." Thesis, Linköpings universitet, Medie- och Informationsteknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-170250.

Full text
Abstract:
This report presents the work of a master thesis which aim was to investigate how sonification can be used in the space visualization software OpenSpace to further convey information about the Solar System. A sonification was implemented by using the software SuperCollider and was integrated into OpenSpace using Open Sound Control to send positional data to control the panning and sound level of the sonification. The graphical user interface of OpenSpace was also extended to make the sonification interactive. Evaluations were conducted both online and in the Dome theater to evaluate how well the sonification conveyed information. The outcome of the evaluations shows promising results, which might suggest that sonification has a future in conveying information of the Solar System.
APA, Harvard, Vancouver, ISO, and other styles
2

Perkins, Rhys John. "Interactive sonification of a physics engine." Thesis, Anglia Ruskin University, 2013. http://arro.anglia.ac.uk/323077/.

Full text
Abstract:
Physics engines have become increasingly prevalent in everyday technology. In the context of this thesis they are regarded as a readily available data set that has the potential to intuitively present the process of sonification to a wide audience. Unfortunately, this process is not the focus of attention when formative decisions are made concerning the continued development of these engines. This may reveal a missed opportunity when considering that the field of interactive sonification upholds the importance of physical causalities for the analysis of data through sound. The following investigation deliberates the contextual framework of this field to argue that the physics engine, as part of typical game engine architecture, is an appropriate foundation on which to design and implement a dynamic toolset for interactive sonification. The basis for this design is supported by a number of significant theories which suggest that the underlying data of a rigid body dynamics physics system can sustain an inherent audiovisual metaphor for interaction, interpretation and analysis. Furthermore, it is determined that this metaphor can be enhanced by the extraordinary potential of the computer in order to construct unique abstractions which build upon the many pertinent ideas and practices within the surrounding literature. These abstractions result in a mental model for the transformation of data to sound that has a number of advantages in contrast to a physical modelling approach while maintaining its same creative potential for instrument building, composition and live performance. Ambitions for both sonification and its creative potential are realised by several components which present the user with a range of options for interacting with this model. The implementation of these components effectuates a design that can be demonstrated to offer a unique interpretation of existing strategies as well as overcoming certain limitations of comparable work.
APA, Harvard, Vancouver, ISO, and other styles
3

Forsberg, Joel. "A Mobile Application for Improving Running Performance Using Interactive Sonification." Thesis, KTH, Tal, musik och hörsel, TMH, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-159577.

Full text
Abstract:
Apps that assist long-distance runners have become popular, however most of them focus on results that come from calculations based on distance and time. To become a better runner, an improvement of both the body posture and running gait is required. Using sonic feedback to improve performance in different sports applications has become an established research area during the last two decades. Sonic feedback is particularly well suited for activities where the user has to maintain a visual focus on something, for example when running. The goal of this project was to implement a mobile application that addresses long-distance runners’ body posture and running gait. By decreasing the energy demand for a specific velocity, the runner’s performance can be improved. The application makes use of the sensors in a mobile phone to analyze the runner’s vertical force, step frequency, velocity and body tilt, together with a sonification of those parameters in an interactive way by altering the music that the user is listening to. The implementation was made in the visual programming language Pure Data together with MobMuPlat, which enables the use of Pure Data in a mobile phone. Tests were carried out with runners of different levels of experience, the results showed that the runners could interact with the music for three of the four parameters but more training is required to be able to change the running gait in real-time.
Det har blivit populärt med appar som riktar sig till långdistanslöpare, men de flesta av dessa fokuserar på resultat som kommer från uträkningar av distans och tid. För att bli en bättre löpare krävs att man förbättrar både sin kroppshållning och sin löpstil. Det har blivit ett etablerat forskningsämne under de senaste årtiondena att använda sig av ljudåterkoppling för att förbättra sin prestation inom olika sporter. Detta lämpar sig väl för aktiviteter där användaren behöver fokusera sin blick på något, till exempel under löpning. Målet med det här projektet var att implementera en mobil applikation som riktar sig till att förbättra långdistanslöpares kroppshållning och löpstil. Genom att minska på energin som krävs för att springa med en viss hastighet kan löparens prestationsförmåga öka. Applikationen använder sig av sensorerna i en mobiltelefon för att analysera användarens vertikala kraft, stegfrekvens, hastighet och kroppslutning genom att sonifiera dessa parametrar på ett interaktivt sätt där musiken som användaren lyssnar på ändras på olika sätt. Implementeringen gjordes i det visuella programmeringsspråket Pure Data tillsammans med MobMuPlat, som gör att implementeringen kan användas i en mobiltelefon. Tester genomfördes med löpare med olika grader av erfarenhet, resultaten visade att löparna kunde interagera med musiken för tre av de fyra parametrarna men mer övning krävs för att kunna förändra löpstilen i realtid.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhao, Haixia. "Interactive sonification of abstract data - framework, design space, evaluation, and user tool." College Park, Md. : University of Maryland, 2006. http://hdl.handle.net/1903/3394.

Full text
Abstract:
Thesis (Ph. D.) -- University of Maryland, College Park, 2006.
Thesis research directed by: Computer Science. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
APA, Harvard, Vancouver, ISO, and other styles
5

Dubus, Gaël. "Interactive sonification of motion : Design, implementation and control of expressive auditory feedback with mobile devices." Doctoral thesis, KTH, Musikakustik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-127944.

Full text
Abstract:
Sound and motion are intrinsically related, by their physical nature and through the link between auditory perception and motor control. If sound provides information about the characteristics of a movement, a movement can also be influenced or triggered by a sound pattern. This thesis investigates how this link can be reinforced by means of interactive sonification. Sonification, the use of sound to communicate, perceptualize and interpret data, can be used in many different contexts. It is particularly well suited for time-related tasks such as monitoring and synchronization, and is therefore an ideal candidate to support the design of applications related to physical training. Our objectives are to develop and investigate computational models for the sonification of motion data with a particular focus on expressive movement and gesture, and for the sonification of elite athletes movements.  We chose to develop our applications on a mobile platform in order to make use of advanced interaction modes using an easily accessible technology. In addition, networking capabilities of modern smartphones potentially allow for adding a social dimension to our sonification applications by extending them to several collaborating users. The sport of rowing was chosen to illustrate the assistance that an interactive sonification system can provide to elite athletes. Bringing into play complex interactions between various kinematic and kinetic quantities, studies on rowing kinematics provide guidelines to optimize rowing efficiency, e.g. by minimizing velocity fluctuations around average velocity. However, rowers can only rely on sparse cues to get information relative to boat velocity, such as the sound made by the water splashing on the hull. We believe that an interactive augmented feedback communicating the dynamic evolution of some kinematic quantities could represent a promising way of enhancing the training of elite rowers. Since only limited space is available on a rowing boat, the use of mobile phones appears appropriate for handling streams of incoming data from various sensors and generating an auditory feedback simultaneously. The development of sonification models for rowing and their design evaluation in offline conditions are presented in Paper I. In Paper II, three different models for sonifying the synchronization of the movements of two users holding a mobile phone are explored. Sonification of expressive gestures by means of expressive music performance is tackled in Paper III. In Paper IV, we introduce a database of mobile applications related to sound and music computing. An overview of the field of sonification is presented in Paper V, along with a systematic review of mapping strategies for sonifying physical quantities. Physical and auditory dimensions were both classified into generic conceptual dimensions, and proportion of use was analyzed in order to identify the most popular mappings. Finally, Paper VI summarizes experiments conducted with the Swedish national rowing team in order to assess sonification models in an interactive context.

QC 20130910

APA, Harvard, Vancouver, ISO, and other styles
6

Edström, Viking, and Fredrik Hallberg. "Human Interaction in 3D Manipulations : Can sonification improve the performance of the interaction?" Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-146344.

Full text
Abstract:
In this report the effects of using sonification when performing move- ments in 3D space are explored. User studies were performed where partic- ipants had to repeatedly move their hand toward a target. Three different sonification modes were tested where the fundamental frequency, sound level and sound rate were varied respectively depending on the distance to the target. The results show that there is no statistically significant performance increase for any sonification mode. There is however an in- dication that sonification increases the interaction speed for some users. The mode which provided the greatest average performance increase was when the sound level was varied. This mode gave a 7% average speed increase over the silent control mode. However, the sound level mode has some significant drawbacks, especially the very high base volume re- quirement, which might not make it the best suited sonification mode for all applications. In the general case we instead recommend using the sonification mode that varies the sound rate, which gave a slightly lower performance gain but can be played at a lower volume due to its binary nature.
APA, Harvard, Vancouver, ISO, and other styles
7

Reynal, Maxime. "Non-visual interaction concepts : considering hearing, haptics and kinesthetics for an augmented remote tower environment." Thesis, Toulouse, ISAE, 2019. http://www.theses.fr/2019ESAE0034.

Full text
Abstract:
Afin de simplifier la gestion des ressources humaines et de réduire les coûts d’exploitation, certaines tours de contrôle sont désormais conçues pour ne pas être implantées directement sur l’aéroport. Ce concept, connu sous le nom de tour de contrôle distante (remote tower), offre un contexte de travail “digital” : la vue sur les pistes est diffusée via des caméras situées sur le terrain distant. Ce concept pourrait également être étendu au contrôle simultanés de plusieurs aéroports à partir d’une seule salle de contrôle, par un contrôleur seul (tour de contrôle distante multiple). Ces notions nouvelles offrent aux concepteurs la possibilité de développer des formes d’interaction novatrices. Cependant, la plupart des augmentations actuelles reposent sur la vue, qui est largement utilisée et, par conséquent, parfois surchargée.Nous nous sommes ainsi concentrés sur la conception et l’évaluation de nouvelles techniques d’interaction faisant appel aux sens non visuels, plus particulièrement l’ouïe, le toucher et la proprioception. Deux campagnes expérimentales ont été menées. Durant les processus de conception, nous avons identifié, avec l’aide d’experts du domaine, certaines situations pertinentes pour les contrôleurs aériens en raison de leur criticité: a) la mauvaise visibilité (brouillard épais,perte de signal vidéo), b) les mouvements non autorisés au sol (lorsque les pilotes déplacent leur appareil sans y avoir été préalablement autorisés), c) l’incursion de piste (lorsqu’un avion traverse le point d’attente afin d’entrer sur la piste alors qu’un autre, simultanément, s’apprête à atterrir) et d) le cas des communications radio simultanées provenant de plusieurs aéroports distants. La première campagne expérimentale visait à quantifier la contribution d’une technique d’interaction basée sur le son spatial, l’interaction kinesthésique et des stimuli vibrotactiles, afin de proposer une solution au cas de perte de visibilité sur le terrain contrôlé. L’objectif était d’améliorer la perception de contrôleurs et d’accroître le niveau général de sécurité, en leur offrant un moyen différent pour localiser les appareils. 22 contrôleurs ont été impliqués dans une tâche de laboratoire en environnement simulé. Des résultats objectifs et subjectifs ont montré une précision significativement plus élevée en cas de visibilité dégradée lorsque la modalité d’interaction testée était activée. Parallèlement, les temps de réponse étaient significativement plus longs relativement courts par rapport à la temporalité de la tâche. L’objectif de la seconde campagne expérimentale, quant à elle, était d’évaluer 3 autres modalités d’interaction visant à proposer des solutions à 3 autres situations critiques : les mouvements non autorisés au sol,les incursions de piste et les appels provenant d’un aéroport secondaire contrôlé. Le son spatial interactif, la stimulation tactile et les mouvements du corps ont été pris en compte pour la conception de 3 autres techniques interactives. 16contrôleurs aériens ont participé à une expérience écologique dans laquelle ils ont contrôlé 1 ou 2 aéroport(s), avec ou sans augmentation. Les résultats comportementaux ont montré une augmentation significative de la performance globale des participants lorsque les modalités d’augmentation étaient activées pour un seul aéroport. La première campagne a été la première étape dans le développement d’une nouvelle technique d’interaction qui utilise le son interactif comme moyen de localisation lorsque la vue seule ne suffit pas. Ces deux campagnes ont constitué les premières étapes de la prise en compte des augmentations multimodales non visuelles dans les contextes des tours de contrôles déportées Simples et Multiples
In an effort to simplify human resource management and reduce operational costs, control towers are now increasingly designed to not be implanted directly on the airport but remotely. This concept, known as remote tower, offers a “digital”working context: the view on the runways is broadcast remotely using cameras located on site. Furthermore, this concept could be enhanced to the control of several airports simultaneously from one remote tower facility, by only one air traffic controller (multiple remote tower). These concepts offer designers the possibility to develop novel interaction forms. However, the most part of the current augmentations rely on sight, which is largely used and, therefore, is sometimes becoming overloaded. In this Ph.D. work, the design and the evaluation of new interaction techniques that rely onnon-visual human senses have been considered (e.g. hearing, touch and proprioception). Two experimental campaigns have been led to address specific use cases. These use cases have been identified during the design process by involving experts from the field, appearing relevant to controllers due to the criticality of the situation they define. These situations are a) poor visibility (heavy fog conditions, loss of video signal in remote context), b) unauthorized movements on ground (when pilots move their aircraft without having been previously cleared), c) runway incursion (which occurs when an aircraft crosses the holding point to enter the runway while another one is about to land), and d) how to deal with multiple calls associated to distinct radio frequencies coming from multiple airports. The first experimental campaign aimed at quantifying the contribution of a multimodal interaction technique based on spatial sound, kinaesthetic interaction and vibrotactile feedback to address the first use case of poor visibility conditions. The purpose was to enhance controllers’ perception and increase overall level of safety, by providing them a novel way to locate aircraft when they are deprived of their sight. 22 controllers have been involved in a laboratory task within a simulated environment.Objective and subjective results showed significantly higher performance in poor visibility using interactives patial sound coupled with vibrotactile feedback, which gave the participants notably higher accuracy in degraded visibility.Meanwhile, response times were significantly longer while remaining acceptably short considering the temporal aspect of the task. The goal of the second experimental campaign was to evaluate 3 other interaction modalities and feedback addressing 3 other critical situations, namely unauthorized movements on ground, runway incursion and calls from a secondary airport. We considered interactive spatial sound, tactile stimulation and body movements to design3 different interaction techniques and feedback. 16 controllers’ participated in an ecological experiment in which they were asked to control 1 or 2 airport(s) (Single Vs. Multiple operations), with augmentations activated or not. Having no neat results regarding the interaction modalities into multiple remote tower operations, behavioural results shown asignificant increase in overall participants’ performance when augmentation modalities were activated in single remotecontrol tower operations. The first campaign was the initial step in the development of a novel interaction technique that uses sound as a precise means of location. These two campaigns constituted the first steps for considering non-visual multimodal augmentations into remote tower operations
APA, Harvard, Vancouver, ISO, and other styles
8

Parseihian, Gaëtan. "Sonification binaurale pour l'aide à la navigation." Phd thesis, Université Pierre et Marie Curie - Paris VI, 2012. http://tel.archives-ouvertes.fr/tel-00771316.

Full text
Abstract:
Dans cette thèse, nous proposons la mise en place d'un système de réalité augmentée fondé sur le son 3D et la sonification, ayant pour objectif de fournir les informations nécessaires aux non- voyants pour un déplacement fiable et sûr. La conception de ce système a été abordée selon trois axes. L'utilisation de la synthèse binaurale pour générer des sons 3D est limitée par le problème de l'individualisation des HRTF. Une méthode a été mise en place pour adapter les individus aux HRTF en utilisant la plasticité du cerveau. Évaluée avec une expérience de localisation, cette méthode a permis de montrer les possibilités d'acquisition rapide d'une carte audio-spatiale virtuelle sans utiliser la vision. La sonification de données spatiales a été étudiée dans le cadre d'un système permettant la préhension d'objet dans l'espace péripersonnel. Les capacités de localisation de sources sonores réelles et virtuelles ont été étudiées avec un test de localisation. Une technique de sonification de la distance a été développée. Consistant à relier le paramètre à sonifier aux paramètres d'un effet audio, cette technique peut être appliquée à tout type de son sans nécessiter d'apprentissage supplémentaire. Une stratégie de sonification permettant de prendre en compte les préférences des utilisateurs a été mise en place. Les " morphocons " sont des icônes sonores définis par des motifs de paramètres acoustiques. Cette méthode permet la construction d'un vocabulaire sonore indépendant du son utilisé. Un test de catégorisation a montré que les sujets sont capables de reconnaître des icônes sonores sur la base d'une description morphologique indépendamment du type de son utilisé.
APA, Harvard, Vancouver, ISO, and other styles
9

Savard, Alexandre. "When gestures are perceived through sounds : a framework for sonification of musicians' ancillary gestures." Thesis, McGill University, 2008. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=116051.

Full text
Abstract:
This thesis presents a multimodal sonification system that combines video with sound synthesis generated from motion capture data. Such a system allows for a fast and efficient exploration of musicians' ancillary gestural data, for which sonification complements conventional videos by stressing certain details which could escape one's attention if not displayed using an appropriate representation. The main objective of this project is to provide a research tool designed for people that are not necessarily familiar with signal processing or computer sciences. This tool is capable of easily generating meaningful sonifications thanks to dedicated mapping strategies. On the one hand, the dimensionality reduction of data obtained from motion capture systems such as the Vicon is fundamental as it may exceed 350 signals describing gestures. For that reason, a Principal Component Analysis is used to objectively reduce the number of signals to a subset that conveys the most significant gesture information in terms of signal variance. On the other hand, movement data presents high variability depending on the subjects: additional control parameters for sound synthesis are offered to restrain the sonification to the significant gestures, easily perceivable visually in terms of speed and path distance. Then, signal conditioning techniques are proposed to adapt the control signals to sound synthesis parameter requirements or to allow for emphasizing certain gesture characteristics that one finds important. All those data treatments are performed in realtime within one unique environment, minimizing data manipulation and facilitating efficient sonification designs. Realtime process also allows for an instantaneous system reset to parameter changes and process selection so that the user can easily and interactively manipulate data, design and adjust sonifications strategies.
APA, Harvard, Vancouver, ISO, and other styles
10

Smith, Daniel R. "Effects of training and context on human performance in a point estimation sonification task." Thesis, Georgia Institute of Technology, 2003. http://hdl.handle.net/1853/32845.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Interactive sonification"

1

Tünnermann, René, Lukas Kolbe, Till Bovermann, and Thomas Hermann. "Surface Interactions for Interactive Sonification." In Auditory Display, 166–83. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-12439-6_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ritterbusch, Sebastian, and Gerhard Jaworek. "Camassia: Monocular Interactive Mobile Way Sonification." In Lecture Notes in Computer Science, 12–18. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-94274-2_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Goudarzi, Visda. "Exploring a Taxonomy of Interaction in Interactive Sonification Systems." In Human Interaction, Emerging Technologies and Future Applications III, 140–45. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-55307-4_22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Wörtwein, Torsten, Boris Schauerte, Karin Müller, and Rainer Stiefelhagen. "Mobile Interactive Image Sonification for the Blind." In Lecture Notes in Computer Science, 212–19. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-41264-1_28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Jeon, Myounghoon, Riley J. Winton, Ashley G. Henry, Sanghun Oh, Carrie M. Bruce, and Bruce N. Walker. "Designing Interactive Sonification for Live Aquarium Exhibits." In Communications in Computer and Information Science, 332–36. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39473-7_67.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Jeon, Myounghoon, Michael T. Smith, James W. Walker, and Scott A. Kuhl. "Constructing the Immersive Interactive Sonification Platform (iISoP)." In Distributed, Ambient, and Pervasive Interactions, 337–48. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-07788-8_32.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Bardelli, Sandro, Claudia Ferretti, Luca Andrea Ludovico, Giorgio Presti, and Maurizio Rinaldi. "A Sonification of the zCOSMOS Galaxy Dataset." In Culture and Computing. Interactive Cultural Heritage and Arts, 171–88. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-77411-0_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Shelley, Simon, Miguel Alonso, Jacqueline Hollowood, Michael Pettitt, Sarah Sharples, Dik Hermes, and Armin Kohlrausch. "Interactive Sonification of Curve Shape and Curvature Data." In Haptic and Audio Interaction Design, 51–60. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04076-4_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Walker, James, Michael T. Smith, and Myounghoon Jeon. "Interactive Sonification Markup Language (ISML) for Efficient Motion-Sound Mappings." In Human-Computer Interaction: Interaction Technologies, 385–94. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-20916-6_36.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hermann, Thomas, Oliver Höner, and Helge Ritter. "AcouMotion – An Interactive Sonification System for Acoustic Motion Control." In Lecture Notes in Computer Science, 312–23. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11678816_35.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Interactive sonification"

1

Turchet, Luca. "Interactive sonification and the IoT." In AM'19: Audio Mostly. New York, NY, USA: ACM, 2019. http://dx.doi.org/10.1145/3356590.3356631.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhao, Haixia. "Interactive sonification of geo-referenced data." In CHI '05 extended abstracts. New York, New York, USA: ACM Press, 2005. http://dx.doi.org/10.1145/1056808.1056848.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wang, Hanqin, and Alexei Sourin. "Feasibility Study on Interactive Geometry Sonification." In 2022 International Conference on Cyberworlds (CW). IEEE, 2022. http://dx.doi.org/10.1109/cw55638.2022.00036.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

O’Neill, Charles A., and Kia Ng. "HEARING IMAGES: INTERACTIVE SONIFICATION INTERFACE FOR IMAGES." In Electronic Visualisation and the Arts (EVA 2008). BCS Learning & Development, 2008. http://dx.doi.org/10.14236/ewic/eva2008.22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

O'Neill, Charles, and Kia Ng. "Hearing Images: Interactive Sonification Interface for Images." In 2008 International Conference on Automated Solutions for Cross Media Content and Multi-Channel Distribution (AXMEDIS). IEEE, 2008. http://dx.doi.org/10.1109/axmedis.2008.42.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Wörtwein, Torsten, Boris Schauerte, Karin E. Müller, and Rainer Stiefelhagen. "Interactive Web-based Image Sonification for the Blind." In ICMI '15: INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION. New York, NY, USA: ACM, 2015. http://dx.doi.org/10.1145/2818346.2823298.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Dal Rì, Francesco, and Raul Masu. "Zugzwang: Chess Representation Combining Sonification and Interactive Performance." In AM '21: Audio Mostly 2021. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3478384.3478394.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Ghisio, Simone, Paolo Coletta, Stefano Piana, Paolo Alborno, Gualtiero Volpe, Antonio Camurri, Ludovica Primavera, et al. "An Open Platform for Full Body Interactive Sonification Exergames." In 7th International Conference on Intelligent Technologies for Interactive Entertainment. IEEE, 2015. http://dx.doi.org/10.4108/icst.intetain.2015.259584.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Alonso, Miguel, Simon Shelley, Dik Hermes, and Armin Kohlrausch. "Evaluating geometrical properties of virtual shapes using interactive sonification." In 2008 IEEE International Workshop on Haptic Audio visual Environments and Games (HAVE 2008). IEEE, 2008. http://dx.doi.org/10.1109/have.2008.4685316.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Liu, Wanyu, Artem Dementyev, Diemo Schwarz, Emmanuel Flety, Wendy E. Mackay, Michel Beaudouin-Lafon, and Frederic Bevilacqua. "SonicHoop: Using Interactive Sonification to Support Aerial Hoop Practices." In CHI '21: CHI Conference on Human Factors in Computing Systems. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3411764.3445539.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography