Dissertations / Theses on the topic 'Time in music Data processing'

To see the other types of publications on this topic, follow the link: Time in music Data processing.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Time in music Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Chang, Kuo-Lung. "A Real-Time Merging-Buffering Technique for MIDI Messages." Thesis, University of North Texas, 1991. https://digital.library.unt.edu/ark:/67531/metadc500471/.

Full text
Abstract:
A powerful and efficient algorithm has been designed to deal with the critical timing problem of the MIDI messages. This algorithm can convert note events stored in a natural way to MIDI messages dynamically. Only limited memory space (the buffer) is required to finish the conversion work, and the size of the buffer is independent of the size of the original sequence (notes). This algorithm's real-time variable properties suggest not only the flexible real-time controls in the use of musical aspects, but also the expandability to interactive multi-media applications. A compositional environment called MusicSculptor has been implemented in terms of this algorithm.
APA, Harvard, Vancouver, ISO, and other styles
2

Shafer, Seth. "Recent Approaches to Real-Time Notation." Thesis, University of North Texas, 2017. https://digital.library.unt.edu/ark:/67531/metadc984210/.

Full text
Abstract:
This paper discusses several compositions that use the computer screen to present music notation to performers. Three of these compositions, Law of Fives (2015), Polytera II (2016), and Terraformation (2016–17), employ strategies that allow the notation to change during the performance of the work as the product of composer-regulated algorithmic generation and performer interaction. New methodologies, implemented using Cycling74's Max software, facilitate performance of these works by allowing effective control of generation and on-screen display of notation; these include an application called VizScore, which delivers notation and conducts through it in real-time, and a development environment for real-time notation using the Bach extensions and graphical overlays around them. These tools support a concept of cartographic composition, in which a composer maps a range of potential behaviors that are mediated by human or algorithmic systems or some combination of the two. Notational variation in performance relies on computer algorithms that can both generate novel ideas and be subject to formal plans designed by the composer. This requires a broader discussion of the underlying algorithms and control mechanisms in the context of algorithmic art in general. Terraformation, for viola and computer, uses a model of the performer's physical actions to constrain the algorithmic generation of musical material displayed in on-screen notation. The resulting action-based on-screen notation system combines common practice notation with fingerboard tablature, color gradients, and abstract graphics. This hybrid model of dynamic notation puts unconventional demands on the performer; implications of this new performance practice are addressed, including behaviors, challenges, and freedoms of real-time notation.
APA, Harvard, Vancouver, ISO, and other styles
3

Serrà, Julià Joan. "Identification of versions of the same musical composition by processing audio descriptions." Doctoral thesis, Universitat Pompeu Fabra, 2011. http://hdl.handle.net/10803/22674.

Full text
Abstract:
This work focuses on the automatic identification of musical piece versions (alternate renditions of the same musical composition like cover songs, live recordings, remixes, etc.). In particular, we propose two core approaches for version identification: model-free and model-based ones. Furthermore, we introduce the use of post-processing strategies to improve the identification of versions. For all that we employ nonlinear signal analysis tools and concepts, complex networks, and time series models. Overall, our work brings automatic version identification to an unprecedented stage where high accuracies are achieved and, at the same time, explores promising directions for future research. Although our steps are guided by the nature of the considered signals (music recordings) and the characteristics of the task at hand (version identification), we believe our methodology can be easily transferred to other contexts and domains.
Aquest treball es centra en la identificació automàtica de versions musicals (interpretacions alternatives d'una mateixa composició: 'covers', directes, remixos, etc.). En concret, proposem dos tiupus d'estratègies: la lliure de model i la basada en models. També introduïm tècniques de post-processat per tal de millorar la identificació de versions. Per fer tot això emprem conceptes relacionats amb l'anàlisi no linial de senyals, xarxes complexes i models de sèries temporals. En general, el nostre treball porta la identificació automàtica de versions a un estadi sense precedents on s'obtenen bons resultats i, al mateix temps, explora noves direccions de futur. Malgrat que els passos que seguim estan guiats per la natura dels senyals involucrats (enregistraments musicals) i les característiques de la tasca que volem solucionar (identificació de versions), creiem que la nostra metodologia es pot transferir fàcilment a altres àmbits i contextos.
APA, Harvard, Vancouver, ISO, and other styles
4

Ostroumov, Ivan Victorovich. "Real time sensors data processing." Thesis, Polit. Challenges of science today: XIV International Scientific and Practical Conference of Young Researchers and Students, April 2–3, 2014 : theses. – К., 2014. – 35p, 2014. http://er.nau.edu.ua/handle/NAU/26582.

Full text
Abstract:
Sensor it is the most powerful part of any system. Aviation industry is the plase where milions of sensors is be used for difetrent purpuses. Othe wery important task of avionics equipment is data transfer between sensors to processing equipment. Why it is so important to transmit data online into MatLab? Nowadays rapidly are developing unmanned aerial vehicles. If we can transmit data from UAV sensors into MatLab, then we can process it and get the desired information about UAV. Of course we have to use the most chipiest way to data transfer. Today everyone in the world has mobile phone. Many of them has different sensors, such as: pressure sensor, temperature sensor, gravity sensor, gyroscope, rotation vector sensor, proximity sensor, light sensor, orientation sensor, magnetic field sensor, accelerometer, GPS receiver and so on. It will be cool if we can use real time data from cell phone sensors for some navigation tasks. In our work we use mobile phone Samsung Galaxy SIII with all sensors which are listed above except temperature sensor. There are existing many programs for reading and displaying data from sensors, such as: “Sensor Kinetics”, “Sensors”, “Data Recording”, “Android Sensors Viewer”. We used “Data Recording”. For the purpose of transmitting data from cell phone there are following methods: - GPRS (Mobile internet); - Bluetooth; - USB cable; - Wi-Fi. After comparing this methods we analyzed that GPRS is uncomfortable for us because we should pay for it, Bluetooth has small coverage, USB cable has not such portability as others methods. So we decided that Wi-Fi is optimal method on transmitting data for our goal
APA, Harvard, Vancouver, ISO, and other styles
5

Macias, Filiberto. "Real Time Telemetry Data Processing and Data Display." International Foundation for Telemetering, 1996. http://hdl.handle.net/10150/611405.

Full text
Abstract:
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California
The Telemetry Data Center (TDC) at White Sands Missile Range (WSMR) is now beginning to modernize its existing telemetry data processing system. Modern networking and interactive graphical displays are now being introduced. This infusion of modern technology will allow the TDC to provide our customers with enhanced data processing and display capability. The intent of this project is to outline this undertaking.
APA, Harvard, Vancouver, ISO, and other styles
6

White, Allan P., and Richard K. Dean. "Real-Time Test Data Processing System." International Foundation for Telemetering, 1989. http://hdl.handle.net/10150/614650.

Full text
Abstract:
International Telemetering Conference Proceedings / October 30-November 02, 1989 / Town & Country Hotel & Convention Center, San Diego, California
The U.S. Army Aviation Development Test Activity at Fort Rucker, Alabama needed a real-time test data collection and processing capability for helicopter flight testing. The system had to be capable of collecting and processing both FM and PCM data streams from analog tape and/or a telemetry receiver. The hardware and software was to be off the shelf whenever possible. The integration was to result in a stand alone telemetry collection and processing system.
APA, Harvard, Vancouver, ISO, and other styles
7

Şentürk, Sertan. "Computational analysis of audio recordings and music scores for the description and discovery of Ottoman-Turkish Makam music." Doctoral thesis, Universitat Pompeu Fabra, 2017. http://hdl.handle.net/10803/402102.

Full text
Abstract:
This thesis addresses several shortcomings on the current state of the art methodologies in music information retrieval (MIR). In particular, it proposes several computational approaches to automatically analyze and describe music scores and audio recordings of Ottoman-Turkish makam music (OTMM). The main contributions of the thesis are the music corpus that has been created to carry out the research and the audio-score alignment methodology developed for the analysis of the corpus. In addition, several novel computational analysis methodologies are presented in the context of common MIR tasks of relevance for OTMM. Some example tasks are predominant melody extraction, tonic identification, tempo estimation, makam recognition, tuning analysis, structural analysis and melodic progression analysis. These methodologies become a part of a complete system called Dunya-makam for the exploration of large corpora of OTMM. The thesis starts by presenting the created CompMusic Ottoman- Turkish makam music corpus. The corpus includes 2200 music scores, more than 6500 audio recordings, and accompanying metadata. The data has been collected, annotated and curated with the help of music experts. Using criteria such as completeness, coverage and quality, we validate the corpus and show its research potential. In fact, our corpus is the largest and most representative resource of OTMM that can be used for computational research. Several test datasets have also been created from the corpus to develop and evaluate the specific methodologies proposed for different computational tasks addressed in the thesis. The part focusing on the analysis of music scores is centered on phrase and section level structural analysis. Phrase boundaries are automatically identified using an existing state-of-the-art segmentation methodology. Section boundaries are extracted using heuristics specific to the formatting of the music scores. Subsequently, a novel method based on graph analysis is used to establish similarities across these structural elements in terms of melody and lyrics, and to label the relations semiotically. The audio analysis section of the thesis reviews the state-of-the-art for analysing the melodic aspects of performances of OTMM. It proposes adaptations of existing predominant melody extraction methods tailored to OTMM. It also presents improvements over pitch-distribution-based tonic identification and makam recognition methodologies. The audio-score alignment methodology is the core of the thesis. It addresses the culture-specific challenges posed by the musical characteristics, music theory related representations and oral praxis of OTMM. Based on several techniques such as subsequence dynamic time warping, Hough transform and variable-length Markov models, the audio-score alignment methodology is designed to handle the structural differences between music scores and audio recordings. The method is robust to the presence of non-notated melodic expressions, tempo deviations within the music performances, and differences in tonic and tuning. The methodology utilizes the outputs of the score and audio analysis, and links the audio and the symbolic data. In addition, the alignment methodology is used to obtain score-informed description of audio recordings. The scoreinformed audio analysis not only simplifies the audio feature extraction steps that would require sophisticated audio processing approaches, but also substantially improves the performance compared with results obtained from the state-of-the-art methods solely relying on audio data. The analysis methodologies presented in the thesis are applied to the CompMusic Ottoman-Turkish makam music corpus and integrated into a web application aimed at culture-aware music discovery. Some of the methodologies have already been applied to other music traditions such as Hindustani, Carnatic and Greek music. Following open research best practices, all the created data, software tools and analysis results are openly available. The methodologies, the tools and the corpus itself provide vast opportunities for future research in many fields such as music information retrieval, computational musicology and music education.
Esta tesis aborda varias limitaciones de las metodologías más avanzadas en el campo de recuperación de información musical (MIR por sus siglas en inglés). En particular, propone varios métodos computacionales para el análisis y la descripción automáticas de partituras y grabaciones de audio de música de makam turco-otomana (MMTO). Las principales contribuciones de la tesis son el corpus de música que ha sido creado para el desarrollo de la investigación y la metodología para alineamiento de audio y partitura desarrollada para el análisis del corpus. Además, se presentan varias metodologías nuevas para análisis computacional en el contexto de las tareas comunes de MIR que son relevantes para MMTO. Algunas de estas tareas son, por ejemplo, extracción de la melodía predominante, identificación de la tónica, estimación de tempo, reconocimiento de makam, análisis de afinación, análisis estructural y análisis de progresión melódica. Estas metodologías constituyen las partes de un sistema completo para la exploración de grandes corpus de MMTO llamado Dunya-makam. La tesis comienza presentando el corpus de música de makam turcootomana de CompMusic. El corpus incluye 2200 partituras, más de 6500 grabaciones de audio, y los metadatos correspondientes. Los datos han sido recopilados, anotados y revisados con la ayuda de expertos. Utilizando criterios como compleción, cobertura y calidad, validamos el corpus y mostramos su potencial para investigación. De hecho, nuestro corpus constituye el recurso de mayor tamaño y representatividad disponible para la investigación computacional de MMTO. Varios conjuntos de datos para experimentación han sido igualmente creados a partir del corpus, con el fin de desarrollar y evaluar las metodologías específicas propuestas para las diferentes tareas computacionales abordadas en la tesis. La parte dedicada al análisis de las partituras se centra en el análisis estructural a nivel de sección y de frase. Los márgenes de frase son identificados automáticamente usando uno de los métodos de segmentación existentes más avanzados. Los márgenes de sección son extraídos usando una heurística específica al formato de las partituras. A continuación, se emplea un método de nueva creación basado en análisis gráfico para establecer similitudes a través de estos elementos estructurales en cuanto a melodía y letra, así como para etiquetar relaciones semióticamente. La sección de análisis de audio de la tesis repasa el estado de la cuestión en cuanto a análisis de los aspectos melódicos en grabaciones de MMTO. Se proponen modificaciones de métodos existentes para extracción de melodía predominante para ajustarlas a MMTO. También se presentan mejoras de metodologías tanto para identificación de tónica basadas en distribución de alturas, como para reconocimiento de makam. La metodología para alineación de audio y partitura constituye el grueso de la tesis. Aborda los retos específicos de esta cultura según vienen determinados por las características musicales, las representaciones relacionadas con la teoría musical y la praxis oral de MMTO. Basada en varias técnicas tales como deformaciones dinámicas de tiempo subsecuentes, transformada de Hough y modelos de Markov de longitud variable, la metodología de alineamiento de audio y partitura está diseñada para tratar las diferencias estructurales entre partituras y grabaciones de audio. El método es robusto a la presencia de expresiones melódicas no anotadas, desviaciones de tiempo en las grabaciones, y diferencias de tónica y afinación. La metodología utiliza los resultados del análisis de partitura y audio para enlazar el audio y los datos simbólicos. Además, la metodología de alineación se usa para obtener una descripción informada por partitura de las grabaciones de audio. El análisis de audio informado por partitura no sólo simplifica los pasos para la extracción de características de audio que de otro modo requerirían sofisticados métodos de procesado de audio, sino que también mejora sustancialmente su rendimiento en comparación con los resultados obtenidos por los métodos más avanzados basados únicamente en datos de audio. Las metodologías analíticas presentadas en la tesis son aplicadas al corpus de música de makam turco-otomana de CompMusic e integradas en una aplicación web dedicada al descubrimiento culturalmente específico de música. Algunas de las metodologías ya han sido aplicadas a otras tradiciones musicales, como música indostaní, carnática y griega. Siguiendo las mejores prácticas de investigación en abierto, todos los datos creados, las herramientas de software y los resultados de análisis está disponibles públicamente. Las metodologías, las herramientas y el corpus en sí mismo ofrecen grandes oportunidades para investigaciones futuras en muchos campos tales como recuperación de información musical, musicología computacional y educación musical.
Aquesta tesi adreça diverses deficiències en l’estat actual de les metodologies d’extracció d’informació de música (Music Information Retrieval o MIR). En particular, la tesi proposa diverses estratègies per analitzar i descriure automàticament partitures musicals i enregistraments d’actuacions musicals de música Makam Turca Otomana (OTMM en les seves sigles en anglès). Les contribucions principals de la tesi són els corpus musicals que s’han creat en el context de la tesi per tal de dur a terme la recerca i la metodologia de alineament d’àudio amb la partitura que s’ha desenvolupat per tal d’analitzar els corpus. A més la tesi presenta diverses noves metodologies d’anàlisi computacional d’OTMM per a les tasques més habituals en MIR. Alguns exemples d’aquestes tasques són la extracció de la melodia principal, la identificació del to musical, l’estimació de tempo, el reconeixement de Makam, l’anàlisi de la afinació, l’anàlisi de la estructura musical i l’anàlisi de la progressió melòdica. Aquest seguit de metodologies formen part del sistema Dunya-makam per a la exploració de grans corpus musicals d’OTMM. En primer lloc, la tesi presenta el corpus CompMusic Ottoman- Turkish makam music. Aquest inclou 2200 partitures musicals, més de 6500 enregistraments d’àudio i metadata complementària. Les dades han sigut recopilades i anotades amb ajuda d’experts en aquest repertori musical. El corpus ha estat validat en termes de d’exhaustivitat, cobertura i qualitat i mostrem aquí el seu potencial per a la recerca. De fet, aquest corpus és el la font més gran i representativa de OTMM que pot ser utilitzada per recerca computacional. També s’han desenvolupat diversos subconjunts de dades per al desenvolupament i evaluació de les metodologies específiques proposades per a les diverses tasques computacionals que es presenten en aquest tesi. La secció de la tesi que tracta de l’anàlisi de partitures musicals se centra en l’anàlisi estructural a nivell de secció i de frase musical. Els límits temporals de les frases musicals s’identifiquen automàticament gràcies a un metodologia de segmentació d’última generació. Els límits de les seccions s’extreuen utilitzant un seguit de regles heurístiques determinades pel format de les partitures musicals. Posteriorment s’utilitza un nou mètode basat en anàlisi gràfic per establir semblances entre aquest elements estructurals en termes de melodia i text. També s’utilitza aquest mètode per etiquetar les relacions semiòtiques existents. La següent secció de la tesi tracta sobre anàlisi d’àudio i en particular revisa les tecnologies d’avantguardia d’anàlisi dels aspectes melòdics en OTMM. S’hi proposen adaptacions dels mètodes d’extracció de melodia existents que s’ajusten a OTMM. També s’hi presenten millores en metodologies de reconeixement de makam i en identificació de tònica basats en distribució de to. La metodologia d’alineament d’àudio amb partitura és el nucli de la tesi. Aquesta aborda els reptes culturalment específics imposats per les característiques musicals, les representacions de la teoria musical i la pràctica oral particulars de l’OTMM. Utilitzant diverses tècniques tal i com Dynamic Time Warping, Hough Transform o models de Markov de durada variable, la metodologia d’alineament esta dissenyada per enfrontar les diferències estructurals entre partitures musicals i enregistraments d’àudio. El mètode és robust inclús en presència d’expressions musicals no anotades en la partitura, desviacions de tempo ocorregudes en les actuacions musicals i diferències de tònica i afinació. La metodologia aprofita els resultats de l’anàlisi de la partitura i l’àudio per enllaçar la informació simbòlica amb l’àudio. A més, la tècnica d’alineament s’utilitza per obtenir descripcions de l’àudio fonamentades en la partitura. L’anàlisi de l’àudio fonamentat en la partitura no només simplifica les fases d’extracció de característiques d’àudio que requeririen de mètodes de processament d’àudio sofisticats, sinó que a més millora substancialment els resultats comparat amb altres mètodes d´ultima generació que només depenen de contingut d’àudio. Les metodologies d’anàlisi presentades s’han utilitzat per analitzar el corpus CompMusic Ottoman-Turkish makam music i s’han integrat en una aplicació web destinada al descobriment musical de tradicions culturals específiques. Algunes de les metodologies ja han sigut també aplicades a altres tradicions musicals com la Hindustani, la Carnàtica i la Grega. Seguint els preceptes de la investigació oberta totes les dades creades, eines computacionals i resultats dels anàlisis estan disponibles obertament. Tant les metodologies, les eines i el corpus en si mateix proporcionen àmplies oportunitats per recerques futures en diversos camps de recerca tal i com la musicologia computacional, la extracció d’informació musical i la educació musical. Traducció d’anglès a català per Oriol Romaní Picas.
APA, Harvard, Vancouver, ISO, and other styles
8

Dowling, Jason, John Welling, Loral Aerosys, Kathy Nanzetta, Toby Bennett, and Jeff Shi. "ACCELERATING REAL-TIME SPACE DATA PACKET PROCESSING." International Foundation for Telemetering, 1995. http://hdl.handle.net/10150/608429.

Full text
Abstract:
International Telemetering Conference Proceedings / October 30-November 02, 1995 / Riviera Hotel, Las Vegas, Nevada
NASA’s use of high bandwidth packetized Consultative Committee for Space Data Systems (CCSDS) telemetry in future missions presents a great challenge to ground data system developers. These missions, including the Earth Observing System (EOS), call for high data rate interfaces and small packet sizes. Because each packet requires a similar amount of protocol processing, high data rates and small packet sizes dramatically increase the real-time workload on ground packet processing systems. NASA’s Goddard Space Flight Center has been developing packet processing subsystems for more than twelve years. Implementations of these subsystems have ranged from mini-computers to single-card VLSI multiprocessor subsystems. The latter subsystem, known as the VLSI Packet Processor, was first deployed in 1991 for use in support of the Solar Anomalous & Magnetospheric Particle Explorer (SAMPEX) mission. An upgraded version of this VMEBus card, first deployed for Space Station flight hardware verification, has demonstrated sustained throughput of up to 50 Megabits per second and 15,000 packets per second. Future space missions including EOS will require significantly higher data and packet rate performance. A new approach to packet processing is under development that will not only increase performance levels by at least a factor of six but also reduce subsystem replication costs by a factor of five. This paper will discuss the development of a next generation packet processing subsystem and the architectural changes necessary to achieve a thirty-fold improvement in the performance/price of real-time packet processing.
APA, Harvard, Vancouver, ISO, and other styles
9

Fujinaga, Ichiro. "Optical music recognition using projections." Thesis, McGill University, 1988. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=61870.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Dahan, Michael. "RTDAP:REAL-TIME DATA ACQUISITION, PROCESSING AND DISPLAY SYSTEM." International Foundation for Telemetering, 1989. http://hdl.handle.net/10150/614848.

Full text
Abstract:
International Telemetering Conference Proceedings / October 30-November 02, 1989 / Town & Country Hotel & Convention Center, San Diego, California
This paper describes a data acquisition, processing and display system which is suitable for various telemetry applications. The system can be connected either to a PCM encoder or to a telemetry decommutator through a built-in interface and can directly address any channel from the PCM stream for processing. Its compact size and simplicity allow it to be used in the flight line as a test console, in mobile stations as the main data processing system, or on-board test civil aircrafts for in-flight monitoring and data processing.
APA, Harvard, Vancouver, ISO, and other styles
11

Swientek, Martin. "High-performance near-time processing of bulk data." Thesis, University of Plymouth, 2015. http://hdl.handle.net/10026.1/3461.

Full text
Abstract:
Enterprise Systems like customer-billing systems or financial transaction systems are required to process large volumes of data in a fixed period of time. Those systems are increasingly required to also provide near-time processing of data to support new service offerings. Common systems for data processing are either optimized for high maximum throughput or low latency. This thesis proposes the concept for an adaptive middleware, which is a new approach for designing systems for bulk data processing. The adaptive middleware is able to adapt its processing type fluently between batch processing and single-event processing. By using message aggregation, message routing and a closed feedback-loop to adjust the data granularity at runtime, the system is able to minimize the end-to-end latency for different load scenarios. The relationship of end-to-end latency and throughput of batch and message-based systems is formally analyzed and a performance evaluation of both processing types has been conducted. Additionally, the impact of message aggregation on throughput and latency is investigated. The proposed middleware concept has been implemented with a research prototype and has been evaluated. The results of the evaluation show that the concept is viable and is able to optimize the end-to-end latency of a system. The design, implementation and operation of an adaptive system for bulk data processing differs from common approaches to implement enterprise systems. A conceptual framework has been development to guide the development process of how to build an adaptive software for bulk data processing. It defines the needed roles and their skills, the necessary tasks and their relationship, artifacts that are created and required by different tasks, the tools that are needed to process the tasks and the processes, which describe the order of tasks.
APA, Harvard, Vancouver, ISO, and other styles
12

Silva, Jesús, Palma Hugo Hernández, Núẽz William Niebles, David Ovallos-Gazabon, and Noel Varela. "Parallel Algorithm for Reduction of Data Processing Time in Big Data." Institute of Physics Publishing, 2020. http://hdl.handle.net/10757/652134.

Full text
Abstract:
Technological advances have allowed to collect and store large volumes of data over the years. Besides, it is significant that today's applications have high performance and can analyze these large datasets effectively. Today, it remains a challenge for data mining to make its algorithms and applications equally efficient in the need of increasing data size and dimensionality [1]. To achieve this goal, many applications rely on parallelism, because it is an area that allows the reduction of cost depending on the execution time of the algorithms because it takes advantage of the characteristics of current computer architectures to run several processes concurrently [2]. This paper proposes a parallel version of the FuzzyPred algorithm based on the amount of data that can be processed within each of the processing threads, synchronously and independently.
APA, Harvard, Vancouver, ISO, and other styles
13

Nordström, Jesper. "Real time digital signal processing using Matlab." Thesis, Uppsala universitet, Signaler och System, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-332075.

Full text
Abstract:
Increased usage of electronic devices and the fast development of microprocessors has increased the usage of digital filters ahead of analog filters. Digital filters offer great benefits over analog filters in that they are inexpensive, they can be reprogrammed easily and they open up whole new range of possibilities when it comes to Internet of things. This thesis describes development of a program that can sample music from the computer's microphone input, filter it inside the program with user built filters and reconstruct the music to the computer's headphone output meaning that the music can be played from the speakers. All of this is to happen in real time. The program is developed for students studying at the department of ``Signals and Systems" and the program is supposed the be one of the educational tools to make sense of signals and filtering. The program works well and filters the sound with satisfying results. It is easy to create filters and filter the signal. Since it is music that is filtered constructing perfect filters with minimum ripple, minimum or linear phase is quite difficult to achieve. The program could be improved by improving the user interface, making the environment more interactive and less difficult to construct good filters. Some improvements could also be made to the implementation; as of now the program might run a bit slow on startup on slower computers.
APA, Harvard, Vancouver, ISO, and other styles
14

Liu, Guangtian. "An event service architecture in distributed real-time systems /." Digital version accessible at:, 1999. http://wwwlib.umi.com/cr/utexas/main.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Guttman, Michael. "Sampled-data IIR filtering via time-mode signal processing." Thesis, McGill University, 2010. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=86770.

Full text
Abstract:
In this work, the design of sampled-data infinite impulse response filters based on time-mode signal processing circuits is presented. Time-mode signal processing (TMSP), defined as the processing of sampled analog information using time-difference variables, has become one of the more popular emerging technologies in circuit design. As TMSP is still relatively new, there is still much development needed to extend the technology into a general signal-processing tool. In this work, a set of general building block will be introduced that perform the most basic mathematical operations in the time-mode. By arranging these basic structures, higher-order time-mode systems, specifically, time-mode filters, will be realized. Three second-order time-mode filters (low-pass, band-reject, high-pass) are modeled using MATLAB, and simulated in Spectre to verify the design methodology. Finally, a damped integrator and a second-order low-pass time-mode IIR filter are both implemented using discrete components.
Dans ce mémoire, la conception de filtres de données-échantillonnées ayant une réponse impulsionnelle infinie basée sur le traitement de signal en mode temporel est présentée. Le traitement de signal dans le domaine temporel (TSDT), définie comme étant le traitement d'information analogique échantillonnée en utilisant des différences de temps comme variables, est devenu une des techniques émergentes de conception de circuits des plus populaires. Puisque le TSDT est toujours relativement récent, il y a encore beaucoup de développements requis pour étendre cette technologie comme un outil de traitement de signal général. Dans cette recherche, un ensemble de blocs d'assemblage capable de réaliser la plupart des opérations mathématiques dans le domaine temporel sera introduit. En arrangeant ces structures élémentaires, des systèmes en mode temporel d'ordre élevé, plus spécifiquement des filtres en mode temporel, seront réalisés. Trois filtres de deuxième ordre dans le domaine temporel (passe-bas, passe-bande et passe-haut) sont modélisés sur MATLAB et simulé sur Spectre afin de vérifier la méthodologie de conception. Finalement, un intégrateur amorti et un filtre passe-bas IIR de deuxième ordre en mode temporel sont implémentés avec des composantes discrètes.
APA, Harvard, Vancouver, ISO, and other styles
16

Dreibelbis, Harold N., Dennis Kelsch, and Larry James. "REAL-TIME TELEMETRY DATA PROCESSING and LARGE SCALE PROCESSORS." International Foundation for Telemetering, 1991. http://hdl.handle.net/10150/612912.

Full text
Abstract:
International Telemetering Conference Proceedings / November 04-07, 1991 / Riviera Hotel and Convention Center, Las Vegas, Nevada
Real-time data processing of telemetry data has evolved from a highly centralized single large scale computer system to multiple mini-computers or super mini-computers tied together in a loosely coupled distributed network. Each mini-computer or super mini-computer essentially performing a single function in the real-time processing sequence of events. The reasons in the past for this evolution are many and varied. This paper will review some of the more significant factors in that evolution and will present some alternatives to a fully distributed mini-computer network that appear to offer significant real-time data processing advantages.
APA, Harvard, Vancouver, ISO, and other styles
17

Feather, Bob, and Michael O’Brien. "OPEN ARCHITECTURE SYSTEM FOR REAL TIME TELEMETRY DATA PROCESSING." International Foundation for Telemetering, 1991. http://hdl.handle.net/10150/612934.

Full text
Abstract:
International Telemetering Conference Proceedings / November 04-07, 1991 / Riviera Hotel and Convention Center, Las Vegas, Nevada
There have been many recent technological advances in small computers, graphics stations, and system networks. This has made it possible to build highly advanced distributed processing systems for telemetry data acquisition and processing. Presently there is a plethora of vendors marketing powerful new network workstation hardware and software products. Computer vendors are rapidly developing new products as new technology continues to emerge. It is becoming difficult to procure and install a new computer system before it has been made obsolete by a competitor or even the same vendor. If one purchases the best hardware and software products individually, the system can end up being composed of incompatible components from different vendors that do not operate as one integrated homogeneous system. If one uses only hardware and software from one vendor in order to simplify system integration, the system will be limited to only those products that the vendor chooses to develop. To truly take advantage of the rapidly advancing computer technology, today’s telemetry systems should be designed for an open systems environment. This paper defines an optimum open architecture system designed around industry wide standards for both hardware and software. This will allow for different vendor’s computers to operate in the same distributed networked system, and will allow software to be portable to the various computers and workstations in the system while maintaining the same user interface. The open architecture system allows for new products to be added as they become available to increase system performance and capability in a truly heterogeneous system environment.
APA, Harvard, Vancouver, ISO, and other styles
18

Dahan, Michael. "RTDAP: Real-Time Data Acquisition, Processing and Display System." International Foundation for Telemetering, 1989. http://hdl.handle.net/10150/614629.

Full text
Abstract:
International Telemetering Conference Proceedings / October 30-November 02, 1989 / Town & Country Hotel & Convention Center, San Diego, California
This paper describes a data acquisition, processing and display system which is suitable for various telemetry applications. The system can be connected either to a PCM encoder or to a telemetry decommutator through a built-in interface and can directly address any channel from the PCM stream for processing. Its compact size and simplicity allow it to be used in the flight line as a test console, in mobile stations as the main data processing system, or on-board test civil aircrafts for in-flight monitoring and data processing.
APA, Harvard, Vancouver, ISO, and other styles
19

Crolene, Robert. "SYSTEMS AND METHODS TO REDUCE DATA PROCESSING TURNAROUND TIME." International Foundation for Telemetering, 1986. http://hdl.handle.net/10150/615397.

Full text
Abstract:
International Telemetering Conference Proceedings / October 13-16, 1986 / Riviera Hotel, Las Vegas, Nevada
Weapon system complexity and its data expression have become a central issue for the Range Directorate at the Pacific Missile Test Center (PMTC). Increasing data complexity and data product turnaround requirements have created a technological push-pull on traditional data processing methods. Several possible responses are discussed which include distributed front and back end processing relative to the large mainframes, and increasing use of artificial intelligence techniques in the data reduction area. These methods are going through progressive steps of implementation at PMTC with some notable success.
APA, Harvard, Vancouver, ISO, and other styles
20

Mousavi, Bamdad. "Scalable Stream Processing and Management for Time Series Data." Thesis, Université d'Ottawa / University of Ottawa, 2021. http://hdl.handle.net/10393/42295.

Full text
Abstract:
There has been an enormous growth in the generation of time series data in the past decade. This trend is caused by widespread adoption of IoT technologies, the data generated by monitoring of cloud computing resources, and cyber physical systems. Although time series data have been a topic of discussion in the domain of data management for several decades, this recent growth has brought the topic to the forefront. Many of the time series management systems available today lack the necessary features to successfully manage and process the sheer amount of time series being generated today. In this today we stive to examine the field and study the prior work in time series management. We then propose a large system capable of handling time series management end to end, from generation to consumption by the end user. Our system is composed of open-source data processing frameworks. Our system has the capability to collect time series data, perform stream processing over it, store it for immediate and future processing and create necessary visualizations. We present the implementation of the system and perform experimentations to show its scalability to handle growing pipelines of incoming data from various sources.
APA, Harvard, Vancouver, ISO, and other styles
21

See, Chong Meng Samson. "Space-time processing for wireless mobile communications." Thesis, Loughborough University, 1999. https://dspace.lboro.ac.uk/2134/25284.

Full text
Abstract:
Intersymbol interference (ISI) and co-channel interference (CCI) are two major obstacles to high speed data transmission in wireless cellular communications systems. Unlike thermal noise, their effects cannot be removed by increasing the signal power and are time-varying due to the relative motion between the transmitters and receivers. Space-time processing offers a signal processing framework to optimally integrate the spatial and temporal properties of the signal for maximal signal reception and at the same time, mitigate the ISI and CCI impairments. In this thesis, we focus on the development of this emerging technology to combat the undesirable effects of ISI and CCL We first develop a convenient mathematical model to parameterize the space-time multipath channel based on signal path power, directions and times of arrival. Starting from the continuous time-domain, we derive compact expressions of the vector space-time channel model that lead to the notion of block space-time manifold, Under certain identifiability conditions, the noiseless vector-channel outputs will lie on a subspace constructed from a set. of basis belonging to the block space-time manifold. This is an important observation as many high resolution array processing algorithms Can be applied directly to estimate the multi path channel parameters. Next we focus on the development of semi-blind channel identification and equalization algorithms for fast time-varying multi path channels. Specifically. we develop space-time processing algorithms for wireless TDMA networks that use short burst data formats with extremely short training data. sequences. Due to the latter, the estimated channel parameters are extremely unreliable for equalization with conventional adaptive methods. We approach the channel acquisition, tracking and equalization problems jointly, and exploit the richness of the inherent structural relationship between the channel parameters and the data sequence by repeated use of available data through a forward- backward optimization procedure. This enables the fuller exploitation of the available data. Our simulation studies show that significant performance gains are achieved over conventional methods. In the final part of this thesis, we address the problem identifying and equalizing multi path communication channels in the presence of strong CCl. By considering CCI as stochasic processes, we find that temporal diversity can be gained by observing the channel outputs from a tapped delay line. Together with the assertion that the finite alphabet property of the information sequences can offer additional information about the channel parameters and the noise-plus-covariance matrix, we develop a spatial temporal algorithm, iterative reweighting alternating minimization, to estimate the channel parameters and information sequence in a weighted least squares framework. The proposed algorithm is robust as it does not require knowledge of the number of CCI nor their structural information. Simulation studies demonstrate its efficacy over many reported methods.
APA, Harvard, Vancouver, ISO, and other styles
22

Nyström, Simon, and Joakim Lönnegren. "Processing data sources with big data frameworks." Thesis, KTH, Data- och elektroteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-188204.

Full text
Abstract:
Big data is a concept that is expanding rapidly. As more and more data is generatedand garnered, there is an increasing need for efficient solutions that can be utilized to process all this data in attempts to gain value from it. The purpose of this thesis is to find an efficient way to quickly process a large number of relatively small files. More specifically, the purpose is to test two frameworks that can be used for processing big data. The frameworks that are tested against each other are Apache NiFi and Apache Storm. A method is devised in order to, firstly, construct a data flow and secondly, construct a method for testing the performance and scalability of the frameworks running this data flow. The results reveal that Apache Storm is faster than Apache NiFi, at the sort of task that was tested. As the number of nodes included in the tests went up, the performance did not always do the same. This indicates that adding more nodes to a big data processing pipeline, does not always result in a better performing setup and that, sometimes, other measures must be made to heighten the performance.
Big data är ett koncept som växer snabbt. När mer och mer data genereras och samlas in finns det ett ökande behov av effektiva lösningar som kan användas föratt behandla all denna data, i försök att utvinna värde från den. Syftet med detta examensarbete är att hitta ett effektivt sätt att snabbt behandla ett stort antal filer, av relativt liten storlek. Mer specifikt så är det för att testa två ramverk som kan användas vid big data-behandling. De två ramverken som testas mot varandra är Apache NiFi och Apache Storm. En metod beskrivs för att, för det första, konstruera ett dataflöde och, för det andra, konstruera en metod för att testa prestandan och skalbarheten av de ramverk som kör dataflödet. Resultaten avslöjar att Apache Storm är snabbare än NiFi, på den typen av test som gjordes. När antalet noder som var med i testerna ökades, så ökade inte alltid prestandan. Detta visar att en ökning av antalet noder, i en big data-behandlingskedja, inte alltid leder till bättre prestanda och att det ibland krävs andra åtgärder för att öka prestandan.
APA, Harvard, Vancouver, ISO, and other styles
23

Smit, Konrad van Zyl. "Applying the phi ratio in designing a musical scale." Thesis, Stellenbosch : University of Stellenbosch, 2005. http://hdl.handle.net/10019.1/2989.

Full text
Abstract:
Thesis (MMus (Music))--University of Stellenbosch, 2005.
In this thesis, an attempt is made to create an aesthetically pleasing musical scale based on the ratio of phi. Precedents for the application of phi in aesthetic fields exist; noteworthy is Le Corbusier’s architectural works, the measurements of which are based on phi. A brief discussion of the unique mathematical properties of phi is given, followed by a discussion of the manifestations of phi in the physical ratios as they appear in animal and plant life. Specific scales which have found an application in art music are discussed, and the properties to which their success is attributable are identified. Consequently, during the design of the phi scale, these characteristics are incorporated. The design of the phi scale is facilitated by the use of the most sophisticated modern computer software in the field of psychacoustics. During the scale’s design process, particular emphasis is placed on the requirement of obtaining maximal sensory consonance. For this reason, an in-depth discussion of the theories regarding consonance perception is undertaken. During this discussion, the reader’s attention is drawn to the difference between musical and perceptual consonance, and a discussion of the developmental history of musical consonance is given. Lastly, the scale is tested to see whether it complies with the requirements for successful scales.
APA, Harvard, Vancouver, ISO, and other styles
24

Rydman, Oskar. "Data processing of Controlled Source Audio Magnetotelluric (CSAMT) Data." Thesis, Uppsala universitet, Geofysik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-387246.

Full text
Abstract:
During this project three distinct methods to improve the data processing of Controlled Source Audio Magnetotellurics (CSAMT) data are implemented and their advantages and disadvantages are discussed. The methods in question are: Detrending the time series in the time domain, instead of detrending in the frequencydomain. Implementation of a coherency test to pinpoint data segments of low quality andremove these data from the calculations. Implementing a method to detect and remove transients from the time series toreduce background noise in the frequency spectra. Both the detrending in time domain and the transient removal shows potential in improvingdata quality even if the improvements are small(both in the (1-10% range). Due totechnical limitations no coherency test was implemented. Overall the processes discussedin the report did improve the data quality and may serve as groundwork for further improvementsto come.
Projektet behandlar tre stycken metoder för att förbättra signalkvaliten hos Controlled Source Audio Magnetotellurics (CSAMT) data, dessa implementeras och deras för- och nackdelar diskuteras. Metoderna som hanteras är: Avlägsnandet av trender från tidsserier i tidsdomänen istället för i frekvensdomänen. Implementationen av ett koherenstest för att identifiera ”dåliga” datasegment ochavlägsna dessa från vidare beräkningar. Implementationen av en metod för att både hitta och avlägsna transienter (dataspikar) från tidsserien för att minska bakgrundsbruset i frekvensspektrat. Både avlägsnandet av trender samt transienter visar positiv inverkan på datakvaliteten,även om skillnaderna är relativt små (båda på ungefär 1-10%). På grund av begränsningarfrån mätdatan kunde inget meningsfullt koherenstest utformas. Överlag har processernasom diskuteras i rapporten förbättrat datakvaliten och kan ses som ett grundarbete förfortsatta förbättringar inom området.
APA, Harvard, Vancouver, ISO, and other styles
25

田慶豐 and Hing-fung Ting. "Examples of time-message tradeoffs in distributed algorithms." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1988. http://hub.hku.hk/bib/B31208393.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Narayanan, Shruthi (Shruthi P. ). "Real-time processing and visualization of intensive care unit data." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/119537.

Full text
Abstract:
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2017.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (page 83).
Intensive care unit (ICU) patients undergo detailed monitoring so that copious information regarding their condition is available to support clinical decision-making. Full utilization of the data depends heavily on its quantity, quality and manner of presentation to the physician at the bedside of a patient. In this thesis, we implemented a visualization system to aid ICU clinicians in collecting, processing, and displaying available ICU data. Our goals for the system are: to be able to receive large quantities of patient data from various sources, to compute complex functions over the data that are able to quantify an ICU patient's condition, to plot the data using a clean and interactive interface, and to be capable of live plot updates upon receiving new data. We made significant headway toward our goals, and we succeeded in creating a highly adaptable visualization system that future developers and users will be able to customize.
by Shruthi Narayanan.
M. Eng.
APA, Harvard, Vancouver, ISO, and other styles
27

Chun, Yang, Yang Hongling, and Zhou Jie. "STUDY ON HIGH-RATE TELEMETRY DATA REAL-TIME PROCESSING TECHNIQUES." International Foundation for Telemetering, 2000. http://hdl.handle.net/10150/608251.

Full text
Abstract:
International Telemetering Conference Proceedings / October 23-26, 2000 / Town & Country Hotel and Conference Center, San Diego, California
Owing to rapid development of PC industry, personal computer has been surprisingly improved on reliability and speed and it has been applied to many fields, such as aerospace, satellite and telemetry applications. As we all known, two aspects decide how fast the PC-based data acquisition can be reached. One aspect is CPU processing and the other is I/O bandwidth. Indeed, the first aspect has changed increasingly insignificant because the frequency of CPU has exceeded 700MHz which can satisfy fully the need of high rate data processing. So I/O bandwidth is the only key factor of the high rate PC-based data acquisition and we must adopt efficient data buffer techniques to satisfy the demand of telemetry data entry. This paper presents a buffered data channel which use memory mapping, EPLD and Dual-Port SRAM techniques. The operation platform of this design is WINDOWS95/98 and the software includes device driver and real-time processing routines.
APA, Harvard, Vancouver, ISO, and other styles
28

Eberius, Julian. "Query-Time Data Integration." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2015. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-191560.

Full text
Abstract:
Today, data is collected in ever increasing scale and variety, opening up enormous potential for new insights and data-centric products. However, in many cases the volume and heterogeneity of new data sources precludes up-front integration using traditional ETL processes and data warehouses. In some cases, it is even unclear if and in what context the collected data will be utilized. Therefore, there is a need for agile methods that defer the effort of integration until the usage context is established. This thesis introduces Query-Time Data Integration as an alternative concept to traditional up-front integration. It aims at enabling users to issue ad-hoc queries on their own data as if all potential other data sources were already integrated, without declaring specific sources and mappings to use. Automated data search and integration methods are then coupled directly with query processing on the available data. The ambiguity and uncertainty introduced through fully automated retrieval and mapping methods is compensated by answering those queries with ranked lists of alternative results. Each result is then based on different data sources or query interpretations, allowing users to pick the result most suitable to their information need. To this end, this thesis makes three main contributions. Firstly, we introduce a novel method for Top-k Entity Augmentation, which is able to construct a top-k list of consistent integration results from a large corpus of heterogeneous data sources. It improves on the state-of-the-art by producing a set of individually consistent, but mutually diverse, set of alternative solutions, while minimizing the number of data sources used. Secondly, based on this novel augmentation method, we introduce the DrillBeyond system, which is able to process Open World SQL queries, i.e., queries referencing arbitrary attributes not defined in the queried database. The original database is then augmented at query time with Web data sources providing those attributes. Its hybrid augmentation/relational query processing enables the use of ad-hoc data search and integration in data analysis queries, and improves both performance and quality when compared to using separate systems for the two tasks. Finally, we studied the management of large-scale dataset corpora such as data lakes or Open Data platforms, which are used as data sources for our augmentation methods. We introduce Publish-time Data Integration as a new technique for data curation systems managing such corpora, which aims at improving the individual reusability of datasets without requiring up-front global integration. This is achieved by automatically generating metadata and format recommendations, allowing publishers to enhance their datasets with minimal effort. Collectively, these three contributions are the foundation of a Query-time Data Integration architecture, that enables ad-hoc data search and integration queries over large heterogeneous dataset collections.
APA, Harvard, Vancouver, ISO, and other styles
29

Bañuelos, Saucedo Miguel Angel. "Signal and data processing for THz imaging." Thesis, University of Manchester, 2014. https://www.research.manchester.ac.uk/portal/en/theses/signal-and-data-processing-for-thz-imaging(58a646f3-033b-4771-b1dc-d1f9fc6dfbf0).html.

Full text
Abstract:
This thesis presents the research made on signal and data processing for THz imaging, with emphasis in noise analysis and tomography in amplitude contrast using a THz time-domain spectrometry system. A THz computerized tomography system was built, tested and characterized. The system is controlled from a personal computer using a program developed ad hoc. Detail is given on the operating principles of the system’s numerous optical and THz components, the design of a computer-based fast lock-in amplifier, the proposal of a local apodization method for reducing spurious oscillations in a THz spectrum, and the use of a parabolic interpolation of integrated signals as a method for estimating THz pulse delay. It is shown that our system can achieve a signal-to-noise ratio of 60 dB in spectrometry tests and 47 dB in tomography tests. Styrofoam phantoms of different shapes and up to 50x60 mm is size are used for analysis. Tomographic images are reconstructed at different frequencies from 0.2 THz to 2.5 THz, showing that volume scattering and edge contrast increase with wavelength. Evidence is given that refractive losses and surface scattering are responsible of high edge contrast in THz tomography images reconstructed in amplitude contrast. A modified Rayleigh roughness factor is proposed to model surface transmission scattering. It is also shown that volume scattering can be modelled by the material’s attenuation coefficient. The use of 4 mm apertures as spatial filters is compared against full beam imaging, and the limitations of Raleigh range are also addressed. It was estimated that for some frequencies between 0.5 THz and 1 THz the Rayleigh range is enough for the tested phantoms. Results on the influence of attenuation and scattering at different THz frequencies can be applied to the development of THz CW imaging systems and as a point of departure for the development of more complex scattering models.
APA, Harvard, Vancouver, ISO, and other styles
30

尹翰卿 and Hon-hing Wan. "Efficient real-time scheduling for multimedia data transmission." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2002. http://hub.hku.hk/bib/B31227910.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Chung, Vera Yuk Ying. "Real-time image processing techniques using custom computing." Thesis, Queensland University of Technology, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
32

Fujinaga, Ichiro. "Adaptive optical music recognition." Thesis, McGill University, 1996. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=42033.

Full text
Abstract:
The basic goal of the Adaptive Optical Music Recognition system presented herein is to create an adaptive software for the recognition of musical notation. The focus of this research has been to create a robust framework upon which a practical optical music recognizer can be built.
The strength of this system is its ability to learn new music symbols and handwritten notations. It also continually improves its accuracy in recognizing these objects by adjusting internal parameters. Given the wide range of music notation styles, these are essential characteristics of a music recognizer.
The implementation of the adaptive system is based on exemplar-based incremental learning, analogous to the idea of "learning by example," that identifies unknown objects by their similarity to one or more of the known stored examples. The entire process is based on two simple, yet powerful algorithms: k-nearest neighbour classifier and genetic algorithm. Using these algorithms, the system is designed to increase its accuracy over time as more data are processed.
APA, Harvard, Vancouver, ISO, and other styles
33

Sandys, Sean David. "Requirement specifications for communication in distributed real-time systems /." Thesis, Connect to this title online; UW restricted, 2002. http://hdl.handle.net/1773/7002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Craig, David W. (David William) Carleton University Dissertation Engineering Electrical. "Light traffic loss of random hard real-time tasks in a network." Ottawa, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
35

Herschel, Melanie, and Felix Naumann. "Space and time scalability of duplicate detection in graph data." Universität Potsdam, 2008. http://opus.kobv.de/ubp/volltexte/2009/3285/.

Full text
Abstract:
Duplicate detection consists in determining different representations of real-world objects in a database. Recent research has considered the use of relationships among object representations to improve duplicate detection. In the general case where relationships form a graph, research has mainly focused on duplicate detection quality/effectiveness. Scalability has been neglected so far, even though it is crucial for large real-world duplicate detection tasks. In this paper we scale up duplicate detection in graph data (DDG) to large amounts of data and pairwise comparisons, using the support of a relational database system. To this end, we first generalize the process of DDG. We then present how to scale algorithms for DDG in space (amount of data processed with limited main memory) and in time. Finally, we explore how complex similarity computation can be performed efficiently. Experiments on data an order of magnitude larger than data considered so far in DDG clearly show that our methods scale to large amounts of data not residing in main memory.
APA, Harvard, Vancouver, ISO, and other styles
36

Ghosh, Kaushik. "Speculative execution in real-time systems." Diss., Georgia Institute of Technology, 1995. http://hdl.handle.net/1853/8174.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Thornlow, Robert Timothy. "Spectrum estimation using extrapolated time series." Thesis, Monterey, California : Naval Postgraduate School, 1990. http://handle.dtic.mil/100.2/ADA246554.

Full text
Abstract:
Thesis (M.S. in Electrical Engineering)--Naval Postgraduate School, December 1990.
Thesis Advisor(s): Hippenstiel, Ralph. Second Reader: Tummala, Murali. "December 1990." Description based on title screen as viewed on March 30, 2010. DTIC Descriptor(s): Frequency, Density, Data Management, Models, Signal To Noise Ratio, Theses, Power Spectra, Sequences, Estimates, Short Range(Time), Spectra, Sampling, Fast Fourier Transforms, Extrapolation, Data Processing. DTIC Identifier(s): Power Spectra, Estimates, Time Series Analysis, Extrapolation, Density, Theses, Fast Fourier Transforms, Eigenvectors, Mathematical Prediction. Author(s) subject terms: Data Extrapolation, Periodogram, AR spectral estimates. Includes bibliographical references (p. 94). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
38

Itagaki, Takebumi. "Real-time sound synthesis on a multi-processor platform." Thesis, Durham University, 1998. http://etheses.dur.ac.uk/4890/.

Full text
Abstract:
Real-time sound synthesis means that the calculation and output of each sound sample for a channel of audio information must be completed within a sample period. At a broadcasting standard, a sampling rate of 32,000 Hz, the maximum period available is 31.25 μsec. Such requirements demand a large amount of data processing power. An effective solution for this problem is a multi-processor platform; a parallel and distributed processing system. The suitability of the MIDI [Music Instrument Digital Interface] standard, published in 1983, as a controller for real-time applications is examined. Many musicians have expressed doubts on the decade old standard's ability for real-time performance. These have been investigated by measuring timing in various musical gestures, and by comparing these with the subjective characteristics of human perception. An implementation and its optimisation of real-time additive synthesis programs on a multi-transputer network are described. A prototype 81-polyphonic-note- organ configuration was implemented. By devising and deploying monitoring processes, the network's performance was measured and enhanced, leading to an efficient usage; the 88-note configuration. Since 88 simultaneous notes are rarely necessary in most performances, a scheduling program for dynamic note allocation was then introduced to achieve further efficiency gains. Considering calculation redundancies still further, a multi-sampling rate approach was applied as a further step to achieve an optimal performance. The theories underlining sound granulation, as a means of constructing complex sounds from grains, and the real-time implementation of this technique are outlined. The idea of sound granulation is quite similar to the quantum-wave theory, "acoustic quanta". Despite the conceptual simplicity, the signal processing requirements set tough demands, providing a challenge for this audio synthesis engine. Three issues arising from the results of the implementations above are discussed; the efficiency of the applications implemented, provisions for new processors and an optimal network architecture for sound synthesis.
APA, Harvard, Vancouver, ISO, and other styles
39

Hooman, Jozef. "Specification and compositional verification of real time systems /." Berlin [u.a.] : Springer, 1991. http://www.loc.gov/catdir/enhancements/fy0815/91041783-d.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Burger, Joseph. "Real-time engagement area dvelopment program (READ-Pro)." Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2002. http://library.nps.navy.mil/uhtbin/hyperion-image/02Jun%5FBurger.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Ting, Hing-fung. "Examples of time-message tradeoffs in distributed algorithms /." [Hong Kong : University of Hong Kong], 1988. http://sunzi.lib.hku.hk/hkuto/record.jsp?B1234980X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Arshad, Norhashim Mohd. "Real-time data compression for machine vision measurement systems." Thesis, Liverpool John Moores University, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.285284.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Brouse, Andrew. "The interharmonium : an investigation into networked musical applications and brainwaves." Thesis, McGill University, 2001. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=33878.

Full text
Abstract:
This work surveys currently available methods for measuring human brainwaves in order to generate music and technologies for real-time transmission of audio and music over the Internet. The end goal is to produce a performable music system which sends live human brainwaves over the Internet to produce sounding music at another, physically separated location.
APA, Harvard, Vancouver, ISO, and other styles
44

Sidyakin, Ivan Mikhailovich. "Techniques for the development of time-constraint telemetric data processing system." Thesis, De Montfort University, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.438905.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Jun, Zhang, Feng MeiPing, Zhu Yanbo, He Bin, and Zhang Qishan. "A Real-Time Telemetry Data Processing System with Open System Architecture." International Foundation for Telemetering, 1994. http://hdl.handle.net/10150/611667.

Full text
Abstract:
International Telemetering Conference Proceedings / October 17-20, 1994 / Town & Country Hotel and Conference Center, San Diego, California
In face of the characteristics of multiple data streams, high bit rate, variable data formats, complicated frame structure and changeable application environment, the programmable PCM telemetry system needs a new data processing system with advanced telemetry system architecture. This paper fully considers the characteristics of real-time telemetry data processing, analyzes the design of open system architecture for real-time telemetry data processing system(TDPS), presents an open system architecture scheme and design of real-time TDPS, gives the structure model of distributed network system, and develops the interface between network database and telemetry database, as well as telemetry processing software with man-machine interface. Finally, a practical and multi-functional real-time TDPS with open system architecture has been built, which based on UNIX operating system, supporting TCP/IP protocol and using Oracle relational database management system. This scheme and design have already proved to be efficient for real-time processing, high speed, mass storage and multi-user operation.
APA, Harvard, Vancouver, ISO, and other styles
46

Lloyd, Joseph W. Jr. "POST-FLIGHT DATA DISTRIBUTION SYSTEM." International Foundation for Telemetering, 1993. http://hdl.handle.net/10150/608898.

Full text
Abstract:
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada
Desktop Processors (IBM PC, PC-compatible, and Macintosh) have made a major impact on how the Naval Air Warfare Center Aircraft Division (NAWCAD}, Patuxent River engineering community performs their work in aircraft weapons tests. The personal processors are utilized by the flight-test engineers not only for report preparation, but also for post-flight Engineering Unit (EU) data reduction and analysis. Present day requirements direct a need for improved post-flight data handling than those of the past. These requirements are driven by the need to analyze all the vehicle's parameters prior to the succeeding test flight, and to generate test reports in a more cost effective and timely manner. This paper defines the post-flight data distribution system at NAWCAD, Patuxent River, explains how these tasks were handled in the past, and the development of a real-time data storage designed approach for post-flight data handling. This engineering design is then described explaining how it sets the precedence for NAWCAD, Patuxent River's future plans; and how it provides the flight-test engineer with the test vehicle's EU data immediately available post-flight at his desktop processor.
APA, Harvard, Vancouver, ISO, and other styles
47

Zhu, Wenyao. "Time-Series Feature Extraction in Embedded Sensor Processing System." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-281820.

Full text
Abstract:
Embedded sensor-based systems mounted with tens or hundreds of sensors can collect enormous time-series data, while the data analysis on those time-series is commonly conducted on the remote server-side. With the development of microprocessors, there have been increasing demands to move the analysis process to the local embedded systems. In this thesis, the objective is to inves- tigate the possibility of the time-series feature extraction methods suitable for the embedded sensor processing systems.As the research problem raised from the objective, we have explored the traditional statistic methods and machine learning approaches on time-series data mining. To narrow down the research scope, the thesis focuses on the similarity search methods together with the clustering algorithms from the time-series feature extraction perspective. In the project, we have chosen and implemented two clustering algorithms, the K-means and the Self-Organizing Map (SOM), combined with two similarity search methods, the Euclidean dis- tance and the Dynamic Time Warping (DTW). The evaluation setup uses four public datasets with labels, and the Rand index (RI) to score the accuracy. We have tested the performance on accuracy and time consumption of the four combinations of the chosen algorithms on the embedded platform.The results show that the SOM with DTW can generally achieve better accuracy with a relatively longer inferring time than the other evaluated meth- ods. Quantitatively, the SOM with DTW can do clustering on one time-series sample of 300 data points for twelve classes in 40 ms using the ESP32 embed- ded microprocessor, with a 4 percentage of accuracy advantage than the fastest K-means with Euclidean distance in RI score. We can conclude that the SOM with DTW algorithm can be used to handle the time-series clustering tasks on the embedded sensor processing systems if the timing requirement is not so stringent.
Inbyggda sensorbaserade system monterade med tiotals eller hundratals senso- rer kan samla in enorma tidsseriedata, medan dataanalysen på dessa tidsserier vanligtvis utförs på en fjärrserver. Med utvecklingen av mikroprocessorer har behovet att flytta analysprocessen till de lokala inbäddade systemen ökat. I detta examensarbete är målet att undersöka vilka tidsserie-extraktionsmetoder som är lämpliga för de inbäddade sensorbehandlingssystemen.Som forskningsproblem för målet har vi undersökt traditionella statistik- metoder och maskininlärningsmetoder för tidsserie-data mining. För att be- gränsa forskningsområdet fokuserar examensarbet på likhetssökningsmetoder tillsammans med klusteralgoritmer från tidsserieens feature extraktionsper- spektiv. I projektet har vi valt och implementerat två klusteralgoritmer, K- means och Self-Organizing Map (SOM), i kombination med två likhetssök- ningsmetoder, det euklidiska avståndet och Dynamic Time Warping (DTW). Resultaten utvärderas med fyra offentliga datasätt med märkt data. Randin- dex (RI) används för att utvärdera noggrannheten. Vi har testat prestandan för noggrannhet och tidsförbrukning för de fyra kombinationerna av de valda al- goritmerna på den inbäddade plattformen.Resultaten visar att SOM med DTW i allmänhet kan uppnå bättre nog- grannhet med en relativt längre inferenstid än de andra utvärderade metoder- na. Kvantitativt kan SOM med DTW uföra klustring på ett tidsserieprov med 300 datapunkter för tolv klasser på 40 ms med en ESP32-inbäddad mikropro- cessor, vilket är en 4-procentig förbättring i noggrannhet i RI-poäng jämfört med det snabbaste K-medel klustringen med Euklidiskt avstånd. Vi drar slut- satsen att SOM med DTW algoritmen kan användas för att hantera tidsserie- klusteruppgifter på de inbäddade sensorbehandlingssystemen om tidsbehovet inte är så strängt.
APA, Harvard, Vancouver, ISO, and other styles
48

Ivan-Roşu, Daniela. "Dynamic resource allocation for adaptive real-time applications." Diss., Georgia Institute of Technology, 1999. http://hdl.handle.net/1853/9200.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Wan, Xiaogeng. "Time series causality analysis and EEG data analysis on music improvisation." Thesis, Imperial College London, 2014. http://hdl.handle.net/10044/1/23956.

Full text
Abstract:
This thesis describes a PhD project on time series causality analysis and applications. The project is motivated by two EEG measurements of music improvisation experiments, where we aim to use causality measures to construct neural networks to identify the neural differences between improvisation and non-improvisation. The research is based on mathematical backgrounds of time series analysis, information theory and network theory. We first studied a series of popular causality measures, namely, the Granger causality, partial directed coherence (PDC) and directed transfer function (DTF), transfer entropy (TE), conditional mutual information from mixed embedding (MIME) and partial MIME (PMIME), from which we proposed our new measures: the direct transfer entropy (DTE) and the wavelet-based extensions of MIME and PMIME. The new measures improved the properties and applications of their father measures, which were verified by simulations and examples. By comparing the measures we studied, MIME was found to be the most useful causality measure for our EEG analysis. Thus, we used MIME to construct both the intra-brain and cross-brain neural networks for musicians and listeners during the music performances. Neural differences were identified in terms of direction and distribution of neural information flows and activity of the large brain regions. Furthermore, we applied MIME on other EEG and financial data applications, where reasonable causality results were obtained.
APA, Harvard, Vancouver, ISO, and other styles
50

Chen, Deji. "Real-time data management in the distributed environment /." Digital version accessible at:, 1999. http://wwwlib.umi.com/cr/utexas/main.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography