To see the other types of publications on this topic, follow the link: Statistical processing of real data.

Dissertations / Theses on the topic 'Statistical processing of real data'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Statistical processing of real data.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Ha, Jin-cheol. "Real-time visual tracking using image processing and filtering methods." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/28177.

Full text
Abstract:
Thesis (M. S.)--Aerospace Engineering, Georgia Institute of Technology, 2008.
Committee Chair: Eric N. Johnson; Committee Co-Chair: Allen R. Tannenbaum; Committee Member: Anthony J. Calise; Committee Member: Eric Feron; Committee Member: Patricio A. Vela.
APA, Harvard, Vancouver, ISO, and other styles
2

Park, Chang Yun. "Predicting deterministic execution times of real-time programs /." Thesis, Connect to this title online; UW restricted, 1992. http://hdl.handle.net/1773/6978.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zamazal, Petr. "Statistická analýza rozsáhlých dat z průmyslu." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2021. http://www.nusl.cz/ntk/nusl-445466.

Full text
Abstract:
This thesis deals with processing of real data regarding waste collection. It describes select parts of the fields of statistical tests, identification of outliers, correlation analysis and linear regression. This theoretical basis is applied through the programming language Python to process the data into a form suitable for creating linear models. Final models explain between 70 \% and 85 \% variability. Finally, the information obtained through this analysis is used to specify recommendations for the waste management company.
APA, Harvard, Vancouver, ISO, and other styles
4

Fernandez, Noemi. "Statistical information processing for data classification." FIU Digital Commons, 1996. http://digitalcommons.fiu.edu/etd/3297.

Full text
Abstract:
This thesis introduces new algorithms for analysis and classification of multivariate data. Statistical approaches are devised for the objectives of data clustering, data classification and object recognition. An initial investigation begins with the application of fundamental pattern recognition principles. Where such fundamental principles meet their limitations, statistical and neural algorithms are integrated to augment the overall approach for an enhanced solution. This thesis provides a new dimension to the problem of classification of data as a result of the following developments: (1) application of algorithms for object classification and recognition; (2) integration of a neural network algorithm which determines the decision functions associated with the task of classification; (3) determination and use of the eigensystem using newly developed methods with the objectives of achieving optimized data clustering and data classification, and dynamic monitoring of time-varying data; and (4) use of the principal component transform to exploit the eigensystem in order to perform the important tasks of orientation-independent object recognition, and di mensionality reduction of the data such as to optimize the processing time without compromising accuracy in the analysis of this data.
APA, Harvard, Vancouver, ISO, and other styles
5

Macias, Filiberto. "Real Time Telemetry Data Processing and Data Display." International Foundation for Telemetering, 1996. http://hdl.handle.net/10150/611405.

Full text
Abstract:
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California
The Telemetry Data Center (TDC) at White Sands Missile Range (WSMR) is now beginning to modernize its existing telemetry data processing system. Modern networking and interactive graphical displays are now being introduced. This infusion of modern technology will allow the TDC to provide our customers with enhanced data processing and display capability. The intent of this project is to outline this undertaking.
APA, Harvard, Vancouver, ISO, and other styles
6

White, Allan P., and Richard K. Dean. "Real-Time Test Data Processing System." International Foundation for Telemetering, 1989. http://hdl.handle.net/10150/614650.

Full text
Abstract:
International Telemetering Conference Proceedings / October 30-November 02, 1989 / Town & Country Hotel & Convention Center, San Diego, California
The U.S. Army Aviation Development Test Activity at Fort Rucker, Alabama needed a real-time test data collection and processing capability for helicopter flight testing. The system had to be capable of collecting and processing both FM and PCM data streams from analog tape and/or a telemetry receiver. The hardware and software was to be off the shelf whenever possible. The integration was to result in a stand alone telemetry collection and processing system.
APA, Harvard, Vancouver, ISO, and other styles
7

Clapp, T. C. "Statistical methods for the processing of communications data." Thesis, University of Cambridge, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.597697.

Full text
Abstract:
This thesis describes the use of methods derived from Bayesian statistics on the problem of blind equalisation of communications channels, although much of this work is applicable to the more general problem of blind deconvolution. In order to allow general models to be incorporated, numerical methods are used; the focus is on Markov chain Monte Carlo (MCMC) methods for processing of blocks of data and the use of particle filters for sequential processing. In order to obtain the best performance using MCMC, the choice of the Markov chain needs tailoring to the application in hand. The use of joint sampling of all the states (transmitted data sequence) and reversible jump moves to combat delay ambiguity are proposed. The use of particle filters is still in its infancy, and much of the focus is on the development of strategies to improve its applicability to real problems. It is well known that fixed-lag methods may be used to great effect on Markovian models where later observations can provide information about states in the recent past. Methods of performing fixed-lag simulation for incorporation into particle filters are described. The use of data windowing on fixed parameter systems allows regeneration of the parameters at each time-step without having excessive demands on storage requirements. In certain cases it is difficult to perform the updating when a new data point is received in a single step. The novel concept of introducing intermediate densities in a manner akin to simulated annealing between time steps is described. This improves robustness and provides a natural method for initialisation. All of these techniques are demonstrated in simulations based upon standard models of communications systems, along with favourable comparisons to more conventional techniques.
APA, Harvard, Vancouver, ISO, and other styles
8

Dowling, Jason, John Welling, Loral Aerosys, Kathy Nanzetta, Toby Bennett, and Jeff Shi. "ACCELERATING REAL-TIME SPACE DATA PACKET PROCESSING." International Foundation for Telemetering, 1995. http://hdl.handle.net/10150/608429.

Full text
Abstract:
International Telemetering Conference Proceedings / October 30-November 02, 1995 / Riviera Hotel, Las Vegas, Nevada
NASA’s use of high bandwidth packetized Consultative Committee for Space Data Systems (CCSDS) telemetry in future missions presents a great challenge to ground data system developers. These missions, including the Earth Observing System (EOS), call for high data rate interfaces and small packet sizes. Because each packet requires a similar amount of protocol processing, high data rates and small packet sizes dramatically increase the real-time workload on ground packet processing systems. NASA’s Goddard Space Flight Center has been developing packet processing subsystems for more than twelve years. Implementations of these subsystems have ranged from mini-computers to single-card VLSI multiprocessor subsystems. The latter subsystem, known as the VLSI Packet Processor, was first deployed in 1991 for use in support of the Solar Anomalous & Magnetospheric Particle Explorer (SAMPEX) mission. An upgraded version of this VMEBus card, first deployed for Space Station flight hardware verification, has demonstrated sustained throughput of up to 50 Megabits per second and 15,000 packets per second. Future space missions including EOS will require significantly higher data and packet rate performance. A new approach to packet processing is under development that will not only increase performance levels by at least a factor of six but also reduce subsystem replication costs by a factor of five. This paper will discuss the development of a next generation packet processing subsystem and the architectural changes necessary to achieve a thirty-fold improvement in the performance/price of real-time packet processing.
APA, Harvard, Vancouver, ISO, and other styles
9

Khondoker, Md Mizanur Rahman. "Statistical methods for pre-processing microarray gene expression data." Thesis, University of Edinburgh, 2006. http://hdl.handle.net/1842/12367.

Full text
Abstract:
A novel method is developed for combining multiple laser scans of microarrays to correct for “signal saturation” and “signal deterioration” effects in the gene expression measurement. A multivariate nonlinear functional regression model with Cauchy distributed errors having additive plus multiplicative scale is proposed as a model for combining multiple scan data. The model has been found to flexibly describe the nonlinear relationship in multiple scan data. The heavy tailed Cauchy distribution with additive plus multiplicative scale provides a basis for objective and robust estimation of gene expression from multiple scan data adjusting for censoring and deterioration bias in the observed intensity. Through combining multiple scans, the model reduces sampling variability in the gene expression estimates. A unified approach for nonparametric location and scale normalisation of log-ratio data is considered. A Generalised Additive Model for Location, Scale and Shape (GAMLSS) is proposed. GAMLSS uses a nonparametric approach for modelling both location and scale of log-ratio data, in contrast to the general tendency of using a parametric transformation, such as arcsinh, for variance stabilisation. Simulation studies demonstrate GAMLSS to be more powerful than the parametric method when a GAMLSS location and scale model, fitted to real data, is assumed correct. GAMLSS has been found to be as powerful as the parametric approach even when the parametric model is appropriate. Finally, we investigate the optimality of different estimation methods for analysing functional regression models. Alternative estimators are available in the literature to deal with the problems of identifiability and consistency. We investigated these estimators in terms of unbiasedness and efficiency for a specific case involving multiple laser scans of microarrays, and found that, in addition to being consistent, named methods are highly efficient and unbiased.
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Yun, and Wenxuan Jiang. "Statistical Processing of IEEE 802.15.4 Data Collected in Industrial Environment." Thesis, Mittuniversitetet, Institutionen för informationsteknologi och medier, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-19619.

Full text
Abstract:
Wireless sensor network, which is constitute of autonomous sensors, is used for monitoring physical or environmental conditions like temperature, sound, pressure, and so on. The dispersed sensors or nodes will respectively pass their data through the network to the main location. Currently, several standards are ratified or in developing for wireless sensor network, like Wireless Hart, ISA, 100.11a, WIA-PAA, IEEE 802.15.4, etc. Among the standards, Zigbee is often used in industrial applications that require short-range and low-rate wireless transfer. In the research, all the data is collected under industrial environment using IEEE 802.15.4 compliant physical layer, some packets are interfered only by multi-path fading while others are also interfered by Wi-Fi interference. The goal of the thesis is to find out the dependence between the received power (RSS), correlation value (CORR) and bit error rate (BER) of the received message, and their distribution in situations both when the packet is lost or not. Besides, the performance of bit error rate such as the distribution and the features of burst error length under Wi-Fi interference or not will also be tested. All of them are based on a precise statistical processing.
APA, Harvard, Vancouver, ISO, and other styles
11

Patel, Ankur. "3D morphable models : data pre-processing, statistical analysis and fitting." Thesis, University of York, 2011. http://etheses.whiterose.ac.uk/1576/.

Full text
Abstract:
This thesis presents research aimed at using a 3D linear statistical model (known as a 3D morphable model) of an object class (which could be faces, bodies, cars, etc) for robust shape recovery. Our aim is to use this recovered information for the purposes of potentially useful applications like recognition and synthesis. With a 3D morphable model as its central theme, this thesis includes: a framework for the groupwise processing of a set of meshes in dense correspondence; a new method for model construction; a new interpretation of the statistical constraints afforded by the model and addressing of some key limitations associated with using such models in real world applications. In Chapter 1 we introduce 3D morphable models, touch on the current state-of-the-art and emphasise why these models are an interesting and important research tool in the computer vision and graphics community. We then talk about the limitations of using such models and use these limitations as a motivation for some of the contributions made in this thesis. Chapter 2 presents an end-to-end system for obtaining a single (possibly symmetric) low resolution mesh topology and texture parameterisation which are optimal with respect to a set of high resolution input meshes in dense correspondence. These methods result in data which can be used to build 3D morphable models (at any resolution). In Chapter 3 we show how the tools of thin-plate spline warping and Procrustes analysis can be used to construct a morphable model as a shape space. We observe that the distribution of parameter vector lengths follows a chi-square distribution and discuss how the parameters of this distribution can be used as a regularisation constraint on the length of parameter vectors. In Chapter 4 we take the idea introduced in Chapter 3 further by enforcing a hard constraint which restricts faces to points on a hyperspherical manifold within the parameter space of a linear statistical model. We introduce tools from differential geometry (log and exponential maps for a hyperspherical manifold) which are necessary for developing our methodology and provide empirical validation to justify our choice of manifold. Finally, we show how to use these tools to perform model fitting, warping and averaging operations on the surface of this manifold. Chapter 5 presents a method to simplify a 3D morphable model without requiring knowledge of the training meshes used to build the model. This extends the simplification ideas in Chapter 2 into a statistical setting. The proposed method is based on iterative edge collapse and we show that the expected value of the Quadric Error Metric can be computed in closed form for a linear deformable model. The simplified models can used to achieve efficient multiscale fitting and super-resolution. In Chapter 6 we consider the problem of model dominance and show how shading constraints can be used to refine morphable model shape estimates, offering the possibility of exceeding the maximum possible accuracy of the model. We present an optimisation scheme based on surface normal error as opposed to image error. This ensures the fullest possible use of the information conveyed by the shading in an image. In addition, our framework allows non-model based estimation of per-vertex bump and albedo maps. This means the recovered model is capable of describing shape and reflectance phenomena not present in the training set. We explore the use of the recovered shape and reflectance information for face recognition and synthesis. Finally, in Chapter 7 we provide concluding remarks and discuss directions for future research.
APA, Harvard, Vancouver, ISO, and other styles
12

Rao, Ashwani Pratap. "Statistical information retrieval models| Experiments, evaluation on real time data." Thesis, University of Delaware, 2014. http://pqdtopen.proquest.com/#viewpdf?dispub=1567821.

Full text
Abstract:

We are all aware of the rise of information age: heterogeneous sources of information and the ability to publish rapidly and indiscriminately are responsible for information chaos. In this work, we are interested in a system which can separate the "wheat" of vital information from the chaff within this information chaos. An efficient filtering system can accelerate meaningful utilization of knowledge. Consider Wikipedia, an example of community-driven knowledge synthesis. Facts about topics on Wikipedia are continuously being updated by users interested in a particular topic. Consider an automatic system (or an invisible robot) to which a topic such as "President of the United States" can be fed. This system will work ceaselessly, filtering new information created on the web in order to provide the small set of documents about the "President of the United States" that are vital to keeping the Wikipedia page relevant and up-to-date. In this work, we present an automatic information filtering system for this task. While building such a system, we have encountered issues related to scalability, retrieval algorithms, and system evaluation; we describe our efforts to understand and overcome these issues.

APA, Harvard, Vancouver, ISO, and other styles
13

Puri, Anuj. "Statistical profile generation of real-time UAV-based traffic data." [Tampa, Fla] : University of South Florida, 2008. http://purl.fcla.edu/usf/dc/et/SFE0002674.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Deel, Troy A. "A statistical study of graph algorithms." Virtual Press, 1985. http://liblink.bsu.edu/uhtbin/catkey/424871.

Full text
Abstract:
The object of this paper is to investigate the behavior of some important graph properties and to statistically analyze the execution times of certain graph are the average degree of a vertex, connectivity of a graph, the existence of Hamilton cycles, Euler tours, and bipartitions in graphs. This study is unique in that it is based on statistical rather than deterministic methods.
APA, Harvard, Vancouver, ISO, and other styles
15

Naulleau, Patrick. "Optical signal processing and real world applications /." Online version of thesis, 1993. http://hdl.handle.net/1850/12136.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Liu, Guangtian. "An event service architecture in distributed real-time systems /." Digital version accessible at:, 1999. http://wwwlib.umi.com/cr/utexas/main.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

He, Zhenyu. "Writer identification using wavelet, contourlet and statistical models." HKBU Institutional Repository, 2006. http://repository.hkbu.edu.hk/etd_ra/767.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Dreibelbis, Harold N., Dennis Kelsch, and Larry James. "REAL-TIME TELEMETRY DATA PROCESSING and LARGE SCALE PROCESSORS." International Foundation for Telemetering, 1991. http://hdl.handle.net/10150/612912.

Full text
Abstract:
International Telemetering Conference Proceedings / November 04-07, 1991 / Riviera Hotel and Convention Center, Las Vegas, Nevada
Real-time data processing of telemetry data has evolved from a highly centralized single large scale computer system to multiple mini-computers or super mini-computers tied together in a loosely coupled distributed network. Each mini-computer or super mini-computer essentially performing a single function in the real-time processing sequence of events. The reasons in the past for this evolution are many and varied. This paper will review some of the more significant factors in that evolution and will present some alternatives to a fully distributed mini-computer network that appear to offer significant real-time data processing advantages.
APA, Harvard, Vancouver, ISO, and other styles
19

Feather, Bob, and Michael O’Brien. "OPEN ARCHITECTURE SYSTEM FOR REAL TIME TELEMETRY DATA PROCESSING." International Foundation for Telemetering, 1991. http://hdl.handle.net/10150/612934.

Full text
Abstract:
International Telemetering Conference Proceedings / November 04-07, 1991 / Riviera Hotel and Convention Center, Las Vegas, Nevada
There have been many recent technological advances in small computers, graphics stations, and system networks. This has made it possible to build highly advanced distributed processing systems for telemetry data acquisition and processing. Presently there is a plethora of vendors marketing powerful new network workstation hardware and software products. Computer vendors are rapidly developing new products as new technology continues to emerge. It is becoming difficult to procure and install a new computer system before it has been made obsolete by a competitor or even the same vendor. If one purchases the best hardware and software products individually, the system can end up being composed of incompatible components from different vendors that do not operate as one integrated homogeneous system. If one uses only hardware and software from one vendor in order to simplify system integration, the system will be limited to only those products that the vendor chooses to develop. To truly take advantage of the rapidly advancing computer technology, today’s telemetry systems should be designed for an open systems environment. This paper defines an optimum open architecture system designed around industry wide standards for both hardware and software. This will allow for different vendor’s computers to operate in the same distributed networked system, and will allow software to be portable to the various computers and workstations in the system while maintaining the same user interface. The open architecture system allows for new products to be added as they become available to increase system performance and capability in a truly heterogeneous system environment.
APA, Harvard, Vancouver, ISO, and other styles
20

Dahan, Michael. "RTDAP: Real-Time Data Acquisition, Processing and Display System." International Foundation for Telemetering, 1989. http://hdl.handle.net/10150/614629.

Full text
Abstract:
International Telemetering Conference Proceedings / October 30-November 02, 1989 / Town & Country Hotel & Convention Center, San Diego, California
This paper describes a data acquisition, processing and display system which is suitable for various telemetry applications. The system can be connected either to a PCM encoder or to a telemetry decommutator through a built-in interface and can directly address any channel from the PCM stream for processing. Its compact size and simplicity allow it to be used in the flight line as a test console, in mobile stations as the main data processing system, or on-board test civil aircrafts for in-flight monitoring and data processing.
APA, Harvard, Vancouver, ISO, and other styles
21

Chong, Wai Yu Ryan. "Statistical and analytic data processing based on SQL7 with web interface." Leeds, 2001. http://www.leeds.ac.uk/library/counter2/compstmsc/20002001/chong.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

ALMEIDA, Marcos Antonio Martins de. "Statistical analysis applied to data classification and image filtering." Universidade Federal de Pernambuco, 2016. https://repositorio.ufpe.br/handle/123456789/25506.

Full text
Abstract:
Submitted by Fernanda Rodrigues de Lima (fernanda.rlima@ufpe.br) on 2018-08-03T20:52:13Z No. of bitstreams: 2 license_rdf: 811 bytes, checksum: e39d27027a6cc9cb039ad269a5db8e34 (MD5) TESE Marcos Antonio Martins de Almeida.pdf: 11555397 bytes, checksum: db589d39915a5dda1d8b9e763a9cf4c0 (MD5)
Approved for entry into archive by Alice Araujo (alice.caraujo@ufpe.br) on 2018-08-09T20:49:00Z (GMT) No. of bitstreams: 2 license_rdf: 811 bytes, checksum: e39d27027a6cc9cb039ad269a5db8e34 (MD5) TESE Marcos Antonio Martins de Almeida.pdf: 11555397 bytes, checksum: db589d39915a5dda1d8b9e763a9cf4c0 (MD5)
Made available in DSpace on 2018-08-09T20:49:01Z (GMT). No. of bitstreams: 2 license_rdf: 811 bytes, checksum: e39d27027a6cc9cb039ad269a5db8e34 (MD5) TESE Marcos Antonio Martins de Almeida.pdf: 11555397 bytes, checksum: db589d39915a5dda1d8b9e763a9cf4c0 (MD5) Previous issue date: 2016-12-21
Statistical analysis is a tool of wide applicability in several areas of scientific knowledge. This thesis makes use of statistical analysis in two different applications: data classification and image processing targeted at document image binarization. In the first case, this thesis presents an analysis of several aspects of the consistency of the classification of the senior researchers in computer science of the Brazilian research council, CNPq - Conselho Nacional de Desenvolvimento Científico e Tecnológico. The second application of statistical analysis developed in this thesis addresses filtering-out the back to front interference which appears whenever a document is written or typed on both sides of translucent paper. In this topic, an assessment of the most important algorithms found in the literature is made, taking into account a large quantity of parameters such as the strength of the back to front interference, the diffusion of the ink in the paper, and the texture and hue of the paper due to aging. A new binarization algorithm is proposed, which is capable of removing the back-to-front noise in a wide range of documents. Additionally, this thesis proposes a new concept of “intelligent” binarization for complex documents, which besides text encompass several graphical elements such as figures, photos, diagrams, etc.
Análise estatística é uma ferramenta de grande aplicabilidade em diversas áreas do conhecimento científico. Esta tese faz uso de análise estatística em duas aplicações distintas: classificação de dados e processamento de imagens de documentos visando a binarização. No primeiro caso, é aqui feita uma análise de diversos aspectos da consistência da classificação de pesquisadores sêniores do CNPq - Conselho Nacional de Desenvolvimento Científico e Tecnológico, na área de Ciência da Computação. A segunda aplicação de análise estatística aqui desenvolvida trata da filtragem da interferência frente-verso que surge quando um documento é escrito ou impresso em ambos os lados da folha de um papel translúcido. Neste tópico é inicialmente feita uma análise da qualidade dos mais importantes algoritmos de binarização levando em consideração parâmetros tais como a intensidade da interferência frente-verso, a difusão da tinta no papel e a textura e escurecimento do papel pelo envelhecimento. Um novo algoritmo para a binarização eficiente de documentos com interferência frente-verso é aqui apresentado, tendo se mostrado capaz de remover tal ruído em uma grande gama de documentos. Adicionalmente, é aqui proposta a binarização “inteligente” de documentos complexos que envolvem diversos elementos gráficos (figuras, diagramas, etc).
APA, Harvard, Vancouver, ISO, and other styles
23

Nyström, Simon, and Joakim Lönnegren. "Processing data sources with big data frameworks." Thesis, KTH, Data- och elektroteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-188204.

Full text
Abstract:
Big data is a concept that is expanding rapidly. As more and more data is generatedand garnered, there is an increasing need for efficient solutions that can be utilized to process all this data in attempts to gain value from it. The purpose of this thesis is to find an efficient way to quickly process a large number of relatively small files. More specifically, the purpose is to test two frameworks that can be used for processing big data. The frameworks that are tested against each other are Apache NiFi and Apache Storm. A method is devised in order to, firstly, construct a data flow and secondly, construct a method for testing the performance and scalability of the frameworks running this data flow. The results reveal that Apache Storm is faster than Apache NiFi, at the sort of task that was tested. As the number of nodes included in the tests went up, the performance did not always do the same. This indicates that adding more nodes to a big data processing pipeline, does not always result in a better performing setup and that, sometimes, other measures must be made to heighten the performance.
Big data är ett koncept som växer snabbt. När mer och mer data genereras och samlas in finns det ett ökande behov av effektiva lösningar som kan användas föratt behandla all denna data, i försök att utvinna värde från den. Syftet med detta examensarbete är att hitta ett effektivt sätt att snabbt behandla ett stort antal filer, av relativt liten storlek. Mer specifikt så är det för att testa två ramverk som kan användas vid big data-behandling. De två ramverken som testas mot varandra är Apache NiFi och Apache Storm. En metod beskrivs för att, för det första, konstruera ett dataflöde och, för det andra, konstruera en metod för att testa prestandan och skalbarheten av de ramverk som kör dataflödet. Resultaten avslöjar att Apache Storm är snabbare än NiFi, på den typen av test som gjordes. När antalet noder som var med i testerna ökades, så ökade inte alltid prestandan. Detta visar att en ökning av antalet noder, i en big data-behandlingskedja, inte alltid leder till bättre prestanda och att det ibland krävs andra åtgärder för att öka prestandan.
APA, Harvard, Vancouver, ISO, and other styles
24

Narayanan, Shruthi (Shruthi P. ). "Real-time processing and visualization of intensive care unit data." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/119537.

Full text
Abstract:
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2017.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (page 83).
Intensive care unit (ICU) patients undergo detailed monitoring so that copious information regarding their condition is available to support clinical decision-making. Full utilization of the data depends heavily on its quantity, quality and manner of presentation to the physician at the bedside of a patient. In this thesis, we implemented a visualization system to aid ICU clinicians in collecting, processing, and displaying available ICU data. Our goals for the system are: to be able to receive large quantities of patient data from various sources, to compute complex functions over the data that are able to quantify an ICU patient's condition, to plot the data using a clean and interactive interface, and to be capable of live plot updates upon receiving new data. We made significant headway toward our goals, and we succeeded in creating a highly adaptable visualization system that future developers and users will be able to customize.
by Shruthi Narayanan.
M. Eng.
APA, Harvard, Vancouver, ISO, and other styles
25

Chun, Yang, Yang Hongling, and Zhou Jie. "STUDY ON HIGH-RATE TELEMETRY DATA REAL-TIME PROCESSING TECHNIQUES." International Foundation for Telemetering, 2000. http://hdl.handle.net/10150/608251.

Full text
Abstract:
International Telemetering Conference Proceedings / October 23-26, 2000 / Town & Country Hotel and Conference Center, San Diego, California
Owing to rapid development of PC industry, personal computer has been surprisingly improved on reliability and speed and it has been applied to many fields, such as aerospace, satellite and telemetry applications. As we all known, two aspects decide how fast the PC-based data acquisition can be reached. One aspect is CPU processing and the other is I/O bandwidth. Indeed, the first aspect has changed increasingly insignificant because the frequency of CPU has exceeded 700MHz which can satisfy fully the need of high rate data processing. So I/O bandwidth is the only key factor of the high rate PC-based data acquisition and we must adopt efficient data buffer techniques to satisfy the demand of telemetry data entry. This paper presents a buffered data channel which use memory mapping, EPLD and Dual-Port SRAM techniques. The operation platform of this design is WINDOWS95/98 and the software includes device driver and real-time processing routines.
APA, Harvard, Vancouver, ISO, and other styles
26

尹翰卿 and Hon-hing Wan. "Efficient real-time scheduling for multimedia data transmission." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2002. http://hub.hku.hk/bib/B31227910.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Chen, Li. "Statistical Machine Learning for Multi-platform Biomedical Data Analysis." Diss., Virginia Tech, 2011. http://hdl.handle.net/10919/77188.

Full text
Abstract:
Recent advances in biotechnologies have enabled multiplatform and large-scale quantitative measurements of biomedical events. The need to analyze the produced vast amount of imaging and genomic data stimulates various novel applications of statistical machine learning methods in many areas of biomedical research. The main objective is to assist biomedical investigators to better interpret, analyze, and understand the biomedical questions based on the acquired data. Given the computational challenges imposed by these high-dimensional and complex data, machine learning research finds its new opportunities and roles. In this dissertation thesis, we propose to develop, test and apply novel statistical machine learning methods to analyze the data mainly acquired by dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) and single nucleotide polymorphism (SNP) microarrays. The research work focuses on: (1) tissue-specific compartmental analysis for dynamic contrast-enhanced MR imaging of complex tumors; (2) computational Analysis for detecting DNA SNP interactions in genome-wide association studies. DCE-MRI provides a noninvasive method for evaluating tumor vasculature patterns based on contrast accumulation and washout. Compartmental analysis is a widely used mathematical tool to model dynamic imaging data and can provide accurate pharmacokinetics parameter estimates. However partial volume effect (PVE) existing in imaging data would have profound effect on the accuracy of pharmacokinetics studies. We therefore propose a convex analysis of mixtures (CAM) algorithm to explicitly eliminate PVE by expressing the kinetics in each pixel as a nonnegative combination of underlying compartments and subsequently identifying pure volume pixels at the corners of the clustered pixel time series scatter plot. The algorithm is supported by a series of newly proved theorems and additional noise filtering and normalization preprocessing. We demonstrate the principle and feasibility of the CAM approach together with compartmental modeling on realistic synthetic data, and compare the accuracy of parameter estimates obtained using CAM or other relevant techniques. Experimental results show a significant improvement in the accuracy of kinetic parameter estimation. We then apply the algorithm to real DCE-MRI data of breast cancer and observe improved pharmacokinetics parameter estimation that separates tumor tissue into sub-regions with differential tracer kinetics on a pixel-by-pixel basis and reveals biologically plausible tumor tissue heterogeneity patterns. This method has combined the advantages of multivariate clustering, convex optimization and compartmental modeling approaches. Interactions among genetic loci are believed to play an important role in disease risk. Due to the huge dimension of SNP data (normally several millions in genome-wide association studies), the combinatorial search and statistical evaluation required to detect multi-locus interactions constitute a significantly challenging computational task. While many approaches have been proposed for detecting such interactions, their relative performance remains largely unclear, due to the fact that performance was evaluated on different data sources, using different performance measures, and under different experimental protocols. Given the importance of detecting gene-gene interactions, a thorough evaluation of the performance and limitations of available methods, a theoretical analysis of the interaction effect and the genetic factors it depends on, and the development of more efficient methods are warranted. Therefore, we perform a computational analysis for detect interactions among SNPs. The contributions are four-fold: (1) developed simulation tools for evaluating performance of any technique designed to detect interactions among genetic variants in case-control studies; (2) used these tools to compare performance of five popular SNP detection methods; and (3) derived analytic relationships between power and the genetic factors, which not only support the experimental results but also gives a quantitative linkage between interaction effect and these factors; (4) based on the novel insights gained by comparative and theoretical analysis, developed an efficient statistically-principled method, namely the hybrid correlation-based association (HCA) to detect interacting SNPs. The HCA algorithm is based on three correlation-based statistics, which are designed to measure the strength of multi-locus interaction with three different interaction types, covering a large portion of possible interactions. Moreover, to maximize the detection power (sensitivity) while suppressing false positive rate (or retaining moderate specificity), we also devised a strategy to hybridize these three statistics in a case-by-case way. A heuristic search strategy is also proposed to largely decrease the computational complexity, especially for high-order interaction detection. We have tested HCA in both simulation study and real disease study. HCA and the selected peer methods were compared on a large number of simulated datasets, each including multiple sets of interaction models. The assessment criteria included several power measures, family-wise type I error rate, and computational complexity. The experimental results of HCA on the simulation data indicate its promising performance in terms of a good balance between detection accuracy and computational complexity. By running on multiple real datasets, HCA also replicates plausible biomarkers reported in previous literatures.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
28

Ghosh, Kaushik. "Speculative execution in real-time systems." Diss., Georgia Institute of Technology, 1995. http://hdl.handle.net/1853/8174.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Sandys, Sean David. "Requirement specifications for communication in distributed real-time systems /." Thesis, Connect to this title online; UW restricted, 2002. http://hdl.handle.net/1773/7002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Hooman, Jozef. "Specification and compositional verification of real time systems /." Berlin [u.a.] : Springer, 1991. http://www.loc.gov/catdir/enhancements/fy0815/91041783-d.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Burger, Joseph. "Real-time engagement area dvelopment program (READ-Pro)." Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2002. http://library.nps.navy.mil/uhtbin/hyperion-image/02Jun%5FBurger.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Lauretig, Adam M. "Natural Language Processing, Statistical Inference, and American Foreign Policy." The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1562147711514566.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Craig, David W. (David William) Carleton University Dissertation Engineering Electrical. "Light traffic loss of random hard real-time tasks in a network." Ottawa, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
34

Arshad, Norhashim Mohd. "Real-time data compression for machine vision measurement systems." Thesis, Liverpool John Moores University, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.285284.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Jun, Zhang, Feng MeiPing, Zhu Yanbo, He Bin, and Zhang Qishan. "A Real-Time Telemetry Data Processing System with Open System Architecture." International Foundation for Telemetering, 1994. http://hdl.handle.net/10150/611667.

Full text
Abstract:
International Telemetering Conference Proceedings / October 17-20, 1994 / Town & Country Hotel and Conference Center, San Diego, California
In face of the characteristics of multiple data streams, high bit rate, variable data formats, complicated frame structure and changeable application environment, the programmable PCM telemetry system needs a new data processing system with advanced telemetry system architecture. This paper fully considers the characteristics of real-time telemetry data processing, analyzes the design of open system architecture for real-time telemetry data processing system(TDPS), presents an open system architecture scheme and design of real-time TDPS, gives the structure model of distributed network system, and develops the interface between network database and telemetry database, as well as telemetry processing software with man-machine interface. Finally, a practical and multi-functional real-time TDPS with open system architecture has been built, which based on UNIX operating system, supporting TCP/IP protocol and using Oracle relational database management system. This scheme and design have already proved to be efficient for real-time processing, high speed, mass storage and multi-user operation.
APA, Harvard, Vancouver, ISO, and other styles
36

Ivan-Roşu, Daniela. "Dynamic resource allocation for adaptive real-time applications." Diss., Georgia Institute of Technology, 1999. http://hdl.handle.net/1853/9200.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

黃伯光 and Pak-kwong Wong. "Statistical language models for Chinese recognition: speech and character." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1998. http://hub.hku.hk/bib/B31239456.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Cannon, Jordan Krokhmal Pavlo. "Statistical analysis and algorithms for online change detection in real-time psychophysiological data." [Iowa City, Iowa] : University of Iowa, 2009. http://ir.uiowa.edu/etd/342.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Cannon, Jordan. "Statistical analysis and algorithms for online change detection in real-time psychophysiological data." Thesis, University of Iowa, 2009. https://ir.uiowa.edu/etd/342.

Full text
Abstract:
Modern systems produce a great amount of information and cues from which human operators must take action. On one hand, these complex systems can place a high demand on an operator's cognitive load, potentially overwhelming them and causing poor performance. On the other hand, some systems utilize extensive automation to accommodate their complexity; this can cause an operator to become complacent and inattentive, which again leads to deteriorated performance (Wilson, Russell, 2003a; Wilson, Russell, 2003b). An ideal human-machine interface would be one that optimizes the functional state of the operator, preventing overload while not permitting complacency, thus resulting in improved system performance. An operator's functional state (OFS) is the momentary ability of an operator to meet task demands with their cognitive resources. A high OFS indicates that an operator is vigilant and aware, with ample cognitive resources to achieve satisfactory performance. A low OFS, however, indicates a non-optimal cognitive load, either too much or too little, resulting in sub-par system performance (Wilson, Russell, 1999). With the ability to measure and detect changes in OFS in real-time, a closed-loop system between the operator and machine could optimize OFS through the dynamic allocation of tasks. For instance, if the system detects the operator is in cognitive overload, it can automate certain tasks allowing them to better focus on salient information. Conversely, if the system detects under-vigilance, it can allocate tasks back to the manual control of the operator. In essence, this system operates to "dynamically match task demands to [an] operator's momentary cognitive state", thereby achieving optimal OFS (Wilson, Russell, 2007). This concept is termed adaptive aiding and has been the subject of much research, with recent emphasis on accurately assessing OFS in real-time. OFS is commonly measured indirectly, like using overt performance metrics on tasks; if performance is declining, a low OFS is assumed. Another indirect measure is the subjective estimate of mental workload, where an operator narrates his/her perceived functional state while performing tasks (Wilson, Russell, 2007). Unfortunately, indirect measures of OFS are often infeasible in operational settings; performance metrics are difficult to construct for highly-automated complex systems, and subjective workload estimates are often inaccurate and intrusive (Wilson, Russell, 2007; Prinzel et al., 2000; Smith et al., 2001). OFS can be more directly measured via psychophysiological signals such as electroencephalogram (EEG) and electrooculography (EOG). Current research has demonstrated these signals' ability to respond to changing cognitive load and to measure OFS (Wilson, Fisher, 1991; Wilson, Fisher, 1995; Gevins et al., 1997; Gevins et al., 1998; Byrne, Parasuraman, 1996). Moreover, psychophysiological signals are continuously available and can be obtained in a non-intrusive manner, pre-requisite for their use in operational environments. The objective of this study is to advance schemes which detect change in OFS by monitoring psychophysiological signals in real-time. Reviews on similar methods can be found in, e.g., Wilson and Russell (2003a) and Wilson and Russell (2007). Many of these methods employ pattern recognition to classify mental workload into one of several discrete categories. For instance, given an experiment with easy, medium and hard tasks, and assuming the tasks induce varying degrees of mental workload on a subject, these methods classify which task is being performed for each epoch of psychophysiological data. The most common classifiers are artificial neural networks (ANN) and multivariate statistical techniques such as stepwise discriminant analysis (SWDA). ANNs have proved especially effective at classifying OFS as they account for the non-linear and higher order relationships often present in EEG/EOG data; they routinely achieve classification accuracy greater than 80%. However, the discrete output of these classification schemes is not conducive to real-time change detection. They accurately classify OFS, but they do not indicate when OFS has changed; the change points remain ambiguous and left to subjective interpretation. Thus, the present study introduces several online algorithms which objectively determine change in OFS via real-time psychophysiological signals. The following chapters describe the dataset evaluated, discuss the statistical properties of psychophysiological signals, and detail various algorithms which utilize these signals to detect real-time changes in OFS. The results of the algorithms are presented along with a discussion. Finally, the study is concluded with a comparison of each method and recommendations for future application.
APA, Harvard, Vancouver, ISO, and other styles
40

Chen, Deji. "Real-time data management in the distributed environment /." Digital version accessible at:, 1999. http://wwwlib.umi.com/cr/utexas/main.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Lloyd, Joseph W. Jr. "POST-FLIGHT DATA DISTRIBUTION SYSTEM." International Foundation for Telemetering, 1993. http://hdl.handle.net/10150/608898.

Full text
Abstract:
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada
Desktop Processors (IBM PC, PC-compatible, and Macintosh) have made a major impact on how the Naval Air Warfare Center Aircraft Division (NAWCAD}, Patuxent River engineering community performs their work in aircraft weapons tests. The personal processors are utilized by the flight-test engineers not only for report preparation, but also for post-flight Engineering Unit (EU) data reduction and analysis. Present day requirements direct a need for improved post-flight data handling than those of the past. These requirements are driven by the need to analyze all the vehicle's parameters prior to the succeeding test flight, and to generate test reports in a more cost effective and timely manner. This paper defines the post-flight data distribution system at NAWCAD, Patuxent River, explains how these tasks were handled in the past, and the development of a real-time data storage designed approach for post-flight data handling. This engineering design is then described explaining how it sets the precedence for NAWCAD, Patuxent River's future plans; and how it provides the flight-test engineer with the test vehicle's EU data immediately available post-flight at his desktop processor.
APA, Harvard, Vancouver, ISO, and other styles
42

Yaman, Sibel. "A multi-objective programming perspective to statistical learning problems." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/26470.

Full text
Abstract:
Thesis (Ph.D)--Electrical and Computer Engineering, Georgia Institute of Technology, 2009.
Committee Chair: Chin-Hui Lee; Committee Member: Anthony Yezzi; Committee Member: Evans Harrell; Committee Member: Fred Juang; Committee Member: James H. McClellan. Part of the SMARTech Electronic Thesis and Dissertation Collection.
APA, Harvard, Vancouver, ISO, and other styles
43

Spina, Robert. "Real time maze traversal /." Online version of thesis, 1989. http://hdl.handle.net/1850/10566.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Chang, Kuo-Lung. "A Real-Time Merging-Buffering Technique for MIDI Messages." Thesis, University of North Texas, 1991. https://digital.library.unt.edu/ark:/67531/metadc500471/.

Full text
Abstract:
A powerful and efficient algorithm has been designed to deal with the critical timing problem of the MIDI messages. This algorithm can convert note events stored in a natural way to MIDI messages dynamically. Only limited memory space (the buffer) is required to finish the conversion work, and the size of the buffer is independent of the size of the original sequence (notes). This algorithm's real-time variable properties suggest not only the flexible real-time controls in the use of musical aspects, but also the expandability to interactive multi-media applications. A compositional environment called MusicSculptor has been implemented in terms of this algorithm.
APA, Harvard, Vancouver, ISO, and other styles
45

O'Brien, R. Michael. "REAL-TIME TELEMETRY DATA FORMATTING FOR FLIGHT TEST ANALYSIS." International Foundation for Telemetering, 1994. http://hdl.handle.net/10150/608577.

Full text
Abstract:
International Telemetering Conference Proceedings / October 17-20, 1994 / Town & Country Hotel and Conference Center, San Diego, California
With today's telemetry systems, an hour-long analog test tape can be digitized in one hour or less. However, the digitized data produced by today's telemetry systems is usually not in a format that can be directly analyzed by the test engineer's analysis tools. The digitized data must be formatted before analysis can begin. The data formatting process can take from one to eight hours depending on the amount of data, the power of the system's host computer, and the complexity of the analysis software's data format. If more than one analysis package is used by the test engineer, the data has to be formatted separately for each package. Using today's high-speed RISC processors and large memory technology, a real-time Flexible Data Formatter can be added to the Telemetry Front End to perform this formatting function. The Flexible Data Formatter (FDF) allows the telemetry user to program the front-end hardware to output the telemetry test data in a format compatible with the user's analysis software. The FDF can also output multiple data files, each in a different format for supporting multiple analysis packages. This eliminates the file formatting step, thus reducing the time to process the data from each test by a factor of two to nine.
APA, Harvard, Vancouver, ISO, and other styles
46

Fidalgo, André Filipe dos Santos Pinto. "IPTV data reduction strategy to measure real users’ behaviours." Master's thesis, Faculdade de Ciências e Tecnologia, 2012. http://hdl.handle.net/10362/8448.

Full text
Abstract:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
The digital IPTV service has evolved in terms of features, technology and accessibility of their contents. However, the rapid evolution of features and services has brought a more complex offering to customers, which often are not enjoyed or even perceived. Therefore, it is important to measure the real advantage of those features and understand how they are used by customers. In this work, we present a strategy that deals directly with the real IPTV data, which result from the interaction actions with the set-top boxes by customers. But this data has a very low granularity level, which is complex and difficult to interpret. The approach is to transform the clicking actions to a more conceptual and representative level of the running activities. Furthermore, there is a significant reduction in the data cardinality, enhanced in terms of information quality. More than a transformation, this approach aims to be iterative, where at each level, we achieve a more accurate information, in order to characterize a particular behaviour. As experimental results, we present some application areas regarding the main offered features in this digital service. In particular, is made a study about zapping behaviour, and also an evaluation about DVR service usage. It is also discussed the possibility to integrate the strategy devised in a particular carrier, aiming to analyse the consumption rate of their services, in order to adjust them to customer real usage profile, and also to study the feasibility of new services introduction.
APA, Harvard, Vancouver, ISO, and other styles
47

Chadha, Sanjay. "A real-time system for multi-transputer systems." Thesis, University of British Columbia, 1990. http://hdl.handle.net/2429/29465.

Full text
Abstract:
Two important problems namely a versatile, efficient communication system and allocation of processors to processes are analysed. An efficient communication system has been developed, in which a central controller, the bus-master, dynamically configures the point-to-point network formed by the links of the transputers. The links are used to form a point-to-point network. An identical kernel resides on each of the nodes. This kernel is responsible for all communications on behalf of the user processes. It makes ConnectLink and ReleaseLink requests to the central controller and when the connections are made it sends the the messages through the connected link to the destination node. If direct connection to the destination node cannot be made then the message is sent to an intermediate node, the message hops through intermediate nodes until it reaches the destination node. The communication system developed provides low latency communication facility, and the system can easily be expanded to include a large number of transputers without increasing interprocess communication overhead by great extent. Another problem, namely the Module Assignment Problem (MAP) is an important issue at the time of development of distributed systems. MAPs are computationally intractable, i.e. the computational requirement grows with power of the number of tasks to be assigned. The load of a distributed system depends on both module execution times, and intermodule communication cost (IMC). If assignment is not done with due consideration, a module assignment can cause computer saturation. Therefore a good assignment should balance the processing load among the processors and generate minimum inter-processor communication (IPC) ( communication between modules not residing on the same processor). Since meeting the deadline constraint is the most important performance measure for RTDPS, meeting the response time is the most important criteria for module assignment. Understanding this we have devised a scheme which assigns processes to processors such that both response time constraints and periodicity constraints are met. If such an assignment is not possible, assignment would fail and an error would be generated. Our assignment algorithm does not take into consideration factors such as load balancing. We believe that the most important factor for RTDPS is meeting the deadline constraints and that's what our algorithm accomplishes.
Applied Science, Faculty of
Electrical and Computer Engineering, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
48

T, N. Santhosh Kumar, K. Abdul Samad A, and M. Sarojini K. "DSP BASED SIGNAL PROCESSING UNIT FOR REAL TIME PROCESSING OF VIBRATION AND ACOUSTIC SIGNALS OF SATELLITE LAUNCH VEHICLES." International Foundation for Telemetering, 1995. http://hdl.handle.net/10150/608530.

Full text
Abstract:
International Telemetering Conference Proceedings / October 30-November 02, 1995 / Riviera Hotel, Las Vegas, Nevada
Measurement of vibration and acoustic signals at various locations in the launch vehicle is important to establish the vibration and acoustic environment encountered by the launch vehicle during flight. The vibration and acoustic signals are wideband and require very large telemetry bandwidth if directly transmitted to ground. The DSP based Signal Processing Unit is designed to measure and analyse acoustic and vibration signals onboard the launch vehicle and transmit the computed spectrum to ground through centralised baseband telemetry system. The analysis techniques employed are power spectral density (PSD) computations using Fast Fourier Transform (FFT) and 1/3rd octave analysis using digital Infinite Impulse Response (IIR) filters. The programmability of all analysis parameters is achieved using EEPROM. This paper discusses the details of measurement and analysis techniques, design philosophy, tools used and implementation schemes. The paper also presents the performance results of flight models.
APA, Harvard, Vancouver, ISO, and other styles
49

Lefloch, Damien [Verfasser]. "Real-time processing of range data focusing on environment reconstruction / Damien Lefloch." Siegen : Universitätsbibliothek der Universität Siegen, 2019. http://d-nb.info/1177366398/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Ho, Eric TszLeung 1979. "A real-time system for processing, sharing, and display of physiology data." Thesis, Massachusetts Institute of Technology, 2003. http://hdl.handle.net/1721.1/87405.

Full text
Abstract:
Thesis (M.Eng. and S.B.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2003.
Includes bibliographical references (p. 61).
by Eric TszLeung Ho.
M.Eng.and S.B.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography