Dissertationen zum Thema „Statistical processing of real data“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit Top-50 Dissertationen für die Forschung zum Thema "Statistical processing of real data" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.
Ha, Jin-cheol. „Real-time visual tracking using image processing and filtering methods“. Diss., Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/28177.
Der volle Inhalt der QuelleCommittee Chair: Eric N. Johnson; Committee Co-Chair: Allen R. Tannenbaum; Committee Member: Anthony J. Calise; Committee Member: Eric Feron; Committee Member: Patricio A. Vela.
Park, Chang Yun. „Predicting deterministic execution times of real-time programs /“. Thesis, Connect to this title online; UW restricted, 1992. http://hdl.handle.net/1773/6978.
Der volle Inhalt der QuelleZamazal, Petr. „Statistická analýza rozsáhlých dat z průmyslu“. Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2021. http://www.nusl.cz/ntk/nusl-445466.
Der volle Inhalt der QuelleFernandez, Noemi. „Statistical information processing for data classification“. FIU Digital Commons, 1996. http://digitalcommons.fiu.edu/etd/3297.
Der volle Inhalt der QuelleMacias, Filiberto. „Real Time Telemetry Data Processing and Data Display“. International Foundation for Telemetering, 1996. http://hdl.handle.net/10150/611405.
Der volle Inhalt der QuelleThe Telemetry Data Center (TDC) at White Sands Missile Range (WSMR) is now beginning to modernize its existing telemetry data processing system. Modern networking and interactive graphical displays are now being introduced. This infusion of modern technology will allow the TDC to provide our customers with enhanced data processing and display capability. The intent of this project is to outline this undertaking.
White, Allan P., und Richard K. Dean. „Real-Time Test Data Processing System“. International Foundation for Telemetering, 1989. http://hdl.handle.net/10150/614650.
Der volle Inhalt der QuelleThe U.S. Army Aviation Development Test Activity at Fort Rucker, Alabama needed a real-time test data collection and processing capability for helicopter flight testing. The system had to be capable of collecting and processing both FM and PCM data streams from analog tape and/or a telemetry receiver. The hardware and software was to be off the shelf whenever possible. The integration was to result in a stand alone telemetry collection and processing system.
Clapp, T. C. „Statistical methods for the processing of communications data“. Thesis, University of Cambridge, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.597697.
Der volle Inhalt der QuelleDowling, Jason, John Welling, Loral Aerosys, Kathy Nanzetta, Toby Bennett und Jeff Shi. „ACCELERATING REAL-TIME SPACE DATA PACKET PROCESSING“. International Foundation for Telemetering, 1995. http://hdl.handle.net/10150/608429.
Der volle Inhalt der QuelleNASA’s use of high bandwidth packetized Consultative Committee for Space Data Systems (CCSDS) telemetry in future missions presents a great challenge to ground data system developers. These missions, including the Earth Observing System (EOS), call for high data rate interfaces and small packet sizes. Because each packet requires a similar amount of protocol processing, high data rates and small packet sizes dramatically increase the real-time workload on ground packet processing systems. NASA’s Goddard Space Flight Center has been developing packet processing subsystems for more than twelve years. Implementations of these subsystems have ranged from mini-computers to single-card VLSI multiprocessor subsystems. The latter subsystem, known as the VLSI Packet Processor, was first deployed in 1991 for use in support of the Solar Anomalous & Magnetospheric Particle Explorer (SAMPEX) mission. An upgraded version of this VMEBus card, first deployed for Space Station flight hardware verification, has demonstrated sustained throughput of up to 50 Megabits per second and 15,000 packets per second. Future space missions including EOS will require significantly higher data and packet rate performance. A new approach to packet processing is under development that will not only increase performance levels by at least a factor of six but also reduce subsystem replication costs by a factor of five. This paper will discuss the development of a next generation packet processing subsystem and the architectural changes necessary to achieve a thirty-fold improvement in the performance/price of real-time packet processing.
Khondoker, Md Mizanur Rahman. „Statistical methods for pre-processing microarray gene expression data“. Thesis, University of Edinburgh, 2006. http://hdl.handle.net/1842/12367.
Der volle Inhalt der QuelleWang, Yun, und Wenxuan Jiang. „Statistical Processing of IEEE 802.15.4 Data Collected in Industrial Environment“. Thesis, Mittuniversitetet, Institutionen för informationsteknologi och medier, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-19619.
Der volle Inhalt der QuellePatel, Ankur. „3D morphable models : data pre-processing, statistical analysis and fitting“. Thesis, University of York, 2011. http://etheses.whiterose.ac.uk/1576/.
Der volle Inhalt der QuelleRao, Ashwani Pratap. „Statistical information retrieval models| Experiments, evaluation on real time data“. Thesis, University of Delaware, 2014. http://pqdtopen.proquest.com/#viewpdf?dispub=1567821.
Der volle Inhalt der QuelleWe are all aware of the rise of information age: heterogeneous sources of information and the ability to publish rapidly and indiscriminately are responsible for information chaos. In this work, we are interested in a system which can separate the "wheat" of vital information from the chaff within this information chaos. An efficient filtering system can accelerate meaningful utilization of knowledge. Consider Wikipedia, an example of community-driven knowledge synthesis. Facts about topics on Wikipedia are continuously being updated by users interested in a particular topic. Consider an automatic system (or an invisible robot) to which a topic such as "President of the United States" can be fed. This system will work ceaselessly, filtering new information created on the web in order to provide the small set of documents about the "President of the United States" that are vital to keeping the Wikipedia page relevant and up-to-date. In this work, we present an automatic information filtering system for this task. While building such a system, we have encountered issues related to scalability, retrieval algorithms, and system evaluation; we describe our efforts to understand and overcome these issues.
Puri, Anuj. „Statistical profile generation of real-time UAV-based traffic data“. [Tampa, Fla] : University of South Florida, 2008. http://purl.fcla.edu/usf/dc/et/SFE0002674.
Der volle Inhalt der QuelleDeel, Troy A. „A statistical study of graph algorithms“. Virtual Press, 1985. http://liblink.bsu.edu/uhtbin/catkey/424871.
Der volle Inhalt der QuelleNaulleau, Patrick. „Optical signal processing and real world applications /“. Online version of thesis, 1993. http://hdl.handle.net/1850/12136.
Der volle Inhalt der QuelleLiu, Guangtian. „An event service architecture in distributed real-time systems /“. Digital version accessible at:, 1999. http://wwwlib.umi.com/cr/utexas/main.
Der volle Inhalt der QuelleHe, Zhenyu. „Writer identification using wavelet, contourlet and statistical models“. HKBU Institutional Repository, 2006. http://repository.hkbu.edu.hk/etd_ra/767.
Der volle Inhalt der QuelleDreibelbis, Harold N., Dennis Kelsch und Larry James. „REAL-TIME TELEMETRY DATA PROCESSING and LARGE SCALE PROCESSORS“. International Foundation for Telemetering, 1991. http://hdl.handle.net/10150/612912.
Der volle Inhalt der QuelleReal-time data processing of telemetry data has evolved from a highly centralized single large scale computer system to multiple mini-computers or super mini-computers tied together in a loosely coupled distributed network. Each mini-computer or super mini-computer essentially performing a single function in the real-time processing sequence of events. The reasons in the past for this evolution are many and varied. This paper will review some of the more significant factors in that evolution and will present some alternatives to a fully distributed mini-computer network that appear to offer significant real-time data processing advantages.
Feather, Bob, und Michael O’Brien. „OPEN ARCHITECTURE SYSTEM FOR REAL TIME TELEMETRY DATA PROCESSING“. International Foundation for Telemetering, 1991. http://hdl.handle.net/10150/612934.
Der volle Inhalt der QuelleThere have been many recent technological advances in small computers, graphics stations, and system networks. This has made it possible to build highly advanced distributed processing systems for telemetry data acquisition and processing. Presently there is a plethora of vendors marketing powerful new network workstation hardware and software products. Computer vendors are rapidly developing new products as new technology continues to emerge. It is becoming difficult to procure and install a new computer system before it has been made obsolete by a competitor or even the same vendor. If one purchases the best hardware and software products individually, the system can end up being composed of incompatible components from different vendors that do not operate as one integrated homogeneous system. If one uses only hardware and software from one vendor in order to simplify system integration, the system will be limited to only those products that the vendor chooses to develop. To truly take advantage of the rapidly advancing computer technology, today’s telemetry systems should be designed for an open systems environment. This paper defines an optimum open architecture system designed around industry wide standards for both hardware and software. This will allow for different vendor’s computers to operate in the same distributed networked system, and will allow software to be portable to the various computers and workstations in the system while maintaining the same user interface. The open architecture system allows for new products to be added as they become available to increase system performance and capability in a truly heterogeneous system environment.
Dahan, Michael. „RTDAP: Real-Time Data Acquisition, Processing and Display System“. International Foundation for Telemetering, 1989. http://hdl.handle.net/10150/614629.
Der volle Inhalt der QuelleThis paper describes a data acquisition, processing and display system which is suitable for various telemetry applications. The system can be connected either to a PCM encoder or to a telemetry decommutator through a built-in interface and can directly address any channel from the PCM stream for processing. Its compact size and simplicity allow it to be used in the flight line as a test console, in mobile stations as the main data processing system, or on-board test civil aircrafts for in-flight monitoring and data processing.
Chong, Wai Yu Ryan. „Statistical and analytic data processing based on SQL7 with web interface“. Leeds, 2001. http://www.leeds.ac.uk/library/counter2/compstmsc/20002001/chong.pdf.
Der volle Inhalt der QuelleALMEIDA, Marcos Antonio Martins de. „Statistical analysis applied to data classification and image filtering“. Universidade Federal de Pernambuco, 2016. https://repositorio.ufpe.br/handle/123456789/25506.
Der volle Inhalt der QuelleApproved for entry into archive by Alice Araujo (alice.caraujo@ufpe.br) on 2018-08-09T20:49:00Z (GMT) No. of bitstreams: 2 license_rdf: 811 bytes, checksum: e39d27027a6cc9cb039ad269a5db8e34 (MD5) TESE Marcos Antonio Martins de Almeida.pdf: 11555397 bytes, checksum: db589d39915a5dda1d8b9e763a9cf4c0 (MD5)
Made available in DSpace on 2018-08-09T20:49:01Z (GMT). No. of bitstreams: 2 license_rdf: 811 bytes, checksum: e39d27027a6cc9cb039ad269a5db8e34 (MD5) TESE Marcos Antonio Martins de Almeida.pdf: 11555397 bytes, checksum: db589d39915a5dda1d8b9e763a9cf4c0 (MD5) Previous issue date: 2016-12-21
Statistical analysis is a tool of wide applicability in several areas of scientific knowledge. This thesis makes use of statistical analysis in two different applications: data classification and image processing targeted at document image binarization. In the first case, this thesis presents an analysis of several aspects of the consistency of the classification of the senior researchers in computer science of the Brazilian research council, CNPq - Conselho Nacional de Desenvolvimento Científico e Tecnológico. The second application of statistical analysis developed in this thesis addresses filtering-out the back to front interference which appears whenever a document is written or typed on both sides of translucent paper. In this topic, an assessment of the most important algorithms found in the literature is made, taking into account a large quantity of parameters such as the strength of the back to front interference, the diffusion of the ink in the paper, and the texture and hue of the paper due to aging. A new binarization algorithm is proposed, which is capable of removing the back-to-front noise in a wide range of documents. Additionally, this thesis proposes a new concept of “intelligent” binarization for complex documents, which besides text encompass several graphical elements such as figures, photos, diagrams, etc.
Análise estatística é uma ferramenta de grande aplicabilidade em diversas áreas do conhecimento científico. Esta tese faz uso de análise estatística em duas aplicações distintas: classificação de dados e processamento de imagens de documentos visando a binarização. No primeiro caso, é aqui feita uma análise de diversos aspectos da consistência da classificação de pesquisadores sêniores do CNPq - Conselho Nacional de Desenvolvimento Científico e Tecnológico, na área de Ciência da Computação. A segunda aplicação de análise estatística aqui desenvolvida trata da filtragem da interferência frente-verso que surge quando um documento é escrito ou impresso em ambos os lados da folha de um papel translúcido. Neste tópico é inicialmente feita uma análise da qualidade dos mais importantes algoritmos de binarização levando em consideração parâmetros tais como a intensidade da interferência frente-verso, a difusão da tinta no papel e a textura e escurecimento do papel pelo envelhecimento. Um novo algoritmo para a binarização eficiente de documentos com interferência frente-verso é aqui apresentado, tendo se mostrado capaz de remover tal ruído em uma grande gama de documentos. Adicionalmente, é aqui proposta a binarização “inteligente” de documentos complexos que envolvem diversos elementos gráficos (figuras, diagramas, etc).
Nyström, Simon, und Joakim Lönnegren. „Processing data sources with big data frameworks“. Thesis, KTH, Data- och elektroteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-188204.
Der volle Inhalt der QuelleBig data är ett koncept som växer snabbt. När mer och mer data genereras och samlas in finns det ett ökande behov av effektiva lösningar som kan användas föratt behandla all denna data, i försök att utvinna värde från den. Syftet med detta examensarbete är att hitta ett effektivt sätt att snabbt behandla ett stort antal filer, av relativt liten storlek. Mer specifikt så är det för att testa två ramverk som kan användas vid big data-behandling. De två ramverken som testas mot varandra är Apache NiFi och Apache Storm. En metod beskrivs för att, för det första, konstruera ett dataflöde och, för det andra, konstruera en metod för att testa prestandan och skalbarheten av de ramverk som kör dataflödet. Resultaten avslöjar att Apache Storm är snabbare än NiFi, på den typen av test som gjordes. När antalet noder som var med i testerna ökades, så ökade inte alltid prestandan. Detta visar att en ökning av antalet noder, i en big data-behandlingskedja, inte alltid leder till bättre prestanda och att det ibland krävs andra åtgärder för att öka prestandan.
Narayanan, Shruthi (Shruthi P. ). „Real-time processing and visualization of intensive care unit data“. Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/119537.
Der volle Inhalt der QuelleThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (page 83).
Intensive care unit (ICU) patients undergo detailed monitoring so that copious information regarding their condition is available to support clinical decision-making. Full utilization of the data depends heavily on its quantity, quality and manner of presentation to the physician at the bedside of a patient. In this thesis, we implemented a visualization system to aid ICU clinicians in collecting, processing, and displaying available ICU data. Our goals for the system are: to be able to receive large quantities of patient data from various sources, to compute complex functions over the data that are able to quantify an ICU patient's condition, to plot the data using a clean and interactive interface, and to be capable of live plot updates upon receiving new data. We made significant headway toward our goals, and we succeeded in creating a highly adaptable visualization system that future developers and users will be able to customize.
by Shruthi Narayanan.
M. Eng.
Chun, Yang, Yang Hongling und Zhou Jie. „STUDY ON HIGH-RATE TELEMETRY DATA REAL-TIME PROCESSING TECHNIQUES“. International Foundation for Telemetering, 2000. http://hdl.handle.net/10150/608251.
Der volle Inhalt der QuelleOwing to rapid development of PC industry, personal computer has been surprisingly improved on reliability and speed and it has been applied to many fields, such as aerospace, satellite and telemetry applications. As we all known, two aspects decide how fast the PC-based data acquisition can be reached. One aspect is CPU processing and the other is I/O bandwidth. Indeed, the first aspect has changed increasingly insignificant because the frequency of CPU has exceeded 700MHz which can satisfy fully the need of high rate data processing. So I/O bandwidth is the only key factor of the high rate PC-based data acquisition and we must adopt efficient data buffer techniques to satisfy the demand of telemetry data entry. This paper presents a buffered data channel which use memory mapping, EPLD and Dual-Port SRAM techniques. The operation platform of this design is WINDOWS95/98 and the software includes device driver and real-time processing routines.
尹翰卿 und Hon-hing Wan. „Efficient real-time scheduling for multimedia data transmission“. Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2002. http://hub.hku.hk/bib/B31227910.
Der volle Inhalt der QuelleChen, Li. „Statistical Machine Learning for Multi-platform Biomedical Data Analysis“. Diss., Virginia Tech, 2011. http://hdl.handle.net/10919/77188.
Der volle Inhalt der QuellePh. D.
Ghosh, Kaushik. „Speculative execution in real-time systems“. Diss., Georgia Institute of Technology, 1995. http://hdl.handle.net/1853/8174.
Der volle Inhalt der QuelleSandys, Sean David. „Requirement specifications for communication in distributed real-time systems /“. Thesis, Connect to this title online; UW restricted, 2002. http://hdl.handle.net/1773/7002.
Der volle Inhalt der QuelleHooman, Jozef. „Specification and compositional verification of real time systems /“. Berlin [u.a.] : Springer, 1991. http://www.loc.gov/catdir/enhancements/fy0815/91041783-d.html.
Der volle Inhalt der QuelleBurger, Joseph. „Real-time engagement area dvelopment program (READ-Pro)“. Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2002. http://library.nps.navy.mil/uhtbin/hyperion-image/02Jun%5FBurger.pdf.
Der volle Inhalt der QuelleLauretig, Adam M. „Natural Language Processing, Statistical Inference, and American Foreign Policy“. The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1562147711514566.
Der volle Inhalt der QuelleCraig, David W. (David William) Carleton University Dissertation Engineering Electrical. „Light traffic loss of random hard real-time tasks in a network“. Ottawa, 1988.
Den vollen Inhalt der Quelle findenArshad, Norhashim Mohd. „Real-time data compression for machine vision measurement systems“. Thesis, Liverpool John Moores University, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.285284.
Der volle Inhalt der QuelleJun, Zhang, Feng MeiPing, Zhu Yanbo, He Bin und Zhang Qishan. „A Real-Time Telemetry Data Processing System with Open System Architecture“. International Foundation for Telemetering, 1994. http://hdl.handle.net/10150/611667.
Der volle Inhalt der QuelleIn face of the characteristics of multiple data streams, high bit rate, variable data formats, complicated frame structure and changeable application environment, the programmable PCM telemetry system needs a new data processing system with advanced telemetry system architecture. This paper fully considers the characteristics of real-time telemetry data processing, analyzes the design of open system architecture for real-time telemetry data processing system(TDPS), presents an open system architecture scheme and design of real-time TDPS, gives the structure model of distributed network system, and develops the interface between network database and telemetry database, as well as telemetry processing software with man-machine interface. Finally, a practical and multi-functional real-time TDPS with open system architecture has been built, which based on UNIX operating system, supporting TCP/IP protocol and using Oracle relational database management system. This scheme and design have already proved to be efficient for real-time processing, high speed, mass storage and multi-user operation.
Ivan-Roşu, Daniela. „Dynamic resource allocation for adaptive real-time applications“. Diss., Georgia Institute of Technology, 1999. http://hdl.handle.net/1853/9200.
Der volle Inhalt der Quelle黃伯光 und Pak-kwong Wong. „Statistical language models for Chinese recognition: speech and character“. Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1998. http://hub.hku.hk/bib/B31239456.
Der volle Inhalt der QuelleCannon, Jordan Krokhmal Pavlo. „Statistical analysis and algorithms for online change detection in real-time psychophysiological data“. [Iowa City, Iowa] : University of Iowa, 2009. http://ir.uiowa.edu/etd/342.
Der volle Inhalt der QuelleCannon, Jordan. „Statistical analysis and algorithms for online change detection in real-time psychophysiological data“. Thesis, University of Iowa, 2009. https://ir.uiowa.edu/etd/342.
Der volle Inhalt der QuelleChen, Deji. „Real-time data management in the distributed environment /“. Digital version accessible at:, 1999. http://wwwlib.umi.com/cr/utexas/main.
Der volle Inhalt der QuelleLloyd, Joseph W. Jr. „POST-FLIGHT DATA DISTRIBUTION SYSTEM“. International Foundation for Telemetering, 1993. http://hdl.handle.net/10150/608898.
Der volle Inhalt der QuelleDesktop Processors (IBM PC, PC-compatible, and Macintosh) have made a major impact on how the Naval Air Warfare Center Aircraft Division (NAWCAD}, Patuxent River engineering community performs their work in aircraft weapons tests. The personal processors are utilized by the flight-test engineers not only for report preparation, but also for post-flight Engineering Unit (EU) data reduction and analysis. Present day requirements direct a need for improved post-flight data handling than those of the past. These requirements are driven by the need to analyze all the vehicle's parameters prior to the succeeding test flight, and to generate test reports in a more cost effective and timely manner. This paper defines the post-flight data distribution system at NAWCAD, Patuxent River, explains how these tasks were handled in the past, and the development of a real-time data storage designed approach for post-flight data handling. This engineering design is then described explaining how it sets the precedence for NAWCAD, Patuxent River's future plans; and how it provides the flight-test engineer with the test vehicle's EU data immediately available post-flight at his desktop processor.
Yaman, Sibel. „A multi-objective programming perspective to statistical learning problems“. Diss., Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/26470.
Der volle Inhalt der QuelleCommittee Chair: Chin-Hui Lee; Committee Member: Anthony Yezzi; Committee Member: Evans Harrell; Committee Member: Fred Juang; Committee Member: James H. McClellan. Part of the SMARTech Electronic Thesis and Dissertation Collection.
Spina, Robert. „Real time maze traversal /“. Online version of thesis, 1989. http://hdl.handle.net/1850/10566.
Der volle Inhalt der QuelleChang, Kuo-Lung. „A Real-Time Merging-Buffering Technique for MIDI Messages“. Thesis, University of North Texas, 1991. https://digital.library.unt.edu/ark:/67531/metadc500471/.
Der volle Inhalt der QuelleO'Brien, R. Michael. „REAL-TIME TELEMETRY DATA FORMATTING FOR FLIGHT TEST ANALYSIS“. International Foundation for Telemetering, 1994. http://hdl.handle.net/10150/608577.
Der volle Inhalt der QuelleWith today's telemetry systems, an hour-long analog test tape can be digitized in one hour or less. However, the digitized data produced by today's telemetry systems is usually not in a format that can be directly analyzed by the test engineer's analysis tools. The digitized data must be formatted before analysis can begin. The data formatting process can take from one to eight hours depending on the amount of data, the power of the system's host computer, and the complexity of the analysis software's data format. If more than one analysis package is used by the test engineer, the data has to be formatted separately for each package. Using today's high-speed RISC processors and large memory technology, a real-time Flexible Data Formatter can be added to the Telemetry Front End to perform this formatting function. The Flexible Data Formatter (FDF) allows the telemetry user to program the front-end hardware to output the telemetry test data in a format compatible with the user's analysis software. The FDF can also output multiple data files, each in a different format for supporting multiple analysis packages. This eliminates the file formatting step, thus reducing the time to process the data from each test by a factor of two to nine.
Fidalgo, André Filipe dos Santos Pinto. „IPTV data reduction strategy to measure real users’ behaviours“. Master's thesis, Faculdade de Ciências e Tecnologia, 2012. http://hdl.handle.net/10362/8448.
Der volle Inhalt der QuelleThe digital IPTV service has evolved in terms of features, technology and accessibility of their contents. However, the rapid evolution of features and services has brought a more complex offering to customers, which often are not enjoyed or even perceived. Therefore, it is important to measure the real advantage of those features and understand how they are used by customers. In this work, we present a strategy that deals directly with the real IPTV data, which result from the interaction actions with the set-top boxes by customers. But this data has a very low granularity level, which is complex and difficult to interpret. The approach is to transform the clicking actions to a more conceptual and representative level of the running activities. Furthermore, there is a significant reduction in the data cardinality, enhanced in terms of information quality. More than a transformation, this approach aims to be iterative, where at each level, we achieve a more accurate information, in order to characterize a particular behaviour. As experimental results, we present some application areas regarding the main offered features in this digital service. In particular, is made a study about zapping behaviour, and also an evaluation about DVR service usage. It is also discussed the possibility to integrate the strategy devised in a particular carrier, aiming to analyse the consumption rate of their services, in order to adjust them to customer real usage profile, and also to study the feasibility of new services introduction.
Chadha, Sanjay. „A real-time system for multi-transputer systems“. Thesis, University of British Columbia, 1990. http://hdl.handle.net/2429/29465.
Der volle Inhalt der QuelleApplied Science, Faculty of
Electrical and Computer Engineering, Department of
Graduate
T, N. Santhosh Kumar, K. Abdul Samad A und M. Sarojini K. „DSP BASED SIGNAL PROCESSING UNIT FOR REAL TIME PROCESSING OF VIBRATION AND ACOUSTIC SIGNALS OF SATELLITE LAUNCH VEHICLES“. International Foundation for Telemetering, 1995. http://hdl.handle.net/10150/608530.
Der volle Inhalt der QuelleMeasurement of vibration and acoustic signals at various locations in the launch vehicle is important to establish the vibration and acoustic environment encountered by the launch vehicle during flight. The vibration and acoustic signals are wideband and require very large telemetry bandwidth if directly transmitted to ground. The DSP based Signal Processing Unit is designed to measure and analyse acoustic and vibration signals onboard the launch vehicle and transmit the computed spectrum to ground through centralised baseband telemetry system. The analysis techniques employed are power spectral density (PSD) computations using Fast Fourier Transform (FFT) and 1/3rd octave analysis using digital Infinite Impulse Response (IIR) filters. The programmability of all analysis parameters is achieved using EEPROM. This paper discusses the details of measurement and analysis techniques, design philosophy, tools used and implementation schemes. The paper also presents the performance results of flight models.
Lefloch, Damien [Verfasser]. „Real-time processing of range data focusing on environment reconstruction / Damien Lefloch“. Siegen : Universitätsbibliothek der Universität Siegen, 2019. http://d-nb.info/1177366398/34.
Der volle Inhalt der QuelleHo, Eric TszLeung 1979. „A real-time system for processing, sharing, and display of physiology data“. Thesis, Massachusetts Institute of Technology, 2003. http://hdl.handle.net/1721.1/87405.
Der volle Inhalt der QuelleIncludes bibliographical references (p. 61).
by Eric TszLeung Ho.
M.Eng.and S.B.