Rozprawy doktorskie na temat „Data Format”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Sprawdź 50 najlepszych rozpraw doktorskich naukowych na temat „Data Format”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Przeglądaj rozprawy doktorskie z różnych dziedzin i twórz odpowiednie bibliografie.
Mills, H. L., i K. D. Turver. "24-BIT FLIGHT TEST DATA RECORDING FORMAT". International Foundation for Telemetering, 1991. http://hdl.handle.net/10150/612937.
Pełny tekst źródłaBoeing Commercial Airplane Group’s Flight Test Engineering organization is developing a new test data recording format to be used on the new model 777 airplane. ARINC 429, ARINC 629 and IRIG PCM data will be formatted for recording test data. The need to support a variety of data recorders, and three types of data, mandate the development of a new recording format. The format Flight Test chose is a variation of IRIG Standard 106-86, Chapter 8. The data from each channel is treated as a data packet, including time and channel ID, and then multiplexed into 24 bits. This allows a time accuracy of 10 microseconds and a minimum latency caused by multiplexing.
Meyer, David, Friedrich Leisch, Torsten Hothorn i Kurt Hornik. "StatDataML. An XML format for statistical data". SFB Adaptive Information Systems and Modelling in Economics and Management Science, WU Vienna University of Economics and Business, 2002. http://epub.wu.ac.at/540/1/document.pdf.
Pełny tekst źródłaSeries: Report Series SFB "Adaptive Information Systems and Modelling in Economics and Management Science"
Ilg, Markus. "Digital processing of map data in raster format /". Zürich : Geographisches Institut Eidgenössische Technische Hochschule, 1986. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=7973.
Pełny tekst źródłaKupferschmidt, Benjamin, i Eric Pesciotta. "Automatic Format Generation Techniques for Network Data Acquisition Systems". International Foundation for Telemetering, 2009. http://hdl.handle.net/10150/606089.
Pełny tekst źródłaConfiguring a modern, high-performance data acquisition system is typically a very timeconsuming and complex process. Any enhancement to the data acquisition setup software that can reduce the amount of time needed to configure the system is extremely useful. Automatic format generation is one of the most useful enhancements to a data acquisition setup application. By using Automatic Format Generation, an instrumentation engineer can significantly reduce the amount of time that is spent configuring the system while simultaneously gaining much greater flexibility in creating sampling formats. This paper discusses several techniques that can be used to generate sampling formats automatically while making highly efficient use of the system's bandwidth. This allows the user to obtain most of the benefits of a hand-tuned, manually created format without spending excessive time creating it. One of the primary techniques that this paper discusses is an enhancement to the commonly used power-of-two rule, for selecting sampling rates. This allows the system to create formats that use a wider variety of rates. The system is also able to handle groups of related measurements that must follow each other sequentially in the sampling format. This paper will also cover a packet based formatting scheme that organizes measurements based on common sampling rates. Each packet contains a set of measurements that are sampled at a particular rate. A key benefit of using an automatic format generation system with this format is the optimization of sampling rates that are used to achieve the best possible match for each measurement's desired sampling rate.
Peart, David E., i Jim Talbert. "CONVERTING ASYNCHRONOUS DATA INTO A STANDARD IRIG TELEMETRY FORMAT". International Foundation for Telemetering, 1997. http://hdl.handle.net/10150/609679.
Pełny tekst źródłaIn recent years we have seen an increase in the use of MIL-STD-1553 buses and other asynchronous data sources used in new missile and launcher designs. The application of multiplexed asynchronous buses in missiles and launchers is very common today. With increasing application of asynchronous data sources into very complex systems the need to acquire, analyze, and present one hundred percent of the bus traffic in real time or near real time has become especially important during testing and diagnostic operations. This paper discusses ways of converting asynchronous data, including MIL-STD-1553, into a telemetry format that is suitable for encryption, telemetering, recording, and presenting with Inter Range Instrumentation Group (IRIG) compatible off-the-shelf hardware. The importance of these designs is to provide the capability to conserve data bandwidth and to maximize the use of existing hardware. In addition, this paper will discuss a unique decode and time tagging design that conserves data storage when compared to the methods in IRIG Standard 106-96 and still maintains a very accurate time tag.
Graul, Michael, Ronald Fernandes, John L. Hamilton, Charles H. Jones i Jon Morgan. "ENHANCEMENTS TO THE DATA DISPLAY MARKUP LANGUAGE". International Foundation for Telemetering, 2006. http://hdl.handle.net/10150/604103.
Pełny tekst źródłaThis paper presents the description of the updated Data Display Markup Language (DDML), a neutral format for data display configurations. The development of DDML is motivated by the fact that in joint service program systems, there is a critical need for common data displays to support distributed T&E missions, irrespective of the test location, data acquisition system, and display system. DDML enables standard data displays to be specified for any given system under test, irrespective of the display vendor or system in which they will be implemented. The version 3.0 of DDML represents a more mature language than the version 1.0 presented at the 2003 ITC. The updated version has been validated for completeness and robustness by developing translators between DDML and numerous vendor formats. The DDML schema has been presented to the Range Commander’s Council (RCC) Data Multiplex Committee for consideration for inclusion in the IRIG 106 standard. The DDML model will be described in terms of both the XML schema and the UML model, and various examples of DDML models will be presented. The intent of this paper is to solicit specific input from the community on this potential RCC standard.
Wegener, John A., i Rodney L. Davis. "EXTENSION OF A COMMON DATA FORMAT FOR REAL-TIME APPLICATIONS". International Foundation for Telemetering, 2004. http://hdl.handle.net/10150/604961.
Pełny tekst źródłaThe HDF5 (Hierarchical Data Format) data storage family is an industry standard format that allows data to be stored in a common format and retrieved by a wide range of common tools. HDF5 is a widely accepted industry standard container for data storage developed by the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign. The HDF5 data storage family includes HDF-Time History, intended for data processing, and HDF-Packet, intended for real-time data collection; each of these is an extension to the basic HDF5 format, which defines data structures and associated interrelationships, optimized for that particular purpose. HDF-Time History, developed jointly by Boeing and NCSA, is in the process of being adopted throughout the Boeing test community and by its external partners. The Boeing/NCSA team is currently developing HDF-Packet to support real-time streaming applications, such as airborne data collection and recording of received telemetry. The advantages are significant cost reduction resulting from storing the data in its final format, thus avoiding conversion between a myriad of recording and intermediate formats. In addition, by eliminating intermediate file translations and conversions, data integrity is maintained from recording through processing and archival storage. As well, HDF5 is a general-purpose wrapper, into which can be stored processed data and other data documentation information (such as calibrations), thus making the final data file self-documenting. This paper describes the basics of the HDF-Time History, the extensions required to support real-time acquisition with HDF-Packet, and implementation issues unique to real-time acquisition. It also describes potential future implementations for data acquisition systems in different segments of the test data industry.
Alfredsson, Anders. "XML as a Format for Representation and Manipulation of Data from Radar Communications". Thesis, University of Skövde, Department of Computer Science, 2001. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-591.
Pełny tekst źródłaXML was designed to be a new standard for marking up data on the web. However, as a result of its extensible and flexible properties, XML is now being used more and more for other purposes than was originally intended. Today XML is prompting an approach more focused on data exchange, between different applications inside companies or even between cooperating businesses.
Businesses are showing interest in using XML as an integral part of their work. Ericsson Microwave Systems (EMW) is a company that sees XML as a conceivable solution to problems in the work with radar communications. An approach towards a solution based on a relational database system has earlier been analysed.
In this project we present an investigation of the work at EMW, and identification and documentation of the problems in the radar communication work. Also, the requirements and expectations that EMW has on XML are presented. Moreover, an analysis has been made to decide to what extent XML could be used to solve the problems of EMW. The analysis was conducted by elucidating the problems and possibilities of XML compared to the previous approach for solving the problems at EMW, which was based on using a relational database management system.
The analysis shows that XML has good features for representing hierarchically structured data, as in the EMW case. It is also shown that XML is good for data integration purposes. Furthermore, the analysis shows that XML, due to its self-describing and weak typing nature, is inappropriate to use in the data semantics and integrity problem context of EMW. However, it also shows that the new XML Schema standard could be used as a complement to the core XML standard, to partially solve the semantics problems.
Barnum, Jil. "THE USE OF HDF IN F-22 AVIONICS TEST AND EVALUATION". International Foundation for Telemetering, 1996. http://hdl.handle.net/10150/608388.
Pełny tekst źródłaHierarchical Data Format (HDF) is a public domain standard for file formats which is documented and maintained by the National Center for Super Computing Applications. HDF is the standard adopted by the F-22 program to increase efficiency of avionics data processing and utility of the data. This paper will discuss how the data processing Integrated Product Team (IPT) on the F-22 program plans to use HDF for file format standardization. The history of the IPT choosing HDF, the efficiencies gained by choosing HDF, and the ease of data transfer will be explained.
Wan, Wade K. (Wade Keith) 1973. "Adaptive format conversion information as enhancement data for scalable video coding". Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/29903.
Pełny tekst źródłaIncludes bibliographical references (p. 143-145).
Scalable coding techniques can be used to efficiently provide multicast video service and involve transmitting a single independently coded base layer and one or more dependently coded enhancement layers. Clients can decode the base layer bitstream and none, some or all of the enhancement layer bitstreams to obtain video quality commensurate with their available resources. In many scalable coding algorithms, residual coding information is the only type of data that is coded in the enhancement layers. However, since the transmitter has access to the original sequence, it can adaptively select different format conversion methods for different regions in an intelligent manner. This adaptive format conversion information can then be transmitted as enhancement data to assist processing at the decoder. The use of adaptive format conversion has not been studied in detail and this thesis examines when and how it can be used for scalable video compression. A new scalable codec is developed in this thesis that can utilize adaptive format conversion information and/or residual coding information as enhancement data. This codec was used in various simulations to investigate different aspects of adaptive format conversion such as the effect of the base layer, a comparison of adaptive format conversion and residual coding, and the use of both adaptive format conversion and residual coding.
(cont.) The experimental results show adaptive format conversion can provide video scalability at low enhancement bitrates not possible with residual coding and also assist residual coding at higher enhancement layer bitrates. This thesis also discusses the application of adaptive format conversion to the migration path for digital television. Adaptive format conversion is well-suited to the unique problems of the migration path and can provide initial video scalability as well as assist a future migration path.
by Wade K. Wan.
Ph.D.
Kupferschmidt, Benjamin, i Albert Berdugo. "DESIGNING AN AUTOMATIC FORMAT GENERATOR FOR A NETWORK DATA ACQUISITION SYSTEM". International Foundation for Telemetering, 2006. http://hdl.handle.net/10150/604157.
Pełny tekst źródłaIn most current PCM based telemetry systems, an instrumentation engineer manually creates the sampling format. This time consuming and tedious process typically involves manually placing each measurement into the format at the proper sampling rate. The telemetry industry is now moving towards Ethernet-based systems comprised of multiple autonomous data acquisition units, which share a single global time source. The architecture of these network systems greatly simplifies the task of implementing an automatic format generator. Automatic format generation eliminates much of the effort required to create a sampling format because the instrumentation engineer only has to specify the desired sampling rate for each measurement. The system handles the task of organizing the format to comply with the specified sampling rates. This paper examines the issues involved in designing an automatic format generator for a network data acquisition system.
Rajyalakshmi, P. S., i R. K. Rajangam. "Data Handling System for IRS". International Foundation for Telemetering, 1987. http://hdl.handle.net/10150/615329.
Pełny tekst źródłaThe three axis stabilized Indian Remote Sensing Satellite will image the earth from a 904 Km polar - sun synchronous orbit. The payload is a set of CCD cameras which collect data in four bands visible and near infra-red region. This payload data from two cameras, each at 10.4 megabits per sec is transmitted in a balanced QPSK in X Band. The payload data before transmission is formatted by adopting Major and Minor frame synchronizing codes. The formatted two streams of data are differentially encoded to take care of 4-phase ambiguity due to QPSK transmission. This paper describes the design and development aspects related to such a Data Handling System. It also highlights the environmental qualification tests that were carried out to meet the requirement of three years operational life of the satellite.
Abboud, Fayez. "Utilizing Image-based Formats to Optimize Pattern Data Format and Processing In Mask and Maskless Pattern Generation Lithography". NSUWorks, 2012. http://nsuworks.nova.edu/gscis_etd/73.
Pełny tekst źródłaSeegmiller, Ray D., Greg C. Willden, Maria S. Araujo, Todd A. Newton, Ben A. Abbott i William A. Malatesta. "Automation of Generalized Measurement Extraction from Telemetric Network Systems". International Foundation for Telemetering, 2012. http://hdl.handle.net/10150/581647.
Pełny tekst źródłaIn telemetric network systems, data extraction is often an after-thought. The data description frequently changes throughout the program so that last minute modifications of the data extraction approach are often required. This paper presents an alternative approach in which automation of measurement extraction is supported. The central key is a formal declarative language that can be used to configure instrumentation devices as well as measurement extraction devices. The Metadata Description Language (MDL) defined by the integrated Network Enhanced Telemetry (iNET) program, augmented with a generalized measurement extraction approach, addresses this issue. This paper describes the TmNS Data Extractor Tool, as well as lessons learned from commercial systems, the iNET program and TMATS.
Manning, Dennis, Rick Williams i Paul Ferrill. "Data Filtering Unit (DFU): Dealing With Cryptovariable Keys in Data Recorded Using the IRIG 106 Chapter 10 Format". International Foundation for Telemetering, 2006. http://hdl.handle.net/10150/604140.
Pełny tekst źródłaRecent advancements in IRIG 106 Chapter 10 recording systems allow the recording of all on board 1553 bus and PCM traffic to a single media. These advancements have also brought about the issue of extracting data with different levels of classification that was written to single location. Carrying GPS “smart” weapons further complicates this issue since the recording of GPS keys adds another level of classification to the mix. The ability to separate and/or remove higher level data from a data product is now required. This paper describes the design of a hardware device that will filter specified data from IRIG 106 Chapter 10 recorder memory modules (RMMs) to prevent the storage device or computer from becoming classified at the level of the specified data.
Munir, Rana Faisal. "Storage format selection and optimization for materialized intermediate results in data-intensive flows". Doctoral thesis, Universitat Politècnica de Catalunya, 2019. http://hdl.handle.net/10803/668476.
Pełny tekst źródłaLas organizaciones producen y recopilan grandes volúmenes de datos, que deben procesarse de forma repetitiva y rápida para obtener información relevante para la empresa. Para tal procesamiento, por lo general, se emplean flujos intensivos de datos (DIFs por sussiglas en inglés) en entornos de procesamiento distribuido. Los DIFs de diferentes usuarios tienen elementos comunes (es decir, se duplican partes del procesamiento, lo que desperdicia recursos computacionales y aumenta el coste en general). Los resultados intermedios de varios DIFs pueden pues coincidir y se pueden por tanto materializar para facilitar su reutilización, lo que ayuda a reducir el coste y ahorrar recursos si se realiza correctamente. Además, la forma en qué se materializan dichos resultados debe ser considerada. Por ejemplo, diferentes tipos de diseño lógico de los datos (es decir, horizontal, vertical o híbrido) se pueden utilizar para reducir el coste de E/S. En esta tesis doctoral, primero proponemos un enfoque novedoso para materializar automáticamente los resultados intermedios de los DIFs a través de un método de optimización multi-objetivo, que puede considerar múltiples y contradictorias métricas de calidad. A continuación, estudiamos el comportamiento de diferentes operadores de DIF que acceden directamente a los resultados materializados. Sobre la base de este estudio, ideamos un enfoque basado en reglas, que decide el diseño del almacenamiento para los resultados materializados en función de los tipos de operaciones que los utilizan directamente. A pesar de mejorar el coste en general, las reglas heurísticas no consideran estadísticas sobre la cantidad de datos leídos al hacer la elección, lo que podría llevar a una decisión errónea. Consecuentemente, diseñamos un modelo de costos que es capaz de encontrar el diseño de almacenamiento adecuado para cada escenario dependiendo de las características de los datos almacenados. El modelo de costes usa estadísticas y características de acceso para estimar el coste de E/S de un resultado intervii medio materializado con diferentes diseños de almacenamiento y elige el de menor coste. Los resultados muestran que los diseños de almacenamiento ayudan a reducir el tiempo de carga de los resultados materializados y, en general, mejoran el rendimiento de los DIF. La tesis también presta atención a la optimización de los parámetros configurables de diseños híbridos. Proponemos así ATUN-HL (Auto TUNing Hybrid Layouts), que, basado en el mismo modelo de costes, las características de los datos y el tipo de acceso que se está haciendo, encuentra los valores óptimos para los parámetros de configuración en disponibles Parquet (una implementación de diseños híbridos para Hadoop Distributed File System). Finalmente, esta tesis estudia el impacto del paralelismo en DIF y diseños híbridos. El modelo de coste propuesto ayuda a idear un enfoque para ajustar el paralelismo al decidir la cantidad de tareas y máquinas para procesar los datos. En resumen, el modelo de costes propuesto permite elegir el mejor diseño de almacenamiento posible para los resultados intermedios materializados, ajustar los parámetros configurables de diseños híbridos y estimar el número de tareas y máquinas para la ejecución de DIF.
Moderne Unternehmen produzieren und sammeln große Datenmengen, die wiederholt und schnell verarbeitet werden müssen, um geschäftliche Erkenntnisse zu gewinnen. Für die Verarbeitung dieser Daten werden typischerweise Datenintensive Prozesse (DIFs) auf verteilten Systemen wie z.B. MapReduce bereitgestellt. Dabei ist festzustellen, dass die DIFs verschiedener Nutzer sich in großen Teilen überschneiden, wodurch viel Arbeit mehrfach geleistet, Ressourcen verschwendet und damit die Gesamtkosten erhöht werden. Um diesen Effekt entgegenzuwirken, können die Zwischenergebnisse der DIFs für spätere Wiederverwendungen materialisiert werden. Hierbei müssen vor allem die unterschiedlichen Speicherlayouts (horizontal, vertikal und hybrid) berücksichtigt werden. In dieser Doktorarbeit wird ein neuartiger Ansatz zur automatischen Materialisierung der Zwischenergebnisse von DIFs durch eine mehrkriterielle Optimierungsmethode vorgeschlagen, der in der Lage ist widersprüchliche Qualitätsmetriken zu behandeln. Des Weiteren wird untersucht die Wechselwirkung zwischen verschiedenen Operatortypen und unterschiedlichen Speicherlayouts untersucht. Basierend auf dieser Untersuchung wird ein regelbasierter Ansatz vorgeschlagen, der das Speicherlayout für materialisierte Ergebnisse, basierend auf den nachfolgenden Operationstypen, festlegt. Obwohl sich die Gesamtkosten für die Ausführung der DIFs im Allgemeinen verbessern, ist der heuristische Ansatz nicht in der Lage die gelesene Datenmenge bei der Auswahl des Speicherlayouts zu berücksichtigen. Dies kann in einigen Fällen zu falschen Entscheidung führen. Aus diesem Grund wird ein Kostenmodell entwickelt, mit dem für jedes Szenario das richtige Speicherlayout gefunden werden kann. Das Kostenmodell schätzt anhand von Daten und Auslastungsmerkmalen die E/A-Kosten eines materialisierten Zwischenergebnisses mit unterschiedlichen Speicherlayouts und wählt das kostenminimale aus. Die Ergebnisse zeigen, dass Speicherlayouts die Ladezeit materialisierter Ergebnisse verkürzen und insgesamt die Leistung von DIFs verbessern. Die Arbeit befasst sich auch mit der Optimierung der konfigurierbaren Parameter von hybriden Layouts. Konkret wird der sogenannte ATUN-HLAnsatz (Auto TUNing Hybrid Layouts) entwickelt, der auf der Grundlage des gleichen Kostenmodells und unter Berücksichtigung der Auslastung und der Merkmale der Daten die optimalen Werte für konfigurierbare Parameter in Parquet, d.h. eine Implementierung von hybrider Layouts. Schließlich werden in dieser Arbeit auch die Auswirkungen von Parallelität in DIFs und hybriden Layouts untersucht. Dazu wird ein Ansatz entwickelt, der in der Lage ist die Anzahl der Aufgaben und dafür notwendigen Maschinen automatisch zu bestimmen. Zusammengefasst lässt sich festhalten, dass das in dieser Arbeit vorgeschlagene Kostenmodell es ermöglicht, das bestmögliche Speicherlayout für materialisierte Zwischenergebnisse zu ermitteln, die konfigurierbaren Parameter hybrider Layouts festzulegen und die Anzahl der Aufgaben und Maschinen für die Ausführung von DIFs zu schätzen
Meric, Burak, Michael Graul, Ronald Fernandes i Charles H. Jones. "DESIGN OF AN INTERLINGUA FOR DATA DISPLAY SYSTEMS". International Foundation for Telemetering, 2003. http://hdl.handle.net/10150/605580.
Pełny tekst źródłaThis paper presents the description of a new XML-based data display language called Data Display Markup Language (DDML) that can be used as an interlingua for different data display configuration formats. Translation of data display configuration between various vendor- formats can be accomplished by translating in and out of DDML. The DDML can also be used as a vendor-neutral format for archiving and retrieving display configurations in a test and evaluation (T&E) configuration repository.
Leopold, Henrik, der Aa Han van, Fabian Pittke, Manuel Raffel, Jan Mendling i Hajo A. Reijers. "Searching textual and model-based process descriptions based on a unified data format". Springer Berlin Heidelberg, 2019. http://dx.doi.org/10.1007/s10270-017-0649-y.
Pełny tekst źródłaJeong, Ki Tai. "A Common Representation Format for Multimedia Documents". Thesis, University of North Texas, 2002. https://digital.library.unt.edu/ark:/67531/metadc3336/.
Pełny tekst źródłaBoman, Maria. "XML/EDI - EDI med XML-format?" Thesis, University of Skövde, Department of Computer Science, 2000. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-422.
Pełny tekst źródłaElectronic data interchange (EDI) är en teknik vilken används för att utbyta elektroniska dokument mellan köpare och säljare. Dessa dokument överförs på en strikt elektronisk standardiserad väg, dokumenten kan vara fakturor, beställningsformulär, leveransplaner etc. EDI har funnits i över 25år och är vida spridd över världen. För att använda EDI krävs det stora resurser i form av kapital och kunskap. Små och medelstora företag har ofta inte de resurser som krävs för att kunna använda EDI.
Nya tekniker utvecklas ständigt. Extensibel Markup Language (XML) är en ny teknik vilken är avsedd att användas främst på Internet. XML delar upp dokumenten med hjälp av etiketter vilket hjälper läsaren att identifiera innehållet i dokumenten. XML har väckt EDI-samhällets intresse då XML anses ha kapaciteten att kunna användas som ett EDI-format samtidigt som XML är Internetanpassat vilket betyder att EDI enklare skulle kunna användas på Internet.
I mitt examensarbete har jag utrett om EDI-användande företag avser använda XML/EDI. Jag har genomfört intervjuer för att besvara problemformuleringen. Resultaten jag kommit fram till är att företagen avser använda XML/EDI om en ordentlig standard blir framtagen. Företagen är dessutom positivt inställda till XML/EDI samt Internetbaserad EDI. Anledningen till den positiva inställningen är främst att XML/EDI samt Internetbaserad EDI skulle innebära lägre kostnader vilket i sin tur skulle innebära att även små och medelstora företag skulle kunna använda tekniken.
Harward, Gregory Brent. "Suitability of the NIST Shop Data Model as a Neutral File Format for Simulation". Diss., CLICK HERE for online access, 2005. http://contentdm.lib.byu.edu/ETD/image/etd899.pdf.
Pełny tekst źródłaThornbrue, James R. (James Raymond) 1976. "Adaptive format conversion information as enhancement data for the high-definition television migration path". Thesis, Massachusetts Institute of Technology, 2003. http://hdl.handle.net/1721.1/29618.
Pełny tekst źródłaIncludes bibliographical references (p. 115-116).
Prior research indicates that a scalable video codec based on adaptive format conversion (AFC) information may be ideally suited to meet the demands of the migration path for high-definition television. Most scalable coding schemes use a single format conversion technique and encode residual information in the enhancement layer. Adaptive format conversion is different in that it employs more than one conversion technique. AFC partitions a video sequence into small blocks and selects the format conversion filter with the best performance in each block. Research shows that the bandwidth required for this type of enhancement information is small, yet the improvement in video quality is significant. This thesis focuses on the migration from 10801 to 1080P using adaptive deinterlacing. Two main questions are answered. First, how does adaptive format conversion perform when the base layer video is compressed in a manner typical to high-definition television? It was found that when the interlaced base layer was compressed to 0.3 bpp, the mean base layer PSNR was 32 dB and the PSNR improvement due to the enhancement layer was as high as 4 dB. Second, what is the optimal tradeoff between base layer and enhancement layer bandwidth? With the total bandwidth fixed at 0.32 bpp, it was found that the optimal bandwidth allocation was about 96% base layer, 4% enhancement layer using fixed, 16x16 pixel partitions. The base and enhancement layer video at this point were compared to 100% base layer allocation and the best nonadaptive format conversion. While there was usually no visible difference in base layer quality, the adaptively deinterlaced enhancement layer was generally sharper, with cleaner edges, less flickering, and fewer aliasing artifacts than the best nonadaptive method. Although further research is needed, the results of these experiments support the idea of using adaptive deinterlacing in the HDTV migration path.
by James R. Thornbrue.
S.M.
Miller, Helen Buchanan. "The effect of graphic format, age, and gender on the interpretation of quantitative data". Diss., Virginia Polytechnic Institute and State University, 1989. http://hdl.handle.net/10919/54245.
Pełny tekst źródłaEd. D.
Fast, Tobias. "Alpha Tested Geometry in DXR : Performance Analysis of Asset Data Variations". Thesis, Blekinge Tekniska Högskola, Institutionen för datavetenskap, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-19730.
Pełny tekst źródłaBakgrund. Strålspårning(Ray tracing) kan användas för att uppnå hyperrealistisk 3D-rendering, men det är en mycket tung beräkningsuppgift. Sedan hårdvarustöd för att utföra strålspårning i realtid lanserades har spelindustrin introducerat funktionen i spel. Trots modern hårdvara upplevers fortfarande prestandaproblem när vanliga renderingstekniker kombineras med strålspårning. En av dessa problematiska tekniker är alfa-testning(alpha testing). Syfte. Denna avhandling kommer att undersöka följande: 1) Hur texturformatet på alfamasken(alpha map) och hur antalet alfamaskar påverkar renderingstiderna. 2) På vilket sätt tesselering av den alfa-testade geometrin påverkar prestandan och om tesselering har potentialen att ersätta alfa-testet helt ur ett prestandaperspektiv. Metod. En DXR 3D-renderare kommer att implementeras som kan rendera alfatestad geometri med hjälp av en “Any-hit” shader. Renderaren användes för att mäta och jämföra renderingstider givet varierande textur- och geometri-data. Två alfaprövade trädmodeller tesselaterades till olika nivåer och deras relaterade texturer konverterades till fyra format som användes i testscenerna. Resultat & Slutsatser. När texturformaten BC7, R(1xfloat32) och BC4 användes för alfamasken visade alla en minskad renderingstid relativ RGBA (4xfloat32). BC4 gav bästa prestandaökningen och minskade renderingstiden med upp till 17% med en alfamask per modell och upp till 43% med åtta alfamasker. När antalet alfamasker som användes per modell ökade renderingstiderna med upp till 52% när alfamaskerna ökade från en till två. En stor ökning av renderingstiden observerades när alfamaskerna gick från tre till fyra i alla testfall. När alfatestning användes på de tesselerade modellversionerna ökade renderingstiderna i de flesta fall, som högst 135%. En minskning på upp till 8% observerades emellertid när modellerna tesselaterades till en viss grad. Att stänga av alfatestning gav en signifikant ökning av prestandan, vilket tillät högre tesselerade versioner att renderas för alla modeller. Samtidigt som antalet trianglar ökade med en faktor på 78, i ett av fallen, minskades renderingstiden med 30%. Detta antyder att förtesselerade modeller potentiellt kan användas för att ersätta alfatestad geometri när prestanda är ett högt krav.
Hodges, Glenn A. "Designing a common interchange format for unit data using the Command and Control information exchange data model (C2IEDM) and XSLT". Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2004. http://library.nps.navy.mil/uhtbin/hyperion/04Sep%5FHodges.pdf.
Pełny tekst źródłaThesis advisor(s): Curtis Blais, Don Brutzman. Includes bibliographical references (p. 95-98). Also available online.
Toby, Inimary T., Mikhail K. Levin, Edward A. Salinas, Scott Christley, Sanchita Bhattacharya, Felix Breden, Adam Buntzman i in. "VDJML: a file format with tools for capturing the results of inferring immune receptor rearrangements". BIOMED CENTRAL LTD, 2016. http://hdl.handle.net/10150/624652.
Pełny tekst źródłaBerdugo, Albert, i Martin Small. "HIGH SPEED ASYNCHRONOUS DATA MULTIPLEXER/ DEMULTIPLEXER FOR HIGH DENSITY DIGITAL RECORDERS". International Foundation for Telemetering, 1996. http://hdl.handle.net/10150/608366.
Pełny tekst źródłaModern High Density Digital Recorders (HDDR) are ideal devices for the storage of large amounts of digital and/or wideband analog data. Ruggedized versions of these recorders are currently available and are supporting many military and commercial flight test applications. However, in certain cases, the storage format becomes very critical, e.g., when a large number of data types are involved, or when channel-to-channel correlation is critical, or when the original data source must be accurately recreated during post mission analysis. A properly designed storage format will not only preserve data quality, but will yield the maximum storage capacity and record time for any given recorder family or data type. This paper describes a multiplex/demultiplex technique that formats multiple high speed data sources into a single, common format for recording. The method is compatible with many popular commercial recorder standards such as DCRsi, VLDS, and DLT. Types of input data typically include PCM, wideband analog data, video, aircraft data buses, avionics, voice, time code, and many others. The described method preserves tight data correlation with minimal data overhead. The described technique supports full reconstruction of the original input signals during data playback. Output data correlation across channels is preserved for all types of data inputs. Simultaneous real-time data recording and reconstruction are also supported.
Gregory, Richard Cedric Thomas Art College of Fine Arts UNSW. "A graphic investigation of the atlas as a narrative format for the visual communication of cultural and social data". Awarded by:University of New South Wales. Art, 2009. http://handle.unsw.edu.au/1959.4/43798.
Pełny tekst źródłaRoguski, Łukasz 1987. "High-throughput sequencing data compression". Doctoral thesis, Universitat Pompeu Fabra, 2017. http://hdl.handle.net/10803/565775.
Pełny tekst źródłaGràcies als avenços en el camp de les tecnologies de seqüenciació, en els darrers anys la recerca biomèdica ha viscut una revolució, que ha tingut com un dels resultats l'explosió del volum de dades genòmiques generades arreu del món. La mida típica de les dades de seqüenciació generades en experiments d'escala mitjana acostuma a situar-se en un rang entre deu i cent gigabytes, que s'emmagatzemen en diversos arxius en diferents formats produïts en cada experiment. Els formats estàndards actuals de facto de representació de dades genòmiques són en format textual. Per raons pràctiques, les dades necessiten ser emmagatzemades en format comprimit. En la majoria dels casos, aquests mètodes de compressió es basen en compressors de text de caràcter general, com ara gzip. Amb tot, no permeten explotar els models d'informació especifícs de dades de seqüenciació. És per això que proporcionen funcionalitats limitades i estalvi insuficient d'espai d'emmagatzematge. Això explica per què operacions relativament bàsiques, com ara el processament, l'emmagatzematge i la transferència de dades genòmiques, s'han convertit en un dels principals obstacles de processos actuals d'anàlisi. Per tot això, aquesta tesi se centra en mètodes d'emmagatzematge i compressió eficients de dades generades en experiments de sequenciació. En primer lloc, proposem un compressor innovador d'arxius FASTQ de propòsit general. A diferència de gzip, aquest compressor permet reduir de manera significativa la mida de l'arxiu resultant del procés de compressió. A més a més, aquesta eina permet processar les dades a una velocitat alta. A continuació, presentem mètodes de compressió que fan ús de l'alta redundància de seqüències present en les dades de seqüenciació. Aquests mètodes obtenen la millor ratio de compressió d'entre els compressors FASTQ del marc teòric actual, sense fer ús de cap referència externa. També mostrem aproximacions de compressió amb pèrdua per emmagatzemar dades de seqüenciació auxiliars, que permeten reduir encara més la mida de les dades. En últim lloc, aportem un sistema flexible de compressió i un format de dades. Aquest sistema fa possible generar de manera semi-automàtica solucions de compressió que no estan lligades a cap mena de format específic d'arxius de dades genòmiques. Per tal de facilitar la gestió complexa de dades, diversos conjunts de dades amb formats heterogenis poden ser emmagatzemats en contenidors configurables amb l'opció de dur a terme consultes personalitzades sobre les dades emmagatzemades. A més a més, exposem que les solucions simples basades en el nostre sistema poden obtenir resultats comparables als compressors de format específic de l'estat de l'art. En resum, les solucions desenvolupades i descrites en aquesta tesi poden ser incorporades amb facilitat en processos d'anàlisi de dades genòmiques. Si prenem aquestes solucions conjuntament, aporten una base sòlida per al desenvolupament d'aproximacions completes encaminades a l'emmagatzematge i gestió eficient de dades genòmiques.
Samaan, Mouna M., i Stephen C. Cook. "Configuration of Flight Test Telemetry Frame Formats". International Foundation for Telemetering, 1995. http://hdl.handle.net/10150/611587.
Pełny tekst źródłaThe production of flight test plans have received attention from many research workers due to increasing complexity of testing facilities, the complex demands proposed by customers and the large volume of data required from test flights. The paper opens with a review of research work conducted by other authors who have contributed to ameliorating the preparation of flight test plans and processing the resulting data. This is followed by a description of a specific problem area; efficiently configuring the flight test data telemetry format (defined by the relevant standards while meeting user requirements of sampling rate and PCM word length). Following a description of a current semi-automated system, the authors propose an enhanced approach and demonstrate its efficiency through two case studies.
Brewer, Peter W., Daniel Murphy i Esther Jansma. "Tricycle: A Universal Conversion Tool For Digital Tree-Ring Data". Tree-Ring Society, 2011. http://hdl.handle.net/10150/622638.
Pełny tekst źródłaRevels, Kenneth W. "Constraints of Migrating Transplant Information System's Legacy Data to an XML Format For Medical Applications Use". NSUWorks, 2001. http://nsuworks.nova.edu/gscis_etd/799.
Pełny tekst źródłaMehrez, Ichrak. "Auto-tuning pour la détection automatique du meilleur format de compression pour matrice creuse". Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLV054/document.
Pełny tekst źródłaSeveral applications in scientific computing deals with large sparse matrices having regular or irregular structures. In order to reduce required memory space and computing time, these matrices require the use of a particular data storage structure as well as the use of parallel/distributed target architectures. The choice of the most appropriate compression format generally depends on several factors, such as matrix structure, numerical method and target architecture. Given the diversity of these factors, an optimized choice for one input data set will likely have poor performances on another. Hence the interest of using a system allowing the automatic selection of the Optimal Compression Format (OCF) by taking into account these different factors. This thesis is written in this context. We detail our approach by presenting a design of an auto-tuner system for OCF selection. Given a sparse matrix, a numerical method, a parallel programming model and an architecture, our system can automatically select the OCF. In a first step, we validate our modeling by a case study that concerns (i) Horner scheme, and then the sparse matrix vector product (SMVP), as numerical methods, (ii) CSC, CSR, ELL, and COO as compression formats; (iii) data parallel as a programming model; and (iv) a multicore platform as target architecture. This study allows us to extract a set of metrics and parameters that affect the OCF selection. We note that data parallel metrics are not sufficient to accurately choose the most suitable format. Therefore, we define new metrics involving the number of operations and the number of indirect data access. Thus, we proposed a new decision process taking into account data parallel model analysis and algorithm analysis.In the second step, we propose to use machine learning algorithm to predict the OCF for a given sparse matrix. An experimental study using a multicore parallel platform and dealing with random and/or real-world random matrices validates our approach and evaluates its performances. As future work, we aim to validate our approach using other parallel platforms such as GPUs
Fernandes, Ronald, Michael Graul, Burak Meric i Charles H. Jones. "ONTOLOGY-DRIVEN TRANSLATOR GENERATOR FOR DATA DISPLAY CONFIGURATIONS". International Foundation for Telemetering, 2004. http://hdl.handle.net/10150/605328.
Pełny tekst źródłaThis paper presents a new approach for the effective generation of translator scripts that can be used to automate the translation of data display configurations from one vendor format to another. Our approach uses the IDEF5 ontology description method to capture the ontology of each vendor format and provides simple rules for performing mappings. In addition, the method includes the specification of mappings between a language-specific ontology and its corresponding syntax specification, that is, either an eXtensible Markup Language (XML) Schema or Document Type Description (DTD). Finally, we provide an algorithm for automatically generating eXtensible Stylesheet Language Transformation (XSLT) scripts that transform XML documents from one language to another. The method is implemented in a graphical tool called the Data Display Translator Generator (DDTG) that supports both inter-language (ontology-to-ontology) and intra-language (syntax-to-ontology) mappings and generates the XSLT scripts. The tool renders the XML Schema or DTD as trees, provides intuitive, user-friendly interfaces for performing the mappings, and provides a report of completed mappings. It also generates data type conversion code when both the source and target syntaxes are XML Schema-based. Our approach has the advantage of performing language mappings at an abstract, ontology level, and facilitates the mapping of tool ontologies to a common domain ontology (in our case, Data Display Markup Language or DDML), thereby eliminating the O(n^2) mapping problem that involves a number of data formats in the same domain.
Nicolai, Andreas. "DELPHIN 6 Climate Data File Specification, Version 1.0". Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2017. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-221222.
Pełny tekst źródłaWu, Yuanyuan. "HADOOP-EDF: LARGE-SCALE DISTRIBUTED PROCESSING OF ELECTROPHYSIOLOGICAL SIGNAL DATA IN HADOOP MAPREDUCE". UKnowledge, 2019. https://uknowledge.uky.edu/cs_etds/88.
Pełny tekst źródłaDanielsson, Robin. "Jämförelser av MySQL och Apache Spark : För aggregering av smartmätardata i Big Data format för en webbapplikation". Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-18667.
Pełny tekst źródłaScanlan, JD. "A context aware attack detection system across multiple gateways in real-time". Thesis, Honours thesis, University of Tasmania, 2004. https://eprints.utas.edu.au/117/1/Thesis_Final.pdf.
Pełny tekst źródłaCampbell, Daniel A., i Lee Reinsmith. "Telemetry Definition and Processing (TDAP): Standardizing Instrumentation and EU Conversion Descriptions". International Foundation for Telemetering, 1997. http://hdl.handle.net/10150/607584.
Pełny tekst źródłaTelemetry format descriptions and engineering unit conversion calibrations are generated in an assortment of formats and numbering systems on various media. Usually this information comes to the central telemetry receiving/processing system from multiple sources, fragmented and disjointed. As present day flight tests require more and more telemetry parameters to be instrumented and processed, standardization and automation for handling this ever increasing amount of information becomes more and more critical. In response to this need, the Telemetry Definition and Processing (TDAP) system has been developed by the Air Force Development Test Center (AFDTC) Eglin AFB, Florida. TDAP standardizes the format of information required to convert PCM data and MIL-STD-1553 Bus data into engineering units. This includes both the format of the data files and the software necessary to display, output, and extract subsets of data. These standardized files are electronically available for TDAP users to review/update and are then used to automatically set up telemetry acquisition systems. This paper describes how TDAP is used to standardize the development and operational test community’s telemetry data reduction process, both real-time and post-test.
Fernandes, Ronald, Michael Graul, John Hamilton, Burak Meric i Charles H. Jones. "DEVELOPING INTERNAL AND EXTERNAL TRANSLATORS FOR DATA DISPLAY SYSTEMS". International Foundation for Telemetering, 2005. http://hdl.handle.net/10150/604905.
Pełny tekst źródłaThe focus of this paper is to describe a unified methodology for developing both internal and external data display translators between an Instrumentation Support System (ISS) format and Data Display Markup Language (DDML), a neutral language for describing data displays. The methodology includes aspects common to both ISSs that have a well documented text-based save format and those that do not, as well as aspects that are unique to each type. We will also describe the means by which an external translator can be integrated into a translator framework. Finally, we will describe how an internal translator can be integrated directly into the ISS.
Bierza, Daniel. "Editor pasportizace VUT". Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2009. http://www.nusl.cz/ntk/nusl-237477.
Pełny tekst źródłaMeuel, Peter. "Insertion de données cachées dans des vidéos au format H. 264". Montpellier 2, 2009. http://www.theses.fr/2009MON20218.
Pełny tekst źródłaThis thesis targets two major issues cause by the massive adoption of the H. 264 video format: the privacy issue with closed-circuit television and the need of secure and robust watermarking methods for the video content. A first contribution adresses the privacy issue achieve the creation of a single video flow wich restraint the visual information of the filmed faces only to persons with the appropriate key. Performances of the results show the usability of the method in video-camera. The second contribution about the robust watermarking uses the secure watermarking state-of-the-art applied to video. On the opposite of crypting, the security of the method relies on the secret subspace for the insertion. The work explains the entire process for an adaptation to the H. 264 video format
Edman, Fredrik. "Läsa och lagra data i JSON format för smart sensor : En jämförelse i svarstid mellan hybriddatabassystemet PostgreSQL och MongoDB". Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-15482.
Pełny tekst źródłaStensmar, Isak. "Steganografi i bilder : En studie om bildformat och visuella bildrepresentationens påverkan vid lagring av data med hjälp av en steganografisk metod". Thesis, Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-12887.
Pełny tekst źródłaContext. By using image steganography it is possible to hide a large amount of data without making big differences to the initial picture. One commonly used method is Least Significant Bit (LSB), which often is considered one of the first method implemented and used in Image Steganography. Apart from the method, the user also have a choice when deciding what picture he or she should use as the carrier of information. What people often try to accomplish is to have a very complex method that hides the data in an efficient way, but forgets about the picture used as a carrier. In the end, all measurements will be done on the picture. Objectives. This study will investigate if different image formats, BMP, PNG, JPEG and TIFF, have an impact on the differences when comparing the original picture with the modified, given that data is stored with a steganographic method and is gradually increased. The study will also investigate if what the picture visually represent will have an effect on the measurements. Methods. An extended method of the Least Significant Bit method will be implemented and used to create different pictures with different kinds of image formats. An experiment will investigate these formats by taking measurements with MSE (Mean Squared Error), PSNR (Peek Signal-to-Noise Ratio) and SSIM (Structural Similarity). Results. When comparing different formats one could say that JPEG showed better performance by having a lower differential value between each test, by looking at the graphs and tables. BMP, PNG and TIFF had minimal changes between each other for each test. As for the visual representation of the pictures, two pictures showed a higher differential value after each test than the remaining three. Conclusions. The results from the experiment showed that which compression method a format uses will have an impact on the measurement. The results also showed that the pictures’ visual representation could have some impact on the measurement of a picture but more data is needed to conclude this theory.
Rådeström, Johan, i Gustav Skoog. "Realtidssammanställning av stora mängder data från tidsseriedatabaser". Thesis, KTH, Data- och elektroteknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-208932.
Pełny tekst źródłaLarge amounts of time series data are generated and managed within management systems and industries with the purpose to enable monitoring of the systems. When the time series is to be acquired and compiled for data analysis, the expenditure of time is a problem. This thesis was purposed to determine how the extraction of time series data should be performed to give the systems the best response time possible. To make the extraction and compilation as effective as possible, different techniques and methods were tested and evaluated. The areas that techniques and methods were compared for were compilation of data inside and outside the database, caching, usage of in-memory databases compared to other databases, dataformats, data transfer, and precalculation of data. The results showed that the best solution was to compile data in parallel outside the database, to use a custom built-in in-memory database, to use Google Protobuf as data format, and finally to use precalculated data.
Xu, Guorong. "Computational Pipeline for Human Transcriptome Quantification Using RNA-seq Data". ScholarWorks@UNO, 2011. http://scholarworks.uno.edu/td/343.
Pełny tekst źródłaMunir, Rana Faisal [Verfasser], Wolfgang [Akademischer Betreuer] Lehner, Alberto [Akademischer Betreuer] Abelló i Oscar [Akademischer Betreuer] Romero. "Storage Format Selection and Optimization for Materialized Intermediate Results in Data-Intensive Flows / Rana Faisal Munir ; Wolfgang Lehner, Alberto Abelló, Oscar Romero". Dresden : Technische Universität Dresden, 2021. http://d-nb.info/1231847670/34.
Pełny tekst źródłaKalibjian, Jeff. "Data Security Architecture Considerations for Telemetry Post Processing Environments". International Foundation for Telemetering, 2017. http://hdl.handle.net/10150/626950.
Pełny tekst źródłaSiraskar, Nandkumar S. "Adaptive Slicing in Additive Manufacturing Process using a Modified Boundary Octree Data Structure". University of Cincinnati / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1353155811.
Pełny tekst źródłaVogelsang, Stefan, Heiko Fechner i Andreas Nicolai. "Delphin 6 Material File Specification". Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-126274.
Pełny tekst źródła