Dissertations / Theses on the topic 'Data reduction and analysis'

To see the other types of publications on this topic, follow the link: Data reduction and analysis.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Data reduction and analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Wehrhahn, Ansgar. "Data Reduction and Analysis for Exoplanet Characterization." Licentiate thesis, Uppsala universitet, Institutionen för fysik och astronomi, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-437613.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Vamulapalli, Harika Rao. "On Dimensionality Reduction of Data." ScholarWorks@UNO, 2010. http://scholarworks.uno.edu/td/1211.

Full text
Abstract:
Random projection method is one of the important tools for the dimensionality reduction of data which can be made efficient with strong error guarantees. In this thesis, we focus on linear transforms of high dimensional data to the low dimensional space satisfying the Johnson-Lindenstrauss lemma. In addition, we also prove some theoretical results relating to the projections that are of interest when applying them in practical applications. We show how the technique can be applied to synthetic data with probabilistic guarantee on the pairwise distance. The connection between dimensionality reduction and compressed sensing is also discussed.
APA, Harvard, Vancouver, ISO, and other styles
3

McClelland, Robyn L. "Regression based variable clustering for data reduction /." Thesis, Connect to this title online; UW restricted, 2000. http://hdl.handle.net/1773/9611.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Dhungana, Prakash. "Application for quick reduction of GPS data for urban mobility analysis." Master's thesis, Escola Superior Tecnologia e Gestão de Oliveira do Hospital, 2014. http://hdl.handle.net/10400.26/17521.

Full text
Abstract:
The potential requirement of using data reduction application as an assistive tool for the student and researchers has been designed, developed and deployed. For that, numerous researches have been carried for this project. During the process, several modifications on architectural design, development and implemention have been done. The idea of further advancement of this application to the further development has been explored. Primarily, open source software for data reductions have been developed to suit the needs of the data analysis enthusiastic. Requirement for the project were drawn upon a thematic analysis carried on the data collected via the statement of arts and supervisors. Six evaluation sessions were carried out with various sectors of data reduction techniques to establish the acceptance, suitability, usefulness and efficiency of the product. The results were outstanding and indicate strong prospects of the product for the targeted arena.
The potential requirement of using data reduction application as an assistive tool for the student and researchers has been designed, developed and deployed. For that, numerous researches have been carried for this project. During the process, several modifications on architectural design, development and implemention have been done. The idea of further advancement of this application to the further development has been explored. Primarily, open source software for data reductions have been developed to suit the needs of the data analysis enthusiastic. Requirement for the project were drawn upon a thematic analysis carried on the data collected via the statement of arts and supervisors. Six evaluation sessions were carried out with various sectors of data reduction techniques to establish the acceptance, suitability, usefulness and efficiency of the product. The results were outstanding and indicate strong prospects of the product for the targeted arena.
info:eu-repo/semantics/acceptedVersion
APA, Harvard, Vancouver, ISO, and other styles
5

Ross, Ian. "Nonlinear dimensionality reduction methods in climate data analysis." Thesis, University of Bristol, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.492479.

Full text
Abstract:
Linear dimensionality reduction techniques, notably principal component analysis, are widely used in climate data analysis as a means to aid in the interpretation of datasets of high dimensionality. These hnear methods may not be appropriate for the analysis of data arising from nonlinear processes occurring in the climate system. Numerous techniques for nonlinear dimensionality reduction have been developed recently that may provide a potentially useful tool for the identification of low-dimensional manifolds in climate data sets arising from nonlinear dynamics. In this thesis I apply three such techniques to the study of El Niño/Southern Oscillation variability in tropical Pacific sea surface temperatures and thermocline depth, comparing observational data with simulations from coupled atmosphere-ocean general circulation models from the CMIP3 multi-model ensemble.
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Yingxing. "On sliced methods in dimension reduction." Click to view the E-thesis via HKUTO, 2005. http://sunzi.lib.hku.hk/hkuto/record/B31559256.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Li, Yingxing, and 李迎星. "On sliced methods in dimension reduction." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2005. http://hub.hku.hk/bib/B31559256.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Claudet, Andre Aman. "Data reduction for high speed computational analysis of three dimensional coordinate measurement data." Thesis, Georgia Institute of Technology, 1998. http://hdl.handle.net/1853/17617.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Tobeck, Daniel. "Data Structures and Reduction Techniques for Fire Tests." Thesis, University of Canterbury. Civil Engineering, 2007. http://hdl.handle.net/10092/1578.

Full text
Abstract:
To perform fire engineering analysis, data on how an object or group of objects burn is almost always needed. This data should be collected and stored in a logical and complete fashion to allow for meaningful analysis later. This thesis details the design of a new fire test Data Base Management System (DBMS) termed UCFIRE which was built to overcome the limitations of existing fire test DBMS and was based primarily on the FDMS 2.0 and FIREBASEXML specifications. The UCFIRE DBMS is currently the most comprehensive and extensible DBMS available in the fire engineering community and can store the following test types: Cone Calorimeter, Furniture Calorimeter, Room/Corner Test, LIFT and Ignitability Apparatus Tests. Any data reduction which is performed on this fire test data should be done in an entirely mechanistic fashion rather than rely on human intuition which is subjective. Currently no other DBMS allows for the semi-automation of the data reduction process. A number of pertinent data reduction algorithms were investigated and incorporated into the UCFIRE DBMS. An ASP.NET Web Service (WEBFIRE) was built to reduce the bandwidth required to exchange fire test information between the UCFIRE DBMS and a UCFIRE document stored on a web server. A number of Mass Loss Rate (MLR) algorithms were investigated and it was found that the Savitzky-Golay filtering algorithm offered the best performance. This algorithm had to be further modified to autonomously filter other noisy events that occurred during the fire tests. This algorithm was then evaluated on test data from exemplar Furniture Calorimeter and Cone Calorimeter tests. The LIFT test standard (ASTM E 1321-97a) requires its ignition and flame spread data to be scrutinised but does not state how to do this. To meet these requirements the fundamentals of linear regression were reviewed and an algorithm to mechanistically scrutinise ignition and flame spread data was developed. This algorithm seemed to produce reasonable results when used on exemplar ignition and flame spread test data.
APA, Harvard, Vancouver, ISO, and other styles
10

Ferrer, Samuel. "STAIRS : Data reduction strategy on genomics." Thesis, Uppsala universitet, Institutionen för biologisk grundutbildning, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-383465.

Full text
Abstract:
Background. An enormous accumulation of genomic data has been taking place over the last ten years. This makes the activities of visualization and manual inspection, key steps in trying to understand large datasets containing DNA sequences with millions of letters. This situation has created a gap between data complexity and qualified personnel due to the need of trading between visualization, reduction capacity and exploratory functions, features rarely achieved by existing tools, such as SRA toolkit (https://www.ncbi.nlm.nih.gov/sra/docs/toolkitsoft/), for instance. A novel approach to the problem of genomic analysis and visualization was pursued in this project, by means of STrAtified Interspersed Reduction Structures (STAIRS). Result. Ten weeks of intense work resulted in novel algorithms to compress data, transform it into stairs vectors and align them. Smith–Waterman and Needleman–Wunsch algorithms have been specially modified for this purpose and the application brought about statistical performance and behavioural charts.
APA, Harvard, Vancouver, ISO, and other styles
11

Ray, Sujan. "Dimensionality Reduction in Healthcare Data Analysis on Cloud Platform." University of Cincinnati / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ucin161375080072697.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Zhou, Ping. "ERROR ANALYSIS AND DATA REDUCTION FOR INTERFEROMETRIC SURFACE MEASUREMENTS." Diss., The University of Arizona, 2009. http://hdl.handle.net/10150/195309.

Full text
Abstract:
High-precision optical systems are generally tested using interferometry, since it often is the only way to achieve the desired measurement precision and accuracy. Interferometers can generally measure a surface to an accuracy of one hundredth of a wave. In order to achieve an accuracy to the next order of magnitude, one thousandth of a wave, each error source in the measurement must be characterized and calibrated.Errors in interferometric measurements are classified into random errors and systematic errors. An approach to estimate random errors in the measurement is provided, based on the variation in the data. Systematic errors, such as retrace error, imaging distortion, and error due to diffraction effects, are also studied in this dissertation. Methods to estimate the first order geometric error and errors due to diffraction effects are presented.Interferometer phase modulation transfer function (MTF) is another intrinsic error. The phase MTF of an infrared interferometer is measured with a phase Siemens star, and a Wiener filter is designed to recover the middle spatial frequency information.Map registration is required when there are two maps tested in different systems and one of these two maps needs to be subtracted from the other. Incorrect mapping causes wavefront errors. A smoothing filter method is presented which can reduce the sensitivity to registration error and improve the overall measurement accuracy.Interferometric optical testing with computer-generated holograms (CGH) is widely used for measuring aspheric surfaces. The accuracy of the drawn pattern on a hologram decides the accuracy of the measurement. Uncertainties in the CGH manufacturing process introduce errors in holograms and then the generated wavefront. An optimal design of the CGH is provided which can reduce the sensitivity to fabrication errors and give good diffraction efficiency for both chrome-on-glass and phase etched CGHs.
APA, Harvard, Vancouver, ISO, and other styles
13

Di, Ciaccio Lucio. "Feature selection and dimensionality reduction for supervised data analysis." Thesis, Massachusetts Institute of Technology, 2016. https://hdl.handle.net/1721.1/122827.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, Department of Aeronautics and Astronautics, 2016
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 103-106).
by Lucio Di Ciaccio.
S.M.
S.M. Massachusetts Institute of Technology, Department of Aeronautics and Astronautics
APA, Harvard, Vancouver, ISO, and other styles
14

Fidalgo, André Filipe dos Santos Pinto. "IPTV data reduction strategy to measure real users’ behaviours." Master's thesis, Faculdade de Ciências e Tecnologia, 2012. http://hdl.handle.net/10362/8448.

Full text
Abstract:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
The digital IPTV service has evolved in terms of features, technology and accessibility of their contents. However, the rapid evolution of features and services has brought a more complex offering to customers, which often are not enjoyed or even perceived. Therefore, it is important to measure the real advantage of those features and understand how they are used by customers. In this work, we present a strategy that deals directly with the real IPTV data, which result from the interaction actions with the set-top boxes by customers. But this data has a very low granularity level, which is complex and difficult to interpret. The approach is to transform the clicking actions to a more conceptual and representative level of the running activities. Furthermore, there is a significant reduction in the data cardinality, enhanced in terms of information quality. More than a transformation, this approach aims to be iterative, where at each level, we achieve a more accurate information, in order to characterize a particular behaviour. As experimental results, we present some application areas regarding the main offered features in this digital service. In particular, is made a study about zapping behaviour, and also an evaluation about DVR service usage. It is also discussed the possibility to integrate the strategy devised in a particular carrier, aiming to analyse the consumption rate of their services, in order to adjust them to customer real usage profile, and also to study the feasibility of new services introduction.
APA, Harvard, Vancouver, ISO, and other styles
15

Voss, T. J. "Automated Analysis Tools for Reducing Spacecraft Telemetry Data." International Foundation for Telemetering, 1993. http://hdl.handle.net/10150/611898.

Full text
Abstract:
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada
A practical description is presented of the methods used to reduce spacecraft telemetry data using a hierarchial toolkit of software programs developed for a UNIX environment.
APA, Harvard, Vancouver, ISO, and other styles
16

Hart, Dennis L., and Marvin A. Smith. "AIM-120A DOPPLER RADAR TELEMETRY DATA REDUCTION AND ANALYSIS SOFTWARE." International Foundation for Telemetering, 1994. http://hdl.handle.net/10150/608575.

Full text
Abstract:
International Telemetering Conference Proceedings / October 17-20, 1994 / Town & Country Hotel and Conference Center, San Diego, California
This paper describes the application software used to convert AIM-120A, Advanced Medium Range Air-to-Air Missile (AMRAAM), telemetry data to a series of color images and time-correlated engineering unit results. X Window System-based graphics facilitate visualization of the doppler radar data. These software programs were developed for the VAX/VMS and DEC Alpha environments.
APA, Harvard, Vancouver, ISO, and other styles
17

Dionisi, Steven M. "Real Time Data Reduction and Analysis Using Artificial Neural Networks." International Foundation for Telemetering, 1993. http://hdl.handle.net/10150/611856.

Full text
Abstract:
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada
An artificial neural network (ANN) for use in real time data reduction and analysis will be presented. The use and advantage of hardware and software implementations of neural networks will be considered. The ability of neural networks to learn and store associations between different sets of data can be used to create custom algorithms for some of the data analysis done during missions. Once trained, the ANN can distill the signals from several sensors into a single output, such as safe/unsafe. Used on a neural chip, the trained ANN can eliminate the need for A/D conversions and multiplexing for processing of combined parameters and the massively parallel nature of the network allows the processing time to remain independent of the number of parameters. As a software routine, the advantages of using an ANN over conventional algorithms include the ease of use for engineers, and the ability to handle nonlinear, noisy and imperfect data. This paper will apply the ANN to performance data from a T-38 aircraft.
APA, Harvard, Vancouver, ISO, and other styles
18

Lee, Ho-Jin. "Functional data analysis: classification and regression." Texas A&M University, 2004. http://hdl.handle.net/1969.1/2805.

Full text
Abstract:
Functional data refer to data which consist of observed functions or curves evaluated at a finite subset of some interval. In this dissertation, we discuss statistical analysis, especially classification and regression when data are available in function forms. Due to the nature of functional data, one considers function spaces in presenting such type of data, and each functional observation is viewed as a realization generated by a random mechanism in the spaces. The classification procedure in this dissertation is based on dimension reduction techniques of the spaces. One commonly used method is Functional Principal Component Analysis (Functional PCA) in which eigen decomposition of the covariance function is employed to find the highest variability along which the data have in the function space. The reduced space of functions spanned by a few eigenfunctions are thought of as a space where most of the features of the functional data are contained. We also propose a functional regression model for scalar responses. Infinite dimensionality of the spaces for a predictor causes many problems, and one such problem is that there are infinitely many solutions. The space of the parameter function is restricted to Sobolev-Hilbert spaces and the loss function, so called, e-insensitive loss function is utilized. As a robust technique of function estimation, we present a way to find a function that has at most e deviation from the observed values and at the same time is as smooth as possible.
APA, Harvard, Vancouver, ISO, and other styles
19

Buckley, Dave. "Moving Data Analysis into the Acquisition Hardware." International Foundation for Telemetering, 2014. http://hdl.handle.net/10150/577519.

Full text
Abstract:
ITC/USA 2014 Conference Proceedings / The Fiftieth Annual International Telemetering Conference and Technical Exhibition / October 20-23, 2014 / Town and Country Resort & Convention Center, San Diego, CA
Data acquisition for flight test is typically handled by dedicated hardware which performs specific functions and targets specific interfaces and buses. Through the use of an FPGA state machine based design approach, performance and robustness can be guaranteed. Up to now sufficient flexibility has been provided by allowing the user to configure the hardware depending on the particular application. However by allowing custom algorithms to be run on the data acquisition hardware, far greater control and flexibility can be offered to the flight test engineer. As the volume of the acquired data increases, this extra control can be used to vastly reduce the amount of data to be recorded or telemetered. Also real-time analysis of test points can now be done where post processing would previously have been required. This paper examines examples of data acquisition, recording and processing and investigates where data reduction and time savings can be achieved by enabling the flight test engineer to run his own algorithms on the hardware.
APA, Harvard, Vancouver, ISO, and other styles
20

Concha, Ramírez Francisca Andrea. "FADRA: A CPU-GPU framework for astronomical data reduction and Analysis." Tesis, Universidad de Chile, 2016. http://repositorio.uchile.cl/handle/2250/140769.

Full text
Abstract:
Magíster en Ciencias, Mención Computación
Esta tesis establece las bases de FADRA: Framework for Astronomical Data Reduction and Analysis. El framework FADRA fue diseñado para ser eficiente, simple de usar, modular, expandible, y open source. Hoy en día, la astronomía es inseparable de la computación, pero algunos de los software más usados en la actualidad fueron desarrollados tres décadas atrás y no están diseñados para enfrentar los actuales paradigmas de big data. El mundo del software astronómico debe evolucionar no solo hacia prácticas que comprendan y adopten la era del big data, sino también que estén enfocadas en el trabajo colaborativo de la comunidad. El trabajo desarollado consistió en el diseño e implementación de los algoritmos básicos para el análisis de datos astronómicos, dando inicio al desarrollo del framework. Esto consideró la implementación de estructuras de datos eficientes al trabajar con un gran número de imágenes, la implementación de algoritmos para el proceso de calibración o reducción de imágenes astronómicas, y el diseño y desarrollo de algoritmos para el cálculo de fotometría y la obtención de curvas de luz. Tanto los algoritmos de reducción como de obtención de curvas de luz fueron implementados en versiones CPU y GPU. Para las implementaciones en GPU, se diseñaron algoritmos que minimizan la cantidad de datos a ser procesados de manera de reducir la transferencia de datos entre CPU y GPU, proceso lento que muchas veces eclipsa las ganancias en tiempo de ejecución que se pueden obtener gracias a la paralelización. A pesar de que FADRA fue diseñado con la idea de utilizar sus algoritmos dentro de scripts, un módulo wrapper para interactuar a través de interfaces gráficas también fue implementado. Una de las principales metas de esta tesis consistió en la validación de los resultados obtenidos con FADRA. Para esto, resultados de la reducción y curvas de luz fueron comparados con resultados de AstroPy, paquete de Python con distintas utilidades para astrónomos. Los experimentos se realizaron sobre seis datasets de imágenes astronómicas reales. En el caso de reducción de imágenes astronómicas, el Normalized Root Mean Squared Error (NRMSE) fue utilizado como métrica de similaridad entre las imágenes. Para las curvas de luz, se probó que las formas de las curvas eran iguales a través de la determinación de offsets constantes entre los valores numéricos de cada uno de los puntos pertenecientes a las distintas curvas. En términos de la validez de los resultados, tanto la reducción como la obtención de curvas de luz, en sus implementaciones CPU y GPU, generaron resultados correctos al ser comparados con los de AstroPy, lo que significa que los desarrollos y aproximaciones diseñados para FADRA otorgan resultados que pueden ser utilizados con seguridad para el análisis científico de imágenes astronómicas. En términos de tiempos de ejecución, la naturaleza intensiva en uso de datos propia del proceso de reducción hace que la versión GPU sea incluso más lenta que la versión CPU. Sin embargo, en el caso de la obtención de curvas de luz, el algoritmo GPU presenta una disminución importante en tiempo de ejecución comparado con su contraparte en CPU.
Este trabajo ha sido parcialmente financiado por Proyecto Fondecyt 1120299
APA, Harvard, Vancouver, ISO, and other styles
21

Cheriyadat, Anil Meerasa. "Limitations of principal component analysis for dimensionality-reduction for classification of hyperspectral data." Master's thesis, Mississippi State : Mississippi State University, 2003. http://library.msstate.edu/etd/show.asp?etd=etd-11072003-133109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Sallum, S., and J. Eisner. "Data Reduction and Image Reconstruction Techniques for Non-redundant Masking." IOP PUBLISHING LTD, 2017. http://hdl.handle.net/10150/626263.

Full text
Abstract:
The technique of non-redundant masking (NRM) transforms a conventional telescope into an interferometric array. In practice, this provides a much better constrained point-spread function than a filled aperture and thus higher resolution than traditional imaging methods. Here, we describe an NRM data reduction pipeline. We discuss strategies for NRM observations regarding dithering patterns and calibrator selection. We describe relevant image calibrations and use example Large Binocular Telescope data sets to show their effects on the scatter in the Fourier measurements. We also describe the various ways to calculate Fourier quantities, and discuss different calibration strategies. We present the results of image reconstructions from simulated observations where we adjust prior images, weighting schemes, and error bar estimation. We compare two imaging algorithms and discuss implications for reconstructing images from real observations. Finally, we explore how the current state of the art compares to next-generation Extremely Large Telescopes.
APA, Harvard, Vancouver, ISO, and other styles
23

Battey, Heather Suzanne. "Dimension reduction and automatic smoothing in high dimensional and functional data analysis." Thesis, University of Cambridge, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.609849.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Sease, Bradley Jason. "Data Reduction for Diverse Optical Observers through Fundamental Dynamic and Geometric Analysis." Diss., Virginia Tech, 2016. http://hdl.handle.net/10919/70923.

Full text
Abstract:
Typical algorithms for processing unresolved space imagery from optical systems make broad assumptions about the expected behavior of the sensors during collection. While these techniques are often successful at data reduction for a particular mission, they rarely extend to sensors in different operating modes. Such specialized techniques therefore reduce the number of sensors able to contribute imagery. By approaching this problem with analysis of the fundamental dynamic equations and geometry at play, we can gain a deeper understanding into the behavior of both stars and space objects viewed through optical sensors. This type of analysis has the potential to enable data collection from a wider variety of sensors, increasing both the quantity and quality of data available for space object catalog maintenance. This dissertation will explore the implications of this approach to unresolved data processing. Sensor-level motion descriptions will be derived and applied to the problem of space object discrimination and tracking. Results of this processing pipeline as applied to both simulated and real optical data will be presented.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
25

Kliegr, Tomáš. "Clickstream Analysis." Master's thesis, Vysoká škola ekonomická v Praze, 2007. http://www.nusl.cz/ntk/nusl-2065.

Full text
Abstract:
Thesis introduces current research trends in clickstream analysis and proposes a new heuristic that could be used for dimensionality reduction of semantically enriched data in Web Usage Mining (WUM). Click-fraud and conversion fraud are identified as key prospective application areas for WUM. Thesis documents a conversion fraud vulnerability of Google Analytics and proposes defense - a new clickstream acquisition software, which collects data in sufficient granularity and structure to allow for data mining approaches to fraud detection. Three variants of K-means clustering algorithms and three association rule data mining systems are evaluated and compared on real-world web usage data.
APA, Harvard, Vancouver, ISO, and other styles
26

Landgraf, Andrew J. "Generalized Principal Component Analysis: Dimensionality Reduction through the Projection of Natural Parameters." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1437610558.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Chao, Roger. "Data analysis for Systematic Literature Reviews." Thesis, Linnéuniversitetet, Institutionen för informatik (IK), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-105122.

Full text
Abstract:
Systematic Literature Reviews (SLR) are a powerful research tool to identify and select literature to answer a certain question. However, an approach to extract inherent analytical data in Systematic Literature Reviews’ multi-dimensional datasets was lacking. Previous Systematic Literature Review tools do not incorporate the capability of providing said analytical insight. Therefore, this thesis aims to provide a useful approach comprehending various algorithms and data treatment techniques to provide the user with analytical insight on their data that is not evident in the bare execution of a Systematic Literature Review. For this goal, a literature review has been conducted to find the most relevant techniques to extract data from multi-dimensional data sets and the aforementioned approach has been tested on a survey regarding Self-Adaptive Systems (SAS) using a web-application. As a result, we find out what are the most adequate techniques to incorporate into the approach this thesis will provide.
APA, Harvard, Vancouver, ISO, and other styles
28

Rooney, E. M. "The measurement and reduction of quality related costs in the process plant industry." Thesis, Cranfield University, 1987. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.380671.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Zhou, Dunke. "High-dimensional Data Clustering and Statistical Analysis of Clustering-based Data Summarization Products." The Ohio State University, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=osu1338303646.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Marschall, Nicolas. "Methodological Problems with Transformation and Size Reduction of Data Sets in Network Analysis." [S.l. : s.n.], 2006. http://nbn-resolving.de/urn:nbn:de:bsz:352-opus-19420.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Dai, Congxia. "An advanced data acquisition system & noise analysis on the aluminum reduction process." Morgantown, W. Va. : [West Virginia University Libraries], 2003. http://etd.wvu.edu/templates/showETD.cfm?recnum=2850.

Full text
Abstract:
Thesis (M.S.)--West Virginia University, 2003.
Title from document title page. Document formatted into pages; contains ix, 82 p. : ill. (some col.). Includes abstract. Includes bibliographical references (p. 75-78).
APA, Harvard, Vancouver, ISO, and other styles
32

Hildreth, John C. "The Use of Short-Interval GPS Data for Construction Operations Analysis." Diss., Virginia Tech, 2003. http://hdl.handle.net/10919/26120.

Full text
Abstract:
The global positioning system (GPS) makes use of extremely accurate measures of the time to determine position. The times required for electronic signals to travel at the speed of light from at least four orbiting satellites to a receiver on earth is measured precisely and used to calculate the distances from the satellites to the receiver. The calculated distances are used to determine the position of the receiver through triangulation. This research takes an approach opposite the original GPS research, focusing on the use of position to determine the time at which events occur. Specifically, this work addresses the question: Can the information pertaining to position and speed contained in a GPS record be used to autonomously identify the times at which critical events occur within a production cycle? The research question was answered by determining the hardware needs for collecting the desired data in a useable format an developing a unique data collection tool to meet those needs. The tool was field evaluated and the data collected was used to determine the software needs for automated reduction of the data to the times at which key events occurred. The software tools were developed in the form of Time Identification Modules (TIMs). The TIMs were used to reduce data collected from a load and haul earthmoving operation to duration measures for the load, haul, dump, and return activities. The value of the developed system was demonstrated by investigating correlations between performance times in construction operations and by using field data to verify the results obtained from productivity estimating tools. Use of the system was shown to improve knowledge and provide additional insight into operations analysis studies.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
33

Grimm, Alexander Rudolf. "Parametric Dynamical Systems: Transient Analysis and Data Driven Modeling." Diss., Virginia Tech, 2018. http://hdl.handle.net/10919/83840.

Full text
Abstract:
Dynamical systems are a commonly used and studied tool for simulation, optimization and design. In many applications such as inverse problem, optimal control, shape optimization and uncertainty quantification, those systems typically depend on a parameter. The need for high fidelity in the modeling stage leads to large-scale parametric dynamical systems. Since these models need to be simulated for a variety of parameter values, the computational burden they incur becomes increasingly difficult. To address these issues, parametric reduced models have encountered increased popularity in recent years. We are interested in constructing parametric reduced models that represent the full-order system accurately over a range of parameters. First, we define a global joint error mea- sure in the frequency and parameter domain to assess the accuracy of the reduced model. Then, by assuming a rational form for the reduced model with poles both in the frequency and parameter domain, we derive necessary conditions for an optimal parametric reduced model in this joint error measure. Similar to the nonparametric case, Hermite interpolation conditions at the reflected images of the poles characterize the optimal parametric approxi- mant. This result extends the well-known interpolatory H2 optimality conditions by Meier and Luenberger to the parametric case. We also develop a numerical algorithm to construct locally optimal reduced models. The theory and algorithm are data-driven, in the sense that only function evaluations of the parametric transfer function are required, not access to the internal dynamics of the full model. While this first framework operates on the continuous function level, assuming repeated transfer function evaluations are available, in some cases merely frequency samples might be given without an option to re-evaluate the transfer function at desired points; in other words, the function samples in parameter and frequency are fixed. In this case, we construct a parametric reduced model that minimizes a discretized least-squares error in the finite set of measurements. Towards this goal, we extend Vector Fitting (VF) to the parametric case, solving a global least-squares problem in both frequency and parameter. The output of this approach might lead to a moderate size reduced model. In this case, we perform a post- processing step to reduce the output of the parametric VF approach using H2 optimal model reduction for a special parametrization. The final model inherits the parametric dependence of the intermediate model, but is of smaller order. A special case of a parameter in a dynamical system is a delay in the model equation, e.g., arising from a feedback loop, reaction time, delayed response and various other physical phenomena. Modeling such a delay comes with several challenges for the mathematical formulation, analysis, and solution. We address the issue of transient behavior for scalar delay equations. Besides the choice of an appropriate measure, we analyze the impact of the coefficients of the delay equation on the finite time growth, which can be arbitrary large purely by the influence of the delay.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
34

潤, 土田, and Jun Tsuchida. "Dimensional reduction method for three-mode three-way data based on canonical covariance analysis." Thesis, https://doors.doshisha.ac.jp/opac/opac_link/bibid/BB13056385/?lang=0, 2018. https://doors.doshisha.ac.jp/opac/opac_link/bibid/BB13056385/?lang=0.

Full text
Abstract:
3相3元データとは,対象,変量,条件の3つの有限集合組によって表現されるデータの総称であり,マーケティングリサーチ,心理学,経済学などの様々な分野で観測されるデータである.本論文では,多変量解析手法の一つである正準共分散分析を3相3元データに拡張し,その適用例および既存手法との比較を行った.
Three-mode three-way data exist in various research areas, such as psychology and marketing research. We propose new dimensional reduction methods for three-mode three-way data based on canonical covariance analysis in this study. In addition, we include a simulation study and apply the proposed method to real data.
博士(文化情報学)
Doctor of Culture and Information Science
同志社大学
Doshisha University
APA, Harvard, Vancouver, ISO, and other styles
35

Bécavin, Christophe. "Dimensionaly reduction and pathway network analysis of transcriptome data : application to T-cell characterization." Paris, Ecole normale supérieure, 2010. http://www.theses.fr/2010ENSUBS02.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Gaines, June Bullard. "AEGIS Data Analysis and Reduction (ADAR) in Support of the AEGIS Weapon System (AWS)." VCU Scholars Compass, 1998. http://scholarscompass.vcu.edu/etd/4622.

Full text
Abstract:
The AEGIS Weapons System (AWS), part of the AEGIS Combat System (ACS), is an integral part of the defense system on U.S. Navy AEGIS-class ships. AEGIS Data Analysis and Reduction (ADAR) has been developed to assist in the evaluation of the AWS data. ADAR, along with the AWS and ACS, has evolved through the years to accommodate advances in technology and computer programming languages. Additionally, ADAR has evolved so that users located at sites other than the Naval Surface Warfare System Dahlgren Division (NSWCDD), Dahlgren, Virginia, can reduce tactical system data and perform data analysis using the reduced data. This paper is a study of the growing pains experienced by the ADAR system as it has evolved. Some of the changes affecting ADAR have included the addition of new elements to both the AWS and the ACS, the multiplication of AEGIS baselines, and the provision of portability both to a multitude of platforms and to several sites supporting the AEGIS program.
APA, Harvard, Vancouver, ISO, and other styles
37

Kim, Jeong Min. "REDUCTION AND ANALYSIS PROGRAM FOR TELEMETRY RECORDINGS (RAPTR): ANALYSIS AND DECOMMUTATION SOFTWARE FOR IRIG 106 CHAPTER 10 DATA." International Foundation for Telemetering, 2005. http://hdl.handle.net/10150/604912.

Full text
Abstract:
ITC/USA 2005 Conference Proceedings / The Forty-First Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2005 / Riviera Hotel & Convention Center, Las Vegas, Nevada
Solid State On-Board Recording is becoming a revolutionary way of recording airborne telemetry data and IRIG 106 Chapter 10 “Solid State On-Board Recorder Standard” provides interface documentation for solid state digital data acquisition. The Reduction and Analysis Program for Telemetry Recordings (RAPTR) is a standardized and extensible software application developed by the 96th Communications Group, Test and Analysis Division, at Eglin AFB, and provides a data reduction capability for disk files in Chapter 10 format. This paper provides the system description and software architecture of RAPTR and presents the 96th Communication Group’s total solution for Chapter 10 telemetry data reduction.
APA, Harvard, Vancouver, ISO, and other styles
38

Pelzel, Frank [Verfasser]. "New Strategies for Data Analysis by Dimension Reduction, of Resource Combinations and Benchmarking / Frank Pelzel." Berlin : epubli, 2020. http://d-nb.info/1219792594/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Ergin, Leanna N. "ENHANCED DATA REDUCTION, SEGMENTATION, AND SPATIAL MULTIPLEXING METHODS FOR HYPERSPECTRAL IMAGING." Cleveland State University / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=csu1501871494997272.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Weng, Jiaying. "TRANSFORMS IN SUFFICIENT DIMENSION REDUCTION AND THEIR APPLICATIONS IN HIGH DIMENSIONAL DATA." UKnowledge, 2019. https://uknowledge.uky.edu/statistics_etds/40.

Full text
Abstract:
The big data era poses great challenges as well as opportunities for researchers to develop efficient statistical approaches to analyze massive data. Sufficient dimension reduction is such an important tool in modern data analysis and has received extensive attention in both academia and industry. In this dissertation, we introduce inverse regression estimators using Fourier transforms, which is superior to the existing SDR methods in two folds, (1) it avoids the slicing of the response variable, (2) it can be readily extended to solve the high dimensional data problem. For the ultra-high dimensional problem, we investigate both eigenvalue decomposition and minimum discrepancy approaches to achieve optimal solutions and also develop a novel and efficient optimization algorithm to obtain the sparse estimates. We derive asymptotic properties of the proposed estimators and demonstrate its efficiency gains compared to the traditional estimators. The oracle properties of the sparse estimates are derived. Simulation studies and real data examples are used to illustrate the effectiveness of the proposed methods. Wavelet transform is another tool that effectively detects information from time-localization of high frequency. Parallel to our proposed Fourier transform methods, we also develop a wavelet transform version approach and derive the asymptotic properties of the resulting estimators.
APA, Harvard, Vancouver, ISO, and other styles
41

Self, Jessica Zeitz. "Designing and Evaluating Object-Level Interaction to Support Human-Model Communication in Data Analysis." Diss., Virginia Tech, 2016. http://hdl.handle.net/10919/70950.

Full text
Abstract:
High-dimensional data appear in all domains and it is challenging to explore. As the number of dimensions in datasets increases, the harder it becomes to discover patterns and develop insights. Data analysis and exploration is an important skill given the amount of data collection in every field of work. However, learning this skill without an understanding of high-dimensional data is challenging. Users naturally tend to characterize data in simplistic one-dimensional terms using metrics such as mean, median, mode. Real-world data is more complex. To gain the most insight from data, users need to recognize and create high-dimensional arguments. Data exploration methods can encourage thinking beyond traditional one-dimensional insights. Dimension reduction algorithms, such as multidimensional scaling, support data explorations by reducing datasets to two dimensions for visualization. Because these algorithms rely on underlying parameterizations, they may be manipulated to assess the data from multiple perspectives. Manipulating can be difficult for users without a strong knowledge of the underlying algorithms. Visual analytics tools that afford object-level interaction (OLI) allow for generation of more complex insights, despite inexperience with multivariate data or the underlying algorithm. The goal of this research is to develop and test variations on types of interactions for interactive visual analytic systems that enable users to tweak model parameters directly or indirectly so that they may explore high-dimensional data. To study interactive data analysis, we present an interface, Andromeda, that enables non-experts of statistical models to explore domain-specific, high-dimensional data. This application implements interactive weighted multidimensional scaling (WMDS) and allows for both parametric and observation-level interaction to provide in-depth data exploration. We performed multiple user studies to answer how parametric and object-level interaction aid in data analysis. With each study, we found usability issues and then designed solutions for the next study. With each critique we uncovered design principles of effective, interactive, visual analytic tools. The final part of this research presents these principles supported by the results of our multiple informal and formal usability studies. The established design principles focus on human-centered usability for developing interactive visual analytic systems that enable users to analyze high-dimensional data through object-level interaction.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
42

Ridgeway, Gregory Kirk. "Generalization of boosting algorithms and applications of Bayesian inference for massive datasets /." Thesis, Connect to this title online; UW restricted, 1999. http://hdl.handle.net/1773/8986.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Upadrasta, Bharat. "Boolean factor analysis a review of a novel method of matrix decomposition and neural network Boolean factor analysis /." Diss., Online access via UMI:, 2009.

Find full text
Abstract:
Thesis (M.S.)--State University of New York at Binghamton, Thomas J. Watson School of Engineering and Applied Science, Department of Systems Science and Industrial Engineering, 2009.
Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
44

Caples, Jerry Joseph. "Variance reduction and variable selection methods for Alho's logistic capture recapture model with applications to census data /." Full text (PDF) from UMI/Dissertation Abstracts International, 2000. http://wwwlib.umi.com/cr/utexas/fullcit?p9992762.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Safari, Katesari Hadi. "BAYESIAN DYNAMIC FACTOR ANALYSIS AND COPULA-BASED MODELS FOR MIXED DATA." OpenSIUC, 2021. https://opensiuc.lib.siu.edu/dissertations/1948.

Full text
Abstract:
Available statistical methodologies focus more on accommodating continuous variables, however recently dealing with count data has received high interest in the statistical literature. In this dissertation, we propose some statistical approaches to investigate linear and nonlinear dependencies between two discrete random variables, or between a discrete and continuous random variables. Copula functions are powerful tools for modeling dependencies between random variables. We derive copula-based population version of Spearman’s rho when at least one of the marginal distribution is discrete. In each case, the functional relationship between Kendall’s tau and Spearman’s rho is obtained. The asymptotic distributions of the proposed estimators of these association measures are derived and their corresponding confidence intervals are constructed, and tests of independence are derived. Then, we propose a Bayesian copula factor autoregressive model for time series mixed data. This model assumes conditional independence and shares latent factors in both mixed-type response and multivariate predictor variables of the time series through a quadratic timeseries regression model. This model is able to reduce the dimensionality by accommodating latent factors in both response and predictor variables of the high-dimensional time series data. A semiparametric time series extended rank likelihood technique is applied to the marginal distributions to handle mixed-type predictors of the high-dimensional time series, which decreases the number of estimated parameters and provides an efficient computational algorithm. In order to update and compute the posterior distributions of the latent factors and other parameters of the models, we propose a naive Bayesian algorithm with Metropolis-Hasting and Forward Filtering Backward Sampling methods. We evaluate the performance of the proposed models and methods through simulation studies. Finally, each proposed model is applied to a real dataset.
APA, Harvard, Vancouver, ISO, and other styles
46

Kidzinski, Lukasz. "Inference for stationary functional time series: dimension reduction and regression." Doctoral thesis, Universite Libre de Bruxelles, 2014. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/209226.

Full text
Abstract:
Les progrès continus dans les techniques du stockage et de la collection des données permettent d'observer et d'enregistrer des processus d’une façon presque continue. Des exemples incluent des données climatiques, des valeurs de transactions financières, des modèles des niveaux de pollution, etc. Pour analyser ces processus, nous avons besoin des outils statistiques appropriés. Une technique très connue est l'analyse de données fonctionnelles (ADF).

L'objectif principal de ce projet de doctorat est d'analyser la dépendance temporelle de l’ADF. Cette dépendance se produit, par exemple, si les données sont constituées à partir d'un processus en temps continu qui a été découpé en segments, les jours par exemple. Nous sommes alors dans le cadre des séries temporelles fonctionnelles.

La première partie de la thèse concerne la régression linéaire fonctionnelle, une extension de la régression multivariée. Nous avons découvert une méthode, basé sur les données, pour choisir la dimension de l’estimateur. Contrairement aux résultats existants, cette méthode n’exige pas d'assomptions invérifiables.

Dans la deuxième partie, on analyse les modèles linéaires fonctionnels dynamiques (MLFD), afin d'étendre les modèles linéaires, déjà reconnu, dans un cadre de la dépendance temporelle. Nous obtenons des estimateurs et des tests statistiques par des méthodes d’analyse harmonique. Nous nous inspirons par des idées de Brillinger qui a étudié ces models dans un contexte d’espaces vectoriels.
Doctorat en Sciences
info:eu-repo/semantics/nonPublished

APA, Harvard, Vancouver, ISO, and other styles
47

Löhr, Andrea. "A noise reduction method based upon statistical analysis for the detection of weak signals in discrete data." [S.l.] : [s.n.], 2003. http://deposit.ddb.de/cgi-bin/dokserv?idn=968817505.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Gapper, Justin J. "Bias Reduction in Machine Learning Classifiers for Spatiotemporal Analysis of Coral Reefs using Remote Sensing Images." Chapman University Digital Commons, 2019. https://digitalcommons.chapman.edu/cads_dissertations/2.

Full text
Abstract:
This dissertation is an evaluation of the generalization characteristics of machine learning classifiers as applied to the detection of coral reefs using remote sensing images. Three scientific studies have been conducted as part of this research: 1) Evaluation of Spatial Generalization Characteristics of a Robust Classifier as Applied to Coral Reef Habitats in Remote Islands of the Pacific Ocean 2) Coral Reef Change Detection in Remote Pacific Islands using Support Vector Machine Classifiers 3) A Generalized Machine Learning Classifier for Spatiotemporal Analysis of Coral Reefs in the Red Sea. The aim of this dissertation is to propose and evaluate a methodology for developing a robust machine learning classifier that can effectively be deployed to accurately detect coral reefs at scale. The hypothesis is that Landsat data can be used to train a classifier to detect coral reefs in remote sensing imagery and that this classifier can be trained to generalize across multiple sites. Another objective is to identify how well different classifiers perform under the generalized conditions and how unique the spectral signature of coral is as environmental conditions vary across observation sites. A methodology for validating the generalization performance of a classifier to unseen locations is proposed and implemented (Controlled Parameter Cross-Validation,). Analysis is performed using satellite imagery from nine different locations with known coral reefs (six Pacific Ocean sites and three Red Sea sites). Ground truth observations for four of the Pacific Ocean sites and two of the Red Sea sites were used to validate the proposed methodology. Within the Pacific Ocean sites, the consolidated classifier (trained on data from all sites) yielded an accuracy of 75.5% (0.778 AUC). Within the Red Sea sites, the consolidated classifier yielded an accuracy of 71.0% (0.7754 AUC). Finally, long-term change detection analysis is conducted for each of the sites evaluated. In total, over 16,700 km2 was analyzed for benthic cover type and cover change detection analysis. Within the Pacific Ocean sites, decreases in coral cover ranged from 25.3% reduction (Kingman Reef) to 42.7% reduction (Kiritimati Island). Within the Red Sea sites, decrease in coral cover ranged from 3.4% (Umluj) to 13.6% (Al Wajh).
APA, Harvard, Vancouver, ISO, and other styles
49

Sakarya, Hatice. "A Contribution To Modern Data Reduction Techniques And Their Applications By Applied Mathematics And Statistical Learning." Master's thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/12612819/index.pdf.

Full text
Abstract:
High-dimensional data take place from digital image processing, gene expression micro arrays, neuronal population activities to financial time series. Dimensionality Reduction - extracting low dimensional structure from high dimension - is a key problem in many areas like information processing, machine learning, data mining, information retrieval and pattern recognition, where we find some data reduction techniques. In this thesis we will give a survey about modern data reduction techniques, representing the state-of-the-art of theory, methods and application, by introducing the language of mathematics there. This needs a special care concerning the questions of, e.g., how to understand discrete structures as manifolds, to identify their structure, preparing the dimension reduction, and to face complexity in the algorithmically methods. A special emphasis will be paid to Principal Component Analysis, Locally Linear Embedding and Isomap Algorithms. These algorithms are studied by a research group from Vilnius, Lithuania and Zeev Volkovich, from Software Engineering Department, ORT Braude College of Engineering, Karmiel, and others. The main purpose of this study is to compare the results of the three of the algorithms. While the comparison is beeing made we will focus the results and duration.
APA, Harvard, Vancouver, ISO, and other styles
50

Stenberg, Erik. "SEQUENTIAL A/B TESTING USING PRE-EXPERIMENT DATA." Thesis, Uppsala universitet, Statistiska institutionen, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-385253.

Full text
Abstract:
This thesis bridges the gap between two popular methods of achieving more efficient online experiments, sequential tests and variance reduction with pre-experiment data. Through simulations, it is shown that there is efficiency to be gained in using control-variates sequentially along with the popular mixture Sequential Probability Ratio Test. More efficient tests lead to faster decisions and smaller sample sizes required. The technique proposed is also tested using empirical data on users from the music streaming service Spotify. An R package which includes the main tests applied in this thesis is also presented.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography