Dissertations / Theses on the topic 'Principal Component Analysis (PCA)'

To see the other types of publications on this topic, follow the link: Principal Component Analysis (PCA).

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Principal Component Analysis (PCA).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Solat, Karo. "Generalized Principal Component Analysis." Diss., Virginia Tech, 2018. http://hdl.handle.net/10919/83469.

Full text
Abstract:
The primary objective of this dissertation is to extend the classical Principal Components Analysis (PCA), aiming to reduce the dimensionality of a large number of Normal interrelated variables, in two directions. The first is to go beyond the static (contemporaneous or synchronous) covariance matrix among these interrelated variables to include certain forms of temporal (over time) dependence. The second direction takes the form of extending the PCA model beyond the Normal multivariate distribution to the Elliptically Symmetric family of distributions, which includes the Normal, the Student's t, the Laplace and the Pearson type II distributions as special cases. The result of these extensions is called the Generalized principal component analysis (GPCA). The GPCA is illustrated using both Monte Carlo simulations as well as an empirical study, in an attempt to demonstrate the enhanced reliability of these more general factor models in the context of out-of-sample forecasting. The empirical study examines the predictive capacity of the GPCA method in the context of Exchange Rate Forecasting, showing how the GPCA method dominates forecasts based on existing standard methods, including the random walk models, with or without including macroeconomic fundamentals.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
2

Li, Liubo Li. "Trend-Filtered Projection for Principal Component Analysis." The Ohio State University, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=osu1503277234178696.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Renkjumnong, Wasuta. "SVD and PCA in Image Processing." Digital Archive @ GSU, 2007. http://digitalarchive.gsu.edu/math_theses/31.

Full text
Abstract:
The Singular Value Decomposition is one of the most useful matrix factorizations in applied linear algebra, the Principal Component Analysis has been called one of the most valuable results of applied linear algebra. How and why principal component analysis is intimately related to the technique of singular value decomposition is shown. Their properties and applications are described. Assumptions behind this techniques as well as possible extensions to overcome these limitations are considered. This understanding leads to the real world applications, in particular, image processing of neurons. Noise reduction, and edge detection of neuron images are investigated.
APA, Harvard, Vancouver, ISO, and other styles
4

Allemang, Matthew R. "Comparison of Automotive Structures Using Transmissibility Functions and Principal Component Analysis." University of Cincinnati / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1367944783.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bianchi, Marcelo Franceschi de. "Extração de características de imagens de faces humanas através de wavelets, PCA e IMPCA." Universidade de São Paulo, 2006. http://www.teses.usp.br/teses/disponiveis/18/18133/tde-10072006-002119/.

Full text
Abstract:
Reconhecimento de padrões em imagens é uma área de grande interesse no mundo científico. Os chamados métodos de extração de características, possuem as habilidades de extrair características das imagens e também de reduzir a dimensionalidade dos dados gerando assim o chamado vetor de características. Considerando uma imagem de consulta, o foco de um sistema de reconhecimento de imagens de faces humanas é pesquisar em um banco de imagens, a imagem mais similar à imagem de consulta, de acordo com um critério dado. Este trabalho de pesquisa foi direcionado para a geração de vetores de características para um sistema de reconhecimento de imagens, considerando bancos de imagens de faces humanas, para propiciar tal tipo de consulta. Um vetor de características é uma representação numérica de uma imagem ou parte dela, descrevendo seus detalhes mais representativos. O vetor de características é um vetor n-dimensional contendo esses valores. Essa nova representação da imagem propicia vantagens ao processo de reconhecimento de imagens, pela redução da dimensionalidade dos dados. Uma abordagem alternativa para caracterizar imagens para um sistema de reconhecimento de imagens de faces humanas é a transformação do domínio. A principal vantagem de uma transformação é a sua efetiva caracterização das propriedades locais da imagem. As wavelets diferenciam-se das tradicionais técnicas de Fourier pela forma de localizar a informação no plano tempo-freqüência; basicamente, têm a capacidade de mudar de uma resolução para outra, o que as fazem especialmente adequadas para análise, representando o sinal em diferentes bandas de freqüências, cada uma com resoluções distintas correspondentes a cada escala. As wavelets foram aplicadas com sucesso na compressão, melhoria, análise, classificação, caracterização e recuperação de imagens. Uma das áreas beneficiadas onde essas propriedades tem encontrado grande relevância é a área de visão computacional, através da representação e descrição de imagens. Este trabalho descreve uma abordagem para o reconhecimento de imagens de faces humanas com a extração de características baseado na decomposição multiresolução de wavelets utilizando os filtros de Haar, Daubechies, Biorthogonal, Reverse Biorthogonal, Symlet, e Coiflet. Foram testadas em conjunto as técnicas PCA (Principal Component Analysis) e IMPCA (Image Principal Component Analysis), sendo que os melhores resultados foram obtidos utilizando a wavelet Biorthogonal com a técnica IMPCA
Image pattern recognition is an interesting area in the scientific world. The features extraction method refers to the ability to extract features from images, reduce the dimensionality and generates the features vector. Given a query image, the goal of a features extraction system is to search the database and return the most similar to the query image according to a given criteria. Our research addresses the generation of features vectors of a recognition image system for human faces databases. A feature vector is a numeric representation of an image or part of it over its representative aspects. The feature vector is a n-dimensional vector organizing such values. This new image representation can be stored into a database and allow a fast image retrieval. An alternative for image characterization for a human face recognition system is the domain transform. The principal advantage of a transform is its effective characterization for their local image properties. In the past few years researches in applied mathematics and signal processing have developed practical wavelet methods for the multi scale representation and analysis of signals. These new tools differ from the traditional Fourier techniques by the way in which they localize the information in the time-frequency plane; in particular, they are capable of trading on type of resolution for the other, which makes them especially suitable for the analysis of non-stationary signals. The wavelet transform is a set basis function that represents signals in different frequency bands, each one with a resolution matching its scale. They have been successfully applied to image compression, enhancement, analysis, classification, characterization and retrieval. One privileged area of application where these properties have been found to be relevant is computer vision, especially human faces imaging. In this work we describe an approach to image recognition for human face databases focused on feature extraction based on multiresolution wavelets decomposition, taking advantage of Biorthogonal, Reverse Biorthogonal, Symlet, Coiflet, Daubechies and Haar. They were tried in joint the techniques together the PCA (Principal Component Analysis) and IMPCA (Image Principal Component Analysis)
APA, Harvard, Vancouver, ISO, and other styles
6

Anjasmara, Ira Mutiara. "Spatio-temporal analysis of GRACE gravity field variations using the principal component analysis." Thesis, Curtin University, 2008. http://hdl.handle.net/20.500.11937/957.

Full text
Abstract:
Gravity Recovery and Climate Experiment (GRACE) mission has amplified the knowledge of both static and time-variable part of the Earth’s gravity field. Currently, GRACE maps the Earth’s gravity field with a near-global coverage and over a five year period, which makes it possible to apply statistical analysis techniques to the data. The objective of this study is to analyse the most dominant spatial and temporal variability of the Earth’s gravity field observed by GRACE using a combination of analytical and statistical methods such as Harmonic Analysis (HA) and Principal Component Analysis (PCA). The HA is used to gain general information of the variability whereas the PCA is used to find the most dominant spatial and temporal variability components without having to introduce any presetting. The latter is an important property that allows for the detection of anomalous or a-periodic behaviour that will be useful for the study of various geophysical processes such as the effect from earthquakes. The analyses are performed for the whole globe as well as for the regional areas of: Sumatra- Andaman, Australia, Africa, Antarctica, South America, Arctic, Greenland, South Asia, North America and Central Europe. On a global scale the most dominant temporal variation is an annual signal followed by a linear trend. Similar results mostly associated to changing land hydrology and/or snow cover are obtained for most regional areas except over the Arctic and Antarctic where the secular trend is the prevailing temporal variability.Apart from these well-known signals, this contribution also demonstrates that the PCA is able to reveal longer periodic and a-periodic signal. A prominent example for the latter is the gravity signal of the Sumatra-Andaman earthquake in late 2004. In an attempt to isolate these signals, linear trend and annual signal are removed from the original data and the PCA is once again applied to the reduced data. For a complete overview of these results the most dominant PCA modes for the global and regional gravity field solutions are presented and discussed.
APA, Harvard, Vancouver, ISO, and other styles
7

Ragozzine, Brett A. "Modeling the Point Spread Function Using Principal Component Analysis." Ohio University / OhioLINK, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1224684806.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Jot, Sapan. "pcaL1: An R Package of Principal Component Analysis using the L1 Norm." VCU Scholars Compass, 2011. http://scholarscompass.vcu.edu/etd/2488.

Full text
Abstract:
Principal component analysis (PCA) is a dimensionality reduction tool which captures the features of data set in low dimensional subspace. Traditional PCA uses L2-PCA and has much desired orthogonality properties, but is sensitive to outliers. PCA using L1 norm has been proposed as an alternative to counter the effect of outliers. The R environment for statistical computing already provides L2-PCA function prcomp(), but there are not many options for L1 norm PCA methods. The goal of the research was to create one R package with different options of PCA methods using L1 norm. So, we choose three different L1-PCA algorithms: PCA-L1 proposed by Kwak [10], L1-PCA* by Brooks et. al. [1], and L1-PCA by Ke and Kanade [9]; to create a package pcaL1 in R, interfacing with C implementation of these algorithms. An open source software for solving linear problems, CLP, is used to solve the optimization problems for L1-PCA* and L1-PCA. We use this package on human microbiome data to investigate the relationship between people based on colonizing bacteria.
APA, Harvard, Vancouver, ISO, and other styles
9

Yang, Libin. "An Application of Principal Component Analysis to Stock Portfolio Management." Thesis, University of Canterbury. Department of economics and finance, 2015. http://hdl.handle.net/10092/10293.

Full text
Abstract:
This thesis investigates the application of principal component analysis to the Australian stock market using ASX200 index and its constituents from April 2000 to February 2014. The first ten principal components were retained to present the major risk sources in the stock market. We constructed portfolio based on each of the ten principal components and named these “principal portfolios
APA, Harvard, Vancouver, ISO, and other styles
10

Anjasmara, Ira Mutiara. "Spatio-temporal analysis of GRACE gravity field variations using the principal component analysis." Curtin University of Technology, Department of Spatial Sciences, 2008. http://espace.library.curtin.edu.au:80/R/?func=dbin-jump-full&object_id=18720.

Full text
Abstract:
Gravity Recovery and Climate Experiment (GRACE) mission has amplified the knowledge of both static and time-variable part of the Earth’s gravity field. Currently, GRACE maps the Earth’s gravity field with a near-global coverage and over a five year period, which makes it possible to apply statistical analysis techniques to the data. The objective of this study is to analyse the most dominant spatial and temporal variability of the Earth’s gravity field observed by GRACE using a combination of analytical and statistical methods such as Harmonic Analysis (HA) and Principal Component Analysis (PCA). The HA is used to gain general information of the variability whereas the PCA is used to find the most dominant spatial and temporal variability components without having to introduce any presetting. The latter is an important property that allows for the detection of anomalous or a-periodic behaviour that will be useful for the study of various geophysical processes such as the effect from earthquakes. The analyses are performed for the whole globe as well as for the regional areas of: Sumatra- Andaman, Australia, Africa, Antarctica, South America, Arctic, Greenland, South Asia, North America and Central Europe. On a global scale the most dominant temporal variation is an annual signal followed by a linear trend. Similar results mostly associated to changing land hydrology and/or snow cover are obtained for most regional areas except over the Arctic and Antarctic where the secular trend is the prevailing temporal variability.
Apart from these well-known signals, this contribution also demonstrates that the PCA is able to reveal longer periodic and a-periodic signal. A prominent example for the latter is the gravity signal of the Sumatra-Andaman earthquake in late 2004. In an attempt to isolate these signals, linear trend and annual signal are removed from the original data and the PCA is once again applied to the reduced data. For a complete overview of these results the most dominant PCA modes for the global and regional gravity field solutions are presented and discussed.
APA, Harvard, Vancouver, ISO, and other styles
11

Balasubramanian, Vijay. "Variance reduction and outlier identification for IDDQ testing of integrated chips using principal component analysis." Texas A&M University, 2006. http://hdl.handle.net/1969.1/4766.

Full text
Abstract:
Integrated circuits manufactured in current technology consist of millions of transistors with dimensions shrinking into the nanometer range. These small transistors have quiescent (leakage) currents that are increasingly sensitive to process variations, which have increased the variation in good-chip quiescent current and consequently reduced the effectiveness of IDDQ testing. This research proposes the use of a multivariate statistical technique known as principal component analysis for the purpose of variance reduction. Outlier analysis is applied to the reduced leakage current values as well as the good chip leakage current estimate, to identify defective chips. The proposed idea is evaluated using IDDQ values from multiple wafers of an industrial chip fabricated in 130 nm technology. It is shown that the proposed method achieves significant variance reduction and identifies many outliers that escape identification by other established techniques. For example, it identifies many of the absolute outliers in bad neighborhoods, which are not detected by Nearest Neighbor Residual and Nearest Current Ratio. It also identifies many of the spatial outliers that pass when using Current Ratio. The proposed method also identifies both active and passive defects.
APA, Harvard, Vancouver, ISO, and other styles
12

Le, Hanh T. Banking &amp Finance Australian School of Business UNSW. "Discrete PCA: an application to corporate governance research." Awarded by:University of New South Wales. Banking & Finance, 2007. http://handle.unsw.edu.au/1959.4/40753.

Full text
Abstract:
This thesis introduces the application of discrete Principal Component Analysis (PCA) to corporate governance research. Given the presence of many discrete variables in typical governance studies, I argue that this method is superior to standard PCA that has been employed by others working in the area. Using a dataset of 244 companies listed on the London Stock Exchange in the year 2002-2003, I find that Pearson's correlations underestimate the strength of association between two variables, when at least one of them is discrete. Accordingly, standard PCA performed on the Pearson correlation matrix results in biased estimates. Applying discrete PCA on the polychoric correlation matrix, I extract from 28 corporate governance variables 10 significant factors. These factors represent 8 main aspects of the governance system, namely auditor reputation, large shareholder influence, size of board committees, social responsibility, risk optimisation, director independence level, female representation and institutional ownership. Finally, I investigate the relationship between corporate governance and a firm's long-run share market performance, with the former being the factors extracted. Consistent with Demsetz' (1983) argument, I document limited explanatory power for these governance factors.
APA, Harvard, Vancouver, ISO, and other styles
13

Landgraf, Andrew J. "Generalized Principal Component Analysis: Dimensionality Reduction through the Projection of Natural Parameters." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1437610558.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Aguirre, Jurado Ricardo. "Resilient Average and Distortion Detection in Sensor Networks." ScholarWorks@UNO, 2009. http://scholarworks.uno.edu/td/962.

Full text
Abstract:
In this paper a resilient sensor network is built in order to lessen the effects of a small portion of corrupted sensors when an aggregated result such as the average needs to be obtained. By examining the variance in sensor readings, a change in the pattern can be spotted and minimized in order to maintain a stable aggregated reading. Offset in sensors readings are also analyzed and compensated to help reduce a bias change in average. These two analytical techniques are later combined in Kalman filter to produce a smooth and resilient average given by the readings of individual sensors. In addition, principal components analysis is used to detect variations in the sensor network. Experiments are held using real sensors called MICAz, which are use to gather light measurements in a small area and display the light average generated in that area.
APA, Harvard, Vancouver, ISO, and other styles
15

Marques, Miguel Alexandre Castanheira. "On-line system for faults detection in induction motors based on PCA." Master's thesis, Faculdade de Ciências e Tecnologia, 2012. http://hdl.handle.net/10362/8578.

Full text
Abstract:
Dissertation to obtain the degree of Master in Electrical and Computer Engineering
Nowadays in the industry there many processes where human intervention is replaced by electrical machines, especially induction machines due to his robustness, performance and low cost. Although, induction machines are a high reliable device, they are also susceptible to faults. Therefore, the study of induction machine state is essential to reduce human and financial costs. The faults in induction machines can be divided mainly into two types: electrical faults and mechanical faults. Electrical faults represent between 40% and 50% of the reported faults and can be divided essentially in 2 types: stator unbalances and broken rotor bars. Taking into account the high dependency of induction machines and the massive use of automatic processes the industrial level, it is necessary to have diagnostic and monitoring systems these machines. It is presented in this work an on-line system for detection and diagnosis of electrical faults in induction motors based on computer-aided monitoring of the supply currents. The main objective is to detect and identify the presence of broken rotor bars and stator short-circuits in the induction motor. The presence of faults in the machine causes different disturbances in the supply currents. Through a stationary reference frame, such as αβ transform it is possible to extract and manipulate the results obtained from the supply currents using Eigen decomposition.
APA, Harvard, Vancouver, ISO, and other styles
16

You, Xiaozhen. "Principal Component Analysis and Assessment of Language Network Activation Patterns in Pediatric Epilepsy." FIU Digital Commons, 2010. http://digitalcommons.fiu.edu/etd/176.

Full text
Abstract:
This dissertation establishes a novel data-driven method to identify language network activation patterns in pediatric epilepsy through the use of the Principal Component Analysis (PCA) on functional magnetic resonance imaging (fMRI). A total of 122 subjects’ data sets from five different hospitals were included in the study through a web-based repository site designed here at FIU. Research was conducted to evaluate different classification and clustering techniques in identifying hidden activation patterns and their associations with meaningful clinical variables. The results were assessed through agreement analysis with the conventional methods of lateralization index (LI) and visual rating. What is unique in this approach is the new mechanism designed for projecting language network patterns in the PCA-based decisional space. Synthetic activation maps were randomly generated from real data sets to uniquely establish nonlinear decision functions (NDF) which are then used to classify any new fMRI activation map into typical or atypical. The best nonlinear classifier was obtained on a 4D space with a complexity (nonlinearity) degree of 7. Based on the significant association of language dominance and intensities with the top eigenvectors of the PCA decisional space, a new algorithm was deployed to delineate primary cluster members without intensity normalization. In this case, three distinct activations patterns (groups) were identified (averaged kappa with rating 0.65, with LI 0.76) and were characterized by the regions of: 1) the left inferior frontal Gyrus (IFG) and left superior temporal gyrus (STG), considered typical for the language task; 2) the IFG, left mesial frontal lobe, right cerebellum regions, representing a variant left dominant pattern by higher activation; and 3) the right homologues of the first pattern in Broca's and Wernicke's language areas. Interestingly, group 2 was found to reflect a different language compensation mechanism than reorganization. Its high intensity activation suggests a possible remote effect on the right hemisphere focus on traditionally left-lateralized functions. In retrospect, this data-driven method provides new insights into mechanisms for brain compensation/reorganization and neural plasticity in pediatric epilepsy.
APA, Harvard, Vancouver, ISO, and other styles
17

Alberi, Matteo. "La PCA per la riduzione dei dati di SPHERE IFS." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amslaurea.unibo.it/6563/.

Full text
Abstract:
Nel primo capitolo di questa tesi viene presentata una panoramica dei principali metodi di rivelazione degli esopianeti: il metodo della Velocità Radiale, il metodo Astrometrico, il metodo del Pulsar Timing, il metodo del Transito, il metodo del Microlensing ed infine il metodo del Direct Imaging che verrà approfondito nei capitoli successivi. Nel secondo capitolo vengono presentati i principi della diffrazione, viene mostrato come attenuare la luce stellare con l'uso del coronografo; vengono descritti i fenomeni di aberrazione della luce provocati dalla strumentazione e dagli effetti distorsivi dell'atmosfera che originano le cosiddette speckle; vengono poi presentate le moderne soluzioni tecniche come l'ottica attiva e adattiva, che hanno permesso un considerevole miglioramento della qualità delle osservazioni. Nel terzo capitolo sono illustrate le tecniche di Differential Imaging che permettono di rimuovere efficacemente le speckle e di migliorare il contrasto delle immagini. Nel quarto viene presentata una descrizione matematica della Principal Component Analysis (Analisi delle Componenti Principali), il metodo statistico utilizzato per la riduzione dei dati astronomici. Il quinto capitolo è dedicato a SPHERE, lo strumento progettato per il Very Large Telescope (VLT), in particolare viene descritto il suo spettrografo IFS con il quale sono stati ottenuti, nella fase di test, i dati analizzati nel lavoro di tesi. Nel sesto capitolo vengono mostrate le procedure di riduzione dati e l'applicazione dell'algoritmo di IDL LA_SVD che applica la Principal Component Analysis e ha permesso, analogamente ai metodi di Differenzial Imaging visti in precedenza, di rimuovere le speckle e migliorare il contrasto delle immagini. Nella parte conclusiva, vengono discussi i risultati.
APA, Harvard, Vancouver, ISO, and other styles
18

Strindlund, Olle. "Evaluation of Homogeneity in Drug Seizures Using Near-Infrared (NIR) Hyperspectral Imaging and Principal Component Analysis (PCA)." Thesis, Linköpings universitet, Kemi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-166747.

Full text
Abstract:
The selection of a representative sample is a delicate problem when drug seizures comprised of large number of units arrive at the Swedish National Forensic Centre (NFC). If deviating objects in the selected sample size are found, additional analyzes are required to investigate how representative the results are for the entire population. This generates further pressure on operational analysis flow. With the goal to provide a tool which forensic scientists at NFC can base their assessment of the representative nature of the selected sampling of large drug seizures on, this project investigated the possibilities of evaluating the level of homogeneity in drug seizures using near-infrared (NIR) hyperspectral imaging along with principal component analysis (PCA). A total of 27 sample groups (homogeneous, heterogeneous and seized sample groups) were analyzed and different predictive models were developed. The models were either based on quantifying the variation in NIR spectra or in PCA scores plots. It was shown that in the spectral range of 1300-2000 nm, using a pre-processing combination of area normalization, quadratic (second polynomial) detrending and mean centering, promising predictive abilities of the models in their evaluation of the level of homogeneity in drug seizures were achieved. A model where the approximated signal-dependent variation was related to the quotient of significant and noise explained variance given by PCA indicated most promising predictive abilities when quantifying the variation in NIR spectra. Similarly, a model where a rectangular area, defined by the maximum distances along PC1 and PC2, was related to the cumulative explained variance of the two PCs showed most promising predictive abilities when quantifying the variation in PCA scores plots. Different zones for which within sample groups are expected to appear based upon their degree of homogeneity could be established for both models. The two models differed in sensitivity. However, more comprehensive studies are required to evaluate the models applicability from an operational point-of-view.
APA, Harvard, Vancouver, ISO, and other styles
19

FANEGAN, JULIUS BOLUDE. "A FUZZY MODEL FOR ESTIMATING REMAINING LIFETIME OF A DIESEL ENGINE." University of Cincinnati / OhioLINK, 2007. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1188951646.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Yu, Xiaoqian. "The Impact of Latency Jitter on the Interpretation of P300 in the Assessment of Cognitive Function." Scholar Commons, 2016. http://scholarcommons.usf.edu/etd/6443.

Full text
Abstract:
When stimuli processing time varies in an oddball paradigm, the latency of the P300 will vary across trials. In an oddball task requiring difficult response selections, as the variation of stimuli processing time increases, so does the variation of the P300 latency, causing latency jitters in the measurement. Averaging the P300 across different trials without adjusting this latency jitter will lead to diminished P300 amplitude, resulting in inaccurate conclusions from the data. Verleger et al. (2014) reported a diminished P300 amplitude in a difficult oddball task that required subjects to make response selections among stimuli that are difficult to distinguish, but his work did not correct for any latency jitter observed within his sample. The current study replicated the easy and hard oddball tasks conducted in Verleger et al.. Raw ERPs obtained from 16 subjects indicated a successful replication of the study. An examination of the behavioral data showed that there was substantial variation in the P300 during the hard oddball tasks, and a latency jitter correction was applied in the analysis. Results indicated that there was a significant increase in the amplitude of P300 after latency jitter correction, and that this P300 amplitude did not differ significantly between easy and hard oddball tasks. These results suggest that difficult decision requirement does not reduce the amplitude of the P300, and that latency jitter should be accounted for when analyzing data from tasks involving a difficult decision requirement.
APA, Harvard, Vancouver, ISO, and other styles
21

Massaro, James. "A PCA based method for image and video pose sequencing /." Online version of thesis, 2010. http://hdl.handle.net/1850/11991.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Radjabi, Ryan F. "WILDFIRE DETECTION SYSTEM BASED ON PRINCIPAL COMPONENT ANALYSIS AND IMAGE PROCESSING OF REMOTE-SENSED VIDEO." DigitalCommons@CalPoly, 2016. https://digitalcommons.calpoly.edu/theses/1621.

Full text
Abstract:
Early detection and mitigation of wildfires can reduce devastating property damage, firefighting costs, pollution, and loss of life. This thesis proposes the method of Principal Component Analysis (PCA) of images in the temporal domain to identify a smoke plume in wildfires. Temporal PCA is an effective motion detector, and spatial filtering of the output Principal Component images can segment the smoke plume region. The effective use of other image processing techniques to identify smoke plumes and heat plumes are compared. The best attributes of smoke plume detectors and heat plume detectors are evaluated for combination in an improved wildfire detection system. PCA of visible blue images at an image sampling rate of 2 seconds per image effectively exploits a smoke plume signal. PCA of infrared images is the fundamental technique for exploiting a heat plume signal. A system architecture is proposed for the implementation of image processing techniques. The real-world deployment and usability are described for this system.
APA, Harvard, Vancouver, ISO, and other styles
23

Rodeia, José Pedro dos Santos. "Analysis and recognition of similar environmental sounds." Master's thesis, FCT - UNL, 2009. http://hdl.handle.net/10362/2305.

Full text
Abstract:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Informática
Humans have the ability to identify sound sources just by hearing a sound. Adapting the same problem to computers is called (automatic) sound recognition. Several sound recognizers have been developed throughout the years. The accuracy provided by these recognizers is influenced by the features they use and the classification method implemented. While there are many approaches in sound feature extraction and in sound classification, most have been used to classify sounds with very different characteristics. Here, we implemented a similar sound recognizer. This recognizer uses sounds with very similar properties making the recognition process harder. Therefore, we will use both temporal and spectral properties of the sound. These properties will be extracted using the Intrinsic Structures Analysis (ISA) method, which uses Independent Component Analysis and Principal Component Analysis. We will implement the classification method based on k-Nearest Neighbor algorithm. Here we prove that the features extracted in this way are powerful in sound recognition. We tested our recognizer with several sets of features the ISA method retrieves, and achieved great results. We, finally, did a user study to compare human performance distinguishing similar sounds against our recognizer. The study allowed us to conclude the sounds are in fact really similar and difficult to distinguish and that our recognizer has much more ability than humans to identify them.
APA, Harvard, Vancouver, ISO, and other styles
24

Nelson, Philip R. C. MacGregor John F. Taylor Paul A. "The treatment of missing measurements in PCA and PLS models /." *McMaster only, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
25

Hassling, Andreas, and Simon Flink. "SYSTEM IDENTIFICATION OF A WASTE-FIRED CFB BOILER : Using Principal Component Analysis (PCA) and Partial Least Squares Regression modeling (PLS-R)." Thesis, Mälardalens högskola, Akademin för ekonomi, samhälle och teknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-34979.

Full text
Abstract:
Heat and electricity production along with waste management are two modern day challenges for society. One of the possible solution to both of them is the incineration of household waste to produce heat and electricity. Incineration is a waste-to-energy treatment process, which can reduce the need for landfills and save the use of more valuable fuels, thereby conserving natural resources. This report/paper investigates the performance and emissions of a municipal solid waste (MSW) fueled industrial boiler by performing a system identification analysis using Principle Component Analysis (PCA) and Partial Least Squares Regression (PLS-R) modeling. The boiler is located in Västerås, Sweden and has a maximum capacity of 167MW. It produces heat and electricity for the city of Västerås and is operated by Mälarenergi AB. A dataset containing 148 different boilers variables, measured with a one hour interval over 2 years, was used for the system identification analysis. The dataset was visually inspected to remove obvious outliers before beginning the analysis using a multivariate data analysis software called The Unscrambler X (Version 10.3, CAMO Software, Norway). Correlations found using PCA was taken in account during the PLSR modelling where models were created for one response each. Some variables had an unexpected impact on the models while others were fully logical regarding combustion theory. Results found during the system analysis process are regarded as reliable. Any errors may be due to outlier data points and model inadequacies.
APA, Harvard, Vancouver, ISO, and other styles
26

Veverka, Vojtěch. "Automatické rozměření vícesvodových EKG signálů." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2017. http://www.nusl.cz/ntk/nusl-316834.

Full text
Abstract:
This semester thesis is focused on automated measurement of ECG signal. The theoretical part describes the rise and options ECG signal. Furthermore, the issue is staged principal components analysis, whose output is used as input signal for seasons. They describe the basic methods used in measurement to ECG signal. The practical part is designed in measurement algorithm for ECG signal that has been tested on basic CSE database. The results are discussed in the conclusion.
APA, Harvard, Vancouver, ISO, and other styles
27

Xian, Qing. "Statistical Assessment of Hydrochemical Characteristics of Streams and Rivers in Eastern New England." Thesis, Boston College, 2009. http://hdl.handle.net/2345/1364.

Full text
Abstract:
Thesis advisor: Rudolph Hon
This study characterizes the current state of water quality of surface streams and rivers in the eastern New England region. A set of water quality data for nine rivers, part of the USGS National Water-Quality Assessment (NAWQA) Program was statistically evaluated to identify natural and anthropogenic persistent influential factors on water quality in surface waters. Binary analysis and multivariate analysis, mainly Principal Component Analysis (PCA) and Factor Analysis (FA) were applied to determine the least number of independent relationships among multiple chemical components in the data set. Statistical results show that in eight of the nine rivers included in this study, four principal components can explain about 80% of the total variance of the original data. The most significant contributing factors can be identified with: (1) chemical weathering; (2) road salt applications; (3) nutrient cycling; and (4) agricultural/waste water
Thesis (MS) — Boston College, 2009
Submitted to: Boston College. Graduate School of Arts and Sciences
Discipline: Geology and Geophysics
APA, Harvard, Vancouver, ISO, and other styles
28

Edberg, Alexandra. "Monitoring Kraft Recovery Boiler Fouling by Multivariate Data Analysis." Thesis, KTH, Skolan för kemi, bioteknologi och hälsa (CBH), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-230906.

Full text
Abstract:
This work deals with fouling in the recovery boiler at Montes del Plata, Uruguay. Multivariate data analysis has been used to analyze the large amount of data that was available in order to investigate how different parameters affect the fouling problems. Principal Component Analysis (PCA) and Partial Least Square Projection (PLS) have in this work been used. PCA has been used to compare average values between time periods with high and low fouling problems while PLS has been used to study the correlation structures between the variables and consequently give an indication of which parameters that might be changed to improve the availability of the boiler. The results show that this recovery boiler tends to have problems with fouling that might depend on the distribution of air, the black liquor pressure or the dry solid content of the black liquor. The results also show that multivariate data analysis is a powerful tool for analyzing these types of fouling problems.
Detta arbete handlar om inkruster i sodapannan pa Montes del Plata, Uruguay. Multivariat dataanalys har anvands for att analysera den stora datamangd som fanns tillganglig for att undersoka hur olika parametrar paverkar inkrusterproblemen. Principal·· Component Analysis (PCA) och Partial Least Square Projection (PLS) har i detta jobb anvants. PCA har anvants for att jamfora medelvarden mellan tidsperioder med hoga och laga inkrusterproblem medan PLS har anvants for att studera korrelationen mellan variablema och darmed ge en indikation pa vilka parametrar som kan tankas att andras for att forbattra tillgangligheten pa sodapannan. Resultaten visar att sodapannan tenderar att ha problem med inkruster som kan hero pa fdrdelningen av luft, pa svartlutens tryck eller pa torrhalten i svartluten. Resultaten visar ocksa att multivariat dataanalys ar ett anvandbart verktyg for att analysera dessa typer av inkrusterproblem.
APA, Harvard, Vancouver, ISO, and other styles
29

Gomes, Róbson Koszeniewski. "Uma Abordagem para Detecção Automática de Planos em Modelos Digitais de Afloramentos Baseada em PCA." Universidade do Vale do Rio dos Sinos, 2014. http://www.repositorio.jesuita.org.br/handle/UNISINOS/4465.

Full text
Abstract:
Submitted by Nara Lays Domingues Viana Oliveira (naradv) on 2015-07-15T18:16:55Z No. of bitstreams: 1 ROBSON.pdf: 2705073 bytes, checksum: d6eefccd3ecb288572c9262f7dc4079a (MD5)
Made available in DSpace on 2015-07-15T18:16:55Z (GMT). No. of bitstreams: 1 ROBSON.pdf: 2705073 bytes, checksum: d6eefccd3ecb288572c9262f7dc4079a (MD5) Previous issue date: 2014
PROCERGS - Companhia de Processamento Dados do Estado Rio Grande Sul
A coleta de dados espaciais tem sido intensamente empregada na área geológica, através da técnica de LIDAR (Light Detection and Ranging). Este tipo de sensoriamento digital remoto de alta resolução e precisão, resulta em modelos digitais 3D que permitem uma análise mais detalhada e quantitativa de estruturas heterôgeneas, como afloramentos. Um dos estudos realizados pelos geólogos são análises sobre a geometria da formação de rochas, onde a informação de orientação de um plano inclinado é um indicativo para a compreensão global da estrutura. Este trabalho propõe a utilização da técnica de Análise de Componentes Principais (PCA) para calcular e detectar automaticamente todos os planos em uma nuvem de pontos. Uma ferramenta foi construída para implementar a visualização do modelo digital e apurar os melhores planos. Um estudo foi realizado a fim de validar as informações encontradas pelo método proposto e dados medidos em campo.
The use of LIDAR (Light Detection and Ranging) systems for gathering spatial data has been extensively used in geological studies. This type of digital remote sensing delivers high resolution and accuracy, resulting in 3D digital models which allow a more detailed and quantitative analysis of heterogeneous structures, as outcrops. One of the studies is based on analysis of the forming rocks geometry. The orientation of a slope plane is an indication for the overall undestanding of the structure. This work proposes a new method to automatically compute and detect all possible planes in a point cloud, based on Principal Component Analysis (PCA) technique. A software tool was constructed to implement the digital model visualization and compute the best planes. A study was conducted to compare and validate the results of the method and the field data collected.
APA, Harvard, Vancouver, ISO, and other styles
30

Sanderson, Conrad, and conradsand@ieee org. "Automatic Person Verification Using Speech and Face Information." Griffith University. School of Microelectronic Engineering, 2003. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20030422.105519.

Full text
Abstract:
Identity verification systems are an important part of our every day life. A typical example is the Automatic Teller Machine (ATM) which employs a simple identity verification scheme: the user is asked to enter their secret password after inserting their ATM card; if the password matches the one prescribed to the card, the user is allowed access to their bank account. This scheme suffers from a major drawback: only the validity of the combination of a certain possession (the ATM card) and certain knowledge (the password) is verified. The ATM card can be lost or stolen, and the password can be compromised. Thus new verification methods have emerged, where the password has either been replaced by, or used in addition to, biometrics such as the person’s speech, face image or fingerprints. Apart from the ATM example described above, biometrics can be applied to other areas, such as telephone & internet based banking, airline reservations & check-in, as well as forensic work and law enforcement applications. Biometric systems based on face images and/or speech signals have been shown to be quite effective. However, their performance easily degrades in the presence of a mismatch between training and testing conditions. For speech based systems this is usually in the form of channel distortion and/or ambient noise; for face based systems it can be in the form of a change in the illumination direction. A system which uses more than one biometric at the same time is known as a multi-modal verification system; it is often comprised of several modality experts and a decision stage. Since a multi-modal system uses complimentary discriminative information, lower error rates can be achieved; moreover, such a system can also be more robust, since the contribution of the modality affected by environmental conditions can be decreased. This thesis makes several contributions aimed at increasing the robustness of single- and multi-modal verification systems. Some of the major contributions are listed below. The robustness of a speech based system to ambient noise is increased by using Maximum Auto-Correlation Value (MACV) features, which utilize information from the source part of the speech signal. A new facial feature extraction technique is proposed (termed DCT-mod2), which utilizes polynomial coefficients derived from 2D Discrete Cosine Transform (DCT) coefficients of spatially neighbouring blocks. The DCT-mod2 features are shown to be robust to an illumination direction change as well as being over 80 times quicker to compute than 2D Gabor wavelet derived features. The fragility of Principal Component Analysis (PCA) derived features to an illumination direction change is solved by introducing a pre-processing step utilizing the DCT-mod2 feature extraction. We show that the enhanced PCA technique retains all the positive aspects of traditional PCA (that is, robustness to compression artefacts and white Gaussian noise) while also being robust to the illumination direction change. Several new methods, for use in fusion of speech and face information under noisy conditions, are proposed; these include a weight adjustment procedure, which explicitly measures the quality of the speech signal, and a decision stage comprised of a structurally noise resistant piece-wise linear classifier, which attempts to minimize the effects of noisy conditions via structural constraints on the decision boundary.
APA, Harvard, Vancouver, ISO, and other styles
31

Iriarte, Martel Jorge Hugo [UNESP]. "Caracterização de germoplasma de pupunha (Bactris gasipaes Kunth) por descritores morfológicos." Universidade Estadual Paulista (UNESP), 2002. http://hdl.handle.net/11449/102849.

Full text
Abstract:
Made available in DSpace on 2014-06-11T19:32:17Z (GMT). No. of bitstreams: 0 Previous issue date: 2002-08-20Bitstream added on 2014-06-13T20:03:34Z : No. of bitstreams: 1 martel_jhi_dr_jabo.pdf: 1142341 bytes, checksum: d5876bdaae6caa03d42f9ab8eff903df (MD5)
A pupunheira tem um potencial econômico e social muito grande, sendo a palmeira mais importante na América pré-colombiana, constituindo junto com o milho e a mandioca, a base da alimentação dos povos primitivos. Os principais produtos extraídos são o palmito e os frutos para o consumo humano direto, alimento animal, farinhas para consumo humano e óleo vegetal. Os objetivos do presente trabalho foram de utilizar uma lista de descritores morfológicos recomendada, para discriminar primeiramente as raças Pará e Putumayo e após sua validação estatística, verificar também a existência da raça Solimões, que até hoje tem sido negada. Foram aplicadas técnicas estatísticas univariadas e multivariadas na tentativa de discriminar as raças. Dos 42 descritores iniciais, 25 apresentaram diferenças significativas entre as raças e 15 tiveram aproximação normal. A análise discriminante mostrou que a raça Pará possuía 15% das plantas mal classificadas e Putumayo 14%, já com a seleção de desenvolmer para componentes principais, as percentagens foram 9 e 19%, respectivamente, para as duas raças. A população de Manacapuru, não formou grupo nas duas primeiras análises de agrupamento e nem com componentes principais. As três análises em conjunto, conseguiram discriminar as raças Pará, Putumayo e Solimões, sendo os descritores mais importantes nesta discriminação e classificação das raças: número de espigas por cacho, comprimento da ráquis, peso dos frutos, espessura das cascas, facilidade para descascar os frutos, peso das cascas, sabor dos frutos, espessura da polpa, distância morfológica dos frutos e peso das sementes.
The peach palm has a economic and social potential very great being the palm most important in the América pre-Colombian, contribuiting together with the maize and the cassava in the indenous feeds. The target of the present work was: to use a morphological descriptor list recommended, to discriminate between two landraces and descriptors validation , to verify the existence of solimoes landraces. Univariated and multivariated statistical techniques were used to attemp discriminate the landraces. Form fort yone initial descriptors, twenty five had presented significant difference between the landraces and fifteen had presented normal approach. The discriminant analysis have showed that Pará landrace possessed fifteen percent of the plant badly c1assified and Putumayo about fourteen percent to it. In the analysis of principal component, the percentages were nine and nineteen percent, respectively, for the two landraces. Manacapuru population did not form c1usterin in the two first one analysis of and nor with principal components. Three joint analysis in the set had obtained to discriminate the Pará, Putumayo and Solimoes landraces and the discrimnant analysis with three landraces, c1assified Manacapuru of the Putumayo landrace inside. The most important descriptors in the discrimination between landraces were: numbers of ears per raceme, raquis length, fruit weight, thickness of fruits bark, facility to peel fruits, weight of fruit bark, fruit flavor, pulp thickness, morphological distance between fruits and seed weight.
APA, Harvard, Vancouver, ISO, and other styles
32

Ivan, Jean-Paul. "Principal Component Modelling of Fuel Consumption ofSeagoing Vessels and Optimising Fuel Consumption as a Mixed-Integer Problem." Thesis, Mälardalens högskola, Akademin för utbildning, kultur och kommunikation, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-51847.

Full text
Abstract:
The fuel consumption of a seagoing vessel is, through a combination of Box-Cox transforms and principal component analysis, reduced to a univariatefunction of the primary principle component with mean model error −3.2%and error standard deviation 10.3%. In the process, a Latin-hypercube-inspired space partitioning sampling technique is developed and successfully used to produce a representative sampleused in determining the regression coefficients. Finally, a formal optimisation problem for minimising the fuel use is described. The problem is derived from a parametrised expression for the fuel consumption, and has only 3, or 2 if simplified, free variables at each timestep. Some information has been redacted in order to comply with NDA restrictions. Most redactions are either names (of vessels or otherwise), units, andin some cases (especially on figures) quantities.

Presentation was performed remotely using Zoom.

APA, Harvard, Vancouver, ISO, and other styles
33

Sousa, Patrícia Ferreira Cunha [UNESP]. "Avaliação de laranjeiras doces quanto à qualidade de frutos, períodos de maturação e resistência a Guignardia citricarpa." Universidade Estadual Paulista (UNESP), 2009. http://hdl.handle.net/11449/102823.

Full text
Abstract:
Made available in DSpace on 2014-06-11T19:32:16Z (GMT). No. of bitstreams: 0 Previous issue date: 2009-02-17Bitstream added on 2014-06-13T20:23:21Z : No. of bitstreams: 1 sousa_pfc_dr_jabo.pdf: 387633 bytes, checksum: 521ab7a95343ec6b201a18d943d41027 (MD5)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Apesar de sua importância comercial, o número de variedades de laranjas é muito restrito no Brasil. Os Bancos de Germoplasmas de citros possuem grande número de genótipos de laranjas doces para serem explorados e avaliados quanto aos aspectos botânicos, genéticos e agronômicos, visando elevar a variabilidade genética e as qualidades agronômicas das cultivares. Como parte desse trabalho, avaliou-se 58 genótipos de laranjeiras doces em relação aos caracteres físicos, visando mercado in natura por meio de 9 caracteres físicos (diâmetro, perímetro, altura e peso dos frutos, espessuras da casca, albedo e polpa e número de sementes) e 7 caracteres visando qualidade industrial (acidez total titulável, sólidos solúveis totais, “ratio”, peso dos frutos, rendimento de suco, ácido ascórbico e índice tecnológico= kg sólidos solúveis/40,8kg). A análise multivariada indicou a existência de variabilidade entre os genótipos em relação aos caracteres físicos visando mercado in natura e qualidade industrial. Dois componentes principais, com autovalores > 1, representaram 66,03% da variância total para os caracteres físicos. As variáveis com maior poder discriminatório na primeira componente principal foram: diâmetro, perímetro, peso e altura dos frutos. Os escores desse componente foram designados MI-CP1 (mercado in natura), e os genótipos com os maiores valores foram os mais indicados para o mercado de fruta fresca. Na segunda componente principal, as variáveis mais discriminantes foram espessura do endocarpo e rendimento de suco, cujos escores foram nomeados (S-CP2), caracteres físicos esses ideais para a qualidade industrial. Nos escores dos dois componentes principais (MI-CP1 e S-CP2), o genótipo 22- ‘Lanelate’ foi destaque, seguido por 43-Telde, 39-Rotuna, 44-Torregrossa, 46-Tua Mamede e 17-Grada. Quanto às avaliações visando qualidade industrial...
Although its commercial importance, the number of you cultivate of oranges it is very restricted in Brazil. The Banks of Germoplasmas of citros possess innumerable accesses of oranges candies to be explored and evaluated how much to the botanical, genetic and agronomics aspects, aiming at to raise the genetic variability and the agronomics qualities cultivating of them. As part of that work, was sought to evaluate 58 genotypes of sweet orange trees in relation to the physical characters, seeking market in nature and industry quality, through 9 physical characters (diameter, perimeter, height and weight of the fruits, thickness of the peel, albedo and pulp and number of seeds) and 7 characters seeking industrial quality (acidity total titillate, total soluble solids, ratio , weight of the fruits, juice revenue, ascorbic acid and technological index = kg solid solutes/40,8kg). The analysis multivariate indicated the variability existence among the genotypes in relation to the physical characters and industrial quality. Two main components, with autovalues> 1, they represented 66,03% of the total variance for the physical characters. The variables with larger power discriminate in the first main component were: diameter, perimeter, weight and height of the fruits; we named the scores of that component of MI-CP1 (market in nature), genotypes with the largest values were the most suitable to the market of fresh fruit; in the second main component the variables more discriminate were thickness of the endocarp and juice revenue, it was named (S-CP2), characters physical ideas for the industrial quality. In the scores of the two main components (MI-CP1 and S-CP2), the genotype 22-Lanelate was prominence, followed for 43-Telde, 39-Rotuna, 44- Torregrossa, 46-Tua Mamede and it 17-Grada. How much to the evaluations aiming at industrial quality (INDUST-CP1), had been distinguished: ...(Complete abstract click electronic access below)
APA, Harvard, Vancouver, ISO, and other styles
34

Fontes, Nayanne Maria Garcia Rego. "Monitoramento e avaliação de desempenho de sistemas MPC utilizando métodos estatísticos multivariados." Universidade Federal de Sergipe, 2017. http://ri.ufs.br:8080/xmlui/handle/123456789/5037.

Full text
Abstract:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES
Monitoring of process control systems is extremely important for industries to ensure the quality of the product and the safety of the process. Predictive controllers, also known by MPC (Model Predictive Control), usually has a well performance initially. However, after a period, many factors contribute to the deterioration of its performance. This highlights the importance of monitoring the MPC control systems. In this work, tools based on multivariate statistical methods are discussed and applied to the problem of monitoring and Performance Assessment of predictive controllers. The methods presented here are: PCA (Principal Component Analysis) and ICA (Independent Component Analysis). Both are techniques that use data collected directly from the process. The first is widely used in Performance Assessment of predictive controllers. The second is a more recent technique that has arisen, mainly in order to be used in fault detection systems. The analyzes are made when applied in simulated processes characteristic of the petrochemical industry operating under MPC control.
O monitoramento de sistemas de controle de processos é extremamente importante no que diz respeito às indústrias, para garantir a qualidade do que é produzido e a segurança do processo. Os controladores preditivos, também conhecidos pela sigla em inglês MPC (Model Predictive Control), costumam ter um bom desempenho inicialmente. Entretanto, após um certo período, muitos fatores contribuem para a deterioração de seu desempenho. Isto evidencia a importância do monitoramento dos sistemas de controle MPC. Neste trabalho aborda-se ferramentas, baseada em métodos estatísticos multivariados, aplicados ao problema de monitoramento e avaliação de desempenho de controladores preditivos. Os métodos aqui apresentados são: o PCA (Análise por componentes principais) e o ICA (Análise por componentes independentes). Ambas são técnicas que utilizam dados coletados diretamente do processo. O primeiro é largamente utilizado na avaliação de desempenho de controladores preditivos. Já o segundo, é uma técnica mais recente que surgiu, principalmente, com o intuito de ser utilizado em sistemas de detecção de falhas. As análises são feitas quando aplicadas em processos simulados característicos da indústria petroquímica operando sob controle MPC.
APA, Harvard, Vancouver, ISO, and other styles
35

Britto, Rodrigo da Silva. "Detecção de falhas com PCA e PLS aplicados a uma planta didática." Pós-Graduação em Engenharia Elétrica, 2014. https://ri.ufs.br/handle/riufs/5012.

Full text
Abstract:
A fault monitoring system in general, which, beside the detection, isolation, diagnosis and fault recuperation steps is a research area of great interest, since the fault occurrence may lead to negative consequences on different levels on social, economical and environmental bases. With the increasing complexity of the industrial process, it is often necessary a quick detection leading to an optimized fault management system and therefore avoiding the loss of material and human resources. This work develops a study on fault detection statistical techniques applied to a didactic plant. The didactic plant deployed in this study comprises a controlled simple industrial process. For the fault detection in this process it were applied the main statistical methods: Principal Component Analysis (PCA) and Partial Least Squares (PLS). Those methods were implemented and applied on the process aiming a comparative analysis concerning themselves. As a result, the methods were able to detect every kind of emulated fault, with little or none detection delay and with similar performances.
Um sistema de monitoramento de falhas em geral, que além da detecção inclui etapas de isolamento, diagnóstico e recuperação das falhas, é uma área de pesquisa de grande interesse, uma vez que a ocorrência de falhas pode ter consequências negativas em diversos níveis, com impactos socioeconômicos e ambientais. Em processos industriais cada vez mais complexos, é necessária uma rápida detecção de falhas, exigindo um sistema de gerenciamento de falhas otimizado, de modo a evitar perdas de recursos materiais e humanos. Este trabalho desenvolve um estudo sobre técnicas estatísticas de detecção de falhas aplicadas numa planta didática. A planta didática empregada no estudo compreende um processo industrial simples controlado. Para a detecção das falhas nesse processo, foram aplicados os principais métodos estatísticos: Análise de Componentes Principais (PCA) e Mínimos Quadrados Parciais (PLS). Estes métodos foram implementados e aplicados ao processo objetivando uma análise comparativa entre os mesmos. Como resultado, os métodos foram capazes de detectar todos os diferentes tipos de falhas emuladas, com pouco ou nenhum atraso na detecção e com desempenhos similares.
APA, Harvard, Vancouver, ISO, and other styles
36

Naik, Ganesh Ramachandra, and ganesh naik@rmit edu au. "Iterative issues of ICA, quality of separation and number of sources: a study for biosignal applications." RMIT University. Electrical and Computer Engineering, 2009. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20090320.115103.

Full text
Abstract:
This thesis has evaluated the use of Independent Component Analysis (ICA) on Surface Electromyography (sEMG), focusing on the biosignal applications. This research has identified and addressed the following four issues related to the use of ICA for biosignals: • The iterative nature of ICA • The order and magnitude ambiguity problems of ICA • Estimation of number of sources based on dependency and independency nature of the signals • Source separation for non-quadratic ICA (undercomplete and overcomplete) This research first establishes the applicability of ICA for sEMG and also identifies the shortcomings related to order and magnitude ambiguity. It has then developed, a mitigation strategy for these issues by using a single unmixing matrix and neural network weight matrix corresponding to the specific user. The research reports experimental verification of the technique and also the investigation of the impact of inter-subject and inter-experimental variations. The results demonstrate that while using sEMG without separation gives only 60% accuracy, and sEMG separated using traditional ICA gives an accuracy of 65%, this approach gives an accuracy of 99% for the same experimental data. Besides the marked improvement in accuracy, the other advantages of such a system are that it is suitable for real time operations and is easy to train by a lay user. The second part of this thesis reports research conducted to evaluate the use of ICA for the separation of bioelectric signals when the number of active sources may not be known. The work proposes the use of value of the determinant of the Global matrix generated using sparse sub band ICA for identifying the number of active sources. The results indicate that the technique is successful in identifying the number of active muscles for complex hand gestures. The results support the applications such as human computer interface. This thesis has also developed a method of determining the number of independent sources in a given mixture and has also demonstrated that using this information, it is possible to separate the signals in an undercomplete situation and reduce the redundancy in the data using standard ICA methods. The experimental verification has demonstrated that the quality of separation using this method is better than other techniques such as Principal Component Analysis (PCA) and selective PCA. This has number of applications such as audio separation and sensor networks.
APA, Harvard, Vancouver, ISO, and other styles
37

Owen, Jade Denise. "Investigation of the elemental profiles of Hypericum perforatum as used in herbal remedies." Thesis, University of Hertfordshire, 2014. http://hdl.handle.net/2299/13233.

Full text
Abstract:
The work presented in this thesis has demonstrated that the use of elemental profiles for the quality control of herbal medicines can be applied to multiple stages of processing. A single method was developed for the elemental analysis of a variety of St John’s Wort (Hypericum perforatum) preparations using Inductively Coupled Plasma – Optical Emission Spectroscopy (ICP-OES). The optimised method consisted of using 5 ml of nitric acid and microwave digestion reaching temperatures of 185⁰C. Using NIST Polish tea (NIST INCT-TL- 1) the method was found to be accurate and the matrix effect from selected St John’s Wort (SJW) preparations was found to be ≤22%. The optimised method was then used to determine the elemental profiles for a larger number of SJW preparations (raw herbs=22, tablets=20 and capsules=12). Specifically, the method was used to determine the typical concentrations of 25 elements (Al, As, B, Ba, Be, Ca, Cd, Co, Cr, Cu, Fe, Hg, In, Mg, Mn, Mo, Ni, Pb, Pt, Sb, Se, Sr, V, Y and Zn) for each form of SJW which ranged from not detected to 200 mg/g. To further interpret the element profiles, Principal Component Analysis (PCA) was carried out. This showed that different forms of SJW could be differentiated based on their elemental profile and the SJW ingredient used (i.e. extract or raw herb) identified. The differences in the profiles were likely due to two factors: (1) the addition of bulking agents and (2) solvent extraction. In order to further understand how the elemental profile changes when producing the extract from the raw plant, eight SJW herb samples were extracted with four solvents (100% water, 60% ethanol, 80% ethanol and 100% ethanol) and analysed for their element content. The results showed that the transfer of elements from the raw herb to an extract was solvent and metal dependent. Generally the highest concentrations of an element were extracted with 100% water, which decreased as the concentration of ethanol increased. However, the transfer efficiency for the element Cu was highest with 60% ethanol. The solvents utilised in industry (60% and 80% ethanol) were found to preconcentrate some elements; Cu (+119%), Mg (+93%), Ni (+183%) and Zn (+12%) were found to preconcentrate in 60 %v/v ethanol extracts and Cu (+5%) and Ni (+30%). PCA of the elemental profiles of the four types of extract showed that differentiation was observed between the different solvents and as the ethanol concentration increased, the extracts became more standardised. Analysis of the bioactive compounds rutin, hyperoside, quercetin, hyperforin and adhyperforin followed by subsequent Correlation Analysis (CA) displayed relationships between the elemental profiles and the molecular profiles. For example strong correlations were seen between hyperoside and Cr as well as Quercetin and Fe. This shows potential for tuning elemental extractions for metal-bioactive compounds for increased bioactivity and bioavailability; however further work in needed in this area.
APA, Harvard, Vancouver, ISO, and other styles
38

Geschwinder, Lukáš. "Možnosti využití metod vícerozměrné statistické analýzy dat při hodnocení spolehlivosti distribučních sítí." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2009. http://www.nusl.cz/ntk/nusl-217824.

Full text
Abstract:
The aim of this study is evaluation of using multi-dimensional statistical analyses methods as a tool for simulations of reliability of distribution network. Prefered methods are a cluster analysis (CLU) and a principal component analysis (PCA). CLU is used for a division of objects on the basis of their signs and a calculation of the distance between objects into groups whose characteristics should be similar. The readout can reveal a secret structure in data. PCA is used for a location of a structure in signs of multi-dimensional matrix data. Signs present separate quantities describing the given object. PCA uses a dissolution of a primary matrix data to structural and noise matrix data. It concerns the transformation of primary matrix data into new grid system of principal components. New conversion data are called a score. Principal components generating orthogonal system of new position. Distribution network from the aspect of reliability can be characterized by a number of new statistical quantities. Reliability indicators might be: interruption numbers, interruption time. Integral reliability indicators might be: system average interruption frequency index (SAIFI) and system average interruption duration index (SAIDI). In conclusion, there is a comparison of performed SAIFI simulation according to negatively binomial division and provided values from a distribution company. It is performed a test at description of sign dependences and outlet divisions.
APA, Harvard, Vancouver, ISO, and other styles
39

Stark, Love. "Outlier detection with ensembled LSTM auto-encoders on PCA transformed financial data." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-296161.

Full text
Abstract:
Financial institutions today generate a large amount of data, data that can contain interesting information to investigate to further the economic growth of said institution. There exists an interest in analyzing these points of information, especially if they are anomalous from the normal day-to-day work. However, to find these outliers is not an easy task and not possible to do manually due to the massive amounts of data being generated daily. Previous work to solve this has explored the usage of machine learning to find outliers in these financial datasets. Previous studies have shown that the pre-processing of data usually stands for a big part in information loss. This work aims to study if there is a proper balance in how the pre-processing is carried out to retain the highest amount of information while simultaneously not letting the data remain too complex for the machine learning models. The dataset used consisted of Foreign exchange transactions supplied by the host company and was pre-processed through the use of Principal Component Analysis (PCA). The main purpose of this work is to test if an ensemble of Long Short-Term Memory Recurrent Neural Networks (LSTM), configured as autoencoders, can be used to detect outliers in the data and if the ensemble is more accurate than a single LSTM autoencoder. Previous studies have shown that Ensemble autoencoders can prove more accurate than a single autoencoder, especially when SkipCells have been implemented (a configuration that skips over LSTM cells to make the model perform with more variation). A datapoint will be considered an outlier if the LSTM model has trouble properly recreating it, i.e. a pattern that is hard to classify, making it available for further investigations done manually. The results show that the ensembled LSTM model proved to be more accurate than that of a single LSTM model in regards to reconstructing the dataset, and by our definition of an outlier, more accurate in outlier detection. The results from the pre-processing experiments reveal different methods of obtaining an optimal number of components for your data. One of those is by studying retained variance and accuracy of PCA transformation compared to model performance for a certain number of components. One of the conclusions from the work is that ensembled LSTM networks can prove very powerful, but that alternatives to pre-processing should be explored such as categorical embedding instead of PCA.
Finansinstitut genererar idag en stor mängd data, data som kan innehålla intressant information värd att undersöka för att främja den ekonomiska tillväxten för nämnda institution. Det finns ett intresse för att analysera dessa informationspunkter, särskilt om de är avvikande från det normala dagliga arbetet. Att upptäcka dessa avvikelser är dock inte en lätt uppgift och ej möjligt att göra manuellt på grund av de stora mängderna data som genereras dagligen. Tidigare arbete för att lösa detta har undersökt användningen av maskininlärning för att upptäcka avvikelser i finansiell data. Tidigare studier har visat på att förbehandlingen av datan vanligtvis står för en stor del i förlust av emphinformation från datan. Detta arbete syftar till att studera om det finns en korrekt balans i hur förbehandlingen utförs för att behålla den högsta mängden information samtidigt som datan inte förblir för komplex för maskininlärnings-modellerna. Det emphdataset som användes bestod av valutatransaktioner som tillhandahölls av värdföretaget och förbehandlades genom användning av Principal Component Analysis (PCA). Huvudsyftet med detta arbete är att undersöka om en ensemble av Long Short-Term Memory Recurrent Neural Networks (LSTM), konfigurerad som autoenkodare, kan användas för att upptäcka avvikelser i data och om ensemblen är mer precis i sina predikteringar än en ensam LSTM-autoenkodare. Tidigare studier har visat att en ensembel avautoenkodare kan visa sig vara mer precisa än en singel autokodare, särskilt när SkipCells har implementerats (en konfiguration som hoppar över vissa av LSTM-cellerna för att göra modellerna mer varierade). En datapunkt kommer att betraktas som en avvikelse om LSTM-modellen har problem med att återskapa den väl, dvs ett mönster som nätverket har svårt att återskapa, vilket gör datapunkten tillgänglig för vidare undersökningar. Resultaten visar att en ensemble av LSTM-modeller predikterade mer precist än en singel LSTM-modell när det gäller att återskapa datasetet, och då enligt vår definition av avvikelser, mer precis avvikelse detektering. Resultaten från förbehandlingen visar olika metoder för att uppnå ett optimalt antal komponenter för dina data genom att studera bibehållen varians och precision för PCA-transformation jämfört med modellprestanda. En av slutsatserna från arbetet är att en ensembel av LSTM-nätverk kan visa sig vara mycket kraftfulla, men att alternativ till förbehandling bör undersökas, såsom categorical embedding istället för PCA.
APA, Harvard, Vancouver, ISO, and other styles
40

Ergin, Emre. "Investigation Of Music Algorithm Based And Wd-pca Method Based Electromagnetic Target Classification Techniques For Their Noise Performances." Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/12611218/index.pdf.

Full text
Abstract:
Multiple Signal Classification (MUSIC) Algorithm based and Wigner Distribution-Principal Component Analysis (WD-PCA) based classification techniques are very recently suggested resonance region approaches for electromagnetic target classification. In this thesis, performances of these two techniques will be compared concerning their robustness for noise and their capacity to handle large number of candidate targets. In this context, classifier design simulations will be demonstrated for target libraries containing conducting and dielectric spheres and for dielectric coated conducting spheres. Small scale aircraft targets modeled by thin conducting wires will also be used in classifier design demonstrations.
APA, Harvard, Vancouver, ISO, and other styles
41

Nordahl, Åke. "Metoder för informationsoptimering vid organisk syntes." Doctoral thesis, Umeå universitet, Kemiska institutionen, 1990. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-102557.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Šrenk, David. "Vizualizace spektroskopických dat pomocí metody analýzy hlavních komponent." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2019. http://www.nusl.cz/ntk/nusl-401532.

Full text
Abstract:
This diploma thesis deals with using laser-induced breakdown plasma spectroscopy for determining the elemental structure of unknown samples. It was necessary to design an appropriate method to qualify material by laser-induced emission spectrum. Pretreatment of data and using a variety of chemometrics methods had to be done in order to qualify the structure of elements. We achieved a required solution by projecting the data to a new PCA space, creating clusters and computing the Euclidean distance between each cluster. The experiment in the practical part was set to detect an interface of two elements. We created a data file simulating the ablation on the interface. This data set was gradually processed applying a mathematical-chemical-physical view. Several data procedures have been compiled: approximation by Lorenz, Gauss and Voigt function and also a pretreatment method such as the detection of outliers, standardization by several procedures and subsequent use of principal components analysis. A summarization of processes for input data is fully described in the thesis.
APA, Harvard, Vancouver, ISO, and other styles
43

Uliyar, Hithesh Sanjiva. "FAULT DIAGNOSIS OF VEHICULAR ELECTRIC POWER GENERATION AND STORAGE." Wright State University / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=wright1284602099.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Razifar, Pasha. "Novel Approaches for Application of Principal Component Analysis on Dynamic PET Images for Improvement of Image Quality and Clinical Diagnosis." Doctoral thesis, Uppsala : Acta Universitatis Upsaliensis : Univ.-bibl. [distributör], 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-6053.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Raiford, Douglas Whitmore III. "Multivariate Analysis of Prokaryotic Amino Acid Usage Bias: A Computational Method for Understanding Protein Building Block Selection in Primitive Organisms." Wright State University / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=wright1133886196.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Gonzalez, Nicolas Alejandro. "Principal Components Analysis, Factor Analysis and Trend Correlations of Twenty-Eight Years of Water Quality Data of Deer Creek Reservoir, Utah." BYU ScholarsArchive, 2012. https://scholarsarchive.byu.edu/etd/3309.

Full text
Abstract:
I evaluated twenty-eight years (1980-2007) of spatial-temporal water quality data from Deer Creek Reservoir in Utah. The data came from three sampling points representing the lotic, transitional and lentic zones. The data included measurements of climatological, hydrological and water quality conditions at four depths; Surface, Above Thermocline, Below Thermocline and Bottom. The time frame spanned dates before and after the completion of the Jordanelle Reservoir (1987-1992), approximately fourteen miles upstream of Deer Creek. I compared temporal groupings and found that a traditional month distribution following standard seasons was not effective in characterizing the measured conditions; I developed a more representative seasonal grouping by performing a Tukey-Kramer multiple comparisons adjustment and a Bonferronian correction of the Student's t comparison. Based on these analyses, I determined the best groupings were Cold (December - April), Semi-Cold (May and November), Semi-Warm (June and October), Warm (July and September) and Transition (August). I performed principal component analysis (PCA) and factor analysis (FA) to determine principal parameters associated with the variability of the water quality of the reservoir. These parameters confirmed our seasonal groups showing the Cold, Transition and Warm seasons as distinct groups. The PCA and FA showed that the variables that drive most of the variability in the reservoir are specific conductivity and variables related with temperature. The PCA and FA showed that the reservoir is highly variable. The first 3 principal components and rotated factors explained a cumulative 59% and 47%, respectively of the variability in Deer Creek. Both parametric and nonparametric approaches provided similar correlations but the evaluations that included censored data (nutrients) were considerably different with the nonparametric approach being preferred.
APA, Harvard, Vancouver, ISO, and other styles
47

Krogh, Robert. "Building and generating facial textures using Eigen faces." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-129935.

Full text
Abstract:
With the evolution in the game industry and other virtual environments, demands on what comes with an application is higher than ever before. This leads to many companies trying to to procedurally generate content in order to save up on storage space and get a wider variety of content. It has become essential to infuse immersion in such application and some companies has even gone as far as to let the player recreate him- or herself to be the hero or heroine of the game. Even so, many AAA companies refrain from using face segmentation software as it gives the power of adding game content by the end users, and that may lead to an increased risk of offensive content, that goes against company standards and policy, to enter their application. By taking the concept of procedural generation and applying this together with face segmentation, placing a Principal Component Analysis (PCA) based texturization model, we allow for a controlled yet functioning face texturization in a run-time virtual environment. In this project we use MatLab to create a controlled Eigen space, infuses this into an application built in Unity 3D using UMA, and lets smaller recreation vectors, that spans a few kilobytes as most, to create textures in run-time. In doing so, we can project faces onto the Eigen space and get fully functioning and texturized characters, able to use ready animations and controllers of the developer’s choice. These Eigen spaces may cost more storage space and loading times up to a limit, but can in turn generate a seemingly endless variation of textural content dynamically. In order to see what potential users prioritize when it comes to applications like these, we conducted a survey where the responders saw variations of this technique and were able to express their view on attributes expected from a “good” (from their point of view) application. In the end we have a UMA ready set of scripts, and a one-time use system to create Eigen spaces for the applications to use it. We worked in close relation with Högström’s Selfie to Avatar face segmentation software and proved the concept in Unity 3D applications.
APA, Harvard, Vancouver, ISO, and other styles
48

Santos, Anderson Rodrigo dos. "Identificação de faces humanas através de PCA-LDA e redes neurais SOM." Universidade de São Paulo, 2005. http://www.teses.usp.br/teses/disponiveis/18/18133/tde-21042006-222231/.

Full text
Abstract:
O uso de dados biométricos da face para verificação automática de identidade é um dos maiores desafios em sistemas de controle de acesso seguro. O processo é extremamente complexo e influenciado por muitos fatores relacionados à forma, posição, iluminação, rotação, translação, disfarce e oclusão de características faciais. Hoje existem muitas técnicas para se reconhecer uma face. Esse trabalho apresenta uma investigação buscando identificar uma face no banco de dados ORL com diferentes grupos de treinamento. É proposto um algoritmo para o reconhecimento de faces baseado na técnica de subespaço LDA (PCA + LDA) utilizando uma rede neural SOM para representar cada classe (face) na etapa de classificação/identificação. Aplicando o método do subespaço LDA busca-se extrair as características mais importantes na identificação das faces previamente conhecidas e presentes no banco de dados, criando um espaço dimensional menor e discriminante com relação ao espaço original. As redes SOM são responsáveis pela memorização das características de cada classe. O algoritmo oferece maior desempenho (taxas de reconhecimento entre 97% e 98%) com relação às adversidades e fontes de erros que prejudicam os métodos de reconhecimento de faces tradicionais.
The use of biometric technique for automatic personal identification is one of the biggest challenges in the security field. The process is complex because it is influenced by many factors related to the form, position, illumination, rotation, translation, disguise and occlusion of face characteristics. Now a days, there are many face recognition techniques. This work presents a methodology for searching a face in the ORL database with some different training sets. The algorithm for face recognition was based on sub-space LDA (PCA + LDA) technique using a SOM neural net to represent each class (face) in the stage of classification/identification. By applying the sub-space LDA method, we extract the most important characteristics in the identification of previously known faces that belong to the database, creating a reduced and more discriminated dimensional space than the original space. The SOM nets are responsible for the memorization of each class characteristic. The algorithm offers great performance (recognition rates between 97% and 98%) considering the adversities and sources of errors inherent to the traditional methods of face recognition.
APA, Harvard, Vancouver, ISO, and other styles
49

Bayik, Tuba Makbule. "Automatic Target Recognition In Infrared Imagery." Master's thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/2/12605388/index.pdf.

Full text
Abstract:
The task of automatically recognizing targets in IR imagery has a history of approximately 25 years of research and development. ATR is an application of pattern recognition and scene analysis in the field of defense industry and it is still one of the challenging problems. This thesis may be viewed as an exploratory study of ATR problem with encouraging recognition algorithms implemented in the area. The examined algorithms are among the solutions to the ATR problem, which are reported to have good performance in the literature. Throughout the study, PCA, subspace LDA, ICA, nearest mean classifier, K nearest neighbors classifier, nearest neighbor classifier, LVQ classifier are implemented and their performances are compared in the aspect of recognition rate. According to the simulation results, the system, which uses the ICA as the feature extractor and LVQ as the classifier, has the best performing results. The good performance of this system is due to the higher order statistics of the data and the success of LVQ in modifying the decision boundaries.
APA, Harvard, Vancouver, ISO, and other styles
50

Bengtsson, Sebastian. "MACHINE LEARNING FOR MECHANICAL ANALYSIS." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-44325.

Full text
Abstract:
It is not reliable to depend on a persons inference on dense data of high dimensionality on a daily basis. A person will grow tired or become distracted and make mistakes over time. Therefore it is desirable to study the feasibility of replacing a persons inference with that of Machine Learning in order to improve reliability. One-Class Support Vector Machines (SVM) with three different kernels (linear, Gaussian and polynomial) are implemented and tested for Anomaly Detection. Principal Component Analysis is used for dimensionality reduction and autoencoders are used with the intention to increase performance. Standard soft-margin SVMs were used for multi-class classification by utilizing the 1vsAll and 1vs1 approaches with the same kernels as for the one-class SVMs. The results for the one-class SVMs and the multi-class SVM methods are compared against each other within their respective applications but also against the performance of Back-Propagation Neural Networks of varying sizes. One-Class SVMs proved very effective in detecting anomalous samples once both Principal Component Analysis and autoencoders had been applied. Standard SVMs with Principal Component Analysis produced promising classification results. Twin SVMs were researched as an alternative to standard SVMs.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography