Dissertations / Theses on the topic 'PCA'

To see the other types of publications on this topic, follow the link: PCA.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'PCA.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Renkjumnong, Wasuta. "SVD and PCA in Image Processing." Digital Archive @ GSU, 2007. http://digitalarchive.gsu.edu/math_theses/31.

Full text
Abstract:
The Singular Value Decomposition is one of the most useful matrix factorizations in applied linear algebra, the Principal Component Analysis has been called one of the most valuable results of applied linear algebra. How and why principal component analysis is intimately related to the technique of singular value decomposition is shown. Their properties and applications are described. Assumptions behind this techniques as well as possible extensions to overcome these limitations are considered. This understanding leads to the real world applications, in particular, image processing of neurons. Noise reduction, and edge detection of neuron images are investigated.
APA, Harvard, Vancouver, ISO, and other styles
2

Bennett, Marissa A. "Improving Model Performance with Robust PCA." Digital WPI, 2020. https://digitalcommons.wpi.edu/etd-theses/1366.

Full text
Abstract:
As machine learning becomes an increasingly relevant field being incorporated into everyday life, so does the need for consistently high performing models. With these high expectations, along with potentially restrictive data sets, it is crucial to be able to use techniques for machine learning that increase the likelihood of success. Robust Principal Component Analysis (RPCA) not only extracts anomalous data, but also finds correlations among the given features in a data set, in which these correlations can themselves be used as features. By taking a novel approach to utilizing the output from RPCA, we address how our method effects the performance of such models. We take into account the efficiency of our approach, and use projectors to enable our method to have a 99.79% faster run time. We apply our method primarily to cyber security data sets, though we also investigate the effects on data sets from other fields (e.g. medical).
APA, Harvard, Vancouver, ISO, and other styles
3

Liu, Peng. "Adaptive Mixture Estimation and Subsampling PCA." Case Western Reserve University School of Graduate Studies / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=case1220644686.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Moravčíková, Simona. "Stanovenie hodnoty firmy PCA Slovakia, s.r.o." Master's thesis, Vysoká škola ekonomická v Praze, 2015. http://www.nusl.cz/ntk/nusl-206699.

Full text
Abstract:
The aim of this thesis is to estimate the value of the company PCA Slovakia, s.r.o. to the 31st December 2015. The thesis is divided into two parts, in the concrete the theoretical and practical parts. The theoretical part describes the basic concepts necessary for the valuation of the company and it is kind of the point for the practical part. On the other hand, the practical part is focused on the introduction of the company and the application of strategic analysis and financial analysis, the prognosis of revenue and other value drivers, financial plan and finally the actual valuation of the company. There was used the DCF method of valuation in the term of FCFF and EVA for the valuation of the company.
APA, Harvard, Vancouver, ISO, and other styles
5

Bernardina, Philipe Dalla. "PCA-tree: uma proposta para indexação multidimensional." Universidade de São Paulo, 2007. http://www.teses.usp.br/teses/disponiveis/45/45134/tde-29082007-114522/.

Full text
Abstract:
Com o vislumbramento de aplicações que exigiam representações em espaços multidimensionais, surgiu a necessidade de desenvolvimento de métodos de acessos eficientes a estes dados representados em R^d. Dentre as aplicações precursoras dos métodos de acessos multidimensionais, podemos citar os sistemas de geoprocessamento, aplicativos 3D e simuladores. Posteriormente, os métodos de acessos multidimensionais também apresentaram-se como uma importante ferramenta no projeto de classificadores, principalmente classificadores pelos vizinhos mais próximos. Com isso, expandiu-se o espaço de representação, que antes se limitava no máximo a quatro dimensões, para dimensionalidades superiores a mil. Dentre os vários métodos de acesso multidimensional existentes, destaca-se uma classe de métodos baseados em árvores balanceadas com representação em R^d. Estes métodos constituem evoluções da árvore de acesso unidimenisonal B-tree e herdam várias características deste último. Neste trabalho, apresentamos alguns métodos de acessos dessa classe de forma a ilustrar a idéia central destes algoritmos e propomos e implementamos um novo método de acesso, a PCA-tree. A PCA-tree utiliza uma heurística de quebra de nós baseada na extração da componente principal das amostras a serem divididas. Um hiperplano que possui essa componente principal como seu vetor normal é definido como o elemento que divide o espaço associado ao nó. A partir dessa idéia básica geramos uma estrutura de dados e algoritmos que utilizam gerenciamento de memória secundária como a B-tree. Finalmente, comparamos o desempenho da PCA-tree com o desempenho de alguns outros métodos de acesso da classe citada, e apresentamos os prós e contras deste novo método de acesso através de análise de resultados práticos.
The advent of applications demanding the representation of objects in multi-dimensional spaces fostered the development of efficient multi-dimensional access methods. Among some early applications that required multi-dimensional access methods, we can cite geo-processing systems, 3D applications and simulators. Later on, multi-dimensional access methods also became important tools in the design of classifiers, mainly of those based on nearest neighbors technique. Consequently, the dimensionality of the spaces has increased, from earlier at most four to dimensionality larger than a thousand. Among several multi-dimensional access methods, the class of approaches based on balanced tree structures with data represented in Rd has received a lot of attention. These methods constitute evolues from the B-tree for unidimensional accesses, and inherit several of its characteristics. In this work, we present some of the access methods based on balanced trees in order to illustrate the central idea of these algorithms, and we propose and implement a new multi-dimensional access method, which we call PCA-tree. It uses an heuristic to break nodes based on the principal component of the sample to be divided. A hyperplane, whose normal is the principal component, is defined as the one that will split the space represented by the node. From this basic idea we define the data structure and the algorithms for the PCA-tree employing secondary memory management, as in B-trees. Finally, we compare the performance of the PCA-tree with the performance of other methods in the cited class, and present advantages and disadvantages of the proposed access method through analysis of experimental results.
APA, Harvard, Vancouver, ISO, and other styles
6

Storm, Christine. "Permutation procedures for ANOVA, regression and PCA." Diss., University of Pretoria, 2012. http://hdl.handle.net/2263/24960.

Full text
Abstract:
Parametric methods are effective and appropriate when data sets are obtained by well-defined random sampling procedures, the population distribution for responses is well-defined, the null sampling distributions of suitable test statistics do not depend on any unknown entity and well-defined likelihood models are provided for by nuisance parameters. Permutation testing methods, on the other hand, are appropriate and unavoidable when distribution models for responses are not well specified, nonparametric or depend on too many nuisance parameters; when ancillary statistics in well-specified distributional models have a strong influence on inferential results or are confounded with other nuisance entities; when the sample sizes are less than the number of parameters and when data sets are obtained by ill-specified selection-bias procedures. In addition, permutation tests are useful not only when parametric tests are not possible, but also when more importance needs to be given to the observed data set, than to the population model, as is typical for example in biostatistics. The different types of permutation methods for analysis of variance, multiple linear regression and principal component analysis are explored. More specifically, one-way, twoway and three-way ANOVA permutation strategies will be discussed. Approximate and exact permutation tests for the significance of one or more regression coefficients in a multiple linear regression model will be explained next, and lastly, the use of permutation tests used as a means to validate and confirm the results obtained from the exploratory PCA will be described.
Dissertation (MSc)--University of Pretoria, 2012.
Statistics
unrestricted
APA, Harvard, Vancouver, ISO, and other styles
7

Le, Hanh T. Banking &amp Finance Australian School of Business UNSW. "Discrete PCA: an application to corporate governance research." Awarded by:University of New South Wales. Banking & Finance, 2007. http://handle.unsw.edu.au/1959.4/40753.

Full text
Abstract:
This thesis introduces the application of discrete Principal Component Analysis (PCA) to corporate governance research. Given the presence of many discrete variables in typical governance studies, I argue that this method is superior to standard PCA that has been employed by others working in the area. Using a dataset of 244 companies listed on the London Stock Exchange in the year 2002-2003, I find that Pearson's correlations underestimate the strength of association between two variables, when at least one of them is discrete. Accordingly, standard PCA performed on the Pearson correlation matrix results in biased estimates. Applying discrete PCA on the polychoric correlation matrix, I extract from 28 corporate governance variables 10 significant factors. These factors represent 8 main aspects of the governance system, namely auditor reputation, large shareholder influence, size of board committees, social responsibility, risk optimisation, director independence level, female representation and institutional ownership. Finally, I investigate the relationship between corporate governance and a firm's long-run share market performance, with the former being the factors extracted. Consistent with Demsetz' (1983) argument, I document limited explanatory power for these governance factors.
APA, Harvard, Vancouver, ISO, and other styles
8

Nziga, Jean-Pierre. "Incremental Sparse-PCA Feature Extraction For Data Streams." NSUWorks, 2015. http://nsuworks.nova.edu/gscis_etd/365.

Full text
Abstract:
Intruders attempt to penetrate commercial systems daily and cause considerable financial losses for individuals and organizations. Intrusion detection systems monitor network events to detect computer security threats. An extensive amount of network data is devoted to detecting malicious activities. Storing, processing, and analyzing the massive volume of data is costly and indicate the need to find efficient methods to perform network data reduction that does not require the data to be first captured and stored. A better approach allows the extraction of useful variables from data streams in real time and in a single pass. The removal of irrelevant attributes reduces the data to be fed to the intrusion detection system (IDS) and shortens the analysis time while improving the classification accuracy. This dissertation introduces an online, real time, data processing method for knowledge extraction. This incremental feature extraction is based on two approaches. First, Chunk Incremental Principal Component Analysis (CIPCA) detects intrusion in data streams. Then, two novel incremental feature extraction methods, Incremental Structured Sparse PCA (ISSPCA) and Incremental Generalized Power Method Sparse PCA (IGSPCA), find malicious elements. Metrics helped compare the performance of all methods. The IGSPCA was found to perform as well as or better than CIPCA overall in term of dimensionality reduction, classification accuracy, and learning time. ISSPCA yielded better results for higher chunk values and greater accumulation ratio thresholds. CIPCA and IGSPCA reduced the IDS dataset to 10 principal components as opposed to 14 eigenvectors for ISSPCA. ISSPCA is more expensive in terms of learning time in comparison to the other techniques. This dissertation presents new methods that perform feature extraction from continuous data streams to find the small number of features necessary to express the most data variance. Data subsets derived from a few important variables render their interpretation easier. Another goal of this dissertation was to propose incremental sparse PCA algorithms capable to process data with concept drift and concept shift. Experiments using WaveForm and WaveFormNoise datasets confirmed this ability. Similar to CIPCA, the ISSPCA and IGSPCA updated eigen-axes as a function of the accumulation ratio value, forming informative eigenspace with few eigenvectors.
APA, Harvard, Vancouver, ISO, and other styles
9

Rimal, Suraj. "POPULATION STRUCTURE INFERENCE USING PCA AND CLUSTERING ALGORITHMS." OpenSIUC, 2021. https://opensiuc.lib.siu.edu/theses/2860.

Full text
Abstract:
Genotype data, consisting large numbers of markers, is used as demographic and association studies to determine genes related to specific traits or diseases. Handling of these datasets usually takes a significant amount of time in its application of population structure inference. Therefore, we suggested applying PCA on genotyped data and then clustering algorithms to specify the individuals to their particular subpopulations. We collected both real and simulated datasets in this study. We studied PCA and selected significant features, then applied five different clustering techniques to obtain better results. Furthermore, we studied three different methods for predicting the optimal number of subpopulations in a collected dataset. The results of four different simulated datasets and two real human genotype datasets show that our approach performs well in the inference of population structure. NbClust is more effective to infer subpopulations in the population. In this study, we showed that centroid-based clustering: such as k-means and PAM, performs better than model-based, spectral, and hierarchical clustering algorithms. This approach also has the benefit of being fast and flexible in the inference of population structure.
APA, Harvard, Vancouver, ISO, and other styles
10

Silva, Luis Felipe da. "Análise quimiométrica da distribuição de quimioterápicos antimicrobianos (Fluoroquinolonas e Sulfonamidas) na Baía de Ubatuba." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/75/75135/tde-17112016-141429/.

Full text
Abstract:
Os quimioterápicos antimicrobianos são considerados contaminantes emergentes, com a capacidade de criar resistência em bactérias. Têm sido o foco de inúmeras pesquisas relacionadas a impactos ambientais, mas no Brasil, as pesquisas sobre a sua ocorrência em ambientes aquáticos continentais e costeiros são escassas. A cidade de Ubatuba tem uma densidade demográfica cinco vezes maior que a média nacional, aumentada em até doze vezes no verão, pressionando ainda mais os ecossistemas da região. Os rios Acaraú, Lagoa-Tavares, Grande e Indaiá deságuam na baía de Ubatuba, comprometendo a qualidade das suas águas. Este trabalho investigou a contribuição da descarga desses rios para a ocorrência e distribuição de quimioterápicos antimicrobianos (fluoroquinolonas e sulfonamidas) na baía de Ubatuba. Foi analisado um total de 36 amostras de água superficial. As coletas foram realizadas no período seco/chuvoso de 2014/2015. Para a determinação e quantificação dos fármacos foi utilizada SPE como método de preparo de amostra e CLAE-MS/MS para a detecção e quantificação dos antimicrobianos estudados. As características físico-químicas pH, temperatura, salinidade, oxigênio dissolvido, potencial redox (ORP), além do Carbono Orgânico Dissolvido (COD) também foram determinados para caracterização das águas da região. A distribuição espaço-temporal dos fármacos e a possível associação com os demais dados investigados foi avaliada pela ferramenta quimiométrica CA, visando à extração da maior quantidade possível de informações. Os rios e, consequentemente a baía estavam impactados pelo despejo de esgoto doméstico e os seguintes quimioterápicos antimicrobianos foram encontrados: sulfametoxazol (SMX), sulfatiazol (STZ), sulfacloropiridiazina (SCP), sulfaquinoxalina (SQX) e norfloxacina (NOR). Observou-se uma variação espacial e temporal, nos perfis de contaminação revelados pela CA.
Antimicrobial chemotherapeutical agents are considered emerging contaminants capable of creating bacterial resistance. They have been the focus of numerous studies related to environmental impacts, but in Brazil, there is little research on their occurrence in continental and coastal aquatic environments. The city of Ubatuba has a population density five times higher than the national average, which may be increased up to twelve times during the summer, pushing further the region\'s ecosystems. The rivers Acaraú, Lagoa-Tavares, Grande and Indaiá flow into Ubatuba Bay, compromising the quality of its waters. This work investigated those rivers\' discharge contribution on the occurrence and distribution of antimicrobial chemotherapeutical agents (fluoroquinolones and sulfonamides) in Ubatuba Bay. Thirty-six samples of surface water were analyzed. The samples were withdrawn during the dry and rainy seasons of 2014 and 2015, respectively. For the determining and quantifying the antimicrobial chemotherapeutical agents, it was used SPE as a sample preparation method and HPLC?MS/MS for their detection and quantification. Physicochemical characteristics like pH, temperature, salinity, dissolved oxygen, and redox potential (ORP), as well as dissolved organic carbon (DOC) were also determined to characterize the waters of the region. The spatial-temporal distribution of the agents and their possible association with other investigated data was assessed by the chemometric tool CA, aiming at extracting the greatest possible amount of information. The rivers and, consequently, the bay were contaminated by domestic sewage discharges and the following antimicrobial chemotherapeutical agents were detected: sulfamethoxazole (SMX), sulfathiazole (STZ), sulfachloropyridiazine (SCP), sulfaquinoxaline (SQX), and norfloxacin (NOR). The contamination profiles revealed by the CA showed a spatial variation and a temporal one.
APA, Harvard, Vancouver, ISO, and other styles
11

TharinduPriyanDeAlwis, MahappuKankanamge. "PCA and FA model for Matrix Valued Time Series Data." OpenSIUC, 2017. https://opensiuc.lib.siu.edu/theses/2197.

Full text
Abstract:
Due to the advance of storage technology in computers, matrix valued time series observations are collecting in many fields like finance, economics, weather forecasting, and many fields. The matrix valued data bring information along its row wise and column wise, and if they are collect over the time we have to treat the data as a time series. If we use classical dimension reduction by converting the matrix into a long vector, the structural information is lost. Here, we discuss two different ways of dimension reduction namely: PCA model for matrix valued time series data and the FA model for matrix valued time series data which keep the matrix structure and reduce the dimension of the observed data. The estimating procedure, theoretical properties and a simulation study are presented and demonstrated for both of these models.
APA, Harvard, Vancouver, ISO, and other styles
12

Königsson, Sofia. "PCA för detektering av avvikande händelser i en kraftvärmeprocess." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-347516.

Full text
Abstract:
Panna 6 på Högdalenverket i södra Stockholm (P6) med tillhörande ångturbin producerar kraftvärme genom förbränning av utsorterat returbränsle från industri och samhälle. För att minimera underhållskostnader och öka anläggningens tillgänglighet är det viktigt att fel och oönskat processbeteende kan upptäckas i ett tidigt skede. I detta syfte testas här en metod för detektering av avvikande händelser med hjälp av principalkomponentanalys (PCA) på produktionsprocessen för kraftvärme. En PCA-modell med reducerad dimension skapas utifrån processdata från en problemfri driftperiod och används som mall för inkommande data att jämföras med i ett kontrolldigram. Avvikelser ifrån modellen bör vara en indikation på att ett onormalt drifttillstånd har uppkommit och orsaker till avvikelsen analyseras. Som avvikande händelse testas två fall av tubläckage som uppstod i ett av tubpaketen för kylning av rökgaserna under 2014 och 2015. Resultatet visar att processavvikelser ifrån normallägesmodellerna tydligt syns i kontrolldiagrammen vid båda tubläckagen och avvikelserna kan härledas till variabler som är kopplade till tubläckage. Det finns potential för att tillämpa metoden för övervakning av processen, en svårighet ligger i att skapa en modell som representerar processen när den är stabil på grund av att det finns många varierande driftfall som anses stabila, detta kräver vidare arbete. Metoden kan redan användas som analysverktyg exempelvis vid misstanke om tubläckage.
Boiler 6 at the Högdalen facility in southern Stockholm (P6) combined with a a steam turbine produces Combined Heat and Power (CHP) through combustion of treated industry waste. In order to minimise maintenance costs and increase plant availability it is of importance to detect process faults and deviations at an early state. In this study a method for outlier detection using Principal Component Analysis (PCA) is applied on the CHP production process. A PCA model with reduced dimension is created using process data from a problem free period and is used as a template for new operating data to be compared with in a control chart. Deviations from the model should be an indication of the presence of abnormal conditions and the reasons for the deviations are analysed. Two cases of tube failure in 2014 and 2015 are used to study the deviations. The result shows that process deviations from the models can be detected in the control chart in both cases of tube failure and the variables known to be associated with tube failure contributes highly to the deviating behaviour. There is potential for applying this method for process control, a difficulty lies in creating a model that represents the stable process when there are big variances within what is considererd a stable process state. The method can be used for data analysis when suspecting a tube failure.
APA, Harvard, Vancouver, ISO, and other styles
13

SILVA, Thiago César de Oliveira. "CLASSIFICAÇÃO DE CERVEJAS POR ANÁLISE DE IMAGENS E PCA." Universidade Federal de Goiás, 2008. http://repositorio.bc.ufg.br/tede/handle/tde/1035.

Full text
Abstract:
Made available in DSpace on 2014-07-29T15:12:43Z (GMT). No. of bitstreams: 1 Dissertacao Thiago Cesar.pdf: 841884 bytes, checksum: cee2fa330f9f584e1e3feaa6524d8a0e (MD5) Previous issue date: 2008-06-30
In this work a simple and new approach for lager beer pattern recognition based on image analysis was accomplished using Principal Component Analysis (PCA). Digital color images obtained from a scanner were decomposed into a binary form with the help of Scilab and SIP softwares. The PCA data matrix consisted of RGB color histograms for each sample. 350 mL beer cans were chosen at random on the shelf of local supermarkets in an overall of ninety cans from ten different brands. 200 mL samples were immersed in an ultrasound bath for a period of time ranging from twenty to thirty minutes before the digital image processing. The resulting methodology can be a simple alternative to the analytical quality control of lager beers.
Nesse trabalho foi desenvolvida uma nova metodologia para classificação de cervejas utilizando análise de imagens, tendo como atributo de qualidade a cor, com o emprego da Análise de Componentes Principais (PCA). Para o desenvolvimento do trabalho fez-se uso do software matemático Scilab e do pacote computacional SIP, que possibilitou a decomposição e a análise das imagens digitalizadas obtidas por meio de um scanner comercial. Rotinas foram aplicadas no software para decompor as imagens geradas nos formatos JPEG e BMP, em termos dos canais R, G e B, do padrão de cores RGB. Por meio dos algoritmos desenvolvidos, cada matriz de dados resultante da imagem de uma amostra de cerveja foi transformada em uma nova matriz, a qual gerou o respectivo histograma médio de freqüências das 768 variáveis, cada canal de cor contém 256 (0 a 255) índices de cor. De tal modo, pôde-se analisar as diferentes marcas de cerveja por meio dos gráficos de PCA, com base nos resultados dos gráficos dos loadings e dos scores e dos histogramas médios de freqüência de cada marca. Foram analisadas imagens em formato JPEG de dez marcas de cerveja do tipo pilsen em embalagens de latas contendo 350 mL de cerveja, escolhidas ao acaso, considerando as marcas de maior consumo na região metropolitana de Goiânia. Cerca de 200 mL de cerveja foram retiradas de cada lata e submetidas ao ultra-som para desgaseificação antes da captura da imagens no scanner. Durante as análises de imagens foram percebidas sensíveis alterações de cor das amostras de cerveja quando expostas a luz e ao ar, de modo que esse fenômeno passou a ser avaliado como sendo objetivo secundário do trabalho. Em um segundo momento, com o intuito de ampliar o estudo da análise de imagens, fez-se novas análises com imagens em formato bitmap, BMP, e para tanto foi realizada nova amostragem com o mesmo critério da anterior. Nessa nova amostragem foram escolhidas oito marcas, dentre as quais, sete coincidiram com as marcas utilizadas na análise das imagens com formato em JPEG. Os resultados obtidos nesse trabalho indicam que a metodologia proposta pode ser uma alternativa analítica simples para o controle de qualidade da cerveja, tendo como atributo a cor.
APA, Harvard, Vancouver, ISO, and other styles
14

Bhosle, Naren. "AMELIORATING AERIAL NEAR MISSES WITH PCA DEVICES AND MOTL." Available to subscribers only, 2009. http://proquest.umi.com/pqdweb?did=1885431391&sid=5&Fmt=2&clientId=1509&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Lacerda, Eduardo Alberto Duarte. "Tomografia PCA aplicada às galáxias do Projeto CALIFA Survey." reponame:Repositório Institucional da UFSC, 2014. https://repositorio.ufsc.br/xmlui/handle/123456789/123229.

Full text
Abstract:
Dissertação (mestrado) - Universidade Federal de Santa Catarina, Centro de Ciências Físicas e Matemáticas, Programa de Pós-Graduação em Física, Florianópolis, 2014.
Made available in DSpace on 2014-08-06T17:57:40Z (GMT). No. of bitstreams: 1 326935.pdf: 30733066 bytes, checksum: bb6126b0b0fa0b7f3f1b3b70eeb0c443 (MD5) Previous issue date: 2014
Levantamentos espectroscópicos de galáxias, como o SDSS, estão sendo levados para um novo nível através das unidades de campo integral (IFU), transformando o foco das propriedades das galáxias como um todo para a física interna dasgaláxias. O survey CALIFA é pioneiro nesta nova geração, proporcionando espectroscopia espacialmente resolvida para centenas de galáxias de todas as formas e massas. Vários estudos têm sido realizados com os cubos de dados deste survey, examinando as distribuições espaciais de linhas de emissão, populações estelares, cinemática do gás e das estrelas, etc. Neste trabalho exploramos a análise dos cubos do CALIFA através da perspectiva matemática da técnica de tomografia da análise de componentes principais (PCA) criada por Steiner e colaboradores. Utilizamos a tomografia PCA em 8 galáxias do CALIFA (4 late-types, 2 early-types e 2 mergers). Linhas de emissão são mascaradas em nossas análises para que nos concentremos nas propriedades de populações estelares. As escalas físicas e angulares abarcadas pelos nossos cubos de dados (cobrindo galáxias inteiras) são muito maiores do que aquelas cobertas por aplicações anteriores desta técnica. Essa diferença nos impeliu a introduzir uma simples variação no estágio pré-PCA, normalizando os espectros a uma escala comum de fluxo, destacando variância espectral qualitativa (em oposição à quantitativa relacionada com a amplitude). Examinamos também cubos sintéticos obtidos com ajustes espectrais dos dados originais. Isto elimina o ruído e efeitos de calibração de fluxo. É bem sabido que não podemos atribuir sentido físico os resultados da PCA. Para resolver este problema, utilizamos uma análise de populações estrelares feitas com o STARLIGHT sobre os mesmos dados para obtermos, através de uma engenharia reversa, o significado astrofísico das componentes principais. Em cada galáxia fomos capazes de encontrar alguma correlação entre componentes principais e parâmetros físicos. Nós detectamos que a primeira componente principal em galáxias espirais geralmente se correlaciona com a idade estelar média. Em mergers a componente principal reflete essencialmente variações da extinção global. Em alguns casos a tomografia PCA nos ajuda a localizar erros no cubos originais pois normalmente a grande variância imposta por pixels danificados aparece em componentes principais isolados.

Abstract : Spectroscopic surveys of galaxies, like the SDSS, are currently being taken to a next level with integral field units, turning the focus from the properties of galaxies as a whole to the internal physics of galaxies. The CALIFA survey is a pioneer in this new generation, providing spatially resolved spectroscopy for hundreds of galaxies of all shapes and masses. Several studies have been conducted with datacubes from this survey, examining the spatial distributions of emission lines, stellar populations, gaseous and stellar kinematics, etc. In this explotatory work we approach the analysis of CALIFA datacubes from the mathematical perspective of principal component analysis (PCA) tomography techniques developed by Steiner and collaborators. We apply the PCA tomography to 8 CALIFA galaxies. (4 late-type, 2 early-type and 2 mergers). Emission lines are masked in our analysis to concentrate on the stellar population properties. The physical and angular scales spaned by our datacubes (which cover whole galaxies) are much larger than those spanned by previous applications of this technique. This diference prompted us to introduce a simple variation on the pre-PCA stage, normalizing all spectra in a datacube to a common flux scale, which highlight qualitative (as opposed to amplitude-related quantitative) spectral variance in the datacube. This eliminates noise and flux-calibration effects on the data. A well known caveat with PCA is that it does not assign physical meaning to its results. To tackle this problem we use a starlight-based stellar population analysis of the same data to reverse-engineer the astrophysical meaning of the principal components. In every galaxy we are able to find some correlation between principal components and physical parameters. We find that the first principal component in spiral galaxies usually correlates with the mean stellar age (hlog tiL). In mergers the first component reflects essencially variations of the global extinction parameter (AV). In some cases, the PCA tomography helps us find errors in the original datacube because usually the huge variance imposed by damaged pixels appears in isolated principal components.
APA, Harvard, Vancouver, ISO, and other styles
16

Wang, Jin. "An Incremental Multilinear System for Human Face Learning and Recognition." FIU Digital Commons, 2010. http://digitalcommons.fiu.edu/etd/312.

Full text
Abstract:
This dissertation establishes a novel system for human face learning and recognition based on incremental multilinear Principal Component Analysis (PCA). Most of the existing face recognition systems need training data during the learning process. The system as proposed in this dissertation utilizes an unsupervised or weakly supervised learning approach, in which the learning phase requires a minimal amount of training data. It also overcomes the inability of traditional systems to adapt to the testing phase as the decision process for the newly acquired images continues to rely on that same old training data set. Consequently when a new training set is to be used, the traditional approach will require that the entire eigensystem will have to be generated again. However, as a means to speed up this computational process, the proposed method uses the eigensystem generated from the old training set together with the new images to generate more effectively the new eigensystem in a so-called incremental learning process. In the empirical evaluation phase, there are two key factors that are essential in evaluating the performance of the proposed method: (1) recognition accuracy and (2) computational complexity. In order to establish the most suitable algorithm for this research, a comparative analysis of the best performing methods has been carried out first. The results of the comparative analysis advocated for the initial utilization of the multilinear PCA in our research. As for the consideration of the issue of computational complexity for the subspace update procedure, a novel incremental algorithm, which combines the traditional sequential Karhunen-Loeve (SKL) algorithm with the newly developed incremental modified fast PCA algorithm, was established. In order to utilize the multilinear PCA in the incremental process, a new unfolding method was developed to affix the newly added data at the end of the previous data. The results of the incremental process based on these two methods were obtained to bear out these new theoretical improvements. Some object tracking results using video images are also provided as another challenging task to prove the soundness of this incremental multilinear learning method.
APA, Harvard, Vancouver, ISO, and other styles
17

David, Höglin. "Regional and Local Factors Influencing the Mass Balance of the Scandinavian Glaciers." Thesis, Uppsala universitet, Institutionen för geovetenskaper, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-311152.

Full text
Abstract:
According to climatic models there will be an increase in the amount of greenhouse gases which results in a warming of the earth where the change will be most prominent in the high latitudes. Glaciers mass balance is a good climate change indicator as the response is fast when climate is changing. Glacier mass balance, area of glaciers, elevation line altitude data for 13 glaciers in Scandinavia as well as North Atlantic oscillation (NAO), Arctic oscillation (AO) and sunspot data where gathered and a principle component analysis (PCA) where made. PCA is a multivariate statistical technique with the goal to extract important information and reduce the dimension of data. Three distinct groupings where found within the data set and was identified as extreme years of North Atlantic Oscillation and Arctic Oscillation and one glacier which had the largest area of the 13 glaciers. The PCA explained that all the variables in the data set is correlated with North Atlantic and Arctic Oscillation to about 40 % and we can conclude that there is a regional and local forcing within our data where the regional (NAO and AO) is of more importance for the variance and for the mass balance.
Enligt klimatmodeller kommer en ökning av växthusgaser i atmosfären leda till en ökning av temperaturen på jorden, den ökningen kommer främst att ske på höga latituder. Glaciärer är bra indikation på förändrat klimat på grund av deras korta responstid när klimatet ändrar sig. För tillfället finns det ca 1900 glaciärer utspridda i de Skandinaviska bergen. Eftersom Skandinavien är så avlångt är det en skillnad i meterologiska och klimatiska förhållanden, både i en nord-syd riktning men även i en öst-väst riktning med kontinentala glaciärer i öst och mer marina i väst. Klimat och glaciärdata för 13 olika glaciärer i Skandinavien, 5 från Sverige och 8 ifrån Norge har samlats in och en statistisk analys, principle component analysis (PCA) har gjorts för att se vad som påverkar massbalansen för glaciärerna. De klimat parametrar som har undersökts är Nordatlantsika oscillationen (NAO), Arktiska oscillationen (AO) och solfläckar tillsammans med massbalans, equilibrium line altitude (ELA) och area för glaciärerna. Tre grupperingar har hittats som kan kopplas till olika klimatvariabler och PCA visar extremår för NAO och AO samt en glaciär som har den största arean. PCA analysen visade att alla variabler korrelerade till NAO och AO med omkring 40 % och vi kan dra slutsatsen att det finns en drivande regional och lokal kraft inom vårat dataset där NAO och AO är viktigast för massbalansen.
APA, Harvard, Vancouver, ISO, and other styles
18

Povia, Giovana Soato. "Determinação dos parametros de qualidade de detergentes em po utilizando espectroscopia no infravermelho proximo." [s.n.], 2007. http://repositorio.unicamp.br/jspui/handle/REPOSIP/249948.

Full text
Abstract:
Orientador: Celio Pasquini
Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Quimica
Made available in DSpace on 2018-08-09T09:59:17Z (GMT). No. of bitstreams: 1 Povia_GiovanaSoato_M.pdf: 1681817 bytes, checksum: d59818d169390e4a42e0555251507c27 (MD5) Previous issue date: 2007
Resumo: Este trabalho visa o desenvolvimento de um método analítico para a determinação dos parâmetros de qualidade em detergentes em pó utilizando a Espectroscopia de Infravermelho Próximo (NIR) e técnicas de calibração multivariada. Foram utilizados dois conjuntos de amostras: o primeiro para as análises quantitativas e o segundo para análises qualitativas. As amostras do primeiro conjunto tiveram os parâmetros de qualidade determinados pelos respectivos métodos de referência. A técnica estatística utilizada para as calibrações foi o PLS. Foram desenvolvidos modelos de calibrações para a previsão do teor de umidade, matéria ativa e densidade. O desempenho dos modelos de calibrações foi avaliado por meio de validação externa. A determinação do teor de umidade apresentou RMSEP = 0,29% (m/m). O valor de RMSEP para a determinação da matéria ativa foi de 0,37% (m/m) e para a determinação da densidade o valor de RMSEP = 14 g L . Os modelos construídos apresentaram resultados satisfatórios e os erros encontrados são aceitáveis para a faixa de controle utilizada na indústria. O segundo conjunto é composto de 4 grupos, que apresentam características distintas. Foram avaliados dois métodos de classificação: SIMCA e PLS DA. É possível observar que ocorre a discriminação das amostras que apresentam teor de matéria ativa mais elevado, no entanto, os outros grupos não puderam ser discriminados. Os dois métodos de classificação avaliados apresentaram resultados semelhantes, com acerto de 100% na classificação de amostras externas somente em seus respectivos grupos
Abstract: This work aims the development of an analytical method for the determination of quality parameters on powder detergents using the near infrared spectroscopy (NIR) and multivariate calibration techniques. Two sets of samples were used: the first one for the quantitative analysis and the second one for qualitative analysis. The samples of the first set had the quality parameters determined by their respective methods of reference. The chemometric technique used for calibration was the PLS1. Calibrations for the prediction of de moisture content, active matter and density were developed. The performance of the calibration models was evaluated through external validation. The determination of the moisture content presented a RMSEP = 0,29% (w/w). The value of RMSEP for the determination of the active matter was 0,37% (w/w) and for the determination of moisture the value of RMSEP was 14 g L. The constructed models presented satisfactory results and the errors that were found are acceptable for the control range used in industry. The second set is composed of four groups of power detergents which present different characteristics. Two methods of classification were evaluated: SIMCA and PLS DA. It was possible to observe the discrimination of the samples which presents higher active matter content. However, the other groups could not be discriminated. Both methods of classifications evaluated presented similar results, with 100% correcte results of the classification of samples only in their respective groups
Mestrado
Quimica Analitica
Mestre em Química
APA, Harvard, Vancouver, ISO, and other styles
19

Pedro, Andre Messias Krell. "Desenvolvimento do metodo multivariado acelerado para determinação do prazo de validade de produtos unindo quimiometria e cinetica quimica." [s.n.], 2009. http://repositorio.unicamp.br/jspui/handle/REPOSIP/248542.

Full text
Abstract:
Orientador: Marcia Miguel Castro Ferreira
Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Quimica
Made available in DSpace on 2018-08-14T13:05:23Z (GMT). No. of bitstreams: 1 Pedro_AndreMessiasKrell_D.pdf: 1179705 bytes, checksum: 302be0a6ab2e796d557a625c76f92bc4 (MD5) Previous issue date: 2009
Resumo: DESENVOLVIMENTO DO MÉTODO MULTIVARIADO ACELERADO PARA DETERMINAÇÃO DO PRAZO DE VALIDADE DE PRODUTOS UNINDO QUIMIOMETRIA E CINÉTICA QUÍMICA descreve um novo conceito de análise de dados que permite avaliar os mecanismos que governam a degradação de bens de consumo e determinar o período de tempo no qual os bens de consumo mantém suas características dentro de níveis aceitáveis. O algoritmo une técnicas quimiométricas com a teoria cinética química formal, constituindo um avanço sobre os protocolos para determinação da validade de produto existentes. Além de ser de fácil interpretação, suas principais vantagens incluem a capacidade de unir informações provenientes de diferentes disciplinas ¿ analíticos, físico-químicos, sensoriais - e a possibilidade de utilização direta de dados oriundos de instrumentação analítica como espectroscópios, cromatógrafos, calorímetros, etc. O trabalho é dividido em seis capítulos. O Capítulo I traz uma revisão da cinética química e dos métodos convencionais para determinação do prazo de validade de produtos. O algoritmo do Método Multivariado Acelerado, bem como suas premissas, vantagens e desvantagens, são descritos no capítulo II. Os Capítulos III e IV trazem aplicações do Método Multivariado Acelerado. No Capítulo III dados físico-químicos e sensoriais foram utilizados para determinar o prazo de validade de um produto alimentíco. No Capítulo IV utilizaram-se dados instrumentais, oriundos de espectroscopia no infravermelho próximo (NIR), como fonte de informação quantitativa que, aliados a avaliações sensoriais, permitiram a determinação da validade de um produto cosmético e forneceram informações relevantes sobre seus modos de degradação. Uma comparação com o método tri-linear PARAFAC é apresentada no Capítulo V. Apesar de constituir ferramenta importante para avaliação dos modos de degradação de produtos, o método PARAFAC carece de estratégias para determinação efetiva do prazo de bens de consumo. Finalmente, o Capítulo VI mostra o desenvolvimento de um protocolo para determinação de várias propriedades de produtos concentrados de tomate utilizando regressão PLS2. Demonstrou-se que, apesar da preferência pelo método de calibração PLS1 por parte dos pesquisadores, a regressão PLS2 é vantajosa quando há correlação entre propriedades cujos métodos de referência são precisos e outras, cuja quantificação é imprecisa.
Abstract: DEVELOPMENT OF THE MULTIVARIATE ACCELERATED SHELF-LIFE TEST (MASLT) FOR DETERMINING THE SHELF-LIFE OF PRODUCTS BY MERGING CHEMOMETRICS AND CHEMICAL KINETICS describes a new concept of data analysis for evaluating the mechanisms that rule out the degradation of consumer goods. The algorithm herein developed took advantage of the soft modeling concepts from chemometrics and of chemical kinetics for determining the period of time within which products would be able to keep their quality characteristics within acceptable levels. Besides being of easy interpretation, the main advantages of the MASLT are the capacity to merge data from different sources ¿ analytical, physical chemical, sensorial ¿ and the ability of handling instrumental data (spectroscopic, chromatographic, calorimetric, etc.) directly for determining the shelf-life of products. This work and some applications are presented in six chapters. In Chapter I, a review of the chemical kinetics theory, as well as the concepts of conventional shelf-life methods, is presented. The MASLT algorithm, together with its assumptions, characteristics and advantages, is presented in Chapter II. Chapters III e IV describe applications where the shelf-life was determined successfully by using the MASLT. In Chapter III, physical chemical and sensorial data were merged not only for determining the shelf-life of an industrialized tomato product, but also to study the correlation between the various properties. In Chapter IV, instrumental data from NIR spectroscopy were used as source of quantitative information and, together with sensory analyses, allowed the determination of the shelf-life of a cosmetic product as well as provided valuable information with regards to its degradation mechanisms. A comparison between the tri-linear method PARAFAC is presented in Chapter V. Despite being an important tool for evaluating the degradation mechanisms of consumer goods, the PARAFAC method lacks strategies for effectively determining the shelf-life of products. Finally, Chapter VI describes the development of a new protocol for determining several properties of tomato concentrate products using PLS2 regression. It was demonstrated that, despite most researchers having a preference for using PLS1 regression, PLS2 is advantageous when there is strong correlations between properties that can be determined quite accurately by their reference methods with those presenting lower accuracy.
Doutorado
Físico-Química
Doutor em Ciências
APA, Harvard, Vancouver, ISO, and other styles
20

Massaro, James. "A PCA based method for image and video pose sequencing /." Online version of thesis, 2010. http://hdl.handle.net/1850/11991.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Louis, Pierre-Yves. "Ergodicity of PCA : equivalence between spatial and temporal mixing conditions." Universität Potsdam, 2004. http://opus.kobv.de/ubp/volltexte/2006/658/.

Full text
Abstract:
For a general attractive Probabilistic Cellular Automata on S-Zd, we prove that the (time-) convergence towards equilibrium of this Markovian parallel dynamics, exponentially fast in the uniform norm, is equivalent to a condition (A). This condition means the exponential decay of the influence from the boundary for the invariant measures of the system restricted to finite boxes. For a class of reversible PCA dynamics on {1,+1}(Zd), wit a naturally associated Gibbsian potential rho, we prove that a (spatial-) weak mixing condition (WM) for rho implies the validity of the assumption (A); thus exponential (time-) ergodicity of these dynamics towards the unique Gibbs measure associated to rho hods. On some particular examples we state that exponential ergodicity holds as soon as there is no phase transition.
APA, Harvard, Vancouver, ISO, and other styles
22

Katadound, Sachin. "Face Recognition: Study and Comparison of PCA and EBGM Algorithms." TopSCHOLAR®, 2004. http://digitalcommons.wku.edu/theses/241.

Full text
Abstract:
Face recognition is a complex and difficult process due to various factors such as variability of illumination, occlusion, face specific characteristics like hair, glasses, beard, etc., and other similar problems affecting computer vision problems. Using a system that offers robust and consistent results for face recognition, various applications such as identification for law enforcement, secure system access, computer human interaction, etc., can be automated successfully. Different methods exist to solve the face recognition problem. Principal component analysis, Independent component analysis, and linear discriminant analysis are few other statistical techniques that are commonly used in solving the face recognition problem. Genetic algorithm, elastic bunch graph matching, artificial neural network, etc. are few of the techniques that have been proposed and implemented. The objective of this thesis paper is to provide insight into different methods available for face recognition, and explore methods that provided an efficient and feasible solution. Factors affecting the result of face recognition and the preprocessing steps that eliminate such abnormalities are also discussed briefly. Principal Component Analysis (PCA) is the most efficient and reliable method known for at least past eight years. Elastic bunch graph matching (EBGM) technique is one of the promising techniques that we studied in this thesis work. We also found better results with EBGM method than PCA in the current thesis paper. We recommend use of a hybrid technique involving the EBGM algorithm to obtain better results. Though, the EBGM method took a long time to train and generate distance measures for the given gallery images compared to PCA. But, we obtained better cumulative match score (CMS) results for the EBGM in comparison to the PCA method. Other promising techniques that can be explored separately in other paper include Genetic algorithm based methods, Mixture of principal components, and Gabor wavelet techniques.
APA, Harvard, Vancouver, ISO, and other styles
23

Nelson, Philip R. C. MacGregor John F. Taylor Paul A. "The treatment of missing measurements in PCA and PLS models /." *McMaster only, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
24

Rueda, Juan. "Evaluation of The Biodegradability and Toxicity of PCA and mPCA." Master's thesis, University of Central Florida, 2013. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5696.

Full text
Abstract:
The main types of hypergolic propellants used at Kennedy Space Center (KSC) are hydrazine (HZ) and monomethylhydrazine (MMH). HZ and MMH are classified as hazardous materials and they are also known to be potentially carcinogenic to humans; therefore, handling these substances and their waste is strictly regulated. The wastes streams from HZ and MMH have been estimated to be the main hazardous wastes streams at KSC. Currently at KSC these wastes are first neutralized using citric acid and then they are transported on public roads for incineration as hazardous materials. A new method using alpha ketoglutaric acid (AKGA) was proposed to treat HZ and MMH wastes. From the reaction of AKGA with HZ and MMH two stable products are formed, 1,4,5,6-tetrahydro-6-oxo-3-pyridazinecarboxylic acid (PCA) and l-methyl-1,4,5,6-tetrahydro-6-oxo-3-pyridazinecarboxylic acid (mPCA), respectively. The cost of purchasing AKGA is greater than the cost of purchasing citric acid; thus, AKGA can only become a cost effective alternative for the treatment of HZ and MMH wastes if the products of the reactions (PCA and mPCA) can be safely disposed of into the sewage system without affecting the treatment efficiency and effluent quality of the wastewater treatment plant (WWTP). In this research mPCA and PCA were analyzed for acute toxicity using fish and crustaceans as well as their effect on the wastewater treatment efficiency and viability using AS microbes, and their biodegradability by AS organisms. Acute toxicity on fish and crustaceans was investigated according to the methods for acute toxicity by USEPA (USEPA Method EPA-821-R-02-012) using Ceriodaphnia dubia (96 hours) and Pimephales promelas (96 hours) as the test organisms. The effect of mPCA and PCA in the treatment efficiency and viability were estimated from respiration inhibition tests (USEPA Method OCSPP 850.3300) and heterotrophic plate counts (HPCs). Lastly, the biodegradability of mPCA and PCA was assessed using the Closed Bottle Test (USEPA Method OPPTS 835.3110). For mPCA, the 96 hours LC50 for C. dubia was estimated at 0.77 ± 0.06 g/L (with a 95% confidence level) and the NOEC was estimated at 0.5 g/L. For P. promelas, the LC50 was above 1.5 g/L but it was noticed that mPCA had an effect on their behavior. Abnormal behavior observed included loss of equilibrium and curved spine. The NOEC on the fish was estimated at 0.75 g/L. PCA did not exhibit a significant mortality on fish or crustaceans. The LC50 of PCA in P. promelas and C. dubia was > 1.5 g/L and the NOEC was 1.5 g/L for both organisms. An Inhibitory effect on the heterotrophic respiration of activated sludge organisms was not observed after exposing them for 180-min to PCA and mPCA at concentrations of up to 1.5 g/L compared to the blank controls. Overall the impact of PCA and mPCA on total respiration rates was small, and only observed at 1,500 mg/L if at all. The difference was apparently caused by inhibition of nitrification rather than heterotrophic inhibition. However due to the variability observed in the measurements of the replicates, it is not possible to firmly conclude that PCA or mPCA at 1,500 mg/L was inhibitory to nitrification. Based on the results from the HPCs, mPCA and PCA did not affect the viability of heterotrophic organisms at 750 mg/L. In the BOD-like closed bottle test using a diluted activated sludge mixed liquor sample, the AS microorganisms were capable of biodegrading up to 67% of a 2 mg/L concentration of PCA (with respect to its theoretical oxygen demand, or ThOD) in 28 days. No biodegradation was observed in the samples containing 2 and 5 mg/L of mPCA after 28 days of incubation using a diluted activated sludge mixed liquor sample as inoculum. The results of this study show that mPCA is more toxic than PCA to Ceriodaphnia dubia and Pimephales promelas. However neither mPCA nor PCA had an effect on the heterotrophic respiration of an AS mixed liquor sample at 1.5 g/L and there was probably no significant inhibition of the nitrification respiration. Samples of PCA and mPCA at 2 and 5 mg/L could not be completely degraded (with respect to their total theoretical oxygen demand) by dilute AS biomass during a 28 day incubation period. mPCA did not show significant degradation in the two different biodegradation tests performed.
M.S.Env.E.
Masters
Civil, Environmental, and Construction Engineering
Engineering and Computer Science
Environmental Engineering
APA, Harvard, Vancouver, ISO, and other styles
25

Barros, Guilherme Antonio Moreira de [UNESP]. "Considerações sobre analgesia controlada pelo paciente (PCA) em hospital universitário." Universidade Estadual Paulista (UNESP), 2001. http://hdl.handle.net/11449/86638.

Full text
Abstract:
Made available in DSpace on 2014-06-11T19:22:21Z (GMT). No. of bitstreams: 0 Previous issue date: 2001Bitstream added on 2014-06-13T20:28:37Z : No. of bitstreams: 1 barros_gam_me_botfm.pdf: 228323 bytes, checksum: 6466d10f0d4eb9a3ff79e8f785fea941 (MD5)
Com o rápido avanço que foi observado nos últimos anos nas técnicas cirúrgicas e anestésicas, os procedimentos se tornaram cada vez mais invasivos. Como houve progressivo envelhecimento da população, o período mais delicado de recuperação, ou seja, o pós-operatório, passou a receber maior atenção. O surgimento de novas técnicas de analgesia, como a Analgesia Controlada pelo Paciente (PCA), vem preencher as necessidades da comunidade médica, cada vez mais atenta à qualidade dos serviços prestados. O Hospital de Clínicas da Faculdade de Medicina da UNESP, Botucatu, atento a essa nova realidade constituiu o Serviço de Dor Aguda (SEDA) para que esta lacuna fosse também preenchida em nosso meio. No intuito de identificar a atuação do SEDA, realizou-se levantamento, de fevereiro de 1995 a dezembro de 1997, com a pesquisa das evoluções de 679 pacientes seguidos pelo SEDA e que fizeram uso do método PCA de analgesia. Observou-se que os resultados obtidos pelo Serviço estavam acima da média relatada pela literatura internacional, com excelentes níveis de analgesia atingidos, baixa ocorrência de efeitos colaterais, e nenhuma complicação fatal no período do estudo.
In the past years a fast developing has been observed in the surgery and anesthetic technique, with more invasive procedures being performed. As the general population has becoming older, the critical recovery period, it means the post surgery period, became focus of attention. The developing of new analgesia techniques, such as Patient Controlled Analgesia (PCA), has the intention of fulfill the needs of the medical community, day by day more aware about the quality of the services. The Hospital of the Sao Paulo State Medical School, Botucatu, aware of this new reality had decided to form the Acute Pain Management Service (SEDA). With the goal of identify the way the SEDA acts this research was realized in period between February, 1995, to December, 1997. Data of 679 patients who used the PCA device were evaluated. The results in this study were as good as the international literature shows, with high quality analgesia, low side effects and no fatal complications on the period observed.
APA, Harvard, Vancouver, ISO, and other styles
26

Moreira, João Paulo Francisco. "Análise de desempenho de anéis de controlo baseado em PCA." Master's thesis, Faculdade de Ciências e Tecnologia, 2013. http://hdl.handle.net/10362/11337.

Full text
Abstract:
Dissertação para obtenção do Grau de Mestre em Engenharia Eletrotécnica e de Computadores
Existem vários trabalhos na literatura que elucidam a importância da avaliação de desempenho de anéis de controlo em sistemas industriais. A indústria está interessada nesta área de investigação e desenvolvimento, devido aos benefícios económicos e de qualidade. Duas metodologias de análise de desempenho, em situações sem falhas e com falhas, para sistemas SISO em tempo real, são propostas nesta dissertação. A primeira contribuição deste trabalho é uma abordagem híbrida fundamentalmente baseada no índice de Harris normalizado, na análise em componentes principais e numa rede neuronal preditiva. Esta abordagem permite analisar o desempenho, assim como detetar e isolar algumas perturbações e falhas típicas na indústria. A segunda contribuição é uma abordagem baseada nas caraterísticas dinâmicas dos sistemas e num multi-modelo PCA do processo. Esta abordagem fundamenta-se no ganho estático estimado, na largura de banda estimada, num modelo PCA para cada ponto de operação e num classificador neuronal de padrões. Esta abordagem também permite avaliar o desempenho, bem como detetar e isolar diferentes tipos de falhas e perturbações. De modo a avaliar o desempenho global das metodologias foram realizados testes experimentais numsistema não-linear de referência de três tanques (AMIRA R DTS 200).
APA, Harvard, Vancouver, ISO, and other styles
27

Barros, Guilherme Antonio Moreira de. "Considerações sobre analgesia controlada pelo paciente (PCA) em hospital universitário /." Botucatu : [s.n.], 2001. http://hdl.handle.net/11449/86638.

Full text
Abstract:
Orientador: Lino Lemonica
Resumo: Com o rápido avanço que foi observado nos últimos anos nas técnicas cirúrgicas e anestésicas, os procedimentos se tornaram cada vez mais invasivos. Como houve progressivo envelhecimento da população, o período mais delicado de recuperação, ou seja, o pós-operatório, passou a receber maior atenção. O surgimento de novas técnicas de analgesia, como a Analgesia Controlada pelo Paciente (PCA), vem preencher as necessidades da comunidade médica, cada vez mais atenta à qualidade dos serviços prestados. O Hospital de Clínicas da Faculdade de Medicina da UNESP, Botucatu, atento a essa nova realidade constituiu o Serviço de Dor Aguda (SEDA) para que esta lacuna fosse também preenchida em nosso meio. No intuito de identificar a atuação do SEDA, realizou-se levantamento, de fevereiro de 1995 a dezembro de 1997, com a pesquisa das evoluções de 679 pacientes seguidos pelo SEDA e que fizeram uso do método PCA de analgesia. Observou-se que os resultados obtidos pelo Serviço estavam acima da média relatada pela literatura internacional, com excelentes níveis de analgesia atingidos, baixa ocorrência de efeitos colaterais, e nenhuma complicação fatal no período do estudo.
Abstract: In the past years a fast developing has been observed in the surgery and anesthetic technique, with more invasive procedures being performed. As the general population has becoming older, the critical recovery period, it means the post surgery period, became focus of attention. The developing of new analgesia techniques, such as Patient Controlled Analgesia (PCA), has the intention of fulfill the needs of the medical community, day by day more aware about the quality of the services. The Hospital of the Sao Paulo State Medical School, Botucatu, aware of this new reality had decided to form the Acute Pain Management Service (SEDA). With the goal of identify the way the SEDA acts this research was realized in period between February, 1995, to December, 1997. Data of 679 patients who used the PCA device were evaluated. The results in this study were as good as the international literature shows, with high quality analgesia, low side effects and no fatal complications on the period observed.
Mestre
APA, Harvard, Vancouver, ISO, and other styles
28

Marchbank, Gavin Clyde. "Posterior cerebral artery (PCA) infarcts and dreaming : a neuropsychological study." Master's thesis, University of Cape Town, 2013. http://hdl.handle.net/11427/14091.

Full text
Abstract:
Recent case reports have shown that global loss of dreaming can result from medial occipitotemporal lesions. These findings have cast doubt on Solms's reformulation of Charcot-Wilbrand Syndrome (CWS) into two distinct disorders of dreaming, and caused substantial confusion in dream research as far as the neurological correlates of dreaming are concerned. This study attempted to confirm these case reports and determine whether there were any characteristics unique to the lesions among patients who had lost the ability to dream following damage to medial occipito-temporal cortex. Nine participants (three non-dreamers and six dreamers) who had suffered non-hemorrhagic infarction in the territory of the posterior cerebral artery were recruited in this study. Case histories and neuroradiological data were used to compare the lesion sites of non-dreamers with dreamers. It was confirmed that complete loss of dreaming could result from lesions in medial occipito-temporal cortex. It was found that non-dreamers always suffered bilateral cortical damage as opposed to dreamers who all suffered unilateral damage. The lesions in the non-dreamers tended to be more posterior than the dreamers. It was further speculated that concomitant damage to the thalamus or parietal areas played a role in the causation of heteromodal loss of dreaming. The implications of these findings were discussed in relation to CWS, Solms's dream system, and dream-function research. Finally, future directions were considered.
APA, Harvard, Vancouver, ISO, and other styles
29

Zhang, Junchao, Haibo Luo, Rongguang Liang, Wei Zhou, Bin Hui, and Zheng Chang. "PCA-based denoising method for division of focal plane polarimeters." OPTICAL SOC AMER, 2017. http://hdl.handle.net/10150/623248.

Full text
Abstract:
Division of focal plane (DoFP) polarimeters are composed of interlaced linear polarizers overlaid upon a focal plane array sensor. The interpolation is essential to reconstruct polarization information. However, current interpolation methods are based on the unrealistic assumption of noise-free images. Thus, it is advantageous to carry out denoising before interpolation. In this paper, we propose a principle component analysis (PCA) based denoising method, which works directly on DoFP images. Both simulated and real DoFP images are used to evaluate the denoising performance. Experimental results show that the proposed method can effectively suppress noise while preserving edges. (C) 2017 Optical Society of America
APA, Harvard, Vancouver, ISO, and other styles
30

Cheng, Chi Wa. "Probabilistic topic modeling and classification probabilistic PCA for text corpora." HKBU Institutional Repository, 2011. http://repository.hkbu.edu.hk/etd_ra/1263.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Alberi, Matteo. "La PCA per la riduzione dei dati di SPHERE IFS." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amslaurea.unibo.it/6563/.

Full text
Abstract:
Nel primo capitolo di questa tesi viene presentata una panoramica dei principali metodi di rivelazione degli esopianeti: il metodo della Velocità Radiale, il metodo Astrometrico, il metodo del Pulsar Timing, il metodo del Transito, il metodo del Microlensing ed infine il metodo del Direct Imaging che verrà approfondito nei capitoli successivi. Nel secondo capitolo vengono presentati i principi della diffrazione, viene mostrato come attenuare la luce stellare con l'uso del coronografo; vengono descritti i fenomeni di aberrazione della luce provocati dalla strumentazione e dagli effetti distorsivi dell'atmosfera che originano le cosiddette speckle; vengono poi presentate le moderne soluzioni tecniche come l'ottica attiva e adattiva, che hanno permesso un considerevole miglioramento della qualità delle osservazioni. Nel terzo capitolo sono illustrate le tecniche di Differential Imaging che permettono di rimuovere efficacemente le speckle e di migliorare il contrasto delle immagini. Nel quarto viene presentata una descrizione matematica della Principal Component Analysis (Analisi delle Componenti Principali), il metodo statistico utilizzato per la riduzione dei dati astronomici. Il quinto capitolo è dedicato a SPHERE, lo strumento progettato per il Very Large Telescope (VLT), in particolare viene descritto il suo spettrografo IFS con il quale sono stati ottenuti, nella fase di test, i dati analizzati nel lavoro di tesi. Nel sesto capitolo vengono mostrate le procedure di riduzione dati e l'applicazione dell'algoritmo di IDL LA_SVD che applica la Principal Component Analysis e ha permesso, analogamente ai metodi di Differenzial Imaging visti in precedenza, di rimuovere le speckle e migliorare il contrasto delle immagini. Nella parte conclusiva, vengono discussi i risultati.
APA, Harvard, Vancouver, ISO, and other styles
32

Svensson, Per-Edvin. "Characterization and Reduction of Noise in PET Data Using MVW-PCA." Thesis, Uppsala University, Department of Information Technology, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-113736.

Full text
Abstract:

Masked Volume-Wise Principal Component Analysis (MVW-PCA) is used in Positron Emission Tomography (PET) to distinguish structures with different kinetic behaviours of an administered tracer. In the article where MVW-PCA was introduced, a noise pre-normalization was suggested due to temporal and spatial variations of the noise between slices. However, the noise pre-normalization proposed in that article was only applicable on datasets reconstructed using the analytical method Filtered Back-Projection (FBP). This study aimed at developing a new noise pre-normalization that is applicable on datasets regardless of whether the dataset was reconstructed with FBP or an iterative reconstruction algorithm, such as Ordered Subset Expectation Maximization (OSEM).

A phantom study was performed to investigate the differences of expectation values and standard deviations of datasets reconstructed with FBP and OSEM. A novel noise pre-normalization method named "higher-order principal component noise pre-normalization" (HOPC noise pre-normalization) was suggested and evaluated against other pre-normalization methods on both synthetic and clinical datasets.

Results showed that MVW-PCA of data reconstructed with FBP was much more dependent on an appropriate pre-normalization than analysis of data reconstructed with OSEM. HOPC noise pre-normalization showed an overall good performance with both FBP and OSEM reconstructions, whereas the other pre-normalization methods only performed well with one of the two methods.

The HOPC noise pre-normalization has potential for improving the results from MVW-PCA on dynamic PET datasets independent of used reconstruction algorithm.

APA, Harvard, Vancouver, ISO, and other styles
33

Zhang, Nuannuan. "A unified PCA and DSP based POD module for hybrid cryptosystems." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0016/MQ54763.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Zhang, Haitao. "Statistical process moni[t]oring and modeling using PCA and PLS." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape4/PQDD_0008/MQ60199.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Jeng, Alagie Malicjk. "Using k-means and PCA in construction of a stock portfolio." Thesis, Mälardalens högskola, Akademin för utbildning, kultur och kommunikation, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35002.

Full text
Abstract:
There is a various analysis that has been presented on the issues that pertain to risk when constructing stock portfolios in the financial market. The information given by most researchers is insufficient to warrant conclusions on the issues of conditional volatility and unconditional volatility hence the need for the study. The operations that take place in the financial market often rely on the different instruments like stocks, bonds, funds, and commodities. With this paper, we are using K-mean clustering and PCA (principal component analysis) in order to try to minimize risk when constructing Stock Portfolio. Results of both methods will be evaluated and compared experimentally (by implementing the methods and looking at the resulting portfolio).
APA, Harvard, Vancouver, ISO, and other styles
36

Swenson, William J. "Pastoral care in the large PCA church a model for ministry /." Theological Research Exchange Network (TREN), 1992. http://www.tren.com.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

RAWAT, URVASHI. "INFRARED AND VISIBLE IMAGE FUSION USING HYBRID LWT AND PCA METHOD." Thesis, DELHI TECHNOLOGICAL UNIVERSITY, 2021. http://dspace.dtu.ac.in:8080/jspui/handle/repository/18907.

Full text
Abstract:
Image fusion is a method in which all the relevant information is collected from the input source images and included in few/single output image. Image fusion techniques are divided into two broad categories: spatial domain and transform domain. Principal component analysis (PCA) is a spatial domain technique which is computationally simpler and reduces redundant information but has the demerit of spectral degradation. Lifting wavelet transform (LWT) is a transform domain technique which has an adaptive design and demands less memory. In this project, a novel hybrid fusion algorithm has been introduced which combines the LWT and PCA in a parallel manner. These two fusion methods are applied on Infrared and Visible image data set. Infrared and visible images contain complementary information and their fusion gives us an output image which is more informative than the individual source images. The hybrid method is also compared with conventional fusion techniques like PCA, LWT and DWT. It has been shown that the proposed method outperforms the conventional methods. The results are analyzed using performance parameters standard deviation, average value, the average difference, and normalized cross- correlation.
APA, Harvard, Vancouver, ISO, and other styles
38

Wu, Kun-da, and 吳坤達. "PCA-ANN for Image Coding." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/37089023513063970533.

Full text
Abstract:
碩士
義守大學
資訊工程學系碩士班
97
Principal component analysis (PCA) is a linear transformation based on linear algebra technology using less data to explain the original data with least errors. It is usually used in signal processing to reduce the dimension of information, Because PCA algorithm requires the computation of eigenvalues and eigenvectors, it is time consuming. In this paper, we use artificial neural networks (ANN) to estimate the principle eigenvectors by learning characteristics. The traditional vector quantization (VQ) uses a codebook to represent all possible image blocks. The advantage of VQ is the high compression ratio and the shortcoming is the codebook design which is time-consuming, In PCA, the eigenvectors, also called the eigenimages, also forms a codebook in another sense, which is similar to VQ scheme. This paper uses the eigenimages as a codebook and performs a VQ-like codng method to take the advantages of both PCA and VQ methods.
APA, Harvard, Vancouver, ISO, and other styles
39

Hung, Cheng-Yu, and 洪承郁. "Robust PCA and its Extension." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/eb95rz.

Full text
Abstract:
碩士
國立臺灣大學
應用數學科學研究所
106
Principal Component Analysis (PCA) has been used in an overwhelming manner for data analysis. However, PCA did not perform well when data did not follow the model well like sensor failure or corrupted sample. Can- dés et al. (2011) proposed Robust Principal Component Analysis (RPCA) to recover the data and proved that it can perform very well when data has the sparsity property for the signal with a low rank background. Unfortunately, the FRET data set does not satisfy the working condition. Here, we employ a sampling scheme to enable the application for the FRET data. For extremely large number of pixel image application, RPCS may suffer from computation loading. Thus, we also extend RPCA to a high order SVD version.
APA, Harvard, Vancouver, ISO, and other styles
40

Yu-Chieh, Chien. "RANSAC-Like Algorithms for Robust PCA." 2006. http://www.cetd.com.tw/ec/thesisdetail.aspx?etdun=U0016-1303200709301738.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

HUANG, JIUN-HAO, and 黃俊豪. "PCA and SVD in image compression." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/ae6u6n.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Chien, Yu-Chieh, and 簡郁潔. "RANSAC-Like Algorithms for Robust PCA." Thesis, 2006. http://ndltd.ncl.edu.tw/handle/67080163111112903333.

Full text
Abstract:
碩士
國立清華大學
資訊工程學系
94
In this thesis, we propose RANSAC [6] (RANdom SAmple Consensus) based approaches to achieve robust PCA [1] (Principal Component Analysis) for data containing outliers. This problem is related to a variety of vision applications that require data analysis or subspace learning, especially for the case that outliers are unavoidable. Overall, our algorithms consist of the following steps. We randomly select a subset of data to compute a PCA model, evaluate the model by other unselected data, and repeat the hypothesize-and-test procedure until a good model is found. To deal with different types of outliers, we apply different sampling strategies, one-dimensional sampling for sample outliers, and two-dimensional sampling for intra-sample outliers. In addition, problems resulted from traditional RANSAC, i.e. a prior error scale and the stopping criteria, are solved by the developed extended MSSE [2] (Modified Selective Statistical Estimator) framework. Experiments on simulated and real data demonstrate superior performance of the proposed RANSAC-like algorithms compared to standard PCA and the robust PCA technique [5] based on M-estimation. The results also verify that the proposed algorithms are robust against up to 80% sample outliers or 30% randomly distributed intra-sample outliers.
APA, Harvard, Vancouver, ISO, and other styles
43

Shu, Tzu-Yen, and 許子彥. "Face Recognition Using SIFT and PCA." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/36246708718401130979.

Full text
Abstract:
碩士
華梵大學
資訊管理學系碩士班
96
There are lots of methods proposed for face recognition in the literature. The images capturing faces in different viewpoints (frontal face and the side view of face) may rapidly decrease the accuracy of face recognition. It is a well-known difficult problem. In this paper, we try to propose methods to increase the accuracy rate. In this paper, the Principal Component Analysis (PCA) is used to get the face candidates in a face database. It is well-known that the Scale Invariant Feature Transform (SIFT) for matching is invariant to scale and rotation. In this paper, SIFT is used for recognition from the face candidates extracted from PCA. There are still some match errors in the results after processing SIFT in our experiment. Therefore, two matching methods using the y-axis distance and cross product for improving the recognition rate are proposed in this paper. There are 12 persons, each having 9 images in different viewpoints in our face database. In our face database, Using SIFT and PCA, the recognition rate is 95.8%. By adding the proposed matching methods of the y-axis distance and the cross product, the recognition rate is up to 98.6%. In the future, we will test our proposed methods to face databases having more persons.
APA, Harvard, Vancouver, ISO, and other styles
44

Váchová, Ilona. "Sebezkušenostní výcvik v PCA z pohledu frekventantů." Master's thesis, 2003. http://www.nusl.cz/ntk/nusl-366817.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Lin, Shih-wei, and 林世偉. "Prediction of Color Channel Characteristic in PCA." Thesis, 2005. http://ndltd.ncl.edu.tw/handle/2y998n.

Full text
Abstract:
碩士
國立臺灣科技大學
電子工程系
94
Consumption examine and mathematics method (Principle component Analysis、Polynomial Regression ,etc. ), to predict LCD its color characteristics of R、G、B three channels in Gamma Curve , tristimulus values curve and spectral information of different digit counts.
APA, Harvard, Vancouver, ISO, and other styles
46

Hsiang-WeiCheng and 鄭翔蔚. "PCA-based Facial and Speaker Recognition System." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/5asfgs.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Yang, En-Tien, and 楊恩典. "Application of PCA on Dynamic Process Monitoring." Thesis, 2000. http://ndltd.ncl.edu.tw/handle/40461919887740463979.

Full text
Abstract:
碩士
國立臺灣大學
化學工程學研究所
88
In the field of process monitoring, we often combine the statistical process control tools and the multivariate analysis techniques together to get better monitoring performance. There are many researches from which a typical pattern of process monitoring has been established. There are several useful multivariate analysis techniques like principal component analysis (PCA), principal component regression (PCR), partial least square or projection to latent structure (PLS), canonical variable analysis (CVA) and etc. The aim of these techniques is to reduce the complexity of the multivariate process monitoring problem while establishing the process model. In this artical, we propose a new monitoring method named transformed principal component analysis (TPCA), which was designed to compensate the effect of the existing of process delay. Then, we apply it to a illustrating example to test the fault detection performance of this method. Finally, we summarize the advantages and the trade-off of this method at the end of this artical as the conclusion.
APA, Harvard, Vancouver, ISO, and other styles
48

Sun, Jung-Nan, and 孫榮男. "PCA and Neural Networks-based Face Recognition." Thesis, 2003. http://ndltd.ncl.edu.tw/handle/26388041785741771834.

Full text
Abstract:
碩士
國立高雄第一科技大學
電腦與通訊工程所
91
This thesis first proposes a feature-based method to detect the human face in the image. The method is based on a human face approximating an ellipse and with a clear edge between face and background to find out the location of the face. We analyze the features of a face with the method of principal component analysis and represent each face with forty-five eigenvalues. Then, these eigenvalues are input into the Backpropagation Neural Network ,which acts as a classifier can be used to recognize faces. Using a digital camera to take the face pictures of thirty persons, everyone has been taken five pictures. Totally, one hundred and fifty pictures are used to make up a small database. Then, the eigenvalues extracted from everyone's five face pictures are used to train a small neural network. Thus, thirty small neural networks are trained, and are stored in database. Then, these trained neural networks are utilized as the recognizing neural networks. Each subnet has only one output neuron, which can be used to represent one person. Each subnet should be trained individually, and all input images can be processed through all subnets independently. When some of the face images have to be modified, only the corresponding well trained subnets need to be changed, and the human face recognition will accomplish needs of work. From the practical simulation, the experimental results achieve that inside test has recognition rate to be 100%, and outside test has recognition rate to be 81%. This is a satisfied result.
APA, Harvard, Vancouver, ISO, and other styles
49

Yang, Tsun Yu, and 楊尊宇. "Data Visualization by PCA, LDA and ICA." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/95190383035489406102.

Full text
Abstract:
碩士
國立清華大學
資訊系統與應用研究所
103
While the internet applications are widely used, the amount of data information has a rapid growth. Big data analysis has become a popular study issue nowadays. Data visualization could help us deal with these big data in an intuitive way. We apply linear dimensionality reduction methods to project the observation data into lower-dimensional subspace. The method could preserve most of the data characteristic with omitting an acceptable little information. Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) are two common linear mapping methods which are proved to have an effective classification result. Independent Component Analysis (ICA) is originally proposed to solve the blind source separation problem. Same as a linear mapping method, ICA also computes a mapping matrix for data projection. But unlike PCA and LDA, ICA assumes non-Gaussian distributions of data could separate the original sources from a mixture. In the implementation of data visualization, we reduce the dimensionality to 2 in order to present the projected data on Euclidean geometry. The experiment result shows that LDA has a better data classification performance than PCA. An ICA algorithm has random factors in itself which may lead to sundry results. By means of low dimensional data visualization, one can imagine or reveal the structure of high dimensional data, for example, the characteristic of clustering.
APA, Harvard, Vancouver, ISO, and other styles
50

Wang, Wei-Yann, and 王韋硯. "PCA Based Performance Assessment for Multivariable Processes." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/32346237271782044889.

Full text
Abstract:
碩士
中原大學
化學工程研究所
96
Abstract In this research, the PCA-based performance assessment for multivariable processes is developed. In the SISO system, the performance bound can be easily estimated from the minimum variance of the single variable. However, in the MIMO system, the traditional multivariable performance assessment benchmarks only consider the trace of the covariance matrix. This may cause false detection since the correlation among variables is not considered. Furthermore, multivariate PCA (principal component analysis) is good at reducing the dimensionality of multivariate data when monitoring the process using the collected normal operation data. It can tell if the normal performance is achieved, but it cannot tell if the operation process has the best performance. In this research, the PCA-based minimum variance performance bounds of the MIMO feedback control system is proposed. PCA is used to eliminate the cross-correlation of variables. The autoregressive moving average (ARMA) filter is used to remove the auto-correlation of variables in the PCA space. The control performance of the minimum variance and the achievable minimum variance are designed respectively by the minimum of the score variance using the resulting decomposition structure of the PCA model without the risk of the over determined or underdetermined problem in the MIMO system. Then, the statistical control chart can be constructed to assess the current performance of the multivariable system. Unlike the continuous process, the performance assessment of the batch process requires particular attention to both disturbance changes and set point changes. Because of the intrinsically dynamic operation and the nonlinear behavior, the conventional approach of the controller assessment cannot be directly applied. In this research, a linear time-variant system will be used to derive the performance bounds from the batch operating data. The MPCA-based performance assessment for batch processes, which is the extension of the PCA-based performance assessment for the continuous process, is developed. Then the performance bounds at each time point computed from the deterministic set-point and the stochastic disturbance for the controlled error variance can help create simple MPCA monitoring charts. They are used to easily track the progress in each batch operation. Finally, the applications are illustrated through simulation problems and a quadruple-tank experiment to demonstrate the effectiveness of the proposed performance assessment techniques in the continuous and batch processes.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography