Dissertations / Theses on the topic 'PCA'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'PCA.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Renkjumnong, Wasuta. "SVD and PCA in Image Processing." Digital Archive @ GSU, 2007. http://digitalarchive.gsu.edu/math_theses/31.
Full textBennett, Marissa A. "Improving Model Performance with Robust PCA." Digital WPI, 2020. https://digitalcommons.wpi.edu/etd-theses/1366.
Full textLiu, Peng. "Adaptive Mixture Estimation and Subsampling PCA." Case Western Reserve University School of Graduate Studies / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=case1220644686.
Full textMoravčíková, Simona. "Stanovenie hodnoty firmy PCA Slovakia, s.r.o." Master's thesis, Vysoká škola ekonomická v Praze, 2015. http://www.nusl.cz/ntk/nusl-206699.
Full textBernardina, Philipe Dalla. "PCA-tree: uma proposta para indexação multidimensional." Universidade de São Paulo, 2007. http://www.teses.usp.br/teses/disponiveis/45/45134/tde-29082007-114522/.
Full textThe advent of applications demanding the representation of objects in multi-dimensional spaces fostered the development of efficient multi-dimensional access methods. Among some early applications that required multi-dimensional access methods, we can cite geo-processing systems, 3D applications and simulators. Later on, multi-dimensional access methods also became important tools in the design of classifiers, mainly of those based on nearest neighbors technique. Consequently, the dimensionality of the spaces has increased, from earlier at most four to dimensionality larger than a thousand. Among several multi-dimensional access methods, the class of approaches based on balanced tree structures with data represented in Rd has received a lot of attention. These methods constitute evolues from the B-tree for unidimensional accesses, and inherit several of its characteristics. In this work, we present some of the access methods based on balanced trees in order to illustrate the central idea of these algorithms, and we propose and implement a new multi-dimensional access method, which we call PCA-tree. It uses an heuristic to break nodes based on the principal component of the sample to be divided. A hyperplane, whose normal is the principal component, is defined as the one that will split the space represented by the node. From this basic idea we define the data structure and the algorithms for the PCA-tree employing secondary memory management, as in B-trees. Finally, we compare the performance of the PCA-tree with the performance of other methods in the cited class, and present advantages and disadvantages of the proposed access method through analysis of experimental results.
Storm, Christine. "Permutation procedures for ANOVA, regression and PCA." Diss., University of Pretoria, 2012. http://hdl.handle.net/2263/24960.
Full textDissertation (MSc)--University of Pretoria, 2012.
Statistics
unrestricted
Le, Hanh T. Banking & Finance Australian School of Business UNSW. "Discrete PCA: an application to corporate governance research." Awarded by:University of New South Wales. Banking & Finance, 2007. http://handle.unsw.edu.au/1959.4/40753.
Full textNziga, Jean-Pierre. "Incremental Sparse-PCA Feature Extraction For Data Streams." NSUWorks, 2015. http://nsuworks.nova.edu/gscis_etd/365.
Full textRimal, Suraj. "POPULATION STRUCTURE INFERENCE USING PCA AND CLUSTERING ALGORITHMS." OpenSIUC, 2021. https://opensiuc.lib.siu.edu/theses/2860.
Full textSilva, Luis Felipe da. "Análise quimiométrica da distribuição de quimioterápicos antimicrobianos (Fluoroquinolonas e Sulfonamidas) na Baía de Ubatuba." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/75/75135/tde-17112016-141429/.
Full textAntimicrobial chemotherapeutical agents are considered emerging contaminants capable of creating bacterial resistance. They have been the focus of numerous studies related to environmental impacts, but in Brazil, there is little research on their occurrence in continental and coastal aquatic environments. The city of Ubatuba has a population density five times higher than the national average, which may be increased up to twelve times during the summer, pushing further the region\'s ecosystems. The rivers Acaraú, Lagoa-Tavares, Grande and Indaiá flow into Ubatuba Bay, compromising the quality of its waters. This work investigated those rivers\' discharge contribution on the occurrence and distribution of antimicrobial chemotherapeutical agents (fluoroquinolones and sulfonamides) in Ubatuba Bay. Thirty-six samples of surface water were analyzed. The samples were withdrawn during the dry and rainy seasons of 2014 and 2015, respectively. For the determining and quantifying the antimicrobial chemotherapeutical agents, it was used SPE as a sample preparation method and HPLC?MS/MS for their detection and quantification. Physicochemical characteristics like pH, temperature, salinity, dissolved oxygen, and redox potential (ORP), as well as dissolved organic carbon (DOC) were also determined to characterize the waters of the region. The spatial-temporal distribution of the agents and their possible association with other investigated data was assessed by the chemometric tool CA, aiming at extracting the greatest possible amount of information. The rivers and, consequently, the bay were contaminated by domestic sewage discharges and the following antimicrobial chemotherapeutical agents were detected: sulfamethoxazole (SMX), sulfathiazole (STZ), sulfachloropyridiazine (SCP), sulfaquinoxaline (SQX), and norfloxacin (NOR). The contamination profiles revealed by the CA showed a spatial variation and a temporal one.
TharinduPriyanDeAlwis, MahappuKankanamge. "PCA and FA model for Matrix Valued Time Series Data." OpenSIUC, 2017. https://opensiuc.lib.siu.edu/theses/2197.
Full textKönigsson, Sofia. "PCA för detektering av avvikande händelser i en kraftvärmeprocess." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-347516.
Full textBoiler 6 at the Högdalen facility in southern Stockholm (P6) combined with a a steam turbine produces Combined Heat and Power (CHP) through combustion of treated industry waste. In order to minimise maintenance costs and increase plant availability it is of importance to detect process faults and deviations at an early state. In this study a method for outlier detection using Principal Component Analysis (PCA) is applied on the CHP production process. A PCA model with reduced dimension is created using process data from a problem free period and is used as a template for new operating data to be compared with in a control chart. Deviations from the model should be an indication of the presence of abnormal conditions and the reasons for the deviations are analysed. Two cases of tube failure in 2014 and 2015 are used to study the deviations. The result shows that process deviations from the models can be detected in the control chart in both cases of tube failure and the variables known to be associated with tube failure contributes highly to the deviating behaviour. There is potential for applying this method for process control, a difficulty lies in creating a model that represents the stable process when there are big variances within what is considererd a stable process state. The method can be used for data analysis when suspecting a tube failure.
SILVA, Thiago César de Oliveira. "CLASSIFICAÇÃO DE CERVEJAS POR ANÁLISE DE IMAGENS E PCA." Universidade Federal de Goiás, 2008. http://repositorio.bc.ufg.br/tede/handle/tde/1035.
Full textIn this work a simple and new approach for lager beer pattern recognition based on image analysis was accomplished using Principal Component Analysis (PCA). Digital color images obtained from a scanner were decomposed into a binary form with the help of Scilab and SIP softwares. The PCA data matrix consisted of RGB color histograms for each sample. 350 mL beer cans were chosen at random on the shelf of local supermarkets in an overall of ninety cans from ten different brands. 200 mL samples were immersed in an ultrasound bath for a period of time ranging from twenty to thirty minutes before the digital image processing. The resulting methodology can be a simple alternative to the analytical quality control of lager beers.
Nesse trabalho foi desenvolvida uma nova metodologia para classificação de cervejas utilizando análise de imagens, tendo como atributo de qualidade a cor, com o emprego da Análise de Componentes Principais (PCA). Para o desenvolvimento do trabalho fez-se uso do software matemático Scilab e do pacote computacional SIP, que possibilitou a decomposição e a análise das imagens digitalizadas obtidas por meio de um scanner comercial. Rotinas foram aplicadas no software para decompor as imagens geradas nos formatos JPEG e BMP, em termos dos canais R, G e B, do padrão de cores RGB. Por meio dos algoritmos desenvolvidos, cada matriz de dados resultante da imagem de uma amostra de cerveja foi transformada em uma nova matriz, a qual gerou o respectivo histograma médio de freqüências das 768 variáveis, cada canal de cor contém 256 (0 a 255) índices de cor. De tal modo, pôde-se analisar as diferentes marcas de cerveja por meio dos gráficos de PCA, com base nos resultados dos gráficos dos loadings e dos scores e dos histogramas médios de freqüência de cada marca. Foram analisadas imagens em formato JPEG de dez marcas de cerveja do tipo pilsen em embalagens de latas contendo 350 mL de cerveja, escolhidas ao acaso, considerando as marcas de maior consumo na região metropolitana de Goiânia. Cerca de 200 mL de cerveja foram retiradas de cada lata e submetidas ao ultra-som para desgaseificação antes da captura da imagens no scanner. Durante as análises de imagens foram percebidas sensíveis alterações de cor das amostras de cerveja quando expostas a luz e ao ar, de modo que esse fenômeno passou a ser avaliado como sendo objetivo secundário do trabalho. Em um segundo momento, com o intuito de ampliar o estudo da análise de imagens, fez-se novas análises com imagens em formato bitmap, BMP, e para tanto foi realizada nova amostragem com o mesmo critério da anterior. Nessa nova amostragem foram escolhidas oito marcas, dentre as quais, sete coincidiram com as marcas utilizadas na análise das imagens com formato em JPEG. Os resultados obtidos nesse trabalho indicam que a metodologia proposta pode ser uma alternativa analítica simples para o controle de qualidade da cerveja, tendo como atributo a cor.
Bhosle, Naren. "AMELIORATING AERIAL NEAR MISSES WITH PCA DEVICES AND MOTL." Available to subscribers only, 2009. http://proquest.umi.com/pqdweb?did=1885431391&sid=5&Fmt=2&clientId=1509&RQT=309&VName=PQD.
Full textLacerda, Eduardo Alberto Duarte. "Tomografia PCA aplicada às galáxias do Projeto CALIFA Survey." reponame:Repositório Institucional da UFSC, 2014. https://repositorio.ufsc.br/xmlui/handle/123456789/123229.
Full textMade available in DSpace on 2014-08-06T17:57:40Z (GMT). No. of bitstreams: 1 326935.pdf: 30733066 bytes, checksum: bb6126b0b0fa0b7f3f1b3b70eeb0c443 (MD5) Previous issue date: 2014
Levantamentos espectroscópicos de galáxias, como o SDSS, estão sendo levados para um novo nível através das unidades de campo integral (IFU), transformando o foco das propriedades das galáxias como um todo para a física interna dasgaláxias. O survey CALIFA é pioneiro nesta nova geração, proporcionando espectroscopia espacialmente resolvida para centenas de galáxias de todas as formas e massas. Vários estudos têm sido realizados com os cubos de dados deste survey, examinando as distribuições espaciais de linhas de emissão, populações estelares, cinemática do gás e das estrelas, etc. Neste trabalho exploramos a análise dos cubos do CALIFA através da perspectiva matemática da técnica de tomografia da análise de componentes principais (PCA) criada por Steiner e colaboradores. Utilizamos a tomografia PCA em 8 galáxias do CALIFA (4 late-types, 2 early-types e 2 mergers). Linhas de emissão são mascaradas em nossas análises para que nos concentremos nas propriedades de populações estelares. As escalas físicas e angulares abarcadas pelos nossos cubos de dados (cobrindo galáxias inteiras) são muito maiores do que aquelas cobertas por aplicações anteriores desta técnica. Essa diferença nos impeliu a introduzir uma simples variação no estágio pré-PCA, normalizando os espectros a uma escala comum de fluxo, destacando variância espectral qualitativa (em oposição à quantitativa relacionada com a amplitude). Examinamos também cubos sintéticos obtidos com ajustes espectrais dos dados originais. Isto elimina o ruído e efeitos de calibração de fluxo. É bem sabido que não podemos atribuir sentido físico os resultados da PCA. Para resolver este problema, utilizamos uma análise de populações estrelares feitas com o STARLIGHT sobre os mesmos dados para obtermos, através de uma engenharia reversa, o significado astrofísico das componentes principais. Em cada galáxia fomos capazes de encontrar alguma correlação entre componentes principais e parâmetros físicos. Nós detectamos que a primeira componente principal em galáxias espirais geralmente se correlaciona com a idade estelar média. Em mergers a componente principal reflete essencialmente variações da extinção global. Em alguns casos a tomografia PCA nos ajuda a localizar erros no cubos originais pois normalmente a grande variância imposta por pixels danificados aparece em componentes principais isolados.
Abstract : Spectroscopic surveys of galaxies, like the SDSS, are currently being taken to a next level with integral field units, turning the focus from the properties of galaxies as a whole to the internal physics of galaxies. The CALIFA survey is a pioneer in this new generation, providing spatially resolved spectroscopy for hundreds of galaxies of all shapes and masses. Several studies have been conducted with datacubes from this survey, examining the spatial distributions of emission lines, stellar populations, gaseous and stellar kinematics, etc. In this explotatory work we approach the analysis of CALIFA datacubes from the mathematical perspective of principal component analysis (PCA) tomography techniques developed by Steiner and collaborators. We apply the PCA tomography to 8 CALIFA galaxies. (4 late-type, 2 early-type and 2 mergers). Emission lines are masked in our analysis to concentrate on the stellar population properties. The physical and angular scales spaned by our datacubes (which cover whole galaxies) are much larger than those spanned by previous applications of this technique. This diference prompted us to introduce a simple variation on the pre-PCA stage, normalizing all spectra in a datacube to a common flux scale, which highlight qualitative (as opposed to amplitude-related quantitative) spectral variance in the datacube. This eliminates noise and flux-calibration effects on the data. A well known caveat with PCA is that it does not assign physical meaning to its results. To tackle this problem we use a starlight-based stellar population analysis of the same data to reverse-engineer the astrophysical meaning of the principal components. In every galaxy we are able to find some correlation between principal components and physical parameters. We find that the first principal component in spiral galaxies usually correlates with the mean stellar age (hlog tiL). In mergers the first component reflects essencially variations of the global extinction parameter (AV). In some cases, the PCA tomography helps us find errors in the original datacube because usually the huge variance imposed by damaged pixels appears in isolated principal components.
Wang, Jin. "An Incremental Multilinear System for Human Face Learning and Recognition." FIU Digital Commons, 2010. http://digitalcommons.fiu.edu/etd/312.
Full textDavid, Höglin. "Regional and Local Factors Influencing the Mass Balance of the Scandinavian Glaciers." Thesis, Uppsala universitet, Institutionen för geovetenskaper, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-311152.
Full textEnligt klimatmodeller kommer en ökning av växthusgaser i atmosfären leda till en ökning av temperaturen på jorden, den ökningen kommer främst att ske på höga latituder. Glaciärer är bra indikation på förändrat klimat på grund av deras korta responstid när klimatet ändrar sig. För tillfället finns det ca 1900 glaciärer utspridda i de Skandinaviska bergen. Eftersom Skandinavien är så avlångt är det en skillnad i meterologiska och klimatiska förhållanden, både i en nord-syd riktning men även i en öst-väst riktning med kontinentala glaciärer i öst och mer marina i väst. Klimat och glaciärdata för 13 olika glaciärer i Skandinavien, 5 från Sverige och 8 ifrån Norge har samlats in och en statistisk analys, principle component analysis (PCA) har gjorts för att se vad som påverkar massbalansen för glaciärerna. De klimat parametrar som har undersökts är Nordatlantsika oscillationen (NAO), Arktiska oscillationen (AO) och solfläckar tillsammans med massbalans, equilibrium line altitude (ELA) och area för glaciärerna. Tre grupperingar har hittats som kan kopplas till olika klimatvariabler och PCA visar extremår för NAO och AO samt en glaciär som har den största arean. PCA analysen visade att alla variabler korrelerade till NAO och AO med omkring 40 % och vi kan dra slutsatsen att det finns en drivande regional och lokal kraft inom vårat dataset där NAO och AO är viktigast för massbalansen.
Povia, Giovana Soato. "Determinação dos parametros de qualidade de detergentes em po utilizando espectroscopia no infravermelho proximo." [s.n.], 2007. http://repositorio.unicamp.br/jspui/handle/REPOSIP/249948.
Full textDissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Quimica
Made available in DSpace on 2018-08-09T09:59:17Z (GMT). No. of bitstreams: 1 Povia_GiovanaSoato_M.pdf: 1681817 bytes, checksum: d59818d169390e4a42e0555251507c27 (MD5) Previous issue date: 2007
Resumo: Este trabalho visa o desenvolvimento de um método analítico para a determinação dos parâmetros de qualidade em detergentes em pó utilizando a Espectroscopia de Infravermelho Próximo (NIR) e técnicas de calibração multivariada. Foram utilizados dois conjuntos de amostras: o primeiro para as análises quantitativas e o segundo para análises qualitativas. As amostras do primeiro conjunto tiveram os parâmetros de qualidade determinados pelos respectivos métodos de referência. A técnica estatística utilizada para as calibrações foi o PLS. Foram desenvolvidos modelos de calibrações para a previsão do teor de umidade, matéria ativa e densidade. O desempenho dos modelos de calibrações foi avaliado por meio de validação externa. A determinação do teor de umidade apresentou RMSEP = 0,29% (m/m). O valor de RMSEP para a determinação da matéria ativa foi de 0,37% (m/m) e para a determinação da densidade o valor de RMSEP = 14 g L . Os modelos construídos apresentaram resultados satisfatórios e os erros encontrados são aceitáveis para a faixa de controle utilizada na indústria. O segundo conjunto é composto de 4 grupos, que apresentam características distintas. Foram avaliados dois métodos de classificação: SIMCA e PLS DA. É possível observar que ocorre a discriminação das amostras que apresentam teor de matéria ativa mais elevado, no entanto, os outros grupos não puderam ser discriminados. Os dois métodos de classificação avaliados apresentaram resultados semelhantes, com acerto de 100% na classificação de amostras externas somente em seus respectivos grupos
Abstract: This work aims the development of an analytical method for the determination of quality parameters on powder detergents using the near infrared spectroscopy (NIR) and multivariate calibration techniques. Two sets of samples were used: the first one for the quantitative analysis and the second one for qualitative analysis. The samples of the first set had the quality parameters determined by their respective methods of reference. The chemometric technique used for calibration was the PLS1. Calibrations for the prediction of de moisture content, active matter and density were developed. The performance of the calibration models was evaluated through external validation. The determination of the moisture content presented a RMSEP = 0,29% (w/w). The value of RMSEP for the determination of the active matter was 0,37% (w/w) and for the determination of moisture the value of RMSEP was 14 g L. The constructed models presented satisfactory results and the errors that were found are acceptable for the control range used in industry. The second set is composed of four groups of power detergents which present different characteristics. Two methods of classification were evaluated: SIMCA and PLS DA. It was possible to observe the discrimination of the samples which presents higher active matter content. However, the other groups could not be discriminated. Both methods of classifications evaluated presented similar results, with 100% correcte results of the classification of samples only in their respective groups
Mestrado
Quimica Analitica
Mestre em Química
Pedro, Andre Messias Krell. "Desenvolvimento do metodo multivariado acelerado para determinação do prazo de validade de produtos unindo quimiometria e cinetica quimica." [s.n.], 2009. http://repositorio.unicamp.br/jspui/handle/REPOSIP/248542.
Full textTese (doutorado) - Universidade Estadual de Campinas, Instituto de Quimica
Made available in DSpace on 2018-08-14T13:05:23Z (GMT). No. of bitstreams: 1 Pedro_AndreMessiasKrell_D.pdf: 1179705 bytes, checksum: 302be0a6ab2e796d557a625c76f92bc4 (MD5) Previous issue date: 2009
Resumo: DESENVOLVIMENTO DO MÉTODO MULTIVARIADO ACELERADO PARA DETERMINAÇÃO DO PRAZO DE VALIDADE DE PRODUTOS UNINDO QUIMIOMETRIA E CINÉTICA QUÍMICA descreve um novo conceito de análise de dados que permite avaliar os mecanismos que governam a degradação de bens de consumo e determinar o período de tempo no qual os bens de consumo mantém suas características dentro de níveis aceitáveis. O algoritmo une técnicas quimiométricas com a teoria cinética química formal, constituindo um avanço sobre os protocolos para determinação da validade de produto existentes. Além de ser de fácil interpretação, suas principais vantagens incluem a capacidade de unir informações provenientes de diferentes disciplinas ¿ analíticos, físico-químicos, sensoriais - e a possibilidade de utilização direta de dados oriundos de instrumentação analítica como espectroscópios, cromatógrafos, calorímetros, etc. O trabalho é dividido em seis capítulos. O Capítulo I traz uma revisão da cinética química e dos métodos convencionais para determinação do prazo de validade de produtos. O algoritmo do Método Multivariado Acelerado, bem como suas premissas, vantagens e desvantagens, são descritos no capítulo II. Os Capítulos III e IV trazem aplicações do Método Multivariado Acelerado. No Capítulo III dados físico-químicos e sensoriais foram utilizados para determinar o prazo de validade de um produto alimentíco. No Capítulo IV utilizaram-se dados instrumentais, oriundos de espectroscopia no infravermelho próximo (NIR), como fonte de informação quantitativa que, aliados a avaliações sensoriais, permitiram a determinação da validade de um produto cosmético e forneceram informações relevantes sobre seus modos de degradação. Uma comparação com o método tri-linear PARAFAC é apresentada no Capítulo V. Apesar de constituir ferramenta importante para avaliação dos modos de degradação de produtos, o método PARAFAC carece de estratégias para determinação efetiva do prazo de bens de consumo. Finalmente, o Capítulo VI mostra o desenvolvimento de um protocolo para determinação de várias propriedades de produtos concentrados de tomate utilizando regressão PLS2. Demonstrou-se que, apesar da preferência pelo método de calibração PLS1 por parte dos pesquisadores, a regressão PLS2 é vantajosa quando há correlação entre propriedades cujos métodos de referência são precisos e outras, cuja quantificação é imprecisa.
Abstract: DEVELOPMENT OF THE MULTIVARIATE ACCELERATED SHELF-LIFE TEST (MASLT) FOR DETERMINING THE SHELF-LIFE OF PRODUCTS BY MERGING CHEMOMETRICS AND CHEMICAL KINETICS describes a new concept of data analysis for evaluating the mechanisms that rule out the degradation of consumer goods. The algorithm herein developed took advantage of the soft modeling concepts from chemometrics and of chemical kinetics for determining the period of time within which products would be able to keep their quality characteristics within acceptable levels. Besides being of easy interpretation, the main advantages of the MASLT are the capacity to merge data from different sources ¿ analytical, physical chemical, sensorial ¿ and the ability of handling instrumental data (spectroscopic, chromatographic, calorimetric, etc.) directly for determining the shelf-life of products. This work and some applications are presented in six chapters. In Chapter I, a review of the chemical kinetics theory, as well as the concepts of conventional shelf-life methods, is presented. The MASLT algorithm, together with its assumptions, characteristics and advantages, is presented in Chapter II. Chapters III e IV describe applications where the shelf-life was determined successfully by using the MASLT. In Chapter III, physical chemical and sensorial data were merged not only for determining the shelf-life of an industrialized tomato product, but also to study the correlation between the various properties. In Chapter IV, instrumental data from NIR spectroscopy were used as source of quantitative information and, together with sensory analyses, allowed the determination of the shelf-life of a cosmetic product as well as provided valuable information with regards to its degradation mechanisms. A comparison between the tri-linear method PARAFAC is presented in Chapter V. Despite being an important tool for evaluating the degradation mechanisms of consumer goods, the PARAFAC method lacks strategies for effectively determining the shelf-life of products. Finally, Chapter VI describes the development of a new protocol for determining several properties of tomato concentrate products using PLS2 regression. It was demonstrated that, despite most researchers having a preference for using PLS1 regression, PLS2 is advantageous when there is strong correlations between properties that can be determined quite accurately by their reference methods with those presenting lower accuracy.
Doutorado
Físico-Química
Doutor em Ciências
Massaro, James. "A PCA based method for image and video pose sequencing /." Online version of thesis, 2010. http://hdl.handle.net/1850/11991.
Full textLouis, Pierre-Yves. "Ergodicity of PCA : equivalence between spatial and temporal mixing conditions." Universität Potsdam, 2004. http://opus.kobv.de/ubp/volltexte/2006/658/.
Full textKatadound, Sachin. "Face Recognition: Study and Comparison of PCA and EBGM Algorithms." TopSCHOLAR®, 2004. http://digitalcommons.wku.edu/theses/241.
Full textNelson, Philip R. C. MacGregor John F. Taylor Paul A. "The treatment of missing measurements in PCA and PLS models /." *McMaster only, 2002.
Find full textRueda, Juan. "Evaluation of The Biodegradability and Toxicity of PCA and mPCA." Master's thesis, University of Central Florida, 2013. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5696.
Full textM.S.Env.E.
Masters
Civil, Environmental, and Construction Engineering
Engineering and Computer Science
Environmental Engineering
Barros, Guilherme Antonio Moreira de [UNESP]. "Considerações sobre analgesia controlada pelo paciente (PCA) em hospital universitário." Universidade Estadual Paulista (UNESP), 2001. http://hdl.handle.net/11449/86638.
Full textCom o rápido avanço que foi observado nos últimos anos nas técnicas cirúrgicas e anestésicas, os procedimentos se tornaram cada vez mais invasivos. Como houve progressivo envelhecimento da população, o período mais delicado de recuperação, ou seja, o pós-operatório, passou a receber maior atenção. O surgimento de novas técnicas de analgesia, como a Analgesia Controlada pelo Paciente (PCA), vem preencher as necessidades da comunidade médica, cada vez mais atenta à qualidade dos serviços prestados. O Hospital de Clínicas da Faculdade de Medicina da UNESP, Botucatu, atento a essa nova realidade constituiu o Serviço de Dor Aguda (SEDA) para que esta lacuna fosse também preenchida em nosso meio. No intuito de identificar a atuação do SEDA, realizou-se levantamento, de fevereiro de 1995 a dezembro de 1997, com a pesquisa das evoluções de 679 pacientes seguidos pelo SEDA e que fizeram uso do método PCA de analgesia. Observou-se que os resultados obtidos pelo Serviço estavam acima da média relatada pela literatura internacional, com excelentes níveis de analgesia atingidos, baixa ocorrência de efeitos colaterais, e nenhuma complicação fatal no período do estudo.
In the past years a fast developing has been observed in the surgery and anesthetic technique, with more invasive procedures being performed. As the general population has becoming older, the critical recovery period, it means the post surgery period, became focus of attention. The developing of new analgesia techniques, such as Patient Controlled Analgesia (PCA), has the intention of fulfill the needs of the medical community, day by day more aware about the quality of the services. The Hospital of the Sao Paulo State Medical School, Botucatu, aware of this new reality had decided to form the Acute Pain Management Service (SEDA). With the goal of identify the way the SEDA acts this research was realized in period between February, 1995, to December, 1997. Data of 679 patients who used the PCA device were evaluated. The results in this study were as good as the international literature shows, with high quality analgesia, low side effects and no fatal complications on the period observed.
Moreira, João Paulo Francisco. "Análise de desempenho de anéis de controlo baseado em PCA." Master's thesis, Faculdade de Ciências e Tecnologia, 2013. http://hdl.handle.net/10362/11337.
Full textExistem vários trabalhos na literatura que elucidam a importância da avaliação de desempenho de anéis de controlo em sistemas industriais. A indústria está interessada nesta área de investigação e desenvolvimento, devido aos benefícios económicos e de qualidade. Duas metodologias de análise de desempenho, em situações sem falhas e com falhas, para sistemas SISO em tempo real, são propostas nesta dissertação. A primeira contribuição deste trabalho é uma abordagem híbrida fundamentalmente baseada no índice de Harris normalizado, na análise em componentes principais e numa rede neuronal preditiva. Esta abordagem permite analisar o desempenho, assim como detetar e isolar algumas perturbações e falhas típicas na indústria. A segunda contribuição é uma abordagem baseada nas caraterísticas dinâmicas dos sistemas e num multi-modelo PCA do processo. Esta abordagem fundamenta-se no ganho estático estimado, na largura de banda estimada, num modelo PCA para cada ponto de operação e num classificador neuronal de padrões. Esta abordagem também permite avaliar o desempenho, bem como detetar e isolar diferentes tipos de falhas e perturbações. De modo a avaliar o desempenho global das metodologias foram realizados testes experimentais numsistema não-linear de referência de três tanques (AMIRA R DTS 200).
Barros, Guilherme Antonio Moreira de. "Considerações sobre analgesia controlada pelo paciente (PCA) em hospital universitário /." Botucatu : [s.n.], 2001. http://hdl.handle.net/11449/86638.
Full textResumo: Com o rápido avanço que foi observado nos últimos anos nas técnicas cirúrgicas e anestésicas, os procedimentos se tornaram cada vez mais invasivos. Como houve progressivo envelhecimento da população, o período mais delicado de recuperação, ou seja, o pós-operatório, passou a receber maior atenção. O surgimento de novas técnicas de analgesia, como a Analgesia Controlada pelo Paciente (PCA), vem preencher as necessidades da comunidade médica, cada vez mais atenta à qualidade dos serviços prestados. O Hospital de Clínicas da Faculdade de Medicina da UNESP, Botucatu, atento a essa nova realidade constituiu o Serviço de Dor Aguda (SEDA) para que esta lacuna fosse também preenchida em nosso meio. No intuito de identificar a atuação do SEDA, realizou-se levantamento, de fevereiro de 1995 a dezembro de 1997, com a pesquisa das evoluções de 679 pacientes seguidos pelo SEDA e que fizeram uso do método PCA de analgesia. Observou-se que os resultados obtidos pelo Serviço estavam acima da média relatada pela literatura internacional, com excelentes níveis de analgesia atingidos, baixa ocorrência de efeitos colaterais, e nenhuma complicação fatal no período do estudo.
Abstract: In the past years a fast developing has been observed in the surgery and anesthetic technique, with more invasive procedures being performed. As the general population has becoming older, the critical recovery period, it means the post surgery period, became focus of attention. The developing of new analgesia techniques, such as Patient Controlled Analgesia (PCA), has the intention of fulfill the needs of the medical community, day by day more aware about the quality of the services. The Hospital of the Sao Paulo State Medical School, Botucatu, aware of this new reality had decided to form the Acute Pain Management Service (SEDA). With the goal of identify the way the SEDA acts this research was realized in period between February, 1995, to December, 1997. Data of 679 patients who used the PCA device were evaluated. The results in this study were as good as the international literature shows, with high quality analgesia, low side effects and no fatal complications on the period observed.
Mestre
Marchbank, Gavin Clyde. "Posterior cerebral artery (PCA) infarcts and dreaming : a neuropsychological study." Master's thesis, University of Cape Town, 2013. http://hdl.handle.net/11427/14091.
Full textZhang, Junchao, Haibo Luo, Rongguang Liang, Wei Zhou, Bin Hui, and Zheng Chang. "PCA-based denoising method for division of focal plane polarimeters." OPTICAL SOC AMER, 2017. http://hdl.handle.net/10150/623248.
Full textCheng, Chi Wa. "Probabilistic topic modeling and classification probabilistic PCA for text corpora." HKBU Institutional Repository, 2011. http://repository.hkbu.edu.hk/etd_ra/1263.
Full textAlberi, Matteo. "La PCA per la riduzione dei dati di SPHERE IFS." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amslaurea.unibo.it/6563/.
Full textSvensson, Per-Edvin. "Characterization and Reduction of Noise in PET Data Using MVW-PCA." Thesis, Uppsala University, Department of Information Technology, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-113736.
Full textMasked Volume-Wise Principal Component Analysis (MVW-PCA) is used in Positron Emission Tomography (PET) to distinguish structures with different kinetic behaviours of an administered tracer. In the article where MVW-PCA was introduced, a noise pre-normalization was suggested due to temporal and spatial variations of the noise between slices. However, the noise pre-normalization proposed in that article was only applicable on datasets reconstructed using the analytical method Filtered Back-Projection (FBP). This study aimed at developing a new noise pre-normalization that is applicable on datasets regardless of whether the dataset was reconstructed with FBP or an iterative reconstruction algorithm, such as Ordered Subset Expectation Maximization (OSEM).
A phantom study was performed to investigate the differences of expectation values and standard deviations of datasets reconstructed with FBP and OSEM. A novel noise pre-normalization method named "higher-order principal component noise pre-normalization" (HOPC noise pre-normalization) was suggested and evaluated against other pre-normalization methods on both synthetic and clinical datasets.
Results showed that MVW-PCA of data reconstructed with FBP was much more dependent on an appropriate pre-normalization than analysis of data reconstructed with OSEM. HOPC noise pre-normalization showed an overall good performance with both FBP and OSEM reconstructions, whereas the other pre-normalization methods only performed well with one of the two methods.
The HOPC noise pre-normalization has potential for improving the results from MVW-PCA on dynamic PET datasets independent of used reconstruction algorithm.
Zhang, Nuannuan. "A unified PCA and DSP based POD module for hybrid cryptosystems." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0016/MQ54763.pdf.
Full textZhang, Haitao. "Statistical process moni[t]oring and modeling using PCA and PLS." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape4/PQDD_0008/MQ60199.pdf.
Full textJeng, Alagie Malicjk. "Using k-means and PCA in construction of a stock portfolio." Thesis, Mälardalens högskola, Akademin för utbildning, kultur och kommunikation, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-35002.
Full textSwenson, William J. "Pastoral care in the large PCA church a model for ministry /." Theological Research Exchange Network (TREN), 1992. http://www.tren.com.
Full textRAWAT, URVASHI. "INFRARED AND VISIBLE IMAGE FUSION USING HYBRID LWT AND PCA METHOD." Thesis, DELHI TECHNOLOGICAL UNIVERSITY, 2021. http://dspace.dtu.ac.in:8080/jspui/handle/repository/18907.
Full textWu, Kun-da, and 吳坤達. "PCA-ANN for Image Coding." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/37089023513063970533.
Full text義守大學
資訊工程學系碩士班
97
Principal component analysis (PCA) is a linear transformation based on linear algebra technology using less data to explain the original data with least errors. It is usually used in signal processing to reduce the dimension of information, Because PCA algorithm requires the computation of eigenvalues and eigenvectors, it is time consuming. In this paper, we use artificial neural networks (ANN) to estimate the principle eigenvectors by learning characteristics. The traditional vector quantization (VQ) uses a codebook to represent all possible image blocks. The advantage of VQ is the high compression ratio and the shortcoming is the codebook design which is time-consuming, In PCA, the eigenvectors, also called the eigenimages, also forms a codebook in another sense, which is similar to VQ scheme. This paper uses the eigenimages as a codebook and performs a VQ-like codng method to take the advantages of both PCA and VQ methods.
Hung, Cheng-Yu, and 洪承郁. "Robust PCA and its Extension." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/eb95rz.
Full text國立臺灣大學
應用數學科學研究所
106
Principal Component Analysis (PCA) has been used in an overwhelming manner for data analysis. However, PCA did not perform well when data did not follow the model well like sensor failure or corrupted sample. Can- dés et al. (2011) proposed Robust Principal Component Analysis (RPCA) to recover the data and proved that it can perform very well when data has the sparsity property for the signal with a low rank background. Unfortunately, the FRET data set does not satisfy the working condition. Here, we employ a sampling scheme to enable the application for the FRET data. For extremely large number of pixel image application, RPCS may suffer from computation loading. Thus, we also extend RPCA to a high order SVD version.
Yu-Chieh, Chien. "RANSAC-Like Algorithms for Robust PCA." 2006. http://www.cetd.com.tw/ec/thesisdetail.aspx?etdun=U0016-1303200709301738.
Full textHUANG, JIUN-HAO, and 黃俊豪. "PCA and SVD in image compression." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/ae6u6n.
Full textChien, Yu-Chieh, and 簡郁潔. "RANSAC-Like Algorithms for Robust PCA." Thesis, 2006. http://ndltd.ncl.edu.tw/handle/67080163111112903333.
Full text國立清華大學
資訊工程學系
94
In this thesis, we propose RANSAC [6] (RANdom SAmple Consensus) based approaches to achieve robust PCA [1] (Principal Component Analysis) for data containing outliers. This problem is related to a variety of vision applications that require data analysis or subspace learning, especially for the case that outliers are unavoidable. Overall, our algorithms consist of the following steps. We randomly select a subset of data to compute a PCA model, evaluate the model by other unselected data, and repeat the hypothesize-and-test procedure until a good model is found. To deal with different types of outliers, we apply different sampling strategies, one-dimensional sampling for sample outliers, and two-dimensional sampling for intra-sample outliers. In addition, problems resulted from traditional RANSAC, i.e. a prior error scale and the stopping criteria, are solved by the developed extended MSSE [2] (Modified Selective Statistical Estimator) framework. Experiments on simulated and real data demonstrate superior performance of the proposed RANSAC-like algorithms compared to standard PCA and the robust PCA technique [5] based on M-estimation. The results also verify that the proposed algorithms are robust against up to 80% sample outliers or 30% randomly distributed intra-sample outliers.
Shu, Tzu-Yen, and 許子彥. "Face Recognition Using SIFT and PCA." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/36246708718401130979.
Full text華梵大學
資訊管理學系碩士班
96
There are lots of methods proposed for face recognition in the literature. The images capturing faces in different viewpoints (frontal face and the side view of face) may rapidly decrease the accuracy of face recognition. It is a well-known difficult problem. In this paper, we try to propose methods to increase the accuracy rate. In this paper, the Principal Component Analysis (PCA) is used to get the face candidates in a face database. It is well-known that the Scale Invariant Feature Transform (SIFT) for matching is invariant to scale and rotation. In this paper, SIFT is used for recognition from the face candidates extracted from PCA. There are still some match errors in the results after processing SIFT in our experiment. Therefore, two matching methods using the y-axis distance and cross product for improving the recognition rate are proposed in this paper. There are 12 persons, each having 9 images in different viewpoints in our face database. In our face database, Using SIFT and PCA, the recognition rate is 95.8%. By adding the proposed matching methods of the y-axis distance and the cross product, the recognition rate is up to 98.6%. In the future, we will test our proposed methods to face databases having more persons.
Váchová, Ilona. "Sebezkušenostní výcvik v PCA z pohledu frekventantů." Master's thesis, 2003. http://www.nusl.cz/ntk/nusl-366817.
Full textLin, Shih-wei, and 林世偉. "Prediction of Color Channel Characteristic in PCA." Thesis, 2005. http://ndltd.ncl.edu.tw/handle/2y998n.
Full text國立臺灣科技大學
電子工程系
94
Consumption examine and mathematics method (Principle component Analysis、Polynomial Regression ,etc. ), to predict LCD its color characteristics of R、G、B three channels in Gamma Curve , tristimulus values curve and spectral information of different digit counts.
Hsiang-WeiCheng and 鄭翔蔚. "PCA-based Facial and Speaker Recognition System." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/5asfgs.
Full textYang, En-Tien, and 楊恩典. "Application of PCA on Dynamic Process Monitoring." Thesis, 2000. http://ndltd.ncl.edu.tw/handle/40461919887740463979.
Full text國立臺灣大學
化學工程學研究所
88
In the field of process monitoring, we often combine the statistical process control tools and the multivariate analysis techniques together to get better monitoring performance. There are many researches from which a typical pattern of process monitoring has been established. There are several useful multivariate analysis techniques like principal component analysis (PCA), principal component regression (PCR), partial least square or projection to latent structure (PLS), canonical variable analysis (CVA) and etc. The aim of these techniques is to reduce the complexity of the multivariate process monitoring problem while establishing the process model. In this artical, we propose a new monitoring method named transformed principal component analysis (TPCA), which was designed to compensate the effect of the existing of process delay. Then, we apply it to a illustrating example to test the fault detection performance of this method. Finally, we summarize the advantages and the trade-off of this method at the end of this artical as the conclusion.
Sun, Jung-Nan, and 孫榮男. "PCA and Neural Networks-based Face Recognition." Thesis, 2003. http://ndltd.ncl.edu.tw/handle/26388041785741771834.
Full text國立高雄第一科技大學
電腦與通訊工程所
91
This thesis first proposes a feature-based method to detect the human face in the image. The method is based on a human face approximating an ellipse and with a clear edge between face and background to find out the location of the face. We analyze the features of a face with the method of principal component analysis and represent each face with forty-five eigenvalues. Then, these eigenvalues are input into the Backpropagation Neural Network ,which acts as a classifier can be used to recognize faces. Using a digital camera to take the face pictures of thirty persons, everyone has been taken five pictures. Totally, one hundred and fifty pictures are used to make up a small database. Then, the eigenvalues extracted from everyone's five face pictures are used to train a small neural network. Thus, thirty small neural networks are trained, and are stored in database. Then, these trained neural networks are utilized as the recognizing neural networks. Each subnet has only one output neuron, which can be used to represent one person. Each subnet should be trained individually, and all input images can be processed through all subnets independently. When some of the face images have to be modified, only the corresponding well trained subnets need to be changed, and the human face recognition will accomplish needs of work. From the practical simulation, the experimental results achieve that inside test has recognition rate to be 100%, and outside test has recognition rate to be 81%. This is a satisfied result.
Yang, Tsun Yu, and 楊尊宇. "Data Visualization by PCA, LDA and ICA." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/95190383035489406102.
Full text國立清華大學
資訊系統與應用研究所
103
While the internet applications are widely used, the amount of data information has a rapid growth. Big data analysis has become a popular study issue nowadays. Data visualization could help us deal with these big data in an intuitive way. We apply linear dimensionality reduction methods to project the observation data into lower-dimensional subspace. The method could preserve most of the data characteristic with omitting an acceptable little information. Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) are two common linear mapping methods which are proved to have an effective classification result. Independent Component Analysis (ICA) is originally proposed to solve the blind source separation problem. Same as a linear mapping method, ICA also computes a mapping matrix for data projection. But unlike PCA and LDA, ICA assumes non-Gaussian distributions of data could separate the original sources from a mixture. In the implementation of data visualization, we reduce the dimensionality to 2 in order to present the projected data on Euclidean geometry. The experiment result shows that LDA has a better data classification performance than PCA. An ICA algorithm has random factors in itself which may lead to sundry results. By means of low dimensional data visualization, one can imagine or reveal the structure of high dimensional data, for example, the characteristic of clustering.
Wang, Wei-Yann, and 王韋硯. "PCA Based Performance Assessment for Multivariable Processes." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/32346237271782044889.
Full text中原大學
化學工程研究所
96
Abstract In this research, the PCA-based performance assessment for multivariable processes is developed. In the SISO system, the performance bound can be easily estimated from the minimum variance of the single variable. However, in the MIMO system, the traditional multivariable performance assessment benchmarks only consider the trace of the covariance matrix. This may cause false detection since the correlation among variables is not considered. Furthermore, multivariate PCA (principal component analysis) is good at reducing the dimensionality of multivariate data when monitoring the process using the collected normal operation data. It can tell if the normal performance is achieved, but it cannot tell if the operation process has the best performance. In this research, the PCA-based minimum variance performance bounds of the MIMO feedback control system is proposed. PCA is used to eliminate the cross-correlation of variables. The autoregressive moving average (ARMA) filter is used to remove the auto-correlation of variables in the PCA space. The control performance of the minimum variance and the achievable minimum variance are designed respectively by the minimum of the score variance using the resulting decomposition structure of the PCA model without the risk of the over determined or underdetermined problem in the MIMO system. Then, the statistical control chart can be constructed to assess the current performance of the multivariable system. Unlike the continuous process, the performance assessment of the batch process requires particular attention to both disturbance changes and set point changes. Because of the intrinsically dynamic operation and the nonlinear behavior, the conventional approach of the controller assessment cannot be directly applied. In this research, a linear time-variant system will be used to derive the performance bounds from the batch operating data. The MPCA-based performance assessment for batch processes, which is the extension of the PCA-based performance assessment for the continuous process, is developed. Then the performance bounds at each time point computed from the deterministic set-point and the stochastic disturbance for the controlled error variance can help create simple MPCA monitoring charts. They are used to easily track the progress in each batch operation. Finally, the applications are illustrated through simulation problems and a quadruple-tank experiment to demonstrate the effectiveness of the proposed performance assessment techniques in the continuous and batch processes.