To see the other types of publications on this topic, follow the link: Linear principal componetns analysis.

Journal articles on the topic 'Linear principal componetns analysis'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Linear principal componetns analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Török, Evelin, István Komlósi, Béla Béri, Imre Füller, Barnabás Vágó, and János Posta. "Principal component analysis of conformation traits in Hungarian Simmental cows." Czech Journal of Animal Science 66, No. 2 (February 15, 2021): 39–45. http://dx.doi.org/10.17221/155/2020-cjas.

Full text
Abstract:
The aim of the current research was to analyze the linear type traits of Hungarian Simmental dual-purpose cows scored in the first lactation using principal component analysis and cluster analysis. Data collected by the Association of Hungarian Simmental Breeders were studied during the work. The filtered database contained the results of 8 868 cows, born after 1997. From the evaluation of main conformation traits, the highest correlations (r = 0.35, P < 0.05) were found between mammary system and feet and legs traits. Within linear type traits, the highest correlation was observed between rump length and rump width (r = 0.81, P < 0.05). Using the principal component analysis, main conformation traits were combined into groups. There were three factors having 84.5 as total variance ratio after varimax rotation. Cluster analysis verified the results of the principal component analysis as most of the trait groups were similar. The strongest relationship was observed between feet and legs and mammary system (main conformation traits) and between rump length and rump width (linear type traits).
APA, Harvard, Vancouver, ISO, and other styles
2

Hiden, H. G., M. J. Willis, M. T. Tham, and G. A. Montague. "Non-linear principal components analysis using genetic programming." Computers & Chemical Engineering 23, no. 3 (February 1999): 413–25. http://dx.doi.org/10.1016/s0098-1354(98)00284-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zhang, J., A. J. Morris, and E. B. Martin. "Process Monitoring Using Non-Linear Principal Component Analysis." IFAC Proceedings Volumes 29, no. 1 (June 1996): 6584–89. http://dx.doi.org/10.1016/s1474-6670(17)58739-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ruessink, B. G., I. M. J. van Enckevort, and Y. Kuriyama. "Non-linear principal component analysis of nearshore bathymetry." Marine Geology 203, no. 1-2 (January 2004): 185–97. http://dx.doi.org/10.1016/s0025-3227(03)00334-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Jia, F., E. B. Martin, and A. J. Morris. "Non-linear principal components analysis for process fault detection." Computers & Chemical Engineering 22 (March 1998): S851—S854. http://dx.doi.org/10.1016/s0098-1354(98)00164-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Rattan, S. S. P., B. G. Ruessink, and W. W. Hsieh. "Non-linear complex principal component analysis of nearshore bathymetry." Nonlinear Processes in Geophysics 12, no. 5 (June 28, 2005): 661–70. http://dx.doi.org/10.5194/npg-12-661-2005.

Full text
Abstract:
Abstract. Complex principal component analysis (CPCA) is a useful linear method for dimensionality reduction of data sets characterized by propagating patterns, where the CPCA modes are linear functions of the complex principal component (CPC), consisting of an amplitude and a phase. The use of non-linear methods, such as the neural-network based circular non-linear principal component analysis (NLPCA.cir) and the recently developed non-linear complex principal component analysis (NLCPCA), may provide a more accurate description of data in case the lower-dimensional structure is non-linear. NLPCA.cir extracts non-linear phase information without amplitude variability, while NLCPCA is capable of extracting both. NLCPCA can thus be viewed as a non-linear generalization of CPCA. In this article, NLCPCA is applied to bathymetry data from the sandy barred beaches at Egmond aan Zee (Netherlands), the Hasaki coast (Japan) and Duck (North Carolina, USA) to examine how effective this new method is in comparison to CPCA and NLPCA.cir in representing propagating phenomena. At Duck, the underlying low-dimensional data structure is found to have linear phase and amplitude variability only and, accordingly, CPCA performs as well as NLCPCA. At Egmond, the reduced data structure contains non-linear spatial patterns (asymmetric bar/trough shapes) without much temporal amplitude variability and, consequently, is about equally well modelled by NLCPCA and NLPCA.cir. Finally, at Hasaki, the data structure displays not only non-linear spatial variability but also considerably temporal amplitude variability, and NLCPCA outperforms both CPCA and NLPCA.cir. Because it is difficult to know the structure of data in advance as to which one of the three models should be used, the generalized NLCPCA model can be used in each situation.
APA, Harvard, Vancouver, ISO, and other styles
7

Kambhatla, Nandakishore, and Todd K. Leen. "Dimension Reduction by Local Principal Component Analysis." Neural Computation 9, no. 7 (October 1, 1997): 1493–516. http://dx.doi.org/10.1162/neco.1997.9.7.1493.

Full text
Abstract:
Reducing or eliminating statistical redundancy between the components of high-dimensional vector data enables a lower-dimensional representation without significant loss of information. Recognizing the limitations of principal component analysis (PCA), researchers in the statistics and neural network communities have developed nonlinear extensions of PCA. This article develops a local linear approach to dimension reduction that provides accurate representations and is fast to compute. We exercise the algorithms on speech and image data, and compare performance with PCA and with neural network implementations of nonlinear PCA. We find that both nonlinear techniques can provide more accurate representations than PCA and show that the local linear techniques outperform neural network implementations.
APA, Harvard, Vancouver, ISO, and other styles
8

Purviance, J. E., M. C. Petzold, and C. Potratz. "A linear statistical FET model using principal component analysis." IEEE Transactions on Microwave Theory and Techniques 37, no. 9 (1989): 1389–94. http://dx.doi.org/10.1109/22.32222.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Jiang, Jian-Hui, Ji-Hong Wang, Xia Chu, and Ru-Qin Yu. "Neural network learning to non-linear principal component analysis." Analytica Chimica Acta 336, no. 1-3 (December 1996): 209–22. http://dx.doi.org/10.1016/s0003-2670(96)00359-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chan. "Face Biometrics Based on Principal Component Analysis and Linear Discriminant Analysis." Journal of Computer Science 6, no. 7 (July 1, 2010): 693–99. http://dx.doi.org/10.3844/jcssp.2010.693.699.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

nka, Riya. "Face Recognition Based on Principal Component Analysis and Linear Discriminant Analysis." International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering 4, no. 8 (August 20, 2015): 7266–74. http://dx.doi.org/10.15662/ijareeie.2015.0408046.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Harris, David. "Principal Components Analysis of Cointegrated Time Series." Econometric Theory 13, no. 4 (February 1997): 529–57. http://dx.doi.org/10.1017/s0266466600005995.

Full text
Abstract:
This paper considers the analysis of cointegrated time series using principal components methods. These methods have the advantage of requiring neither the normalization imposed by the triangular error correction model nor the specification of a finite-order vector autoregression. An asymptotically efficient estimator of the cointegrating vectors is given, along with tests forcointegration and tests of certain linear restrictions on the cointegrating vectors. An illustrative application is provided.
APA, Harvard, Vancouver, ISO, and other styles
13

Grbovic, Mihajlo, Christopher Dance, and Slobodan Vucetic. "Sparse Principal Component Analysis with Constraints." Proceedings of the AAAI Conference on Artificial Intelligence 26, no. 1 (September 20, 2021): 935–41. http://dx.doi.org/10.1609/aaai.v26i1.8316.

Full text
Abstract:
The sparse principal component analysis is a variant of the classical principal component analysis, which finds linear combinations of a small number of features that maximize variance across data. In this paper we propose a methodology for adding two general types of feature grouping constraints into the original sparse PCA optimization procedure.We derive convex relaxations of the considered constraints, ensuring the convexity of the resulting optimization problem. Empirical evaluation on three real-world problems, one in process monitoring sensor networks and two in social networks, serves to illustrate the usefulness of the proposed methodology.
APA, Harvard, Vancouver, ISO, and other styles
14

Shao, R., F. Jia, E. B. Martin, and A. J. Morris. "Wavelets and non-linear principal components analysis for process monitoring." Control Engineering Practice 7, no. 7 (July 1999): 865–79. http://dx.doi.org/10.1016/s0967-0661(99)00039-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Vinjamuri, Ramana, Vrajeshri Patel, Michael Powell, Zhi-Hong Mao, and Nathan Crone. "Candidates for Synergies: Linear Discriminants versus Principal Components." Computational Intelligence and Neuroscience 2014 (2014): 1–10. http://dx.doi.org/10.1155/2014/373957.

Full text
Abstract:
Movement primitives or synergies have been extracted from human hand movements using several matrix factorization, dimensionality reduction, and classification methods. Principal component analysis (PCA) is widely used to obtain the first few significant eigenvectors of covariance that explain most of the variance of the data. Linear discriminant analysis (LDA) is also used as a supervised learning method to classify the hand postures corresponding to the objects grasped. Synergies obtained using PCA are principal component vectors aligned with dominant variances. On the other hand, synergies obtained using LDA are linear discriminant vectors that separate the groups of variances. In this paper, time varying kinematic synergies in the human hand grasping movements were extracted using these two diametrically opposite methods and were evaluated in reconstructing natural and American sign language (ASL) postural movements. We used an unsupervised LDA (ULDA) to extract linear discriminants. The results suggest that PCA outperformed LDA. The uniqueness, advantages, and disadvantages of each of these methods in representing high-dimensional hand movements in reduced dimensions were discussed.
APA, Harvard, Vancouver, ISO, and other styles
16

Papaioannou, Athanasios, and Stefanos Zafeiriou. "Principal Component Analysis With Complex Kernel: The Widely Linear Model." IEEE Transactions on Neural Networks and Learning Systems 25, no. 9 (September 2014): 1719–26. http://dx.doi.org/10.1109/tnnls.2013.2285783.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Ricciardi, Carlo, Antonio Saverio Valente, Kyle Edmund, Valeria Cantoni, Roberta Green, Antonella Fiorillo, Ilaria Picone, Stefania Santini, and Mario Cesarelli. "Linear discriminant analysis and principal component analysis to predict coronary artery disease." Health Informatics Journal 26, no. 3 (January 23, 2020): 2181–92. http://dx.doi.org/10.1177/1460458219899210.

Full text
Abstract:
Coronary artery disease is one of the most prevalent chronic pathologies in the modern world, leading to the deaths of thousands of people, both in the United States and in Europe. This article reports the use of data mining techniques to analyse a population of 10,265 people who were evaluated by the Department of Advanced Biomedical Sciences for myocardial ischaemia. Overall, 22 features are extracted, and linear discriminant analysis is implemented twice through both the Knime analytics platform and R statistical programming language to classify patients as either normal or pathological. The former of these analyses includes only classification, while the latter method includes principal component analysis before classification to create new features. The classification accuracies obtained for these methods were 84.5 and 86.0 per cent, respectively, with a specificity over 97 per cent and a sensitivity between 62 and 66 per cent. This article presents a practical implementation of traditional data mining techniques that can be used to help clinicians in decision-making; moreover, principal component analysis is used as an algorithm for feature reduction.
APA, Harvard, Vancouver, ISO, and other styles
18

Witjes, Han, Mark Rijpkema, Marinette van der Graaf, Willem Melssen, Arend Heerschap, and Lutgarde Buydens. "Multispectral magnetic resonance image analysis using principal component and linear discriminant analysis." Journal of Magnetic Resonance Imaging 17, no. 2 (January 22, 2003): 261–69. http://dx.doi.org/10.1002/jmri.10237.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Gwelo, Abubakari S. "PRINCIPAL COMPONENTS TO OVERCOME MULTICOLLINEARITY PROBLEM." Oradea Journal of Business and Economics 4, no. 1 (March 2019): 79–91. http://dx.doi.org/10.47535/1991ojbe062.

Full text
Abstract:
The impact of ignoring collinearity among predictors is well documented in a statistical literature. An attempt has been made in this study to document application of Principal components as remedial solution to this problem. Using a sample of six hundred participants, linear regression model was fitted and collinearity between predictors was detected using Variance Inflation Factor (VIF). After confirming the existence of high relationship between independent variables, the principal components was utilized to find the possible linear combination of variables that can produce large variance without much loss of information. Thus, the set of correlated variables were reduced into new minimum number of variables which are independent on each other but contained linear combination of the related variables. In order to check the presence of relationship between predictors, dependent variables were regressed on these five principal components. The results show that VIF values for each predictor ranged from 1 to 3 which indicates that multicollinearity problem was eliminated. Finally another linear regression model was fitted using Principal components as predictors. The assessment of relationship between predictors indicated that no any symptoms of multicollinearity were observed. The study revealed that principal component analysis is one of the appropriate methods of solving the collinearity among variables. Therefore this technique produces better estimation and prediction than ordinary least squares when predictors are related. The study concludes that principal component analysis is appropriate method of solving this matter.
APA, Harvard, Vancouver, ISO, and other styles
20

Mokeev, A. V., and V. V. Mokeev. "Pattern recognition by means of linear discriminant analysis and the principal components analysis." Pattern Recognition and Image Analysis 25, no. 4 (October 2015): 685–91. http://dx.doi.org/10.1134/s1054661815040185.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Shao, R., F. Jia, E. B. Martin, and A. J. Morris. "Fault Detection Using Wavelet Filtering and Non-Linear Principal Components Analysis." IFAC Proceedings Volumes 31, no. 10 (June 1998): 23–28. http://dx.doi.org/10.1016/s1474-6670(17)37530-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Jia, F., E. B. Martin, and A. J. Morris. "Non-linear principal components analysis with application to process fault detection." International Journal of Systems Science 31, no. 11 (January 2000): 1473–87. http://dx.doi.org/10.1080/00207720050197848.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Danklmayer, A., M. Chandra, and E. Lüneburg. "Principal Component Analysis In Radar Polarimetry." Advances in Radio Science 3 (May 13, 2005): 399–400. http://dx.doi.org/10.5194/ars-3-399-2005.

Full text
Abstract:
Abstract. Second order moments of multivariate (often Gaussian) joint probability density functions can be described by the covariance or normalised correlation matrices or by the Kennaugh matrix (Kronecker matrix). In Radar Polarimetry the application of the covariance matrix is known as target decomposition theory, which is a special application of the extremely versatile Principle Component Analysis (PCA). The basic idea of PCA is to convert a data set, consisting of correlated random variables into a new set of uncorrelated variables and order the new variables according to the value of their variances. It is important to stress that uncorrelatedness does not necessarily mean independent which is used in the much stronger concept of Independent Component Analysis (ICA). Both concepts agree for multivariate Gaussian distribution functions, representing the most random and least structured distribution. In this contribution, we propose a new approach in applying the concept of PCA to Radar Polarimetry. Therefore, new uncorrelated random variables will be introduced by means of linear transformations with well determined loading coefficients. This in turn, will allow the decomposition of the original random backscattering target variables into three point targets with new random uncorrelated variables whose variances agree with the eigenvalues of the covariance matrix. This allows a new interpretation of existing decomposition theorems.
APA, Harvard, Vancouver, ISO, and other styles
24

San Ye, Guo Ke, and Zhu Yi. "Separability Promotion Algorithm Based on Kernel Principal Component Analysis plus Linear Discriminant Analysis." International Journal of Advancements in Computing Technology 5, no. 6 (March 31, 2013): 1048–57. http://dx.doi.org/10.4156/ijact.vol5.issue6.123.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Zakaria, Nur Khalidah. "ASD Children Gait Classification Based On Principal Component Analysis and Linear Discriminant Analysis." International Journal of Emerging Trends in Engineering Research 8, no. 6 (June 25, 2020): 2438–45. http://dx.doi.org/10.30534/ijeter/2020/38862020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Zhu, Zhibo, Wei Yan, Yongan Wang, Yang Zhao, Tao Zhang, and Junshuo Huang. "Noise Analysis Method of Radiated EMI based on Non-linear Principal Component Analysis." Applied Computational Electromagnetics Society 35, no. 10 (December 8, 2020): 1144–52. http://dx.doi.org/10.47037/2020.aces.j.351006.

Full text
Abstract:
Aiming at the radiated electromagnetic interference (EMI) noise of electronic equipment, a novel method of radiated EMI noise analysis based on non-linear principal component analysis (NLPCA) algorithm is proposed in this paper. In order to obtain multiple independent common-mode / differential-mode radiated sources, and to find the sources that cause the radiated noises that exceed the limit of standard, NLPCA algorithm is used to process the near-field radiated signals superimposed by multiple radiated sources. The simulation results show that NLPCA can successfully screen out the radiated EMI noises which exceed the limit of standard. Moreover, the experiments are carried out with three models: double-common-mode hybrid sources, double-differential-mode hybrid sources and common-differential-mode hybrid sources. Compared with the traditional independent component algorithm (ICA), the method proposed in this paper can separate the radiated EMI noise sources more accurately and quickly. It can be concluded that the accuracy of NLPCA algorithm is 10% higher than ICA algorithm. This work will contribute to trace the radiated EMI noise sources, and to provide the theoretical basis for the future suppression.
APA, Harvard, Vancouver, ISO, and other styles
27

LI Hai-sen, 李海森, 张艳宁 ZHANG Yan-ning, 姚睿 YAO Rui, and 孙瑾秋 Sun Jin-qiu. "Parameter estimation of linear motion blur based on principal component analysis." Optics and Precision Engineering 21, no. 10 (2013): 2656–63. http://dx.doi.org/10.3788/ope.20132110.2656.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Ma, Steven. "Principal Component Analysis in Linear Regression Survival Model with Microarray Data." Journal of Data Science 5, no. 2 (July 12, 2021): 183–98. http://dx.doi.org/10.6339/jds.2007.05(2).326.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Mohtasham, Jalil, Ali Shams Nateri, and Hale Khalili. "Textile colour matching using linear and exponential weighted principal component analysis." Coloration Technology 128, no. 3 (April 26, 2012): 199–203. http://dx.doi.org/10.1111/j.1478-4408.2012.00362.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Zhu, Yani, Chaoyang Zhu, and Xiaoxin Li. "Improved principal component analysis and linear regression classification for face recognition." Signal Processing 145 (April 2018): 175–82. http://dx.doi.org/10.1016/j.sigpro.2017.11.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Roopa, H., and T. Asha. "A Linear Model Based on Principal Component Analysis for Disease Prediction." IEEE Access 7 (2019): 105314–18. http://dx.doi.org/10.1109/access.2019.2931956.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Pankavich, Stephen, and Rebecca Swanson. "Principal Component Analysis: Resources for an Essential Application of Linear Algebra." PRIMUS 25, no. 5 (December 20, 2014): 400–420. http://dx.doi.org/10.1080/10511970.2014.993446.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Peng, Chong, Yongyong Chen, Zhao Kang, Chenglizhao Chen, and Qiang Cheng. "Robust principal component analysis: A factorization-based approach with linear complexity." Information Sciences 513 (March 2020): 581–99. http://dx.doi.org/10.1016/j.ins.2019.09.074.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Van Pelt, Wilfrid, and Jan Van Rijckevorsel. "Non-linear principal component analysis of maximum expiratory flow-volume curves." Applied Stochastic Models and Data Analysis 2, no. 1-2 (1986): 1–12. http://dx.doi.org/10.1002/asm.3150020102.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Li, Heping, Yu Ren, Fan Yu, Dongliang Song, Lizhe Zhu, Shibo Yu, Siyuan Jiang, and Shuang Wang. "Raman Microspectral Study and Classification of the Pathological Evolution of Breast Cancer Using Both Principal Component Analysis-Linear Discriminant Analysis and Principal Component Analysis-Support Vector Machine." Journal of Spectroscopy 2021 (April 21, 2021): 1–11. http://dx.doi.org/10.1155/2021/5572782.

Full text
Abstract:
To facilitate the enhanced reliability of Raman-based tumor detection and analytical methodologies, an ex vivo Raman spectral investigation was conducted to identify distinct compositional information of healthy (H), ductal carcinoma in situ (DCIS), and invasive ductal carcinoma (IDC). Then, principal component analysis-linear discriminant analysis (PCA-LDA) and principal component analysis-support vector machine (PCA-SVM) models were constructed for distinguishing spectral features among different tissue groups. Spectral analysis highlighted differences in levels of unsaturated and saturated lipids, carotenoids, protein, and nucleic acid between healthy and cancerous tissue and variations in the levels of nucleic acid, protein, and phenylalanine between DCIS and IDC. Both classification models were principal component analysis-linear discriminant analysis to be extremely efficient on discriminating tissue pathological types with 99% accuracy for PCA-LDA and 100%, 100%, and 96.7% for PCA-SVM analysis based on linear kernel, polynomial kernel, and radial basis function (RBF), respectively, while PCA-SVM algorithm greatly simplified the complexity of calculation without sacrificing performance. The present study demonstrates that Raman spectroscopy combined with multivariate analysis technology has considerable potential for improving the efficiency and performance of breast cancer diagnosis.
APA, Harvard, Vancouver, ISO, and other styles
36

Zhao, Wei Dong, Chang Liu, and Tao Yan. "Incremental Tensor Principal Component Analysis for Image Recognition." Advanced Materials Research 710 (June 2013): 584–88. http://dx.doi.org/10.4028/www.scientific.net/amr.710.584.

Full text
Abstract:
Aiming at the disadvantages of the traditional off-line vector-based learning algorithm, this paper proposes a kind of Incremental Tensor Principal Component Analysis (ITPCA) algorithm. It represents an image as a tensor data and processes incremental principal component analysis learning based on update-SVD technique. On the one hand, the proposed algorithm is helpful to preserve the structure information of the image. On the other hand, it solves the training problem for new samples. The experiments on handwritten numeral recognition have demonstrated that the algorithm has achieved better performance than traditional vector-based Incremental Principal Component Analysis (IPCA) and Multi-linear Principal Component Analysis (MPCA) algorithms.
APA, Harvard, Vancouver, ISO, and other styles
37

Niaki, Seyed Taghi Akhavan, Majid Khedmati, and Mir Emad Soleymanian. "Statistical Monitoring of Autocorrelated Simple Linear Profiles Based on Principal Components Analysis." Communications in Statistics - Theory and Methods 44, no. 21 (November 2, 2015): 4454–75. http://dx.doi.org/10.1080/03610926.2013.835417.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Ahn, Jong-Hoon, and Jong-Hoon Oh. "A Constrained EM Algorithm for Principal Component Analysis." Neural Computation 15, no. 1 (January 1, 2003): 57–65. http://dx.doi.org/10.1162/089976603321043694.

Full text
Abstract:
We propose a constrained EM algorithm for principal component analysis (PCA) using a coupled probability model derived from single-standard factor analysis models with isotropic noise structure. The single probabilistic PCA, especially for the case where there is no noise, can find only a vector set that is a linear superposition of principal components and requires postprocessing, such as diagonalization of symmetric matrices. By contrast, the proposed algorithm finds the actual principal components, which are sorted in descending order of eigenvalue size and require no additional calculation or postprocessing. The method is easily applied to kernel PCA. It is also shown that the new EM algorithm is derived from a generalized least-squares formulation.
APA, Harvard, Vancouver, ISO, and other styles
39

Astuti, Widi, and Adiwijaya Adiwijaya. "Principal Component Analysis Sebagai Ekstraksi Fitur Data Microarray Untuk Deteksi Kanker Berbasis Linear Discriminant Analysis." JURNAL MEDIA INFORMATIKA BUDIDARMA 3, no. 2 (April 14, 2019): 72. http://dx.doi.org/10.30865/mib.v3i2.1161.

Full text
Abstract:
Cancer is one of the leading causes of death globally. Early detection of cancer allows better treatment for patients. One method to detect cancer is using microarray data classification. However, microarray data has high dimensions which complicates the classification process. Linear Discriminant Analysis is a classification technique which is easy to implement and has good accuracy. However, Linear Discriminant Analysis has difficulty in handling high dimensional data. Therefore, Principal Component Analysis, a feature extraction technique is used to optimize Linear Discriminant Analysis performance. Based on the results of the study, it was found that usage of Principal Component Analysis increases the accuracy of up to 29.04% and f-1 score by 64.28% for colon cancer data.
APA, Harvard, Vancouver, ISO, and other styles
40

Machidon, Alina L., Fabio Del Frate, Matteo Picchiani, Octavian M. Machidon, and Petre L. Ogrutan. "Geometrical Approximated Principal Component Analysis for Hyperspectral Image Analysis." Remote Sensing 12, no. 11 (May 26, 2020): 1698. http://dx.doi.org/10.3390/rs12111698.

Full text
Abstract:
Principal Component Analysis (PCA) is a method based on statistics and linear algebra techniques, used in hyperspectral satellite imagery for data dimensionality reduction required in order to speed up and increase the performance of subsequent hyperspectral image processing algorithms. This paper introduces the PCA approximation method based on a geometric construction approach (gaPCA) method, an alternative algorithm for computing the principal components based on a geometrical constructed approximation of the standard PCA and presents its application to remote sensing hyperspectral images. gaPCA has the potential of yielding better land classification results by preserving a higher degree of information related to the smaller objects of the scene (or to the rare spectral objects) than the standard PCA, being focused not on maximizing the variance of the data, but the range. The paper validates gaPCA on four distinct datasets and performs comparative evaluations and metrics with the standard PCA method. A comparative land classification benchmark of gaPCA and the standard PCA using statistical-based tools is also described. The results show gaPCA is an effective dimensionality-reduction tool, with performance similar to, and in several cases, even higher than standard PCA on specific image classification tasks. gaPCA was shown to be more suitable for hyperspectral images with small structures or objects that need to be detected or where preponderantly spectral classes or spectrally similar classes are present.
APA, Harvard, Vancouver, ISO, and other styles
41

Niu, Dong Xiao, Qiong Wang, Peng Wang, Shu Yi Zhou, Wei Dong Liu, and Xiao Yan Yu. "Electricity Competitiveness Evaluation Research Based on Principal Component Analysis." Advanced Materials Research 960-961 (June 2014): 1467–72. http://dx.doi.org/10.4028/www.scientific.net/amr.960-961.1467.

Full text
Abstract:
This paper constructs the evaluation index system of electricity competitiveness in terminal energy consumption, evaluates the electricity competitiveness in Ningxia region from 2005 to 2011 using principal component analysis (PCA), and compares the evaluation results of PCA, the linear weighted method, the comprehensive index method and TOPSIS-grey correlation method. The compatibility degree and difference degree of each method are analyzed and calculated to verify the applicability of the PCA. The results show that PCA is the most scientific and appropriate evaluation method.
APA, Harvard, Vancouver, ISO, and other styles
42

Castaño, A., F. Fernández-Navarro, Annalisa Riccardi, and C. Hervás-Martínez. "Enforcement of the principal component analysis–extreme learning machine algorithm by linear discriminant analysis." Neural Computing and Applications 27, no. 6 (June 25, 2015): 1749–60. http://dx.doi.org/10.1007/s00521-015-1974-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Mallary, C., C. J. Berg, John R. Buck, Amit Tandon, and Alan Andonian. "Acoustic rainfall detection with linear discriminant functions of principal components." Journal of the Acoustical Society of America 151, no. 4 (April 2022): A149. http://dx.doi.org/10.1121/10.0010934.

Full text
Abstract:
Ma and Nystuen (2005) pioneered passive acoustic measurement of rainfall rates. This project extends their work with signal processing algorithms exploiting the full frequency band of the acoustic signals. We also extend Schwock and Abadi’s order-statistic power spectral density (PSD) estimation for outlier rejection to reject recreational anthropogenic noise sources and reject diurnal biological sources using two hydrophones spaced by 1 m. Ma and Nystuen reduced the data dimensionality by extracting a few "discriminant frequencies." Our proposed detection algorithm implements principal component analysis (PCA) to reduce the estimated PSD to two principal components. Linear discriminant analysis (LDA) provides a simple detection statistic from the two dimensional principal components. We evaluated our algorithm on four months of acoustic and meteorological data collected from a dock in New Bedford, MA in shallow water (3 m deep). For 1% false alarms, the proposed PCA/LDA algorithm correctly detected 36% (±7%) of rain events exceeding 1 mm/hr, including 64% (±7%) of the rain by volume. Applying Ma and Nystuen’s algorithm to the same data set for the same false alarm rate detected 23% (±11%) of events containing 52% (±26%) of the rainfall volume. [Work supported by ONR.]
APA, Harvard, Vancouver, ISO, and other styles
44

B S, Lokasree. "Data Analysis and Data Classification in Machine Learning using Linear Regression and Principal Component Analysis." Turkish Journal of Computer and Mathematics Education (TURCOMAT) 12, no. 2 (April 11, 2021): 835–44. http://dx.doi.org/10.17762/turcomat.v12i2.1092.

Full text
Abstract:
In this paper step-by-step procedure to implement linear regression and principal component analysis by considering two examples for each model is explained, to predict the continuous values of target variables. Basically linear regression methods are widely used in prediction, forecasting and error reduction. And principle component analysis is applied for facial recognition, computer vision etc. In Principal component analysis, it is explained how to select a point with respect to variance. And also Lagrange multiplier is used to maximize the principle component function, so that optimized solution is obtained
APA, Harvard, Vancouver, ISO, and other styles
45

UESUGI, Ryo, Katsuhiro HONDA, Hidetomo ICHIHASHI, and Akira NOTSU. "Local Principal Component Analysis for Mixed Databases Based on Linear Fuzzy Clustering." Journal of Japan Society for Fuzzy Theory and Intelligent Informatics 19, no. 3 (2007): 287–98. http://dx.doi.org/10.3156/jsoft.19.287.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Yang, Stephen J. H., Owen H. T. Lu, Anna Y. Q. Huang, Jeff C. H. Huang, Hiroaki Ogata, and Albert J. Q. Lin. "Predicting Students' Academic Performance Using Multiple Linear Regression and Principal Component Analysis." Journal of Information Processing 26 (2018): 170–76. http://dx.doi.org/10.2197/ipsjjip.26.170.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Selvi. "An Efficient Age Estimation System based on Multi Linear Principal Component Analysis." Journal of Computer Science 7, no. 10 (October 1, 2011): 1497–504. http://dx.doi.org/10.3844/jcssp.2011.1497.1504.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Tan, M. H., and J. K. Hammond. "A non-parametric approach for linear system identification using principal component analysis." Mechanical Systems and Signal Processing 21, no. 4 (May 2007): 1576–600. http://dx.doi.org/10.1016/j.ymssp.2006.07.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Hot, A., G. Kerschen, E. Foltête, and S. Cogan. "Detection and quantification of non-linear structural behavior using principal component analysis." Mechanical Systems and Signal Processing 26 (January 2012): 104–16. http://dx.doi.org/10.1016/j.ymssp.2011.06.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Lei, Min, and Guang Meng. "Symplectic Principal Component Analysis: A New Method for Time Series Analysis." Mathematical Problems in Engineering 2011 (2011): 1–14. http://dx.doi.org/10.1155/2011/793429.

Full text
Abstract:
Experimental data are often very complex since the underlying dynamical system may be unknown and the data may heavily be corrupted by noise. It is a crucial task to properly analyze data to get maximal information of the underlying dynamical system. This paper presents a novel principal component analysis (PCA) method based on symplectic geometry, called symplectic PCA (SPCA), to study nonlinear time series. Being nonlinear, it is different from the traditional PCA method based on linear singular value decomposition (SVD). It is thus perceived to be able to better represent nonlinear, especially chaotic data, than PCA. Using the chaotic Lorenz time series data, we show that this is indeed the case. Furthermore, we show that SPCA can conveniently reduce measurement noise.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography