Статті в журналах з теми "Dimensionality reduction analysis"

Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Dimensionality reduction analysis.

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Dimensionality reduction analysis".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Vats, Deepak, and Avinash Sharma. "Dimensionality Reduction Techniques: Comparative Analysis." Journal of Computational and Theoretical Nanoscience 17, no. 6 (June 1, 2020): 2684–88. http://dx.doi.org/10.1166/jctn.2020.8967.

Повний текст джерела
Анотація:
It has been spotted an exponential growth in terms of dimension in real world data. Some example of higher dimensional data may includes speech signal, sensor data, medical data, criminal data and data related to recommendation process for different field like news, movies (Netflix) and e-commerce. To empowering learning accuracy in the area of machine learning and enhancing mining performance one need to remove redundant feature and feature not relevant for mining and learning task from this high dimension dataset. There exist many supervised and unsupervised methodologies in literature to perform dimension reduction. The objective of paper is to present most prominent methodologies related to the field of dimension reduction and highlight advantages along with disadvantages of these algorithms which can act as starting point for beginners of this field.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Xie, Fuding, Yutao Fan, and Ming Zhou. "Dimensionality Reduction by Weighted Connections between Neighborhoods." Abstract and Applied Analysis 2014 (2014): 1–5. http://dx.doi.org/10.1155/2014/928136.

Повний текст джерела
Анотація:
Dimensionality reduction is the transformation of high-dimensional data into a meaningful representation of reduced dimensionality. This paper introduces a dimensionality reduction technique by weighted connections between neighborhoods to improveK-Isomap method, attempting to preserve perfectly the relationships between neighborhoods in the process of dimensionality reduction. The validity of the proposal is tested by three typical examples which are widely employed in the algorithms based on manifold. The experimental results show that the local topology nature of dataset is preserved well while transforming dataset in high-dimensional space into a new dataset in low-dimensionality by the proposed method.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Fujiwara, Takanori, Xinhai Wei, Jian Zhao, and Kwan-Liu Ma. "Interactive Dimensionality Reduction for Comparative Analysis." IEEE Transactions on Visualization and Computer Graphics 28, no. 1 (January 2022): 758–68. http://dx.doi.org/10.1109/tvcg.2021.3114807.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Masram, M. S., and T. Diwan. "Microblog Dimensionality Reduction With Semantic Analysis." International Journal of Computer Sciences and Engineering 6, no. 1 (January 31, 2018): 342–46. http://dx.doi.org/10.26438/ijcse/v6i1.342346.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Liang, Zhizheng, Shixiong Xia, and Yong Zhou. "Normalized discriminant analysis for dimensionality reduction." Neurocomputing 110 (June 2013): 153–59. http://dx.doi.org/10.1016/j.neucom.2012.12.007.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Schott, James R. "Dimensionality reduction in quadratic discriminant analysis." Computational Statistics & Data Analysis 16, no. 2 (August 1993): 161–74. http://dx.doi.org/10.1016/0167-9473(93)90111-6.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Kumar, Aswani. "Analysis of unsupervised dimensionality reduction techniques." Computer Science and Information Systems 6, no. 2 (2009): 217–27. http://dx.doi.org/10.2298/csis0902217k.

Повний текст джерела
Анотація:
Domains such as text, images etc contain large amounts of redundancies and ambiguities among the attributes which result in considerable noise effects (i.e. the data is high dimension). Retrieving the data from high dimensional datasets is a big challenge. Dimensionality reduction techniques have been a successful avenue for automatically extracting the latent concepts by removing the noise and reducing the complexity in processing the high dimensional data. In this paper we conduct a systematic study on comparing the unsupervised dimensionality reduction techniques for text retrieval task. We analyze these techniques from the view of complexity, approximation error and retrieval quality with experiments on four testing document collections.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Ngo, T. T., M. Bellalij, and Y. Saad. "The Trace Ratio Optimization Problem for Dimensionality Reduction." SIAM Journal on Matrix Analysis and Applications 31, no. 5 (January 2010): 2950–71. http://dx.doi.org/10.1137/090776603.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Wang, Shanshan, Lan Bai, Xu Chen, Zhen Wang, and Yuan-Hai Shao. "Divergent Projection Analysis for Unsupervised Dimensionality Reduction." Procedia Computer Science 199 (2022): 384–91. http://dx.doi.org/10.1016/j.procs.2022.01.047.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Yuan, Sen, Xia Mao, and Lijiang Chen. "Multilinear Spatial Discriminant Analysis for Dimensionality Reduction." IEEE Transactions on Image Processing 26, no. 6 (June 2017): 2669–81. http://dx.doi.org/10.1109/tip.2017.2685343.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Brown, W. Michael, Shawn Martin, Sara N. Pollock, Evangelos A. Coutsias, and Jean-Paul Watson. "Algorithmic dimensionality reduction for molecular structure analysis." Journal of Chemical Physics 129, no. 6 (August 14, 2008): 064118. http://dx.doi.org/10.1063/1.2968610.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Avron, Haim, Christos Boutsidis, Sivan Toledo, and Anastasios Zouzias. "Efficient Dimensionality Reduction for Canonical Correlation Analysis." SIAM Journal on Scientific Computing 36, no. 5 (January 2014): S111—S131. http://dx.doi.org/10.1137/130919222.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Huang, Ke-Kun, Dao-Qing Dai, and Chuan-Xian Ren. "Regularized coplanar discriminant analysis for dimensionality reduction." Pattern Recognition 62 (February 2017): 87–98. http://dx.doi.org/10.1016/j.patcog.2016.08.024.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Murthy, K. Ramachandra, and Ashish Ghosh. "Moments discriminant analysis for supervised dimensionality reduction." Neurocomputing 237 (May 2017): 114–32. http://dx.doi.org/10.1016/j.neucom.2016.09.048.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Kenny, Sean, Luis Crespo, and Dan Giesy. "Dimensionality reduction for uncertain dynamic systems." International Journal for Numerical Methods in Engineering 80, no. 6‒7 (November 5, 2009): 767–88. http://dx.doi.org/10.1002/nme.2591.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Sun, Yu-Yin, Michael Ng, and Zhi-Hua Zhou. "Multi-Instance Dimensionality Reduction." Proceedings of the AAAI Conference on Artificial Intelligence 24, no. 1 (July 3, 2010): 587–92. http://dx.doi.org/10.1609/aaai.v24i1.7700.

Повний текст джерела
Анотація:
Multi-instance learning deals with problems that treat bags of instances as training examples. In single-instance learning problems, dimensionality reduction is an essential step for high-dimensional data analysis and has been studied for years. The curse of dimensionality also exists in multiinstance learning tasks, yet this difficult task has not been studied before. Direct application of existing single-instance dimensionality reduction objectives to multi-instance learning tasks may not work well since it ignores the characteristic of multi-instance learning that the labels of bags are known while the labels of instances are unknown. In this paper, we propose an effective model and develop an efficient algorithm to solve the multi-instance dimensionality reduction problem. We formulate the objective as an optimization problem by considering orthonormality and sparsity constraints in the projection matrix for dimensionality reduction, and then solve it by the gradient descent along the tangent space of the orthonormal matrices. We also propose an approximation for improving the efficiency. Experimental results validate the effectiveness of the proposed method.
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Wang, Yong Mao. "An Image Retrieval Model Based on Local Fisher Discriminant Analysis." Advanced Materials Research 255-260 (May 2011): 2057–61. http://dx.doi.org/10.4028/www.scientific.net/amr.255-260.2057.

Повний текст джерела
Анотація:
This paper introduces an image retrieval model based on dimensionality reduction. The proposed model is divided into two main techniques, the first one is concerned with the feature extraction from image database, and the second one is performing a dimensionality reduction. In the first technique, the color histogram and Color Texture Moment are used to extract the color and texture features, respectively. In the second technique, Local Fisher Discriminant Analysis (LFDA) which is a supervised linear dimensionality reduction algorithm is used to performing dimensionality. LFDA combines the ideas of Fisher Discriminant Analysis (FDA) and Locality Preserving Projection (LPP). LFDA can preserve both manifold of data and discriminant information. Experiments demonstrate that the proposed image retrieval scheme based on dimensionality reduction can achieve satisfactory results.
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Gao, Chaobang, and Jiajin Wen. "A dimensionality reduction principle on the optimization of function." Journal of Mathematical Inequalities, no. 3 (2013): 357–75. http://dx.doi.org/10.7153/jmi-07-32.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
19

K, Bhargavi. "Data Dimensionality Reduction Techniques : Review." International Journal of Engineering Technology and Management Sciences 4, no. 4 (July 28, 2020): 62–65. http://dx.doi.org/10.46647/ijetms.2020.v04i04.010.

Повний текст джерела
Анотація:
Data science is the study of data. It involves developing methods of recording, storing, and analyzing data to effectively extract useful information. The goal of data science is to gain insights and knowledge from any type of data — both structured and unstructured. Data science is related to computer science, but is a separate field. Computer science involves creating programs and algorithms to record and process data, while data science covers any type of data analysis, which may or may not use computers. Data science is more closely related to the mathematics field of Statistics, which includes the collection, organization, analysis, and presentation of data. Because of the large amounts of data modern companies and organizations maintain, data science has become an integral part of IT. For example, a company that has petabytes of user data may use data science to develop effective ways to store, manage, and analyze the data. The company may use the scientific method to run tests and extract results that can provide meaningful insights about their users.
Стилі APA, Harvard, Vancouver, ISO та ін.
20

López-Rubio, Ezequiel, Juan Miguel Ortiz-de-Lazcano-Lobato, José Muñoz-Pérez, and José Antonio Gómez-Ruiz. "Principal Components Analysis Competitive Learning." Neural Computation 16, no. 11 (November 1, 2004): 2459–81. http://dx.doi.org/10.1162/0899766041941880.

Повний текст джерела
Анотація:
We present a new neural model that extends the classical competitive learning by performing a principal components analysis (PCA) at each neuron. This model represents an improvement with respect to known local PCA methods, because it is not needed to present the entire data set to the network on each computing step. This allows a fast execution while retaining the dimensionality-reduction properties of the PCA. Furthermore, every neuron is able to modify its behavior to adapt to the local dimensionality of the input distribution. Hence, our model has a dimensionality estimation capability. The experimental results we present show the dimensionality-reduction capabilities of the model with multisensor images.
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Xu, Hui, Yongguo Yang, Xin Wang, Mingming Liu, Hongxia Xie, and Chujiao Wang. "Multiple Kernel Dimensionality Reduction via Ratio-Trace and Marginal Fisher Analysis." Mathematical Problems in Engineering 2019 (January 14, 2019): 1–8. http://dx.doi.org/10.1155/2019/6941475.

Повний текст джерела
Анотація:
Traditional supervised multiple kernel learning (MKL) for dimensionality reduction is generally an extension of kernel discriminant analysis (KDA), which has some restrictive assumptions. In addition, they generally are based on graph embedding framework. A more general multiple kernel-based dimensionality reduction algorithm, called multiple kernel marginal Fisher analysis (MKL-MFA), is presented for supervised nonlinear dimensionality reduction combined with ratio-race optimization problem. MKL-MFA aims at relaxing the restrictive assumption that the data of each class is of a Gaussian distribution and finding an appropriate convex combination of several base kernels. To improve the efficiency of multiple kernel dimensionality reduction, the spectral regression frameworks are incorporated into the optimization model. Furthermore, the optimal weights of predefined base kernels can be obtained by solving a different convex optimization. Experimental results on benchmark datasets demonstrate that MKL-MFA outperforms the state-of-the-art supervised multiple kernel dimensionality reduction methods.
Стилі APA, Harvard, Vancouver, ISO та ін.
22

Tang, Yunbo, Dan Chen, and Xiaoli Li. "Dimensionality Reduction Methods for Brain Imaging Data Analysis." ACM Computing Surveys 54, no. 4 (July 2021): 1–36. http://dx.doi.org/10.1145/3448302.

Повний текст джерела
Анотація:
The past century has witnessed the grand success of brain imaging technologies, such as electroencephalography and magnetic resonance imaging, in probing cognitive states and pathological brain dynamics for neuroscience research and neurology practices. Human brain is “the most complex object in the universe,” and brain imaging data ( BID ) are routinely of multiple/many attributes and highly non-stationary. These are determined by the nature of BID as the recordings of the evolving processes of the brain(s) under examination in various views. Driven by the increasingly high demands for precision, efficiency, and reliability in neuro-science and engineering tasks, dimensionality reduction has become a priority issue in BID analysis to handle the notoriously high dimensionality and large scale of big BID sets as well as the enormously complicated interdependencies among data elements. This has become particularly urgent and challenging in this big data era. Dimensionality reduction theories and methods manifest unrivaled potential in revealing key insights to BID via offering the low-dimensional/tiny representations/features, which may preserve critical characterizations of massive neuronal activities and brain functional and/or malfunctional states of interest. This study surveys the most salient work along this direction conforming to a 3-dimensional taxonomy with respect to (1) the scale of BID , of which the design with this consideration is important for the potential applications; (2) the order of BID , in which a higher order denotes more BID attributes manipulatable by the method; and (3) linearity , in which the method’s degree of linearity largely determines the “fidelity” in BID exploration. This study defines criteria for qualitative evaluations of these works in terms of effectiveness, interpretability, efficiency, and scalability. The classifications and evaluations based on the taxonomy provide comprehensive guides to (1) how existing research and development efforts are distributed and (2) their performance, features, and potential in influential applications especially when involving big data. In the end, this study crystallizes the open technical issues and proposes research challenges that must be solved to enable further researches in this area of great potential.
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Hsu, Chung-Chian, and Jhen-Wei Wu. "Visualized mixed-type data analysis via dimensionality reduction." Intelligent Data Analysis 22, no. 5 (September 26, 2018): 981–1007. http://dx.doi.org/10.3233/ida-173480.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
24

Sharma, A., and K. K. Paliwal. "Rotational Linear Discriminant Analysis Technique for Dimensionality Reduction." IEEE Transactions on Knowledge and Data Engineering 20, no. 10 (October 2008): 1336–47. http://dx.doi.org/10.1109/tkde.2008.101.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Su, Bing, Xiaoqing Ding, Changsong Liu, and Ying Wu. "Heteroscedastic Max–Min Distance Analysis for Dimensionality Reduction." IEEE Transactions on Image Processing 27, no. 8 (August 2018): 4052–65. http://dx.doi.org/10.1109/tip.2018.2836312.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
26

Cong, Iris, and Luming Duan. "Quantum discriminant analysis for dimensionality reduction and classification." New Journal of Physics 18, no. 7 (July 6, 2016): 073011. http://dx.doi.org/10.1088/1367-2630/18/7/073011.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
27

Zhu, Xiaofeng, Zi Huang, Heng Tao Shen, Jian Cheng, and Changsheng Xu. "Dimensionality reduction by Mixed Kernel Canonical Correlation Analysis." Pattern Recognition 45, no. 8 (August 2012): 3003–16. http://dx.doi.org/10.1016/j.patcog.2012.02.007.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
28

Haufe, Stefan, Sven Dähne, and Vadim V. Nikulin. "Dimensionality reduction for the analysis of brain oscillations." NeuroImage 101 (November 2014): 583–97. http://dx.doi.org/10.1016/j.neuroimage.2014.06.073.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
29

Zhu, Tao, Ye Xu, Furao Shen, and Jinxi Zhao. "Orthogonal component analysis: A fast dimensionality reduction algorithm." Neurocomputing 177 (February 2016): 136–46. http://dx.doi.org/10.1016/j.neucom.2015.11.012.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
30

Yuan, Ming-Dong, Da-Zheng Feng, Ya Shi, and Wen-Juan Liu. "Dimensionality reduction by collaborative preserving Fisher discriminant analysis." Neurocomputing 356 (September 2019): 228–43. http://dx.doi.org/10.1016/j.neucom.2019.05.014.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
31

Ivosev, Gordana, Lyle Burton, and Ron Bonner. "Dimensionality Reduction and Visualization in Principal Component Analysis." Analytical Chemistry 80, no. 13 (July 2008): 4933–44. http://dx.doi.org/10.1021/ac800110w.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Reddy, G. Thippa, M. Praveen Kumar Reddy, Kuruva Lakshmanna, Rajesh Kaluri, Dharmendra Singh Rajput, Gautam Srivastava, and Thar Baker. "Analysis of Dimensionality Reduction Techniques on Big Data." IEEE Access 8 (2020): 54776–88. http://dx.doi.org/10.1109/access.2020.2980942.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Lee, Jea-Young, and Ho-Guen Lee. "Multifactor Dimensionality Reduction(MDR) Analysis by Dummy Variables." Korean Journal of Applied Statistics 22, no. 2 (April 30, 2009): 435–42. http://dx.doi.org/10.5351/kjas.2009.22.2.435.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Sun, H., C. Du, H. X. Zou, and K. F. Ji. "Nonparametric sparse discriminant analysis for dimensionality reduction ‐ RETRACTED." Electronics Letters 49, no. 3 (January 2013): 187–89. http://dx.doi.org/10.1049/el.2012.2876.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Alaybeyoglu, Begum, and Elif Ozkirimli. "Analysis of Membrane Translocation Simulations using Dimensionality Reduction." Biophysical Journal 106, no. 2 (January 2014): 805a. http://dx.doi.org/10.1016/j.bpj.2013.11.4415.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Xu, Jie, Zhenghong Gu, and Kan Xie. "Fuzzy Local Mean Discriminant Analysis for Dimensionality Reduction." Neural Processing Letters 44, no. 3 (December 8, 2015): 701–18. http://dx.doi.org/10.1007/s11063-015-9489-3.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Zhong, Guoqiang, and Mohamed Cheriet. "Large Margin Low Rank Tensor Analysis." Neural Computation 26, no. 4 (April 2014): 761–80. http://dx.doi.org/10.1162/neco_a_00570.

Повний текст джерела
Анотація:
We present a supervised model for tensor dimensionality reduction, which is called large margin low rank tensor analysis (LMLRTA). In contrast to traditional vector representation-based dimensionality reduction methods, LMLRTA can take any order of tensors as input. And unlike previous tensor dimensionality reduction methods, which can learn only the low-dimensional embeddings with a priori specified dimensionality, LMLRTA can automatically and jointly learn the dimensionality and the low-dimensional representations from data. Moreover, LMLRTA delivers low rank projection matrices, while it encourages data of the same class to be close and of different classes to be separated by a large margin of distance in the low-dimensional tensor space. LMLRTA can be optimized using an iterative fixed-point continuation algorithm, which is guaranteed to converge to a local optimal solution of the optimization problem. We evaluate LMLRTA on an object recognition application, where the data are represented as 2D tensors, and a face recognition application, where the data are represented as 3D tensors. Experimental results show the superiority of LMLRTA over state-of-the-art approaches.
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Wang, Ziqiang, Xia Sun, Lijun Sun, and Yuchun Huang. "Semisupervised Kernel Marginal Fisher Analysis for Face Recognition." Scientific World Journal 2013 (2013): 1–13. http://dx.doi.org/10.1155/2013/981840.

Повний текст джерела
Анотація:
Dimensionality reduction is a key problem in face recognition due to the high-dimensionality of face image. To effectively cope with this problem, a novel dimensionality reduction algorithm called semisupervised kernel marginal Fisher analysis (SKMFA) for face recognition is proposed in this paper. SKMFA can make use of both labelled and unlabeled samples to learn the projection matrix for nonlinear dimensionality reduction. Meanwhile, it can successfully avoid the singularity problem by not calculating the matrix inverse. In addition, in order to make the nonlinear structure captured by the data-dependent kernel consistent with the intrinsic manifold structure, a manifold adaptive nonparameter kernel is incorporated into the learning process of SKMFA. Experimental results on three face image databases demonstrate the effectiveness of our proposed algorithm.
Стилі APA, Harvard, Vancouver, ISO та ін.
39

Yan, Ronghua, Jinye Peng, and Dongmei Ma. "Dimensionality Reduction Based on PARAFAC Model." Journal of Imaging Science and Technology 63, no. 6 (November 1, 2019): 60501–1. http://dx.doi.org/10.2352/j.imagingsci.technol.2019.63.6.060501.

Повний текст джерела
Анотація:
Abstract In hyperspectral image analysis, dimensionality reduction is a preprocessing step for hyperspectral image (HSI) classification. Principal component analysis (PCA) reduces the spectral dimension and does not utilize the spatial information of an HSI. To solve it, the tensor decompositions have been successfully applied to joint noise reduction in spatial and spectral dimensions of hyperspectral images, such as parallel factor analysis (PARAFAC). However, the PARAFAC method does not reduce the dimension in the spectral dimension. To improve it, two new methods were proposed in this article, that is, combine PCA and PARAFAC to reduce both the dimension in the spectral dimension and the noise in the spatial and spectral dimensions. The experimental results indicate that the new methods improve the classification compared with the PARAFAC method.
Стилі APA, Harvard, Vancouver, ISO та ін.
40

James, Lachlan P., Haresh Suppiah, Michael R. McGuigan, and David L. Carey. "Dimensionality Reduction for Countermovement Jump Metrics." International Journal of Sports Physiology and Performance 16, no. 7 (July 1, 2021): 1052–55. http://dx.doi.org/10.1123/ijspp.2020-0606.

Повний текст джерела
Анотація:
Purpose: Dozens of variables can be derived from the countermovement jump (CMJ). However, this does not guarantee an increase in useful information because many of the variables are highly correlated. Furthermore, practitioners should seek to find the simplest solution to performance testing and reporting challenges. The purpose of this investigation was to show how to apply dimensionality reduction to CMJ data with a view to offer practitioners solutions to aid applications in high-performance settings. Methods: The data were collected from 3 cohorts using 3 different devices. Dimensionality reduction was undertaken on the extracted variables by way of principal component analysis and maximum likelihood factor analysis. Results: Over 90% of the variance in each CMJ data set could be explained in 3 or 4 principal components. Similarly, 2 to 3 factors could successfully explain the CMJ. Conclusions: The application of dimensional reduction through principal component analysis and factor analysis allowed for the identification of key variables that strongly contributed to distinct aspects of jump performance. Practitioners and scientists can consider the information derived from these procedures in several ways to streamline the transfer of CMJ test information.
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Remesh, Reshma, and Pattabiraman V. "A SURVEY ON THE CURES FOR THE CURSE OF DIMENSIONALITY IN BIG DATA." Asian Journal of Pharmaceutical and Clinical Research 10, no. 13 (April 1, 2017): 355. http://dx.doi.org/10.22159/ajpcr.2017.v10s1.19755.

Повний текст джерела
Анотація:
Dimensionality reduction techniques are used to reduce the complexity for analysis of high dimensional data sets. The raw input data set may have large dimensions and it might consume time and lead to wrong predictions if unnecessary data attributes are been considered for analysis. So using dimensionality reduction techniques one can reduce the dimensions of input data towards accurate prediction with less cost. In this paper the different machine learning approaches used for dimensionality reductions such as PCA, SVD, LDA, Kernel Principal Component Analysis and Artificial Neural Network have been studied.
Стилі APA, Harvard, Vancouver, ISO та ін.
42

Oseledets, I. V., D. V. Savostianov, and E. E. Tyrtyshnikov. "Tucker Dimensionality Reduction of Three-Dimensional Arrays in Linear Time." SIAM Journal on Matrix Analysis and Applications 30, no. 3 (January 2008): 939–56. http://dx.doi.org/10.1137/060655894.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
43

Bluhm, Andreas, and Daniel Stilck França. "Dimensionality reduction of SDPs through sketching." Linear Algebra and its Applications 563 (February 2019): 461–75. http://dx.doi.org/10.1016/j.laa.2018.11.012.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
44

Olaya, Jbari, and Chakkor Otman. "Non-negative Matrix Factorization for Dimensionality Reduction." ITM Web of Conferences 48 (2022): 03006. http://dx.doi.org/10.1051/itmconf/20224803006.

Повний текст джерела
Анотація:
Abstract—What matrix factorization methods do is reduce the dimensionality of the data without losing any important information. In this work, we present the Non-negative Matrix Factorization (NMF) method, focusing on its advantages concerning other methods of matrix factorization. We discuss the main optimization algorithms, used to solve the NMF problem, and their convergence. The paper also contains a comparative study between principal component analysis (PCA), independent component analysis (ICA), and NMF for dimensionality reduction using a face image database. Index Terms—NMF, PCA, ICA, dimensionality reduction.
Стилі APA, Harvard, Vancouver, ISO та ін.
45

Migenda, Nico, Ralf Möller, and Wolfram Schenck. "Adaptive dimensionality reduction for neural network-based online principal component analysis." PLOS ONE 16, no. 3 (March 30, 2021): e0248896. http://dx.doi.org/10.1371/journal.pone.0248896.

Повний текст джерела
Анотація:
“Principal Component Analysis” (PCA) is an established linear technique for dimensionality reduction. It performs an orthonormal transformation to replace possibly correlated variables with a smaller set of linearly independent variables, the so-called principal components, which capture a large portion of the data variance. The problem of finding the optimal number of principal components has been widely studied for offline PCA. However, when working with streaming data, the optimal number changes continuously. This requires to update both the principal components and the dimensionality in every timestep. While the continuous update of the principal components is widely studied, the available algorithms for dimensionality adjustment are limited to an increment of one in neural network-based and incremental PCA. Therefore, existing approaches cannot account for abrupt changes in the presented data. The contribution of this work is to enable in neural network-based PCA the continuous dimensionality adjustment by an arbitrary number without the necessity to learn all principal components. A novel algorithm is presented that utilizes several PCA characteristics to adaptivly update the optimal number of principal components for neural network-based PCA. A precise estimation of the required dimensionality reduces the computational effort while ensuring that the desired amount of variance is kept. The computational complexity of the proposed algorithm is investigated and it is benchmarked in an experimental study against other neural network-based and incremental PCA approaches where it produces highly competitive results.
Стилі APA, Harvard, Vancouver, ISO та ін.
46

Gewers, Felipe L., Gustavo R. Ferreira, Henrique F. De Arruda, Filipi N. Silva, Cesar H. Comin, Diego R. Amancio, and Luciano Da F. Costa. "Principal Component Analysis." ACM Computing Surveys 54, no. 4 (May 2021): 1–34. http://dx.doi.org/10.1145/3447755.

Повний текст джерела
Анотація:
Principal component analysis (PCA) is often applied for analyzing data in the most diverse areas. This work reports, in an accessible and integrated manner, several theoretical and practical aspects of PCA. The basic principles underlying PCA, data standardization, possible visualizations of the PCA results, and outlier detection are subsequently addressed. Next, the potential of using PCA for dimensionality reduction is illustrated on several real-world datasets. Finally, we summarize PCA-related approaches and other dimensionality reduction techniques. All in all, the objective of this work is to assist researchers from the most diverse areas in using and interpreting PCA.
Стилі APA, Harvard, Vancouver, ISO та ін.
47

Bao Bo-Cheng, Wang Chun-Li, Wu Hua-Gan, and Qiao Xiao-Hua. "Dimensionality reduction modeling and characteristic analysis of memristive circuit." Acta Physica Sinica 63, no. 2 (2014): 020504. http://dx.doi.org/10.7498/aps.63.020504.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
48

Sacha, Dominik, Leishi Zhang, Michael Sedlmair, John A. Lee, Jaakko Peltonen, Daniel Weiskopf, Stephen C. North, and Daniel A. Keim. "Visual Interaction with Dimensionality Reduction: A Structured Literature Analysis." IEEE Transactions on Visualization and Computer Graphics 23, no. 1 (January 2017): 241–50. http://dx.doi.org/10.1109/tvcg.2016.2598495.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
49

Fujiwara, Takanori, Oh-Hyun Kwon, and Kwan-Liu Ma. "Supporting Analysis of Dimensionality Reduction Results with Contrastive Learning." IEEE Transactions on Visualization and Computer Graphics 26, no. 1 (January 2020): 45–55. http://dx.doi.org/10.1109/tvcg.2019.2934251.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
50

PALIWAL, KULDIP K., and ALOK SHARMA. "IMPROVED PSEUDOINVERSE LINEAR DISCRIMINANT ANALYSIS METHOD FOR DIMENSIONALITY REDUCTION." International Journal of Pattern Recognition and Artificial Intelligence 26, no. 01 (February 2012): 1250002. http://dx.doi.org/10.1142/s0218001412500024.

Повний текст джерела
Анотація:
Pseudoinverse linear discriminant analysis (PLDA) is a classical method for solving small sample size problem. However, its performance is limited. In this paper, we propose an improved PLDA method which is faster and produces better classification accuracy when experimented on several datasets.
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії