To see the other types of publications on this topic, follow the link: Dimensionality reduction.

Journal articles on the topic 'Dimensionality reduction'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Dimensionality reduction.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Cheng, Long, Chenyu You, and Yani Guan. "Random Projections for Non-linear Dimensionality Reduction." International Journal of Machine Learning and Computing 6, no. 4 (August 2016): 220–25. http://dx.doi.org/10.18178/ijmlc.2016.6.4.601.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Marchette, David J., and Wendy L. Poston. "Local dimensionality reduction." Computational Statistics 14, no. 4 (September 12, 1999): 469–89. http://dx.doi.org/10.1007/s001800050026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sun, Yu-Yin, Michael Ng, and Zhi-Hua Zhou. "Multi-Instance Dimensionality Reduction." Proceedings of the AAAI Conference on Artificial Intelligence 24, no. 1 (July 3, 2010): 587–92. http://dx.doi.org/10.1609/aaai.v24i1.7700.

Full text
Abstract:
Multi-instance learning deals with problems that treat bags of instances as training examples. In single-instance learning problems, dimensionality reduction is an essential step for high-dimensional data analysis and has been studied for years. The curse of dimensionality also exists in multiinstance learning tasks, yet this difficult task has not been studied before. Direct application of existing single-instance dimensionality reduction objectives to multi-instance learning tasks may not work well since it ignores the characteristic of multi-instance learning that the labels of bags are known while the labels of instances are unknown. In this paper, we propose an effective model and develop an efficient algorithm to solve the multi-instance dimensionality reduction problem. We formulate the objective as an optimization problem by considering orthonormality and sparsity constraints in the projection matrix for dimensionality reduction, and then solve it by the gradient descent along the tangent space of the orthonormal matrices. We also propose an approximation for improving the efficiency. Experimental results validate the effectiveness of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
4

Koren, Y., and L. Carmel. "Robust linear dimensionality reduction." IEEE Transactions on Visualization and Computer Graphics 10, no. 4 (July 2004): 459–70. http://dx.doi.org/10.1109/tvcg.2004.17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Lotlikar, R., and R. Kothari. "Fractional-step dimensionality reduction." IEEE Transactions on Pattern Analysis and Machine Intelligence 22, no. 6 (June 2000): 623–27. http://dx.doi.org/10.1109/34.862200.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gottlieb, Lee-Ad, Aryeh Kontorovich, and Robert Krauthgamer. "Adaptive metric dimensionality reduction." Theoretical Computer Science 620 (March 2016): 105–18. http://dx.doi.org/10.1016/j.tcs.2015.10.040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Pang, Rich, Benjamin J. Lansdell, and Adrienne L. Fairhall. "Dimensionality reduction in neuroscience." Current Biology 26, no. 14 (July 2016): R656—R660. http://dx.doi.org/10.1016/j.cub.2016.05.029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Lovaglio, Pietro Giorgio, and Giorgio Vittadini. "Multilevel dimensionality-reduction methods." Statistical Methods & Applications 22, no. 2 (September 27, 2012): 183–207. http://dx.doi.org/10.1007/s10260-012-0215-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Carter, Kevin, Raviv Raich, William Finn, and Alfred Hero,III. "Information-Geometric Dimensionality Reduction." IEEE Signal Processing Magazine 28, no. 2 (March 2011): 89–99. http://dx.doi.org/10.1109/msp.2010.939536.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gonen, Mehmet. "Bayesian Supervised Dimensionality Reduction." IEEE Transactions on Cybernetics 43, no. 6 (December 2013): 2179–89. http://dx.doi.org/10.1109/tcyb.2013.2245321.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Zhang, Zhao, Tommy W. S. Chow, and Ning Ye. "SEMISUPERVISED MULTIMODAL DIMENSIONALITY REDUCTION." Computational Intelligence 29, no. 1 (May 15, 2012): 70–110. http://dx.doi.org/10.1111/j.1467-8640.2012.00429.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Liu, Huan, and Rudy Setiono. "Dimensionality reduction via discretization." Knowledge-Based Systems 9, no. 1 (February 1996): 67–72. http://dx.doi.org/10.1016/0950-7051(95)01030-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Yang, Li. "Distance-preserving dimensionality reduction." Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery 1, no. 5 (June 14, 2011): 369–80. http://dx.doi.org/10.1002/widm.39.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Li, Hongda, Jian Cui, Xinle Zhang, Yongqi Han, and Liying Cao. "Dimensionality Reduction and Classification of Hyperspectral Remote Sensing Image Feature Extraction." Remote Sensing 14, no. 18 (September 13, 2022): 4579. http://dx.doi.org/10.3390/rs14184579.

Full text
Abstract:
Terrain classification is an important research direction in the field of remote sensing. Hyperspectral remote sensing image data contain a large amount of rich ground object information. However, such data have the characteristics of high spatial dimensions of features, strong data correlation, high data redundancy, and long operation time, which lead to difficulty in image data classification. A data dimensionality reduction algorithm can transform the data into low-dimensional data with strong features and then classify the dimensionally reduced data. However, most classification methods cannot effectively extract dimensionality-reduced data features. In this paper, different dimensionality reduction and machine learning supervised classification algorithms are explored to determine a suitable combination method of dimensionality reduction and classification for hyperspectral images. Soft and hard classification methods are adopted to achieve the classification of pixels according to diversity. The results show that the data after dimensionality reduction retain the data features with high overall feature correlation, and the data dimension is drastically reduced. The dimensionality reduction method of unified manifold approximation and projection and the classification method of support vector machine achieve the best terrain classification with 99.57% classification accuracy. High-precision fitting of neural networks for soft classification of hyperspectral images with a model fitting correlation coefficient (R2) of up to 0.979 solves the problem of mixed pixel decomposition.
APA, Harvard, Vancouver, ISO, and other styles
15

Ahmad, Noor, and Ali Bou Nassif. "Dimensionality Reduction: Challenges and Solutions." ITM Web of Conferences 43 (2022): 01017. http://dx.doi.org/10.1051/itmconf/20224301017.

Full text
Abstract:
The use of dimensionality reduction techniques is a keystone for analyzing and interpreting high dimensional data. These techniques gather several data features of interest, such as dynamical structure, input-output relationships, the correlation between data sets, covariance, etc. Dimensionality reduction entails mapping a set of high dimensional data features onto low dimensional data. Motivated by the lack of learning models’ performance due to the high dimensionality data, this study encounters five distinct dimensionality reduction methods. Besides, a comparison between reduced dimensionality data and the original one using statistical and machine learning models is conducted thoroughly.
APA, Harvard, Vancouver, ISO, and other styles
16

K, Bhargavi. "Data Dimensionality Reduction Techniques : Review." International Journal of Engineering Technology and Management Sciences 4, no. 4 (July 28, 2020): 62–65. http://dx.doi.org/10.46647/ijetms.2020.v04i04.010.

Full text
Abstract:
Data science is the study of data. It involves developing methods of recording, storing, and analyzing data to effectively extract useful information. The goal of data science is to gain insights and knowledge from any type of data — both structured and unstructured. Data science is related to computer science, but is a separate field. Computer science involves creating programs and algorithms to record and process data, while data science covers any type of data analysis, which may or may not use computers. Data science is more closely related to the mathematics field of Statistics, which includes the collection, organization, analysis, and presentation of data. Because of the large amounts of data modern companies and organizations maintain, data science has become an integral part of IT. For example, a company that has petabytes of user data may use data science to develop effective ways to store, manage, and analyze the data. The company may use the scientific method to run tests and extract results that can provide meaningful insights about their users.
APA, Harvard, Vancouver, ISO, and other styles
17

Kay, Steven. "Dimensionality Reduction for Signal Detection." IEEE Signal Processing Letters 29 (2022): 145–48. http://dx.doi.org/10.1109/lsp.2021.3129453.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Nelson, Jelani. "Dimensionality Reduction in Euclidean Space." Notices of the American Mathematical Society 67, no. 10 (November 1, 2020): 1. http://dx.doi.org/10.1090/noti2166.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Tianhao Zhang, Dacheng Tao, Xuelong Li, and Jie Yang. "Patch Alignment for Dimensionality Reduction." IEEE Transactions on Knowledge and Data Engineering 21, no. 9 (September 2009): 1299–313. http://dx.doi.org/10.1109/tkde.2008.212.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Wang, Shujian, Deyan Xie, Fang Chen, and Quanxue Gao. "Dimensionality reduction by LPP‐L21." IET Computer Vision 12, no. 5 (March 23, 2018): 659–65. http://dx.doi.org/10.1049/iet-cvi.2017.0302.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Raymer, M. L., W. F. Punch, E. D. Goodman, L. A. Kuhn, and A. K. Jain. "Dimensionality reduction using genetic algorithms." IEEE Transactions on Evolutionary Computation 4, no. 2 (July 2000): 164–71. http://dx.doi.org/10.1109/4235.850656.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Saund, E. "Dimensionality-reduction using connectionist networks." IEEE Transactions on Pattern Analysis and Machine Intelligence 11, no. 3 (March 1989): 304–14. http://dx.doi.org/10.1109/34.21799.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Vats, Deepak, and Avinash Sharma. "Dimensionality Reduction Techniques: Comparative Analysis." Journal of Computational and Theoretical Nanoscience 17, no. 6 (June 1, 2020): 2684–88. http://dx.doi.org/10.1166/jctn.2020.8967.

Full text
Abstract:
It has been spotted an exponential growth in terms of dimension in real world data. Some example of higher dimensional data may includes speech signal, sensor data, medical data, criminal data and data related to recommendation process for different field like news, movies (Netflix) and e-commerce. To empowering learning accuracy in the area of machine learning and enhancing mining performance one need to remove redundant feature and feature not relevant for mining and learning task from this high dimension dataset. There exist many supervised and unsupervised methodologies in literature to perform dimension reduction. The objective of paper is to present most prominent methodologies related to the field of dimension reduction and highlight advantages along with disadvantages of these algorithms which can act as starting point for beginners of this field.
APA, Harvard, Vancouver, ISO, and other styles
24

Harrow, Aram W., Ashley Montanaro, and Anthony J. Short. "Limitations on quantum dimensionality reduction." International Journal of Quantum Information 13, no. 04 (June 2015): 1440001. http://dx.doi.org/10.1142/s0219749914400012.

Full text
Abstract:
The Johnson–Lindenstrauss Lemma is a classic result which implies that any set of n real vectors can be compressed to O( log n) dimensions while only distorting pairwise Euclidean distances by a constant factor. Here we consider potential extensions of this result to the compression of quantum states. We show that, by contrast with the classical case, there does not exist any distribution over quantum channels that significantly reduces the dimension of quantum states while preserving the 2-norm distance with high probability. We discuss two tasks for which the 2-norm distance is indeed the correct figure of merit. In the case of the trace norm, we show that the dimension of low-rank mixed states can be reduced by up to a square root, but that essentially no dimensionality reduction is possible for highly mixed states.
APA, Harvard, Vancouver, ISO, and other styles
25

Tasoulis, Sotiris, Nicos G. Pavlidis, and Teemu Roos. "Nonlinear dimensionality reduction for clustering." Pattern Recognition 107 (November 2020): 107508. http://dx.doi.org/10.1016/j.patcog.2020.107508.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Gao, Junbin, Qinfeng Shi, and Tibério S. Caetano. "Dimensionality reduction via compressive sensing." Pattern Recognition Letters 33, no. 9 (July 2012): 1163–70. http://dx.doi.org/10.1016/j.patrec.2012.02.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Wang, Yasi, Hongxun Yao, and Sicheng Zhao. "Auto-encoder based dimensionality reduction." Neurocomputing 184 (April 2016): 232–42. http://dx.doi.org/10.1016/j.neucom.2015.08.104.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Hou, Chenping, Changshui Zhang, Yi Wu, and Yuanyuan Jiao. "Stable local dimensionality reduction approaches." Pattern Recognition 42, no. 9 (September 2009): 2054–66. http://dx.doi.org/10.1016/j.patcog.2008.12.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Qiao, Lishan, Limei Zhang, and Songcan Chen. "Dimensionality reduction with adaptive graph." Frontiers of Computer Science 7, no. 5 (August 10, 2013): 745–53. http://dx.doi.org/10.1007/s11704-013-2234-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Lai, Zhihui, Yong Xu, Jian Yang, Linlin Shen, and David Zhang. "Rotational Invariant Dimensionality Reduction Algorithms." IEEE Transactions on Cybernetics 47, no. 11 (November 2017): 3733–46. http://dx.doi.org/10.1109/tcyb.2016.2578642.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Nagabhushan, P., K. Chidananda Gowda, and Edwin Diday. "Dimensionality reduction of symbolic data." Pattern Recognition Letters 16, no. 2 (February 1995): 219–23. http://dx.doi.org/10.1016/0167-8655(94)00085-h.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Havelka, Jan, Anna Kučerová, and Jan Sýkora. "Dimensionality reduction in thermal tomography." Computers & Mathematics with Applications 78, no. 9 (November 2019): 3077–89. http://dx.doi.org/10.1016/j.camwa.2019.04.019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Santos, João Filipe, Maria Manuela Portela, and Inmaculada Pulido-Calvo. "Dimensionality reduction in drought modelling." Hydrological Processes 27, no. 10 (April 17, 2012): 1399–410. http://dx.doi.org/10.1002/hyp.9300.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Shen, Zilin. "Comparison and Evaluation of Classical Dimensionality Reduction Methods." Highlights in Science, Engineering and Technology 70 (November 15, 2023): 411–18. http://dx.doi.org/10.54097/hset.v70i.13890.

Full text
Abstract:
As one of the tasks of unsupervised learning, data dimensionality reduction is faced with the problem of a lack of evaluation methods. Based on this, three classical dimensionality reduction methods such as PCA, t-SNE and UMAP were selected as the research object in this paper. This article selected 5 three-classification datasets and used the three methods mentioned above to perform dimensionality reduction. This paper plotted 3D scatter graphs after dimensionality reduction to analyze the differentiation effect of the data on different categories of the target variable. Then the data after dimensionality reduction was classified using random forest model and the classification accuracy was obtained. According to the 3D scatter plots and the accuracy of random forest, it is found that PCA has a good dimensionality reduction effect on most of the selected datasets, and t-SNE has a relatively stable dimensionality reduction effect. In contrast, UMAP has good dimensionality reduction performance in some individual datasets but lacks stability. Overall, this paper proposes a dimensionality reduction evaluation method that combines scatter-plot visualization results and classification models, which can effectively predict the performances of the dimensionality reduction methods for a variety of datasets, thereby promoting the comparison and selection of dimensionality reduction methods in the field of unsupervised learning.
APA, Harvard, Vancouver, ISO, and other styles
35

Xie, Fuding, Yutao Fan, and Ming Zhou. "Dimensionality Reduction by Weighted Connections between Neighborhoods." Abstract and Applied Analysis 2014 (2014): 1–5. http://dx.doi.org/10.1155/2014/928136.

Full text
Abstract:
Dimensionality reduction is the transformation of high-dimensional data into a meaningful representation of reduced dimensionality. This paper introduces a dimensionality reduction technique by weighted connections between neighborhoods to improveK-Isomap method, attempting to preserve perfectly the relationships between neighborhoods in the process of dimensionality reduction. The validity of the proposal is tested by three typical examples which are widely employed in the algorithms based on manifold. The experimental results show that the local topology nature of dataset is preserved well while transforming dataset in high-dimensional space into a new dataset in low-dimensionality by the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
36

Chun-Man Yan, Chun-Man Yan, and Yu-Yao Zhang Chun-Man Yan. "Face Recognition Based on SRC Combined with Sparse Embedding Dimensionality Reduction." 電腦學刊 33, no. 2 (April 2022): 083–93. http://dx.doi.org/10.53106/199115992022043302007.

Full text
Abstract:
<p>Sparse representation-based classification (SRC) method has achieved good recognition results and shown strong robustness for face recognition, especially when the face image is affected by illumination variations, expression changes and occlusion. SRC method simply uses the training set as a dictionary to encode test samples. However, the high-dimensional training face data usually contain a large amount of redundant information, which will increase the complexity of this method. Therefore, the image dimensionality reduction procedure is separately performed by most of the existing methods before SRC is launched, but this may not be able to make full use of the discriminative information of the training samples. In this paper, based on the efficient SRC method, a sparse embedding dimensionality reduction strategy is combined with to achieve a face recognition method. For the proposed method, a projection matrix is used to project high-dimensional data into a low-dimensional space. At the same time, a discriminative coefficient constraint term in the objective function is introduced to reduce the classification residual of the sample through the distance relationship between all coefficients. Then the label information of the sample is used to iteratively update the projection matrix and coefficient representation. Finally, the test samples are projected into the low-dimensional space for classification. A large number of experimental results on three widely used face datasets show that the proposed method improves the discrimination of face images in low-dimensional space and can achieve better face recognition results.</p> <p>&nbsp;</p>
APA, Harvard, Vancouver, ISO, and other styles
37

Chun-Man Yan, Chun-Man Yan, and Yu-Yao Zhang Chun-Man Yan. "Face Recognition Based on SRC Combined with Sparse Embedding Dimensionality Reduction." 電腦學刊 33, no. 2 (April 2022): 083–93. http://dx.doi.org/10.53106/199115992022043302007.

Full text
Abstract:
<p>Sparse representation-based classification (SRC) method has achieved good recognition results and shown strong robustness for face recognition, especially when the face image is affected by illumination variations, expression changes and occlusion. SRC method simply uses the training set as a dictionary to encode test samples. However, the high-dimensional training face data usually contain a large amount of redundant information, which will increase the complexity of this method. Therefore, the image dimensionality reduction procedure is separately performed by most of the existing methods before SRC is launched, but this may not be able to make full use of the discriminative information of the training samples. In this paper, based on the efficient SRC method, a sparse embedding dimensionality reduction strategy is combined with to achieve a face recognition method. For the proposed method, a projection matrix is used to project high-dimensional data into a low-dimensional space. At the same time, a discriminative coefficient constraint term in the objective function is introduced to reduce the classification residual of the sample through the distance relationship between all coefficients. Then the label information of the sample is used to iteratively update the projection matrix and coefficient representation. Finally, the test samples are projected into the low-dimensional space for classification. A large number of experimental results on three widely used face datasets show that the proposed method improves the discrimination of face images in low-dimensional space and can achieve better face recognition results.</p> <p>&nbsp;</p>
APA, Harvard, Vancouver, ISO, and other styles
38

Zhao, Xiaowei, Feiping Nie, Sen Wang, Jun Guo, Pengfei Xu, and Xiaojiang Chen. "Unsupervised 2D Dimensionality Reduction with Adaptive Structure Learning." Neural Computation 29, no. 5 (May 2017): 1352–74. http://dx.doi.org/10.1162/neco_a_00950.

Full text
Abstract:
In recent years, unsupervised two-dimensional (2D) dimensionality reduction methods for unlabeled large-scale data have made progress. However, performance of these degrades when the learning of similarity matrix is at the beginning of the dimensionality reduction process. A similarity matrix is used to reveal the underlying geometry structure of data in unsupervised dimensionality reduction methods. Because of noise data, it is difficult to learn the optimal similarity matrix. In this letter, we propose a new dimensionality reduction model for 2D image matrices: unsupervised 2D dimensionality reduction with adaptive structure learning (DRASL). Instead of using a predetermined similarity matrix to characterize the underlying geometry structure of the original 2D image space, our proposed approach involves the learning of a similarity matrix in the procedure of dimensionality reduction. To realize a desirable neighbors assignment after dimensionality reduction, we add a constraint to our model such that there are exact [Formula: see text] connected components in the final subspace. To accomplish these goals, we propose a unified objective function to integrate dimensionality reduction, the learning of the similarity matrix, and the adaptive learning of neighbors assignment into it. An iterative optimization algorithm is proposed to solve the objective function. We compare the proposed method with several 2D unsupervised dimensionality methods. K-means is used to evaluate the clustering performance. We conduct extensive experiments on Coil20, AT&T, FERET, USPS, and Yale data sets to verify the effectiveness of our proposed method.
APA, Harvard, Vancouver, ISO, and other styles
39

Vijayarani, Dr S., Dr C. Sivamathi, and Mrs S. Maria Sylviaa. "Bio Inspired Algorithms for Dimensionality Reduction and Outlier Detection in Medical Datasets." International Journal of Advanced Networking and Applications 14, no. 01 (2022): 5277–86. http://dx.doi.org/10.35444/ijana.2022.14107.

Full text
Abstract:
Dimensionality Reduction is one of the useful techniques used in number of applications in order to reduce the number of features to improve the productivity and efficiency of the task. Clustering is one of the influential tasks in data mining. Dimensionality reductions are used in data mining, Image processing, Networking, Mobile computing, etc. The elementary intention of this work is to apply dimensionality reduction algorithms and then cluster the datasets to detect outliers. A bio-inspired ACO (Ant Colony optimization) algorithm has been proposed to reduce dimensionality. Also another bio-inspired algorithm FA (Firefly Algorithm) has been proposed to detect outliers. The three distinct medical datasets: thyroid dataset, Oesophagal dataset and Heart disease dataset are used for experimental results.
APA, Harvard, Vancouver, ISO, and other styles
40

Gao, Yunlong, Sizhe Luo, Jinyan Pan, Zhihao Wang, and Peng Gao. "Kernel alignment unsupervised discriminative dimensionality reduction." Neurocomputing 453 (September 2021): 181–94. http://dx.doi.org/10.1016/j.neucom.2021.03.127.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Crespo, Luis G., Brendon K. Colbert, Sean P. Kenny, and Daniel P. Giesy. "Dimensionality Reduction of Sliced-Normal Distributions." IFAC-PapersOnLine 53, no. 2 (2020): 7412–17. http://dx.doi.org/10.1016/j.ifacol.2020.12.1275.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Petty, G. W. "Dimensionality reduction in Bayesian estimation algorithms." Atmospheric Measurement Techniques 6, no. 9 (September 4, 2013): 2267–76. http://dx.doi.org/10.5194/amt-6-2267-2013.

Full text
Abstract:
Abstract. An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may utilize 9 to 13 channels to retrieve precipitation, for example. An objective method is described for distilling the relevant information content from N real channels into a much smaller number (M) of pseudochannels while also regularizing the background (geophysical plus instrument) noise component. The pseudochannels are linear combinations of the original N channels obtained via a two-stage principal component analysis of the dependent dataset. Bayesian retrievals based on a single pseudochannel applied to the independent dataset yield striking improvements in overall performance. The differences between the conventional Bayesian retrieval and reduced-dimensional Bayesian retrieval suggest that a major potential problem with conventional multichannel retrievals – whether Bayesian or not – lies in the common but often inappropriate assumption of diagonal error covariance. The dimensional reduction technique described herein avoids this problem by, in effect, recasting the retrieval problem in a coordinate system in which the desired covariance is lower-dimensional, diagonal, and unit magnitude.
APA, Harvard, Vancouver, ISO, and other styles
43

Petty, G. W. "Dimensionality reduction in Bayesian estimation algorithms." Atmospheric Measurement Techniques Discussions 6, no. 2 (March 4, 2013): 2327–52. http://dx.doi.org/10.5194/amtd-6-2327-2013.

Full text
Abstract:
Abstract. An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may utilize 9 to 13 channels to retrieve precipitation, for example. An objective method is described for distilling the relevant information content from N real channels into a much smaller number (M) of pseudochannels while also regularizing the background (geophysical plus instrument) noise component. The pseudochannels are linear combinations of the original N channels obtained via a two-stage principal component analysis of the dependent dataset. Bayesian retrievals based on a single pseudochannel applied to the independent dataset yield striking improvements in overall performance. The differences between the conventional Bayesian retrieval and reduced-dimensional Bayesian retrieval suggest that a major potential problem with conventional multichannel retrievals – whether Bayesian or not – lies in the common but often inappropriate assumption of diagonal error covariance. The dimensional reduction technique described herein avoids this problem by, in effect, recasting the retrieval problem in a coordinate system in which the desired covariance is lower-dimensional, diagonal, and unit magnitude.
APA, Harvard, Vancouver, ISO, and other styles
44

Mahadev, Preeti, and P. Nagabhushan. "Incremental Dimensionality Reduction in Hyperspectral Data." International Journal of Computer Applications 163, no. 7 (April 17, 2017): 21–34. http://dx.doi.org/10.5120/ijca2017913575.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Jindal, Priyanka, and Dharmender Kumar. "A Review on Dimensionality Reduction Techniques." International Journal of Computer Applications 173, no. 2 (September 15, 2017): 42–46. http://dx.doi.org/10.5120/ijca2017915260.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Merola, Giovanni M., and Bovas Abraham. "Dimensionality reduction approach to multivariate prediction." Canadian Journal of Statistics 29, no. 2 (June 2001): 191–200. http://dx.doi.org/10.2307/3316072.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Zhikai Zhao, Jiansheng Qian, and Jian Cheng. "Marginal Discriminant Projection for Dimensionality Reduction." International Journal of Digital Content Technology and its Applications 6, no. 15 (August 31, 2012): 1–11. http://dx.doi.org/10.4156/jdcta.vol6.issue15.1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Gnana Singh, D. Asir Antony, and E. Jebamalar Leavline. "Dimensionality Reduction for Classification and Clustering." International Journal of Intelligent Systems and Applications 11, no. 4 (April 8, 2019): 61–68. http://dx.doi.org/10.5815/ijisa.2019.04.06.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

VanHorn, Kevin Christopher, and Murat Can Çobanoğlu. "Haisu: Hierarchically supervised nonlinear dimensionality reduction." PLOS Computational Biology 18, no. 7 (July 21, 2022): e1010351. http://dx.doi.org/10.1371/journal.pcbi.1010351.

Full text
Abstract:
We propose a novel strategy for incorporating hierarchical supervised label information into nonlinear dimensionality reduction techniques. Specifically, we extend t-SNE, UMAP, and PHATE to include known or predicted class labels and demonstrate the efficacy of our approach on multiple single-cell RNA sequencing datasets. Our approach, “Haisu,” is applicable across domains and methods of nonlinear dimensionality reduction. In general, the mathematical effect of Haisu can be summarized as a variable perturbation of the high dimensional space in which the original data is observed. We thereby preserve the core characteristics of the visualization method and only change the manifold to respect known or assumed class labels when provided. Our strategy is designed to aid in the discovery and understanding of underlying patterns in a dataset that is heavily influenced by parent-child relationships. We show that using our approach can also help in semi-supervised settings where labels are known for only some datapoints (for instance when only a fraction of the cells are labeled). In summary, Haisu extends existing popular visualization methods to enable a user to incorporate labels known a priori into a visualization, including their hierarchical relationships as defined by a user input graph.
APA, Harvard, Vancouver, ISO, and other styles
50

B. Nath, D. K. Bhattacharyya, and A. Ghosh. "Dimensionality Reduction for Association Rule Mining." International Journal of Intelligent Information Processing 2, no. 1 (March 31, 2011): 9–21. http://dx.doi.org/10.4156/ijiip.vol2.issue1.2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography