Artículos de revistas sobre el tema "Dimensionality reduction"

Siga este enlace para ver otros tipos de publicaciones sobre el tema: Dimensionality reduction.

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores artículos de revistas para su investigación sobre el tema "Dimensionality reduction".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Cheng, Long, Chenyu You y Yani Guan. "Random Projections for Non-linear Dimensionality Reduction". International Journal of Machine Learning and Computing 6, n.º 4 (agosto de 2016): 220–25. http://dx.doi.org/10.18178/ijmlc.2016.6.4.601.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Marchette, David J. y Wendy L. Poston. "Local dimensionality reduction". Computational Statistics 14, n.º 4 (12 de septiembre de 1999): 469–89. http://dx.doi.org/10.1007/s001800050026.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Sun, Yu-Yin, Michael Ng y Zhi-Hua Zhou. "Multi-Instance Dimensionality Reduction". Proceedings of the AAAI Conference on Artificial Intelligence 24, n.º 1 (3 de julio de 2010): 587–92. http://dx.doi.org/10.1609/aaai.v24i1.7700.

Texto completo
Resumen
Multi-instance learning deals with problems that treat bags of instances as training examples. In single-instance learning problems, dimensionality reduction is an essential step for high-dimensional data analysis and has been studied for years. The curse of dimensionality also exists in multiinstance learning tasks, yet this difficult task has not been studied before. Direct application of existing single-instance dimensionality reduction objectives to multi-instance learning tasks may not work well since it ignores the characteristic of multi-instance learning that the labels of bags are known while the labels of instances are unknown. In this paper, we propose an effective model and develop an efficient algorithm to solve the multi-instance dimensionality reduction problem. We formulate the objective as an optimization problem by considering orthonormality and sparsity constraints in the projection matrix for dimensionality reduction, and then solve it by the gradient descent along the tangent space of the orthonormal matrices. We also propose an approximation for improving the efficiency. Experimental results validate the effectiveness of the proposed method.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Koren, Y. y L. Carmel. "Robust linear dimensionality reduction". IEEE Transactions on Visualization and Computer Graphics 10, n.º 4 (julio de 2004): 459–70. http://dx.doi.org/10.1109/tvcg.2004.17.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Lotlikar, R. y R. Kothari. "Fractional-step dimensionality reduction". IEEE Transactions on Pattern Analysis and Machine Intelligence 22, n.º 6 (junio de 2000): 623–27. http://dx.doi.org/10.1109/34.862200.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Gottlieb, Lee-Ad, Aryeh Kontorovich y Robert Krauthgamer. "Adaptive metric dimensionality reduction". Theoretical Computer Science 620 (marzo de 2016): 105–18. http://dx.doi.org/10.1016/j.tcs.2015.10.040.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Pang, Rich, Benjamin J. Lansdell y Adrienne L. Fairhall. "Dimensionality reduction in neuroscience". Current Biology 26, n.º 14 (julio de 2016): R656—R660. http://dx.doi.org/10.1016/j.cub.2016.05.029.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Lovaglio, Pietro Giorgio y Giorgio Vittadini. "Multilevel dimensionality-reduction methods". Statistical Methods & Applications 22, n.º 2 (27 de septiembre de 2012): 183–207. http://dx.doi.org/10.1007/s10260-012-0215-2.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Carter, Kevin, Raviv Raich, William Finn y Alfred Hero,III. "Information-Geometric Dimensionality Reduction". IEEE Signal Processing Magazine 28, n.º 2 (marzo de 2011): 89–99. http://dx.doi.org/10.1109/msp.2010.939536.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Gonen, Mehmet. "Bayesian Supervised Dimensionality Reduction". IEEE Transactions on Cybernetics 43, n.º 6 (diciembre de 2013): 2179–89. http://dx.doi.org/10.1109/tcyb.2013.2245321.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Zhang, Zhao, Tommy W. S. Chow y Ning Ye. "SEMISUPERVISED MULTIMODAL DIMENSIONALITY REDUCTION". Computational Intelligence 29, n.º 1 (15 de mayo de 2012): 70–110. http://dx.doi.org/10.1111/j.1467-8640.2012.00429.x.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Liu, Huan y Rudy Setiono. "Dimensionality reduction via discretization". Knowledge-Based Systems 9, n.º 1 (febrero de 1996): 67–72. http://dx.doi.org/10.1016/0950-7051(95)01030-0.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Yang, Li. "Distance-preserving dimensionality reduction". Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery 1, n.º 5 (14 de junio de 2011): 369–80. http://dx.doi.org/10.1002/widm.39.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Li, Hongda, Jian Cui, Xinle Zhang, Yongqi Han y Liying Cao. "Dimensionality Reduction and Classification of Hyperspectral Remote Sensing Image Feature Extraction". Remote Sensing 14, n.º 18 (13 de septiembre de 2022): 4579. http://dx.doi.org/10.3390/rs14184579.

Texto completo
Resumen
Terrain classification is an important research direction in the field of remote sensing. Hyperspectral remote sensing image data contain a large amount of rich ground object information. However, such data have the characteristics of high spatial dimensions of features, strong data correlation, high data redundancy, and long operation time, which lead to difficulty in image data classification. A data dimensionality reduction algorithm can transform the data into low-dimensional data with strong features and then classify the dimensionally reduced data. However, most classification methods cannot effectively extract dimensionality-reduced data features. In this paper, different dimensionality reduction and machine learning supervised classification algorithms are explored to determine a suitable combination method of dimensionality reduction and classification for hyperspectral images. Soft and hard classification methods are adopted to achieve the classification of pixels according to diversity. The results show that the data after dimensionality reduction retain the data features with high overall feature correlation, and the data dimension is drastically reduced. The dimensionality reduction method of unified manifold approximation and projection and the classification method of support vector machine achieve the best terrain classification with 99.57% classification accuracy. High-precision fitting of neural networks for soft classification of hyperspectral images with a model fitting correlation coefficient (R2) of up to 0.979 solves the problem of mixed pixel decomposition.
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Ahmad, Noor y Ali Bou Nassif. "Dimensionality Reduction: Challenges and Solutions". ITM Web of Conferences 43 (2022): 01017. http://dx.doi.org/10.1051/itmconf/20224301017.

Texto completo
Resumen
The use of dimensionality reduction techniques is a keystone for analyzing and interpreting high dimensional data. These techniques gather several data features of interest, such as dynamical structure, input-output relationships, the correlation between data sets, covariance, etc. Dimensionality reduction entails mapping a set of high dimensional data features onto low dimensional data. Motivated by the lack of learning models’ performance due to the high dimensionality data, this study encounters five distinct dimensionality reduction methods. Besides, a comparison between reduced dimensionality data and the original one using statistical and machine learning models is conducted thoroughly.
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

K, Bhargavi. "Data Dimensionality Reduction Techniques : Review". International Journal of Engineering Technology and Management Sciences 4, n.º 4 (28 de julio de 2020): 62–65. http://dx.doi.org/10.46647/ijetms.2020.v04i04.010.

Texto completo
Resumen
Data science is the study of data. It involves developing methods of recording, storing, and analyzing data to effectively extract useful information. The goal of data science is to gain insights and knowledge from any type of data — both structured and unstructured. Data science is related to computer science, but is a separate field. Computer science involves creating programs and algorithms to record and process data, while data science covers any type of data analysis, which may or may not use computers. Data science is more closely related to the mathematics field of Statistics, which includes the collection, organization, analysis, and presentation of data. Because of the large amounts of data modern companies and organizations maintain, data science has become an integral part of IT. For example, a company that has petabytes of user data may use data science to develop effective ways to store, manage, and analyze the data. The company may use the scientific method to run tests and extract results that can provide meaningful insights about their users.
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Kay, Steven. "Dimensionality Reduction for Signal Detection". IEEE Signal Processing Letters 29 (2022): 145–48. http://dx.doi.org/10.1109/lsp.2021.3129453.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Nelson, Jelani. "Dimensionality Reduction in Euclidean Space". Notices of the American Mathematical Society 67, n.º 10 (1 de noviembre de 2020): 1. http://dx.doi.org/10.1090/noti2166.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Tianhao Zhang, Dacheng Tao, Xuelong Li y Jie Yang. "Patch Alignment for Dimensionality Reduction". IEEE Transactions on Knowledge and Data Engineering 21, n.º 9 (septiembre de 2009): 1299–313. http://dx.doi.org/10.1109/tkde.2008.212.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Wang, Shujian, Deyan Xie, Fang Chen y Quanxue Gao. "Dimensionality reduction by LPP‐L21". IET Computer Vision 12, n.º 5 (23 de marzo de 2018): 659–65. http://dx.doi.org/10.1049/iet-cvi.2017.0302.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Raymer, M. L., W. F. Punch, E. D. Goodman, L. A. Kuhn y A. K. Jain. "Dimensionality reduction using genetic algorithms". IEEE Transactions on Evolutionary Computation 4, n.º 2 (julio de 2000): 164–71. http://dx.doi.org/10.1109/4235.850656.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Saund, E. "Dimensionality-reduction using connectionist networks". IEEE Transactions on Pattern Analysis and Machine Intelligence 11, n.º 3 (marzo de 1989): 304–14. http://dx.doi.org/10.1109/34.21799.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Vats, Deepak y Avinash Sharma. "Dimensionality Reduction Techniques: Comparative Analysis". Journal of Computational and Theoretical Nanoscience 17, n.º 6 (1 de junio de 2020): 2684–88. http://dx.doi.org/10.1166/jctn.2020.8967.

Texto completo
Resumen
It has been spotted an exponential growth in terms of dimension in real world data. Some example of higher dimensional data may includes speech signal, sensor data, medical data, criminal data and data related to recommendation process for different field like news, movies (Netflix) and e-commerce. To empowering learning accuracy in the area of machine learning and enhancing mining performance one need to remove redundant feature and feature not relevant for mining and learning task from this high dimension dataset. There exist many supervised and unsupervised methodologies in literature to perform dimension reduction. The objective of paper is to present most prominent methodologies related to the field of dimension reduction and highlight advantages along with disadvantages of these algorithms which can act as starting point for beginners of this field.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Harrow, Aram W., Ashley Montanaro y Anthony J. Short. "Limitations on quantum dimensionality reduction". International Journal of Quantum Information 13, n.º 04 (junio de 2015): 1440001. http://dx.doi.org/10.1142/s0219749914400012.

Texto completo
Resumen
The Johnson–Lindenstrauss Lemma is a classic result which implies that any set of n real vectors can be compressed to O( log n) dimensions while only distorting pairwise Euclidean distances by a constant factor. Here we consider potential extensions of this result to the compression of quantum states. We show that, by contrast with the classical case, there does not exist any distribution over quantum channels that significantly reduces the dimension of quantum states while preserving the 2-norm distance with high probability. We discuss two tasks for which the 2-norm distance is indeed the correct figure of merit. In the case of the trace norm, we show that the dimension of low-rank mixed states can be reduced by up to a square root, but that essentially no dimensionality reduction is possible for highly mixed states.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Tasoulis, Sotiris, Nicos G. Pavlidis y Teemu Roos. "Nonlinear dimensionality reduction for clustering". Pattern Recognition 107 (noviembre de 2020): 107508. http://dx.doi.org/10.1016/j.patcog.2020.107508.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Gao, Junbin, Qinfeng Shi y Tibério S. Caetano. "Dimensionality reduction via compressive sensing". Pattern Recognition Letters 33, n.º 9 (julio de 2012): 1163–70. http://dx.doi.org/10.1016/j.patrec.2012.02.007.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Wang, Yasi, Hongxun Yao y Sicheng Zhao. "Auto-encoder based dimensionality reduction". Neurocomputing 184 (abril de 2016): 232–42. http://dx.doi.org/10.1016/j.neucom.2015.08.104.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Hou, Chenping, Changshui Zhang, Yi Wu y Yuanyuan Jiao. "Stable local dimensionality reduction approaches". Pattern Recognition 42, n.º 9 (septiembre de 2009): 2054–66. http://dx.doi.org/10.1016/j.patcog.2008.12.009.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Qiao, Lishan, Limei Zhang y Songcan Chen. "Dimensionality reduction with adaptive graph". Frontiers of Computer Science 7, n.º 5 (10 de agosto de 2013): 745–53. http://dx.doi.org/10.1007/s11704-013-2234-z.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Lai, Zhihui, Yong Xu, Jian Yang, Linlin Shen y David Zhang. "Rotational Invariant Dimensionality Reduction Algorithms". IEEE Transactions on Cybernetics 47, n.º 11 (noviembre de 2017): 3733–46. http://dx.doi.org/10.1109/tcyb.2016.2578642.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Nagabhushan, P., K. Chidananda Gowda y Edwin Diday. "Dimensionality reduction of symbolic data". Pattern Recognition Letters 16, n.º 2 (febrero de 1995): 219–23. http://dx.doi.org/10.1016/0167-8655(94)00085-h.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Havelka, Jan, Anna Kučerová y Jan Sýkora. "Dimensionality reduction in thermal tomography". Computers & Mathematics with Applications 78, n.º 9 (noviembre de 2019): 3077–89. http://dx.doi.org/10.1016/j.camwa.2019.04.019.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Santos, João Filipe, Maria Manuela Portela y Inmaculada Pulido-Calvo. "Dimensionality reduction in drought modelling". Hydrological Processes 27, n.º 10 (17 de abril de 2012): 1399–410. http://dx.doi.org/10.1002/hyp.9300.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Shen, Zilin. "Comparison and Evaluation of Classical Dimensionality Reduction Methods". Highlights in Science, Engineering and Technology 70 (15 de noviembre de 2023): 411–18. http://dx.doi.org/10.54097/hset.v70i.13890.

Texto completo
Resumen
As one of the tasks of unsupervised learning, data dimensionality reduction is faced with the problem of a lack of evaluation methods. Based on this, three classical dimensionality reduction methods such as PCA, t-SNE and UMAP were selected as the research object in this paper. This article selected 5 three-classification datasets and used the three methods mentioned above to perform dimensionality reduction. This paper plotted 3D scatter graphs after dimensionality reduction to analyze the differentiation effect of the data on different categories of the target variable. Then the data after dimensionality reduction was classified using random forest model and the classification accuracy was obtained. According to the 3D scatter plots and the accuracy of random forest, it is found that PCA has a good dimensionality reduction effect on most of the selected datasets, and t-SNE has a relatively stable dimensionality reduction effect. In contrast, UMAP has good dimensionality reduction performance in some individual datasets but lacks stability. Overall, this paper proposes a dimensionality reduction evaluation method that combines scatter-plot visualization results and classification models, which can effectively predict the performances of the dimensionality reduction methods for a variety of datasets, thereby promoting the comparison and selection of dimensionality reduction methods in the field of unsupervised learning.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Xie, Fuding, Yutao Fan y Ming Zhou. "Dimensionality Reduction by Weighted Connections between Neighborhoods". Abstract and Applied Analysis 2014 (2014): 1–5. http://dx.doi.org/10.1155/2014/928136.

Texto completo
Resumen
Dimensionality reduction is the transformation of high-dimensional data into a meaningful representation of reduced dimensionality. This paper introduces a dimensionality reduction technique by weighted connections between neighborhoods to improveK-Isomap method, attempting to preserve perfectly the relationships between neighborhoods in the process of dimensionality reduction. The validity of the proposal is tested by three typical examples which are widely employed in the algorithms based on manifold. The experimental results show that the local topology nature of dataset is preserved well while transforming dataset in high-dimensional space into a new dataset in low-dimensionality by the proposed method.
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Chun-Man Yan, Chun-Man Yan y Yu-Yao Zhang Chun-Man Yan. "Face Recognition Based on SRC Combined with Sparse Embedding Dimensionality Reduction". 電腦學刊 33, n.º 2 (abril de 2022): 083–93. http://dx.doi.org/10.53106/199115992022043302007.

Texto completo
Resumen
<p>Sparse representation-based classification (SRC) method has achieved good recognition results and shown strong robustness for face recognition, especially when the face image is affected by illumination variations, expression changes and occlusion. SRC method simply uses the training set as a dictionary to encode test samples. However, the high-dimensional training face data usually contain a large amount of redundant information, which will increase the complexity of this method. Therefore, the image dimensionality reduction procedure is separately performed by most of the existing methods before SRC is launched, but this may not be able to make full use of the discriminative information of the training samples. In this paper, based on the efficient SRC method, a sparse embedding dimensionality reduction strategy is combined with to achieve a face recognition method. For the proposed method, a projection matrix is used to project high-dimensional data into a low-dimensional space. At the same time, a discriminative coefficient constraint term in the objective function is introduced to reduce the classification residual of the sample through the distance relationship between all coefficients. Then the label information of the sample is used to iteratively update the projection matrix and coefficient representation. Finally, the test samples are projected into the low-dimensional space for classification. A large number of experimental results on three widely used face datasets show that the proposed method improves the discrimination of face images in low-dimensional space and can achieve better face recognition results.</p> <p>&nbsp;</p>
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Chun-Man Yan, Chun-Man Yan y Yu-Yao Zhang Chun-Man Yan. "Face Recognition Based on SRC Combined with Sparse Embedding Dimensionality Reduction". 電腦學刊 33, n.º 2 (abril de 2022): 083–93. http://dx.doi.org/10.53106/199115992022043302007.

Texto completo
Resumen
<p>Sparse representation-based classification (SRC) method has achieved good recognition results and shown strong robustness for face recognition, especially when the face image is affected by illumination variations, expression changes and occlusion. SRC method simply uses the training set as a dictionary to encode test samples. However, the high-dimensional training face data usually contain a large amount of redundant information, which will increase the complexity of this method. Therefore, the image dimensionality reduction procedure is separately performed by most of the existing methods before SRC is launched, but this may not be able to make full use of the discriminative information of the training samples. In this paper, based on the efficient SRC method, a sparse embedding dimensionality reduction strategy is combined with to achieve a face recognition method. For the proposed method, a projection matrix is used to project high-dimensional data into a low-dimensional space. At the same time, a discriminative coefficient constraint term in the objective function is introduced to reduce the classification residual of the sample through the distance relationship between all coefficients. Then the label information of the sample is used to iteratively update the projection matrix and coefficient representation. Finally, the test samples are projected into the low-dimensional space for classification. A large number of experimental results on three widely used face datasets show that the proposed method improves the discrimination of face images in low-dimensional space and can achieve better face recognition results.</p> <p>&nbsp;</p>
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Zhao, Xiaowei, Feiping Nie, Sen Wang, Jun Guo, Pengfei Xu y Xiaojiang Chen. "Unsupervised 2D Dimensionality Reduction with Adaptive Structure Learning". Neural Computation 29, n.º 5 (mayo de 2017): 1352–74. http://dx.doi.org/10.1162/neco_a_00950.

Texto completo
Resumen
In recent years, unsupervised two-dimensional (2D) dimensionality reduction methods for unlabeled large-scale data have made progress. However, performance of these degrades when the learning of similarity matrix is at the beginning of the dimensionality reduction process. A similarity matrix is used to reveal the underlying geometry structure of data in unsupervised dimensionality reduction methods. Because of noise data, it is difficult to learn the optimal similarity matrix. In this letter, we propose a new dimensionality reduction model for 2D image matrices: unsupervised 2D dimensionality reduction with adaptive structure learning (DRASL). Instead of using a predetermined similarity matrix to characterize the underlying geometry structure of the original 2D image space, our proposed approach involves the learning of a similarity matrix in the procedure of dimensionality reduction. To realize a desirable neighbors assignment after dimensionality reduction, we add a constraint to our model such that there are exact [Formula: see text] connected components in the final subspace. To accomplish these goals, we propose a unified objective function to integrate dimensionality reduction, the learning of the similarity matrix, and the adaptive learning of neighbors assignment into it. An iterative optimization algorithm is proposed to solve the objective function. We compare the proposed method with several 2D unsupervised dimensionality methods. K-means is used to evaluate the clustering performance. We conduct extensive experiments on Coil20, AT&T, FERET, USPS, and Yale data sets to verify the effectiveness of our proposed method.
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Vijayarani, Dr S., Dr C. Sivamathi y Mrs S. Maria Sylviaa. "Bio Inspired Algorithms for Dimensionality Reduction and Outlier Detection in Medical Datasets". International Journal of Advanced Networking and Applications 14, n.º 01 (2022): 5277–86. http://dx.doi.org/10.35444/ijana.2022.14107.

Texto completo
Resumen
Dimensionality Reduction is one of the useful techniques used in number of applications in order to reduce the number of features to improve the productivity and efficiency of the task. Clustering is one of the influential tasks in data mining. Dimensionality reductions are used in data mining, Image processing, Networking, Mobile computing, etc. The elementary intention of this work is to apply dimensionality reduction algorithms and then cluster the datasets to detect outliers. A bio-inspired ACO (Ant Colony optimization) algorithm has been proposed to reduce dimensionality. Also another bio-inspired algorithm FA (Firefly Algorithm) has been proposed to detect outliers. The three distinct medical datasets: thyroid dataset, Oesophagal dataset and Heart disease dataset are used for experimental results.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Gao, Yunlong, Sizhe Luo, Jinyan Pan, Zhihao Wang y Peng Gao. "Kernel alignment unsupervised discriminative dimensionality reduction". Neurocomputing 453 (septiembre de 2021): 181–94. http://dx.doi.org/10.1016/j.neucom.2021.03.127.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Crespo, Luis G., Brendon K. Colbert, Sean P. Kenny y Daniel P. Giesy. "Dimensionality Reduction of Sliced-Normal Distributions". IFAC-PapersOnLine 53, n.º 2 (2020): 7412–17. http://dx.doi.org/10.1016/j.ifacol.2020.12.1275.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Petty, G. W. "Dimensionality reduction in Bayesian estimation algorithms". Atmospheric Measurement Techniques 6, n.º 9 (4 de septiembre de 2013): 2267–76. http://dx.doi.org/10.5194/amt-6-2267-2013.

Texto completo
Resumen
Abstract. An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may utilize 9 to 13 channels to retrieve precipitation, for example. An objective method is described for distilling the relevant information content from N real channels into a much smaller number (M) of pseudochannels while also regularizing the background (geophysical plus instrument) noise component. The pseudochannels are linear combinations of the original N channels obtained via a two-stage principal component analysis of the dependent dataset. Bayesian retrievals based on a single pseudochannel applied to the independent dataset yield striking improvements in overall performance. The differences between the conventional Bayesian retrieval and reduced-dimensional Bayesian retrieval suggest that a major potential problem with conventional multichannel retrievals – whether Bayesian or not – lies in the common but often inappropriate assumption of diagonal error covariance. The dimensional reduction technique described herein avoids this problem by, in effect, recasting the retrieval problem in a coordinate system in which the desired covariance is lower-dimensional, diagonal, and unit magnitude.
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Petty, G. W. "Dimensionality reduction in Bayesian estimation algorithms". Atmospheric Measurement Techniques Discussions 6, n.º 2 (4 de marzo de 2013): 2327–52. http://dx.doi.org/10.5194/amtd-6-2327-2013.

Texto completo
Resumen
Abstract. An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may utilize 9 to 13 channels to retrieve precipitation, for example. An objective method is described for distilling the relevant information content from N real channels into a much smaller number (M) of pseudochannels while also regularizing the background (geophysical plus instrument) noise component. The pseudochannels are linear combinations of the original N channels obtained via a two-stage principal component analysis of the dependent dataset. Bayesian retrievals based on a single pseudochannel applied to the independent dataset yield striking improvements in overall performance. The differences between the conventional Bayesian retrieval and reduced-dimensional Bayesian retrieval suggest that a major potential problem with conventional multichannel retrievals – whether Bayesian or not – lies in the common but often inappropriate assumption of diagonal error covariance. The dimensional reduction technique described herein avoids this problem by, in effect, recasting the retrieval problem in a coordinate system in which the desired covariance is lower-dimensional, diagonal, and unit magnitude.
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Mahadev, Preeti y P. Nagabhushan. "Incremental Dimensionality Reduction in Hyperspectral Data". International Journal of Computer Applications 163, n.º 7 (17 de abril de 2017): 21–34. http://dx.doi.org/10.5120/ijca2017913575.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Jindal, Priyanka y Dharmender Kumar. "A Review on Dimensionality Reduction Techniques". International Journal of Computer Applications 173, n.º 2 (15 de septiembre de 2017): 42–46. http://dx.doi.org/10.5120/ijca2017915260.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Merola, Giovanni M. y Bovas Abraham. "Dimensionality reduction approach to multivariate prediction". Canadian Journal of Statistics 29, n.º 2 (junio de 2001): 191–200. http://dx.doi.org/10.2307/3316072.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Zhikai Zhao, Jiansheng Qian y Jian Cheng. "Marginal Discriminant Projection for Dimensionality Reduction". International Journal of Digital Content Technology and its Applications 6, n.º 15 (31 de agosto de 2012): 1–11. http://dx.doi.org/10.4156/jdcta.vol6.issue15.1.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Gnana Singh, D. Asir Antony y E. Jebamalar Leavline. "Dimensionality Reduction for Classification and Clustering". International Journal of Intelligent Systems and Applications 11, n.º 4 (8 de abril de 2019): 61–68. http://dx.doi.org/10.5815/ijisa.2019.04.06.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

VanHorn, Kevin Christopher y Murat Can Çobanoğlu. "Haisu: Hierarchically supervised nonlinear dimensionality reduction". PLOS Computational Biology 18, n.º 7 (21 de julio de 2022): e1010351. http://dx.doi.org/10.1371/journal.pcbi.1010351.

Texto completo
Resumen
We propose a novel strategy for incorporating hierarchical supervised label information into nonlinear dimensionality reduction techniques. Specifically, we extend t-SNE, UMAP, and PHATE to include known or predicted class labels and demonstrate the efficacy of our approach on multiple single-cell RNA sequencing datasets. Our approach, “Haisu,” is applicable across domains and methods of nonlinear dimensionality reduction. In general, the mathematical effect of Haisu can be summarized as a variable perturbation of the high dimensional space in which the original data is observed. We thereby preserve the core characteristics of the visualization method and only change the manifold to respect known or assumed class labels when provided. Our strategy is designed to aid in the discovery and understanding of underlying patterns in a dataset that is heavily influenced by parent-child relationships. We show that using our approach can also help in semi-supervised settings where labels are known for only some datapoints (for instance when only a fraction of the cells are labeled). In summary, Haisu extends existing popular visualization methods to enable a user to incorporate labels known a priori into a visualization, including their hierarchical relationships as defined by a user input graph.
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

B. Nath, D. K. Bhattacharyya y A. Ghosh. "Dimensionality Reduction for Association Rule Mining". International Journal of Intelligent Information Processing 2, n.º 1 (31 de marzo de 2011): 9–21. http://dx.doi.org/10.4156/ijiip.vol2.issue1.2.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía