Academic literature on the topic 'Dimensionality reduction analysis'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Dimensionality reduction analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Dimensionality reduction analysis"

1

Vats, Deepak, and Avinash Sharma. "Dimensionality Reduction Techniques: Comparative Analysis." Journal of Computational and Theoretical Nanoscience 17, no. 6 (June 1, 2020): 2684–88. http://dx.doi.org/10.1166/jctn.2020.8967.

Full text
Abstract:
It has been spotted an exponential growth in terms of dimension in real world data. Some example of higher dimensional data may includes speech signal, sensor data, medical data, criminal data and data related to recommendation process for different field like news, movies (Netflix) and e-commerce. To empowering learning accuracy in the area of machine learning and enhancing mining performance one need to remove redundant feature and feature not relevant for mining and learning task from this high dimension dataset. There exist many supervised and unsupervised methodologies in literature to perform dimension reduction. The objective of paper is to present most prominent methodologies related to the field of dimension reduction and highlight advantages along with disadvantages of these algorithms which can act as starting point for beginners of this field.
APA, Harvard, Vancouver, ISO, and other styles
2

Xie, Fuding, Yutao Fan, and Ming Zhou. "Dimensionality Reduction by Weighted Connections between Neighborhoods." Abstract and Applied Analysis 2014 (2014): 1–5. http://dx.doi.org/10.1155/2014/928136.

Full text
Abstract:
Dimensionality reduction is the transformation of high-dimensional data into a meaningful representation of reduced dimensionality. This paper introduces a dimensionality reduction technique by weighted connections between neighborhoods to improveK-Isomap method, attempting to preserve perfectly the relationships between neighborhoods in the process of dimensionality reduction. The validity of the proposal is tested by three typical examples which are widely employed in the algorithms based on manifold. The experimental results show that the local topology nature of dataset is preserved well while transforming dataset in high-dimensional space into a new dataset in low-dimensionality by the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
3

Fujiwara, Takanori, Xinhai Wei, Jian Zhao, and Kwan-Liu Ma. "Interactive Dimensionality Reduction for Comparative Analysis." IEEE Transactions on Visualization and Computer Graphics 28, no. 1 (January 2022): 758–68. http://dx.doi.org/10.1109/tvcg.2021.3114807.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Masram, M. S., and T. Diwan. "Microblog Dimensionality Reduction With Semantic Analysis." International Journal of Computer Sciences and Engineering 6, no. 1 (January 31, 2018): 342–46. http://dx.doi.org/10.26438/ijcse/v6i1.342346.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Liang, Zhizheng, Shixiong Xia, and Yong Zhou. "Normalized discriminant analysis for dimensionality reduction." Neurocomputing 110 (June 2013): 153–59. http://dx.doi.org/10.1016/j.neucom.2012.12.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Schott, James R. "Dimensionality reduction in quadratic discriminant analysis." Computational Statistics & Data Analysis 16, no. 2 (August 1993): 161–74. http://dx.doi.org/10.1016/0167-9473(93)90111-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kumar, Aswani. "Analysis of unsupervised dimensionality reduction techniques." Computer Science and Information Systems 6, no. 2 (2009): 217–27. http://dx.doi.org/10.2298/csis0902217k.

Full text
Abstract:
Domains such as text, images etc contain large amounts of redundancies and ambiguities among the attributes which result in considerable noise effects (i.e. the data is high dimension). Retrieving the data from high dimensional datasets is a big challenge. Dimensionality reduction techniques have been a successful avenue for automatically extracting the latent concepts by removing the noise and reducing the complexity in processing the high dimensional data. In this paper we conduct a systematic study on comparing the unsupervised dimensionality reduction techniques for text retrieval task. We analyze these techniques from the view of complexity, approximation error and retrieval quality with experiments on four testing document collections.
APA, Harvard, Vancouver, ISO, and other styles
8

Ngo, T. T., M. Bellalij, and Y. Saad. "The Trace Ratio Optimization Problem for Dimensionality Reduction." SIAM Journal on Matrix Analysis and Applications 31, no. 5 (January 2010): 2950–71. http://dx.doi.org/10.1137/090776603.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Shanshan, Lan Bai, Xu Chen, Zhen Wang, and Yuan-Hai Shao. "Divergent Projection Analysis for Unsupervised Dimensionality Reduction." Procedia Computer Science 199 (2022): 384–91. http://dx.doi.org/10.1016/j.procs.2022.01.047.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Yuan, Sen, Xia Mao, and Lijiang Chen. "Multilinear Spatial Discriminant Analysis for Dimensionality Reduction." IEEE Transactions on Image Processing 26, no. 6 (June 2017): 2669–81. http://dx.doi.org/10.1109/tip.2017.2685343.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Dimensionality reduction analysis"

1

Khosla, Nitin, and n/a. "Dimensionality Reduction Using Factor Analysis." Griffith University. School of Engineering, 2006. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20061010.151217.

Full text
Abstract:
In many pattern recognition applications, a large number of features are extracted in order to ensure an accurate classification of unknown classes. One way to solve the problems of high dimensions is to first reduce the dimensionality of the data to a manageable size, keeping as much of the original information as possible and then feed the reduced-dimensional data into a pattern recognition system. In this situation, dimensionality reduction process becomes the pre-processing stage of the pattern recognition system. In addition to this, probablility density estimation, with fewer variables is a simpler approach for dimensionality reduction. Dimensionality reduction is useful in speech recognition, data compression, visualization and exploratory data analysis. Some of the techniques which can be used for dimensionality reduction are; Factor Analysis (FA), Principal Component Analysis(PCA), and Linear Discriminant Analysis(LDA). Factor Analysis can be considered as an extension of Principal Component Analysis. The EM (expectation maximization) algorithm is ideally suited to problems of this sort, in that it produces maximum-likelihood (ML) estimates of parameters when there is a many-to-one mapping from an underlying distribution to the distribution governing the observation, conditioned upon the obervations. The maximization step then provides a new estimate of the parameters. This research work compares the techniques; Factor Analysis (Expectation-Maximization algorithm based), Principal Component Analysis and Linear Discriminant Analysis for dimensionality reduction and investigates Local Factor Analysis (EM algorithm based) and Local Principal Component Analysis using Vector Quantization.
APA, Harvard, Vancouver, ISO, and other styles
2

Vamulapalli, Harika Rao. "On Dimensionality Reduction of Data." ScholarWorks@UNO, 2010. http://scholarworks.uno.edu/td/1211.

Full text
Abstract:
Random projection method is one of the important tools for the dimensionality reduction of data which can be made efficient with strong error guarantees. In this thesis, we focus on linear transforms of high dimensional data to the low dimensional space satisfying the Johnson-Lindenstrauss lemma. In addition, we also prove some theoretical results relating to the projections that are of interest when applying them in practical applications. We show how the technique can be applied to synthetic data with probabilistic guarantee on the pairwise distance. The connection between dimensionality reduction and compressed sensing is also discussed.
APA, Harvard, Vancouver, ISO, and other styles
3

Vasiloglou, Nikolaos. "Isometry and convexity in dimensionality reduction." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/28120.

Full text
Abstract:
Thesis (M. S.)--Electrical and Computer Engineering, Georgia Institute of Technology, 2009.
Committee Chair: David Anderson; Committee Co-Chair: Alexander Gray; Committee Member: Anthony Yezzi; Committee Member: Hongyuan Zha; Committee Member: Justin Romberg; Committee Member: Ronald Schafer.
APA, Harvard, Vancouver, ISO, and other styles
4

Ross, Ian. "Nonlinear dimensionality reduction methods in climate data analysis." Thesis, University of Bristol, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.492479.

Full text
Abstract:
Linear dimensionality reduction techniques, notably principal component analysis, are widely used in climate data analysis as a means to aid in the interpretation of datasets of high dimensionality. These hnear methods may not be appropriate for the analysis of data arising from nonlinear processes occurring in the climate system. Numerous techniques for nonlinear dimensionality reduction have been developed recently that may provide a potentially useful tool for the identification of low-dimensional manifolds in climate data sets arising from nonlinear dynamics. In this thesis I apply three such techniques to the study of El Niño/Southern Oscillation variability in tropical Pacific sea surface temperatures and thermocline depth, comparing observational data with simulations from coupled atmosphere-ocean general circulation models from the CMIP3 multi-model ensemble.
APA, Harvard, Vancouver, ISO, and other styles
5

Ray, Sujan. "Dimensionality Reduction in Healthcare Data Analysis on Cloud Platform." University of Cincinnati / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ucin161375080072697.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Di, Ciaccio Lucio. "Feature selection and dimensionality reduction for supervised data analysis." Thesis, Massachusetts Institute of Technology, 2016. https://hdl.handle.net/1721.1/122827.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, Department of Aeronautics and Astronautics, 2016
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 103-106).
by Lucio Di Ciaccio.
S.M.
S.M. Massachusetts Institute of Technology, Department of Aeronautics and Astronautics
APA, Harvard, Vancouver, ISO, and other styles
7

Coleman, Ashley B. "Feature Extraction using Dimensionality Reduction Techniques: Capturing the Human Perspective." Wright State University / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=wright1452775165.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hui, Shirley. "FlexSADRA: Flexible Structural Alignment using a Dimensionality Reduction Approach." Thesis, University of Waterloo, 2005. http://hdl.handle.net/10012/1173.

Full text
Abstract:
A topic of research that is frequently studied in Structural Biology is the problem of determining the degree of similarity between two protein structures. The most common solution is to perform a three dimensional structural alignment on the two structures. Rigid structural alignment algorithms have been developed in the past to accomplish this but treat the protein molecules as immutable structures. Since protein structures can bend and flex, rigid algorithms do not yield accurate results and as a result, flexible structural alignment algorithms have been developed. The problem with these algorithms is that the protein structures are represented using thousands of atomic coordinate variables. This results in a great computational burden due to the large number of degrees of freedom required to account for the flexibility. Past research in dimensionality reduction techniques has shown that a linear dimensionality reduction technique called Principal Component Analysis (PCA) is well suited for high dimensionality reduction. This thesis introduces a new flexible structural alignment algorithm called FlexSADRA, which uses PCA to perform flexible structural alignments. Test results show that FlexSADRA determines better alignments than rigid structural alignment algorithms. Unlike existing rigid and flexible algorithms, FlexSADRA addresses the problem in a significantly lower dimensionality problem space and assesses not only the structural fit but the structural feasibility of the final alignment.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Yuyao. "Non-linear dimensionality reduction and sparse representation models for facial analysis." Thesis, Lyon, INSA, 2014. http://www.theses.fr/2014ISAL0019/document.

Full text
Abstract:
Les techniques d'analyse du visage nécessitent généralement une représentation pertinente des images, notamment en passant par des techniques de réduction de la dimension, intégrées dans des schémas plus globaux, et qui visent à capturer les caractéristiques discriminantes des signaux. Dans cette thèse, nous fournissons d'abord une vue générale sur l'état de l'art de ces modèles, puis nous appliquons une nouvelle méthode intégrant une approche non-linéaire, Kernel Similarity Principle Component Analysis (KS-PCA), aux Modèles Actifs d'Apparence (AAMs), pour modéliser l'apparence d'un visage dans des conditions d'illumination variables. L'algorithme proposé améliore notablement les résultats obtenus par l'utilisation d'une transformation PCA linéaire traditionnelle, que ce soit pour la capture des caractéristiques saillantes, produites par les variations d'illumination, ou pour la reconstruction des visages. Nous considérons aussi le problème de la classification automatiquement des poses des visages pour différentes vues et différentes illumination, avec occlusion et bruit. Basé sur les méthodes des représentations parcimonieuses, nous proposons deux cadres d'apprentissage de dictionnaire pour ce problème. Une première méthode vise la classification de poses à l'aide d'une représentation parcimonieuse active (Active Sparse Representation ASRC). En fait, un dictionnaire est construit grâce à un modèle linéaire, l'Incremental Principle Component Analysis (Incremental PCA), qui a tendance à diminuer la redondance intra-classe qui peut affecter la performance de la classification, tout en gardant la redondance inter-classes, qui elle, est critique pour les représentations parcimonieuses. La seconde approche proposée est un modèle des représentations parcimonieuses basé sur le Dictionary-Learning Sparse Representation (DLSR), qui cherche à intégrer la prise en compte du critère de la classification dans le processus d'apprentissage du dictionnaire. Nous faisons appel dans cette partie à l'algorithme K-SVD. Nos résultats expérimentaux montrent la performance de ces deux méthodes d'apprentissage de dictionnaire. Enfin, nous proposons un nouveau schéma pour l'apprentissage de dictionnaire adapté à la normalisation de l'illumination (Dictionary Learning for Illumination Normalization: DLIN). L'approche ici consiste à construire une paire de dictionnaires avec une représentation parcimonieuse. Ces dictionnaires sont construits respectivement à partir de visages illuminées normalement et irrégulièrement, puis optimisés de manière conjointe. Nous utilisons un modèle de mixture de Gaussiennes (GMM) pour augmenter la capacité à modéliser des données avec des distributions plus complexes. Les résultats expérimentaux démontrent l'efficacité de notre approche pour la normalisation d'illumination
Face analysis techniques commonly require a proper representation of images by means of dimensionality reduction leading to embedded manifolds, which aims at capturing relevant characteristics of the signals. In this thesis, we first provide a comprehensive survey on the state of the art of embedded manifold models. Then, we introduce a novel non-linear embedding method, the Kernel Similarity Principal Component Analysis (KS-PCA), into Active Appearance Models, in order to model face appearances under variable illumination. The proposed algorithm successfully outperforms the traditional linear PCA transform to capture the salient features generated by different illuminations, and reconstruct the illuminated faces with high accuracy. We also consider the problem of automatically classifying human face poses from face views with varying illumination, as well as occlusion and noise. Based on the sparse representation methods, we propose two dictionary-learning frameworks for this pose classification problem. The first framework is the Adaptive Sparse Representation pose Classification (ASRC). It trains the dictionary via a linear model called Incremental Principal Component Analysis (Incremental PCA), tending to decrease the intra-class redundancy which may affect the classification performance, while keeping the extra-class redundancy which is critical for sparse representation. The other proposed work is the Dictionary-Learning Sparse Representation model (DLSR) that learns the dictionary with the aim of coinciding with the classification criterion. This training goal is achieved by the K-SVD algorithm. In a series of experiments, we show the performance of the two dictionary-learning methods which are respectively based on a linear transform and a sparse representation model. Besides, we propose a novel Dictionary Learning framework for Illumination Normalization (DL-IN). DL-IN based on sparse representation in terms of coupled dictionaries. The dictionary pairs are jointly optimized from normally illuminated and irregularly illuminated face image pairs. We further utilize a Gaussian Mixture Model (GMM) to enhance the framework's capability of modeling data under complex distribution. The GMM adapt each model to a part of the samples and then fuse them together. Experimental results demonstrate the effectiveness of the sparsity as a prior for patch-based illumination normalization for face images
APA, Harvard, Vancouver, ISO, and other styles
10

Moraes, Lailson Bandeira de. "Two-dimensional extensions of semi-supervised dimensionality reduction methods." Universidade Federal de Pernambuco, 2013. https://repositorio.ufpe.br/handle/123456789/12388.

Full text
Abstract:
Submitted by João Arthur Martins (joao.arthur@ufpe.br) on 2015-03-11T18:17:21Z No. of bitstreams: 2 Dissertaçao Lailson de Moraes.pdf: 4634910 bytes, checksum: cbec580f8cbc24cb3feb2379a1d2dfbd (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5)
Approved for entry into archive by Daniella Sodre (daniella.sodre@ufpe.br) on 2015-03-13T13:02:06Z (GMT) No. of bitstreams: 2 Dissertaçao Lailson de Moraes.pdf: 4634910 bytes, checksum: cbec580f8cbc24cb3feb2379a1d2dfbd (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5)
Made available in DSpace on 2015-03-13T13:02:06Z (GMT). No. of bitstreams: 2 Dissertaçao Lailson de Moraes.pdf: 4634910 bytes, checksum: cbec580f8cbc24cb3feb2379a1d2dfbd (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Previous issue date: 2013-08-19
An important pre-processing step in machine learning systems is dimensionality reduction, which aims to produce compact representations of high-dimensional patterns. In computer vision applications, these patterns are typically images, that are represented by two-dimensional matrices. However, traditional dimensionality reduction techniques were designed to work only with vectors, what makes them a suboptimal choice for processing two-dimensional data. Another problem with traditional approaches for dimensionality reduction is that they operate either on a fully unsupervised or fully supervised way, what limits their efficiency in scenarios where supervised information is available only for a subset of the data. These situations are increasingly common because in many modern applications it is easy to produce raw data, but it is usually difficult to label it. In this study, we propose three dimensionality reduction methods that can overcome these limitations: Two-dimensional Semi-supervised Dimensionality Reduction (2D-SSDR), Two-dimensional Discriminant Principal Component Analysis (2D-DPCA), and Two-dimensional Semi-supervised Local Fisher Discriminant Analysis (2D-SELF). They work directly with two-dimensional data and can also take advantage of supervised information even if it is available only for a small part of the dataset. In addition, a fully supervised method, the Two-dimensional Local Fisher Discriminant Analysis (2D-LFDA), is proposed too. The methods are defined in terms of a two-dimensional framework, which was created in this study as well. The framework is capable of generally describing scatter-based methods for dimensionality reduction and can be used for deriving other two-dimensional methods in the future. Experimental results showed that, as expected, the novel methods are faster and more stable than the existing ones. Furthermore, 2D-SSDR, 2D-SELF, and 2D-LFDA achieved competitive classification accuracies most of the time when compared to the traditional methods. Therefore, these three techniques can be seen as viable alternatives to existing dimensionality reduction methods.
Um estágio importante de pré-processamento em sistemas de aprendizagem de máquina é a redução de dimensionalidade, que tem como objetivo produzir representações compactas de padrões de alta dimensionalidade. Em aplicações de visão computacional, estes padrões são tipicamente imagens, que são representadas por matrizes bi-dimensionais. Entretanto, técnicas tradicionais para redução de dimensionalidade foram projetadas para lidar apenas com vetores, o que as torna opções inadequadas para processar dados bi-dimensionais. Outro problema com as abordagens tradicionais para redução de dimensionalidade é que elas operam apenas de forma totalmente não-supervisionada ou totalmente supervisionada, o que limita sua eficiência em cenários onde dados supervisionados estão disponíveis apenas para um subconjunto das amostras. Estas situações são cada vez mais comuns por que em várias aplicações modernas é fácil produzir dados brutos, mas é geralmente difícil rotulá-los. Neste estudo, propomos três métodos para redução de dimensionalidade capazes de contornar estas limitações: Two-dimensional Semi-supervised Dimensionality Reduction (2DSSDR), Two-dimensional Discriminant Principal Component Analysis (2D-DPCA), e Twodimensional Semi-supervised Local Fisher Discriminant Analysis (2D-SELF). Eles operam diretamente com dados bi-dimensionais e também podem explorar informação supervisionada, mesmo que ela esteja disponível apenas para uma pequena parte das amostras. Adicionalmente, um método completamente supervisionado, o Two-dimensional Local Fisher Discriminant Analysis (2D-LFDA) é proposto também. Os métodos são definidos nos termos de um framework bi-dimensional, que foi igualmente criado neste estudo. O framework é capaz de descrever métodos para redução de dimensionalidade baseados em dispersão de forma geral e pode ser usado para derivar outras técnicas bi-dimensionais no futuro. Resultados experimentais mostraram que, como esperado, os novos métodos são mais rápidos e estáveis que as técnicas existentes. Além disto, 2D-SSDR, 2D-SELF, e 2D-LFDA obtiveram taxas de erro competitivas na maior parte das vezes quando comparadas aos métodos tradicionais. Desta forma, estas três técnicas podem ser vistas como alternativas viáveis aos métodos existentes para redução de dimensionalidade.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Dimensionality reduction analysis"

1

Kramer, Oliver. Dimensionality Reduction with Unsupervised Nearest Neighbors. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Geometric data analysis: An empirical approach to dimensionality reduction and the study of patterns. New York: Wiley, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Krämer, Oliver. Dimensionality Reduction with Unsupervised Nearest Neighbors. Springer Berlin / Heidelberg, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Carreira-Perpinan, Miguel A. Dimensionality Reduction (Chapman & Hall/Crc Computer Science & Data Analysis). Chapman & Hall/CRC, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kramer, Oliver. Dimensionality Reduction with Unsupervised Nearest Neighbors. Springer, 2016.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Popov, Valentin, and Markus Heß. Method of Dimensionality Reduction in Contact Mechanics and Friction. Springer Berlin / Heidelberg, 2016.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Popov, Valentin L., and Markus Heß. Method of Dimensionality Reduction in Contact Mechanics and Friction. Springer, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Popov, Valentin L., and Markus Heß. Method of Dimensionality Reduction in Contact Mechanics and Friction. Springer, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kirby, Michael. Geometric Data Analysis: An Empirical Approach to Dimensionality Reduction and the Study of Patterns. Wiley & Sons, Incorporated, John, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kirby, Michael. Geometric Data Analysis: An Empirical Approach to Dimensionality Reduction and the Study of Patterns. Wiley-Interscience, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Dimensionality reduction analysis"

1

Durstewitz, Daniel. "Dimensionality Reduction." In Advanced Data Analysis in Neuroscience, 105–19. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-59976-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Phillips, Jeff M. "Dimensionality Reduction." In Mathematical Foundations for Data Analysis, 143–76. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-62341-8_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gisbrecht, Andrej, Daniela Hofmann, and Barbara Hammer. "Discriminative Dimensionality Reduction Mappings." In Advances in Intelligent Data Analysis XI, 126–38. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-34156-4_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hung, Chih-Cheng, Enmin Song, and Yihua Lan. "Dimensionality Reduction and Sparse Representation." In Image Texture Analysis, 103–27. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-13773-1_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Andersson, Fredrik, and Jens Nilsson. "Nonlinear Dimensionality Reduction Using Circuit Models." In Image Analysis, 950–59. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11499145_96.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Campadelli, Paola, Elena Casiraghi, and Claudio Ceruti. "Neighborhood Selection for Dimensionality Reduction." In Image Analysis and Processing — ICIAP 2015, 183–91. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-23231-7_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Williamson, Len. "Persistent Homology for Dimensionality Reduction." In Reinforcement Learning Algorithms: Analysis and Applications, 97–105. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-41188-6_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Li, Zuoling, and Guirong Weng. "GCM Data Analysis Using Dimensionality Reduction." In Advances in Computer Science and Education, 217–22. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-27945-4_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Moore, Jason H., and Peter C. Andrews. "Epistasis Analysis Using Multifactor Dimensionality Reduction." In Methods in Molecular Biology, 301–14. New York, NY: Springer New York, 2014. http://dx.doi.org/10.1007/978-1-4939-2155-3_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Tripathy, B. K., S. Anveshrithaa, and Shrusti Ghela. "Comparative Analysis of Dimensionality Reduction Techniques." In Unsupervised Learning Approaches for Dimensionality Reduction and Data Visualization, 137–49. Boca Raton: CRC Press, 2021. http://dx.doi.org/10.1201/9781003190554-14.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Dimensionality reduction analysis"

1

Underhill, David G., Luke K. McDowell, David J. Marchette, and Jeffrey L. Solka. "Enhancing Text Analysis via Dimensionality Reduction." In 2007 IEEE International Conference on Information Reuse and Integration. IEEE, 2007. http://dx.doi.org/10.1109/iri.2007.4296645.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Lei, Peipei Peng, Xuezhi Xiang, and Xiantong Zhen. "Dimensionality reduction by supervised locality analysis." In 2015 IEEE International Conference on Image Processing (ICIP). IEEE, 2015. http://dx.doi.org/10.1109/icip.2015.7351048.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Feng Zheng, Na Chen, and Luoqing Li. "Semi-supervised Laplacian eigenmaps for dimensionality reduction." In 2008 International Conference on Wavelet Analysis and Pattern Recognition (ICWAPR). IEEE, 2008. http://dx.doi.org/10.1109/icwapr.2008.4635894.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sugiyama, Masashi. "Local Fisher discriminant analysis for supervised dimensionality reduction." In the 23rd international conference. New York, New York, USA: ACM Press, 2006. http://dx.doi.org/10.1145/1143844.1143958.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Vo, Nhat, Duc Vo, Subhash Challa, and Bill Moran. "Parametric subspace analysis for dimensionality reduction and classification." In 2009 IEEE Symposium on Computational Intelligence and Data Mining (CIDM). IEEE, 2009. http://dx.doi.org/10.1109/cidm.2009.4938672.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Narwane, Swati V., and Sudhir D. Sawarkar. "Dimensionality Reduction of Unbalanced Datasets: Principal Component Analysis." In 2021 Asian Conference on Innovation in Technology (ASIANCON). IEEE, 2021. http://dx.doi.org/10.1109/asiancon51346.2021.9544971.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Narwane, Swati V., and Sudhir D. Sawarkar. "Dimensionality Reduction of Unbalanced Datasets: Principal Component Analysis." In 2021 Asian Conference on Innovation in Technology (ASIANCON). IEEE, 2021. http://dx.doi.org/10.1109/asiancon51346.2021.9544971.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Migenda, Nico, and Wolfram Schenck. "Adaptive Dimensionality Reduction for Local Principal Component Analysis." In 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA). IEEE, 2020. http://dx.doi.org/10.1109/etfa46521.2020.9212129.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kazemipour, Abbas, and Shaul Druckmann. "Nonlinear Dimensionality Reduction Via Polynomial Principal Component Analysis." In 2018 IEEE Global Conference on Signal and Information Processing (GlobalSIP). IEEE, 2018. http://dx.doi.org/10.1109/globalsip.2018.8646515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Peng, Jing, Stefan Robila, Wei Fan, and Guna Seetharaman. "Analysis of Chernoff criterion for linear dimensionality reduction." In 2010 IEEE International Conference on Systems, Man and Cybernetics - SMC. IEEE, 2010. http://dx.doi.org/10.1109/icsmc.2010.5641971.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Dimensionality reduction analysis"

1

Bucholtz, Frank, Jonathan M. Nichols, Michael D. Duncan, and Leslie N. Smith. The Feasibility of Nonlinear Dimensionality Reduction for the Rapid Analysis of Persistent Surveillance Data, including the Detection of IED Placement Activity. Fort Belvoir, VA: Defense Technical Information Center, October 2008. http://dx.doi.org/10.21236/ada488142.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography