Literatura académica sobre el tema "Dimensionality reduction"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Dimensionality reduction".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Artículos de revistas sobre el tema "Dimensionality reduction"

1

Cheng, Long, Chenyu You y Yani Guan. "Random Projections for Non-linear Dimensionality Reduction". International Journal of Machine Learning and Computing 6, n.º 4 (agosto de 2016): 220–25. http://dx.doi.org/10.18178/ijmlc.2016.6.4.601.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Marchette, David J. y Wendy L. Poston. "Local dimensionality reduction". Computational Statistics 14, n.º 4 (12 de septiembre de 1999): 469–89. http://dx.doi.org/10.1007/s001800050026.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Sun, Yu-Yin, Michael Ng y Zhi-Hua Zhou. "Multi-Instance Dimensionality Reduction". Proceedings of the AAAI Conference on Artificial Intelligence 24, n.º 1 (3 de julio de 2010): 587–92. http://dx.doi.org/10.1609/aaai.v24i1.7700.

Texto completo
Resumen
Multi-instance learning deals with problems that treat bags of instances as training examples. In single-instance learning problems, dimensionality reduction is an essential step for high-dimensional data analysis and has been studied for years. The curse of dimensionality also exists in multiinstance learning tasks, yet this difficult task has not been studied before. Direct application of existing single-instance dimensionality reduction objectives to multi-instance learning tasks may not work well since it ignores the characteristic of multi-instance learning that the labels of bags are known while the labels of instances are unknown. In this paper, we propose an effective model and develop an efficient algorithm to solve the multi-instance dimensionality reduction problem. We formulate the objective as an optimization problem by considering orthonormality and sparsity constraints in the projection matrix for dimensionality reduction, and then solve it by the gradient descent along the tangent space of the orthonormal matrices. We also propose an approximation for improving the efficiency. Experimental results validate the effectiveness of the proposed method.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Koren, Y. y L. Carmel. "Robust linear dimensionality reduction". IEEE Transactions on Visualization and Computer Graphics 10, n.º 4 (julio de 2004): 459–70. http://dx.doi.org/10.1109/tvcg.2004.17.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Lotlikar, R. y R. Kothari. "Fractional-step dimensionality reduction". IEEE Transactions on Pattern Analysis and Machine Intelligence 22, n.º 6 (junio de 2000): 623–27. http://dx.doi.org/10.1109/34.862200.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Gottlieb, Lee-Ad, Aryeh Kontorovich y Robert Krauthgamer. "Adaptive metric dimensionality reduction". Theoretical Computer Science 620 (marzo de 2016): 105–18. http://dx.doi.org/10.1016/j.tcs.2015.10.040.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Pang, Rich, Benjamin J. Lansdell y Adrienne L. Fairhall. "Dimensionality reduction in neuroscience". Current Biology 26, n.º 14 (julio de 2016): R656—R660. http://dx.doi.org/10.1016/j.cub.2016.05.029.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Lovaglio, Pietro Giorgio y Giorgio Vittadini. "Multilevel dimensionality-reduction methods". Statistical Methods & Applications 22, n.º 2 (27 de septiembre de 2012): 183–207. http://dx.doi.org/10.1007/s10260-012-0215-2.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Carter, Kevin, Raviv Raich, William Finn y Alfred Hero,III. "Information-Geometric Dimensionality Reduction". IEEE Signal Processing Magazine 28, n.º 2 (marzo de 2011): 89–99. http://dx.doi.org/10.1109/msp.2010.939536.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Gonen, Mehmet. "Bayesian Supervised Dimensionality Reduction". IEEE Transactions on Cybernetics 43, n.º 6 (diciembre de 2013): 2179–89. http://dx.doi.org/10.1109/tcyb.2013.2245321.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Tesis sobre el tema "Dimensionality reduction"

1

Ariu, Kaito. "Online Dimensionality Reduction". Licentiate thesis, KTH, Reglerteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-290791.

Texto completo
Resumen
In this thesis, we investigate online dimensionality reduction methods, wherethe algorithms learn by sequentially acquiring data. We focus on two specificalgorithm design problems in (i) recommender systems and (ii) heterogeneousclustering from binary user feedback. (i) For recommender systems, we consider a system consisting of m users and n items. In each round, a user,selected uniformly at random, arrives to the system and requests a recommendation. The algorithm observes the user id and recommends an itemfrom the item set. A notable restriction here is that the same item cannotbe recommended to the same user more than once, a constraint referred toas a no-repetition constraint. We study this problem as a variant of themulti-armed bandit problem and analyze regret with the various structurespertaining to items and users. We devise fundamental limits of regret andalgorithms that can achieve the limits order-wise. The analysis explicitlyhighlights the importance of each component of regret. For example, we candistinguish the regret due to the no-repetition constraint, that generated tolearn the statistics of user’s preference for an item, and that generated tolearn the low-dimensional space of the users and items were shown. (ii) Inthe clustering with binary feedback problem, the objective is to classify itemssolely based on limited user feedback. More precisely, users are just askedsimple questions with binary answers. A notable difficulty stems from theheterogeneity in the difficulty in classifying the various items (some itemsrequire more feedback to be classified than others). For this problem, wederive fundamental limits of the cluster recovery rates for both offline andonline algorithms. For the offline setting, we devise a simple algorithm thatachieves the limit order-wise. For the online setting, we propose an algorithm inspired by the lower bound. For both of the problems, we evaluatethe proposed algorithms by inspecting their theoretical guarantees and usingnumerical experiments performed on the synthetic and non-synthetic dataset.
Denna avhandling studerar algoritmer för datareduktion som lär sig från sekventiellt inhämtad data. Vi fokuserar speciellt på frågeställningar som uppkommer i utvecklingen av rekommendationssystem och i identifieringen av heterogena grupper av användare från data. För rekommendationssystem betraktar vi ett system med m användare och n objekt. I varje runda observerar algoritmen en slumpmässigt vald användare och rekommenderar ett objekt. En viktig begränsning i vår problemformuleringar att rekommendationer inte får upprepas: samma objekt inte kan rekommenderas till samma användartermer än en gång. Vi betraktar problemet som en variant av det flerarmadebanditproblemet och analyserar systemprestanda i termer av "ånger” under olika antaganden.Vi härleder fundamentala gränser för ånger och föreslår algoritmer som är (ordningsmässigt) optimala. En intressant komponent av vår analys är att vi lyckas att karaktärisera hur vart och ett av våra antaganden påverkar systemprestandan. T.ex. kan vi kvantifiera prestandaförlusten i ånger på grund av att rekommendationer inte får upprepas, på grund avatt vi måste lära oss statistiken för vilka objekt en användare är intresserade av, och för kostnaden för att lära sig den lågdimensionella rymden för användare och objekt. För problemet med hur man bäst identifierar grupper av användare härleder vi fundamentala gränser för hur snabbt det går att identifiera kluster. Vi gör detta för algoritmer som har samtidig tillgång till all data och för algoritmer som måste lära sig genom sekventiell inhämtning av data. Med tillgång till all data kan vår algoritm uppnå den optimala prestandan ordningsmässigt. När data måste inhämtas sekventiellt föreslår vi en algoritm som är inspirerad av den nedre gränsen på vad som kan uppnås. För båda problemen utvärderar vi de föreslagna algoritmerna numeriskt och jämför den praktiska prestandan med de teoretiska garantierna.

QC 20210223

Los estilos APA, Harvard, Vancouver, ISO, etc.
2

LEGRAMANTI, SIRIO. "Bayesian dimensionality reduction". Doctoral thesis, Università Bocconi, 2021. http://hdl.handle.net/11565/4035711.

Texto completo
Resumen
No abstract available
We are currently witnessing an explosion in the amount of available data. Such growth involves not only the number of data points but also their dimensionality. This poses new challenges to statistical modeling and computations, thus making dimensionality reduction more central than ever. In the present thesis, we provide methodological, computational and theoretical advancements in Bayesian dimensionality reduction via novel structured priors. Namely, we develop a new increasing shrinkage prior and illustrate how it can be employed to discard redundant dimensions in Gaussian factor models. In order to make it usable for larger datasets, we also investigate variational methods for posterior inference under this proposed prior. Beyond traditional models and parameter spaces, we also provide a different take on dimensionality reduction, focusing on community detection in networks. For this purpose, we define a general class of Bayesian nonparametric priors that encompasses existing stochastic block models as special cases and includes promising unexplored options. Our Bayesian approach allows for a natural incorporation of node attributes and facilitates uncertainty quantification as well as model selection.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Baldiwala, Aliakbar. "Dimensionality Reduction for Commercial Vehicle Fleet Monitoring". Thesis, Université d'Ottawa / University of Ottawa, 2018. http://hdl.handle.net/10393/38330.

Texto completo
Resumen
A variety of new features have been added in the present-day vehicles like a pre-crash warning, the vehicle to vehicle communication, semi-autonomous driving systems, telematics, drive by wire. They demand very high bandwidth from in-vehicle networks. Various electronic control units present inside the automotive transmit useful information via automotive multiplexing. Automotive multiplexing allows sharing information among various intelligent modules inside an automotive electronic system. Optimum functionality is achieved by transmitting this data in real time. The high bandwidth and high-speed requirement can be achieved either by using multiple buses or by implementing higher bandwidth. But, by doing so the cost of the network and the complexity of the wiring in the vehicle increases. Another option is to implement higher layer protocol which can reduce the amount of data transferred by using data reduction (DR) techniques, thus reducing the bandwidth usage. The implementation cost is minimal as only the changes are required in the software and not in hardware. In our work, we present a new data reduction algorithm termed as “Comprehensive Data Reduction (CDR)” algorithm. The proposed algorithm is used for minimization of the bus utilization of CAN bus for a future vehicle. The reduction in the busload was efficiently made by compressing the parameters; thus, more number of messages and lower priority messages can be efficiently sent on the CAN bus. The proposed work also presents a performance analysis of proposed algorithm with the boundary of fifteen compression algorithm, and Compression area selection algorithms (Existing Data Reduction Algorithm). The results of the analysis show that proposed CDR algorithm provides better data reduction compared to earlier proposed algorithms. The promising results were obtained in terms of reduction in bus utilization, compression efficiency, and percent peak load of CAN bus. This Reduction in the bus utilization permits to utilize a larger number of network nodes (ECU’s) in the existing system without increasing the overall cost of the system. The proposed algorithm has been developed for automotive environment, but it can also be utilized in any applications where extensive information transmission among various control units is carried out via a multiplexing bus.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Bolelli, Maria Virginia. "Diffusion Maps for Dimensionality Reduction". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019. http://amslaurea.unibo.it/18246/.

Texto completo
Resumen
In this thesis we present the diffusion maps, a framework based on diffusion processes for finding meaningful geometric descriptions of data sets. A diffusion process can be described via an iterative application of the heat kernel which has two main characteristics: it satisfies a Markov semigroup property and its level sets encode all geometric features of the space. This process, well known in regular manifolds, has been extended to general data set by Coifman and Lafon. They define a diffusion kernel starting from the geometric properties of the data and their density properties. This kernel will be a compact operator, and the projection on its eigenvectors at different instant of time, provides a family of embeddings of a dataset into a suitable Euclidean space. The projection on the first eigenvectors, naturally leads to a dimensionality reduction algorithm. Numerical implementation is provided on different data set.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Khosla, Nitin y n/a. "Dimensionality Reduction Using Factor Analysis". Griffith University. School of Engineering, 2006. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20061010.151217.

Texto completo
Resumen
In many pattern recognition applications, a large number of features are extracted in order to ensure an accurate classification of unknown classes. One way to solve the problems of high dimensions is to first reduce the dimensionality of the data to a manageable size, keeping as much of the original information as possible and then feed the reduced-dimensional data into a pattern recognition system. In this situation, dimensionality reduction process becomes the pre-processing stage of the pattern recognition system. In addition to this, probablility density estimation, with fewer variables is a simpler approach for dimensionality reduction. Dimensionality reduction is useful in speech recognition, data compression, visualization and exploratory data analysis. Some of the techniques which can be used for dimensionality reduction are; Factor Analysis (FA), Principal Component Analysis(PCA), and Linear Discriminant Analysis(LDA). Factor Analysis can be considered as an extension of Principal Component Analysis. The EM (expectation maximization) algorithm is ideally suited to problems of this sort, in that it produces maximum-likelihood (ML) estimates of parameters when there is a many-to-one mapping from an underlying distribution to the distribution governing the observation, conditioned upon the obervations. The maximization step then provides a new estimate of the parameters. This research work compares the techniques; Factor Analysis (Expectation-Maximization algorithm based), Principal Component Analysis and Linear Discriminant Analysis for dimensionality reduction and investigates Local Factor Analysis (EM algorithm based) and Local Principal Component Analysis using Vector Quantization.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Vamulapalli, Harika Rao. "On Dimensionality Reduction of Data". ScholarWorks@UNO, 2010. http://scholarworks.uno.edu/td/1211.

Texto completo
Resumen
Random projection method is one of the important tools for the dimensionality reduction of data which can be made efficient with strong error guarantees. In this thesis, we focus on linear transforms of high dimensional data to the low dimensional space satisfying the Johnson-Lindenstrauss lemma. In addition, we also prove some theoretical results relating to the projections that are of interest when applying them in practical applications. We show how the technique can be applied to synthetic data with probabilistic guarantee on the pairwise distance. The connection between dimensionality reduction and compressed sensing is also discussed.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Widemann, David P. "Dimensionality reduction for hyperspectral data". College Park, Md.: University of Maryland, 2008. http://hdl.handle.net/1903/8448.

Texto completo
Resumen
Thesis (Ph. D.) -- University of Maryland, College Park, 2008.
Thesis research directed by: Dept. of Mathematics. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Khosla, Nitin. "Dimensionality Reduction Using Factor Analysis". Thesis, Griffith University, 2006. http://hdl.handle.net/10072/366058.

Texto completo
Resumen
In many pattern recognition applications, a large number of features are extracted in order to ensure an accurate classification of unknown classes. One way to solve the problems of high dimensions is to first reduce the dimensionality of the data to a manageable size, keeping as much of the original information as possible and then feed the reduced-dimensional data into a pattern recognition system. In this situation, dimensionality reduction process becomes the pre-processing stage of the pattern recognition system. In addition to this, probablility density estimation, with fewer variables is a simpler approach for dimensionality reduction. Dimensionality reduction is useful in speech recognition, data compression, visualization and exploratory data analysis. Some of the techniques which can be used for dimensionality reduction are; Factor Analysis (FA), Principal Component Analysis(PCA), and Linear Discriminant Analysis(LDA). Factor Analysis can be considered as an extension of Principal Component Analysis. The EM (expectation maximization) algorithm is ideally suited to problems of this sort, in that it produces maximum-likelihood (ML) estimates of parameters when there is a many-to-one mapping from an underlying distribution to the distribution governing the observation, conditioned upon the obervations. The maximization step then provides a new estimate of the parameters. This research work compares the techniques; Factor Analysis (Expectation-Maximization algorithm based), Principal Component Analysis and Linear Discriminant Analysis for dimensionality reduction and investigates Local Factor Analysis (EM algorithm based) and Local Principal Component Analysis using Vector Quantization.
Thesis (Masters)
Master of Philosophy (MPhil)
School of Engineering
Full Text
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Sætrom, Jon. "Reduction of Dimensionality in Spatiotemporal Models". Doctoral thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for matematiske fag, 2010. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-11247.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Ghodsi, Boushehri Ali. "Nonlinear Dimensionality Reduction with Side Information". Thesis, University of Waterloo, 2006. http://hdl.handle.net/10012/1020.

Texto completo
Resumen
In this thesis, I look at three problems with important applications in data processing. Incorporating side information, provided by the user or derived from data, is a main theme of each of these problems.

This thesis makes a number of contributions. The first is a technique for combining different embedding objectives, which is then exploited to incorporate side information expressed in terms of transformation invariants known to hold in the data. It also introduces two different ways of incorporating transformation invariants in order to make new similarity measures. Two algorithms are proposed which learn metrics based on different types of side information. These learned metrics can then be used in subsequent embedding methods. Finally, it introduces a manifold learning algorithm that is useful when applied to sequential decision problems. In this case we are given action labels in addition to data points. Actions in the manifold learned by this algorithm have meaningful representations in that they are represented as simple transformations.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Libros sobre el tema "Dimensionality reduction"

1

Lee, John A. y Michel Verleysen, eds. Nonlinear Dimensionality Reduction. New York, NY: Springer New York, 2007. http://dx.doi.org/10.1007/978-0-387-39351-3.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Lespinats, Sylvain, Benoit Colange y Denys Dutykh. Nonlinear Dimensionality Reduction Techniques. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-81026-9.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Garzon, Max, Ching-Chi Yang, Deepak Venugopal, Nirman Kumar, Kalidas Jana y Lih-Yuan Deng, eds. Dimensionality Reduction in Data Science. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-05371-9.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Paul, Arati y Nabendu Chaki. Dimensionality Reduction of Hyperspectral Imagery. Cham: Springer International Publishing, 2024. http://dx.doi.org/10.1007/978-3-031-42667-4.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Strange, Harry y Reyer Zwiggelaar. Open Problems in Spectral Dimensionality Reduction. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-03943-5.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Kramer, Oliver. Dimensionality Reduction with Unsupervised Nearest Neighbors. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38652-7.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Kramer, Oliver. Dimensionality Reduction with Unsupervised Nearest Neighbors. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Shaw, Blake. Graph Embedding and Nonlinear Dimensionality Reduction. [New York, N.Y.?]: [publisher not identified], 2011.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Ghojogh, Benyamin, Mark Crowley, Fakhri Karray y Ali Ghodsi. Elements of Dimensionality Reduction and Manifold Learning. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-10602-6.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Wang, Jianzhong. Geometric Structure of High-Dimensional Data and Dimensionality Reduction. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-27497-8.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Capítulos de libros sobre el tema "Dimensionality reduction"

1

Herrera, Francisco, Francisco Charte, Antonio J. Rivera y María J. del Jesus. "Dimensionality Reduction". En Multilabel Classification, 115–31. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-41111-8_7.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Kramer, Oliver. "Dimensionality Reduction". En Dimensionality Reduction with Unsupervised Nearest Neighbors, 33–52. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38652-7_4.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Hull, Isaiah. "Dimensionality Reduction". En Machine Learning for Economics and Finance in TensorFlow 2, 281–306. Berkeley, CA: Apress, 2020. http://dx.doi.org/10.1007/978-1-4842-6373-0_8.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Shen, Heng Tao. "Dimensionality Reduction". En Encyclopedia of Database Systems, 1–2. New York, NY: Springer New York, 2017. http://dx.doi.org/10.1007/978-1-4899-7993-3_551-2.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Webb, Geoffrey I., Johannes Fürnkranz, Johannes Fürnkranz, Johannes Fürnkranz, Geoffrey Hinton, Claude Sammut, Joerg Sander et al. "Dimensionality Reduction". En Encyclopedia of Machine Learning, 274–79. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_216.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Dinov, Ivo D. "Dimensionality Reduction". En Data Science and Predictive Analytics, 233–66. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-72347-1_6.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Shen, Heng Tao. "Dimensionality Reduction". En Encyclopedia of Database Systems, 843–46. Boston, MA: Springer US, 2009. http://dx.doi.org/10.1007/978-0-387-39940-9_551.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Mathar, Rudolf, Gholamreza Alirezaei, Emilio Balda y Arash Behboodi. "Dimensionality Reduction". En Fundamentals of Data Analytics, 45–67. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-56831-3_4.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Durstewitz, Daniel. "Dimensionality Reduction". En Advanced Data Analysis in Neuroscience, 105–19. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-59976-2_6.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Braga-Neto, Ulisses. "Dimensionality Reduction". En Fundamentals of Pattern Recognition and Machine Learning, 205–29. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-27656-0_9.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Actas de conferencias sobre el tema "Dimensionality reduction"

1

Bunte, Kerstin, Michael Biehl y Barbara Hammer. "Dimensionality reduction mappings". En 2011 Ieee Symposium On Computational Intelligence And Data Mining - Part Of 17273 - 2011 Ssci. IEEE, 2011. http://dx.doi.org/10.1109/cidm.2011.5949443.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Schclar, Alon y Amir Averbuch. "Diffusion Bases Dimensionality Reduction". En 7th International Conference on Neural Computation Theory and Applications. SCITEPRESS - Science and and Technology Publications, 2015. http://dx.doi.org/10.5220/0005625301510156.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Bingham, Ella, Aristides Gionis, Niina Haiminen, Heli Hiisilä, Heikki Mannila y Evimaria Terzi. "Segmentation and dimensionality reduction". En Proceedings of the 2006 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2006. http://dx.doi.org/10.1137/1.9781611972764.33.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Zhang, Daoqiang, Zhi-Hua Zhou y Songcan Chen. "Semi-Supervised Dimensionality Reduction". En Proceedings of the 2007 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2007. http://dx.doi.org/10.1137/1.9781611972771.73.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Guo, Ce y Wayne Luk. "Quantisation-aware Dimensionality Reduction". En 2020 International Conference on Field-Programmable Technology (ICFPT). IEEE, 2020. http://dx.doi.org/10.1109/icfpt51103.2020.00041.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Zhu, Xiaofeng, Cong Lei, Hao Yu, Yonggang Li, Jiangzhang Gan y Shichao Zhang. "Robust Graph Dimensionality Reduction". En Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/452.

Texto completo
Resumen
In this paper, we propose conducting Robust Graph Dimensionality Reduction (RGDR) by learning a transformation matrix to map original high-dimensional data into their low-dimensional intrinsic space without the influence of outliers. To do this, we propose simultaneously 1) adaptively learning three variables, \ie a reverse graph embedding of original data, a transformation matrix, and a graph matrix preserving the local similarity of original data in their low-dimensional intrinsic space; and 2) employing robust estimators to avoid outliers involving the processes of optimizing these three matrices. As a result, original data are cleaned by two strategies, \ie a prediction of original data based on three resulting variables and robust estimators, so that the transformation matrix can be learnt from accurately estimated intrinsic space with the helping of the reverse graph embedding and the graph matrix. Moreover, we propose a new optimization algorithm to the resulting objective function as well as theoretically prove the convergence of our optimization algorithm. Experimental results indicated that our proposed method outperformed all the comparison methods in terms of different classification tasks.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Gashler, Mike y Tony Martinez. "Temporal nonlinear dimensionality reduction". En 2011 International Joint Conference on Neural Networks (IJCNN 2011 - San Jose). IEEE, 2011. http://dx.doi.org/10.1109/ijcnn.2011.6033465.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Heylen, Rob y Paul Scheunders. "Nonlinear barycentric dimensionality reduction". En 2010 17th IEEE International Conference on Image Processing (ICIP 2010). IEEE, 2010. http://dx.doi.org/10.1109/icip.2010.5653675.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Mosci, Sofia, Lorenzo Rosasco y Alessandro Verri. "Dimensionality reduction and generalization". En the 24th international conference. New York, New York, USA: ACM Press, 2007. http://dx.doi.org/10.1145/1273496.1273579.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Luo, Xianghui y Robert J. Durrant. "Maximum Gradient Dimensionality Reduction". En 2018 24th International Conference on Pattern Recognition (ICPR). IEEE, 2018. http://dx.doi.org/10.1109/icpr.2018.8546198.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Informes sobre el tema "Dimensionality reduction"

1

Jain, Anil K. Classification, Clustering and Dimensionality Reduction. Fort Belvoir, VA: Defense Technical Information Center, julio de 2008. http://dx.doi.org/10.21236/ada483446.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Wolf, Lior y Stanley Bileschi. Combining Variable Selection with Dimensionality Reduction. Fort Belvoir, VA: Defense Technical Information Center, marzo de 2005. http://dx.doi.org/10.21236/ada454990.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Jones, Michael J. Using Recurrent Networks for Dimensionality Reduction. Fort Belvoir, VA: Defense Technical Information Center, septiembre de 1992. http://dx.doi.org/10.21236/ada259497.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

León, Carlos. Detecting anomalous payments networks: A dimensionality reduction approach. Banco de la República de Colombia, diciembre de 2019. http://dx.doi.org/10.32468/be.1098.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Sarwar, Badrul, George Karypis, Joseph Konstan y John Riedl. Application of Dimensionality Reduction in Recommender System - A Case Study. Fort Belvoir, VA: Defense Technical Information Center, julio de 2000. http://dx.doi.org/10.21236/ada439541.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Fukumizu, Kenji, Francis R. Bach y Michael I. Jordan. Dimensionality Reduction for Supervised Learning With Reproducing Kernel Hilbert Spaces. Fort Belvoir, VA: Defense Technical Information Center, mayo de 2003. http://dx.doi.org/10.21236/ada446572.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Nichols, Jonathan M., Frank Bucholtz y Joseph V. Michalowicz. Intelligent Data Fusion Using Sparse Representations and Nonlinear Dimensionality Reduction. Fort Belvoir, VA: Defense Technical Information Center, septiembre de 2009. http://dx.doi.org/10.21236/ada507109.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Vales, C., Y. Choi, D. Copeland y S. Cheung. Energy conserving quadrature based dimensionality reduction for nonlinear hydrodynamics problems. Office of Scientific and Technical Information (OSTI), agosto de 2023. http://dx.doi.org/10.2172/1995059.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Ho, Tu Bao. Methods of Sparse Modeling and Dimensionality Reduction to Deal with Big Data. Fort Belvoir, VA: Defense Technical Information Center, abril de 2015. http://dx.doi.org/10.21236/ada623178.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Mohan, Anish, Guillermo Sapiro y Edward Bosch. Spatially-Coherent Non-Linear Dimensionality Reduction and Segmentation of Hyper-Spectral Images (PREPRINT). Fort Belvoir, VA: Defense Technical Information Center, junio de 2006. http://dx.doi.org/10.21236/ada478496.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía