Academic literature on the topic 'Dimensionality reduction'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Dimensionality reduction.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Dimensionality reduction"

1

Cheng, Long, Chenyu You, and Yani Guan. "Random Projections for Non-linear Dimensionality Reduction." International Journal of Machine Learning and Computing 6, no. 4 (August 2016): 220–25. http://dx.doi.org/10.18178/ijmlc.2016.6.4.601.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Marchette, David J., and Wendy L. Poston. "Local dimensionality reduction." Computational Statistics 14, no. 4 (September 12, 1999): 469–89. http://dx.doi.org/10.1007/s001800050026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sun, Yu-Yin, Michael Ng, and Zhi-Hua Zhou. "Multi-Instance Dimensionality Reduction." Proceedings of the AAAI Conference on Artificial Intelligence 24, no. 1 (July 3, 2010): 587–92. http://dx.doi.org/10.1609/aaai.v24i1.7700.

Full text
Abstract:
Multi-instance learning deals with problems that treat bags of instances as training examples. In single-instance learning problems, dimensionality reduction is an essential step for high-dimensional data analysis and has been studied for years. The curse of dimensionality also exists in multiinstance learning tasks, yet this difficult task has not been studied before. Direct application of existing single-instance dimensionality reduction objectives to multi-instance learning tasks may not work well since it ignores the characteristic of multi-instance learning that the labels of bags are known while the labels of instances are unknown. In this paper, we propose an effective model and develop an efficient algorithm to solve the multi-instance dimensionality reduction problem. We formulate the objective as an optimization problem by considering orthonormality and sparsity constraints in the projection matrix for dimensionality reduction, and then solve it by the gradient descent along the tangent space of the orthonormal matrices. We also propose an approximation for improving the efficiency. Experimental results validate the effectiveness of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
4

Koren, Y., and L. Carmel. "Robust linear dimensionality reduction." IEEE Transactions on Visualization and Computer Graphics 10, no. 4 (July 2004): 459–70. http://dx.doi.org/10.1109/tvcg.2004.17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Lotlikar, R., and R. Kothari. "Fractional-step dimensionality reduction." IEEE Transactions on Pattern Analysis and Machine Intelligence 22, no. 6 (June 2000): 623–27. http://dx.doi.org/10.1109/34.862200.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gottlieb, Lee-Ad, Aryeh Kontorovich, and Robert Krauthgamer. "Adaptive metric dimensionality reduction." Theoretical Computer Science 620 (March 2016): 105–18. http://dx.doi.org/10.1016/j.tcs.2015.10.040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Pang, Rich, Benjamin J. Lansdell, and Adrienne L. Fairhall. "Dimensionality reduction in neuroscience." Current Biology 26, no. 14 (July 2016): R656—R660. http://dx.doi.org/10.1016/j.cub.2016.05.029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Lovaglio, Pietro Giorgio, and Giorgio Vittadini. "Multilevel dimensionality-reduction methods." Statistical Methods & Applications 22, no. 2 (September 27, 2012): 183–207. http://dx.doi.org/10.1007/s10260-012-0215-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Carter, Kevin, Raviv Raich, William Finn, and Alfred Hero,III. "Information-Geometric Dimensionality Reduction." IEEE Signal Processing Magazine 28, no. 2 (March 2011): 89–99. http://dx.doi.org/10.1109/msp.2010.939536.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gonen, Mehmet. "Bayesian Supervised Dimensionality Reduction." IEEE Transactions on Cybernetics 43, no. 6 (December 2013): 2179–89. http://dx.doi.org/10.1109/tcyb.2013.2245321.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Dimensionality reduction"

1

Ariu, Kaito. "Online Dimensionality Reduction." Licentiate thesis, KTH, Reglerteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-290791.

Full text
Abstract:
In this thesis, we investigate online dimensionality reduction methods, wherethe algorithms learn by sequentially acquiring data. We focus on two specificalgorithm design problems in (i) recommender systems and (ii) heterogeneousclustering from binary user feedback. (i) For recommender systems, we consider a system consisting of m users and n items. In each round, a user,selected uniformly at random, arrives to the system and requests a recommendation. The algorithm observes the user id and recommends an itemfrom the item set. A notable restriction here is that the same item cannotbe recommended to the same user more than once, a constraint referred toas a no-repetition constraint. We study this problem as a variant of themulti-armed bandit problem and analyze regret with the various structurespertaining to items and users. We devise fundamental limits of regret andalgorithms that can achieve the limits order-wise. The analysis explicitlyhighlights the importance of each component of regret. For example, we candistinguish the regret due to the no-repetition constraint, that generated tolearn the statistics of user’s preference for an item, and that generated tolearn the low-dimensional space of the users and items were shown. (ii) Inthe clustering with binary feedback problem, the objective is to classify itemssolely based on limited user feedback. More precisely, users are just askedsimple questions with binary answers. A notable difficulty stems from theheterogeneity in the difficulty in classifying the various items (some itemsrequire more feedback to be classified than others). For this problem, wederive fundamental limits of the cluster recovery rates for both offline andonline algorithms. For the offline setting, we devise a simple algorithm thatachieves the limit order-wise. For the online setting, we propose an algorithm inspired by the lower bound. For both of the problems, we evaluatethe proposed algorithms by inspecting their theoretical guarantees and usingnumerical experiments performed on the synthetic and non-synthetic dataset.
Denna avhandling studerar algoritmer för datareduktion som lär sig från sekventiellt inhämtad data. Vi fokuserar speciellt på frågeställningar som uppkommer i utvecklingen av rekommendationssystem och i identifieringen av heterogena grupper av användare från data. För rekommendationssystem betraktar vi ett system med m användare och n objekt. I varje runda observerar algoritmen en slumpmässigt vald användare och rekommenderar ett objekt. En viktig begränsning i vår problemformuleringar att rekommendationer inte får upprepas: samma objekt inte kan rekommenderas till samma användartermer än en gång. Vi betraktar problemet som en variant av det flerarmadebanditproblemet och analyserar systemprestanda i termer av "ånger” under olika antaganden.Vi härleder fundamentala gränser för ånger och föreslår algoritmer som är (ordningsmässigt) optimala. En intressant komponent av vår analys är att vi lyckas att karaktärisera hur vart och ett av våra antaganden påverkar systemprestandan. T.ex. kan vi kvantifiera prestandaförlusten i ånger på grund av att rekommendationer inte får upprepas, på grund avatt vi måste lära oss statistiken för vilka objekt en användare är intresserade av, och för kostnaden för att lära sig den lågdimensionella rymden för användare och objekt. För problemet med hur man bäst identifierar grupper av användare härleder vi fundamentala gränser för hur snabbt det går att identifiera kluster. Vi gör detta för algoritmer som har samtidig tillgång till all data och för algoritmer som måste lära sig genom sekventiell inhämtning av data. Med tillgång till all data kan vår algoritm uppnå den optimala prestandan ordningsmässigt. När data måste inhämtas sekventiellt föreslår vi en algoritm som är inspirerad av den nedre gränsen på vad som kan uppnås. För båda problemen utvärderar vi de föreslagna algoritmerna numeriskt och jämför den praktiska prestandan med de teoretiska garantierna.

QC 20210223

APA, Harvard, Vancouver, ISO, and other styles
2

LEGRAMANTI, SIRIO. "Bayesian dimensionality reduction." Doctoral thesis, Università Bocconi, 2021. http://hdl.handle.net/11565/4035711.

Full text
Abstract:
No abstract available
We are currently witnessing an explosion in the amount of available data. Such growth involves not only the number of data points but also their dimensionality. This poses new challenges to statistical modeling and computations, thus making dimensionality reduction more central than ever. In the present thesis, we provide methodological, computational and theoretical advancements in Bayesian dimensionality reduction via novel structured priors. Namely, we develop a new increasing shrinkage prior and illustrate how it can be employed to discard redundant dimensions in Gaussian factor models. In order to make it usable for larger datasets, we also investigate variational methods for posterior inference under this proposed prior. Beyond traditional models and parameter spaces, we also provide a different take on dimensionality reduction, focusing on community detection in networks. For this purpose, we define a general class of Bayesian nonparametric priors that encompasses existing stochastic block models as special cases and includes promising unexplored options. Our Bayesian approach allows for a natural incorporation of node attributes and facilitates uncertainty quantification as well as model selection.
APA, Harvard, Vancouver, ISO, and other styles
3

Baldiwala, Aliakbar. "Dimensionality Reduction for Commercial Vehicle Fleet Monitoring." Thesis, Université d'Ottawa / University of Ottawa, 2018. http://hdl.handle.net/10393/38330.

Full text
Abstract:
A variety of new features have been added in the present-day vehicles like a pre-crash warning, the vehicle to vehicle communication, semi-autonomous driving systems, telematics, drive by wire. They demand very high bandwidth from in-vehicle networks. Various electronic control units present inside the automotive transmit useful information via automotive multiplexing. Automotive multiplexing allows sharing information among various intelligent modules inside an automotive electronic system. Optimum functionality is achieved by transmitting this data in real time. The high bandwidth and high-speed requirement can be achieved either by using multiple buses or by implementing higher bandwidth. But, by doing so the cost of the network and the complexity of the wiring in the vehicle increases. Another option is to implement higher layer protocol which can reduce the amount of data transferred by using data reduction (DR) techniques, thus reducing the bandwidth usage. The implementation cost is minimal as only the changes are required in the software and not in hardware. In our work, we present a new data reduction algorithm termed as “Comprehensive Data Reduction (CDR)” algorithm. The proposed algorithm is used for minimization of the bus utilization of CAN bus for a future vehicle. The reduction in the busload was efficiently made by compressing the parameters; thus, more number of messages and lower priority messages can be efficiently sent on the CAN bus. The proposed work also presents a performance analysis of proposed algorithm with the boundary of fifteen compression algorithm, and Compression area selection algorithms (Existing Data Reduction Algorithm). The results of the analysis show that proposed CDR algorithm provides better data reduction compared to earlier proposed algorithms. The promising results were obtained in terms of reduction in bus utilization, compression efficiency, and percent peak load of CAN bus. This Reduction in the bus utilization permits to utilize a larger number of network nodes (ECU’s) in the existing system without increasing the overall cost of the system. The proposed algorithm has been developed for automotive environment, but it can also be utilized in any applications where extensive information transmission among various control units is carried out via a multiplexing bus.
APA, Harvard, Vancouver, ISO, and other styles
4

Bolelli, Maria Virginia. "Diffusion Maps for Dimensionality Reduction." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019. http://amslaurea.unibo.it/18246/.

Full text
Abstract:
In this thesis we present the diffusion maps, a framework based on diffusion processes for finding meaningful geometric descriptions of data sets. A diffusion process can be described via an iterative application of the heat kernel which has two main characteristics: it satisfies a Markov semigroup property and its level sets encode all geometric features of the space. This process, well known in regular manifolds, has been extended to general data set by Coifman and Lafon. They define a diffusion kernel starting from the geometric properties of the data and their density properties. This kernel will be a compact operator, and the projection on its eigenvectors at different instant of time, provides a family of embeddings of a dataset into a suitable Euclidean space. The projection on the first eigenvectors, naturally leads to a dimensionality reduction algorithm. Numerical implementation is provided on different data set.
APA, Harvard, Vancouver, ISO, and other styles
5

Khosla, Nitin, and n/a. "Dimensionality Reduction Using Factor Analysis." Griffith University. School of Engineering, 2006. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20061010.151217.

Full text
Abstract:
In many pattern recognition applications, a large number of features are extracted in order to ensure an accurate classification of unknown classes. One way to solve the problems of high dimensions is to first reduce the dimensionality of the data to a manageable size, keeping as much of the original information as possible and then feed the reduced-dimensional data into a pattern recognition system. In this situation, dimensionality reduction process becomes the pre-processing stage of the pattern recognition system. In addition to this, probablility density estimation, with fewer variables is a simpler approach for dimensionality reduction. Dimensionality reduction is useful in speech recognition, data compression, visualization and exploratory data analysis. Some of the techniques which can be used for dimensionality reduction are; Factor Analysis (FA), Principal Component Analysis(PCA), and Linear Discriminant Analysis(LDA). Factor Analysis can be considered as an extension of Principal Component Analysis. The EM (expectation maximization) algorithm is ideally suited to problems of this sort, in that it produces maximum-likelihood (ML) estimates of parameters when there is a many-to-one mapping from an underlying distribution to the distribution governing the observation, conditioned upon the obervations. The maximization step then provides a new estimate of the parameters. This research work compares the techniques; Factor Analysis (Expectation-Maximization algorithm based), Principal Component Analysis and Linear Discriminant Analysis for dimensionality reduction and investigates Local Factor Analysis (EM algorithm based) and Local Principal Component Analysis using Vector Quantization.
APA, Harvard, Vancouver, ISO, and other styles
6

Vamulapalli, Harika Rao. "On Dimensionality Reduction of Data." ScholarWorks@UNO, 2010. http://scholarworks.uno.edu/td/1211.

Full text
Abstract:
Random projection method is one of the important tools for the dimensionality reduction of data which can be made efficient with strong error guarantees. In this thesis, we focus on linear transforms of high dimensional data to the low dimensional space satisfying the Johnson-Lindenstrauss lemma. In addition, we also prove some theoretical results relating to the projections that are of interest when applying them in practical applications. We show how the technique can be applied to synthetic data with probabilistic guarantee on the pairwise distance. The connection between dimensionality reduction and compressed sensing is also discussed.
APA, Harvard, Vancouver, ISO, and other styles
7

Widemann, David P. "Dimensionality reduction for hyperspectral data." College Park, Md.: University of Maryland, 2008. http://hdl.handle.net/1903/8448.

Full text
Abstract:
Thesis (Ph. D.) -- University of Maryland, College Park, 2008.
Thesis research directed by: Dept. of Mathematics. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
APA, Harvard, Vancouver, ISO, and other styles
8

Khosla, Nitin. "Dimensionality Reduction Using Factor Analysis." Thesis, Griffith University, 2006. http://hdl.handle.net/10072/366058.

Full text
Abstract:
In many pattern recognition applications, a large number of features are extracted in order to ensure an accurate classification of unknown classes. One way to solve the problems of high dimensions is to first reduce the dimensionality of the data to a manageable size, keeping as much of the original information as possible and then feed the reduced-dimensional data into a pattern recognition system. In this situation, dimensionality reduction process becomes the pre-processing stage of the pattern recognition system. In addition to this, probablility density estimation, with fewer variables is a simpler approach for dimensionality reduction. Dimensionality reduction is useful in speech recognition, data compression, visualization and exploratory data analysis. Some of the techniques which can be used for dimensionality reduction are; Factor Analysis (FA), Principal Component Analysis(PCA), and Linear Discriminant Analysis(LDA). Factor Analysis can be considered as an extension of Principal Component Analysis. The EM (expectation maximization) algorithm is ideally suited to problems of this sort, in that it produces maximum-likelihood (ML) estimates of parameters when there is a many-to-one mapping from an underlying distribution to the distribution governing the observation, conditioned upon the obervations. The maximization step then provides a new estimate of the parameters. This research work compares the techniques; Factor Analysis (Expectation-Maximization algorithm based), Principal Component Analysis and Linear Discriminant Analysis for dimensionality reduction and investigates Local Factor Analysis (EM algorithm based) and Local Principal Component Analysis using Vector Quantization.
Thesis (Masters)
Master of Philosophy (MPhil)
School of Engineering
Full Text
APA, Harvard, Vancouver, ISO, and other styles
9

Sætrom, Jon. "Reduction of Dimensionality in Spatiotemporal Models." Doctoral thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for matematiske fag, 2010. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-11247.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ghodsi, Boushehri Ali. "Nonlinear Dimensionality Reduction with Side Information." Thesis, University of Waterloo, 2006. http://hdl.handle.net/10012/1020.

Full text
Abstract:
In this thesis, I look at three problems with important applications in data processing. Incorporating side information, provided by the user or derived from data, is a main theme of each of these problems.

This thesis makes a number of contributions. The first is a technique for combining different embedding objectives, which is then exploited to incorporate side information expressed in terms of transformation invariants known to hold in the data. It also introduces two different ways of incorporating transformation invariants in order to make new similarity measures. Two algorithms are proposed which learn metrics based on different types of side information. These learned metrics can then be used in subsequent embedding methods. Finally, it introduces a manifold learning algorithm that is useful when applied to sequential decision problems. In this case we are given action labels in addition to data points. Actions in the manifold learned by this algorithm have meaningful representations in that they are represented as simple transformations.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Dimensionality reduction"

1

Lee, John A., and Michel Verleysen, eds. Nonlinear Dimensionality Reduction. New York, NY: Springer New York, 2007. http://dx.doi.org/10.1007/978-0-387-39351-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lespinats, Sylvain, Benoit Colange, and Denys Dutykh. Nonlinear Dimensionality Reduction Techniques. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-81026-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Garzon, Max, Ching-Chi Yang, Deepak Venugopal, Nirman Kumar, Kalidas Jana, and Lih-Yuan Deng, eds. Dimensionality Reduction in Data Science. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-05371-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Paul, Arati, and Nabendu Chaki. Dimensionality Reduction of Hyperspectral Imagery. Cham: Springer International Publishing, 2024. http://dx.doi.org/10.1007/978-3-031-42667-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Strange, Harry, and Reyer Zwiggelaar. Open Problems in Spectral Dimensionality Reduction. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-03943-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kramer, Oliver. Dimensionality Reduction with Unsupervised Nearest Neighbors. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38652-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kramer, Oliver. Dimensionality Reduction with Unsupervised Nearest Neighbors. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Shaw, Blake. Graph Embedding and Nonlinear Dimensionality Reduction. [New York, N.Y.?]: [publisher not identified], 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ghojogh, Benyamin, Mark Crowley, Fakhri Karray, and Ali Ghodsi. Elements of Dimensionality Reduction and Manifold Learning. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-10602-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Jianzhong. Geometric Structure of High-Dimensional Data and Dimensionality Reduction. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-27497-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Dimensionality reduction"

1

Herrera, Francisco, Francisco Charte, Antonio J. Rivera, and María J. del Jesus. "Dimensionality Reduction." In Multilabel Classification, 115–31. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-41111-8_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kramer, Oliver. "Dimensionality Reduction." In Dimensionality Reduction with Unsupervised Nearest Neighbors, 33–52. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38652-7_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hull, Isaiah. "Dimensionality Reduction." In Machine Learning for Economics and Finance in TensorFlow 2, 281–306. Berkeley, CA: Apress, 2020. http://dx.doi.org/10.1007/978-1-4842-6373-0_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Shen, Heng Tao. "Dimensionality Reduction." In Encyclopedia of Database Systems, 1–2. New York, NY: Springer New York, 2017. http://dx.doi.org/10.1007/978-1-4899-7993-3_551-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Webb, Geoffrey I., Johannes Fürnkranz, Johannes Fürnkranz, Johannes Fürnkranz, Geoffrey Hinton, Claude Sammut, Joerg Sander, et al. "Dimensionality Reduction." In Encyclopedia of Machine Learning, 274–79. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_216.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Dinov, Ivo D. "Dimensionality Reduction." In Data Science and Predictive Analytics, 233–66. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-72347-1_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Shen, Heng Tao. "Dimensionality Reduction." In Encyclopedia of Database Systems, 843–46. Boston, MA: Springer US, 2009. http://dx.doi.org/10.1007/978-0-387-39940-9_551.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Mathar, Rudolf, Gholamreza Alirezaei, Emilio Balda, and Arash Behboodi. "Dimensionality Reduction." In Fundamentals of Data Analytics, 45–67. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-56831-3_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Durstewitz, Daniel. "Dimensionality Reduction." In Advanced Data Analysis in Neuroscience, 105–19. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-59976-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Braga-Neto, Ulisses. "Dimensionality Reduction." In Fundamentals of Pattern Recognition and Machine Learning, 205–29. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-27656-0_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Dimensionality reduction"

1

Bunte, Kerstin, Michael Biehl, and Barbara Hammer. "Dimensionality reduction mappings." In 2011 Ieee Symposium On Computational Intelligence And Data Mining - Part Of 17273 - 2011 Ssci. IEEE, 2011. http://dx.doi.org/10.1109/cidm.2011.5949443.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Schclar, Alon, and Amir Averbuch. "Diffusion Bases Dimensionality Reduction." In 7th International Conference on Neural Computation Theory and Applications. SCITEPRESS - Science and and Technology Publications, 2015. http://dx.doi.org/10.5220/0005625301510156.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bingham, Ella, Aristides Gionis, Niina Haiminen, Heli Hiisilä, Heikki Mannila, and Evimaria Terzi. "Segmentation and dimensionality reduction." In Proceedings of the 2006 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2006. http://dx.doi.org/10.1137/1.9781611972764.33.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zhang, Daoqiang, Zhi-Hua Zhou, and Songcan Chen. "Semi-Supervised Dimensionality Reduction." In Proceedings of the 2007 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2007. http://dx.doi.org/10.1137/1.9781611972771.73.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Guo, Ce, and Wayne Luk. "Quantisation-aware Dimensionality Reduction." In 2020 International Conference on Field-Programmable Technology (ICFPT). IEEE, 2020. http://dx.doi.org/10.1109/icfpt51103.2020.00041.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhu, Xiaofeng, Cong Lei, Hao Yu, Yonggang Li, Jiangzhang Gan, and Shichao Zhang. "Robust Graph Dimensionality Reduction." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/452.

Full text
Abstract:
In this paper, we propose conducting Robust Graph Dimensionality Reduction (RGDR) by learning a transformation matrix to map original high-dimensional data into their low-dimensional intrinsic space without the influence of outliers. To do this, we propose simultaneously 1) adaptively learning three variables, \ie a reverse graph embedding of original data, a transformation matrix, and a graph matrix preserving the local similarity of original data in their low-dimensional intrinsic space; and 2) employing robust estimators to avoid outliers involving the processes of optimizing these three matrices. As a result, original data are cleaned by two strategies, \ie a prediction of original data based on three resulting variables and robust estimators, so that the transformation matrix can be learnt from accurately estimated intrinsic space with the helping of the reverse graph embedding and the graph matrix. Moreover, we propose a new optimization algorithm to the resulting objective function as well as theoretically prove the convergence of our optimization algorithm. Experimental results indicated that our proposed method outperformed all the comparison methods in terms of different classification tasks.
APA, Harvard, Vancouver, ISO, and other styles
7

Gashler, Mike, and Tony Martinez. "Temporal nonlinear dimensionality reduction." In 2011 International Joint Conference on Neural Networks (IJCNN 2011 - San Jose). IEEE, 2011. http://dx.doi.org/10.1109/ijcnn.2011.6033465.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Heylen, Rob, and Paul Scheunders. "Nonlinear barycentric dimensionality reduction." In 2010 17th IEEE International Conference on Image Processing (ICIP 2010). IEEE, 2010. http://dx.doi.org/10.1109/icip.2010.5653675.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Mosci, Sofia, Lorenzo Rosasco, and Alessandro Verri. "Dimensionality reduction and generalization." In the 24th international conference. New York, New York, USA: ACM Press, 2007. http://dx.doi.org/10.1145/1273496.1273579.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Luo, Xianghui, and Robert J. Durrant. "Maximum Gradient Dimensionality Reduction." In 2018 24th International Conference on Pattern Recognition (ICPR). IEEE, 2018. http://dx.doi.org/10.1109/icpr.2018.8546198.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Dimensionality reduction"

1

Jain, Anil K. Classification, Clustering and Dimensionality Reduction. Fort Belvoir, VA: Defense Technical Information Center, July 2008. http://dx.doi.org/10.21236/ada483446.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wolf, Lior, and Stanley Bileschi. Combining Variable Selection with Dimensionality Reduction. Fort Belvoir, VA: Defense Technical Information Center, March 2005. http://dx.doi.org/10.21236/ada454990.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jones, Michael J. Using Recurrent Networks for Dimensionality Reduction. Fort Belvoir, VA: Defense Technical Information Center, September 1992. http://dx.doi.org/10.21236/ada259497.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

León, Carlos. Detecting anomalous payments networks: A dimensionality reduction approach. Banco de la República de Colombia, December 2019. http://dx.doi.org/10.32468/be.1098.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sarwar, Badrul, George Karypis, Joseph Konstan, and John Riedl. Application of Dimensionality Reduction in Recommender System - A Case Study. Fort Belvoir, VA: Defense Technical Information Center, July 2000. http://dx.doi.org/10.21236/ada439541.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Fukumizu, Kenji, Francis R. Bach, and Michael I. Jordan. Dimensionality Reduction for Supervised Learning With Reproducing Kernel Hilbert Spaces. Fort Belvoir, VA: Defense Technical Information Center, May 2003. http://dx.doi.org/10.21236/ada446572.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Nichols, Jonathan M., Frank Bucholtz, and Joseph V. Michalowicz. Intelligent Data Fusion Using Sparse Representations and Nonlinear Dimensionality Reduction. Fort Belvoir, VA: Defense Technical Information Center, September 2009. http://dx.doi.org/10.21236/ada507109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Vales, C., Y. Choi, D. Copeland, and S. Cheung. Energy conserving quadrature based dimensionality reduction for nonlinear hydrodynamics problems. Office of Scientific and Technical Information (OSTI), August 2023. http://dx.doi.org/10.2172/1995059.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ho, Tu Bao. Methods of Sparse Modeling and Dimensionality Reduction to Deal with Big Data. Fort Belvoir, VA: Defense Technical Information Center, April 2015. http://dx.doi.org/10.21236/ada623178.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Mohan, Anish, Guillermo Sapiro, and Edward Bosch. Spatially-Coherent Non-Linear Dimensionality Reduction and Segmentation of Hyper-Spectral Images (PREPRINT). Fort Belvoir, VA: Defense Technical Information Center, June 2006. http://dx.doi.org/10.21236/ada478496.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography