Gotowa bibliografia na temat „Dimensionality reduction”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Dimensionality reduction”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Dimensionality reduction"

1

Cheng, Long, Chenyu You i Yani Guan. "Random Projections for Non-linear Dimensionality Reduction". International Journal of Machine Learning and Computing 6, nr 4 (sierpień 2016): 220–25. http://dx.doi.org/10.18178/ijmlc.2016.6.4.601.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Marchette, David J., i Wendy L. Poston. "Local dimensionality reduction". Computational Statistics 14, nr 4 (12.09.1999): 469–89. http://dx.doi.org/10.1007/s001800050026.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Sun, Yu-Yin, Michael Ng i Zhi-Hua Zhou. "Multi-Instance Dimensionality Reduction". Proceedings of the AAAI Conference on Artificial Intelligence 24, nr 1 (3.07.2010): 587–92. http://dx.doi.org/10.1609/aaai.v24i1.7700.

Pełny tekst źródła
Streszczenie:
Multi-instance learning deals with problems that treat bags of instances as training examples. In single-instance learning problems, dimensionality reduction is an essential step for high-dimensional data analysis and has been studied for years. The curse of dimensionality also exists in multiinstance learning tasks, yet this difficult task has not been studied before. Direct application of existing single-instance dimensionality reduction objectives to multi-instance learning tasks may not work well since it ignores the characteristic of multi-instance learning that the labels of bags are known while the labels of instances are unknown. In this paper, we propose an effective model and develop an efficient algorithm to solve the multi-instance dimensionality reduction problem. We formulate the objective as an optimization problem by considering orthonormality and sparsity constraints in the projection matrix for dimensionality reduction, and then solve it by the gradient descent along the tangent space of the orthonormal matrices. We also propose an approximation for improving the efficiency. Experimental results validate the effectiveness of the proposed method.
Style APA, Harvard, Vancouver, ISO itp.
4

Koren, Y., i L. Carmel. "Robust linear dimensionality reduction". IEEE Transactions on Visualization and Computer Graphics 10, nr 4 (lipiec 2004): 459–70. http://dx.doi.org/10.1109/tvcg.2004.17.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Lotlikar, R., i R. Kothari. "Fractional-step dimensionality reduction". IEEE Transactions on Pattern Analysis and Machine Intelligence 22, nr 6 (czerwiec 2000): 623–27. http://dx.doi.org/10.1109/34.862200.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Gottlieb, Lee-Ad, Aryeh Kontorovich i Robert Krauthgamer. "Adaptive metric dimensionality reduction". Theoretical Computer Science 620 (marzec 2016): 105–18. http://dx.doi.org/10.1016/j.tcs.2015.10.040.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Pang, Rich, Benjamin J. Lansdell i Adrienne L. Fairhall. "Dimensionality reduction in neuroscience". Current Biology 26, nr 14 (lipiec 2016): R656—R660. http://dx.doi.org/10.1016/j.cub.2016.05.029.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Lovaglio, Pietro Giorgio, i Giorgio Vittadini. "Multilevel dimensionality-reduction methods". Statistical Methods & Applications 22, nr 2 (27.09.2012): 183–207. http://dx.doi.org/10.1007/s10260-012-0215-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Carter, Kevin, Raviv Raich, William Finn i Alfred Hero,III. "Information-Geometric Dimensionality Reduction". IEEE Signal Processing Magazine 28, nr 2 (marzec 2011): 89–99. http://dx.doi.org/10.1109/msp.2010.939536.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Gonen, Mehmet. "Bayesian Supervised Dimensionality Reduction". IEEE Transactions on Cybernetics 43, nr 6 (grudzień 2013): 2179–89. http://dx.doi.org/10.1109/tcyb.2013.2245321.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Rozprawy doktorskie na temat "Dimensionality reduction"

1

Ariu, Kaito. "Online Dimensionality Reduction". Licentiate thesis, KTH, Reglerteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-290791.

Pełny tekst źródła
Streszczenie:
In this thesis, we investigate online dimensionality reduction methods, wherethe algorithms learn by sequentially acquiring data. We focus on two specificalgorithm design problems in (i) recommender systems and (ii) heterogeneousclustering from binary user feedback. (i) For recommender systems, we consider a system consisting of m users and n items. In each round, a user,selected uniformly at random, arrives to the system and requests a recommendation. The algorithm observes the user id and recommends an itemfrom the item set. A notable restriction here is that the same item cannotbe recommended to the same user more than once, a constraint referred toas a no-repetition constraint. We study this problem as a variant of themulti-armed bandit problem and analyze regret with the various structurespertaining to items and users. We devise fundamental limits of regret andalgorithms that can achieve the limits order-wise. The analysis explicitlyhighlights the importance of each component of regret. For example, we candistinguish the regret due to the no-repetition constraint, that generated tolearn the statistics of user’s preference for an item, and that generated tolearn the low-dimensional space of the users and items were shown. (ii) Inthe clustering with binary feedback problem, the objective is to classify itemssolely based on limited user feedback. More precisely, users are just askedsimple questions with binary answers. A notable difficulty stems from theheterogeneity in the difficulty in classifying the various items (some itemsrequire more feedback to be classified than others). For this problem, wederive fundamental limits of the cluster recovery rates for both offline andonline algorithms. For the offline setting, we devise a simple algorithm thatachieves the limit order-wise. For the online setting, we propose an algorithm inspired by the lower bound. For both of the problems, we evaluatethe proposed algorithms by inspecting their theoretical guarantees and usingnumerical experiments performed on the synthetic and non-synthetic dataset.
Denna avhandling studerar algoritmer för datareduktion som lär sig från sekventiellt inhämtad data. Vi fokuserar speciellt på frågeställningar som uppkommer i utvecklingen av rekommendationssystem och i identifieringen av heterogena grupper av användare från data. För rekommendationssystem betraktar vi ett system med m användare och n objekt. I varje runda observerar algoritmen en slumpmässigt vald användare och rekommenderar ett objekt. En viktig begränsning i vår problemformuleringar att rekommendationer inte får upprepas: samma objekt inte kan rekommenderas till samma användartermer än en gång. Vi betraktar problemet som en variant av det flerarmadebanditproblemet och analyserar systemprestanda i termer av "ånger” under olika antaganden.Vi härleder fundamentala gränser för ånger och föreslår algoritmer som är (ordningsmässigt) optimala. En intressant komponent av vår analys är att vi lyckas att karaktärisera hur vart och ett av våra antaganden påverkar systemprestandan. T.ex. kan vi kvantifiera prestandaförlusten i ånger på grund av att rekommendationer inte får upprepas, på grund avatt vi måste lära oss statistiken för vilka objekt en användare är intresserade av, och för kostnaden för att lära sig den lågdimensionella rymden för användare och objekt. För problemet med hur man bäst identifierar grupper av användare härleder vi fundamentala gränser för hur snabbt det går att identifiera kluster. Vi gör detta för algoritmer som har samtidig tillgång till all data och för algoritmer som måste lära sig genom sekventiell inhämtning av data. Med tillgång till all data kan vår algoritm uppnå den optimala prestandan ordningsmässigt. När data måste inhämtas sekventiellt föreslår vi en algoritm som är inspirerad av den nedre gränsen på vad som kan uppnås. För båda problemen utvärderar vi de föreslagna algoritmerna numeriskt och jämför den praktiska prestandan med de teoretiska garantierna.

QC 20210223

Style APA, Harvard, Vancouver, ISO itp.
2

LEGRAMANTI, SIRIO. "Bayesian dimensionality reduction". Doctoral thesis, Università Bocconi, 2021. http://hdl.handle.net/11565/4035711.

Pełny tekst źródła
Streszczenie:
No abstract available
We are currently witnessing an explosion in the amount of available data. Such growth involves not only the number of data points but also their dimensionality. This poses new challenges to statistical modeling and computations, thus making dimensionality reduction more central than ever. In the present thesis, we provide methodological, computational and theoretical advancements in Bayesian dimensionality reduction via novel structured priors. Namely, we develop a new increasing shrinkage prior and illustrate how it can be employed to discard redundant dimensions in Gaussian factor models. In order to make it usable for larger datasets, we also investigate variational methods for posterior inference under this proposed prior. Beyond traditional models and parameter spaces, we also provide a different take on dimensionality reduction, focusing on community detection in networks. For this purpose, we define a general class of Bayesian nonparametric priors that encompasses existing stochastic block models as special cases and includes promising unexplored options. Our Bayesian approach allows for a natural incorporation of node attributes and facilitates uncertainty quantification as well as model selection.
Style APA, Harvard, Vancouver, ISO itp.
3

Baldiwala, Aliakbar. "Dimensionality Reduction for Commercial Vehicle Fleet Monitoring". Thesis, Université d'Ottawa / University of Ottawa, 2018. http://hdl.handle.net/10393/38330.

Pełny tekst źródła
Streszczenie:
A variety of new features have been added in the present-day vehicles like a pre-crash warning, the vehicle to vehicle communication, semi-autonomous driving systems, telematics, drive by wire. They demand very high bandwidth from in-vehicle networks. Various electronic control units present inside the automotive transmit useful information via automotive multiplexing. Automotive multiplexing allows sharing information among various intelligent modules inside an automotive electronic system. Optimum functionality is achieved by transmitting this data in real time. The high bandwidth and high-speed requirement can be achieved either by using multiple buses or by implementing higher bandwidth. But, by doing so the cost of the network and the complexity of the wiring in the vehicle increases. Another option is to implement higher layer protocol which can reduce the amount of data transferred by using data reduction (DR) techniques, thus reducing the bandwidth usage. The implementation cost is minimal as only the changes are required in the software and not in hardware. In our work, we present a new data reduction algorithm termed as “Comprehensive Data Reduction (CDR)” algorithm. The proposed algorithm is used for minimization of the bus utilization of CAN bus for a future vehicle. The reduction in the busload was efficiently made by compressing the parameters; thus, more number of messages and lower priority messages can be efficiently sent on the CAN bus. The proposed work also presents a performance analysis of proposed algorithm with the boundary of fifteen compression algorithm, and Compression area selection algorithms (Existing Data Reduction Algorithm). The results of the analysis show that proposed CDR algorithm provides better data reduction compared to earlier proposed algorithms. The promising results were obtained in terms of reduction in bus utilization, compression efficiency, and percent peak load of CAN bus. This Reduction in the bus utilization permits to utilize a larger number of network nodes (ECU’s) in the existing system without increasing the overall cost of the system. The proposed algorithm has been developed for automotive environment, but it can also be utilized in any applications where extensive information transmission among various control units is carried out via a multiplexing bus.
Style APA, Harvard, Vancouver, ISO itp.
4

Bolelli, Maria Virginia. "Diffusion Maps for Dimensionality Reduction". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019. http://amslaurea.unibo.it/18246/.

Pełny tekst źródła
Streszczenie:
In this thesis we present the diffusion maps, a framework based on diffusion processes for finding meaningful geometric descriptions of data sets. A diffusion process can be described via an iterative application of the heat kernel which has two main characteristics: it satisfies a Markov semigroup property and its level sets encode all geometric features of the space. This process, well known in regular manifolds, has been extended to general data set by Coifman and Lafon. They define a diffusion kernel starting from the geometric properties of the data and their density properties. This kernel will be a compact operator, and the projection on its eigenvectors at different instant of time, provides a family of embeddings of a dataset into a suitable Euclidean space. The projection on the first eigenvectors, naturally leads to a dimensionality reduction algorithm. Numerical implementation is provided on different data set.
Style APA, Harvard, Vancouver, ISO itp.
5

Khosla, Nitin, i n/a. "Dimensionality Reduction Using Factor Analysis". Griffith University. School of Engineering, 2006. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20061010.151217.

Pełny tekst źródła
Streszczenie:
In many pattern recognition applications, a large number of features are extracted in order to ensure an accurate classification of unknown classes. One way to solve the problems of high dimensions is to first reduce the dimensionality of the data to a manageable size, keeping as much of the original information as possible and then feed the reduced-dimensional data into a pattern recognition system. In this situation, dimensionality reduction process becomes the pre-processing stage of the pattern recognition system. In addition to this, probablility density estimation, with fewer variables is a simpler approach for dimensionality reduction. Dimensionality reduction is useful in speech recognition, data compression, visualization and exploratory data analysis. Some of the techniques which can be used for dimensionality reduction are; Factor Analysis (FA), Principal Component Analysis(PCA), and Linear Discriminant Analysis(LDA). Factor Analysis can be considered as an extension of Principal Component Analysis. The EM (expectation maximization) algorithm is ideally suited to problems of this sort, in that it produces maximum-likelihood (ML) estimates of parameters when there is a many-to-one mapping from an underlying distribution to the distribution governing the observation, conditioned upon the obervations. The maximization step then provides a new estimate of the parameters. This research work compares the techniques; Factor Analysis (Expectation-Maximization algorithm based), Principal Component Analysis and Linear Discriminant Analysis for dimensionality reduction and investigates Local Factor Analysis (EM algorithm based) and Local Principal Component Analysis using Vector Quantization.
Style APA, Harvard, Vancouver, ISO itp.
6

Vamulapalli, Harika Rao. "On Dimensionality Reduction of Data". ScholarWorks@UNO, 2010. http://scholarworks.uno.edu/td/1211.

Pełny tekst źródła
Streszczenie:
Random projection method is one of the important tools for the dimensionality reduction of data which can be made efficient with strong error guarantees. In this thesis, we focus on linear transforms of high dimensional data to the low dimensional space satisfying the Johnson-Lindenstrauss lemma. In addition, we also prove some theoretical results relating to the projections that are of interest when applying them in practical applications. We show how the technique can be applied to synthetic data with probabilistic guarantee on the pairwise distance. The connection between dimensionality reduction and compressed sensing is also discussed.
Style APA, Harvard, Vancouver, ISO itp.
7

Widemann, David P. "Dimensionality reduction for hyperspectral data". College Park, Md.: University of Maryland, 2008. http://hdl.handle.net/1903/8448.

Pełny tekst źródła
Streszczenie:
Thesis (Ph. D.) -- University of Maryland, College Park, 2008.
Thesis research directed by: Dept. of Mathematics. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
Style APA, Harvard, Vancouver, ISO itp.
8

Khosla, Nitin. "Dimensionality Reduction Using Factor Analysis". Thesis, Griffith University, 2006. http://hdl.handle.net/10072/366058.

Pełny tekst źródła
Streszczenie:
In many pattern recognition applications, a large number of features are extracted in order to ensure an accurate classification of unknown classes. One way to solve the problems of high dimensions is to first reduce the dimensionality of the data to a manageable size, keeping as much of the original information as possible and then feed the reduced-dimensional data into a pattern recognition system. In this situation, dimensionality reduction process becomes the pre-processing stage of the pattern recognition system. In addition to this, probablility density estimation, with fewer variables is a simpler approach for dimensionality reduction. Dimensionality reduction is useful in speech recognition, data compression, visualization and exploratory data analysis. Some of the techniques which can be used for dimensionality reduction are; Factor Analysis (FA), Principal Component Analysis(PCA), and Linear Discriminant Analysis(LDA). Factor Analysis can be considered as an extension of Principal Component Analysis. The EM (expectation maximization) algorithm is ideally suited to problems of this sort, in that it produces maximum-likelihood (ML) estimates of parameters when there is a many-to-one mapping from an underlying distribution to the distribution governing the observation, conditioned upon the obervations. The maximization step then provides a new estimate of the parameters. This research work compares the techniques; Factor Analysis (Expectation-Maximization algorithm based), Principal Component Analysis and Linear Discriminant Analysis for dimensionality reduction and investigates Local Factor Analysis (EM algorithm based) and Local Principal Component Analysis using Vector Quantization.
Thesis (Masters)
Master of Philosophy (MPhil)
School of Engineering
Full Text
Style APA, Harvard, Vancouver, ISO itp.
9

Sætrom, Jon. "Reduction of Dimensionality in Spatiotemporal Models". Doctoral thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for matematiske fag, 2010. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-11247.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Ghodsi, Boushehri Ali. "Nonlinear Dimensionality Reduction with Side Information". Thesis, University of Waterloo, 2006. http://hdl.handle.net/10012/1020.

Pełny tekst źródła
Streszczenie:
In this thesis, I look at three problems with important applications in data processing. Incorporating side information, provided by the user or derived from data, is a main theme of each of these problems.

This thesis makes a number of contributions. The first is a technique for combining different embedding objectives, which is then exploited to incorporate side information expressed in terms of transformation invariants known to hold in the data. It also introduces two different ways of incorporating transformation invariants in order to make new similarity measures. Two algorithms are proposed which learn metrics based on different types of side information. These learned metrics can then be used in subsequent embedding methods. Finally, it introduces a manifold learning algorithm that is useful when applied to sequential decision problems. In this case we are given action labels in addition to data points. Actions in the manifold learned by this algorithm have meaningful representations in that they are represented as simple transformations.
Style APA, Harvard, Vancouver, ISO itp.

Książki na temat "Dimensionality reduction"

1

Lee, John A., i Michel Verleysen, red. Nonlinear Dimensionality Reduction. New York, NY: Springer New York, 2007. http://dx.doi.org/10.1007/978-0-387-39351-3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Lespinats, Sylvain, Benoit Colange i Denys Dutykh. Nonlinear Dimensionality Reduction Techniques. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-81026-9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Garzon, Max, Ching-Chi Yang, Deepak Venugopal, Nirman Kumar, Kalidas Jana i Lih-Yuan Deng, red. Dimensionality Reduction in Data Science. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-05371-9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Paul, Arati, i Nabendu Chaki. Dimensionality Reduction of Hyperspectral Imagery. Cham: Springer International Publishing, 2024. http://dx.doi.org/10.1007/978-3-031-42667-4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Strange, Harry, i Reyer Zwiggelaar. Open Problems in Spectral Dimensionality Reduction. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-03943-5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Kramer, Oliver. Dimensionality Reduction with Unsupervised Nearest Neighbors. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38652-7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Kramer, Oliver. Dimensionality Reduction with Unsupervised Nearest Neighbors. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Shaw, Blake. Graph Embedding and Nonlinear Dimensionality Reduction. [New York, N.Y.?]: [publisher not identified], 2011.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Ghojogh, Benyamin, Mark Crowley, Fakhri Karray i Ali Ghodsi. Elements of Dimensionality Reduction and Manifold Learning. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-10602-6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Wang, Jianzhong. Geometric Structure of High-Dimensional Data and Dimensionality Reduction. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-27497-8.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Części książek na temat "Dimensionality reduction"

1

Herrera, Francisco, Francisco Charte, Antonio J. Rivera i María J. del Jesus. "Dimensionality Reduction". W Multilabel Classification, 115–31. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-41111-8_7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Kramer, Oliver. "Dimensionality Reduction". W Dimensionality Reduction with Unsupervised Nearest Neighbors, 33–52. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38652-7_4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Hull, Isaiah. "Dimensionality Reduction". W Machine Learning for Economics and Finance in TensorFlow 2, 281–306. Berkeley, CA: Apress, 2020. http://dx.doi.org/10.1007/978-1-4842-6373-0_8.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Shen, Heng Tao. "Dimensionality Reduction". W Encyclopedia of Database Systems, 1–2. New York, NY: Springer New York, 2017. http://dx.doi.org/10.1007/978-1-4899-7993-3_551-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Webb, Geoffrey I., Johannes Fürnkranz, Johannes Fürnkranz, Johannes Fürnkranz, Geoffrey Hinton, Claude Sammut, Joerg Sander i in. "Dimensionality Reduction". W Encyclopedia of Machine Learning, 274–79. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_216.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Dinov, Ivo D. "Dimensionality Reduction". W Data Science and Predictive Analytics, 233–66. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-72347-1_6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Shen, Heng Tao. "Dimensionality Reduction". W Encyclopedia of Database Systems, 843–46. Boston, MA: Springer US, 2009. http://dx.doi.org/10.1007/978-0-387-39940-9_551.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Mathar, Rudolf, Gholamreza Alirezaei, Emilio Balda i Arash Behboodi. "Dimensionality Reduction". W Fundamentals of Data Analytics, 45–67. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-56831-3_4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Durstewitz, Daniel. "Dimensionality Reduction". W Advanced Data Analysis in Neuroscience, 105–19. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-59976-2_6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Braga-Neto, Ulisses. "Dimensionality Reduction". W Fundamentals of Pattern Recognition and Machine Learning, 205–29. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-27656-0_9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Streszczenia konferencji na temat "Dimensionality reduction"

1

Bunte, Kerstin, Michael Biehl i Barbara Hammer. "Dimensionality reduction mappings". W 2011 Ieee Symposium On Computational Intelligence And Data Mining - Part Of 17273 - 2011 Ssci. IEEE, 2011. http://dx.doi.org/10.1109/cidm.2011.5949443.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Schclar, Alon, i Amir Averbuch. "Diffusion Bases Dimensionality Reduction". W 7th International Conference on Neural Computation Theory and Applications. SCITEPRESS - Science and and Technology Publications, 2015. http://dx.doi.org/10.5220/0005625301510156.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Bingham, Ella, Aristides Gionis, Niina Haiminen, Heli Hiisilä, Heikki Mannila i Evimaria Terzi. "Segmentation and dimensionality reduction". W Proceedings of the 2006 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2006. http://dx.doi.org/10.1137/1.9781611972764.33.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Zhang, Daoqiang, Zhi-Hua Zhou i Songcan Chen. "Semi-Supervised Dimensionality Reduction". W Proceedings of the 2007 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2007. http://dx.doi.org/10.1137/1.9781611972771.73.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Guo, Ce, i Wayne Luk. "Quantisation-aware Dimensionality Reduction". W 2020 International Conference on Field-Programmable Technology (ICFPT). IEEE, 2020. http://dx.doi.org/10.1109/icfpt51103.2020.00041.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Zhu, Xiaofeng, Cong Lei, Hao Yu, Yonggang Li, Jiangzhang Gan i Shichao Zhang. "Robust Graph Dimensionality Reduction". W Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/452.

Pełny tekst źródła
Streszczenie:
In this paper, we propose conducting Robust Graph Dimensionality Reduction (RGDR) by learning a transformation matrix to map original high-dimensional data into their low-dimensional intrinsic space without the influence of outliers. To do this, we propose simultaneously 1) adaptively learning three variables, \ie a reverse graph embedding of original data, a transformation matrix, and a graph matrix preserving the local similarity of original data in their low-dimensional intrinsic space; and 2) employing robust estimators to avoid outliers involving the processes of optimizing these three matrices. As a result, original data are cleaned by two strategies, \ie a prediction of original data based on three resulting variables and robust estimators, so that the transformation matrix can be learnt from accurately estimated intrinsic space with the helping of the reverse graph embedding and the graph matrix. Moreover, we propose a new optimization algorithm to the resulting objective function as well as theoretically prove the convergence of our optimization algorithm. Experimental results indicated that our proposed method outperformed all the comparison methods in terms of different classification tasks.
Style APA, Harvard, Vancouver, ISO itp.
7

Gashler, Mike, i Tony Martinez. "Temporal nonlinear dimensionality reduction". W 2011 International Joint Conference on Neural Networks (IJCNN 2011 - San Jose). IEEE, 2011. http://dx.doi.org/10.1109/ijcnn.2011.6033465.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Heylen, Rob, i Paul Scheunders. "Nonlinear barycentric dimensionality reduction". W 2010 17th IEEE International Conference on Image Processing (ICIP 2010). IEEE, 2010. http://dx.doi.org/10.1109/icip.2010.5653675.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Mosci, Sofia, Lorenzo Rosasco i Alessandro Verri. "Dimensionality reduction and generalization". W the 24th international conference. New York, New York, USA: ACM Press, 2007. http://dx.doi.org/10.1145/1273496.1273579.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Luo, Xianghui, i Robert J. Durrant. "Maximum Gradient Dimensionality Reduction". W 2018 24th International Conference on Pattern Recognition (ICPR). IEEE, 2018. http://dx.doi.org/10.1109/icpr.2018.8546198.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Raporty organizacyjne na temat "Dimensionality reduction"

1

Jain, Anil K. Classification, Clustering and Dimensionality Reduction. Fort Belvoir, VA: Defense Technical Information Center, lipiec 2008. http://dx.doi.org/10.21236/ada483446.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Wolf, Lior, i Stanley Bileschi. Combining Variable Selection with Dimensionality Reduction. Fort Belvoir, VA: Defense Technical Information Center, marzec 2005. http://dx.doi.org/10.21236/ada454990.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Jones, Michael J. Using Recurrent Networks for Dimensionality Reduction. Fort Belvoir, VA: Defense Technical Information Center, wrzesień 1992. http://dx.doi.org/10.21236/ada259497.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

León, Carlos. Detecting anomalous payments networks: A dimensionality reduction approach. Banco de la República de Colombia, grudzień 2019. http://dx.doi.org/10.32468/be.1098.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Sarwar, Badrul, George Karypis, Joseph Konstan i John Riedl. Application of Dimensionality Reduction in Recommender System - A Case Study. Fort Belvoir, VA: Defense Technical Information Center, lipiec 2000. http://dx.doi.org/10.21236/ada439541.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Fukumizu, Kenji, Francis R. Bach i Michael I. Jordan. Dimensionality Reduction for Supervised Learning With Reproducing Kernel Hilbert Spaces. Fort Belvoir, VA: Defense Technical Information Center, maj 2003. http://dx.doi.org/10.21236/ada446572.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Nichols, Jonathan M., Frank Bucholtz i Joseph V. Michalowicz. Intelligent Data Fusion Using Sparse Representations and Nonlinear Dimensionality Reduction. Fort Belvoir, VA: Defense Technical Information Center, wrzesień 2009. http://dx.doi.org/10.21236/ada507109.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Vales, C., Y. Choi, D. Copeland i S. Cheung. Energy conserving quadrature based dimensionality reduction for nonlinear hydrodynamics problems. Office of Scientific and Technical Information (OSTI), sierpień 2023. http://dx.doi.org/10.2172/1995059.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Ho, Tu Bao. Methods of Sparse Modeling and Dimensionality Reduction to Deal with Big Data. Fort Belvoir, VA: Defense Technical Information Center, kwiecień 2015. http://dx.doi.org/10.21236/ada623178.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Mohan, Anish, Guillermo Sapiro i Edward Bosch. Spatially-Coherent Non-Linear Dimensionality Reduction and Segmentation of Hyper-Spectral Images (PREPRINT). Fort Belvoir, VA: Defense Technical Information Center, czerwiec 2006. http://dx.doi.org/10.21236/ada478496.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii