Academic literature on the topic 'Intrinsic dimensionality estimation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Intrinsic dimensionality estimation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Intrinsic dimensionality estimation"

1

LI, CHUN-GUANG, JUN GUO, and BO XIAO. "INTRINSIC DIMENSIONALITY ESTIMATION WITHIN NEIGHBORHOOD CONVEX HULL." International Journal of Pattern Recognition and Artificial Intelligence 23, no. 01 (February 2009): 31–44. http://dx.doi.org/10.1142/s0218001409007016.

Full text
Abstract:
In this paper, a novel method to estimate the intrinsic dimensionality of high-dimensional data set is proposed. Based on neighborhood information, our method calculates the non-negative locally linear reconstruction coefficients from its neighbors for each data point, and the numbers of those dominant positive reconstruction coefficients are regarded as a faithful guide to the intrinsic dimensionality of data set. The proposed method requires no parametric assumption on data distribution and is easy to implement in the general framework of manifold learning. Experimental results on several synthesized data sets and real data sets have shown the benefits of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
2

He, Jinrong, Lixin Ding, Lei Jiang, Zhaokui Li, and Qinghui Hu. "Intrinsic dimensionality estimation based on manifold assumption." Journal of Visual Communication and Image Representation 25, no. 5 (July 2014): 740–47. http://dx.doi.org/10.1016/j.jvcir.2014.01.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bruske, J., and G. Sommer. "Intrinsic dimensionality estimation with optimally topology preserving maps." IEEE Transactions on Pattern Analysis and Machine Intelligence 20, no. 5 (May 1998): 572–75. http://dx.doi.org/10.1109/34.682189.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cordes, Dietmar, and Rajesh R. Nandy. "Estimation of the intrinsic dimensionality of fMRI data." NeuroImage 29, no. 1 (January 2006): 145–54. http://dx.doi.org/10.1016/j.neuroimage.2005.07.054.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Amsaleg, Laurent, Oussama Chelly, Teddy Furon, Stéphane Girard, Michael E. Houle, Ken-ichi Kawarabayashi, and Michael Nett. "Extreme-value-theoretic estimation of local intrinsic dimensionality." Data Mining and Knowledge Discovery 32, no. 6 (July 27, 2018): 1768–805. http://dx.doi.org/10.1007/s10618-018-0578-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Altan, Ege, Sara A. Solla, Lee E. Miller, and Eric J. Perreault. "Estimating the dimensionality of the manifold underlying multi-electrode neural recordings." PLOS Computational Biology 17, no. 11 (November 29, 2021): e1008591. http://dx.doi.org/10.1371/journal.pcbi.1008591.

Full text
Abstract:
It is generally accepted that the number of neurons in a given brain area far exceeds the number of neurons needed to carry any specific function controlled by that area. For example, motor areas of the human brain contain tens of millions of neurons that control the activation of tens or at most hundreds of muscles. This massive redundancy implies the covariation of many neurons, which constrains the population activity to a low-dimensional manifold within the space of all possible patterns of neural activity. To gain a conceptual understanding of the complexity of the neural activity within a manifold, it is useful to estimate its dimensionality, which quantifies the number of degrees of freedom required to describe the observed population activity without significant information loss. While there are many algorithms for dimensionality estimation, we do not know which are well suited for analyzing neural activity. The objective of this study was to evaluate the efficacy of several representative algorithms for estimating the dimensionality of linearly and nonlinearly embedded data. We generated synthetic neural recordings with known intrinsic dimensionality and used them to test the algorithms’ accuracy and robustness. We emulated some of the important challenges associated with experimental data by adding noise, altering the nature of the embedding of the low-dimensional manifold within the high-dimensional recordings, varying the dimensionality of the manifold, and limiting the amount of available data. We demonstrated that linear algorithms overestimate the dimensionality of nonlinear, noise-free data. In cases of high noise, most algorithms overestimated the dimensionality. We thus developed a denoising algorithm based on deep learning, the “Joint Autoencoder”, which significantly improved subsequent dimensionality estimation. Critically, we found that all algorithms failed when the intrinsic dimensionality was high (above 20) or when the amount of data used for estimation was low. Based on the challenges we observed, we formulated a pipeline for estimating the dimensionality of experimental neural data.
APA, Harvard, Vancouver, ISO, and other styles
7

Heylen, Rob, and Paul Scheunders. "Hyperspectral Intrinsic Dimensionality Estimation With Nearest-Neighbor Distance Ratios." IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 6, no. 2 (April 2013): 570–79. http://dx.doi.org/10.1109/jstars.2013.2256338.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Wang, Xiaorong, and Aiqing Xu. "Intrinsic Dimensionality Estimation for Data Points in Local Region." Sankhya B 81, no. 1 (March 15, 2018): 123–32. http://dx.doi.org/10.1007/s13571-018-0156-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Karbauskaitė, Rasa, and Gintautas Dzemyda. "Geodesic distances in the intrinsic dimensionality estimation using packing numbers." Nonlinear Analysis: Modelling and Control 19, no. 4 (December 10, 2014): 578–91. http://dx.doi.org/10.15388/na.2014.4.4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Benkő, Zsigmond, Marcell Stippinger, Roberta Rehus, Attila Bencze, Dániel Fabó, Boglárka Hajnal, Loránd G. Eröss, András Telcs, and Zoltán Somogyvári. "Manifold-adaptive dimension estimation revisited." PeerJ Computer Science 8 (January 6, 2022): e790. http://dx.doi.org/10.7717/peerj-cs.790.

Full text
Abstract:
Data dimensionality informs us about data complexity and sets limit on the structure of successful signal processing pipelines. In this work we revisit and improve the manifold adaptive Farahmand-Szepesvári-Audibert (FSA) dimension estimator, making it one of the best nearest neighbor-based dimension estimators available. We compute the probability density function of local FSA estimates, if the local manifold density is uniform. Based on the probability density function, we propose to use the median of local estimates as a basic global measure of intrinsic dimensionality, and we demonstrate the advantages of this asymptotically unbiased estimator over the previously proposed statistics: the mode and the mean. Additionally, from the probability density function, we derive the maximum likelihood formula for global intrinsic dimensionality, if i.i.d. holds. We tackle edge and finite-sample effects with an exponential correction formula, calibrated on hypercube datasets. We compare the performance of the corrected median-FSA estimator with kNN estimators: maximum likelihood (Levina-Bickel), the 2NN and two implementations of DANCo (R and MATLAB). We show that corrected median-FSA estimator beats the maximum likelihood estimator and it is on equal footing with DANCo for standard synthetic benchmarks according to mean percentage error and error rate metrics. With the median-FSA algorithm, we reveal diverse changes in the neural dynamics while resting state and during epileptic seizures. We identify brain areas with lower-dimensional dynamics that are possible causal sources and candidates for being seizure onset zones.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Intrinsic dimensionality estimation"

1

Kalantan, Zakiah Ibrahim. "Methods for estimation of intrinsic dimensionality." Thesis, Durham University, 2014. http://etheses.dur.ac.uk/9500/.

Full text
Abstract:
Dimension reduction is an important tool used to describe the structure of complex data (explicitly or implicitly) through a small but sufficient number of variables, and thereby make data analysis more efficient. It is also useful for visualization purposes. Dimension reduction helps statisticians to overcome the ‘curse of dimensionality’. However, most dimension reduction techniques require the intrinsic dimension of the low-dimensional subspace to be fixed in advance. The availability of reliable intrinsic dimension (ID) estimation techniques is of major importance. The main goal of this thesis is to develop algorithms for determining the intrinsic dimensions of recorded data sets in a nonlinear context. Whilst this is a well-researched topic for linear planes, based mainly on principal components analysis, relatively little attention has been paid to ways of estimating this number for non–linear variable interrelationships. The proposed algorithms here are based on existing concepts that can be categorized into local methods, relying on randomly selected subsets of a recorded variable set, and global methods, utilizing the entire data set. This thesis provides an overview of ID estimation techniques, with special consideration given to recent developments in non–linear techniques, such as charting manifold and fractal–based methods. Despite their nominal existence, the practical implementation of these techniques is far from straightforward. The intrinsic dimension is estimated via Brand’s algorithm by examining the growth point process, which counts the number of points in hyper-spheres. The estimation needs to determine the starting point for each hyper-sphere. In this thesis we provide settings for selecting starting points which work well for most data sets. Additionally we propose approaches for estimating dimensionality via Brand’s algorithm, the Dip method and the Regression method. Other approaches are proposed for estimating the intrinsic dimension by fractal dimension estimation methods, which exploit the intrinsic geometry of a data set. The most popular concept from this family of methods is the correlation dimension, which requires the estimation of the correlation integral for a ball of radius tending to 0. In this thesis we propose new approaches to approximate the correlation integral in this limit. The new approaches are the Intercept method, the Slop method and the Polynomial method. In addition we propose a new approach, a localized global method, which could be defined as a local version of global ID methods. The objective of the localized global approach is to improve the algorithm based on a local ID method, which could significantly reduce the negative bias. Experimental results on real world and simulated data are used to demonstrate the algorithms and compare them to other methodology. A simulation study which verifies the effectiveness of the proposed methods is also provided. Finally, these algorithms are contrasted using a recorded data set from an industrial melter process.
APA, Harvard, Vancouver, ISO, and other styles
2

Gashler, Michael S. "Advancing the Effectiveness of Non-Linear Dimensionality Reduction Techniques." BYU ScholarsArchive, 2012. https://scholarsarchive.byu.edu/etd/3216.

Full text
Abstract:
Data that is represented with high dimensionality presents a computational complexity challenge for many existing algorithms. Limiting dimensionality by discarding attributes is sometimes a poor solution to this problem because significant high-level concepts may be encoded in the data across many or all of the attributes. Non-linear dimensionality reduction (NLDR) techniques have been successful with many problems at minimizing dimensionality while preserving intrinsic high-level concepts that are encoded with varying combinations of attributes. Unfortunately, many challenges remain with existing NLDR techniques, including excessive computational requirements, an inability to benefit from prior knowledge, and an inability to handle certain difficult conditions that occur in data with many real-world problems. Further, certain practical factors have limited advancement in NLDR, such as a lack of clarity regarding suitable applications for NLDR, and a general inavailability of efficient implementations of complex algorithms. This dissertation presents a collection of papers that advance the state of NLDR in each of these areas. Contributions of this dissertation include: • An NLDR algorithm, called Manifold Sculpting, that optimizes its solution using graduated optimization. This approach enables it to obtain better results than methods that only optimize an approximate problem. Additionally, Manifold Sculpting can benefit from prior knowledge about the problem. • An intelligent neighbor-finding technique called SAFFRON that improves the breadth of problems that existing NLDR techniques can handle. • A neighborhood refinement technique called CycleCut that further increases the robustness of existing NLDR techniques, and that can work in conjunction with SAFFRON to solve difficult problems. • Demonstrations of specific applications for NLDR techniques, including the estimation of state within dynamical systems, training of recurrent neural networks, and imputing missing values in data. • An open source toolkit containing each of the techniques described in this dissertation, as well as several existing NLDR algorithms, and other useful machine learning methods.
APA, Harvard, Vancouver, ISO, and other styles
3

Winiger, Joakim. "Estimating the intrinsic dimensionality of high dimensional data." Thesis, KTH, Matematisk statistik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-163170.

Full text
Abstract:
This report presents a review of some methods for estimating what is known as intrinsic dimensionality (ID). The principle behind intrinsic dimensionality estimation is that frequently, it is possible to find some structure in data which makes it possible to re-express it using a fewer number of coordinates (dimensions). The main objective of the report is to solve a common problem: Given a (typically high-dimensional) dataset, determine whether the number of dimensions are redundant, and if so, find a lower dimensional representation of it. We introduce different approaches for ID estimation, motivate them theoretically and compare them using both synthetic and real datasets. The first three methods estimate the ID of a dataset while the fourth finds a low dimensional version of the data. This is a useful order in which to organize the task, given an estimate of the ID of a dataset, construct a simpler version of the dataset using this number of dimensions. The results show that it is possible to obtain a remarkable decrease in high-dimensional data. The different methods give similar results despite their different theoretical backgrounds and behave as expected when using them on synthetic datasets with known ID.
Denna rapport ger en genomgång av olika metoder för skattning av inre dimension (ID). Principen bakom begreppet ID är att det ofta är möjligt att hitta strukturer i data som gör det möjligt att uttrycka samma data på nytt med ett färre antal koordinater (dimensioner). Syftet med detta projekt är att lösa ett vanligt problem: given en (vanligtvis högdimensionell) datamängd, avgör om antalet dimensioner är överflödiga, och om så är fallet, hitta en representation av datamängden som har ett mindre antal dimensioner. Vi introducerar olika tillvägagångssätt för skattning av inre dimension, går igenom teorin bakom dem och jämför deras resultat för både syntetiska och verkliga datamängder. De tre första metoderna skattar den inre dimensionen av data medan den fjärde hittar en lägre-dimensionell version av en datamängd. Denna ordning är praktisk för syftet med projektet, när vi har en skattning av den inre dimensionen av en datamängd kan vi använda denna skattning för att konstruera en enklare version av datamängden som har detta antal dimensioner. Resultaten visar att för högdimensionell data går det att reducera antalet dimensioner avsevärt. De olika metoderna ger liknande resultat trots deras olika teoretiska bakgrunder, och ger väntade resultat när de används på syntetiska datamängder vars inre dimensioner redan är kända.
APA, Harvard, Vancouver, ISO, and other styles
4

RANDAZZO, VINCENZO. "Novel neural approaches to data topology analysis and telemedicine." Doctoral thesis, Politecnico di Torino, 2020. http://hdl.handle.net/11583/2850610.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

AHRAM, TAREQ. "INFORMATION RETRIEVAL PERFORMANCE ENHANCEMENT USING THE AVERAGE STANDARD ESTIMATOR AND THE MULTI-CRITERIA DECISION WEIGHTED SET." Doctoral diss., University of Central Florida, 2008. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/3280.

Full text
Abstract:
Information retrieval is much more challenging than traditional small document collection retrieval. The main difference is the importance of correlations between related concepts in complex data structures. These structures have been studied by several information retrieval systems. This research began by performing a comprehensive review and comparison of several techniques of matrix dimensionality estimation and their respective effects on enhancing retrieval performance using singular value decomposition and latent semantic analysis. Two novel techniques have been introduced in this research to enhance intrinsic dimensionality estimation, the Multi-criteria Decision Weighted model to estimate matrix intrinsic dimensionality for large document collections and the Average Standard Estimator (ASE) for estimating data intrinsic dimensionality based on the singular value decomposition (SVD). ASE estimates the level of significance for singular values resulting from the singular value decomposition. ASE assumes that those variables with deep relations have sufficient correlation and that only those relationships with high singular values are significant and should be maintained. Experimental results over all possible dimensions indicated that ASE improved matrix intrinsic dimensionality estimation by including the effect of both singular values magnitude of decrease and random noise distracters. Analysis based on selected performance measures indicates that for each document collection there is a region of lower dimensionalities associated with improved retrieval performance. However, there was clear disagreement between the various performance measures on the model associated with best performance. The introduction of the multi-weighted model and Analytical Hierarchy Processing (AHP) analysis helped in ranking dimensionality estimation techniques and facilitates satisfying overall model goals by leveraging contradicting constrains and satisfying information retrieval priorities. ASE provided the best estimate for MEDLINE intrinsic dimensionality among all other dimensionality estimation techniques, and further, ASE improved precision and relative relevance by 10.2% and 7.4% respectively. AHP analysis indicates that ASE and the weighted model ranked the best among other methods with 30.3% and 20.3% in satisfying overall model goals in MEDLINE and 22.6% and 25.1% for CRANFIELD. The weighted model improved MEDLINE relative relevance by 4.4%, while the scree plot, weighted model, and ASE provided better estimation of data intrinsic dimensionality for CRANFIELD collection than Kaiser-Guttman and Percentage of variance. ASE dimensionality estimation technique provided a better estimation of CISI intrinsic dimensionality than all other tested methods since all methods except ASE tend to underestimate CISI document collection intrinsic dimensionality. ASE improved CISI average relative relevance and average search length by 28.4% and 22.0% respectively. This research provided evidence supporting a system using a weighted multi-criteria performance evaluation technique resulting in better overall performance than a single criteria ranking model. Thus, the weighted multi-criteria model with dimensionality reduction provides a more efficient implementation for information retrieval than using a full rank model.
Ph.D.
Department of Industrial Engineering and Management Systems
Engineering and Computer Science
Industrial Engineering PhD
APA, Harvard, Vancouver, ISO, and other styles
6

LANTERI, ALESSANDRO. "Novel methods for Intrinsic dimension estimation and manifold learning." Doctoral thesis, 2016. http://hdl.handle.net/11573/905425.

Full text
Abstract:
One of the most challenging problems in modern science is how to deal with the huge amount of data that today's technologies provide. Several diculties may arise. For instance, the number of samples may be too big and the stream of incoming data may be faster than the algorithm needed to process them. Another common problem is that when data dimension grows also the volume of the space does, leading to a sparsication of the available data. This may cause problems in the statistical analysis since the data needed to support our conclusion often grows exponentially with the dimension. This problem is commonly referred to as the Curse of Dimensionality and it is one of the reasons why high dimensional data can not be analyzed eciently with traditional methods. Classical methods for dimensionality reduction, like principal component analysis and factor analysis, may fail due to a nonlinear structure of the data. In recent years several methods for nonlinear dimensionality reduction have been proposed. A general way to model high dimensional data set is to represent the observations as noisy samples drawn from a probability distribution mu in the real coordinate space of D dimensions. It has been observed that the essential support of mu can be often well approximated by low dimensional sets. These sets can be assumed to be low dimensional manifolds embedded in the ambient dimension D. A manifold is a topologial space which globally may not be Euclidean but in a small neighbor of each point behaves like an Euclidean space. In this setting we call intrinsic dimension the dimension of the manifold, which is usually much lower than the ambient dimension D. Roughly speaking, the intrinsic dimension of a data set can be described as the minimum number of variables needed to represent the data without signicant loss of information. In this work we propose dierent methods aimed at estimate the intrinsic dimension. The rst method we present models the neighbors of each point as stochastic processes, in such a way that a closed form likelihood function can be written. This leads to a closed form maximum likelihood estimator (MLE) for the intrinsic dimension, which has all the good features that a MLE can have. The second method is based on a multiscale singular value decomposition (MSVD) of the data. This method performs singular value decomposition (SVD) on neighbors of increasing size and nd an estimate for the intrinsic dimension studying the behavior of the singular values as the radius of the neighbor increases. We also introduce an algorithm to estimate the model parameters when the data are assumed to be sampled around an unknown number of planes with dierent intrinsic dimensions, embedded in a high dimensional space. This kind of models have many applications in computer vision and patter recognition, where the data can be described by multiple linear structures or need to be clusterized into groups that can be represented by low dimensional hyperplanes. The algorithm relies on both MSVD and spectral clustering, and it is able to estimate the number of planes, their dimension as well as their arrangement in the ambient space. Finally, we propose a novel method for manifold reconstruction based on a multiscale approach, which approximates the manifold from coarse to ne scales with increasing precision. The basic idea is to produce, at a generic scale j, a piecewise linear approximation of the manifold using a collection of low dimensional planes and use those planes to create clusters for the data. At scale j + 1, each cluster is independently approximated by another collection of low dimensional planes.The process is iterated until the desired precision is achieved. This algorithm is fast because it is highly parallelizable and its computational time is independent from the sample size. Moreover this method automatically constructs a tree structure for the data. This feature can be particularly useful in applications which requires an a priori tree data structure. The aim of the collection of methods proposed in this work is to provide algorithms to learn and estimate the underlying structure of high dimensional dataset.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Intrinsic dimensionality estimation"

1

Bruske, J., and G. Sommer. "An algorithm for intrinsic dimensionality estimation." In Computer Analysis of Images and Patterns, 9–16. Berlin, Heidelberg: Springer Berlin Heidelberg, 1997. http://dx.doi.org/10.1007/3-540-63460-6_94.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Amsaleg, Laurent, Oussama Chelly, Michael E. Houle, Ken-ichi Kawarabayashi, Miloš Radovanović, and Weeris Treeratanajaru. "Intrinsic Dimensionality Estimation within Tight Localities." In Proceedings of the 2019 SIAM International Conference on Data Mining, 181–89. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2019. http://dx.doi.org/10.1137/1.9781611975673.21.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bruske, J., and G. Sommer. "Topology representing networks for intrinsic dimensionality estimation." In Lecture Notes in Computer Science, 595–600. Berlin, Heidelberg: Springer Berlin Heidelberg, 1997. http://dx.doi.org/10.1007/bfb0020219.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Shukur, Mohammed Hussein, T. Sobha Rani, S. Durga Bhavani, G. Narahari Sastry, and Surampudi Bapi Raju. "Local and Global Intrinsic Dimensionality Estimation for Better Chemical Space Representation." In Lecture Notes in Computer Science, 329–38. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-25725-4_29.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Heylen, Rob, Mario Parente, and Paul Scheunders. "Estimation of the Intrinsic Dimensionality in Hyperspectral Imagery via the Hubness Phenomenon." In Latent Variable Analysis and Signal Separation, 357–66. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-53547-0_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Cohen, Albert, Wolfgang Dahmen, and Ron DeVore. "State Estimation—The Role of Reduced Models." In SEMA SIMAI Springer Series, 57–77. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-86236-7_4.

Full text
Abstract:
AbstractThe exploration of complex physical or technological processes usually requires exploiting available information from different sources: (i) physical laws often represented as a family of parameter dependent partial differential equations and (ii) data provided by measurement devices or sensors. The amount of sensors is typically limited and data acquisition may be expensive and in some cases even harmful. This article reviews some recent developments for this “small-data” scenario where inversion is strongly aggravated by the typically large parametric dimensionality. The proposed concepts may be viewed as exploring alternatives to Bayesian inversion in favor of more deterministic accuracy quantification related to the required computational complexity. We discuss optimality criteria which delineate intrinsic information limits, and highlight the role of reduced models for developing efficient computational strategies. In particular, the need to adapt the reduced models—not to a specific (possibly noisy) data set but rather to the sensor system—is a central theme. This, in turn, is facilitated by exploiting geometric perspectives based on proper stable variational formulations of the continuous model.
APA, Harvard, Vancouver, ISO, and other styles
7

Goel, A., S. S. Rao, and A. Passamante. "Estimating Local Intrinsic Dimensionality Using Thresholding Techniques." In NATO ASI Series, 125–28. Boston, MA: Springer New York, 1989. http://dx.doi.org/10.1007/978-1-4757-0623-9_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Bassis, S., A. Rozza, C. Ceruti, G. Lombardi, E. Casiraghi, and P. Campadelli. "A Novel Intrinsic Dimensionality Estimator Based on Rank-Order Statistics." In Clustering High--Dimensional Data, 102–17. Berlin, Heidelberg: Springer Berlin Heidelberg, 2015. http://dx.doi.org/10.1007/978-3-662-48577-4_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Intrinsic dimensionality estimation"

1

Karantzalos, Konstantinos. "Intrinsic dimensionality estimation and dimensionality reduction through scale space filtering." In 2009 16th International Conference on Digital Signal Processing (DSP). IEEE, 2009. http://dx.doi.org/10.1109/icdsp.2009.5201196.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Li, Chun-guang, Jun Guo, and Xiangfei Nie. "Intrinsic Dimensionality Estimation with Neighborhood Convex Hull." In 2007 International Conference on Computational Intelligence and Security (CIS 2007). IEEE, 2007. http://dx.doi.org/10.1109/cis.2007.104.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hein, Matthias, and Jean-Yves Audibert. "Intrinsic dimensionality estimation of submanifolds in Rd." In the 22nd international conference. New York, New York, USA: ACM Press, 2005. http://dx.doi.org/10.1145/1102351.1102388.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Little, Anna V., Jason Lee, Yoon-Mo Jung, and Mauro Maggioni. "Estimation of intrinsic dimensionality of samples from noisy low-dimensional manifolds in high dimensions with multiscale SVD." In 2009 IEEE/SP 15th Workshop on Statistical Signal Processing (SSP). IEEE, 2009. http://dx.doi.org/10.1109/ssp.2009.5278634.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Amsaleg, Laurent, Oussama Chelly, Teddy Furon, Stéphane Girard, Michael E. Houle, Ken-ichi Kawarabayashi, and Michael Nett. "Estimating Local Intrinsic Dimensionality." In KDD '15: The 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY, USA: ACM, 2015. http://dx.doi.org/10.1145/2783258.2783405.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

LUO, XIN, JIA WANG, HUIJIE ZHANG, and XIAO WANG. "Estimating the Intrinsic Dimensionality of Hyperspectral Remote Sensing Imagery with Rare Features." In 2018 1st IEEE International Conference on Knowledge Innovation and Invention (ICKII). IEEE, 2018. http://dx.doi.org/10.1109/ickii.2018.8569176.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Bretton-Granatoor, Zachary, Hannah Stealey, Samantha R. Santacruz, and Jarrod A. Lewis-Peacock. "Estimating Intrinsic Manifold Dimensionality to Classify Task-Related Information in Human and Non-Human Primate Data." In 2022 IEEE Biomedical Circuits and Systems Conference (BioCAS). IEEE, 2022. http://dx.doi.org/10.1109/biocas54905.2022.9948604.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography