Gotowa bibliografia na temat „Dimensionality”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Dimensionality”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Dimensionality"

1

Nandakumar, Ratna. "Traditional Dimensionality Versus Essential Dimensionality". Journal of Educational Measurement 28, nr 2 (czerwiec 1991): 99–117. http://dx.doi.org/10.1111/j.1745-3984.1991.tb00347.x.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Warfield, J. N., i A. N. Christakis. "Dimensionality". Systems Research 4, nr 2 (czerwiec 1987): 127–37. http://dx.doi.org/10.1002/sres.3850040207.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Li, Hongda, Jian Cui, Xinle Zhang, Yongqi Han i Liying Cao. "Dimensionality Reduction and Classification of Hyperspectral Remote Sensing Image Feature Extraction". Remote Sensing 14, nr 18 (13.09.2022): 4579. http://dx.doi.org/10.3390/rs14184579.

Pełny tekst źródła
Streszczenie:
Terrain classification is an important research direction in the field of remote sensing. Hyperspectral remote sensing image data contain a large amount of rich ground object information. However, such data have the characteristics of high spatial dimensions of features, strong data correlation, high data redundancy, and long operation time, which lead to difficulty in image data classification. A data dimensionality reduction algorithm can transform the data into low-dimensional data with strong features and then classify the dimensionally reduced data. However, most classification methods cannot effectively extract dimensionality-reduced data features. In this paper, different dimensionality reduction and machine learning supervised classification algorithms are explored to determine a suitable combination method of dimensionality reduction and classification for hyperspectral images. Soft and hard classification methods are adopted to achieve the classification of pixels according to diversity. The results show that the data after dimensionality reduction retain the data features with high overall feature correlation, and the data dimension is drastically reduced. The dimensionality reduction method of unified manifold approximation and projection and the classification method of support vector machine achieve the best terrain classification with 99.57% classification accuracy. High-precision fitting of neural networks for soft classification of hyperspectral images with a model fitting correlation coefficient (R2) of up to 0.979 solves the problem of mixed pixel decomposition.
Style APA, Harvard, Vancouver, ISO itp.
4

Schwartz, M. A., i C. S. Chen. "Deconstructing Dimensionality". Science 339, nr 6118 (24.01.2013): 402–4. http://dx.doi.org/10.1126/science.1233814.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Verberck, Bart. "Dimensionality matters". Nature Physics 12, nr 4 (kwiecień 2016): 287. http://dx.doi.org/10.1038/nphys3726.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Wojcieszak, Magdalena E. "Three Dimensionality". Television & New Media 10, nr 6 (wrzesień 2009): 459–81. http://dx.doi.org/10.1177/1527476409343798.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Sharma, Sunil. "The Dimensionality". Volume 5 - 2020, Issue 9 - September 5, nr 9 (17.09.2020): 166–69. http://dx.doi.org/10.38124/ijisrt20sep168.

Pełny tekst źródła
Streszczenie:
This thesis is all about dimensions and the law of Visual, Physical and Time (V,P,T) dimension which succeed to discovered more than 9 dimension (with examples) which includes current three spatial dimensions as well. General idea of the law is to show that the Dimensions are categorized by phases not by sequences, and Visual, Physical, and Time are the phases the contains their own specific dimensions e.g. Initial Dimensions like Length, Mass and Infinite Time, respectively. Although this paper covers some of the major topics which are (i) Non-Existence of 4th dimension, (ii) Understanding the law of VPT Dimension (iii) Types (iv) Phases (v) Rules of Representation (vi) General term and Numerical term, (vii) Human Dimension, and some uses. This research can be used in various fields like physics, cosmology studies, mathematics, architecture, etc. which generates massive impact on science development. The topics which are not entertain in this paper are Dimensional formula, some theories claim higher dimensions and higher dimensions like 5th 6th and so on
Style APA, Harvard, Vancouver, ISO itp.
8

Pestov, Vladimir. "Intrinsic dimensionality". SIGSPATIAL Special 2, nr 2 (lipiec 2010): 8–11. http://dx.doi.org/10.1145/1862413.1862416.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Goren, Paul. "Dimensionality Redux". Political Research Quarterly 61, nr 1 (marzec 2008): 162–64. http://dx.doi.org/10.1177/1065912907308653.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Shelah, Saharon. "Multi-dimensionality". Israel Journal of Mathematics 74, nr 2-3 (październik 1991): 281–88. http://dx.doi.org/10.1007/bf02775792.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Rozprawy doktorskie na temat "Dimensionality"

1

Ariu, Kaito. "Online Dimensionality Reduction". Licentiate thesis, KTH, Reglerteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-290791.

Pełny tekst źródła
Streszczenie:
In this thesis, we investigate online dimensionality reduction methods, wherethe algorithms learn by sequentially acquiring data. We focus on two specificalgorithm design problems in (i) recommender systems and (ii) heterogeneousclustering from binary user feedback. (i) For recommender systems, we consider a system consisting of m users and n items. In each round, a user,selected uniformly at random, arrives to the system and requests a recommendation. The algorithm observes the user id and recommends an itemfrom the item set. A notable restriction here is that the same item cannotbe recommended to the same user more than once, a constraint referred toas a no-repetition constraint. We study this problem as a variant of themulti-armed bandit problem and analyze regret with the various structurespertaining to items and users. We devise fundamental limits of regret andalgorithms that can achieve the limits order-wise. The analysis explicitlyhighlights the importance of each component of regret. For example, we candistinguish the regret due to the no-repetition constraint, that generated tolearn the statistics of user’s preference for an item, and that generated tolearn the low-dimensional space of the users and items were shown. (ii) Inthe clustering with binary feedback problem, the objective is to classify itemssolely based on limited user feedback. More precisely, users are just askedsimple questions with binary answers. A notable difficulty stems from theheterogeneity in the difficulty in classifying the various items (some itemsrequire more feedback to be classified than others). For this problem, wederive fundamental limits of the cluster recovery rates for both offline andonline algorithms. For the offline setting, we devise a simple algorithm thatachieves the limit order-wise. For the online setting, we propose an algorithm inspired by the lower bound. For both of the problems, we evaluatethe proposed algorithms by inspecting their theoretical guarantees and usingnumerical experiments performed on the synthetic and non-synthetic dataset.
Denna avhandling studerar algoritmer för datareduktion som lär sig från sekventiellt inhämtad data. Vi fokuserar speciellt på frågeställningar som uppkommer i utvecklingen av rekommendationssystem och i identifieringen av heterogena grupper av användare från data. För rekommendationssystem betraktar vi ett system med m användare och n objekt. I varje runda observerar algoritmen en slumpmässigt vald användare och rekommenderar ett objekt. En viktig begränsning i vår problemformuleringar att rekommendationer inte får upprepas: samma objekt inte kan rekommenderas till samma användartermer än en gång. Vi betraktar problemet som en variant av det flerarmadebanditproblemet och analyserar systemprestanda i termer av "ånger” under olika antaganden.Vi härleder fundamentala gränser för ånger och föreslår algoritmer som är (ordningsmässigt) optimala. En intressant komponent av vår analys är att vi lyckas att karaktärisera hur vart och ett av våra antaganden påverkar systemprestandan. T.ex. kan vi kvantifiera prestandaförlusten i ånger på grund av att rekommendationer inte får upprepas, på grund avatt vi måste lära oss statistiken för vilka objekt en användare är intresserade av, och för kostnaden för att lära sig den lågdimensionella rymden för användare och objekt. För problemet med hur man bäst identifierar grupper av användare härleder vi fundamentala gränser för hur snabbt det går att identifiera kluster. Vi gör detta för algoritmer som har samtidig tillgång till all data och för algoritmer som måste lära sig genom sekventiell inhämtning av data. Med tillgång till all data kan vår algoritm uppnå den optimala prestandan ordningsmässigt. När data måste inhämtas sekventiellt föreslår vi en algoritm som är inspirerad av den nedre gränsen på vad som kan uppnås. För båda problemen utvärderar vi de föreslagna algoritmerna numeriskt och jämför den praktiska prestandan med de teoretiska garantierna.

QC 20210223

Style APA, Harvard, Vancouver, ISO itp.
2

LEGRAMANTI, SIRIO. "Bayesian dimensionality reduction". Doctoral thesis, Università Bocconi, 2021. http://hdl.handle.net/11565/4035711.

Pełny tekst źródła
Streszczenie:
No abstract available
We are currently witnessing an explosion in the amount of available data. Such growth involves not only the number of data points but also their dimensionality. This poses new challenges to statistical modeling and computations, thus making dimensionality reduction more central than ever. In the present thesis, we provide methodological, computational and theoretical advancements in Bayesian dimensionality reduction via novel structured priors. Namely, we develop a new increasing shrinkage prior and illustrate how it can be employed to discard redundant dimensions in Gaussian factor models. In order to make it usable for larger datasets, we also investigate variational methods for posterior inference under this proposed prior. Beyond traditional models and parameter spaces, we also provide a different take on dimensionality reduction, focusing on community detection in networks. For this purpose, we define a general class of Bayesian nonparametric priors that encompasses existing stochastic block models as special cases and includes promising unexplored options. Our Bayesian approach allows for a natural incorporation of node attributes and facilitates uncertainty quantification as well as model selection.
Style APA, Harvard, Vancouver, ISO itp.
3

Kelly, Wallace Eugene. "Dimensionality in fuzzy systems". [S.l.] : Universität Stuttgart , Fakultätsübergreifend / Sonstige Einrichtung, 1997. http://www.bsz-bw.de/cgi-bin/xvms.cgi?SWB6783685.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Bolelli, Maria Virginia. "Diffusion Maps for Dimensionality Reduction". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019. http://amslaurea.unibo.it/18246/.

Pełny tekst źródła
Streszczenie:
In this thesis we present the diffusion maps, a framework based on diffusion processes for finding meaningful geometric descriptions of data sets. A diffusion process can be described via an iterative application of the heat kernel which has two main characteristics: it satisfies a Markov semigroup property and its level sets encode all geometric features of the space. This process, well known in regular manifolds, has been extended to general data set by Coifman and Lafon. They define a diffusion kernel starting from the geometric properties of the data and their density properties. This kernel will be a compact operator, and the projection on its eigenvectors at different instant of time, provides a family of embeddings of a dataset into a suitable Euclidean space. The projection on the first eigenvectors, naturally leads to a dimensionality reduction algorithm. Numerical implementation is provided on different data set.
Style APA, Harvard, Vancouver, ISO itp.
5

Khosla, Nitin, i n/a. "Dimensionality Reduction Using Factor Analysis". Griffith University. School of Engineering, 2006. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20061010.151217.

Pełny tekst źródła
Streszczenie:
In many pattern recognition applications, a large number of features are extracted in order to ensure an accurate classification of unknown classes. One way to solve the problems of high dimensions is to first reduce the dimensionality of the data to a manageable size, keeping as much of the original information as possible and then feed the reduced-dimensional data into a pattern recognition system. In this situation, dimensionality reduction process becomes the pre-processing stage of the pattern recognition system. In addition to this, probablility density estimation, with fewer variables is a simpler approach for dimensionality reduction. Dimensionality reduction is useful in speech recognition, data compression, visualization and exploratory data analysis. Some of the techniques which can be used for dimensionality reduction are; Factor Analysis (FA), Principal Component Analysis(PCA), and Linear Discriminant Analysis(LDA). Factor Analysis can be considered as an extension of Principal Component Analysis. The EM (expectation maximization) algorithm is ideally suited to problems of this sort, in that it produces maximum-likelihood (ML) estimates of parameters when there is a many-to-one mapping from an underlying distribution to the distribution governing the observation, conditioned upon the obervations. The maximization step then provides a new estimate of the parameters. This research work compares the techniques; Factor Analysis (Expectation-Maximization algorithm based), Principal Component Analysis and Linear Discriminant Analysis for dimensionality reduction and investigates Local Factor Analysis (EM algorithm based) and Local Principal Component Analysis using Vector Quantization.
Style APA, Harvard, Vancouver, ISO itp.
6

Barlow, Thomas W. "Reduced dimensionality in molecular representation". Thesis, University of Oxford, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.320639.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Ahrens, Johan. "Non-contextual inequalities and dimensionality". Doctoral thesis, Stockholms universitet, Fysikum, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-116832.

Pełny tekst źródła
Streszczenie:
This PhD-thesis is based on the five experiments I have performed during mytime as a PhD-student. Three experiments are implementations of non-contextualinequalities and two are implementations of witness functions for classical- andquantum dimensions of sets of states. A dimension witness is an operator function that produce a value whenapplied to a set of states. This value has different upper bounds depending onthe dimension of the set of states and also depending on if the states are classicalor quantum. Therefore a dimension witness can only give a lower bound on thedimension of the set of states.The first dimension witness is based on the CHSH-inequality and has theability of discriminating between classical and quantum sets of states of two andthree dimensions, it can also indicate if a set of states must be of dimension fouror higher.The second dimension witness is based on a set theoretical representationof the possible combinations of states and measurements and grows with thedimension of the set of states you want to be able to identify, on the other handthere is a formula for expanding it to arbitrary dimension.Non-contextual hidden variable models is a family of hidden variable modelswhich include local hidden variable models, so in a sence non-contextual inequal-ities are a generalisation of Bell-inequalities. The experiments presented in this thesis all use single particle quantum systems.The first experiment is a violation of the KCBS-inequality, this is the simplest correlation inequality which is violated by quantum mechanics.The second experiment is a violation of the Wright-inequality which is the simplest inequality violated by quantum mechanics, it contains only projectors and not correlations.The final experiment of the thesis is an implementation of a Hardy-like equality for non-contextuality, this means that the operators in the KCBS-inequality have been rotated so that one term in the sum will be zero for all non-contextual hidden variable models and we get a contradiction since quantum mechanicsgives a non-zero value for all terms.
Denna doktorsavhandling är baserad på fem experiment jag har utfört undermin tid som doktorand. Tre experiment är realiseringar av icke-kontextuella olikheter och de två övriga är realiseringar av vittnesfunktioner för klassiska och kvantmekaniska dimensioner hos en uppsättning tillstånd. Ett dimensionsvittne är en funktion som tar en uppsättning tillstånd och producerar ett värde. Detta värde har olika övre gränser beroende på dimensionen hos uppsättningen tillstånd och beror även på om tillstånden är klassiska eller kvantmekaniska. På grund av detta kan ett dimensionsvittne endast ge en undre uppskattning på dimensionen hos en uppsättning tillstånd.Det första dimensionsvittnet är baserat på CHSH-olikheten och kan urskiljamellan klassiska och kvantmekaniska tillstånd av två och tre dimensioner, det kan även avgöra ifall uppsättningen av tillstånd har dimension fyra eller högre. Det andra dimensionsvittnet är baserat på en sannolikhetsteoretisk representation av möjliga kombinationer av tillstånd och mätningar. Detta vittne växer med antalet dimensioner som skall kunna urskiljas, å andra sidan finns det en formel för hur man kan expandera vittnet till godtycklig dimension.Icke-kontextuella gömda-variabel-teorier är en familj av gömda-variabel-teorier som innefattar lokala gömda-variabel-teorier, så i en bemärkelse är icke-kontextuella olikheter en generalisering av Bell-olikheter. Experimenteni denna avhandling använder sig alla av en-partikel-kvantsystem. Det första experimentet är en brytning av KCBS-olikheten, det är den en-klaste olikheten baserad på korrelationer som kan brytas av kvantmekanik. Det andra experimentet är en brytning av Wright-olikheten som är den enklaste olikheten som kan brytas av kvantmekanik, den innehåller endast projektorer inga korrelationer. Det sista experimentet i avhandlingen är en realisering av en Hardy-lik olikhet för icke-kontextualitet. Detta betyder att operatorerna i KCBS-olikheten har roterats så att en term i summan är identiskt noll för alla icke-kontextuella gömda-variabel-teorier och vi får en motsägelse då kvantmekaniken ger ettnoll-skiljt värde för alla termer.
Style APA, Harvard, Vancouver, ISO itp.
8

Hyde, Susan Margaret. "Understanding dimensionality in health care". Thesis, Manchester Metropolitan University, 2014. http://e-space.mmu.ac.uk/326230/.

Pełny tekst źródła
Streszczenie:
In recent years, the quality of non-clinical elements of health care has been challenged in the UK. While dimensions such as the environment, communications, reliability, access, etc., all contribute to making patients feel more at ease during a time when they are at their most vulnerable, they often fall short of what they should be. This paper supports the shift towards greater emphasis on understanding the functional elements of health services in an effort to improve patient experience and outcomes. While there is an abundance of literature discussing the evaluation of service quality, much of this focuses on the SERVQUAL model and, although there is increasing debate about its relevance across sectors, no alternative has been offered. This paper argues that the model lacks substance as a tool to evaluate quality in the complex environment of health care. The study embraced multiple methods to acquire a greater understanding of service quality constructs within the health care sector. It was carried out in three phases. The first comprised critical incident interviews with service users, which highlighted both successes and failings in their care. This was followed by staff interviews and focus groups representing a cross section of the public, providing an insight into how different groups perceive quality. The data was used in the design of a detailed questionnaire which attracted in excess of 1,000 responses. Factor analysis was then used to develop a framework of key elements relevant both to hospital settings and to those services provided in the community such as general practice. The findings provide a four-factor model comprising: trust, access, a caring approach and professionalism, three of which are comprised primarily of human interactions. These findings suggest that although the original SERVQUAL ten-item model does have some relevance, with the adapted five-item model being far too simplistic, neither fully addresses the needs of a sector as unique and high contact as health care. The results point the way for further research to develop a detailed model to evaluate service quality in health care settings.
Style APA, Harvard, Vancouver, ISO itp.
9

Vamulapalli, Harika Rao. "On Dimensionality Reduction of Data". ScholarWorks@UNO, 2010. http://scholarworks.uno.edu/td/1211.

Pełny tekst źródła
Streszczenie:
Random projection method is one of the important tools for the dimensionality reduction of data which can be made efficient with strong error guarantees. In this thesis, we focus on linear transforms of high dimensional data to the low dimensional space satisfying the Johnson-Lindenstrauss lemma. In addition, we also prove some theoretical results relating to the projections that are of interest when applying them in practical applications. We show how the technique can be applied to synthetic data with probabilistic guarantee on the pairwise distance. The connection between dimensionality reduction and compressed sensing is also discussed.
Style APA, Harvard, Vancouver, ISO itp.
10

Widemann, David P. "Dimensionality reduction for hyperspectral data". College Park, Md.: University of Maryland, 2008. http://hdl.handle.net/1903/8448.

Pełny tekst źródła
Streszczenie:
Thesis (Ph. D.) -- University of Maryland, College Park, 2008.
Thesis research directed by: Dept. of Mathematics. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
Style APA, Harvard, Vancouver, ISO itp.

Książki na temat "Dimensionality"

1

Lee, John A., i Michel Verleysen, red. Nonlinear Dimensionality Reduction. New York, NY: Springer New York, 2007. http://dx.doi.org/10.1007/978-0-387-39351-3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Lespinats, Sylvain, Benoit Colange i Denys Dutykh. Nonlinear Dimensionality Reduction Techniques. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-81026-9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Garzon, Max, Ching-Chi Yang, Deepak Venugopal, Nirman Kumar, Kalidas Jana i Lih-Yuan Deng, red. Dimensionality Reduction in Data Science. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-05371-9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Beeby, J. L., P. K. Bhattacharya, P. Ch Gravelle, F. Koch i D. J. Lockwood, red. Condensed Systems of Low Dimensionality. Boston, MA: Springer US, 1991. http://dx.doi.org/10.1007/978-1-4684-1348-9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

NATO, Advanced Research Workshop on Condensed Systems of Low Dimensionality (1990 Marmaris Turkey). Condensed systems of low dimensionality. New York: Plenum Press, 1991.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Beeby, J. L. Condensed Systems of Low Dimensionality. Boston, MA: Springer US, 1991.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Paul, Arati, i Nabendu Chaki. Dimensionality Reduction of Hyperspectral Imagery. Cham: Springer International Publishing, 2024. http://dx.doi.org/10.1007/978-3-031-42667-4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Strange, Harry, i Reyer Zwiggelaar. Open Problems in Spectral Dimensionality Reduction. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-03943-5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

He, Tian-Xiao. Dimensionality Reducing Expansion of Multivariate Integration. Boston, MA: Birkhäuser Boston, 2001. http://dx.doi.org/10.1007/978-1-4612-2100-5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Kramer, Oliver. Dimensionality Reduction with Unsupervised Nearest Neighbors. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38652-7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Części książek na temat "Dimensionality"

1

Horton, Mike, Ida Marais i Karl Bang Christensen. "Dimensionality". W Rasch Models in Health, 137–58. Hoboken, NJ USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118574454.ch9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Bhatia, Rajendra. "Dimensionality". W Texts and Readings in Mathematics, 11–18. Gurgaon: Hindustan Book Agency, 2009. http://dx.doi.org/10.1007/978-93-86279-45-3_2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Hodder, Rupert. "Dimensionality". W High-level Political Appointments in the Philippines, 29–47. Singapore: Springer Singapore, 2013. http://dx.doi.org/10.1007/978-981-4560-05-4_4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Plowright, Philip D. "Dimensionality". W Making Architecture Through Being Human, 76–81. Abingdon, Oxon ; New York, NY : Routledge, 2020.: Routledge, 2019. http://dx.doi.org/10.4324/9780429261718-23.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Sharp, John. "Dimensionality". W The Routledge Companion to Video Game Studies, 162–69. Wyd. 2. New York: Routledge, 2023. http://dx.doi.org/10.4324/9781003214977-24.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Jørgensen, Sveinung, i Lars Jacob Tynes Pedersen. "Three-Dimensionality Rather than One-Dimensionality". W RESTART Sustainable Business Model Innovation, 153–68. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-91971-3_11.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Herrera, Francisco, Francisco Charte, Antonio J. Rivera i María J. del Jesus. "Dimensionality Reduction". W Multilabel Classification, 115–31. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-41111-8_7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Kramer, Oliver. "Dimensionality Reduction". W Dimensionality Reduction with Unsupervised Nearest Neighbors, 33–52. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38652-7_4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Hull, Isaiah. "Dimensionality Reduction". W Machine Learning for Economics and Finance in TensorFlow 2, 281–306. Berkeley, CA: Apress, 2020. http://dx.doi.org/10.1007/978-1-4842-6373-0_8.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Shen, Heng Tao. "Dimensionality Reduction". W Encyclopedia of Database Systems, 1–2. New York, NY: Springer New York, 2017. http://dx.doi.org/10.1007/978-1-4899-7993-3_551-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Streszczenia konferencji na temat "Dimensionality"

1

Kapralos, Bill, Nathan Mekuz, Agnieszka Kopinska i Saad Khattak. "Dimensionality reduced HRTFs". W the 2008 International Conference in Advances. New York, New York, USA: ACM Press, 2008. http://dx.doi.org/10.1145/1501750.1501763.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Bunte, Kerstin, Michael Biehl i Barbara Hammer. "Dimensionality reduction mappings". W 2011 Ieee Symposium On Computational Intelligence And Data Mining - Part Of 17273 - 2011 Ssci. IEEE, 2011. http://dx.doi.org/10.1109/cidm.2011.5949443.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

HU, Min, i Qunsheng PENG. "Volume fractal dimensionality". W the 2005 ACM symposium. New York, New York, USA: ACM Press, 2005. http://dx.doi.org/10.1145/1066677.1066718.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Karantzalos, Konstantinos. "Intrinsic dimensionality estimation and dimensionality reduction through scale space filtering". W 2009 16th International Conference on Digital Signal Processing (DSP). IEEE, 2009. http://dx.doi.org/10.1109/icdsp.2009.5201196.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Hendrikse, Anne, Raymond Veldhuis i Luuk Spreeuwers. "Verification Under Increasing Dimensionality". W 2010 20th International Conference on Pattern Recognition (ICPR). IEEE, 2010. http://dx.doi.org/10.1109/icpr.2010.149.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Schclar, Alon, i Amir Averbuch. "Diffusion Bases Dimensionality Reduction". W 7th International Conference on Neural Computation Theory and Applications. SCITEPRESS - Science and and Technology Publications, 2015. http://dx.doi.org/10.5220/0005625301510156.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

WU, ZHONG CHAO. "DIMENSIONALITY IN QUANTUM COSMOLOGY". W Proceedings of the MG10 Meeting held at Brazilian Center for Research in Physics (CBPF). World Scientific Publishing Company, 2006. http://dx.doi.org/10.1142/9789812704030_0135.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Bingham, Ella, Aristides Gionis, Niina Haiminen, Heli Hiisilä, Heikki Mannila i Evimaria Terzi. "Segmentation and dimensionality reduction". W Proceedings of the 2006 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2006. http://dx.doi.org/10.1137/1.9781611972764.33.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Zhang, Daoqiang, Zhi-Hua Zhou i Songcan Chen. "Semi-Supervised Dimensionality Reduction". W Proceedings of the 2007 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2007. http://dx.doi.org/10.1137/1.9781611972771.73.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Amsaleg, Laurent, Oussama Chelly, Teddy Furon, Stéphane Girard, Michael E. Houle, Ken-ichi Kawarabayashi i Michael Nett. "Estimating Local Intrinsic Dimensionality". W KDD '15: The 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY, USA: ACM, 2015. http://dx.doi.org/10.1145/2783258.2783405.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Raporty organizacyjne na temat "Dimensionality"

1

Jain, Anil K. Classification, Clustering and Dimensionality Reduction. Fort Belvoir, VA: Defense Technical Information Center, lipiec 2008. http://dx.doi.org/10.21236/ada483446.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Eichenfield, Matt. Reduced Dimensionality Lithium Niobate Microsystems. Office of Scientific and Technical Information (OSTI), styczeń 2017. http://dx.doi.org/10.2172/1338889.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Topping, Craig V. Molecular Magnets and Reduced Dimensionality. Office of Scientific and Technical Information (OSTI), grudzień 2012. http://dx.doi.org/10.2172/1058056.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Wolf, Lior, i Stanley Bileschi. Combining Variable Selection with Dimensionality Reduction. Fort Belvoir, VA: Defense Technical Information Center, marzec 2005. http://dx.doi.org/10.21236/ada454990.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Nandakumar, Ratna. Assessing Essential Dimensionality of Real Data. Fort Belvoir, VA: Defense Technical Information Center, sierpień 1992. http://dx.doi.org/10.21236/ada255774.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Jones, Michael J. Using Recurrent Networks for Dimensionality Reduction. Fort Belvoir, VA: Defense Technical Information Center, wrzesień 1992. http://dx.doi.org/10.21236/ada259497.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Bolton, S. R. Dimensionality of InGaAs nonlinear optical response. Office of Scientific and Technical Information (OSTI), lipiec 1995. http://dx.doi.org/10.2172/270789.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Ulloa, S. E. Electronic states in systems of reduced dimensionality. Office of Scientific and Technical Information (OSTI), kwiecień 1992. http://dx.doi.org/10.2172/5296020.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Webb, Geoffrey, i Mark Carman. Dynamic Dimensionality Selection for Bayesian Classifier Ensembles. Fort Belvoir, VA: Defense Technical Information Center, marzec 2015. http://dx.doi.org/10.21236/ada614917.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Powell, Thomas E., J. W. Cunningham, William E. Wimpee, Mark A. Wilson i Rodger D. Ballentine. Dimensionality of Ability-Requirements for Generic Job Activities. Fort Belvoir, VA: Defense Technical Information Center, lipiec 1999. http://dx.doi.org/10.21236/ada368192.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii