Littérature scientifique sur le sujet « Unsupervied learning »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Sommaire
Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Unsupervied learning ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Articles de revues sur le sujet "Unsupervied learning"
Fong, A. C. M., et G. Hong. « Boosted Supervised Intensional Learning Supported by Unsupervised Learning ». International Journal of Machine Learning and Computing 11, no 2 (mars 2021) : 98–102. http://dx.doi.org/10.18178/ijmlc.2021.11.2.1020.
Texte intégralXu, Mingle, Sook Yoon, Jaesu Lee et Dong Sun Park. « Unsupervised Transfer Learning for Plant Anomaly Recognition ». Korean Institute of Smart Media 11, no 4 (31 mai 2022) : 30–37. http://dx.doi.org/10.30693/smj.2022.11.4.30.
Texte intégralKruglov, Artem V. « The Unsupervised Learning Algorithm for Detecting Ellipsoid Objects ». International Journal of Machine Learning and Computing 9, no 3 (juin 2019) : 255–60. http://dx.doi.org/10.18178/ijmlc.2019.9.3.795.
Texte intégralShi, Chengming, Bo Luo, Hongqi Li, Bin Li, Xinyong Mao et Fangyu Peng. « Anomaly Detection via Unsupervised Learning for Tool Breakage Monitoring ». International Journal of Machine Learning and Computing 6, no 5 (octobre 2016) : 256–59. http://dx.doi.org/10.18178/ijmlc.2016.6.5.607.
Texte intégralBanzi, Jamal, Isack Bulugu et Zhongfu Ye. « Deep Predictive Neural Network : Unsupervised Learning for Hand Pose Estimation ». International Journal of Machine Learning and Computing 9, no 4 (août 2019) : 432–39. http://dx.doi.org/10.18178/ijmlc.2019.9.4.822.
Texte intégralBarlow, H. B. « Unsupervised Learning ». Neural Computation 1, no 3 (septembre 1989) : 295–311. http://dx.doi.org/10.1162/neco.1989.1.3.295.
Texte intégralZhuo Wang, Zhuo Wang, Min Huang Zhuo Wang, Xiao-Long Huang Min Huang, Fei Man Xiao-Long Huang, Jia-Ming Dou Fei Man et Jian-li Lyu Jia-Ming Dou. « Unsupervised Learning of Depth and Ego-Motion from Continuous Monocular Images ». 電腦學刊 32, no 6 (décembre 2021) : 038–51. http://dx.doi.org/10.53106/199115992021123206004.
Texte intégralWatkin, T. L. H., et J. P. Nadal. « Optimal unsupervised learning ». Journal of Physics A : Mathematical and General 27, no 6 (21 mars 1994) : 1899–915. http://dx.doi.org/10.1088/0305-4470/27/6/016.
Texte intégralSanger, T. « Optimal unsupervised learning ». Neural Networks 1 (janvier 1988) : 127. http://dx.doi.org/10.1016/0893-6080(88)90166-9.
Texte intégralChen Guoyang, 陈国洋, 吴小俊 Wu Xiaojun et 徐天阳 Xu Tianyang. « 基于深度学习的无监督红外图像与可见光图像融合算法 ». Laser & ; Optoelectronics Progress 59, no 4 (2022) : 0410010. http://dx.doi.org/10.3788/lop202259.0410010.
Texte intégralThèses sur le sujet "Unsupervied learning"
GIOBERGIA, FLAVIO. « Machine learning with limited label availability : algorithms and applications ». Doctoral thesis, Politecnico di Torino, 2023. https://hdl.handle.net/11583/2976594.
Texte intégralSnyder, Benjamin Ph D. Massachusetts Institute of Technology. « Unsupervised multilingual learning ». Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/62455.
Texte intégralCataloged from PDF version of thesis.
Includes bibliographical references (p. 241-254).
For centuries, scholars have explored the deep links among human languages. In this thesis, we present a class of probabilistic models that exploit these links as a form of naturally occurring supervision. These models allow us to substantially improve performance for core text processing tasks, such as morphological segmentation, part-of-speech tagging, and syntactic parsing. Besides these traditional NLP tasks, we also present a multilingual model for lost language deciphersment. We test this model on the ancient Ugaritic language. Our results show that we can automatically uncover much of the historical relationship between Ugaritic and Biblical Hebrew, a known related language.
by Benjamin Snyder.
Ph.D.
Geigel, Arturo. « Unsupervised Learning Trojan ». NSUWorks, 2014. http://nsuworks.nova.edu/gscis_etd/17.
Texte intégralMathieu, Michael. « Unsupervised Learning under Uncertainty ». Thesis, New York University, 2017. http://pqdtopen.proquest.com/#viewpdf?dispub=10261120.
Texte intégralDeep learning, in particular neural networks, achieved remarkable success in the recent years. However, most of it is based on supervised learning, and relies on ever larger datasets, and immense computing power. One step towards general artificial intelligence is to build a model of the world, with enough knowledge to acquire a kind of ``common sense''. Representations learned by such a model could be reused in a number of other tasks. It would reduce the requirement for labelled samples and possibly acquire a deeper understanding of the problem. The vast quantities of knowledge required to build common sense precludes the use of supervised learning, and suggests to rely on unsupervised learning instead.
The concept of uncertainty is central to unsupervised learning. The task is usually to learn a complex, multimodal distribution. Density estimation and generative models aim at representing the whole distribution of the data, while predictive learning consists of predicting the state of the world given the context and, more often than not, the prediction is not unique. That may be because the model lacks the capacity or the computing power to make a certain prediction, or because the future depends on parameters that are not part of the observation. Finally, the world can be chaotic of truly stochastic. Representing complex, multimodal continuous distributions with deep neural networks is still an open problem.
In this thesis, we first assess the difficulties of representing probabilities in high dimensional spaces, and review the related work in this domain. We then introduce two methods to address the problem of video prediction, first using a novel form of linearizing auto-encoders and latent variables, and secondly using Generative Adversarial Networks (GANs). We show how GANs can be seen as trainable loss functions to represent uncertainty, then how they can be used to disentangle factors of variation. Finally, we explore a new non-probabilistic framework for GANs.
Boschini, Matteo. « Unsupervised Learning of Scene Flow ». Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/16226/.
Texte intégralJelacic, Mersad. « Unsupervised Learning for Plant Recognition ». Thesis, Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-247.
Texte intégralSix methods are used for clustering data containing two different objects: sugar-beet plants
and weed. These objects are described by 19 different features, i.e. shape and color features.
There is also information about the distance between sugar-beet plants that is used for
labeling clusters. The methods that are evaluated: k-means, k-medoids, hierarchical clustering,
competitive learning, self-organizing maps and fuzzy c-means. After using the methods on
plant data, clusters are formed. The clusters are labeled with three different proposed
methods: expert, database and context method. Expert method is using a human for giving
initial cluster centers that are labeled. The database method is using a database as an expert
that provides initial cluster centers. The context method is using information about the
environment, which is the distance between sugar-beet plants, for labeling the clusters.
The algorithms that were tested, with the lowest achieved corresponding error, are: k-means
(3.3%), k-medoids (3.8%), hierarchical clustering (5.3%), competitive learning (6.8%), self-
organizing maps (4.9%) and fuzzy c-means (7.9%). Three different datasets were used and the
lowest error on dataset0 is 3.3%, compared to supervised learning methods where it is 3%.
For dataset1 the error is 18.7% and for dataset2 it is 5.8%. Compared to supervised methods,
the error on dataset1 is 11% and for dataset2 it is 5.1%. The high error rate on dataset1 is due
to the samples are not very well separated in different clusters. The features from dataset1 are
extracted from lower resolution on images than the other datasets, and another difference
between the datasets are the sugar-beet plants that are in different growth stages.
The performance of the three methods for labeling clusters is: expert method (6.8% as the
lowest error achieved), database method (3.7%) and context method (6.8%). These results
show the clustering results by competitive learning where the real error is 6.8%.
Unsupervised-learning methods for clustering can very well be used for plant identification.
Because the samples are not classified, an automatic labeling technique must be used if plants
are to be identified. The three proposed techniques can be used for automatic labeling of
plants.
Amin, Khizer, et Mehmood ul haq Minhas. « Facebook Blocket with Unsupervised Learning ». Thesis, Blekinge Tekniska Högskola, Institutionen för tillämpad signalbehandling, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-1969.
Texte intégralKorkontzelos, Ioannis. « Unsupervised learning of multiword expressions ». Thesis, University of York, 2010. http://etheses.whiterose.ac.uk/2091/.
Texte intégralLiang, Yingyu. « Modern aspects of unsupervised learning ». Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/52282.
Texte intégralXiao, Ying. « New tools for unsupervised learning ». Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/52995.
Texte intégralLivres sur le sujet "Unsupervied learning"
Kyan, Matthew, Paisarn Muneesawang, Kambiz Jarrah et Ling Guan. Unsupervised Learning. Hoboken, NJ, USA : John Wiley & Sons, Inc., 2014. http://dx.doi.org/10.1002/9781118875568.
Texte intégralCelebi, M. Emre, et Kemal Aydin, dir. Unsupervised Learning Algorithms. Cham : Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8.
Texte intégralLi, Xiangtao, et Ka-Chun Wong, dir. Natural Computing for Unsupervised Learning. Cham : Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-319-98566-4.
Texte intégralUnsupervised Representation Learning with Correlations. [New York, N.Y.?] : [publisher not identified], 2020.
Trouver le texte intégralLeordeanu, Marius. Unsupervised Learning in Space and Time. Cham : Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-42128-1.
Texte intégralBaruque, Bruno, et Emilio Corchado. Fusion Methods for Unsupervised Learning Ensembles. Berlin, Heidelberg : Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-16205-3.
Texte intégralAlbalate, Amparo, et Wolfgang Minker. Semi-Supervised and Unsupervised Machine Learning. Hoboken, NJ, USA : John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118557693.
Texte intégralBartlett, Marian Stewart. Face Image Analysis by Unsupervised Learning. Boston, MA : Springer US, 2001. http://dx.doi.org/10.1007/978-1-4615-1637-8.
Texte intégralE, Hinton Geoffrey, et Sejnowski Terrence J, dir. Unsupervised learning : Foundations of neural computation. Cambridge, Mass : MIT Press, 1999.
Trouver le texte intégralBartlett, Marian Stewart. Face Image Analysis by Unsupervised Learning. Boston, MA : Springer US, 2001.
Trouver le texte intégralChapitres de livres sur le sujet "Unsupervied learning"
Deepak, P. « Anomaly Detection for Data with Spatial Attributes ». Dans Unsupervised Learning Algorithms, 1–32. Cham : Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_1.
Texte intégralTorra, Vicenç, Guillermo Navarro-Arribas et Klara Stokes. « An Overview of the Use of Clustering for Data Privacy ». Dans Unsupervised Learning Algorithms, 237–51. Cham : Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_10.
Texte intégralWang, Chang-Dong, et Jian-Huang Lai. « Nonlinear Clustering : Methods and Applications ». Dans Unsupervised Learning Algorithms, 253–302. Cham : Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_11.
Texte intégralİnkaya, Tülin, Sinan Kayalıgil et Nur Evin Özdemirel. « Swarm Intelligence-Based Clustering Algorithms : A Survey ». Dans Unsupervised Learning Algorithms, 303–41. Cham : Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_12.
Texte intégralHuang, Xiaohui, Yunming Ye et Haijun Zhang. « Extending Kmeans-Type Algorithms by Integrating Intra-cluster Compactness and Inter-cluster Separation ». Dans Unsupervised Learning Algorithms, 343–84. Cham : Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_13.
Texte intégralTsolakis, Dimitrios M., et George E. Tsekouras. « A Fuzzy-Soft Competitive Learning Approach for Grayscale Image Compression ». Dans Unsupervised Learning Algorithms, 385–404. Cham : Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_14.
Texte intégralWong, Ka-Chun, Yue Li et Zhaolei Zhang. « Unsupervised Learning in Genome Informatics ». Dans Unsupervised Learning Algorithms, 405–48. Cham : Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_15.
Texte intégralMartin, Dian I., John C. Martin et Michael W. Berry. « The Application of LSA to the Evaluation of Questionnaire Responses ». Dans Unsupervised Learning Algorithms, 449–84. Cham : Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_16.
Texte intégralAhmed, Rezwan, et George Karypis. « Mining Evolving Patterns in Dynamic Relational Networks ». Dans Unsupervised Learning Algorithms, 485–532. Cham : Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_17.
Texte intégralTrentin, Edmondo, et Marco Bongini. « Probabilistically Grounded Unsupervised Training of Neural Networks ». Dans Unsupervised Learning Algorithms, 533–58. Cham : Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_18.
Texte intégralActes de conférences sur le sujet "Unsupervied learning"
Yu, Francis T. S., Taiwei Lu et Don A. Gregory. « Self-Learning Optical Neural Network ». Dans Spatial Light Modulators and Applications. Washington, D.C. : Optica Publishing Group, 1990. http://dx.doi.org/10.1364/slma.1990.mb4.
Texte intégralShui, Xinghua, et Huadong Zheng. « Multi-depth Hologram Generation with Unsupervised-learning Based Computer-generated Holography ». Dans Digital Holography and Three-Dimensional Imaging. Washington, D.C. : Optica Publishing Group, 2022. http://dx.doi.org/10.1364/dh.2022.w5a.12.
Texte intégralVer Steeg, Greg. « Unsupervised Learning via Total Correlation Explanation ». Dans Twenty-Sixth International Joint Conference on Artificial Intelligence. California : International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/740.
Texte intégralGonzalez, Andres, Zoya Heidari et Olivier Lopez. « Data-Driven Algorithms for Image-Based Rock Classification and Formation Evaluation in Formations With Rapid Spatial Variation in Rock Fabric ». Dans 2022 SPWLA 63rd Annual Symposium. Society of Petrophysicists and Well Log Analysts, 2022. http://dx.doi.org/10.30632/spwla-2022-0018.
Texte intégralFigueirêdo, Ilan Sousa, Tássio Farias Carvalho, Wenisten José Dantas Silva, Lílian Lefol Nani Guarieiro et Erick Giovani Sperandio Nascimento. « Detecting Interesting and Anomalous Patterns In Multivariate Time-Series Data in an Offshore Platform Using Unsupervised Learning ». Dans Offshore Technology Conference. OTC, 2021. http://dx.doi.org/10.4043/31297-ms.
Texte intégralHuang, Ling. « Unsupervised Multi-view Learning ». Dans Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California : International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/910.
Texte intégralReite, Aaron A., Scott Kangas, Zackery Steck, George S. Goley, Jonathan Von Stroh et Steven Forsyth. « Unsupervised feature learning in remote sensing ». Dans Applications of Machine Learning, sous la direction de Michael E. Zelinski, Tarek M. Taha, Jonathan Howe, Abdul A. Awwal et Khan M. Iftekharuddin. SPIE, 2019. http://dx.doi.org/10.1117/12.2529791.
Texte intégralChen, Junjie, William K. Cheung et Anran Wang. « Learning Deep Unsupervised Binary Codes for Image Retrieval ». Dans Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California : International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/85.
Texte intégralLi, Jundong, Jiliang Tang et Huan Liu. « Reconstruction-based Unsupervised Feature Selection : An Embedded Approach ». Dans Twenty-Sixth International Joint Conference on Artificial Intelligence. California : International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/300.
Texte intégralXiang, Mingjun, Lingxiao Wang, Yu Sha, Hui Yuan, Kai Zhou et Hartmut G. Roskos. « Phase Retrieval for Terahertz Holography with Physics-Informed Deep Learning ». Dans Digital Holography and Three-Dimensional Imaging. Washington, D.C. : Optica Publishing Group, 2022. http://dx.doi.org/10.1364/dh.2022.tu4a.4.
Texte intégralRapports d'organisations sur le sujet "Unsupervied learning"
Vesselinov, Velimir Valentinov. TensorDecompostions : Unsupervised machine learning methods. Office of Scientific and Technical Information (OSTI), février 2019. http://dx.doi.org/10.2172/1493534.
Texte intégralSprechmann, Pablo, et Guillermo Sapiro. Dictionary Learning and Sparse Coding for Unsupervised Clustering. Fort Belvoir, VA : Defense Technical Information Center, septembre 2009. http://dx.doi.org/10.21236/ada513140.
Texte intégralVesselinov, Velimir, Bulbul Ahmmed, Maruti Mudunuru, Jeff Pepin, Erick Burns, D. Siler, Satish Karra et Richard Middleton. Discovering Hidden Geothermal Signatures using Unsupervised Machine Learning. Office of Scientific and Technical Information (OSTI), avril 2021. http://dx.doi.org/10.2172/1781347.
Texte intégralSafta, Cosmin, Habib Najm, Michael Grant et Michael Sparapany. Trajectory Optimization via Unsupervised Probabilistic Learning On Manifolds. Office of Scientific and Technical Information (OSTI), septembre 2021. http://dx.doi.org/10.2172/1821958.
Texte intégralShekhar, Shubhranshu, Jetson Leder-Luis et Leman Akoglu. Unsupervised Machine Learning for Explainable Health Care Fraud Detection. Cambridge, MA : National Bureau of Economic Research, février 2023. http://dx.doi.org/10.3386/w30946.
Texte intégralAhmmed, Bulbul. Supervised and Unsupervised Machine Learning to Understanding Reactive-transport Data. Office of Scientific and Technical Information (OSTI), mai 2020. http://dx.doi.org/10.2172/1630844.
Texte intégralObert, James, et Timothy James Loffredo. Efficient Binary Static Code Data Flow Analysis Using Unsupervised Learning. Office of Scientific and Technical Information (OSTI), novembre 2019. http://dx.doi.org/10.2172/1592974.
Texte intégralYeamans, Katelyn Angela. Unsupervised Machine Learning for Evaluation of Aging in Explosive Pressed Pellets. Office of Scientific and Technical Information (OSTI), décembre 2018. http://dx.doi.org/10.2172/1484618.
Texte intégralWehner, Michael, Mark Risser, Paul Ullrich et Shiheng Duan. Exploring variability in seasonal average and extreme precipitation using unsupervised machine learning. Office of Scientific and Technical Information (OSTI), avril 2021. http://dx.doi.org/10.2172/1769708.
Texte intégralPinar, Ali, Tamara G. Kolda, Kevin Thomas Carlberg, Grey Ballard et Michael Mahoney. Unsupervised Learning Through Randomized Algorithms for High-Volume High-Velocity Data (ULTRA-HV). Office of Scientific and Technical Information (OSTI), janvier 2018. http://dx.doi.org/10.2172/1417788.
Texte intégral