Literatura académica sobre el tema "Unsupervied learning"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Unsupervied learning".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Artículos de revistas sobre el tema "Unsupervied learning"
Fong, A. C. M. y G. Hong. "Boosted Supervised Intensional Learning Supported by Unsupervised Learning". International Journal of Machine Learning and Computing 11, n.º 2 (marzo de 2021): 98–102. http://dx.doi.org/10.18178/ijmlc.2021.11.2.1020.
Texto completoXu, Mingle, Sook Yoon, Jaesu Lee y Dong Sun Park. "Unsupervised Transfer Learning for Plant Anomaly Recognition". Korean Institute of Smart Media 11, n.º 4 (31 de mayo de 2022): 30–37. http://dx.doi.org/10.30693/smj.2022.11.4.30.
Texto completoKruglov, Artem V. "The Unsupervised Learning Algorithm for Detecting Ellipsoid Objects". International Journal of Machine Learning and Computing 9, n.º 3 (junio de 2019): 255–60. http://dx.doi.org/10.18178/ijmlc.2019.9.3.795.
Texto completoShi, Chengming, Bo Luo, Hongqi Li, Bin Li, Xinyong Mao y Fangyu Peng. "Anomaly Detection via Unsupervised Learning for Tool Breakage Monitoring". International Journal of Machine Learning and Computing 6, n.º 5 (octubre de 2016): 256–59. http://dx.doi.org/10.18178/ijmlc.2016.6.5.607.
Texto completoBanzi, Jamal, Isack Bulugu y Zhongfu Ye. "Deep Predictive Neural Network: Unsupervised Learning for Hand Pose Estimation". International Journal of Machine Learning and Computing 9, n.º 4 (agosto de 2019): 432–39. http://dx.doi.org/10.18178/ijmlc.2019.9.4.822.
Texto completoBarlow, H. B. "Unsupervised Learning". Neural Computation 1, n.º 3 (septiembre de 1989): 295–311. http://dx.doi.org/10.1162/neco.1989.1.3.295.
Texto completoZhuo Wang, Zhuo Wang, Min Huang Zhuo Wang, Xiao-Long Huang Min Huang, Fei Man Xiao-Long Huang, Jia-Ming Dou Fei Man y Jian-li Lyu Jia-Ming Dou. "Unsupervised Learning of Depth and Ego-Motion from Continuous Monocular Images". 電腦學刊 32, n.º 6 (diciembre de 2021): 038–51. http://dx.doi.org/10.53106/199115992021123206004.
Texto completoWatkin, T. L. H. y J. P. Nadal. "Optimal unsupervised learning". Journal of Physics A: Mathematical and General 27, n.º 6 (21 de marzo de 1994): 1899–915. http://dx.doi.org/10.1088/0305-4470/27/6/016.
Texto completoSanger, T. "Optimal unsupervised learning". Neural Networks 1 (enero de 1988): 127. http://dx.doi.org/10.1016/0893-6080(88)90166-9.
Texto completoChen Guoyang, 陈国洋, 吴小俊 Wu Xiaojun y 徐天阳 Xu Tianyang. "基于深度学习的无监督红外图像与可见光图像融合算法". Laser & Optoelectronics Progress 59, n.º 4 (2022): 0410010. http://dx.doi.org/10.3788/lop202259.0410010.
Texto completoTesis sobre el tema "Unsupervied learning"
GIOBERGIA, FLAVIO. "Machine learning with limited label availability: algorithms and applications". Doctoral thesis, Politecnico di Torino, 2023. https://hdl.handle.net/11583/2976594.
Texto completoSnyder, Benjamin Ph D. Massachusetts Institute of Technology. "Unsupervised multilingual learning". Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/62455.
Texto completoCataloged from PDF version of thesis.
Includes bibliographical references (p. 241-254).
For centuries, scholars have explored the deep links among human languages. In this thesis, we present a class of probabilistic models that exploit these links as a form of naturally occurring supervision. These models allow us to substantially improve performance for core text processing tasks, such as morphological segmentation, part-of-speech tagging, and syntactic parsing. Besides these traditional NLP tasks, we also present a multilingual model for lost language deciphersment. We test this model on the ancient Ugaritic language. Our results show that we can automatically uncover much of the historical relationship between Ugaritic and Biblical Hebrew, a known related language.
by Benjamin Snyder.
Ph.D.
Geigel, Arturo. "Unsupervised Learning Trojan". NSUWorks, 2014. http://nsuworks.nova.edu/gscis_etd/17.
Texto completoMathieu, Michael. "Unsupervised Learning under Uncertainty". Thesis, New York University, 2017. http://pqdtopen.proquest.com/#viewpdf?dispub=10261120.
Texto completoDeep learning, in particular neural networks, achieved remarkable success in the recent years. However, most of it is based on supervised learning, and relies on ever larger datasets, and immense computing power. One step towards general artificial intelligence is to build a model of the world, with enough knowledge to acquire a kind of ``common sense''. Representations learned by such a model could be reused in a number of other tasks. It would reduce the requirement for labelled samples and possibly acquire a deeper understanding of the problem. The vast quantities of knowledge required to build common sense precludes the use of supervised learning, and suggests to rely on unsupervised learning instead.
The concept of uncertainty is central to unsupervised learning. The task is usually to learn a complex, multimodal distribution. Density estimation and generative models aim at representing the whole distribution of the data, while predictive learning consists of predicting the state of the world given the context and, more often than not, the prediction is not unique. That may be because the model lacks the capacity or the computing power to make a certain prediction, or because the future depends on parameters that are not part of the observation. Finally, the world can be chaotic of truly stochastic. Representing complex, multimodal continuous distributions with deep neural networks is still an open problem.
In this thesis, we first assess the difficulties of representing probabilities in high dimensional spaces, and review the related work in this domain. We then introduce two methods to address the problem of video prediction, first using a novel form of linearizing auto-encoders and latent variables, and secondly using Generative Adversarial Networks (GANs). We show how GANs can be seen as trainable loss functions to represent uncertainty, then how they can be used to disentangle factors of variation. Finally, we explore a new non-probabilistic framework for GANs.
Boschini, Matteo. "Unsupervised Learning of Scene Flow". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/16226/.
Texto completoJelacic, Mersad. "Unsupervised Learning for Plant Recognition". Thesis, Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-247.
Texto completoSix methods are used for clustering data containing two different objects: sugar-beet plants
and weed. These objects are described by 19 different features, i.e. shape and color features.
There is also information about the distance between sugar-beet plants that is used for
labeling clusters. The methods that are evaluated: k-means, k-medoids, hierarchical clustering,
competitive learning, self-organizing maps and fuzzy c-means. After using the methods on
plant data, clusters are formed. The clusters are labeled with three different proposed
methods: expert, database and context method. Expert method is using a human for giving
initial cluster centers that are labeled. The database method is using a database as an expert
that provides initial cluster centers. The context method is using information about the
environment, which is the distance between sugar-beet plants, for labeling the clusters.
The algorithms that were tested, with the lowest achieved corresponding error, are: k-means
(3.3%), k-medoids (3.8%), hierarchical clustering (5.3%), competitive learning (6.8%), self-
organizing maps (4.9%) and fuzzy c-means (7.9%). Three different datasets were used and the
lowest error on dataset0 is 3.3%, compared to supervised learning methods where it is 3%.
For dataset1 the error is 18.7% and for dataset2 it is 5.8%. Compared to supervised methods,
the error on dataset1 is 11% and for dataset2 it is 5.1%. The high error rate on dataset1 is due
to the samples are not very well separated in different clusters. The features from dataset1 are
extracted from lower resolution on images than the other datasets, and another difference
between the datasets are the sugar-beet plants that are in different growth stages.
The performance of the three methods for labeling clusters is: expert method (6.8% as the
lowest error achieved), database method (3.7%) and context method (6.8%). These results
show the clustering results by competitive learning where the real error is 6.8%.
Unsupervised-learning methods for clustering can very well be used for plant identification.
Because the samples are not classified, an automatic labeling technique must be used if plants
are to be identified. The three proposed techniques can be used for automatic labeling of
plants.
Amin, Khizer y Mehmood ul haq Minhas. "Facebook Blocket with Unsupervised Learning". Thesis, Blekinge Tekniska Högskola, Institutionen för tillämpad signalbehandling, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-1969.
Texto completoKorkontzelos, Ioannis. "Unsupervised learning of multiword expressions". Thesis, University of York, 2010. http://etheses.whiterose.ac.uk/2091/.
Texto completoLiang, Yingyu. "Modern aspects of unsupervised learning". Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/52282.
Texto completoXiao, Ying. "New tools for unsupervised learning". Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/52995.
Texto completoLibros sobre el tema "Unsupervied learning"
Kyan, Matthew, Paisarn Muneesawang, Kambiz Jarrah y Ling Guan. Unsupervised Learning. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2014. http://dx.doi.org/10.1002/9781118875568.
Texto completoCelebi, M. Emre y Kemal Aydin, eds. Unsupervised Learning Algorithms. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8.
Texto completoLi, Xiangtao y Ka-Chun Wong, eds. Natural Computing for Unsupervised Learning. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-319-98566-4.
Texto completoUnsupervised Representation Learning with Correlations. [New York, N.Y.?]: [publisher not identified], 2020.
Buscar texto completoLeordeanu, Marius. Unsupervised Learning in Space and Time. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-42128-1.
Texto completoBaruque, Bruno y Emilio Corchado. Fusion Methods for Unsupervised Learning Ensembles. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-16205-3.
Texto completoAlbalate, Amparo y Wolfgang Minker. Semi-Supervised and Unsupervised Machine Learning. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118557693.
Texto completoBartlett, Marian Stewart. Face Image Analysis by Unsupervised Learning. Boston, MA: Springer US, 2001. http://dx.doi.org/10.1007/978-1-4615-1637-8.
Texto completoE, Hinton Geoffrey y Sejnowski Terrence J, eds. Unsupervised learning: Foundations of neural computation. Cambridge, Mass: MIT Press, 1999.
Buscar texto completoBartlett, Marian Stewart. Face Image Analysis by Unsupervised Learning. Boston, MA: Springer US, 2001.
Buscar texto completoCapítulos de libros sobre el tema "Unsupervied learning"
Deepak, P. "Anomaly Detection for Data with Spatial Attributes". En Unsupervised Learning Algorithms, 1–32. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_1.
Texto completoTorra, Vicenç, Guillermo Navarro-Arribas y Klara Stokes. "An Overview of the Use of Clustering for Data Privacy". En Unsupervised Learning Algorithms, 237–51. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_10.
Texto completoWang, Chang-Dong y Jian-Huang Lai. "Nonlinear Clustering: Methods and Applications". En Unsupervised Learning Algorithms, 253–302. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_11.
Texto completoİnkaya, Tülin, Sinan Kayalıgil y Nur Evin Özdemirel. "Swarm Intelligence-Based Clustering Algorithms: A Survey". En Unsupervised Learning Algorithms, 303–41. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_12.
Texto completoHuang, Xiaohui, Yunming Ye y Haijun Zhang. "Extending Kmeans-Type Algorithms by Integrating Intra-cluster Compactness and Inter-cluster Separation". En Unsupervised Learning Algorithms, 343–84. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_13.
Texto completoTsolakis, Dimitrios M. y George E. Tsekouras. "A Fuzzy-Soft Competitive Learning Approach for Grayscale Image Compression". En Unsupervised Learning Algorithms, 385–404. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_14.
Texto completoWong, Ka-Chun, Yue Li y Zhaolei Zhang. "Unsupervised Learning in Genome Informatics". En Unsupervised Learning Algorithms, 405–48. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_15.
Texto completoMartin, Dian I., John C. Martin y Michael W. Berry. "The Application of LSA to the Evaluation of Questionnaire Responses". En Unsupervised Learning Algorithms, 449–84. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_16.
Texto completoAhmed, Rezwan y George Karypis. "Mining Evolving Patterns in Dynamic Relational Networks". En Unsupervised Learning Algorithms, 485–532. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_17.
Texto completoTrentin, Edmondo y Marco Bongini. "Probabilistically Grounded Unsupervised Training of Neural Networks". En Unsupervised Learning Algorithms, 533–58. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24211-8_18.
Texto completoActas de conferencias sobre el tema "Unsupervied learning"
Yu, Francis T. S., Taiwei Lu y Don A. Gregory. "Self-Learning Optical Neural Network". En Spatial Light Modulators and Applications. Washington, D.C.: Optica Publishing Group, 1990. http://dx.doi.org/10.1364/slma.1990.mb4.
Texto completoShui, Xinghua y Huadong Zheng. "Multi-depth Hologram Generation with Unsupervised-learning Based Computer-generated Holography". En Digital Holography and Three-Dimensional Imaging. Washington, D.C.: Optica Publishing Group, 2022. http://dx.doi.org/10.1364/dh.2022.w5a.12.
Texto completoVer Steeg, Greg. "Unsupervised Learning via Total Correlation Explanation". En Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/740.
Texto completoGonzalez, Andres, Zoya Heidari y Olivier Lopez. "Data-Driven Algorithms for Image-Based Rock Classification and Formation Evaluation in Formations With Rapid Spatial Variation in Rock Fabric". En 2022 SPWLA 63rd Annual Symposium. Society of Petrophysicists and Well Log Analysts, 2022. http://dx.doi.org/10.30632/spwla-2022-0018.
Texto completoFigueirêdo, Ilan Sousa, Tássio Farias Carvalho, Wenisten José Dantas Silva, Lílian Lefol Nani Guarieiro y Erick Giovani Sperandio Nascimento. "Detecting Interesting and Anomalous Patterns In Multivariate Time-Series Data in an Offshore Platform Using Unsupervised Learning". En Offshore Technology Conference. OTC, 2021. http://dx.doi.org/10.4043/31297-ms.
Texto completoHuang, Ling. "Unsupervised Multi-view Learning". En Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/910.
Texto completoReite, Aaron A., Scott Kangas, Zackery Steck, George S. Goley, Jonathan Von Stroh y Steven Forsyth. "Unsupervised feature learning in remote sensing". En Applications of Machine Learning, editado por Michael E. Zelinski, Tarek M. Taha, Jonathan Howe, Abdul A. Awwal y Khan M. Iftekharuddin. SPIE, 2019. http://dx.doi.org/10.1117/12.2529791.
Texto completoChen, Junjie, William K. Cheung y Anran Wang. "Learning Deep Unsupervised Binary Codes for Image Retrieval". En Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/85.
Texto completoLi, Jundong, Jiliang Tang y Huan Liu. "Reconstruction-based Unsupervised Feature Selection: An Embedded Approach". En Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/300.
Texto completoXiang, Mingjun, Lingxiao Wang, Yu Sha, Hui Yuan, Kai Zhou y Hartmut G. Roskos. "Phase Retrieval for Terahertz Holography with Physics-Informed Deep Learning". En Digital Holography and Three-Dimensional Imaging. Washington, D.C.: Optica Publishing Group, 2022. http://dx.doi.org/10.1364/dh.2022.tu4a.4.
Texto completoInformes sobre el tema "Unsupervied learning"
Vesselinov, Velimir Valentinov. TensorDecompostions : Unsupervised machine learning methods. Office of Scientific and Technical Information (OSTI), febrero de 2019. http://dx.doi.org/10.2172/1493534.
Texto completoSprechmann, Pablo y Guillermo Sapiro. Dictionary Learning and Sparse Coding for Unsupervised Clustering. Fort Belvoir, VA: Defense Technical Information Center, septiembre de 2009. http://dx.doi.org/10.21236/ada513140.
Texto completoVesselinov, Velimir, Bulbul Ahmmed, Maruti Mudunuru, Jeff Pepin, Erick Burns, D. Siler, Satish Karra y Richard Middleton. Discovering Hidden Geothermal Signatures using Unsupervised Machine Learning. Office of Scientific and Technical Information (OSTI), abril de 2021. http://dx.doi.org/10.2172/1781347.
Texto completoSafta, Cosmin, Habib Najm, Michael Grant y Michael Sparapany. Trajectory Optimization via Unsupervised Probabilistic Learning On Manifolds. Office of Scientific and Technical Information (OSTI), septiembre de 2021. http://dx.doi.org/10.2172/1821958.
Texto completoShekhar, Shubhranshu, Jetson Leder-Luis y Leman Akoglu. Unsupervised Machine Learning for Explainable Health Care Fraud Detection. Cambridge, MA: National Bureau of Economic Research, febrero de 2023. http://dx.doi.org/10.3386/w30946.
Texto completoAhmmed, Bulbul. Supervised and Unsupervised Machine Learning to Understanding Reactive-transport Data. Office of Scientific and Technical Information (OSTI), mayo de 2020. http://dx.doi.org/10.2172/1630844.
Texto completoObert, James y Timothy James Loffredo. Efficient Binary Static Code Data Flow Analysis Using Unsupervised Learning. Office of Scientific and Technical Information (OSTI), noviembre de 2019. http://dx.doi.org/10.2172/1592974.
Texto completoYeamans, Katelyn Angela. Unsupervised Machine Learning for Evaluation of Aging in Explosive Pressed Pellets. Office of Scientific and Technical Information (OSTI), diciembre de 2018. http://dx.doi.org/10.2172/1484618.
Texto completoWehner, Michael, Mark Risser, Paul Ullrich y Shiheng Duan. Exploring variability in seasonal average and extreme precipitation using unsupervised machine learning. Office of Scientific and Technical Information (OSTI), abril de 2021. http://dx.doi.org/10.2172/1769708.
Texto completoPinar, Ali, Tamara G. Kolda, Kevin Thomas Carlberg, Grey Ballard y Michael Mahoney. Unsupervised Learning Through Randomized Algorithms for High-Volume High-Velocity Data (ULTRA-HV). Office of Scientific and Technical Information (OSTI), enero de 2018. http://dx.doi.org/10.2172/1417788.
Texto completo