Literatura académica sobre el tema "Probabilistic deep models"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Probabilistic deep models".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Artículos de revistas sobre el tema "Probabilistic deep models"
Masegosa, Andrés R., Rafael Cabañas, Helge Langseth, Thomas D. Nielsen y Antonio Salmerón. "Probabilistic Models with Deep Neural Networks". Entropy 23, n.º 1 (18 de enero de 2021): 117. http://dx.doi.org/10.3390/e23010117.
Texto completoVillanueva Llerena, Julissa y Denis Deratani Maua. "Efficient Predictive Uncertainty Estimators for Deep Probabilistic Models". Proceedings of the AAAI Conference on Artificial Intelligence 34, n.º 10 (3 de abril de 2020): 13740–41. http://dx.doi.org/10.1609/aaai.v34i10.7142.
Texto completoKarami, Mahdi y Dale Schuurmans. "Deep Probabilistic Canonical Correlation Analysis". Proceedings of the AAAI Conference on Artificial Intelligence 35, n.º 9 (18 de mayo de 2021): 8055–63. http://dx.doi.org/10.1609/aaai.v35i9.16982.
Texto completoLu, Ming, Zhihao Duan, Fengqing Zhu y Zhan Ma. "Deep Hierarchical Video Compression". Proceedings of the AAAI Conference on Artificial Intelligence 38, n.º 8 (24 de marzo de 2024): 8859–67. http://dx.doi.org/10.1609/aaai.v38i8.28733.
Texto completoMaroñas, Juan, Roberto Paredes y Daniel Ramos. "Calibration of deep probabilistic models with decoupled bayesian neural networks". Neurocomputing 407 (septiembre de 2020): 194–205. http://dx.doi.org/10.1016/j.neucom.2020.04.103.
Texto completoLi, Zhenjun, Xi Liu, Dawei Kou, Yi Hu, Qingrui Zhang y Qingxi Yuan. "Probabilistic Models for the Shear Strength of RC Deep Beams". Applied Sciences 13, n.º 8 (12 de abril de 2023): 4853. http://dx.doi.org/10.3390/app13084853.
Texto completoSerpell, Cristián, Ignacio A. Araya, Carlos Valle y Héctor Allende. "Addressing model uncertainty in probabilistic forecasting using Monte Carlo dropout". Intelligent Data Analysis 24 (4 de diciembre de 2020): 185–205. http://dx.doi.org/10.3233/ida-200015.
Texto completoBoursin, Nicolas, Carl Remlinger y Joseph Mikael. "Deep Generators on Commodity Markets Application to Deep Hedging". Risks 11, n.º 1 (23 de diciembre de 2022): 7. http://dx.doi.org/10.3390/risks11010007.
Texto completoZuidberg Dos Martires, Pedro. "Probabilistic Neural Circuits". Proceedings of the AAAI Conference on Artificial Intelligence 38, n.º 15 (24 de marzo de 2024): 17280–89. http://dx.doi.org/10.1609/aaai.v38i15.29675.
Texto completoRavuri, Suman, Karel Lenc, Matthew Willson, Dmitry Kangin, Remi Lam, Piotr Mirowski, Megan Fitzsimons et al. "Skilful precipitation nowcasting using deep generative models of radar". Nature 597, n.º 7878 (29 de septiembre de 2021): 672–77. http://dx.doi.org/10.1038/s41586-021-03854-z.
Texto completoTesis sobre el tema "Probabilistic deep models"
Misino, Eleonora. "Deep Generative Models with Probabilistic Logic Priors". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/24058/.
Texto completoZhai, Menghua. "Deep Probabilistic Models for Camera Geo-Calibration". UKnowledge, 2018. https://uknowledge.uky.edu/cs_etds/74.
Texto completoWu, Di. "Human action recognition using deep probabilistic graphical models". Thesis, University of Sheffield, 2014. http://etheses.whiterose.ac.uk/6603/.
Texto completoRossi, Simone. "Improving Scalability and Inference in Probabilistic Deep Models". Electronic Thesis or Diss., Sorbonne université, 2022. http://www.theses.fr/2022SORUS042.
Texto completoThroughout the last decade, deep learning has reached a sufficient level of maturity to become the preferred choice to solve machine learning-related problems or to aid decision making processes.At the same time, deep learning is generally not equipped with the ability to accurately quantify the uncertainty of its predictions, thus making these models less suitable for risk-critical applications.A possible solution to address this problem is to employ a Bayesian formulation; however, while this offers an elegant treatment, it is analytically intractable and it requires approximations.Despite the huge advancements in the last few years, there is still a long way to make these approaches widely applicable.In this thesis, we address some of the challenges for modern Bayesian deep learning, by proposing and studying solutions to improve scalability and inference of these models.The first part of the thesis is dedicated to deep models where inference is carried out using variational inference (VI).Specifically, we study the role of initialization of the variational parameters and we show how careful initialization strategies can make VI deliver good performance even in large scale models.In this part of the thesis we also study the over-regularization effect of the variational objective on over-parametrized models.To tackle this problem, we propose an novel parameterization based on the Walsh-Hadamard transform; not only this solves the over-regularization effect of VI but it also allows us to model non-factorized posteriors while keeping time and space complexity under control.The second part of the thesis is dedicated to a study on the role of priors.While being an essential building block of Bayes' rule, picking good priors for deep learning models is generally hard.For this reason, we propose two different strategies based (i) on the functional interpretation of neural networks and (ii) on a scalable procedure to perform model selection on the prior hyper-parameters, akin to maximization of the marginal likelihood.To conclude this part, we analyze a different kind of Bayesian model (Gaussian process) and we study the effect of placing a prior on all the hyper-parameters of these models, including the additional variables required by the inducing-point approximations.We also show how it is possible to infer free-form posteriors on these variables, which conventionally would have been otherwise point-estimated
Hager, Paul Andrew. "Investigation of connection between deep learning and probabilistic graphical models". Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/119552.
Texto completoThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (page 21).
The field of machine learning (ML) has benefitted greatly from its relationship with the field of classical statistics. In support of that continued expansion, the following proposes an alternative perspective at the link between these fields. The link focuses on probabilistic graphical models in the context of reinforcement learning. Viewing certain algorithms as reinforcement learning gives one an ability to map ML concepts to statistics problems. Training a multi-layer nonlinear perceptron algorithm is equivalent to structure learning problems in probabilistic graphical models (PGMs). The technique of boosting weak rules into an ensemble is weighted sampling. Finally regularizing neural networks using the dropout technique is conditioning on certain observations in PGMs.
by Paul Andrew Hager.
M. Eng.
Farouni, Tarek. "An Overview of Probabilistic Latent Variable Models with anApplication to the Deep Unsupervised Learning of ChromatinStates". The Ohio State University, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=osu1492189894812539.
Texto completoQian, Weizhu. "Discovering human mobility from mobile data : probabilistic models and learning algorithms". Thesis, Bourgogne Franche-Comté, 2020. http://www.theses.fr/2020UBFCA025.
Texto completoSmartphone usage data can be used to study human indoor and outdoor mobility. In our work, we investigate both aspects in proposing machine learning-based algorithms adapted to the different information sources that can be collected.In terms of outdoor mobility, we use the collected GPS coordinate data to discover the daily mobility patterns of the users. To this end, we propose an automatic clustering algorithm using the Dirichlet process Gaussian mixture model (DPGMM) so as to cluster the daily GPS trajectories. This clustering method is based on estimating probability densities of the trajectories, which alleviate the problems caused by the data noise.By contrast, we utilize the collected WiFi fingerprint data to study indoor human mobility. In order to predict the indoor user location at the next time points, we devise a hybrid deep learning model, called the convolutional mixture density recurrent neural network (CMDRNN), which combines the advantages of different multiple deep neural networks. Moreover, as for accurate indoor location recognition, we presume that there exists a latent distribution governing the input and output at the same time. Based on this assumption, we develop a variational auto-encoder (VAE)-based semi-supervised learning model. In the unsupervised learning procedure, we employ a VAE model to learn a latent distribution of the input, the WiFi fingerprint data. In the supervised learning procedure, we use a neural network to compute the target, the user coordinates. Furthermore, based on the same assumption used in the VAE-based semi-supervised learning model, we leverage the information bottleneck theory to devise a variational information bottleneck (VIB)-based model. This is an end-to-end deep learning model which is easier to train and has better performance.Finally, we validate thees proposed methods on several public real-world datasets providing thus results that verify the efficiencies of our methods as compared to other existing methods generally used
SYED, MUHAMMAD FARRUKH SHAHID. "Data-Driven Approach based on Deep Learning and Probabilistic Models for PHY-Layer Security in AI-enabled Cognitive Radio IoT". Doctoral thesis, Università degli studi di Genova, 2021. http://hdl.handle.net/11567/1048543.
Texto completoEl-Shaer, Mennat Allah. "An Experimental Evaluation of Probabilistic Deep Networks for Real-time Traffic Scene Representation using Graphical Processing Units". The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1546539166677894.
Texto completoHu, Xu. "Towards efficient learning of graphical models and neural networks with variational techniques". Thesis, Paris Est, 2019. http://www.theses.fr/2019PESC1037.
Texto completoIn this thesis, I will mainly focus on variational inference and probabilistic models. In particular, I will cover several projects I have been working on during my PhD about improving the efficiency of AI/ML systems with variational techniques. The thesis consists of two parts. In the first part, the computational efficiency of probabilistic graphical models is studied. In the second part, several problems of learning deep neural networks are investigated, which are related to either energy efficiency or sample efficiency
Libros sobre el tema "Probabilistic deep models"
Oaksford, Mike y Nick Chater. Causal Models and Conditional Reasoning. Editado por Michael R. Waldmann. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199399550.013.5.
Texto completoTutino, Stefania. Uncertainty in Post-Reformation Catholicism. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780190694098.001.0001.
Texto completoTrappenberg, Thomas P. Fundamentals of Machine Learning. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198828044.001.0001.
Texto completoLevinson, Stephen C. Speech Acts. Editado por Yan Huang. Oxford University Press, 2016. http://dx.doi.org/10.1093/oxfordhb/9780199697960.013.22.
Texto completoCapítulos de libros sobre el tema "Probabilistic deep models"
Sucar, Luis Enrique. "Deep Learning and Graphical Models". En Probabilistic Graphical Models, 327–46. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-61943-5_16.
Texto completoBahadir, Cagla Deniz, Benjamin Liechty, David J. Pisapia y Mert R. Sabuncu. "Characterizing the Features of Mitotic Figures Using a Conditional Diffusion Probabilistic Model". En Deep Generative Models, 121–31. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-53767-7_12.
Texto completoGustafsson, Fredrik K., Martin Danelljan, Goutam Bhat y Thomas B. Schön. "Energy-Based Models for Deep Probabilistic Regression". En Computer Vision – ECCV 2020, 325–43. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58565-5_20.
Texto completoHung, Alex Ling Yu, Zhiqing Sun, Wanwen Chen y John Galeotti. "Hierarchical Probabilistic Ultrasound Image Inpainting via Variational Inference". En Deep Generative Models, and Data Augmentation, Labelling, and Imperfections, 83–92. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-88210-5_7.
Texto completoNkambule, Tshepo y Ritesh Ajoodha. "Classification of Music by Genre Using Probabilistic Models and Deep Learning Models". En Proceedings of Sixth International Congress on Information and Communication Technology, 185–93. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-2102-4_17.
Texto completoVölcker, Claas, Alejandro Molina, Johannes Neumann, Dirk Westermann y Kristian Kersting. "DeepNotebooks: Deep Probabilistic Models Construct Python Notebooks for Reporting Datasets". En Machine Learning and Knowledge Discovery in Databases, 28–43. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-43823-4_3.
Texto completoDinh, Xuan Truong y Hai Van Pham. "Social Network Analysis Based on Combining Probabilistic Models with Graph Deep Learning". En Communication and Intelligent Systems, 975–86. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-1089-9_76.
Texto completoLinghu, Yuan, Xiangxue Li y Zhenlong Zhang. "Deep Learning vs. Traditional Probabilistic Models: Case Study on Short Inputs for Password Guessing". En Algorithms and Architectures for Parallel Processing, 468–83. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-38991-8_31.
Texto completoLiu, Zheng y Hao Wang. "Research on Process Diagnosis of Severe Accidents Based on Deep Learning and Probabilistic Safety Analysis". En Springer Proceedings in Physics, 624–34. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-1023-6_54.
Texto completoStojanovski, David, Uxio Hermida, Pablo Lamata, Arian Beqiri y Alberto Gomez. "Echo from Noise: Synthetic Ultrasound Image Generation Using Diffusion Models for Real Image Segmentation". En Simplifying Medical Ultrasound, 34–43. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-44521-7_4.
Texto completoActas de conferencias sobre el tema "Probabilistic deep models"
Sidheekh, Sahil, Saurabh Mathur, Athresh Karanam y Sriraam Natarajan. "Deep Tractable Probabilistic Models". En CODS-COMAD 2024: 7th Joint International Conference on Data Science & Management of Data (11th ACM IKDD CODS and 29th COMAD). New York, NY, USA: ACM, 2024. http://dx.doi.org/10.1145/3632410.3633295.
Texto completoLiu, Xixi, Che-Tsung Lin y Christopher Zach. "Energy-based Models for Deep Probabilistic Regression". En 2022 26th International Conference on Pattern Recognition (ICPR). IEEE, 2022. http://dx.doi.org/10.1109/icpr56361.2022.9955636.
Texto completoAsgariandehkordi, Hojat, Sobhan Goudarzi, Adrian Basarab y Hassan Rivaz. "Deep Ultrasound Denoising Using Diffusion Probabilistic Models". En 2023 IEEE International Ultrasonics Symposium (IUS). IEEE, 2023. http://dx.doi.org/10.1109/ius51837.2023.10306544.
Texto completoVillanueva Llerena, Julissa. "Predictive Uncertainty Estimation for Tractable Deep Probabilistic Models". En Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/745.
Texto completoCotterell, Ryan y Jason Eisner. "Probabilistic Typology: Deep Generative Models of Vowel Inventories". En Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, PA, USA: Association for Computational Linguistics, 2017. http://dx.doi.org/10.18653/v1/p17-1109.
Texto completoZHANG, YANG, YOU-WU WANG y YI-QING NI. "HYBRID PROBABILISTIC DEEP LEARNING FOR DAMAGE IDENTIFICATION". En Structural Health Monitoring 2023. Destech Publications, Inc., 2023. http://dx.doi.org/10.12783/shm2023/37014.
Texto completoLi, Xiucheng, Gao Cong y Yun Cheng. "Spatial Transition Learning on Road Networks with Deep Probabilistic Models". En 2020 IEEE 36th International Conference on Data Engineering (ICDE). IEEE, 2020. http://dx.doi.org/10.1109/icde48307.2020.00037.
Texto completoZhu, Jun. "Probabilistic Machine Learning: Models, Algorithms and a Programming Library". En Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/823.
Texto completoSaleem, Rabia, Bo Yuan, Fatih Kurugollu y Ashiq Anjum. "Explaining probabilistic Artificial Intelligence (AI) models by discretizing Deep Neural Networks". En 2020 IEEE/ACM 13th International Conference on Utility and Cloud Computing (UCC). IEEE, 2020. http://dx.doi.org/10.1109/ucc48980.2020.00070.
Texto completoBejarano, Gissella. "PhD Forum: Deep Learning and Probabilistic Models Applied to Sequential Data". En 2018 IEEE International Conference on Smart Computing (SMARTCOMP). IEEE, 2018. http://dx.doi.org/10.1109/smartcomp.2018.00066.
Texto completo