Academic literature on the topic 'Probabilistic deep models'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Probabilistic deep models.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Probabilistic deep models"
Masegosa, Andrés R., Rafael Cabañas, Helge Langseth, Thomas D. Nielsen, and Antonio Salmerón. "Probabilistic Models with Deep Neural Networks." Entropy 23, no. 1 (January 18, 2021): 117. http://dx.doi.org/10.3390/e23010117.
Full textVillanueva Llerena, Julissa, and Denis Deratani Maua. "Efficient Predictive Uncertainty Estimators for Deep Probabilistic Models." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 10 (April 3, 2020): 13740–41. http://dx.doi.org/10.1609/aaai.v34i10.7142.
Full textKarami, Mahdi, and Dale Schuurmans. "Deep Probabilistic Canonical Correlation Analysis." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 9 (May 18, 2021): 8055–63. http://dx.doi.org/10.1609/aaai.v35i9.16982.
Full textLu, Ming, Zhihao Duan, Fengqing Zhu, and Zhan Ma. "Deep Hierarchical Video Compression." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 8 (March 24, 2024): 8859–67. http://dx.doi.org/10.1609/aaai.v38i8.28733.
Full textMaroñas, Juan, Roberto Paredes, and Daniel Ramos. "Calibration of deep probabilistic models with decoupled bayesian neural networks." Neurocomputing 407 (September 2020): 194–205. http://dx.doi.org/10.1016/j.neucom.2020.04.103.
Full textLi, Zhenjun, Xi Liu, Dawei Kou, Yi Hu, Qingrui Zhang, and Qingxi Yuan. "Probabilistic Models for the Shear Strength of RC Deep Beams." Applied Sciences 13, no. 8 (April 12, 2023): 4853. http://dx.doi.org/10.3390/app13084853.
Full textSerpell, Cristián, Ignacio A. Araya, Carlos Valle, and Héctor Allende. "Addressing model uncertainty in probabilistic forecasting using Monte Carlo dropout." Intelligent Data Analysis 24 (December 4, 2020): 185–205. http://dx.doi.org/10.3233/ida-200015.
Full textBoursin, Nicolas, Carl Remlinger, and Joseph Mikael. "Deep Generators on Commodity Markets Application to Deep Hedging." Risks 11, no. 1 (December 23, 2022): 7. http://dx.doi.org/10.3390/risks11010007.
Full textZuidberg Dos Martires, Pedro. "Probabilistic Neural Circuits." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 15 (March 24, 2024): 17280–89. http://dx.doi.org/10.1609/aaai.v38i15.29675.
Full textRavuri, Suman, Karel Lenc, Matthew Willson, Dmitry Kangin, Remi Lam, Piotr Mirowski, Megan Fitzsimons, et al. "Skilful precipitation nowcasting using deep generative models of radar." Nature 597, no. 7878 (September 29, 2021): 672–77. http://dx.doi.org/10.1038/s41586-021-03854-z.
Full textDissertations / Theses on the topic "Probabilistic deep models"
Misino, Eleonora. "Deep Generative Models with Probabilistic Logic Priors." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/24058/.
Full textZhai, Menghua. "Deep Probabilistic Models for Camera Geo-Calibration." UKnowledge, 2018. https://uknowledge.uky.edu/cs_etds/74.
Full textWu, Di. "Human action recognition using deep probabilistic graphical models." Thesis, University of Sheffield, 2014. http://etheses.whiterose.ac.uk/6603/.
Full textRossi, Simone. "Improving Scalability and Inference in Probabilistic Deep Models." Electronic Thesis or Diss., Sorbonne université, 2022. http://www.theses.fr/2022SORUS042.
Full textThroughout the last decade, deep learning has reached a sufficient level of maturity to become the preferred choice to solve machine learning-related problems or to aid decision making processes.At the same time, deep learning is generally not equipped with the ability to accurately quantify the uncertainty of its predictions, thus making these models less suitable for risk-critical applications.A possible solution to address this problem is to employ a Bayesian formulation; however, while this offers an elegant treatment, it is analytically intractable and it requires approximations.Despite the huge advancements in the last few years, there is still a long way to make these approaches widely applicable.In this thesis, we address some of the challenges for modern Bayesian deep learning, by proposing and studying solutions to improve scalability and inference of these models.The first part of the thesis is dedicated to deep models where inference is carried out using variational inference (VI).Specifically, we study the role of initialization of the variational parameters and we show how careful initialization strategies can make VI deliver good performance even in large scale models.In this part of the thesis we also study the over-regularization effect of the variational objective on over-parametrized models.To tackle this problem, we propose an novel parameterization based on the Walsh-Hadamard transform; not only this solves the over-regularization effect of VI but it also allows us to model non-factorized posteriors while keeping time and space complexity under control.The second part of the thesis is dedicated to a study on the role of priors.While being an essential building block of Bayes' rule, picking good priors for deep learning models is generally hard.For this reason, we propose two different strategies based (i) on the functional interpretation of neural networks and (ii) on a scalable procedure to perform model selection on the prior hyper-parameters, akin to maximization of the marginal likelihood.To conclude this part, we analyze a different kind of Bayesian model (Gaussian process) and we study the effect of placing a prior on all the hyper-parameters of these models, including the additional variables required by the inducing-point approximations.We also show how it is possible to infer free-form posteriors on these variables, which conventionally would have been otherwise point-estimated
Hager, Paul Andrew. "Investigation of connection between deep learning and probabilistic graphical models." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/119552.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (page 21).
The field of machine learning (ML) has benefitted greatly from its relationship with the field of classical statistics. In support of that continued expansion, the following proposes an alternative perspective at the link between these fields. The link focuses on probabilistic graphical models in the context of reinforcement learning. Viewing certain algorithms as reinforcement learning gives one an ability to map ML concepts to statistics problems. Training a multi-layer nonlinear perceptron algorithm is equivalent to structure learning problems in probabilistic graphical models (PGMs). The technique of boosting weak rules into an ensemble is weighted sampling. Finally regularizing neural networks using the dropout technique is conditioning on certain observations in PGMs.
by Paul Andrew Hager.
M. Eng.
Farouni, Tarek. "An Overview of Probabilistic Latent Variable Models with anApplication to the Deep Unsupervised Learning of ChromatinStates." The Ohio State University, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=osu1492189894812539.
Full textQian, Weizhu. "Discovering human mobility from mobile data : probabilistic models and learning algorithms." Thesis, Bourgogne Franche-Comté, 2020. http://www.theses.fr/2020UBFCA025.
Full textSmartphone usage data can be used to study human indoor and outdoor mobility. In our work, we investigate both aspects in proposing machine learning-based algorithms adapted to the different information sources that can be collected.In terms of outdoor mobility, we use the collected GPS coordinate data to discover the daily mobility patterns of the users. To this end, we propose an automatic clustering algorithm using the Dirichlet process Gaussian mixture model (DPGMM) so as to cluster the daily GPS trajectories. This clustering method is based on estimating probability densities of the trajectories, which alleviate the problems caused by the data noise.By contrast, we utilize the collected WiFi fingerprint data to study indoor human mobility. In order to predict the indoor user location at the next time points, we devise a hybrid deep learning model, called the convolutional mixture density recurrent neural network (CMDRNN), which combines the advantages of different multiple deep neural networks. Moreover, as for accurate indoor location recognition, we presume that there exists a latent distribution governing the input and output at the same time. Based on this assumption, we develop a variational auto-encoder (VAE)-based semi-supervised learning model. In the unsupervised learning procedure, we employ a VAE model to learn a latent distribution of the input, the WiFi fingerprint data. In the supervised learning procedure, we use a neural network to compute the target, the user coordinates. Furthermore, based on the same assumption used in the VAE-based semi-supervised learning model, we leverage the information bottleneck theory to devise a variational information bottleneck (VIB)-based model. This is an end-to-end deep learning model which is easier to train and has better performance.Finally, we validate thees proposed methods on several public real-world datasets providing thus results that verify the efficiencies of our methods as compared to other existing methods generally used
SYED, MUHAMMAD FARRUKH SHAHID. "Data-Driven Approach based on Deep Learning and Probabilistic Models for PHY-Layer Security in AI-enabled Cognitive Radio IoT." Doctoral thesis, Università degli studi di Genova, 2021. http://hdl.handle.net/11567/1048543.
Full textEl-Shaer, Mennat Allah. "An Experimental Evaluation of Probabilistic Deep Networks for Real-time Traffic Scene Representation using Graphical Processing Units." The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1546539166677894.
Full textHu, Xu. "Towards efficient learning of graphical models and neural networks with variational techniques." Thesis, Paris Est, 2019. http://www.theses.fr/2019PESC1037.
Full textIn this thesis, I will mainly focus on variational inference and probabilistic models. In particular, I will cover several projects I have been working on during my PhD about improving the efficiency of AI/ML systems with variational techniques. The thesis consists of two parts. In the first part, the computational efficiency of probabilistic graphical models is studied. In the second part, several problems of learning deep neural networks are investigated, which are related to either energy efficiency or sample efficiency
Books on the topic "Probabilistic deep models"
Oaksford, Mike, and Nick Chater. Causal Models and Conditional Reasoning. Edited by Michael R. Waldmann. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199399550.013.5.
Full textTutino, Stefania. Uncertainty in Post-Reformation Catholicism. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780190694098.001.0001.
Full textTrappenberg, Thomas P. Fundamentals of Machine Learning. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198828044.001.0001.
Full textLevinson, Stephen C. Speech Acts. Edited by Yan Huang. Oxford University Press, 2016. http://dx.doi.org/10.1093/oxfordhb/9780199697960.013.22.
Full textBook chapters on the topic "Probabilistic deep models"
Sucar, Luis Enrique. "Deep Learning and Graphical Models." In Probabilistic Graphical Models, 327–46. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-61943-5_16.
Full textBahadir, Cagla Deniz, Benjamin Liechty, David J. Pisapia, and Mert R. Sabuncu. "Characterizing the Features of Mitotic Figures Using a Conditional Diffusion Probabilistic Model." In Deep Generative Models, 121–31. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-53767-7_12.
Full textGustafsson, Fredrik K., Martin Danelljan, Goutam Bhat, and Thomas B. Schön. "Energy-Based Models for Deep Probabilistic Regression." In Computer Vision – ECCV 2020, 325–43. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58565-5_20.
Full textHung, Alex Ling Yu, Zhiqing Sun, Wanwen Chen, and John Galeotti. "Hierarchical Probabilistic Ultrasound Image Inpainting via Variational Inference." In Deep Generative Models, and Data Augmentation, Labelling, and Imperfections, 83–92. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-88210-5_7.
Full textNkambule, Tshepo, and Ritesh Ajoodha. "Classification of Music by Genre Using Probabilistic Models and Deep Learning Models." In Proceedings of Sixth International Congress on Information and Communication Technology, 185–93. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-2102-4_17.
Full textVölcker, Claas, Alejandro Molina, Johannes Neumann, Dirk Westermann, and Kristian Kersting. "DeepNotebooks: Deep Probabilistic Models Construct Python Notebooks for Reporting Datasets." In Machine Learning and Knowledge Discovery in Databases, 28–43. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-43823-4_3.
Full textDinh, Xuan Truong, and Hai Van Pham. "Social Network Analysis Based on Combining Probabilistic Models with Graph Deep Learning." In Communication and Intelligent Systems, 975–86. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-1089-9_76.
Full textLinghu, Yuan, Xiangxue Li, and Zhenlong Zhang. "Deep Learning vs. Traditional Probabilistic Models: Case Study on Short Inputs for Password Guessing." In Algorithms and Architectures for Parallel Processing, 468–83. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-38991-8_31.
Full textLiu, Zheng, and Hao Wang. "Research on Process Diagnosis of Severe Accidents Based on Deep Learning and Probabilistic Safety Analysis." In Springer Proceedings in Physics, 624–34. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-1023-6_54.
Full textStojanovski, David, Uxio Hermida, Pablo Lamata, Arian Beqiri, and Alberto Gomez. "Echo from Noise: Synthetic Ultrasound Image Generation Using Diffusion Models for Real Image Segmentation." In Simplifying Medical Ultrasound, 34–43. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-44521-7_4.
Full textConference papers on the topic "Probabilistic deep models"
Sidheekh, Sahil, Saurabh Mathur, Athresh Karanam, and Sriraam Natarajan. "Deep Tractable Probabilistic Models." In CODS-COMAD 2024: 7th Joint International Conference on Data Science & Management of Data (11th ACM IKDD CODS and 29th COMAD). New York, NY, USA: ACM, 2024. http://dx.doi.org/10.1145/3632410.3633295.
Full textLiu, Xixi, Che-Tsung Lin, and Christopher Zach. "Energy-based Models for Deep Probabilistic Regression." In 2022 26th International Conference on Pattern Recognition (ICPR). IEEE, 2022. http://dx.doi.org/10.1109/icpr56361.2022.9955636.
Full textAsgariandehkordi, Hojat, Sobhan Goudarzi, Adrian Basarab, and Hassan Rivaz. "Deep Ultrasound Denoising Using Diffusion Probabilistic Models." In 2023 IEEE International Ultrasonics Symposium (IUS). IEEE, 2023. http://dx.doi.org/10.1109/ius51837.2023.10306544.
Full textVillanueva Llerena, Julissa. "Predictive Uncertainty Estimation for Tractable Deep Probabilistic Models." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/745.
Full textCotterell, Ryan, and Jason Eisner. "Probabilistic Typology: Deep Generative Models of Vowel Inventories." In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, PA, USA: Association for Computational Linguistics, 2017. http://dx.doi.org/10.18653/v1/p17-1109.
Full textZHANG, YANG, YOU-WU WANG, and YI-QING NI. "HYBRID PROBABILISTIC DEEP LEARNING FOR DAMAGE IDENTIFICATION." In Structural Health Monitoring 2023. Destech Publications, Inc., 2023. http://dx.doi.org/10.12783/shm2023/37014.
Full textLi, Xiucheng, Gao Cong, and Yun Cheng. "Spatial Transition Learning on Road Networks with Deep Probabilistic Models." In 2020 IEEE 36th International Conference on Data Engineering (ICDE). IEEE, 2020. http://dx.doi.org/10.1109/icde48307.2020.00037.
Full textZhu, Jun. "Probabilistic Machine Learning: Models, Algorithms and a Programming Library." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/823.
Full textSaleem, Rabia, Bo Yuan, Fatih Kurugollu, and Ashiq Anjum. "Explaining probabilistic Artificial Intelligence (AI) models by discretizing Deep Neural Networks." In 2020 IEEE/ACM 13th International Conference on Utility and Cloud Computing (UCC). IEEE, 2020. http://dx.doi.org/10.1109/ucc48980.2020.00070.
Full textBejarano, Gissella. "PhD Forum: Deep Learning and Probabilistic Models Applied to Sequential Data." In 2018 IEEE International Conference on Smart Computing (SMARTCOMP). IEEE, 2018. http://dx.doi.org/10.1109/smartcomp.2018.00066.
Full text