Literatura científica selecionada sobre o tema "Probabilistic deep models"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Índice
Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Probabilistic deep models".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Artigos de revistas sobre o assunto "Probabilistic deep models"
Masegosa, Andrés R., Rafael Cabañas, Helge Langseth, Thomas D. Nielsen e Antonio Salmerón. "Probabilistic Models with Deep Neural Networks". Entropy 23, n.º 1 (18 de janeiro de 2021): 117. http://dx.doi.org/10.3390/e23010117.
Texto completo da fonteVillanueva Llerena, Julissa, e Denis Deratani Maua. "Efficient Predictive Uncertainty Estimators for Deep Probabilistic Models". Proceedings of the AAAI Conference on Artificial Intelligence 34, n.º 10 (3 de abril de 2020): 13740–41. http://dx.doi.org/10.1609/aaai.v34i10.7142.
Texto completo da fonteKarami, Mahdi, e Dale Schuurmans. "Deep Probabilistic Canonical Correlation Analysis". Proceedings of the AAAI Conference on Artificial Intelligence 35, n.º 9 (18 de maio de 2021): 8055–63. http://dx.doi.org/10.1609/aaai.v35i9.16982.
Texto completo da fonteLu, Ming, Zhihao Duan, Fengqing Zhu e Zhan Ma. "Deep Hierarchical Video Compression". Proceedings of the AAAI Conference on Artificial Intelligence 38, n.º 8 (24 de março de 2024): 8859–67. http://dx.doi.org/10.1609/aaai.v38i8.28733.
Texto completo da fonteMaroñas, Juan, Roberto Paredes e Daniel Ramos. "Calibration of deep probabilistic models with decoupled bayesian neural networks". Neurocomputing 407 (setembro de 2020): 194–205. http://dx.doi.org/10.1016/j.neucom.2020.04.103.
Texto completo da fonteLi, Zhenjun, Xi Liu, Dawei Kou, Yi Hu, Qingrui Zhang e Qingxi Yuan. "Probabilistic Models for the Shear Strength of RC Deep Beams". Applied Sciences 13, n.º 8 (12 de abril de 2023): 4853. http://dx.doi.org/10.3390/app13084853.
Texto completo da fonteSerpell, Cristián, Ignacio A. Araya, Carlos Valle e Héctor Allende. "Addressing model uncertainty in probabilistic forecasting using Monte Carlo dropout". Intelligent Data Analysis 24 (4 de dezembro de 2020): 185–205. http://dx.doi.org/10.3233/ida-200015.
Texto completo da fonteBoursin, Nicolas, Carl Remlinger e Joseph Mikael. "Deep Generators on Commodity Markets Application to Deep Hedging". Risks 11, n.º 1 (23 de dezembro de 2022): 7. http://dx.doi.org/10.3390/risks11010007.
Texto completo da fonteZuidberg Dos Martires, Pedro. "Probabilistic Neural Circuits". Proceedings of the AAAI Conference on Artificial Intelligence 38, n.º 15 (24 de março de 2024): 17280–89. http://dx.doi.org/10.1609/aaai.v38i15.29675.
Texto completo da fonteRavuri, Suman, Karel Lenc, Matthew Willson, Dmitry Kangin, Remi Lam, Piotr Mirowski, Megan Fitzsimons et al. "Skilful precipitation nowcasting using deep generative models of radar". Nature 597, n.º 7878 (29 de setembro de 2021): 672–77. http://dx.doi.org/10.1038/s41586-021-03854-z.
Texto completo da fonteTeses / dissertações sobre o assunto "Probabilistic deep models"
Misino, Eleonora. "Deep Generative Models with Probabilistic Logic Priors". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/24058/.
Texto completo da fonteZhai, Menghua. "Deep Probabilistic Models for Camera Geo-Calibration". UKnowledge, 2018. https://uknowledge.uky.edu/cs_etds/74.
Texto completo da fonteWu, Di. "Human action recognition using deep probabilistic graphical models". Thesis, University of Sheffield, 2014. http://etheses.whiterose.ac.uk/6603/.
Texto completo da fonteRossi, Simone. "Improving Scalability and Inference in Probabilistic Deep Models". Electronic Thesis or Diss., Sorbonne université, 2022. http://www.theses.fr/2022SORUS042.
Texto completo da fonteThroughout the last decade, deep learning has reached a sufficient level of maturity to become the preferred choice to solve machine learning-related problems or to aid decision making processes.At the same time, deep learning is generally not equipped with the ability to accurately quantify the uncertainty of its predictions, thus making these models less suitable for risk-critical applications.A possible solution to address this problem is to employ a Bayesian formulation; however, while this offers an elegant treatment, it is analytically intractable and it requires approximations.Despite the huge advancements in the last few years, there is still a long way to make these approaches widely applicable.In this thesis, we address some of the challenges for modern Bayesian deep learning, by proposing and studying solutions to improve scalability and inference of these models.The first part of the thesis is dedicated to deep models where inference is carried out using variational inference (VI).Specifically, we study the role of initialization of the variational parameters and we show how careful initialization strategies can make VI deliver good performance even in large scale models.In this part of the thesis we also study the over-regularization effect of the variational objective on over-parametrized models.To tackle this problem, we propose an novel parameterization based on the Walsh-Hadamard transform; not only this solves the over-regularization effect of VI but it also allows us to model non-factorized posteriors while keeping time and space complexity under control.The second part of the thesis is dedicated to a study on the role of priors.While being an essential building block of Bayes' rule, picking good priors for deep learning models is generally hard.For this reason, we propose two different strategies based (i) on the functional interpretation of neural networks and (ii) on a scalable procedure to perform model selection on the prior hyper-parameters, akin to maximization of the marginal likelihood.To conclude this part, we analyze a different kind of Bayesian model (Gaussian process) and we study the effect of placing a prior on all the hyper-parameters of these models, including the additional variables required by the inducing-point approximations.We also show how it is possible to infer free-form posteriors on these variables, which conventionally would have been otherwise point-estimated
Hager, Paul Andrew. "Investigation of connection between deep learning and probabilistic graphical models". Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/119552.
Texto completo da fonteThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (page 21).
The field of machine learning (ML) has benefitted greatly from its relationship with the field of classical statistics. In support of that continued expansion, the following proposes an alternative perspective at the link between these fields. The link focuses on probabilistic graphical models in the context of reinforcement learning. Viewing certain algorithms as reinforcement learning gives one an ability to map ML concepts to statistics problems. Training a multi-layer nonlinear perceptron algorithm is equivalent to structure learning problems in probabilistic graphical models (PGMs). The technique of boosting weak rules into an ensemble is weighted sampling. Finally regularizing neural networks using the dropout technique is conditioning on certain observations in PGMs.
by Paul Andrew Hager.
M. Eng.
Farouni, Tarek. "An Overview of Probabilistic Latent Variable Models with anApplication to the Deep Unsupervised Learning of ChromatinStates". The Ohio State University, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=osu1492189894812539.
Texto completo da fonteQian, Weizhu. "Discovering human mobility from mobile data : probabilistic models and learning algorithms". Thesis, Bourgogne Franche-Comté, 2020. http://www.theses.fr/2020UBFCA025.
Texto completo da fonteSmartphone usage data can be used to study human indoor and outdoor mobility. In our work, we investigate both aspects in proposing machine learning-based algorithms adapted to the different information sources that can be collected.In terms of outdoor mobility, we use the collected GPS coordinate data to discover the daily mobility patterns of the users. To this end, we propose an automatic clustering algorithm using the Dirichlet process Gaussian mixture model (DPGMM) so as to cluster the daily GPS trajectories. This clustering method is based on estimating probability densities of the trajectories, which alleviate the problems caused by the data noise.By contrast, we utilize the collected WiFi fingerprint data to study indoor human mobility. In order to predict the indoor user location at the next time points, we devise a hybrid deep learning model, called the convolutional mixture density recurrent neural network (CMDRNN), which combines the advantages of different multiple deep neural networks. Moreover, as for accurate indoor location recognition, we presume that there exists a latent distribution governing the input and output at the same time. Based on this assumption, we develop a variational auto-encoder (VAE)-based semi-supervised learning model. In the unsupervised learning procedure, we employ a VAE model to learn a latent distribution of the input, the WiFi fingerprint data. In the supervised learning procedure, we use a neural network to compute the target, the user coordinates. Furthermore, based on the same assumption used in the VAE-based semi-supervised learning model, we leverage the information bottleneck theory to devise a variational information bottleneck (VIB)-based model. This is an end-to-end deep learning model which is easier to train and has better performance.Finally, we validate thees proposed methods on several public real-world datasets providing thus results that verify the efficiencies of our methods as compared to other existing methods generally used
SYED, MUHAMMAD FARRUKH SHAHID. "Data-Driven Approach based on Deep Learning and Probabilistic Models for PHY-Layer Security in AI-enabled Cognitive Radio IoT". Doctoral thesis, Università degli studi di Genova, 2021. http://hdl.handle.net/11567/1048543.
Texto completo da fonteEl-Shaer, Mennat Allah. "An Experimental Evaluation of Probabilistic Deep Networks for Real-time Traffic Scene Representation using Graphical Processing Units". The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1546539166677894.
Texto completo da fonteHu, Xu. "Towards efficient learning of graphical models and neural networks with variational techniques". Thesis, Paris Est, 2019. http://www.theses.fr/2019PESC1037.
Texto completo da fonteIn this thesis, I will mainly focus on variational inference and probabilistic models. In particular, I will cover several projects I have been working on during my PhD about improving the efficiency of AI/ML systems with variational techniques. The thesis consists of two parts. In the first part, the computational efficiency of probabilistic graphical models is studied. In the second part, several problems of learning deep neural networks are investigated, which are related to either energy efficiency or sample efficiency
Livros sobre o assunto "Probabilistic deep models"
Oaksford, Mike, e Nick Chater. Causal Models and Conditional Reasoning. Editado por Michael R. Waldmann. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199399550.013.5.
Texto completo da fonteTutino, Stefania. Uncertainty in Post-Reformation Catholicism. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780190694098.001.0001.
Texto completo da fonteTrappenberg, Thomas P. Fundamentals of Machine Learning. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198828044.001.0001.
Texto completo da fonteLevinson, Stephen C. Speech Acts. Editado por Yan Huang. Oxford University Press, 2016. http://dx.doi.org/10.1093/oxfordhb/9780199697960.013.22.
Texto completo da fonteCapítulos de livros sobre o assunto "Probabilistic deep models"
Sucar, Luis Enrique. "Deep Learning and Graphical Models". In Probabilistic Graphical Models, 327–46. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-61943-5_16.
Texto completo da fonteBahadir, Cagla Deniz, Benjamin Liechty, David J. Pisapia e Mert R. Sabuncu. "Characterizing the Features of Mitotic Figures Using a Conditional Diffusion Probabilistic Model". In Deep Generative Models, 121–31. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-53767-7_12.
Texto completo da fonteGustafsson, Fredrik K., Martin Danelljan, Goutam Bhat e Thomas B. Schön. "Energy-Based Models for Deep Probabilistic Regression". In Computer Vision – ECCV 2020, 325–43. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58565-5_20.
Texto completo da fonteHung, Alex Ling Yu, Zhiqing Sun, Wanwen Chen e John Galeotti. "Hierarchical Probabilistic Ultrasound Image Inpainting via Variational Inference". In Deep Generative Models, and Data Augmentation, Labelling, and Imperfections, 83–92. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-88210-5_7.
Texto completo da fonteNkambule, Tshepo, e Ritesh Ajoodha. "Classification of Music by Genre Using Probabilistic Models and Deep Learning Models". In Proceedings of Sixth International Congress on Information and Communication Technology, 185–93. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-2102-4_17.
Texto completo da fonteVölcker, Claas, Alejandro Molina, Johannes Neumann, Dirk Westermann e Kristian Kersting. "DeepNotebooks: Deep Probabilistic Models Construct Python Notebooks for Reporting Datasets". In Machine Learning and Knowledge Discovery in Databases, 28–43. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-43823-4_3.
Texto completo da fonteDinh, Xuan Truong, e Hai Van Pham. "Social Network Analysis Based on Combining Probabilistic Models with Graph Deep Learning". In Communication and Intelligent Systems, 975–86. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-1089-9_76.
Texto completo da fonteLinghu, Yuan, Xiangxue Li e Zhenlong Zhang. "Deep Learning vs. Traditional Probabilistic Models: Case Study on Short Inputs for Password Guessing". In Algorithms and Architectures for Parallel Processing, 468–83. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-38991-8_31.
Texto completo da fonteLiu, Zheng, e Hao Wang. "Research on Process Diagnosis of Severe Accidents Based on Deep Learning and Probabilistic Safety Analysis". In Springer Proceedings in Physics, 624–34. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-1023-6_54.
Texto completo da fonteStojanovski, David, Uxio Hermida, Pablo Lamata, Arian Beqiri e Alberto Gomez. "Echo from Noise: Synthetic Ultrasound Image Generation Using Diffusion Models for Real Image Segmentation". In Simplifying Medical Ultrasound, 34–43. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-44521-7_4.
Texto completo da fonteTrabalhos de conferências sobre o assunto "Probabilistic deep models"
Sidheekh, Sahil, Saurabh Mathur, Athresh Karanam e Sriraam Natarajan. "Deep Tractable Probabilistic Models". In CODS-COMAD 2024: 7th Joint International Conference on Data Science & Management of Data (11th ACM IKDD CODS and 29th COMAD). New York, NY, USA: ACM, 2024. http://dx.doi.org/10.1145/3632410.3633295.
Texto completo da fonteLiu, Xixi, Che-Tsung Lin e Christopher Zach. "Energy-based Models for Deep Probabilistic Regression". In 2022 26th International Conference on Pattern Recognition (ICPR). IEEE, 2022. http://dx.doi.org/10.1109/icpr56361.2022.9955636.
Texto completo da fonteAsgariandehkordi, Hojat, Sobhan Goudarzi, Adrian Basarab e Hassan Rivaz. "Deep Ultrasound Denoising Using Diffusion Probabilistic Models". In 2023 IEEE International Ultrasonics Symposium (IUS). IEEE, 2023. http://dx.doi.org/10.1109/ius51837.2023.10306544.
Texto completo da fonteVillanueva Llerena, Julissa. "Predictive Uncertainty Estimation for Tractable Deep Probabilistic Models". In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/745.
Texto completo da fonteCotterell, Ryan, e Jason Eisner. "Probabilistic Typology: Deep Generative Models of Vowel Inventories". In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, PA, USA: Association for Computational Linguistics, 2017. http://dx.doi.org/10.18653/v1/p17-1109.
Texto completo da fonteZHANG, YANG, YOU-WU WANG e YI-QING NI. "HYBRID PROBABILISTIC DEEP LEARNING FOR DAMAGE IDENTIFICATION". In Structural Health Monitoring 2023. Destech Publications, Inc., 2023. http://dx.doi.org/10.12783/shm2023/37014.
Texto completo da fonteLi, Xiucheng, Gao Cong e Yun Cheng. "Spatial Transition Learning on Road Networks with Deep Probabilistic Models". In 2020 IEEE 36th International Conference on Data Engineering (ICDE). IEEE, 2020. http://dx.doi.org/10.1109/icde48307.2020.00037.
Texto completo da fonteZhu, Jun. "Probabilistic Machine Learning: Models, Algorithms and a Programming Library". In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/823.
Texto completo da fonteSaleem, Rabia, Bo Yuan, Fatih Kurugollu e Ashiq Anjum. "Explaining probabilistic Artificial Intelligence (AI) models by discretizing Deep Neural Networks". In 2020 IEEE/ACM 13th International Conference on Utility and Cloud Computing (UCC). IEEE, 2020. http://dx.doi.org/10.1109/ucc48980.2020.00070.
Texto completo da fonteBejarano, Gissella. "PhD Forum: Deep Learning and Probabilistic Models Applied to Sequential Data". In 2018 IEEE International Conference on Smart Computing (SMARTCOMP). IEEE, 2018. http://dx.doi.org/10.1109/smartcomp.2018.00066.
Texto completo da fonte