Dissertations / Theses on the topic 'Variational Infernce'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Variational Infernce.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Rouillard, Louis. "Bridging Simulation-based Inference and Hierarchical Modeling : Applications in Neuroscience." Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG024.
Neuroimaging investigates the brain's architecture and function using magnetic resonance (MRI). To make sense of the complex observed signal, Neuroscientists posit explanatory models, governed by interpretable parameters. This thesis tackles statistical inference : guessing which parameters could have yielded the signal through the model.Inference in Neuroimaging is complexified by at least three hurdles : a large dimensionality, a large uncertainty, and the hierarchcial structure of data. We look into variational inference (VI) as an optimization-based method to tackle this regime.Specifically, we conbine structured stochastic VI and normalizing flows (NFs) to design expressive yet scalable variational families. We apply those techniques in diffusion and functional MRI, on tasks including individual parcellation, microstructure inference and directional coupling estimation. Through these applications, we underline the interplay between the forward and reverse Kullback-Leibler (KL) divergences as complemen-tary tools for inference. We also demonstrate the ability of automatic VI (AVI) as a reliable and scalable inference method to tackle the challenges of model-driven Neuroscience
Calabrese, Chris M. Eng Massachusetts Institute of Technology. "Distributed inference : combining variational inference with distributed computing." Thesis, Massachusetts Institute of Technology, 2013. http://hdl.handle.net/1721.1/85407.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 95-97).
The study of inference techniques and their use for solving complicated models has taken off in recent years, but as the models we attempt to solve become more complex, there is a worry that our inference techniques will be unable to produce results. Many problems are difficult to solve using current approaches because it takes too long for our implementations to converge on useful values. While coming up with more efficient inference algorithms may be the answer, we believe that an alternative approach to solving this complicated problem involves leveraging the computation power of multiple processors or machines with existing inference algorithms. This thesis describes the design and implementation of such a system by combining a variational inference implementation (Variational Message Passing) with a high-level distributed framework (Graphlab) and demonstrates that inference is performed faster on a few large graphical models when using this system.
by Chris Calabrese.
M. Eng.
Lawrence, Neil David. "Variational inference in probabilistic models." Thesis, University of Cambridge, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.621104.
Beal, Matthew James. "Variational algorithms for approximate Bayesian inference." Thesis, University College London (University of London), 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.404387.
Wang, Pengyu. "Collapsed variational inference for computational linguistics." Thesis, University of Oxford, 2016. https://ora.ox.ac.uk/objects/uuid:13c08f60-1441-4ea5-b52f-7ffd0d7a744f.
Mamikonyan, Arsen. "Variational inference for non-stationary distributions." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/113125.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (page 49).
In this thesis, I look at multiple Variational Inference algorithm, transform Kalman Variational Bayes and Stochastic Variational Inference into streaming algorithms and try to identify if any of them work with non-stationary distributions. I conclude that Kalman Variational Bayes can do as good as any other algorithm for stationary distributions, and tracks non-stationary distributions better than any other algorithm in question.
by Arsen Mamikonyan.
M. Eng.
Thorpe, Matthew. "Variational methods for geometric statistical inference." Thesis, University of Warwick, 2015. http://wrap.warwick.ac.uk/74241/.
Challis, E. A. L. "Variational approximate inference in latent linear models." Thesis, University College London (University of London), 2013. http://discovery.ucl.ac.uk/1414228/.
Matthews, Alexander Graeme de Garis. "Scalable Gaussian process inference using variational methods." Thesis, University of Cambridge, 2017. https://www.repository.cam.ac.uk/handle/1810/278022.
Maestrini, Luca. "On variational approximations for frequentist and bayesian inference." Doctoral thesis, Università degli studi di Padova, 2018. http://hdl.handle.net/11577/3424936.
Le approssimazioni variazionali sono tecniche di inferenza approssimata per modelli statisticicomplessi che si propongono come alternative, più rapide e di tipo deterministico,a metodi tradizionali che, sebbene accurati, necessitano di maggiori tempi per l'adattamento. Vengono qui sviluppati e valutati alcuni strumenti variazionali per l'inferenzabasata sulla verosimiglianza e per l'inferenza bayesiana, estendendo dei risultati recentiin letteratura sulle approssimazioni variazionali. In particolare, la prima parte dellatesi impiega una strategia basata su un'approssimazione variazionale gaussiana per la funzione di verosimiglianza di modelli lineari generalizzati misti con matrici di disegnodegli effetti casuali generiche, includenti, per esempio, funzioni di basi spline. Questometodo consiste nell'approssimare la distribuzione del vettore degli effetti casuali,condizionatamente alle risposte, con una densità gaussiana. Il secondo filone concerneinvece una particolare classe di approssimazioni variazionali nota come mean field variational Bayes, che impone un prodotto di densità come restrizione non parametrica sulla densità approssimante. Vengono sviluppati algoritmi per l'inferenza e l'adattamento dimodelli con risposte elaborate, adottando la prospettiva del variational message passing. La modularità del variational message passing è tale da consentire estensioni amodelli con strutture di verosimiglianza più complesse e scalabilità a insiemi di dati di grandi dimensioni con relativa semplicità. Vengono inoltre derivati in forma esplicitadegli algoritmi per modelli contenenti effetti casuali su più livelli e risposte non normali,introducendo semplicazioni atte a incrementare l'efficienza computazionale. Sonoinclusi studi numerici e illustrazioni, considerando come riferimento per un confronto il metodo Markov chain Monte Carlo.
Houghton, Adrian James. "Variational Bayesian inference for comparison Var(1) models." Thesis, University of Newcastle Upon Tyne, 2009. http://hdl.handle.net/10443/790.
Zhang, Ye. "Community Detection| Fundamental Limits, Methodology, and Variational Inference." Thesis, Yale University, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10957347.
Network analysis has become one of the most active research areas over the past few years. A core problem in network analysis is community detection. In this thesis, we investigate it under Stochastic Block Model and Degree-corrected Block Model from three different perspectives: 1) the minimax rates of community detection problem, 2) rate-optimal and computationally feasible algorithms, and 3) computational and theoretical guarantees of variational inference for community detection.
Andersson, Gabriel. "Decoding Neural Signals Associated to Cytokine Activity." Thesis, KTH, Matematik (Inst.), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-291559.
Vagusnerven har visat sig spela en viktig roll beträffande inflammatoriska sjukdomar. Denna nerv reglerar produktionen av inflammatoriska protein, som de inflammationsfrämjande cytokinerna TNF och IL-1β. Detta arbete använder sig av elektroniska mätningar av Vagusnerven i möss som under tiden blir injicerade med de två cytokinerna TNF och IL-1β. Syftet med arbetet är att undersöka om det är möjligt att extrahera information om de specifika cytokinerna från Vagusnervmätningarna. För att uppnå detta designar vi en semi-vägledd lärandemetod som modellerar dem observerade vågformerna med en betingad sannolikhetsfunktion. Betingandet baseras på en uppskattning av hur ofta varje enskild vågform förekommer och lokala maximum av den betingade sannolikhetsfunktionen tolkas som möjliga kandidat-vågformer att innehålla cytokin-information. Metodiken ger varierande, men lovande resultat. Förekomsten av flertalet kandidat-vågformer har en tydlig ökning efter tidpunkten för cytokin-injektion. Vidare så diskuteras svårigheter i att uppnå konsekventa resultat för alla mätningar, samt olika möjligheter för framtida arbete inom området.
Ocone, Andrea. "Variational inference for Gaussian-jump processes with application in gene regulation." Thesis, University of Edinburgh, 2013. http://hdl.handle.net/1842/8280.
Jaakkola, Tommi S. (Tommi Sakari). "Variational methods for inference and estimation in graphical models." Thesis, Massachusetts Institute of Technology, 1997. http://hdl.handle.net/1721.1/10307.
Sontag, David Alexander. "Cutting plane algorithms for variational inference in graphical models." Thesis, Massachusetts Institute of Technology, 2007. http://hdl.handle.net/1721.1/40327.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Includes bibliographical references (leaves 65-66).
In this thesis, we give a new class of outer bounds on the marginal polytope, and propose a cutting-plane algorithm for efficiently optimizing over these constraints. When combined with a concave upper bound on the entropy, this gives a new variational inference algorithm for probabilistic inference in discrete Markov Random Fields (MRFs). Valid constraints are derived for the marginal polytope through a series of projections onto the cut polytope. Projecting onto a larger model gives an efficient separation algorithm for a large class of valid inequalities arising from each of the original projections. As a result, we obtain tighter upper bounds on the logpartition function than possible with previous variational inference algorithms. We also show empirically that our approximations of the marginals are significantly more accurate. This algorithm can also be applied to the problem of finding the Maximum a Posteriori assignment in a MRF, which corresponds to a linear program over the marginal polytope. One of the main contributions of the thesis is to bring together two seemingly different fields, polyhedral combinatorics and probabilistic inference, showing how certain results in either field can carry over to the other.
by David Alexander Sontag.
S.M.
Knollmüller, Jakob [Verfasser], and Torsten [Akademischer Betreuer] Enßlin. "Metric Gaussian variational inference / Jakob Knollmüller ; Betreuer: Torsten Enßlin." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2020. http://d-nb.info/1227839901/34.
Teng, Jing. "Variational filtering for bayesian inference in wireless sensor networks." Troyes, 2009. http://www.theses.fr/2009TROY0019.
In this thesis, we tackle the intractable Bayesian inference problems in wireless sensor networks (WSNs) by variational approximation. A general framework for variational Bayesian inference is proposed for three basic and closely related applications: single target tracking, multiple targets tracking (MTT), simultaneous sensor localization and target tracking (SLAT). The trade-off between estimation precision and energy-awareness is the primary focus for the WSN applications, leading to decentralized execution of the variational filter (VF). Contributions of the thesis consist in following points: - A VF algorithm simultaneously updates and approximates the filtering distribution, reducing the temporal dependence to one Gaussian statistic. - A general state evolution model describes the target state, allowing discrete jumps in target trajectory. - A binary proximity observation model quantifies an observation to a single bit, minimizing energy and bandwidth consumption. - A non-myopic cluster activation rule based on the prediction of VF is proposed for the proactive cluster management, which dramatically decreases hand-off operations between successive clusters. - A Dijkstra-like clustering algorithm for reactive cluster management yields optimal clustering. - An hybrid probabilistic data association and VF scheme is employed for MTT. - A distributed VF solution for SLAT on-line up-dates and refines estimates of sensor locations and target trajectory
Abeywardana, Sachinthaka. "Variational Inference in Generalised Hyperbolic and von Mises-Fisher Distributions." Thesis, The University of Sydney, 2015. http://hdl.handle.net/2123/16504.
Nissilä, M. (Mauri). "Iterative receivers for digital communications via variational inference and estimation." Doctoral thesis, University of Oulu, 2008. http://urn.fi/urn:isbn:9789514286865.
Cherief-Abdellatif, Badr-Eddine. "Contributions to the theoretical study of variational inference and robustness." Electronic Thesis or Diss., Institut polytechnique de Paris, 2020. http://www.theses.fr/2020IPPAG001.
This PhD thesis deals with variational inference and robustness. More precisely, it focuses on the statistical properties of variational approximations and the design of efficient algorithms for computing them in an online fashion, and investigates Maximum Mean Discrepancy based estimators as learning rules that are robust to model misspecification.In recent years, variational inference has been extensively studied from the computational viewpoint, but only little attention has been put in the literature towards theoretical properties of variational approximations until very recently. In this thesis, we investigate the consistency of variational approximations in various statistical models and the conditions that ensure the consistency of variational approximations. In particular, we tackle the special case of mixture models and deep neural networks. We also justify in theory the use of the ELBO maximization strategy, a model selection criterion that is widely used in the Variational Bayes community and is known to work well in practice.Moreover, Bayesian inference provides an attractive online-learning framework to analyze sequential data, and offers generalization guarantees which hold even under model mismatch and with adversaries. Unfortunately, exact Bayesian inference is rarely feasible in practice and approximation methods are usually employed, but do such methods preserve the generalization properties of Bayesian inference? In this thesis, we show that this is indeed the case for some variational inference algorithms. We propose new online, tempered variational algorithms and derive their generalization bounds. Our theoretical result relies on the convexity of the variational objective, but we argue that our result should hold more generally and present empirical evidence in support of this. Our work presents theoretical justifications in favor of online algorithms that rely on approximate Bayesian methods. Another point that is addressed in this thesis is the design of a universal estimation procedure. This question is of major interest, in particular because it leads to robust estimators, a very hot topic in statistics and machine learning. We tackle the problem of universal estimation using a minimum distance estimator based on the Maximum Mean Discrepancy. We show that the estimator is robust to both dependence and to the presence of outliers in the dataset. We also highlight the connections that may exist with minimum distance estimators using L2-distance. Finally, we provide a theoretical study of the stochastic gradient descent algorithm used to compute the estimator, and we support our findings with numerical simulations. We also propose a Bayesian version of our estimator, that we study from both a theoretical and a computational points of view
曾達誠 and Tat-shing Tsang. "Statistical inference on the coefficient of variation." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B31223503.
Tsang, Tat-shing. "Statistical inference on the coefficient of variation /." Hong Kong : University of Hong Kong, 2000. http://sunzi.lib.hku.hk/hkuto/record.jsp?B21903980.
Wang, Jiabin. "Variational Bayes inference based segmentation algorithms for brain PET-CT images." Thesis, The University of Sydney, 2012. https://hdl.handle.net/2123/29251.
BOLZONI, MATTIA. "Variational inference and semi-parametric methods for time-series probabilistic forecasting." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2021. http://hdl.handle.net/10281/313704.
Probabilistic forecasting is a common task. The usual approach assumes a fixed structure for the outcome distribution, often called model, that depends on unseen quantities called parameters. It uses data to infer a reasonable distribution over these latent values. The inference step is not always straightforward, because single-value can lead to poor performances and overfitting while handling a proper distribution with MCMC can be challenging. Variational Inference (VI) is emerging as a viable optimisation based alternative that models the target posterior with instrumental variables called variational parameters. However, VI usually imposes a parametric structure on the proposed posterior. The thesis's first contribution is Hierarchical Variational Inference (HVI) a methodology that uses Neural Networks to create semi-parametric posterior approximations with the same minimum requirements as Metropolis-Hastings or Hamiltonian MCMC. The second contribution is a Python package to conduct VI on time-series models for mean-covariance estimate, using HVI and standard VI techniques combined with Neural Networks. Results on econometric and financial data show a consistent improvement using VI, compared to point estimate, obtaining lower variance forecasting.
Ban, Yutong. "Suivi multi-locuteurs avec information audio-visuel pour la perception du robot." Thesis, Université Grenoble Alpes (ComUE), 2019. http://www.theses.fr/2019GREAM017/document.
Robot perception plays a crucial role in human-robot interaction (HRI). Perception system provides the robot information of the surroundings and enables the robot to give feedbacks. In a conversational scenario, a group of people may chat in front of the robot and move freely. In such situations, robots are expected to understand where are the people, who are speaking, or what are they talking about. This thesis concentrates on answering the first two questions, namely speaker tracking and diarization. We use different modalities of the robot’s perception system to achieve the goal. Like seeing and hearing for a human-being, audio and visual information are the critical cues for a robot in a conversational scenario. The advancement of computer vision and audio processing of the last decade has revolutionized the robot perception abilities. In this thesis, we have the following contributions: we first develop a variational Bayesian framework for tracking multiple objects. The variational Bayesian framework gives closed-form tractable problem solutions, which makes the tracking process efficient. The framework is first applied to visual multiple-person tracking. Birth and death process are built jointly with the framework to deal with the varying number of the people in the scene. Furthermore, we exploit the complementarity of vision and robot motorinformation. On the one hand, the robot’s active motion can be integrated into the visual tracking system to stabilize the tracking. On the other hand, visual information can be used to perform motor servoing. Moreover, audio and visual information are then combined in the variational framework, to estimate the smooth trajectories of speaking people, and to infer the acoustic status of a person- speaking or silent. In addition, we employ the model to acoustic-only speaker localization and tracking. Online dereverberation techniques are first applied then followed by the tracking system. Finally, a variant of the acoustic speaker tracking model based on von-Mises distribution is proposed, which is specifically adapted to directional data. All the proposed methods are validated on datasets according to applications
Hsu, Wei-Ning Ph D. Massachusetts Institute of Technology. "Unsupervised learning of disentangled representations for speech with neural variational inference models." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/118059.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 121-128).
Despite recent successes in machine learning, artificial intelligence is still far from matching human intelligence in many ways. Two important aspects are transferability and amount of supervision required. Take speech recognition for example: while humans can easily adapt to a new accent without explicit supervision (i.e., ground truth transcripts for speech of a new accent), current machine learning techniques still struggle with such a scenario. We argue that an essential component of human learning is unsupervised or weakly supervised representation learning, which transforms input signals to low dimensional representations that facilitate subsequent structured learning and knowledge acquisition. In this thesis, we develop unsupervised representation learning frameworks for speech data. We start with investigating an existing variational autoencoder (VAE) model for learning latent representations, and derive novel latent space operations for speech transformation. The transformation method is applied to unsupervised domain adaptation problems, which addresses the transferability issues of supervised machine learning framework. We then extend the VAE models, and propose a novel factorized hierarchical variational autoencoder (FHVAE), which better models a generative process of sequential data, and learns not only disentangled, but also interpretable latent representations without any supervision. By leveraging the interpretability, we demonstrate that such representations can be applied to a wide range of tasks, including but not limited to: voice conversion, denoising, speaker verification, speaker invariant phonetic feature extraction, and noise invariant phonetic feature extraction. In the last part of this thesis, we examine scalability issues regarding the original FHVAE training algorithm in terms of runtime, memory, and optimization stability. Based on our analysis, we propose a hierarchical sampling algorithm for training, which enables training of FHVAE models on arbitrarily large datasets.
by Wei-Ning Hsu.
S.M.
Steinberg, John. "A Comparative Analysis of Bayesian Nonparametric Variational Inference Algorithms for Speech Recognition." Master's thesis, Temple University Libraries, 2013. http://cdm16002.contentdm.oclc.org/cdm/ref/collection/p245801coll10/id/216605.
M.S.E.E.
Nonparametric Bayesian models have become increasingly popular in speech recognition tasks such as language and acoustic modeling due to their ability to discover underlying structure in an iterative manner. These methods do not require a priori assumptions about the structure of the data, such as the number of mixture components, and can learn this structure directly. Dirichlet process mixtures (DPMs) are a widely used nonparametric Bayesian method which can be used as priors to determine an optimal number of mixture components and their respective weights in a Gaussian mixture model (GMM). Because DPMs potentially require an infinite number of parameters, inference algorithms are needed to make posterior calculations tractable. The focus of this work is an evaluation of three of these Bayesian variational inference algorithms which have only recently become computationally viable: Accelerated Variational Dirichlet Process Mixtures (AVDPM), Collapsed Variational Stick Breaking (CVSB), and Collapsed Dirichlet Priors (CDP). To eliminate other effects on performance such as language models, a phoneme classification task is chosen to more clearly assess the viability of these algorithms for acoustic modeling. Evaluations were conducted on the CALLHOME English and Mandarin corpora, consisting of two languages that, from a human perspective, are phonologically very different. It is shown in this work that these inference algorithms yield error rates comparable to a baseline Gaussian mixture model (GMM) but with a factor of up to 20 fewer mixture components. AVDPM is shown to be the most attractive choice because it delivers the most compact models and is computationally efficient, enabling its application to big data problems.
Temple University--Theses
Xu, Zhen. "Using Social Dynamics to Make Individual Predictions| Variational Inference with Stochastic Kinetic Model." Thesis, State University of New York at Buffalo, 2017. http://pqdtopen.proquest.com/#viewpdf?dispub=10253123.
Social dynamics is concerned with the interactions of individuals and the resulting group behaviors. It models the temporal evolution of social systems via the interactions of the individuals within these systems. The availability of large-scale data in social networks and sensor networks offers an unprecedented opportunity to predict state changing events at the individual level. Examples of such events are disease infection, rumor propagation and opinion transition in elections, etc. Unlike previous research focusing on the collective effects of social systems, we want to make efficient inferences on the individual level.
Two main challenges are addressed: temporal modeling and computational complexity. The interaction pattern for each individual keeps changing over the time, i.e., an individual interacts with different individuals at different times. Second, as the number of tracked individual increases, the computational complexity grows exponentially with traditional sequential data analysis.
The contributions are: (i) leverage social networks and sensor networks data to make tractable inferences on both individual behaviors and collective effects in social dynamics. (ii) use the stochastic kinetic model to summarize dynamic interactions among individuals and simplify the state transition probabilities. (iii) propose an efficient variational inference algorithm whose complexity grows linearly with the number of tracked individuals M. Given the state space K of a single individual and the total number of time steps T, the complexity of naive brute-force approach is O(KMT) and the complexity of existing exact inference approach is O(KMT). In comparison, the complexity of the proposed algorithm is O(K 2MT). In practice, it requires several iterations to converge.
In the empirical study concerning epidemics dynamics, given wireless sensor network data collected from more than ten thousand people (M = 13,888) over three years (T = 3465), we use the proposed algorithm to track disease transmission, and predict the probability of infection for each individual (K = 2) along the time until convergence (I=5). It is more efficient than state of the art sampling methods, i.e., MCMC and particle filter, while achieving high accuracy.
OLOBATUYI, KEHINDE IBUKUN. "A Family of Variational Algorithms for Approximate Bayesian Inference of High-Dimensional Data." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2021. http://hdl.handle.net/10281/325856.
The Bayesian framework for machine learning allows the incorporation of prior knowledge into the system in a coherent manner which avoids overfitting problems but rather seeks to approximate the exact posterior and provides a principled basis for the selection of model among alternative models. Unfortunately, the computation required in Bayesian framework is usually intractable. This thesis provides a family of Variational Bayesian (VB) framework which approximates these intractable computations with latent variables by minimizing the Kullback-Leibler divergence between the exact posterior and the approximate distribution. Chapter 1 presents background materials on Bayesian inference, and propagation algorithms. Chapter 2 discusses the family of variational Bayesian theory. It generalizes the expectation maximization (EM) algorithm for learning maximum likelihood parameters. Finally, it discusses factorized approximation of Expectation propagation. Chapter 3 - 5 derive and apply the variants of Variational Bayesian to the family of cluster weighted models (CWMs). It investigates the background history of CWM and proposes new different members into the family. First, the dimensionality of CWM is explored by introducing the t-distributed stochastic neighbor embedding (tSNE) for dimensionality reduction which leads to CMWs based on tSNE for high-dimensional data. Afterwards, we propose a Multinomial CWM for multiclass classification and Zero-inflated Poisson CWM for zero-inflated data. This work derives and applies the Expectation Maximization algorithm with three different maximization step algorithms: Ordinary Least Squares (OLS), Iteratively Reweighted Least Squares (IRLS), and Stochastic Gradient Descent (SGD) to estimate the models' parameters. It finally examines the classification performance of the family of CWM by eight different information criteria and varieties of Adjusted Rand Index (ARI). Chapter 6 proposes a variant of Expectation Propagation: EP-MCMC, EP-ADMM algorithms to the inverse models. It demonstrates EP-MCMC and EP-ADMM on complex Bayesian models for image reconstruction and compares the performance to MCMC. Chapter 7 concludes with a discussion and possible future directions for optimization algorithms.
Burchett, Woodrow. "Improving the Computational Efficiency in Bayesian Fitting of Cormack-Jolly-Seber Models with Individual, Continuous, Time-Varying Covariates." UKnowledge, 2017. http://uknowledge.uky.edu/statistics_etds/27.
Lauretig, Adam M. "Natural Language Processing, Statistical Inference, and American Foreign Policy." The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1562147711514566.
Carbonetto, Peter. "New probabilistic inference algorithms that harness the strengths of variational and Monte Carlo methods." Thesis, University of British Columbia, 2009. http://hdl.handle.net/2429/11990.
Burleigh, John Gordon. "Variation in the process of molecular evolution and its impact on phylogenetic inference /." free to MU campus, to others for purchase, 2002. http://wwwlib.umi.com/cr/mo/fullcit?p3052155.
Tison, Jean-Luc. "Genetic variation and inference of demographic histories in non-model species." Doctoral thesis, Stockholms universitet, Institutionen för molekylär biovetenskap, Wenner-Grens institut, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-109896.
At the time of the doctoral defense, the following papers were unpublished and had a status as follows: Paper 2: Manuscript. Paper 3: Manuscript.
Liu, Qiang. "Inference of Spot Volatility in the presence of Infinite Variation Jumps." Thesis, University of Macau, 2018. http://umaclib3.umac.mo/record=b3952482.
Wedenberg, Kim, and Alexander Sjöberg. "Online inference of topics : Implementation of the topic model Latent Dirichlet Allocation using an online variational bayes inference algorithm to sort news articles." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-222429.
Grullon, Dylan Emanuel Centeno. "Disentangling time constant and time dependent hidden state in time series with variational Bayesian inference." Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/124572.
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2019
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 85-86).
In this thesis, we design and explore a new model architecture called a Variational Bayes Recurrent Neural Network (VBRNN) for modelling time series. The VBRNN contains explicit structure to disentangle time constant and time dependent dynamics for use with compatible time series, such as those that can be modelled by differential equations with time constant parameters and time dependent state. The model consists of a Variational Bayes (VB) layer to infer time constant state, as well as a conditioned-RNN to model time dependent dynamics. The VBRNN is explored through various synthetic datasets and problems, and compared to conventional methods on these datasets. This approach demonstrates effective disentanglement, motivating future work to explore the efficacy of this mo del in real word datasets.
by Dylan Emanuel Centeno Grullon.
M. Eng.
M.Eng. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science
Wenzel, Florian. "Scalable Inference in Latent Gaussian Process Models." Doctoral thesis, Humboldt-Universität zu Berlin, 2020. http://dx.doi.org/10.18452/20926.
Latent Gaussian process (GP) models help scientists to uncover hidden structure in data, express domain knowledge and form predictions about the future. These models have been successfully applied in many domains including robotics, geology, genetics and medicine. A GP defines a distribution over functions and can be used as a flexible building block to develop expressive probabilistic models. The main computational challenge of these models is to make inference about the unobserved latent random variables, that is, computing the posterior distribution given the data. Currently, most interesting Gaussian process models have limited applicability to big data. This thesis develops a new efficient inference approach for latent GP models. Our new inference framework, which we call augmented variational inference, is based on the idea of considering an augmented version of the intractable GP model that renders the model conditionally conjugate. We show that inference in the augmented model is more efficient and, unlike in previous approaches, all updates can be computed in closed form. The ideas around our inference framework facilitate novel latent GP models that lead to new results in language modeling, genetic association studies and uncertainty quantification in classification tasks.
Marklund, Emil. "Bayesian inference in aggregated hidden Markov models." Thesis, Uppsala universitet, Institutionen för biologisk grundutbildning, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-243090.
Li, Xin. "Haplotype Inference from Pedigree Data and Population Data." Cleveland, Ohio : Case Western Reserve University, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=case1259867573.
Title from PDF (viewed on 2009-12-30) Department of Electrical Engineering and Computer Science Includes abstract Includes bibliographical references and appendices Available online via the OhioLINK ETD Center
Michelen, Strofer Carlos Alejandro. "Machine Learning and Field Inversion approaches to Data-Driven Turbulence Modeling." Diss., Virginia Tech, 2021. http://hdl.handle.net/10919/103155.
Doctor of Philosophy
The Reynolds-averaged Navier-Stokes (RANS) equations are widely used to simulate fluid flows in engineering applications despite their known inaccuracy in many flows of practical interest. The uncertainty in the RANS equations is known to stem from the Reynolds stress tensor for which no universally applicable turbulence model exists. The computational cost of more accurate methods for fluid flow simulation, however, means RANS simulations will likely continue to be a major tool in engineering applications and there is still a need for improved RANS turbulence modeling. This dissertation explores two different approaches to use available experimental data to improve RANS predictions by improving the uncertain Reynolds stress tensor field. The first approach is using machine learning to learn a data-driven turbulence model from a set of training data. This model can then be applied to predict new flows in place of traditional turbulence models. To this end, this dissertation presents a novel framework for training deep neural networks using experimental measurements of velocity and pressure. When using velocity and pressure data, gradient-based training of the neural network requires the sensitivity of the RANS equations to the learned Reynolds stress. Two different methods, the continuous adjoint and ensemble approximation, are used to obtain the required sensitivity. The second approach explored in this dissertation is field inversion, whereby available data for a flow of interest is used to infer a Reynolds stress field that leads to improved RANS solutions for that same flow. Here, the field inversion is done via the ensemble Kalman inversion (EKI), a Monte Carlo Bayesian procedure, and the focus is on improving the inference by enforcing known physical constraints on the inferred Reynolds stress field. To this end, a method for enforcing boundary conditions on the inferred field is presented. While further development is needed, the two data-driven approaches explored and improved upon here demonstrate the potential for improved practical RANS predictions.
Lienart, Thibaut. "Inference on Markov random fields : methods and applications." Thesis, University of Oxford, 2017. http://ora.ox.ac.uk/objects/uuid:3095b14c-98fb-4bda-affc-a1fa1708f628.
Saha, Abhijoy. "A Geometric Framework for Modeling and Inference using the Nonparametric Fisher–Rao metric." The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1562679374833421.
Ashton, Gregory. "Timing variations in neutron stars : models, inference and their implications for gravitational waves." Thesis, University of Southampton, 2016. https://eprints.soton.ac.uk/401822/.
Tong, Zhigang. "Statistical Inference for Heavy Tailed Time Series and Vectors." Thesis, Université d'Ottawa / University of Ottawa, 2017. http://hdl.handle.net/10393/35649.
Cheema, Prasad. "Machine Learning for Inverse Structural-Dynamical Problems: From Bayesian Non-Parametrics, to Variational Inference, and Chaos Surrogates." Thesis, University of Sydney, 2020. https://hdl.handle.net/2123/24139.
Miao, Yishu. "Deep generative models for natural language processing." Thesis, University of Oxford, 2017. http://ora.ox.ac.uk/objects/uuid:e4e1f1f9-e507-4754-a0ab-0246f1e1e258.
Shringarpure, Suyash. "Statistical Methods for studying Genetic Variation in Populations." Research Showcase @ CMU, 2012. http://repository.cmu.edu/dissertations/117.
Nguyen, Trong Nghia. "Deep Learning Based Statistical Models for Business and Financial Data." Thesis, The University of Sydney, 2021. https://hdl.handle.net/2123/26944.