Academic literature on the topic 'Markov chain Monte Carlo (MCMC)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Markov chain Monte Carlo (MCMC).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Markov chain Monte Carlo (MCMC)"

1

Borkar, Vivek S. "Markov Chain Monte Carlo (MCMC)." Resonance 27, no. 7 (July 2022): 1107–15. http://dx.doi.org/10.1007/s12045-022-1407-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Roy, Vivekananda. "Convergence Diagnostics for Markov Chain Monte Carlo." Annual Review of Statistics and Its Application 7, no. 1 (March 9, 2020): 387–412. http://dx.doi.org/10.1146/annurev-statistics-031219-041300.

Full text
Abstract:
Markov chain Monte Carlo (MCMC) is one of the most useful approaches to scientific computing because of its flexible construction, ease of use, and generality. Indeed, MCMC is indispensable for performing Bayesian analysis. Two critical questions that MCMC practitioners need to address are where to start and when to stop the simulation. Although a great amount of research has gone into establishing convergence criteria and stopping rules with sound theoretical foundation, in practice, MCMC users often decide convergence by applying empirical diagnostic tools. This review article discusses the most widely used MCMC convergence diagnostic tools. Some recently proposed stopping rules with firm theoretical footing are also presented. The convergence diagnostics and stopping rules are illustrated using three detailed examples.
APA, Harvard, Vancouver, ISO, and other styles
3

Jones, Galin L., and Qian Qin. "Markov Chain Monte Carlo in Practice." Annual Review of Statistics and Its Application 9, no. 1 (March 7, 2022): 557–78. http://dx.doi.org/10.1146/annurev-statistics-040220-090158.

Full text
Abstract:
Markov chain Monte Carlo (MCMC) is an essential set of tools for estimating features of probability distributions commonly encountered in modern applications. For MCMC simulation to produce reliable outcomes, it needs to generate observations representative of the target distribution, and it must be long enough so that the errors of Monte Carlo estimates are small. We review methods for assessing the reliability of the simulation effort, with an emphasis on those most useful in practically relevant settings. Both strengths and weaknesses of these methods are discussed. The methods are illustrated in several examples and in a detailed case study.
APA, Harvard, Vancouver, ISO, and other styles
4

Jones, Galin L., and Qian Qin. "Markov Chain Monte Carlo in Practice." Annual Review of Statistics and Its Application 9, no. 1 (March 7, 2022): 557–78. http://dx.doi.org/10.1146/annurev-statistics-040220-090158.

Full text
Abstract:
Markov chain Monte Carlo (MCMC) is an essential set of tools for estimating features of probability distributions commonly encountered in modern applications. For MCMC simulation to produce reliable outcomes, it needs to generate observations representative of the target distribution, and it must be long enough so that the errors of Monte Carlo estimates are small. We review methods for assessing the reliability of the simulation effort, with an emphasis on those most useful in practically relevant settings. Both strengths and weaknesses of these methods are discussed. The methods are illustrated in several examples and in a detailed case study.
APA, Harvard, Vancouver, ISO, and other styles
5

Siems, Tobias. "Markov Chain Monte Carlo on finite state spaces." Mathematical Gazette 104, no. 560 (June 18, 2020): 281–87. http://dx.doi.org/10.1017/mag.2020.51.

Full text
Abstract:
We elaborate the idea behind Markov chain Monte Carlo (MCMC) methods in a mathematically coherent, yet simple and understandable way. To this end, we prove a pivotal convergence theorem for finite Markov chains and a minimal version of the Perron-Frobenius theorem. Subsequently, we briefly discuss two fundamental MCMC methods, the Gibbs and Metropolis-Hastings sampler. Only very basic knowledge about matrices, convergence of real sequences and probability theory is required.
APA, Harvard, Vancouver, ISO, and other styles
6

Chaudhary, A. K. "Bayesian Analysis of Two Parameter Complementary Exponential Power Distribution." NCC Journal 3, no. 1 (June 14, 2018): 1–23. http://dx.doi.org/10.3126/nccj.v3i1.20244.

Full text
Abstract:
In this paper, the Markov chain Monte Carlo (MCMC) method is used to estimate the parameters of CEP distribution based on a complete sample. A procedure is developed to obtain Bayes estimates of the parameters of the CEP distribution using Markov Chain Monte Carlo (MCMC) simulation method in OpenBUGS, established software for Bayesian analysis using Markov Chain Monte Carlo (MCMC) methods. The MCMC methods have been shown to be easier to implement computationally, the estimates always exist and are statistically consistent, and their probability intervals are convenient to construct. The R functions are developed to study the statistical properties, model validation and comparison tools of the distribution and the output analysis of MCMC samples generated from OpenBUGS. A real data set is considered for illustration under uniform and gamma sets of priors. NCC Journal Vol. 3, No. 1, 2018, Page: 1-23
APA, Harvard, Vancouver, ISO, and other styles
7

Chaudhary, Arun Kumar, and Vijay Kumar. "A Bayesian Estimation and Predictionof Gompertz Extension Distribution Using the MCMC Method." Nepal Journal of Science and Technology 19, no. 1 (July 1, 2020): 142–60. http://dx.doi.org/10.3126/njst.v19i1.29795.

Full text
Abstract:
In this paper, the Markov chain Monte Carlo (MCMC) method is used to estimate the parameters of the Gompertz extension distribution based on a complete sample. We have developed a procedure to obtain Bayes estimates of the parameters of the Gompertz extension distribution using Markov Chain Monte Carlo (MCMC) simulation method in OpenBUGS, established software for Bayesian analysis using Markov Chain Monte Carlo (MCMC) methods. We have obtained the Bayes estimates of the parameters, hazard and reliability functions, and their probability intervals are also presented. We have applied the predictive check method to discuss the issue of model compatibility. A real data set is considered for illustration under uniform and gamma priors.
APA, Harvard, Vancouver, ISO, and other styles
8

Chaudhary, A. K. "A Study of Perks-II Distribution via Bayesian Paradigm." Pravaha 24, no. 1 (June 12, 2018): 1–17. http://dx.doi.org/10.3126/pravaha.v24i1.20221.

Full text
Abstract:
In this paper, the Markov chain Monte Carlo (MCMC) method is used to estimate the parameters of Perks-II distribution based on a complete sample. The procedures are developed to perform full Bayesian analysis of the Perks-II distributions using Markov Chain Monte Carlo (MCMC) simulation method in OpenBUGS, established software for Bayesian analysis using Markov Chain Monte Carlo (MCMC) methods. We have obtained the Bayes estimates of the parameters, hazard and reliability functions, and their probability intervals are also presented. We have also discussed the issue of model compatibility for the given data set. A real data set is considered for illustration under gamma sets of priors.PravahaVol. 24, No. 1, 2018,page: 1-17
APA, Harvard, Vancouver, ISO, and other styles
9

Müller, Christian, Fabian Weysser, Thomas Mrziglod, and Andreas Schuppert. "Markov-Chain Monte-Carlo methods and non-identifiabilities." Monte Carlo Methods and Applications 24, no. 3 (September 1, 2018): 203–14. http://dx.doi.org/10.1515/mcma-2018-0018.

Full text
Abstract:
Abstract We consider the problem of sampling from high-dimensional likelihood functions with large amounts of non-identifiabilities via Markov-Chain Monte-Carlo algorithms. Non-identifiabilities are problematic for commonly used proposal densities, leading to a low effective sample size. To address this problem, we introduce a regularization method using an artificial prior, which restricts non-identifiable parts of the likelihood function. This enables us to sample the posterior using common MCMC methods more efficiently. We demonstrate this with three MCMC methods on a likelihood based on a complex, high-dimensional blood coagulation model and a single series of measurements. By using the approximation of the artificial prior for the non-identifiable directions, we obtain a sample quality criterion. Unlike other sample quality criteria, it is valid even for short chain lengths. We use the criterion to compare the following three MCMC variants: The Random Walk Metropolis Hastings, the Adaptive Metropolis Hastings and the Metropolis adjusted Langevin algorithm.
APA, Harvard, Vancouver, ISO, and other styles
10

Shadare, A. E., M. N. O. Sadiku, and S. M. Musa. "Markov Chain Monte Carlo Solution of Poisson’s Equation in Axisymmetric Regions." Advanced Electromagnetics 8, no. 5 (December 17, 2019): 29–36. http://dx.doi.org/10.7716/aem.v8i5.1255.

Full text
Abstract:
The advent of the Monte Carlo methods to the field of EM have seen floating random walk, fixed random walk and Exodus methods deployed to solve Poisson’s equation in rectangular coordinate and axisymmetric solution regions. However, when considering large EM domains, classical Monte Carlo methods could be time-consuming because they calculate potential one point at a time. Thus, Markov Chain Monte Carlo (MCMC) is generally preferred to other Monte Carlo methods when considering whole-field computation. In this paper, MCMC has been applied to solve Poisson’s equation in homogeneous and inhomogeneous axisymmetric regions. The MCMC results are compared with the analytical and finite difference solutions.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Markov chain Monte Carlo (MCMC)"

1

Guha, Subharup. "Benchmark estimation for Markov Chain Monte Carlo samplers." The Ohio State University, 2004. http://rave.ohiolink.edu/etdc/view?acc_num=osu1085594208.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Angelino, Elaine Lee. "Accelerating Markov chain Monte Carlo via parallel predictive prefetching." Thesis, Harvard University, 2014. http://nrs.harvard.edu/urn-3:HUL.InstRepos:13070022.

Full text
Abstract:
We present a general framework for accelerating a large class of widely used Markov chain Monte Carlo (MCMC) algorithms. This dissertation demonstrates that MCMC inference can be accelerated in a model of parallel computation that uses speculation to predict and complete computational work ahead of when it is known to be useful. By exploiting fast, iterative approximations to the target density, we can speculatively evaluate many potential future steps of the chain in parallel. In Bayesian inference problems, this approach can accelerate sampling from the target distribution, without compromising exactness, by exploiting subsets of data. It takes advantage of whatever parallel resources are available, but produces results exactly equivalent to standard serial execution. In the initial burn-in phase of chain evaluation, it achieves speedup over serial evaluation that is close to linear in the number of available cores.
Engineering and Applied Sciences
APA, Harvard, Vancouver, ISO, and other styles
3

Browne, William J. "Applying MCMC methods to multi-level models." Thesis, University of Bath, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.268210.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Durmus, Alain. "High dimensional Markov chain Monte Carlo methods : theory, methods and applications." Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLT001/document.

Full text
Abstract:
L'objet de cette thèse est l'analyse fine de méthodes de Monte Carlopar chaînes de Markov (MCMC) et la proposition de méthodologies nouvelles pour échantillonner une mesure de probabilité en grande dimension. Nos travaux s'articulent autour de trois grands sujets.Le premier thème que nous abordons est la convergence de chaînes de Markov en distance de Wasserstein. Nous établissons des bornes explicites de convergence géométrique et sous-géométrique. Nous appliquons ensuite ces résultats à l'étude d'algorithmes MCMC. Nous nous intéressons à une variante de l'algorithme de Metropolis-Langevin ajusté (MALA) pour lequel nous donnons des bornes explicites de convergence. Le deuxième algorithme MCMC que nous analysons est l'algorithme de Crank-Nicolson pré-conditionné, pour lequel nous montrerons une convergence sous-géométrique.Le second objet de cette thèse est l'étude de l'algorithme de Langevin unajusté (ULA). Nous nous intéressons tout d'abord à des bornes explicites en variation totale suivant différentes hypothèses sur le potentiel associé à la distribution cible. Notre étude traite le cas où le pas de discrétisation est maintenu constant mais aussi du cas d'une suite de pas tendant vers 0. Nous prêtons dans cette étude une attention toute particulière à la dépendance de l'algorithme en la dimension de l'espace d'état. Dans le cas où la densité est fortement convexe, nous établissons des bornes de convergence en distance de Wasserstein. Ces bornes nous permettent ensuite de déduire des bornes de convergence en variation totale qui sont plus précises que celles reportées précédemment sous des conditions plus faibles sur le potentiel. Le dernier sujet de cette thèse est l'étude des algorithmes de type Metropolis-Hastings par échelonnage optimal. Tout d'abord, nous étendons le résultat pionnier sur l'échelonnage optimal de l'algorithme de Metropolis à marche aléatoire aux densités cibles dérivables en moyenne Lp pour p ≥ 2. Ensuite, nous proposons de nouveaux algorithmes de type Metropolis-Hastings qui présentent un échelonnage optimal plus avantageux que celui de l'algorithme MALA. Enfin, nous analysons la stabilité et la convergence en variation totale de ces nouveaux algorithmes
The subject of this thesis is the analysis of Markov Chain Monte Carlo (MCMC) methods and the development of new methodologies to sample from a high dimensional distribution. Our work is divided into three main topics. The first problem addressed in this manuscript is the convergence of Markov chains in Wasserstein distance. Geometric and sub-geometric convergence with explicit constants, are derived under appropriate conditions. These results are then applied to thestudy of MCMC algorithms. The first analyzed algorithm is an alternative scheme to the Metropolis Adjusted Langevin algorithm for which explicit geometric convergence bounds are established. The second method is the pre-Conditioned Crank-Nicolson algorithm. It is shown that under mild assumption, the Markov chain associated with thisalgorithm is sub-geometrically ergodic in an appropriated Wasserstein distance. The second topic of this thesis is the study of the Unadjusted Langevin algorithm (ULA). We are first interested in explicit convergence bounds in total variation under different kinds of assumption on the potential associated with the target distribution. In particular, we pay attention to the dependence of the algorithm on the dimension of the state space. The case of fixed step sizes as well as the case of nonincreasing sequences of step sizes are dealt with. When the target density is strongly log-concave, explicit bounds in Wasserstein distance are established. These results are then used to derived new bounds in the total variation distance which improve the one previously derived under weaker conditions on the target density.The last part tackles new optimal scaling results for Metropolis-Hastings type algorithms. First, we extend the pioneer result on the optimal scaling of the random walk Metropolis algorithm to target densities which are differentiable in Lp mean for p ≥ 2. Then, we derive new Metropolis-Hastings type algorithms which have a better optimal scaling compared the MALA algorithm. Finally, the stability and the convergence in total variation of these new algorithms are studied
APA, Harvard, Vancouver, ISO, and other styles
5

Harkness, Miles Adam. "Parallel simulation, delayed rejection and reversible jump MCMC for object recognition." Thesis, University of Bristol, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.324266.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Smith, Corey James. "Exact Markov Chain Monte Carlo with Likelihood Approximations for Functional Linear Models." The Ohio State University, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=osu1531833318013379.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Walker, Neil Rawlinson. "A Bayesian approach to the job search model and its application to unemployment durations using MCMC methods." Thesis, University of Newcastle Upon Tyne, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.299053.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Jeon, Juncheol. "Deterioration model for ports in the Republic of Korea using Markov chain Monte Carlo with multiple imputation." Thesis, University of Dundee, 2019. https://discovery.dundee.ac.uk/en/studentTheses/1cc538ea-1468-4d51-bcf8-711f8b9912f9.

Full text
Abstract:
Condition of infrastructure is deteriorated over time as it gets older. It is the deterioration model that predicts how and when facilities will deteriorate over time. In most infrastructure management system, the deterioration model is a crucial element. Using the deterioration model, it is very helpful to estimate when repair will be carried out, how much will be needed for the maintenance of the entire facilities, and what maintenance costs will be required during the life cycle of the facility. However, the study of deterioration model for civil infrastructures of ports is still in its infancy. In particular, there is almost no related research in South Korea. Thus, this study aims to develop a deterioration model for civil infrastructure of ports in South Korea. There are various methods such as Deterministic, Stochastic, and Artificial Intelligence to develop deterioration model. In this research, Markov model using Markov chain theory, one of the Stochastic methods, is used to develop deterioration model for ports in South Korea. Markov chain is a probabilistic process among states. i.e., in Markov chain, transition among states follows some probability which is called as the transition probability. The key process of developing Markov model is to find this transition probability. This process is called calibration. In this study, the existing methods, Optimization method and Markov Chain Monte Carlo (MCMC), are reviewed, and methods to improve for these are presented. In addition, in this study, only a small amount of data are used, which causes distortion of the model. Thus, supplement techniques are presented to overcome the small size of data. In order to address the problem of the existing methods and the lack of data, the deterioration model developed by the four calibration methods: Optimization, Optimization with Bootstrap, MCMC (Markov Chain Monte Carlo), and MCMC with Multiple imputation, are finally proposed in this study. In addition, comparison between four models are carried out and good performance model is proposed. This research provides deterioration model for port in South Korea, and more accurate calibration technique is suggested. Furthermore, the method of supplementing insufficient data has been combined with existing calibration techniques.
APA, Harvard, Vancouver, ISO, and other styles
9

Fu, Jianlin. "A markov chain monte carlo method for inverse stochastic modeling and uncertainty assessment." Doctoral thesis, Universitat Politècnica de València, 2008. http://hdl.handle.net/10251/1969.

Full text
Abstract:
Unlike the traditional two-stage methods, a conditional and inverse-conditional simulation approach may directly generate independent, identically distributed realizations to honor both static data and state data in one step. The Markov chain Monte Carlo (McMC) method was proved a powerful tool to perform such type of stochastic simulation. One of the main advantages of the McMC over the traditional sensitivity-based optimization methods to inverse problems is its power, flexibility and well-posedness in incorporating observation data from different sources. In this work, an improved version of the McMC method is presented to perform the stochastic simulation of reservoirs and aquifers in the framework of multi-Gaussian geostatistics. First, a blocking scheme is proposed to overcome the limitations of the classic single-component Metropolis-Hastings-type McMC. One of the main characteristics of the blocking McMC (BMcMC) scheme is that, depending on the inconsistence between the prior model and the reality, it can preserve the prior spatial structure and statistics as users specified. At the same time, it improves the mixing of the Markov chain and hence enhances the computational efficiency of the McMC. Furthermore, the exploration ability and the mixing speed of McMC are efficiently improved by coupling the multiscale proposals, i.e., the coupled multiscale McMC method. In order to make the BMcMC method capable of dealing with the high-dimensional cases, a multi-scale scheme is introduced to accelerate the computation of the likelihood which greatly improves the computational efficiency of the McMC due to the fact that most of the computational efforts are spent on the forward simulations. To this end, a flexible-grid full-tensor finite-difference simulator, which is widely compatible with the outputs from various upscaling subroutines, is developed to solve the flow equations and a constant-displacement random-walk particle-tracking method, which enhances the com
Fu, J. (2008). A markov chain monte carlo method for inverse stochastic modeling and uncertainty assessment [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/1969
Palancia
APA, Harvard, Vancouver, ISO, and other styles
10

Lindahl, John, and Douglas Persson. "Data-driven test case design of automatic test cases using Markov chains and a Markov chain Monte Carlo method." Thesis, Malmö universitet, Fakulteten för teknik och samhälle (TS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-43498.

Full text
Abstract:
Large and complex software that is frequently changed leads to testing challenges. It is well established that the later a fault is detected in software development, the more it costs to fix. This thesis aims to research and develop a method of generating relevant and non-redundant test cases for a regression test suite, to catch bugs as early in the development process as possible. The research was executed at Axis Communications AB with their products and systems in mind. The approach utilizes user data to dynamically generate a Markov chain model and with a Markov chain Monte Carlo method, strengthen that model. The model generates test case proposals, detects test gaps, and identifies redundant test cases based on the user data and data from a test suite. The sampling in the Markov chain Monte Carlo method can be modified to bias the model for test coverage or relevancy. The model is generated generically and can therefore be implemented in other API-driven systems. The model was designed with scalability in mind and further implementations can be made to increase the complexity and further specialize the model for individual needs.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Markov chain Monte Carlo (MCMC)"

1

1947-, Gianola Daniel, ed. Likelihood, Bayesian and MCMC methods in quantitative genetics. New York: Springer-Verlag, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

1961-, Robert Christian P., ed. Discretization and MCMC convergence assessment. New York: Springer, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Handbook for Markov chain Monte Carlo. Boca Raton: Taylor & Francis, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Liang, Faming, Chuanhai Liu, and Raymond J. Carroll. Advanced Markov Chain Monte Carlo Methods. Chichester, UK: John Wiley & Sons, Ltd, 2010. http://dx.doi.org/10.1002/9780470669723.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

R, Gilks W., Richardson S, and Spiegelhalter D. J, eds. Markov chain Monte Carlo in practice. Boca Raton, Fla: Chapman & Hall, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

R, Gilks W., Richardson S, and Spiegelhalter D. J, eds. Markov chain Monte Carlo in practice. London: Chapman & Hall, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Cowles, Mary Kathryn. Possible biases induced by MCMC convergence diagnostics. Toronto: University of Toronto, Dept. of Statistics, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

S, Kendall W., Liang F. 1970-, and Wang J. S. 1960-, eds. Markov chain Monte Carlo: Innovations and applications. Singapore: World Scientific, 2005.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Joseph, Anosh. Markov Chain Monte Carlo Methods in Quantum Field Theories. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-46044-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gamerman, Dani. Markov chain Monte Carlo: Stochastic simulation for Bayesian inference. London: Chapman & Hall, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Markov chain Monte Carlo (MCMC)"

1

Robert, Christian P., and Sylvia Richardson. "Markov Chain Monte Carlo Methods." In Discretization and MCMC Convergence Assessment, 1–25. New York, NY: Springer New York, 1998. http://dx.doi.org/10.1007/978-1-4612-1716-9_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hanada, Masanori, and So Matsuura. "Applications of Markov Chain Monte Carlo." In MCMC from Scratch, 113–68. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-2715-7_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hanada, Masanori, and So Matsuura. "General Aspects of Markov Chain Monte Carlo." In MCMC from Scratch, 27–38. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-2715-7_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zhang, Yan. "Markov Chain Monte Carlo (MCMC) Simulations." In Encyclopedia of Systems Biology, 1176. New York, NY: Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4419-9863-7_403.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bhattacharya, Rabi, Lizhen Lin, and Victor Patrangenaru. "Markov Chain Monte Carlo (MCMC) Simulation and Bayes Theory." In Springer Texts in Statistics, 325–32. New York, NY: Springer New York, 2016. http://dx.doi.org/10.1007/978-1-4939-4032-5_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Walgama Wellalage, N. K., Tieling Zhang, Richard Dwight, and Khaled El-Akruti. "Bridge Deterioration Modeling by Markov Chain Monte Carlo (MCMC) Simulation Method." In Lecture Notes in Mechanical Engineering, 545–56. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-09507-3_47.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Lundén, Daniel, Gizem Çaylak, Fredrik Ronquist, and David Broman. "Automatic Alignment in Higher-Order Probabilistic Programming Languages." In Programming Languages and Systems, 535–63. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-30044-8_20.

Full text
Abstract:
AbstractProbabilistic Programming Languages (PPLs) allow users to encode statistical inference problems and automatically apply an inference algorithm to solve them. Popular inference algorithms for PPLs, such as sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC), are built around checkpoints—relevant events for the inference algorithm during the execution of a probabilistic program. Deciding the location of checkpoints is, in current PPLs, not done optimally. To solve this problem, we present a static analysis technique that automatically determines checkpoints in programs, relieving PPL users of this task. The analysis identifies a set of checkpoints that execute in the same order in every program run—they are aligned. We formalize alignment, prove the correctness of the analysis, and implement the analysis as part of the higher-order functional PPL Miking CorePPL. By utilizing the alignment analysis, we design two novel inference algorithm variants: aligned SMC and aligned lightweight MCMC. We show, through real-world experiments, that they significantly improve inference execution time and accuracy compared to standard PPL versions of SMC and MCMC.
APA, Harvard, Vancouver, ISO, and other styles
8

Wüthrich, Mario V., and Michael Merz. "Bayesian Methods, Regularization and Expectation-Maximization." In Springer Actuarial, 207–66. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-12409-9_6.

Full text
Abstract:
AbstractThis chapter summarizes some techniques that use Bayes’ theorem. These are classical Bayesian statistical models using, e.g., the Markov chain Monte Carlo (MCMC) method for model fitting. We discuss regularization of regression models such as ridge and LASSO regularization, which has a Bayesian interpretation, and we consider the Expectation-Maximization (EM) algorithm. The EM algorithm is a general purpose tool that can handle incomplete data settings. We illustrate this for different examples coming from mixture distributions, censored and truncated claims data.
APA, Harvard, Vancouver, ISO, and other styles
9

Lundén, Daniel, Johannes Borgström, and David Broman. "Correctness of Sequential Monte Carlo Inference for Probabilistic Programming Languages." In Programming Languages and Systems, 404–31. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-72019-3_15.

Full text
Abstract:
AbstractProbabilistic programming is an approach to reasoning under uncertainty by encoding inference problems as programs. In order to solve these inference problems, probabilistic programming languages (PPLs) employ different inference algorithms, such as sequential Monte Carlo (SMC), Markov chain Monte Carlo (MCMC), or variational methods. Existing research on such algorithms mainly concerns their implementation and efficiency, rather than the correctness of the algorithms themselves when applied in the context of expressive PPLs. To remedy this, we give a correctness proof for SMC methods in the context of an expressive PPL calculus, representative of popular PPLs such as WebPPL, Anglican, and Birch. Previous work have studied correctness of MCMC using an operational semantics, and correctness of SMC and MCMC in a denotational setting without term recursion. However, for SMC inference—one of the most commonly used algorithms in PPLs as of today—no formal correctness proof exists in an operational setting. In particular, an open question is if the resample locations in a probabilistic program affects the correctness of SMC. We solve this fundamental problem, and make four novel contributions: (i) we extend an untyped PPL lambda calculus and operational semantics to include explicit resample terms, expressing synchronization points in SMC inference; (ii) we prove, for the first time, that subject to mild restrictions, any placement of the explicit resample terms is valid for a generic form of SMC inference; (iii) as a result of (ii), our calculus benefits from classic results from the SMC literature: a law of large numbers and an unbiased estimate of the model evidence; and (iv) we formalize the bootstrap particle filter for the calculus and discuss how our results can be further extended to other SMC algorithms.
APA, Harvard, Vancouver, ISO, and other styles
10

Amiri, Esmail. "Bayesian Automatic Parameter Estimation of Threshold Autoregressive (TAR) Models using Markov Chain Monte Carlo (MCMC)." In Compstat, 189–94. Heidelberg: Physica-Verlag HD, 2002. http://dx.doi.org/10.1007/978-3-642-57489-4_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Markov chain Monte Carlo (MCMC)"

1

Vaiciulyte, Ingrida. "Adaptive Monte-Carlo Markov chain for multivariate statistical estimation." In International Workshop of "Stochastic Programming for Implementation and Advanced Applications". The Association of Lithuanian Serials, 2012. http://dx.doi.org/10.5200/stoprog.2012.21.

Full text
Abstract:
The estimation of the multivariate skew t-distribution by the Monte-Carlo Markov Chain (MCMC) method is considered in the paper. Thus, the MCMC procedure is constructed for recurrent estimation of skew t-distribution, following the maximum likelihood method, where the Monte-Carlo sample size is regulated to ensure the convergence and to decrease the total amount of Monte-Carlo trials, required for estimation. The confidence intervals of Monte-Carlo estimators are introduced because of their asymptotic normality. The termination rule is also implemented by testing statistical hypotheses on an insignificant change of estimates in two steps of the procedure. The algorithm developed has been tested by computer simulation with test example. The test sample, following from skew t-distribution, has been simulated by computer and parameters of the skew t-distribution have been estimated by MathCAD. Next, the chi-squared criterion confirmed the hypothesis of distribution of statistics with respect to under- lying distribution function. Thus, computer simulation confirmed the applicability of the Monte-Carlo Markov chain approach with adaptively regulated sample size for estimation of parameters of the skew t- distribution with acceptable accuracy.
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Zhen, Xupeng He, Yiteng Li, Marwa AlSinan, Hyung Kwak, and Hussein Hoteit. "Parameter Inversion in Geothermal Reservoir Using Markov Chain Monte Carlo and Deep Learning." In SPE Reservoir Simulation Conference. SPE, 2023. http://dx.doi.org/10.2118/212185-ms.

Full text
Abstract:
Abstract Traditional history-matching process suffers from non-uniqueness solutions, subsurface uncertainties, and high computational cost. This work proposes a robust history-matching workflow utilizing the Bayesian Markov Chain Monte Carlo (MCMC) and Bidirectional Long-Short Term Memory (BiLSTM) network to perform history matching under uncertainties for geothermal resource development efficiently. There are mainly four steps. Step 1: Identifying uncertainty parameters. Step 2: The BiLSTM is built to map the nonlinear relationship between the key uncertainty parameters (e.g., injection rates, reservoir temperature, etc.) and time series outputs (temperature of producer). Bayesian optimization is used to automate the tuning process of the hyper-parameters. Step 3: The Bayesian MCMC is performed to inverse the uncertainty parameters. The BiLSTM is served as the forward model to reduce the computational expense. Step 4: If the errors of the predicted response between the high-fidelity model and Bayesian MCMC are high, we need to revisit the accuracy of the BiLSTM and the prior information on the uncertainty parameters. We demonstrate the proposed method using a 3D fractured geothermal reservoir, where the cold water is injected into a geothermal reservoir, and the energy is extracted by producing hot water in a producer. Results show that the proposed Bayesian MCMC and BiLSTM method can successfully inverse the uncertainty parameters with narrow uncertainties by comparing the inversed parameters and the ground truth. We then compare its superiority with models like PCE, Kriging, and SVR, and our method achieves the highest accuracy. We propose a Bayesian MCMC and BiLSTM-based history matching method for uncertainty parameters inversion and demonstrate its accuracy and robustness compared with other models. This approach provides an efficient and practical history-matching method for geothermal extraction with significant uncertainties.
APA, Harvard, Vancouver, ISO, and other styles
3

Auvinen, Harri, Tuomo Raitio, Samuli Siltanen, and Paavo Alku. "Utilizing Markov chain Monte Carlo (MCMC) method for improved glottal inverse filtering." In Interspeech 2012. ISCA: ISCA, 2012. http://dx.doi.org/10.21437/interspeech.2012-450.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Emery, A. F., and E. Valenti. "Estimating Parameters of a Packed Bed by Least Squares and Markov Chain Monte Carlo." In ASME 2005 International Mechanical Engineering Congress and Exposition. ASMEDC, 2005. http://dx.doi.org/10.1115/imece2005-82086.

Full text
Abstract:
Most parameter estimation is based upon the assumption of normally distributed errors using least squares and the confidence intervals are computed from the sensitivities and the statistics of the residuals. For nonlinear problems, the assumption of a normal distribution of the parameters may not be valid. Determining the probability density distribution can be difficult, particularly when there is more than one parameter to be estimated or there is uncertainty about other parameters. An alternative approach is Bayesian inference, but the numerical computations can be expensive. Markov Chain Monte Carlo (MCMC) may alleviate some of the expense. The paper describes the application of MCMC to estimate the mass flow rate, the heat transfer coefficient, and the specific heat of a packed bed regenerator.
APA, Harvard, Vancouver, ISO, and other styles
5

Guzman, Rel. "Monte Carlo Methods on High Dimensional Data." In LatinX in AI at Neural Information Processing Systems Conference 2018. Journal of LatinX in AI Research, 2018. http://dx.doi.org/10.52591/lxai2018120314.

Full text
Abstract:
Markov Chain Monte Carlo (MCMC) simulation is a family of stochastic algorithms that are commonly used to approximate probability distributions by generating samples. The aim of this proposal is to deal with the problem of doing that job on a large scale because due to the increasing power computational demands of data being tall or wide, a study that combines statistical and engineering expertise can be made in order to achieve hardware-accelerated MCMC inference. In this work, I attempt to advance the theory and practice of approximate MCMC methods by developing a toolbox of distributed MCMC algorithms, and then a new method for dealing with large scale problems will be proposed, or else a framework for choosing the most appropriate method will be established. Papers like [1] provide a comprehensive review of the existing literature regarding methods to tackle big data problems. My idea is to tackle divide and conquer approaches since they can work distributed in several machines or else Graphics Processing Unit (GPUs), so I cover the theory behind these methods; then, exhaustive experimental tests will help me compare and categorize them according to their limitations in wide and tall data by considering the dataset size n, sample dimension d, and number of samples T to produce.
APA, Harvard, Vancouver, ISO, and other styles
6

ur Rehman, M. Javvad, Sarat Chandra Dass, and Vijanth Sagayan Asirvadam. "Markov chain Monte Carlo (MCMC) method for parameter estimation of nonlinear dynamical systems." In 2015 IEEE International Conference on Signal and Image Processing Applications (ICSIPA). IEEE, 2015. http://dx.doi.org/10.1109/icsipa.2015.7412154.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hassan, Badreldin G. H., Isameldin A. Atiem, and Ping Feng. "Rainfall Frequency Analysis of Sudan by Using Bayesian Markov chain Monte Carlo (MCMC) methods." In 2013 International Conference on Information Science and Technology Applications. Paris, France: Atlantis Press, 2013. http://dx.doi.org/10.2991/icista.2013.21.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Niaki, Farbod Akhavan, Durul Ulutan, and Laine Mears. "Parameter Estimation Using Markov Chain Monte Carlo Method in Mechanistic Modeling of Tool Wear During Milling." In ASME 2015 International Manufacturing Science and Engineering Conference. American Society of Mechanical Engineers, 2015. http://dx.doi.org/10.1115/msec2015-9357.

Full text
Abstract:
Several models have been proposed to describe the relationship between cutting parameters and machining outputs such as cutting forces and tool wear. However, these models usually cannot be generalized, due to the inherent uncertainties that exist in the process. These uncertainties may originate from machining, workpiece material composition, and measurements. A stochastic approach can be utilized to compensate for the lack of certainty in machining, particularly for tool wear evolution. The Markov Chain Monte Carlo (MCMC) method is a powerful tool for addressing uncertainties in machining parameter estimation. The Hybrid Metropolis-Gibbs algorithm has been chosen in this work to estimate the unknown parameters in a mechanistic tool wear model for end milling of difficult-to-machine alloys. The results show a good potential of the Markov Chain Monte Carlo modeling in prediction of parameters in the presence of uncertainties.
APA, Harvard, Vancouver, ISO, and other styles
9

Agdas, Duzgun, Michael T. Davidson, and Ralph D. Ellis. "Efficiency Comparison of Markov Chain Monte Carlo Simulation with Subset Simulation (MCMC/ss) to Standard Monte Carlo Simulation (sMC) for Extreme Event Scenarios." In First International Symposium on Uncertainty Modeling and Analysis and Management (ICVRAM 2011); and Fifth International Symposium on Uncertainty Modeling and Anaylsis (ISUMA). Reston, VA: American Society of Civil Engineers, 2011. http://dx.doi.org/10.1061/41170(400)11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Anggarwati, Febiana Putri, Azizah, and Trianingsih Eni Lestari. "Risk analysis of investment in stock market using mixture of mixture model and Bayesian Markov Chain Monte Carlo (MCMC)." In PROCEEDINGS OF THE II INTERNATIONAL SCIENTIFIC CONFERENCE ON ADVANCES IN SCIENCE, ENGINEERING AND DIGITAL EDUCATION: (ASEDU-II 2021). AIP Publishing, 2022. http://dx.doi.org/10.1063/5.0110465.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Markov chain Monte Carlo (MCMC)"

1

Gelfand, Alan E., and Sujit K. Sahu. On Markov Chain Monte Carlo Acceleration. Fort Belvoir, VA: Defense Technical Information Center, April 1994. http://dx.doi.org/10.21236/ada279393.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Safta, Cosmin, Mohammad Khalil, and Habib N. Najm. Transitional Markov Chain Monte Carlo Sampler in UQTk. Office of Scientific and Technical Information (OSTI), March 2020. http://dx.doi.org/10.2172/1606084.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Warnes, Gregory R. HYDRA: A Java Library for Markov Chain Monte Carlo. Fort Belvoir, VA: Defense Technical Information Center, March 2002. http://dx.doi.org/10.21236/ada459649.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bates, Cameron Russell, and Edward Allen Mckigney. Metis: A Pure Metropolis Markov Chain Monte Carlo Bayesian Inference Library. Office of Scientific and Technical Information (OSTI), January 2018. http://dx.doi.org/10.2172/1417145.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Baltz, E. Markov Chain Monte Carlo Exploration of Minimal Supergravity with Implications for Dark Matter. Office of Scientific and Technical Information (OSTI), July 2004. http://dx.doi.org/10.2172/827306.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Sethuraman, Jayaram. Easily Verifiable Conditions for the Convergence of the Markov Chain Monte Carlo Method. Fort Belvoir, VA: Defense Technical Information Center, December 1995. http://dx.doi.org/10.21236/ada308874.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Doss, Hani. Studies in Reliability Theory and Survival Analysis and in Markov Chain Monte Carlo Methods. Fort Belvoir, VA: Defense Technical Information Center, September 1998. http://dx.doi.org/10.21236/ada367895.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Doss, Hani. Statistical Inference for Coherent Systems from Partial Information and Markov Chain Monte Carlo Methods. Fort Belvoir, VA: Defense Technical Information Center, January 1996. http://dx.doi.org/10.21236/ada305676.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Doss, Hani. Studies in Reliability Theory and Survival Analysis and in Markov Chain Monte Carlo Methods. Fort Belvoir, VA: Defense Technical Information Center, December 1998. http://dx.doi.org/10.21236/ada379998.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Knopp, Jeremy S., and Fumio Kojima. Inverse Problem for Electromagnetic Propagation in a Dielectric Medium using Markov Chain Monte Carlo Method (Preprint). Fort Belvoir, VA: Defense Technical Information Center, August 2012. http://dx.doi.org/10.21236/ada565876.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography