Journal articles on the topic 'Markov chain Monte Carlo (MCMC)'

To see the other types of publications on this topic, follow the link: Markov chain Monte Carlo (MCMC).

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Markov chain Monte Carlo (MCMC).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Borkar, Vivek S. "Markov Chain Monte Carlo (MCMC)." Resonance 27, no. 7 (July 2022): 1107–15. http://dx.doi.org/10.1007/s12045-022-1407-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Roy, Vivekananda. "Convergence Diagnostics for Markov Chain Monte Carlo." Annual Review of Statistics and Its Application 7, no. 1 (March 9, 2020): 387–412. http://dx.doi.org/10.1146/annurev-statistics-031219-041300.

Full text
Abstract:
Markov chain Monte Carlo (MCMC) is one of the most useful approaches to scientific computing because of its flexible construction, ease of use, and generality. Indeed, MCMC is indispensable for performing Bayesian analysis. Two critical questions that MCMC practitioners need to address are where to start and when to stop the simulation. Although a great amount of research has gone into establishing convergence criteria and stopping rules with sound theoretical foundation, in practice, MCMC users often decide convergence by applying empirical diagnostic tools. This review article discusses the most widely used MCMC convergence diagnostic tools. Some recently proposed stopping rules with firm theoretical footing are also presented. The convergence diagnostics and stopping rules are illustrated using three detailed examples.
APA, Harvard, Vancouver, ISO, and other styles
3

Jones, Galin L., and Qian Qin. "Markov Chain Monte Carlo in Practice." Annual Review of Statistics and Its Application 9, no. 1 (March 7, 2022): 557–78. http://dx.doi.org/10.1146/annurev-statistics-040220-090158.

Full text
Abstract:
Markov chain Monte Carlo (MCMC) is an essential set of tools for estimating features of probability distributions commonly encountered in modern applications. For MCMC simulation to produce reliable outcomes, it needs to generate observations representative of the target distribution, and it must be long enough so that the errors of Monte Carlo estimates are small. We review methods for assessing the reliability of the simulation effort, with an emphasis on those most useful in practically relevant settings. Both strengths and weaknesses of these methods are discussed. The methods are illustrated in several examples and in a detailed case study.
APA, Harvard, Vancouver, ISO, and other styles
4

Jones, Galin L., and Qian Qin. "Markov Chain Monte Carlo in Practice." Annual Review of Statistics and Its Application 9, no. 1 (March 7, 2022): 557–78. http://dx.doi.org/10.1146/annurev-statistics-040220-090158.

Full text
Abstract:
Markov chain Monte Carlo (MCMC) is an essential set of tools for estimating features of probability distributions commonly encountered in modern applications. For MCMC simulation to produce reliable outcomes, it needs to generate observations representative of the target distribution, and it must be long enough so that the errors of Monte Carlo estimates are small. We review methods for assessing the reliability of the simulation effort, with an emphasis on those most useful in practically relevant settings. Both strengths and weaknesses of these methods are discussed. The methods are illustrated in several examples and in a detailed case study.
APA, Harvard, Vancouver, ISO, and other styles
5

Siems, Tobias. "Markov Chain Monte Carlo on finite state spaces." Mathematical Gazette 104, no. 560 (June 18, 2020): 281–87. http://dx.doi.org/10.1017/mag.2020.51.

Full text
Abstract:
We elaborate the idea behind Markov chain Monte Carlo (MCMC) methods in a mathematically coherent, yet simple and understandable way. To this end, we prove a pivotal convergence theorem for finite Markov chains and a minimal version of the Perron-Frobenius theorem. Subsequently, we briefly discuss two fundamental MCMC methods, the Gibbs and Metropolis-Hastings sampler. Only very basic knowledge about matrices, convergence of real sequences and probability theory is required.
APA, Harvard, Vancouver, ISO, and other styles
6

Chaudhary, A. K. "Bayesian Analysis of Two Parameter Complementary Exponential Power Distribution." NCC Journal 3, no. 1 (June 14, 2018): 1–23. http://dx.doi.org/10.3126/nccj.v3i1.20244.

Full text
Abstract:
In this paper, the Markov chain Monte Carlo (MCMC) method is used to estimate the parameters of CEP distribution based on a complete sample. A procedure is developed to obtain Bayes estimates of the parameters of the CEP distribution using Markov Chain Monte Carlo (MCMC) simulation method in OpenBUGS, established software for Bayesian analysis using Markov Chain Monte Carlo (MCMC) methods. The MCMC methods have been shown to be easier to implement computationally, the estimates always exist and are statistically consistent, and their probability intervals are convenient to construct. The R functions are developed to study the statistical properties, model validation and comparison tools of the distribution and the output analysis of MCMC samples generated from OpenBUGS. A real data set is considered for illustration under uniform and gamma sets of priors. NCC Journal Vol. 3, No. 1, 2018, Page: 1-23
APA, Harvard, Vancouver, ISO, and other styles
7

Chaudhary, Arun Kumar, and Vijay Kumar. "A Bayesian Estimation and Predictionof Gompertz Extension Distribution Using the MCMC Method." Nepal Journal of Science and Technology 19, no. 1 (July 1, 2020): 142–60. http://dx.doi.org/10.3126/njst.v19i1.29795.

Full text
Abstract:
In this paper, the Markov chain Monte Carlo (MCMC) method is used to estimate the parameters of the Gompertz extension distribution based on a complete sample. We have developed a procedure to obtain Bayes estimates of the parameters of the Gompertz extension distribution using Markov Chain Monte Carlo (MCMC) simulation method in OpenBUGS, established software for Bayesian analysis using Markov Chain Monte Carlo (MCMC) methods. We have obtained the Bayes estimates of the parameters, hazard and reliability functions, and their probability intervals are also presented. We have applied the predictive check method to discuss the issue of model compatibility. A real data set is considered for illustration under uniform and gamma priors.
APA, Harvard, Vancouver, ISO, and other styles
8

Chaudhary, A. K. "A Study of Perks-II Distribution via Bayesian Paradigm." Pravaha 24, no. 1 (June 12, 2018): 1–17. http://dx.doi.org/10.3126/pravaha.v24i1.20221.

Full text
Abstract:
In this paper, the Markov chain Monte Carlo (MCMC) method is used to estimate the parameters of Perks-II distribution based on a complete sample. The procedures are developed to perform full Bayesian analysis of the Perks-II distributions using Markov Chain Monte Carlo (MCMC) simulation method in OpenBUGS, established software for Bayesian analysis using Markov Chain Monte Carlo (MCMC) methods. We have obtained the Bayes estimates of the parameters, hazard and reliability functions, and their probability intervals are also presented. We have also discussed the issue of model compatibility for the given data set. A real data set is considered for illustration under gamma sets of priors.PravahaVol. 24, No. 1, 2018,page: 1-17
APA, Harvard, Vancouver, ISO, and other styles
9

Müller, Christian, Fabian Weysser, Thomas Mrziglod, and Andreas Schuppert. "Markov-Chain Monte-Carlo methods and non-identifiabilities." Monte Carlo Methods and Applications 24, no. 3 (September 1, 2018): 203–14. http://dx.doi.org/10.1515/mcma-2018-0018.

Full text
Abstract:
Abstract We consider the problem of sampling from high-dimensional likelihood functions with large amounts of non-identifiabilities via Markov-Chain Monte-Carlo algorithms. Non-identifiabilities are problematic for commonly used proposal densities, leading to a low effective sample size. To address this problem, we introduce a regularization method using an artificial prior, which restricts non-identifiable parts of the likelihood function. This enables us to sample the posterior using common MCMC methods more efficiently. We demonstrate this with three MCMC methods on a likelihood based on a complex, high-dimensional blood coagulation model and a single series of measurements. By using the approximation of the artificial prior for the non-identifiable directions, we obtain a sample quality criterion. Unlike other sample quality criteria, it is valid even for short chain lengths. We use the criterion to compare the following three MCMC variants: The Random Walk Metropolis Hastings, the Adaptive Metropolis Hastings and the Metropolis adjusted Langevin algorithm.
APA, Harvard, Vancouver, ISO, and other styles
10

Shadare, A. E., M. N. O. Sadiku, and S. M. Musa. "Markov Chain Monte Carlo Solution of Poisson’s Equation in Axisymmetric Regions." Advanced Electromagnetics 8, no. 5 (December 17, 2019): 29–36. http://dx.doi.org/10.7716/aem.v8i5.1255.

Full text
Abstract:
The advent of the Monte Carlo methods to the field of EM have seen floating random walk, fixed random walk and Exodus methods deployed to solve Poisson’s equation in rectangular coordinate and axisymmetric solution regions. However, when considering large EM domains, classical Monte Carlo methods could be time-consuming because they calculate potential one point at a time. Thus, Markov Chain Monte Carlo (MCMC) is generally preferred to other Monte Carlo methods when considering whole-field computation. In this paper, MCMC has been applied to solve Poisson’s equation in homogeneous and inhomogeneous axisymmetric regions. The MCMC results are compared with the analytical and finite difference solutions.
APA, Harvard, Vancouver, ISO, and other styles
11

Finke, Axel, Arnaud Doucet, and Adam M. Johansen. "Limit theorems for sequential MCMC methods." Advances in Applied Probability 52, no. 2 (June 2020): 377–403. http://dx.doi.org/10.1017/apr.2020.9.

Full text
Abstract:
AbstractBoth sequential Monte Carlo (SMC) methods (a.k.a. ‘particle filters’) and sequential Markov chain Monte Carlo (sequential MCMC) methods constitute classes of algorithms which can be used to approximate expectations with respect to (a sequence of) probability distributions and their normalising constants. While SMC methods sample particles conditionally independently at each time step, sequential MCMC methods sample particles according to a Markov chain Monte Carlo (MCMC) kernel. Introduced over twenty years ago in [6], sequential MCMC methods have attracted renewed interest recently as they empirically outperform SMC methods in some applications. We establish an $\mathbb{L}_r$ -inequality (which implies a strong law of large numbers) and a central limit theorem for sequential MCMC methods and provide conditions under which errors can be controlled uniformly in time. In the context of state-space models, we also provide conditions under which sequential MCMC methods can indeed outperform standard SMC methods in terms of asymptotic variance of the corresponding Monte Carlo estimators.
APA, Harvard, Vancouver, ISO, and other styles
12

Karandikar, Rajeeva L. "On the Markov Chain Monte Carlo (MCMC) method." Sadhana 31, no. 2 (April 2006): 81–104. http://dx.doi.org/10.1007/bf02719775.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Masoumi, Samira, Thomas A. Duever, and Park M. Reilly. "Sequential Markov Chain Monte Carlo (MCMC) model discrimination." Canadian Journal of Chemical Engineering 91, no. 5 (July 13, 2012): 862–69. http://dx.doi.org/10.1002/cjce.21711.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Qin, Liang, Philipp Höllmer, and Werner Krauth. "Direction-sweep Markov chains." Journal of Physics A: Mathematical and Theoretical 55, no. 10 (February 16, 2022): 105003. http://dx.doi.org/10.1088/1751-8121/ac508a.

Full text
Abstract:
Abstract We discuss a non-reversible, lifted Markov-chain Monte Carlo (MCMC) algorithm for particle systems in which the direction of proposed displacements is changed deterministically. This algorithm sweeps through directions analogously to the popular MCMC sweep methods for particle or spin indices. Direction-sweep MCMC can be applied to a wide range of reversible or non-reversible Markov chains, such as the Metropolis algorithm or the event-chain Monte Carlo algorithm. For a single two-dimensional tethered hard-disk dipole, we consider direction-sweep MCMC in the limit where restricted equilibrium is reached among the accessible configurations for a fixed direction before incrementing it. We show rigorously that direction-sweep MCMC leaves the stationary probability distribution unchanged and that it profoundly modifies the Markov-chain trajectory. Long excursions, with persistent rotation in one direction, alternate with long sequences of rapid zigzags resulting in persistent rotation in the opposite direction in the limit of small direction increments. The mapping to a Langevin equation then yields the exact scaling of excursions while the zigzags are described through a non-linear differential equation that is solved exactly. We show that the direction-sweep algorithm can have shorter mixing times than the algorithms with random updates of directions. We point out possible applications of direction-sweep MCMC in polymer physics and in molecular simulation.
APA, Harvard, Vancouver, ISO, and other styles
15

Koike, Takaaki, and Marius Hofert. "Markov Chain Monte Carlo Methods for Estimating Systemic Risk Allocations." Risks 8, no. 1 (January 15, 2020): 6. http://dx.doi.org/10.3390/risks8010006.

Full text
Abstract:
In this paper, we propose a novel framework for estimating systemic risk measures and risk allocations based on Markov Chain Monte Carlo (MCMC) methods. We consider a class of allocations whose jth component can be written as some risk measure of the jth conditional marginal loss distribution given the so-called crisis event. By considering a crisis event as an intersection of linear constraints, this class of allocations covers, for example, conditional Value-at-Risk (CoVaR), conditional expected shortfall (CoES), VaR contributions, and range VaR (RVaR) contributions as special cases. For this class of allocations, analytical calculations are rarely available, and numerical computations based on Monte Carlo (MC) methods often provide inefficient estimates due to the rare-event character of the crisis events. We propose an MCMC estimator constructed from a sample path of a Markov chain whose stationary distribution is the conditional distribution given the crisis event. Efficient constructions of Markov chains, such as the Hamiltonian Monte Carlo and Gibbs sampler, are suggested and studied depending on the crisis event and the underlying loss distribution. The efficiency of the MCMC estimators is demonstrated in a series of numerical experiments.
APA, Harvard, Vancouver, ISO, and other styles
16

Azizah, Azizah. "PEMODELAN KLAIM ASURANSI MENGGUNAKAN PENDEKATAN BAYESIAN DAN MARKOV CHAIN MONTE CARLO." Jurnal Kajian Matematika dan Aplikasinya (JKMA) 2, no. 2 (June 11, 2021): 7. http://dx.doi.org/10.17977/um055v2i22021p7-13.

Full text
Abstract:
The determination of the correct prediction of claims frequency and claims severity is very important in the insurance business to determine the outstanding claims reserve which should be prepared by an insurance company. One approach which may be used to predict a future value is the Bayesian approach. This approach combines the sample and the prior information The information is used to construct the posterior distribution and to determine the estimate of the parameters. However, in this approach, integrations of functions with high dimensions are often encountered. In this Thesis, a Markov Chain Monte Carlo (MCMC) simulation is used using the Gibbs Sampling algorithm to solve the problem. The MCMC simulation uses ergodic chain property in Markov Chain. In Ergodic Markov Chain, a stationary distribution, which is the target distribution, is obtained. The MCMC simulation is applied in Hierarchical Poisson Model. The OpenBUGS software is used to carry out the tasks. The MCMC simulation in Hierarchical Poisson Model can predict the claims frequency.
APA, Harvard, Vancouver, ISO, and other styles
17

SETIAWANI, PUTU AMANDA, KOMANG DHARMAWAN, and I. WAYAN SUMARJAYA. "IMPLEMENTASI METODE MARKOV CHAIN MONTE CARLO DALAM PENENTUAN HARGA KONTRAK BERJANGKA KOMODITAS." E-Jurnal Matematika 4, no. 3 (August 30, 2015): 122. http://dx.doi.org/10.24843/mtk.2015.v04.i03.p099.

Full text
Abstract:
The aim of the research is to implement Markov Chain Monte Carlo (MCMC) simulation method to price the futures contract of cocoa commodities. The result shows that MCMC is more flexible than Standard Monte Carlo (SMC) simulation method because MCMC method uses hit-and-run sampler algorithm to generate proposal movements that are subsequently accepted or rejected with a probability that depends on the distribution of the target that we want to be achieved. This research shows that MCMC method is suitable to be used to simulate the model of cocoa commodity price movement. The result of this research is a simulation of future contract prices for the next three months and future contract prices that must be paid at the time the contract expires. Pricing future contract by using MCMC method will produce the cheaper contract price if it compares to Standard Monte Carlo simulation.
APA, Harvard, Vancouver, ISO, and other styles
18

Levy, Roy. "The Rise of Markov Chain Monte Carlo Estimation for Psychometric Modeling." Journal of Probability and Statistics 2009 (2009): 1–18. http://dx.doi.org/10.1155/2009/537139.

Full text
Abstract:
Markov chain Monte Carlo (MCMC) estimation strategies represent a powerful approach to estimation in psychometric models. Popular MCMC samplers and their alignment with Bayesian approaches to modeling are discussed. Key historical and current developments of MCMC are surveyed, emphasizing how MCMC allows the researcher to overcome the limitations of other estimation paradigms, facilitates the estimation of models that might otherwise be intractable, and frees the researcher from certain possible misconceptions about the models.
APA, Harvard, Vancouver, ISO, and other styles
19

Grana, Dario, Leandro de Figueiredo, and Klaus Mosegaard. "Markov chain Monte Carlo for petrophysical inversion." GEOPHYSICS 87, no. 1 (November 12, 2021): M13—M24. http://dx.doi.org/10.1190/geo2021-0177.1.

Full text
Abstract:
Stochastic petrophysical inversion is a method used to predict reservoir properties from seismic data. Recent advances in stochastic optimization allow generating multiple realizations of rock and fluid properties conditioned on seismic data. To match the measured data and represent the uncertainty of the model variables, many realizations are generally required. Stochastic sampling and optimization of spatially correlated models are computationally demanding. Monte Carlo methods allow quantifying the uncertainty of the model variables but are impractical for high-dimensional models with spatially correlated variables. We have developed a Bayesian approach based on an efficient implementation of the Markov chain Monte Carlo (MCMC) method for the inversion of seismic data for the prediction of reservoir properties. Our Bayesian approach includes an explicit vertical correlation model in the proposal distribution. It is applied trace by trace, and the lateral continuity model is imposed by using the previously simulated values at the adjacent traces as conditioning data for simulating the initial model at the current trace. The methodology is first presented for a 1D problem to test the vertical correlation, and it is extended to 2D problems by including the lateral correlation and comparing two novel implementations based on sequential sampling. Our method is applied to synthetic data to estimate the posterior distribution of the petrophysical properties conditioned on the measured seismic data. The results are compared with an MCMC implementation without lateral correlation and demonstrate the advantage of integrating a spatial correlation model.
APA, Harvard, Vancouver, ISO, and other styles
20

Biswas, Abhik. "Bayesian MCMC Approach to Learning About the SIR Model." International Journal for Research in Applied Science and Engineering Technology 10, no. 6 (June 30, 2022): 540–53. http://dx.doi.org/10.22214/ijraset.2022.43818.

Full text
Abstract:
Abstract: This project aims to study the parameters of the Deterministic SIR(Susceptible → Infected → Recovered) model of COVID-19 in a Bayesian MCMC framework. Several deterministic mathematical models are being developed everyday to forecast the spread of COVID-19 correctly. Here, I have tried to model and study the parameters of the SIR Infectious disease model using the Bayesian Framework and Markov-Chain Monte-Carlo (MCMC) techniques. I have used Bayesian Inference to predict the Basic Reproductive Rate ࢚ࡾ in real time using and following this, demonstrated the process of how the parameters of the SIR Model can be estimated using Bayesian Statistics and Markov-Chain Monte-Carlo Methods. Keywords: COVID-19, Bayesian Inference, Dynamical Systems, SIR Model, Basic Reproductive Rate, Markov-Chain MonteCarlo(MCMC)
APA, Harvard, Vancouver, ISO, and other styles
21

Song, Yihan, Ali Luo, and Yongheng Zhao. "Measuring Stellar Radial Velocity using Markov Chain Monte Carlo(MCMC) Method." Proceedings of the International Astronomical Union 9, S298 (May 2013): 441. http://dx.doi.org/10.1017/s1743921313007060.

Full text
Abstract:
AbstractStellar radial velocity is estimated by using template fitting and Markov Chain Monte Carlo(MCMC) methods. This method works on the LAMOST stellar spectra. The MCMC simulation generates a probability distribution of the RV. The RV error can also computed from distribution.
APA, Harvard, Vancouver, ISO, and other styles
22

Vargas, Juan P., Jair C. Koppe, Sebastián Pérez, and Juan P. Hurtado. "Planning Tunnel Construction Using Markov Chain Monte Carlo (MCMC)." Mathematical Problems in Engineering 2015 (2015): 1–8. http://dx.doi.org/10.1155/2015/797953.

Full text
Abstract:
Tunnels, drifts, drives, and other types of underground excavation are very common in mining as well as in the construction of roads, railways, dams, and other civil engineering projects. Planning is essential to the success of tunnel excavation, and construction time is one of the most important factors to be taken into account. This paper proposes a simulation algorithm based on a stochastic numerical method, the Markov chain Monte Carlo method, that can provide the best estimate of the opening excavation times for the classic method of drilling and blasting. Taking account of technical considerations that affect the tunnel excavation cycle, the simulation is developed through a computational algorithm. Using the Markov chain Monte Carlo method, the unit operations involved in the underground excavation cycle are identified and assigned probability distributions that, with random number input, make it possible to simulate the total excavation time. The results obtained with this method are compared with a real case of tunneling excavation. By incorporating variability in the planning, it is possible to determine with greater certainty the ranges over which the execution times of the unit operations fluctuate. In addition, the financial risks associated with planning errors can be reduced and the exploitation of resources maximized.
APA, Harvard, Vancouver, ISO, and other styles
23

Roberts, Gareth O., and Jeffrey S. Rosenthal. "Complexity bounds for Markov chain Monte Carlo algorithms via diffusion limits." Journal of Applied Probability 53, no. 2 (June 2016): 410–20. http://dx.doi.org/10.1017/jpr.2016.9.

Full text
Abstract:
Abstract We connect known results about diffusion limits of Markov chain Monte Carlo (MCMC) algorithms to the computer science notion of algorithm complexity. Our main result states that any weak limit of a Markov process implies a corresponding complexity bound (in an appropriate metric). We then combine this result with previously-known MCMC diffusion limit results to prove that under appropriate assumptions, the random-walk Metropolis algorithm in d dimensions takes O(d) iterations to converge to stationarity, while the Metropolis-adjusted Langevin algorithm takes O(d1/3) iterations to converge to stationarity.
APA, Harvard, Vancouver, ISO, and other styles
24

Stathopoulos, Vassilios, and Mark A. Girolami. "Markov chain Monte Carlo inference for Markov jump processes via the linear noise approximation." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 371, no. 1984 (February 13, 2013): 20110541. http://dx.doi.org/10.1098/rsta.2011.0541.

Full text
Abstract:
Bayesian analysis for Markov jump processes (MJPs) is a non-trivial and challenging problem. Although exact inference is theoretically possible, it is computationally demanding, thus its applicability is limited to a small class of problems. In this paper, we describe the application of Riemann manifold Markov chain Monte Carlo (MCMC) methods using an approximation to the likelihood of the MJP that is valid when the system modelled is near its thermodynamic limit. The proposed approach is both statistically and computationally efficient whereas the convergence rate and mixing of the chains allow for fast MCMC inference. The methodology is evaluated using numerical simulations on two problems from chemical kinetics and one from systems biology.
APA, Harvard, Vancouver, ISO, and other styles
25

Sinharay, Sandip. "Experiences With Markov Chain Monte Carlo Convergence Assessment in Two Psychometric Examples." Journal of Educational and Behavioral Statistics 29, no. 4 (December 2004): 461–88. http://dx.doi.org/10.3102/10769986029004461.

Full text
Abstract:
There is an increasing use of Markov chain Monte Carlo (MCMC) algorithms for fitting statistical models in psychometrics, especially in situations where the traditional estimation techniques are very difficult to apply. One of the disadvantages of using an MCMC algorithm is that it is not straightforward to determine the convergence of the algorithm. Using the output of an MCMC algorithm that has not converged may lead to incorrect inferences on the problem at hand. The convergence is not one to a point, but that of the distribution of a sequence of generated values to another distribution, and hence is not easy to assess; there is no guaranteed diagnostic tool to determine convergence of an MCMC algorithm in general. This article examines the convergence of MCMC algorithms using a number of convergence diagnostics for two real data examples from psychometrics. Findings from this research have the potential to be useful to researchers using the algorithms. For both the examples, the number of iterations required (suggested by the diagnostics) to be reasonably confident that the MCMC algorithm has converged may be larger than what many practitioners consider to be safe.
APA, Harvard, Vancouver, ISO, and other styles
26

Pooley, C. M., S. C. Bishop, A. Doeschl-Wilson, and G. Marion. "Posterior-based proposals for speeding up Markov chain Monte Carlo." Royal Society Open Science 6, no. 11 (November 2019): 190619. http://dx.doi.org/10.1098/rsos.190619.

Full text
Abstract:
Markov chain Monte Carlo (MCMC) is widely used for Bayesian inference in models of complex systems. Performance, however, is often unsatisfactory in models with many latent variables due to so-called poor mixing, necessitating the development of application-specific implementations. This paper introduces ‘posterior-based proposals' (PBPs), a new type of MCMC update applicable to a huge class of statistical models (whose conditional dependence structures are represented by directed acyclic graphs). PBPs generate large joint updates in parameter and latent variable space, while retaining good acceptance rates (typically 33%). Evaluation against other approaches (from standard Gibbs/random walk updates to state-of-the-art Hamiltonian and particle MCMC methods) was carried out for widely varying model types: an individual-based model for disease diagnostic test data, a financial stochastic volatility model, a mixed model used in statistical genetics and a population model used in ecology. While different methods worked better or worse in different scenarios, PBPs were found to be either near to the fastest or significantly faster than the next best approach (by up to a factor of 10). PBPs, therefore, represent an additional general purpose technique that can be usefully applied in a wide variety of contexts.
APA, Harvard, Vancouver, ISO, and other styles
27

South, Leah F., Marina Riabiz, Onur Teymur, and Chris J. Oates. "Postprocessing of MCMC." Annual Review of Statistics and Its Application 9, no. 1 (March 7, 2022): 529–55. http://dx.doi.org/10.1146/annurev-statistics-040220-091727.

Full text
Abstract:
Markov chain Monte Carlo is the engine of modern Bayesian statistics, being used to approximate the posterior and derived quantities of interest. Despite this, the issue of how the output from a Markov chain is postprocessed and reported is often overlooked. Convergence diagnostics can be used to control bias via burn-in removal, but these do not account for (common) situations where a limited computational budget engenders a bias-variance trade-off. The aim of this article is to review state-of-the-art techniques for postprocessing Markov chain output. Our review covers methods based on discrepancy minimization, which directly address the bias-variance trade-off, as well as general-purpose control variate methods for approximating expected quantities of interest.
APA, Harvard, Vancouver, ISO, and other styles
28

Tie, Zhixin, Dingkai Zhu, Shunhe Hong, and Hui Xu. "A Hierarchical Random Graph Efficient Sampling Algorithm Based on Improved MCMC Algorithm." Electronics 11, no. 15 (July 31, 2022): 2396. http://dx.doi.org/10.3390/electronics11152396.

Full text
Abstract:
A hierarchical random graph (HRG) model combined with a maximum likelihood approach and a Markov Chain Monte Carlo algorithm can not only be used to quantitatively describe the hierarchical organization of many real networks, but also can predict missing connections in partly known networks with high accuracy. However, the computational cost is very large when hierarchical random graphs are sampled by the Markov Chain Monte Carlo algorithm (MCMC), so that the hierarchical random graphs, which can describe the characteristics of network structure, cannot be found in a reasonable time range. This seriously limits the practicability of the model. In order to overcome this defect, an improved MCMC algorithm called two-state transitions MCMC (TST-MCMC) for efficiently sampling hierarchical random graphs is proposed in this paper. On the Markov chain composed of all possible hierarchical random graphs, TST-MCMC can generate two candidate state variables during state transition and introduce a competition mechanism to filter out the worse of the two candidate state variables. In addition, the detailed balance of Markov chain can be ensured by using Metropolis–Hastings rule. By using this method, not only can the convergence speed of Markov chain be improved, but the convergence interval of Markov chain can be narrowed as well. Three example networks are employed to verify the performance of the proposed algorithm. Experimental results show that our algorithm is more feasible and more effective than the compared schemes.
APA, Harvard, Vancouver, ISO, and other styles
29

Harizahayu, Harizahayu. "PEMODELAN RANTAI MARKOV MENGGUNAKAN ALGORITMA METROPOLIS-HASTINGS." MAp (Mathematics and Applications) Journal 2, no. 2 (December 31, 2020): 11–18. http://dx.doi.org/10.15548/map.v2i2.2259.

Full text
Abstract:
Pada tulisan ini akan dijelaskan bentuk distribusi posterior P(probabilitas klaim) = Beta (β│α) dengan proses simulasi implementasi algoritma yang disederhanakan dan penerapan algoritma Markov Chain Monte Carlo dengan mengunkan analisis sistem Bayes dengan pendekatan model Markov Monte Carlo. Algoritma Markov Chain Monte Carlo adalah suatu kelas algoritma untuk melakukan sampling dari distribusi probabilitas dengan membangun rantai Markov pada suatu distribusi tertentu yang stasioner. Algoritma Metropolis merupakan algoritma untuk membangkitkan barisan sampel menggunakan mekanisme penerimaan dan penolakan (accept-reject) dari suatu distribusi probabilitas yang sulit untuk dilakukan penarikan sampel. Penggunaan perangkat lunak R sebagai media untuk mengeksplorasi algoritma dan diagnostik yang umum untuk implementasi MCMC. Hampir semua program dapat dijalankan dengan fungsionalitas dasar R yang berarti tidak diperlukan overhead penyetelan untuk menjalankan blok kode selain versi kerja R dan tersedia gratis secara online untuk semua sistem operasi.AbstractIn this paper, we will describe the form of the posterior distribution of (claim probability) = with A simplified algorithm implementation simulation process and the application of the Markov Chain Monte Carlo algorithm are given by the Bayes system analysis with the Markov Monte Carlo model approach. The Monte Carlo algorithm of the Markov Chain is a class of algorithms for sampling the distribution of probability through the construction of a Markov chain in a particular stationary distribution. The Metropolis-Hastings algorithm is an algorithm used to produce sample sequences from a probability distribution that is difficult for sampling using an accept-reject mechanism. Usage of R program as a platform for MCMC implementations to explore popular algorithms and diagnostics. It is possible to run almost any program with simple R features, which means that no overhead configuration is needed to run code blocks other than the working version of R, and all operating systems are available online free of charge.
APA, Harvard, Vancouver, ISO, and other styles
30

Yuan, Ke, Mark Girolami, and Mahesan Niranjan. "Markov Chain Monte Carlo Methods for State-Space Models with Point Process Observations." Neural Computation 24, no. 6 (June 2012): 1462–86. http://dx.doi.org/10.1162/neco_a_00281.

Full text
Abstract:
This letter considers how a number of modern Markov chain Monte Carlo (MCMC) methods can be applied for parameter estimation and inference in state-space models with point process observations. We quantified the efficiencies of these MCMC methods on synthetic data, and our results suggest that the Reimannian manifold Hamiltonian Monte Carlo method offers the best performance. We further compared such a method with a previously tested variational Bayes method on two experimental data sets. Results indicate similar performance on the large data sets and superior performance on small ones. The work offers an extensive suite of MCMC algorithms evaluated on an important class of models for physiological signal analysis.
APA, Harvard, Vancouver, ISO, and other styles
31

Jiang, Yu Hang, Tong Liu, Zhiya Lou, Jeffrey S. Rosenthal, Shanshan Shangguan, Fei Wang, and Zixuan Wu. "Markov Chain Confidence Intervals and Biases." International Journal of Statistics and Probability 11, no. 1 (December 21, 2021): 29. http://dx.doi.org/10.5539/ijsp.v11n1p29.

Full text
Abstract:
We derive explicit asymptotic confidence intervals for any Markov chain Monte Carlo (MCMC) algorithm with finite asymptotic variance, started at any initial state, without requiring a Central Limit Theorem nor reversibility nor geometric ergodicity nor any bias bound. We also derive explicit non-asymptotic confidence intervals assuming bounds on the bias or first moment, or alternatively that the chain starts in stationarity. We relate those non-asymptotic bounds to properties of MCMC bias, and show that polynomially ergodicity implies certain bias bounds. We also apply our results to several numerical examples. It is our hope that these results will provide simple and useful tools for estimating errors of MCMC algorithms when CLTs are not available.
APA, Harvard, Vancouver, ISO, and other styles
32

Shao, Liangshan, and Yingchao Gao. "A Gas Prominence Prediction Model Based on Entropy-Weighted Gray Correlation and MCMC-ISSA-SVM." Processes 11, no. 7 (July 13, 2023): 2098. http://dx.doi.org/10.3390/pr11072098.

Full text
Abstract:
To improve the accuracy of coal and gas prominence prediction, an improved sparrow search algorithm (ISSA) and an optimized support vector machine (SVM) based on the Markov chain Monte Carlo (MCMC) filling algorithm prediction model were proposed. The mean value of the data after filling in the missing values in the coal and gas prominence data using the MCMC filling algorithm was 2.282, with a standard deviation of 0.193. Compared with the mean fill method (Mean), random forest filling method (random forest, RF), and K-nearest neighbor filling method (K-nearest neighbor, KNN), the MCMC filling algorithm showed the best results. The parameter indicators of the salient data were ranked by entropy-weighted gray correlation analysis, and the salient prediction experiments were divided into four groups with different numbers of parameter indicators according to the entropy-weighted gray correlation. The best results were obtained in the fourth group, with a maximum relative error (maximum relative error, REmax) of 0.500, an average relative error (average relative error, MRE) of 0.042, a root mean square error (root mean square error, RMSE) of 0.144, and a coefficient of determination (coefficient of determination, R2) of 0.993. The best predicted parameters were the initial velocity of gas dispersion (X2), gas content (X4), K1 gas desorption (X5), and drill chip volume (X6). To improve the sparrow search algorithm (sparrow search algorithm, SSA), the adaptive t-distribution variation operator was introduced to obtain ISSA, and the prediction models of improved sparrow search algorithm optimized support vector machine based on Markov chain Monte Carlo filling algorithm (MCMC-ISSA-SVM), sparrow search algorithm optimized support vector machine based on Markov chain Monte Carlo filling algorithm (MCMC-SSA-SVM), genetic algorithm optimized support vector machine based on Markov chain Monte Carlo filling algorithm (MCMC-GA-SVM) and particle swarm optimization algorithm optimized support vector machine based on Markov chain Monte Carlo filling algorithm (MCMC- PSO -SVM) were established for coal and gas prominence prediction using the ISSA, SSA, genetic algorithm (genetic algorithm, GA) and particle swarm optimization algorithm (particle swarm optimization, PSO) respectively. Comparing the prediction experimental results of each model, the prediction accuracy of MCMC-ISSA-SVM is 98.25%, the error is 0.018, the convergence speed is the fastest, the number of iterations is the least, and the best fitness and the average fitness are the highest among the four models. All the prediction results of MCMC-ISSA-SVM are significantly better than the other three models, which indicates that the algorithm improvement is effective. ISSA outperformed SSA, PSO, and GA, and the MCMC-ISSA-SVM model was able to significantly improve the prediction accuracy and effectively enhance the generalization ability.
APA, Harvard, Vancouver, ISO, and other styles
33

Lukitasari, Dewi, Adi Setiawan, and Leopoldus Ricky Sasangko. "Bayesian Survival Analysis Untuk Mengestimasi Parameter Model Weibull-Regression Pada Kasus Ketahanan Hidup Pasien Penderita Jantung Koroner." d'CARTESIAN 4, no. 1 (February 10, 2015): 26. http://dx.doi.org/10.35799/dc.4.1.2015.7531.

Full text
Abstract:
Paper ini membahas mengenai estimasi parameter model Weibull-Regression untuk data tersensor pada kasus ketahanan hidup pasien penderita jantung koroner dengan pendekatan Bayesian survival analysis. Data yang digunakan adalah data simulasi waktu hidup pasien, status pasien (hidup/mati) dan treatment yang dikenakan yaitu ring dan bypass. Pendekatan Bayesian (Bayesian approach) digunakan untuk mencari distribusi posterior parameter. Metode Markov Chain Monte Carlo (MCMC) digunakan untuk membangkitkan Rantai Markov guna mengestimasi parameter meliputi koefisien regresi (b) dan parameter r dari model survival Weibull. Parameter b dan r yang diperoleh digunakan untuk menghitung fungsi survival tiap pasien untuk tiap treatment yang sekaligus menunjukkan probabilitas bertahan hidup pasien penderita jantung koroner. Kata Kunci : Bayesian, Markov Chain Monte Carlo (MCMC), Model Weibull-Regression, Survival Analysis
APA, Harvard, Vancouver, ISO, and other styles
34

Ahmadian, Yashar, Jonathan W. Pillow, and Liam Paninski. "Efficient Markov Chain Monte Carlo Methods for Decoding Neural Spike Trains." Neural Computation 23, no. 1 (January 2011): 46–96. http://dx.doi.org/10.1162/neco_a_00059.

Full text
Abstract:
Stimulus reconstruction or decoding methods provide an important tool for understanding how sensory and motor information is represented in neural activity. We discuss Bayesian decoding methods based on an encoding generalized linear model (GLM) that accurately describes how stimuli are transformed into the spike trains of a group of neurons. The form of the GLM likelihood ensures that the posterior distribution over the stimuli that caused an observed set of spike trains is log concave so long as the prior is. This allows the maximum a posteriori (MAP) stimulus estimate to be obtained using efficient optimization algorithms. Unfortunately, the MAP estimate can have a relatively large average error when the posterior is highly nongaussian. Here we compare several Markov chain Monte Carlo (MCMC) algorithms that allow for the calculation of general Bayesian estimators involving posterior expectations (conditional on model parameters). An efficient version of the hybrid Monte Carlo (HMC) algorithm was significantly superior to other MCMC methods for gaussian priors. When the prior distribution has sharp edges and corners, on the other hand, the “hit-and-run” algorithm performed better than other MCMC methods. Using these algorithms, we show that for this latter class of priors, the posterior mean estimate can have a considerably lower average error than MAP, whereas for gaussian priors, the two estimators have roughly equal efficiency. We also address the application of MCMC methods for extracting nonmarginal properties of the posterior distribution. For example, by using MCMC to calculate the mutual information between the stimulus and response, we verify the validity of a computationally efficient Laplace approximation to this quantity for gaussian priors in a wide range of model parameters; this makes direct model-based computation of the mutual information tractable even in the case of large observed neural populations, where methods based on binning the spike train fail. Finally, we consider the effect of uncertainty in the GLM parameters on the posterior estimators.
APA, Harvard, Vancouver, ISO, and other styles
35

Östling, Robert, and Jörg Tiedemann. "Efficient Word Alignment with Markov Chain Monte Carlo." Prague Bulletin of Mathematical Linguistics 106, no. 1 (October 1, 2016): 125–46. http://dx.doi.org/10.1515/pralin-2016-0013.

Full text
Abstract:
Abstract We present EFMARAL, a new system for efficient and accurate word alignment using a Bayesian model with Markov Chain Monte Carlo (MCMC) inference. Through careful selection of data structures and model architecture we are able to surpass the fast_align system, commonly used for performance-critical word alignment, both in computational efficiency and alignment accuracy. Our evaluation shows that a phrase-based statistical machine translation (SMT) system produces translations of higher quality when using word alignments from EFMARAL than from fast_align, and that translation quality is on par with what is obtained using GIZA++, a tool requiring orders of magnitude more processing time. More generally we hope to convince the reader that Monte Carlo sampling, rather than being viewed as a slow method of last resort, should actually be the method of choice for the SMT practitioner and others interested in word alignment.
APA, Harvard, Vancouver, ISO, and other styles
36

Shadare, A. E., M. N. O. Sadiku, and S. M. Musa. "Solution of Axisymmetric Inhomogeneous Problems with the Markov Chain Monte Carlo." Advanced Electromagnetics 8, no. 4 (September 7, 2019): 50–58. http://dx.doi.org/10.7716/aem.v8i4.1162.

Full text
Abstract:
With increasing complexity of EM problems, 1D and 2D axisymmetric approximations in p, z plane are sometimes necessary to quickly solve difficult symmetric problems using limited data storage and within shortest possible time. Inhomogeneous EM problems frequently occur in cases where two or more dielectric media, separated by an interface, exist and could pose challenges in complex EM problems. Simple, fast and efficient numerical techniques are constantly desired. This paper presents the application of simple and efficient Markov Chain Monte Carlo (MCMC) to EM inhomogeneous axisymmetric Laplace’s equations. Two cases are considered based on constant and mixed boundary potentials and MCMC solutions are found to be in close agreement with the finite difference solutions.
APA, Harvard, Vancouver, ISO, and other styles
37

de Figueiredo, Leandro Passos, Dario Grana, Mauro Roisenberg, and Bruno B. Rodrigues. "Gaussian mixture Markov chain Monte Carlo method for linear seismic inversion." GEOPHYSICS 84, no. 3 (May 1, 2019): R463—R476. http://dx.doi.org/10.1190/geo2018-0529.1.

Full text
Abstract:
We have developed a Markov chain Monte Carlo (MCMC) method for joint inversion of seismic data for the prediction of facies and elastic properties. The solution of the inverse problem is defined by the Bayesian posterior distribution of the properties of interest. The prior distribution is a Gaussian mixture model, and each component is associated to a potential configuration of the facies sequence along the seismic trace. The low frequency is incorporated by using facies-dependent depositional trend models for the prior means of the elastic properties in each facies. The posterior distribution is also a Gaussian mixture, for which the Gaussian component can be analytically computed. However, due to the high number of components of the mixture, i.e., the large number of facies configurations, the computation of the full posterior distribution is impractical. Our Gaussian mixture MCMC method allows for the calculation of the full posterior distribution by sampling the facies configurations according to the acceptance/rejection probability. The novelty of the method is the use of an MCMC framework with multimodal distributions for the description of the model properties and the facies along the entire seismic trace. Our method is tested on synthetic seismic data, applied to real seismic data, and validated using a well test.
APA, Harvard, Vancouver, ISO, and other styles
38

Stuart, Georgia K., Susan E. Minkoff, and Felipe Pereira. "A two-stage Markov chain Monte Carlo method for seismic inversion and uncertainty quantification." GEOPHYSICS 84, no. 6 (November 1, 2019): R1003—R1020. http://dx.doi.org/10.1190/geo2018-0893.1.

Full text
Abstract:
Bayesian methods for full-waveform inversion allow quantification of uncertainty in the solution, including determination of interval estimates and posterior distributions of the model unknowns. Markov chain Monte Carlo (MCMC) methods produce posterior distributions subject to fewer assumptions, such as normality, than deterministic Bayesian methods. However, MCMC is computationally a very expensive process that requires repeated solution of the wave equation for different velocity samples. Ultimately, a large proportion of these samples (often 40%–90%) is rejected. We have evaluated a two-stage MCMC algorithm that uses a coarse-grid filter to quickly reject unacceptable velocity proposals, thereby reducing the computational expense of solving the velocity inversion problem and quantifying uncertainty. Our filter stage uses operator upscaling, which provides near-perfect speedup in parallel with essentially no communication between processes and produces data that are highly correlated with those obtained from the full fine-grid solution. Four numerical experiments demonstrate the efficiency and accuracy of the method. The two-stage MCMC algorithm produce the same results (i.e., posterior distributions and uncertainty information, such as medians and highest posterior density intervals) as the Metropolis-Hastings MCMC. Thus, no information needed for uncertainty quantification is compromised when replacing the one-stage MCMC with the more computationally efficient two-stage MCMC. In four representative experiments, the two-stage method reduces the time spent on rejected models by one-third to one-half, which is important because most of models tried during the course of the MCMC algorithm are rejected. Furthermore, the two-stage MCMC algorithm substantially reduced the overall time-per-trial by as much as 40%, while increasing the acceptance rate from 9% to 90%.
APA, Harvard, Vancouver, ISO, and other styles
39

Che, X., and S. Xu. "Bayesian data analysis for agricultural experiments." Canadian Journal of Plant Science 90, no. 5 (September 1, 2010): 575–603. http://dx.doi.org/10.4141/cjps10004.

Full text
Abstract:
Data collected in agricultural experiments can be analyzed in many different ways using different models. The most commonly used models are the linear model and the generalized linear model. The maximum likelihood method is often used for data analysis. However, this method may not be able to handle complicated models, especially multiple level hierarchical models. The Bayesian method partitions complicated models into simple components, each of which may be formulated analytically. Therefore, the Bayesian method is capable of handling very complicated models. The Bayesian method itself may not be more complicated than the maximum likelihood method, but the analysis is time consuming, because numerical integration involved in Bayesian analysis is almost exclusively accomplished based on Monte Carlo simulations, the so called Markov Chain Monte Carlo (MCMC) algorithm. Although the MCMC algorithm is intuitive and straightforward to statisticians, it may not be that simple to agricultural scientists, whose main purpose is to implement the method and interpret the results. In this review, we provide the general concept of Bayesian analysis and the MCMC algorithm in a way that can be understood by non-statisticians. We also demonstrate the implementation of the MCMC algorithm using professional software packages such as the MCMC procedure in SAS software. Three datasets from agricultural experiments were analyzed to demonstrate the MCMC algorithm.Key words: Bayesian method, Generalized linear model, Markov Chain Monte Carlo, SAS, WinBUGS
APA, Harvard, Vancouver, ISO, and other styles
40

Acquah, Henry De-Graft. "Bayesian Logistic Regression Modelling via Markov Chain Monte Carlo Algorithm." Journal of Social and Development Sciences 4, no. 4 (April 30, 2013): 193–97. http://dx.doi.org/10.22610/jsds.v4i4.751.

Full text
Abstract:
This paper introduces Bayesian analysis and demonstrates its application to parameter estimation of the logistic regression via Markov Chain Monte Carlo (MCMC) algorithm. The Bayesian logistic regression estimation is compared with the classical logistic regression. Both the classical logistic regression and the Bayesian logistic regression suggest that higher per capita income is associated with free trade of countries. The results also show a reduction of standard errors associated with the coefficients obtained from the Bayesian analysis, thus bringing greater stability to the coefficients. It is concluded that Bayesian Markov Chain Monte Carlo algorithm offers an alternative framework for estimating the logistic regression model.
APA, Harvard, Vancouver, ISO, and other styles
41

Atchadé, Yves, and Yizao Wang. "On the convergence rates of some adaptive Markov chain Monte Carlo algorithms." Journal of Applied Probability 52, no. 3 (September 2015): 811–25. http://dx.doi.org/10.1239/jap/1445543848.

Full text
Abstract:
In this paper we study the mixing time of certain adaptive Markov chain Monte Carlo (MCMC) algorithms. Under some regularity conditions, we show that the convergence rate of importance resampling MCMC algorithms, measured in terms of the total variation distance, is O(n-1). By means of an example, we establish that, in general, this algorithm does not converge at a faster rate. We also study the interacting tempering algorithm, a simplified version of the equi-energy sampler, and establish that its mixing time is of order O(n-1/2).
APA, Harvard, Vancouver, ISO, and other styles
42

Atchadé, Yves, and Yizao Wang. "On the convergence rates of some adaptive Markov chain Monte Carlo algorithms." Journal of Applied Probability 52, no. 03 (September 2015): 811–25. http://dx.doi.org/10.1017/s0021900200113452.

Full text
Abstract:
In this paper we study the mixing time of certain adaptive Markov chain Monte Carlo (MCMC) algorithms. Under some regularity conditions, we show that the convergence rate of importance resampling MCMC algorithms, measured in terms of the total variation distance, isO(n-1). By means of an example, we establish that, in general, this algorithm does not converge at a faster rate. We also study the interacting tempering algorithm, a simplified version of the equi-energy sampler, and establish that its mixing time is of orderO(n-1/2).
APA, Harvard, Vancouver, ISO, and other styles
43

van den Berg, Stéphanie M., Leo Beem, and Dorret I. Boomsma. "Fitting Genetic Models Using Markov Chain Monte Carlo Algorithms With BUGS." Twin Research and Human Genetics 9, no. 3 (June 1, 2006): 334–42. http://dx.doi.org/10.1375/twin.9.3.334.

Full text
Abstract:
AbstractMaximum likelihood estimation techniques are widely used in twin and family studies, but soon reach computational boundaries when applied to highly complex models (e.g., models including gene-by-environment interaction and gene–environment correlation, item response theory measurement models, repeated measures, longitudinal structures, extended pedigrees). Markov Chain Monte Carlo (MCMC) algorithms are very well suited to fit complex models with hierarchically structured data. This article introduces the key concepts of Bayesian inference and MCMC parameter estimation and provides a number of scripts describing relatively simple models to be estimated by the freely obtainable BUGS software. In addition, inference using BUGS is illustrated using a data set on follicle-stimulating hormone and luteinizing hormone levels with repeated measures. The examples provided can serve as stepping stones for more complicated models, tailored to the specific needs of the individual researcher.
APA, Harvard, Vancouver, ISO, and other styles
44

Liang, Faming, and Ick-Hoon Jin. "A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants." Neural Computation 25, no. 8 (August 2013): 2199–234. http://dx.doi.org/10.1162/neco_a_00466.

Full text
Abstract:
Simulating from distributions with intractable normalizing constants has been a long-standing problem in machine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. The MCMH algorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals.
APA, Harvard, Vancouver, ISO, and other styles
45

Bumbaca, Federico (Rico), Sanjog Misra, and Peter E. Rossi. "Scalable Target Marketing: Distributed Markov Chain Monte Carlo for Bayesian Hierarchical Models." Journal of Marketing Research 57, no. 6 (October 1, 2020): 999–1018. http://dx.doi.org/10.1177/0022243720952410.

Full text
Abstract:
Many problems in marketing and economics require firms to make targeted consumer-specific decisions, but current estimation methods are not designed to scale to the size of modern data sets. In this article, the authors propose a new algorithm to close that gap. They develop a distributed Markov chain Monte Carlo (MCMC) algorithm for estimating Bayesian hierarchical models when the number of consumers is very large and the objects of interest are the consumer-level parameters. The two-stage and embarrassingly parallel algorithm is asymptotically unbiased in the number of consumers, retains the flexibility of a standard MCMC algorithm, and is easy to implement. The authors show that the distributed MCMC algorithm is faster and more efficient than a single-machine algorithm by at least an order of magnitude. They illustrate the approach with simulations with up to 100 million consumers, and with data on 1,088,310 donors to a charitable organization. The algorithm enables an increase of between $1.6 million and $4.6 million in additional donations when applied to a large modern-size data set compared with a typical-size data set.
APA, Harvard, Vancouver, ISO, and other styles
46

Zhao, Di, and Haiwu He. "DSMC: Fast direct simulation Monte Carlo solver for the Boltzmann equation by Multi-Chain Markov Chain and multicore programming." International Journal of Modeling, Simulation, and Scientific Computing 07, no. 02 (June 2016): 1650009. http://dx.doi.org/10.1142/s1793962316500094.

Full text
Abstract:
Direct Simulation Monte Carlo (DSMC) solves the Boltzmann equation with large Knudsen number. The Boltzmann equation generally consists of three terms: the force term, the diffusion term and the collision term. While the first two terms of the Boltzmann equation can be discretized by numerical methods such as the finite volume method, the third term can be approximated by DSMC, and DSMC simulates the physical behaviors of gas molecules. However, because of the low sampling efficiency of Monte Carlo Simulation in DSMC, this part usually occupies large portion of computational costs to solve the Boltzmann equation. In this paper, by Markov Chain Monte Carlo (MCMC) and multicore programming, we develop Direct Simulation Multi-Chain Markov Chain Monte Carlo (DSMC3): a fast solver to calculate the numerical solution for the Boltzmann equation. Computational results show that DSMC3 is significantly faster than the conventional method DSMC.
APA, Harvard, Vancouver, ISO, and other styles
47

Chaudhary, Arun Kumar, and Vijay Kumar. "A Bayesian Analysis of Perks Distribution via Markov Chain Monte Carlo Simulation." Nepal Journal of Science and Technology 14, no. 1 (October 14, 2013): 153–66. http://dx.doi.org/10.3126/njst.v14i1.8936.

Full text
Abstract:
In this paper the Markov chain Monte Carlo (MCMC) method is used to estimate the parameters of Perks distribution based on a complete sample. The procedures are developed to perform full Bayesian analysis of the Perks distributions using MCMC simulation method in OpenBUGS. We obtained the Bayes estimates of the parameters, hazard and reliability functions, and their probability intervals are also presented. We also discussed the issue of model compatibility for the given data set. A real data set is considered for illustration under gamma sets of priors. Nepal Journal of Science and Technology Vol. 14, No. 1 (2013) 153-166 DOI: http://dx.doi.org/10.3126/njst.v14i1.8936
APA, Harvard, Vancouver, ISO, and other styles
48

Izzatullah, Muhammad, Tristan van Leeuwen, and Daniel Peter. "Bayesian seismic inversion: a fast sampling Langevin dynamics Markov chain Monte Carlo method." Geophysical Journal International 227, no. 3 (July 22, 2021): 1523–53. http://dx.doi.org/10.1093/gji/ggab287.

Full text
Abstract:
SUMMARY In this study, we aim to solve the seismic inversion in the Bayesian framework by generating samples from the posterior distribution. This distribution incorporates the uncertainties in the seismic data, forward model, and prior information about the subsurface model parameters; thus, we obtain more information through sampling than through a point estimate (e.g. maximum a posteriori method). Based on the numerical cost of solving the forward problem and the dimensions of the subsurface model parameters and observed data, sampling with Markov chain Monte Carlo (MCMC) algorithms can be prohibitively expensive. Herein, we consider the promising Langevin dynamics MCMC algorithm. However, this algorithm has two central challenges: (1) the step size requires prior tuning to achieve optimal performance and (2) the Metropolis–Hastings acceptance step is computationally demanding. We approach these challenges by proposing an adaptive step-size rule and considering the suppression of the Metropolis–Hastings acceptance step. We highlight the proposed method’s potential through several numerical examples and rigorously validate it via qualitative and quantitative evaluation of the sample quality based on the kernelized Stein discrepancy (KSD) and other MCMC diagnostics such as trace and autocorrelation function plots. We conclude that, by suppressing the Metropolis–Hastings step, the proposed method provides fast sampling at efficient computational costs for large-scale seismic Bayesian inference; however, this inflates the second statistical moment (variance) due to asymptotic bias. Nevertheless, the proposed method reliably recovers important aspects of the posterior, including means, variances, skewness and 1-D and 2-D marginals. With larger computational budget, exact MCMC methods (i.e. with a Metropolis–Hastings step) should be favoured. The results thus obtained can be considered a feasibility study for promoting the approximate Langevin dynamics MCMC method for Bayesian seismic inversion on limited computational resources.
APA, Harvard, Vancouver, ISO, and other styles
49

Kitchen, James L., Jonathan D. Moore, Sarah A. Palmer, and Robin G. Allaby. "MCMC-ODPR: Primer design optimization using Markov Chain Monte Carlo sampling." BMC Bioinformatics 13, no. 1 (2012): 287. http://dx.doi.org/10.1186/1471-2105-13-287.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Wei, Pengfei, Chenghu Tang, and Yuting Yang. "Structural reliability and reliability sensitivity analysis of extremely rare failure events by combining sampling and surrogate model methods." Proceedings of the Institution of Mechanical Engineers, Part O: Journal of Risk and Reliability 233, no. 6 (May 17, 2019): 943–57. http://dx.doi.org/10.1177/1748006x19844666.

Full text
Abstract:
The aim of this article is to study the reliability analysis, parametric reliability sensitivity analysis and global reliability sensitivity analysis of structures with extremely rare failure events. First, the global reliability sensitivity indices are restudied, and we show that the total effect index can also be interpreted as the effect of randomly copying each individual input variable on the failure surface. Second, a new method, denoted as Active learning Kriging Markov Chain Monte Carlo (AK-MCMC), is developed for adaptively approximating the failure surface with active learning Kriging surrogate model as well as dynamically updated Monte Carlo or Markov chain Monte Carlo populations. Third, the AK-MCMC procedure combined with the quasi-optimal importance sampling procedure is extended for estimating the failure probability and the parametric reliability sensitivity and global reliability sensitivity indices. For estimating the global reliability sensitivity indices, two new importance sampling estimators are derived. The AK-MCMC procedure can be regarded as a combination of the classical Monte Carlo Simulation (AK-MCS) and subset simulation procedures, but it is much more effective when applied to extremely rare failure events. Results of test examples show that the proposed method can accurately and robustly estimate the extremely small failure probability (e.g. 1e–9) as well as the related parametric reliability sensitivity and global reliability sensitivity indices with several dozens of function calls.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography