To see the other types of publications on this topic, follow the link: Markov chain Monte Carlo (MCMC).

Journal articles on the topic 'Markov chain Monte Carlo (MCMC)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Markov chain Monte Carlo (MCMC).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Borkar, Vivek S. "Markov Chain Monte Carlo (MCMC)." Resonance 27, no. 7 (2022): 1107–15. http://dx.doi.org/10.1007/s12045-022-1407-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Roy, Vivekananda. "Convergence Diagnostics for Markov Chain Monte Carlo." Annual Review of Statistics and Its Application 7, no. 1 (2020): 387–412. http://dx.doi.org/10.1146/annurev-statistics-031219-041300.

Full text
Abstract:
Markov chain Monte Carlo (MCMC) is one of the most useful approaches to scientific computing because of its flexible construction, ease of use, and generality. Indeed, MCMC is indispensable for performing Bayesian analysis. Two critical questions that MCMC practitioners need to address are where to start and when to stop the simulation. Although a great amount of research has gone into establishing convergence criteria and stopping rules with sound theoretical foundation, in practice, MCMC users often decide convergence by applying empirical diagnostic tools. This review article discusses the
APA, Harvard, Vancouver, ISO, and other styles
3

Jones, Galin L., and Qian Qin. "Markov Chain Monte Carlo in Practice." Annual Review of Statistics and Its Application 9, no. 1 (2022): 557–78. http://dx.doi.org/10.1146/annurev-statistics-040220-090158.

Full text
Abstract:
Markov chain Monte Carlo (MCMC) is an essential set of tools for estimating features of probability distributions commonly encountered in modern applications. For MCMC simulation to produce reliable outcomes, it needs to generate observations representative of the target distribution, and it must be long enough so that the errors of Monte Carlo estimates are small. We review methods for assessing the reliability of the simulation effort, with an emphasis on those most useful in practically relevant settings. Both strengths and weaknesses of these methods are discussed. The methods are illustra
APA, Harvard, Vancouver, ISO, and other styles
4

Jones, Galin L., and Qian Qin. "Markov Chain Monte Carlo in Practice." Annual Review of Statistics and Its Application 9, no. 1 (2022): 557–78. http://dx.doi.org/10.1146/annurev-statistics-040220-090158.

Full text
Abstract:
Markov chain Monte Carlo (MCMC) is an essential set of tools for estimating features of probability distributions commonly encountered in modern applications. For MCMC simulation to produce reliable outcomes, it needs to generate observations representative of the target distribution, and it must be long enough so that the errors of Monte Carlo estimates are small. We review methods for assessing the reliability of the simulation effort, with an emphasis on those most useful in practically relevant settings. Both strengths and weaknesses of these methods are discussed. The methods are illustra
APA, Harvard, Vancouver, ISO, and other styles
5

Siems, Tobias. "Markov Chain Monte Carlo on finite state spaces." Mathematical Gazette 104, no. 560 (2020): 281–87. http://dx.doi.org/10.1017/mag.2020.51.

Full text
Abstract:
We elaborate the idea behind Markov chain Monte Carlo (MCMC) methods in a mathematically coherent, yet simple and understandable way. To this end, we prove a pivotal convergence theorem for finite Markov chains and a minimal version of the Perron-Frobenius theorem. Subsequently, we briefly discuss two fundamental MCMC methods, the Gibbs and Metropolis-Hastings sampler. Only very basic knowledge about matrices, convergence of real sequences and probability theory is required.
APA, Harvard, Vancouver, ISO, and other styles
6

Liu, Qiaomu. "Brief Introduction to Markov Chain Monte Carlo and Its Algorithms." Theoretical and Natural Science 92, no. 1 (2025): 108–15. https://doi.org/10.54254/2753-8818/2025.22031.

Full text
Abstract:
The Markov Chain Monte Carlo (MCMC) methods have become indispensable tools in modern statistical computation, enabling researchers to approximate complex probability distributions that are otherwise intractable. This paper focus on MCMC in Statistics and Probability area which is used to draw samples from a probability distribution. In order to introduce this algorithm in a relatively light and straightforward way, this paper breaks the content into two parts: Markov Chain and MCMC, and brings in stochastic process, Markov property, Ordinary Monte Carlo, and Monte Carlo Integration in success
APA, Harvard, Vancouver, ISO, and other styles
7

Chaudhary, A. K. "Bayesian Analysis of Two Parameter Complementary Exponential Power Distribution." NCC Journal 3, no. 1 (2018): 1–23. http://dx.doi.org/10.3126/nccj.v3i1.20244.

Full text
Abstract:
In this paper, the Markov chain Monte Carlo (MCMC) method is used to estimate the parameters of CEP distribution based on a complete sample. A procedure is developed to obtain Bayes estimates of the parameters of the CEP distribution using Markov Chain Monte Carlo (MCMC) simulation method in OpenBUGS, established software for Bayesian analysis using Markov Chain Monte Carlo (MCMC) methods. The MCMC methods have been shown to be easier to implement computationally, the estimates always exist and are statistically consistent, and their probability intervals are convenient to construct. The R fun
APA, Harvard, Vancouver, ISO, and other styles
8

Chaudhary, Arun Kumar, and Vijay Kumar. "A Bayesian Estimation and Predictionof Gompertz Extension Distribution Using the MCMC Method." Nepal Journal of Science and Technology 19, no. 1 (2020): 142–60. http://dx.doi.org/10.3126/njst.v19i1.29795.

Full text
Abstract:
In this paper, the Markov chain Monte Carlo (MCMC) method is used to estimate the parameters of the Gompertz extension distribution based on a complete sample. We have developed a procedure to obtain Bayes estimates of the parameters of the Gompertz extension distribution using Markov Chain Monte Carlo (MCMC) simulation method in OpenBUGS, established software for Bayesian analysis using Markov Chain Monte Carlo (MCMC) methods. We have obtained the Bayes estimates of the parameters, hazard and reliability functions, and their probability intervals are also presented. We have applied the predic
APA, Harvard, Vancouver, ISO, and other styles
9

Chaudhary, A. K. "A Study of Perks-II Distribution via Bayesian Paradigm." Pravaha 24, no. 1 (2018): 1–17. http://dx.doi.org/10.3126/pravaha.v24i1.20221.

Full text
Abstract:
In this paper, the Markov chain Monte Carlo (MCMC) method is used to estimate the parameters of Perks-II distribution based on a complete sample. The procedures are developed to perform full Bayesian analysis of the Perks-II distributions using Markov Chain Monte Carlo (MCMC) simulation method in OpenBUGS, established software for Bayesian analysis using Markov Chain Monte Carlo (MCMC) methods. We have obtained the Bayes estimates of the parameters, hazard and reliability functions, and their probability intervals are also presented. We have also discussed the issue of model compatibility for
APA, Harvard, Vancouver, ISO, and other styles
10

Shadare, A. E., M. N. O. Sadiku, and S. M. Musa. "Markov Chain Monte Carlo Solution of Poisson’s Equation in Axisymmetric Regions." Advanced Electromagnetics 8, no. 5 (2019): 29–36. http://dx.doi.org/10.7716/aem.v8i5.1255.

Full text
Abstract:
The advent of the Monte Carlo methods to the field of EM have seen floating random walk, fixed random walk and Exodus methods deployed to solve Poisson’s equation in rectangular coordinate and axisymmetric solution regions. However, when considering large EM domains, classical Monte Carlo methods could be time-consuming because they calculate potential one point at a time. Thus, Markov Chain Monte Carlo (MCMC) is generally preferred to other Monte Carlo methods when considering whole-field computation. In this paper, MCMC has been applied to solve Poisson’s equation in homogeneous and inhomoge
APA, Harvard, Vancouver, ISO, and other styles
11

Chen, Bairun. "Seeking Application of Markov Chain Monte Carlo in Different Fields." Theoretical and Natural Science 92, no. 1 (2025): 172–77. https://doi.org/10.54254/2753-8818/2025.22222.

Full text
Abstract:
The Markov Chain Monte Carlo (MCMC) methods build the Markov chain which the stable distribution matches the expected distribution It can approximate sampling and estimate to the complex probability distributions by generating dependent samples iteratively. The MCMC methods are algorithms used to generate samples in the complex probability distributions. This article introduces the basic principles, theory background and the applications of MCMC in various fields such as Bayesian inference, statistical physics, and machine learning. By discussing the particular challenge that MCMC is facing no
APA, Harvard, Vancouver, ISO, and other styles
12

Finke, Axel, Arnaud Doucet, and Adam M. Johansen. "Limit theorems for sequential MCMC methods." Advances in Applied Probability 52, no. 2 (2020): 377–403. http://dx.doi.org/10.1017/apr.2020.9.

Full text
Abstract:
AbstractBoth sequential Monte Carlo (SMC) methods (a.k.a. ‘particle filters’) and sequential Markov chain Monte Carlo (sequential MCMC) methods constitute classes of algorithms which can be used to approximate expectations with respect to (a sequence of) probability distributions and their normalising constants. While SMC methods sample particles conditionally independently at each time step, sequential MCMC methods sample particles according to a Markov chain Monte Carlo (MCMC) kernel. Introduced over twenty years ago in [6], sequential MCMC methods have attracted renewed interest recently as
APA, Harvard, Vancouver, ISO, and other styles
13

Müller, Christian, Fabian Weysser, Thomas Mrziglod, and Andreas Schuppert. "Markov-Chain Monte-Carlo methods and non-identifiabilities." Monte Carlo Methods and Applications 24, no. 3 (2018): 203–14. http://dx.doi.org/10.1515/mcma-2018-0018.

Full text
Abstract:
Abstract We consider the problem of sampling from high-dimensional likelihood functions with large amounts of non-identifiabilities via Markov-Chain Monte-Carlo algorithms. Non-identifiabilities are problematic for commonly used proposal densities, leading to a low effective sample size. To address this problem, we introduce a regularization method using an artificial prior, which restricts non-identifiable parts of the likelihood function. This enables us to sample the posterior using common MCMC methods more efficiently. We demonstrate this with three MCMC methods on a likelihood based on a
APA, Harvard, Vancouver, ISO, and other styles
14

Qin, Liang, Philipp Höllmer, and Werner Krauth. "Direction-sweep Markov chains." Journal of Physics A: Mathematical and Theoretical 55, no. 10 (2022): 105003. http://dx.doi.org/10.1088/1751-8121/ac508a.

Full text
Abstract:
Abstract We discuss a non-reversible, lifted Markov-chain Monte Carlo (MCMC) algorithm for particle systems in which the direction of proposed displacements is changed deterministically. This algorithm sweeps through directions analogously to the popular MCMC sweep methods for particle or spin indices. Direction-sweep MCMC can be applied to a wide range of reversible or non-reversible Markov chains, such as the Metropolis algorithm or the event-chain Monte Carlo algorithm. For a single two-dimensional tethered hard-disk dipole, we consider direction-sweep MCMC in the limit where restricted equ
APA, Harvard, Vancouver, ISO, and other styles
15

Koike, Takaaki, and Marius Hofert. "Markov Chain Monte Carlo Methods for Estimating Systemic Risk Allocations." Risks 8, no. 1 (2020): 6. http://dx.doi.org/10.3390/risks8010006.

Full text
Abstract:
In this paper, we propose a novel framework for estimating systemic risk measures and risk allocations based on Markov Chain Monte Carlo (MCMC) methods. We consider a class of allocations whose jth component can be written as some risk measure of the jth conditional marginal loss distribution given the so-called crisis event. By considering a crisis event as an intersection of linear constraints, this class of allocations covers, for example, conditional Value-at-Risk (CoVaR), conditional expected shortfall (CoES), VaR contributions, and range VaR (RVaR) contributions as special cases. For thi
APA, Harvard, Vancouver, ISO, and other styles
16

Azizah, Azizah. "PEMODELAN KLAIM ASURANSI MENGGUNAKAN PENDEKATAN BAYESIAN DAN MARKOV CHAIN MONTE CARLO." Jurnal Kajian Matematika dan Aplikasinya (JKMA) 2, no. 2 (2021): 7. http://dx.doi.org/10.17977/um055v2i22021p7-13.

Full text
Abstract:
The determination of the correct prediction of claims frequency and claims severity is very important in the insurance business to determine the outstanding claims reserve which should be prepared by an insurance company. One approach which may be used to predict a future value is the Bayesian approach. This approach combines the sample and the prior information The information is used to construct the posterior distribution and to determine the estimate of the parameters. However, in this approach, integrations of functions with high dimensions are often encountered. In this Thesis, a Markov
APA, Harvard, Vancouver, ISO, and other styles
17

Karandikar, Rajeeva L. "On the Markov Chain Monte Carlo (MCMC) method." Sadhana 31, no. 2 (2006): 81–104. http://dx.doi.org/10.1007/bf02719775.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Masoumi, Samira, Thomas A. Duever, and Park M. Reilly. "Sequential Markov Chain Monte Carlo (MCMC) model discrimination." Canadian Journal of Chemical Engineering 91, no. 5 (2012): 862–69. http://dx.doi.org/10.1002/cjce.21711.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

SETIAWANI, PUTU AMANDA, KOMANG DHARMAWAN, and I. WAYAN SUMARJAYA. "IMPLEMENTASI METODE MARKOV CHAIN MONTE CARLO DALAM PENENTUAN HARGA KONTRAK BERJANGKA KOMODITAS." E-Jurnal Matematika 4, no. 3 (2015): 122. http://dx.doi.org/10.24843/mtk.2015.v04.i03.p099.

Full text
Abstract:
The aim of the research is to implement Markov Chain Monte Carlo (MCMC) simulation method to price the futures contract of cocoa commodities. The result shows that MCMC is more flexible than Standard Monte Carlo (SMC) simulation method because MCMC method uses hit-and-run sampler algorithm to generate proposal movements that are subsequently accepted or rejected with a probability that depends on the distribution of the target that we want to be achieved. This research shows that MCMC method is suitable to be used to simulate the model of cocoa commodity price movement. The result of this rese
APA, Harvard, Vancouver, ISO, and other styles
20

Biswas, Abhik. "Bayesian MCMC Approach to Learning About the SIR Model." International Journal for Research in Applied Science and Engineering Technology 10, no. 6 (2022): 540–53. http://dx.doi.org/10.22214/ijraset.2022.43818.

Full text
Abstract:
Abstract: This project aims to study the parameters of the Deterministic SIR(Susceptible → Infected → Recovered) model of COVID-19 in a Bayesian MCMC framework. Several deterministic mathematical models are being developed everyday to forecast the spread of COVID-19 correctly. Here, I have tried to model and study the parameters of the SIR Infectious disease model using the Bayesian Framework and Markov-Chain Monte-Carlo (MCMC) techniques. I have used Bayesian Inference to predict the Basic Reproductive Rate ࢚ࡾ in real time using and following this, demonstrated the process of how the paramete
APA, Harvard, Vancouver, ISO, and other styles
21

Levy, Roy. "The Rise of Markov Chain Monte Carlo Estimation for Psychometric Modeling." Journal of Probability and Statistics 2009 (2009): 1–18. http://dx.doi.org/10.1155/2009/537139.

Full text
Abstract:
Markov chain Monte Carlo (MCMC) estimation strategies represent a powerful approach to estimation in psychometric models. Popular MCMC samplers and their alignment with Bayesian approaches to modeling are discussed. Key historical and current developments of MCMC are surveyed, emphasizing how MCMC allows the researcher to overcome the limitations of other estimation paradigms, facilitates the estimation of models that might otherwise be intractable, and frees the researcher from certain possible misconceptions about the models.
APA, Harvard, Vancouver, ISO, and other styles
22

Grana, Dario, Leandro de Figueiredo, and Klaus Mosegaard. "Markov chain Monte Carlo for petrophysical inversion." GEOPHYSICS 87, no. 1 (2021): M13—M24. http://dx.doi.org/10.1190/geo2021-0177.1.

Full text
Abstract:
Stochastic petrophysical inversion is a method used to predict reservoir properties from seismic data. Recent advances in stochastic optimization allow generating multiple realizations of rock and fluid properties conditioned on seismic data. To match the measured data and represent the uncertainty of the model variables, many realizations are generally required. Stochastic sampling and optimization of spatially correlated models are computationally demanding. Monte Carlo methods allow quantifying the uncertainty of the model variables but are impractical for high-dimensional models with spati
APA, Harvard, Vancouver, ISO, and other styles
23

Song, Yihan, Ali Luo, and Yongheng Zhao. "Measuring Stellar Radial Velocity using Markov Chain Monte Carlo(MCMC) Method." Proceedings of the International Astronomical Union 9, S298 (2013): 441. http://dx.doi.org/10.1017/s1743921313007060.

Full text
Abstract:
AbstractStellar radial velocity is estimated by using template fitting and Markov Chain Monte Carlo(MCMC) methods. This method works on the LAMOST stellar spectra. The MCMC simulation generates a probability distribution of the RV. The RV error can also computed from distribution.
APA, Harvard, Vancouver, ISO, and other styles
24

Harizahayu, Harizahayu. "PEMODELAN RANTAI MARKOV MENGGUNAKAN ALGORITMA METROPOLIS-HASTINGS." MAp (Mathematics and Applications) Journal 2, no. 2 (2020): 11–18. http://dx.doi.org/10.15548/map.v2i2.2259.

Full text
Abstract:
Pada tulisan ini akan dijelaskan bentuk distribusi posterior P(probabilitas klaim) = Beta (β│α) dengan proses simulasi implementasi algoritma yang disederhanakan dan penerapan algoritma Markov Chain Monte Carlo dengan mengunkan analisis sistem Bayes dengan pendekatan model Markov Monte Carlo. Algoritma Markov Chain Monte Carlo adalah suatu kelas algoritma untuk melakukan sampling dari distribusi probabilitas dengan membangun rantai Markov pada suatu distribusi tertentu yang stasioner. Algoritma Metropolis merupakan algoritma untuk membangkitkan barisan sampel menggunakan mekanisme penerimaan d
APA, Harvard, Vancouver, ISO, and other styles
25

Tie, Zhixin, Dingkai Zhu, Shunhe Hong, and Hui Xu. "A Hierarchical Random Graph Efficient Sampling Algorithm Based on Improved MCMC Algorithm." Electronics 11, no. 15 (2022): 2396. http://dx.doi.org/10.3390/electronics11152396.

Full text
Abstract:
A hierarchical random graph (HRG) model combined with a maximum likelihood approach and a Markov Chain Monte Carlo algorithm can not only be used to quantitatively describe the hierarchical organization of many real networks, but also can predict missing connections in partly known networks with high accuracy. However, the computational cost is very large when hierarchical random graphs are sampled by the Markov Chain Monte Carlo algorithm (MCMC), so that the hierarchical random graphs, which can describe the characteristics of network structure, cannot be found in a reasonable time range. Thi
APA, Harvard, Vancouver, ISO, and other styles
26

Vargas, Juan P., Jair C. Koppe, Sebastián Pérez, and Juan P. Hurtado. "Planning Tunnel Construction Using Markov Chain Monte Carlo (MCMC)." Mathematical Problems in Engineering 2015 (2015): 1–8. http://dx.doi.org/10.1155/2015/797953.

Full text
Abstract:
Tunnels, drifts, drives, and other types of underground excavation are very common in mining as well as in the construction of roads, railways, dams, and other civil engineering projects. Planning is essential to the success of tunnel excavation, and construction time is one of the most important factors to be taken into account. This paper proposes a simulation algorithm based on a stochastic numerical method, the Markov chain Monte Carlo method, that can provide the best estimate of the opening excavation times for the classic method of drilling and blasting. Taking account of technical cons
APA, Harvard, Vancouver, ISO, and other styles
27

South, Leah F., Marina Riabiz, Onur Teymur, and Chris J. Oates. "Postprocessing of MCMC." Annual Review of Statistics and Its Application 9, no. 1 (2022): 529–55. http://dx.doi.org/10.1146/annurev-statistics-040220-091727.

Full text
Abstract:
Markov chain Monte Carlo is the engine of modern Bayesian statistics, being used to approximate the posterior and derived quantities of interest. Despite this, the issue of how the output from a Markov chain is postprocessed and reported is often overlooked. Convergence diagnostics can be used to control bias via burn-in removal, but these do not account for (common) situations where a limited computational budget engenders a bias-variance trade-off. The aim of this article is to review state-of-the-art techniques for postprocessing Markov chain output. Our review covers methods based on discr
APA, Harvard, Vancouver, ISO, and other styles
28

Roberts, Gareth O., and Jeffrey S. Rosenthal. "Complexity bounds for Markov chain Monte Carlo algorithms via diffusion limits." Journal of Applied Probability 53, no. 2 (2016): 410–20. http://dx.doi.org/10.1017/jpr.2016.9.

Full text
Abstract:
Abstract We connect known results about diffusion limits of Markov chain Monte Carlo (MCMC) algorithms to the computer science notion of algorithm complexity. Our main result states that any weak limit of a Markov process implies a corresponding complexity bound (in an appropriate metric). We then combine this result with previously-known MCMC diffusion limit results to prove that under appropriate assumptions, the random-walk Metropolis algorithm in d dimensions takes O(d) iterations to converge to stationarity, while the Metropolis-adjusted Langevin algorithm takes O(d1/3) iterations to conv
APA, Harvard, Vancouver, ISO, and other styles
29

Stathopoulos, Vassilios, and Mark A. Girolami. "Markov chain Monte Carlo inference for Markov jump processes via the linear noise approximation." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 371, no. 1984 (2013): 20110541. http://dx.doi.org/10.1098/rsta.2011.0541.

Full text
Abstract:
Bayesian analysis for Markov jump processes (MJPs) is a non-trivial and challenging problem. Although exact inference is theoretically possible, it is computationally demanding, thus its applicability is limited to a small class of problems. In this paper, we describe the application of Riemann manifold Markov chain Monte Carlo (MCMC) methods using an approximation to the likelihood of the MJP that is valid when the system modelled is near its thermodynamic limit. The proposed approach is both statistically and computationally efficient whereas the convergence rate and mixing of the chains all
APA, Harvard, Vancouver, ISO, and other styles
30

Yuan, Ke, Mark Girolami, and Mahesan Niranjan. "Markov Chain Monte Carlo Methods for State-Space Models with Point Process Observations." Neural Computation 24, no. 6 (2012): 1462–86. http://dx.doi.org/10.1162/neco_a_00281.

Full text
Abstract:
This letter considers how a number of modern Markov chain Monte Carlo (MCMC) methods can be applied for parameter estimation and inference in state-space models with point process observations. We quantified the efficiencies of these MCMC methods on synthetic data, and our results suggest that the Reimannian manifold Hamiltonian Monte Carlo method offers the best performance. We further compared such a method with a previously tested variational Bayes method on two experimental data sets. Results indicate similar performance on the large data sets and superior performance on small ones. The wo
APA, Harvard, Vancouver, ISO, and other styles
31

Shao, Liangshan, and Yingchao Gao. "A Gas Prominence Prediction Model Based on Entropy-Weighted Gray Correlation and MCMC-ISSA-SVM." Processes 11, no. 7 (2023): 2098. http://dx.doi.org/10.3390/pr11072098.

Full text
Abstract:
To improve the accuracy of coal and gas prominence prediction, an improved sparrow search algorithm (ISSA) and an optimized support vector machine (SVM) based on the Markov chain Monte Carlo (MCMC) filling algorithm prediction model were proposed. The mean value of the data after filling in the missing values in the coal and gas prominence data using the MCMC filling algorithm was 2.282, with a standard deviation of 0.193. Compared with the mean fill method (Mean), random forest filling method (random forest, RF), and K-nearest neighbor filling method (K-nearest neighbor, KNN), the MCMC fillin
APA, Harvard, Vancouver, ISO, and other styles
32

Lukitasari, Dewi, Adi Setiawan, and Leopoldus Ricky Sasangko. "Bayesian Survival Analysis Untuk Mengestimasi Parameter Model Weibull-Regression Pada Kasus Ketahanan Hidup Pasien Penderita Jantung Koroner." d'CARTESIAN 4, no. 1 (2015): 26. http://dx.doi.org/10.35799/dc.4.1.2015.7531.

Full text
Abstract:
Paper ini membahas mengenai estimasi parameter model Weibull-Regression untuk data tersensor pada kasus ketahanan hidup pasien penderita jantung koroner dengan pendekatan Bayesian survival analysis. Data yang digunakan adalah data simulasi waktu hidup pasien, status pasien (hidup/mati) dan treatment yang dikenakan yaitu ring dan bypass. Pendekatan Bayesian (Bayesian approach) digunakan untuk mencari distribusi posterior parameter. Metode Markov Chain Monte Carlo (MCMC) digunakan untuk membangkitkan Rantai Markov guna mengestimasi parameter meliputi koefisien regresi (b) dan parameter r dari mo
APA, Harvard, Vancouver, ISO, and other styles
33

Sinharay, Sandip. "Experiences With Markov Chain Monte Carlo Convergence Assessment in Two Psychometric Examples." Journal of Educational and Behavioral Statistics 29, no. 4 (2004): 461–88. http://dx.doi.org/10.3102/10769986029004461.

Full text
Abstract:
There is an increasing use of Markov chain Monte Carlo (MCMC) algorithms for fitting statistical models in psychometrics, especially in situations where the traditional estimation techniques are very difficult to apply. One of the disadvantages of using an MCMC algorithm is that it is not straightforward to determine the convergence of the algorithm. Using the output of an MCMC algorithm that has not converged may lead to incorrect inferences on the problem at hand. The convergence is not one to a point, but that of the distribution of a sequence of generated values to another distribution, an
APA, Harvard, Vancouver, ISO, and other styles
34

Pooley, C. M., S. C. Bishop, A. Doeschl-Wilson, and G. Marion. "Posterior-based proposals for speeding up Markov chain Monte Carlo." Royal Society Open Science 6, no. 11 (2019): 190619. http://dx.doi.org/10.1098/rsos.190619.

Full text
Abstract:
Markov chain Monte Carlo (MCMC) is widely used for Bayesian inference in models of complex systems. Performance, however, is often unsatisfactory in models with many latent variables due to so-called poor mixing, necessitating the development of application-specific implementations. This paper introduces ‘posterior-based proposals' (PBPs), a new type of MCMC update applicable to a huge class of statistical models (whose conditional dependence structures are represented by directed acyclic graphs). PBPs generate large joint updates in parameter and latent variable space, while retaining good ac
APA, Harvard, Vancouver, ISO, and other styles
35

Jiang, Yu Hang, Tong Liu, Zhiya Lou, et al. "Markov Chain Confidence Intervals and Biases." International Journal of Statistics and Probability 11, no. 1 (2021): 29. http://dx.doi.org/10.5539/ijsp.v11n1p29.

Full text
Abstract:
We derive explicit asymptotic confidence intervals for any Markov chain Monte Carlo (MCMC) algorithm with finite asymptotic variance, started at any initial state, without requiring a Central Limit Theorem nor reversibility nor geometric ergodicity nor any bias bound. We also derive explicit non-asymptotic confidence intervals assuming bounds on the bias or first moment, or alternatively that the chain starts in stationarity. We relate those non-asymptotic bounds to properties of MCMC bias, and show that polynomially ergodicity implies certain bias bounds. We also apply our results to several
APA, Harvard, Vancouver, ISO, and other styles
36

Vestring, Yann, and Javad Tavakoli. "Estimating and Calibrating Markov Chain Sample Error Variance." International Journal of Statistics and Probability 13, no. 1 (2024): 10. http://dx.doi.org/10.5539/ijsp.v13n1p10.

Full text
Abstract:
Markov chain Monte Carlo (MCMC) methods are a powerful and versatile tool with applications spanning a wide spectrum of fields, including Bayesian inference, computational biology, and physics. One of the key challenges in applying MCMC algorithms is to deal with estimation error. The main result in this article is a closed form, non-asymptotic solution for the sample error variance of a single MCMC  estimate. Importantly, this result assumes that the state-space is finite and discrete. We demonstrate with examples how this result can help estimate and calibrate MCMC estimation error
APA, Harvard, Vancouver, ISO, and other styles
37

Ahmadian, Yashar, Jonathan W. Pillow, and Liam Paninski. "Efficient Markov Chain Monte Carlo Methods for Decoding Neural Spike Trains." Neural Computation 23, no. 1 (2011): 46–96. http://dx.doi.org/10.1162/neco_a_00059.

Full text
Abstract:
Stimulus reconstruction or decoding methods provide an important tool for understanding how sensory and motor information is represented in neural activity. We discuss Bayesian decoding methods based on an encoding generalized linear model (GLM) that accurately describes how stimuli are transformed into the spike trains of a group of neurons. The form of the GLM likelihood ensures that the posterior distribution over the stimuli that caused an observed set of spike trains is log concave so long as the prior is. This allows the maximum a posteriori (MAP) stimulus estimate to be obtained using e
APA, Harvard, Vancouver, ISO, and other styles
38

Barido-Sottani, Joëlle, Orlando Schwery, Rachel C. M. Warnock, Chi Zhang, and April Marie Wright. "Practical guidelines for Bayesian phylogenetic inference using Markov Chain Monte Carlo (MCMC)." Open Research Europe 3 (June 28, 2024): 204. http://dx.doi.org/10.12688/openreseurope.16679.2.

Full text
Abstract:
Phylogenetic estimation is, and has always been, a complex endeavor. Estimating a phylogenetic tree involves evaluating many possible solutions and possible evolutionary histories that could explain a set of observed data, typically by using a model of evolution. Modern statistical methods involve not just the estimation of a tree, but also solutions to more complex models involving fossil record information and other data sources. Markov Chain Monte Carlo (MCMC) is a leading method for approximating the posterior distribution of parameters in a mathematical model. It is deployed in all Bayesi
APA, Harvard, Vancouver, ISO, and other styles
39

Barido-Sottani, Joëlle, Orlando Schwery, Rachel C. M. Warnock, Chi Zhang, and April Marie Wright. "Practical guidelines for Bayesian phylogenetic inference using Markov chain Monte Carlo (MCMC)." Open Research Europe 3 (August 5, 2024): 204. http://dx.doi.org/10.12688/openreseurope.16679.3.

Full text
Abstract:
Phylogenetic estimation is, and has always been, a complex endeavor. Estimating a phylogenetic tree involves evaluating many possible solutions and possible evolutionary histories that could explain a set of observed data, typically by using a model of evolution. Values for all model parameters need to be evaluated as well. Modern statistical methods involve not just the estimation of a tree, but also solutions to more complex models involving fossil record information and other data sources. Markov chain Monte Carlo (MCMC) is a leading method for approximating the posterior distribution of pa
APA, Harvard, Vancouver, ISO, and other styles
40

Barido-Sottani, Joëlle, Orlando Schwery, Rachel C. M. Warnock, Chi Zhang, and April Marie Wright. "Practical guidelines for Bayesian phylogenetic inference using Markov Chain Monte Carlo (MCMC)." Open Research Europe 3 (November 20, 2023): 204. http://dx.doi.org/10.12688/openreseurope.16679.1.

Full text
Abstract:
Phylogenetic estimation is, and has always been, a complex endeavor. Estimating a phylogenetic tree involves evaluating many possible solutions and possible evolutionary histories that could explain a set of observed data, typically by using a model of evolution. Modern statistical methods involve not just the estimation of a tree, but also solutions to more complex models involving fossil record information and other data sources. Markov Chain Monte Carlo (MCMC) is a leading method for approximating the posterior distribution of parameters in a mathematical model. It is deployed in all Bayesi
APA, Harvard, Vancouver, ISO, and other styles
41

Che, X., and S. Xu. "Bayesian data analysis for agricultural experiments." Canadian Journal of Plant Science 90, no. 5 (2010): 575–603. http://dx.doi.org/10.4141/cjps10004.

Full text
Abstract:
Data collected in agricultural experiments can be analyzed in many different ways using different models. The most commonly used models are the linear model and the generalized linear model. The maximum likelihood method is often used for data analysis. However, this method may not be able to handle complicated models, especially multiple level hierarchical models. The Bayesian method partitions complicated models into simple components, each of which may be formulated analytically. Therefore, the Bayesian method is capable of handling very complicated models. The Bayesian method itself may no
APA, Harvard, Vancouver, ISO, and other styles
42

Acquah, Henry De-Graft. "Bayesian Logistic Regression Modelling via Markov Chain Monte Carlo Algorithm." Journal of Social and Development Sciences 4, no. 4 (2013): 193–97. http://dx.doi.org/10.22610/jsds.v4i4.751.

Full text
Abstract:
This paper introduces Bayesian analysis and demonstrates its application to parameter estimation of the logistic regression via Markov Chain Monte Carlo (MCMC) algorithm. The Bayesian logistic regression estimation is compared with the classical logistic regression. Both the classical logistic regression and the Bayesian logistic regression suggest that higher per capita income is associated with free trade of countries. The results also show a reduction of standard errors associated with the coefficients obtained from the Bayesian analysis, thus bringing greater stability to the coefficients.
APA, Harvard, Vancouver, ISO, and other styles
43

Liang, Faming, and Ick-Hoon Jin. "A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants." Neural Computation 25, no. 8 (2013): 2199–234. http://dx.doi.org/10.1162/neco_a_00466.

Full text
Abstract:
Simulating from distributions with intractable normalizing constants has been a long-standing problem in machine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and e
APA, Harvard, Vancouver, ISO, and other styles
44

Östling, Robert, and Jörg Tiedemann. "Efficient Word Alignment with Markov Chain Monte Carlo." Prague Bulletin of Mathematical Linguistics 106, no. 1 (2016): 125–46. http://dx.doi.org/10.1515/pralin-2016-0013.

Full text
Abstract:
Abstract We present EFMARAL, a new system for efficient and accurate word alignment using a Bayesian model with Markov Chain Monte Carlo (MCMC) inference. Through careful selection of data structures and model architecture we are able to surpass the fast_align system, commonly used for performance-critical word alignment, both in computational efficiency and alignment accuracy. Our evaluation shows that a phrase-based statistical machine translation (SMT) system produces translations of higher quality when using word alignments from EFMARAL than from fast_align, and that translation quality is
APA, Harvard, Vancouver, ISO, and other styles
45

Shadare, A. E., M. N. O. Sadiku, and S. M. Musa. "Solution of Axisymmetric Inhomogeneous Problems with the Markov Chain Monte Carlo." Advanced Electromagnetics 8, no. 4 (2019): 50–58. http://dx.doi.org/10.7716/aem.v8i4.1162.

Full text
Abstract:
With increasing complexity of EM problems, 1D and 2D axisymmetric approximations in p, z plane are sometimes necessary to quickly solve difficult symmetric problems using limited data storage and within shortest possible time. Inhomogeneous EM problems frequently occur in cases where two or more dielectric media, separated by an interface, exist and could pose challenges in complex EM problems. Simple, fast and efficient numerical techniques are constantly desired. This paper presents the application of simple and efficient Markov Chain Monte Carlo (MCMC) to EM inhomogeneous axisymmetric Lapla
APA, Harvard, Vancouver, ISO, and other styles
46

de Figueiredo, Leandro Passos, Dario Grana, Mauro Roisenberg, and Bruno B. Rodrigues. "Gaussian mixture Markov chain Monte Carlo method for linear seismic inversion." GEOPHYSICS 84, no. 3 (2019): R463—R476. http://dx.doi.org/10.1190/geo2018-0529.1.

Full text
Abstract:
We have developed a Markov chain Monte Carlo (MCMC) method for joint inversion of seismic data for the prediction of facies and elastic properties. The solution of the inverse problem is defined by the Bayesian posterior distribution of the properties of interest. The prior distribution is a Gaussian mixture model, and each component is associated to a potential configuration of the facies sequence along the seismic trace. The low frequency is incorporated by using facies-dependent depositional trend models for the prior means of the elastic properties in each facies. The posterior distributio
APA, Harvard, Vancouver, ISO, and other styles
47

Zhao, Di, and Haiwu He. "DSMC: Fast direct simulation Monte Carlo solver for the Boltzmann equation by Multi-Chain Markov Chain and multicore programming." International Journal of Modeling, Simulation, and Scientific Computing 07, no. 02 (2016): 1650009. http://dx.doi.org/10.1142/s1793962316500094.

Full text
Abstract:
Direct Simulation Monte Carlo (DSMC) solves the Boltzmann equation with large Knudsen number. The Boltzmann equation generally consists of three terms: the force term, the diffusion term and the collision term. While the first two terms of the Boltzmann equation can be discretized by numerical methods such as the finite volume method, the third term can be approximated by DSMC, and DSMC simulates the physical behaviors of gas molecules. However, because of the low sampling efficiency of Monte Carlo Simulation in DSMC, this part usually occupies large portion of computational costs to solve the
APA, Harvard, Vancouver, ISO, and other styles
48

Stuart, Georgia K., Susan E. Minkoff, and Felipe Pereira. "A two-stage Markov chain Monte Carlo method for seismic inversion and uncertainty quantification." GEOPHYSICS 84, no. 6 (2019): R1003—R1020. http://dx.doi.org/10.1190/geo2018-0893.1.

Full text
Abstract:
Bayesian methods for full-waveform inversion allow quantification of uncertainty in the solution, including determination of interval estimates and posterior distributions of the model unknowns. Markov chain Monte Carlo (MCMC) methods produce posterior distributions subject to fewer assumptions, such as normality, than deterministic Bayesian methods. However, MCMC is computationally a very expensive process that requires repeated solution of the wave equation for different velocity samples. Ultimately, a large proportion of these samples (often 40%–90%) is rejected. We have evaluated a two-sta
APA, Harvard, Vancouver, ISO, and other styles
49

Atchadé, Yves, and Yizao Wang. "On the convergence rates of some adaptive Markov chain Monte Carlo algorithms." Journal of Applied Probability 52, no. 3 (2015): 811–25. http://dx.doi.org/10.1239/jap/1445543848.

Full text
Abstract:
In this paper we study the mixing time of certain adaptive Markov chain Monte Carlo (MCMC) algorithms. Under some regularity conditions, we show that the convergence rate of importance resampling MCMC algorithms, measured in terms of the total variation distance, is O(n-1). By means of an example, we establish that, in general, this algorithm does not converge at a faster rate. We also study the interacting tempering algorithm, a simplified version of the equi-energy sampler, and establish that its mixing time is of order O(n-1/2).
APA, Harvard, Vancouver, ISO, and other styles
50

Atchadé, Yves, and Yizao Wang. "On the convergence rates of some adaptive Markov chain Monte Carlo algorithms." Journal of Applied Probability 52, no. 03 (2015): 811–25. http://dx.doi.org/10.1017/s0021900200113452.

Full text
Abstract:
In this paper we study the mixing time of certain adaptive Markov chain Monte Carlo (MCMC) algorithms. Under some regularity conditions, we show that the convergence rate of importance resampling MCMC algorithms, measured in terms of the total variation distance, isO(n-1). By means of an example, we establish that, in general, this algorithm does not converge at a faster rate. We also study the interacting tempering algorithm, a simplified version of the equi-energy sampler, and establish that its mixing time is of orderO(n-1/2).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!