Letteratura scientifica selezionata sul tema "Markov chain Monte Carlo methods"

Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili

Scegli il tipo di fonte:

Consulta la lista di attuali articoli, libri, tesi, atti di convegni e altre fonti scientifiche attinenti al tema "Markov chain Monte Carlo methods".

Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.

Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.

Articoli di riviste sul tema "Markov chain Monte Carlo methods"

1

Athreya, K. B., Mohan Delampady e T. Krishnan. "Markov Chain Monte Carlo methods". Resonance 8, n. 12 (dicembre 2003): 18–32. http://dx.doi.org/10.1007/bf02839048.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Athreya, K. B., Mohan Delampady e T. Krishnan. "Markov chain Monte Carlo methods". Resonance 8, n. 10 (ottobre 2003): 8–19. http://dx.doi.org/10.1007/bf02840702.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Athreya, K. B., Mohan Delampady e T. Krishnan. "Markov chain Monte Carlo methods". Resonance 8, n. 7 (luglio 2003): 63–75. http://dx.doi.org/10.1007/bf02834404.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Athreya, K. B., Mohan Delampady e T. Krishnan. "Markov Chain Monte Carlo methods". Resonance 8, n. 4 (aprile 2003): 17–26. http://dx.doi.org/10.1007/bf02883528.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Andrieu, Christophe, Arnaud Doucet e Roman Holenstein. "Particle Markov chain Monte Carlo methods". Journal of the Royal Statistical Society: Series B (Statistical Methodology) 72, n. 3 (giugno 2010): 269–342. http://dx.doi.org/10.1111/j.1467-9868.2009.00736.x.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Gelman, Andrew, e Donald B. Rubin. "Markov chain Monte Carlo methods in biostatistics". Statistical Methods in Medical Research 5, n. 4 (dicembre 1996): 339–55. http://dx.doi.org/10.1177/096228029600500402.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Brockwell, Anthony, Pierre Del Moral e Arnaud Doucet. "Sequentially interacting Markov chain Monte Carlo methods". Annals of Statistics 38, n. 6 (dicembre 2010): 3387–411. http://dx.doi.org/10.1214/09-aos747.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Jones, Galin L., e Qian Qin. "Markov Chain Monte Carlo in Practice". Annual Review of Statistics and Its Application 9, n. 1 (7 marzo 2022): 557–78. http://dx.doi.org/10.1146/annurev-statistics-040220-090158.

Testo completo
Abstract (sommario):
Markov chain Monte Carlo (MCMC) is an essential set of tools for estimating features of probability distributions commonly encountered in modern applications. For MCMC simulation to produce reliable outcomes, it needs to generate observations representative of the target distribution, and it must be long enough so that the errors of Monte Carlo estimates are small. We review methods for assessing the reliability of the simulation effort, with an emphasis on those most useful in practically relevant settings. Both strengths and weaknesses of these methods are discussed. The methods are illustrated in several examples and in a detailed case study.
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Jones, Galin L., e Qian Qin. "Markov Chain Monte Carlo in Practice". Annual Review of Statistics and Its Application 9, n. 1 (7 marzo 2022): 557–78. http://dx.doi.org/10.1146/annurev-statistics-040220-090158.

Testo completo
Abstract (sommario):
Markov chain Monte Carlo (MCMC) is an essential set of tools for estimating features of probability distributions commonly encountered in modern applications. For MCMC simulation to produce reliable outcomes, it needs to generate observations representative of the target distribution, and it must be long enough so that the errors of Monte Carlo estimates are small. We review methods for assessing the reliability of the simulation effort, with an emphasis on those most useful in practically relevant settings. Both strengths and weaknesses of these methods are discussed. The methods are illustrated in several examples and in a detailed case study.
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Montanaro, Ashley. "Quantum speedup of Monte Carlo methods". Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 471, n. 2181 (settembre 2015): 20150301. http://dx.doi.org/10.1098/rspa.2015.0301.

Testo completo
Abstract (sommario):
Monte Carlo methods use random sampling to estimate numerical quantities which are hard to compute deterministically. One important example is the use in statistical physics of rapidly mixing Markov chains to approximately compute partition functions. In this work, we describe a quantum algorithm which can accelerate Monte Carlo methods in a very general setting. The algorithm estimates the expected output value of an arbitrary randomized or quantum subroutine with bounded variance, achieving a near-quadratic speedup over the best possible classical algorithm. Combining the algorithm with the use of quantum walks gives a quantum speedup of the fastest known classical algorithms with rigorous performance bounds for computing partition functions, which use multiple-stage Markov chain Monte Carlo techniques. The quantum algorithm can also be used to estimate the total variation distance between probability distributions efficiently.
Gli stili APA, Harvard, Vancouver, ISO e altri

Tesi sul tema "Markov chain Monte Carlo methods"

1

Fang, Youhan. "Efficient Markov Chain Monte Carlo Methods". Thesis, Purdue University, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10809188.

Testo completo
Abstract (sommario):

Generating random samples from a prescribed distribution is one of the most important and challenging problems in machine learning, Bayesian statistics, and the simulation of materials. Markov Chain Monte Carlo (MCMC) methods are usually the required tool for this task, if the desired distribution is known only up to a multiplicative constant. Samples produced by an MCMC method are real values in N-dimensional space, called the configuration space. The distribution of such samples converges to the target distribution in the limit. However, existing MCMC methods still face many challenges that are not well resolved. Difficulties for sampling by using MCMC methods include, but not exclusively, dealing with high dimensional and multimodal problems, high computation cost due to extremely large datasets in Bayesian machine learning models, and lack of reliable indicators for detecting convergence and measuring the accuracy of sampling. This dissertation focuses on new theory and methodology for efficient MCMC methods that aim to overcome the aforementioned difficulties.

One contribution of this dissertation is generalizations of hybrid Monte Carlo (HMC). An HMC method combines a discretized dynamical system in an extended space, called the state space, and an acceptance test based on the Metropolis criterion. The discretized dynamical system used in HMC is volume preserving—meaning that in the state space, the absolute Jacobian of a map from one point on the trajectory to another is 1. Volume preservation is, however, not necessary for the general purpose of sampling. A general theory allowing the use of non-volume preserving dynamics for proposing MCMC moves is proposed. Examples including isokinetic dynamics and variable mass Hamiltonian dynamics with an explicit integrator, are all designed with fewer restrictions based on the general theory. Experiments show improvement in efficiency for sampling high dimensional multimodal problems. A second contribution is stochastic gradient samplers with reduced bias. An in-depth analysis of the noise introduced by the stochastic gradient is provided. Two methods to reduce the bias in the distribution of samples are proposed. One is to correct the dynamics by using an estimated noise based on subsampled data, and the other is to introduce additional variables and corresponding dynamics to adaptively reduce the bias. Extensive experiments show that both methods outperform existing methods. A third contribution is quasi-reliable estimates of effective sample size. Proposed is a more reliable indicator—the longest integrated autocorrelation time over all functions in the state space—for detecting the convergence and measuring the accuracy of MCMC methods. The superiority of the new indicator is supported by experiments on both synthetic and real problems.

Minor contributions include a general framework of changing variables, and a numerical integrator for the Hamiltonian dynamics with fourth order accuracy. The idea of changing variables is to transform the potential energy function as a function of the original variable to a function of the new variable, such that undesired properties can be removed. Two examples are provided and preliminary experimental results are obtained for supporting this idea. The fourth order integrator is constructed by combining the idea of the simplified Takahashi-Imada method and a two-stage Hessian-based integrator. The proposed method, called two-stage simplified Takahashi-Imada method, shows outstanding performance over existing methods in high-dimensional sampling problems.

Gli stili APA, Harvard, Vancouver, ISO e altri
2

Murray, Iain Andrew. "Advances in Markov chain Monte Carlo methods". Thesis, University College London (University of London), 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.487199.

Testo completo
Abstract (sommario):
Probability distributions over many variables occur frequently in Bayesian inference, statistical physics and simulation studies. Samples from distributions give insight into their typical behavior and can allow approximation of any quantity of interest, such as expectations or normalizing constants. Markov chain Monte Carlo (MCMC), introduced by Metropolis et al. (1953), allows r sampling from distributions with intractable normalization, and remains one of most important tools for approximate computation with probability distributions. I While not needed by MCMC, normalizers are key quantities: in Bayesian statistics marginal likelihoods are needed for model comparison; in statistical physics many physical quantities relate to the partition function. In this thesis we propose and investigate several new Monte Carlo algorithms, both for evaluating normalizing constants and for improved sampling of distributions. Many MCMC correctness proofs rely on using reversible transition operators; this can lead to chains exploring by slow random walks. After reviewing existing MCMC algorithms, we develop a new framework for constructing non-reversible transition operators from existing reversible ones. Next we explore and extend MCMC-based algorithms for computing normalizing constants. In particular we develop a newMCMC operator and Nested Sampling approach for the Potts model. Our results demonstrate that these approaches can be superior to finding normalizing constants by annealing methods and can obtain better posterior samples. Finally we consider 'doubly-intractable' distributions with extra unknown normalizer terms that do not cancel in standard MCMC algorithms. We propose using several deterministic approximations for the unknown terms, and investigate their interaction with sampling algorithms. We then develop novel exact-sampling-based MCMC methods, the Exchange Algorithm and Latent Histories. For the first time these algorithms do not require separate approximation before sampling begins. Moreover, the Exchange Algorithm outperforms the only alternative sampling algorithm for doubly intractable distributions.
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Graham, Matthew McKenzie. "Auxiliary variable Markov chain Monte Carlo methods". Thesis, University of Edinburgh, 2018. http://hdl.handle.net/1842/28962.

Testo completo
Abstract (sommario):
Markov chain Monte Carlo (MCMC) methods are a widely applicable class of algorithms for estimating integrals in statistical inference problems. A common approach in MCMC methods is to introduce additional auxiliary variables into the Markov chain state and perform transitions in the joint space of target and auxiliary variables. In this thesis we consider novel methods for using auxiliary variables within MCMC methods to allow approximate inference in otherwise intractable models and to improve sampling performance in models exhibiting challenging properties such as multimodality. We first consider the pseudo-marginal framework. This extends the Metropolis–Hastings algorithm to cases where we only have access to an unbiased estimator of the density of target distribution. The resulting chains can sometimes show ‘sticking’ behaviour where long series of proposed updates are rejected. Further the algorithms can be difficult to tune and it is not immediately clear how to generalise the approach to alternative transition operators. We show that if the auxiliary variables used in the density estimator are included in the chain state it is possible to use new transition operators such as those based on slice-sampling algorithms within a pseudo-marginal setting. This auxiliary pseudo-marginal approach leads to easier to tune methods and is often able to improve sampling efficiency over existing approaches. As a second contribution we consider inference in probabilistic models defined via a generative process with the probability density of the outputs of this process only implicitly defined. The approximate Bayesian computation (ABC) framework allows inference in such models when conditioning on the values of observed model variables by making the approximation that generated observed variables are ‘close’ rather than exactly equal to observed data. Although making the inference problem more tractable, the approximation error introduced in ABC methods can be difficult to quantify and standard algorithms tend to perform poorly when conditioning on high dimensional observations. This often requires further approximation by reducing the observations to lower dimensional summary statistics. We show how including all of the random variables used in generating model outputs as auxiliary variables in a Markov chain state can allow the use of more efficient and robust MCMC methods such as slice sampling and Hamiltonian Monte Carlo (HMC) within an ABC framework. In some cases this can allow inference when conditioning on the full set of observed values when standard ABC methods require reduction to lower dimensional summaries for tractability. Further we introduce a novel constrained HMC method for performing inference in a restricted class of differentiable generative models which allows conditioning the generated observed variables to be arbitrarily close to observed data while maintaining computational tractability. As a final topicwe consider the use of an auxiliary temperature variable in MCMC methods to improve exploration of multimodal target densities and allow estimation of normalising constants. Existing approaches such as simulated tempering and annealed importance sampling use temperature variables which take on only a discrete set of values. The performance of these methods can be sensitive to the number and spacing of the temperature values used, and the discrete nature of the temperature variable prevents the use of gradient-based methods such as HMC to update the temperature alongside the target variables. We introduce new MCMC methods which instead use a continuous temperature variable. This both removes the need to tune the choice of discrete temperature values and allows the temperature variable to be updated jointly with the target variables within a HMC method.
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Xu, Jason Qian. "Markov Chain Monte Carlo and Non-Reversible Methods". Thesis, The University of Arizona, 2012. http://hdl.handle.net/10150/244823.

Testo completo
Abstract (sommario):
The bulk of Markov chain Monte Carlo applications make use of reversible chains, relying on the Metropolis-Hastings algorithm or similar methods. While reversible chains have the advantage of being relatively easy to analyze, it has been shown that non-reversible chains may outperform them in various scenarios. Neal proposes an algorithm that transforms a general reversible chain into a non-reversible chain with a construction that does not increase the asymptotic variance. These modified chains work to avoid diffusive backtracking behavior which causes Markov chains to be trapped in one position for too long. In this paper, we provide an introduction to MCMC, and discuss the Metropolis algorithm and Neal’s algorithm. We introduce a decaying memory algorithm inspired by Neal’s idea, and then analyze and compare the performance of these chains on several examples.
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Zhang, Yichuan. "Scalable geometric Markov chain Monte Carlo". Thesis, University of Edinburgh, 2016. http://hdl.handle.net/1842/20978.

Testo completo
Abstract (sommario):
Markov chain Monte Carlo (MCMC) is one of the most popular statistical inference methods in machine learning. Recent work shows that a significant improvement of the statistical efficiency of MCMC on complex distributions can be achieved by exploiting geometric properties of the target distribution. This is known as geometric MCMC. However, many such methods, like Riemannian manifold Hamiltonian Monte Carlo (RMHMC), are computationally challenging to scale up to high dimensional distributions. The primary goal of this thesis is to develop novel geometric MCMC methods applicable to large-scale problems. To overcome the computational bottleneck of computing second order derivatives in geometric MCMC, I propose an adaptive MCMC algorithm using an efficient approximation based on Limited memory BFGS. I also propose a simplified variant of RMHMC that is able to work effectively on larger scale than the previous methods. Finally, I address an important limitation of geometric MCMC, namely that is only available for continuous distributions. I investigate a relaxation of discrete variables to continuous variables that allows us to apply the geometric methods. This is a new direction of MCMC research which is of potential interest to many applications. The effectiveness of the proposed methods is demonstrated on a wide range of popular models, including generalised linear models, conditional random fields (CRFs), hierarchical models and Boltzmann machines.
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Pereira, Fernanda Chaves. "Bayesian Markov chain Monte Carlo methods in general insurance". Thesis, City University London, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.342720.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Cheal, Ryan. "Markov Chain Monte Carlo methods for simulation in pedigrees". Thesis, University of Bath, 1996. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.362254.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Durmus, Alain. "High dimensional Markov chain Monte Carlo methods : theory, methods and applications". Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLT001/document.

Testo completo
Abstract (sommario):
L'objet de cette thèse est l'analyse fine de méthodes de Monte Carlopar chaînes de Markov (MCMC) et la proposition de méthodologies nouvelles pour échantillonner une mesure de probabilité en grande dimension. Nos travaux s'articulent autour de trois grands sujets.Le premier thème que nous abordons est la convergence de chaînes de Markov en distance de Wasserstein. Nous établissons des bornes explicites de convergence géométrique et sous-géométrique. Nous appliquons ensuite ces résultats à l'étude d'algorithmes MCMC. Nous nous intéressons à une variante de l'algorithme de Metropolis-Langevin ajusté (MALA) pour lequel nous donnons des bornes explicites de convergence. Le deuxième algorithme MCMC que nous analysons est l'algorithme de Crank-Nicolson pré-conditionné, pour lequel nous montrerons une convergence sous-géométrique.Le second objet de cette thèse est l'étude de l'algorithme de Langevin unajusté (ULA). Nous nous intéressons tout d'abord à des bornes explicites en variation totale suivant différentes hypothèses sur le potentiel associé à la distribution cible. Notre étude traite le cas où le pas de discrétisation est maintenu constant mais aussi du cas d'une suite de pas tendant vers 0. Nous prêtons dans cette étude une attention toute particulière à la dépendance de l'algorithme en la dimension de l'espace d'état. Dans le cas où la densité est fortement convexe, nous établissons des bornes de convergence en distance de Wasserstein. Ces bornes nous permettent ensuite de déduire des bornes de convergence en variation totale qui sont plus précises que celles reportées précédemment sous des conditions plus faibles sur le potentiel. Le dernier sujet de cette thèse est l'étude des algorithmes de type Metropolis-Hastings par échelonnage optimal. Tout d'abord, nous étendons le résultat pionnier sur l'échelonnage optimal de l'algorithme de Metropolis à marche aléatoire aux densités cibles dérivables en moyenne Lp pour p ≥ 2. Ensuite, nous proposons de nouveaux algorithmes de type Metropolis-Hastings qui présentent un échelonnage optimal plus avantageux que celui de l'algorithme MALA. Enfin, nous analysons la stabilité et la convergence en variation totale de ces nouveaux algorithmes
The subject of this thesis is the analysis of Markov Chain Monte Carlo (MCMC) methods and the development of new methodologies to sample from a high dimensional distribution. Our work is divided into three main topics. The first problem addressed in this manuscript is the convergence of Markov chains in Wasserstein distance. Geometric and sub-geometric convergence with explicit constants, are derived under appropriate conditions. These results are then applied to thestudy of MCMC algorithms. The first analyzed algorithm is an alternative scheme to the Metropolis Adjusted Langevin algorithm for which explicit geometric convergence bounds are established. The second method is the pre-Conditioned Crank-Nicolson algorithm. It is shown that under mild assumption, the Markov chain associated with thisalgorithm is sub-geometrically ergodic in an appropriated Wasserstein distance. The second topic of this thesis is the study of the Unadjusted Langevin algorithm (ULA). We are first interested in explicit convergence bounds in total variation under different kinds of assumption on the potential associated with the target distribution. In particular, we pay attention to the dependence of the algorithm on the dimension of the state space. The case of fixed step sizes as well as the case of nonincreasing sequences of step sizes are dealt with. When the target density is strongly log-concave, explicit bounds in Wasserstein distance are established. These results are then used to derived new bounds in the total variation distance which improve the one previously derived under weaker conditions on the target density.The last part tackles new optimal scaling results for Metropolis-Hastings type algorithms. First, we extend the pioneer result on the optimal scaling of the random walk Metropolis algorithm to target densities which are differentiable in Lp mean for p ≥ 2. Then, we derive new Metropolis-Hastings type algorithms which have a better optimal scaling compared the MALA algorithm. Finally, the stability and the convergence in total variation of these new algorithms are studied
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Wu, Miaodan. "Markov chain Monte Carlo methods applied to Bayesian data analysis". Thesis, University of Cambridge, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.625087.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Paul, Rajib. "Theoretical And Algorithmic Developments In Markov Chain Monte Carlo". The Ohio State University, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=osu1218184168.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri

Libri sul tema "Markov chain Monte Carlo methods"

1

Liang, Faming, Chuanhai Liu e Raymond J. Carroll. Advanced Markov Chain Monte Carlo Methods. Chichester, UK: John Wiley & Sons, Ltd, 2010. http://dx.doi.org/10.1002/9780470669723.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Handbook for Markov chain Monte Carlo. Boca Raton: Taylor & Francis, 2011.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

R, Gilks W., Richardson S e Spiegelhalter D. J, a cura di. Markov chain Monte Carlo in practice. Boca Raton, Fla: Chapman & Hall, 1998.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

R, Gilks W., Richardson S e Spiegelhalter D. J, a cura di. Markov chain Monte Carlo in practice. London: Chapman & Hall, 1996.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Liang, F. Advanced Markov chain Monte Carlo methods: Learning from past samples. Hoboken, NJ: Wiley, 2010.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Joseph, Anosh. Markov Chain Monte Carlo Methods in Quantum Field Theories. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-46044-0.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

S, Kendall W., Liang F. 1970- e Wang J. S. 1960-, a cura di. Markov chain Monte Carlo: Innovations and applications. Singapore: World Scientific, 2005.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

1946-, Winkler Gerhard, a cura di. Image analysis, random fields and Markov chain Monte Carlo methods: A mathematical introduction. 2a ed. Berlin: Springer, 2003.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Gerhard, Winkler. Image analysis, random fields and Markov chain Monte Carlo methods: A mathematical introduction. 2a ed. Berlin: Springer, 2003.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Winkler, Gerhard. Image Analysis, Random Fields and Markov Chain Monte Carlo Methods. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-642-55760-6.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri

Capitoli di libri sul tema "Markov chain Monte Carlo methods"

1

Barbu, Adrian, e Song-Chun Zhu. "Markov Chain Monte Carlo: The Basics". In Monte Carlo Methods, 49–70. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-13-2971-5_3.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Barbu, Adrian, e Song-Chun Zhu. "Data Driven Markov Chain Monte Carlo". In Monte Carlo Methods, 211–80. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-13-2971-5_8.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Li, Hang. "Markov Chain Monte Carlo Method". In Machine Learning Methods, 401–37. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-3917-6_19.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Ó Ruanaidh, Joseph J. K., e William J. Fitzgerald. "Markov Chain Monte Carlo Methods". In Numerical Bayesian Methods Applied to Signal Processing, 69–95. New York, NY: Springer New York, 1996. http://dx.doi.org/10.1007/978-1-4612-0717-7_4.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Robert, Christian P., e Sylvia Richardson. "Markov Chain Monte Carlo Methods". In Discretization and MCMC Convergence Assessment, 1–25. New York, NY: Springer New York, 1998. http://dx.doi.org/10.1007/978-1-4612-1716-9_1.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Lange, Kenneth. "Markov Chain Monte Carlo Methods". In Mathematical and Statistical Methods for Genetic Analysis, 142–63. New York, NY: Springer New York, 1997. http://dx.doi.org/10.1007/978-1-4757-2739-5_9.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Hörmann, Wolfgang, Josef Leydold e Gerhard Derflinger. "Markov Chain Monte Carlo Methods". In Automatic Nonuniform Random Variate Generation, 363–86. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-662-05946-3_14.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Albert, Jim. "Markov Chain Monte Carlo Methods". In Bayesian Computation with R, 117–52. New York, NY: Springer New York, 2009. http://dx.doi.org/10.1007/978-0-387-92298-0_6.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Neifer, Thomas. "Markov Chain Monte Carlo Methods". In Springer Texts in Business and Economics, 167–83. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-47206-0_9.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Chib, Siddhartha. "Markov Chain Monte Carlo Methods". In The New Palgrave Dictionary of Economics, 1–11. London: Palgrave Macmillan UK, 2008. http://dx.doi.org/10.1057/978-1-349-95121-5_2042-1.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri

Atti di convegni sul tema "Markov chain Monte Carlo methods"

1

Runnalls, A. "Monte Carlo Markov chain methods for tracking". In IEE Colloquium on `Algorithms for Target Tracking'. IEE, 1995. http://dx.doi.org/10.1049/ic:19950668.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Wadsley, Andrew W. "Markov Chain Monte Carlo Methods for Reserves Estimation". In International Petroleum Technology Conference. International Petroleum Technology Conference, 2005. http://dx.doi.org/10.2523/iptc-10065-ms.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Somersalo, Erkki, Jari P. Kaipio, Marko J. Vauhkonen, D. Baroudi e S. Jaervenpaeae. "Impedance imaging and Markov chain Monte Carlo methods". In Optical Science, Engineering and Instrumentation '97, a cura di Randall L. Barbour, Mark J. Carvlin e Michael A. Fiddy. SPIE, 1997. http://dx.doi.org/10.1117/12.279723.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Wadsley, Andrew W. "Markov Chain Monte Carlo Methods for Reserves Estimation". In International Petroleum Technology Conference. International Petroleum Technology Conference, 2005. http://dx.doi.org/10.2523/10065-ms.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Gerencser, L., S. D. Hill, Z. Vago e Z. Vincze. "Discrete optimization, SPSA and Markov chain Monte Carlo methods". In Proceedings of the 2004 American Control Conference. IEEE, 2004. http://dx.doi.org/10.23919/acc.2004.1384507.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

de Figueiredo, L. Passos, D. Grana, M. Roisenberg e B. Rodrigues. "Markov Chain Monte Carlo Methods for High-dimensional Mixture Distributions". In Petroleum Geostatistics 2019. European Association of Geoscientists & Engineers, 2019. http://dx.doi.org/10.3997/2214-4609.201902273.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Nabarrete, Airton, José Antonio Hernandes e Rafael Beal Macedo. "BAYESIAN DYNAMIC MODEL UPDATING USING MARKOV CHAIN MONTE CARLO METHODS". In 24th ABCM International Congress of Mechanical Engineering. ABCM, 2017. http://dx.doi.org/10.26678/abcm.cobem2017.cob17-1646.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Runnalls, A. "Low-observable maritime tracking using Monte Carlo Markov chain methods". In IEE Colloquium on Target Tracking and Data Fusion. IEE, 1996. http://dx.doi.org/10.1049/ic:19961354.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

van Lieshout, M. N. M. "Markov chain Monte Carlo methods for clustering of image features". In Fifth International Conference on Image Processing and its Applications. IEE, 1995. http://dx.doi.org/10.1049/cp:19950657.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Akoum, S., R. Peng, R. R. Chen e B. Farhang-Boroujeny. "Markov Chain Monte Carlo Detection Methods for High SNR Regimes". In ICC 2009 - 2009 IEEE International Conference on Communications. IEEE, 2009. http://dx.doi.org/10.1109/icc.2009.5199166.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri

Rapporti di organizzazioni sul tema "Markov chain Monte Carlo methods"

1

Reddy, S., e A. Crisp. Deep Neural Network Informed Markov Chain Monte Carlo Methods. Office of Scientific and Technical Information (OSTI), novembre 2023. http://dx.doi.org/10.2172/2283285.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Doss, Hani. Studies in Reliability Theory and Survival Analysis and in Markov Chain Monte Carlo Methods. Fort Belvoir, VA: Defense Technical Information Center, settembre 1998. http://dx.doi.org/10.21236/ada367895.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Doss, Hani. Statistical Inference for Coherent Systems from Partial Information and Markov Chain Monte Carlo Methods. Fort Belvoir, VA: Defense Technical Information Center, gennaio 1996. http://dx.doi.org/10.21236/ada305676.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Doss, Hani. Studies in Reliability Theory and Survival Analysis and in Markov Chain Monte Carlo Methods. Fort Belvoir, VA: Defense Technical Information Center, dicembre 1998. http://dx.doi.org/10.21236/ada379998.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Sethuraman, Jayaram. Easily Verifiable Conditions for the Convergence of the Markov Chain Monte Carlo Method. Fort Belvoir, VA: Defense Technical Information Center, dicembre 1995. http://dx.doi.org/10.21236/ada308874.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Glaser, R., G. Johannesson, S. Sengupta, B. Kosovic, S. Carle, G. Franz, R. Aines et al. Stochastic Engine Final Report: Applying Markov Chain Monte Carlo Methods with Importance Sampling to Large-Scale Data-Driven Simulation. Office of Scientific and Technical Information (OSTI), marzo 2004. http://dx.doi.org/10.2172/15009813.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Knopp, Jeremy S., e Fumio Kojima. Inverse Problem for Electromagnetic Propagation in a Dielectric Medium using Markov Chain Monte Carlo Method (Preprint). Fort Belvoir, VA: Defense Technical Information Center, agosto 2012. http://dx.doi.org/10.21236/ada565876.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Warnes, Gregory R. The Normal Kernel Coupler: An Adaptive Markov Chain Monte Carlo Method for Efficiently Sampling From Multi-Modal Distributions. Fort Belvoir, VA: Defense Technical Information Center, marzo 2001. http://dx.doi.org/10.21236/ada459460.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Zang, Emma. Bayesian Statistics for Social and Health Scientists in R and Python. Instats Inc., 2023. http://dx.doi.org/10.61700/obtt1o65iw3ui469.

Testo completo
Abstract (sommario):
This seminar will introduce you to Bayesian statistics, which are increasingly popular and offer a powerful alternative to more traditional forms of statistical analysis. Targeted at a social and health science audience, the seminar will cover the fundamentals of Bayesian inference and illustrate a variety of techniques with applied examples of Bayesian regressions and hierarchical models. You will gain an understanding of Markov chain Monte Carlo (MCMC) methods and learn how to develop and validate Bayesian models so that you can apply them in your daily research, with the kinds of intuitive inferences that Bayesian methods allow. An official Instats certificate of completion is provided at the conclusion of the seminar. For European PhD students, the seminar offers 2 ECTS Equivalent points.
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Zang, Emma. Bayesian Statistics for Social and Health Scientists in R and Python + 2 Free Seminars. Instats Inc., 2022. http://dx.doi.org/10.61700/bgfpomu3wdhe5469.

Testo completo
Abstract (sommario):
This seminar will introduce you to Bayesian statistics, which are increasingly popular and offer a powerful alternative to more traditional forms of statistical analysis. Targeted at a social and health science audience, the seminar will cover the fundamentals of Bayesian inference and illustrate a variety of techniques with applied examples of Bayesian regressions and hierarchical models. You will gain an understanding of Markov chain Monte Carlo (MCMC) methods and learn how to develop and validate Bayesian models so that you can apply them in your daily research, with the kinds of intuitive inferences that Bayesian methods allow. When purchasing the seminar you will be freely enrolled in two on-demand seminars for Path Analysis in R and CFA/SEM in R with Bayesian estimation by Professor Zyphur, helping you to extend your learning and offering a substantial value. An official Instats certificate of completion is provided at the conclusion of the seminar. For European PhD students, the seminar offers 2 ECTS Equivalent points.
Gli stili APA, Harvard, Vancouver, ISO e altri
Offriamo sconti su tutti i piani premium per gli autori le cui opere sono incluse in raccolte letterarie tematiche. Contattaci per ottenere un codice promozionale unico!

Vai alla bibliografia