Добірка наукової літератури з теми "Markov chain Monte Carlo samplers"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Markov chain Monte Carlo samplers".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Статті в журналах з теми "Markov chain Monte Carlo samplers"

1

South, L. F., A. N. Pettitt, and C. C. Drovandi. "Sequential Monte Carlo Samplers with Independent Markov Chain Monte Carlo Proposals." Bayesian Analysis 14, no. 3 (September 2019): 753–76. http://dx.doi.org/10.1214/18-ba1129.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Everitt, Richard G., Richard Culliford, Felipe Medina-Aguayo, and Daniel J. Wilson. "Sequential Monte Carlo with transformations." Statistics and Computing 30, no. 3 (November 17, 2019): 663–76. http://dx.doi.org/10.1007/s11222-019-09903-y.

Повний текст джерела
Анотація:
AbstractThis paper examines methodology for performing Bayesian inference sequentially on a sequence of posteriors on spaces of different dimensions. For this, we use sequential Monte Carlo samplers, introducing the innovation of using deterministic transformations to move particles effectively between target distributions with different dimensions. This approach, combined with adaptive methods, yields an extremely flexible and general algorithm for Bayesian model comparison that is suitable for use in applications where the acceptance rate in reversible jump Markov chain Monte Carlo is low. We use this approach on model comparison for mixture models, and for inferring coalescent trees sequentially, as data arrives.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Xiaopeng Xu, Xiaopeng Xu, Chuancai Liu Xiaopeng Xu, Hongji Yang Chuancai Liu, and Xiaochun Zhang Hongji Yang. "A Multi-Trajectory Monte Carlo Sampler." 網際網路技術學刊 23, no. 5 (September 2022): 1117–28. http://dx.doi.org/10.53106/160792642022092305020.

Повний текст джерела
Анотація:
<p>Markov Chain Monte Carlo techniques based on Hamiltonian dynamics can sample the first or last principal components of multivariate probability models using simulated trajectories. However, when components&rsquo; scales span orders of magnitude, these approaches may be unable of accessing all components adequately. While it is possible to reconcile the first and last components by alternating between two different types of trajectories, the sampling of intermediate components may be imprecise. In this paper, a function generalizing the kinetic energies of Hamiltonian Monte Carlo and Riemannian Manifold Hamiltonian Monte Carlo is proposed, and it is found that the methods based on a specific form of the function can more accurately sample normal distributions. Additionally, the multi-particle algorithm&rsquo;s reasoning is given after a review of some statistical ideas.</p> <p>&nbsp;</p>
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Dellaportas, Petros, and Ioannis Kontoyiannis. "Control variates for estimation based on reversible Markov chain Monte Carlo samplers." Journal of the Royal Statistical Society: Series B (Statistical Methodology) 74, no. 1 (November 3, 2011): 133–61. http://dx.doi.org/10.1111/j.1467-9868.2011.01000.x.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Jones, Galin L., Gareth O. Roberts, and Jeffrey S. Rosenthal. "Convergence of Conditional Metropolis-Hastings Samplers." Advances in Applied Probability 46, no. 2 (June 2014): 422–45. http://dx.doi.org/10.1239/aap/1401369701.

Повний текст джерела
Анотація:
We consider Markov chain Monte Carlo algorithms which combine Gibbs updates with Metropolis-Hastings updates, resulting in a conditional Metropolis-Hastings sampler (CMH sampler). We develop conditions under which the CMH sampler will be geometrically or uniformly ergodic. We illustrate our results by analysing a CMH sampler used for drawing Bayesian inferences about the entire sample path of a diffusion process, based only upon discrete observations.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Jones, Galin L., Gareth O. Roberts, and Jeffrey S. Rosenthal. "Convergence of Conditional Metropolis-Hastings Samplers." Advances in Applied Probability 46, no. 02 (June 2014): 422–45. http://dx.doi.org/10.1017/s0001867800007151.

Повний текст джерела
Анотація:
We consider Markov chain Monte Carlo algorithms which combine Gibbs updates with Metropolis-Hastings updates, resulting in aconditional Metropolis-Hastings sampler(CMH sampler). We develop conditions under which the CMH sampler will be geometrically or uniformly ergodic. We illustrate our results by analysing a CMH sampler used for drawing Bayesian inferences about the entire sample path of a diffusion process, based only upon discrete observations.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Levy, Roy. "The Rise of Markov Chain Monte Carlo Estimation for Psychometric Modeling." Journal of Probability and Statistics 2009 (2009): 1–18. http://dx.doi.org/10.1155/2009/537139.

Повний текст джерела
Анотація:
Markov chain Monte Carlo (MCMC) estimation strategies represent a powerful approach to estimation in psychometric models. Popular MCMC samplers and their alignment with Bayesian approaches to modeling are discussed. Key historical and current developments of MCMC are surveyed, emphasizing how MCMC allows the researcher to overcome the limitations of other estimation paradigms, facilitates the estimation of models that might otherwise be intractable, and frees the researcher from certain possible misconceptions about the models.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Kilic, Zeliha, Max Schweiger, Camille Moyer, and Steve Pressé. "Monte Carlo samplers for efficient network inference." PLOS Computational Biology 19, no. 7 (July 18, 2023): e1011256. http://dx.doi.org/10.1371/journal.pcbi.1011256.

Повний текст джерела
Анотація:
Accessing information on an underlying network driving a biological process often involves interrupting the process and collecting snapshot data. When snapshot data are stochastic, the data’s structure necessitates a probabilistic description to infer underlying reaction networks. As an example, we may imagine wanting to learn gene state networks from the type of data collected in single molecule RNA fluorescence in situ hybridization (RNA-FISH). In the networks we consider, nodes represent network states, and edges represent biochemical reaction rates linking states. Simultaneously estimating the number of nodes and constituent parameters from snapshot data remains a challenging task in part on account of data uncertainty and timescale separations between kinetic parameters mediating the network. While parametric Bayesian methods learn parameters given a network structure (with known node numbers) with rigorously propagated measurement uncertainty, learning the number of nodes and parameters with potentially large timescale separations remain open questions. Here, we propose a Bayesian nonparametric framework and describe a hybrid Bayesian Markov Chain Monte Carlo (MCMC) sampler directly addressing these challenges. In particular, in our hybrid method, Hamiltonian Monte Carlo (HMC) leverages local posterior geometries in inference to explore the parameter space; Adaptive Metropolis Hastings (AMH) learns correlations between plausible parameter sets to efficiently propose probable models; and Parallel Tempering takes into account multiple models simultaneously with tempered information content to augment sampling efficiency. We apply our method to synthetic data mimicking single molecule RNA-FISH, a popular snapshot method in probing transcriptional networks to illustrate the identified challenges inherent to learning dynamical models from these snapshots and how our method addresses them.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Guha, Subharup, Steven N. MacEachern, and Mario Peruggia. "Benchmark Estimation for Markov chain Monte Carlo Samples." Journal of Computational and Graphical Statistics 13, no. 3 (September 2004): 683–701. http://dx.doi.org/10.1198/106186004x2598.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Siems, Tobias. "Markov Chain Monte Carlo on finite state spaces." Mathematical Gazette 104, no. 560 (June 18, 2020): 281–87. http://dx.doi.org/10.1017/mag.2020.51.

Повний текст джерела
Анотація:
We elaborate the idea behind Markov chain Monte Carlo (MCMC) methods in a mathematically coherent, yet simple and understandable way. To this end, we prove a pivotal convergence theorem for finite Markov chains and a minimal version of the Perron-Frobenius theorem. Subsequently, we briefly discuss two fundamental MCMC methods, the Gibbs and Metropolis-Hastings sampler. Only very basic knowledge about matrices, convergence of real sequences and probability theory is required.
Стилі APA, Harvard, Vancouver, ISO та ін.

Дисертації з теми "Markov chain Monte Carlo samplers"

1

Guha, Subharup. "Benchmark estimation for Markov Chain Monte Carlo samplers." The Ohio State University, 2004. http://rave.ohiolink.edu/etdc/view?acc_num=osu1085594208.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Sisson, Scott Antony. "Markov chains for genetics and extremes." Thesis, University of Bristol, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.391095.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Pang, Wan-Kai. "Modelling ordinal categorical data : a Gibbs sampler approach." Thesis, University of Southampton, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.323876.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Verhelst, Norman D., Reinhold Hatzinger, and Patrick Mair. "The Rasch Sampler." Foundation for Open Access Statistics, 2007. http://dx.doi.org/10.18637/jss.v020.i04.

Повний текст джерела
Анотація:
The Rasch sampler is an efficient algorithm to sample binary matrices with given marginal sums. It is a Markov chain Monte Carlo (MCMC) algorithm. The program can handle matrices of up to 1024 rows and 64 columns. A special option allows to sample square matrices with given marginals and fixed main diagonal, a problem prominent in social network analysis. In all cases the stationary distribution is uniform. The user has control on the serial dependency. (authors' abstract)
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Zhu, Qingyun. "Product Deletion and Supply Chain Management." Digital WPI, 2019. https://digitalcommons.wpi.edu/etd-dissertations/527.

Повний текст джерела
Анотація:
One of the most significant changes in the evolution of modern business management is that organizations no longer compete as individual entities in the market, but as interlocking supply chains. Markets are no longer simply trading desks but dynamic ecosystems where people, organizations and the environment interact. Products and associated materials and resources are links that bridge supply chains from upstream (sourcing and manufacturing) to downstream (delivering and consuming). The lifecycle of a product plays a critical role in supply chains. Supply chains may be composed by, designed around, and modified for products. Product-related issues greatly impact supply chains. Existing studies have advanced product management and product lifecycle management literature through dimensions of product innovation, product growth, product line extensions, product efficiencies, and product acquisition. Product deletion, rationalization, or reduction research is limited but is a critical issue for many reasons. Sustainability is an important reason for this managerial decision. This study, grounded from multiple literature streams in both marketing and supply chain fields, identified relations and propositions to form a firm-level analysis on the role of supply chains in organizational product deletion decisions. Interviews, observational and archival data from international companies (i.e.: Australia, China, India, and Iran) contributed to the empirical support as case studies through a grounded theory approach. Bayesian analysis, an underused empirical analysis tool, was utilized to provide insights into this underdeveloped research stream; and its relationship to qualitative research enhances broader methodological understanding. Gibbs sampler and reversible jump Markov chain Monte Carlo (MCMC) simulation were used for Bayesian analysis based on collected data. The integrative findings are exploratory but provide insights for a number of research propositions.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Al, Hakmani Rahab. "Bayesian Estimation of Mixture IRT Models using NUTS." OpenSIUC, 2018. https://opensiuc.lib.siu.edu/dissertations/1641.

Повний текст джерела
Анотація:
The No-U-Turn Sampler (NUTS) is a relatively new Markov chain Monte Carlo (MCMC) algorithm that avoids the random walk behavior that common MCMC algorithms such as Gibbs sampling or Metropolis Hastings usually exhibit. Given the fact that NUTS can efficiently explore the entire space of the target distribution, the sampler converges to high-dimensional target distributions more quickly than other MCMC algorithms and is hence less computational expensive. The focus of this study is on applying NUTS to one of the complex IRT models, specifically the two-parameter mixture IRT (Mix2PL) model, and further to examine its performance in estimating model parameters when sample size, test length, and number of latent classes are manipulated. The results indicate that overall, NUTS performs well in recovering model parameters. However, the recovery of the class membership of individual persons is not satisfactory for the three-class conditions. Also, the results indicate that WAIC performs better than LOO in recovering the number of latent classes, in terms of the proportion of the time the correct model was selected as the best fitting model. However, when the effective number of parameters was also considered in selecting the best fitting model, both fully Bayesian fit indices perform equally well. In addition, the results suggest that when multiple latent classes exist, using either fully Bayesian fit indices (WAIC or LOO) would not select the conventional IRT model. On the other hand, when all examinees came from a single unified population, fitting MixIRT models using NUTS causes problems in convergence.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Lu, Pingbo. "Calibrated Bayes factors for model selection and model averaging." The Ohio State University, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=osu1343396705.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Deng, Wei. "Multiple imputation for marginal and mixed models in longitudinal data with informative missingness." Connect to resource, 2005. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1126890027.

Повний текст джерела
Анотація:
Thesis (Ph. D.)--Ohio State University, 2005.
Title from first page of PDF file. Document formatted into pages; contains xiii, 108 p.; also includes graphics. Includes bibliographical references (p. 104-108). Available online via OhioLINK's ETD Center
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Wu, Yi-Fang. "Accuracy and variability of item parameter estimates from marginal maximum a posteriori estimation and Bayesian inference via Gibbs samplers." Diss., University of Iowa, 2015. https://ir.uiowa.edu/etd/5879.

Повний текст джерела
Анотація:
Item response theory (IRT) uses a family of statistical models for estimating stable characteristics of items and examinees and defining how these characteristics interact in describing item and test performance. With a focus on the three-parameter logistic IRT (Birnbaum, 1968; Lord, 1980) model, the current study examines the accuracy and variability of the item parameter estimates from the marginal maximum a posteriori estimation via an expectation-maximization algorithm (MMAP/EM) and the Markov chain Monte Carlo Gibbs sampling (MCMC/GS) approach. In the study, the various factors which have an impact on the accuracy and variability of the item parameter estimates are discussed, and then further evaluated through a large scale simulation. The factors of interest include the composition and length of tests, the distribution of underlying latent traits, the size of samples, and the prior distributions of discrimination, difficulty, and pseudo-guessing parameters. The results of the two estimation methods are compared to determine the lower limit--in terms of test length, sample size, test characteristics, and prior distributions of item parameters--at which the methods can satisfactorily recover item parameters and efficiently function in reality. For practitioners, the results help to define limits on the appropriate use of the BILOG-MG (which implements MMAP/EM) and also, to assist in deciding the utility of OpenBUGS (which carries out MCMC/GS) for item parameter estimation in practice.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Fu, Shuting. "Bayesian Logistic Regression Model with Integrated Multivariate Normal Approximation for Big Data." Digital WPI, 2016. https://digitalcommons.wpi.edu/etd-theses/451.

Повний текст джерела
Анотація:
The analysis of big data is of great interest today, and this comes with challenges of improving precision and efficiency in estimation and prediction. We study binary data with covariates from numerous small areas, where direct estimation is not reliable, and there is a need to borrow strength from the ensemble. This is generally done using Bayesian logistic regression, but because there are numerous small areas, the exact computation for the logistic regression model becomes challenging. Therefore, we develop an integrated multivariate normal approximation (IMNA) method for binary data with covariates within the Bayesian paradigm, and this procedure is assisted by the empirical logistic transform. Our main goal is to provide the theory of IMNA and to show that it is many times faster than the exact logistic regression method with almost the same accuracy. We apply the IMNA method to the health status binary data (excellent health or otherwise) from the Nepal Living Standards Survey with more than 60,000 households (small areas). We estimate the proportion of Nepalese in excellent health condition for each household. For these data IMNA gives estimates of the household proportions as precise as those from the logistic regression model and it is more than fifty times faster (20 seconds versus 1,066 seconds), and clearly this gain is transferable to bigger data problems.
Стилі APA, Harvard, Vancouver, ISO та ін.

Книги з теми "Markov chain Monte Carlo samplers"

1

Liang, F. Advanced Markov chain Monte Carlo methods: Learning from past samples. Hoboken, NJ: Wiley, 2010.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Handbook for Markov chain Monte Carlo. Boca Raton: Taylor & Francis, 2011.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Liang, Faming, Chuanhai Liu, and Raymond J. Carroll. Advanced Markov Chain Monte Carlo Methods. Chichester, UK: John Wiley & Sons, Ltd, 2010. http://dx.doi.org/10.1002/9780470669723.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

R, Gilks W., Richardson S, and Spiegelhalter D. J, eds. Markov chain Monte Carlo in practice. Boca Raton, Fla: Chapman & Hall, 1998.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

R, Gilks W., Richardson S, and Spiegelhalter D. J, eds. Markov chain Monte Carlo in practice. London: Chapman & Hall, 1996.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

S, Kendall W., Liang F. 1970-, and Wang J. S. 1960-, eds. Markov chain Monte Carlo: Innovations and applications. Singapore: World Scientific, 2005.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Joseph, Anosh. Markov Chain Monte Carlo Methods in Quantum Field Theories. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-46044-0.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Gamerman, Dani. Markov chain Monte Carlo: Stochastic simulation for Bayesian inference. London: Chapman & Hall, 1997.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Freitas, Lopes Hedibert, ed. Markov chain Monte Carlo: Stochastic simulation for Bayesian inference. 2nd ed. Boca Raton: Taylor & Francis, 2006.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Markov chain Monte Carlo: Stochastic simulation for Bayesian inference. London: Chapman & Hall, 1997.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Частини книг з теми "Markov chain Monte Carlo samplers"

1

Keith, Jonathan M., and Christian M. Davey. "Bayesian Approaches to the Design of Markov Chain Monte Carlo Samplers." In Monte Carlo and Quasi-Monte Carlo Methods 2012, 455–66. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-41095-6_22.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Aoki, Satoshi, Hisayuki Hara, and Akimichi Takemura. "Markov Chain Monte Carlo Methods over Discrete Sample Space." In Springer Series in Statistics, 23–31. New York, NY: Springer New York, 2012. http://dx.doi.org/10.1007/978-1-4614-3719-2_2.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Tanner, Martin A. "Markov Chain Monte Carlo: The Gibbs Sampler and the Metropolis Algorithm." In Tools for Statistical Inference, 137–92. New York, NY: Springer New York, 1996. http://dx.doi.org/10.1007/978-1-4612-4024-2_6.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Kitchen, Nathan, and Andreas Kuehlmann. "A Markov Chain Monte Carlo Sampler for Mixed Boolean/Integer Constraints." In Computer Aided Verification, 446–61. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-02658-4_34.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Tanner, Martin A. "Markov Chain Monte Carlo: The Gibbs Sampler and the Metropolis Algorithm." In Tools for Statistical Inference, 102–46. New York, NY: Springer US, 1993. http://dx.doi.org/10.1007/978-1-4684-0192-9_6.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Yu, Thomas, Marco Pizzolato, Gabriel Girard, Jonathan Rafael-Patino, Erick Jorge Canales-Rodríguez, and Jean-Philippe Thiran. "Robust Biophysical Parameter Estimation with a Neural Network Enhanced Hamiltonian Markov Chain Monte Carlo Sampler." In Lecture Notes in Computer Science, 818–29. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-20351-1_64.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Sorensen, Daniel, and Daniel Gianola. "Markov Chain Monte Carlo." In Statistics for Biology and Health, 497–537. New York, NY: Springer New York, 2002. http://dx.doi.org/10.1007/0-387-22764-4_11.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Wedel, Michel, and Peter Lenk. "Markov Chain Monte Carlo." In Encyclopedia of Operations Research and Management Science, 925–30. Boston, MA: Springer US, 2013. http://dx.doi.org/10.1007/978-1-4419-1153-7_1164.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Joseph, Anosh. "Markov Chain Monte Carlo." In SpringerBriefs in Physics, 37–42. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-46044-0_4.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Fürnkranz, Johannes, Philip K. Chan, Susan Craw, Claude Sammut, William Uther, Adwait Ratnaparkhi, Xin Jin, et al. "Markov Chain Monte Carlo." In Encyclopedia of Machine Learning, 639–42. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_511.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Тези доповідей конференцій з теми "Markov chain Monte Carlo samplers"

1

Vaiciulyte, Ingrida. "Adaptive Monte-Carlo Markov chain for multivariate statistical estimation." In International Workshop of "Stochastic Programming for Implementation and Advanced Applications". The Association of Lithuanian Serials, 2012. http://dx.doi.org/10.5200/stoprog.2012.21.

Повний текст джерела
Анотація:
The estimation of the multivariate skew t-distribution by the Monte-Carlo Markov Chain (MCMC) method is considered in the paper. Thus, the MCMC procedure is constructed for recurrent estimation of skew t-distribution, following the maximum likelihood method, where the Monte-Carlo sample size is regulated to ensure the convergence and to decrease the total amount of Monte-Carlo trials, required for estimation. The confidence intervals of Monte-Carlo estimators are introduced because of their asymptotic normality. The termination rule is also implemented by testing statistical hypotheses on an insignificant change of estimates in two steps of the procedure. The algorithm developed has been tested by computer simulation with test example. The test sample, following from skew t-distribution, has been simulated by computer and parameters of the skew t-distribution have been estimated by MathCAD. Next, the chi-squared criterion confirmed the hypothesis of distribution of statistics with respect to under- lying distribution function. Thus, computer simulation confirmed the applicability of the Monte-Carlo Markov chain approach with adaptively regulated sample size for estimation of parameters of the skew t- distribution with acceptable accuracy.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

BURKETT, K. M., B. McNENEY, and J. GRAHAM. "A MARKOV CHAIN MONTE CARLO SAMPLER FOR GENE GENEALOGIES CONDITIONAL ON HAPLOTYPE DATA." In Proceedings of Statistics 2011 Canada/IMST 2011-FIM XX. WORLD SCIENTIFIC, 2013. http://dx.doi.org/10.1142/9789814417983_0003.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Guzman, Rel. "Monte Carlo Methods on High Dimensional Data." In LatinX in AI at Neural Information Processing Systems Conference 2018. Journal of LatinX in AI Research, 2018. http://dx.doi.org/10.52591/lxai2018120314.

Повний текст джерела
Анотація:
Markov Chain Monte Carlo (MCMC) simulation is a family of stochastic algorithms that are commonly used to approximate probability distributions by generating samples. The aim of this proposal is to deal with the problem of doing that job on a large scale because due to the increasing power computational demands of data being tall or wide, a study that combines statistical and engineering expertise can be made in order to achieve hardware-accelerated MCMC inference. In this work, I attempt to advance the theory and practice of approximate MCMC methods by developing a toolbox of distributed MCMC algorithms, and then a new method for dealing with large scale problems will be proposed, or else a framework for choosing the most appropriate method will be established. Papers like [1] provide a comprehensive review of the existing literature regarding methods to tackle big data problems. My idea is to tackle divide and conquer approaches since they can work distributed in several machines or else Graphics Processing Unit (GPUs), so I cover the theory behind these methods; then, exhaustive experimental tests will help me compare and categorize them according to their limitations in wide and tall data by considering the dataset size n, sample dimension d, and number of samples T to produce.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Putze, A., L. Derome, F. Donato, and D. Maurin. "A Markov Chain Monte Carlo technique to sample transport and source parameters of Galactic cosmic rays." In Proceedings of the 12th ICATPP Conference. WORLD SCIENTIFIC, 2011. http://dx.doi.org/10.1142/9789814329033_0056.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Davis, Gary A. "Sample-Based Estimation of Vehicle Speeds from Yaw Marks: Bayesian Implementation Using Markov Chain Monte Carlo Simulation." In SAE 2014 World Congress & Exhibition. 400 Commonwealth Drive, Warrendale, PA, United States: SAE International, 2014. http://dx.doi.org/10.4271/2014-01-0467.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

De Sa, Christopher, Kunle Olukotun, and Christopher Ré. "Ensuring Rapid Mixing and Low Bias for Asynchronous Gibbs Sampling." In Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/672.

Повний текст джерела
Анотація:
Gibbs sampling is a Markov chain Monte Carlo technique commonly used for estimating marginal distributions. To speed up Gibbs sampling, there has recently been interest in parallelizing it by executing asynchronously. While empirical results suggest that many models can be efficiently sampled asynchronously, traditional Markov chain analysis does not apply to the asynchronous case, and thus asynchronous Gibbs sampling is poorly understood. In this paper, we derive a better understanding of the two main challenges of asynchronous Gibbs: bias and mixing time. We show experimentally that our theoretical results match practical outcomes.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Lin, Pin-Yi, and Kuei-Yuan Chan. "Optimal Sample Augmentation and Resource Allocation for Design With Inadequate Uncertainty Data." In ASME 2012 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/detc2012-70234.

Повний текст джерела
Анотація:
Uncertainty modeling in reliability-based design optimization problems requires a large amount of measurement data that are generally too costly in engineering practice. Instead, engineers are constantly challenged to make timely design decisions with only limited information at hand. In the literature, Bayesian binomial inference techniques have been used to estimate the reliability values of functions of uncertainties with limited samples. However, existing methods assume one sample as the entire set of measurements with one for each uncertain quantity while in reality one sample is one measurement on a specific quantity. As a result, effective yet efficient allocating resources in sample augmentation is needed to reflect the relative contributions of uncertainties on the final optimum. We propose a sample augmentation process that uses the concept of sample combinations. Uncertain quantities are sampled with respect to their relative ‘importance’ while the impacts of bad measurements, which affect the evaluation of reliability inference, are alleviated via a Markov-Chain Monte Carlo filter. The proposed method could minimize the efforts and resources without assuming distributions for uncertainties. Several examples are used to demonstrate the validity of the method in product development.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

An, Dawn, and Joo-Ho Choi. "Improved MCMC Method for Parameter Estimation Based on Marginal Probability Density Function." In ASME 2011 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2011. http://dx.doi.org/10.1115/detc2011-48784.

Повний текст джерела
Анотація:
In many engineering problems, sampling is often used to estimate and quantify the probability distribution of uncertain parameters during the course of Bayesian framework, which is to draw proper samples that follow the probabilistic feature of the parameters. Among numerous approaches, Markov Chain Monte Carlo (MCMC) has gained the most popularity due to its efficiency and wide applicability. The MCMC, however, does not work well in the case of increased parameters and/or high correlations due to the difficulty of finding proper proposal distribution. In this paper, a method employing marginal probability density function (PDF) as a proposal distribution is proposed to overcome these problems. Several engineering problems which are formulated by Bayesian approach are addressed to demonstrate the effectiveness of proposed method.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Wu, Y. T., A. P. Ku, and C. M. Serratella. "A Robust and Efficient Computational Method for Fatigue Reliability Update Using Inspected Data." In ASME 2009 28th International Conference on Ocean, Offshore and Arctic Engineering. ASMEDC, 2009. http://dx.doi.org/10.1115/omae2009-80034.

Повний текст джерела
Анотація:
This paper presents a new methodology for reliability-based inspection planning focusing on robust and accurate computational strategies for fatigue-reliability updating using inspection results. The core of the proposed strategy is a conditioned sampling-based method, implemented by a Fast Probability Analyzer (FPA) software where efficiency is achieved by using the importance sampling principal. For a single component or limit state, FPA first generates Markov-Chain Monte Carlo (MCMC) samples in the failure domain, then applies an adaptive stratified importance sampling (ASIS) method to compute probability of failure (PoF) with error control. Once the MCMC samples have been created, solving a reliability updating problem is fairly straightforward and computationally robust relative to the conventional system reliability methods that rely on linearization of the limit states. The new approach is demonstrated using examples including stiffened panels of a ship-shaped vessel where reliability is updated using inspection results from 100 panel connections.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Bérešová, Simona. "Numerical realization of the Bayesian inversion accelerated using surrogate models." In Programs and Algorithms of Numerical Mathematics 21. Institute of Mathematics, Czech Academy of Sciences, 2023. http://dx.doi.org/10.21136/panm.2022.03.

Повний текст джерела
Анотація:
The Bayesian inversion is a natural approach to the solution of inverse problems based on uncertain observed data. The result of such an inverse problem is the posterior distribution of unknown parameters. This paper deals with the numerical realization of the Bayesian inversion focusing on problems governed by computationally expensive forward models such as numerical solutions of partial differential equations. Samples from the posterior distribution are generated using the Markov chain Monte Carlo (MCMC) methods accelerated with surrogate models. A surrogate model is understood as an approximation of the forward model which should be computationally much cheaper. The target distribution is not fully replaced by its approximation; therefore, samples from the exact posterior distribution are provided. In addition, non-intrusive surrogate models can be updated during the sampling process resulting in an adaptive MCMC method. The use of the surrogate models significantly reduces the number of evaluations of the forward model needed for a reliable description of the posterior distribution. Described sampling procedures are implemented in the form of a Python package.
Стилі APA, Harvard, Vancouver, ISO та ін.

Звіти організацій з теми "Markov chain Monte Carlo samplers"

1

Safta, Cosmin, Mohammad Khalil, and Habib N. Najm. Transitional Markov Chain Monte Carlo Sampler in UQTk. Office of Scientific and Technical Information (OSTI), March 2020. http://dx.doi.org/10.2172/1606084.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Gelfand, Alan E., and Sujit K. Sahu. On Markov Chain Monte Carlo Acceleration. Fort Belvoir, VA: Defense Technical Information Center, April 1994. http://dx.doi.org/10.21236/ada279393.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Warnes, Gregory R. HYDRA: A Java Library for Markov Chain Monte Carlo. Fort Belvoir, VA: Defense Technical Information Center, March 2002. http://dx.doi.org/10.21236/ada459649.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Bates, Cameron Russell, and Edward Allen Mckigney. Metis: A Pure Metropolis Markov Chain Monte Carlo Bayesian Inference Library. Office of Scientific and Technical Information (OSTI), January 2018. http://dx.doi.org/10.2172/1417145.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Baltz, E. Markov Chain Monte Carlo Exploration of Minimal Supergravity with Implications for Dark Matter. Office of Scientific and Technical Information (OSTI), July 2004. http://dx.doi.org/10.2172/827306.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Sethuraman, Jayaram. Easily Verifiable Conditions for the Convergence of the Markov Chain Monte Carlo Method. Fort Belvoir, VA: Defense Technical Information Center, December 1995. http://dx.doi.org/10.21236/ada308874.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Doss, Hani. Studies in Reliability Theory and Survival Analysis and in Markov Chain Monte Carlo Methods. Fort Belvoir, VA: Defense Technical Information Center, September 1998. http://dx.doi.org/10.21236/ada367895.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Doss, Hani. Statistical Inference for Coherent Systems from Partial Information and Markov Chain Monte Carlo Methods. Fort Belvoir, VA: Defense Technical Information Center, January 1996. http://dx.doi.org/10.21236/ada305676.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Doss, Hani. Studies in Reliability Theory and Survival Analysis and in Markov Chain Monte Carlo Methods. Fort Belvoir, VA: Defense Technical Information Center, December 1998. http://dx.doi.org/10.21236/ada379998.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Knopp, Jeremy S., and Fumio Kojima. Inverse Problem for Electromagnetic Propagation in a Dielectric Medium using Markov Chain Monte Carlo Method (Preprint). Fort Belvoir, VA: Defense Technical Information Center, August 2012. http://dx.doi.org/10.21236/ada565876.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії