Academic literature on the topic 'Markov chains'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Markov chains.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Markov chains"
Grewal, Jasleen K., Martin Krzywinski, and Naomi Altman. "Markov models—Markov chains." Nature Methods 16, no. 8 (July 30, 2019): 663–64. http://dx.doi.org/10.1038/s41592-019-0476-x.
Full textValenzuela, Mississippi. "Markov chains and applications." Selecciones Matemáticas 9, no. 01 (June 30, 2022): 53–78. http://dx.doi.org/10.17268/sel.mat.2022.01.05.
Full textLindley, D. V., and J. R. Norris. "Markov Chains." Mathematical Gazette 83, no. 496 (March 1999): 188. http://dx.doi.org/10.2307/3618756.
Full textLund, Robert B., and J. R. Norris. "Markov Chains." Journal of the American Statistical Association 94, no. 446 (June 1999): 654. http://dx.doi.org/10.2307/2670196.
Full textVerbeken, Brecht, and Marie-Anne Guerry. "Attainability for Markov and Semi-Markov Chains." Mathematics 12, no. 8 (April 19, 2024): 1227. http://dx.doi.org/10.3390/math12081227.
Full textBarker, Richard J., and Matthew R. Schofield. "Putting Markov Chains Back into Markov Chain Monte Carlo." Journal of Applied Mathematics and Decision Sciences 2007 (October 30, 2007): 1–13. http://dx.doi.org/10.1155/2007/98086.
Full textZhong, Pingping, Weiguo Yang, and Peipei Liang. "THE ASYMPTOTIC EQUIPARTITION PROPERTY FOR ASYMPTOTIC CIRCULAR MARKOV CHAINS." Probability in the Engineering and Informational Sciences 24, no. 2 (March 18, 2010): 279–88. http://dx.doi.org/10.1017/s0269964809990271.
Full textLund, Robert, Ying Zhao, and Peter C. Kiessler. "A monotonicity in reversible Markov chains." Journal of Applied Probability 43, no. 2 (June 2006): 486–99. http://dx.doi.org/10.1239/jap/1152413736.
Full textLund, Robert, Ying Zhao, and Peter C. Kiessler. "A monotonicity in reversible Markov chains." Journal of Applied Probability 43, no. 02 (June 2006): 486–99. http://dx.doi.org/10.1017/s0021900200001777.
Full textJanssen, A., and J. Segers. "Markov Tail Chains." Journal of Applied Probability 51, no. 4 (December 2014): 1133–53. http://dx.doi.org/10.1239/jap/1421763332.
Full textDissertations / Theses on the topic "Markov chains"
Skorniakov, Viktor. "Asymptotically homogeneous Markov chains." Doctoral thesis, Lithuanian Academic Libraries Network (LABT), 2010. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2010~D_20101223_152954-43357.
Full textDisertacijoje tirta Markovo grandinių klasė, kurios iteracijos nusakomos atsitiktinėmis asimptotiškai homogeninėmis funkcijomis, ir išspręsti du uždaviniai: 1) surastos bendros sąlygos, kurios garantuoja vienintelio stacionaraus skirstinio egzistavimą; 2) vienmatėms grandinėms surastos sąlygos, kurioms esant stacionarus skirstinys turi "sunkias" uodegas.
Cho, Eun Hea. "Computation for Markov Chains." NCSU, 2000. http://www.lib.ncsu.edu/theses/available/etd-20000303-164550.
Full textA finite, homogeneous, irreducible Markov chain $\mC$ with transitionprobability matrix possesses a unique stationary distribution vector. The questions one can pose in the area of computation of Markov chains include the following:
- How does one compute the stationary distributions?
- How accurate is the resulting answer?
In this thesis, we try to provide answers to these questions.
The thesis is divided in two parts. The first part deals with the perturbation theory of finite, homogeneous, irreducible Markov Chains, which is related to the first question above. The purpose of this part is to analyze the sensitivity of the stationarydistribution vector to perturbations in the transition probabilitymatrix. The second part gives answers to the question of computing the stationarydistributions of nearly uncoupled Markov chains (NUMC).
Dessain, Thomas James. "Perturbations of Markov chains." Thesis, Durham University, 2014. http://etheses.dur.ac.uk/10619/.
Full textDi, Cecco Davide <1980>. "Markov exchangeable data and mixtures of Markov Chains." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1547/1/Di_Cecco_Davide_Tesi.pdf.
Full textDi, Cecco Davide <1980>. "Markov exchangeable data and mixtures of Markov Chains." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1547/.
Full textMatthews, James. "Markov chains for sampling matchings." Thesis, University of Edinburgh, 2008. http://hdl.handle.net/1842/3072.
Full textWilson, David Bruce. "Exact sampling with Markov chains." Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/38402.
Full textMestern, Mark Andrew. "Distributed analysis of Markov chains." Master's thesis, University of Cape Town, 1998. http://hdl.handle.net/11427/9693.
Full textThis thesis examines how parallel and distributed algorithms can increase the power of techniques for correctness and performance analysis of concurrent systems. The systems in question are state transition systems from which Markov chains can be derived. Both phases of the analysis pipeline are considered: state space generation from a state transition model to form the Markov chain and finding performance information by solving the steady state equations of the Markov Chain. The state transition models are specified in a general interface language which can describe any Markovian process. The models are not tied to a specific modelling formalism, but common formal description techniques such as generalised stochastic Petri nets and queuing networks can generate these models. Tools for Markov chain analysis face the problem of state Spaces that are so large that they exceed the memory and processing power of a single workstation. This problem is attacked with methods to reduce memory usage, and by dividing the problem between several workstations. A distributed state space generation algorithm was designed and implemented for a local area network of workstations. The state space generation algorithm also includes a probabilistic dynamic hash compaction technique for storing state hash tables, which dramatically reduces memory consumption.- Numerical solution methods for Markov chains are surveyed and two iterative methods, BiCG and BiCGSTAB, were chosen for a parallel implementation to show that this stage of analysis also benefits from a distributed approach. The results from the distributed generation algorithm show a good speed up of the state space generation phase and that the method makes the generation of larger state spaces possible. The distributed methods for the steady state solution also allow larger models to be analysed, but the heavy communications load on the network prevents improved execution time.
Salzman, Julia. "Spectral analysis with Markov chains /." May be available electronically:, 2007. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.
Full textDorff, Rebecca. "Modelling Infertility with Markov Chains." BYU ScholarsArchive, 2013. https://scholarsarchive.byu.edu/etd/4070.
Full textBooks on the topic "Markov chains"
Gagniuc, Paul A. Markov Chains. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2017. http://dx.doi.org/10.1002/9781119387596.
Full textBrémaud, Pierre. Markov Chains. New York, NY: Springer New York, 1999. http://dx.doi.org/10.1007/978-1-4757-3124-8.
Full textSericola, Bruno. Markov Chains. Hoboken, NJ USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118731543.
Full textDouc, Randal, Eric Moulines, Pierre Priouret, and Philippe Soulier. Markov Chains. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-97704-1.
Full textGraham, Carl. Markov Chains. Chichester, UK: John Wiley & Sons, Ltd, 2014. http://dx.doi.org/10.1002/9781118881866.
Full textBrémaud, Pierre. Markov Chains. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-45982-6.
Full textChing, Wai-Ki, Ximin Huang, Michael K. Ng, and Tak-Kuen Siu. Markov Chains. Boston, MA: Springer US, 2013. http://dx.doi.org/10.1007/978-1-4614-6312-2.
Full textHermanns, Holger. Interactive Markov Chains. Berlin, Heidelberg: Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/3-540-45804-2.
Full textPrivault, Nicolas. Understanding Markov Chains. Singapore: Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-13-0659-4.
Full textHartfiel, Darald J. Markov Set-Chains. Berlin, Heidelberg: Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/bfb0094586.
Full textBook chapters on the topic "Markov chains"
Carlton, Matthew A., and Jay L. Devore. "Markov Chains." In Probability with Applications in Engineering, Science, and Technology, 423–87. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-52401-6_6.
Full textLindsey, James K. "Markov Chains." In The Analysis of Stochastic Processes using GLIM, 21–42. New York, NY: Springer New York, 1992. http://dx.doi.org/10.1007/978-1-4612-2888-2_2.
Full textGebali, Fayez. "Markov Chains." In Analysis of Computer and Communication Networks, 1–57. Boston, MA: Springer US, 2008. http://dx.doi.org/10.1007/978-0-387-74437-7_3.
Full textLakatos, László, László Szeidl, and Miklós Telek. "Markov Chains." In Introduction to Queueing Systems with Telecommunication Applications, 93–177. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-15142-3_3.
Full textGordon, Hugh. "Markov Chains." In Discrete Probability, 209–49. New York, NY: Springer New York, 1997. http://dx.doi.org/10.1007/978-1-4612-1966-8_9.
Full textHermanns, Holger. "Markov Chains." In Interactive Markov Chains, 35–55. Berlin, Heidelberg: Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/3-540-45804-2_3.
Full textSerfozo, Richard. "Markov Chains." In Probability and Its Applications, 1–98. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-540-89332-5_1.
Full textÖkten, Giray. "Markov Chains." In Probability and Simulation, 81–98. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-56070-6_4.
Full textRobert, Christian P., and George Casella. "Markov Chains." In Springer Texts in Statistics, 139–91. New York, NY: Springer New York, 1999. http://dx.doi.org/10.1007/978-1-4757-3071-5_4.
Full textHarris, Carl M. "Markov chains." In Encyclopedia of Operations Research and Management Science, 481–84. New York, NY: Springer US, 2001. http://dx.doi.org/10.1007/1-4020-0611-x_579.
Full textConference papers on the topic "Markov chains"
Hunter, Jeffrey J. "Perturbed Markov Chains." In Proceedings of the International Statistics Workshop. WORLD SCIENTIFIC, 2006. http://dx.doi.org/10.1142/9789812772466_0008.
Full textSaglam, Cenk Oguz, and Katie Byl. "Metastable Markov chains." In 2014 IEEE 53rd Annual Conference on Decision and Control (CDC). IEEE, 2014. http://dx.doi.org/10.1109/cdc.2014.7039847.
Full textBini, D. A., B. Meini, S. Steffé, and B. Van Houdt. "Structured Markov chains solver." In Proceeding from the 2006 workshop. New York, New York, USA: ACM Press, 2006. http://dx.doi.org/10.1145/1190366.1190378.
Full textBini, D. A., B. Meini, S. Steffé, and B. Van Houdt. "Structured Markov chains solver." In Proceeding from the 2006 workshop. New York, New York, USA: ACM Press, 2006. http://dx.doi.org/10.1145/1190366.1190379.
Full textKiefer, Stefan, and A. Prasad Sistla. "Distinguishing Hidden Markov Chains." In LICS '16: 31st Annual ACM/IEEE Symposium on Logic in Computer Science. New York, NY, USA: ACM, 2016. http://dx.doi.org/10.1145/2933575.2933608.
Full textGarcía, Jesús E. "Combining multivariate Markov chains." In PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON NUMERICAL ANALYSIS AND APPLIED MATHEMATICS 2014 (ICNAAM-2014). AIP Publishing LLC, 2015. http://dx.doi.org/10.1063/1.4912373.
Full textMitzenmacher, Michael. "Session details: Markov chains." In STOC '09: Symposium on Theory of Computing. New York, NY, USA: ACM, 2009. http://dx.doi.org/10.1145/3257428.
Full textZhang, Yu-Fen, Qun-Feng Zhang, and Rui-Hua Yu. "Markov property of Markov chains and its test." In 2010 International Conference on Machine Learning and Cybernetics (ICMLC). IEEE, 2010. http://dx.doi.org/10.1109/icmlc.2010.5580952.
Full textHeidergott, Bernd. "Perturbation analysis of Markov chains." In 2008 9th International Workshop on Discrete Event Systems. IEEE, 2008. http://dx.doi.org/10.1109/wodes.2008.4605929.
Full textVillacorta, Pablo, Jose_Luis Verdegay, and David Pelta. "Towards fuzzy linguistic Markov chains." In The 8th conference of the European Society for Fuzzy Logic and Technology. Paris, France: Atlantis Press, 2013. http://dx.doi.org/10.2991/eusflat.2013.106.
Full textReports on the topic "Markov chains"
Ramezani, Vahid R., and Steven I. Marcus. Risk-Sensitive Probability for Markov Chains. Fort Belvoir, VA: Defense Technical Information Center, September 2002. http://dx.doi.org/10.21236/ada438509.
Full textMarie, Raymond, Andrew Reibman, and Kishor Trivedi. Transient Solution of Acyclic Markov Chains. Fort Belvoir, VA: Defense Technical Information Center, August 1985. http://dx.doi.org/10.21236/ada162314.
Full textDoerschuk, Peter C., Robert R. Tenney, and Alan S. Willsky. Modeling Electrocardiograms Using Interacting Markov Chains. Fort Belvoir, VA: Defense Technical Information Center, July 1985. http://dx.doi.org/10.21236/ada162758.
Full textMa, D.-J., A. M. Makowski, and A. Shwartz. Stochastic Approximations for Finite-State Markov Chains. Fort Belvoir, VA: Defense Technical Information Center, January 1987. http://dx.doi.org/10.21236/ada452264.
Full textKrakowski, Martin. Models of Coin-Tossing for Markov Chains. Revision. Fort Belvoir, VA: Defense Technical Information Center, December 1987. http://dx.doi.org/10.21236/ada196572.
Full textDupuis, Paul, and Hui Wang. Adaptive Importance Sampling for Uniformly Recurrent Markov Chains. Fort Belvoir, VA: Defense Technical Information Center, January 2003. http://dx.doi.org/10.21236/ada461913.
Full textTsitsiklis, John N. Markov Chains with Rare Transitions and Simulated Annealing. Fort Belvoir, VA: Defense Technical Information Center, September 1985. http://dx.doi.org/10.21236/ada161598.
Full textСоловйов, Володимир Миколайович, V. Saptsin, and D. Chabanenko. Markov chains applications to the financial-economic time series predictions. Transport and Telecommunication Institute, 2011. http://dx.doi.org/10.31812/0564/1189.
Full textHarris, Carl M. Rootfinding for Markov Chains with Quasi-Triangular Transition Matrices. Fort Belvoir, VA: Defense Technical Information Center, October 1988. http://dx.doi.org/10.21236/ada202468.
Full textThompson, Theodore J., James P. Boyle, and Douglas J. Hentschel. Markov Chains for Random Urinalysis 1: Age-Test Model. Fort Belvoir, VA: Defense Technical Information Center, March 1993. http://dx.doi.org/10.21236/ada263274.
Full text