Academic literature on the topic 'Markov'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Markov.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Markov"

1

Sauvageot, Jean-Luc. "Markov quantum semigroups admit covariant MarkovC*-dilations." Communications in Mathematical Physics 106, no. 1 (March 1986): 91–103. http://dx.doi.org/10.1007/bf01210927.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Guyo, X., and C. Hardouin†. "Markow chain markov field dynamics:models and statistics." Statistics 35, no. 4 (January 2001): 593–627. http://dx.doi.org/10.1080/02331880108802756.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kulkarni, V. G., and V. G. Adlakha. "Markov and Markov-RegenerativepertNetworks." Operations Research 34, no. 5 (October 1986): 769–81. http://dx.doi.org/10.1287/opre.34.5.769.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Grewal, Jasleen K., Martin Krzywinski, and Naomi Altman. "Markov models—Markov chains." Nature Methods 16, no. 8 (July 30, 2019): 663–64. http://dx.doi.org/10.1038/s41592-019-0476-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Grewal, Jasleen K., Martin Krzywinski, and Naomi Altman. "Markov models — hidden Markov models." Nature Methods 16, no. 9 (August 30, 2019): 795–96. http://dx.doi.org/10.1038/s41592-019-0532-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Guédon, Yann. "Hidden hybrid Markov/semi-Markov chains." Computational Statistics & Data Analysis 49, no. 3 (June 2005): 663–88. http://dx.doi.org/10.1016/j.csda.2004.05.033.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chandgotia, Nishant, Guangyue Han, Brian Marcus, Tom Meyerovitch, and Ronnie Pavlov. "One-dimensional Markov random fields, Markov chains and topological Markov fields." Proceedings of the American Mathematical Society 142, no. 1 (October 3, 2013): 227–42. http://dx.doi.org/10.1090/s0002-9939-2013-11741-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Rofiroh, Rofiroh Rofiroh. "APLIKASI RANTAI MARKOV PADA PENENTUAN HARI BERSALJU DI BEBERAPA KOTA AMERIKA SERIKAT." STATMAT : JURNAL STATISTIKA DAN MATEMATIKA 2, no. 2 (July 31, 2020): 28. http://dx.doi.org/10.32493/sm.v2i2.5435.

Full text
Abstract:
ABSTRACT This research models on the stochastic process. The method used is the Marchov chain method with the stochastic process where the forthcoming condition will only be influenced by the closest preceding condition . This method was applied to the observational data snaw day for the Markov chain at eight observation stations in the United States, namely the New York, Sedro Wooley, Glendivem Willow City, Del Norte, Medford, Charlestone, and Blue Hill. The purpose of this study is to determine the convergence direction of the step transition probability and the probability distribution of the Markov chain in three conditions. According to the results of data processing using matlab software, diagonal matrices, and spectral theorems, similar results were obtained on the convergence of the transition matrix of each observation station which was influenced by the difference in probability changes of two conditions. Keywords: Marchov Chain, Snaw Day, Transition Matrix ABSTRAK Penelitian ini melakukan pemodelan pada proses stokastik. Metode penelitian yang digunakan adalah metode rantai markov dengan proses stokastik, keadaan yang akan datang hanya akan dipengaruhi keadaan terdekat sebelumnya . Metode ini diterapkan pada data pengamatan hari bersalju untuk rantai markov di delapan stasiun pengamatan yang ada di Amerika Serikat, yaitu stasiun pengamatan New York, Sedro Wooley, Glendive, Willow City, Del Norte, Medford, Charleston, dan Blue Hill. Tujuan penelitian ini adalah untuk mengetahui arah kekonvergenan peluang transisi dan menentukan distribusi peluang rantai markov n langkah dengan tiga keadaan. Berdasarkan hasil pengolahan data dengan menggunakan software matlab, matriks diagonal, teorema spektral didapatkan hasil yang sama untuk kekonvergenan matriks transisi dari masing-masing stasiun pengamatan dipengaruhi oleh selisih perubahan peluang dua keadaan. Kata kunci: Rantai Markov, Hari Bersalju, Matriks Transisi
APA, Harvard, Vancouver, ISO, and other styles
9

Demenkov, N. P., E. A. Mirkin, and I. A. Mochalov. "Markov and Semi-Markov Processes with Fuzzy States. Part 1. Markov Processes." Informacionnye tehnologii 26, no. 6 (June 23, 2020): 323–34. http://dx.doi.org/10.17587/it.26.323-334.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Alimov, D. "Markov Functionals of an Ergodic Markov Process." Theory of Probability & Its Applications 39, no. 3 (January 1995): 504–12. http://dx.doi.org/10.1137/1139035.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Markov"

1

Di, Cecco Davide <1980&gt. "Markov exchangeable data and mixtures of Markov Chains." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1547/1/Di_Cecco_Davide_Tesi.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Di, Cecco Davide <1980&gt. "Markov exchangeable data and mixtures of Markov Chains." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1547/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yildirak, Sahap Kasirga. "The Identificaton Of A Bivariate Markov Chain Market Model." Phd thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/1257898/index.pdf.

Full text
Abstract:
This work is an extension of the classical Cox-Ross-Rubinstein discrete time market model in which only one risky asset is considered. We introduce another risky asset into the model. Moreover, the random structure of the asset price sequence is generated by bivariate finite state Markov chain. Then, the interest rate varies over time as it is the function of generating sequences. We discuss how the model can be adapted to the real data. Finally, we illustrate sample implementations to give a better idea about the use of the model.
APA, Harvard, Vancouver, ISO, and other styles
4

Tillman, Måns. "On-Line Market Microstructure Prediction Using Hidden Markov Models." Thesis, KTH, Matematisk statistik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-208312.

Full text
Abstract:
Over the last decades, financial markets have undergone dramatic changes. With the advent of the arbitrage pricing theory, along with new technology, markets have become more efficient. In particular, the new high-frequency markets, with algorithmic trading operating on micro-second level, make it possible to translate ”information” into price almost instantaneously. Such phenomena are studied in the field of market microstructure theory, which aims to explain and predict them. In this thesis, we model the dynamics of high frequency markets using non-linear hidden Markov models (HMMs). Such models feature an intuitive separation between observations and dynamics, and are therefore highly convenient tools in financial settings, where they allow a precise application of domain knowledge. HMMs can be formulated based on only a few parameters, yet their inherently dynamic nature can be used to capture well-known intra-day seasonality effects that many other models fail to explain. Due to recent breakthroughs in Monte Carlo methods, HMMs can now be efficiently estimated in real-time. In this thesis, we develop a holistic framework for performing both real-time inference and learning of HMMs, by combining several particle-based methods. Within this framework, we also provide methods for making accurate predictions from the model, as well as methods for assessing the model itself. In this framework, a sequential Monte Carlo bootstrap filter is adopted to make on-line inference and predictions. Coupled with a backward smoothing filter, this provides a forward filtering/backward smoothing scheme. This is then used in the sequential Monte Carlo expectation-maximization algorithm for finding the optimal hyper-parameters for the model. To design an HMM specifically for capturing information translation, we adopt the observable volume imbalance into a dynamic setting. Volume imbalance has previously been used in market microstructure theory to study, for example, price impact. Through careful selection of key model assumptions, we define a slightly modified observable as a process that we call scaled volume imbalance. The outcomes of this process retain the key features of volume imbalance (that is, its relationship to price impact and information), and allows an efficient evaluation of the framework, while providing a promising platform for future studies. This is demonstrated through a test on actual financial trading data, where we obtain high-performance predictions. Our results demonstrate that the proposed framework can successfully be applied to the field of market microstructure.
Under de senaste decennierna har det gjorts stora framsteg inom finansiell teori för kapitalmarknader. Formuleringen av arbitrageteori medförde möjligheten att konsekvent kunna prissätta finansiella instrument. Men i en tid då högfrekvenshandel numera är standard, har omsättningen av information i pris börjat ske i allt snabbare takt. För att studera dessa fenomen; prispåverkan och informationsomsättning, har mikrostrukturteorin vuxit fram. I den här uppsatsen studerar vi mikrostruktur med hjälp av en dynamisk modell. Historiskt sett har mikrostrukturteorin fokuserat på statiska modeller men med hjälp av icke-linjära dolda Markovmodeller (HMM:er) utökar vi detta till den dynamiska domänen. HMM:er kommer med en naturlig uppdelning mellan observation och dynamik, och är utformade på ett sådant sätt att vi kan dra nytta av domänspecifik kunskap. Genom att formulera lämpliga nyckelantaganden baserade på traditionell mikrostrukturteori specificerar vi en modell—med endast ett fåtal parametrar—som klarar av att beskriva de välkända säsongsbeteenden som statiska modeller inte klarar av. Tack vare nya genombrott inom Monte Carlo-metoder finns det nu kraftfulla verktyg att tillgå för att utföra optimal filtrering med HMM:er i realtid. Vi applicerar ett så kallat bootstrap filter för att sekventiellt filtrera fram tillståndet för modellen och prediktera framtida tillstånd. Tillsammans med tekniken backward smoothing estimerar vi den posteriora simultana fördelningen för varje handelsdag. Denna används sedan för statistisk inlärning av våra hyperparametrar via en sekventiell Monte Carlo Expectation Maximization-algoritm. För att formulera en modell som beskriver omsättningen av information, väljer vi att utgå ifrån volume imbalance, som ofta används för att studera prispåverkan. Vi definierar den relaterade observerbara storheten scaled volume imbalance som syftar till att bibehålla kopplingen till prispåverkan men även går att modellera med en dynamisk process som passar in i ramverket för HMM:er. Vi visar även hur man inom detta ramverk kan utvärdera HMM:er i allmänhet, samt genomför denna analys för vår modell i synnerhet. Modellen testas mot finansiell handelsdata för både terminskontrakt och aktier och visar i bägge fall god predikteringsförmåga.
APA, Harvard, Vancouver, ISO, and other styles
5

Desharnais, Josée. "Labelled Markov processes." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0031/NQ64546.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Balan, Raluca M. "Set-Markov processes." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/NQ66119.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Eltannir, Akram A. "Markov interactive processes." Diss., Georgia Institute of Technology, 1993. http://hdl.handle.net/1853/30745.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Werner, Ivan. "Contractive Markov systems." Thesis, University of St Andrews, 2004. http://hdl.handle.net/10023/15173.

Full text
Abstract:
We introduce a theory of contractive Markov systems (CMS) which provides a unifying framework in so-called "fractal" geometry. It extends the known theory of iterated function systems (IFS) with place dependent probabilities [1][8] in a way that it also covers graph directed constructions of "fractal" sets [18]. Such systems naturally extend finite Markov chains and inherit some of their properties. In Chapter 1, we consider iterations of a Markov system and show that they preserve the essential structure of it. In Chapter 2, we show that the Markov operator defined by such a system has a unique invariant probability measure in the irreducible case and an attractive probability measure in the aperiodic case if the restrictions of the probability functions on their vertex sets are Dini-continuous and bounded away from zero, and the system satisfies a condition of a contractiveness on average. This generalizes a result from [1]. Furthermore, we show that the rate of convergence to the stationary state is exponential in the aperiodic case with constant probabilities and a compact state space. In Chapter 3, we construct a coding map for a contractive Markov system. In Chapter 4, we calculate Kolmogorov-Sinai entropy of the generalized Markov shift. In Chapter 5, we prove an ergodic theorem for Markov chains associated with the contractive Markov systems. It generalizes the ergodic theorem of Elton [8].
APA, Harvard, Vancouver, ISO, and other styles
9

Durrell, Fernando. "Constrained portfolio selection with Markov and non-Markov processes and insiders." Doctoral thesis, University of Cape Town, 2007. http://hdl.handle.net/11427/4379.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Skorniakov, Viktor. "Asymptotically homogeneous Markov chains." Doctoral thesis, Lithuanian Academic Libraries Network (LABT), 2010. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2010~D_20101223_152954-43357.

Full text
Abstract:
In the dissertation there is investigated a class of Markov chains defined by iterations of a function possessing a property of asymptotical homogeneity. Two problems are solved: 1) there are established rather general conditions under which the chain has unique stationary distribution; 2) for the chains evolving in a real line there are established conditions under which the stationary distribution of the chain is heavy-tailed.
Disertacijoje tirta Markovo grandinių klasė, kurios iteracijos nusakomos atsitiktinėmis asimptotiškai homogeninėmis funkcijomis, ir išspręsti du uždaviniai: 1) surastos bendros sąlygos, kurios garantuoja vienintelio stacionaraus skirstinio egzistavimą; 2) vienmatėms grandinėms surastos sąlygos, kurioms esant stacionarus skirstinys turi "sunkias" uodegas.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Markov"

1

Hou, Zhenting, Jerzy A. Filar, and Anyue Chen, eds. Markov Processes and Controlled Markov Chains. Boston, MA: Springer US, 2002. http://dx.doi.org/10.1007/978-1-4613-0265-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhenting, Hou, Filar Jerzy A. 1949-, and Chen Anyue, eds. Markov processes and controlled Markov chains. Dordrecht: Kluwer Academic Publishers, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Domingos, Pedro, and Daniel Lowd. Markov Logic. Cham: Springer International Publishing, 2009. http://dx.doi.org/10.1007/978-3-031-01549-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Gagniuc, Paul A. Markov Chains. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2017. http://dx.doi.org/10.1002/9781119387596.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Brémaud, Pierre. Markov Chains. New York, NY: Springer New York, 1999. http://dx.doi.org/10.1007/978-1-4757-3124-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Sericola, Bruno. Markov Chains. Hoboken, NJ USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118731543.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Douc, Randal, Eric Moulines, Pierre Priouret, and Philippe Soulier. Markov Chains. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-97704-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Graham, Carl. Markov Chains. Chichester, UK: John Wiley & Sons, Ltd, 2014. http://dx.doi.org/10.1002/9781118881866.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Brémaud, Pierre. Markov Chains. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-45982-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ching, Wai-Ki, Ximin Huang, Michael K. Ng, and Tak-Kuen Siu. Markov Chains. Boston, MA: Springer US, 2013. http://dx.doi.org/10.1007/978-1-4614-6312-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Markov"

1

Koralov, Leonid, and Yakov G. Sinai. "Markov Processes and Markov Families." In Theory of Probability and Random Processes, 273–87. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-540-68829-7_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gheorghe, Adrian V. "Semi-Markov and Markov Chains." In Decision Processes in Dynamic Probabilistic System, 1–104. Dordrecht: Springer Netherlands, 1990. http://dx.doi.org/10.1007/978-94-009-0493-4_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Girardin, Valérie, and Nikolaos Limnios. "Markov and Semi-Markov Processes." In Applied Probability, 215–52. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-97412-5_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Tapiero, Charles S. "Modelling: Markov Chains and Markov Processes." In Applied Stochastic Models and Control for Finance and Insurance, 41–88. Boston, MA: Springer US, 1998. http://dx.doi.org/10.1007/978-1-4615-5823-1_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Nakagawa, Toshio. "Semi-Markov and Markov Renewal Processes." In Springer Series in Reliability Engineering, 123–48. London: Springer London, 2011. http://dx.doi.org/10.1007/978-0-85729-274-2_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Forsyth, David. "Markov Chains and Hidden Markov Models." In Probability and Statistics for Computer Science, 331–51. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-64410-3_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Aoki, Satoshi, Hisayuki Hara, and Akimichi Takemura. "Running Markov Chain Without Markov Bases." In Springer Series in Statistics, 275–86. New York, NY: Springer New York, 2012. http://dx.doi.org/10.1007/978-1-4614-3719-2_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Timischl, Werner. "Markov-Ketten und Hidden-Markov-Modelle." In Mathematische Methoden der Bioinformatik - Eine Einführung, 161–218. Berlin, Heidelberg: Springer Berlin Heidelberg, 2023. http://dx.doi.org/10.1007/978-3-662-67458-1_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Nagasawa, Masao. "Markov Processes." In Stochastic Processes in Quantum Physics, 1–26. Basel: Birkhäuser Basel, 2000. http://dx.doi.org/10.1007/978-3-0348-8383-2_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gardiner, Crispin W. "Markov Processes." In Springer Series in Synergetics, 42–79. Berlin, Heidelberg: Springer Berlin Heidelberg, 1985. http://dx.doi.org/10.1007/978-3-662-02452-2_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Markov"

1

Oguz, Mehmet Kayra, and Alexander Dockhorn. "Markov Senior - Learning Markov Junior Grammars to Generate User-specified Content." In 2024 IEEE Conference on Games (CoG), 1–8. IEEE, 2024. http://dx.doi.org/10.1109/cog60054.2024.10645650.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

HARA, Hisayuki, Satoshi AOKI, and Akimichi TAKEMURA. "Running Markov Chain without Markov Basis." In Harmony of Gröbner Bases and the Modern Industrial Society - The Second CREST-CSBM International Conference. Singapore: World Scientific Publishing Co. Pte. Ltd., 2012. http://dx.doi.org/10.1142/9789814383462_0005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Farshchian, Maryam, and Majid Vafaei Jahan. "Stock market prediction with Hidden Markov Model." In 2015 International Congress on Technology, Communication and Knowledge (ICTCK). IEEE, 2015. http://dx.doi.org/10.1109/ictck.2015.7582714.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Gupta, Aditya, and Bhuwan Dhingra. "Stock market prediction using Hidden Markov Models." In 2012 Students Conference on Engineering and Systems (SCES). IEEE, 2012. http://dx.doi.org/10.1109/sces.2012.6199099.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Somani, Poonam, Shreyas Talele, and Suraj Sawant. "Stock market prediction using Hidden Markov Model." In 2014 IEEE 7th Joint International Information Technology and Artificial Intelligence Conference (ITAIC). IEEE, 2014. http://dx.doi.org/10.1109/itaic.2014.7065011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Roark, Brian. "Markov parsing." In the 40th Annual Meeting. Morristown, NJ, USA: Association for Computational Linguistics, 2001. http://dx.doi.org/10.3115/1073083.1073131.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Domingos, Pedro. "Markov logic." In Proceeding of the 17th ACM conference. New York, New York, USA: ACM Press, 2008. http://dx.doi.org/10.1145/1458082.1458084.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Karlin, A. R., S. J. Phillips, and P. Raghavan. "Markov paging." In Proceedings., 33rd Annual Symposium on Foundations of Computer Science. IEEE, 1992. http://dx.doi.org/10.1109/sfcs.1992.267771.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Chierichetti, Flavio, Ravi Kumar, and Prabhakar Raghavan. "Markov Layout." In 2011 IEEE 52nd Annual Symposium on Foundations of Computer Science (FOCS). IEEE, 2011. http://dx.doi.org/10.1109/focs.2011.71.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Dimoulkas, Ilias, Mikael Amelin, and Mohammad Reza Hesamzadeh. "Forecasting balancing market prices using Hidden Markov Models." In 2016 13th International Conference on the European Energy Market (EEM). IEEE, 2016. http://dx.doi.org/10.1109/eem.2016.7521229.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Markov"

1

Adler, Robert J., Stamatis Gambanis, and Gennady Samorodnitsky. On Stable Markov Processes. Fort Belvoir, VA: Defense Technical Information Center, September 1987. http://dx.doi.org/10.21236/ada192892.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ghahramani, Zoubin, and Michael I. Jordan. Factorial Hidden Markov Models. Fort Belvoir, VA: Defense Technical Information Center, January 1996. http://dx.doi.org/10.21236/ada307097.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ainsleigh, Phillip L. Theory of Continuous-State Hidden Markov Models and Hidden Gauss-Markov Models. Fort Belvoir, VA: Defense Technical Information Center, March 2001. http://dx.doi.org/10.21236/ada415930.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Thrun, Sebastian, and John Langford. Monte Carlo Hidden Markov Models. Fort Belvoir, VA: Defense Technical Information Center, December 1998. http://dx.doi.org/10.21236/ada363714.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Dueker, Michael J. Markov Switching in GARCH Processes and Mean Reverting Stock Market Volatility. Federal Reserve Bank of St. Louis, 1994. http://dx.doi.org/10.20955/wp.1994.015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ramezani, Vahid R., and Steven I. Marcus. Risk-Sensitive Probability for Markov Chains. Fort Belvoir, VA: Defense Technical Information Center, September 2002. http://dx.doi.org/10.21236/ada438509.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Dean, Thomas L. Model Acquisition for Markov Decision Problems. Fort Belvoir, VA: Defense Technical Information Center, October 1998. http://dx.doi.org/10.21236/ada373795.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Gelfand, Alan E., and Sujit K. Sahu. On Markov Chain Monte Carlo Acceleration. Fort Belvoir, VA: Defense Technical Information Center, April 1994. http://dx.doi.org/10.21236/ada279393.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Yang, Jie, and Yangsheng Xu. Hidden Markov Model for Gesture Recognition. Fort Belvoir, VA: Defense Technical Information Center, May 1994. http://dx.doi.org/10.21236/ada282845.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Krebs, William B. Markov Chain Simulations of Binary Matrices. Fort Belvoir, VA: Defense Technical Information Center, January 1992. http://dx.doi.org/10.21236/ada249265.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography