Academic literature on the topic 'Markov'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Markov.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Markov"
Sauvageot, Jean-Luc. "Markov quantum semigroups admit covariant MarkovC*-dilations." Communications in Mathematical Physics 106, no. 1 (March 1986): 91–103. http://dx.doi.org/10.1007/bf01210927.
Full textGuyo, X., and C. Hardouin†. "Markow chain markov field dynamics:models and statistics." Statistics 35, no. 4 (January 2001): 593–627. http://dx.doi.org/10.1080/02331880108802756.
Full textKulkarni, V. G., and V. G. Adlakha. "Markov and Markov-RegenerativepertNetworks." Operations Research 34, no. 5 (October 1986): 769–81. http://dx.doi.org/10.1287/opre.34.5.769.
Full textGrewal, Jasleen K., Martin Krzywinski, and Naomi Altman. "Markov models—Markov chains." Nature Methods 16, no. 8 (July 30, 2019): 663–64. http://dx.doi.org/10.1038/s41592-019-0476-x.
Full textGrewal, Jasleen K., Martin Krzywinski, and Naomi Altman. "Markov models — hidden Markov models." Nature Methods 16, no. 9 (August 30, 2019): 795–96. http://dx.doi.org/10.1038/s41592-019-0532-6.
Full textGuédon, Yann. "Hidden hybrid Markov/semi-Markov chains." Computational Statistics & Data Analysis 49, no. 3 (June 2005): 663–88. http://dx.doi.org/10.1016/j.csda.2004.05.033.
Full textChandgotia, Nishant, Guangyue Han, Brian Marcus, Tom Meyerovitch, and Ronnie Pavlov. "One-dimensional Markov random fields, Markov chains and topological Markov fields." Proceedings of the American Mathematical Society 142, no. 1 (October 3, 2013): 227–42. http://dx.doi.org/10.1090/s0002-9939-2013-11741-7.
Full textRofiroh, Rofiroh Rofiroh. "APLIKASI RANTAI MARKOV PADA PENENTUAN HARI BERSALJU DI BEBERAPA KOTA AMERIKA SERIKAT." STATMAT : JURNAL STATISTIKA DAN MATEMATIKA 2, no. 2 (July 31, 2020): 28. http://dx.doi.org/10.32493/sm.v2i2.5435.
Full textDemenkov, N. P., E. A. Mirkin, and I. A. Mochalov. "Markov and Semi-Markov Processes with Fuzzy States. Part 1. Markov Processes." Informacionnye tehnologii 26, no. 6 (June 23, 2020): 323–34. http://dx.doi.org/10.17587/it.26.323-334.
Full textAlimov, D. "Markov Functionals of an Ergodic Markov Process." Theory of Probability & Its Applications 39, no. 3 (January 1995): 504–12. http://dx.doi.org/10.1137/1139035.
Full textDissertations / Theses on the topic "Markov"
Di, Cecco Davide <1980>. "Markov exchangeable data and mixtures of Markov Chains." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1547/1/Di_Cecco_Davide_Tesi.pdf.
Full textDi, Cecco Davide <1980>. "Markov exchangeable data and mixtures of Markov Chains." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1547/.
Full textYildirak, Sahap Kasirga. "The Identificaton Of A Bivariate Markov Chain Market Model." Phd thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/1257898/index.pdf.
Full textTillman, Måns. "On-Line Market Microstructure Prediction Using Hidden Markov Models." Thesis, KTH, Matematisk statistik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-208312.
Full textUnder de senaste decennierna har det gjorts stora framsteg inom finansiell teori för kapitalmarknader. Formuleringen av arbitrageteori medförde möjligheten att konsekvent kunna prissätta finansiella instrument. Men i en tid då högfrekvenshandel numera är standard, har omsättningen av information i pris börjat ske i allt snabbare takt. För att studera dessa fenomen; prispåverkan och informationsomsättning, har mikrostrukturteorin vuxit fram. I den här uppsatsen studerar vi mikrostruktur med hjälp av en dynamisk modell. Historiskt sett har mikrostrukturteorin fokuserat på statiska modeller men med hjälp av icke-linjära dolda Markovmodeller (HMM:er) utökar vi detta till den dynamiska domänen. HMM:er kommer med en naturlig uppdelning mellan observation och dynamik, och är utformade på ett sådant sätt att vi kan dra nytta av domänspecifik kunskap. Genom att formulera lämpliga nyckelantaganden baserade på traditionell mikrostrukturteori specificerar vi en modell—med endast ett fåtal parametrar—som klarar av att beskriva de välkända säsongsbeteenden som statiska modeller inte klarar av. Tack vare nya genombrott inom Monte Carlo-metoder finns det nu kraftfulla verktyg att tillgå för att utföra optimal filtrering med HMM:er i realtid. Vi applicerar ett så kallat bootstrap filter för att sekventiellt filtrera fram tillståndet för modellen och prediktera framtida tillstånd. Tillsammans med tekniken backward smoothing estimerar vi den posteriora simultana fördelningen för varje handelsdag. Denna används sedan för statistisk inlärning av våra hyperparametrar via en sekventiell Monte Carlo Expectation Maximization-algoritm. För att formulera en modell som beskriver omsättningen av information, väljer vi att utgå ifrån volume imbalance, som ofta används för att studera prispåverkan. Vi definierar den relaterade observerbara storheten scaled volume imbalance som syftar till att bibehålla kopplingen till prispåverkan men även går att modellera med en dynamisk process som passar in i ramverket för HMM:er. Vi visar även hur man inom detta ramverk kan utvärdera HMM:er i allmänhet, samt genomför denna analys för vår modell i synnerhet. Modellen testas mot finansiell handelsdata för både terminskontrakt och aktier och visar i bägge fall god predikteringsförmåga.
Desharnais, Josée. "Labelled Markov processes." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0031/NQ64546.pdf.
Full textBalan, Raluca M. "Set-Markov processes." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/NQ66119.pdf.
Full textEltannir, Akram A. "Markov interactive processes." Diss., Georgia Institute of Technology, 1993. http://hdl.handle.net/1853/30745.
Full textWerner, Ivan. "Contractive Markov systems." Thesis, University of St Andrews, 2004. http://hdl.handle.net/10023/15173.
Full textDurrell, Fernando. "Constrained portfolio selection with Markov and non-Markov processes and insiders." Doctoral thesis, University of Cape Town, 2007. http://hdl.handle.net/11427/4379.
Full textSkorniakov, Viktor. "Asymptotically homogeneous Markov chains." Doctoral thesis, Lithuanian Academic Libraries Network (LABT), 2010. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2010~D_20101223_152954-43357.
Full textDisertacijoje tirta Markovo grandinių klasė, kurios iteracijos nusakomos atsitiktinėmis asimptotiškai homogeninėmis funkcijomis, ir išspręsti du uždaviniai: 1) surastos bendros sąlygos, kurios garantuoja vienintelio stacionaraus skirstinio egzistavimą; 2) vienmatėms grandinėms surastos sąlygos, kurioms esant stacionarus skirstinys turi "sunkias" uodegas.
Books on the topic "Markov"
Hou, Zhenting, Jerzy A. Filar, and Anyue Chen, eds. Markov Processes and Controlled Markov Chains. Boston, MA: Springer US, 2002. http://dx.doi.org/10.1007/978-1-4613-0265-0.
Full textZhenting, Hou, Filar Jerzy A. 1949-, and Chen Anyue, eds. Markov processes and controlled Markov chains. Dordrecht: Kluwer Academic Publishers, 2002.
Find full textDomingos, Pedro, and Daniel Lowd. Markov Logic. Cham: Springer International Publishing, 2009. http://dx.doi.org/10.1007/978-3-031-01549-6.
Full textGagniuc, Paul A. Markov Chains. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2017. http://dx.doi.org/10.1002/9781119387596.
Full textBrémaud, Pierre. Markov Chains. New York, NY: Springer New York, 1999. http://dx.doi.org/10.1007/978-1-4757-3124-8.
Full textSericola, Bruno. Markov Chains. Hoboken, NJ USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118731543.
Full textDouc, Randal, Eric Moulines, Pierre Priouret, and Philippe Soulier. Markov Chains. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-97704-1.
Full textGraham, Carl. Markov Chains. Chichester, UK: John Wiley & Sons, Ltd, 2014. http://dx.doi.org/10.1002/9781118881866.
Full textBrémaud, Pierre. Markov Chains. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-45982-6.
Full textChing, Wai-Ki, Ximin Huang, Michael K. Ng, and Tak-Kuen Siu. Markov Chains. Boston, MA: Springer US, 2013. http://dx.doi.org/10.1007/978-1-4614-6312-2.
Full textBook chapters on the topic "Markov"
Koralov, Leonid, and Yakov G. Sinai. "Markov Processes and Markov Families." In Theory of Probability and Random Processes, 273–87. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-540-68829-7_19.
Full textGheorghe, Adrian V. "Semi-Markov and Markov Chains." In Decision Processes in Dynamic Probabilistic System, 1–104. Dordrecht: Springer Netherlands, 1990. http://dx.doi.org/10.1007/978-94-009-0493-4_1.
Full textGirardin, Valérie, and Nikolaos Limnios. "Markov and Semi-Markov Processes." In Applied Probability, 215–52. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-97412-5_5.
Full textTapiero, Charles S. "Modelling: Markov Chains and Markov Processes." In Applied Stochastic Models and Control for Finance and Insurance, 41–88. Boston, MA: Springer US, 1998. http://dx.doi.org/10.1007/978-1-4615-5823-1_2.
Full textNakagawa, Toshio. "Semi-Markov and Markov Renewal Processes." In Springer Series in Reliability Engineering, 123–48. London: Springer London, 2011. http://dx.doi.org/10.1007/978-0-85729-274-2_5.
Full textForsyth, David. "Markov Chains and Hidden Markov Models." In Probability and Statistics for Computer Science, 331–51. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-64410-3_14.
Full textAoki, Satoshi, Hisayuki Hara, and Akimichi Takemura. "Running Markov Chain Without Markov Bases." In Springer Series in Statistics, 275–86. New York, NY: Springer New York, 2012. http://dx.doi.org/10.1007/978-1-4614-3719-2_16.
Full textTimischl, Werner. "Markov-Ketten und Hidden-Markov-Modelle." In Mathematische Methoden der Bioinformatik - Eine Einführung, 161–218. Berlin, Heidelberg: Springer Berlin Heidelberg, 2023. http://dx.doi.org/10.1007/978-3-662-67458-1_7.
Full textNagasawa, Masao. "Markov Processes." In Stochastic Processes in Quantum Physics, 1–26. Basel: Birkhäuser Basel, 2000. http://dx.doi.org/10.1007/978-3-0348-8383-2_1.
Full textGardiner, Crispin W. "Markov Processes." In Springer Series in Synergetics, 42–79. Berlin, Heidelberg: Springer Berlin Heidelberg, 1985. http://dx.doi.org/10.1007/978-3-662-02452-2_3.
Full textConference papers on the topic "Markov"
Oguz, Mehmet Kayra, and Alexander Dockhorn. "Markov Senior - Learning Markov Junior Grammars to Generate User-specified Content." In 2024 IEEE Conference on Games (CoG), 1–8. IEEE, 2024. http://dx.doi.org/10.1109/cog60054.2024.10645650.
Full textHARA, Hisayuki, Satoshi AOKI, and Akimichi TAKEMURA. "Running Markov Chain without Markov Basis." In Harmony of Gröbner Bases and the Modern Industrial Society - The Second CREST-CSBM International Conference. Singapore: World Scientific Publishing Co. Pte. Ltd., 2012. http://dx.doi.org/10.1142/9789814383462_0005.
Full textFarshchian, Maryam, and Majid Vafaei Jahan. "Stock market prediction with Hidden Markov Model." In 2015 International Congress on Technology, Communication and Knowledge (ICTCK). IEEE, 2015. http://dx.doi.org/10.1109/ictck.2015.7582714.
Full textGupta, Aditya, and Bhuwan Dhingra. "Stock market prediction using Hidden Markov Models." In 2012 Students Conference on Engineering and Systems (SCES). IEEE, 2012. http://dx.doi.org/10.1109/sces.2012.6199099.
Full textSomani, Poonam, Shreyas Talele, and Suraj Sawant. "Stock market prediction using Hidden Markov Model." In 2014 IEEE 7th Joint International Information Technology and Artificial Intelligence Conference (ITAIC). IEEE, 2014. http://dx.doi.org/10.1109/itaic.2014.7065011.
Full textRoark, Brian. "Markov parsing." In the 40th Annual Meeting. Morristown, NJ, USA: Association for Computational Linguistics, 2001. http://dx.doi.org/10.3115/1073083.1073131.
Full textDomingos, Pedro. "Markov logic." In Proceeding of the 17th ACM conference. New York, New York, USA: ACM Press, 2008. http://dx.doi.org/10.1145/1458082.1458084.
Full textKarlin, A. R., S. J. Phillips, and P. Raghavan. "Markov paging." In Proceedings., 33rd Annual Symposium on Foundations of Computer Science. IEEE, 1992. http://dx.doi.org/10.1109/sfcs.1992.267771.
Full textChierichetti, Flavio, Ravi Kumar, and Prabhakar Raghavan. "Markov Layout." In 2011 IEEE 52nd Annual Symposium on Foundations of Computer Science (FOCS). IEEE, 2011. http://dx.doi.org/10.1109/focs.2011.71.
Full textDimoulkas, Ilias, Mikael Amelin, and Mohammad Reza Hesamzadeh. "Forecasting balancing market prices using Hidden Markov Models." In 2016 13th International Conference on the European Energy Market (EEM). IEEE, 2016. http://dx.doi.org/10.1109/eem.2016.7521229.
Full textReports on the topic "Markov"
Adler, Robert J., Stamatis Gambanis, and Gennady Samorodnitsky. On Stable Markov Processes. Fort Belvoir, VA: Defense Technical Information Center, September 1987. http://dx.doi.org/10.21236/ada192892.
Full textGhahramani, Zoubin, and Michael I. Jordan. Factorial Hidden Markov Models. Fort Belvoir, VA: Defense Technical Information Center, January 1996. http://dx.doi.org/10.21236/ada307097.
Full textAinsleigh, Phillip L. Theory of Continuous-State Hidden Markov Models and Hidden Gauss-Markov Models. Fort Belvoir, VA: Defense Technical Information Center, March 2001. http://dx.doi.org/10.21236/ada415930.
Full textThrun, Sebastian, and John Langford. Monte Carlo Hidden Markov Models. Fort Belvoir, VA: Defense Technical Information Center, December 1998. http://dx.doi.org/10.21236/ada363714.
Full textDueker, Michael J. Markov Switching in GARCH Processes and Mean Reverting Stock Market Volatility. Federal Reserve Bank of St. Louis, 1994. http://dx.doi.org/10.20955/wp.1994.015.
Full textRamezani, Vahid R., and Steven I. Marcus. Risk-Sensitive Probability for Markov Chains. Fort Belvoir, VA: Defense Technical Information Center, September 2002. http://dx.doi.org/10.21236/ada438509.
Full textDean, Thomas L. Model Acquisition for Markov Decision Problems. Fort Belvoir, VA: Defense Technical Information Center, October 1998. http://dx.doi.org/10.21236/ada373795.
Full textGelfand, Alan E., and Sujit K. Sahu. On Markov Chain Monte Carlo Acceleration. Fort Belvoir, VA: Defense Technical Information Center, April 1994. http://dx.doi.org/10.21236/ada279393.
Full textYang, Jie, and Yangsheng Xu. Hidden Markov Model for Gesture Recognition. Fort Belvoir, VA: Defense Technical Information Center, May 1994. http://dx.doi.org/10.21236/ada282845.
Full textKrebs, William B. Markov Chain Simulations of Binary Matrices. Fort Belvoir, VA: Defense Technical Information Center, January 1992. http://dx.doi.org/10.21236/ada249265.
Full text