Journal articles on the topic 'Markov decision theory'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 journal articles for your research on the topic 'Markov decision theory.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.
Weng, Paul, and Olivier Spanjaard. "Functional Reward Markov Decision Processes: Theory and Applications." International Journal on Artificial Intelligence Tools 26, no. 03 (June 2017): 1760014. http://dx.doi.org/10.1142/s0218213017600144.
Full textBuchholz, Peter. "Bounding reward measures of Markov models using the Markov decision processes." Numerical Linear Algebra with Applications 18, no. 6 (October 18, 2011): 919–30. http://dx.doi.org/10.1002/nla.792.
Full textOrtega-Gutiérrez, R. Israel, and H. Cruz-Suárez. "A Moreau-Yosida regularization for Markov decision processes." Proyecciones (Antofagasta) 40, no. 1 (February 1, 2020): 117–37. http://dx.doi.org/10.22199/issn.0717-6279-2021-01-0008.
Full textOrtega-Gutiérrez, R. Israel, and H. Cruz-Suárez. "A Moreau-Yosida regularization for Markov decision processes." Proyecciones (Antofagasta) 40, no. 1 (February 1, 2020): 117–37. http://dx.doi.org/10.22199/issn.0717-6279-2021-01-0008.
Full textKoole, Ger. "Monotonicity in Markov Reward and Decision Chains: Theory and Applications." Foundations and Trends® in Stochastic Systems 1, no. 1 (2006): 1–76. http://dx.doi.org/10.1561/0900000002.
Full textCai, Lin. "Research of Optimizing Computer Network Based on Dynamism Theory." Applied Mechanics and Materials 556-562 (May 2014): 5356–58. http://dx.doi.org/10.4028/www.scientific.net/amm.556-562.5356.
Full textBrázdil, Tomáš, Václav Brožek, Vojtěch Forejt, and Antonín Kučera. "Reachability in recursive Markov decision processes." Information and Computation 206, no. 5 (May 2008): 520–37. http://dx.doi.org/10.1016/j.ic.2007.09.002.
Full textBarker, Richard J., and Matthew R. Schofield. "Putting Markov Chains Back into Markov Chain Monte Carlo." Journal of Applied Mathematics and Decision Sciences 2007 (October 30, 2007): 1–13. http://dx.doi.org/10.1155/2007/98086.
Full textBarkalov, S. A., A. V. Ananiev, K. S. Ivannikov, and S. I. Moiseev. "Algorithm and methods for management decision-making based on the theory of latent variables under time conditions." Bulletin of the South Ural State University. Ser. Computer Technologies, Automatic Control & Radioelectronics 22, no. 3 (2022): 106–16. http://dx.doi.org/10.14529/ctcr220310.
Full textKadota, Yoshinobu, Masami Kurano, and Masami Yasuda. "Discounted Markov decision processes with utility constraints." Computers & Mathematics with Applications 51, no. 2 (January 2006): 279–84. http://dx.doi.org/10.1016/j.camwa.2005.11.013.
Full textHasteer, Nitasha, Abhay Bansal, and B. K. Murthy. "Crowdsourced Software Development Process: Investigation and Modeling through Markov Decision Theory." International Journal of Software Engineering and Its Applications 9, no. 9 (September 30, 2015): 41–54. http://dx.doi.org/10.14257/ijseia.2015.9.9.05.
Full textUsaha, W., and J. Barria. "Markov decision theory framework for resource allocation in LEO satellite constellations." IEE Proceedings - Communications 149, no. 5 (December 1, 2002): 270–76. http://dx.doi.org/10.1049/ip-com:20020510.
Full textRummukainen, H., and J. Virtamo. "Polynomial cost approximations in Markov decision theory based call admission control." IEEE/ACM Transactions on Networking 9, no. 6 (2001): 769–79. http://dx.doi.org/10.1109/90.974530.
Full textHinz, Juri. "An Algorithm for Making Regime-Changing Markov Decisions." Algorithms 14, no. 10 (October 4, 2021): 291. http://dx.doi.org/10.3390/a14100291.
Full textYing, Shenggang, and Mingsheng Ying. "Reachability analysis of quantum Markov decision processes." Information and Computation 263 (December 2018): 31–51. http://dx.doi.org/10.1016/j.ic.2018.09.001.
Full textDoyen, Laurent, Thierry Massart, and Mahsa Shirmohammadi. "The complexity of synchronizing Markov decision processes." Journal of Computer and System Sciences 100 (March 2019): 96–129. http://dx.doi.org/10.1016/j.jcss.2018.09.004.
Full textGuerreiro, Sérgio. "Using Markov Theory to Deliver Informed Decisions in Partially Observable Business Processes Operation." International Journal of Operations Research and Information Systems 9, no. 2 (April 2018): 53–72. http://dx.doi.org/10.4018/ijoris.2018040103.
Full textPazis, Jason, and Ronald Parr. "PAC Optimal Exploration in Continuous Space Markov Decision Processes." Proceedings of the AAAI Conference on Artificial Intelligence 27, no. 1 (June 30, 2013): 774–81. http://dx.doi.org/10.1609/aaai.v27i1.8678.
Full textVan Dijk, Nico M., and Martin L. Puterman. "Perturbation theory for Markov reward processes with applications to queueing systems." Advances in Applied Probability 20, no. 1 (March 1988): 79–98. http://dx.doi.org/10.2307/1427271.
Full textVan Dijk, Nico M., and Martin L. Puterman. "Perturbation theory for Markov reward processes with applications to queueing systems." Advances in Applied Probability 20, no. 01 (March 1988): 79–98. http://dx.doi.org/10.1017/s000186780001795x.
Full textJain, Rahul, and Pravin Varaiya. "Simulation-based optimization of Markov decision processes: An empirical process theory approach." Automatica 46, no. 8 (August 2010): 1297–304. http://dx.doi.org/10.1016/j.automatica.2010.05.021.
Full textPothos, Emmanuel M., and Jerome R. Busemeyer. "A quantum probability explanation for violations of ‘rational’ decision theory." Proceedings of the Royal Society B: Biological Sciences 276, no. 1665 (March 25, 2009): 2171–78. http://dx.doi.org/10.1098/rspb.2009.0121.
Full textSennott, Linn I. "Average Cost Semi-Markov Decision Processes and the Control of Queueing Systems." Probability in the Engineering and Informational Sciences 3, no. 2 (April 1989): 247–72. http://dx.doi.org/10.1017/s0269964800001121.
Full textLeder, Nicole, Bernd Heidergott, and Arie Hordijk. "An Approximation Approach for the Deviation Matrix of Continuous-Time Markov Processes with Application to Markov Decision Theory." Operations Research 58, no. 4-part-1 (August 2010): 918–32. http://dx.doi.org/10.1287/opre.1090.0786.
Full textFlesch, János, Arkadi Predtetchinski, and Eilon Solan. "Sporadic Overtaking Optimality in Markov Decision Problems." Dynamic Games and Applications 7, no. 2 (April 13, 2016): 212–28. http://dx.doi.org/10.1007/s13235-016-0186-2.
Full textChen, Cai Ping, Qiao Jing Liu, and Pan Zheng. "Application of Grey-Markov Model in Predicting Container Throughput of Fujian Province." Advanced Materials Research 779-780 (September 2013): 720–23. http://dx.doi.org/10.4028/www.scientific.net/amr.779-780.720.
Full textGuerreiro, Sérgio. "(Re)Designing Business Processes Using Markov Theory and Constrained State, Transition and Actor Role Spaces." International Journal of Knowledge-Based Organizations 9, no. 2 (April 2019): 43–61. http://dx.doi.org/10.4018/ijkbo.2019040103.
Full textSalari, Nooshin, and Viliam Makis. "Application of Markov renewal theory and semi‐Markov decision processes in maintenance modeling and optimization of multi‐unit systems." Naval Research Logistics (NRL) 67, no. 7 (August 3, 2020): 548–58. http://dx.doi.org/10.1002/nav.21932.
Full textMorcous, George, and Zoubir Lounis. "Integration of stochastic deterioration models with multicriteria decision theory for optimizing maintenance of bridge decks." Canadian Journal of Civil Engineering 33, no. 6 (June 1, 2006): 756–65. http://dx.doi.org/10.1139/l06-011.
Full textMikhalov, Oleksandr Illich, Oleksandr Afrykanovych Stenin, Viktor Petrovych Pasko, Oleksandr Serhiiovych Stenin, and Yurii Opanasovych Tymoshyn. "Situational planning and operational adjustment of the route of the Autonomous robotic underwater vehicle." System technologies 3, no. 122 (October 10, 2019): 3–11. http://dx.doi.org/10.34185/1562-9945-3-122-2019-01.
Full textLiang, Jinglian, Chao Xu, Zhiyong Feng, and Xirong Ma. "Hidden Markov Model Decision Forest for Dynamic Facial Expression Recognition." International Journal of Pattern Recognition and Artificial Intelligence 29, no. 07 (September 28, 2015): 1556010. http://dx.doi.org/10.1142/s0218001415560108.
Full textAzam, Md Ali, Hans D. Mittelmann, and Shankarachary Ragi. "UAV Formation Shape Control via Decentralized Markov Decision Processes." Algorithms 14, no. 3 (March 17, 2021): 91. http://dx.doi.org/10.3390/a14030091.
Full textCruz-Suárez, H., G. Zacarías-Espinoza, and V. Vázquez-Guevara. "A Version of the Euler Equation in Discounted Markov Decision Processes." Journal of Applied Mathematics 2012 (2012): 1–16. http://dx.doi.org/10.1155/2012/103698.
Full textHuang, Chien-Cheng, Kwo-Jean Farn, Feng-Yu Lin, and Frank Yeong-Sung Lin. "Software Vulnerability Patch Management with Semi-Markov Decision Process." Applied Mathematics & Information Sciences 7, no. 6 (November 1, 2013): 2467–76. http://dx.doi.org/10.12785/amis/070640.
Full textKhalvati, Koosha, Seongmin A. Park, Saghar Mirbagheri, Remi Philippe, Mariateresa Sestito, Jean-Claude Dreher, and Rajesh P. N. Rao. "Modeling other minds: Bayesian inference explains human choices in group decision-making." Science Advances 5, no. 11 (November 2019): eaax8783. http://dx.doi.org/10.1126/sciadv.aax8783.
Full textJunges, Sebastian, Joost-Pieter Katoen, Guillermo A. Pérez, and Tobias Winkler. "The complexity of reachability in parametric Markov decision processes." Journal of Computer and System Sciences 119 (August 2021): 183–210. http://dx.doi.org/10.1016/j.jcss.2021.02.006.
Full textSennott, Linn I. "The Computation of Average Optimal Policies in Denumerable State Markov Decision Chains." Advances in Applied Probability 29, no. 1 (March 1997): 114–37. http://dx.doi.org/10.2307/1427863.
Full textSennott, Linn I. "The Computation of Average Optimal Policies in Denumerable State Markov Decision Chains." Advances in Applied Probability 29, no. 01 (March 1997): 114–37. http://dx.doi.org/10.1017/s0001867800027816.
Full textKolarov, Aleksandar, and Joseph Hui. "On computing Markov decision theory-based cost for routing in circuit-switched broadband networks." Journal of Network and Systems Management 3, no. 4 (December 1995): 405–26. http://dx.doi.org/10.1007/bf02139532.
Full textWei, Qingda. "Finite approximation for finite-horizon continuous-time Markov decision processes." 4OR 15, no. 1 (June 11, 2016): 67–84. http://dx.doi.org/10.1007/s10288-016-0321-3.
Full textMelnik, Roderick V. Nicholas. "Dynamic system evolution and markov chain approximation." Discrete Dynamics in Nature and Society 2, no. 1 (1998): 7–39. http://dx.doi.org/10.1155/s1026022698000028.
Full textSagum, Ria Ambrocio. "Filipino Native Language Identification using Markov Chain Model and Maximum Likelihood Decision Rule." Turkish Journal of Computer and Mathematics Education (TURCOMAT) 12, no. 3 (April 11, 2021): 5475–78. http://dx.doi.org/10.17762/turcomat.v12i3.2206.
Full textVanneste, Stephan G. "A Generalized Age-Replacement Model." Probability in the Engineering and Informational Sciences 6, no. 4 (October 1992): 525–41. http://dx.doi.org/10.1017/s0269964800002710.
Full textLan, Jian-yi, and Ying Zhou. "Application of Gray Markov SCGM1,1c Model to Prediction of Accidents Deaths in Coal Mining." International Scholarly Research Notices 2014 (November 4, 2014): 1–7. http://dx.doi.org/10.1155/2014/632804.
Full textGIRTLER, Jerzy. "Possibility of estimating the reliability of diesel engines by applying the theory of semi-Markov processes and making operational decisions by considering reliability of diagnosis on technical state of this sort of combustion engines." Combustion Engines 163, no. 4 (November 1, 2015): 57–66. http://dx.doi.org/10.19206/ce-116857.
Full textGuerreiro, Sérgio Luís Proença Duarte. "Decision-making in partially known business process environments using Markov theory and policy graph visualisation." International Journal of Business Information Systems 36, no. 3 (2021): 355. http://dx.doi.org/10.1504/ijbis.2021.113283.
Full textALIDRISI, MUSTAFA M. "Optimal control of the service rate of an exponential queuing network using Markov decision theory." International Journal of Systems Science 21, no. 12 (December 1990): 2553–63. http://dx.doi.org/10.1080/00207729008910569.
Full textGuerreiro, Sérgio. "Decision-Making in Partially Known Business Process Environments using Markov Theory and Policy Graph Visualization." International Journal of Business Information Systems 1, no. 1 (2020): 1. http://dx.doi.org/10.1504/ijbis.2020.10024182.
Full textLi, Jing. "A Replica Selection Decision in Cloud Computing Environment." Advanced Materials Research 121-122 (June 2010): 801–6. http://dx.doi.org/10.4028/www.scientific.net/amr.121-122.801.
Full textLinzner, Dominik, and Heinz Koeppl. "A Variational Perturbative Approach to Planning in Graph-Based Markov Decision Processes." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (April 3, 2020): 7203–10. http://dx.doi.org/10.1609/aaai.v34i05.6210.
Full text