Academic literature on the topic 'Markov processes'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Markov processes.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Markov processes"

1

Demenkov, N. P., E. A. Mirkin, and I. A. Mochalov. "Markov and Semi-Markov Processes with Fuzzy States. Part 1. Markov Processes." Informacionnye tehnologii 26, no. 6 (June 23, 2020): 323–34. http://dx.doi.org/10.17587/it.26.323-334.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

FRANZ, UWE. "CLASSICAL MARKOV PROCESSES FROM QUANTUM LÉVY PROCESSES." Infinite Dimensional Analysis, Quantum Probability and Related Topics 02, no. 01 (March 1999): 105–29. http://dx.doi.org/10.1142/s0219025799000060.

Full text
Abstract:
We show how classical Markov processes can be obtained from quantum Lévy processes. It is shown that quantum Lévy processes are quantum Markov processes, and sufficient conditions for restrictions to subalgebras to remain quantum Markov processes are given. A classical Markov process (which has the same time-ordered moments as the quantum process in the vacuum state) exists whenever we can restrict to a commutative subalgebra without losing the quantum Markov property.8 Several examples, including the Azéma martingale, with explicit calculations are presented. In particular, the action of the generator of the classical Markov processes on polynomials or their moments are calculated using Hopf algebra duality.
APA, Harvard, Vancouver, ISO, and other styles
3

Demenkov, N. P., E. A. Mirkin, and I. A. Mochalov. "Markov and Semi-Markov Processes with Fuzzy States. Part 2. Semi-Markov Processes." INFORMACIONNYE TEHNOLOGII 26, no. 7 (July 17, 2020): 387–93. http://dx.doi.org/10.17587/it.26.387-393.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Whittle, P., and M. L. Puterman. "Markov Decision Processes." Journal of the Royal Statistical Society. Series A (Statistics in Society) 158, no. 3 (1995): 636. http://dx.doi.org/10.2307/2983459.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Smith, J. Q., and D. J. White. "Markov Decision Processes." Journal of the Royal Statistical Society. Series A (Statistics in Society) 157, no. 1 (1994): 164. http://dx.doi.org/10.2307/2983520.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

King, Aaron A., Qianying Lin, and Edward L. Ionides. "Markov genealogy processes." Theoretical Population Biology 143 (February 2022): 77–91. http://dx.doi.org/10.1016/j.tpb.2021.11.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Thomas, L. C., D. J. White, and Martin L. Puterman. "Markov Decision Processes." Journal of the Operational Research Society 46, no. 6 (June 1995): 792. http://dx.doi.org/10.2307/2584317.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Ephraim, Y., and N. Merhav. "Hidden Markov processes." IEEE Transactions on Information Theory 48, no. 6 (June 2002): 1518–69. http://dx.doi.org/10.1109/tit.2002.1003838.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bäuerle, Nicole, and Ulrich Rieder. "Markov Decision Processes." Jahresbericht der Deutschen Mathematiker-Vereinigung 112, no. 4 (September 8, 2010): 217–43. http://dx.doi.org/10.1365/s13291-010-0007-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wal, J., and J. Wessels. "MARKOV DECISION PROCESSES." Statistica Neerlandica 39, no. 2 (June 1985): 219–33. http://dx.doi.org/10.1111/j.1467-9574.1985.tb01140.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Markov processes"

1

Desharnais, Josée. "Labelled Markov processes." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0031/NQ64546.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Balan, Raluca M. "Set-Markov processes." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/NQ66119.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Eltannir, Akram A. "Markov interactive processes." Diss., Georgia Institute of Technology, 1993. http://hdl.handle.net/1853/30745.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Haugomat, Tristan. "Localisation en espace de la propriété de Feller avec application aux processus de type Lévy." Thesis, Rennes 1, 2018. http://www.theses.fr/2018REN1S046/document.

Full text
Abstract:
Dans cette thèse, nous donnons une localisation en espace de la théorie des processus de Feller. Un premier objectif est d’obtenir des résultats simples et précis sur la convergence de processus de Markov. Un second objectif est d’étudier le lien entre les notions de propriété de Feller, problème de martingales et topologie de Skorokhod. Dans un premier temps nous donnons une version localisée de la topologie de Skorokhod. Nous en étudions les notions de compacité et tension. Nous faisons le lien entre les topologies de Skorokhod localisée et non localisée, grâce à la notion de changement de temps. Dans un second temps, à l’aide de la topologie de Skorokhod localisée et du changement de temps, nous étudions les problèmes de martingales. Nous montrons pour des processus l’équivalence entre, d’une part, être solution d’un problème de martingales bien posé, d’autre part, vérifier une version localisée de la propriété de Feller, et enfin, être markovien et continu en loi par rapport à sa condition initiale. Nous caractérisons la convergence en loi pour les solutions de problèmes de martingale en terme de convergence des opérateurs associés et donnons un résultat similaire pour les approximations à temps discret. Pour finir, nous appliquons la théorie des processus localement fellerien à deux exemples. Nous l’appliquons d’abord au processus de type Lévy et obtenons des résultats de convergence pour des processus à temps discret et continu, notamment des méthodes de simulation et schémas d’Euler. Nous appliquons ensuite cette même théorie aux diffusions unidimensionnelles dans des potentiels, nous obtenons des résultats de convergence de diffusions ou marches aléatoires vers des diffusions singulières. Comme conséquences, nous déduisons la convergence de marches aléatoires en milieux aléatoires vers des diffusions en potentiels aléatoires
In this PhD thesis, we give a space localisation for the theory of Feller processes. A first objective is to obtain simple and precise results on the convergence of Markov processes. A second objective is to study the link between the notions of Feller property, martingale problem and Skorokhod topology. First we give a localised version of the Skorokhod topology. We study the notions of compactness and tightness for this topology. We make the connexion between localised and unlocalised Skorokhod topologies, by using the notion of time change. In a second step, using the localised Skorokhod topology and the time change, we study martingale problems. We show the equivalence between, on the one hand, to be solution of a well-posed martingale problem, on the other hand, to satisfy a localised version of the Feller property, and finally, to be a Markov process weakly continuous with respect to the initial condition. We characterise the weak convergence for solutions of martingale problems in terms of convergence of associated operators and give a similar result for discrete time approximations. Finally, we apply the theory of locally Feller process to some examples. We first apply it to the Lévy-type processes and obtain convergence results for discrete and continuous time processes, including simulation methods and Euler’s schemes. We then apply the same theory to one-dimensional diffusions in a potential and we obtain convergence results of diffusions or random walks towards singular diffusions. As a consequences, we deduce the convergence of random walks in random environment towards diffusions in random potential
APA, Harvard, Vancouver, ISO, and other styles
5

莊競誠 and King-sing Chong. "Explorations in Markov processes." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1997. http://hub.hku.hk/bib/B31235682.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

James, Huw William. "Transient Markov decision processes." Thesis, University of Bristol, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.430192.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ku, Ho Ming. "Interacting Markov branching processes." Thesis, University of Liverpool, 2014. http://livrepository.liverpool.ac.uk/2002759/.

Full text
Abstract:
In engineering, biology and physics, in many systems, the particles or members give birth and die through time. These systems can be modeled by continuoustime Markov Chains and Markov Processes. Applications of Markov Processes are investigated by many scientists, Jagers [1975] for example . In ordinary Markov branching processes, each particles or members are assumed to be identical and independent. However, in some cases, each two members of the species may interact/collide together to give new birth. In considering these cases, we need to have some more general processes. We may use collision branching processes to model such systems. Then, in order to consider an even more general model, i.e. each particles can have branching and collision effect. In this case the branching component and collision component will have an interaction effect. We consider this model as interacting branching collision processes. In this thesis, in Chapter 1, we firstly look at some background, basic concepts of continuous-time Markov Chains and ordinary Markov branching processes. After revising some basic concepts and models, we look into more complicated models, collision branching processes and interacting branching collision processes. In Chapter 2, for collision branching processes, we investigate the basic properties, criteria of uniqueness, and explicit expressions for the extinction probability and the expected/mean extinction time and expected/mean explosion time. In Chapter 3, for interacting branching collision processes, similar to the structure in last chapter, we investigate the basic properties, criteria of uniqueness. Because of the more complicated model settings, a lot more details are required in considering the extinction probability. We will divide this section into several parts and consider the extinction probability under different cases and assumptions. After considering the extinction probability for the interacting branching processes, we notice that the explicit form of the extinction probability may be too complicated. In the last part of Chapter 3, we discuss the asymptotic behavior for the extinction probability of the interacting branching collision processes. In Chapter 4, we look at a related but still important branching model, Markov branching processes with immigration, emigration and resurrection. We investigate the basic properties, criteria of uniqueness. The most interesting part is that we investigate the extinction probability with our technique/methods using in Chapter 4. This can also be served as a good example of the methods introducing in Chapter 3. In Chapter 5, we look at two interacting branching models, One is interacting collision process with immigration, emigration and resurrection. The other one is interacting branching collision processes with immigration, emigration and resurrection. we investigate the basic properties, criteria of uniqueness and extinction probability. My original material starts from Chapter 4. The model used in chapter 4 were introduced by Li and Liu [2011]. In Li and Liu [2011], some calculation in cases of extinction probability evaluation were not strictly defined. My contribution focuses on the extinction probability evaluation and discussing the asymptotic behavior for the extinction probability in Chapter 4. A paper for this model will be submitted in this year. While two interacting branching models are discussed in Chapter 5. Some important properties for the two models are studied in detail.
APA, Harvard, Vancouver, ISO, and other styles
8

Chong, King-sing. "Explorations in Markov processes /." Hong Kong : University of Hong Kong, 1997. http://sunzi.lib.hku.hk/hkuto/record.jsp?B18736105.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Pötzelberger, Klaus. "On the Approximation of finite Markov-exchangeable processes by mixtures of Markov Processes." Department of Statistics and Mathematics, WU Vienna University of Economics and Business, 1991. http://epub.wu.ac.at/526/1/document.pdf.

Full text
Abstract:
We give an upper bound for the norm distance of (0,1) -valued Markov-exchangeable random variables to mixtures of distributions of Markov processes. A Markov-exchangeable random variable has a distribution that depends only on the starting value and the number of transitions 0-0, 0-1, 1-0 and 1-1. We show that if, for increasing length of variables, the norm distance to mixtures of Markov processes goes to 0, the rate of this convergence may be arbitrarily slow. (author's abstract)
Series: Forschungsberichte / Institut für Statistik
APA, Harvard, Vancouver, ISO, and other styles
10

Ferns, Norman Francis. "Metrics for Markov decision processes." Thesis, McGill University, 2003. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=80263.

Full text
Abstract:
We present a class of metrics, defined on the state space of a finite Markov decision process (MDP), each of which is sound with respect to stochastic bisimulation, a notion of MDP state equivalence derived from the theory of concurrent processes. Such metrics are based on similar metrics developed in the context of labelled Markov processes, and like those, are suitable for state space aggregation. Furthermore, we restrict our attention to a subset of this class that is appropriate for certain reinforcement learning (RL) tasks, specifically, infinite horizon tasks with an expected total discounted reward optimality criterion. Given such an RL metric, we provide bounds relating it to the optimal value function of the original MDP as well as to the value function of the aggregate MDP. Finally, we present an algorithm for calculating such a metric up to a prescribed degree of accuracy and some empirical results.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Markov processes"

1

White, D. J. Markov decision processes. New York: John Wiley & Sons, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ethier, Stewart N., and Thomas G. Kurtz, eds. Markov Processes. Hoboken, NJ, USA: John Wiley & Sons, Inc., 1986. http://dx.doi.org/10.1002/9780470316658.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Puterman, Martin L., ed. Markov Decision Processes. Hoboken, NJ, USA: John Wiley & Sons, Inc., 1994. http://dx.doi.org/10.1002/9780470316887.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hou, Zhenting, Jerzy A. Filar, and Anyue Chen, eds. Markov Processes and Controlled Markov Chains. Boston, MA: Springer US, 2002. http://dx.doi.org/10.1007/978-1-4613-0265-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zhenting, Hou, Filar Jerzy A. 1949-, and Chen Anyue, eds. Markov processes and controlled Markov chains. Dordrecht: Kluwer Academic Publishers, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Pardoux, Étienne. Markov Processes and Applications. Chichester, UK: John Wiley & Sons, Ltd, 2008. http://dx.doi.org/10.1002/9780470721872.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zhenting, Hou, and Guo Qingfeng. Homogeneous Denumerable Markov Processes. Berlin, Heidelberg: Springer Berlin Heidelberg, 1988. http://dx.doi.org/10.1007/978-3-642-68127-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Komorowski, Tomasz, Claudio Landim, and Stefano Olla. Fluctuations in Markov Processes. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-29880-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Blumenthal, Robert M. Excursions of Markov Processes. Boston, MA: Birkhäuser Boston, 1992. http://dx.doi.org/10.1007/978-1-4684-9412-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hernández-Lerma, O. Adaptive Markov Control Processes. New York, NY: Springer New York, 1989. http://dx.doi.org/10.1007/978-1-4419-8714-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Markov processes"

1

Itô, Kiyosi. "Markov Processes." In Stochastic Processes, 93–178. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-662-10065-3_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Nagasawa, Masao. "Markov Processes." In Stochastic Processes in Quantum Physics, 1–26. Basel: Birkhäuser Basel, 2000. http://dx.doi.org/10.1007/978-3-0348-8383-2_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gardiner, Crispin W. "Markov Processes." In Springer Series in Synergetics, 42–79. Berlin, Heidelberg: Springer Berlin Heidelberg, 1985. http://dx.doi.org/10.1007/978-3-662-02452-2_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Gardiner, Crispin W. "Markov Processes." In Springer Series in Synergetics, 42–79. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-662-05389-8_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Poler, Raúl, Josefa Mula, and Manuel Díaz-Madroñero. "Markov Processes." In Operations Research Problems, 375–419. London: Springer London, 2013. http://dx.doi.org/10.1007/978-1-4471-5577-5_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Čepin, Marko. "Markov Processes." In Assessment of Power System Reliability, 113–18. London: Springer London, 2011. http://dx.doi.org/10.1007/978-0-85729-688-7_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Jacobs, Konrad. "Markov Processes." In Discrete Stochastics, 119–54. Basel: Birkhäuser Basel, 1992. http://dx.doi.org/10.1007/978-3-0348-8645-1_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Rudnicki, Ryszard, and Marta Tyran-Kamińska. "Markov Processes." In Piecewise Deterministic Processes in Biological Models, 33–62. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-61295-9_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Nelson, Randolph. "Markov Processes." In Probability, Stochastic Processes, and Queueing Theory, 329–89. New York, NY: Springer New York, 1995. http://dx.doi.org/10.1007/978-1-4757-2426-4_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Alfa, Attahiru Sule. "Markov Processes." In Queueing Theory for Telecommunications, 11–78. Boston, MA: Springer US, 2010. http://dx.doi.org/10.1007/978-1-4419-7314-6_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Markov processes"

1

Budgett, Stephanie, Azam Asanjarani, and Heti Afimeimounga. "Visualizing Markov Processes." In Bridging the Gap: Empowering and Educating Today’s Learners in Statistics. International Association for Statistical Education, 2022. http://dx.doi.org/10.52041/iase.icots11.t10f3.

Full text
Abstract:
Researchers and educators have long been aware of the misconceptions prevalent in people’s probabilistic reasoning processes. Calls to reform the teaching of probability from a traditional and predominantly mathematical approach to include an emphasis on modelling using technology have been heeded by many. The purpose of this paper is to present our experiences of including an activity based on an interactive visualisation tool in the Markov processes module of a first-year probability course. Initial feedback suggests that the tool may support students’ understanding of the equilibrium distribution and points to certain aspects of the tool that may be beneficial. A targeted survey, to be administered in Semester 1, 2022, aims to provide more insight.
APA, Harvard, Vancouver, ISO, and other styles
2

Mladenov, Martin, Craig Boutilier, Dale Schuurmans, Ofer Meshi, Gal Elidan, and Tyler Lu. "Logistic Markov Decision Processes." In Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/346.

Full text
Abstract:
User modeling in advertising and recommendation has typically focused on myopic predictors of user responses. In this work, we consider the long-term decision problem associated with user interaction. We propose a concise specification of long-term interaction dynamics by combining factored dynamic Bayesian networks with logistic predictors of user responses, allowing state-of-the-art prediction models to be seamlessly extended. We show how to solve such models at scale by providing a constraint generation approach for approximate linear programming that overcomes the variable coupling and non-linearity induced by the logistic regression predictor. The efficacy of the approach is demonstrated on advertising domains with up to 2^54 states and 2^39 actions.
APA, Harvard, Vancouver, ISO, and other styles
3

Hutter, Marcus. "Feature Markov Decision Processes." In 2nd Conference on Artificial General Intelligence 2009. Paris, France: Atlantis Press, 2009. http://dx.doi.org/10.2991/agi.2009.30.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hawkes, Alan G. "Markov processes in APL." In Conference proceedings. New York, New York, USA: ACM Press, 1990. http://dx.doi.org/10.1145/97808.97843.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kumar, Ravi, Maithra Raghu, Tamás Sarlós, and Andrew Tomkins. "Linear Additive Markov Processes." In WWW '17: 26th International World Wide Web Conference. Republic and Canton of Geneva, Switzerland: International World Wide Web Conferences Steering Committee, 2017. http://dx.doi.org/10.1145/3038912.3052644.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Tzortzis, Ioannis, Charalambos D. Charalambous, Themistoklis Charalambous, Christoforos N. Hadjicostis, and Mikael Johansson. "Approximation of Markov processes by lower dimensional processes." In 2014 IEEE 53rd Annual Conference on Decision and Control (CDC). IEEE, 2014. http://dx.doi.org/10.1109/cdc.2014.7040082.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yu, Jia Yuan, and Shie Mannor. "Arbitrarily modulated Markov decision processes." In 2009 Joint 48th IEEE Conference on Decision and Control (CDC) and 28th Chinese Control Conference (CCC). IEEE, 2009. http://dx.doi.org/10.1109/cdc.2009.5400662.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kozen, Dexter, Kim G. Larsen, Radu Mardare, and Prakash Panangaden. "Stone Duality for Markov Processes." In 2013 Twenty-Eighth Annual IEEE/ACM Symposium on Logic in Computer Science (LICS 2013). IEEE, 2013. http://dx.doi.org/10.1109/lics.2013.38.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Brázdil, T., V. Brožek, K. Etessami, A. Kučera, and D. Wojtczak. "One-Counter Markov Decision Processes." In Proceedings of the Twenty-First Annual ACM-SIAM Symposium on Discrete Algorithms. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2010. http://dx.doi.org/10.1137/1.9781611973075.70.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hölzl, Johannes. "Markov processes in Isabelle/HOL." In CPP '17: Certified Proofs and Programs. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3018610.3018628.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Markov processes"

1

Adler, Robert J., Stamatis Gambanis, and Gennady Samorodnitsky. On Stable Markov Processes. Fort Belvoir, VA: Defense Technical Information Center, September 1987. http://dx.doi.org/10.21236/ada192892.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Abdel-Hameed, M. Markovian Shock Models, Deterioration Processes, Stratified Markov Processes Replacement Policies. Fort Belvoir, VA: Defense Technical Information Center, December 1985. http://dx.doi.org/10.21236/ada174646.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Newell, Alan. Markovian Shock Models, Deterioration Processes, Stratified Markov Processes and Replacement Policies. Fort Belvoir, VA: Defense Technical Information Center, May 1986. http://dx.doi.org/10.21236/ada174995.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Dueker, Michael J. Markov Switching in GARCH Processes and Mean Reverting Stock Market Volatility. Federal Reserve Bank of St. Louis, 1994. http://dx.doi.org/10.20955/wp.1994.015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Cinlar, E. Markov Processes Applied to Control, Reliability and Replacement. Fort Belvoir, VA: Defense Technical Information Center, April 1989. http://dx.doi.org/10.21236/ada208634.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Rohlicek, J. R., and A. S. Willsky. Structural Decomposition of Multiple Time Scale Markov Processes,. Fort Belvoir, VA: Defense Technical Information Center, October 1987. http://dx.doi.org/10.21236/ada189739.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Serfozo, Richard F. Poisson Functionals of Markov Processes and Queueing Networks. Fort Belvoir, VA: Defense Technical Information Center, December 1987. http://dx.doi.org/10.21236/ada191217.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Serfozo, R. F. Poisson Functionals of Markov Processes and Queueing Networks,. Fort Belvoir, VA: Defense Technical Information Center, December 1987. http://dx.doi.org/10.21236/ada194289.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Draper, Bruce A., and J. Ross Beveridge. Learning to Populate Geospatial Databases via Markov Processes. Fort Belvoir, VA: Defense Technical Information Center, December 1999. http://dx.doi.org/10.21236/ada374536.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chang, Hyeong S., Michael C. Fu, and Steven I. Marcus. An Adaptive Sampling Algorithm for Solving Markov Decision Processes. Fort Belvoir, VA: Defense Technical Information Center, May 2002. http://dx.doi.org/10.21236/ada438505.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography