Letteratura scientifica selezionata sul tema "Markov processes"

Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili

Scegli il tipo di fonte:

Consulta la lista di attuali articoli, libri, tesi, atti di convegni e altre fonti scientifiche attinenti al tema "Markov processes".

Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.

Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.

Articoli di riviste sul tema "Markov processes"

1

Demenkov, N. P., E. A. Mirkin e I. A. Mochalov. "Markov and Semi-Markov Processes with Fuzzy States. Part 1. Markov Processes". Informacionnye tehnologii 26, n. 6 (23 giugno 2020): 323–34. http://dx.doi.org/10.17587/it.26.323-334.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

FRANZ, UWE. "CLASSICAL MARKOV PROCESSES FROM QUANTUM LÉVY PROCESSES". Infinite Dimensional Analysis, Quantum Probability and Related Topics 02, n. 01 (marzo 1999): 105–29. http://dx.doi.org/10.1142/s0219025799000060.

Testo completo
Abstract (sommario):
We show how classical Markov processes can be obtained from quantum Lévy processes. It is shown that quantum Lévy processes are quantum Markov processes, and sufficient conditions for restrictions to subalgebras to remain quantum Markov processes are given. A classical Markov process (which has the same time-ordered moments as the quantum process in the vacuum state) exists whenever we can restrict to a commutative subalgebra without losing the quantum Markov property.8 Several examples, including the Azéma martingale, with explicit calculations are presented. In particular, the action of the generator of the classical Markov processes on polynomials or their moments are calculated using Hopf algebra duality.
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Demenkov, N. P., E. A. Mirkin e I. A. Mochalov. "Markov and Semi-Markov Processes with Fuzzy States. Part 2. Semi-Markov Processes". INFORMACIONNYE TEHNOLOGII 26, n. 7 (17 luglio 2020): 387–93. http://dx.doi.org/10.17587/it.26.387-393.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Whittle, P., e M. L. Puterman. "Markov Decision Processes." Journal of the Royal Statistical Society. Series A (Statistics in Society) 158, n. 3 (1995): 636. http://dx.doi.org/10.2307/2983459.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Smith, J. Q., e D. J. White. "Markov Decision Processes." Journal of the Royal Statistical Society. Series A (Statistics in Society) 157, n. 1 (1994): 164. http://dx.doi.org/10.2307/2983520.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

King, Aaron A., Qianying Lin e Edward L. Ionides. "Markov genealogy processes". Theoretical Population Biology 143 (febbraio 2022): 77–91. http://dx.doi.org/10.1016/j.tpb.2021.11.003.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Thomas, L. C., D. J. White e Martin L. Puterman. "Markov Decision Processes." Journal of the Operational Research Society 46, n. 6 (giugno 1995): 792. http://dx.doi.org/10.2307/2584317.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Ephraim, Y., e N. Merhav. "Hidden Markov processes". IEEE Transactions on Information Theory 48, n. 6 (giugno 2002): 1518–69. http://dx.doi.org/10.1109/tit.2002.1003838.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Bäuerle, Nicole, e Ulrich Rieder. "Markov Decision Processes". Jahresbericht der Deutschen Mathematiker-Vereinigung 112, n. 4 (8 settembre 2010): 217–43. http://dx.doi.org/10.1365/s13291-010-0007-2.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Wal, J., e J. Wessels. "MARKOV DECISION PROCESSES". Statistica Neerlandica 39, n. 2 (giugno 1985): 219–33. http://dx.doi.org/10.1111/j.1467-9574.1985.tb01140.x.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri

Tesi sul tema "Markov processes"

1

Desharnais, Josée. "Labelled Markov processes". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0031/NQ64546.pdf.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Balan, Raluca M. "Set-Markov processes". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/NQ66119.pdf.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Eltannir, Akram A. "Markov interactive processes". Diss., Georgia Institute of Technology, 1993. http://hdl.handle.net/1853/30745.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Haugomat, Tristan. "Localisation en espace de la propriété de Feller avec application aux processus de type Lévy". Thesis, Rennes 1, 2018. http://www.theses.fr/2018REN1S046/document.

Testo completo
Abstract (sommario):
Dans cette thèse, nous donnons une localisation en espace de la théorie des processus de Feller. Un premier objectif est d’obtenir des résultats simples et précis sur la convergence de processus de Markov. Un second objectif est d’étudier le lien entre les notions de propriété de Feller, problème de martingales et topologie de Skorokhod. Dans un premier temps nous donnons une version localisée de la topologie de Skorokhod. Nous en étudions les notions de compacité et tension. Nous faisons le lien entre les topologies de Skorokhod localisée et non localisée, grâce à la notion de changement de temps. Dans un second temps, à l’aide de la topologie de Skorokhod localisée et du changement de temps, nous étudions les problèmes de martingales. Nous montrons pour des processus l’équivalence entre, d’une part, être solution d’un problème de martingales bien posé, d’autre part, vérifier une version localisée de la propriété de Feller, et enfin, être markovien et continu en loi par rapport à sa condition initiale. Nous caractérisons la convergence en loi pour les solutions de problèmes de martingale en terme de convergence des opérateurs associés et donnons un résultat similaire pour les approximations à temps discret. Pour finir, nous appliquons la théorie des processus localement fellerien à deux exemples. Nous l’appliquons d’abord au processus de type Lévy et obtenons des résultats de convergence pour des processus à temps discret et continu, notamment des méthodes de simulation et schémas d’Euler. Nous appliquons ensuite cette même théorie aux diffusions unidimensionnelles dans des potentiels, nous obtenons des résultats de convergence de diffusions ou marches aléatoires vers des diffusions singulières. Comme conséquences, nous déduisons la convergence de marches aléatoires en milieux aléatoires vers des diffusions en potentiels aléatoires
In this PhD thesis, we give a space localisation for the theory of Feller processes. A first objective is to obtain simple and precise results on the convergence of Markov processes. A second objective is to study the link between the notions of Feller property, martingale problem and Skorokhod topology. First we give a localised version of the Skorokhod topology. We study the notions of compactness and tightness for this topology. We make the connexion between localised and unlocalised Skorokhod topologies, by using the notion of time change. In a second step, using the localised Skorokhod topology and the time change, we study martingale problems. We show the equivalence between, on the one hand, to be solution of a well-posed martingale problem, on the other hand, to satisfy a localised version of the Feller property, and finally, to be a Markov process weakly continuous with respect to the initial condition. We characterise the weak convergence for solutions of martingale problems in terms of convergence of associated operators and give a similar result for discrete time approximations. Finally, we apply the theory of locally Feller process to some examples. We first apply it to the Lévy-type processes and obtain convergence results for discrete and continuous time processes, including simulation methods and Euler’s schemes. We then apply the same theory to one-dimensional diffusions in a potential and we obtain convergence results of diffusions or random walks towards singular diffusions. As a consequences, we deduce the convergence of random walks in random environment towards diffusions in random potential
Gli stili APA, Harvard, Vancouver, ISO e altri
5

莊競誠 e King-sing Chong. "Explorations in Markov processes". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1997. http://hub.hku.hk/bib/B31235682.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

James, Huw William. "Transient Markov decision processes". Thesis, University of Bristol, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.430192.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Ku, Ho Ming. "Interacting Markov branching processes". Thesis, University of Liverpool, 2014. http://livrepository.liverpool.ac.uk/2002759/.

Testo completo
Abstract (sommario):
In engineering, biology and physics, in many systems, the particles or members give birth and die through time. These systems can be modeled by continuoustime Markov Chains and Markov Processes. Applications of Markov Processes are investigated by many scientists, Jagers [1975] for example . In ordinary Markov branching processes, each particles or members are assumed to be identical and independent. However, in some cases, each two members of the species may interact/collide together to give new birth. In considering these cases, we need to have some more general processes. We may use collision branching processes to model such systems. Then, in order to consider an even more general model, i.e. each particles can have branching and collision effect. In this case the branching component and collision component will have an interaction effect. We consider this model as interacting branching collision processes. In this thesis, in Chapter 1, we firstly look at some background, basic concepts of continuous-time Markov Chains and ordinary Markov branching processes. After revising some basic concepts and models, we look into more complicated models, collision branching processes and interacting branching collision processes. In Chapter 2, for collision branching processes, we investigate the basic properties, criteria of uniqueness, and explicit expressions for the extinction probability and the expected/mean extinction time and expected/mean explosion time. In Chapter 3, for interacting branching collision processes, similar to the structure in last chapter, we investigate the basic properties, criteria of uniqueness. Because of the more complicated model settings, a lot more details are required in considering the extinction probability. We will divide this section into several parts and consider the extinction probability under different cases and assumptions. After considering the extinction probability for the interacting branching processes, we notice that the explicit form of the extinction probability may be too complicated. In the last part of Chapter 3, we discuss the asymptotic behavior for the extinction probability of the interacting branching collision processes. In Chapter 4, we look at a related but still important branching model, Markov branching processes with immigration, emigration and resurrection. We investigate the basic properties, criteria of uniqueness. The most interesting part is that we investigate the extinction probability with our technique/methods using in Chapter 4. This can also be served as a good example of the methods introducing in Chapter 3. In Chapter 5, we look at two interacting branching models, One is interacting collision process with immigration, emigration and resurrection. The other one is interacting branching collision processes with immigration, emigration and resurrection. we investigate the basic properties, criteria of uniqueness and extinction probability. My original material starts from Chapter 4. The model used in chapter 4 were introduced by Li and Liu [2011]. In Li and Liu [2011], some calculation in cases of extinction probability evaluation were not strictly defined. My contribution focuses on the extinction probability evaluation and discussing the asymptotic behavior for the extinction probability in Chapter 4. A paper for this model will be submitted in this year. While two interacting branching models are discussed in Chapter 5. Some important properties for the two models are studied in detail.
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Chong, King-sing. "Explorations in Markov processes /". Hong Kong : University of Hong Kong, 1997. http://sunzi.lib.hku.hk/hkuto/record.jsp?B18736105.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Pötzelberger, Klaus. "On the Approximation of finite Markov-exchangeable processes by mixtures of Markov Processes". Department of Statistics and Mathematics, WU Vienna University of Economics and Business, 1991. http://epub.wu.ac.at/526/1/document.pdf.

Testo completo
Abstract (sommario):
We give an upper bound for the norm distance of (0,1) -valued Markov-exchangeable random variables to mixtures of distributions of Markov processes. A Markov-exchangeable random variable has a distribution that depends only on the starting value and the number of transitions 0-0, 0-1, 1-0 and 1-1. We show that if, for increasing length of variables, the norm distance to mixtures of Markov processes goes to 0, the rate of this convergence may be arbitrarily slow. (author's abstract)
Series: Forschungsberichte / Institut für Statistik
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Ferns, Norman Francis. "Metrics for Markov decision processes". Thesis, McGill University, 2003. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=80263.

Testo completo
Abstract (sommario):
We present a class of metrics, defined on the state space of a finite Markov decision process (MDP), each of which is sound with respect to stochastic bisimulation, a notion of MDP state equivalence derived from the theory of concurrent processes. Such metrics are based on similar metrics developed in the context of labelled Markov processes, and like those, are suitable for state space aggregation. Furthermore, we restrict our attention to a subset of this class that is appropriate for certain reinforcement learning (RL) tasks, specifically, infinite horizon tasks with an expected total discounted reward optimality criterion. Given such an RL metric, we provide bounds relating it to the optimal value function of the original MDP as well as to the value function of the aggregate MDP. Finally, we present an algorithm for calculating such a metric up to a prescribed degree of accuracy and some empirical results.
Gli stili APA, Harvard, Vancouver, ISO e altri

Libri sul tema "Markov processes"

1

White, D. J. Markov decision processes. New York: John Wiley & Sons, 1993.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Ethier, Stewart N., e Thomas G. Kurtz, a cura di. Markov Processes. Hoboken, NJ, USA: John Wiley & Sons, Inc., 1986. http://dx.doi.org/10.1002/9780470316658.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Puterman, Martin L., a cura di. Markov Decision Processes. Hoboken, NJ, USA: John Wiley & Sons, Inc., 1994. http://dx.doi.org/10.1002/9780470316887.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Hou, Zhenting, Jerzy A. Filar e Anyue Chen, a cura di. Markov Processes and Controlled Markov Chains. Boston, MA: Springer US, 2002. http://dx.doi.org/10.1007/978-1-4613-0265-0.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Zhenting, Hou, Filar Jerzy A. 1949- e Chen Anyue, a cura di. Markov processes and controlled Markov chains. Dordrecht: Kluwer Academic Publishers, 2002.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Pardoux, Étienne. Markov Processes and Applications. Chichester, UK: John Wiley & Sons, Ltd, 2008. http://dx.doi.org/10.1002/9780470721872.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Zhenting, Hou, e Guo Qingfeng. Homogeneous Denumerable Markov Processes. Berlin, Heidelberg: Springer Berlin Heidelberg, 1988. http://dx.doi.org/10.1007/978-3-642-68127-1.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Komorowski, Tomasz, Claudio Landim e Stefano Olla. Fluctuations in Markov Processes. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-29880-6.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Blumenthal, Robert M. Excursions of Markov Processes. Boston, MA: Birkhäuser Boston, 1992. http://dx.doi.org/10.1007/978-1-4684-9412-9.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Hernández-Lerma, O. Adaptive Markov Control Processes. New York, NY: Springer New York, 1989. http://dx.doi.org/10.1007/978-1-4419-8714-3.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri

Capitoli di libri sul tema "Markov processes"

1

Itô, Kiyosi. "Markov Processes". In Stochastic Processes, 93–178. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-662-10065-3_3.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Nagasawa, Masao. "Markov Processes". In Stochastic Processes in Quantum Physics, 1–26. Basel: Birkhäuser Basel, 2000. http://dx.doi.org/10.1007/978-3-0348-8383-2_1.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Gardiner, Crispin W. "Markov Processes". In Springer Series in Synergetics, 42–79. Berlin, Heidelberg: Springer Berlin Heidelberg, 1985. http://dx.doi.org/10.1007/978-3-662-02452-2_3.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Gardiner, Crispin W. "Markov Processes". In Springer Series in Synergetics, 42–79. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-662-05389-8_3.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Poler, Raúl, Josefa Mula e Manuel Díaz-Madroñero. "Markov Processes". In Operations Research Problems, 375–419. London: Springer London, 2013. http://dx.doi.org/10.1007/978-1-4471-5577-5_10.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Čepin, Marko. "Markov Processes". In Assessment of Power System Reliability, 113–18. London: Springer London, 2011. http://dx.doi.org/10.1007/978-0-85729-688-7_8.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Jacobs, Konrad. "Markov Processes". In Discrete Stochastics, 119–54. Basel: Birkhäuser Basel, 1992. http://dx.doi.org/10.1007/978-3-0348-8645-1_6.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Rudnicki, Ryszard, e Marta Tyran-Kamińska. "Markov Processes". In Piecewise Deterministic Processes in Biological Models, 33–62. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-61295-9_2.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Nelson, Randolph. "Markov Processes". In Probability, Stochastic Processes, and Queueing Theory, 329–89. New York, NY: Springer New York, 1995. http://dx.doi.org/10.1007/978-1-4757-2426-4_8.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Alfa, Attahiru Sule. "Markov Processes". In Queueing Theory for Telecommunications, 11–78. Boston, MA: Springer US, 2010. http://dx.doi.org/10.1007/978-1-4419-7314-6_2.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri

Atti di convegni sul tema "Markov processes"

1

Budgett, Stephanie, Azam Asanjarani e Heti Afimeimounga. "Visualizing Markov Processes". In Bridging the Gap: Empowering and Educating Today’s Learners in Statistics. International Association for Statistical Education, 2022. http://dx.doi.org/10.52041/iase.icots11.t10f3.

Testo completo
Abstract (sommario):
Researchers and educators have long been aware of the misconceptions prevalent in people’s probabilistic reasoning processes. Calls to reform the teaching of probability from a traditional and predominantly mathematical approach to include an emphasis on modelling using technology have been heeded by many. The purpose of this paper is to present our experiences of including an activity based on an interactive visualisation tool in the Markov processes module of a first-year probability course. Initial feedback suggests that the tool may support students’ understanding of the equilibrium distribution and points to certain aspects of the tool that may be beneficial. A targeted survey, to be administered in Semester 1, 2022, aims to provide more insight.
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Mladenov, Martin, Craig Boutilier, Dale Schuurmans, Ofer Meshi, Gal Elidan e Tyler Lu. "Logistic Markov Decision Processes". In Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/346.

Testo completo
Abstract (sommario):
User modeling in advertising and recommendation has typically focused on myopic predictors of user responses. In this work, we consider the long-term decision problem associated with user interaction. We propose a concise specification of long-term interaction dynamics by combining factored dynamic Bayesian networks with logistic predictors of user responses, allowing state-of-the-art prediction models to be seamlessly extended. We show how to solve such models at scale by providing a constraint generation approach for approximate linear programming that overcomes the variable coupling and non-linearity induced by the logistic regression predictor. The efficacy of the approach is demonstrated on advertising domains with up to 2^54 states and 2^39 actions.
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Hutter, Marcus. "Feature Markov Decision Processes". In 2nd Conference on Artificial General Intelligence 2009. Paris, France: Atlantis Press, 2009. http://dx.doi.org/10.2991/agi.2009.30.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Hawkes, Alan G. "Markov processes in APL". In Conference proceedings. New York, New York, USA: ACM Press, 1990. http://dx.doi.org/10.1145/97808.97843.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Kumar, Ravi, Maithra Raghu, Tamás Sarlós e Andrew Tomkins. "Linear Additive Markov Processes". In WWW '17: 26th International World Wide Web Conference. Republic and Canton of Geneva, Switzerland: International World Wide Web Conferences Steering Committee, 2017. http://dx.doi.org/10.1145/3038912.3052644.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Tzortzis, Ioannis, Charalambos D. Charalambous, Themistoklis Charalambous, Christoforos N. Hadjicostis e Mikael Johansson. "Approximation of Markov processes by lower dimensional processes". In 2014 IEEE 53rd Annual Conference on Decision and Control (CDC). IEEE, 2014. http://dx.doi.org/10.1109/cdc.2014.7040082.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Yu, Jia Yuan, e Shie Mannor. "Arbitrarily modulated Markov decision processes". In 2009 Joint 48th IEEE Conference on Decision and Control (CDC) and 28th Chinese Control Conference (CCC). IEEE, 2009. http://dx.doi.org/10.1109/cdc.2009.5400662.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Kozen, Dexter, Kim G. Larsen, Radu Mardare e Prakash Panangaden. "Stone Duality for Markov Processes". In 2013 Twenty-Eighth Annual IEEE/ACM Symposium on Logic in Computer Science (LICS 2013). IEEE, 2013. http://dx.doi.org/10.1109/lics.2013.38.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Brázdil, T., V. Brožek, K. Etessami, A. Kučera e D. Wojtczak. "One-Counter Markov Decision Processes". In Proceedings of the Twenty-First Annual ACM-SIAM Symposium on Discrete Algorithms. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2010. http://dx.doi.org/10.1137/1.9781611973075.70.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Hölzl, Johannes. "Markov processes in Isabelle/HOL". In CPP '17: Certified Proofs and Programs. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3018610.3018628.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri

Rapporti di organizzazioni sul tema "Markov processes"

1

Adler, Robert J., Stamatis Gambanis e Gennady Samorodnitsky. On Stable Markov Processes. Fort Belvoir, VA: Defense Technical Information Center, settembre 1987. http://dx.doi.org/10.21236/ada192892.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Abdel-Hameed, M. Markovian Shock Models, Deterioration Processes, Stratified Markov Processes Replacement Policies. Fort Belvoir, VA: Defense Technical Information Center, dicembre 1985. http://dx.doi.org/10.21236/ada174646.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Newell, Alan. Markovian Shock Models, Deterioration Processes, Stratified Markov Processes and Replacement Policies. Fort Belvoir, VA: Defense Technical Information Center, maggio 1986. http://dx.doi.org/10.21236/ada174995.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Dueker, Michael J. Markov Switching in GARCH Processes and Mean Reverting Stock Market Volatility. Federal Reserve Bank of St. Louis, 1994. http://dx.doi.org/10.20955/wp.1994.015.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Cinlar, E. Markov Processes Applied to Control, Reliability and Replacement. Fort Belvoir, VA: Defense Technical Information Center, aprile 1989. http://dx.doi.org/10.21236/ada208634.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Rohlicek, J. R., e A. S. Willsky. Structural Decomposition of Multiple Time Scale Markov Processes,. Fort Belvoir, VA: Defense Technical Information Center, ottobre 1987. http://dx.doi.org/10.21236/ada189739.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Serfozo, Richard F. Poisson Functionals of Markov Processes and Queueing Networks. Fort Belvoir, VA: Defense Technical Information Center, dicembre 1987. http://dx.doi.org/10.21236/ada191217.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Serfozo, R. F. Poisson Functionals of Markov Processes and Queueing Networks,. Fort Belvoir, VA: Defense Technical Information Center, dicembre 1987. http://dx.doi.org/10.21236/ada194289.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Draper, Bruce A., e J. Ross Beveridge. Learning to Populate Geospatial Databases via Markov Processes. Fort Belvoir, VA: Defense Technical Information Center, dicembre 1999. http://dx.doi.org/10.21236/ada374536.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Chang, Hyeong S., Michael C. Fu e Steven I. Marcus. An Adaptive Sampling Algorithm for Solving Markov Decision Processes. Fort Belvoir, VA: Defense Technical Information Center, maggio 2002. http://dx.doi.org/10.21236/ada438505.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Offriamo sconti su tutti i piani premium per gli autori le cui opere sono incluse in raccolte letterarie tematiche. Contattaci per ottenere un codice promozionale unico!

Vai alla bibliografia