Gotowa bibliografia na temat „Markov processes”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Markov processes”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Markov processes"

1

Demenkov, N. P., E. A. Mirkin i I. A. Mochalov. "Markov and Semi-Markov Processes with Fuzzy States. Part 1. Markov Processes". Informacionnye tehnologii 26, nr 6 (23.06.2020): 323–34. http://dx.doi.org/10.17587/it.26.323-334.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

FRANZ, UWE. "CLASSICAL MARKOV PROCESSES FROM QUANTUM LÉVY PROCESSES". Infinite Dimensional Analysis, Quantum Probability and Related Topics 02, nr 01 (marzec 1999): 105–29. http://dx.doi.org/10.1142/s0219025799000060.

Pełny tekst źródła
Streszczenie:
We show how classical Markov processes can be obtained from quantum Lévy processes. It is shown that quantum Lévy processes are quantum Markov processes, and sufficient conditions for restrictions to subalgebras to remain quantum Markov processes are given. A classical Markov process (which has the same time-ordered moments as the quantum process in the vacuum state) exists whenever we can restrict to a commutative subalgebra without losing the quantum Markov property.8 Several examples, including the Azéma martingale, with explicit calculations are presented. In particular, the action of the generator of the classical Markov processes on polynomials or their moments are calculated using Hopf algebra duality.
Style APA, Harvard, Vancouver, ISO itp.
3

Demenkov, N. P., E. A. Mirkin i I. A. Mochalov. "Markov and Semi-Markov Processes with Fuzzy States. Part 2. Semi-Markov Processes". INFORMACIONNYE TEHNOLOGII 26, nr 7 (17.07.2020): 387–93. http://dx.doi.org/10.17587/it.26.387-393.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Whittle, P., i M. L. Puterman. "Markov Decision Processes." Journal of the Royal Statistical Society. Series A (Statistics in Society) 158, nr 3 (1995): 636. http://dx.doi.org/10.2307/2983459.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Smith, J. Q., i D. J. White. "Markov Decision Processes." Journal of the Royal Statistical Society. Series A (Statistics in Society) 157, nr 1 (1994): 164. http://dx.doi.org/10.2307/2983520.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

King, Aaron A., Qianying Lin i Edward L. Ionides. "Markov genealogy processes". Theoretical Population Biology 143 (luty 2022): 77–91. http://dx.doi.org/10.1016/j.tpb.2021.11.003.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Thomas, L. C., D. J. White i Martin L. Puterman. "Markov Decision Processes." Journal of the Operational Research Society 46, nr 6 (czerwiec 1995): 792. http://dx.doi.org/10.2307/2584317.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Ephraim, Y., i N. Merhav. "Hidden Markov processes". IEEE Transactions on Information Theory 48, nr 6 (czerwiec 2002): 1518–69. http://dx.doi.org/10.1109/tit.2002.1003838.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Bäuerle, Nicole, i Ulrich Rieder. "Markov Decision Processes". Jahresbericht der Deutschen Mathematiker-Vereinigung 112, nr 4 (8.09.2010): 217–43. http://dx.doi.org/10.1365/s13291-010-0007-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Wal, J., i J. Wessels. "MARKOV DECISION PROCESSES". Statistica Neerlandica 39, nr 2 (czerwiec 1985): 219–33. http://dx.doi.org/10.1111/j.1467-9574.1985.tb01140.x.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Rozprawy doktorskie na temat "Markov processes"

1

Desharnais, Josée. "Labelled Markov processes". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0031/NQ64546.pdf.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Balan, Raluca M. "Set-Markov processes". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/NQ66119.pdf.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Eltannir, Akram A. "Markov interactive processes". Diss., Georgia Institute of Technology, 1993. http://hdl.handle.net/1853/30745.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Haugomat, Tristan. "Localisation en espace de la propriété de Feller avec application aux processus de type Lévy". Thesis, Rennes 1, 2018. http://www.theses.fr/2018REN1S046/document.

Pełny tekst źródła
Streszczenie:
Dans cette thèse, nous donnons une localisation en espace de la théorie des processus de Feller. Un premier objectif est d’obtenir des résultats simples et précis sur la convergence de processus de Markov. Un second objectif est d’étudier le lien entre les notions de propriété de Feller, problème de martingales et topologie de Skorokhod. Dans un premier temps nous donnons une version localisée de la topologie de Skorokhod. Nous en étudions les notions de compacité et tension. Nous faisons le lien entre les topologies de Skorokhod localisée et non localisée, grâce à la notion de changement de temps. Dans un second temps, à l’aide de la topologie de Skorokhod localisée et du changement de temps, nous étudions les problèmes de martingales. Nous montrons pour des processus l’équivalence entre, d’une part, être solution d’un problème de martingales bien posé, d’autre part, vérifier une version localisée de la propriété de Feller, et enfin, être markovien et continu en loi par rapport à sa condition initiale. Nous caractérisons la convergence en loi pour les solutions de problèmes de martingale en terme de convergence des opérateurs associés et donnons un résultat similaire pour les approximations à temps discret. Pour finir, nous appliquons la théorie des processus localement fellerien à deux exemples. Nous l’appliquons d’abord au processus de type Lévy et obtenons des résultats de convergence pour des processus à temps discret et continu, notamment des méthodes de simulation et schémas d’Euler. Nous appliquons ensuite cette même théorie aux diffusions unidimensionnelles dans des potentiels, nous obtenons des résultats de convergence de diffusions ou marches aléatoires vers des diffusions singulières. Comme conséquences, nous déduisons la convergence de marches aléatoires en milieux aléatoires vers des diffusions en potentiels aléatoires
In this PhD thesis, we give a space localisation for the theory of Feller processes. A first objective is to obtain simple and precise results on the convergence of Markov processes. A second objective is to study the link between the notions of Feller property, martingale problem and Skorokhod topology. First we give a localised version of the Skorokhod topology. We study the notions of compactness and tightness for this topology. We make the connexion between localised and unlocalised Skorokhod topologies, by using the notion of time change. In a second step, using the localised Skorokhod topology and the time change, we study martingale problems. We show the equivalence between, on the one hand, to be solution of a well-posed martingale problem, on the other hand, to satisfy a localised version of the Feller property, and finally, to be a Markov process weakly continuous with respect to the initial condition. We characterise the weak convergence for solutions of martingale problems in terms of convergence of associated operators and give a similar result for discrete time approximations. Finally, we apply the theory of locally Feller process to some examples. We first apply it to the Lévy-type processes and obtain convergence results for discrete and continuous time processes, including simulation methods and Euler’s schemes. We then apply the same theory to one-dimensional diffusions in a potential and we obtain convergence results of diffusions or random walks towards singular diffusions. As a consequences, we deduce the convergence of random walks in random environment towards diffusions in random potential
Style APA, Harvard, Vancouver, ISO itp.
5

莊競誠 i King-sing Chong. "Explorations in Markov processes". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1997. http://hub.hku.hk/bib/B31235682.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

James, Huw William. "Transient Markov decision processes". Thesis, University of Bristol, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.430192.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Ku, Ho Ming. "Interacting Markov branching processes". Thesis, University of Liverpool, 2014. http://livrepository.liverpool.ac.uk/2002759/.

Pełny tekst źródła
Streszczenie:
In engineering, biology and physics, in many systems, the particles or members give birth and die through time. These systems can be modeled by continuoustime Markov Chains and Markov Processes. Applications of Markov Processes are investigated by many scientists, Jagers [1975] for example . In ordinary Markov branching processes, each particles or members are assumed to be identical and independent. However, in some cases, each two members of the species may interact/collide together to give new birth. In considering these cases, we need to have some more general processes. We may use collision branching processes to model such systems. Then, in order to consider an even more general model, i.e. each particles can have branching and collision effect. In this case the branching component and collision component will have an interaction effect. We consider this model as interacting branching collision processes. In this thesis, in Chapter 1, we firstly look at some background, basic concepts of continuous-time Markov Chains and ordinary Markov branching processes. After revising some basic concepts and models, we look into more complicated models, collision branching processes and interacting branching collision processes. In Chapter 2, for collision branching processes, we investigate the basic properties, criteria of uniqueness, and explicit expressions for the extinction probability and the expected/mean extinction time and expected/mean explosion time. In Chapter 3, for interacting branching collision processes, similar to the structure in last chapter, we investigate the basic properties, criteria of uniqueness. Because of the more complicated model settings, a lot more details are required in considering the extinction probability. We will divide this section into several parts and consider the extinction probability under different cases and assumptions. After considering the extinction probability for the interacting branching processes, we notice that the explicit form of the extinction probability may be too complicated. In the last part of Chapter 3, we discuss the asymptotic behavior for the extinction probability of the interacting branching collision processes. In Chapter 4, we look at a related but still important branching model, Markov branching processes with immigration, emigration and resurrection. We investigate the basic properties, criteria of uniqueness. The most interesting part is that we investigate the extinction probability with our technique/methods using in Chapter 4. This can also be served as a good example of the methods introducing in Chapter 3. In Chapter 5, we look at two interacting branching models, One is interacting collision process with immigration, emigration and resurrection. The other one is interacting branching collision processes with immigration, emigration and resurrection. we investigate the basic properties, criteria of uniqueness and extinction probability. My original material starts from Chapter 4. The model used in chapter 4 were introduced by Li and Liu [2011]. In Li and Liu [2011], some calculation in cases of extinction probability evaluation were not strictly defined. My contribution focuses on the extinction probability evaluation and discussing the asymptotic behavior for the extinction probability in Chapter 4. A paper for this model will be submitted in this year. While two interacting branching models are discussed in Chapter 5. Some important properties for the two models are studied in detail.
Style APA, Harvard, Vancouver, ISO itp.
8

Chong, King-sing. "Explorations in Markov processes /". Hong Kong : University of Hong Kong, 1997. http://sunzi.lib.hku.hk/hkuto/record.jsp?B18736105.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Pötzelberger, Klaus. "On the Approximation of finite Markov-exchangeable processes by mixtures of Markov Processes". Department of Statistics and Mathematics, WU Vienna University of Economics and Business, 1991. http://epub.wu.ac.at/526/1/document.pdf.

Pełny tekst źródła
Streszczenie:
We give an upper bound for the norm distance of (0,1) -valued Markov-exchangeable random variables to mixtures of distributions of Markov processes. A Markov-exchangeable random variable has a distribution that depends only on the starting value and the number of transitions 0-0, 0-1, 1-0 and 1-1. We show that if, for increasing length of variables, the norm distance to mixtures of Markov processes goes to 0, the rate of this convergence may be arbitrarily slow. (author's abstract)
Series: Forschungsberichte / Institut für Statistik
Style APA, Harvard, Vancouver, ISO itp.
10

Ferns, Norman Francis. "Metrics for Markov decision processes". Thesis, McGill University, 2003. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=80263.

Pełny tekst źródła
Streszczenie:
We present a class of metrics, defined on the state space of a finite Markov decision process (MDP), each of which is sound with respect to stochastic bisimulation, a notion of MDP state equivalence derived from the theory of concurrent processes. Such metrics are based on similar metrics developed in the context of labelled Markov processes, and like those, are suitable for state space aggregation. Furthermore, we restrict our attention to a subset of this class that is appropriate for certain reinforcement learning (RL) tasks, specifically, infinite horizon tasks with an expected total discounted reward optimality criterion. Given such an RL metric, we provide bounds relating it to the optimal value function of the original MDP as well as to the value function of the aggregate MDP. Finally, we present an algorithm for calculating such a metric up to a prescribed degree of accuracy and some empirical results.
Style APA, Harvard, Vancouver, ISO itp.

Książki na temat "Markov processes"

1

White, D. J. Markov decision processes. New York: John Wiley & Sons, 1993.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Ethier, Stewart N., i Thomas G. Kurtz, red. Markov Processes. Hoboken, NJ, USA: John Wiley & Sons, Inc., 1986. http://dx.doi.org/10.1002/9780470316658.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Puterman, Martin L., red. Markov Decision Processes. Hoboken, NJ, USA: John Wiley & Sons, Inc., 1994. http://dx.doi.org/10.1002/9780470316887.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Hou, Zhenting, Jerzy A. Filar i Anyue Chen, red. Markov Processes and Controlled Markov Chains. Boston, MA: Springer US, 2002. http://dx.doi.org/10.1007/978-1-4613-0265-0.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Zhenting, Hou, Filar Jerzy A. 1949- i Chen Anyue, red. Markov processes and controlled Markov chains. Dordrecht: Kluwer Academic Publishers, 2002.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Pardoux, Étienne. Markov Processes and Applications. Chichester, UK: John Wiley & Sons, Ltd, 2008. http://dx.doi.org/10.1002/9780470721872.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Zhenting, Hou, i Guo Qingfeng. Homogeneous Denumerable Markov Processes. Berlin, Heidelberg: Springer Berlin Heidelberg, 1988. http://dx.doi.org/10.1007/978-3-642-68127-1.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Komorowski, Tomasz, Claudio Landim i Stefano Olla. Fluctuations in Markov Processes. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-29880-6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Blumenthal, Robert M. Excursions of Markov Processes. Boston, MA: Birkhäuser Boston, 1992. http://dx.doi.org/10.1007/978-1-4684-9412-9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Hernández-Lerma, O. Adaptive Markov Control Processes. New York, NY: Springer New York, 1989. http://dx.doi.org/10.1007/978-1-4419-8714-3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Części książek na temat "Markov processes"

1

Itô, Kiyosi. "Markov Processes". W Stochastic Processes, 93–178. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-662-10065-3_3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Nagasawa, Masao. "Markov Processes". W Stochastic Processes in Quantum Physics, 1–26. Basel: Birkhäuser Basel, 2000. http://dx.doi.org/10.1007/978-3-0348-8383-2_1.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Gardiner, Crispin W. "Markov Processes". W Springer Series in Synergetics, 42–79. Berlin, Heidelberg: Springer Berlin Heidelberg, 1985. http://dx.doi.org/10.1007/978-3-662-02452-2_3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Gardiner, Crispin W. "Markov Processes". W Springer Series in Synergetics, 42–79. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-662-05389-8_3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Poler, Raúl, Josefa Mula i Manuel Díaz-Madroñero. "Markov Processes". W Operations Research Problems, 375–419. London: Springer London, 2013. http://dx.doi.org/10.1007/978-1-4471-5577-5_10.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Čepin, Marko. "Markov Processes". W Assessment of Power System Reliability, 113–18. London: Springer London, 2011. http://dx.doi.org/10.1007/978-0-85729-688-7_8.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Jacobs, Konrad. "Markov Processes". W Discrete Stochastics, 119–54. Basel: Birkhäuser Basel, 1992. http://dx.doi.org/10.1007/978-3-0348-8645-1_6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Rudnicki, Ryszard, i Marta Tyran-Kamińska. "Markov Processes". W Piecewise Deterministic Processes in Biological Models, 33–62. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-61295-9_2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Nelson, Randolph. "Markov Processes". W Probability, Stochastic Processes, and Queueing Theory, 329–89. New York, NY: Springer New York, 1995. http://dx.doi.org/10.1007/978-1-4757-2426-4_8.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Alfa, Attahiru Sule. "Markov Processes". W Queueing Theory for Telecommunications, 11–78. Boston, MA: Springer US, 2010. http://dx.doi.org/10.1007/978-1-4419-7314-6_2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Streszczenia konferencji na temat "Markov processes"

1

Budgett, Stephanie, Azam Asanjarani i Heti Afimeimounga. "Visualizing Markov Processes". W Bridging the Gap: Empowering and Educating Today’s Learners in Statistics. International Association for Statistical Education, 2022. http://dx.doi.org/10.52041/iase.icots11.t10f3.

Pełny tekst źródła
Streszczenie:
Researchers and educators have long been aware of the misconceptions prevalent in people’s probabilistic reasoning processes. Calls to reform the teaching of probability from a traditional and predominantly mathematical approach to include an emphasis on modelling using technology have been heeded by many. The purpose of this paper is to present our experiences of including an activity based on an interactive visualisation tool in the Markov processes module of a first-year probability course. Initial feedback suggests that the tool may support students’ understanding of the equilibrium distribution and points to certain aspects of the tool that may be beneficial. A targeted survey, to be administered in Semester 1, 2022, aims to provide more insight.
Style APA, Harvard, Vancouver, ISO itp.
2

Mladenov, Martin, Craig Boutilier, Dale Schuurmans, Ofer Meshi, Gal Elidan i Tyler Lu. "Logistic Markov Decision Processes". W Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/346.

Pełny tekst źródła
Streszczenie:
User modeling in advertising and recommendation has typically focused on myopic predictors of user responses. In this work, we consider the long-term decision problem associated with user interaction. We propose a concise specification of long-term interaction dynamics by combining factored dynamic Bayesian networks with logistic predictors of user responses, allowing state-of-the-art prediction models to be seamlessly extended. We show how to solve such models at scale by providing a constraint generation approach for approximate linear programming that overcomes the variable coupling and non-linearity induced by the logistic regression predictor. The efficacy of the approach is demonstrated on advertising domains with up to 2^54 states and 2^39 actions.
Style APA, Harvard, Vancouver, ISO itp.
3

Hutter, Marcus. "Feature Markov Decision Processes". W 2nd Conference on Artificial General Intelligence 2009. Paris, France: Atlantis Press, 2009. http://dx.doi.org/10.2991/agi.2009.30.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Hawkes, Alan G. "Markov processes in APL". W Conference proceedings. New York, New York, USA: ACM Press, 1990. http://dx.doi.org/10.1145/97808.97843.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Kumar, Ravi, Maithra Raghu, Tamás Sarlós i Andrew Tomkins. "Linear Additive Markov Processes". W WWW '17: 26th International World Wide Web Conference. Republic and Canton of Geneva, Switzerland: International World Wide Web Conferences Steering Committee, 2017. http://dx.doi.org/10.1145/3038912.3052644.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Tzortzis, Ioannis, Charalambos D. Charalambous, Themistoklis Charalambous, Christoforos N. Hadjicostis i Mikael Johansson. "Approximation of Markov processes by lower dimensional processes". W 2014 IEEE 53rd Annual Conference on Decision and Control (CDC). IEEE, 2014. http://dx.doi.org/10.1109/cdc.2014.7040082.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Yu, Jia Yuan, i Shie Mannor. "Arbitrarily modulated Markov decision processes". W 2009 Joint 48th IEEE Conference on Decision and Control (CDC) and 28th Chinese Control Conference (CCC). IEEE, 2009. http://dx.doi.org/10.1109/cdc.2009.5400662.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Kozen, Dexter, Kim G. Larsen, Radu Mardare i Prakash Panangaden. "Stone Duality for Markov Processes". W 2013 Twenty-Eighth Annual IEEE/ACM Symposium on Logic in Computer Science (LICS 2013). IEEE, 2013. http://dx.doi.org/10.1109/lics.2013.38.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Brázdil, T., V. Brožek, K. Etessami, A. Kučera i D. Wojtczak. "One-Counter Markov Decision Processes". W Proceedings of the Twenty-First Annual ACM-SIAM Symposium on Discrete Algorithms. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2010. http://dx.doi.org/10.1137/1.9781611973075.70.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Hölzl, Johannes. "Markov processes in Isabelle/HOL". W CPP '17: Certified Proofs and Programs. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3018610.3018628.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Raporty organizacyjne na temat "Markov processes"

1

Adler, Robert J., Stamatis Gambanis i Gennady Samorodnitsky. On Stable Markov Processes. Fort Belvoir, VA: Defense Technical Information Center, wrzesień 1987. http://dx.doi.org/10.21236/ada192892.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Abdel-Hameed, M. Markovian Shock Models, Deterioration Processes, Stratified Markov Processes Replacement Policies. Fort Belvoir, VA: Defense Technical Information Center, grudzień 1985. http://dx.doi.org/10.21236/ada174646.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Newell, Alan. Markovian Shock Models, Deterioration Processes, Stratified Markov Processes and Replacement Policies. Fort Belvoir, VA: Defense Technical Information Center, maj 1986. http://dx.doi.org/10.21236/ada174995.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Dueker, Michael J. Markov Switching in GARCH Processes and Mean Reverting Stock Market Volatility. Federal Reserve Bank of St. Louis, 1994. http://dx.doi.org/10.20955/wp.1994.015.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Cinlar, E. Markov Processes Applied to Control, Reliability and Replacement. Fort Belvoir, VA: Defense Technical Information Center, kwiecień 1989. http://dx.doi.org/10.21236/ada208634.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Rohlicek, J. R., i A. S. Willsky. Structural Decomposition of Multiple Time Scale Markov Processes,. Fort Belvoir, VA: Defense Technical Information Center, październik 1987. http://dx.doi.org/10.21236/ada189739.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Serfozo, Richard F. Poisson Functionals of Markov Processes and Queueing Networks. Fort Belvoir, VA: Defense Technical Information Center, grudzień 1987. http://dx.doi.org/10.21236/ada191217.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Serfozo, R. F. Poisson Functionals of Markov Processes and Queueing Networks,. Fort Belvoir, VA: Defense Technical Information Center, grudzień 1987. http://dx.doi.org/10.21236/ada194289.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Draper, Bruce A., i J. Ross Beveridge. Learning to Populate Geospatial Databases via Markov Processes. Fort Belvoir, VA: Defense Technical Information Center, grudzień 1999. http://dx.doi.org/10.21236/ada374536.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Chang, Hyeong S., Michael C. Fu i Steven I. Marcus. An Adaptive Sampling Algorithm for Solving Markov Decision Processes. Fort Belvoir, VA: Defense Technical Information Center, maj 2002. http://dx.doi.org/10.21236/ada438505.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii