Artykuły w czasopismach na temat „Markov processes”

Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Markov processes.

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 50 najlepszych artykułów w czasopismach naukowych na temat „Markov processes”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.

1

Demenkov, N. P., E. A. Mirkin i I. A. Mochalov. "Markov and Semi-Markov Processes with Fuzzy States. Part 1. Markov Processes". Informacionnye tehnologii 26, nr 6 (23.06.2020): 323–34. http://dx.doi.org/10.17587/it.26.323-334.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

FRANZ, UWE. "CLASSICAL MARKOV PROCESSES FROM QUANTUM LÉVY PROCESSES". Infinite Dimensional Analysis, Quantum Probability and Related Topics 02, nr 01 (marzec 1999): 105–29. http://dx.doi.org/10.1142/s0219025799000060.

Pełny tekst źródła
Streszczenie:
We show how classical Markov processes can be obtained from quantum Lévy processes. It is shown that quantum Lévy processes are quantum Markov processes, and sufficient conditions for restrictions to subalgebras to remain quantum Markov processes are given. A classical Markov process (which has the same time-ordered moments as the quantum process in the vacuum state) exists whenever we can restrict to a commutative subalgebra without losing the quantum Markov property.8 Several examples, including the Azéma martingale, with explicit calculations are presented. In particular, the action of the generator of the classical Markov processes on polynomials or their moments are calculated using Hopf algebra duality.
Style APA, Harvard, Vancouver, ISO itp.
3

Demenkov, N. P., E. A. Mirkin i I. A. Mochalov. "Markov and Semi-Markov Processes with Fuzzy States. Part 2. Semi-Markov Processes". INFORMACIONNYE TEHNOLOGII 26, nr 7 (17.07.2020): 387–93. http://dx.doi.org/10.17587/it.26.387-393.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Whittle, P., i M. L. Puterman. "Markov Decision Processes." Journal of the Royal Statistical Society. Series A (Statistics in Society) 158, nr 3 (1995): 636. http://dx.doi.org/10.2307/2983459.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Smith, J. Q., i D. J. White. "Markov Decision Processes." Journal of the Royal Statistical Society. Series A (Statistics in Society) 157, nr 1 (1994): 164. http://dx.doi.org/10.2307/2983520.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

King, Aaron A., Qianying Lin i Edward L. Ionides. "Markov genealogy processes". Theoretical Population Biology 143 (luty 2022): 77–91. http://dx.doi.org/10.1016/j.tpb.2021.11.003.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Thomas, L. C., D. J. White i Martin L. Puterman. "Markov Decision Processes." Journal of the Operational Research Society 46, nr 6 (czerwiec 1995): 792. http://dx.doi.org/10.2307/2584317.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Ephraim, Y., i N. Merhav. "Hidden Markov processes". IEEE Transactions on Information Theory 48, nr 6 (czerwiec 2002): 1518–69. http://dx.doi.org/10.1109/tit.2002.1003838.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Bäuerle, Nicole, i Ulrich Rieder. "Markov Decision Processes". Jahresbericht der Deutschen Mathematiker-Vereinigung 112, nr 4 (8.09.2010): 217–43. http://dx.doi.org/10.1365/s13291-010-0007-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Wal, J., i J. Wessels. "MARKOV DECISION PROCESSES". Statistica Neerlandica 39, nr 2 (czerwiec 1985): 219–33. http://dx.doi.org/10.1111/j.1467-9574.1985.tb01140.x.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
11

Thomas, L. C. "Markov Decision Processes". Journal of the Operational Research Society 46, nr 6 (czerwiec 1995): 792–93. http://dx.doi.org/10.1057/jors.1995.110.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
12

Frank, T. D. "Nonlinear Markov processes". Physics Letters A 372, nr 25 (czerwiec 2008): 4553–55. http://dx.doi.org/10.1016/j.physleta.2008.04.027.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
13

Kinateder, Kimberly K. J. "Corner Markov processes". Journal of Theoretical Probability 8, nr 3 (lipiec 1995): 539–47. http://dx.doi.org/10.1007/bf02218043.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
14

Brooks, Stephen, i D. J. White. "Markov Decision Processes." Statistician 44, nr 2 (1995): 292. http://dx.doi.org/10.2307/2348465.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
15

Fagnola, Franco. "Algebraic Markov processes". Proyecciones (Antofagasta) 18, nr 3 (1999): 13–28. http://dx.doi.org/10.22199/s07160917.1999.0003.00003.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
16

Craven, B. D. "Perturbed Markov Processes". Stochastic Models 19, nr 2 (5.01.2003): 269–85. http://dx.doi.org/10.1081/stm-120020390.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
17

White, Chelsea C., i Douglas J. White. "Markov decision processes". European Journal of Operational Research 39, nr 1 (marzec 1989): 1–16. http://dx.doi.org/10.1016/0377-2217(89)90348-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
18

Zhenting, Hou, Liu Zaiming i Zou Jiezhong. "Markov skeleton processes". Chinese Science Bulletin 43, nr 11 (czerwiec 1998): 881–89. http://dx.doi.org/10.1007/bf02884605.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
19

Franz, Uwe, Volkmar Liebscher i Stefan Zeiser. "Piecewise-Deterministic Markov Processes as Limits of Markov Jump Processes". Advances in Applied Probability 44, nr 3 (wrzesień 2012): 729–48. http://dx.doi.org/10.1239/aap/1346955262.

Pełny tekst źródła
Streszczenie:
A classical result about Markov jump processes states that a certain class of dynamical systems given by ordinary differential equations are obtained as the limit of a sequence of scaled Markov jump processes. This approach fails if the scaling cannot be carried out equally across all entities. In the present paper we present a convergence theorem for such an unequal scaling. In contrast to an equal scaling the limit process is not purely deterministic but still possesses randomness. We show that these processes constitute a rich subclass of piecewise-deterministic processes. Such processes apply in molecular biology where entities often occur in different scales of numbers.
Style APA, Harvard, Vancouver, ISO itp.
20

Franz, Uwe, Volkmar Liebscher i Stefan Zeiser. "Piecewise-Deterministic Markov Processes as Limits of Markov Jump Processes". Advances in Applied Probability 44, nr 03 (wrzesień 2012): 729–48. http://dx.doi.org/10.1017/s0001867800005851.

Pełny tekst źródła
Streszczenie:
A classical result about Markov jump processes states that a certain class of dynamical systems given by ordinary differential equations are obtained as the limit of a sequence of scaled Markov jump processes. This approach fails if the scaling cannot be carried out equally across all entities. In the present paper we present a convergence theorem for such an unequal scaling. In contrast to an equal scaling the limit process is not purely deterministic but still possesses randomness. We show that these processes constitute a rich subclass of piecewise-deterministic processes. Such processes apply in molecular biology where entities often occur in different scales of numbers.
Style APA, Harvard, Vancouver, ISO itp.
21

Buchholz, Peter, i Miklós Telek. "Rational Processes Related to Communicating Markov Processes". Journal of Applied Probability 49, nr 1 (marzec 2012): 40–59. http://dx.doi.org/10.1239/jap/1331216833.

Pełny tekst źródła
Streszczenie:
We define a class of stochastic processes, denoted as marked rational arrival processes (MRAPs), which is an extension of matrix exponential distributions and rational arrival processes. Continuous-time Markov processes with labeled transitions are a subclass of this more general model class. New equivalence relations between processes are defined, and it is shown that these equivalence relations are natural extensions of strong and weak lumpability and the corresponding bisimulation relations that have been defined for Markov processes. If a general rational process is equivalent to a Markov process, it can be used in numerical analysis techniques instead of the Markov process. This observation allows one to apply MRAPs like Markov processes and since the new equivalence relations are more general than lumpability and bisimulation, it is sometimes possible to find smaller representations of given processes. Finally, we show that the equivalence is preserved by the composition of MRAPs and can therefore be exploited in compositional modeling.
Style APA, Harvard, Vancouver, ISO itp.
22

Buchholz, Peter, i Miklós Telek. "Rational Processes Related to Communicating Markov Processes". Journal of Applied Probability 49, nr 01 (marzec 2012): 40–59. http://dx.doi.org/10.1017/s0021900200008858.

Pełny tekst źródła
Streszczenie:
We define a class of stochastic processes, denoted as marked rational arrival processes (MRAPs), which is an extension of matrix exponential distributions and rational arrival processes. Continuous-time Markov processes with labeled transitions are a subclass of this more general model class. New equivalence relations between processes are defined, and it is shown that these equivalence relations are natural extensions of strong and weak lumpability and the corresponding bisimulation relations that have been defined for Markov processes. If a general rational process is equivalent to a Markov process, it can be used in numerical analysis techniques instead of the Markov process. This observation allows one to apply MRAPs like Markov processes and since the new equivalence relations are more general than lumpability and bisimulation, it is sometimes possible to find smaller representations of given processes. Finally, we show that the equivalence is preserved by the composition of MRAPs and can therefore be exploited in compositional modeling.
Style APA, Harvard, Vancouver, ISO itp.
23

Iwata, Yukiko. "Constrictive Markov operators induced by Markov processes". Positivity 20, nr 2 (3.09.2015): 355–67. http://dx.doi.org/10.1007/s11117-015-0360-6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
24

Malinovskii, V. K. "Limit theorems for recurrent semi-Markov processes and Markov renewal processes". Journal of Soviet Mathematics 36, nr 4 (luty 1987): 493–502. http://dx.doi.org/10.1007/bf01663460.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
25

Fredkin, Donald R., i John A. Rice. "On aggregated Markov processes". Journal of Applied Probability 23, nr 1 (marzec 1986): 208–14. http://dx.doi.org/10.2307/3214130.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
26

Kazak, Jolanta. "Piecewise-deterministic Markov processes". Annales Polonici Mathematici 109, nr 3 (2013): 279–96. http://dx.doi.org/10.4064/ap109-3-4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
27

Lee, P. M., i O. Hernandez-Lerma. "Adaptive Markov Control Processes". Mathematical Gazette 74, nr 470 (grudzień 1990): 417. http://dx.doi.org/10.2307/3618186.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
28

Kowalski, Zbigniew S. "Multiple Markov Gaussian processes". Applicationes Mathematicae 48, nr 1 (2021): 65–78. http://dx.doi.org/10.4064/am2411-1-2021.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
29

SHIEH, Narn-Rueih. "Collisions of Markov Processes". Tokyo Journal of Mathematics 18, nr 1 (czerwiec 1995): 111–21. http://dx.doi.org/10.3836/tjm/1270043612.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
30

Hawkes, Alan G. "Markov processes in APL". ACM SIGAPL APL Quote Quad 20, nr 4 (maj 1990): 173–85. http://dx.doi.org/10.1145/97811.97843.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
31

Pollett, P. K. "Connecting reversible Markov processes". Advances in Applied Probability 18, nr 4 (grudzień 1986): 880–900. http://dx.doi.org/10.2307/1427254.

Pełny tekst źródła
Streszczenie:
We provide a framework for interconnecting a collection of reversible Markov processes in such a way that the resulting process has a product-form invariant measure with respect to which the process is reversible. A number of examples are discussed including Kingman&s reversible migration process, interconnected random walks and stratified clustering processes.
Style APA, Harvard, Vancouver, ISO itp.
32

Gerontidis, Ioannis I. "Markov population replacement processes". Advances in Applied Probability 27, nr 3 (wrzesień 1995): 711–40. http://dx.doi.org/10.2307/1428131.

Pełny tekst źródła
Streszczenie:
We consider a migration process whose singleton process is a time-dependent Markov replacement process. For the singleton process, which may be treated as either open or closed, we study the limiting distribution, the distribution of the time to replacement and related quantities. For a replacement process in equilibrium we obtain a version of Little's law and we provide conditions for reversibility. For the resulting linear population process we characterize exponential ergodicity for two types of environmental behaviour, i.e. either convergent or cyclic, and finally for large population sizes a diffusion approximation analysis is provided.
Style APA, Harvard, Vancouver, ISO itp.
33

Alexopoulos, Christos, Akram A. El-Tannir i Richard F. Serfozo. "Partition-Reversible Markov Processes". Operations Research 47, nr 1 (luty 1999): 125–30. http://dx.doi.org/10.1287/opre.47.1.125.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
34

Avrachenkov, Konstantin, Alexey Piunovskiy i Yi Zhang. "Markov Processes with Restart". Journal of Applied Probability 50, nr 4 (grudzień 2013): 960–68. http://dx.doi.org/10.1239/jap/1389370093.

Pełny tekst źródła
Streszczenie:
We consider a general homogeneous continuous-time Markov process with restarts. The process is forced to restart from a given distribution at time moments generated by an independent Poisson process. The motivation to study such processes comes from modeling human and animal mobility patterns, restart processes in communication protocols, and from application of restarting random walks in information retrieval. We provide a connection between the transition probability functions of the original Markov process and the modified process with restarts. We give closed-form expressions for the invariant probability measure of the modified process. When the process evolves on the Euclidean space, there is also a closed-form expression for the moments of the modified process. We show that the modified process is always positive Harris recurrent and exponentially ergodic with the index equal to (or greater than) the rate of restarts. Finally, we illustrate the general results by the standard and geometric Brownian motions.
Style APA, Harvard, Vancouver, ISO itp.
35

Novak, Stephanie, i Lyman J. Fretwell. "Non‐Markov noise processes". Journal of the Acoustical Society of America 80, S1 (grudzień 1986): S64. http://dx.doi.org/10.1121/1.2023904.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
36

Rodrigues, Josemar, N. Balakrishnan i Patrick Borges. "Markov-Correlated Poisson Processes". Communications in Statistics - Theory and Methods 42, nr 20 (18.10.2013): 3696–703. http://dx.doi.org/10.1080/03610926.2011.636168.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
37

Larralde, H., F. Leyvraz i D. P. Sanders. "Metastability in Markov processes". Journal of Statistical Mechanics: Theory and Experiment 2006, nr 08 (18.08.2006): P08013. http://dx.doi.org/10.1088/1742-5468/2006/08/p08013.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
38

jun luo, Shou. "Two-parameter markov processes". Stochastics and Stochastic Reports 40, nr 3-4 (wrzesień 1992): 181–93. http://dx.doi.org/10.1080/17442509208833788.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
39

Chin, Y. C., i A. J. Baddeley. "Markov interacting component processes". Advances in Applied Probability 32, nr 3 (wrzesień 2000): 597–619. http://dx.doi.org/10.1239/aap/1013540233.

Pełny tekst źródła
Streszczenie:
A generalization of Markov point processes is introduced in which interactions occur between connected components of the point pattern. A version of the Hammersley-Clifford characterization theorem is proved which states that a point process is a Markov interacting component process if and only if its density function is a product of interaction terms associated with cliques of connected components. Integrability and superpositional properties of the processes are shown and a pairwise interaction example is used for detailed exploration.
Style APA, Harvard, Vancouver, ISO itp.
40

Cai, Haiyan. "Piecewise deterministic Markov processes". Stochastic Analysis and Applications 11, nr 3 (styczeń 1993): 255–74. http://dx.doi.org/10.1080/07362999308809317.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
41

ACCARDI, LUIGI, i ANILESH MOHARI. "TIME REFLECTED MARKOV PROCESSES". Infinite Dimensional Analysis, Quantum Probability and Related Topics 02, nr 03 (wrzesień 1999): 397–425. http://dx.doi.org/10.1142/s0219025799000230.

Pełny tekst źródła
Streszczenie:
A classical stochastic process which is Markovian for its past filtration is also Markovian for its future filtration. We show with a counterexample based on quantum liftings of a finite state classical Markov chain that this property cannot hold in the category of expected Markov processes. Using a duality theory for von Neumann algebras with weights, developed by Petz on the basis of previous results by Groh and Kümmerer, we show that a quantum version of this symmetry can be established in the category of weak Markov processes in the sense of Bhat and Parthasarathy. Here time reversal is implemented by an anti-unitary operator and a weak Markov process is time reversal invariant if and only if the associated semigroup coincides with its Petz dual. This construction allows one to extend to the quantum case, both for backward and forward processes, the Misra–Prigogine–Courbage internal time operator and to show that the two operators are intertwined by the time reversal anti-automorphism.
Style APA, Harvard, Vancouver, ISO itp.
42

Fredkin, Donald R., i John A. Rice. "On aggregated Markov processes". Journal of Applied Probability 23, nr 01 (marzec 1986): 208–14. http://dx.doi.org/10.1017/s0021900200106412.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
43

Halibard, Moishe, i Ido Kanter. "Markov processes and linguistics". Physica A: Statistical Mechanics and its Applications 249, nr 1-4 (styczeń 1998): 525–35. http://dx.doi.org/10.1016/s0378-4371(97)00512-8.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
44

Chin, Y. C., i A. J. Baddeley. "Markov interacting component processes". Advances in Applied Probability 32, nr 03 (wrzesień 2000): 597–619. http://dx.doi.org/10.1017/s0001867800010144.

Pełny tekst źródła
Streszczenie:
A generalization of Markov point processes is introduced in which interactions occur between connected components of the point pattern. A version of the Hammersley-Clifford characterization theorem is proved which states that a point process is a Markov interacting component process if and only if its density function is a product of interaction terms associated with cliques of connected components. Integrability and superpositional properties of the processes are shown and a pairwise interaction example is used for detailed exploration.
Style APA, Harvard, Vancouver, ISO itp.
45

Pollett, P. K. "Connecting reversible Markov processes". Advances in Applied Probability 18, nr 04 (grudzień 1986): 880–900. http://dx.doi.org/10.1017/s0001867800016190.

Pełny tekst źródła
Streszczenie:
We provide a framework for interconnecting a collection of reversible Markov processes in such a way that the resulting process has a product-form invariant measure with respect to which the process is reversible. A number of examples are discussed including Kingman&s reversible migration process, interconnected random walks and stratified clustering processes.
Style APA, Harvard, Vancouver, ISO itp.
46

Gerontidis, Ioannis I. "Markov population replacement processes". Advances in Applied Probability 27, nr 03 (wrzesień 1995): 711–40. http://dx.doi.org/10.1017/s0001867800027129.

Pełny tekst źródła
Streszczenie:
We consider a migration process whose singleton process is a time-dependent Markov replacement process. For the singleton process, which may be treated as either open or closed, we study the limiting distribution, the distribution of the time to replacement and related quantities. For a replacement process in equilibrium we obtain a version of Little's law and we provide conditions for reversibility. For the resulting linear population process we characterize exponential ergodicity for two types of environmental behaviour, i.e. either convergent or cyclic, and finally for large population sizes a diffusion approximation analysis is provided.
Style APA, Harvard, Vancouver, ISO itp.
47

Avrachenkov, Konstantin, Alexey Piunovskiy i Yi Zhang. "Markov Processes with Restart". Journal of Applied Probability 50, nr 04 (grudzień 2013): 960–68. http://dx.doi.org/10.1017/s0021900200013735.

Pełny tekst źródła
Streszczenie:
We consider a general homogeneous continuous-time Markov process with restarts. The process is forced to restart from a given distribution at time moments generated by an independent Poisson process. The motivation to study such processes comes from modeling human and animal mobility patterns, restart processes in communication protocols, and from application of restarting random walks in information retrieval. We provide a connection between the transition probability functions of the original Markov process and the modified process with restarts. We give closed-form expressions for the invariant probability measure of the modified process. When the process evolves on the Euclidean space, there is also a closed-form expression for the moments of the modified process. We show that the modified process is always positive Harris recurrent and exponentially ergodic with the index equal to (or greater than) the rate of restarts. Finally, we illustrate the general results by the standard and geometric Brownian motions.
Style APA, Harvard, Vancouver, ISO itp.
48

Baykal-Gürsoy, M., i K. Gürsoy. "SEMI-MARKOV DECISION PROCESSES". Probability in the Engineering and Informational Sciences 21, nr 4 (październik 2007): 635–57. http://dx.doi.org/10.1017/s026996480700037x.

Pełny tekst źródła
Streszczenie:
Considered are semi-Markov decision processes (SMDPs) with finite state and action spaces. We study two criteria: the expected average reward per unit time subject to a sample path constraint on the average cost per unit time and the expected time-average variability. Under a certain condition, for communicating SMDPs, we construct (randomized) stationary policies that are ε-optimal for each criterion; the policy is optimal for the first criterion under the unichain assumption and the policy is optimal and pure for a specific variability function in the second criterion. For general multichain SMDPs, by using a state space decomposition approach, similar results are obtained.
Style APA, Harvard, Vancouver, ISO itp.
49

Desharnais, Josée, Vineet Gupta, Radha Jagadeesan i Prakash Panangaden. "Approximating labelled Markov processes". Information and Computation 184, nr 1 (lipiec 2003): 160–200. http://dx.doi.org/10.1016/s0890-5401(03)00051-8.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
50

Sabbadin, Régis. "Possibilistic Markov decision processes". Engineering Applications of Artificial Intelligence 14, nr 3 (czerwiec 2001): 287–300. http://dx.doi.org/10.1016/s0952-1976(01)00007-0.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii