To see the other types of publications on this topic, follow the link: Invariant distribution of Markov processes.

Journal articles on the topic 'Invariant distribution of Markov processes'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Invariant distribution of Markov processes.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Arnold, Barry C., and C. A. Robertson. "Autoregressive logistic processes." Journal of Applied Probability 26, no. 3 (September 1989): 524–31. http://dx.doi.org/10.2307/3214410.

Full text
Abstract:
A stochastic model is presented which yields a stationary Markov process whose invariant distribution is logistic. The model is autoregressive in character and is closely related to the autoregressive Pareto processes introduced earlier by Yeh et al. (1988). The model may be constructed to have absolutely continuous joint distributions. Analogous higher-order autoregressive and moving average processes may be constructed.
APA, Harvard, Vancouver, ISO, and other styles
2

Arnold, Barry C., and C. A. Robertson. "Autoregressive logistic processes." Journal of Applied Probability 26, no. 03 (September 1989): 524–31. http://dx.doi.org/10.1017/s0021900200038122.

Full text
Abstract:
A stochastic model is presented which yields a stationary Markov process whose invariant distribution is logistic. The model is autoregressive in character and is closely related to the autoregressive Pareto processes introduced earlier by Yeh et al. (1988). The model may be constructed to have absolutely continuous joint distributions. Analogous higher-order autoregressive and moving average processes may be constructed.
APA, Harvard, Vancouver, ISO, and other styles
3

McDonald, D. "An invariance principle for semi-Markov processes." Advances in Applied Probability 17, no. 1 (March 1985): 100–126. http://dx.doi.org/10.2307/1427055.

Full text
Abstract:
Let (I(t))∞t = () be a semi-Markov process with state space II and recurrent probability transition kernel P. Subject to certain mixing conditions, where Δis an invariant probability measure for P and μb is the expected sojourn time in state b ϵΠ. We show that this limit is robust; that is, for each state b ϵ Πthe sojourn-time distribution may change for each transition, but, as long as the expected sojourn time in b is µb on the average, the above limit still holds. The kernel P may also vary for each transition as long as Δis invariant.
APA, Harvard, Vancouver, ISO, and other styles
4

McDonald, D. "An invariance principle for semi-Markov processes." Advances in Applied Probability 17, no. 01 (March 1985): 100–126. http://dx.doi.org/10.1017/s0001867800014683.

Full text
Abstract:
Let (I(t))∞ t = () be a semi-Markov process with state space II and recurrent probability transition kernel P. Subject to certain mixing conditions, where Δis an invariant probability measure for P and μ b is the expected sojourn time in state b ϵΠ. We show that this limit is robust; that is, for each state b ϵ Πthe sojourn-time distribution may change for each transition, but, as long as the expected sojourn time in b is µ b on the average, the above limit still holds. The kernel P may also vary for each transition as long as Δis invariant.
APA, Harvard, Vancouver, ISO, and other styles
5

Barnsley, Michael F., and John H. Elton. "A new class of markov processes for image encoding." Advances in Applied Probability 20, no. 1 (March 1988): 14–32. http://dx.doi.org/10.2307/1427268.

Full text
Abstract:
A new class of iterated function systems is introduced, which allows for the computation of non-compactly supported invariant measures, which may represent, for example, greytone images of infinite extent. Conditions for the existence and attractiveness of invariant measures for this new class of randomly iterated maps, which are not necessarily contractions, in metric spaces such as , are established. Estimates for moments of these measures are obtained.Special conditions are given for existence of the invariant measure in the interesting case of affine maps on . For non-singular affine maps on , the support of the measure is shown to be an infinite interval, but Fourier transform analysis shows that the measure can be purely singular even though its distribution function is strictly increasing.
APA, Harvard, Vancouver, ISO, and other styles
6

Barnsley, Michael F., and John H. Elton. "A new class of markov processes for image encoding." Advances in Applied Probability 20, no. 01 (March 1988): 14–32. http://dx.doi.org/10.1017/s0001867800017924.

Full text
Abstract:
A new class of iterated function systems is introduced, which allows for the computation of non-compactly supported invariant measures, which may represent, for example, greytone images of infinite extent. Conditions for the existence and attractiveness of invariant measures for this new class of randomly iterated maps, which are not necessarily contractions, in metric spaces such as , are established. Estimates for moments of these measures are obtained. Special conditions are given for existence of the invariant measure in the interesting case of affine maps on . For non-singular affine maps on , the support of the measure is shown to be an infinite interval, but Fourier transform analysis shows that the measure can be purely singular even though its distribution function is strictly increasing.
APA, Harvard, Vancouver, ISO, and other styles
7

Kalpazidou, S. "On Levy's theorem concerning positiveness of transition probabilities of Markov processes: the circuit processes case." Journal of Applied Probability 30, no. 1 (March 1993): 28–39. http://dx.doi.org/10.2307/3214619.

Full text
Abstract:
We prove Lévy's theorem concerning positiveness of transition probabilities of Markov processes when the state space is countable and an invariant probability distribution exists. Our approach relies on the representation of transition probabilities in terms of the directed circuits that occur along the sample paths.
APA, Harvard, Vancouver, ISO, and other styles
8

Kalpazidou, S. "On Levy's theorem concerning positiveness of transition probabilities of Markov processes: the circuit processes case." Journal of Applied Probability 30, no. 01 (March 1993): 28–39. http://dx.doi.org/10.1017/s0021900200043977.

Full text
Abstract:
We prove Lévy's theorem concerning positiveness of transition probabilities of Markov processes when the state space is countable and an invariant probability distribution exists. Our approach relies on the representation of transition probabilities in terms of the directed circuits that occur along the sample paths.
APA, Harvard, Vancouver, ISO, and other styles
9

Avrachenkov, Konstantin, Alexey Piunovskiy, and Yi Zhang. "Markov Processes with Restart." Journal of Applied Probability 50, no. 4 (December 2013): 960–68. http://dx.doi.org/10.1239/jap/1389370093.

Full text
Abstract:
We consider a general homogeneous continuous-time Markov process with restarts. The process is forced to restart from a given distribution at time moments generated by an independent Poisson process. The motivation to study such processes comes from modeling human and animal mobility patterns, restart processes in communication protocols, and from application of restarting random walks in information retrieval. We provide a connection between the transition probability functions of the original Markov process and the modified process with restarts. We give closed-form expressions for the invariant probability measure of the modified process. When the process evolves on the Euclidean space, there is also a closed-form expression for the moments of the modified process. We show that the modified process is always positive Harris recurrent and exponentially ergodic with the index equal to (or greater than) the rate of restarts. Finally, we illustrate the general results by the standard and geometric Brownian motions.
APA, Harvard, Vancouver, ISO, and other styles
10

Avrachenkov, Konstantin, Alexey Piunovskiy, and Yi Zhang. "Markov Processes with Restart." Journal of Applied Probability 50, no. 04 (December 2013): 960–68. http://dx.doi.org/10.1017/s0021900200013735.

Full text
Abstract:
We consider a general homogeneous continuous-time Markov process with restarts. The process is forced to restart from a given distribution at time moments generated by an independent Poisson process. The motivation to study such processes comes from modeling human and animal mobility patterns, restart processes in communication protocols, and from application of restarting random walks in information retrieval. We provide a connection between the transition probability functions of the original Markov process and the modified process with restarts. We give closed-form expressions for the invariant probability measure of the modified process. When the process evolves on the Euclidean space, there is also a closed-form expression for the moments of the modified process. We show that the modified process is always positive Harris recurrent and exponentially ergodic with the index equal to (or greater than) the rate of restarts. Finally, we illustrate the general results by the standard and geometric Brownian motions.
APA, Harvard, Vancouver, ISO, and other styles
11

Pagès, Gilles, and Clément Rey. "Recursive computation of the invariant distributions of Feller processes: Revisited examples and new applications." Monte Carlo Methods and Applications 25, no. 1 (March 1, 2019): 1–36. http://dx.doi.org/10.1515/mcma-2018-2027.

Full text
Abstract:
Abstract In this paper, we show that the abstract framework developed in [G. Pagès and C. Rey, Recursive computation of the invariant distribution of Markov and Feller processes, preprint 2017, https://arxiv.org/abs/1703.04557] and inspired by [D. Lamberton and G. Pagès, Recursive computation of the invariant distribution of a diffusion, Bernoulli 8 2002, 3, 367–405] can be used to build invariant distributions for Brownian diffusion processes using the Milstein scheme and for diffusion processes with censored jump using the Euler scheme. Both studies rely on a weakly mean-reverting setting for both cases. For the Milstein scheme we prove the convergence for test functions with polynomial (Wasserstein convergence) and exponential growth. For the Euler scheme of diffusion processes with censored jump we prove the convergence for test functions with polynomial growth.
APA, Harvard, Vancouver, ISO, and other styles
12

Fredes, Luis, and Jean-François Marckert. "Invariant measures of interacting particle systems: Algebraic aspects." ESAIM: Probability and Statistics 24 (2020): 526–80. http://dx.doi.org/10.1051/ps/2020008.

Full text
Abstract:
Consider a continuous time particle system ηt = (ηt(k), k ∈ 𝕃), indexed by a lattice 𝕃 which will be either ℤ, ℤ∕nℤ, a segment {1, ⋯ , n}, or ℤd, and taking its values in the set Eκ𝕃 where Eκ = {0, ⋯ , κ − 1} for some fixed κ ∈{∞, 2, 3, ⋯ }. Assume that the Markovian evolution of the particle system (PS) is driven by some translation invariant local dynamics with bounded range, encoded by a jump rate matrix ⊤. These are standard settings, satisfied by the TASEP, the voter models, the contact processes. The aim of this paper is to provide some sufficient and/or necessary conditions on the matrix ⊤ so that this Markov process admits some simple invariant distribution, as a product measure (if 𝕃 is any of the spaces mentioned above), the law of a Markov process indexed by ℤ or [1, n] ∩ ℤ (if 𝕃 = ℤ or {1, …, n}), or a Gibbs measure if 𝕃 = ℤ/nℤ. Multiple applications follow: efficient ways to find invariant Markov laws for a given jump rate matrix or to prove that none exists. The voter models and the contact processes are shown not to possess any Markov laws as invariant distribution (for any memory m). (As usual, a random process X indexed by ℤ or ℕ is said to be a Markov chain with memory m ∈ {0, 1, 2, ⋯ } if ℙ(Xk ∈ A | Xk−i, i ≥ 1) = ℙ(Xk ∈ A | Xk−i, 1 ≤ i ≤ m), for any k.) We also prove that some models close to these models do. We exhibit PS admitting hidden Markov chains as invariant distribution and design many PS on ℤ2, with jump rates indexed by 2 × 2 squares, admitting product invariant measures.
APA, Harvard, Vancouver, ISO, and other styles
13

Pollett, P. K., and P. G. Taylor. "On the Problem of Establishing the Existence of Stationary Distributions for Continuous-Time Markov Chains." Probability in the Engineering and Informational Sciences 7, no. 4 (October 1993): 529–43. http://dx.doi.org/10.1017/s0269964800003119.

Full text
Abstract:
We consider the problem of establishing the existence of stationary distributions for continuous-time Markov chains directly from the transition rates Q. Given an invariant probability distribution m for Q, we show that a necessary and sufficient condition for m to be a stationary distribution for the minimal process is that Q be regular. We provide sufficient conditions for the regularity of Q that are simple to verify in practice, thus allowing one to easily identify stationary distributions for a variety of models. To illustrate our results, we shall consider three classes of multidimensional Markov chains, namely, networks of queues with batch movements, semireversible queues, and partially balanced Markov processes.
APA, Harvard, Vancouver, ISO, and other styles
14

Chen, Anyue, Kai Wang Ng, and Hanjun Zhang. "Uniqueness and Decay Properties of Markov Branching Processes with Disasters." Journal of Applied Probability 51, no. 3 (September 2014): 613–24. http://dx.doi.org/10.1239/jap/1409932662.

Full text
Abstract:
In this paper we discuss the decay properties of Markov branching processes with disasters, including the decay parameter, invariant measures, and quasistationary distributions. After showing that the corresponding q-matrix Q is always regular and, thus, that the Feller minimal Q-process is honest, we obtain the exact value of the decay parameter λC. We show that the decay parameter can be easily expressed explicitly. We further show that the Markov branching process with disaster is always λC-positive. The invariant vectors, the invariant measures, and the quasidistributions are given explicitly.
APA, Harvard, Vancouver, ISO, and other styles
15

Chen, Anyue, Kai Wang Ng, and Hanjun Zhang. "Uniqueness and Decay Properties of Markov Branching Processes with Disasters." Journal of Applied Probability 51, no. 03 (September 2014): 613–24. http://dx.doi.org/10.1017/s0021900200011554.

Full text
Abstract:
In this paper we discuss the decay properties of Markov branching processes with disasters, including the decay parameter, invariant measures, and quasistationary distributions. After showing that the corresponding q-matrix Q is always regular and, thus, that the Feller minimal Q-process is honest, we obtain the exact value of the decay parameter λ C . We show that the decay parameter can be easily expressed explicitly. We further show that the Markov branching process with disaster is always λ C -positive. The invariant vectors, the invariant measures, and the quasidistributions are given explicitly.
APA, Harvard, Vancouver, ISO, and other styles
16

Pollett, P. K. "Reversibility, invariance and μ-invariance." Advances in Applied Probability 20, no. 3 (September 1988): 600–621. http://dx.doi.org/10.2307/1427037.

Full text
Abstract:
In this paper we consider a number of questions relating to the problem of determining quasi-stationary distributions for transient Markov processes. First we find conditions under which a measure or vector that is µ-invariant for a matrix of transition rates is also μ-invariant for the family of transition matrices of the minimal process it generates. These provide a means for determining whether or not the so-called stationary conditional quasi-stationary distribution exists in the λ-transient case. The process is not assumed to be regular, nor is it assumed to be uniform or irreducible. In deriving the invariance conditions we reveal a relationship between μ-invariance and the invariance of measures for related processes called the μ-reverse and the μ-dual processes. They play a role analogous to the time-reverse process which arises in the discussion of stationary distributions. Secondly we bring the related notions of detail-balance and reversibility into the realm of quasi-stationary processes. For example, if a process can be identified as being μ-reversible, the problem of determining quasi-stationary distributions is made much simpler. Finally, we consider some practical problems that emerge when calculating quasi-stationary distributions directly from the transition rates of the process. Our results are illustrated with reference to a variety of processes including examples of birth and death processes and the birth, death and catastrophe process.
APA, Harvard, Vancouver, ISO, and other styles
17

Pollett, P. K. "Reversibility, invariance and μ-invariance." Advances in Applied Probability 20, no. 03 (September 1988): 600–621. http://dx.doi.org/10.1017/s0001867800018164.

Full text
Abstract:
In this paper we consider a number of questions relating to the problem of determining quasi-stationary distributions for transient Markov processes. First we find conditions under which a measure or vector that is µ-invariant for a matrix of transition rates is also μ-invariant for the family of transition matrices of the minimal process it generates. These provide a means for determining whether or not the so-called stationary conditional quasi-stationary distribution exists in the λ-transient case. The process is not assumed to be regular, nor is it assumed to be uniform or irreducible. In deriving the invariance conditions we reveal a relationship between μ-invariance and the invariance of measures for related processes called the μ-reverse and the μ-dual processes. They play a role analogous to the time-reverse process which arises in the discussion of stationary distributions. Secondly we bring the related notions of detail-balance and reversibility into the realm of quasi-stationary processes. For example, if a process can be identified as being μ-reversible, the problem of determining quasi-stationary distributions is made much simpler. Finally, we consider some practical problems that emerge when calculating quasi-stationary distributions directly from the transition rates of the process. Our results are illustrated with reference to a variety of processes including examples of birth and death processes and the birth, death and catastrophe process.
APA, Harvard, Vancouver, ISO, and other styles
18

Kazakevičius, Vytautas, and Remigijus Leipus. "A new theorem on the existence of invariant distributions with applications to ARCH processes." Journal of Applied Probability 40, no. 1 (March 2003): 147–62. http://dx.doi.org/10.1239/jap/1044476832.

Full text
Abstract:
A new theorem on the existence of an invariant initial distribution for a Markov chain evolving on a Polish space is proved. As an application of the theorem, sufficient conditions for the existence of integrated ARCH processes are established. In the case where these conditions are violated, the top Lyapunov exponent is shown to be zero.
APA, Harvard, Vancouver, ISO, and other styles
19

Kazakevičius, Vytautas, and Remigijus Leipus. "A new theorem on the existence of invariant distributions with applications to ARCH processes." Journal of Applied Probability 40, no. 01 (March 2003): 147–62. http://dx.doi.org/10.1017/s0021900200022312.

Full text
Abstract:
A new theorem on the existence of an invariant initial distribution for a Markov chain evolving on a Polish space is proved. As an application of the theorem, sufficient conditions for the existence of integrated ARCH processes are established. In the case where these conditions are violated, the top Lyapunov exponent is shown to be zero.
APA, Harvard, Vancouver, ISO, and other styles
20

Li, Yangrong, Anthony G. Pakes, Jia Li, and Anhui Gu. "The Limit Behavior of Dual Markov Branching Processes." Journal of Applied Probability 45, no. 1 (March 2008): 176–89. http://dx.doi.org/10.1239/jap/1208358960.

Full text
Abstract:
A dual Markov branching process (DMBP) is by definition a Siegmund's predual of some Markov branching process (MBP). Such a process does exist and is uniquely determined by the so-called dual-branching property. Its q-matrix Q is derived and proved to be regular and monotone. Several equivalent definitions for a DMBP are given. The criteria for transience, positive recurrence, strong ergodicity, and the Feller property are established. The invariant distributions are given by a clear formulation with a geometric limit law.
APA, Harvard, Vancouver, ISO, and other styles
21

Li, Yangrong, Anthony G. Pakes, Jia Li, and Anhui Gu. "The Limit Behavior of Dual Markov Branching Processes." Journal of Applied Probability 45, no. 01 (March 2008): 176–89. http://dx.doi.org/10.1017/s0021900200004046.

Full text
Abstract:
A dual Markov branching process (DMBP) is by definition a Siegmund's predual of some Markov branching process (MBP). Such a process does exist and is uniquely determined by the so-called dual-branching property. Its q-matrix Q is derived and proved to be regular and monotone. Several equivalent definitions for a DMBP are given. The criteria for transience, positive recurrence, strong ergodicity, and the Feller property are established. The invariant distributions are given by a clear formulation with a geometric limit law.
APA, Harvard, Vancouver, ISO, and other styles
22

Nair, M. G., and P. K. Pollett. "On the relationship between µ-invariant measures and quasi-stationary distributions for continuous-time Markov chains." Advances in Applied Probability 25, no. 1 (March 1993): 82–102. http://dx.doi.org/10.2307/1427497.

Full text
Abstract:
In a recent paper, van Doorn (1991) explained how quasi-stationary distributions for an absorbing birth-death process could be determined from the transition rates of the process, thus generalizing earlier work of Cavender (1978). In this paper we shall show that many of van Doorn's results can be extended to deal with an arbitrary continuous-time Markov chain over a countable state space, consisting of an irreducible class, C, and an absorbing state, 0, which is accessible from C. Some of our results are extensions of theorems proved for honest chains in Pollett and Vere-Jones (1992).In Section 3 we prove that a probability distribution on C is a quasi-stationary distribution if and only if it is a µ-invariant measure for the transition function, P. We shall also show that if m is a quasi-stationary distribution for P, then a necessary and sufficient condition for m to be µ-invariant for Q is that P satisfies the Kolmogorov forward equations over C. When the remaining forward equations hold, the quasi-stationary distribution must satisfy a set of ‘residual equations' involving the transition rates into the absorbing state. The residual equations allow us to determine the value of µ for which the quasi-stationary distribution is µ-invariant for P. We also prove some more general results giving bounds on the values of µ for which a convergent measure can be a µ-subinvariant and then µ-invariant measure for P. The remainder of the paper is devoted to the question of when a convergent µ-subinvariant measure, m, for Q is a quasi-stationary distribution. Section 4 establishes a necessary and sufficient condition for m to be a quasi-stationary distribution for the minimal chain. In Section 5 we consider ‘single-exit' chains. We derive a necessary and sufficient condition for there to exist a process for which m is a quasi-stationary distribution. Under this condition all such processes can be specified explicitly through their resolvents. The results proved here allow us to conclude that the bounds for µ obtained in Section 3 are, in fact, tight. Finally, in Section 6, we illustrate our results by way of two examples: regular birth-death processes and a pure-birth process with absorption.
APA, Harvard, Vancouver, ISO, and other styles
23

Nair, M. G., and P. K. Pollett. "On the relationship between µ-invariant measures and quasi-stationary distributions for continuous-time Markov chains." Advances in Applied Probability 25, no. 01 (March 1993): 82–102. http://dx.doi.org/10.1017/s0001867800025180.

Full text
Abstract:
In a recent paper, van Doorn (1991) explained how quasi-stationary distributions for an absorbing birth-death process could be determined from the transition rates of the process, thus generalizing earlier work of Cavender (1978). In this paper we shall show that many of van Doorn's results can be extended to deal with an arbitrary continuous-time Markov chain over a countable state space, consisting of an irreducible class, C, and an absorbing state, 0, which is accessible from C. Some of our results are extensions of theorems proved for honest chains in Pollett and Vere-Jones (1992). In Section 3 we prove that a probability distribution on C is a quasi-stationary distribution if and only if it is a µ-invariant measure for the transition function, P. We shall also show that if m is a quasi-stationary distribution for P, then a necessary and sufficient condition for m to be µ-invariant for Q is that P satisfies the Kolmogorov forward equations over C. When the remaining forward equations hold, the quasi-stationary distribution must satisfy a set of ‘residual equations' involving the transition rates into the absorbing state. The residual equations allow us to determine the value of µ for which the quasi-stationary distribution is µ-invariant for P. We also prove some more general results giving bounds on the values of µ for which a convergent measure can be a µ-subinvariant and then µ-invariant measure for P. The remainder of the paper is devoted to the question of when a convergent µ-subinvariant measure, m, for Q is a quasi-stationary distribution. Section 4 establishes a necessary and sufficient condition for m to be a quasi-stationary distribution for the minimal chain. In Section 5 we consider ‘single-exit' chains. We derive a necessary and sufficient condition for there to exist a process for which m is a quasi-stationary distribution. Under this condition all such processes can be specified explicitly through their resolvents. The results proved here allow us to conclude that the bounds for µ obtained in Section 3 are, in fact, tight. Finally, in Section 6, we illustrate our results by way of two examples: regular birth-death processes and a pure-birth process with absorption.
APA, Harvard, Vancouver, ISO, and other styles
24

Ferrari, Pablo A., and Nancy Lopes Garcia. "One-dimensional loss networks and conditioned M/G/∞ queues." Journal of Applied Probability 35, no. 4 (December 1998): 963–75. http://dx.doi.org/10.1239/jap/1032438391.

Full text
Abstract:
We study one-dimensional continuous loss networks with length distribution G and cable capacity C. We prove that the unique stationary distribution ηL of the network for which the restriction on the number of calls to be less than C is imposed only in the segment [−L,L] is the same as the distribution of a stationary M/G/∞ queue conditioned to be less than C in the time interval [−L,L]. For distributions G which are of phase type (= absorbing times of finite state Markov processes) we show that the limit as L → ∞ of ηL exists and is unique. The limiting distribution turns out to be invariant for the infinite loss network. This was conjectured by Kelly (1991).
APA, Harvard, Vancouver, ISO, and other styles
25

Asselah, Amine, Pablo A. Ferrari, and Pablo Groisman. "Quasistationary Distributions and Fleming-Viot Processes in Finite Spaces." Journal of Applied Probability 48, no. 02 (June 2011): 322–32. http://dx.doi.org/10.1017/s0021900200007907.

Full text
Abstract:
Consider a continuous-time Markov process with transition rates matrixQin the state space Λ ⋃ {0}. In the associated Fleming-Viot processNparticles evolve independently in Λ with transition rates matrixQuntil one of them attempts to jump to state 0. At this moment the particle jumps to one of the positions of the other particles, chosen uniformly at random. When Λ is finite, we show that the empirical distribution of the particles at a fixed time converges asN→ ∞ to the distribution of a single particle at the same time conditioned on not touching {0}. Furthermore, the empirical profile of the unique invariant measure for the Fleming-Viot process withNparticles converges asN→ ∞ to the unique quasistationary distribution of the one-particle motion. A key element of the approach is to show that the two-particle correlations are of order 1 /N.
APA, Harvard, Vancouver, ISO, and other styles
26

Asselah, Amine, Pablo A. Ferrari, and Pablo Groisman. "Quasistationary Distributions and Fleming-Viot Processes in Finite Spaces." Journal of Applied Probability 48, no. 2 (June 2011): 322–32. http://dx.doi.org/10.1239/jap/1308662630.

Full text
Abstract:
Consider a continuous-time Markov process with transition rates matrix Q in the state space Λ ⋃ {0}. In the associated Fleming-Viot process N particles evolve independently in Λ with transition rates matrix Q until one of them attempts to jump to state 0. At this moment the particle jumps to one of the positions of the other particles, chosen uniformly at random. When Λ is finite, we show that the empirical distribution of the particles at a fixed time converges as N → ∞ to the distribution of a single particle at the same time conditioned on not touching {0}. Furthermore, the empirical profile of the unique invariant measure for the Fleming-Viot process with N particles converges as N → ∞ to the unique quasistationary distribution of the one-particle motion. A key element of the approach is to show that the two-particle correlations are of order 1 / N.
APA, Harvard, Vancouver, ISO, and other styles
27

Ferrari, Pablo A., and Nancy Lopes Garcia. "One-dimensional loss networks and conditioned M/G/∞ queues." Journal of Applied Probability 35, no. 04 (December 1998): 963–75. http://dx.doi.org/10.1017/s0021900200016661.

Full text
Abstract:
We study one-dimensional continuous loss networks with length distribution G and cable capacity C. We prove that the unique stationary distribution η L of the network for which the restriction on the number of calls to be less than C is imposed only in the segment [−L,L] is the same as the distribution of a stationary M/G/∞ queue conditioned to be less than C in the time interval [−L,L]. For distributions G which are of phase type (= absorbing times of finite state Markov processes) we show that the limit as L → ∞ of η L exists and is unique. The limiting distribution turns out to be invariant for the infinite loss network. This was conjectured by Kelly (1991).
APA, Harvard, Vancouver, ISO, and other styles
28

Pollett, P. K., and A. J. Roberts. "A description of the long-term behaviour of absorbing continuous-time Markov chains using a centre manifold." Advances in Applied Probability 22, no. 1 (March 1990): 111–28. http://dx.doi.org/10.2307/1427600.

Full text
Abstract:
We use the notion of an invariant manifold to describe the long-term behaviour of absorbing continuous-time Markov processes with a denumerable infinity of states. We show that there exists an invariant manifold for the forward differential equations and we are able to describe the evolution of the state probabilities on this manifold. Our approach gives rise to a new method for calculating conditional limiting distributions, one which is also appropriate for dealing with processes whose transition probabilities satisfy a system of non-linear differential equations.
APA, Harvard, Vancouver, ISO, and other styles
29

Pollett, P. K., and A. J. Roberts. "A description of the long-term behaviour of absorbing continuous-time Markov chains using a centre manifold." Advances in Applied Probability 22, no. 01 (March 1990): 111–28. http://dx.doi.org/10.1017/s0001867800019364.

Full text
Abstract:
We use the notion of an invariant manifold to describe the long-term behaviour of absorbing continuous-time Markov processes with a denumerable infinity of states. We show that there exists an invariant manifold for the forward differential equations and we are able to describe the evolution of the state probabilities on this manifold. Our approach gives rise to a new method for calculating conditional limiting distributions, one which is also appropriate for dealing with processes whose transition probabilities satisfy a system of non-linear differential equations.
APA, Harvard, Vancouver, ISO, and other styles
30

Daduna, Hans, and Ryszard Szekli. "Correlation formulas for Markovian network processes in a random environment." Advances in Applied Probability 48, no. 1 (March 2016): 176–98. http://dx.doi.org/10.1017/apr.2015.12.

Full text
Abstract:
Abstract We consider Markov processes, which describe, e.g. queueing network processes, in a random environment which influences the network by determining random breakdown of nodes, and the necessity of repair thereafter. Starting from an explicit steady-state distribution of product form available in the literature, we note that this steady-state distribution does not provide information about the correlation structure in time and space (over nodes). We study this correlation structure via one-step correlations for the queueing-environment process. Although formulas for absolute values of these correlations are complicated, the differences of correlations of related networks are simple and have a nice structure. We therefore compare two networks in a random environment having the same invariant distribution, and focus on the time behaviour of the processes when in such a network the environment changes or the rules for travelling are perturbed. Evaluating the comparison formulas we compare spectral gaps and asymptotic variances of related processes.
APA, Harvard, Vancouver, ISO, and other styles
31

Klebaner, F. C., U. Rösler, and S. Sagitov. "Transformations of Galton-Watson processes and linear fractional reproduction." Advances in Applied Probability 39, no. 4 (December 2007): 1036–53. http://dx.doi.org/10.1239/aap/1198177238.

Full text
Abstract:
By establishing general relationships between branching transformations (Harris-Sevastyanov, Lamperti-Ney, time reversals, and Asmussen-Sigman) and Markov chain transforms (Doob's h-transform, time reversal, and the cone dual), we discover a deeper connection between these transformations with harmonic functions and invariant measures for the process itself and its space-time process. We give a classification of the duals into Doob's h-transform, pathwise time reversal, and cone reversal. Explicit results are obtained for the linear fractional offspring distribution. Remarkably, for this case, all reversals turn out to be a Galton-Watson process with a dual reproduction law and eternal particle or some kind of immigration. In particular, we generalize a result of Klebaner and Sagitov (2002) in which only a geometric offspring distribution was considered. A new graphical representation in terms of an associated simple random walk on N2 allows for illuminating picture proofs of our main results concerning transformations of the linear fractional Galton-Watson process.
APA, Harvard, Vancouver, ISO, and other styles
32

Klebaner, F. C., U. Rösler, and S. Sagitov. "Transformations of Galton-Watson processes and linear fractional reproduction." Advances in Applied Probability 39, no. 04 (December 2007): 1036–53. http://dx.doi.org/10.1017/s0001867800002226.

Full text
Abstract:
By establishing general relationships between branching transformations (Harris-Sevastyanov, Lamperti-Ney, time reversals, and Asmussen-Sigman) and Markov chain transforms (Doob's h-transform, time reversal, and the cone dual), we discover a deeper connection between these transformations with harmonic functions and invariant measures for the process itself and its space-time process. We give a classification of the duals into Doob's h-transform, pathwise time reversal, and cone reversal. Explicit results are obtained for the linear fractional offspring distribution. Remarkably, for this case, all reversals turn out to be a Galton-Watson process with a dual reproduction law and eternal particle or some kind of immigration. In particular, we generalize a result of Klebaner and Sagitov (2002) in which only a geometric offspring distribution was considered. A new graphical representation in terms of an associated simple random walk on N 2 allows for illuminating picture proofs of our main results concerning transformations of the linear fractional Galton-Watson process.
APA, Harvard, Vancouver, ISO, and other styles
33

Thierrin, Ferenc Cole, Fady Alajaji, and Tamás Linder. "Rényi Cross-Entropy Measures for Common Distributions and Processes with Memory." Entropy 24, no. 10 (October 4, 2022): 1417. http://dx.doi.org/10.3390/e24101417.

Full text
Abstract:
Two Rényi-type generalizations of the Shannon cross-entropy, the Rényi cross-entropy and the Natural Rényi cross-entropy, were recently used as loss functions for the improved design of deep learning generative adversarial networks. In this work, we derive the Rényi and Natural Rényi differential cross-entropy measures in closed form for a wide class of common continuous distributions belonging to the exponential family, and we tabulate the results for ease of reference. We also summarise the Rényi-type cross-entropy rates between stationary Gaussian processes and between finite-alphabet time-invariant Markov sources.
APA, Harvard, Vancouver, ISO, and other styles
34

Fackeldey, Konstantin, Amir Niknejad, and Marcus Weber. "Finding metastabilities in reversible Markov chains based on incomplete sampling." Special Matrices 5, no. 1 (January 26, 2017): 73–81. http://dx.doi.org/10.1515/spma-2017-0006.

Full text
Abstract:
Abstract In order to fully characterize the state-transition behaviour of finite Markov chains one needs to provide the corresponding transition matrix P. In many applications such as molecular simulation and drug design, the entries of the transition matrix P are estimated by generating realizations of the Markov chain and determining the one-step conditional probability Pij for a transition from one state i to state j. This sampling can be computational very demanding. Therefore, it is a good idea to reduce the sampling effort. The main purpose of this paper is to design a sampling strategy, which provides a partial sampling of only a subset of the rows of such a matrix P. Our proposed approach fits very well to stochastic processes stemming from simulation of molecular systems or random walks on graphs and it is different from the matrix completion approaches which try to approximate the transition matrix by using a low-rank-assumption. It will be shown how Markov chains can be analyzed on the basis of a partial sampling. More precisely. First, we will estimate the stationary distribution from a partially given matrix P. Second, we will estimate the infinitesimal generator Q of P on the basis of this stationary distribution. Third, from the generator we will compute the leading invariant subspace, which should be identical to the leading invariant subspace of P. Forth, we will apply Robust Perron Cluster Analysis (PCCA+) in order to identify metastabilities using this subspace.
APA, Harvard, Vancouver, ISO, and other styles
35

Chen, Francis K. C., and Richard Cowan. "Invariant distributions for shapes in sequences of randomly-divided rectangles." Advances in Applied Probability 31, no. 1 (March 1999): 1–14. http://dx.doi.org/10.1239/aap/1029954262.

Full text
Abstract:
Interest has been shown in Markovian sequences of geometric shapes. Mostly the equations for invariant probability measures over shape space are extremely complicated and multidimensional. This paper deals with rectangles which have a simple one-dimensional shape descriptor. We explore the invariant distributions of shape under a variety of randomised rules for splitting the rectangle into two sub-rectangles, with numerous methods for selecting the next shape in sequence. Many explicit results emerge. These help to fill a vacant niche in shape theory, whilst contributing at the same time, new distributions on [0,1] and interesting examples of Markov processes or, in the language of another discipline, of stochastic dynamical systems.
APA, Harvard, Vancouver, ISO, and other styles
36

Chen, Francis K. C., and Richard Cowan. "Invariant distributions for shapes in sequences of randomly-divided rectangles." Advances in Applied Probability 31, no. 01 (March 1999): 1–14. http://dx.doi.org/10.1017/s0001867800008910.

Full text
Abstract:
Interest has been shown in Markovian sequences of geometric shapes. Mostly the equations for invariant probability measures over shape space are extremely complicated and multidimensional. This paper deals with rectangles which have a simple one-dimensional shape descriptor. We explore the invariant distributions of shape under a variety of randomised rules for splitting the rectangle into two sub-rectangles, with numerous methods for selecting the next shape in sequence. Many explicit results emerge. These help to fill a vacant niche in shape theory, whilst contributing at the same time, new distributions on [0,1] and interesting examples of Markov processes or, in the language of another discipline, of stochastic dynamical systems.
APA, Harvard, Vancouver, ISO, and other styles
37

Ball, Frank, Robin K. Milne, Ian D. Tame, and Geoffrey F. Yeo. "Superposition of Interacting Aggregated Continuous-Time Markov Chains." Advances in Applied Probability 29, no. 1 (March 1997): 56–91. http://dx.doi.org/10.2307/1427861.

Full text
Abstract:
Consider a system of interacting finite Markov chains in continuous time, where each subsystem is aggregated by a common partitioning of the state space. The interaction is assumed to arise from dependence of some of the transition rates for a given subsystem at a specified time on the states of the other subsystems at that time. With two subsystem classes, labelled 0 and 1, the superposition process arising from a system counts the number of subsystems in the latter class. Key structure and results from the theory of aggregated Markov processes are summarized. These are then applied also to superposition processes. In particular, we consider invariant distributions for the level m entry process, marginal and joint distributions for sojourn-times of the superposition process at its various levels, and moments and correlation functions associated with these distributions. The distributions are obtained mainly by using matrix methods, though an approach based on point process methods and conditional probability arguments is outlined. Conditions under which an interacting aggregated Markov chain is reversible are established. The ideas are illustrated with simple examples for which numerical results are obtained using Matlab. Motivation for this study has come from stochastic modelling of the behaviour of ion channels; another application is in reliability modelling.
APA, Harvard, Vancouver, ISO, and other styles
38

Ball, Frank, Robin K. Milne, Ian D. Tame, and Geoffrey F. Yeo. "Superposition of Interacting Aggregated Continuous-Time Markov Chains." Advances in Applied Probability 29, no. 01 (March 1997): 56–91. http://dx.doi.org/10.1017/s0001867800027798.

Full text
Abstract:
Consider a system of interacting finite Markov chains in continuous time, where each subsystem is aggregated by a common partitioning of the state space. The interaction is assumed to arise from dependence of some of the transition rates for a given subsystem at a specified time on the states of the other subsystems at that time. With two subsystem classes, labelled 0 and 1, the superposition process arising from a system counts the number of subsystems in the latter class. Key structure and results from the theory of aggregated Markov processes are summarized. These are then applied also to superposition processes. In particular, we consider invariant distributions for the level m entry process, marginal and joint distributions for sojourn-times of the superposition process at its various levels, and moments and correlation functions associated with these distributions. The distributions are obtained mainly by using matrix methods, though an approach based on point process methods and conditional probability arguments is outlined. Conditions under which an interacting aggregated Markov chain is reversible are established. The ideas are illustrated with simple examples for which numerical results are obtained using Matlab. Motivation for this study has come from stochastic modelling of the behaviour of ion channels; another application is in reliability modelling.
APA, Harvard, Vancouver, ISO, and other styles
39

Dshalalow, Jewgeni. "Multichannel queueing systems with infinite waiting room and stochastic control." Journal of Applied Probability 26, no. 2 (June 1989): 345–62. http://dx.doi.org/10.2307/3214040.

Full text
Abstract:
A wide class of multichannel queueing models appears to be useful in practice where the input stream of customers can be controlled at the moments preceding the customers' departures from the source (e.g. airports, transportation systems, inventories, tandem queues). In addition, the servicing facility can govern the intensity of the servicing process that further improves flexibility of the system. In such a multichannel queue with infinite waiting room the queueing process {Zt; t ≧ 0} is under investigation. The author obtains explicit formulas for the limiting distribution of (Zt) partly using an approach developed in previous work and based on the theory of semi-regenerative processes. Among other results the limiting distributions of the actual and virtual waiting time are derived. The input stream (which is not recurrent) is investigated, and distribution of the residual time from t to the next arrival is obtained. The author also treats a Markov chain embedded in (Zt) and gives a necessary and sufficient condition for its existence. Under this condition the invariant probability measure is derived.
APA, Harvard, Vancouver, ISO, and other styles
40

Dshalalow, Jewgeni. "Multichannel queueing systems with infinite waiting room and stochastic control." Journal of Applied Probability 26, no. 02 (June 1989): 345–62. http://dx.doi.org/10.1017/s0021900200027339.

Full text
Abstract:
A wide class of multichannel queueing models appears to be useful in practice where the input stream of customers can be controlled at the moments preceding the customers' departures from the source (e.g. airports, transportation systems, inventories, tandem queues). In addition, the servicing facility can govern the intensity of the servicing process that further improves flexibility of the system. In such a multichannel queue with infinite waiting room the queueing process {Zt ; t ≧ 0} is under investigation. The author obtains explicit formulas for the limiting distribution of (Zt ) partly using an approach developed in previous work and based on the theory of semi-regenerative processes. Among other results the limiting distributions of the actual and virtual waiting time are derived. The input stream (which is not recurrent) is investigated, and distribution of the residual time from t to the next arrival is obtained. The author also treats a Markov chain embedded in (Zt ) and gives a necessary and sufficient condition for its existence. Under this condition the invariant probability measure is derived.
APA, Harvard, Vancouver, ISO, and other styles
41

Robini, Marc C., Yoram Bresler, and Isabelle E. Magnin. "ON THE CONVERGENCE OF METROPOLIS-TYPE RELAXATION AND ANNEALING WITH CONSTRAINTS." Probability in the Engineering and Informational Sciences 16, no. 4 (October 2002): 427–52. http://dx.doi.org/10.1017/s0269964802164035.

Full text
Abstract:
We discuss the asymptotic behavior of time-inhomogeneous Metropolis chains for solving constrained sampling and optimization problems. In addition to the usual inverse temperature schedule (βn)n∈[hollow N]*, the type of Markov processes under consideration is controlled by a divergent sequence (θn)n∈[hollow N]* of parameters acting as Lagrange multipliers. The associated transition probability matrices (Pβn,θn)n∈[hollow N]* are defined by Pβ,θ = q(x, y)exp(−β(Wθ(y) − Wθ(x))+) for all pairs (x, y) of distinct elements of a finite set Ω, where q is an irreducible and reversible Markov kernel and the energy function Wθ is of the form Wθ = U + θV for some functions U,V : Ω → [hollow R]. Our approach, which is based on a comparison of the distribution of the chain at time n with the invariant measure of Pβn,θn, requires the computation of an upper bound for the second largest eigenvalue in absolute value of Pβn,θn. We extend the geometric bounds derived by Ingrassia and we give new sufficient conditions on the control sequences for the algorithm to simulate a Gibbs distribution with energy U on the constrained set [Ω with tilde above] = {x ∈ Ω : V(x) = minz∈ΩV(z)} and to minimize U over [Ω with tilde above].
APA, Harvard, Vancouver, ISO, and other styles
42

Chong, Siang Yew, Peter Tiňo, Jun He, and Xin Yao. "A New Framework for Analysis of Coevolutionary Systems—Directed Graph Representation and Random Walks." Evolutionary Computation 27, no. 2 (June 2019): 195–228. http://dx.doi.org/10.1162/evco_a_00218.

Full text
Abstract:
Studying coevolutionary systems in the context of simplified models (i.e., games with pairwise interactions between coevolving solutions modeled as self plays) remains an open challenge since the rich underlying structures associated with pairwise-comparison-based fitness measures are often not taken fully into account. Although cyclic dynamics have been demonstrated in several contexts (such as intransitivity in coevolutionary problems), there is no complete characterization of cycle structures and their effects on coevolutionary search. We develop a new framework to address this issue. At the core of our approach is the directed graph (digraph) representation of coevolutionary problems that fully captures structures in the relations between candidate solutions. Coevolutionary processes are modeled as a specific type of Markov chains—random walks on digraphs. Using this framework, we show that coevolutionary problems admit a qualitative characterization: a coevolutionary problem is either solvable (there is a subset of solutions that dominates the remaining candidate solutions) or not. This has an implication on coevolutionary search. We further develop our framework that provides the means to construct quantitative tools for analysis of coevolutionary processes and demonstrate their applications through case studies. We show that coevolution of solvable problems corresponds to an absorbing Markov chain for which we can compute the expected hitting time of the absorbing class. Otherwise, coevolution will cycle indefinitely and the quantity of interest will be the limiting invariant distribution of the Markov chain. We also provide an index for characterizing complexity in coevolutionary problems and show how they can be generated in a controlled manner.
APA, Harvard, Vancouver, ISO, and other styles
43

Ruessink, B. G. "Parameter stability and consistency in an alongshore-current model determined with Markov chain Monte Carlo." Journal of Hydroinformatics 10, no. 2 (March 1, 2008): 153–62. http://dx.doi.org/10.2166/hydro.2008.016.

Full text
Abstract:
When a numerical model is to be used as a practical tool, its parameters should preferably be stable and consistent, that is, possess a small uncertainty and be time-invariant. Using data and predictions of alongshore mean currents flowing on a beach as a case study, this paper illustrates how parameter stability and consistency can be assessed using Markov chain Monte Carlo. Within a single calibration run, Markov chain Monte Carlo estimates the parameter posterior probability density function, its mode being the best-fit parameter set. Parameter stability is investigated by stepwise adding new data to a calibration run, while consistency is examined by calibrating the model on different datasets of equal length. The results for the present case study indicate that various tidal cycles with strong (say, >0.5 m/s) currents are required to obtain stable parameter estimates, and that the best-fit model parameters and the underlying posterior distribution are strongly time-varying. This inconsistent parameter behavior may reflect unresolved variability of the processes represented by the parameters, or may represent compensational behavior for temporal violations in specific model assumptions.
APA, Harvard, Vancouver, ISO, and other styles
44

HAFOUTA, YEOR. "Limit theorems for some skew products with mixing base maps." Ergodic Theory and Dynamical Systems 41, no. 1 (August 5, 2019): 241–71. http://dx.doi.org/10.1017/etds.2019.48.

Full text
Abstract:
We obtain a central limit theorem, local limit theorems and renewal theorems for stationary processes generated by skew product maps $T(\unicode[STIX]{x1D714},x)=(\unicode[STIX]{x1D703}\unicode[STIX]{x1D714},T_{\unicode[STIX]{x1D714}}x)$ together with a $T$-invariant measure whose base map $\unicode[STIX]{x1D703}$ satisfies certain topological and mixing conditions and the maps $T_{\unicode[STIX]{x1D714}}$ on the fibers are certain non-singular distance-expanding maps. Our results hold true when $\unicode[STIX]{x1D703}$ is either a sufficiently fast mixing Markov shift with positive transition densities or a (non-uniform) Young tower with at least one periodic point and polynomial tails. The proofs are based on the random complex Ruelle–Perron–Frobenius theorem from Hafouta and Kifer [Nonconventional Limit Theorems and Random Dynamics. World Scientific, Singapore, 2018] applied with appropriate random transfer operators generated by $T_{\unicode[STIX]{x1D714}}$, together with certain regularity assumptions (as functions of $\unicode[STIX]{x1D714}$) of these operators. Limit theorems for deterministic processes whose distributions on the fibers are generated by Markov chains with transition operators satisfying a random version of the Doeblin condition are also obtained. The main innovation in this paper is that the results hold true even though the spectral theory used in Aimino, Nicol and Vaienti [Annealed and quenched limit theorems for random expanding dynamical systems. Probab. Theory Related Fields162 (2015), 233–274] does not seem to be applicable, and the dual of the Koopman operator of $T$ (with respect to the invariant measure) does not seem to have a spectral gap.
APA, Harvard, Vancouver, ISO, and other styles
45

Steif, Jeffrey E. "-Convergence to equilibrium and space—time bernoullicity for spin systems in the M < ε case." Ergodic Theory and Dynamical Systems 11, no. 3 (September 1991): 547–75. http://dx.doi.org/10.1017/s0143385700006337.

Full text
Abstract:
AbstractLiggett has proved that for spin systems, Markov processes with state space {0,1}ℤn, there is a unique stationary distribution in the M < ε regime and all initial configurations uniformly approach this unique stationary distribution exponentially in the weak topology. Here, M and ε are two parameters of the system. We extend this result to discrete time but strengthen it by proving exponential convergence in the stronger - metric instead of the usual weak topology. This is then used to show that the unique stationary process with state space {0,1}ℤn and index set ℤ is isomorphic (in the sense of ergodic theory) to an independent process indexed by ℤ. In the translation invariant case, we prove the stronger fact that this stationary process viewed as a {0, l}-valued process with index set ℤn × ℤ (spacetime) is isomorphic to an independent process also indexed by ℤn × ℤ. This shows that this process is in some sense the most random possible. An application of this last result to approximating by an infinite number of finite systems concatenated independently together is also presented. Finally, we extend all of these results to continuous time.
APA, Harvard, Vancouver, ISO, and other styles
46

Macek, Wiesław M., Dariusz Wójcik, and James L. Burch. "Magnetospheric Multiscale Observations of Markov Turbulence on Kinetic Scales." Astrophysical Journal 943, no. 2 (February 1, 2023): 152. http://dx.doi.org/10.3847/1538-4357/aca0a0.

Full text
Abstract:
Abstract In our previous studies we have examined solar wind and magnetospheric plasma turbulence, including Markovian character on large inertial magnetohydrodynamic scales. Here we present the results of the statistical analysis of magnetic field fluctuations in the Earth’s magnetosheath, based on the Magnetospheric Multiscale mission at much smaller kinetic scales. Following our results on spectral analysis with very large slopes of about −16/3, we apply a Markov-process approach to turbulence in this kinetic regime. It is shown that the Chapman–Kolmogorov equation is satisfied and that the lowest-order Kramers–Moyal coefficients describing drift and diffusion with a power-law dependence are consistent with a generalized Ornstein–Uhlenbeck process. The solutions of the Fokker–Planck equation agree with experimental probability density functions, which exhibit a universal global scale invariance through the kinetic domain. In particular, for moderate scales we have the kappa distribution described by various peaked shapes with heavy tails, which, with large values of the kappa parameter, are reduced to the Gaussian distribution for large inertial scales. This shows that the turbulence cascade can be described by the Markov processes also on very small scales. The obtained results on kinetic scales may be useful for a better understanding of the physical mechanisms governing turbulence.
APA, Harvard, Vancouver, ISO, and other styles
47

Li, Xiu-Juan, and Yu-Peng Yang. "Signatures of the Self-organized Criticality Phenomenon in Precursors of Gamma-Ray Bursts." Astrophysical Journal Letters 955, no. 2 (September 29, 2023): L34. http://dx.doi.org/10.3847/2041-8213/acf12c.

Full text
Abstract:
Abstract Precursors provide important clues to the nature of gamma-ray burst (GRB) central engines and can be used to contain GRB physical processes. In this Letter, we study the self-organized criticality in precursors of long GRBs in the third Swift/Burst Alert Telescope catalog. We investigate the differential and cumulative size distributions of 100 precursors, including peak flux, duration, rise time, decay time, and quiescent time with the Markov Chain Monte Carlo technique. It is found that all of the distributions can be well described by power-law models and understood within the physical framework of a self-organized criticality system. In addition, we inspect the cumulative distribution functions of the size differences with a q-Gaussian function. The scale-invariance structures of precursors further strengthen our findings. Particularly, similar analyses are made in 127 main bursts. The results show that both precursors and main bursts can be attributed to a self-organized criticality system with the spatial dimension S = 3 and driven by a similar magnetically dominated process.
APA, Harvard, Vancouver, ISO, and other styles
48

Massoud, Elias C., A. Anthony Bloom, Marcos Longo, John T. Reager, Paul A. Levine, and John R. Worden. "Information content of soil hydrology in a west Amazon watershed as informed by GRACE." Hydrology and Earth System Sciences 26, no. 5 (March 15, 2022): 1407–23. http://dx.doi.org/10.5194/hess-26-1407-2022.

Full text
Abstract:
Abstract. The seasonal-to-decadal terrestrial water balance on river basin scales depends on several well-characterized but uncertain soil physical processes, including soil moisture, plant available water, rooting depth, and recharge to lower soil layers. Reducing uncertainties in these quantities using observations is a key step toward improving the data fidelity and skill of land surface models. In this study, we quantitatively characterize the capability of Gravity Recovery and Climate Experiment (NASA-GRACE) measurements – a key constraint on total water storage (TWS) – to inform and constrain these processes. We use a reduced-complexity physically based model capable of simulating the hydrologic cycle, and we apply Bayesian inference on the model parameters using a Markov chain Monte Carlo algorithm, to minimize mismatches between model-simulated and GRACE-observed TWS anomalies. Based on the prior and posterior model parameter distributions, we further quantify information gain with regard to terrestrial water states, associated fluxes, and time-invariant process parameters. We show that the data-constrained terrestrial water storage model can capture basic physics of the hydrologic cycle for a watershed in the western Amazon during the period January 2003 through December 2012, with an r2 of 0.98 and root mean square error of 30.99 mm between observed and simulated TWS. Furthermore, we show a reduction of uncertainty in many of the parameters and state variables, ranging from a 2 % reduction in uncertainty for the porosity parameter to an 85 % reduction for the rooting depth parameter. The annual and interannual variability of the system are also simulated accurately, with the model simulations capturing the impacts of the 2005–2006 and 2010–2011 South American droughts. The results shown here suggest the potential of using gravimetric observations of TWS to identify and constrain key parameters in soil hydrologic models.
APA, Harvard, Vancouver, ISO, and other styles
49

Osada, Hirofumi. "Stochastic geometry and dynamics of infinitely many particle systems—random matrices and interacting Brownian motions in infinite dimensions." Sugaku Expositions 34, no. 2 (October 12, 2021): 141–73. http://dx.doi.org/10.1090/suga/461.

Full text
Abstract:
We explain the general theories involved in solving an infinite-dimensional stochastic differential equation (ISDE) for interacting Brownian motions in infinite dimensions related to random matrices. Typical examples are the stochastic dynamics of infinite particle systems with logarithmic interaction potentials such as the sine, Airy, Bessel, and also for the Ginibre interacting Brownian motions. The first three are infinite-dimensional stochastic dynamics in one-dimensional space related to random matrices called Gaussian ensembles. They are the stationary distributions of interacting Brownian motions and given by the limit point processes of the distributions of eigenvalues of these random matrices. The sine, Airy, and Bessel point processes and interacting Brownian motions are thought to be geometrically and dynamically universal as the limits of bulk, soft edge, and hard edge scaling. The Ginibre point process is a rotation- and translation-invariant point process on R 2 \mathbb {R}^2 , and an equilibrium state of the Ginibre interacting Brownian motions. It is the bulk limit of the distributions of eigenvalues of non-Hermitian Gaussian random matrices. When the interacting Brownian motions constitute a one-dimensional system interacting with each other through the logarithmic potential with inverse temperature β = 2 \beta = 2 , an algebraic construction is known in which the stochastic dynamics are defined by the space-time correlation function. The approach based on the stochastic analysis (called the analytic approach) can be applied to an extremely wide class. If we apply the analytic approach to this system, we see that these two constructions give the same stochastic dynamics. From the algebraic construction, despite being an infinite interacting particle system, it is possible to represent and calculate various quantities such as moments by the correlation functions. We can thus obtain quantitative information. From the analytic construction, it is possible to represent the dynamics as a solution of an ISDE. We can obtain qualitative information such as semi-martingale properties, continuity, and non-collision properties of each particle, and the strong Markov property of the infinite particle system as a whole. Ginibre interacting Brownian motions constitute a two-dimensional infinite particle system related to non-Hermitian Gaussian random matrices. It has a logarithmic interaction potential with β = 2 \beta = 2 , but no algebraic configurations are known.The present result is the only construction.
APA, Harvard, Vancouver, ISO, and other styles
50

Pollett, P. K. "Connecting reversible Markov processes." Advances in Applied Probability 18, no. 4 (December 1986): 880–900. http://dx.doi.org/10.2307/1427254.

Full text
Abstract:
We provide a framework for interconnecting a collection of reversible Markov processes in such a way that the resulting process has a product-form invariant measure with respect to which the process is reversible. A number of examples are discussed including Kingman&s reversible migration process, interconnected random walks and stratified clustering processes.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography