Journal articles on the topic 'Markov reversibility'

To see the other types of publications on this topic, follow the link: Markov reversibility.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Markov reversibility.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Beare, Brendan K., and Juwon Seo. "TIME IRREVERSIBLE COPULA-BASED MARKOV MODELS." Econometric Theory 30, no. 5 (April 16, 2014): 923–60. http://dx.doi.org/10.1017/s0266466614000115.

Full text
Abstract:
Economic and financial time series frequently exhibit time irreversible dynamics. For instance, there is considerable evidence of asymmetric fluctuations in many macroeconomic and financial variables, and certain game theoretic models of price determination predict asymmetric cycles in price series. In this paper, we make two primary contributions to the econometric literature on time reversibility. First, we propose a new test of time reversibility, applicable to stationary Markov chains. Compared to existing tests, our test has the advantage of being consistent against arbitrary violations of reversibility. Second, we explain how a circulation density function may be used to characterize the nature of time irreversibility when it is present. We propose a copula-based estimator of the circulation density and verify that it is well behaved asymptotically under suitable regularity conditions. We illustrate the use of our time reversibility test and circulation density estimator by applying them to five years of Canadian gasoline price markup data.
APA, Harvard, Vancouver, ISO, and other styles
2

Ōsawa, Hideo. "Reversibility of Markov chains with applications to storage models." Journal of Applied Probability 22, no. 1 (March 1985): 123–37. http://dx.doi.org/10.2307/3213752.

Full text
Abstract:
This paper studies the reversibility conditions of stationary Markov chains (discrete-time Markov processes) with general state space. In particular, we investigate the Markov chains having atomic points in the state space. Such processes are often seen in storage models, for example waiting time in a queue, insurance risk reserve, dam content and so on. The necessary and sufficient conditions for reversibility of these processes are obtained. Further, we apply these conditions to some storage models and present some interesting results for single-server queues and a finite insurance risk model.
APA, Harvard, Vancouver, ISO, and other styles
3

Ōsawa, Hideo. "Reversibility of Markov chains with applications to storage models." Journal of Applied Probability 22, no. 01 (March 1985): 123–37. http://dx.doi.org/10.1017/s0021900200029053.

Full text
Abstract:
This paper studies the reversibility conditions of stationary Markov chains (discrete-time Markov processes) with general state space. In particular, we investigate the Markov chains having atomic points in the state space. Such processes are often seen in storage models, for example waiting time in a queue, insurance risk reserve, dam content and so on. The necessary and sufficient conditions for reversibility of these processes are obtained. Further, we apply these conditions to some storage models and present some interesting results for single-server queues and a finite insurance risk model.
APA, Harvard, Vancouver, ISO, and other styles
4

Steuber, Tara L., Peter C. Kiessler, and Robert Lund. "TESTING FOR REVERSIBILITY IN MARKOV CHAIN DATA." Probability in the Engineering and Informational Sciences 26, no. 4 (July 30, 2012): 593–611. http://dx.doi.org/10.1017/s0269964812000228.

Full text
Abstract:
This paper introduces two statistics that assess whether (or not) a sequence sampled from a stationary time-homogeneous Markov chain on a finite state space is reversible. The test statistics are based on observed deviations of transition sample counts between each pair of states in the chain. First, the joint asymptotic normality of these sample counts is established. This result is then used to construct two chi-squared-based tests for reversibility. Simulations assess the power and type one error of the proposed tests.
APA, Harvard, Vancouver, ISO, and other styles
5

Kämpke, T. "Reversibility and equivalence in directed markov fields." Mathematical and Computer Modelling 23, no. 3 (February 1996): 87–101. http://dx.doi.org/10.1016/0895-7177(95)00235-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ge, Hao, Da-Quan Jiang, and Min Qian. "Reversibility and entropy production of inhomogeneous Markov chains." Journal of Applied Probability 43, no. 04 (December 2006): 1028–43. http://dx.doi.org/10.1017/s0021900200002400.

Full text
Abstract:
In this paper we introduce the concepts of instantaneous reversibility and instantaneous entropy production rate for inhomogeneous Markov chains with denumerable state spaces. The following statements are proved to be equivalent: the inhomogeneous Markov chain is instantaneously reversible; it is in detailed balance; its entropy production rate vanishes. In particular, for a time-periodic birth-death chain, which can be regarded as a simple version of a physical model (Brownian motors), we prove that its rotation number is 0 when it is instantaneously reversible or periodically reversible. Hence, in our model of Markov chains, the directed transport phenomenon of Brownian motors can occur only in nonequilibrium and irreversible systems.
APA, Harvard, Vancouver, ISO, and other styles
7

Ge, Hao, Da-Quan Jiang, and Min Qian. "Reversibility and entropy production of inhomogeneous Markov chains." Journal of Applied Probability 43, no. 4 (December 2006): 1028–43. http://dx.doi.org/10.1239/jap/1165505205.

Full text
Abstract:
In this paper we introduce the concepts of instantaneous reversibility and instantaneous entropy production rate for inhomogeneous Markov chains with denumerable state spaces. The following statements are proved to be equivalent: the inhomogeneous Markov chain is instantaneously reversible; it is in detailed balance; its entropy production rate vanishes. In particular, for a time-periodic birth-death chain, which can be regarded as a simple version of a physical model (Brownian motors), we prove that its rotation number is 0 when it is instantaneously reversible or periodically reversible. Hence, in our model of Markov chains, the directed transport phenomenon of Brownian motors can occur only in nonequilibrium and irreversible systems.
APA, Harvard, Vancouver, ISO, and other styles
8

Tetali, Prasad. "An Extension of Foster's Network Theorem." Combinatorics, Probability and Computing 3, no. 3 (September 1994): 421–27. http://dx.doi.org/10.1017/s0963548300001309.

Full text
Abstract:
Consider an electrical network onnnodes with resistorsrijbetween nodesiandj. LetRijdenote theeffective resistancebetween the nodes. Then Foster's Theorem [5] asserts thatwherei∼jdenotesiandjare connected by a finiterij. In [10] this theorem is proved by making use of random walks. The classical connection between electrical networks and reversible random walks implies a corresponding statement for reversible Markov chains. In this paper we prove an elementary identity for ergodic Markov chains, and show that this yields Foster's theorem when the chain is time-reversible.We also prove a generalization of aresistive inverseidentity. This identity was known for resistive networks, but we prove a more general identity for ergodic Markov chains. We show that time-reversibility, once again, yields the known identity. Among other results, this identity also yields an alternative characterization of reversibility of Markov chains (see Remarks 1 and 2 below). This characterization, when interpreted in terms of electrical currents, implies thereciprocity theoremin single-source resistive networks, thus allowing us to establish the equivalence ofreversibilityin Markov chains andreciprocityin electrical networks.
APA, Harvard, Vancouver, ISO, and other styles
9

Serfozo, Richard F. "Reversible Markov processes on general spaces and spatial migration processes." Advances in Applied Probability 37, no. 03 (September 2005): 801–18. http://dx.doi.org/10.1017/s0001867800000483.

Full text
Abstract:
In this study, we characterize the equilibrium behavior of spatial migration processes that represent population migrations, or birth-death processes, in general spaces. These processes are reversible Markov jump processes on measure spaces. As a precursor, we present fundamental properties of reversible Markov jump processes on general spaces. A major result is a canonical formula for the stationary distribution of a reversible process. This involves the characterization of two-way communication in transitions, using certain Radon-Nikodým derivatives. Other results concern a Kolmogorov criterion for reversibility, time reversibility, and several methods of constructing or identifying reversible processes.
APA, Harvard, Vancouver, ISO, and other styles
10

Serfozo, Richard F. "Reversible Markov processes on general spaces and spatial migration processes." Advances in Applied Probability 37, no. 3 (September 2005): 801–18. http://dx.doi.org/10.1239/aap/1127483748.

Full text
Abstract:
In this study, we characterize the equilibrium behavior of spatial migration processes that represent population migrations, or birth-death processes, in general spaces. These processes are reversible Markov jump processes on measure spaces. As a precursor, we present fundamental properties of reversible Markov jump processes on general spaces. A major result is a canonical formula for the stationary distribution of a reversible process. This involves the characterization of two-way communication in transitions, using certain Radon-Nikodým derivatives. Other results concern a Kolmogorov criterion for reversibility, time reversibility, and several methods of constructing or identifying reversible processes.
APA, Harvard, Vancouver, ISO, and other styles
11

Gerontidis, Ioannis I. "Markov population replacement processes." Advances in Applied Probability 27, no. 03 (September 1995): 711–40. http://dx.doi.org/10.1017/s0001867800027129.

Full text
Abstract:
We consider a migration process whose singleton process is a time-dependent Markov replacement process. For the singleton process, which may be treated as either open or closed, we study the limiting distribution, the distribution of the time to replacement and related quantities. For a replacement process in equilibrium we obtain a version of Little's law and we provide conditions for reversibility. For the resulting linear population process we characterize exponential ergodicity for two types of environmental behaviour, i.e. either convergent or cyclic, and finally for large population sizes a diffusion approximation analysis is provided.
APA, Harvard, Vancouver, ISO, and other styles
12

Gerontidis, Ioannis I. "Markov population replacement processes." Advances in Applied Probability 27, no. 3 (September 1995): 711–40. http://dx.doi.org/10.2307/1428131.

Full text
Abstract:
We consider a migration process whose singleton process is a time-dependent Markov replacement process. For the singleton process, which may be treated as either open or closed, we study the limiting distribution, the distribution of the time to replacement and related quantities. For a replacement process in equilibrium we obtain a version of Little's law and we provide conditions for reversibility. For the resulting linear population process we characterize exponential ergodicity for two types of environmental behaviour, i.e. either convergent or cyclic, and finally for large population sizes a diffusion approximation analysis is provided.
APA, Harvard, Vancouver, ISO, and other styles
13

Wübker, Achim. "Spectral Theory for Weakly Reversible Markov Chains." Journal of Applied Probability 49, no. 1 (March 2012): 245–65. http://dx.doi.org/10.1239/jap/1331216845.

Full text
Abstract:
The theory of L2-spectral gaps for reversible Markov chains has been studied by many authors. In this paper we consider positive recurrent general state space Markov chains with stationary transition probabilities. Replacing the assumption of reversibility with a weaker assumption, we still obtain a simple necessary and sufficient condition for the spectral gap property of the associated Markov operator in terms of the isoperimetric constant. We show that this result can be applied to a large class of Markov chains, including those that are related to positive recurrent finite-range random walks on Z.
APA, Harvard, Vancouver, ISO, and other styles
14

Kessler, Mathieu, and Michael Sørensen. "On Time-Reversibility and Estimating Functions for Markov Processes." Statistical Inference for Stochastic Processes 8, no. 1 (2005): 95–107. http://dx.doi.org/10.1023/b:sisp.0000049125.31288.fa.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Marin, A., and S. Rossi. "On the relations between Markov chain lumpability and reversibility." Acta Informatica 54, no. 5 (March 31, 2016): 447–85. http://dx.doi.org/10.1007/s00236-016-0266-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

McCausland, William J. "Time reversibility of stationary regular finite-state Markov chains." Journal of Econometrics 136, no. 1 (January 2007): 303–18. http://dx.doi.org/10.1016/j.jeconom.2005.09.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Niemiro, Wojciech. "Tail events of simulated annealing Markov chains." Journal of Applied Probability 32, no. 4 (December 1995): 867–76. http://dx.doi.org/10.2307/3215200.

Full text
Abstract:
We consider non-homogeneous Markov chains generated by the simulated annealing algorithm. We classify states according to asymptotic properties of trajectories. We identify recurrent and transient states. The set of recurrent states is partitioned into disjoint classes of asymptotically communicating states. These classes correspond to atoms of the tail sigma-field. The results are valid under the weak reversibility assumption of Hajek.
APA, Harvard, Vancouver, ISO, and other styles
18

Niemiro, Wojciech. "Tail events of simulated annealing Markov chains." Journal of Applied Probability 32, no. 04 (December 1995): 867–76. http://dx.doi.org/10.1017/s0021900200103341.

Full text
Abstract:
We consider non-homogeneous Markov chains generated by the simulated annealing algorithm. We classify states according to asymptotic properties of trajectories. We identify recurrent and transient states. The set of recurrent states is partitioned into disjoint classes of asymptotically communicating states. These classes correspond to atoms of the tail sigma-field. The results are valid under the weak reversibility assumption of Hajek.
APA, Harvard, Vancouver, ISO, and other styles
19

Wübker, Achim. "Spectral Theory for Weakly Reversible Markov Chains." Journal of Applied Probability 49, no. 01 (March 2012): 245–65. http://dx.doi.org/10.1017/s0021900200008974.

Full text
Abstract:
The theory of L 2-spectral gaps for reversible Markov chains has been studied by many authors. In this paper we consider positive recurrent general state space Markov chains with stationary transition probabilities. Replacing the assumption of reversibility with a weaker assumption, we still obtain a simple necessary and sufficient condition for the spectral gap property of the associated Markov operator in terms of the isoperimetric constant. We show that this result can be applied to a large class of Markov chains, including those that are related to positive recurrent finite-range random walks on Z.
APA, Harvard, Vancouver, ISO, and other styles
20

Richman, David, and W. E. Sharp. "A method for determining the reversibility of a Markov sequence." Mathematical Geology 22, no. 7 (October 1990): 749–61. http://dx.doi.org/10.1007/bf00890660.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Al’pin, Yu A. "Harary’s Theorem on Signed Graphs and Reversibility of Markov Chains." Journal of Mathematical Sciences 199, no. 4 (June 2014): 375–80. http://dx.doi.org/10.1007/s10958-014-1864-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Jiang, Yu Hang, Tong Liu, Zhiya Lou, Jeffrey S. Rosenthal, Shanshan Shangguan, Fei Wang, and Zixuan Wu. "Markov Chain Confidence Intervals and Biases." International Journal of Statistics and Probability 11, no. 1 (December 21, 2021): 29. http://dx.doi.org/10.5539/ijsp.v11n1p29.

Full text
Abstract:
We derive explicit asymptotic confidence intervals for any Markov chain Monte Carlo (MCMC) algorithm with finite asymptotic variance, started at any initial state, without requiring a Central Limit Theorem nor reversibility nor geometric ergodicity nor any bias bound. We also derive explicit non-asymptotic confidence intervals assuming bounds on the bias or first moment, or alternatively that the chain starts in stationarity. We relate those non-asymptotic bounds to properties of MCMC bias, and show that polynomially ergodicity implies certain bias bounds. We also apply our results to several numerical examples. It is our hope that these results will provide simple and useful tools for estimating errors of MCMC algorithms when CLTs are not available.
APA, Harvard, Vancouver, ISO, and other styles
23

Henderson, W., C. E. M. Pearce, P. K. Pollett, and P. G. Taylor. "Connecting internally balanced quasi-reversible Markov processes." Advances in Applied Probability 24, no. 04 (December 1992): 934–59. http://dx.doi.org/10.1017/s0001867800025027.

Full text
Abstract:
We provide a general framework for interconnecting a collection of quasi-reversible nodes in such a way that the resulting process exhibits a product-form invariant measure. The individual nodes can be quite general, although some degree of internal balance will be assumed. Any of the nodes may possess a feedback mechanism. Indeed, we pay particular attention to a class of feedback queues, characterized by the fact that their state description allows one to maintain a record of the order in which events occur. We also examine in some detail the problem of determining for which values of the arrival rates a node does exhibit quasi-reversibility.
APA, Harvard, Vancouver, ISO, and other styles
24

Henderson, W., C. E. M. Pearce, P. K. Pollett, and P. G. Taylor. "Connecting internally balanced quasi-reversible Markov processes." Advances in Applied Probability 24, no. 4 (December 1992): 934–59. http://dx.doi.org/10.2307/1427720.

Full text
Abstract:
We provide a general framework for interconnecting a collection of quasi-reversible nodes in such a way that the resulting process exhibits a product-form invariant measure. The individual nodes can be quite general, although some degree of internal balance will be assumed. Any of the nodes may possess a feedback mechanism. Indeed, we pay particular attention to a class of feedback queues, characterized by the fact that their state description allows one to maintain a record of the order in which events occur. We also examine in some detail the problem of determining for which values of the arrival rates a node does exhibit quasi-reversibility.
APA, Harvard, Vancouver, ISO, and other styles
25

A. Khan, Zahid, and Ram Chandra Tewari. "Markov Reversibility, Quasi-Symmetry, and Marginal Homogeneity in Cyclothymiacs Geological Successions." International Journal of Geoinformatics and Geological Science 8, no. 2 (June 25, 2021): 9–25. http://dx.doi.org/10.14445/23939206/ijggs-v8i2p102.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Battaglini, Marco, Salvatore Nunnari, and Thomas R. Palfrey. "The Dynamic Free Rider Problem: A Laboratory Study." American Economic Journal: Microeconomics 8, no. 4 (November 1, 2016): 268–308. http://dx.doi.org/10.1257/mic.20150126.

Full text
Abstract:
We report the results of an experiment that investigates free riding in the accumulation of durable public goods. We consider economies with reversibility, where contributions can be positive or negative; and economies with irreversibility, where contributions are nonnegative. Aggregate outcomes support the qualitative predictions of the Markov Perfect Equilibria (MPE) characterized in Battaglini, Nunnari, and Palfrey (2014): steady state levels of public good are lower with reversibility than irreversibility; accumulation is inefficiently slow; and the public good is under-provided in both regimes. On the other hand, public good levels are higher than MPE, and some evidence of history dependence is detected. (JEL C91, H41)
APA, Harvard, Vancouver, ISO, and other styles
27

Pollett, P. K. "Reversibility, invariance and μ-invariance." Advances in Applied Probability 20, no. 03 (September 1988): 600–621. http://dx.doi.org/10.1017/s0001867800018164.

Full text
Abstract:
In this paper we consider a number of questions relating to the problem of determining quasi-stationary distributions for transient Markov processes. First we find conditions under which a measure or vector that is µ-invariant for a matrix of transition rates is also μ-invariant for the family of transition matrices of the minimal process it generates. These provide a means for determining whether or not the so-called stationary conditional quasi-stationary distribution exists in the λ-transient case. The process is not assumed to be regular, nor is it assumed to be uniform or irreducible. In deriving the invariance conditions we reveal a relationship between μ-invariance and the invariance of measures for related processes called the μ-reverse and the μ-dual processes. They play a role analogous to the time-reverse process which arises in the discussion of stationary distributions. Secondly we bring the related notions of detail-balance and reversibility into the realm of quasi-stationary processes. For example, if a process can be identified as being μ-reversible, the problem of determining quasi-stationary distributions is made much simpler. Finally, we consider some practical problems that emerge when calculating quasi-stationary distributions directly from the transition rates of the process. Our results are illustrated with reference to a variety of processes including examples of birth and death processes and the birth, death and catastrophe process.
APA, Harvard, Vancouver, ISO, and other styles
28

Pollett, P. K. "Reversibility, invariance and μ-invariance." Advances in Applied Probability 20, no. 3 (September 1988): 600–621. http://dx.doi.org/10.2307/1427037.

Full text
Abstract:
In this paper we consider a number of questions relating to the problem of determining quasi-stationary distributions for transient Markov processes. First we find conditions under which a measure or vector that is µ-invariant for a matrix of transition rates is also μ-invariant for the family of transition matrices of the minimal process it generates. These provide a means for determining whether or not the so-called stationary conditional quasi-stationary distribution exists in the λ-transient case. The process is not assumed to be regular, nor is it assumed to be uniform or irreducible. In deriving the invariance conditions we reveal a relationship between μ-invariance and the invariance of measures for related processes called the μ-reverse and the μ-dual processes. They play a role analogous to the time-reverse process which arises in the discussion of stationary distributions. Secondly we bring the related notions of detail-balance and reversibility into the realm of quasi-stationary processes. For example, if a process can be identified as being μ-reversible, the problem of determining quasi-stationary distributions is made much simpler. Finally, we consider some practical problems that emerge when calculating quasi-stationary distributions directly from the transition rates of the process. Our results are illustrated with reference to a variety of processes including examples of birth and death processes and the birth, death and catastrophe process.
APA, Harvard, Vancouver, ISO, and other styles
29

Chernick, M. R., D. J. Daley, and R. P. Littlejohn. "A time-reversibility relationship between two Markov chains with exponential stationary distributions." Journal of Applied Probability 25, no. 2 (June 1988): 418–22. http://dx.doi.org/10.2307/3214450.

Full text
Abstract:
The stationary non-negative Markov chains {Yn} and {Xn} specified by the relations for {η n} a sequence of independent identically distributed (i.i.d.) random variables which are independent of {Yn}, and for {ξ n} a sequence of i.i.d. random variables which are independent of {Xn}, are mutually time-reversed if and only if their common marginal distribution is exponential, relating the exponential autoregressive process of Gaver and Lewis (1980) to the exponential minification process of Tavares (1980).
APA, Harvard, Vancouver, ISO, and other styles
30

Chernick, M. R., D. J. Daley, and R. P. Littlejohn. "A time-reversibility relationship between two Markov chains with exponential stationary distributions." Journal of Applied Probability 25, no. 02 (June 1988): 418–22. http://dx.doi.org/10.1017/s0021900200041061.

Full text
Abstract:
The stationary non-negative Markov chains {Yn } and {Xn } specified by the relations for {η n } a sequence of independent identically distributed (i.i.d.) random variables which are independent of {Yn }, and for {ξ n } a sequence of i.i.d. random variables which are independent of {Xn }, are mutually time-reversed if and only if their common marginal distribution is exponential, relating the exponential autoregressive process of Gaver and Lewis (1980) to the exponential minification process of Tavares (1980).
APA, Harvard, Vancouver, ISO, and other styles
31

Konstantopoulos, Panagiotis, and Jean Walrand. "A Quasi-Reversibility Approach to the Insensitivity of Generalized Semi-Markov Processes." Probability in the Engineering and Informational Sciences 3, no. 3 (July 1989): 405–15. http://dx.doi.org/10.1017/s0269964800001273.

Full text
Abstract:
This paper is concerned with a certain property of the stationary distribution of a generalized semi-Markov process (GSMP) known as insensitivity. It is well-known that the so-called Matthes' conditions form a necessary and sufficient algebraic criterion for insensitivity. Most proofs of these conditions are basically algebraic. By interpreting a GSMP as a simple queueing network, we are able to show that Matthes' conditions are equivalent to the quasi-reversibility of the network, thus obtaining another simple proof of the sufficiency of these conditions. Furthermore, we apply our method to find a simple criterion for the insensitivity of GSMP's with generalized routing (in a sense that is introduced in the paper).
APA, Harvard, Vancouver, ISO, and other styles
32

Keilson, J., and O. A. Vasicek. "Monotone measures of ergodicity for Markov chains." Journal of Applied Mathematics and Stochastic Analysis 11, no. 3 (January 1, 1998): 283–88. http://dx.doi.org/10.1155/s1048953398000239.

Full text
Abstract:
The following paper, first written in 1974, was never published other than as part of an internal research series. Its lack of publication is unrelated to the merits of the paper and the paper is of current importance by virtue of its relation to the relaxation time. A systematic discussion is provided of the approach of a finite Markov chain to ergodicity by proving the monotonicity of an important set of norms, each measures of egodicity, whether or not time reversibility is present. The paper is of particular interest because the discussion of the relaxation time of a finite Markov chain [2] has only been clean for time reversible chains, a small subset of the chains of interest. This restriction is not present here. Indeed, a new relaxation time quoted quantifies the relaxation time for all finite ergodic chains (cf. the discussion of Q1(t) below Equation (1.7)]. This relaxation time was developed by Keilson with A. Roy in his thesis [6], yet to be published.
APA, Harvard, Vancouver, ISO, and other styles
33

Wolfer, Geoffrey, and Shun Watanabe. "Information Geometry of Reversible Markov Chains." Information Geometry 4, no. 2 (November 22, 2021): 393–433. http://dx.doi.org/10.1007/s41884-021-00061-7.

Full text
Abstract:
AbstractWe analyze the information geometric structure of time reversibility for parametric families of irreducible transition kernels of Markov chains. We define and characterize reversible exponential families of Markov kernels, and show that irreducible and reversible Markov kernels form both a mixture family and, perhaps surprisingly, an exponential family in the set of all stochastic kernels. We propose a parametrization of the entire manifold of reversible kernels, and inspect reversible geodesics. We define information projections onto the reversible manifold, and derive closed-form expressions for the e-projection and m-projection, along with Pythagorean identities with respect to information divergence, leading to some new notion of reversiblization of Markov kernels. We show the family of edge measures pertaining to irreducible and reversible kernels also forms an exponential family among distributions over pairs. We further explore geometric properties of the reversible family, by comparing them with other remarkable families of stochastic matrices. Finally, we show that reversible kernels are, in a sense we define, the minimal exponential family generated by the m-family of symmetric kernels, and the smallest mixture family that comprises the e-family of memoryless kernels.
APA, Harvard, Vancouver, ISO, and other styles
34

Sumita, Ushio, and Maria Rieders. "Lumpability and time reversibility in the aggregation-disaggregation method for large markov chains." Communications in Statistics. Stochastic Models 5, no. 1 (January 1989): 63–81. http://dx.doi.org/10.1080/15326348908807099.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

DERRIENNIC, YVES, and MICHAEL LIN. "VARIANCE BOUNDING MARKOV CHAINS, L2-UNIFORM MEAN ERGODICITY AND THE CLT." Stochastics and Dynamics 11, no. 01 (March 2011): 81–94. http://dx.doi.org/10.1142/s0219493711003176.

Full text
Abstract:
We prove that variance bounding Markov chains, as defined by Roberts and Rosenthal [31], are uniformly mean ergodic in L2 of the invariant probability. For such chains, without any additional mixing, reversibility, or Harris recurrence assumptions, the central limit theorem and the invariance principle hold for every centered additive functional with finite variance. We also show that L2-geometric ergodicity is equivalent to L2-uniform geometric ergodicity. We then specialize the results to random walks on compact Abelian groups, and construct a probability on the unit circle such that the random walk it generates is L2-uniformly geometrically ergodic, but is not Harris recurrent.
APA, Harvard, Vancouver, ISO, and other styles
36

Zhou, Hua, and Kenneth Lange. "Composition Markov chains of multinomial type." Advances in Applied Probability 41, no. 01 (March 2009): 270–91. http://dx.doi.org/10.1017/s0001867800003220.

Full text
Abstract:
Suppose that n identical particles evolve according to the same marginal Markov chain. In this setting we study chains such as the Ehrenfest chain that move a prescribed number of randomly chosen particles at each epoch. The product chain constructed by this device inherits its eigenstructure from the marginal chain. There is a further chain derived from the product chain called the composition chain that ignores particle labels and tracks the numbers of particles in the various states. The composition chain in turn inherits its eigenstructure and various properties such as reversibility from the product chain. The equilibrium distribution of the composition chain is multinomial. The current paper proves these facts in the well-known framework of state lumping and identifies the column eigenvectors of the composition chain with the multivariate Krawtchouk polynomials of Griffiths. The advantages of knowing the full spectral decomposition of the composition chain include (a) detailed estimates of the rate of convergence to equilibrium, (b) construction of martingales that allow calculation of the moments of the particle counts, and (c) explicit expressions for mean coalescence times in multi-person random walks. These possibilities are illustrated by applications to Ehrenfest chains, the Hoare and Rahman chain, Kimura's continuous-time chain for DNA evolution, a light bulb chain, and random walks on some specific graphs.
APA, Harvard, Vancouver, ISO, and other styles
37

Zhou, Hua, and Kenneth Lange. "Composition Markov chains of multinomial type." Advances in Applied Probability 41, no. 1 (March 2009): 270–91. http://dx.doi.org/10.1239/aap/1240319585.

Full text
Abstract:
Suppose that n identical particles evolve according to the same marginal Markov chain. In this setting we study chains such as the Ehrenfest chain that move a prescribed number of randomly chosen particles at each epoch. The product chain constructed by this device inherits its eigenstructure from the marginal chain. There is a further chain derived from the product chain called the composition chain that ignores particle labels and tracks the numbers of particles in the various states. The composition chain in turn inherits its eigenstructure and various properties such as reversibility from the product chain. The equilibrium distribution of the composition chain is multinomial. The current paper proves these facts in the well-known framework of state lumping and identifies the column eigenvectors of the composition chain with the multivariate Krawtchouk polynomials of Griffiths. The advantages of knowing the full spectral decomposition of the composition chain include (a) detailed estimates of the rate of convergence to equilibrium, (b) construction of martingales that allow calculation of the moments of the particle counts, and (c) explicit expressions for mean coalescence times in multi-person random walks. These possibilities are illustrated by applications to Ehrenfest chains, the Hoare and Rahman chain, Kimura's continuous-time chain for DNA evolution, a light bulb chain, and random walks on some specific graphs.
APA, Harvard, Vancouver, ISO, and other styles
38

Wang, Hongyun, and Hong Qian. "On detailed balance and reversibility of semi-Markov processes and single-molecule enzyme kinetics." Journal of Mathematical Physics 48, no. 1 (January 2007): 013303. http://dx.doi.org/10.1063/1.2432065.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Pollett, P. K. "Preserving partial balance in continuous-time Markov chains." Advances in Applied Probability 19, no. 02 (June 1987): 431–53. http://dx.doi.org/10.1017/s000186780001661x.

Full text
Abstract:
Recently a number of authors have considered general procedures for coupling stochastic systems. If the individual components of a system, when considered in isolation, are found to possess the simplifying feature of either reversibility, quasireversibility or partial balance they can be coupled in such a way that the equilibrium analysis of the system is considerably simpler than one might expect in advance. In particular the system usually exhibits a product-form equilibrium distribution and this is often insensitive to the precise specification of the individual components. It is true, however, that certain kinds of components lose their simplifying feature if the specification of the coupling procedure changes. From a practical point of view it is important, therefore, to determine if, and then under what conditions, the revelant feature is preserved. In this paper we obtain conditions under which partial balance in a component is preserved and these often amount to the requirement that there exists a quantity which is unaffected by the internal workings of the component in question. We give particular attention to the components of a stratified clustering process as these most often suffer from loss of partial balance.
APA, Harvard, Vancouver, ISO, and other styles
40

Pollett, P. K. "Preserving partial balance in continuous-time Markov chains." Advances in Applied Probability 19, no. 2 (June 1987): 431–53. http://dx.doi.org/10.2307/1427426.

Full text
Abstract:
Recently a number of authors have considered general procedures for coupling stochastic systems. If the individual components of a system, when considered in isolation, are found to possess the simplifying feature of either reversibility, quasireversibility or partial balance they can be coupled in such a way that the equilibrium analysis of the system is considerably simpler than one might expect in advance. In particular the system usually exhibits a product-form equilibrium distribution and this is often insensitive to the precise specification of the individual components. It is true, however, that certain kinds of components lose their simplifying feature if the specification of the coupling procedure changes. From a practical point of view it is important, therefore, to determine if, and then under what conditions, the revelant feature is preserved.In this paper we obtain conditions under which partial balance in a component is preserved and these often amount to the requirement that there exists a quantity which is unaffected by the internal workings of the component in question. We give particular attention to the components of a stratified clustering process as these most often suffer from loss of partial balance.
APA, Harvard, Vancouver, ISO, and other styles
41

Zachary, Stan. "Dynamics of large uncontrolled loss networks." Journal of Applied Probability 37, no. 03 (September 2000): 685–95. http://dx.doi.org/10.1017/s0021900200015916.

Full text
Abstract:
This paper studies the connection between the dynamical and equilibrium behaviour of large uncontrolled loss networks. We consider the behaviour of the number of calls of each type in the network, and show that, under the limiting regime of Kelly (1986), all trajectories of the limiting dynamics converge to a single fixed point, which is necessarily that on which the limiting stationary distribution is concentrated. The approach uses Lyapunov techniques and involves the evolution of the transition rates of a stationary Markov process in such a way that it tends to reversibility.
APA, Harvard, Vancouver, ISO, and other styles
42

Zachary, Stan. "Dynamics of large uncontrolled loss networks." Journal of Applied Probability 37, no. 3 (September 2000): 685–95. http://dx.doi.org/10.1239/jap/1014842828.

Full text
Abstract:
This paper studies the connection between the dynamical and equilibrium behaviour of large uncontrolled loss networks. We consider the behaviour of the number of calls of each type in the network, and show that, under the limiting regime of Kelly (1986), all trajectories of the limiting dynamics converge to a single fixed point, which is necessarily that on which the limiting stationary distribution is concentrated. The approach uses Lyapunov techniques and involves the evolution of the transition rates of a stationary Markov process in such a way that it tends to reversibility.
APA, Harvard, Vancouver, ISO, and other styles
43

Chikina, Maria, Alan Frieze, and Wesley Pegden. "Assessing significance in a Markov chain without mixing." Proceedings of the National Academy of Sciences 114, no. 11 (February 28, 2017): 2860–64. http://dx.doi.org/10.1073/pnas.1617540114.

Full text
Abstract:
We present a statistical test to detect that a presented state of a reversible Markov chain was not chosen from a stationary distribution. In particular, given a value function for the states of the Markov chain, we would like to show rigorously that the presented state is an outlier with respect to the values, by establishing a p value under the null hypothesis that it was chosen from a stationary distribution of the chain. A simple heuristic used in practice is to sample ranks of states from long random trajectories on the Markov chain and compare these with the rank of the presented state; if the presented state is a 0.1% outlier compared with the sampled ranks (its rank is in the bottom 0.1% of sampled ranks), then this observation should correspond to a p value of 0.001. This significance is not rigorous, however, without good bounds on the mixing time of the Markov chain. Our test is the following: Given the presented state in the Markov chain, take a random walk from the presented state for any number of steps. We prove that observing that the presented state is an ε-outlier on the walk is significant at p=2ε under the null hypothesis that the state was chosen from a stationary distribution. We assume nothing about the Markov chain beyond reversibility and show that significance at p≈ε is best possible in general. We illustrate the use of our test with a potential application to the rigorous detection of gerrymandering in Congressional districting.
APA, Harvard, Vancouver, ISO, and other styles
44

Palacios, José Luis, Eduardo Gómez, and Miguel Del Río. "Hitting Times of Walks on Graphs through Voltages." Journal of Probability 2014 (May 20, 2014): 1–6. http://dx.doi.org/10.1155/2014/852481.

Full text
Abstract:
We derive formulas for the expected hitting times of general random walks on graphs, in terms of voltages, with very elementary electric means. Under this new light we revise bounds and hitting times for birth-and-death Markov chains and for walks on graphs with cutpoints, and give some exact computations on the necklace graph. We also prove Tetali’s formula for hitting times without making use of the reciprocity principle. In fact this principle follows as a corollary of our argument that also yields as corollaries the triangular inequality for effective resistances and the reversibility of the sum of hitting times around a tour.
APA, Harvard, Vancouver, ISO, and other styles
45

Jirasakuldech, Benjamas, Riza Emekter, and Peter Went. "Fundamental Value Hypothesis and Return Behavior: Evidence from Emerging Equity Markets." Review of Pacific Basin Financial Markets and Policies 09, no. 01 (March 2006): 97–127. http://dx.doi.org/10.1142/s0219091506000689.

Full text
Abstract:
This study examines the return behavior of 15 emerging equity markets for persistent deviations from the fundamental value hypothesis. The duration dependence test shows that rational expectations bubble do not cause deviations from fundamental value in any of the markets. Markov chain test results imply that markets in China, Malaysia, the Philippines, and Singapore deviate from their fundamental values due to non-random price changes. A price decrease is more likely to follow two periods of price decrease in these four equity markets. Finally, time reversibility test reveals that all equity markets, except for Jordan and Egypt, exhibit asymmetrical price patterns, suggesting departures from fundamental values.
APA, Harvard, Vancouver, ISO, and other styles
46

Chen, Anyue, Hanjun Zhang, Kai Liu, and Keith Rennolls. "Birth-death processes with disaster and instantaneous resurrection." Advances in Applied Probability 36, no. 01 (March 2004): 267–92. http://dx.doi.org/10.1017/s0001867800012969.

Full text
Abstract:
A new structure with the special property that instantaneous resurrection and mass disaster are imposed on an ordinary birth-death process is considered. Under the condition that the underlying birth-death process is exit or bilateral, we are able to give easily checked existence criteria for such Markov processes. A very simple uniqueness criterion is also established. All honest processes are explicitly constructed. Ergodicity properties for these processes are investigated. Surprisingly, it can be proved that all the honest processes are not only recurrent but also ergodic without imposing any extra conditions. Equilibrium distributions are then established. Symmetry and reversibility of such processes are also investigated. Several examples are provided to illustrate our results.
APA, Harvard, Vancouver, ISO, and other styles
47

Chen, Anyue, Hanjun Zhang, Kai Liu, and Keith Rennolls. "Birth-death processes with disaster and instantaneous resurrection." Advances in Applied Probability 36, no. 1 (March 2004): 267–92. http://dx.doi.org/10.1239/aap/1077134473.

Full text
Abstract:
A new structure with the special property that instantaneous resurrection and mass disaster are imposed on an ordinary birth-death process is considered. Under the condition that the underlying birth-death process is exit or bilateral, we are able to give easily checked existence criteria for such Markov processes. A very simple uniqueness criterion is also established. All honest processes are explicitly constructed. Ergodicity properties for these processes are investigated. Surprisingly, it can be proved that all the honest processes are not only recurrent but also ergodic without imposing any extra conditions. Equilibrium distributions are then established. Symmetry and reversibility of such processes are also investigated. Several examples are provided to illustrate our results.
APA, Harvard, Vancouver, ISO, and other styles
48

McKenzie, ED. "The distributional structure of finite moving-average processes." Journal of Applied Probability 25, no. 2 (June 1988): 313–21. http://dx.doi.org/10.2307/3214439.

Full text
Abstract:
Analysis of time-series models has, in the past, concentrated mainly on second-order properties, i.e. the covariance structure. Recent interest in non-Gaussian and non-linear processes has necessitated exploration of more general properties, even for standard models. We demonstrate that the powerful Markov property which greatly simplifies the distributional structure of finite autoregressions has an analogue in the (non-Markovian) finite moving-average processes. In fact, all the joint distributions of samples of a qth-order moving average may be constructed from only the (q + 1)th-order distribution. The usefulness of this result is illustrated by references to three areas of application: time-reversibility; asymptotic behaviour; and sums and associated point and count processes. Generalizations of the result are also considered.
APA, Harvard, Vancouver, ISO, and other styles
49

McKenzie, ED. "The distributional structure of finite moving-average processes." Journal of Applied Probability 25, no. 02 (June 1988): 313–21. http://dx.doi.org/10.1017/s002190020004095x.

Full text
Abstract:
Analysis of time-series models has, in the past, concentrated mainly on second-order properties, i.e. the covariance structure. Recent interest in non-Gaussian and non-linear processes has necessitated exploration of more general properties, even for standard models. We demonstrate that the powerful Markov property which greatly simplifies the distributional structure of finite autoregressions has an analogue in the (non-Markovian) finite moving-average processes. In fact, all the joint distributions of samples of a qth-order moving average may be constructed from only the (q + 1)th-order distribution. The usefulness of this result is illustrated by references to three areas of application: time-reversibility; asymptotic behaviour; and sums and associated point and count processes. Generalizations of the result are also considered.
APA, Harvard, Vancouver, ISO, and other styles
50

Benfenati, Francesco, and Gian Paolo Beretta. "Ergodicity, Maximum Entropy Production, and Steepest Entropy Ascent in the Proofs of Onsager’s Reciprocal Relations." Journal of Non-Equilibrium Thermodynamics 43, no. 2 (April 25, 2018): 101–10. http://dx.doi.org/10.1515/jnet-2017-0054.

Full text
Abstract:
AbstractWe show that to prove the Onsager relations using the microscopic time reversibility one necessarily has to make an ergodic hypothesis, or a hypothesis closely linked to that. This is true in all the proofs of the Onsager relations in the literature: from the original proof by Onsager, to more advanced proofs in the context of linear response theory and the theory of Markov processes, to the proof in the context of the kinetic theory of gases. The only three proofs that do not require any kind of ergodic hypothesis are based on additional hypotheses on the macroscopic evolution: Ziegler’s maximum entropy production principle (MEPP), the principle of time reversal invariance of the entropy production, or the steepest entropy ascent principle (SEAP).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography