To see the other types of publications on this topic, follow the link: Branching Markov chains.

Journal articles on the topic 'Branching Markov chains'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Branching Markov chains.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Müller, Sebastian. "Recurrence for branching Markov chains." Electronic Communications in Probability 13 (2008): 576–605. http://dx.doi.org/10.1214/ecp.v13-1424.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Baier, Christel, Joost-Pieter Katoen, Holger Hermanns, and Verena Wolf. "Comparative branching-time semantics for Markov chains." Information and Computation 200, no. 2 (August 2005): 149–214. http://dx.doi.org/10.1016/j.ic.2005.03.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Schinazi, Rinaldo. "On multiple phase transitions for branching Markov chains." Journal of Statistical Physics 71, no. 3-4 (May 1993): 507–11. http://dx.doi.org/10.1007/bf01058434.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Athreya, Krishna B., and Hye-Jeong Kang. "Some limit theorems for positive recurrent branching Markov chains: I." Advances in Applied Probability 30, no. 3 (September 1998): 693–710. http://dx.doi.org/10.1239/aap/1035228124.

Full text
Abstract:
In this paper we consider a Galton-Watson process whose particles move according to a Markov chain with discrete state space. The Markov chain is assumed to be positive recurrent. We prove a law of large numbers for the empirical position distribution and also discuss the large deviation aspects of this convergence.
APA, Harvard, Vancouver, ISO, and other styles
5

Athreya, Krishna B., and Hye-Jeong Kang. "Some limit theorems for positive recurrent branching Markov chains: I." Advances in Applied Probability 30, no. 03 (September 1998): 693–710. http://dx.doi.org/10.1017/s0001867800008557.

Full text
Abstract:
In this paper we consider a Galton-Watson process whose particles move according to a Markov chain with discrete state space. The Markov chain is assumed to be positive recurrent. We prove a law of large numbers for the empirical position distribution and also discuss the large deviation aspects of this convergence.
APA, Harvard, Vancouver, ISO, and other styles
6

LIU, YUANYUAN, HANJUN ZHANG, and YIQIANG ZHAO. "COMPUTABLE STRONGLY ERGODIC RATES OF CONVERGENCE FOR CONTINUOUS-TIME MARKOV CHAINS." ANZIAM Journal 49, no. 4 (April 2008): 463–78. http://dx.doi.org/10.1017/s1446181108000114.

Full text
Abstract:
AbstractIn this paper, we investigate computable lower bounds for the best strongly ergodic rate of convergence of the transient probability distribution to the stationary distribution for stochastically monotone continuous-time Markov chains and reversible continuous-time Markov chains, using a drift function and the expectation of the first hitting time on some state. We apply these results to birth–death processes, branching processes and population processes.
APA, Harvard, Vancouver, ISO, and other styles
7

BACCI, GIORGIO, GIOVANNI BACCI, KIM G. LARSEN, and RADU MARDARE. "Converging from branching to linear metrics on Markov chains." Mathematical Structures in Computer Science 29, no. 1 (July 25, 2017): 3–37. http://dx.doi.org/10.1017/s0960129517000160.

Full text
Abstract:
We study two well-known linear-time metrics on Markov chains (MCs), namely, the strong and strutter trace distances. Our interest in these metrics is motivated by their relation to the probabilistic linear temporal logic (LTL)-model checking problem: we prove that they correspond to the maximal differences in the probability of satisfying the same LTL and LTL−X(LTL without next operator) formulas, respectively.The threshold problem for these distances (whether their value exceeds a given threshold) is NP-hard and not known to be decidable. Nevertheless, we provide an approximation schema where each lower and upper approximant is computable in polynomial time in the size of the MC.The upper approximants are bisimilarity-like pseudometrics (hence, branching-time distances) that converge point-wise to the linear-time metrics. This convergence is interesting in itself, because it reveals a non-trivial relation between branching and linear-time metric-based semantics that does not hold in equivalence-based semantics.
APA, Harvard, Vancouver, ISO, and other styles
8

Huang, Ying, and Arthur F. Veinott. "Markov Branching Decision Chains with Interest-Rate-Dependent Rewards." Probability in the Engineering and Informational Sciences 9, no. 1 (January 1995): 99–121. http://dx.doi.org/10.1017/s0269964800003715.

Full text
Abstract:
Finite-state-and-action Markov branching decision chains are studied with bounded endogenous expected population sizes and interest-rate-dependent one-period rewards that are analytic in the interest rate at zero. The existence of a stationary strong-maximum-present-value policy is established. Miller and Veinott's [1969] strong policy-improvement method is generalized to find in finite time a stationary n-present-value optimal policy and, when the one-period rewards are rational in the interest rate, a stationary strong-maximum-present-value policy. This extends previous studies of Blackwell [1962], Miller and Veinott [1969], Veinott [1974], and Rothblum [1974, 1975], in which the one-period rewards are independent of the interest rate, and Denardo [1971] in which semi-Markov decision chains with small interest rates are studied. The problem of finding a stationary n-present-value optimal policy is also formulated as a staircase linear program in which the objective function and right-hand sides, but not the constraint matrix, depend on the interest rate, and solutions for all small enough positive interest rates are sought. The optimal solutions of the primal and dual are polynomials in the reciprocal of the interest rate. A constructive rule is given for finding a stationary n-present-value optimal policy from an optimal solution of the asymptotic linear program. This generalizes the linear programming approaches for finding maximum-reward-rate and maximum-present-value policies for Markov decision chains studied by Manne [1960], d'Epenoux [1960, 1963], Balinski [1961], Derman [1962], Denardo and Fox [1968], Denardo [1970], Derman and Veinott [1972], Veinott [1973], and Hordijk and Kallenberg [1979, 1984].
APA, Harvard, Vancouver, ISO, and other styles
9

Hu, Dihe. "Infinitely dimensional control Markov branching chains in random environments." Science in China Series A 49, no. 1 (January 2006): 27–53. http://dx.doi.org/10.1007/s11425-005-0024-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Cox, J. T. "On the ergodic theory of critical branching Markov chains." Stochastic Processes and their Applications 50, no. 1 (March 1994): 1–20. http://dx.doi.org/10.1016/0304-4149(94)90144-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

González, M., R. Martínez, and M. Mota. "Rates of Growth in a Class of Homogeneous Multidimensional Markov Chains." Journal of Applied Probability 43, no. 1 (March 2006): 159–74. http://dx.doi.org/10.1239/jap/1143936250.

Full text
Abstract:
We investigate the asymptotic behaviour of homogeneous multidimensional Markov chains whose states have nonnegative integer components. We obtain growth rates for these models in a situation similar to the near-critical case for branching processes, provided that they converge to infinity with positive probability. Finally, the general theoretical results are applied to a class of controlled multitype branching process in which random control is allowed.
APA, Harvard, Vancouver, ISO, and other styles
12

González, M., R. Martínez, and M. Mota. "Rates of Growth in a Class of Homogeneous Multidimensional Markov Chains." Journal of Applied Probability 43, no. 01 (March 2006): 159–74. http://dx.doi.org/10.1017/s0021900200001431.

Full text
Abstract:
We investigate the asymptotic behaviour of homogeneous multidimensional Markov chains whose states have nonnegative integer components. We obtain growth rates for these models in a situation similar to the near-critical case for branching processes, provided that they converge to infinity with positive probability. Finally, the general theoretical results are applied to a class of controlled multitype branching process in which random control is allowed.
APA, Harvard, Vancouver, ISO, and other styles
13

González, M., R. Martínez, and M. Mota. "On the geometric growth in a class of homogeneous multitype Markov chain." Journal of Applied Probability 42, no. 4 (December 2005): 1015–30. http://dx.doi.org/10.1239/jap/1134587813.

Full text
Abstract:
In this paper, we investigate the geometric growth of homogeneous multitype Markov chains whose states have nonnegative integer coordinates. Such models are considered in a situation similar to the supercritical case for branching processes. Finally, our general theoretical results are applied to a class of controlled multitype branching process in which the control is random.
APA, Harvard, Vancouver, ISO, and other styles
14

González, M., R. Martínez, and M. Mota. "On the geometric growth in a class of homogeneous multitype Markov chain." Journal of Applied Probability 42, no. 04 (December 2005): 1015–30. http://dx.doi.org/10.1017/s0021900200001078.

Full text
Abstract:
In this paper, we investigate the geometric growth of homogeneous multitype Markov chains whose states have nonnegative integer coordinates. Such models are considered in a situation similar to the supercritical case for branching processes. Finally, our general theoretical results are applied to a class of controlled multitype branching process in which the control is random.
APA, Harvard, Vancouver, ISO, and other styles
15

Chen, Anyue, Phil Pollett, Hanjun Zhang, and Ben Cairns. "Uniqueness criteria for continuous-time Markov chains with general transition structures." Advances in Applied Probability 37, no. 4 (December 2005): 1056–74. http://dx.doi.org/10.1239/aap/1134587753.

Full text
Abstract:
We derive necessary and sufficient conditions for the existence of bounded or summable solutions to systems of linear equations associated with Markov chains. This substantially extends a famous result of G. E. H. Reuter, which provides a convenient means of checking various uniqueness criteria for birth-death processes. Our result allows chains with much more general transition structures to be accommodated. One application is to give a new proof of an important result of M. F. Chen concerning upwardly skip-free processes. We then use our generalization of Reuter's lemma to prove new results for downwardly skip-free chains, such as the Markov branching process and several of its many generalizations. This permits us to establish uniqueness criteria for several models, including the general birth, death, and catastrophe process, extended branching processes, and asymptotic birth-death processes, the latter being neither upwardly skip-free nor downwardly skip-free.
APA, Harvard, Vancouver, ISO, and other styles
16

Chen, Anyue, Phil Pollett, Hanjun Zhang, and Ben Cairns. "Uniqueness criteria for continuous-time Markov chains with general transition structures." Advances in Applied Probability 37, no. 04 (December 2005): 1056–74. http://dx.doi.org/10.1017/s0001867800000665.

Full text
Abstract:
We derive necessary and sufficient conditions for the existence of bounded or summable solutions to systems of linear equations associated with Markov chains. This substantially extends a famous result of G. E. H. Reuter, which provides a convenient means of checking various uniqueness criteria for birth-death processes. Our result allows chains with much more general transition structures to be accommodated. One application is to give a new proof of an important result of M. F. Chen concerning upwardly skip-free processes. We then use our generalization of Reuter's lemma to prove new results for downwardly skip-free chains, such as the Markov branching process and several of its many generalizations. This permits us to establish uniqueness criteria for several models, including the general birth, death, and catastrophe process, extended branching processes, and asymptotic birth-death processes, the latter being neither upwardly skip-free nor downwardly skip-free.
APA, Harvard, Vancouver, ISO, and other styles
17

Athreya, Krishna B., and Hye-Jeong Kang. "Some limit theorems for positive recurrent branching Markov chains: II." Advances in Applied Probability 30, no. 3 (September 1998): 711–22. http://dx.doi.org/10.1239/aap/1035228125.

Full text
Abstract:
In this paper we consider a Galton-Watson process in which particles move according to a positive recurrent Markov chain on a general state space. We prove a law of large numbers for the empirical position distribution and also discuss the rate of this convergence.
APA, Harvard, Vancouver, ISO, and other styles
18

Athreya, Krishna B., and Hye-Jeong Kang. "Some limit theorems for positive recurrent branching Markov chains: II." Advances in Applied Probability 30, no. 03 (September 1998): 711–22. http://dx.doi.org/10.1017/s0001867800008569.

Full text
Abstract:
In this paper we consider a Galton-Watson process in which particles move according to a positive recurrent Markov chain on a general state space. We prove a law of large numbers for the empirical position distribution and also discuss the rate of this convergence.
APA, Harvard, Vancouver, ISO, and other styles
19

Kirkpatrick, Anna, Kalen Patton, Prasad Tetali, and Cassie Mitchell. "Markov Chain-Based Sampling for Exploring RNA Secondary Structure under the Nearest Neighbor Thermodynamic Model and Extended Applications." Mathematical and Computational Applications 25, no. 4 (October 10, 2020): 67. http://dx.doi.org/10.3390/mca25040067.

Full text
Abstract:
Ribonucleic acid (RNA) secondary structures and branching properties are important for determining functional ramifications in biology. While energy minimization of the Nearest Neighbor Thermodynamic Model (NNTM) is commonly used to identify such properties (number of hairpins, maximum ladder distance, etc.), it is difficult to know whether the resultant values fall within expected dispersion thresholds for a given energy function. The goal of this study was to construct a Markov chain capable of examining the dispersion of RNA secondary structures and branching properties obtained from NNTM energy function minimization independent of a specific nucleotide sequence. Plane trees are studied as a model for RNA secondary structure, with energy assigned to each tree based on the NNTM, and a corresponding Gibbs distribution is defined on the trees. Through a bijection between plane trees and 2-Motzkin paths, a Markov chain converging to the Gibbs distribution is constructed, and fast mixing time is established by estimating the spectral gap of the chain. The spectral gap estimate is obtained through a series of decompositions of the chain and also by building on known mixing time results for other chains on Dyck paths. The resulting algorithm can be used as a tool for exploring the branching structure of RNA, especially for long sequences, and to examine branching structure dependence on energy model parameters. Full exposition is provided for the mathematical techniques used with the expectation that these techniques will prove useful in bioinformatics, computational biology, and additional extended applications.
APA, Harvard, Vancouver, ISO, and other styles
20

Hu, Dihe. "THE CONSTRUCTION OF MULTITYPE CANONICAL MARKOV BRANCHING CHAINS IN RANDOM ENVIRONMENTS." Acta Mathematica Scientia 26, no. 3 (July 2006): 431–42. http://dx.doi.org/10.1016/s0252-9602(06)60067-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Vallander, S. S. "Occupation times for countable Markov chains III. Chains on a tree with one branching point." Journal of Soviet Mathematics 36, no. 4 (February 1987): 451–61. http://dx.doi.org/10.1007/bf01663453.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Bansaye, Vincent. "Ancestral Lineages and Limit Theorems for Branching Markov Chains in Varying Environment." Journal of Theoretical Probability 32, no. 1 (March 27, 2018): 249–81. http://dx.doi.org/10.1007/s10959-018-0825-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Di-he, Hu, and Zhang Shu-lin. "The laplace functional and moments for Markov branching chains in random environments." Wuhan University Journal of Natural Sciences 10, no. 3 (May 2005): 485–92. http://dx.doi.org/10.1007/bf02831130.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Athreya, Krishna B. "Change of Measures for Markov Chains and the LlogL Theorem for Branching Processes." Bernoulli 6, no. 2 (April 2000): 323. http://dx.doi.org/10.2307/3318579.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Pollett, P. K. "On the identification of continuous-time Markov chains with a given invariant measure." Journal of Applied Probability 31, no. 4 (December 1994): 897–910. http://dx.doi.org/10.2307/3215315.

Full text
Abstract:
In [14] a necessary and sufficient condition was obtained for there to exist uniquely a Q-process with a specified invariant measure, under the assumption that Q is a stable, conservative, single-exit matrix. The purpose of this note is to demonstrate that, for an arbitrary stable and conservative q-matrix, the same condition suffices for the existence of a suitable Q-process, but that this process might not be unique. A range of examples is considered, including pure-birth processes, a birth process with catastrophes, birth-death processes and the Markov branching process with immigration.
APA, Harvard, Vancouver, ISO, and other styles
26

Pollett, P. K. "On the identification of continuous-time Markov chains with a given invariant measure." Journal of Applied Probability 31, no. 04 (December 1994): 897–910. http://dx.doi.org/10.1017/s0021900200099435.

Full text
Abstract:
In [14] a necessary and sufficient condition was obtained for there to exist uniquely a Q-process with a specified invariant measure, under the assumption that Q is a stable, conservative, single-exit matrix. The purpose of this note is to demonstrate that, for an arbitrary stable and conservative q-matrix, the same condition suffices for the existence of a suitable Q-process, but that this process might not be unique. A range of examples is considered, including pure-birth processes, a birth process with catastrophes, birth-death processes and the Markov branching process with immigration.
APA, Harvard, Vancouver, ISO, and other styles
27

Parsamanesh, Mahmood, and Marwan Abukhaled. "Stochastic modeling of spreading an infection with standard incidence rate." Journal of Statistics and Management Systems 27, no. 6 (2024): 1221–41. http://dx.doi.org/10.47974/jsms-1271.

Full text
Abstract:
This paper studies and models the random spread of an infection in a population. It extends the traditional deterministic modeling approach by incorporating discretetime stochastic modeling using Markov chains. The probability of extinction and disease persistence is then investigated using the branching chain method, with a focus on a quantity in the non-random model, which is called the basic reproductive number. Moreover, the random model is transformed into a system of stochastic differential equations by approximating the probability distribution function using the forward Kolmogorov equation. An equivalent stochastic differential system is also introduced and examined for the stochastic model. Finally, numerical simulations of the stochastic models are conducted to evaluate the theoretical findings presented in the paper.
APA, Harvard, Vancouver, ISO, and other styles
28

Louhichi, Sana, and Bernard Ycart. "Exponential Growth of Bifurcating Processes with Ancestral Dependence." Advances in Applied Probability 47, no. 2 (June 2015): 545–64. http://dx.doi.org/10.1239/aap/1435236987.

Full text
Abstract:
Branching processes are classical growth models in cell kinetics. In their construction, it is usually assumed that cell lifetimes are independent random variables, which has been proved false in experiments. Models of dependent lifetimes are considered here, in particular bifurcating Markov chains. Under the hypotheses of stationarity and multiplicative ergodicity, the corresponding branching process is proved to have the same type of asymptotics as its classic counterpart in the independent and identically distributed supercritical case: the cell population grows exponentially, the growth rate being related to the exponent of multiplicative ergodicity, in a similar way as to the Laplace transform of lifetimes in the i.i.d. case. An identifiable model for which the multiplicative ergodicity coefficients and the growth rate can be explicitly computed is proposed.
APA, Harvard, Vancouver, ISO, and other styles
29

Louhichi, Sana, and Bernard Ycart. "Exponential Growth of Bifurcating Processes with Ancestral Dependence." Advances in Applied Probability 47, no. 02 (June 2015): 545–64. http://dx.doi.org/10.1017/s0001867800007977.

Full text
Abstract:
Branching processes are classical growth models in cell kinetics. In their construction, it is usually assumed that cell lifetimes are independent random variables, which has been proved false in experiments. Models of dependent lifetimes are considered here, in particular bifurcating Markov chains. Under the hypotheses of stationarity and multiplicative ergodicity, the corresponding branching process is proved to have the same type of asymptotics as its classic counterpart in the independent and identically distributed supercritical case: the cell population grows exponentially, the growth rate being related to the exponent of multiplicative ergodicity, in a similar way as to the Laplace transform of lifetimes in the i.i.d. case. An identifiable model for which the multiplicative ergodicity coefficients and the growth rate can be explicitly computed is proposed.
APA, Harvard, Vancouver, ISO, and other styles
30

Das, Ankush, Di Wang, and Jan Hoffmann. "Probabilistic Resource-Aware Session Types." Proceedings of the ACM on Programming Languages 7, POPL (January 9, 2023): 1925–56. http://dx.doi.org/10.1145/3571259.

Full text
Abstract:
Session types guarantee that message-passing processes adhere to predefined communication protocols. Prior work on session types has focused on deterministic languages but many message-passing systems, such as Markov chains and randomized distributed algorithms, are probabilistic. To implement and analyze such systems, this article develops the meta theory of probabilistic session types with an application focus on automatic expected resource analysis. Probabilistic session types describe probability distributions over messages and are a conservative extension of intuitionistic (binary) session types. To send on a probabilistic channel, processes have to utilize internal randomness from a probabilistic branching or external randomness from receiving on a probabilistic channel. The analysis for expected resource bounds is smoothly integrated with the type system and is a variant of automatic amortized resource analysis. Type inference relies on linear constraint solving to automatically derive symbolic bounds for various cost metrics. The technical contributions include the meta theory that is based on a novel nested multiverse semantics and a type-reconstruction algorithm that allows flexible mixing of different sources of randomness without burdening the programmer with complex type annotations. The type system has been implemented in the language NomosPro with linear-time type checking. Experiments demonstrate that NomosPro is applicable in different domains such as cost analysis of randomized distributed algorithms, analysis of Markov chains, probabilistic analysis of amortized data structures and digital contracts. NomosPro is also shown to be scalable by (i) implementing two broadcast and a bounded retransmission protocol where messages are dropped with a fixed probability, and (ii) verifying the limiting distribution of a Markov chain with 64 states and 420 transitions.
APA, Harvard, Vancouver, ISO, and other styles
31

Trčka, Nikola. "Strong, Weak and Branching Bisimulation for Transition Systems and Markov Reward Chains: A Unifying Matrix Approach." Electronic Proceedings in Theoretical Computer Science 13 (December 10, 2009): 55–65. http://dx.doi.org/10.4204/eptcs.13.5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Klebaner, Fima C. "Linear growth in near-critical population-size-dependent multitype Galton–Watson processes." Journal of Applied Probability 26, no. 3 (September 1989): 431–45. http://dx.doi.org/10.2307/3214402.

Full text
Abstract:
We consider a multitype population-size-dependent branching process in discrete time. A process is considered to be near-critical if the mean matrices of offspring distributions approach the mean matrix of a critical process as the population size increases. We show that if the second moments of offspring distributions stabilize as the population size increases, and the limiting variances are not too large in comparison with the deviation of the means from criticality, then the extinction probability is less than 1 and the process grows arithmetically fast, in the sense that any linear combination which is not orthogonal to the left eigenvector of the limiting mean matrix grows linearly to a limit distribution. We identify cases when the limiting distribution is gamma. A result on transience of multidimensional Markov chains is also given.
APA, Harvard, Vancouver, ISO, and other styles
33

Klebaner, Fima C. "Linear growth in near-critical population-size-dependent multitype Galton–Watson processes." Journal of Applied Probability 26, no. 03 (September 1989): 431–45. http://dx.doi.org/10.1017/s0021900200038043.

Full text
Abstract:
We consider a multitype population-size-dependent branching process in discrete time. A process is considered to be near-critical if the mean matrices of offspring distributions approach the mean matrix of a critical process as the population size increases. We show that if the second moments of offspring distributions stabilize as the population size increases, and the limiting variances are not too large in comparison with the deviation of the means from criticality, then the extinction probability is less than 1 and the process grows arithmetically fast, in the sense that any linear combination which is not orthogonal to the left eigenvector of the limiting mean matrix grows linearly to a limit distribution. We identify cases when the limiting distribution is gamma. A result on transience of multidimensional Markov chains is also given.
APA, Harvard, Vancouver, ISO, and other styles
34

Breban, Romulus. "Emergence failure of early epidemics: A mathematical modeling approach." PLOS ONE 19, no. 5 (May 29, 2024): e0301415. http://dx.doi.org/10.1371/journal.pone.0301415.

Full text
Abstract:
Epidemic or pathogen emergence is the phenomenon by which a poorly transmissible pathogen finds its evolutionary pathway to become a mutant that can cause an epidemic. Many mathematical models of pathogen emergence rely on branching processes. Here, we discuss pathogen emergence using Markov chains, for a more tractable analysis, generalizing previous work by Kendall and Bartlett about disease invasion. We discuss the probability of emergence failure for early epidemics, when the number of infected individuals is small and the number of the susceptible individuals is virtually unlimited. Our formalism addresses both directly transmitted and vector-borne diseases, in the cases where the original pathogen is 1) one step-mutation away from the epidemic strain, and 2) undergoing a long chain of neutral mutations that do not change the epidemiology. We obtain analytic results for the probabilities of emergence failure and two features transcending the transmission mechanism. First, the reproduction number of the original pathogen is determinant for the probability of pathogen emergence, more important than the mutation rate or the transmissibility of the emerged pathogen. Second, the probability of mutation within infected individuals must be sufficiently high for the pathogen undergoing neutral mutations to start an epidemic, the mutation threshold depending again on the basic reproduction number of the original pathogen. Finally, we discuss the parameterization of models of pathogen emergence, using SARS-CoV1 as an example of zoonotic emergence and HIV as an example for the emergence of drug resistance. We also discuss assumptions of our models and implications for epidemiology.
APA, Harvard, Vancouver, ISO, and other styles
35

Hautphenne, Sophie. "A Structured Markov Chain Approach to Branching Processes." Stochastic Models 31, no. 3 (June 24, 2015): 403–32. http://dx.doi.org/10.1080/15326349.2015.1022264.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Giroux, Gaston. "Asymptotic results for non-linear processes of the McKean tagged-molecule type." Journal of Applied Probability 23, no. 1 (March 1986): 42–51. http://dx.doi.org/10.2307/3214115.

Full text
Abstract:
McKean's tagged-molecule process is a non-linear homogeneous two-state Markov chain in continuous time, constructed with the aid of a binary branching process. For each of a large class of branching processes we construct a similar process. The construction is carefully done and the weak homogeneity is deduced. A simple probability argument permits us to show convergence to the equidistribution (½, ½) and to note that this limit is a strong equilibrium. A non-homogeneous Markov chain result is also used to establish the geometric rate of convergence. A proof of a Boltzmann H-theorem is also established.
APA, Harvard, Vancouver, ISO, and other styles
37

Giroux, Gaston. "Asymptotic results for non-linear processes of the McKean tagged-molecule type." Journal of Applied Probability 23, no. 01 (March 1986): 42–51. http://dx.doi.org/10.1017/s0021900200106266.

Full text
Abstract:
McKean's tagged-molecule process is a non-linear homogeneous two-state Markov chain in continuous time, constructed with the aid of a binary branching process. For each of a large class of branching processes we construct a similar process. The construction is carefully done and the weak homogeneity is deduced. A simple probability argument permits us to show convergence to the equidistribution (½, ½) and to note that this limit is a strong equilibrium. A non-homogeneous Markov chain result is also used to establish the geometric rate of convergence. A proof of a Boltzmann H-theorem is also established.
APA, Harvard, Vancouver, ISO, and other styles
38

HONG, WENMING, and HUAMING WANG. "INTRINSIC BRANCHING STRUCTURE WITHIN (L-1) RANDOM WALK IN RANDOM ENVIRONMENT AND ITS APPLICATIONS." Infinite Dimensional Analysis, Quantum Probability and Related Topics 16, no. 01 (March 2013): 1350006. http://dx.doi.org/10.1142/s0219025713500069.

Full text
Abstract:
We figure out the intrinsic branching structure within (L-1) random walk in random environment. As applications, the branching structure enable us to calculate the expectation of the first hitting time directly, and specify the density of the invariant measure for the Markov chain of "the environment viewed from particles" explicitly.
APA, Harvard, Vancouver, ISO, and other styles
39

Melas, V. B. "Branching Technique for Markov Chain Simulation (Finite State Case)." Statistics 25, no. 2 (January 1994): 159–71. http://dx.doi.org/10.1080/02331889408802441.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Machado, F. P., and S. Yu Popov. "One-dimensional branching random walks in a Markovian random environment." Journal of Applied Probability 37, no. 4 (December 2000): 1157–63. http://dx.doi.org/10.1239/jap/1014843096.

Full text
Abstract:
We study a one-dimensional supercritical branching random walk in a non-i.i.d. random environment, which considers both the branching mechanism and the step transition. This random environment is constructed using a recurrent Markov chain on a finite or countable state space. Criteria of (strong) recurrence and transience are presented for this model.
APA, Harvard, Vancouver, ISO, and other styles
41

HONG, WENMING, and LIN ZHANG. "BRANCHING STRUCTURE FOR THE TRANSIENT (1, R)-RANDOM WALK IN RANDOM ENVIRONMENT AND ITS APPLICATIONS." Infinite Dimensional Analysis, Quantum Probability and Related Topics 13, no. 04 (December 2010): 589–618. http://dx.doi.org/10.1142/s0219025710004188.

Full text
Abstract:
An intrinsic multi-type branching structure within the transient (1, R)-RWRE is revealed. The branching structure enables us to specify the density of the absolutely continuous invariant measure for the Markov chain of environments seen from the particle and reprove the LLN with a drift explicitly in terms of the environment.
APA, Harvard, Vancouver, ISO, and other styles
42

Machado, F. P., and S. Yu Popov. "One-dimensional branching random walks in a Markovian random environment." Journal of Applied Probability 37, no. 04 (December 2000): 1157–63. http://dx.doi.org/10.1017/s0021900200018350.

Full text
Abstract:
We study a one-dimensional supercritical branching random walk in a non-i.i.d. random environment, which considers both the branching mechanism and the step transition. This random environment is constructed using a recurrent Markov chain on a finite or countable state space. Criteria of (strong) recurrence and transience are presented for this model.
APA, Harvard, Vancouver, ISO, and other styles
43

Pakes, Anthony G. "Extinction and explosion of nonlinear Markov branching processes." Journal of the Australian Mathematical Society 82, no. 3 (June 2007): 403–28. http://dx.doi.org/10.1017/s1446788700036193.

Full text
Abstract:
AbstractThis paper concerns a generalization of the Markov branching process that preserves the random walk jump chain, but admits arbitrary positive jump rates. Necessary and sufficient conditions are found for regularity, including a generalization of the Harris-Dynkin integral condition when the jump rates are reciprocals of a Hausdorff moment sequence. Behaviour of the expected time to extinction is found, and some asymptotic properties of the explosion time are given for the case where extinction cannot occur. Existence of a unique invariant measure is shown, and conditions found for unique solution of the Forward equations. The ergodicity of a resurrected version is investigated.
APA, Harvard, Vancouver, ISO, and other styles
44

Grey, D. R. "Supercritical branching processes with density independent catastrophes." Mathematical Proceedings of the Cambridge Philosophical Society 104, no. 2 (September 1988): 413–16. http://dx.doi.org/10.1017/s0305004100065579.

Full text
Abstract:
A Markov branching process in either discrete time (the Galton–Watson process) or continuous time is modified by the introduction of a process of catastrophes which remove some individuals (and, by implication, their descendants) from the population. The catastrophe process is independent of the reproduction mechanism and takes the form of a sequence of independent identically distributed non-negative integer-valued random variables. In the continuous time case, these catastrophes occur at the points of an independent Poisson process with constant rate. If at any time the size of a catastrophe is at least the current population size, then the population becomes extinct. Thus in both discrete and continuous time we still have a Markov chain with stationary transition probabilities and an absorbing state at zero. Some authors use the term ‘emigration’ as an alternative to ‘catastrophe’.
APA, Harvard, Vancouver, ISO, and other styles
45

Bandyopadhyay, Antar, Svante Janson, and Debleena Thacker. "Strong convergence of infinite color balanced urns under uniform ergodicity." Journal of Applied Probability 57, no. 3 (September 2020): 853–65. http://dx.doi.org/10.1017/jpr.2020.37.

Full text
Abstract:
AbstractWe consider the generalization of the Pólya urn scheme with possibly infinitely many colors, as introduced in [37], [4], [5], and [6]. For countably many colors, we prove almost sure convergence of the urn configuration under the uniform ergodicity assumption on the associated Markov chain. The proof uses a stochastic coupling of the sequence of chosen colors with a branching Markov chain on a weighted random recursive tree as described in [6], [31], and [26]. Using this coupling we estimate the covariance between any two selected colors. In particular, we re-prove the limit theorem for the classical urn models with finitely many colors.
APA, Harvard, Vancouver, ISO, and other styles
46

Costes, E., and Y. Guédon. "Modeling the Sylleptic Branching on One-year-old Trunks of Apple Cultivars." Journal of the American Society for Horticultural Science 122, no. 1 (January 1997): 53–62. http://dx.doi.org/10.21273/jashs.122.1.53.

Full text
Abstract:
The structure of 1-year-old trunks resulting from sylleptic branching are compared among apple (Malus domestica Borkh) cultivars with diverse branching and fruiting habits. The 1-year-old trunks developing from a graft are described as a succession of metamers whose structure refers to location, distribution, and length of sylleptic axillary shoots. We used a stochastic process called hidden semi-Markov chain to capture the embedded structure resulting from mixing of different types of axillary shoots developing along the trunks. The models, corresponding to the different cultivars, are composed of a first transient nonbranched state, a succession of transient states that cover the median sylleptic branching zone, and a final absorbing nonbranched state. They are interpreted with regard to complexity, extent, and branching distribution of the median sylleptic zone. Main results deal with the balance between long and short sylleptic shoots and the distribution of long sylleptic shoots along the trunks. Results suggest that sylleptic branching could be used as an early characteristic to evaluate the later branching behavior of cultivars.
APA, Harvard, Vancouver, ISO, and other styles
47

Jr., Glenn Lahodny, and Mona Zevika. "Effects of Fogging and Mosquito Repellent on the Probability of Disease Extinction for Dengue Fever." Communication in Biomathematical Sciences 4, no. 1 (May 7, 2021): 1–13. http://dx.doi.org/10.5614/cbms.2021.4.1.1.

Full text
Abstract:
A Continuous-Time Markov Chain model is constructed based on the a deterministic model of dengue fever transmission including mosquito fogging and the use of repellent. The basic reproduction number (R0) for the corresponding deterministic model is obtained. This number indicates the possible occurrence of an endemic at the early stages of the infection period. A multitype branching process is used to approximate the Markov chain. The construction of offspring probability generating functions related to the infected states is used to calculate the probability of disease extinction and the probability of an outbreak (P0). Sensitivity analysis is shown for variation of control parameters and for indices of the basic reproduction number. These results allow for a better understanding of the relation of the basic reproduction number with other indicators of disease transmission.
APA, Harvard, Vancouver, ISO, and other styles
48

Piau, Didier. "Harmonic moments of inhomogeneous branching processes." Advances in Applied Probability 38, no. 2 (June 2006): 465–86. http://dx.doi.org/10.1239/aap/1151337080.

Full text
Abstract:
We study the harmonic moments of Galton-Watson processes that are possibly inhomogeneous and have positive values. Good estimates of these are needed to compute unbiased estimators for noncanonical branching Markov processes, which occur, for instance, in the modelling of the polymerase chain reaction. By convexity, the ratio of the harmonic mean to the mean is at most 1. We prove that, for every square-integrable branching mechanism, this ratio lies between 1-A/k and 1-A/k for every initial population of size k>A. The positive constants A and Aͤ are such that A≥Aͤ, are explicit, and depend only on the generation-by-generation branching mechanisms. In particular, we do not use the distribution of the limit of the classical martingale associated with the Galton-Watson process. Thus, emphasis is put on nonasymptotic bounds and on the dependence of the harmonic mean upon the size of the initial population. In the Bernoulli case, which is relevant for the modelling of the polymerase chain reaction, we prove essentially optimal bounds that are valid for every initial population size k≥1. Finally, in the general case and for sufficiently large initial populations, similar techniques yield sharp estimates of the harmonic moments of higher degree.
APA, Harvard, Vancouver, ISO, and other styles
49

Piau, Didier. "Harmonic moments of inhomogeneous branching processes." Advances in Applied Probability 38, no. 02 (June 2006): 465–86. http://dx.doi.org/10.1017/s0001867800001051.

Full text
Abstract:
We study the harmonic moments of Galton-Watson processes that are possibly inhomogeneous and have positive values. Good estimates of these are needed to compute unbiased estimators for noncanonical branching Markov processes, which occur, for instance, in the modelling of the polymerase chain reaction. By convexity, the ratio of the harmonic mean to the mean is at most 1. We prove that, for every square-integrable branching mechanism, this ratio lies between 1-A/k and 1-A/k for every initial population of size k>A. The positive constants A and Aͤ are such that A≥Aͤ, are explicit, and depend only on the generation-by-generation branching mechanisms. In particular, we do not use the distribution of the limit of the classical martingale associated with the Galton-Watson process. Thus, emphasis is put on nonasymptotic bounds and on the dependence of the harmonic mean upon the size of the initial population. In the Bernoulli case, which is relevant for the modelling of the polymerase chain reaction, we prove essentially optimal bounds that are valid for every initial population size k≥1. Finally, in the general case and for sufficiently large initial populations, similar techniques yield sharp estimates of the harmonic moments of higher degree.
APA, Harvard, Vancouver, ISO, and other styles
50

Cerf, Raphaël, and Joseba Dalmau. "Galton–Watson and branching process representations of the normalized Perron–Frobenius eigenvector." ESAIM: Probability and Statistics 23 (2019): 797–802. http://dx.doi.org/10.1051/ps/2019007.

Full text
Abstract:
Let A be a primitive matrix and let λ be its Perron–Frobenius eigenvalue. We give formulas expressing the associated normalized Perron–Frobenius eigenvector as a simple functional of a multitype Galton–Watson process whose mean matrix is A, as well as of a multitype branching process with mean matrix e(A−I)t. These formulas are generalizations of the classical formula for the invariant probability measure of a Markov chain.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography