Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Branching Markov chains.

Zeitschriftenartikel zum Thema „Branching Markov chains“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-50 Zeitschriftenartikel für die Forschung zum Thema "Branching Markov chains" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Zeitschriftenartikel für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Müller, Sebastian. „Recurrence for branching Markov chains“. Electronic Communications in Probability 13 (2008): 576–605. http://dx.doi.org/10.1214/ecp.v13-1424.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Baier, Christel, Joost-Pieter Katoen, Holger Hermanns und Verena Wolf. „Comparative branching-time semantics for Markov chains“. Information and Computation 200, Nr. 2 (August 2005): 149–214. http://dx.doi.org/10.1016/j.ic.2005.03.001.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Schinazi, Rinaldo. „On multiple phase transitions for branching Markov chains“. Journal of Statistical Physics 71, Nr. 3-4 (Mai 1993): 507–11. http://dx.doi.org/10.1007/bf01058434.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Athreya, Krishna B., und Hye-Jeong Kang. „Some limit theorems for positive recurrent branching Markov chains: I“. Advances in Applied Probability 30, Nr. 3 (September 1998): 693–710. http://dx.doi.org/10.1239/aap/1035228124.

Der volle Inhalt der Quelle
Annotation:
In this paper we consider a Galton-Watson process whose particles move according to a Markov chain with discrete state space. The Markov chain is assumed to be positive recurrent. We prove a law of large numbers for the empirical position distribution and also discuss the large deviation aspects of this convergence.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Athreya, Krishna B., und Hye-Jeong Kang. „Some limit theorems for positive recurrent branching Markov chains: I“. Advances in Applied Probability 30, Nr. 03 (September 1998): 693–710. http://dx.doi.org/10.1017/s0001867800008557.

Der volle Inhalt der Quelle
Annotation:
In this paper we consider a Galton-Watson process whose particles move according to a Markov chain with discrete state space. The Markov chain is assumed to be positive recurrent. We prove a law of large numbers for the empirical position distribution and also discuss the large deviation aspects of this convergence.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

LIU, YUANYUAN, HANJUN ZHANG und YIQIANG ZHAO. „COMPUTABLE STRONGLY ERGODIC RATES OF CONVERGENCE FOR CONTINUOUS-TIME MARKOV CHAINS“. ANZIAM Journal 49, Nr. 4 (April 2008): 463–78. http://dx.doi.org/10.1017/s1446181108000114.

Der volle Inhalt der Quelle
Annotation:
AbstractIn this paper, we investigate computable lower bounds for the best strongly ergodic rate of convergence of the transient probability distribution to the stationary distribution for stochastically monotone continuous-time Markov chains and reversible continuous-time Markov chains, using a drift function and the expectation of the first hitting time on some state. We apply these results to birth–death processes, branching processes and population processes.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

BACCI, GIORGIO, GIOVANNI BACCI, KIM G. LARSEN und RADU MARDARE. „Converging from branching to linear metrics on Markov chains“. Mathematical Structures in Computer Science 29, Nr. 1 (25.07.2017): 3–37. http://dx.doi.org/10.1017/s0960129517000160.

Der volle Inhalt der Quelle
Annotation:
We study two well-known linear-time metrics on Markov chains (MCs), namely, the strong and strutter trace distances. Our interest in these metrics is motivated by their relation to the probabilistic linear temporal logic (LTL)-model checking problem: we prove that they correspond to the maximal differences in the probability of satisfying the same LTL and LTL−X(LTL without next operator) formulas, respectively.The threshold problem for these distances (whether their value exceeds a given threshold) is NP-hard and not known to be decidable. Nevertheless, we provide an approximation schema where each lower and upper approximant is computable in polynomial time in the size of the MC.The upper approximants are bisimilarity-like pseudometrics (hence, branching-time distances) that converge point-wise to the linear-time metrics. This convergence is interesting in itself, because it reveals a non-trivial relation between branching and linear-time metric-based semantics that does not hold in equivalence-based semantics.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Huang, Ying, und Arthur F. Veinott. „Markov Branching Decision Chains with Interest-Rate-Dependent Rewards“. Probability in the Engineering and Informational Sciences 9, Nr. 1 (Januar 1995): 99–121. http://dx.doi.org/10.1017/s0269964800003715.

Der volle Inhalt der Quelle
Annotation:
Finite-state-and-action Markov branching decision chains are studied with bounded endogenous expected population sizes and interest-rate-dependent one-period rewards that are analytic in the interest rate at zero. The existence of a stationary strong-maximum-present-value policy is established. Miller and Veinott's [1969] strong policy-improvement method is generalized to find in finite time a stationary n-present-value optimal policy and, when the one-period rewards are rational in the interest rate, a stationary strong-maximum-present-value policy. This extends previous studies of Blackwell [1962], Miller and Veinott [1969], Veinott [1974], and Rothblum [1974, 1975], in which the one-period rewards are independent of the interest rate, and Denardo [1971] in which semi-Markov decision chains with small interest rates are studied. The problem of finding a stationary n-present-value optimal policy is also formulated as a staircase linear program in which the objective function and right-hand sides, but not the constraint matrix, depend on the interest rate, and solutions for all small enough positive interest rates are sought. The optimal solutions of the primal and dual are polynomials in the reciprocal of the interest rate. A constructive rule is given for finding a stationary n-present-value optimal policy from an optimal solution of the asymptotic linear program. This generalizes the linear programming approaches for finding maximum-reward-rate and maximum-present-value policies for Markov decision chains studied by Manne [1960], d'Epenoux [1960, 1963], Balinski [1961], Derman [1962], Denardo and Fox [1968], Denardo [1970], Derman and Veinott [1972], Veinott [1973], and Hordijk and Kallenberg [1979, 1984].
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Hu, Dihe. „Infinitely dimensional control Markov branching chains in random environments“. Science in China Series A 49, Nr. 1 (Januar 2006): 27–53. http://dx.doi.org/10.1007/s11425-005-0024-2.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Cox, J. T. „On the ergodic theory of critical branching Markov chains“. Stochastic Processes and their Applications 50, Nr. 1 (März 1994): 1–20. http://dx.doi.org/10.1016/0304-4149(94)90144-9.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

González, M., R. Martínez und M. Mota. „Rates of Growth in a Class of Homogeneous Multidimensional Markov Chains“. Journal of Applied Probability 43, Nr. 1 (März 2006): 159–74. http://dx.doi.org/10.1239/jap/1143936250.

Der volle Inhalt der Quelle
Annotation:
We investigate the asymptotic behaviour of homogeneous multidimensional Markov chains whose states have nonnegative integer components. We obtain growth rates for these models in a situation similar to the near-critical case for branching processes, provided that they converge to infinity with positive probability. Finally, the general theoretical results are applied to a class of controlled multitype branching process in which random control is allowed.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

González, M., R. Martínez und M. Mota. „Rates of Growth in a Class of Homogeneous Multidimensional Markov Chains“. Journal of Applied Probability 43, Nr. 01 (März 2006): 159–74. http://dx.doi.org/10.1017/s0021900200001431.

Der volle Inhalt der Quelle
Annotation:
We investigate the asymptotic behaviour of homogeneous multidimensional Markov chains whose states have nonnegative integer components. We obtain growth rates for these models in a situation similar to the near-critical case for branching processes, provided that they converge to infinity with positive probability. Finally, the general theoretical results are applied to a class of controlled multitype branching process in which random control is allowed.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

González, M., R. Martínez und M. Mota. „On the geometric growth in a class of homogeneous multitype Markov chain“. Journal of Applied Probability 42, Nr. 4 (Dezember 2005): 1015–30. http://dx.doi.org/10.1239/jap/1134587813.

Der volle Inhalt der Quelle
Annotation:
In this paper, we investigate the geometric growth of homogeneous multitype Markov chains whose states have nonnegative integer coordinates. Such models are considered in a situation similar to the supercritical case for branching processes. Finally, our general theoretical results are applied to a class of controlled multitype branching process in which the control is random.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

González, M., R. Martínez und M. Mota. „On the geometric growth in a class of homogeneous multitype Markov chain“. Journal of Applied Probability 42, Nr. 04 (Dezember 2005): 1015–30. http://dx.doi.org/10.1017/s0021900200001078.

Der volle Inhalt der Quelle
Annotation:
In this paper, we investigate the geometric growth of homogeneous multitype Markov chains whose states have nonnegative integer coordinates. Such models are considered in a situation similar to the supercritical case for branching processes. Finally, our general theoretical results are applied to a class of controlled multitype branching process in which the control is random.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Chen, Anyue, Phil Pollett, Hanjun Zhang und Ben Cairns. „Uniqueness criteria for continuous-time Markov chains with general transition structures“. Advances in Applied Probability 37, Nr. 4 (Dezember 2005): 1056–74. http://dx.doi.org/10.1239/aap/1134587753.

Der volle Inhalt der Quelle
Annotation:
We derive necessary and sufficient conditions for the existence of bounded or summable solutions to systems of linear equations associated with Markov chains. This substantially extends a famous result of G. E. H. Reuter, which provides a convenient means of checking various uniqueness criteria for birth-death processes. Our result allows chains with much more general transition structures to be accommodated. One application is to give a new proof of an important result of M. F. Chen concerning upwardly skip-free processes. We then use our generalization of Reuter's lemma to prove new results for downwardly skip-free chains, such as the Markov branching process and several of its many generalizations. This permits us to establish uniqueness criteria for several models, including the general birth, death, and catastrophe process, extended branching processes, and asymptotic birth-death processes, the latter being neither upwardly skip-free nor downwardly skip-free.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

Chen, Anyue, Phil Pollett, Hanjun Zhang und Ben Cairns. „Uniqueness criteria for continuous-time Markov chains with general transition structures“. Advances in Applied Probability 37, Nr. 04 (Dezember 2005): 1056–74. http://dx.doi.org/10.1017/s0001867800000665.

Der volle Inhalt der Quelle
Annotation:
We derive necessary and sufficient conditions for the existence of bounded or summable solutions to systems of linear equations associated with Markov chains. This substantially extends a famous result of G. E. H. Reuter, which provides a convenient means of checking various uniqueness criteria for birth-death processes. Our result allows chains with much more general transition structures to be accommodated. One application is to give a new proof of an important result of M. F. Chen concerning upwardly skip-free processes. We then use our generalization of Reuter's lemma to prove new results for downwardly skip-free chains, such as the Markov branching process and several of its many generalizations. This permits us to establish uniqueness criteria for several models, including the general birth, death, and catastrophe process, extended branching processes, and asymptotic birth-death processes, the latter being neither upwardly skip-free nor downwardly skip-free.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Athreya, Krishna B., und Hye-Jeong Kang. „Some limit theorems for positive recurrent branching Markov chains: II“. Advances in Applied Probability 30, Nr. 3 (September 1998): 711–22. http://dx.doi.org/10.1239/aap/1035228125.

Der volle Inhalt der Quelle
Annotation:
In this paper we consider a Galton-Watson process in which particles move according to a positive recurrent Markov chain on a general state space. We prove a law of large numbers for the empirical position distribution and also discuss the rate of this convergence.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Athreya, Krishna B., und Hye-Jeong Kang. „Some limit theorems for positive recurrent branching Markov chains: II“. Advances in Applied Probability 30, Nr. 03 (September 1998): 711–22. http://dx.doi.org/10.1017/s0001867800008569.

Der volle Inhalt der Quelle
Annotation:
In this paper we consider a Galton-Watson process in which particles move according to a positive recurrent Markov chain on a general state space. We prove a law of large numbers for the empirical position distribution and also discuss the rate of this convergence.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Kirkpatrick, Anna, Kalen Patton, Prasad Tetali und Cassie Mitchell. „Markov Chain-Based Sampling for Exploring RNA Secondary Structure under the Nearest Neighbor Thermodynamic Model and Extended Applications“. Mathematical and Computational Applications 25, Nr. 4 (10.10.2020): 67. http://dx.doi.org/10.3390/mca25040067.

Der volle Inhalt der Quelle
Annotation:
Ribonucleic acid (RNA) secondary structures and branching properties are important for determining functional ramifications in biology. While energy minimization of the Nearest Neighbor Thermodynamic Model (NNTM) is commonly used to identify such properties (number of hairpins, maximum ladder distance, etc.), it is difficult to know whether the resultant values fall within expected dispersion thresholds for a given energy function. The goal of this study was to construct a Markov chain capable of examining the dispersion of RNA secondary structures and branching properties obtained from NNTM energy function minimization independent of a specific nucleotide sequence. Plane trees are studied as a model for RNA secondary structure, with energy assigned to each tree based on the NNTM, and a corresponding Gibbs distribution is defined on the trees. Through a bijection between plane trees and 2-Motzkin paths, a Markov chain converging to the Gibbs distribution is constructed, and fast mixing time is established by estimating the spectral gap of the chain. The spectral gap estimate is obtained through a series of decompositions of the chain and also by building on known mixing time results for other chains on Dyck paths. The resulting algorithm can be used as a tool for exploring the branching structure of RNA, especially for long sequences, and to examine branching structure dependence on energy model parameters. Full exposition is provided for the mathematical techniques used with the expectation that these techniques will prove useful in bioinformatics, computational biology, and additional extended applications.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

Hu, Dihe. „THE CONSTRUCTION OF MULTITYPE CANONICAL MARKOV BRANCHING CHAINS IN RANDOM ENVIRONMENTS“. Acta Mathematica Scientia 26, Nr. 3 (Juli 2006): 431–42. http://dx.doi.org/10.1016/s0252-9602(06)60067-2.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

Vallander, S. S. „Occupation times for countable Markov chains III. Chains on a tree with one branching point“. Journal of Soviet Mathematics 36, Nr. 4 (Februar 1987): 451–61. http://dx.doi.org/10.1007/bf01663453.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Bansaye, Vincent. „Ancestral Lineages and Limit Theorems for Branching Markov Chains in Varying Environment“. Journal of Theoretical Probability 32, Nr. 1 (27.03.2018): 249–81. http://dx.doi.org/10.1007/s10959-018-0825-1.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

Di-he, Hu, und Zhang Shu-lin. „The laplace functional and moments for Markov branching chains in random environments“. Wuhan University Journal of Natural Sciences 10, Nr. 3 (Mai 2005): 485–92. http://dx.doi.org/10.1007/bf02831130.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

Athreya, Krishna B. „Change of Measures for Markov Chains and the LlogL Theorem for Branching Processes“. Bernoulli 6, Nr. 2 (April 2000): 323. http://dx.doi.org/10.2307/3318579.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Pollett, P. K. „On the identification of continuous-time Markov chains with a given invariant measure“. Journal of Applied Probability 31, Nr. 4 (Dezember 1994): 897–910. http://dx.doi.org/10.2307/3215315.

Der volle Inhalt der Quelle
Annotation:
In [14] a necessary and sufficient condition was obtained for there to exist uniquely a Q-process with a specified invariant measure, under the assumption that Q is a stable, conservative, single-exit matrix. The purpose of this note is to demonstrate that, for an arbitrary stable and conservative q-matrix, the same condition suffices for the existence of a suitable Q-process, but that this process might not be unique. A range of examples is considered, including pure-birth processes, a birth process with catastrophes, birth-death processes and the Markov branching process with immigration.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

Pollett, P. K. „On the identification of continuous-time Markov chains with a given invariant measure“. Journal of Applied Probability 31, Nr. 04 (Dezember 1994): 897–910. http://dx.doi.org/10.1017/s0021900200099435.

Der volle Inhalt der Quelle
Annotation:
In [14] a necessary and sufficient condition was obtained for there to exist uniquely a Q-process with a specified invariant measure, under the assumption that Q is a stable, conservative, single-exit matrix. The purpose of this note is to demonstrate that, for an arbitrary stable and conservative q-matrix, the same condition suffices for the existence of a suitable Q-process, but that this process might not be unique. A range of examples is considered, including pure-birth processes, a birth process with catastrophes, birth-death processes and the Markov branching process with immigration.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
27

Parsamanesh, Mahmood, und Marwan Abukhaled. „Stochastic modeling of spreading an infection with standard incidence rate“. Journal of Statistics and Management Systems 27, Nr. 6 (2024): 1221–41. http://dx.doi.org/10.47974/jsms-1271.

Der volle Inhalt der Quelle
Annotation:
This paper studies and models the random spread of an infection in a population. It extends the traditional deterministic modeling approach by incorporating discretetime stochastic modeling using Markov chains. The probability of extinction and disease persistence is then investigated using the branching chain method, with a focus on a quantity in the non-random model, which is called the basic reproductive number. Moreover, the random model is transformed into a system of stochastic differential equations by approximating the probability distribution function using the forward Kolmogorov equation. An equivalent stochastic differential system is also introduced and examined for the stochastic model. Finally, numerical simulations of the stochastic models are conducted to evaluate the theoretical findings presented in the paper.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
28

Louhichi, Sana, und Bernard Ycart. „Exponential Growth of Bifurcating Processes with Ancestral Dependence“. Advances in Applied Probability 47, Nr. 2 (Juni 2015): 545–64. http://dx.doi.org/10.1239/aap/1435236987.

Der volle Inhalt der Quelle
Annotation:
Branching processes are classical growth models in cell kinetics. In their construction, it is usually assumed that cell lifetimes are independent random variables, which has been proved false in experiments. Models of dependent lifetimes are considered here, in particular bifurcating Markov chains. Under the hypotheses of stationarity and multiplicative ergodicity, the corresponding branching process is proved to have the same type of asymptotics as its classic counterpart in the independent and identically distributed supercritical case: the cell population grows exponentially, the growth rate being related to the exponent of multiplicative ergodicity, in a similar way as to the Laplace transform of lifetimes in the i.i.d. case. An identifiable model for which the multiplicative ergodicity coefficients and the growth rate can be explicitly computed is proposed.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
29

Louhichi, Sana, und Bernard Ycart. „Exponential Growth of Bifurcating Processes with Ancestral Dependence“. Advances in Applied Probability 47, Nr. 02 (Juni 2015): 545–64. http://dx.doi.org/10.1017/s0001867800007977.

Der volle Inhalt der Quelle
Annotation:
Branching processes are classical growth models in cell kinetics. In their construction, it is usually assumed that cell lifetimes are independent random variables, which has been proved false in experiments. Models of dependent lifetimes are considered here, in particular bifurcating Markov chains. Under the hypotheses of stationarity and multiplicative ergodicity, the corresponding branching process is proved to have the same type of asymptotics as its classic counterpart in the independent and identically distributed supercritical case: the cell population grows exponentially, the growth rate being related to the exponent of multiplicative ergodicity, in a similar way as to the Laplace transform of lifetimes in the i.i.d. case. An identifiable model for which the multiplicative ergodicity coefficients and the growth rate can be explicitly computed is proposed.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
30

Das, Ankush, Di Wang und Jan Hoffmann. „Probabilistic Resource-Aware Session Types“. Proceedings of the ACM on Programming Languages 7, POPL (09.01.2023): 1925–56. http://dx.doi.org/10.1145/3571259.

Der volle Inhalt der Quelle
Annotation:
Session types guarantee that message-passing processes adhere to predefined communication protocols. Prior work on session types has focused on deterministic languages but many message-passing systems, such as Markov chains and randomized distributed algorithms, are probabilistic. To implement and analyze such systems, this article develops the meta theory of probabilistic session types with an application focus on automatic expected resource analysis. Probabilistic session types describe probability distributions over messages and are a conservative extension of intuitionistic (binary) session types. To send on a probabilistic channel, processes have to utilize internal randomness from a probabilistic branching or external randomness from receiving on a probabilistic channel. The analysis for expected resource bounds is smoothly integrated with the type system and is a variant of automatic amortized resource analysis. Type inference relies on linear constraint solving to automatically derive symbolic bounds for various cost metrics. The technical contributions include the meta theory that is based on a novel nested multiverse semantics and a type-reconstruction algorithm that allows flexible mixing of different sources of randomness without burdening the programmer with complex type annotations. The type system has been implemented in the language NomosPro with linear-time type checking. Experiments demonstrate that NomosPro is applicable in different domains such as cost analysis of randomized distributed algorithms, analysis of Markov chains, probabilistic analysis of amortized data structures and digital contracts. NomosPro is also shown to be scalable by (i) implementing two broadcast and a bounded retransmission protocol where messages are dropped with a fixed probability, and (ii) verifying the limiting distribution of a Markov chain with 64 states and 420 transitions.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
31

Trčka, Nikola. „Strong, Weak and Branching Bisimulation for Transition Systems and Markov Reward Chains: A Unifying Matrix Approach“. Electronic Proceedings in Theoretical Computer Science 13 (10.12.2009): 55–65. http://dx.doi.org/10.4204/eptcs.13.5.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
32

Klebaner, Fima C. „Linear growth in near-critical population-size-dependent multitype Galton–Watson processes“. Journal of Applied Probability 26, Nr. 3 (September 1989): 431–45. http://dx.doi.org/10.2307/3214402.

Der volle Inhalt der Quelle
Annotation:
We consider a multitype population-size-dependent branching process in discrete time. A process is considered to be near-critical if the mean matrices of offspring distributions approach the mean matrix of a critical process as the population size increases. We show that if the second moments of offspring distributions stabilize as the population size increases, and the limiting variances are not too large in comparison with the deviation of the means from criticality, then the extinction probability is less than 1 and the process grows arithmetically fast, in the sense that any linear combination which is not orthogonal to the left eigenvector of the limiting mean matrix grows linearly to a limit distribution. We identify cases when the limiting distribution is gamma. A result on transience of multidimensional Markov chains is also given.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
33

Klebaner, Fima C. „Linear growth in near-critical population-size-dependent multitype Galton–Watson processes“. Journal of Applied Probability 26, Nr. 03 (September 1989): 431–45. http://dx.doi.org/10.1017/s0021900200038043.

Der volle Inhalt der Quelle
Annotation:
We consider a multitype population-size-dependent branching process in discrete time. A process is considered to be near-critical if the mean matrices of offspring distributions approach the mean matrix of a critical process as the population size increases. We show that if the second moments of offspring distributions stabilize as the population size increases, and the limiting variances are not too large in comparison with the deviation of the means from criticality, then the extinction probability is less than 1 and the process grows arithmetically fast, in the sense that any linear combination which is not orthogonal to the left eigenvector of the limiting mean matrix grows linearly to a limit distribution. We identify cases when the limiting distribution is gamma. A result on transience of multidimensional Markov chains is also given.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
34

Breban, Romulus. „Emergence failure of early epidemics: A mathematical modeling approach“. PLOS ONE 19, Nr. 5 (29.05.2024): e0301415. http://dx.doi.org/10.1371/journal.pone.0301415.

Der volle Inhalt der Quelle
Annotation:
Epidemic or pathogen emergence is the phenomenon by which a poorly transmissible pathogen finds its evolutionary pathway to become a mutant that can cause an epidemic. Many mathematical models of pathogen emergence rely on branching processes. Here, we discuss pathogen emergence using Markov chains, for a more tractable analysis, generalizing previous work by Kendall and Bartlett about disease invasion. We discuss the probability of emergence failure for early epidemics, when the number of infected individuals is small and the number of the susceptible individuals is virtually unlimited. Our formalism addresses both directly transmitted and vector-borne diseases, in the cases where the original pathogen is 1) one step-mutation away from the epidemic strain, and 2) undergoing a long chain of neutral mutations that do not change the epidemiology. We obtain analytic results for the probabilities of emergence failure and two features transcending the transmission mechanism. First, the reproduction number of the original pathogen is determinant for the probability of pathogen emergence, more important than the mutation rate or the transmissibility of the emerged pathogen. Second, the probability of mutation within infected individuals must be sufficiently high for the pathogen undergoing neutral mutations to start an epidemic, the mutation threshold depending again on the basic reproduction number of the original pathogen. Finally, we discuss the parameterization of models of pathogen emergence, using SARS-CoV1 as an example of zoonotic emergence and HIV as an example for the emergence of drug resistance. We also discuss assumptions of our models and implications for epidemiology.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
35

Hautphenne, Sophie. „A Structured Markov Chain Approach to Branching Processes“. Stochastic Models 31, Nr. 3 (24.06.2015): 403–32. http://dx.doi.org/10.1080/15326349.2015.1022264.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
36

Giroux, Gaston. „Asymptotic results for non-linear processes of the McKean tagged-molecule type“. Journal of Applied Probability 23, Nr. 1 (März 1986): 42–51. http://dx.doi.org/10.2307/3214115.

Der volle Inhalt der Quelle
Annotation:
McKean's tagged-molecule process is a non-linear homogeneous two-state Markov chain in continuous time, constructed with the aid of a binary branching process. For each of a large class of branching processes we construct a similar process. The construction is carefully done and the weak homogeneity is deduced. A simple probability argument permits us to show convergence to the equidistribution (½, ½) and to note that this limit is a strong equilibrium. A non-homogeneous Markov chain result is also used to establish the geometric rate of convergence. A proof of a Boltzmann H-theorem is also established.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
37

Giroux, Gaston. „Asymptotic results for non-linear processes of the McKean tagged-molecule type“. Journal of Applied Probability 23, Nr. 01 (März 1986): 42–51. http://dx.doi.org/10.1017/s0021900200106266.

Der volle Inhalt der Quelle
Annotation:
McKean's tagged-molecule process is a non-linear homogeneous two-state Markov chain in continuous time, constructed with the aid of a binary branching process. For each of a large class of branching processes we construct a similar process. The construction is carefully done and the weak homogeneity is deduced. A simple probability argument permits us to show convergence to the equidistribution (½, ½) and to note that this limit is a strong equilibrium. A non-homogeneous Markov chain result is also used to establish the geometric rate of convergence. A proof of a Boltzmann H-theorem is also established.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
38

HONG, WENMING, und HUAMING WANG. „INTRINSIC BRANCHING STRUCTURE WITHIN (L-1) RANDOM WALK IN RANDOM ENVIRONMENT AND ITS APPLICATIONS“. Infinite Dimensional Analysis, Quantum Probability and Related Topics 16, Nr. 01 (März 2013): 1350006. http://dx.doi.org/10.1142/s0219025713500069.

Der volle Inhalt der Quelle
Annotation:
We figure out the intrinsic branching structure within (L-1) random walk in random environment. As applications, the branching structure enable us to calculate the expectation of the first hitting time directly, and specify the density of the invariant measure for the Markov chain of "the environment viewed from particles" explicitly.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
39

Melas, V. B. „Branching Technique for Markov Chain Simulation (Finite State Case)“. Statistics 25, Nr. 2 (Januar 1994): 159–71. http://dx.doi.org/10.1080/02331889408802441.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
40

Machado, F. P., und S. Yu Popov. „One-dimensional branching random walks in a Markovian random environment“. Journal of Applied Probability 37, Nr. 4 (Dezember 2000): 1157–63. http://dx.doi.org/10.1239/jap/1014843096.

Der volle Inhalt der Quelle
Annotation:
We study a one-dimensional supercritical branching random walk in a non-i.i.d. random environment, which considers both the branching mechanism and the step transition. This random environment is constructed using a recurrent Markov chain on a finite or countable state space. Criteria of (strong) recurrence and transience are presented for this model.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
41

HONG, WENMING, und LIN ZHANG. „BRANCHING STRUCTURE FOR THE TRANSIENT (1, R)-RANDOM WALK IN RANDOM ENVIRONMENT AND ITS APPLICATIONS“. Infinite Dimensional Analysis, Quantum Probability and Related Topics 13, Nr. 04 (Dezember 2010): 589–618. http://dx.doi.org/10.1142/s0219025710004188.

Der volle Inhalt der Quelle
Annotation:
An intrinsic multi-type branching structure within the transient (1, R)-RWRE is revealed. The branching structure enables us to specify the density of the absolutely continuous invariant measure for the Markov chain of environments seen from the particle and reprove the LLN with a drift explicitly in terms of the environment.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
42

Machado, F. P., und S. Yu Popov. „One-dimensional branching random walks in a Markovian random environment“. Journal of Applied Probability 37, Nr. 04 (Dezember 2000): 1157–63. http://dx.doi.org/10.1017/s0021900200018350.

Der volle Inhalt der Quelle
Annotation:
We study a one-dimensional supercritical branching random walk in a non-i.i.d. random environment, which considers both the branching mechanism and the step transition. This random environment is constructed using a recurrent Markov chain on a finite or countable state space. Criteria of (strong) recurrence and transience are presented for this model.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
43

Pakes, Anthony G. „Extinction and explosion of nonlinear Markov branching processes“. Journal of the Australian Mathematical Society 82, Nr. 3 (Juni 2007): 403–28. http://dx.doi.org/10.1017/s1446788700036193.

Der volle Inhalt der Quelle
Annotation:
AbstractThis paper concerns a generalization of the Markov branching process that preserves the random walk jump chain, but admits arbitrary positive jump rates. Necessary and sufficient conditions are found for regularity, including a generalization of the Harris-Dynkin integral condition when the jump rates are reciprocals of a Hausdorff moment sequence. Behaviour of the expected time to extinction is found, and some asymptotic properties of the explosion time are given for the case where extinction cannot occur. Existence of a unique invariant measure is shown, and conditions found for unique solution of the Forward equations. The ergodicity of a resurrected version is investigated.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
44

Grey, D. R. „Supercritical branching processes with density independent catastrophes“. Mathematical Proceedings of the Cambridge Philosophical Society 104, Nr. 2 (September 1988): 413–16. http://dx.doi.org/10.1017/s0305004100065579.

Der volle Inhalt der Quelle
Annotation:
A Markov branching process in either discrete time (the Galton–Watson process) or continuous time is modified by the introduction of a process of catastrophes which remove some individuals (and, by implication, their descendants) from the population. The catastrophe process is independent of the reproduction mechanism and takes the form of a sequence of independent identically distributed non-negative integer-valued random variables. In the continuous time case, these catastrophes occur at the points of an independent Poisson process with constant rate. If at any time the size of a catastrophe is at least the current population size, then the population becomes extinct. Thus in both discrete and continuous time we still have a Markov chain with stationary transition probabilities and an absorbing state at zero. Some authors use the term ‘emigration’ as an alternative to ‘catastrophe’.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
45

Bandyopadhyay, Antar, Svante Janson und Debleena Thacker. „Strong convergence of infinite color balanced urns under uniform ergodicity“. Journal of Applied Probability 57, Nr. 3 (September 2020): 853–65. http://dx.doi.org/10.1017/jpr.2020.37.

Der volle Inhalt der Quelle
Annotation:
AbstractWe consider the generalization of the Pólya urn scheme with possibly infinitely many colors, as introduced in [37], [4], [5], and [6]. For countably many colors, we prove almost sure convergence of the urn configuration under the uniform ergodicity assumption on the associated Markov chain. The proof uses a stochastic coupling of the sequence of chosen colors with a branching Markov chain on a weighted random recursive tree as described in [6], [31], and [26]. Using this coupling we estimate the covariance between any two selected colors. In particular, we re-prove the limit theorem for the classical urn models with finitely many colors.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
46

Costes, E., und Y. Guédon. „Modeling the Sylleptic Branching on One-year-old Trunks of Apple Cultivars“. Journal of the American Society for Horticultural Science 122, Nr. 1 (Januar 1997): 53–62. http://dx.doi.org/10.21273/jashs.122.1.53.

Der volle Inhalt der Quelle
Annotation:
The structure of 1-year-old trunks resulting from sylleptic branching are compared among apple (Malus domestica Borkh) cultivars with diverse branching and fruiting habits. The 1-year-old trunks developing from a graft are described as a succession of metamers whose structure refers to location, distribution, and length of sylleptic axillary shoots. We used a stochastic process called hidden semi-Markov chain to capture the embedded structure resulting from mixing of different types of axillary shoots developing along the trunks. The models, corresponding to the different cultivars, are composed of a first transient nonbranched state, a succession of transient states that cover the median sylleptic branching zone, and a final absorbing nonbranched state. They are interpreted with regard to complexity, extent, and branching distribution of the median sylleptic zone. Main results deal with the balance between long and short sylleptic shoots and the distribution of long sylleptic shoots along the trunks. Results suggest that sylleptic branching could be used as an early characteristic to evaluate the later branching behavior of cultivars.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
47

Jr., Glenn Lahodny, und Mona Zevika. „Effects of Fogging and Mosquito Repellent on the Probability of Disease Extinction for Dengue Fever“. Communication in Biomathematical Sciences 4, Nr. 1 (07.05.2021): 1–13. http://dx.doi.org/10.5614/cbms.2021.4.1.1.

Der volle Inhalt der Quelle
Annotation:
A Continuous-Time Markov Chain model is constructed based on the a deterministic model of dengue fever transmission including mosquito fogging and the use of repellent. The basic reproduction number (R0) for the corresponding deterministic model is obtained. This number indicates the possible occurrence of an endemic at the early stages of the infection period. A multitype branching process is used to approximate the Markov chain. The construction of offspring probability generating functions related to the infected states is used to calculate the probability of disease extinction and the probability of an outbreak (P0). Sensitivity analysis is shown for variation of control parameters and for indices of the basic reproduction number. These results allow for a better understanding of the relation of the basic reproduction number with other indicators of disease transmission.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
48

Piau, Didier. „Harmonic moments of inhomogeneous branching processes“. Advances in Applied Probability 38, Nr. 2 (Juni 2006): 465–86. http://dx.doi.org/10.1239/aap/1151337080.

Der volle Inhalt der Quelle
Annotation:
We study the harmonic moments of Galton-Watson processes that are possibly inhomogeneous and have positive values. Good estimates of these are needed to compute unbiased estimators for noncanonical branching Markov processes, which occur, for instance, in the modelling of the polymerase chain reaction. By convexity, the ratio of the harmonic mean to the mean is at most 1. We prove that, for every square-integrable branching mechanism, this ratio lies between 1-A/k and 1-A/k for every initial population of size k>A. The positive constants A and Aͤ are such that A≥Aͤ, are explicit, and depend only on the generation-by-generation branching mechanisms. In particular, we do not use the distribution of the limit of the classical martingale associated with the Galton-Watson process. Thus, emphasis is put on nonasymptotic bounds and on the dependence of the harmonic mean upon the size of the initial population. In the Bernoulli case, which is relevant for the modelling of the polymerase chain reaction, we prove essentially optimal bounds that are valid for every initial population size k≥1. Finally, in the general case and for sufficiently large initial populations, similar techniques yield sharp estimates of the harmonic moments of higher degree.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
49

Piau, Didier. „Harmonic moments of inhomogeneous branching processes“. Advances in Applied Probability 38, Nr. 02 (Juni 2006): 465–86. http://dx.doi.org/10.1017/s0001867800001051.

Der volle Inhalt der Quelle
Annotation:
We study the harmonic moments of Galton-Watson processes that are possibly inhomogeneous and have positive values. Good estimates of these are needed to compute unbiased estimators for noncanonical branching Markov processes, which occur, for instance, in the modelling of the polymerase chain reaction. By convexity, the ratio of the harmonic mean to the mean is at most 1. We prove that, for every square-integrable branching mechanism, this ratio lies between 1-A/k and 1-A/k for every initial population of size k>A. The positive constants A and Aͤ are such that A≥Aͤ, are explicit, and depend only on the generation-by-generation branching mechanisms. In particular, we do not use the distribution of the limit of the classical martingale associated with the Galton-Watson process. Thus, emphasis is put on nonasymptotic bounds and on the dependence of the harmonic mean upon the size of the initial population. In the Bernoulli case, which is relevant for the modelling of the polymerase chain reaction, we prove essentially optimal bounds that are valid for every initial population size k≥1. Finally, in the general case and for sufficiently large initial populations, similar techniques yield sharp estimates of the harmonic moments of higher degree.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
50

Cerf, Raphaël, und Joseba Dalmau. „Galton–Watson and branching process representations of the normalized Perron–Frobenius eigenvector“. ESAIM: Probability and Statistics 23 (2019): 797–802. http://dx.doi.org/10.1051/ps/2019007.

Der volle Inhalt der Quelle
Annotation:
Let A be a primitive matrix and let λ be its Perron–Frobenius eigenvalue. We give formulas expressing the associated normalized Perron–Frobenius eigenvector as a simple functional of a multitype Galton–Watson process whose mean matrix is A, as well as of a multitype branching process with mean matrix e(A−I)t. These formulas are generalizations of the classical formula for the invariant probability measure of a Markov chain.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie