Journal articles on the topic 'Markov approximation'

To see the other types of publications on this topic, follow the link: Markov approximation.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Markov approximation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Butko, Yana A. "Chernoff approximation of subordinate semigroups." Stochastics and Dynamics 18, no. 03 (May 18, 2018): 1850021. http://dx.doi.org/10.1142/s0219493718500211.

Full text
Abstract:
This note is devoted to the approximation of evolution semigroups generated by some Markov processes and hence to the approximation of transition probabilities of these processes. The considered semigroups correspond to processes obtained by subordination (i.e. by a time-change) of some original (parent) Markov processes with respect to some subordinators, i.e. Lévy processes with a.s. increasing paths (they play the role of the new time). If the semigroup, corresponding to a parent Markov process, is not known explicitly then neither the subordinate semigroup, nor even the generator of the subordinate semigroup are known explicitly too. In this note, some (Chernoff) approximations are constructed for subordinate semigroups (in the case when subordinators have either known transitional probabilities, or known and bounded Lévy measure) under the condition that the parent semigroups are not known but are already Chernoff-approximated. As it has been shown in the recent literature, this condition is fulfilled for several important classes of Markov processes. This fact allows, in particular, to use the constructed Chernoff approximations of subordinate semigroups, in order to approximate semigroups corresponding to subordination of Feller processes and (Feller type) diffusions in Euclidean spaces, star graphs and Riemannian manifolds. Such approximations can be used for direct calculations and simulation of stochastic processes. The method of Chernoff approximation is based on the Chernoff theorem and can be interpreted also as a construction of Markov chains approximating a given Markov process and as the numerical path integration method of solving the corresponding PDE/SDE.
APA, Harvard, Vancouver, ISO, and other styles
2

Patseika, Pavel G., Yauheni A. Rouba, and Kanstantin A. Smatrytski. "On one rational integral operator of Fourier – Chebyshev type and approximation of Markov functions." Journal of the Belarusian State University. Mathematics and Informatics, no. 2 (July 30, 2020): 6–27. http://dx.doi.org/10.33581/2520-6508-2020-2-6-27.

Full text
Abstract:
The purpose of this paper is to construct an integral rational Fourier operator based on the system of Chebyshev – Markov rational functions and to study its approximation properties on classes of Markov functions. In the introduction the main results of well-known works on approximations of Markov functions are present. Rational approximation of such functions is a well-known classical problem. It was studied by A. A. Gonchar, T. Ganelius, J.-E. Andersson, A. A. Pekarskii, G. Stahl and other authors. In the main part an integral operator of the Fourier – Chebyshev type with respect to the rational Chebyshev – Markov functions, which is a rational function of order no higher than n is introduced, and approximation of Markov functions is studied. If the measure satisfies the following conditions: suppμ = [1, a], a > 1, dμ(t) = ϕ(t)dt and ϕ(t) ἆ (t − 1)α on [1, a] the estimates of pointwise and uniform approximation and the asymptotic expression of the majorant of uniform approximation are established. In the case of a fixed number of geometrically distinct poles in the extended complex plane, values of optimal parameters that provide the highest rate of decreasing of this majorant are found, as well as asymptotically accurate estimates of the best uniform approximation by this method in the case of an even number of geometrically distinct poles of the approximating function. In the final part we present asymptotic estimates of approximation of some elementary functions, which can be presented by Markov functions.
APA, Harvard, Vancouver, ISO, and other styles
3

Peköz, Erol A. "Stein's method for geometric approximation." Journal of Applied Probability 33, no. 3 (September 1996): 707–13. http://dx.doi.org/10.2307/3215352.

Full text
Abstract:
The Stein–Chen method for Poisson approximation is adapted to the setting of the geometric distribution. This yields a convenient method for assessing the accuracy of the geometric approximation to the distribution of the number of failures preceding the first success in dependent trials. The results are applied to approximating waiting time distributions for patterns in coin tossing, and to approximating the distribution of the time when a stationary Markov chain first visits a rare set of states. The error bounds obtained are sharper than those obtainable using related Poisson approximations.
APA, Harvard, Vancouver, ISO, and other styles
4

Peköz, Erol A. "Stein's method for geometric approximation." Journal of Applied Probability 33, no. 03 (September 1996): 707–13. http://dx.doi.org/10.1017/s0021900200100142.

Full text
Abstract:
The Stein–Chen method for Poisson approximation is adapted to the setting of the geometric distribution. This yields a convenient method for assessing the accuracy of the geometric approximation to the distribution of the number of failures preceding the first success in dependent trials. The results are applied to approximating waiting time distributions for patterns in coin tossing, and to approximating the distribution of the time when a stationary Markov chain first visits a rare set of states. The error bounds obtained are sharper than those obtainable using related Poisson approximations.
APA, Harvard, Vancouver, ISO, and other styles
5

Heinzmann, Dominik. "Extinction Times in Multitype Markov Branching Processes." Journal of Applied Probability 46, no. 1 (March 2009): 296–307. http://dx.doi.org/10.1239/jap/1238592131.

Full text
Abstract:
In this paper, a distributional approximation to the time to extinction in a subcritical continuous-time Markov branching process is derived. A limit theorem for this distribution is established and the error in the approximation is quantified. The accuracy of the approximation is illustrated in an epidemiological example. Since Markov branching processes serve as approximations to nonlinear epidemic processes in the initial and final stages, our results can also be used to describe the time to extinction for such processes.
APA, Harvard, Vancouver, ISO, and other styles
6

Heinzmann, Dominik. "Extinction Times in Multitype Markov Branching Processes." Journal of Applied Probability 46, no. 01 (March 2009): 296–307. http://dx.doi.org/10.1017/s0021900200005374.

Full text
Abstract:
In this paper, a distributional approximation to the time to extinction in a subcritical continuous-time Markov branching process is derived. A limit theorem for this distribution is established and the error in the approximation is quantified. The accuracy of the approximation is illustrated in an epidemiological example. Since Markov branching processes serve as approximations to nonlinear epidemic processes in the initial and final stages, our results can also be used to describe the time to extinction for such processes.
APA, Harvard, Vancouver, ISO, and other styles
7

Guo, Yuanzhen, Hao Xiong, and Nicholas Ruozzi. "Marginal Inference in Continuous Markov Random Fields Using Mixtures." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 7834–41. http://dx.doi.org/10.1609/aaai.v33i01.33017834.

Full text
Abstract:
Exact marginal inference in continuous graphical models is computationally challenging outside of a few special cases. Existing work on approximate inference has focused on approximately computing the messages as part of the loopy belief propagation algorithm either via sampling methods or moment matching relaxations. In this work, we present an alternative family of approximations that, instead of approximating the messages, approximates the beliefs in the continuous Bethe free energy using mixture distributions. We show that these types of approximations can be combined with numerical quadrature to yield algorithms with both theoretical guarantees on the quality of the approximation and significantly better practical performance in a variety of applications that are challenging for current state-of-the-art methods.
APA, Harvard, Vancouver, ISO, and other styles
8

Anichkin, S. A., and V. V. Kalashnikov. "Approximation of Markov chains." Journal of Soviet Mathematics 32, no. 1 (January 1986): 1–8. http://dx.doi.org/10.1007/bf01084492.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Rolski, Tomasz. "Approximation of periodic queues." Advances in Applied Probability 19, no. 3 (September 1987): 691–707. http://dx.doi.org/10.2307/1427413.

Full text
Abstract:
In this paper we demonstrate how some characteristics of queues with the periodic Poisson arrivals can be approximated by the respective characteristics in queues with Markov modulated input. These Markov modulated queues were recently studied by Regterschot and de Smit (1984). The approximation theorems are given in terms of the weak convergence of some characteristics and their uniform integrability. The approximations are applicable for the following characteristics: mean workload, mean workload at the time of day, mean delay, mean queue size.
APA, Harvard, Vancouver, ISO, and other styles
10

Rolski, Tomasz. "Approximation of periodic queues." Advances in Applied Probability 19, no. 03 (September 1987): 691–707. http://dx.doi.org/10.1017/s0001867800016827.

Full text
Abstract:
In this paper we demonstrate how some characteristics of queues with the periodic Poisson arrivals can be approximated by the respective characteristics in queues with Markov modulated input. These Markov modulated queues were recently studied by Regterschot and de Smit (1984). The approximation theorems are given in terms of the weak convergence of some characteristics and their uniform integrability. The approximations are applicable for the following characteristics: mean workload, mean workload at the time of day, mean delay, mean queue size.
APA, Harvard, Vancouver, ISO, and other styles
11

Patseika, Pavel G., and Yauheni A. Rouba. "Fejer means of rational Fourier – Chebyshev series and approximation of function |x|s." Journal of the Belarusian State University. Mathematics and Informatics, no. 3 (November 29, 2019): 18–34. http://dx.doi.org/10.33581/2520-6508-2019-3-18-34.

Full text
Abstract:
Approximation properties of Fejer means of Fourier series by Chebyshev – Markov system of algebraic fractions and approximation by Fejer means of function |x|s, 0 < s < 2, on the interval [−1,1], are studied. One orthogonal system of Chebyshev – Markov algebraic fractions is considers, and Fejer means of the corresponding rational Fourier – Chebyshev series is introduce. The order of approximations of the sequence of Fejer means of continuous functions on a segment in terms of the continuity module and sufficient conditions on the parameter providing uniform convergence are established. A estimates of the pointwise and uniform approximation of the function |x|s, 0 < s < 2, on the interval [−1,1], the asymptotic expressions under n→∞ of majorant of uniform approximations, and the optimal value of the parameter, which provides the highest rate of approximation of the studied functions are sums of rational use of Fourier – Chebyshev are found.
APA, Harvard, Vancouver, ISO, and other styles
12

Koppula, Kavitha, Babushri Srinivas Kedukodi, and Syam Prasad Kuncham. "Markov frameworks and stock market decision making." Soft Computing 24, no. 21 (May 18, 2020): 16413–24. http://dx.doi.org/10.1007/s00500-020-04950-4.

Full text
Abstract:
Abstract In this paper, we present applications of Markov rough approximation framework (MRAF). The concept of MRAF is defined based on rough sets and Markov chains. MRAF is used to obtain the probability distribution function of various reference points in a rough approximation framework. We consider a set to be approximated together with its dynamacity and the effect of dynamacity on rough approximations is stated with the help of Markov chains. An extension to Pawlak’s decision algorithm is presented, and it is used for predictions in a stock market environment. In addition, suitability of the algorithm is illustrated in a multi-criteria medical diagnosis problem. Finally, the definition of fuzzy tolerance relation is extended to higher dimensions using reference points and basic results are established.
APA, Harvard, Vancouver, ISO, and other styles
13

Xia, Aihua. "A probabilistic proof of Stein's factors." Journal of Applied Probability 36, no. 1 (March 1999): 287–90. http://dx.doi.org/10.1239/jap/1032374250.

Full text
Abstract:
We provide a probabilistic proof of the Stein's factors based on properties of birth and death Markov chains, solving a tantalizing puzzle in using Markov chain knowledge to view the celebrated Stein–Chen method for Poisson approximations. This work complements the work of Barbour (1988) for the case of Poisson random variable approximation.
APA, Harvard, Vancouver, ISO, and other styles
14

Xia, Aihua. "A probabilistic proof of Stein's factors." Journal of Applied Probability 36, no. 01 (March 1999): 287–90. http://dx.doi.org/10.1017/s0021900200017058.

Full text
Abstract:
We provide a probabilistic proof of the Stein's factors based on properties of birth and death Markov chains, solving a tantalizing puzzle in using Markov chain knowledge to view the celebrated Stein–Chen method for Poisson approximations. This work complements the work of Barbour (1988) for the case of Poisson random variable approximation.
APA, Harvard, Vancouver, ISO, and other styles
15

Ding, J., and N. H. Rhee. "A modified piecewise linear Markov approximation of Markov operators." Applied Mathematics and Computation 174, no. 1 (March 2006): 236–51. http://dx.doi.org/10.1016/j.amc.2005.03.026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Patseika, Pavel G., and Yauheni A. Rouba. "On rational Abel – Poisson means on a segment and approximations of Markov functions." Journal of the Belarusian State University. Mathematics and Informatics, no. 3 (November 19, 2021): 6–24. http://dx.doi.org/10.33581/2520-6508-2021-3-6-24.

Full text
Abstract:
Approximations on the segment [−1, 1] of Markov functions by Abel – Poisson sums of a rational integral operator of Fourier type associated with the Chebyshev – Markov system of algebraic fractions in the case of a fixed number of geometrically different poles are investigated. An integral representation of approximations and an estimate of uniform approximations are found. Approximations of Markov functions in the case when the measure µ satisfies the conditions suppµ = [1, a], a > 1, dµ(t) = φ(t)dt and φ(t) ≍ (t − 1)α on [1, a], a are studied and estimates of pointwise and uniform approximations and the asymptotic expression of the majorant of uniform approximations are obtained. The optimal values of the parameters at which the majorant has the highest rate of decrease are found. As a corollary, asymptotic estimates of approximations on the segment [−1, 1] are given by the method of rational approximation of some elementary Markov functions under study.
APA, Harvard, Vancouver, ISO, and other styles
17

Xia, Aihua, and Mei Zhang. "On approximation of Markov binomial distributions." Bernoulli 15, no. 4 (November 2009): 1335–50. http://dx.doi.org/10.3150/09-bej194.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Van Schaftingen, Jean, and Justin Dekeyser. "Approximation of symmetrizations by Markov processes." Indiana University Mathematics Journal 66, no. 4 (2017): 1145–72. http://dx.doi.org/10.1512/iumj.2017.66.6118.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Octav, Olteanu. "Markov Moment Problem and Related Approximation." Open Journal of Mathematical Modeling 1, no. 4 (2013): 113. http://dx.doi.org/10.12966/ojmmo.08.05.2013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Minghua Chen, Soung Chang Liew, Ziyu Shao, and Caihong Kai. "Markov Approximation for Combinatorial Network Optimization." IEEE Transactions on Information Theory 59, no. 10 (October 2013): 6301–27. http://dx.doi.org/10.1109/tit.2013.2268923.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Borkar, Vivek S. "Stochastic approximation with ‘controlled Markov’ noise." Systems & Control Letters 55, no. 2 (February 2006): 139–45. http://dx.doi.org/10.1016/j.sysconle.2005.06.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Jaśkiewicz, A., and A. S. Nowak. "Approximation of Noncooperative Semi-Markov Games." Journal of Optimization Theory and Applications 131, no. 1 (November 29, 2006): 115–34. http://dx.doi.org/10.1007/s10957-006-9128-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Barbour, A. D., and Torgny Lindvall. "Translated Poisson Approximation for Markov Chains." Journal of Theoretical Probability 19, no. 3 (November 23, 2006): 609–30. http://dx.doi.org/10.1007/s10959-006-0047-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Barbour, A. D., and Torgny Lindvall. "Translated Poisson Approximation for Markov Chains." Journal of Theoretical Probability 22, no. 1 (October 15, 2008): 279–80. http://dx.doi.org/10.1007/s10959-008-0190-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Borovkov, K. A., and D. Pfeifer. "Pseudo-Poisson approximation for Markov chains." Stochastic Processes and their Applications 61, no. 1 (January 1996): 163–80. http://dx.doi.org/10.1016/0304-4149(95)00065-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Andersson, J. E. "Best Rational Approximation to Markov Functions." Journal of Approximation Theory 76, no. 2 (February 1994): 219–32. http://dx.doi.org/10.1006/jath.1994.1015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Fu, James C., Liqun Wang, and W. Y. Wendy Lou. "On exact and large deviation approximation for the distribution of the longest run in a sequence of two-state Markov dependent trials." Journal of Applied Probability 40, no. 2 (June 2003): 346–60. http://dx.doi.org/10.1239/jap/1053003548.

Full text
Abstract:
Consider a sequence of outcomes from Markov dependent two-state (success-failure) trials. In this paper, the exact distributions are derived for three longest-run statistics: the longest failure run, longest success run, and the maximum of the two. The method of finite Markov chain imbedding is used to obtain these exact distributions, and their bounds and large deviation approximation are also studied. Numerical comparisons among the exact distributions, bounds, and approximations are provided to illustrate the theoretical results. With some modifications, we show that the results can be easily extended to Markov dependent multistate trials.
APA, Harvard, Vancouver, ISO, and other styles
28

Fu, James C., Liqun Wang, and W. Y. Wendy Lou. "On exact and large deviation approximation for the distribution of the longest run in a sequence of two-state Markov dependent trials." Journal of Applied Probability 40, no. 02 (June 2003): 346–60. http://dx.doi.org/10.1017/s0021900200019343.

Full text
Abstract:
Consider a sequence of outcomes from Markov dependent two-state (success-failure) trials. In this paper, the exact distributions are derived for three longest-run statistics: the longest failure run, longest success run, and the maximum of the two. The method of finite Markov chain imbedding is used to obtain these exact distributions, and their bounds and large deviation approximation are also studied. Numerical comparisons among the exact distributions, bounds, and approximations are provided to illustrate the theoretical results. With some modifications, we show that the results can be easily extended to Markov dependent multistate trials.
APA, Harvard, Vancouver, ISO, and other styles
29

Durbin, J. "The first-passage density of a continuous gaussian process to a general boundary." Journal of Applied Probability 22, no. 1 (March 1985): 99–122. http://dx.doi.org/10.2307/3213751.

Full text
Abstract:
Under mild conditions an explicit expression is obtained for the first-passage density of sample paths of a continuous Gaussian process to a general boundary. Since this expression will usually be hard to compute, an approximation is given which is computationally simple and which is exact in the limit as the boundary becomes increasingly remote. The integral of this approximating density is itself approximated by a simple formula and this also is exact in the limit. A new integral equation is derived for the first-passage density of a continuous Gaussian Markov process. This is used to obtain further approximations.
APA, Harvard, Vancouver, ISO, and other styles
30

Durbin, J. "The first-passage density of a continuous gaussian process to a general boundary." Journal of Applied Probability 22, no. 01 (March 1985): 99–122. http://dx.doi.org/10.1017/s0021900200029041.

Full text
Abstract:
Under mild conditions an explicit expression is obtained for the first-passage density of sample paths of a continuous Gaussian process to a general boundary. Since this expression will usually be hard to compute, an approximation is given which is computationally simple and which is exact in the limit as the boundary becomes increasingly remote. The integral of this approximating density is itself approximated by a simple formula and this also is exact in the limit. A new integral equation is derived for the first-passage density of a continuous Gaussian Markov process. This is used to obtain further approximations.
APA, Harvard, Vancouver, ISO, and other styles
31

Limnios, Nikolaos, and Anatoliy Swishchuk. "Discrete-Time Semi-Markov Random Evolutions in Asymptotic Reduced Random Media with Applications." Mathematics 8, no. 6 (June 12, 2020): 963. http://dx.doi.org/10.3390/math8060963.

Full text
Abstract:
This paper deals with discrete-time semi-Markov random evolutions (DTSMRE) in reduced random media. The reduction can be done for ergodic and non ergodic media. Asymptotic approximations of random evolutions living in reducible random media (random environment) are obtained. Namely, averaging, diffusion approximation and normal deviation or diffusion approximation with equilibrium by martingale weak convergence method are obtained. Applications of the above results to the additive functionals and dynamical systems in discrete-time produce the above tree types of asymptotic results.
APA, Harvard, Vancouver, ISO, and other styles
32

Olteanu, Octav. "Convexity, Markov Operators, Approximation, and Related Optimization." Mathematics 10, no. 15 (August 4, 2022): 2775. http://dx.doi.org/10.3390/math10152775.

Full text
Abstract:
The present review paper provides recent results on convexity and its applications to the constrained extension of linear operators, motivated by the existence of subgradients of continuous convex operators, the Markov moment problem and related Markov operators, approximation using the Krein–Milman theorem, related optimization, and polynomial approximation on unbounded subsets. In many cases, the Mazur–Orlicz theorem also leads to Markov operators as solutions. The common point of all these results is the Hahn–Banach theorem and its consequences, supplied by specific results in polynomial approximation. All these theorems or their proofs essentially involve the notion of convexity.
APA, Harvard, Vancouver, ISO, and other styles
33

Fu, James C., and Brad C. Johnson. "Approximate probabilities for runs and patterns in i.i.d. and Markov-dependent multistate trials." Advances in Applied Probability 41, no. 1 (March 2009): 292–308. http://dx.doi.org/10.1239/aap/1240319586.

Full text
Abstract:
Let Xn(Λ) be the number of nonoverlapping occurrences of a simple pattern Λ in a sequence of independent and identically distributed (i.i.d.) multistate trials. For fixed k, the exact tail probability P{Xn (∧) < k} is difficult to compute and tends to 0 exponentially as n → ∞. In this paper we use the finite Markov chain imbedding technique and standard matrix theory results to obtain an approximation for this tail probability. The result is extended to compound patterns, Markov-dependent multistate trials, and overlapping occurrences of Λ. Numerical comparisons with Poisson and normal approximations are provided. Results indicate that the proposed approximations perform very well and do significantly better than the Poisson and normal approximations in many cases.
APA, Harvard, Vancouver, ISO, and other styles
34

Fu, James C., and Brad C. Johnson. "Approximate probabilities for runs and patterns in i.i.d. and Markov-dependent multistate trials." Advances in Applied Probability 41, no. 01 (March 2009): 292–308. http://dx.doi.org/10.1017/s0001867800003232.

Full text
Abstract:
LetXn(Λ) be the number of nonoverlapping occurrences of a simple pattern Λ in a sequence of independent and identically distributed (i.i.d.) multistate trials. For fixedk, the exact tail probability P{Xn(∧) &lt; k} is difficult to compute and tends to 0 exponentially asn→ ∞. In this paper we use the finite Markov chain imbedding technique and standard matrix theory results to obtain an approximation for this tail probability. The result is extended to compound patterns, Markov-dependent multistate trials, and overlapping occurrences of Λ. Numerical comparisons with Poisson and normal approximations are provided. Results indicate that the proposed approximations perform very well and do significantly better than the Poisson and normal approximations in many cases.
APA, Harvard, Vancouver, ISO, and other styles
35

Swishchuk, Anatoliy, and Nikolaos Limnios. "Controlled Discrete-Time Semi-Markov Random Evolutions and Their Applications." Mathematics 9, no. 2 (January 13, 2021): 158. http://dx.doi.org/10.3390/math9020158.

Full text
Abstract:
In this paper, we introduced controlled discrete-time semi-Markov random evolutions. These processes are random evolutions of discrete-time semi-Markov processes where we consider a control. applied to the values of random evolution. The main results concern time-rescaled weak convergence limit theorems in a Banach space of the above stochastic systems as averaging and diffusion approximation. The applications are given to the controlled additive functionals, controlled geometric Markov renewal processes, and controlled dynamical systems. We provide dynamical principles for discrete-time dynamical systems such as controlled additive functionals and controlled geometric Markov renewal processes. We also produce dynamic programming equations (Hamilton–Jacobi–Bellman equations) for the limiting processes in diffusion approximation such as controlled additive functionals, controlled geometric Markov renewal processes and controlled dynamical systems. As an example, we consider the solution of portfolio optimization problem by Merton for the limiting controlled geometric Markov renewal processes in diffusion approximation scheme. The rates of convergence in the limit theorems are also presented.
APA, Harvard, Vancouver, ISO, and other styles
36

Clark, Steven P., and Peter C. Kiessler. "A Diffusion Approximation for Markov Renewal Processes." Journal of Applied Probability 44, no. 2 (June 2007): 366–78. http://dx.doi.org/10.1239/jap/1183667407.

Full text
Abstract:
For a Markov renewal process where the time parameter is discrete, we present a novel method for calculating the asymptotic variance. Our approach is based on the key renewal theorem and is applicable even when the state space of the Markov chain is countably infinite.
APA, Harvard, Vancouver, ISO, and other styles
37

Clark, Steven P., and Peter C. Kiessler. "A Diffusion Approximation for Markov Renewal Processes." Journal of Applied Probability 44, no. 02 (June 2007): 366–78. http://dx.doi.org/10.1017/s0021900200117887.

Full text
Abstract:
For a Markov renewal process where the time parameter is discrete, we present a novel method for calculating the asymptotic variance. Our approach is based on the key renewal theorem and is applicable even when the state space of the Markov chain is countably infinite.
APA, Harvard, Vancouver, ISO, and other styles
38

Clark, Steven P., and Peter C. Kiessler. "A Diffusion Approximation for Markov Renewal Processes." Journal of Applied Probability 44, no. 02 (June 2007): 366–78. http://dx.doi.org/10.1017/s0021900200003028.

Full text
Abstract:
For a Markov renewal process where the time parameter is discrete, we present a novel method for calculating the asymptotic variance. Our approach is based on the key renewal theorem and is applicable even when the state space of the Markov chain is countably infinite.
APA, Harvard, Vancouver, ISO, and other styles
39

Capitanelli, Raffaela, and Mirko D’Ovidio. "Approximation of Space-Time Fractional Equations." Fractal and Fractional 5, no. 3 (July 17, 2021): 71. http://dx.doi.org/10.3390/fractalfract5030071.

Full text
Abstract:
The aim of this paper is to provide approximation results for space-time non-local equations with general non-local (and fractional) operators in space and time. We consider a general Markov process time changed with general subordinators or inverses to general subordinators. Our analysis is based on Bernstein symbols and Dirichlet forms, where the symbols characterize the time changes, and the Dirichlet forms characterize the Markov processes.
APA, Harvard, Vancouver, ISO, and other styles
40

BÖTTCHER, BJÖRN, and RENÉ L. SCHILLING. "APPROXIMATION OF FELLER PROCESSES BY MARKOV CHAINS WITH LÉVY INCREMENTS." Stochastics and Dynamics 09, no. 01 (March 2009): 71–80. http://dx.doi.org/10.1142/s0219493709002555.

Full text
Abstract:
We consider Feller processes whose generators have the test functions as an operator core. In this case, the generator is a pseudo differential operator with negative definite symbol q(x, ξ). If |q(x, ξ)| < c(1 + |ξ|2), the corresponding Feller process can be approximated by Markov chains whose steps are increments of Lévy processes. This approximation can easily be used for a simulation of the sample path of a Feller process. Further, we provide conditions in terms of the symbol for the transition operators of the Markov chains to be Feller. This gives rise to a sequence of Feller processes approximating the given Feller process.
APA, Harvard, Vancouver, ISO, and other styles
41

Michaelides, Michalis, Jane Hillston, and Guido Sanguinetti. "Geometric fluid approximation for general continuous-time Markov chains." Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 475, no. 2229 (September 2019): 20190100. http://dx.doi.org/10.1098/rspa.2019.0100.

Full text
Abstract:
Fluid approximations have seen great success in approximating the macro-scale behaviour of Markov systems with a large number of discrete states. However, these methods rely on the continuous-time Markov chain (CTMC) having a particular population structure which suggests a natural continuous state-space endowed with a dynamics for the approximating process. We construct here a general method based on spectral analysis of the transition matrix of the CTMC, without the need for a population structure. Specifically, we use the popular manifold learning method of diffusion maps to analyse the transition matrix as the operator of a hidden continuous process. An embedding of states in a continuous space is recovered, and the space is endowed with a drift vector field inferred via Gaussian process regression. In this manner, we construct an ordinary differential equation whose solution approximates the evolution of the CTMC mean, mapped onto the continuous space (known as the fluid limit).
APA, Harvard, Vancouver, ISO, and other styles
42

Patseika, P. G., and Y. A. Rouba. "The Abel – Poisson means of conjugate Fourier – Chebyshev series and their approximation properties." Proceedings of the National Academy of Sciences of Belarus. Physics and Mathematics Series 57, no. 2 (July 16, 2021): 156–75. http://dx.doi.org/10.29235/1561-2430-2021-57-2-156-175.

Full text
Abstract:
Herein, the approximation properties of the Abel – Poisson means of rational conjugate Fourier series on the system of the Chebyshev–Markov algebraic fractions are studied, and the approximations of conjugate functions with density | x |s , s ∈(1, 2), on the segment [–1,1] by this method are investigated. In the introduction, the results related to the study of the polynomial and rational approximations of conjugate functions are presented. The conjugate Fourier series on one system of the Chebyshev – Markov algebraic fractions is constructed. In the main part of the article, the integral representation of the approximations of conjugate functions on the segment [–1,1] by the method under study is established, the asymptotically exact upper bounds of deviations of conjugate Abel – Poisson means on classes of conjugate functions when the function satisfies the Lipschitz condition on the segment [–1,1] are found, and the approximations of the conjugate Abel – Poisson means of conjugate functions with density | x |s , s ∈(1, 2), on the segment [–1,1] are studied. Estimates of the approximations are obtained, and the asymptotic expression of the majorant of the approximations in the final part is found. The optimal value of the parameter at which the greatest rate of decreasing the majorant is provided is found. As a consequence of the obtained results, the problem of approximating the conjugate function with density | x |s , s ∈(1, 2), by the Abel – Poisson means of conjugate polynomial series on the system of Chebyshev polynomials of the first kind is studied in detail. Estimates of the approximations are established, as well as the asymptotic expression of the majorants of the approximations. This work is of both theoretical and applied nature. It can be used when reading special courses at mathematical faculties and for solving specific problems of computational mathematics.
APA, Harvard, Vancouver, ISO, and other styles
43

Patseika, P. G., and Y. A. Rovba. "On approximations of the function |x|s by the Vallee Poussin means of the Fourier series by the system of the Chebyshev – Markov rational fractions." Proceedings of the National Academy of Sciences of Belarus. Physics and Mathematics Series 55, no. 3 (October 7, 2019): 263–82. http://dx.doi.org/10.29235/1561-2430-2019-55-3-263-282.

Full text
Abstract:
The approximative properties of the Valle Poussin means of the Fourier series by the system of the Chebyshev – Markov rational fractions in the approximation of the function |x|s, 0 < s < 2 are investigated. The introduction presents the main results of the previously known works on the Vallee Poussin means in the polynomial and rational cases, as well as on the known literature data on the approximations of functions with power singularity. The Valle Poussin means on the interval [–1,1] as a method of summing the Fourier series by one system of the Chebyshev – Markov rational fractions are introduced. In the main section of the article, a integral representation for the error of approximations by the rational Valle Poussin means of the function |x|s, 0 < s < 2, on the segment [–1,1], an estimate of deviations of the Valle Poussin means from the function |x|s, 0 < s < 2, depending on the position of the point on the segment, a uniform estimate of deviations on the segment [–1,1] and its asymptotic expression are found. The optimal value of the parameter is obtained, at which the deviation error of the Valle Poussin means from the function |x|s, 0 < s <2, on the interval [–1,1] has the highest velocity of zero. As a consequence of the obtained results, the problem of approximation of the function |x|s, s > 0, by the Valle Poussin means of the Fourier series by the system of the Chebyshev first-kind polynomials is studied in detail. The pointwise estimation of approximation and asymptotic estimation are established.The work is both theoretical and applied. Its results can be used to read special courses at mathematical faculties and to solve specific problems of computational mathematics.
APA, Harvard, Vancouver, ISO, and other styles
44

Baumann, Hendrik. "Finite-State-Space Truncations for Infinite Quasi-Birth-Death Processes." Journal of Applied Mathematics 2020 (August 7, 2020): 1–23. http://dx.doi.org/10.1155/2020/2678374.

Full text
Abstract:
For dealing numerically with the infinite-state-space Markov chains, a truncation of the state space is inevitable, that is, an approximation by a finite-state-space Markov chain has to be performed. In this paper, we consider level-dependent quasi-birth-death processes, and we focus on the computation of stationary expectations. In previous literature, efficient methods for computing approximations to these characteristics have been suggested and established. These methods rely on truncating the process at some level N, and for N⟶∞, convergence of the approximation to the desired characteristic is guaranteed. This paper’s main goal is to quantify the speed of convergence. Under the assumption of an f-modulated drift condition, we derive terms for a lower bound and an upper bound on stationary expectations which converge quickly to the same value and which can be efficiently computed.
APA, Harvard, Vancouver, ISO, and other styles
45

Anselmi, Jonatha, François Dufour, and Tomás Prieto-Rumeau. "Computable approximations for average Markov decision processes in continuous time." Journal of Applied Probability 55, no. 2 (June 2018): 571–92. http://dx.doi.org/10.1017/jpr.2018.36.

Full text
Abstract:
Abstract In this paper we study the numerical approximation of the optimal long-run average cost of a continuous-time Markov decision process, with Borel state and action spaces, and with bounded transition and reward rates. Our approach uses a suitable discretization of the state and action spaces to approximate the original control model. The approximation error for the optimal average reward is then bounded by a linear combination of coefficients related to the discretization of the state and action spaces, namely, the Wasserstein distance between an underlying probability measure μ and a measure with finite support, and the Hausdorff distance between the original and the discretized actions sets. When approximating μ with its empirical probability measure we obtain convergence in probability at an exponential rate. An application to a queueing system is presented.
APA, Harvard, Vancouver, ISO, and other styles
46

Chuan, Pham, Minh N. H. Nguyen, and Choong Seon Hong. "A Markov Approximation-Based Approach for Network Service Chain Embedding." Journal of KIISE 44, no. 7 (July 31, 2017): 719–25. http://dx.doi.org/10.5626/jok.2017.44.7.719.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Christen, J. Andrés, and Colin Fox. "Markov chain Monte Carlo Using an Approximation." Journal of Computational and Graphical Statistics 14, no. 4 (December 2005): 795–810. http://dx.doi.org/10.1198/106186005x76983.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Blanchet, Jose, Guillermo Gallego, and Vineet Goyal. "A Markov Chain Approximation to Choice Modeling." Operations Research 64, no. 4 (August 2016): 886–905. http://dx.doi.org/10.1287/opre.2016.1505.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Solomos, G. P., and A. C. Lucia. "MARKOV APPROXIMATION TO FATIGUE CRACK SIZE DISTRIBUTION." Fatigue & Fracture of Engineering Materials and Structures 13, no. 5 (September 1990): 457–71. http://dx.doi.org/10.1111/j.1460-2695.1990.tb00617.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Jack, M. W., M. Naraschewski, M. J. Collett, and D. F. Walls. "Markov approximation for the atomic output coupler." Physical Review A 59, no. 4 (April 1, 1999): 2962–73. http://dx.doi.org/10.1103/physreva.59.2962.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography