Journal articles on the topic 'Modelling, Markov chain'

To see the other types of publications on this topic, follow the link: Modelling, Markov chain.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Modelling, Markov chain.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

BOUCHER, THOMAS R., and DAREN B. H. CLINE. "PIGGYBACKING THRESHOLD PROCESSES WITH A FINITE STATE MARKOV CHAIN." Stochastics and Dynamics 09, no. 02 (June 2009): 187–204. http://dx.doi.org/10.1142/s0219493709002622.

Full text
Abstract:
The state-space representations of certain nonlinear autoregressive time series are general state Markov chains. The transitions of a general state Markov chain among regions in its state-space can be modeled with the transitions among states of a finite state Markov chain. Stability of the time series is then informed by the stationary distributions of the finite state Markov chain. This approach generalizes some previous results.
APA, Harvard, Vancouver, ISO, and other styles
2

Gerontidis, Ioannis I. "Semi-Markov Replacement Chains." Advances in Applied Probability 26, no. 03 (September 1994): 728–55. http://dx.doi.org/10.1017/s0001867800026525.

Full text
Abstract:
We consider an absorbing semi-Markov chain for which each time absorption occurs there is a resetting of the chain according to some initial (replacement) distribution. The new process is a semi-Markov replacement chain and we study its properties in terms of those of the imbedded Markov replacement chain. A time-dependent version of the model is also defined and analysed asymptotically for two types of environmental behaviour, i.e. either convergent or cyclic. The results contribute to the control theory of semi-Markov chains and extend in a natural manner a wide variety of applied probability models. An application to the modelling of populations with semi-Markovian replacements is also presented.
APA, Harvard, Vancouver, ISO, and other styles
3

Gerontidis, Ioannis I. "Semi-Markov Replacement Chains." Advances in Applied Probability 26, no. 3 (September 1994): 728–55. http://dx.doi.org/10.2307/1427818.

Full text
Abstract:
We consider an absorbing semi-Markov chain for which each time absorption occurs there is a resetting of the chain according to some initial (replacement) distribution. The new process is a semi-Markov replacement chain and we study its properties in terms of those of the imbedded Markov replacement chain. A time-dependent version of the model is also defined and analysed asymptotically for two types of environmental behaviour, i.e. either convergent or cyclic. The results contribute to the control theory of semi-Markov chains and extend in a natural manner a wide variety of applied probability models. An application to the modelling of populations with semi-Markovian replacements is also presented.
APA, Harvard, Vancouver, ISO, and other styles
4

Faddy, M. J., and S. I. McClean. "Markov Chain Modelling for Geriatric Patient Care." Methods of Information in Medicine 44, no. 03 (2005): 369–73. http://dx.doi.org/10.1055/s-0038-1633979.

Full text
Abstract:
Summary Objectives: To show that Markov chain modelling can be applied to data on geriatric patients and use these models to assess the effects of covariates. Methods: Phase-type distributions were fitted by maximum likelihood to data on times spent by the patients in hospital and in community-based care. Data on the different events that ended the patients’ periods of care were used to estimate the dependence of the probabilities of these events on the phase from which the time in care ended. The age of the patients at admission to care and the year of admission were also included as covariates. Results: Differential effects of these covariates were shown on the various parameters of the fitted model, and interpretations of these effects made. Conclusions: Models based on phase-type distributions were appropriate for describing times spent in care, as the ordered phases had an interpretable structure corresponding to increasing amounts of care being given.
APA, Harvard, Vancouver, ISO, and other styles
5

SINGHAL, EKTA, and Kunal Mehta. "Marketing Channel Attribution Modelling: Markov Chain Analysis." International Journal of Indian Culture and Business Management 1, no. 1 (2020): 1. http://dx.doi.org/10.1504/ijicbm.2020.10027991.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Mehta, Kunal, and Ekta Singhal. "Marketing channel attribution modelling: Markov chain analysis." International Journal of Indian Culture and Business Management 21, no. 1 (2020): 63. http://dx.doi.org/10.1504/ijicbm.2020.109344.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hadjinicola, George, and Larry Goldstein. "Markov chain modelling of bioassay toxicity procedures." Statistics in Medicine 12, no. 7 (April 15, 1993): 661–74. http://dx.doi.org/10.1002/sim.4780120705.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Catak, Muammer, Nurşin Baş, Kevin Cronin, Dario Tellez-Medina, Edmond P. Byrne, and John J. Fitzpatrick. "Markov chain modelling of fluidised bed granulation." Chemical Engineering Journal 164, no. 2-3 (November 1, 2010): 403–9. http://dx.doi.org/10.1016/j.cej.2010.02.022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Huang, Vincent, and James Unwin. "Markov chain models of refugee migration data." IMA Journal of Applied Mathematics 85, no. 6 (September 29, 2020): 892–912. http://dx.doi.org/10.1093/imamat/hxaa032.

Full text
Abstract:
Abstract The application of Markov chains to modelling refugee crises is explored, focusing on local migration of individuals at the level of cities and days. As an explicit example, we apply the Markov chains migration model developed here to United Nations High Commissioner for Refugees data on the Burundi refugee crisis. We compare our method to a state-of-the-art ‘agent-based’ model of Burundi refugee movements, and highlight that Markov chain approaches presented here can improve the match to data while simultaneously being more algorithmically efficient.
APA, Harvard, Vancouver, ISO, and other styles
10

Balzter, Heiko. "Markov chain models for vegetation dynamics." Ecological Modelling 126, no. 2-3 (February 2000): 139–54. http://dx.doi.org/10.1016/s0304-3800(00)00262-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Jin, Yongliang, and Amlan Mukherjee. "Markov chain applications in modelling facility condition deterioration." International Journal of Critical Infrastructures 10, no. 2 (2014): 93. http://dx.doi.org/10.1504/ijcis.2014.062965.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

LIU, PEIDONG, and YAN ZHENG. "MARKOV CHAIN PERTURBATIONS OF A CLASS OF PARTIALLY EXPANDING ATTRACTORS." Stochastics and Dynamics 06, no. 03 (September 2006): 341–54. http://dx.doi.org/10.1142/s0219493706001761.

Full text
Abstract:
In this paper Markov chain perturbations of a class of partially expanding attractors of a diffeomorphism are considered. We show that, under some regularity conditions on the transition probabilities, the zero-noise limits of stationary measures of the Markov chains are Sinai–Ruelle–Bowen measures of the diffeomorphism on the attractors.
APA, Harvard, Vancouver, ISO, and other styles
13

Kuo, Lynn. "Markov Chain Monte Carlo." Technometrics 42, no. 2 (May 2000): 216. http://dx.doi.org/10.1080/00401706.2000.10486017.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Ge, Yuan, Yan Zhang, Wengen Gao, Fanyong Cheng, Nuo Yu, and Jincenzi Wu. "Modelling and Prediction of Random Delays in NCSs Using Double-Chain HMMs." Discrete Dynamics in Nature and Society 2020 (October 29, 2020): 1–16. http://dx.doi.org/10.1155/2020/6848420.

Full text
Abstract:
This paper is concerned with the modelling and prediction of random delays in networked control systems. The stochastic distribution of the random delay in the current sampling period is assumed to be affected by the network state in the current sampling period as well as the random delay in the previous sampling period. Based on this assumption, the double-chain hidden Markov model (DCHMM) is proposed in this paper to model the delays. There are two Markov chains in this model. One is the hidden Markov chain which consists of the network states and the other is the observable Markov chain which consists of the delays. Moreover, the delays are also affected by the hidden network states, which constructs the DCHMM-based delay model. The initialization and optimization problems of the model parameters are solved by using the segmental K-mean clustering algorithm and the expectation maximization algorithm, respectively. Based on the model, the prediction of the controller-to-actuator (CA) delay in the current sampling period is obtained. The prediction can be used to design a controller to compensate the CA delay in the future research. Some comparative experiments are carried out to demonstrate the effectiveness and superiority of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
15

Melnik, Roderick V. Nicholas. "Dynamic system evolution and markov chain approximation." Discrete Dynamics in Nature and Society 2, no. 1 (1998): 7–39. http://dx.doi.org/10.1155/s1026022698000028.

Full text
Abstract:
In this paper computational aspects of the mathematical modelling of dynamic system evolution have been considered as a problem in information theory. The construction of mathematical models is treated as a decision making process with limited available information.The solution of the problem is associated with a computational model based on heuristics of a Markov Chain in a discrete space–time of events. A stable approximation of the chain has been derived and the limiting cases are discussed. An intrinsic interconnection of constructive, sequential, and evolutionary approaches in related optimization problems provides new challenges for future work.
APA, Harvard, Vancouver, ISO, and other styles
16

Brumnik, Robert, Podbregar Iztok, and Ferjancic-Podbregar Mojca. "Markov Chains Modelling for Biometric System Reliability Estimations in Supply Chain Management." Sensor Letters 11, no. 2 (February 1, 2013): 377–83. http://dx.doi.org/10.1166/sl.2013.2133.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Sahin, Ahmet D., and Zekai Sen. "First-order Markov chain approach to wind speed modelling." Journal of Wind Engineering and Industrial Aerodynamics 89, no. 3-4 (March 2001): 263–69. http://dx.doi.org/10.1016/s0167-6105(00)00081-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Ip, W. H., Bocheng Chen, Henry C. W. Lau, K. L. Choy, and S. L. Chan. "Modelling a CRM Markov chain process using system dynamics." International Journal of Value Chain Management 2, no. 4 (2008): 420. http://dx.doi.org/10.1504/ijvcm.2008.019849.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Mardia, K. V., V. B. Nyirongo, A. N. Walder, C. Xu, P. A. Dowd, R. J. Fowell, and J. T. Kent. "Markov Chain Monte Carlo Implementation of Rock Fracture Modelling." Mathematical Geology 39, no. 4 (August 9, 2007): 355–81. http://dx.doi.org/10.1007/s11004-007-9099-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Caleyo, F., J. C. Velázquez, A. Valor, and J. M. Hallen. "Markov chain modelling of pitting corrosion in underground pipelines." Corrosion Science 51, no. 9 (September 2009): 2197–207. http://dx.doi.org/10.1016/j.corsci.2009.06.014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Pereira, A. G. C., F. A. S. Sousa, B. B. Andrade, and Viviane Simioli Medeiros Campos. "Higher order Markov Chain Model for Synthetic Generation of Daily Streamflows." TEMA (São Carlos) 19, no. 3 (December 17, 2018): 449. http://dx.doi.org/10.5540/tema.2018.019.03.449.

Full text
Abstract:
The aim of this study is to get further into the two-state Markov chain model for synthetic generation daily streamflows. The model proposed in Aksoy and Bayazit (2000) and Aksoy (2003) is based on a two Markov chains for determining the state of the stream. The ascension curve of the hydrograph is modeled by a two-parameter Gamma probability distribution function and is assumed that a recession curve of the hydrograph follows an exponentially function. In this work, instead of assuming a pre-defined order for the Markov chains involved in the modelling of streamflows, a BIC test is performed to establish the Markov chain order that best fit on the data. The methodology was applied to data from seven Brazilian sites. The model proposed here was better than that one proposed by Aksoy but for two sites which have the lowest time series and are located in the driest regions.
APA, Harvard, Vancouver, ISO, and other styles
22

Quine, M. P., and J. S. Law. "Modelling random linear nucleation and growth by a Markov chain." Journal of Applied Probability 36, no. 01 (March 1999): 273–78. http://dx.doi.org/10.1017/s0021900200017034.

Full text
Abstract:
In an attempt to investigate the adequacy of the normal approximation for the number of nuclei in certain growth/coverage models, we consider a Markov chain which has properties in common with related continuous-time Markov processes (as well as being of interest in its own right). We establish that the rate of convergence to normality for the number of ‘drops’ during times 1,2,…n is of the optimal ‘Berry–Esséen’ form, as n → ∞. We also establish a law of the iterated logarithm and a functional central limit theorem.
APA, Harvard, Vancouver, ISO, and other styles
23

Quine, M. P., and J. S. Law. "Modelling random linear nucleation and growth by a Markov chain." Journal of Applied Probability 36, no. 1 (March 1999): 273–78. http://dx.doi.org/10.1239/jap/1032374248.

Full text
Abstract:
In an attempt to investigate the adequacy of the normal approximation for the number of nuclei in certain growth/coverage models, we consider a Markov chain which has properties in common with related continuous-time Markov processes (as well as being of interest in its own right). We establish that the rate of convergence to normality for the number of ‘drops’ during times 1,2,…n is of the optimal ‘Berry–Esséen’ form, as n → ∞. We also establish a law of the iterated logarithm and a functional central limit theorem.
APA, Harvard, Vancouver, ISO, and other styles
24

Ball, Frank, and Geoffrey F. Yeo. "Lumpability and marginalisability for continuous-time Markov chains." Journal of Applied Probability 30, no. 3 (September 1993): 518–28. http://dx.doi.org/10.2307/3214762.

Full text
Abstract:
We consider lumpability for continuous-time Markov chains and provide a simple probabilistic proof of necessary and sufficient conditions for strong lumpability, valid in circumstances not covered by known theory. We also consider the following marginalisability problem. Let {X{t)} = {(X1(t), X2(t), · ··, Xm(t))} be a continuous-time Markov chain. Under what conditions are the marginal processes {X1(t)}, {X2(t)}, · ··, {Xm(t)} also continuous-time Markov chains? We show that this is related to lumpability and, if no two of the marginal processes can jump simultaneously, then they are continuous-time Markov chains if and only if they are mutually independent. Applications to ion channel modelling and birth–death processes are discussed briefly.
APA, Harvard, Vancouver, ISO, and other styles
25

Acquah, Henry De-Graft. "Bayesian Logistic Regression Modelling via Markov Chain Monte Carlo Algorithm." Journal of Social and Development Sciences 4, no. 4 (April 30, 2013): 193–97. http://dx.doi.org/10.22610/jsds.v4i4.751.

Full text
Abstract:
This paper introduces Bayesian analysis and demonstrates its application to parameter estimation of the logistic regression via Markov Chain Monte Carlo (MCMC) algorithm. The Bayesian logistic regression estimation is compared with the classical logistic regression. Both the classical logistic regression and the Bayesian logistic regression suggest that higher per capita income is associated with free trade of countries. The results also show a reduction of standard errors associated with the coefficients obtained from the Bayesian analysis, thus bringing greater stability to the coefficients. It is concluded that Bayesian Markov Chain Monte Carlo algorithm offers an alternative framework for estimating the logistic regression model.
APA, Harvard, Vancouver, ISO, and other styles
26

Pelkowitz, L. "The general markov chain disorder problem." Stochastics 21, no. 2 (June 1987): 113–30. http://dx.doi.org/10.1080/17442508708833454.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Ball, Frank, and Geoffrey F. Yeo. "Lumpability and marginalisability for continuous-time Markov chains." Journal of Applied Probability 30, no. 03 (September 1993): 518–28. http://dx.doi.org/10.1017/s0021900200044272.

Full text
Abstract:
We consider lumpability for continuous-time Markov chains and provide a simple probabilistic proof of necessary and sufficient conditions for strong lumpability, valid in circumstances not covered by known theory. We also consider the following marginalisability problem. Let {X{t)} = {(X 1(t), X 2(t), · ··, Xm (t))} be a continuous-time Markov chain. Under what conditions are the marginal processes {X 1(t)}, {X 2(t)}, · ··, {Xm (t)} also continuous-time Markov chains? We show that this is related to lumpability and, if no two of the marginal processes can jump simultaneously, then they are continuous-time Markov chains if and only if they are mutually independent. Applications to ion channel modelling and birth–death processes are discussed briefly.
APA, Harvard, Vancouver, ISO, and other styles
28

Čech, Martin, and Radim Lenort. "MODELLING OF FINANCIAL RESOURCE ALLOCATION FOR INCREASING THE SUPPLY CHAIN RESILIENCE USING MARKOV CHAINS." Acta logistica 8, no. 2 (June 30, 2021): 141–51. http://dx.doi.org/10.22306/al.v8i2.213.

Full text
Abstract:
The concept of supply chain resilience has arisen in response to changing conditions in the global market environment. Although supply chain resilience building is gaining increasing interest among the professional public and business practice, supporting decision-making in supply chain resilience building is still in its infancy. This article aims to present a mathematical model of the supply chain based on Markov chains to assess the impact of funds allocated to strengthening the supply chain’s resilience to its overall performance and thus support decision-making in the field. Mathematical model assumptions are presented, then a mathematical model of a linear supply chain is developed and generalized, tested and methodological recommendations are presented. To support the use of the model, a set of managerial implications is presented, benefits and limitations are discussed, and further research direction is defined.
APA, Harvard, Vancouver, ISO, and other styles
29

Ball, Frank. "Central limit theorems for multivariate semi-Markov sequences and processes, with applications." Journal of Applied Probability 36, no. 02 (June 1999): 415–32. http://dx.doi.org/10.1017/s0021900200017228.

Full text
Abstract:
In this paper, central limit theorems for multivariate semi-Markov sequences and processes are obtained, both as the number of jumps of the associated Markov chain tends to infinity and, if appropriate, as the time for which the process has been running tends to infinity. The theorems are widely applicable since many functions defined on Markov or semi-Markov processes can be analysed by exploiting appropriate embedded multivariate semi-Markov sequences. An application to a problem in ion channel modelling is described in detail. Other applications, including to multivariate stationary reward processes, counting processes associated with Markov renewal processes, the interpretation of Markov chain Monte Carlo runs and statistical inference on semi-Markov models are briefly outlined.
APA, Harvard, Vancouver, ISO, and other styles
30

Ball, Frank. "Central limit theorems for multivariate semi-Markov sequences and processes, with applications." Journal of Applied Probability 36, no. 2 (June 1999): 415–32. http://dx.doi.org/10.1239/jap/1032374462.

Full text
Abstract:
In this paper, central limit theorems for multivariate semi-Markov sequences and processes are obtained, both as the number of jumps of the associated Markov chain tends to infinity and, if appropriate, as the time for which the process has been running tends to infinity. The theorems are widely applicable since many functions defined on Markov or semi-Markov processes can be analysed by exploiting appropriate embedded multivariate semi-Markov sequences. An application to a problem in ion channel modelling is described in detail. Other applications, including to multivariate stationary reward processes, counting processes associated with Markov renewal processes, the interpretation of Markov chain Monte Carlo runs and statistical inference on semi-Markov models are briefly outlined.
APA, Harvard, Vancouver, ISO, and other styles
31

Ball, Frank G., Robin K. Milne, and Geoffrey F. Yeo. "Marked Continuous-Time Markov Chain Modelling of Burst Behaviour for Single Ion Channels." Journal of Applied Mathematics and Decision Sciences 2007 (October 29, 2007): 1–14. http://dx.doi.org/10.1155/2007/48138.

Full text
Abstract:
Patch clamp recordings from ion channels often show bursting behaviour, that is, periods of repetitive activity, which are noticeably separated from each other by periods of inactivity. A number of authors have obtained results for important properties of theoretical and empirical bursts when channel gating is modelled by a continuous-time Markov chain with a finite-state space. We show how the use of marked continuous-time Markov chains can simplify the derivation of (i) the distributions of several burst properties, including the total open time, the total charge transfer, and the number of openings in a burst, and (ii) the form of these distributions when the underlying gating process is time reversible and in equilibrium.
APA, Harvard, Vancouver, ISO, and other styles
32

Nkemnole, Edesiri Bridget, and Ekene Nwaokoro. "Modelling Customer Relationships as Hidden Markov Chains." Path of Science 6, no. 11 (November 30, 2020): 5011–19. http://dx.doi.org/10.22178/pos.64-9.

Full text
Abstract:
Models in behavioural relationship marketing suggest that relations between the customer and the company change over time as a result of the continuous encounter. Some theoretical models have been put forward concerning relationship marketing, both from the standpoints of consumer behaviour and empirical modelling. In addition to these, this study proposes the hidden Markov model (HMM) as a potential tool for assessing customer relationships. Specifically, the HMM is submitted via the framework of a Markov chain model to classify customers relationship dynamics of a telecommunication service company by using an experimental data set. We develop and estimate an HMM to relate the unobservable relationship states to the observed buying behaviour of the customers giving an appropriate classification of the customers into the relationship states. By merely accounting for the functional and unobserved heterogeneity with a two-state hidden Markov model and taking estimation into account via an optimal estimation method, the empirical results not only demonstrate the value of the proposed model in assessing the dynamics of a customer relationship over time but also gives the optimal marketing-mixed strategies in different customer state.
APA, Harvard, Vancouver, ISO, and other styles
33

Kwasniok, Frank. "Data-based stochastic subgrid-scale parametrization: an approach using cluster-weighted modelling." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 370, no. 1962 (March 13, 2012): 1061–86. http://dx.doi.org/10.1098/rsta.2011.0384.

Full text
Abstract:
A new approach for data-based stochastic parametrization of unresolved scales and processes in numerical weather and climate prediction models is introduced. The subgrid-scale model is conditional on the state of the resolved scales, consisting of a collection of local models. A clustering algorithm in the space of the resolved variables is combined with statistical modelling of the impact of the unresolved variables. The clusters and the parameters of the associated subgrid models are estimated simultaneously from data. The method is implemented and explored in the framework of the Lorenz '96 model using discrete Markov processes as local statistical models. Performance of the cluster-weighted Markov chain scheme is investigated for long-term simulations as well as ensemble prediction. It clearly outperforms simple parametrization schemes and compares favourably with another recently proposed subgrid modelling scheme also based on conditional Markov chains.
APA, Harvard, Vancouver, ISO, and other styles
34

Lewy, P., and A. Nielsen. "Modelling stochastic fish stock dynamics using Markov Chain Monte Carlo." ICES Journal of Marine Science 60, no. 4 (January 1, 2003): 743–52. http://dx.doi.org/10.1016/s1054-3139(03)00080-8.

Full text
Abstract:
Abstract A new age-structured stock dynamics approach including stochastic survival and recruitment processes is developed and implemented. The model is able to analyse detailed sources of information used in standard age-based fish stock assessment such as catch-at-age and effort data from commercial fleets and research surveys. The stock numbers are treated as unobserved variables subject to process errors while the catches are observed variables subject to both sampling and process errors. Results obtained for North Sea plaice using Markov Chain Monte Carlo methods indicate that the process error by far accounts for most of the variation compared to sampling error. Comparison with results from a simpler separable model indicates that the new model provides more precise estimates with fewer parameters.
APA, Harvard, Vancouver, ISO, and other styles
35

Voskoglou, Michael G. "An application of Markov chain to the process of modelling." International Journal of Mathematical Education in Science and Technology 25, no. 4 (July 1994): 475–80. http://dx.doi.org/10.1080/0020739940250401.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Stowasser, Markus. "Modelling rain risk: a multi‐order Markov chain model approach." Journal of Risk Finance 13, no. 1 (December 30, 2011): 45–60. http://dx.doi.org/10.1108/15265941211191930.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Anderson, E. J. "Markov chain modelling of the solution surface in local search." Journal of the Operational Research Society 53, no. 6 (June 2002): 630–36. http://dx.doi.org/10.1057/palgrave/jors/2601342.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Jasra, Ajay, David A. Stephens, Kerry Gallagher, and Christopher C. Holmes. "Bayesian Mixture Modelling in Geochronology via Markov Chain Monte Carlo." Mathematical Geology 38, no. 3 (April 2006): 269–300. http://dx.doi.org/10.1007/s11004-005-9019-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Abundo, M., and L. Caramellino. "Some remarks on a Markov chain modelling cooperative biological systems." Open Systems & Information Dynamics 3, no. 3 (October 1995): 325–43. http://dx.doi.org/10.1007/bf02228996.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Khan, Samiullah, and Mohammad Abdul Qadir. "Deterministic Time Markov Chain Modelling of Simultaneous Multipath Transmission Schemes." IEEE Access 5 (2017): 8536–44. http://dx.doi.org/10.1109/access.2017.2701769.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Kennedy, Rodney A., and Shin-Ho Chung. "Modelling and identification of coupled markov chain model with application." International Journal of Adaptive Control and Signal Processing 10, no. 6 (November 1996): 623–34. http://dx.doi.org/10.1002/(sici)1099-1115(199611)10:6<623::aid-acs402>3.0.co;2-#.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Kalashnikov, Vladimir V. "Regeneration and general Markov chains." Journal of Applied Mathematics and Stochastic Analysis 7, no. 3 (January 1, 1994): 357–71. http://dx.doi.org/10.1155/s1048953394000304.

Full text
Abstract:
Ergodicity, continuity, finite approximations and rare visits of general Markov chains are investigated. The obtained results permit further quantitative analysis of characteristics, such as, rates of convergence, continuity (measured as a distance between perturbed and non-perturbed characteristics), deviations between Markov chains, accuracy of approximations and bounds on the distribution function of the first visit time to a chosen subset, etc. The underlying techniques use the embedding of the general Markov chain into a wide sense regenerative process with the help of splitting construction.
APA, Harvard, Vancouver, ISO, and other styles
43

GASBARRA, DARIO, JOSÉ IGOR MORLANES, and ESKO VALKEILA. "INITIAL ENLARGEMENT IN A MARKOV CHAIN MARKET MODEL." Stochastics and Dynamics 11, no. 02n03 (September 2011): 389–413. http://dx.doi.org/10.1142/s021949371100336x.

Full text
Abstract:
Enlargement of filtrations is a classical topic in the general theory of stochastic processes. This theory has been applied to stochastic finance in order to analyze models with insider information. In this paper we study initial enlargement in a Markov chain market model, introduced by Norberg. In the enlarged filtration, several things can happen: some of the jumps times can be accessible or predictable, but in the original filtration all the jumps times are totally inaccessible. But even if the jumps times change to accessible or predictable, the insider does not necessarily have arbitrage possibilities.
APA, Harvard, Vancouver, ISO, and other styles
44

Fawcett, Lee, and David Walshaw. "Markov chain models for extreme wind speeds." Environmetrics 17, no. 8 (2006): 795–809. http://dx.doi.org/10.1002/env.794.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Gerencsér, Balázs. "Markov chain mixing time on cycles." Stochastic Processes and their Applications 121, no. 11 (November 2011): 2553–70. http://dx.doi.org/10.1016/j.spa.2011.07.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Imkeller, Peter, and Peter Kloeden. "On the Computation of Invariant Measures in Random Dynamical Systems." Stochastics and Dynamics 03, no. 02 (June 2003): 247–65. http://dx.doi.org/10.1142/s0219493703000711.

Full text
Abstract:
Invariant measures of dynamical systems generated e.g. by difference equations can be computed by discretizing the originally continuum state space, and replacing the action of the generator by the transition mechanism of a Markov chain. In fact they are approximated by stationary vectors of these Markov chains. Here we extend this well-known approximation result and the underlying algorithm to the setting of random dynamical systems, i.e. dynamical systems on the skew product of a probability space carrying the underlying stationary stochasticity and the state space, a particular non-autonomous framework. The systems are generated by difference equations driven by stationary random processes modelled on a metric dynamical system. The approximation algorithm involves spatial discretizations and the definition of appropriate random Markov chains with stationary vectors converging to the random invariant measure of the system.
APA, Harvard, Vancouver, ISO, and other styles
47

Keilson, J., and O. A. Vasicek. "Monotone measures of ergodicity for Markov chains." Journal of Applied Mathematics and Stochastic Analysis 11, no. 3 (January 1, 1998): 283–88. http://dx.doi.org/10.1155/s1048953398000239.

Full text
Abstract:
The following paper, first written in 1974, was never published other than as part of an internal research series. Its lack of publication is unrelated to the merits of the paper and the paper is of current importance by virtue of its relation to the relaxation time. A systematic discussion is provided of the approach of a finite Markov chain to ergodicity by proving the monotonicity of an important set of norms, each measures of egodicity, whether or not time reversibility is present. The paper is of particular interest because the discussion of the relaxation time of a finite Markov chain [2] has only been clean for time reversible chains, a small subset of the chains of interest. This restriction is not present here. Indeed, a new relaxation time quoted quantifies the relaxation time for all finite ergodic chains (cf. the discussion of Q1(t) below Equation (1.7)]. This relaxation time was developed by Keilson with A. Roy in his thesis [6], yet to be published.
APA, Harvard, Vancouver, ISO, and other styles
48

Abdelkader, Eslam Mohammed, Tarek Zayed, and Mohamed Marzouk. "Modelling the Deterioration of Bridge Decks Based on Semi-Markov Decision Process." International Journal of Strategic Decision Sciences 10, no. 1 (January 2019): 23–45. http://dx.doi.org/10.4018/ijsds.2019010103.

Full text
Abstract:
Deterioration models represent a very important pillar for the effective use of bridge management systems (BMS's). This article presents a probabilistic time-based model that predicts the condition ratings of the concrete bridge decks along their service life. The deterioration process of the concrete bridge decks is modeled using a semi-Markov decision process. The sojourn time of each condition state is fitted to a certain probability distribution based on some goodness of fit tests. The parameters of the probability density functions are obtained using maximum likelihood estimation. The cumulative density functions are defined based on Latin hypercube sampling. Finally, a comparison is conducted between the Markov Chain, semi-Markov chain, Weibull and gamma distributions to select the most accurate prediction model. Results indicate that the semi-Markov model outperformed the other models in terms of three performance indicators are: root-mean square error (RMSE), mean absolute error (MAE), chi-squared statistic (x2).
APA, Harvard, Vancouver, ISO, and other styles
49

Faghih-Roohi, Shahrzad, Min Xie, and Kien Ming Ng. "Accident risk assessment in marine transportation via Markov modelling and Markov Chain Monte Carlo simulation." Ocean Engineering 91 (November 2014): 363–70. http://dx.doi.org/10.1016/j.oceaneng.2014.09.029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Todorovic, P. "Remarks on a monotone Markov chain." Journal of Applied Mathematics and Simulation 1, no. 2 (January 1, 1987): 137–54. http://dx.doi.org/10.1155/s1048953388000103.

Full text
Abstract:
In applications, considerations on stochastic models often involve a Markov chain {ζn}0∞ with state space in R+, and a transition probability Q. For each x R+ the support of Q(x,.) is [0,x]. This implies that ζ0≥ζ1≥…. Under certain regularity assumptions on Q we show that Qn(x,Bu)→1 as n→∞ for all u>0 and that 1−Qn(x,Bu)≤[1−Q(x,Bu)]n where Bu=[0,u). Set τ0=max{k;ζk=ζ0}, τn=max{k;ζk=ζτn−1+1} and write Xn=ζτn−1+1, Tn=τn−τn−1. We investigate some properties of the imbedded Markov chain {Xn}0∞ and of {Tn}0∞. We determine all the marginal distributions of {Tn}0∞ and show that it is asymptotically stationary and that it possesses a monotonicity property. We also prove that under some mild regularity assumptions on β(x)=1−Q(x,Bx), ∑1n(Ti−a)/bn→dZ∼N(0,1).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography