Academic literature on the topic 'Markov chain model'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Markov chain model.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Markov chain model"

1

Rai, Prerna, and Arvind Lal. "Google PageRank Algorithm: Markov Chain Model and Hidden Markov Model." International Journal of Computer Applications 138, no. 9 (March 17, 2016): 9–13. http://dx.doi.org/10.5120/ijca2016908942.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hlynka, Myron, and Tolulope Sajobi. "A Markov Chain Fibonacci Model." Missouri Journal of Mathematical Sciences 20, no. 3 (October 2008): 186–99. http://dx.doi.org/10.35834/mjms/1316032778.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Berchtold, Andre. "The double chain markov model." Communications in Statistics - Theory and Methods 28, no. 11 (January 1999): 2569–89. http://dx.doi.org/10.1080/03610929908832439.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Yang, Chuan-sheng, Yu-jia Zheng, and Chao Wang. "Incremental multivariate Markov chain model." Journal of Engineering 2018, no. 16 (November 1, 2018): 1433–35. http://dx.doi.org/10.1049/joe.2018.8278.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Boys, R. J., and D. A. Henderson. "On Determining the Order of Markov Dependence of an Observed Process Governed by a Hidden Markov Model." Scientific Programming 10, no. 3 (2002): 241–51. http://dx.doi.org/10.1155/2002/683164.

Full text
Abstract:
This paper describes a Bayesian approach to determining the order of a finite state Markov chain whose transition probabilities are themselves governed by a homogeneous finite state Markov chain. It extends previous work on homogeneous Markov chains to more general and applicable hidden Markov models. The method we describe uses a Markov chain Monte Carlo algorithm to obtain samples from the (posterior) distribution for both the order of Markov dependence in the observed sequence and the other governing model parameters. These samples allow coherent inferences to be made straightforwardly in contrast to those which use information criteria. The methods are illustrated by their application to both simulated and real data sets.
APA, Harvard, Vancouver, ISO, and other styles
6

Valenzuela, Mississippi. "Markov chains and applications." Selecciones Matemáticas 9, no. 01 (June 30, 2022): 53–78. http://dx.doi.org/10.17268/sel.mat.2022.01.05.

Full text
Abstract:
This work has three important purposes: first it is the study of Markov Chains, the second is to show that Markov chains have different applications and finally it is to model a process of this behaves. Throughout this work we will describe what a Markov chain is, what these processes are for and how these chains are classified. We will describe a Markov Chain, that is, analyze what are the primary elements that make up a Markov chain, among others.
APA, Harvard, Vancouver, ISO, and other styles
7

Gerontidis, Ioannis I. "Semi-Markov Replacement Chains." Advances in Applied Probability 26, no. 3 (September 1994): 728–55. http://dx.doi.org/10.2307/1427818.

Full text
Abstract:
We consider an absorbing semi-Markov chain for which each time absorption occurs there is a resetting of the chain according to some initial (replacement) distribution. The new process is a semi-Markov replacement chain and we study its properties in terms of those of the imbedded Markov replacement chain. A time-dependent version of the model is also defined and analysed asymptotically for two types of environmental behaviour, i.e. either convergent or cyclic. The results contribute to the control theory of semi-Markov chains and extend in a natural manner a wide variety of applied probability models. An application to the modelling of populations with semi-Markovian replacements is also presented.
APA, Harvard, Vancouver, ISO, and other styles
8

Gerontidis, Ioannis I. "Semi-Markov Replacement Chains." Advances in Applied Probability 26, no. 03 (September 1994): 728–55. http://dx.doi.org/10.1017/s0001867800026525.

Full text
Abstract:
We consider an absorbing semi-Markov chain for which each time absorption occurs there is a resetting of the chain according to some initial (replacement) distribution. The new process is a semi-Markov replacement chain and we study its properties in terms of those of the imbedded Markov replacement chain. A time-dependent version of the model is also defined and analysed asymptotically for two types of environmental behaviour, i.e. either convergent or cyclic. The results contribute to the control theory of semi-Markov chains and extend in a natural manner a wide variety of applied probability models. An application to the modelling of populations with semi-Markovian replacements is also presented.
APA, Harvard, Vancouver, ISO, and other styles
9

Kwon, Hyun-Han, and Byung-Sik Kim. "Development of Statistical Downscaling Model Using Nonstationary Markov Chain." Journal of Korea Water Resources Association 42, no. 3 (March 31, 2009): 213–25. http://dx.doi.org/10.3741/jkwra.2009.42.3.213.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Umurzakov, Uktam. "PREDICTION OF PRICES FOR AGRICULTURAL PRODUCTS THROUGH MARKOV CHAIN MODEL." International Journal of Psychosocial Rehabilitation 24, no. 03 (February 18, 2020): 293–303. http://dx.doi.org/10.37200/ijpr/v24i3/pr200782.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Markov chain model"

1

Yildirak, Sahap Kasirga. "The Identificaton Of A Bivariate Markov Chain Market Model." Phd thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/1257898/index.pdf.

Full text
Abstract:
This work is an extension of the classical Cox-Ross-Rubinstein discrete time market model in which only one risky asset is considered. We introduce another risky asset into the model. Moreover, the random structure of the asset price sequence is generated by bivariate finite state Markov chain. Then, the interest rate varies over time as it is the function of generating sequences. We discuss how the model can be adapted to the real data. Finally, we illustrate sample implementations to give a better idea about the use of the model.
APA, Harvard, Vancouver, ISO, and other styles
2

Jindasawat, Jutaporn. "Testing the order of a Markov chain model." Thesis, University of Newcastle Upon Tyne, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.446197.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Mehl, Christopher. "Bayesian Hierarchical Modeling and Markov Chain Simulation for Chronic Wasting Disease." Diss., University of Colorado at Denver, 2004. http://hdl.handle.net/10919/71563.

Full text
Abstract:
In this thesis, a dynamic spatial model for the spread of Chronic Wasting Disease in Colorado mule deer is derived from a system of differential equations that captures the qualitative spatial and temporal behaviour of the disease. These differential equations are incorporated into an empirical Bayesian hierarchical model through the unusual step of deterministic autoregressive updates. Spatial effects in the model are described directly in the differential equations rather than through the use of correlations in the data. The use of deterministic updates is a simplification that reduces the number of parameters that must be estimated, yet still provides a flexible model that gives reasonable predictions for the disease. The posterior distribution generated by the data model hierarchy possesses characteristics that are atypical for many Markov chain Monte Carlo simulation techniques. To address these difficulties, a new MCMC technique is developed that has qualities similar to recently introduced tempered Langevin type algorithms. The methodology is used to fit the CWD model, and posterior parameter estimates are then used to obtain predictions about Chronic Wasting Disease.
APA, Harvard, Vancouver, ISO, and other styles
4

Au, Chi Yan. "Numerical methods for solving Markov chain driven Black-Scholes model." HKBU Institutional Repository, 2010. http://repository.hkbu.edu.hk/etd_ra/1154.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yapo, Patrice Ogou 1967. "A Markov chain flow model with application to flood forecasting." Thesis, The University of Arizona, 1992. http://hdl.handle.net/10150/278135.

Full text
Abstract:
This thesis presents a new approach to streamflow forecasting. The approach is based on specifying the probabilities that the next flow of a stream will occur within different ranges of values. Hence, this method is different from the time series models where point estimates are given as forecasts. With this approach flood forecasting is possible by focusing on a preselected range of streamflows. A double criteria objective function is developed to assess the model performance in flood prediction. Three case studies are examined based on data from the Salt River in Phoenix, Arizona and Bird Creek near Sperry, Oklahoma. The models presented are: a first order Markov chain (FOMC), a second order Markov chain (SOMC), and a first order Markov chain with rainfall as an exogenous input (FOMCX). Three forecasts methodologies are compared among each other and against time series models. It is shown that the SOMC is better than the FOMC while the FOMCX is better than the time series models.
APA, Harvard, Vancouver, ISO, and other styles
6

Neuhoff, Daniel. "Reversible Jump Markov Chain Monte Carlo." Doctoral thesis, Humboldt-Universität zu Berlin, Wirtschaftswissenschaftliche Fakultät, 2016. http://dx.doi.org/10.18452/17461.

Full text
Abstract:
Die vier in der vorliegenden Dissertation enthaltenen Studien beschäftigen sich vorwiegend mit dem dynamischen Verhalten makroökonomischer Zeitreihen. Diese Dynamiken werden sowohl im Kontext eines einfachen DSGE Modells, als auch aus der Sichtweise reiner Zeitreihenmodelle untersucht.
The four studies of this thesis are concerned predominantly with the dynamics of macroeconomic time series, both in the context of a simple DSGE model, as well as from a pure time series modeling perspective.
APA, Harvard, Vancouver, ISO, and other styles
7

Kharbouch, Alaa Amin. "A bacterial algorithm for surface mapping using a Markov modulated Markov chain model of bacterial chemotaxis." Thesis, Massachusetts Institute of Technology, 2006. http://hdl.handle.net/1721.1/36186.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2006.
Includes bibliographical references (p. 83-85).
Bacterial chemotaxis is the locomotory response of bacteria to chemical stimuli. E. coli movement can be described as a biased random walk, and it is known that the general biological or evolutionary function is to increase exposure to some substances and reduce exposure to others. In this thesis we introduce an algorithm for surface mapping, which tracks the motion of a bacteria-like software agent (based on a low-level model of the biochemical network responsible for chemotaxis) on a surface or objective function. Towards that end, a discrete Markov modulated Markov chains model of the chemotaxis pathway is described and used. Results from simulations using one- and two-dimensional test surfaces show that the software agents, referred to as bacterial agents, and the surface mapping algorithm can produce an estimate which shares some broad characteristics with the surface and uncovers some features of it. We also demonstrate that the bacterial agent, when given the ability to reduce the value of the surface at locations it visits (analogous to consuming a substance on a concentration surface), is more effective in reducing the surface integral within a certain period of time when compared to a bacterial agent lacking the ability to sense surface information or respond to it.
by Alaa Amin Kharbouch.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
8

Webb, Jared Anthony. "A Topics Analysis Model for Health Insurance Claims." BYU ScholarsArchive, 2013. https://scholarsarchive.byu.edu/etd/3805.

Full text
Abstract:
Mathematical probability has a rich theory and powerful applications. Of particular note is the Markov chain Monte Carlo (MCMC) method for sampling from high dimensional distributions that may not admit a naive analysis. We develop the theory of the MCMC method from first principles and prove its relevance. We also define a Bayesian hierarchical model for generating data. By understanding how data are generated we may infer hidden structure about these models. We use a specific MCMC method called a Gibbs' sampler to discover topic distributions in a hierarchical Bayesian model called Topics Over Time. We propose an innovative use of this model to discover disease and treatment topics in a corpus of health insurance claims data. By representing individuals as mixtures of topics, we are able to consider their future costs on an individual level rather than as part of a large collective.
APA, Harvard, Vancouver, ISO, and other styles
9

Nasrallah, Yamen. "Enhanced IEEE 802.11.p-Based MAC Protocols for Vehicular Ad hoc Networks." Thesis, Université d'Ottawa / University of Ottawa, 2017. http://hdl.handle.net/10393/36168.

Full text
Abstract:
The Intelligent Transportation System (ITS) is a cooperative system that relies on reliable and robust communication schemes among vehicles and between vehicles and their surroundings. The main objective of the ITS is to ensure the safety of vehicle drivers and pedestrians. It provides an efficient and reliable transportation system that enhances traffic management, reduces congestion time, enables smooth traffic re-routing, and avoids economic losses. An essential part of the ITS is the Vehicular Ad hoc Network (VANET). VANET enables the setup of Vehicle-to-Vehicle (V2V) as well as Vehicle-to-Infrastructure (V2I) communication platforms: the two key components in the ITS. The de-facto standard used in wireless V2V and V2I communication applications is the Dedicated Short Range Communication (DSRC). The protocol that defines the specifications for the Medium Access Control (MAC) layer and the physical layer in the DSRC is the IEEE 802.11p protocol. The IEEE 802.11p protocol and its Enhanced Distributed Channel Access (EDCA) mechanism are the main focus of this thesis. Our main objective is to develop new IEEE 802.11p-based protocol for V2V and V2I communication systems, to improve the performance of safety-related applications. These applications are of paramount importance in ITS, because their goal is to decrease the rate of vehicle collisions, and hence reduce the enormous costs associated with them. In fact, large percentage of vehicle collisions can be easily avoided with the exchange of relevant information between vehicles and the Road Side Units (RSUs) installed on the sides of the roads. In this thesis, we propose various enhancements to the IEEE 802.11p protocol to improve its performance by lowering the average end-to-end delay and increasing the average network throughput. We introduce multiple adaptive algorithms to promote the QoS support across all the Access Categories (AC) in IEEE 802.11p. We propose two adaptive backoff algorithms and two algorithms that adaptively change the values of the Arbitrary Inter-Frame Space (AIFS). Then we extend our model to be applied in a large-scale vehicular network. In this context, a multi-layer cluster-based architecture is adopted, and two new distributed time synchronization mechanisms are developed.
APA, Harvard, Vancouver, ISO, and other styles
10

Mamudu, Lohuwa. "Modeling Student Enrollment at ETSU Using a Discrete-Time Markov Chain Model." Digital Commons @ East Tennessee State University, 2017. https://dc.etsu.edu/etd/3310.

Full text
Abstract:
Discrete-time Markov chain models can be used to make future predictions in many important fields including education. Government and educational institutions today are concerned about college enrollment and what impacts the number of students enrolling. One challenge is how to make an accurate prediction about student enrollment so institutions can plan appropriately. In this thesis, we model student enrollment at East Tennessee State University (ETSU) with a discrete-time Markov chain model developed using ETSU student data from Fall 2008 to Spring 2017. In this thesis, we focus on the progression from one level to another within the university system including graduation and dropout probabilities as indicated by the data. We further include the probability that a student will leave school for a limited period of time and then return to the institution. We conclude with a simulation of the model and a comparison to the trends seen in the data.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Markov chain model"

1

Banisch, Sven. Markov Chain Aggregation for Agent-Based Models. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24877-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ching, Wai-Ki. Markov Chains: Models, Algorithms and Applications. 2nd ed. Boston, MA: Springer US, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Meyer, Carl D., and Robert J. Plemmons, eds. Linear Algebra, Markov Chains, and Queueing Models. New York, NY: Springer New York, 1993. http://dx.doi.org/10.1007/978-1-4613-8351-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Yücesan, Enver. Analysis of Markov chains using simulation graph models. Fontainebleau: INSEAD, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bioinformatics: Sequence alignment and Markov models. New York: McGraw-Hill, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Saad, Y. Projection methods for the numerical solution of Markov chain models. [Moffett Field, Calif.]: Research Institute for Advanced Computer Science, NASA Ames Research Center, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Penny, D. Modeling the covarion model of molecular evolution by hidden Markov chains. Palmerston North, N.Z: Massey University College of Sciences, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Mo, Jeonghoon. Performance modeling of communication networks with Markov chains. San Rafael, Calif. (1537 Fourth Street, San Rafael, CA 94901 USA): Morgan & Claypool, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

N, Limnios, ed. Semi-Markov chains and hidden semi-Markov models toward applications: Their use in reliability and DNA analysis. New York: Springer, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bagnoli, Carlo, Alessia Bravin, Maurizio Massaro, and Alessandra Vignotto. Business Model 4.0. Venice: Edizioni Ca' Foscari, 2018. http://dx.doi.org/10.30687/978-88-6969-286-4.

Full text
Abstract:
The manufacturing digital transformation is changing the industry through the introduction of advanced solutions that allow companies to re-interpret their role along the value chain. The industrial revolution opens up great opportunities for Italian companies, in terms of process efficiency, cost reduction and improvement in productivity, but also in the rethinking of products, new services, and the ability of reaction to market needs. This report examines the possible impact of Industry 4.0 on business models considering technological innovation also as a driver of strategic innovation.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Markov chain model"

1

Mehrdoust, Farshid. "Markov Chain Monte Carlo Model." In Encyclopedia of Social Network Analysis and Mining, 1–14. New York, NY: Springer New York, 2016. http://dx.doi.org/10.1007/978-1-4614-7163-9_150-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Mehrdoust, Farshid. "Markov Chain Monte Carlo Model." In Encyclopedia of Social Network Analysis and Mining, 857–70. New York, NY: Springer New York, 2014. http://dx.doi.org/10.1007/978-1-4614-6170-8_150.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Eshima, Nobuoki. "The Latent Markov Chain Model." In An Introduction to Latent Class Analysis, 121–47. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-0972-6_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hermanns, Holger, Joost-Pieter Katoen, Joachim Meyer-Kayser, and Markus Siegle. "A Markov Chain Model Checker." In Tools and Algorithms for the Construction and Analysis of Systems, 347–62. Berlin, Heidelberg: Springer Berlin Heidelberg, 2000. http://dx.doi.org/10.1007/3-540-46419-0_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Mehrdoust, Farshid. "Markov Chain Monte Carlo Model." In Encyclopedia of Social Network Analysis and Mining, 1247–60. New York, NY: Springer New York, 2018. http://dx.doi.org/10.1007/978-1-4939-7131-2_150.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Shen, Dong. "Markov Chain Model for Linear Systems." In Iterative Learning Control with Passive Incomplete Information, 133–60. Singapore: Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-10-8267-2_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Eshima, Nobuoki. "The Mixed Latent Markov Chain Model." In An Introduction to Latent Class Analysis, 149–59. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-0972-6_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Sakumura, Yuichi, Norio Konno, and Kazuyuki Aihara. "Markov Chain Model Approximating the Hodgkin-Huxley Neuron." In Artificial Neural Networks — ICANN 2001, 1153–60. Berlin, Heidelberg: Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-44668-0_161.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zavalishchin, Dmitry, and Galina Timofeeva. "Construction of Confidence Sets for Markov Chain Model." In Lecture Notes in Electrical Engineering, 253–63. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-43671-5_22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kaliakatsos-Papakostas, Maximos A., Michael G. Epitropakis, and Michael N. Vrahatis. "Weighted Markov Chain Model for Musical Composer Identification." In Applications of Evolutionary Computation, 334–43. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-20520-0_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Markov chain model"

1

García, Jesús E., S. L. M. Londoño, and Thainá Soares. "Optimal model for a Markov chain with Markov covariates." In INTERNATIONAL CONFERENCE OF NUMERICAL ANALYSIS AND APPLIED MATHEMATICS ICNAAM 2019. AIP Publishing, 2020. http://dx.doi.org/10.1063/5.0026429.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Saad, Syafawati Ab, Farah Adibah Adnan, Haslinda Ibrahim, and Rahela Rahim. "Manpower planning using Markov Chain model." In PROCEEDINGS OF THE 21ST NATIONAL SYMPOSIUM ON MATHEMATICAL SCIENCES (SKSM21): Germination of Mathematical Sciences Education and Research towards Global Sustainability. AIP Publishing LLC, 2014. http://dx.doi.org/10.1063/1.4887748.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ma, Huiqun, Ling Liu, and Tao Chen. "Assessment Model Based on Markov Chain." In 2008 Fifth International Conference on Fuzzy Systems and Knowledge Discovery (FSKD). IEEE, 2008. http://dx.doi.org/10.1109/fskd.2008.212.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zheludev, Michael, and Evgeny Nagradov. "Anomaly detection using Markov chain model." In 2017 Computer Science and Information Technologies (CSIT). IEEE, 2017. http://dx.doi.org/10.1109/csitechnol.2017.8312166.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Liu, Xiaofan, Liliang Ren, Fei Yuan, and Bang Yang. "Meteorological Drought Forecasting Using Markov Chain Model." In 2009 International Conference on Environmental Science and Information Application Technology, ESIAT. IEEE, 2009. http://dx.doi.org/10.1109/esiat.2009.19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Liang, Qi-sheng Guo, and Xiu-yue Yang. "Evaluation method based on markov chain model." In 2008 Asia Simulation Conference - 7th International Conference on System Simulation and Scientific Computing (ICSC). IEEE, 2008. http://dx.doi.org/10.1109/asc-icsc.2008.4675615.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zhou, Qingxin. "Markov Chain Combination Prediction Model and Its Application in Stock Market." In Proceedings of the 2018 5th International Conference on Education, Management, Arts, Economics and Social Science (ICEMAESS 2018). Paris, France: Atlantis Press, 2018. http://dx.doi.org/10.2991/icemaess-18.2018.43.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Jingmin Wang, Junjie Kang, Yanfu Sun, and Duanmei Liu. "Load forecasting based on GM - Markov chain model." In 2010 Second Pacific-Asia Conference on Circuits,Communications and System (PACCS). IEEE, 2010. http://dx.doi.org/10.1109/paccs.2010.5627058.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Tian, Jin, Ren-Ping Liu, and Shang-Jing Lin. "A Markov Chain Analysis Model of IEEE 802.11p." In the 2017 International Conference. New York, New York, USA: ACM Press, 2017. http://dx.doi.org/10.1145/3180496.3180598.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Song, Na, Wai-Ki Ching, Tak-Kuen Siu, Eric S. Fung, and Michael K. Ng. "Option Valuation under a Multivariate Markov Chain Model." In 2010 Third International Joint Conference on Computational Science and Optimization. IEEE, 2010. http://dx.doi.org/10.1109/cso.2010.73.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Markov chain model"

1

Edmunds, T. A. A Markov Chain Model for evaluating the effectiveness of randomized surveillance procedures. Office of Scientific and Technical Information (OSTI), January 1994. http://dx.doi.org/10.2172/10142261.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Tollar, Eric S. On the Limit Behavior of a Multi-Compartment Storage Model with an Underlying Markov Chain. I. Without Normalization. Fort Belvoir, VA: Defense Technical Information Center, February 1985. http://dx.doi.org/10.21236/ada161293.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Tollar, Eric S. On the Limit Behavior of a Multi-Compartment Storage Model with an Underlying Markov Chain. II. With Normalization. Fort Belvoir, VA: Defense Technical Information Center, February 1985. http://dx.doi.org/10.21236/ada161661.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Skahill, Brian, and Jeffrey Baggett. A practical two-phase approach to improve the reliability and efficiency of Markov chain Monte Carlo directed hydrologic model calibration. Engineer Research and Development Center (U.S.), March 2020. http://dx.doi.org/10.21079/11681/35753.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wereley, Norman M., and Bruce K. Walker. Approximate Evaluation of Semi-Markov Chain Reliability Models,. Fort Belvoir, VA: Defense Technical Information Center, February 1988. http://dx.doi.org/10.21236/ada194669.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Thompson, Theodore J., James P. Boyle, and Douglas J. Hentschel. Markov Chains for Random Urinalysis 1: Age-Test Model. Fort Belvoir, VA: Defense Technical Information Center, March 1993. http://dx.doi.org/10.21236/ada263274.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Krakowski, Martin. Models of Coin-Tossing for Markov Chains. Revision. Fort Belvoir, VA: Defense Technical Information Center, December 1987. http://dx.doi.org/10.21236/ada196572.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Soloviev, V., V. Saptsin, and D. Chabanenko. Financial time series prediction with the technology of complex Markov chains. Брама-Україна, 2014. http://dx.doi.org/10.31812/0564/1305.

Full text
Abstract:
In this research the technology of complex Markov chains, i.e. Markov chains with a memory is applied to forecast financial time-series. The main distinction of complex or high-order Markov Chains and simple first-ord yer ones is the existing of aftereffect or memory. The high-order Markov chains can be simplified to first-order ones by generalizing the states in Markov chains. Considering the «generalized state» as the sequence of states makes a possibility to model high-order Markov chains like first-order ones. The adaptive method of defining the states is proposed, it is concerned with the statistic properties of price returns.
APA, Harvard, Vancouver, ISO, and other styles
9

Boyle, James P., Douglas J. Hentschel, and Theodore J. Thompson. Markov Chains for Random Urinalysis II: Age-Test Model with Absorbing State. Fort Belvoir, VA: Defense Technical Information Center, May 1993. http://dx.doi.org/10.21236/ada265557.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Smith, Wayne D., Karen D. Burns, and Jane N. Moorhead. Using Markov Chains to Model the Error Behavior of Data Communications Channels. Fort Belvoir, VA: Defense Technical Information Center, May 1993. http://dx.doi.org/10.21236/ada266468.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography