Journal articles on the topic 'Markov'

To see the other types of publications on this topic, follow the link: Markov.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Markov.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Sauvageot, Jean-Luc. "Markov quantum semigroups admit covariant MarkovC*-dilations." Communications in Mathematical Physics 106, no. 1 (March 1986): 91–103. http://dx.doi.org/10.1007/bf01210927.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Guyo, X., and C. Hardouin†. "Markow chain markov field dynamics:models and statistics." Statistics 35, no. 4 (January 2001): 593–627. http://dx.doi.org/10.1080/02331880108802756.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kulkarni, V. G., and V. G. Adlakha. "Markov and Markov-RegenerativepertNetworks." Operations Research 34, no. 5 (October 1986): 769–81. http://dx.doi.org/10.1287/opre.34.5.769.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Grewal, Jasleen K., Martin Krzywinski, and Naomi Altman. "Markov models—Markov chains." Nature Methods 16, no. 8 (July 30, 2019): 663–64. http://dx.doi.org/10.1038/s41592-019-0476-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Grewal, Jasleen K., Martin Krzywinski, and Naomi Altman. "Markov models — hidden Markov models." Nature Methods 16, no. 9 (August 30, 2019): 795–96. http://dx.doi.org/10.1038/s41592-019-0532-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Guédon, Yann. "Hidden hybrid Markov/semi-Markov chains." Computational Statistics & Data Analysis 49, no. 3 (June 2005): 663–88. http://dx.doi.org/10.1016/j.csda.2004.05.033.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chandgotia, Nishant, Guangyue Han, Brian Marcus, Tom Meyerovitch, and Ronnie Pavlov. "One-dimensional Markov random fields, Markov chains and topological Markov fields." Proceedings of the American Mathematical Society 142, no. 1 (October 3, 2013): 227–42. http://dx.doi.org/10.1090/s0002-9939-2013-11741-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Rofiroh, Rofiroh Rofiroh. "APLIKASI RANTAI MARKOV PADA PENENTUAN HARI BERSALJU DI BEBERAPA KOTA AMERIKA SERIKAT." STATMAT : JURNAL STATISTIKA DAN MATEMATIKA 2, no. 2 (July 31, 2020): 28. http://dx.doi.org/10.32493/sm.v2i2.5435.

Full text
Abstract:
ABSTRACT This research models on the stochastic process. The method used is the Marchov chain method with the stochastic process where the forthcoming condition will only be influenced by the closest preceding condition . This method was applied to the observational data snaw day for the Markov chain at eight observation stations in the United States, namely the New York, Sedro Wooley, Glendivem Willow City, Del Norte, Medford, Charlestone, and Blue Hill. The purpose of this study is to determine the convergence direction of the step transition probability and the probability distribution of the Markov chain in three conditions. According to the results of data processing using matlab software, diagonal matrices, and spectral theorems, similar results were obtained on the convergence of the transition matrix of each observation station which was influenced by the difference in probability changes of two conditions. Keywords: Marchov Chain, Snaw Day, Transition Matrix ABSTRAK Penelitian ini melakukan pemodelan pada proses stokastik. Metode penelitian yang digunakan adalah metode rantai markov dengan proses stokastik, keadaan yang akan datang hanya akan dipengaruhi keadaan terdekat sebelumnya . Metode ini diterapkan pada data pengamatan hari bersalju untuk rantai markov di delapan stasiun pengamatan yang ada di Amerika Serikat, yaitu stasiun pengamatan New York, Sedro Wooley, Glendive, Willow City, Del Norte, Medford, Charleston, dan Blue Hill. Tujuan penelitian ini adalah untuk mengetahui arah kekonvergenan peluang transisi dan menentukan distribusi peluang rantai markov n langkah dengan tiga keadaan. Berdasarkan hasil pengolahan data dengan menggunakan software matlab, matriks diagonal, teorema spektral didapatkan hasil yang sama untuk kekonvergenan matriks transisi dari masing-masing stasiun pengamatan dipengaruhi oleh selisih perubahan peluang dua keadaan. Kata kunci: Rantai Markov, Hari Bersalju, Matriks Transisi
APA, Harvard, Vancouver, ISO, and other styles
9

Demenkov, N. P., E. A. Mirkin, and I. A. Mochalov. "Markov and Semi-Markov Processes with Fuzzy States. Part 1. Markov Processes." Informacionnye tehnologii 26, no. 6 (June 23, 2020): 323–34. http://dx.doi.org/10.17587/it.26.323-334.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Alimov, D. "Markov Functionals of an Ergodic Markov Process." Theory of Probability & Its Applications 39, no. 3 (January 1995): 504–12. http://dx.doi.org/10.1137/1139035.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Kilgore, Theodore, and R. A. Zalik. "Splicing of Markov and weak Markov systems." Journal of Approximation Theory 59, no. 1 (October 1989): 2–11. http://dx.doi.org/10.1016/0021-9045(89)90156-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Iwata, Yukiko. "Constrictive Markov operators induced by Markov processes." Positivity 20, no. 2 (September 3, 2015): 355–67. http://dx.doi.org/10.1007/s11117-015-0360-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Pachet, François, and Pierre Roy. "Markov constraints: steerable generation of Markov sequences." Constraints 16, no. 2 (September 18, 2010): 148–72. http://dx.doi.org/10.1007/s10601-010-9101-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Verbeken, Brecht, and Marie-Anne Guerry. "Attainability for Markov and Semi-Markov Chains." Mathematics 12, no. 8 (April 19, 2024): 1227. http://dx.doi.org/10.3390/math12081227.

Full text
Abstract:
When studying Markov chain models and semi-Markov chain models, it is useful to know which state vectors n, where each component ni represents the number of entities in the state Si, can be maintained or attained. This question leads to the definitions of maintainability and attainability for (time-homogeneous) Markov chain models. Recently, the definition of maintainability was extended to the concept of state reunion maintainability (SR-maintainability) for semi-Markov chains. Within the framework of semi-Markov chains, the states are subdivided further into seniority-based states. State reunion maintainability assesses the maintainability of the distribution across states. Following this idea, we introduce the concept of state reunion attainability, which encompasses the potential of a system to attain a specific distribution across the states after uniting the seniority-based states into the underlying states. In this paper, we start by extending the concept of attainability for constant-sized Markov chain models to systems that are subject to growth or contraction. Afterwards, we introduce the concepts of attainability and state reunion attainability for semi-Markov chain models, using SR-maintainability as a starting point. The attainable region, as well as the state reunion attainable region, are described as the convex hull of their respective vertices, and properties of these regions are investigated.
APA, Harvard, Vancouver, ISO, and other styles
15

Demenkov, N. P., E. A. Mirkin, and I. A. Mochalov. "Markov and Semi-Markov Processes with Fuzzy States. Part 2. Semi-Markov Processes." INFORMACIONNYE TEHNOLOGII 26, no. 7 (July 17, 2020): 387–93. http://dx.doi.org/10.17587/it.26.387-393.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Lindley, D. V., and J. R. Norris. "Markov Chains." Mathematical Gazette 83, no. 496 (March 1999): 188. http://dx.doi.org/10.2307/3618756.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Lund, Robert B., and J. R. Norris. "Markov Chains." Journal of the American Statistical Association 94, no. 446 (June 1999): 654. http://dx.doi.org/10.2307/2670196.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Deshmukh, S. R. "Markov Sampling." Australian & New Zealand Journal of Statistics 42, no. 3 (September 2000): 337–45. http://dx.doi.org/10.1111/1467-842x.00130.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Frank, Ove, and David Strauss. "Markov Graphs." Journal of the American Statistical Association 81, no. 395 (September 1986): 832–42. http://dx.doi.org/10.1080/01621459.1986.10478342.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Nord, Erik. "Markov-modellering." Tidsskrift for Den norske legeforening 134, no. 21 (2014): 2062–65. http://dx.doi.org/10.4045/tidsskr.14.0116.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Karlin, Anna R., Steven J. Phillips, and Prabhakar Raghavan. "Markov Paging." SIAM Journal on Computing 30, no. 3 (January 2000): 906–22. http://dx.doi.org/10.1137/s0097539794268042.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Madigan, David, Michael D. Perlman, and Michael Levitz. "Markov Models." Annals of Statistics 29, no. 6 (December 2001): 1751–84. http://dx.doi.org/10.1214/aos/1015345961.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Arkoubi, Khadija. "MARKOV CHAIN." International Journal of Scientific and Engineering Research 7, no. 3 (March 25, 2016): 706–7. http://dx.doi.org/10.14299/ijser.2016.03.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Bavaud, François, and Aris Xanthos. "Markov Associativities." Journal of Quantitative Linguistics 12, no. 2-3 (August 2005): 123–37. http://dx.doi.org/10.1080/09296170500172437.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Jacquet, P., and W. Szpankowski. "Markov Types and Minimax Redundancy for Markov Sources." IEEE Transactions on Information Theory 50, no. 7 (July 2004): 1393–402. http://dx.doi.org/10.1109/tit.2004.830765.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Bhat, B. R., and Sunita K. Deshpande. "Testing for markov process vs semi-markov process." Communications in Statistics - Theory and Methods 15, no. 8 (January 1986): 2375–82. http://dx.doi.org/10.1080/03610928608829255.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Guyon, X., and C. Hardouin. "Markov Chain Markov Field dynamics: Models and statistics." Statistics 36, no. 4 (January 2002): 339–63. http://dx.doi.org/10.1080/02331880213192.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Girardin, Valerie. "Entropy Maximization for Markov and Semi-Markov Processes." Methodology and Computing in Applied Probability 6, no. 1 (March 2004): 109–27. http://dx.doi.org/10.1023/b:mcap.0000012418.88825.18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Ephraim, Y., and W. J. J. Roberts. "An EM Algorithm for Markov Modulated Markov Processes." IEEE Transactions on Signal Processing 57, no. 2 (February 2009): 463–70. http://dx.doi.org/10.1109/tsp.2008.2007919.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Bezhaeva, Z. I., and V. I. Oseledets. "Quantum Markov States and Quantum Hidden Markov States." Journal of Mathematical Sciences 240, no. 5 (June 26, 2019): 507–14. http://dx.doi.org/10.1007/s10958-019-04368-w.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Magda, Ksenija, and Jasmin Zemunović. "Farizeji u Markovu evanđelju." Kairos 13, no. 2 (November 22, 2019): 227–46. http://dx.doi.org/10.32862/k1.13.2.4.

Full text
Abstract:
Glavni je cilj ovoga rada usporediti što se o farizejima može doznati iz izvanbiblijskih izvora, s onime što se može doznati iz naracije u Mk 1,21-3,35, s ciljem da se postavi povijesni temelj za Markov izvještaj. Prvi dio prikazuje povijesno dostupnu građu sa zaključkom da je ambivalentna, dok se u drugom dijelu prikazuje razvoj konflikta između Isusa i farizeja, koji na kraju osvjetljuje povijesne činjenice prema zaključku da je Markov opis farizeja povijesno plauzibilan te time Markovo evanđelje može doprinijeti povijesnoj diskusiji o farizejima. Rad je stoga podijeljen na dva glavna dijela. Prvi, povijesni dio, bavi se pokušajem rekonstrukcije odnosa Isusa i farizeja na temelju povijesti, a drugi, egzegetski, na temelju naracijske kritike razgrađuje povijest i kerigmu.
APA, Harvard, Vancouver, ISO, and other styles
32

Korolkov, A. T., and D. K. Vasenkov. "Precursors and participants of the discovery of the Markovsk oil and gas field." Geology and Environment 3, no. 3 (2023): 87–94. http://dx.doi.org/10.26516/2541-9641.2023.3.87.

Full text
Abstract:
The history of the discovery of the first oil and gas fields on the Siberian platform is traced from the first oil deposit on Chemikanskaya Square in the Tolba River basin (Sakha-Yakutia) to the discovery of the first Markov oil and gas condensate field. The role of the father of Siberian oil Vasily Mikhailovich Senyukov in obtaining the first oil on the Siberian platform and in promoting the idea of reference drilling within its limits is shown. Despite the accidental discovery of the Markov field (oil fountain) in 1962, exploration work within its limits indicated the further direction of the search in the north direction 90 km from Markovo (within the Nepsky Vault).
APA, Harvard, Vancouver, ISO, and other styles
33

MATSUMOTO, Hiroyuki, and Yukio OGURA. "Markov or non-Markov property of $cM-X$ processes." Journal of the Mathematical Society of Japan 56, no. 2 (April 2004): 519–40. http://dx.doi.org/10.2969/jmsj/1191418643.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Ephraim, Y., and B. L. Mark. "Explicit Forward Recursive Estimators for Markov Modulated Markov Processes." Stochastic Models 28, no. 3 (July 2012): 359–87. http://dx.doi.org/10.1080/15326349.2012.699750.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Grewal, Jasleen K., Martin Krzywinski, and Naomi Altman. "Markov models — training and evaluation of hidden Markov models." Nature Methods 17, no. 2 (February 2020): 121–22. http://dx.doi.org/10.1038/s41592-019-0702-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Kella, Offer, and Wolfgang Stadje. "Markov-modulated linear fluid networks with Markov additive input." Journal of Applied Probability 39, no. 2 (June 2002): 413–20. http://dx.doi.org/10.1239/jap/1025131438.

Full text
Abstract:
We consider a network of dams to which the external input is a multivariate Markov additive process. For each state of the Markov chain modulating the Markov additive process, the release rates are linear (constant multiple of the content level). Each unit of material processed by a given station is then divided into fixed proportions each of which is routed to another station or leaves the system. For each state of the modulating process, this routeing is determined by some substochastic matrix. We identify simple conditions for stability and show how to compute transient and stationary characteristics of such networks.
APA, Harvard, Vancouver, ISO, and other styles
37

Todorovic, P., and J. Gani. "A Markov renewal process imbedded in a Markov chain." Stochastic Analysis and Applications 7, no. 3 (January 1989): 339–53. http://dx.doi.org/10.1080/07362998908809186.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Dixit, Purushottam D., and Ken A. Dill. "Caliber Corrected Markov Modeling (C2M2): Correcting Equilibrium Markov Models." Journal of Chemical Theory and Computation 14, no. 2 (January 26, 2018): 1111–19. http://dx.doi.org/10.1021/acs.jctc.7b01126.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Hunter, Jeffrey J. "Kemeny's function for Markov chains and Markov renewal processes." Linear Algebra and its Applications 559 (December 2018): 54–72. http://dx.doi.org/10.1016/j.laa.2018.08.032.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Kella, Offer, and Wolfgang Stadje. "Markov-modulated linear fluid networks with Markov additive input." Journal of Applied Probability 39, no. 02 (June 2002): 413–20. http://dx.doi.org/10.1017/s0021900200022634.

Full text
Abstract:
We consider a network of dams to which the external input is a multivariate Markov additive process. For each state of the Markov chain modulating the Markov additive process, the release rates are linear (constant multiple of the content level). Each unit of material processed by a given station is then divided into fixed proportions each of which is routed to another station or leaves the system. For each state of the modulating process, this routeing is determined by some substochastic matrix. We identify simple conditions for stability and show how to compute transient and stationary characteristics of such networks.
APA, Harvard, Vancouver, ISO, and other styles
41

Accardi, Luigi, and Francesco Fidaleo. "Non-homogeneous quantum Markov states and quantum Markov fields." Journal of Functional Analysis 200, no. 2 (June 2003): 324–47. http://dx.doi.org/10.1016/s0022-1236(03)00071-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Mo, Xiao-yun, Xu-yan Xiang, and Xiang-qun Yang. "Adjoining batch Markov arrival processes of a Markov chain." Acta Mathematicae Applicatae Sinica, English Series 34, no. 1 (January 2018): 1–10. http://dx.doi.org/10.1007/s10255-018-0724-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Ding, J., and N. H. Rhee. "A modified piecewise linear Markov approximation of Markov operators." Applied Mathematics and Computation 174, no. 1 (March 2006): 236–51. http://dx.doi.org/10.1016/j.amc.2005.03.026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

ÇELİK, SERDAR. "MICRO-MARKOV: MARKOV ANALİZİ İLE MİKROTONAL, ALGORİTMİK BESTELEME UYGULAMASI." Journal of International Social Research 9, no. 43 (April 20, 2016): 2565. http://dx.doi.org/10.17719/jisr.20164317816.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Fiedler, T. "Markov Moves Cannot be Replaced by Double Markov Moves." Journal of Knot Theory and Its Ramifications 12, no. 04 (June 2003): 575–77. http://dx.doi.org/10.1142/s0218216503002561.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Barker, Richard J., and Matthew R. Schofield. "Putting Markov Chains Back into Markov Chain Monte Carlo." Journal of Applied Mathematics and Decision Sciences 2007 (October 30, 2007): 1–13. http://dx.doi.org/10.1155/2007/98086.

Full text
Abstract:
Markov chain theory plays an important role in statistical inference both in the formulation of models for data and in the construction of efficient algorithms for inference. The use of Markov chains in modeling data has a long history, however the use of Markov chain theory in developing algorithms for statistical inference has only become popular recently. Using mark-recapture models as an illustration, we show how Markov chains can be used for developing demographic models and also in developing efficient algorithms for inference. We anticipate that a major area of future research involving mark-recapture data will be the development of hierarchical models that lead to better demographic models that account for all uncertainties in the analysis. A key issue is determining when the chains produced by Markov chain Monte Carlo sampling have converged.
APA, Harvard, Vancouver, ISO, and other styles
47

Gosavi, Abhijit. "Target-sensitive control of Markov and semi-Markov processes." International Journal of Control, Automation and Systems 9, no. 5 (October 2011): 941–51. http://dx.doi.org/10.1007/s12555-011-0515-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Hölzl, Johannes. "Markov Chains and Markov Decision Processes in Isabelle/HOL." Journal of Automated Reasoning 59, no. 3 (December 20, 2016): 345–87. http://dx.doi.org/10.1007/s10817-016-9401-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Janssens, Eva F., and Sean McCrary. "Finite-State Markov-Chain Approximations: A Hidden Markov Approach." Finance and Economics Discussion Series, no. 2023-040 (June 2023): 1–62. http://dx.doi.org/10.17016/feds.2023.040.

Full text
Abstract:
This paper proposes a novel finite-state Markov chain approximation method for Markov processes with continuous support, providing both an optimal grid and transition probability matrix. The method can be used for multivariate processes, as well as non-stationary processes such as those with a life-cycle component. The method is based on minimizing the information loss between a Hidden Markov Model and the true data-generating process. We provide sufficient conditions under which this information loss can be made arbitrarily small if enough grid points are used. We compare our method to existing methods through the lens of an asset-pricing model, and a life-cycle consumption-savings model. We find our method leads to more parsimonious discretizations and more accurate solutions, and the discretization matters for the welfare costs of risk, the marginal propensities to consume, and the amount of wealth inequality a life-cycle model can generate.
APA, Harvard, Vancouver, ISO, and other styles
50

Ye, Fei, and Yifei Wang. "A Novel Method for Decoding Any High-Order Hidden Markov Model." Discrete Dynamics in Nature and Society 2014 (2014): 1–6. http://dx.doi.org/10.1155/2014/231704.

Full text
Abstract:
This paper proposes a novel method for decoding any high-order hidden Markov model. First, the high-order hidden Markov model is transformed into an equivalent first-order hidden Markov model by Hadar’s transformation. Next, the optimal state sequence of the equivalent first-order hidden Markov model is recognized by the existing Viterbi algorithm of the first-order hidden Markov model. Finally, the optimal state sequence of the high-order hidden Markov model is inferred from the optimal state sequence of the equivalent first-order hidden Markov model. This method provides a unified algorithm framework for decoding hidden Markov models including the first-order hidden Markov model and any high-order hidden Markov model.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography