Academic literature on the topic 'Invariant distribution of Markov processes'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Invariant distribution of Markov processes.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Invariant distribution of Markov processes"
Arnold, Barry C., and C. A. Robertson. "Autoregressive logistic processes." Journal of Applied Probability 26, no. 3 (September 1989): 524–31. http://dx.doi.org/10.2307/3214410.
Full textArnold, Barry C., and C. A. Robertson. "Autoregressive logistic processes." Journal of Applied Probability 26, no. 03 (September 1989): 524–31. http://dx.doi.org/10.1017/s0021900200038122.
Full textMcDonald, D. "An invariance principle for semi-Markov processes." Advances in Applied Probability 17, no. 1 (March 1985): 100–126. http://dx.doi.org/10.2307/1427055.
Full textMcDonald, D. "An invariance principle for semi-Markov processes." Advances in Applied Probability 17, no. 01 (March 1985): 100–126. http://dx.doi.org/10.1017/s0001867800014683.
Full textBarnsley, Michael F., and John H. Elton. "A new class of markov processes for image encoding." Advances in Applied Probability 20, no. 1 (March 1988): 14–32. http://dx.doi.org/10.2307/1427268.
Full textBarnsley, Michael F., and John H. Elton. "A new class of markov processes for image encoding." Advances in Applied Probability 20, no. 01 (March 1988): 14–32. http://dx.doi.org/10.1017/s0001867800017924.
Full textKalpazidou, S. "On Levy's theorem concerning positiveness of transition probabilities of Markov processes: the circuit processes case." Journal of Applied Probability 30, no. 1 (March 1993): 28–39. http://dx.doi.org/10.2307/3214619.
Full textKalpazidou, S. "On Levy's theorem concerning positiveness of transition probabilities of Markov processes: the circuit processes case." Journal of Applied Probability 30, no. 01 (March 1993): 28–39. http://dx.doi.org/10.1017/s0021900200043977.
Full textAvrachenkov, Konstantin, Alexey Piunovskiy, and Yi Zhang. "Markov Processes with Restart." Journal of Applied Probability 50, no. 4 (December 2013): 960–68. http://dx.doi.org/10.1239/jap/1389370093.
Full textAvrachenkov, Konstantin, Alexey Piunovskiy, and Yi Zhang. "Markov Processes with Restart." Journal of Applied Probability 50, no. 04 (December 2013): 960–68. http://dx.doi.org/10.1017/s0021900200013735.
Full textDissertations / Theses on the topic "Invariant distribution of Markov processes"
Hahn, Léo. "Interacting run-and-tumble particles as piecewise deterministic Markov processes : invariant distribution and convergence." Electronic Thesis or Diss., Université Clermont Auvergne (2021-...), 2024. http://www.theses.fr/2024UCFA0084.
Full text1. Simulating active and metastable systems with piecewise deterministic Markov processes (PDMPs): - Which dynamics to choose to efficiently simulate metastable states? - How to directly exploit the non-equilibrium nature of PDMPs to study the modeled physical systems? 2. Modeling active systems with PDMPs: - What conditions must a system meet to be modeled by a PDMP? - In which cases does the system have a stationary distribution? - How to calculate dynamic quantities (e.g., transition rates) in this framework? 3. Improving simulation techniques for equilibrium systems: - Can results obtained in the context of non-equilibrium systems be used to accelerate the simulation of equilibrium systems? - How to use topological information to adapt the dynamics in real-time?
Casse, Jérôme. "Automates cellulaires probabilistes et processus itérés ad libitum." Thesis, Bordeaux, 2015. http://www.theses.fr/2015BORD0248/document.
Full textThe first part of this thesis is about probabilistic cellular automata (PCA) on the line and with two neighbors. For a given PCA, we look for the set of its invariant distributions. Due to reasons explained in detail in this thesis, it is nowadays unthinkable to get all of them and we concentrate our reections on the invariant Markovian distributions. We establish, first, an algebraic theorem that gives a necessary and sufficient condition for a PCA to have one or more invariant Markovian distributions when the alphabet E is finite. Then, we generalize this result to the case of a polish alphabet E once we have clarified the encountered topological difficulties. Finally, we calculate the 8-vertex model's correlation function for some parameters values using previous results.The second part of this thesis is about infinite iterations of stochastic processes. We establish the convergence of the finite dimensional distributions of the α-stable processes iterated n times, when n goes to infinite, according to parameter of stability and to drift r. Then, we describe the limit distributions. In the iterated Brownian motion case, we show that the limit distributions are linked with iterated functions system
陳冠全 and Koon-chuen Chen. "Invariant limiting shape distributions for some sequential rectangularmodels." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1998. http://hub.hku.hk/bib/B31238233.
Full textChen, Koon-chuen. "Invariant limiting shape distributions for some sequential rectangular models /." Hong Kong : University of Hong Kong, 1998. http://sunzi.lib.hku.hk/hkuto/record.jsp?B20998934.
Full textHammer, Matthias [Verfasser]. "Ergodicity and regularity of invariant measure for branching Markov processes with immigration / Matthias Hammer." Mainz : Universitätsbibliothek Mainz, 2012. http://d-nb.info/1029390975/34.
Full textHurth, Tobias. "Invariant densities for dynamical systems with random switching." Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/52274.
Full textKaijser, Thomas. "Convergence in distribution for filtering processes associated to Hidden Markov Models with densities." Linköpings universitet, Matematik och tillämpad matematik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-92590.
Full textTalwar, Gaurav. "HMM-based non-intrusive speech quality and implementation of Viterbi score distribution and hiddenness based measures to improve the performance of speech recognition." Laramie, Wyo. : University of Wyoming, 2006. http://proquest.umi.com/pqdweb?did=1288654981&sid=7&Fmt=2&clientId=18949&RQT=309&VName=PQD.
Full textGreen, David Anthony. "Departure processes from MAP/PH/1 queues." Title page, contents and abstract only, 1999. http://thesis.library.adelaide.edu.au/public/adt-SUA20020815.092144.
Full textDrton, Mathias. "Maximum likelihood estimation in Gaussian AMP chain graph models and Gaussian ancestral graph models /." Thesis, Connect to this title online; UW restricted, 2004. http://hdl.handle.net/1773/8952.
Full textBooks on the topic "Invariant distribution of Markov processes"
Hernández-Lerma, O. Markov Chains and Invariant Probabilities. Basel: Birkhäuser Basel, 2003.
Find full textLiao, Ming. Invariant Markov Processes Under Lie Group Actions. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-92324-6.
Full textCarlsson, Niclas. Markov chains on metric spaces: Invariant measures and asymptotic behaviour. Åbo: Åbo Akademi University Press, 2005.
Find full textBanjevic, Dragan. Recurrent relations for distribution of waiting time in Markov chain. [Toronto]: University of Toronto, Department of Statistics, 1994.
Find full textservice), SpringerLink (Online, ed. Measure-Valued Branching Markov Processes. Berlin, Heidelberg: Springer-Verlag Berlin Heidelberg, 2011.
Find full textOswaldo Luiz do Valle Costa. Continuous Average Control of Piecewise Deterministic Markov Processes. New York, NY: Springer New York, 2013.
Find full textFeinberg, Eugene A. Handbook of Markov Decision Processes: Methods and Applications. Boston, MA: Springer US, 2002.
Find full textUlrich, Rieder, and SpringerLink (Online service), eds. Markov Decision Processes with Applications to Finance. Berlin, Heidelberg: Springer-Verlag Berlin Heidelberg, 2011.
Find full textTaira, Kazuaki. Semigroups, Boundary Value Problems and Markov Processes. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004.
Find full textMilch, Paul R. FORECASTER, a Markovian model to analyze the distribution of Naval Officers. Monterey, Calif: Naval Postgraduate School, 1990.
Find full textBook chapters on the topic "Invariant distribution of Markov processes"
Liao, Ming. "Decomposition of Markov Processes." In Invariant Markov Processes Under Lie Group Actions, 305–29. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-92324-6_9.
Full textPollett, P. K. "Identifying Q-Processes with a Given Finite µ-Invariant Measure." In Markov Processes and Controlled Markov Chains, 41–55. Boston, MA: Springer US, 2002. http://dx.doi.org/10.1007/978-1-4613-0265-0_3.
Full textDubins, Lester E., Ashok P. Maitra, and William D. Sudderth. "Invariant Gambling Problems and Markov Decision Processes." In International Series in Operations Research & Management Science, 409–28. Boston, MA: Springer US, 2002. http://dx.doi.org/10.1007/978-1-4615-0805-2_13.
Full textDudley, R. M. "A note on Lorentz-invariant Markov processes." In Selected Works of R.M. Dudley, 109–15. New York, NY: Springer New York, 2010. http://dx.doi.org/10.1007/978-1-4419-5821-1_8.
Full textCocozza-Thivent, Christiane. "Hitting Time Distribution." In Markov Renewal and Piecewise Deterministic Processes, 63–77. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-70447-6_4.
Full textLiao, Ming. "Lévy Processes in Lie Groups." In Invariant Markov Processes Under Lie Group Actions, 35–71. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-92324-6_2.
Full textLiao, Ming. "Lévy Processes in Homogeneous Spaces." In Invariant Markov Processes Under Lie Group Actions, 73–101. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-92324-6_3.
Full textRong, Wu. "Some Properties of Invariant Functions of Markov Processes." In Seminar on Stochastic Processes, 1988, 239–44. Boston, MA: Birkhäuser Boston, 1989. http://dx.doi.org/10.1007/978-1-4612-3698-6_16.
Full textLiao, Ming. "Lévy Processes in Compact Lie Groups." In Invariant Markov Processes Under Lie Group Actions, 103–33. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-92324-6_4.
Full textLiao, Ming. "Inhomogeneous Lévy Processes in Lie Groups." In Invariant Markov Processes Under Lie Group Actions, 169–237. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-92324-6_6.
Full textConference papers on the topic "Invariant distribution of Markov processes"
Rajendiran, Shenbageshwaran, Francisco Galdos, Carissa Anne Lee, Sidra Xu, Justin Harvell, Shireen Singh, Sean M. Wu, Elizabeth A. Lipke, and Selen Cremaschi. "Modeling hiPSC-to-Early Cardiomyocyte Differentiation Process using Microsimulation and Markov Chain Models." In Foundations of Computer-Aided Process Design, 344–50. Hamilton, Canada: PSE Press, 2024. http://dx.doi.org/10.69997/sct.152564.
Full textAkshay, S., Blaise Genest, and Nikhil Vyas. "Distribution-based objectives for Markov Decision Processes." In LICS '18: 33rd Annual ACM/IEEE Symposium on Logic in Computer Science. New York, NY, USA: ACM, 2018. http://dx.doi.org/10.1145/3209108.3209185.
Full textBudgett, Stephanie, Azam Asanjarani, and Heti Afimeimounga. "Visualizing Markov Processes." In Bridging the Gap: Empowering and Educating Today’s Learners in Statistics. International Association for Statistical Education, 2022. http://dx.doi.org/10.52041/iase.icots11.t10f3.
Full textFracasso, Paulo Thiago, Frank Stephenson Barnes, and Anna Helena Reali Costa. "Energy cost optimization in water distribution systems using Markov Decision Processes." In 2013 International Green Computing Conference (IGCC). IEEE, 2013. http://dx.doi.org/10.1109/igcc.2013.6604516.
Full textIsmail, Muhammad Ali. "Multi-core processor based parallel implementation for finding distribution vectors in Markov processes." In 2013 18th International Conference on Digital Signal Processing (DSP). IEEE, 2013. http://dx.doi.org/10.1109/siecpc.2013.6550997.
Full textTsukamoto, Hiroki, Song Bian, and Takashi Sato. "Statistical Device Modeling with Arbitrary Model-Parameter Distribution via Markov Chain Monte Carlo." In 2021 International Conference on Simulation of Semiconductor Processes and Devices (SISPAD). IEEE, 2021. http://dx.doi.org/10.1109/sispad54002.2021.9592558.
Full textLee, Seungchul, Lin Li, and Jun Ni. "Modeling of Degradation Processes to Obtain an Optimal Solution for Maintenance and Performance." In ASME 2009 International Manufacturing Science and Engineering Conference. ASMEDC, 2009. http://dx.doi.org/10.1115/msec2009-84166.
Full textSathe, Sumedh, Chinmay Samak, Tanmay Samak, Ajinkya Joglekar, Shyam Ranganathan, and Venkat N. Krovi. "Data Driven Vehicle Dynamics System Identification Using Gaussian Processes." In WCX SAE World Congress Experience. 400 Commonwealth Drive, Warrendale, PA, United States: SAE International, 2024. http://dx.doi.org/10.4271/2024-01-2022.
Full textVelasquez, Alvaro. "Steady-State Policy Synthesis for Verifiable Control." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/784.
Full textHaschka, Markus, and Volker Krebs. "A Direct Approximation of Cole-Cole-Systems for Time-Domain Analysis." In ASME 2005 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2005. http://dx.doi.org/10.1115/detc2005-84579.
Full textReports on the topic "Invariant distribution of Markov processes"
Stettner, Lukasz. On the Existence and Uniqueness of Invariant Measure for Continuous Time Markov Processes,. Fort Belvoir, VA: Defense Technical Information Center, April 1986. http://dx.doi.org/10.21236/ada174758.
Full text