Literatura científica selecionada sobre o tema "Invariant distribution of Markov processes"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Invariant distribution of Markov processes".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Artigos de revistas sobre o assunto "Invariant distribution of Markov processes"
Arnold, Barry C., e C. A. Robertson. "Autoregressive logistic processes". Journal of Applied Probability 26, n.º 3 (setembro de 1989): 524–31. http://dx.doi.org/10.2307/3214410.
Texto completo da fonteArnold, Barry C., e C. A. Robertson. "Autoregressive logistic processes". Journal of Applied Probability 26, n.º 03 (setembro de 1989): 524–31. http://dx.doi.org/10.1017/s0021900200038122.
Texto completo da fonteMcDonald, D. "An invariance principle for semi-Markov processes". Advances in Applied Probability 17, n.º 1 (março de 1985): 100–126. http://dx.doi.org/10.2307/1427055.
Texto completo da fonteMcDonald, D. "An invariance principle for semi-Markov processes". Advances in Applied Probability 17, n.º 01 (março de 1985): 100–126. http://dx.doi.org/10.1017/s0001867800014683.
Texto completo da fonteBarnsley, Michael F., e John H. Elton. "A new class of markov processes for image encoding". Advances in Applied Probability 20, n.º 1 (março de 1988): 14–32. http://dx.doi.org/10.2307/1427268.
Texto completo da fonteBarnsley, Michael F., e John H. Elton. "A new class of markov processes for image encoding". Advances in Applied Probability 20, n.º 01 (março de 1988): 14–32. http://dx.doi.org/10.1017/s0001867800017924.
Texto completo da fonteKalpazidou, S. "On Levy's theorem concerning positiveness of transition probabilities of Markov processes: the circuit processes case". Journal of Applied Probability 30, n.º 1 (março de 1993): 28–39. http://dx.doi.org/10.2307/3214619.
Texto completo da fonteKalpazidou, S. "On Levy's theorem concerning positiveness of transition probabilities of Markov processes: the circuit processes case". Journal of Applied Probability 30, n.º 01 (março de 1993): 28–39. http://dx.doi.org/10.1017/s0021900200043977.
Texto completo da fonteAvrachenkov, Konstantin, Alexey Piunovskiy e Yi Zhang. "Markov Processes with Restart". Journal of Applied Probability 50, n.º 4 (dezembro de 2013): 960–68. http://dx.doi.org/10.1239/jap/1389370093.
Texto completo da fonteAvrachenkov, Konstantin, Alexey Piunovskiy e Yi Zhang. "Markov Processes with Restart". Journal of Applied Probability 50, n.º 04 (dezembro de 2013): 960–68. http://dx.doi.org/10.1017/s0021900200013735.
Texto completo da fonteTeses / dissertações sobre o assunto "Invariant distribution of Markov processes"
Hahn, Léo. "Interacting run-and-tumble particles as piecewise deterministic Markov processes : invariant distribution and convergence". Electronic Thesis or Diss., Université Clermont Auvergne (2021-...), 2024. http://www.theses.fr/2024UCFA0084.
Texto completo da fonte1. Simulating active and metastable systems with piecewise deterministic Markov processes (PDMPs): - Which dynamics to choose to efficiently simulate metastable states? - How to directly exploit the non-equilibrium nature of PDMPs to study the modeled physical systems? 2. Modeling active systems with PDMPs: - What conditions must a system meet to be modeled by a PDMP? - In which cases does the system have a stationary distribution? - How to calculate dynamic quantities (e.g., transition rates) in this framework? 3. Improving simulation techniques for equilibrium systems: - Can results obtained in the context of non-equilibrium systems be used to accelerate the simulation of equilibrium systems? - How to use topological information to adapt the dynamics in real-time?
Casse, Jérôme. "Automates cellulaires probabilistes et processus itérés ad libitum". Thesis, Bordeaux, 2015. http://www.theses.fr/2015BORD0248/document.
Texto completo da fonteThe first part of this thesis is about probabilistic cellular automata (PCA) on the line and with two neighbors. For a given PCA, we look for the set of its invariant distributions. Due to reasons explained in detail in this thesis, it is nowadays unthinkable to get all of them and we concentrate our reections on the invariant Markovian distributions. We establish, first, an algebraic theorem that gives a necessary and sufficient condition for a PCA to have one or more invariant Markovian distributions when the alphabet E is finite. Then, we generalize this result to the case of a polish alphabet E once we have clarified the encountered topological difficulties. Finally, we calculate the 8-vertex model's correlation function for some parameters values using previous results.The second part of this thesis is about infinite iterations of stochastic processes. We establish the convergence of the finite dimensional distributions of the α-stable processes iterated n times, when n goes to infinite, according to parameter of stability and to drift r. Then, we describe the limit distributions. In the iterated Brownian motion case, we show that the limit distributions are linked with iterated functions system
陳冠全 e Koon-chuen Chen. "Invariant limiting shape distributions for some sequential rectangularmodels". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1998. http://hub.hku.hk/bib/B31238233.
Texto completo da fonteChen, Koon-chuen. "Invariant limiting shape distributions for some sequential rectangular models /". Hong Kong : University of Hong Kong, 1998. http://sunzi.lib.hku.hk/hkuto/record.jsp?B20998934.
Texto completo da fonteHammer, Matthias [Verfasser]. "Ergodicity and regularity of invariant measure for branching Markov processes with immigration / Matthias Hammer". Mainz : Universitätsbibliothek Mainz, 2012. http://d-nb.info/1029390975/34.
Texto completo da fonteHurth, Tobias. "Invariant densities for dynamical systems with random switching". Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/52274.
Texto completo da fonteKaijser, Thomas. "Convergence in distribution for filtering processes associated to Hidden Markov Models with densities". Linköpings universitet, Matematik och tillämpad matematik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-92590.
Texto completo da fonteTalwar, Gaurav. "HMM-based non-intrusive speech quality and implementation of Viterbi score distribution and hiddenness based measures to improve the performance of speech recognition". Laramie, Wyo. : University of Wyoming, 2006. http://proquest.umi.com/pqdweb?did=1288654981&sid=7&Fmt=2&clientId=18949&RQT=309&VName=PQD.
Texto completo da fonteGreen, David Anthony. "Departure processes from MAP/PH/1 queues". Title page, contents and abstract only, 1999. http://thesis.library.adelaide.edu.au/public/adt-SUA20020815.092144.
Texto completo da fonteDrton, Mathias. "Maximum likelihood estimation in Gaussian AMP chain graph models and Gaussian ancestral graph models /". Thesis, Connect to this title online; UW restricted, 2004. http://hdl.handle.net/1773/8952.
Texto completo da fonteLivros sobre o assunto "Invariant distribution of Markov processes"
Hernández-Lerma, O. Markov Chains and Invariant Probabilities. Basel: Birkhäuser Basel, 2003.
Encontre o texto completo da fonteLiao, Ming. Invariant Markov Processes Under Lie Group Actions. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-92324-6.
Texto completo da fonteCarlsson, Niclas. Markov chains on metric spaces: Invariant measures and asymptotic behaviour. Åbo: Åbo Akademi University Press, 2005.
Encontre o texto completo da fonteBanjevic, Dragan. Recurrent relations for distribution of waiting time in Markov chain. [Toronto]: University of Toronto, Department of Statistics, 1994.
Encontre o texto completo da fonteservice), SpringerLink (Online, ed. Measure-Valued Branching Markov Processes. Berlin, Heidelberg: Springer-Verlag Berlin Heidelberg, 2011.
Encontre o texto completo da fonteOswaldo Luiz do Valle Costa. Continuous Average Control of Piecewise Deterministic Markov Processes. New York, NY: Springer New York, 2013.
Encontre o texto completo da fonteFeinberg, Eugene A. Handbook of Markov Decision Processes: Methods and Applications. Boston, MA: Springer US, 2002.
Encontre o texto completo da fonteUlrich, Rieder, e SpringerLink (Online service), eds. Markov Decision Processes with Applications to Finance. Berlin, Heidelberg: Springer-Verlag Berlin Heidelberg, 2011.
Encontre o texto completo da fonteTaira, Kazuaki. Semigroups, Boundary Value Problems and Markov Processes. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004.
Encontre o texto completo da fonteMilch, Paul R. FORECASTER, a Markovian model to analyze the distribution of Naval Officers. Monterey, Calif: Naval Postgraduate School, 1990.
Encontre o texto completo da fonteCapítulos de livros sobre o assunto "Invariant distribution of Markov processes"
Liao, Ming. "Decomposition of Markov Processes". In Invariant Markov Processes Under Lie Group Actions, 305–29. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-92324-6_9.
Texto completo da fontePollett, P. K. "Identifying Q-Processes with a Given Finite µ-Invariant Measure". In Markov Processes and Controlled Markov Chains, 41–55. Boston, MA: Springer US, 2002. http://dx.doi.org/10.1007/978-1-4613-0265-0_3.
Texto completo da fonteDubins, Lester E., Ashok P. Maitra e William D. Sudderth. "Invariant Gambling Problems and Markov Decision Processes". In International Series in Operations Research & Management Science, 409–28. Boston, MA: Springer US, 2002. http://dx.doi.org/10.1007/978-1-4615-0805-2_13.
Texto completo da fonteDudley, R. M. "A note on Lorentz-invariant Markov processes". In Selected Works of R.M. Dudley, 109–15. New York, NY: Springer New York, 2010. http://dx.doi.org/10.1007/978-1-4419-5821-1_8.
Texto completo da fonteCocozza-Thivent, Christiane. "Hitting Time Distribution". In Markov Renewal and Piecewise Deterministic Processes, 63–77. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-70447-6_4.
Texto completo da fonteLiao, Ming. "Lévy Processes in Lie Groups". In Invariant Markov Processes Under Lie Group Actions, 35–71. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-92324-6_2.
Texto completo da fonteLiao, Ming. "Lévy Processes in Homogeneous Spaces". In Invariant Markov Processes Under Lie Group Actions, 73–101. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-92324-6_3.
Texto completo da fonteRong, Wu. "Some Properties of Invariant Functions of Markov Processes". In Seminar on Stochastic Processes, 1988, 239–44. Boston, MA: Birkhäuser Boston, 1989. http://dx.doi.org/10.1007/978-1-4612-3698-6_16.
Texto completo da fonteLiao, Ming. "Lévy Processes in Compact Lie Groups". In Invariant Markov Processes Under Lie Group Actions, 103–33. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-92324-6_4.
Texto completo da fonteLiao, Ming. "Inhomogeneous Lévy Processes in Lie Groups". In Invariant Markov Processes Under Lie Group Actions, 169–237. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-92324-6_6.
Texto completo da fonteTrabalhos de conferências sobre o assunto "Invariant distribution of Markov processes"
Rajendiran, Shenbageshwaran, Francisco Galdos, Carissa Anne Lee, Sidra Xu, Justin Harvell, Shireen Singh, Sean M. Wu, Elizabeth A. Lipke e Selen Cremaschi. "Modeling hiPSC-to-Early Cardiomyocyte Differentiation Process using Microsimulation and Markov Chain Models". In Foundations of Computer-Aided Process Design, 344–50. Hamilton, Canada: PSE Press, 2024. http://dx.doi.org/10.69997/sct.152564.
Texto completo da fonteAkshay, S., Blaise Genest e Nikhil Vyas. "Distribution-based objectives for Markov Decision Processes". In LICS '18: 33rd Annual ACM/IEEE Symposium on Logic in Computer Science. New York, NY, USA: ACM, 2018. http://dx.doi.org/10.1145/3209108.3209185.
Texto completo da fonteBudgett, Stephanie, Azam Asanjarani e Heti Afimeimounga. "Visualizing Markov Processes". In Bridging the Gap: Empowering and Educating Today’s Learners in Statistics. International Association for Statistical Education, 2022. http://dx.doi.org/10.52041/iase.icots11.t10f3.
Texto completo da fonteFracasso, Paulo Thiago, Frank Stephenson Barnes e Anna Helena Reali Costa. "Energy cost optimization in water distribution systems using Markov Decision Processes". In 2013 International Green Computing Conference (IGCC). IEEE, 2013. http://dx.doi.org/10.1109/igcc.2013.6604516.
Texto completo da fonteIsmail, Muhammad Ali. "Multi-core processor based parallel implementation for finding distribution vectors in Markov processes". In 2013 18th International Conference on Digital Signal Processing (DSP). IEEE, 2013. http://dx.doi.org/10.1109/siecpc.2013.6550997.
Texto completo da fonteTsukamoto, Hiroki, Song Bian e Takashi Sato. "Statistical Device Modeling with Arbitrary Model-Parameter Distribution via Markov Chain Monte Carlo". In 2021 International Conference on Simulation of Semiconductor Processes and Devices (SISPAD). IEEE, 2021. http://dx.doi.org/10.1109/sispad54002.2021.9592558.
Texto completo da fonteLee, Seungchul, Lin Li e Jun Ni. "Modeling of Degradation Processes to Obtain an Optimal Solution for Maintenance and Performance". In ASME 2009 International Manufacturing Science and Engineering Conference. ASMEDC, 2009. http://dx.doi.org/10.1115/msec2009-84166.
Texto completo da fonteSathe, Sumedh, Chinmay Samak, Tanmay Samak, Ajinkya Joglekar, Shyam Ranganathan e Venkat N. Krovi. "Data Driven Vehicle Dynamics System Identification Using Gaussian Processes". In WCX SAE World Congress Experience. 400 Commonwealth Drive, Warrendale, PA, United States: SAE International, 2024. http://dx.doi.org/10.4271/2024-01-2022.
Texto completo da fonteVelasquez, Alvaro. "Steady-State Policy Synthesis for Verifiable Control". In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/784.
Texto completo da fonteHaschka, Markus, e Volker Krebs. "A Direct Approximation of Cole-Cole-Systems for Time-Domain Analysis". In ASME 2005 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2005. http://dx.doi.org/10.1115/detc2005-84579.
Texto completo da fonteRelatórios de organizações sobre o assunto "Invariant distribution of Markov processes"
Stettner, Lukasz. On the Existence and Uniqueness of Invariant Measure for Continuous Time Markov Processes,. Fort Belvoir, VA: Defense Technical Information Center, abril de 1986. http://dx.doi.org/10.21236/ada174758.
Texto completo da fonte