Siga este link para ver outros tipos de publicações sobre o tema: Stochastic processes with large dimension.

Teses / dissertações sobre o tema "Stochastic processes with large dimension"

Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos

Selecione um tipo de fonte:

Veja os 20 melhores trabalhos (teses / dissertações) para estudos sobre o assunto "Stochastic processes with large dimension".

Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.

Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.

Veja as teses / dissertações das mais diversas áreas científicas e compile uma bibliografia correta.

1

Bastide, Dorinel-Marian. "Handling derivatives risks with XVAs in a one-period network model". Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASM027.

Texto completo da fonte
Resumo:
La réglementation requiert des établissements bancaires d'être en mesure de conduire des analyses de scénarios de tests de résistance (stress tests) réguliers de leurs expositions, en particulier face aux chambres de compensation (CCPs) auxquels ils sont largement exposés, en appliquant des chocs de marchés pour capturer le risque de marché et des chocs économiques pouvant conduire à l'état de faillite, dit aussi de défaut, divers acteurs financiers afin de refléter les risques de crédit et de contrepartie. Un des rôles principaux des CCPs est d'assurer par leur interposition entre acteurs financiers la réduction du risque de contrepartie associé aux pertes potentiels des engagements contractuels non respectés dus à la faillite d'une ou plusieurs des parties engagées. Elles facilitent également les divers flux financiers des activités de trading même en cas de défaut d'un ou plusieurs de leurs membres en re-basculant certaines des positions de ces membres et en allouant toute perte qui pourrait se matérialiser suite à ces défauts aux membres survivants . Pour développer une vision juste des risques et disposer d'outils performants de pilotage du capital, il apparaît essentiel d'être en mesure d'appréhender de manière exhaustive les pertes et besoins de liquidités occasionnés par ces divers chocs dans ces réseaux financiers ainsi que d'avoir une compréhension précise des mécanismes sous-jacents. Ce projet de thèse aborde différentes questions de modélisation permettant de refléter ces besoins, qui sont au cœur de la gestion des risques d'une banque dans les environnements actuels de trading centralisé. Nous commençons d'abord par définir un dispositif de modèle statique à une période reflétant les positions hétérogènes et possibilité de défauts joints de multiples acteurs financiers, qu'ils soient membres de CCPs ou autres participants financiers, pour identifier les différents coûts, dits de XVA, générés par les activités de clearing et bilatérales avec des formules explicites pour ces coûts. Divers cas d'usage de ce dispositif sont illustrés avec des exemples d'exercices de stress test sur des réseaux financiers depuis le point de vue d'un membre ou de novation de portefeuille de membres en défaut sur des CCPs avec les autres membres survivants. Des modèles de distributions à queues épaisses pour générer les pertes sur les portefeuilles et les défauts sont privilégiés avec l'application de techniques de Monte-Carlo en très grande dimension accompagnée des quantifications d'incertitudes numériques. Nous développons aussi l'aspect novation de portefeuille de membres en défauts et les transferts de coûts XVA associés. Ces novations peuvent s'exécuter soit sur les places de marchés (exchanges), soit par les CCP elles-mêmes qui désignent les repreneurs optimaux ou qui mettent aux enchères les positions des membres défaillants avec des expressions d'équilibres économiques. Les défauts de membres sur plusieurs CCPs en commun amènent par ailleurs à la mise en équation et la résolution de problèmes d'optimisation multidimensionnelle du transfert des risques abordées dans ces travaux
Finance regulators require banking institutions to be able to conduct regular scenario analyses to assess their resistance to various shocks (stress tests) of their exposures, in particular towards clearing houses (CCPs) to which they are largely exposed, by applying market shocks to capture market risk and economic shocks leading some financial players to bankruptcy, known as default state, to reflect both credit and counterparty risks. By interposing itself between financial actors, one of the main purposes of CCPs are to limit counterparty risk due to contractual payment failures due to one or several defaults among engaged parties. They also facilitate the various financial flows of the trading activities even in the event of default of one or more of their members by re-arranging certain positions and allocating any loss that could materialize following these defaults to the surviving members. To develop a relevant view of risks and ensure effective capital steering tools, it is essential for banks to have the capacity to comprehensively understand the losses and liquidity needs caused by these various shocks within these financial networks as well as to have an understanding of the underlying mechanisms. This thesis project aims at tackling modelling issues to answer those different needs that are at the heart of risk management practices for banks under clearing environments. We begin by defining a one-period static model for reflecting the market heterogeneous positions and possible joint defaults of multiple financial players, being members of CCPs and other financial participants, to identify the different costs, known as XVAs, generated by both clearing and bilateral activities, with explicit formulas for these costs. Various use cases of this modelling framework are illustrated with stress test exercises examples on financial networks from a member's point of view or innovation of portfolio of CCP defaulted members with other surviving members. Fat-tailed distributions are favoured to generate portfolio losses and defaults with the application of very large-dimension Monte-Carlo methods along with numerical uncertainty quantifications. We also expand on the novation aspects of portfolios of defaulted members and the associated XVA costs transfers. These innovations can be carried out either on the marketplaces (exchanges) or by the CCPs themselves by identifying the optimal buyers or by conducting auctions of defaulted positions with dedicated economic equilibrium problems. Failures of members on several CCPs in common also lead to the formulation and resolution of multidimensional optimization problems of risk transfer that are introduced in this thesis
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Jones, Elinor Mair. "Large deviations of random walks and levy processes". Thesis, University of Manchester, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.491853.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Suzuki, Kohei. "Convergence of stochastic processes on varying metric spaces". 京都大学 (Kyoto University), 2016. http://hdl.handle.net/2433/215281.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Kuwada, Kazumasa. "On large deviations for current-valued processes induced from stochastic line integrals". 京都大学 (Kyoto University), 2004. http://hdl.handle.net/2433/147585.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Hoshaw-Woodard, Stacy. "Large sample methods for analyzing longitudinal data in rehabilitation research /". free to MU campus, to others for purchase, 1999. http://wwwlib.umi.com/cr/mo/fullcit?p9946263.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Löhr, Wolfgang. "Models of Discrete-Time Stochastic Processes and Associated Complexity Measures". Doctoral thesis, Universitätsbibliothek Leipzig, 2010. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-38267.

Texto completo da fonte
Resumo:
Many complexity measures are defined as the size of a minimal representation in a specific model class. One such complexity measure, which is important because it is widely applied, is statistical complexity. It is defined for discrete-time, stationary stochastic processes within a theory called computational mechanics. Here, a mathematically rigorous, more general version of this theory is presented, and abstract properties of statistical complexity as a function on the space of processes are investigated. In particular, weak-* lower semi-continuity and concavity are shown, and it is argued that these properties should be shared by all sensible complexity measures. Furthermore, a formula for the ergodic decomposition is obtained. The same results are also proven for two other complexity measures that are defined by different model classes, namely process dimension and generative complexity. These two quantities, and also the information theoretic complexity measure called excess entropy, are related to statistical complexity, and this relation is discussed here. It is also shown that computational mechanics can be reformulated in terms of Frank Knight's prediction process, which is of both conceptual and technical interest. In particular, it allows for a unified treatment of different processes and facilitates topological considerations. Continuity of the Markov transition kernel of a discrete version of the prediction process is obtained as a new result.
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Löhr, Wolfgang. "Models of Discrete-Time Stochastic Processes and Associated Complexity Measures". Doctoral thesis, Max Planck Institut für Mathematik in den Naturwissenschaften, 2009. https://ul.qucosa.de/id/qucosa%3A11017.

Texto completo da fonte
Resumo:
Many complexity measures are defined as the size of a minimal representation in a specific model class. One such complexity measure, which is important because it is widely applied, is statistical complexity. It is defined for discrete-time, stationary stochastic processes within a theory called computational mechanics. Here, a mathematically rigorous, more general version of this theory is presented, and abstract properties of statistical complexity as a function on the space of processes are investigated. In particular, weak-* lower semi-continuity and concavity are shown, and it is argued that these properties should be shared by all sensible complexity measures. Furthermore, a formula for the ergodic decomposition is obtained. The same results are also proven for two other complexity measures that are defined by different model classes, namely process dimension and generative complexity. These two quantities, and also the information theoretic complexity measure called excess entropy, are related to statistical complexity, and this relation is discussed here. It is also shown that computational mechanics can be reformulated in terms of Frank Knight''s prediction process, which is of both conceptual and technical interest. In particular, it allows for a unified treatment of different processes and facilitates topological considerations. Continuity of the Markov transition kernel of a discrete version of the prediction process is obtained as a new result.
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Kubasch, Madeleine. "Approximation of stochastic models for epidemics on large multi-level graphs". Electronic Thesis or Diss., Institut polytechnique de Paris, 2024. https://theses.hal.science/tel-04717689.

Texto completo da fonte
Resumo:
Nous étudions un modèle SIR à deux niveaux de mélange, à savoir un niveau global uniformément mélangeant, et un niveau local divisé en deux couches de contacts au sein des foyers et lieux de travail, respectivement. Nous cherchons à développer des modèles réduits qui approchent bien cette dynamique épidémique, tout en étant plus maniables pour l’analyse numérique et/ou théorique.D'abord, nous analysons l’impact épidémique de la distribution des tailles des lieux de travail. Notre étude par simulations montre que, si la moyenne de la distribution des tailles de lieux de travail est fixée, sa variance est un bon indicateur de son influence sur des caractéristiques clés de l’épidémie. Cela nous permet de proposer des stratégies de télétravail efficaces. Ensuite, nous montrons qu’un modèle SIR déterministe, uniformément mélangeant, calibré sur le taux de croissance épidémique fournit une approximation parcimonieuse de l'épidémie.Néanmoins, la précision de ce modèle réduit décroît au cours du temps et n'a pas de garanties théoriques. Nous étudions donc la limite grande population du modèle stochastique à foyers et lieux de travail, que nous formalisons comme un processus à valeur mesure dont l’espace de types est continu. Nous établissons sa convergence vers l’unique solution déterministe d’une équation à valeur mesure. Dans le cas où les périodes infectieuses sont exponentiellement distribuées, une réduction plus forte vers un système dynamique fini-dimensionnel est obtenue.De plus, une étude de sensibilité nous permet de comprendre l’impact des paramètres du modèle sur la performance de ces deux modèles réduits. Nous montrons que la limite grande population du modèle foyer-travail permet de bien approcher l’épidémie, même si certaines hypothèses sur le réseau de contact sont relâchées. De même, nous quantifions l’impact des paramètres épidémiques sur la capacité du modèle réduit uniformément mélangeant à prédire des caractéristiques clés de l’épidémie.Enfin, nous considérons plus généralement des processus de population densité-dépendants. Nous établissons une formule tous-pour-un qui réduit la lignée typique d’un individu échantillonné à un processus spinal inhomogène en temps. Par ailleurs, nous quantifions par couplage la convergence en grande population d'une construction spinale
We study an SIR model with two levels of mixing, namely a uniformly mixing global level, and a local level with two layers of household and workplace contacts, respectively. More precisely, we aim at proposing reduced models which approximate well the epidemic dynamics at hand, while being more prone to mathematical analysis and/or numerical exploration.We investigate the epidemic impact of the workplace size distribution. Our simulation study shows that if the average workplace size is kept fixed, the variance of the workplace size distribution is a good indicator of its influence on key epidemic outcomes. In addition, this allows to design an efficient teleworking strategy. Next, we demonstrate that a deterministic, uniformly mixing SIR model calibrated using the epidemic growth rate yields a parsimonious approximation of the household-workplace model.However, the accuracy of this reduced model deteriorates over time and lacks theoretical guarantees. Hence, we study the large population limit of the stochastic household-workplace model, which we formalize as a measure-valued process with continuous state space. In a general setting, we establish convergence to the unique deterministic solution of a measure-valued equation. In the case of exponentially distributed infectious periods, a stronger reduction to a finite dimensional dynamical system is obtained.Further, in order to gain a finer insight on the impact of the model parameters on the performance of both reduced models, we perform a sensitivity study. We show that the large population limit of the household-workplace model can approximate well the epidemic even if some assumptions on the contact network are relaxed. Similarly, we quantify the impact of epidemic parameters on the capacity of the uniformly mixing reduced model to predict key epidemic outcomes.Finally, we consider density-dependent population processes in general. We establish a many-to-one formula which reduces the typical lineage of a sampled individual to a time-inhomogeneous spinal process. In addition, we use a coupling argument to quantify the large population convergence of a spinal process
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

De, Oliveira Gomes André. "Large Deviations Studies for Small Noise Limits of Dynamical Systems Perturbed by Lévy Processes". Doctoral thesis, Humboldt-Universität zu Berlin, 2018. http://dx.doi.org/10.18452/19118.

Texto completo da fonte
Resumo:
Die vorliegende Dissertation beschäftigt sich mit der Anwendung der Theorie der großen Abweichungen auf verschiedene Fragestellungen der stochastischen Analysis und stochastischen Dynamik von Sprungprozessen. Die erste Fragestellung behandelt die erste Austrittszeit aus einem beschränkten Gebiet für eine bestimmte Klasse von Sprungdiffusionen mit exponentiell leichten Sprüngen. In Abhängigkeit von der Leichtheit des Sprungmaßes wird das asymptotische Verhalten der Verteilung und insbesondere der Erwartung der ersten Austrittszeit bestimmt wenn das Rauschen verschwindet. Dabei folgt die Verteilung der ersten Austrittszeit einem Prinzip der großen Abweichungen im Falle eines superexponentiellen Sprungmaßes. Wohingegen im subexponentiellen Fall die Verteilung einem Prinzip moderater Abweichungen genügt. In beiden Fällen wird die Asymptotik bestimmt durch eine deterministische Größe, die den minimalen Energieaufwand beschreibt, um die Sprungdiffusion einen optimalen Kontrollpfad, der zum Austritt führt, folgen zu lassen. Die zweite Fragestellung widmet sich dem Grenzverhalten gekoppelter Vorwärts-Rückwärtssysteme stochastischer Differentialgleichungen bei kleinem Rauschen. Dazu assoziiert ist eine spezielle Klasse nicht-lokaler partieller Differentialgleichungen, die auch in nicht-lokalen Modellen der Fluiddynamik eine Rolle spielen. Mithilfe eines probabilistischen Ansatzes und der Markovschen Struktur dieser Systeme wird die Konvergenz auf Ebene von Viskositätslösungen untersucht. Dabei wird ein Prinzip der großen Abweichungen für die involvierten Stochastischen Prozesse hergeleitet.
This thesis deals with applications of Large Deviations Theory to different problems of Stochastic Dynamics and Stochastic Analysis concerning Jump Processes. The first problem we address is the first exit time from a fixed bounded domain for a certain class of exponentially light jump diffusions. According to the lightness of the jump measure of the driving process, we derive, when the source of the noise vanishes, the asymptotic behavior of the law and of the expected value of first exit time. In the super-exponential regime the law of the first exit time follows a large deviations scale and in the sub-exponential regime it follows a moderate deviations one. In both regimes the first exit time is comprehended, in the small noise limit, in terms of a deterministic quantity that encodes the minimal energy the jump diffusion needs to spend in order to follow an optimal controlled path that leads to the exit. The second problem that we analyze is the small noise limit of a certain class of coupled forward-backward systems of Stochastic Differential Equations. Associated to these stochastic objects are some nonlinear nonlocal Partial Differential Equations that arise as nonlocal toy-models of Fluid Dynamics. Using a probabilistic approach and the Markov nature of these systems we study the convergence at the level of viscosity solutions and we derive a large deviations principles for the laws of the stochastic processes that are involved.
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Chinnici, Marta. "Stochastic self-similar processes and large scale structures". Tesi di dottorato, 2008. http://www.fedoa.unina.it/1993/1/Chinnici_Scienze_Computazionali.pdf.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
11

Oprisan, Adina. "Large deviation principle for functional limit theorems". 2009. http://hdl.handle.net/10106/1734.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
12

"Hausdorff dimension of the Brownian frontier and stochastic Loewner evolution". 2012. http://library.cuhk.edu.hk/record=b5549118.

Texto completo da fonte
Resumo:
令B{U+209C}表示一個平面布朗運動。我們把C \B[0, 1] 的無界連通分支的邊界稱爲B[0; 1] 的外邊界。在本文中,我們將討論如何計算B[0,1] 的外邊界的Hausdorff 維數。
我們將在第二章討論Lawler早期的工作[7]。他定義了一個常數ζ(所謂的不聯通指數) 。利用能量的方法, 他證明了 B[0,1]的外邊界的Hausdorff維數是2(1 - ζ)概率大於零, 然後0-1律可以明這個概率就是1。但是用他的方法我們不能算出ζ的準確值。
Lawler, Schramm and Werner 在一系列文章[10],[11] 和[13] 中研究了SLE{U+2096}和excursion 測度。利用SLE6 和excursion 測度的共形不變性,他們可以計算出了布朗運動的相交指數ξ (j; λ )。因此ζ = ξ (2; 0)/2 = 1/3,由此可以知道B[0, 1] 的外邊界的Hausdorff 維數就是4/3。從而可以說完全證明了著名的Mandelbrot 猜想。
Let B{U+209C} be a Brownian motion on the complex plane. The frontier of B[0; 1] is defined to be the boundary of the unbounded connected component of C\B[0; 1].In this thesis, we will review the calculation of the Hausdorff dimension of the frontier of B[0; 1].
We first dissuss the earlier work of Lawler [7] in Chapter 2. He defined a constant ζ (so called the dimension of disconnection exponent). By using the energy method, he proved that with positive probability the Hausdorff dimension of the frontier of B[0; 1] is 2(1 -ζ ), then zero-one law show that the probability is one. But we can not calculate the exact value of ζ in this way.
In the series of papers by Lawler, Schramm and Werner [10], [11] and [13], they studied the SLE{U+2096} and excursion measure. By using the conformal invariance of SLE₆ and excursion measure, they can calculate the exact value of the Brownian intersection exponents ξ(j, λ). Consequently, ζ = ξ(2, 0)/2 = 1/3, and the Hausdorff dimension of the frontier of B [0,1] is 4/3 almost surely. This answers the well known conjecture by Mandelbrot positively.
Detailed summary in vernacular field only.
Detailed summary in vernacular field only.
Detailed summary in vernacular field only.
Zhang, Pengfei.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2012.
Includes bibliographical references (leaves 53-55).
Abstracts also in Chinese.
Chapter 1 --- Introduction --- p.6
Chapter 2 --- Hausdorff dimension of the frontier of Brownian motion --- p.11
Chapter 2.1 --- Preliminaries --- p.11
Chapter 2.2 --- Hausdorff dimension of Brownian frontier --- p.13
Chapter 3 --- Stochastic Loewner Evolution --- p.24
Chapter 3.1 --- Definitions --- p.24
Chapter 3.2 --- Continuity and Transience --- p.26
Chapter 3.3 --- Locality property of SLE₆ --- p.30
Chapter 3.4 --- Crossing exponent for SLE₆ --- p.32
Chapter 4 --- Brownian intersection exponents --- p.37
Chapter 4.1 --- Half-plane exponent --- p.37
Chapter 4.2 --- Whole-plane exponent --- p.41
Chapter 4.3 --- Proof of Theorem 4.6 and Theorem 4.7 --- p.44
Chapter 4.4 --- Proof of Theorem 1.2 --- p.47
Chapter A --- Excursion measure --- p.48
Chapter A.1 --- Metric space of curves --- p.48
Chapter A.2 --- Measures on metric space --- p.49
Chapter A.3 --- Excursion measure on K --- p.49
Bibliography --- p.53
Estilos ABNT, Harvard, Vancouver, APA, etc.
13

Chan, Grace W. S. "Some aspects of estimation of fractal dimension and stochastic simulation". Phd thesis, 1995. http://hdl.handle.net/1885/138474.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
14

Sinclair, Jennifer Laurie. "Small and Large Scale Limits of Multifractal Stochastic Processes with Applications". 2009. http://trace.tennessee.edu/utk_graddiss/92.

Texto completo da fonte
Resumo:
Various classes of multifractal processes, that is processes that display different properties at different scales, are studied. Most of the processes examined in this work exhibit stable trends at small scales and Gaussian trends at large scales, although the opposite can also occur. Many natural phenomena exhibit a fractal structure depending on some scaling factor, such as space or time. Thus, these types of processes have many useful modeling applications, including Biology and Economics. First, generalized tempered stable processes are defined and studied, following the original work on tempered stable processes by Jan Rosinski [16]. Generalized tempered stable processes encompass the modern variations on tempered stable distributions that have been introduced in the field, including "Modified tempered stable distributions [10]," "Layered stable distributions [8]," and "Lamperti stable processes [2]." This work shows generalized tempered stable processes exhibit multifractal properties at different scales in the space of cadlag functions equipped with the Skorokhod topology and investigates other properties, such as series representations and absolute continuity. Next, processes driven by generalized tempered stable processes involving a certain Volterra kernel are defined and short and long term behavior is established, following the work of Houdré and Kawai [7]. Finally, inspired by the work of Pipiras and Taqqu [13], the multifractal behavior of more general infinitely divisible processes is established, based on the Lévy-Itô representation of infinitely divisible processes. Numerous examples are given throughout the entire text to exemplify the strong presence of processes of this type in current literature.
Estilos ABNT, Harvard, Vancouver, APA, etc.
15

Li, Yuxiao. "Spatio-Temporal Prediction and Stochastic Simulation for Large-Scale Nonstationary Processes". Thesis, 2020. http://hdl.handle.net/10754/665845.

Texto completo da fonte
Resumo:
There has been an increasing demand for describing, predicting, and drawing inferences for various environmental processes, such as air pollution and precipitation. Environmental statistics plays an important role in many related applications, such as weather-related risk assessment for urban design and crop growth. However, modeling the spatio-temporal dynamics of environmental data is challenging due to their inherent high variability and nonstationarity. This dissertation is composed of four signi cant contributions to the modeling, simulation, and prediction of spatiotemporal processes using statistical techniques and machine learning algorithms. This dissertation rstly focuses on the Gaussian process emulators of the numerical climate models over a large spatial region, where the spatial process exhibits nonstationarity. The proposed method allows for estimating a rich class of nonstationary Mat ern covariance functions with spatially varying parameters. The e cient estimation is achieved by local-polynomial tting of the covariance parameters. To extend the applicability of this method to large-scale computations, the proposed method is implemented by developing software with high-performance computing architectures for nonstationary Gaussian process estimation and simulation. The developed software outperforms existing ones in both computational time and accuracy by a large margin. The method and software are applied to the statistical emulation of high-resolution climate models. The second focus of this dissertation is the development of spatio-temporal stochastic weather generators for non-Gaussian and nonstationary processes. The proposed multi-site generator uses a left-censored non-Gaussian vector autoregression model, where the random error follows a skew-symmetric distribution. It not only drives the occurrence and intensity simultaneously but also possesses nice interpretations both physically and statistically. The generator is applied to 30-second precipitation data collected at the University of Lausanne. Finally, this dissertation investigates the spatial prediction with scalable deep learning algorithms to overcome the limitations of the classical Kriging predictor in geostatistics. A novel neural network structure is proposed for spatial prediction by adding an embedding layer of spatial coordinates with basis functions. The proposed method, called DeepKriging, has multiple advantages over Kriging and classical neural networks with spatial coordinates as features. The method is applied to the prediction of ne particulate matter (PM2:5) concentrations in the United States.
Estilos ABNT, Harvard, Vancouver, APA, etc.
16

Boucher, Christopher Lawrence. "Large deviations for doubly indexed stochastic processes with applications to statistical mechanics". 1998. https://scholarworks.umass.edu/dissertations/AAI9841842.

Texto completo da fonte
Resumo:
The theory of large deviations studies situations in which certain probabilities involving a given stochastic process decay to zero exponentially fast. One of the aims of this dissertation is to extend this theory to the setting in which the stochastic processes under consideration are indexed by two parameters, rather than the usual one parameter. The introduction of the second index often allows one to study more easily the large deviation asymptotics of processes with a spatial component. Such doubly indexed processes, interesting in their own right, are especially so because of their applications to a class of statistical mechanical models of fluid turbulence. Indeed, the powerful apparatus of large deviation theory can be applied to a general statistical mechanical model via the program outlined in Chapter 2. A second aim of this dissertation is to apply our two parameter large deviation results to a particular model of two-dimensional fluid turbulence introduced in Chapter 6. The main probabalistic theorem in the dissertation is the large deviation principle for the doubly indexed sequence of random probability measures$$W\sb{r,q}(dx\times dy)\ \doteq\ \theta (dx)\otimes\sum\sbsp{k=1}{2r}1\sb{D\sb{r,k}}(x)L\sb{q,k}(dy).$$Here $\theta$ is a probability measure on a Polish space $\chi$, $\{D\sb{r,k},\ k = 1,\...,2\sp{r}\}$ is a dyadic partition of $\chi$ (hence the use of $2\sp{r}$ summands) satisfying $\theta (D\sb{r,k}) = 1/2\sp{r},$ and $L\sb{q,1},\ L\sb{q,2},\... ,L\sb{q,2\sp{r}}$ is an independent, identically distributed sequence of random probability measures on a Polish space $\cal Y$ such that $\{L\sb{q,k},\ q\in {\rm I\!N}\}$ satisfies the large deviation principle with a convex rate function. A number of related asymptotic results are also derived. In the final two chapters of the dissertation we introduce a statistical mechanical model of two-dimensional turbulence constructed on a uniform lattice of points of the unit torus. We use a doubly indexed process closely related to $W\sb{r,q}$ to approximate the process which arises naturally in applying large deviation theory to this model. The two parameter large deviation principle for the doubly indexed process then leads to the evaluation of the asymptotics of certain key statistical mechanical quantities related to the partition function and the Gibbs states.
Estilos ABNT, Harvard, Vancouver, APA, etc.
17

Hu, Yujie. "An effective method of stochastic simulation of complex large-scale transport processes in naturally fractured reservoirs". 2002. http://wwwlib.umi.com/cr/utexas/fullcit?p3114761.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
18

Lin, Yier. "Large deviations of the KPZ equation, Markov duality and SPDE limits of the vertex models". Thesis, 2021. https://doi.org/10.7916/d8-q300-qe66.

Texto completo da fonte
Resumo:
The Kardar-Parisi-Zhang (KPZ) equation is a stochastic PDE describing various objects in statistical mechanics such as random interface growth, directed polymers, interacting particle systems. We study large deviations of the KPZ equation, both in the short time and long time regime. We prove the first short time large deviations for the KPZ equation and detects a Gaussian - 5/2 power law crossover in the lower tail rate function. In the long-time regime, we study the upper tail large deviations of the KPZ equation starting from a wide range of initial data and explore how the rate function depends on the initial data. The KPZ equation plays a role as the weak scaling limit of various models in the KPZ universality class. We show the stochastic higher spin six vertex model, a class of models which sit on top of the KPZ integrable systems, converges weakly to the KPZ equation under certain scaling. This extends the weak universality of the KPZ equation. On the other hand, we show that under a different scaling, the stochastic higher spin six vertex model converges to a hyperbolic stochastic PDE called stochastic telegraph equation. One key tool behind the proof of these two stochastic PDE limits is a property called Markov duality.
Estilos ABNT, Harvard, Vancouver, APA, etc.
19

Mukeru, Safari. "Local times of Brownian motion". Thesis, 2010. http://hdl.handle.net/10500/3781.

Texto completo da fonte
Resumo:
After a review of the notions of Hausdorff and Fourier dimensions from fractal geometry and Fourier analysis and the properties of local times of Brownian motion, we study the Fourier structure of Brownian level sets. We show that if δa(X) is the Dirac measure of one-dimensional Brownian motion X at the level a, that is the measure defined by the Brownian local time La at level a, and μ is its restriction to the random interval [0, L−1 a (1)], then the Fourier transform of μ is such that, with positive probability, for all 0 ≤ β < 1/2, the function u → |u|β|μ(u)|2, (u ∈ R), is bounded. This growth rate is the best possible. Consequently, each Brownian level set, reduced to a compact interval, is with positive probability, a Salem set of dimension 1/2. We also show that the zero set of X reduced to the interval [0, L−1 0 (1)] is, almost surely, a Salem set. Finally, we show that the restriction μ of δ0(X) to the deterministic interval [0, 1] is such that its Fourier transform satisfies E (|ˆμ(u)|2) ≤ C|u|−1/2, u 6= 0 and C > 0. Key words: Hausdorff dimension, Fourier dimension, Salem sets, Brownian motion, local times, level sets, Fourier transform, inverse local times.
Decision Sciences
PhD. (Operations Research)
Estilos ABNT, Harvard, Vancouver, APA, etc.
20

Langovoy, Mikhail Anatolievich. "Data-driven goodness-of-fit tests". Doctoral thesis, 2007. http://hdl.handle.net/11858/00-1735-0000-0006-B393-4.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Oferecemos descontos em todos os planos premium para autores cujas obras estão incluídas em seleções literárias temáticas. Contate-nos para obter um código promocional único!

Vá para a bibliografia