Добірка наукової літератури з теми "Processus gaussiens contraints"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Processus gaussiens contraints".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Статті в журналах з теми "Processus gaussiens contraints":
Schumann, U. "A contrail cirrus prediction model." Geoscientific Model Development 5, no. 3 (May 3, 2012): 543–80. http://dx.doi.org/10.5194/gmd-5-543-2012.
Schumann, U. "A contrail cirrus prediction model." Geoscientific Model Development Discussions 4, no. 4 (November 28, 2011): 3185–293. http://dx.doi.org/10.5194/gmdd-4-3185-2011.
Naiman, A. D., S. K. Lele, J. T. Wilkerson, and M. Z. Jacobson. "Parameterization of subgrid aircraft emission plumes for use in large-scale atmospheric simulations." Atmospheric Chemistry and Physics Discussions 9, no. 6 (November 18, 2009): 24755–81. http://dx.doi.org/10.5194/acpd-9-24755-2009.
Дисертації з теми "Processus gaussiens contraints":
Tran, Tien-Tam. "Constrained and Low Rank Gaussian Process on some Manifolds." Electronic Thesis or Diss., Université Clermont Auvergne (2021-...), 2023. https://theses.hal.science/tel-04529284.
The thesis is divided into three main parts, we will summarize the major contributions of the thesis as follows.Low complexity Gaussian processes:Gaussian process regression usually scales as $O(n^3)$ for computation and $O(n^2)$ for memory requirements, where $n$ represents the number of observations. This limitation becomes unfeasible for many problems when $n$ is large. In this thesis, we investigate the Karhunen-Loève expansion of Gaussian processes, which offers several advantages over low-rank compression techniques. By truncating the Karhunen-Loève expansion, we obtain an explicit low-rank approximation of the covariance matrix (Gram matrix), greatly simplifying statistical inference when the number of truncations is small relative to $n$.We then provide explicit solutions for low complexity Gaussian processes. We seek Karhunen-Loève expansions, by solving for eigenpaires of a differential operator where the covariance function serves as the Green function. We offer explicit solutions for the Matérn differential operator and for differential operators with eigenfunctions represented by classical polynomials. In the experimental section, we compare our proposed methods with alternative approaches, revealing their enhanced capability in capturing intricate patterns.Constrained Gaussian processes:This thesis introduces a novel approach used constrained Gaussian processes to approximate a density function based on observations. To address these constraints, our approach involves modeling square root of unknown density function realized as a Gaussian process. In this work, we adopt a truncated version of the Karhunen-Loève expansion as the approximation method. A notable advantage of this approach is that the coefficients are Gaussian and independent, with the constraints on the realized functions entirely dictated by the constraints on the random coefficients. After conditioning on both available data and constraints, the posterior distribution of the coefficients is a normal distribution constrained to the unit sphere. This distribution poses analytical intractability, necessitating numerical methods for approximation. To this end, this thesis employs spherical Hamiltonian Monte Carlo (HMC). The efficacy of the proposed framework is validated through a series of experiments, with performance comparisons against alternative methods.Transfer learning on the manifold of finite probability measures:Finally, we introduce transfer learning models in the space of finite probability measures, denoted as $mathcal{P}_+(I)$. In our investigation, we endow the space $mathcal{P}_+(I)$ with the Fisher-Rao metric, transforming it into a Riemannian manifold. This Riemannian manifold, $mathcal{P}_+(I)$, holds a significant place in Information Geometry and has numerous applications. Within this thesis, we provide detailed formulas for geodesics, the exponential map, the log map, and parallel transport on $mathcal{P}_+(I)$.Our exploration extends to statistical models situated within $mathcal{P}_+(I)$, typically conducted within the tangent space of this manifold. With a comprehensive set of geometric tools, we introduce transfer learning models facilitating knowledge transfer between these tangent spaces. Detailed algorithms for transfer learning encompassing Principal Component Analysis (PCA) and linear regression models are presented. To substantiate these concepts, we conduct a series of experiments, offering empirical evidence of their efficacy
Maatouk, Hassan. "Correspondance entre régression par processus Gaussien et splines d'interpolation sous contraintes linéaires de type inégalité. Théorie et applications." Thesis, Saint-Etienne, EMSE, 2015. http://www.theses.fr/2015EMSE0791/document.
This thesis is dedicated to interpolation problems when the numerical function is known to satisfy some properties such as positivity, monotonicity or convexity. Two methods of interpolation are studied. The first one is deterministic and is based on convex optimization in a Reproducing Kernel Hilbert Space (RKHS). The second one is a Bayesian approach based on Gaussian Process Regression (GPR) or Kriging. By using a finite linear functional decomposition, we propose to approximate the original Gaussian process by a finite-dimensional Gaussian process such that conditional simulations satisfy all the inequality constraints. As a consequence, GPR is equivalent to the simulation of a truncated Gaussian vector to a convex set. The mode or Maximum A Posteriori is defined as a Bayesian estimator and prediction intervals are quantified by simulation. Convergence of the method is proved and the correspondence between the two methods is done. This can be seen as an extension of the correspondence established by [Kimeldorf and Wahba, 1971] between Bayesian estimation on stochastic process and smoothing by splines. Finally, a real application in insurance and finance is given to estimate a term-structure curve and default probabilities
Dubourg, Vincent. "Méta-modèles adaptatifs pour l'analyse de fiabilité et l'optimisation sous contrainte fiabiliste." Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2011. http://tel.archives-ouvertes.fr/tel-00697026.
Lopez, lopera Andres Felipe. "Gaussian Process Modelling under Inequality Constraints." Thesis, Lyon, 2019. https://tel.archives-ouvertes.fr/tel-02863891.
Conditioning Gaussian processes (GPs) by inequality constraints gives more realistic models. This thesis focuses on the finite-dimensional approximation of GP models proposed by Maatouk (2015), which satisfies the constraints everywhere in the input space. Several contributions are provided. First, we study the use of Markov chain Monte Carlo methods for truncated multinormals. They result in efficient sampling for linear inequality constraints. Second, we explore the extension of the model, previously limited up tothree-dimensional spaces, to higher dimensions. The introduction of a noise effect allows us to go up to dimension five. We propose a sequential algorithm based on knot insertion, which concentrates the computational budget on the most active dimensions. We also explore the Delaunay triangulation as an alternative to tensorisation. Finally, we study the case of additive models in this context, theoretically and on problems involving hundreds of input variables. Third, we give theoretical results on inference under inequality constraints. The asymptotic consistency and normality of maximum likelihood estimators are established. The main methods throughout this manuscript are implemented in R language programming.They are applied to risk assessment problems in nuclear safety and coastal flooding, accounting for positivity and monotonicity constraints. As a by-product, we also show that the proposed GP approach provides an original framework for modelling Poisson processes with stochastic intensities
Houret, Thomas. "Méthode d’estimation des valeurs extrêmes des distributions de contraintes induites et de seuils de susceptibilité dans le cadre des études de durcissement et de vulnérabilité aux menaces électromagnétiques intentionnelles." Thesis, Rennes, INSA, 2019. http://www.theses.fr/2019ISAR0011.
Intentional ElectroMagnetic Interference (IEMI) can cause equipment failure. The study of the effects of an IEMI begins with an assessment of the risk of equipment failure in order to implement appropriate protections, if required. Unfortunately, a deterministic prediction of a failure is impossible because both characteristics of equipment and of the aggression are very uncertain. The proposed strategy consists of modelling the stress generated by the aggression, as well as the susceptibility of the equipment, as random variables. Then, three steps are necessary: The first step deals with the estimation of the probability distribution of the random susceptibility variable. The second step deals with the similar estimation for the constraint / stress then that of the stress. Eventually, the third step concerns the calculation of the probability of failure. For the first step, we use statistical inference methods on a small sample of measured susceptibility thresholds. We compare two types of parametric inference: bayesian and maximum likelihood. We conclude that a relevant approach for a risk analysis is to use the confidence or credibility intervals of parameter estimates to frame the probability of failure, regardless of the inference method chosen. For the second step we explore extreme value exploration techniques while reducing the number of simulations required. In particular, we propose the technique of Controlled Stratification by Kriging. We show that this technique drastically improves performance compared to the classic approach (Monte Carlo simulation). In addition, we propose a particular implementation of this technique in order to control the calculation effort. Finally, the third step is the simplest once the first two steps have been completed since, by definition, a failure occurs when the stress is greater than the susceptibility. With the help of a final test case comprising the simulation of an electromagnetic aggression on a piece of equipment, we use the method developed in our work to estimate the frame of the probability of failure, More specifically, we show that the combined use of controlled stratification by kriging and inference of susceptibility distribution, allows to frame the estimated true value of the probability of failure