Teses / dissertações sobre o tema "Inversions statistiques"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Veja os 27 melhores trabalhos (teses / dissertações) para estudos sobre o assunto "Inversions statistiques".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Veja as teses / dissertações das mais diversas áreas científicas e compile uma bibliografia correta.
Boucher, Eulalie. "Designing Deep-Learning models for surface and atmospheric retrievals from the IASI infrared sounder". Electronic Thesis or Diss., Sorbonne université, 2024. http://www.theses.fr/2024SORUS145.
Texto completo da fonteObserving the Earth is vital to comprehend and monitor the complex behaviour of our planet. Satellites, equipped with a number of sophisticated sensors, serve as a key platform for this, offering an opportunity to observe the Earth globally and continuously. Machine Learning (ML) techniques have been used in the remote sensing community for several decades to deal with the vast amount of data generated daily by Earth observation systems. The revolution brought about by novel Deep Learning (DL) techniques has however opened up new possibilities for the exploitation of satellite observations. This research aims to show that image-processing techniques such as Convolutional Neural Networks (CNNs), provided that they are well mastered, have the potential to improve the estimation of the Earth's atmospheric and surface parameters. By looking at the observations at the image scale rather than at the pixel scale, spatial dependencies can be taken into account. Such techniques will be used for the retrieval of surface and atmospheric temperatures, as well as cloud detection and classification from the Infrared Atmospheric Sounding Interferometer (IASI) observations. IASI, onboard the polar orbiting satellites Metop, is a hyperspectral sounder gathering data across a broad range of infrared wavelengths that are suitable to identify atmospheric constituents for a range of atmospheric vertical levels, as well as surface parameters. In addition to improving the quality of the retrievals, such Artificial Intelligence (AI) methods are capable of dealing with images that contain missing data, better estimating extreme events (often overlooked by traditional ML techniques) and estimating retrieval uncertainties. This thesis shows why AI methods should be the preferred approach for the exploitation of observations coming from new satellite missions such as IASI-NG or MTG-S IRS
Smith, Pascalle. "Modélisation des cultures européennes au sein de la biosphère : phénologie, productivité et flux de CO2". Paris 6, 2008. http://www.theses.fr/2008PA066250.
Texto completo da fonteCamps, Pierre. "Comportement du champ magnétique de la terre au cours de ses renversements : étude d'un exemple, variations ultra rapides et caractéristiques statistiques globales". Montpellier 2, 1994. http://www.theses.fr/1994MON20203.
Texto completo da fonteNier, Vincent Philippe. "Estimation statistique des propriétés physiques de monocouches cellulaires". Thesis, Paris 6, 2016. http://www.theses.fr/2016PA066233/document.
Texto completo da fonteEpithelial cells are known to form cohesive monolayers, a form of tissue organization encountered in the lung, the kidney or the skin. From in vitro experiments, we have characterized the mechanical properties of cell monolayers. We have studied the closure of circular wounds over a nonadhesive substrate. Comparing different models, we have shown how closure is possible thanks to a contractile acto-myosin cable and to fluctuations of the tissue tension. Traction Force Microscopy (TFM) allows to measure the forces that cells exert on their substrate. Starting from this measurement and using the force balance equations, we have solved this underdetermined problem by Bayesian inversion and obtained the internal stress field of the tissue. Applying this method on single images (BISM: Bayesian Inversion Stress Microscopy), and adapting it with a Kalman filter for movies (KISM: Kalman Inversion Stress Microscopy) we have inferred the stress tensor of cell monolayers, without making any hypothesis on the tissue rheology. Finally, we have estimated the stresses directly from the substrate displacements, without computing the traction forces and thus reducing the number of matrix inversions (BISMu: Bayesian Inversion Stress Microscopy from substrate displacements)
Arnst, Maarten. "Inversion of probabilistic models of structures using measured transfer functions". Châtenay-Malabry, Ecole centrale de Paris, 2007. http://www.theses.fr/2007ECAP1037.
Texto completo da fonteThe aim of this thesis is to develop a methodology for the experimental identification of probabilistic models for the dynamical behaviour of structures. The inversion of probabilistic structural models with minimal parameterization, introduced by Soize, from measured transfer functions is in particular considered. It is first shown that the classical methods of estimation from the theory of mathematical statistics, such as the method of maximum likelihood, are not well-adapted to formulate and solve this inverse problem. In particular, numerical difficulties and conceptual problems due to model misspecification are shown to prohibit the application of the classical methods. The inversion of probabilistic structural models is then formulated alternatively as the minimization, with respect to the parameters to be identified, of an objective function measuring a distance between the experimental data and the probabilistic model. Two principles of construction for the definition of this distance are proposed, based on either the loglikelihood function, or the relative entropy. The limitation of the distance to low-order marginal laws is demonstrated to allow to circumvent the aforementioned difficulties. The methodology is applied to examples featuring simulated data and to a civil and environmental engineering case history featuring real experimental data
Ars, Sébastien. "Caractérisation des émissions de méthane à l'échelle locale à l'aide d'une méthode d'inversion statistique basée sur un modèle gaussien paramétré avec les données d'un gaz traceur". Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLV030/document.
Texto completo da fonteThe increase of atmospheric methane concentrations since the beginning of the industrial era is directly linked to anthropogenic activities. This increase is partly responsible for the enhancement of the greenhouse effect leading to a rise of Earth's surface temperatures and a degradation of air quality. There are still considerable uncertainties regarding methane emissions estimates from many sources at local scale. A better characterization of these sources would help the implementation of effective adaptation and mitigation policies to reduce these emissions.To do so, we have developed a new method to quantify methane emissions from local sites based on the combination of mobile atmospheric measurements, a Gaussian model and a statistical inversion. These atmospheric measurements are carried out within the framework of the tracer method, which consists in emitting a gas co-located with the methane source at a known flow. An estimate of methane emissions can be given by measuring the tracer and methane concentrations through the emission plume coming from the site. This method presents some limitations especially when several sources and/or extended sources can be found on the studied site. In these conditions, the colocation of the tracer and methane sources is difficult. The Gaussian model enables to take into account this bad collocation. It also gives a separate estimate of each source of a site when the classical tracer release method only gives an estimate of its total emissions. The statistical inversion enables to take into account the uncertainties associated with the model and the measurements.The method is based on the use of the measured tracer gas concentrations to choose the stability class of the Gaussian model that best represents the atmospheric conditions during the measurements. These tracer data are also used to parameterize the error associated with the measurements and the model in the statistical inversion. We first tested this new method with controlled emissions of tracer and methane. The tracer and methane sources were positioned in different configurations in order to better understand the contributions of this method compared to the traditional tracer method. These tests have demonstrated that the statistical inversion parameterized by the tracer gas data gives better estimates of methane emissions when the tracer and methane sources are not perfectly collocated or when there are several sources of methane.In a second time, I applied this method to two sites known for their methane emissions, namely a farm and a gas distribution facility. These measurements enabled us to test the applicability and robustness of the method under more complex methane source distribution conditions and gave us better estimates of the total methane emissions of these sites that take into account the location of the tracer regarding methane sources. Separate estimates of every source within the site are highly dependent on the meteorological conditions during the measurements. The analysis of the correlations on the posterior uncertainties between the different sources gives a diagnostic of the separability of the sources.Finally I focused on methane emissions associated with the waste sector. To do so, I carried out several measurement campaigns in landfills and wastewater treatment plants and I also used data collected on this type of sites during other projects. I selected the most suitable method to estimate methane emissions of each site and the obtained estimates for each one of these sites show the variability of methane emissions in the waste sector
Romary, Thomas. "Inversion des modèles stochastiques de milieux hétérogènes". Paris 6, 2008. https://tel.archives-ouvertes.fr/tel-00395528.
Texto completo da fonteArs, Sébastien. "Caractérisation des émissions de méthane à l'échelle locale à l'aide d'une méthode d'inversion statistique basée sur un modèle gaussien paramétré avec les données d'un gaz traceur". Electronic Thesis or Diss., Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLV030.
Texto completo da fonteThe increase of atmospheric methane concentrations since the beginning of the industrial era is directly linked to anthropogenic activities. This increase is partly responsible for the enhancement of the greenhouse effect leading to a rise of Earth's surface temperatures and a degradation of air quality. There are still considerable uncertainties regarding methane emissions estimates from many sources at local scale. A better characterization of these sources would help the implementation of effective adaptation and mitigation policies to reduce these emissions.To do so, we have developed a new method to quantify methane emissions from local sites based on the combination of mobile atmospheric measurements, a Gaussian model and a statistical inversion. These atmospheric measurements are carried out within the framework of the tracer method, which consists in emitting a gas co-located with the methane source at a known flow. An estimate of methane emissions can be given by measuring the tracer and methane concentrations through the emission plume coming from the site. This method presents some limitations especially when several sources and/or extended sources can be found on the studied site. In these conditions, the colocation of the tracer and methane sources is difficult. The Gaussian model enables to take into account this bad collocation. It also gives a separate estimate of each source of a site when the classical tracer release method only gives an estimate of its total emissions. The statistical inversion enables to take into account the uncertainties associated with the model and the measurements.The method is based on the use of the measured tracer gas concentrations to choose the stability class of the Gaussian model that best represents the atmospheric conditions during the measurements. These tracer data are also used to parameterize the error associated with the measurements and the model in the statistical inversion. We first tested this new method with controlled emissions of tracer and methane. The tracer and methane sources were positioned in different configurations in order to better understand the contributions of this method compared to the traditional tracer method. These tests have demonstrated that the statistical inversion parameterized by the tracer gas data gives better estimates of methane emissions when the tracer and methane sources are not perfectly collocated or when there are several sources of methane.In a second time, I applied this method to two sites known for their methane emissions, namely a farm and a gas distribution facility. These measurements enabled us to test the applicability and robustness of the method under more complex methane source distribution conditions and gave us better estimates of the total methane emissions of these sites that take into account the location of the tracer regarding methane sources. Separate estimates of every source within the site are highly dependent on the meteorological conditions during the measurements. The analysis of the correlations on the posterior uncertainties between the different sources gives a diagnostic of the separability of the sources.Finally I focused on methane emissions associated with the waste sector. To do so, I carried out several measurement campaigns in landfills and wastewater treatment plants and I also used data collected on this type of sites during other projects. I selected the most suitable method to estimate methane emissions of each site and the obtained estimates for each one of these sites show the variability of methane emissions in the waste sector
Romary, Thomas. "INVERSION DES MODELES STOCHASTIQUES DE MILIEUX HETEROGENES". Phd thesis, Université Pierre et Marie Curie - Paris VI, 2008. http://tel.archives-ouvertes.fr/tel-00395528.
Texto completo da fonteHan, Bin. "Gamma positivity in enumerative combinatorics". Thesis, Lyon, 2019. http://www.theses.fr/2019LYSE1115/document.
Texto completo da fonteThe gamma positivity of a combinatorial sequence unifies both unimodality and symmetry. Finding new family of objets whose enumerative sequences have gamma positivity is a challenge and important topic in recent years. it has received considerable attention in recent times because of Gal’s conjecture, which asserts that the gamma-vector has nonnegative entries for any flag simple polytope. Often times, the h-polynomial for simplicial polytopes of combinatorial signification can be given as a generating function over a related set of combinatorial objects with respect to some statistic like the descent numbers, whose enumerative polynomials on permutations are Eulerian polynomials.This work deals with the gamma properties of several enumerative polynomials of permutation such as Eulerian polynomials and Narayana polynomials. This thesis contains five chapters
Bӑrbos, Andrei-Cristian. "Efficient high-dimension gaussian sampling based on matrix splitting : application to bayesian Inversion". Thesis, Bordeaux, 2018. http://www.theses.fr/2018BORD0002/document.
Texto completo da fonteThe thesis deals with the problem of high-dimensional Gaussian sampling.Such a problem arises for example in Bayesian inverse problems in imaging where the number of variables easily reaches an order of 106_109. The complexity of the sampling problem is inherently linked to the structure of the covariance matrix. Different solutions to tackle this problem have already been proposed among which we emphasizethe Hogwild algorithm which runs local Gibbs sampling updates in parallel with periodic global synchronisation.Our algorithm makes use of the connection between a class of iterative samplers and iterative solvers for systems of linear equations. It does not target the required Gaussian distribution, instead it targets an approximate distribution. However, we are able to control how far off the approximate distribution is with respect to the required one by means of asingle tuning parameter.We first compare the proposed sampling algorithm with the Gibbs and Hogwild algorithms on moderately sized problems for different target distributions. Our algorithm manages to out perform the Gibbs and Hogwild algorithms in most of the cases. Let us note that the performances of our algorithm are dependent on the tuning parameter.We then compare the proposed algorithm with the Hogwild algorithm on a large scalereal application, namely image deconvolution-interpolation. The proposed algorithm enables us to obtain good results, whereas the Hogwild algorithm fails to converge. Let us note that for small values of the tuning parameter our algorithm fails to converge as well.Not with standing, a suitably chosen value for the tuning parameter enables our proposed sampler to converge and to deliver good results
Harmouzi, Ouassima. "Reconnaissance détaillée de la partie nord-est du Bassin de Saïss (Maroc) : interprétation de sondages électriques verticaux par combinaison des méthodes statistique, géostatistique et d'inversion". Thesis, Bordeaux 1, 2010. http://www.theses.fr/2010BOR14030/document.
Texto completo da fonteThe Geoelectric prospection is usually used in Morocco for hydrogeological recognition. The purpose of this work is to propose new techniques for interpreting vertical electric soundings in a reduced time, and also to fully exploit a database of stored electrical soundings by the establishment, amongst other things, of the horizontal and vertical 2D images, estimating the distribution of apparent electrical resistivity (geostatistic modeling, inversion, etc.). In order to characterize electrically the study area (north-east of the Saïss Basin), a statistical analysis of apparent resistivity of vertical electric soundings was performed. This simple descriptive analysis is followed by a statistical analysis (principal component analysis PCA and ascending hierarchical classification HAC.) (...)The results of statistical analysis and geostatistical supplemented by inversion of the average electric sounding per class, highlighted the reliability of these techniques to the interpretation of a large number of electrical soundings instead of the usual method which is based on the inversion of the electrical sounding one by one and correlate them later, to build the global structure of the area studied. With the techniques used in this work, very satisfactory results in a more reduced time, for interpreting vertical electric soundings, are obtained. VIThe studied profiles and inverted using the software RES2Dinv show all three structures defined previously (Resistant – Conductive - resistant), on the other hand, there are variations within the same formation. In addition, the spatial organization of the formation makes it possible to confirm the existence of faults coherent with the structure in horst and graben basin
Ben, youssef Atef. "Contrôle de têtes parlantes par inversion acoustico-articulatoire pour l'apprentissage et la réhabilitation du langage". Phd thesis, Université de Grenoble, 2011. http://tel.archives-ouvertes.fr/tel-00721957.
Texto completo da fonteEl, Amri Mohamed. "Analyse d'incertitudes et de robustesse pour les modèles à entrées et sorties fonctionnelles". Thesis, Université Grenoble Alpes (ComUE), 2019. http://www.theses.fr/2019GREAM015.
Texto completo da fonteThis thesis deals with the inversion problem under uncertainty of expensive-to-evaluate functions in the context of the tuning of the control unit of a vehicule depollution system.The effect of these uncertainties is taken into account through the expectation of the quantity of interest. The problem lies in the fact that the uncertainty is partly due to a functional variable only known through a given sample. We propose two approaches to solve the inversion problem, both methods are based on Gaussian Process modelling for expensive-to-evaluate functions and a dimension reduction of the functional variable by the Karhunen-Loève expansion.The first methodology consists in applying a Stepwise Uncertainty Reduction (SUR) method on the expectation of the quantity of interest. At each evaluation point in the control space, the expectation is estimated by a greedy functional quantification method that provides a discrete representation of the functional variable and an effective sequential estimate from the given sample.The second approach consists in applying the SUR method directly to the quantity of interest in the joint space. Devoted to inversion under functional uncertainties, a strategy for enriching the experimental design exploiting the properties of Gaussian processes is proposed.These two approaches are compared on toy analytical examples and are applied to an industrial application for an exhaust gas post-treatment system of a vehicle. The objective is to identify the set of control parameters that leads to meet the pollutant emission norms under uncertainties on the driving cycle
Văcar, Cornelia Paula. "Inversion for textured images : unsupervised myopic deconvolution, model selection, deconvolution-segmentation". Thesis, Bordeaux, 2014. http://www.theses.fr/2014BORD0131/document.
Texto completo da fonteThis thesis is addressing a series of inverse problems of major importance in the fieldof image processing (image segmentation, model choice, parameter estimation, deconvolution)in the context of textured images. In all of the aforementioned problems theobservations are indirect, i.e., the textured images are affected by a blur and by noise. Thecontributions of this work belong to three main classes: modeling, methodological andalgorithmic. From the modeling standpoint, the contribution consists in the development of a newnon-Gaussian model for textures. The Fourier coefficients of the textured images are modeledby a Scale Mixture of Gaussians Random Field. The Power Spectral Density of thetexture has a parametric form, driven by a set of parameters that encode the texture characteristics.The methodological contribution is threefold and consists in solving three image processingproblems that have not been tackled so far in the context of indirect observationsof textured images. All the proposed methods are Bayesian and are based on the exploitingthe information encoded in the a posteriori law. The first method that is proposed is devotedto the myopic deconvolution of a textured image and the estimation of its parameters.The second method achieves joint model selection and model parameters estimation froman indirect observation of a textured image. Finally, the third method addresses the problemof joint deconvolution and segmentation of an image composed of several texturedregions, while estimating at the same time the parameters of each constituent texture.Last, but not least, the algorithmic contribution is represented by the development ofa new efficient version of the Metropolis Hastings algorithm, with a directional componentof the proposal function based on the”Newton direction” and the Fisher informationmatrix. This particular directional component allows for an efficient exploration of theparameter space and, consequently, increases the convergence speed of the algorithm.To summarize, this work presents a series of methods to solve three image processingproblems in the context of blurry and noisy textured images. Moreover, we present twoconnected contributions, one regarding the texture models andone meant to enhance theperformances of the samplers employed for all of the three methods
Croix, Jean-Charles. "A new decomposition of Gaussian random elements in Banach spaces with application to Bayesian inversion". Thesis, Lyon, 2018. http://www.theses.fr/2018LYSEM019.
Texto completo da fonteNferring is a fundamental task in science and engineering: it gives the opportunity to compare theory to experimental data. Since measurements are finite by nature whereas parameters are often functional, it is necessary to offset this loss of information with external constraints, by regularization.The obtained solution then realizes a trade-off between proximity to data on one side and regularity on the other.For more than fifteen years, these methods include a probabilistic thinking, taking uncertainty into account. Regularization now consists in choosing a prior probability measure on the parameter space and explicit a link between data and parameters to deduce an update of the initial belief. This posterior distribution informs on how likely is a set of parameters, even in infinite dimension.In this thesis, approximation of such methods is studied. Indeed, infinite dimensional probability measures, while being attractive theoretically, often require a discretization to actually extract information (estimation, sampling). When they are Gaussian, the Karhunen-Loève decomposition gives a practical method to do so, and the main result of this thesis is to provide a generalization to Banach spaces, much more natural and less restrictive. The other contributions are related to its practical use with in a real-world example
Ben, Youssef Atef. "Contrôle de têtes parlantes par inversion acoustico-articulatoire pour l’apprentissage et la réhabilitation du langage". Thesis, Grenoble, 2011. http://www.theses.fr/2011GRENT088/document.
Texto completo da fonteSpeech sounds may be complemented by displaying speech articulators shapes on a computer screen, hence producing augmented speech, a signal that is potentially useful in all instances where the sound itself might be difficult to understand, for physical or perceptual reasons. In this thesis, we introduce a system called visual articulatory feedback, in which the visible and hidden articulators of a talking head are controlled from the speaker's speech sound. The motivation of this research was to develop such a system that could be applied to Computer Aided Pronunciation Training (CAPT) for learning of foreign languages, or in the domain of speech therapy. We have based our approach to this mapping problem on statistical models build from acoustic and articulatory data. In this thesis we have developed and evaluated two statistical learning methods trained on parallel synchronous acoustic and articulatory data recorded on a French speaker by means of an electromagnetic articulograph. Our Hidden Markov models (HMMs) approach combines HMM-based acoustic recognition and HMM-based articulatory synthesis techniques to estimate the articulatory trajectories from the acoustic signal. Gaussian mixture models (GMMs) estimate articulatory features directly from the acoustic ones. We have based our evaluation of the improvement results brought to these models on several criteria: the Root Mean Square Error between the original and recovered EMA coordinates, the Pearson Product-Moment Correlation Coefficient, displays of the articulatory spaces and articulatory trajectories, as well as some acoustic or articulatory recognition rates. Experiments indicate that the use of states tying and multi-Gaussian per state in the acoustic HMM improves the recognition stage, and that the minimum generation error (MGE) articulatory HMMs parameter updating results in a more accurate inversion than the conventional maximum likelihood estimation (MLE) training. In addition, the GMM mapping using MLE criteria is more efficient than using minimum mean square error (MMSE) criteria. In conclusion, we have found that the HMM inversion system has a greater accuracy compared with the GMM one. Beside, experiments using the same statistical methods and data have shown that the face-to-tongue inversion problem, i.e. predicting tongue shapes from face and lip shapes cannot be solved in a general way, and that it is impossible for some phonetic classes. In order to extend our system based on a single speaker to a multi-speaker speech inversion system, we have implemented a speaker adaptation method based on the maximum likelihood linear regression (MLLR). In MLLR, a linear regression-based transform that adapts the original acoustic HMMs to those of the new speaker was calculated to maximise the likelihood of adaptation data. Finally, this speaker adaptation stage has been evaluated using an articulatory phonetic recognition system, as there are not original articulatory data available for the new speakers. Finally, using this adaptation procedure, we have developed a complete articulatory feedback demonstrator, which can work for any speaker. This system should be assessed by perceptual tests in realistic conditions
Bruned, Vianney. "Analyse statistique et interprétation automatique de données diagraphiques pétrolières différées à l’aide du calcul haute performance". Thesis, Montpellier, 2018. http://www.theses.fr/2018MONTS064.
Texto completo da fonteIn this thesis, we investigate the automation of the identification and the characterization of geological strata using well logs. For a single well, geological strata are determined thanks to the segmentation of the logs comparable to multivariate time series. The identification of strata on different wells from the same field requires correlation methods for time series. We propose a new global method of wells correlation using multiple sequence alignment algorithms from bioinformatics. The determination of the mineralogical composition and the percentage of fluids inside a geological stratum results in an ill-posed inverse problem. Current methods are based on experts’ choices: the selection of a subset of mineral for a given stratum. Because of a model with a non-computable likelihood, an approximate Bayesian method (ABC) assisted with a density-based clustering algorithm can characterize the mineral composition of the geological layer. The classification step is necessary to deal with the identifiability issue of the minerals. At last, the workflow is tested on a study case
Schreiber, Floriane. "Estimation des conditions océanographiques par inversion de données issues d'un radar imageur non calibré". Electronic Thesis or Diss., Toulon, 2020. http://www.theses.fr/2020TOUL0016.
Texto completo da fonteMany empirical models describing sea clutter statistical distribution exist but they do not directly depend on the sea sate. They are not suitable to perform inversion. To model the statistical distribution of the backscattered intensity, we use a two-scale model (TSM) which is linked to the sea state via the mss (mean square slope). This model allows to retrieve the NRCS but does not perfectly describes the sea clutter distribution simultaneously in the two direct polarization channels. This is due to an overestimation of the Bragg polarization ratio (PR)
Robelin, David. "Détection de courts segments inversés dans les génomes - méthodes et applications". Phd thesis, Université Paris Sud - Paris XI, 2005. http://tel.archives-ouvertes.fr/tel-00010628.
Texto completo da fonteSanchez, Reyes Hugo Samuel. "Inversion cinématique progressive linéaire de la source sismique et ses perspectives dans la quantification des incertitudes associées". Thesis, Université Grenoble Alpes (ComUE), 2019. http://www.theses.fr/2019GREAU026/document.
Texto completo da fonteThe earthquake characterization is a fundamental research field in seismology, which final goal is to provide accurate estimations of earthquake attributes. In this study field, various questions may rise such as the following ones: when and where did an earthquake happen? How large was it? What is its evolution in space and time? In addition, more challenging questions can be addressed such as the following ones: why did it occur? What is the next one in a given area? In order to progress in the first list of questions, a physical description, or model, of the event is necessary. The investigation of such model (or image) is the scientific topic I investigate during my PhD in the framework of kinematic source models. Understanding the seismic source as a propagating dislocation that occurs across a given geometry of an active fault, the kinematic source models are the physical representations of the time and space history of such rupture propagation. Such physical representation is said to be a kinematic approach because the inferred rupture histories are obtained without taking into account the forces that might cause the origin of the dislocation.In this PhD dissertation, I present a new hierarchical time kinematic source inversion method able to assimilate data traces through evolutive time windows. A linear time-domain formulation relates the slip-rate function and seismograms, preserving the positivity of this function and the causality when spanning the model space: taking benefit of the time-space sparsity of the rupture model evolution is as essential as considering the causality between rupture and each record delayed by the known propagator operator different for each station. This progressive approach, both on the data space and on the model space, does require mild assumptions on prior slip-rate functions or preconditioning strategies on the slip-rate local gradient estimations. These assumptions are based on simple physical expected rupture models. Successful applications of this method to a well-known benchmark (Source Inversion Validation Exercise 1) and to the recorded data of the 2016 Kumamoto mainshock (Mw=7.0) illustrate the advantages of this alternative approach of a linear kinematic source inversion.The underlying target of this new formulation will be the future uncertainty quantification of such model reconstruction. In order to achieve this goal, as well as to highlight key properties considered in this linear time-domain approach, I explore the Hamiltonian Monte Carlo (HMC) stochastic Bayesian framework, which appears to be one of the possible and very promising strategies that can be applied to this stabilized over-parametrized optimization of a linear forward problem to assess the uncertainties on kinematic source inversions. The HMC technique shows to be compatible with the linear time-domain strategy here presented. This technique, thanks to an efficient estimation of the local gradient of the misfit function, appears to be able to rapidly explore the high-dimensional space of probable solutions, while the linearity between unknowns and observables is preserved. In this work, I investigate the performance of the HMC strategy dealing with simple synthetic cases with almost perfect illumination, in order to provide a better understanding of all the concepts and required tunning to achieve a correct exploration of the model space. The results from this preliminary investigation are promising and open a new way of tackling the kinematic source reconstruction problem and the assessment of the associated uncertainties
Bossy, Thomas. "Impact-defined climate targets : estimating ensembles of pathways of compatible anthropogenic drivers through inversion of the cause-effect chain". Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASJ022.
Texto completo da fonteThis dissertation presents a multidisciplinary approach to climate change research. It explores the limitations of the current scenario-building framework used by the Intergovernmental Panel on Climate Change (IPCC) and presents new strategies for better understanding climate futures. Using Pathfinder, a simple model focused on climate and the carbon cycle, this research fills a gap in the range of existing simple climate models by incorporating the latest data and providing a backward, temperature-driven examination of climate change scenarios.Prospects for improvement are then identified by discussing the representation of the ocean in Pathfinder, focusing on the Ocean Heat-Carbon Nexus and its critical role in the global carbon cycle and the response of Earth's climate to cumulative CO2 emissions. A comparison is made between the representations of the Ocean Heat-Carbon Nexus in Pathfinder and state-of-the-art Earth system models, highlighting the significant discrepancies and potential implications for future warming scenarios.After introducing Pathfinder, my research first examines the CO2 emission reductions physically required to meet the 1.5C global warming target, emphasizing the importance of CO2 emissions from land use and non-CO2 forcing. We then reverse the causal chain to link environmental impacts to anthropogenic activities, which is a unique approach. The study maps the spaces of anthropogenic activities compatible with planetary boundaries and introduces a modeling framework that accounts for global warming, ocean acidification, sea level rise, and Arctic sea ice melt.Furthermore, this thesis examines the role of Integrated Assessment Models (IAMs) in understanding the costs associated with these climate scenarios. It explores the impact of conceptual choices in these models on the identification of robust mitigation pathways and examines the effects of physical uncertainty and intergenerational equity.This manuscript concludes with an appreciation of the key contributions of my doctoral research to climate change modeling, exploration of new frontiers and opportunities in the field, and personal insights into the research journey. Overall, this research represents a unique, innovative approach to climate change modeling that will hopefully provide practical tools for assessing and developing mitigation strategies
Kacimi, Sahra. "Contribution à la restitution des précipitations tropicales par radiométrie micro-ondes : préparation à la mission Megha-Tropiques". Versailles-St Quentin en Yvelines, 2012. http://www.theses.fr/2012VERS0064.
Texto completo da fonteWithin the framework of the global warming, the analysis of water and energy budget is of major importance. Considering the Megha-Tropiques (MT) mission whose one of the scientific objectives is to improve the knowledge of water and energy cycle in the intertropical region, the estimation of instantaneous surface rainfall is of the great importance. My PhD work focuses on the optimization of a multi-region, the estimation of instantaneous surface rainfall is of great importance. My PhD work focuses on the optimization of a multi-plateform Bayesian retrieval algorithm called BRAIN (Bayesian Retrieval Algorithm Including Neural Networks) (Viltard et al. , 2006) used for MT. This algorithm uses passive microwave data from satellites such as TRMM, SSM/I and AQUA. It uses a Bayesian Monte Carlo approach to retrieve several atmospheric parameters such as the instantaneous rainfall rate. In order to get a more accurate rainfall restitution, two research axes were investigated : the detection of a priori rainy areas that takes place before the rainfall estimation itself, and the impact of the database and inversion parameters. First, the database on which the algorithm relies needs to be more representative especially as far as high rain rates are concerned. To improve the representativeness of the inversion database, we need first to eliminate repetitive profiles, that is to say extract prototypes from it. To be made, we use Self Organizing Maps SO%s developed by T. Kohonen (2001). Second, the improvement of the rainy-non-rainy pixels classification before the inversion was made using neural networks
Herig, Coimbra Pedro Henrique. "A whirlwind journey of wavelet turbulent flux mapping : estimation of spatialized turbulent flux on tall towers and its uncertainties". Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASJ011.
Texto completo da fonteClimate and human activity are closely linked. Greenhouse gas (GHGs) emissions impact climate dynamics and air quality, affecting millions globally. Effective GHG monitoring is essential for informed policy decisions, yet it is complex due to spatial and temporal variability of sources and sinks, and atmospheric transport. Monitoring networks address this variability by deploying sensors across diverse geographic locations sampling continuously over time.Urban areas are key emission points, driving climate change. However, monitoring direct GHG changes over >5 km2 with varied sources and vegetated areas lacks a standard method. Eddy Covariance (EC) offers direct, continuous GHG net flux monitoring. Wavelet-based EC operates on the same principles as the standard method but calculates covariance using frequency decomposed time series. This approach does not require stationarity, leaving more data available for analysis, particularly beneficial in complex urban environments where non-stationary fluxes are common.Disentangling anthropogenic and biogenic components of a net CO2 flux is recognised as a key issue yet to be resolved in urbanised areas. Conventional ecosystem models used to partition gross primary productivity (GPP) and ecosystem respiration (Reco) are not appropriate for urbanised areas. Direct partitioning using high-frequency correlations between tracer gases may help overcoming the limitations of standard partitioning methods.While Eddy Covariance remains standard for local studies, estimating larger-scale surface fluxes often involves assimilating background concentration measurements to prior estimations using transport models. The progress in satellite imagery and detailed inventories provides a new basis that helps improve these methods. However, inversion methods using tower flux data are still sparse and would be interesting to test in urbanised areas.The objective of this PhD was to evaluate wavelet-based EC combined with Bayesian inversion methods for CO2 flux mapping. During the course of the PhD I discovered a new direct partitioning method that was used with a combination of CH4 and CO to improve the overall inversion in the suburban area of the Saclay plateau.In the first paper of the PhD, we hypothesised that decomposing concentration and wind signals by frequency can capture individual gusts within each frequency, typically mixed in the original signal. We leveraged this feature to propose a new parameter-free direct partitioning method based on quadrant analysis of CO2 and water vapour frequency decomposed fluxes. We showed that this method could indeed provide unbiased estimates of GPP and Reco at a crop and a forest ecosystem site near Paris. We also found that wavelet eddy covariance further saved up to 30% of the non-stationary data in these sites.In the second paper, we proposed using tall towers equipped with high-precision but slow analysers for measuring fluxes. Despite slower acquisition frequencies, attenuation was limited to 20 % by a lower contribution of high frequencies at this height. Results encourage further collaboration between atmospheric and ecosystem networks for in-situ measurements.In the third paper, we combined the partitioning method proposed in the first paper with the flux from the second paper, including now more gases measured to partition CO2 fluxes in biogenic and anthropogenic components and assimilate them in previous spatially-explicit estimations of fluxes at few km2. The obtained flux maps offer the advantage of relying on direct flux measurements at the landscape scale and may be used to informing large-scale inversions at broader scales.Results focused on the Parisian region provide valuable insights for flux measurements at the landscape scale and beyond, and contributing to emission monitoring strategies. These advancements contribute to understanding and addressing environmental challenges at the temporal and spatial scales where decisions are made
Fetel, Emmanuel. "Quantification des incertitudes liées aux simulations d'écoulement dans un réservoir pétrolier à l'aide de surfaces de réponse non linéaires". Thesis, Vandoeuvre-les-Nancy, INPL, 2007. http://www.theses.fr/2007INPL005N/document.
Texto completo da fonteThis work develops several methodologies to assess uncertainty on oil reservoir production. Taking in consideration that flow simulations are time consuming it uses response surfaces to approximate the relationship between reservoir uncertain parameters and production variables. First deterministic methods are developed to construct and analyze such surfaces: construction algorithms are the discrete smooth interpolation and the dual kriging while analysis methods are the variance based sensitivity analysis and the bayesian inversion of production history. This framework is then extended to include fast flow simulation results and to handle stochastic parameters characterized by a random effect on reservoir production. These methodologies are developed based on the following considerations: the number of uncertain parameters is usually large, data may be noisy if some parameters are neglected and the complex relationship between uncertain parameters and reservoir production
Lecaignard, Françoise. "Predictive coding in auditory processing : insights from advanced modeling of EEG and MEG mismatch responses". Thesis, Lyon, 2016. http://www.theses.fr/2016LYSE1160/document.
Texto completo da fonteThis thesis aims at testing the predictive coding account of auditory perception. This framework rests on precision-weighted prediction errors elicited by unexpected sounds that propagate along a hierarchical organization in order to maintain the brain adapted to a varying acoustic environment. Using the mismatch negativity (MMN), a brain response to unexpected stimuli (deviants) that could reflect such errors, we could address the computational and neurophysiological underpinnings of predictive coding. Precisely, we manipulated the predictability of deviants and applied computational learning models and dynamic causal models (DCM) to electrophysiological responses (EEG, MEG) measured simultaneously. Deviant predictability was found to modulate deviance responses, a result supporting their interpretation as prediction errors. Such effect might involve the (high-level) implicit learning of sound sequence regularities that would in turn influence auditory processing in lower hierarchical levels. Computational modeling revealed the perceptual learning of sounds, resting on temporal integration exhibiting differences induced by our predictability manipulation. In addition, DCM analysis indicated predictability changes in the synaptic connectivity established by deviance processing. These results conform predictive coding predictions regarding both deviance processing and its modulation by deviant predictability and strongly support perceptual learning of auditory regularities achieved within an auditory hierarchy. Our findings also highlight the power of this mechanistic framework to elaborate and test new hypothesis enabling to improve our understanding of auditory processing
Kasraoui, Anisse. "Études combinatoires sur les permutations et partitions d'ensemble". Phd thesis, Université Claude Bernard - Lyon I, 2009. http://tel.archives-ouvertes.fr/tel-00393631.
Texto completo da fonte