Dissertations / Theses on the topic 'Probabilités – Prévision'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 35 dissertations / theses for your research on the topic 'Probabilités – Prévision.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Houdant, Benoît. "Contribution à l'amélioration de la prévision hydrométéorologique opérationnelle : pour l'usage des probabilités dans la communication entre acteurs." Phd thesis, ENGREF (AgroParisTech), 2004. http://pastel.archives-ouvertes.fr/pastel-00000925.
Akil, Nicolas. "Etude des incertitudes des modèles neuronaux sur la prévision hydrogéologique. Application à des bassins versants de typologies différentes." Electronic Thesis or Diss., IMT Mines Alès, 2021. http://www.theses.fr/2021EMAL0005.
Floods and droughts are the two main risks in France and require a special attention. In these conditions, where climate change generates increasingly frequent extreme phenomena, modeling these risks is an essential element for water resource management.Currently, discharges and water heights are mainly predicted from physical or conceptual based models. Although efficient and necessary, the calibration and implementation of these models require long and costly studies.Hydrogeological forecasting models often use data from incomplete or poorly dimensioned measurement networks. Moreover, the behavior of the study basins is in most cases difficult to understand. This difficulty is thus noted to estimate the uncertainties associated with hydrogeological modeling.In this context, this thesis, supported by IMT Mines Alès and financed by the company aQuasys and ANRT, aims at developing models based on the systemic paradigm. These models require only basic knowledge on the physical characterization of the studied basin, and can be calibrated from only input and output information (rainfall and discharge/height).The most widely used models in the environmental world are neural networks, which are used in this project. This thesis seeks to address three main goals:1. Development of a model design method adapted to different variables (surface water flows/height) and to very different types of basins: watersheds or hydrogeological basins (groundwater height)2. Evaluation of the uncertainties associated with these models in relation to the types of targeted basins3. Reducing of these uncertaintiesSeveral basins are used to address these issues: the Blavet basin in Brittany and the basin of the Southern and Central Champagne Chalk groundwater table
Carraro, Laurent. "Questions de prédiction pour le mouvement brownien et le processus de Wiener à plusieurs paramètres." Lyon 1, 1985. http://www.theses.fr/1985LYO11660.
Roulin, Emmannuel. "Medium-range probabilistic river streamflow predictions." Doctoral thesis, Universite Libre de Bruxelles, 2014. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/209270.
The research began by analyzing the meteorological predictions at the medium-range (up to 10-15 days) and their use in hydrological forecasting. Precipitation from the ensemble prediction system of the European Centre for Medium-Range Weather Forecasts (ECMWF) were used. A semi-distributed hydrological model was used to transform these precipitation forecasts into ensemble streamflow predictions. The performance of these forecasts was analyzed in probabilistic terms. A simple decision model also allowed to compare the relative economic value of hydrological ensemble predictions and some deterministic alternatives.
Numerical weather prediction models are imperfect. The ensemble forecasts are therefore affected by errors implying the presence of biases and the unreliability of probabilities derived from the ensembles. By comparing the results of these predictions to the corresponding observed data, a statistical model for the correction of forecasts, known as post-processing, has been adapted and shown to improve the performance of probabilistic forecasts of precipitation. This approach is based on retrospective forecasts made by the ECMWF for the past twenty years, providing a sufficient statistical sample.
Besides the errors related to meteorological forcing, hydrological forecasts also display errors related to initial conditions and to modeling errors (errors in the structure of the hydrological model and in the parameter values). The last stage of the research was therefore to investigate, using simple models, the impact of these different sources of error on the quality of hydrological predictions and to explore the possibility of using hydrological reforecasts for post-processing, themselves based on retrospective precipitation forecasts.
/
La prévision des débits des rivières se fait traditionnellement sur la base de mesures en temps réel des précipitations sur les bassins-versant et des débits à l'exutoire et en amont. Ces données sont traitées dans des modèles mathématiques de complexité variée et permettent d'obtenir des prévisions précises pour des temps courts. Pour prolonger l'horizon de prévision à quelques jours – afin d'être en mesure d'émettre des alertes précoces – il est nécessaire de prendre en compte les prévisions météorologiques. Cependant celles-ci présentent par nature une dynamique sensible aux erreurs sur les conditions initiales et, par conséquent, pour une gestion appropriée des risques, il faut considérer les prévisions en termes probabilistes. Actuellement, les prévisions d'ensemble sont effectuées à l'aide d'un modèle numérique de prévision du temps avec des conditions initiales perturbées et permettent d'évaluer l'incertitude.
La recherche a commencé par l'analyse des prévisions météorologiques à moyen-terme (10-15 jours) et leur utilisation pour des prévisions hydrologiques. Les précipitations issues du système de prévisions d'ensemble du Centre Européen pour les Prévisions Météorologiques à Moyen-Terme ont été utilisées. Un modèle hydrologique semi-distribué a permis de traduire ces prévisions de précipitations en prévisions d'ensemble de débits. Les performances de ces prévisions ont été analysées en termes probabilistes. Un modèle de décision simple a également permis de comparer la valeur économique relative des prévisions hydrologiques d'ensemble et d'alternatives déterministes.
Les modèles numériques de prévision du temps sont imparfaits. Les prévisions d'ensemble sont donc entachées d'erreurs impliquant la présence de biais et un manque de fiabilité des probabilités déduites des ensembles. En comparant les résultats de ces prévisions aux données observées correspondantes, un modèle statistique pour la correction des prévisions, connue sous le nom de post-processing, a été adapté et a permis d'améliorer les performances des prévisions probabilistes des précipitations. Cette approche se base sur des prévisions rétrospectives effectuées par le Centre Européen sur les vingt dernières années, fournissant un échantillon statistique suffisant.
A côté des erreurs liées au forçage météorologique, les prévisions hydrologiques sont également entachées d'erreurs liées aux conditions initiales et aux erreurs de modélisation (structure du modèle hydrologique et valeur des paramètres). La dernière étape de la recherche a donc consisté à étudier, à l'aide de modèles simples, l'impact de ces différentes sources d'erreur sur la qualité des prévisions hydrologiques et à explorer la possibilité d'utiliser des prévisions hydrologiques rétrospectives pour le post-processing, elles-même basées sur les prévisions rétrospectives des précipitations.
Doctorat en Sciences
info:eu-repo/semantics/nonPublished
Horrigue, Walid. "Prévision non paramétrique dans les modèles de censure via l'estimation du quantile conditionnel en dimension infinie." Thesis, Littoral, 2012. http://www.theses.fr/2012DUNK0511.
In this thesis, we study some asymptotic properties of conditional functional parameters in nonparametric statistics setting, when the explanatory variable takes its values in infinite dimension space. In this nonparametric setting, we consider the estimators of the usual functional parameters, as the conditional law, the conditional probability density, the conditional quantile. We are essentially interested in the problem of forecasting in the nonparametric conditional models, when the data are functional random variables. Firstly, we propose an estimator of the conditional quantile and we establish its uniform strong convergence with rates over a compact subset. To follow the convention in biomedical studies, we consider an identically distributed sequence {Ti, i ≥ 1}, here density f, right censored by a random {Ci, i ≥ 1} also assumed independent identically distributed and independent of {Ti, i ≥ 1}. Our study focuses on dependent data and the covariate X takes values in an infinite space dimension. In a second step we establish the asymptotic normality of the kernel estimator of the conditional quantile, under α-mixing assumption and on the concentration properties on small balls of the probability measure of the functional regressors. Many applications in some particular cases have been also given
Candille, Guillem. "Validation des systèmes de prévisions météorologiques probabilistes." Paris 6, 2003. http://www.theses.fr/2003PA066511.
Smadi, Charline. "Modèles probabilistes de populations : branchement avec catastrophes et signature génétique de la sélection." Thesis, Paris Est, 2015. http://www.theses.fr/2015PESC1035/document.
This thesis is devoted to the probabilistic study of demographic and genetical responses of a population to some point wise events. In a first part, we are interested in the effect of random catastrophes, which kill a fraction of the population and occur repeatedly, in populations modeled by branching processes. First we construct a new class of processes, the continuous state branching processes with catastrophes, as the unique strong solution of a stochastic differential equation. Then we describe the conditions for the population extinction. Finally, in the case of almost sure absorption, we state the asymptotical rate of absorption. This last result has a direct application to the determination of the number of infected cells in a model of cell infection by parasites. Indeed, the parasite population size in a lineage follows in this model a branching process, and catastrophes correspond to the sharing of the parasites between the two daughter cells when a division occurs. In a second part, we focus on the genetic signature of selective sweeps. The genetic material of an individual (mostly) determines its phenotype and in particular some quantitative traits, as birth and intrinsic death rates, and interactions with others individuals. But genotype is not sufficient to determine "adaptation" in a given environment: for example the life expectancy of a human being is very dependent on his environment (access to drinking water, to medical infrastructures,...). The eco-evolutive approach aims at taking into account the environment by modeling interactions between individuals. When a mutation or an environmental modification occurs, some alleles can invade the population to the detriment of other alleles: this phenomenon is called a selective sweep and leaves signatures in the neutral diversity in the vicinity of the locus where the allele fixates. Indeed, this latter "hitchhiking” alleles situated on loci linked to the selected locus. The only possibility for an allele to escape this "hitchhiking" is the occurrence of a genetical recombination, which associates it to another haplotype in the population. We quantify the signature left by such a selective sweep on the neutral diversity. We first focus on neutral proportion variation in loci partially linked with the selected locus, under different scenari of selective sweeps. We prove that these different scenari leave distinct signatures on neutral diversity, which can allow to discriminate them. Then we focus on the linked genealogies of two neutral alleles situated in the vicinity of the selected locus. In particular, we quantify some statistics under different scenari of selective sweeps, which are currently used to detect recent selective events in current population genetic data. In these works the population evolves as a multitype birth and death process with competition. If such a model is more realistic than branching processes, the non-linearity caused by competitions makes its study more complex
Cifonelli, Antonio. "Probabilistic exponential smoothing for explainable AI in the supply chain domain." Electronic Thesis or Diss., Normandie, 2023. http://www.theses.fr/2023NORMIR41.
The key role that AI could play in improving business operations has been known for a long time, but the penetration process of this new technology has encountered certain obstacles within companies, in particular, implementation costs. On average, it takes 2.8 years from supplier selection to full deployment of a new solution. There are three fundamental points to consider when developing a new model. Misalignment of expectations, the need for understanding and explanation, and performance and reliability issues. In the case of models dealing with supply chain data, there are five additionally specific issues: - Managing uncertainty. Precision is not everything. Decision-makers are looking for a way to minimise the risk associated with each decision they have to make in the presence of uncertainty. Obtaining an exact forecast is a advantageous; obtaining a fairly accurate forecast and calculating its limits is realistic and appropriate. - Handling integer and positive data. Most items sold in retail cannot be sold in subunits. This simple aspect of selling, results in a constraint that must be satisfied by the result of any given method or model: the result must be a positive integer. - Observability. Customer demand cannot be measured directly, only sales can be recorded and used as a proxy. - Scarcity and parsimony. Sales are a discontinuous quantity. By recording sales by day, an entire year is condensed into just 365 points. What’s more, a large proportion of them will be zero. - Just-in-time optimisation. Forecasting is a key function, but it is only one element in a chain of processes supporting decision-making. Time is a precious resource that cannot be devoted entirely to a single function. The decision-making process and associated adaptations must therefore be carried out within a limited time frame, and in a sufficiently flexible manner to be able to be interrupted and restarted if necessary in order to incorporate unexpected events or necessary adjustments. This thesis fits into this context and is the result of the work carried out at the heart of Lokad, a Paris-based software company aiming to bridge the gap between technology and the supply chain. The doctoral research was funded by Lokad in collaborationwith the ANRT under a CIFRE contract. The proposed work aims to be a good compromise between new technologies and business expectations, addressing the various aspects presented above. We have started forecasting using the exponential smoothing family which are easy to implement and extremely fast to run. As they are widely used in the industry, they have already won the confidence of users. What’s more, they are easy to understand and explain to an unlettered audience. By exploiting more advanced AI techniques, some of the limitations of the models used can be overcome. Cross-learning proved to be a relevant approach for extrapolating useful information when the number of available data was very limited. Since the common Gaussian assumption is not suitable for discrete sales data, we proposed using a model associatedwith either a Poisson distribution or a Negative Binomial one, which better corresponds to the nature of the phenomena we are seeking to model and predict. We also proposed using Monte Carlo simulations to deal with uncertainty. A number of scenarios are generated, sampled and modelled using a distribution. From this distribution, confidence intervals of different and adapted sizes can be deduced. Using real company data, we compared our approach with state-of-the-art methods such as DeepAR model, DeepSSMs and N-Beats. We deduced a new model based on the Holt-Winter method. These models were implemented in Lokad’s work flow
Yang, Gen. "Modèles prudents en apprentissage statistique supervisé." Thesis, Compiègne, 2016. http://www.theses.fr/2016COMP2263/document.
In some areas of supervised machine learning (e.g. medical diagnostics, computer vision), predictive models are not only evaluated on their accuracy but also on their ability to obtain more reliable representation of the data and the induced knowledge, in order to allow for cautious decision making. This is the problem we studied in this thesis. Specifically, we examined two existing approaches of the literature to make models and predictions more cautious and more reliable: the framework of imprecise probabilities and the one of cost-sensitive learning. These two areas are both used to make models and inferences more reliable and cautious. Yet few existing studies have attempted to bridge these two frameworks due to both theoretical and practical problems. Our contributions are to clarify and to resolve these problems. Theoretically, few existing studies have addressed how to quantify the different classification errors when set-valued predictions are produced and when the costs of mistakes are not equal (in terms of consequences). Our first contribution has been to establish general properties and guidelines for quantifying the misclassification costs for set-valued predictions. These properties have led us to derive a general formula, that we call the generalized discounted cost (GDC), which allow the comparison of classifiers whatever the form of their predictions (singleton or set-valued) in the light of a risk aversion parameter. Practically, most classifiers basing on imprecise probabilities fail to integrate generic misclassification costs efficiently because the computational complexity increases by an order (or more) of magnitude when non unitary costs are used. This problem has led to our second contribution, the implementation of a classifier that can manage the probability intervals produced by imprecise probabilities and the generic error costs with the same order of complexity as in the case where standard probabilities and unitary costs are used. This is to use a binary decomposition technique, the nested dichotomies. The properties and prerequisites of this technique have been studied in detail. In particular, we saw that the nested dichotomies are applicable to all imprecise probabilistic models and they reduce the imprecision level of imprecise models without loss of predictive power. Various experiments were conducted throughout the thesis to illustrate and support our contributions. We characterized the behavior of the GDC using ordinal data sets. These experiences have highlighted the differences between a model based on standard probability framework to produce indeterminate predictions and a model based on imprecise probabilities. The latter is generally more competent because it distinguishes two sources of uncertainty (ambiguity and the lack of information), even if the combined use of these two types of models is also of particular interest as it can assist the decision-maker to improve the data quality or the classifiers. In addition, experiments conducted on a wide variety of data sets showed that the use of nested dichotomies significantly improves the predictive power of an indeterminate model with generic costs
Atger, Frédéric. "Validation et étude de quelques propriétés de systèmes de prévision météorologique ensemblistes." Toulouse 3, 2003. http://www.theses.fr/2003TOU30051.
Probabilistic meteorological forecasts based on ensemble prediction systems are evaluated. The resolution and reliability components of the decomposition of the Brier score are used for quantifying the performance. Probabilistic forecasts based on operational ensembles are compared to those obtained from a single model run, through a statistical scheme. " Poorman ensembles ", consisting of a few deterministic forecasts run in different operational centres, are evaluated too. The conditions for a realistic estimation of the performance of ensemble based probabilistic forecasts are also investigated. The spatial and interannual variability of the reliability implies a strong stratification of the data, that is not always possible with available samples limited in size. Another, essential issue is the categorization of forecast probabilities, required for achieving the decomposition of the Brier score
Turko, Anton. "Approche comportementale des marchés financiers : options et prévisions." Aix-Marseille 3, 2009. http://www.theses.fr/2009AIX32038.
This paper aims at giving a more comprehensive understanding of the way the prices are set up on option markets. Prediction models are derived. To facilitate their understanding we propose to split price formation process into two phases: -a first one which says how an equilibrium can be reached taking into account the way opposite operators see the future market movements and why some equilibrium can be reached or not. -second, based upon the information revealed by active operators, when enough information is given along time, we show that it is possible give probabilities of market tendencies. This will lead to a behavioral financial model build in the context of utility theory and stochastic dominance, leading to a better understanding of capital markets and option markets. . The purpose of the model will be to permit high probability forecasts in some few specific situations. For this, information coming from buyers as sellers is exploited, each operator bringing his own particular vision of future price market movements
Marty, Renaud. "Prévision hydrologique d'ensemble adaptée aux bassins à crue rapide : élaboration de prévisions probabilistes de précipitations à 12 et 24h : désagrégation horaire conditionnelle pour la modélisation hydrologique : application à des bassins de la région Cévennes Vivarais." Grenoble, 2010. http://www.theses.fr/2010GRENU005.
Catchments of Southern France are regularly subject to quick floods, usually in autumn, generated by intense rainfall events. Thus, flood risk is a major concern, necessitating a maximal lead-time to issue early flood warning, as well as an estimation of future discharges. Firstly, the elements required for hydrological forecasts and the related uncertainties are illustrated. Then, a simple and modular approach adapted to flash flood catchments (having a time to peak of about few hours) is proposed. Considering the targeted lead-time (24-48h), quantitative precipitations forecasts are a key element of this approach. Two prediction systems are described and evaluated: the EPS ensemble forecasts provided by ECMWF and the ANALOG probabilistic forecasts issued from an analog sorting technique produced by LTHE. A statistical correction of the latter is suggested to improve its reliability. The different forecasts are thereafter disaggregated by a generator from a 12 or 24 hours time-step to hourly scenarios which respect the precipitation forecasts and are climatologically consistent. Rainfall scenarios are then used as input to a simple and robust hydrological model, to provide hydrological ensemble forecasts. These forecasts get noticeably improved when sub-daily information about rainfall amounts is provided, either from EPS at a 6 or 12h time-step, or from ANALOG applied at 12h, or from a combination of both approaches, taking into account daily rainfall amount from ANALOG and a sub-daily chronology from EPS at 6h
Zalachori, Ioanna. "Prévisions hydrologiques d’ensemble : développements pour améliorer la qualité des prévisions et estimer leur utilité." Thesis, Paris, AgroParisTech, 2013. http://www.theses.fr/2013AGPT0032/document.
The last decade has seen the emergence of streamflow probabilistic forecasting as the most suitable approach to anticipate risks and provide warnings for public safety and property protection. However, beyond the gains in security, the added‐value of probabilistic information also translates into economic benefits or an optimal management of water resources for economic activities that depend on it.In streamflow forecasting, the uncertainty associated with rainfall predictions from numerical weather prediction models plays an important role. To go beyond the limits of classical predictability, meteorological services developed ensemble prediction systems, which are generated on the basis of perturbations of the initial conditions of the models and stochastic variations in their parameterization. Equally probable scenarios of the evolution of the atmosphere are proposed for forecasting horizons up to 10‐15 days.The integration of weather ensemble predictions in the hydrological forecasting chain is an interesting approach to produce probabilistic streamflow forecasts and quantify the total predictive uncertainty in hydrology. Last and final summary in the thesis
Zalachori, Ioanna. "Prévisions hydrologiques d’ensemble : développements pour améliorer la qualité des prévisions et estimer leur utilité." Electronic Thesis or Diss., Paris, AgroParisTech, 2013. http://www.theses.fr/2013AGPT0032.
The last decade has seen the emergence of streamflow probabilistic forecasting as the most suitable approach to anticipate risks and provide warnings for public safety and property protection. However, beyond the gains in security, the added‐value of probabilistic information also translates into economic benefits or an optimal management of water resources for economic activities that depend on it.In streamflow forecasting, the uncertainty associated with rainfall predictions from numerical weather prediction models plays an important role. To go beyond the limits of classical predictability, meteorological services developed ensemble prediction systems, which are generated on the basis of perturbations of the initial conditions of the models and stochastic variations in their parameterization. Equally probable scenarios of the evolution of the atmosphere are proposed for forecasting horizons up to 10‐15 days.The integration of weather ensemble predictions in the hydrological forecasting chain is an interesting approach to produce probabilistic streamflow forecasts and quantify the total predictive uncertainty in hydrology. Last and final summary in the thesis
Bellier, Joseph. "Prévisions hydrologiques probabilistes dans un cadre multivarié : quels outils pour assurer fiabilité et cohérence spatio-temporelle ?" Thesis, Université Grenoble Alpes (ComUE), 2018. http://www.theses.fr/2018GREAU029/document.
This dissertation adresses the production of short-to-medium range hydrological forecasts, in a context involving a number of basins with correlated streamflows. Our case study, based on real data, includes several tributaries of the upper Rhone river in France. Work has been conducted on implementing a forecasting chain that combines ensemble approaches, namely meteorological ensemble forecasting and hydrological multi-model, with statistical correction methods. Ensemble methods are able to dynamically generate an uncertainty that is case-specific, while statistical corrections, which are applied to meteorological (pre-processing) and/or hydrological (post-processing) forecasts, are needed to ensure forecast calibration.Each statistical correction, performed in a univariate framework, induces the loss of the spatial and temporal dependence structure of the forecasts. We were therefore interested in reconstructing such structure, first by making a diagnosic study of existing methods, notably the Schaake shuffle and ECC. Adaptations were proposed in order to address the identified caveats. For pre-processing, we aimed at improving the conditioning of the dependence structure on the meteorological situation. For post-processing, our effort focused on ensuring that streamflow forecasts respect the autocorrélation charateristics and preserve a good calibration, especially during the recession phases, which are problematic. Verification of the so-obtained (meteorological and/or hydrological) forecasts was conducted using univariate and multivariate tools, paying particular attention to the forecast calibration, thanks to the concept of stratification.Finally, we studied the interactions between the different modules of our forecasting chain, by comparing scenarios where only some of the modules were activated. This experiment allowed us to provide guidelines relative to the implementation or the upgrade of an operational probabilistic streamflow forecasting chain
Zalachori, Ioanna. "Prévisions hydrologiques d'ensemble : développements pour améliorer la qualité des prévisions et estimer leur utilité." Phd thesis, AgroParisTech, 2013. http://pastel.archives-ouvertes.fr/pastel-00927676.
Perrin, Frédéric. "Prise en compte des données expérimentales dans les modèles probabilistes pour la prévision de la durée de vie des structures." Clermont-Ferrand 2, 2008. http://www.theses.fr/2008CLF21823.
Moulin, Laetitia. "Prévision des crues rapides avec des modèles hydrologiques globaux. Applications aux bassins opérationnels de la Loire supérieure : évaluation des modélisations, prise en compte des incertitudes sur les précipitations moyennes spatiales et utilisation de prévisions météorologiques." Phd thesis, AgroParisTech, 2007. http://tel.archives-ouvertes.fr/tel-00368262.
Après une description du bassin à Bas-en-Basset, l'analyse critique des jeux de données disponibles met en évidence leur richesse, mais aussi leurs défauts. La grande variété des événements hydrométéorologiques touchant ces bassins apparaît particulièrement intéressante pour comparer des modèles hydrologiques.
Des modèles conceptuels simples sont apparus plus robustes et souvent plus performants que des modèles statistiques ou des réseaux de neurones artificiels. Des critères spécifiques à la prévision des crues mettent en évidence les informations sur l'évolution immédiate des débits apportées par la transformation de la pluie en débit, même si les erreurs de modélisation restent importantes et finalement proches d'un modèle à l'autre.
Un effort particulier a été porté sur l'estimation par krigeage des précipitations moyennes spatiales, pour lesquelles un modèle d'erreur est proposé et validé sur les données. Ces incertitudes, propagées dans les modèles pluie-débit, contribuent, selon la taille des bassins, à une part variable de l'erreur totale de modélisation.
Enfin un travail exploratoire a montré l'intérêt d'inclure des prévisions de pluies probabilisées dans une chaîne hydrométéorologique, pour augmenter les délais d'anticipation et prendre en compte les incertitudes associées. Toutefois, la disponibilité de ces prévisions impose des traitements préalables à leur utilisation.
Il ressort que des outils simples peuvent laisser envisager des améliorations dans ce domaine encore très perfectible de la prévision des crues.
Courbariaux, Marie. "Contributions statistiques aux prévisions hydrométéorologiques par méthodes d’ensemble." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLA003/document.
In this thesis, we are interested in representing and taking into account uncertainties in medium term probabilistic hydrological prediction systems.These uncertainties mainly come from two sources: (1) from the imperfection of meteorological forecasts (used as inputs to these systems) and (2) from the imperfection of the representation of the hydrological process by the rainfall-runoff simulator (RRS) (at the heart of these systems).The performance of a probabilistic forecasting system is assessed by the sharpness of its predictions conditional on its reliability. The statistical approach we follow provides a guarantee of reliability if the assumptions it implies are complied with. We are also seeking to incorporate auxilary information to get sharper.We propose, for each source of uncertainty, a method enabling this incorporation: (1) a meteorological post-processor based on the statistical property of exchangeability and enabling to take into account several (ensemble or determistic) forecasts; (2) a hydrological post-processor using the RRS state variables through a Probit model arbitrating between two interpretable hydrological regimes and thus representing an uncertainty with heterogeneous variance.These two methods demonstrate adaptability on the various application cases provided by EDF and Hydro-Québec, which are partners and funders of the project. Those methods are moreover simpler and more formal than the operational methods while demonstrating similar performances
Descamps, Laurent. "Définition des conditions initiales des prévisions d'ensemble : liens avec l'assimilation de données." Paris 6, 2007. http://www.theses.fr/2007PA066596.
Marty, Renaud. "PREVISION HYDROLOGIQUE D'ENSEMBLE ADAPTEE AUX BASSINS A CRUE RAPIDE. Elaboration de prévisions probabilistes de précipitations à 12 et 24 h. Désagrégation horaire conditionnelle pour la modélisation hydrologique. Application à des bassins de la région Cévennes Vivarais." Phd thesis, Université de Grenoble, 2010. http://tel.archives-ouvertes.fr/tel-00480713.
Bermolen, Maria Paola. "Modèles probabilistes et statistiques pour la conception et l'analyse des systèmes de communication." Paris, Télécom ParisTech, 2010. https://pastel.hal.science/pastel-00005853.
Two different problems are addressed in this thesis : traffic prediction and classification and access mechanisms in MANETS. In the first part of the thesis, we address the problem of traffic prediction and classsification by means of advanced statistical tools. We analyze the problem of online prediction of the load on a link based only on past measurements and without assuming any particular model. Concerning traffic classification, and motivated by the widespread use of P2P systems, we focus on the identification of P2P applications, considering more precisely the case of P2P television (P2P-TV). For both cases, our framework makes use of Support Vector Machines (SVM). The algorithms we propose provide very accurate results, they are robust and their computational cost is extremely low. These properties make our solutions specially adapted to an online application. Self-organized systems such as MANETs, are of particular importance in today’s world. In the second part of the thesis, we address two different problems related to MAC mechanisms in MANETs (in particular, we concentrate on CSMA). Firstly, an analysis of the existing models for CSMA, with special emphasis on their correlation with the real protocol, is presented. Some weakness are identified and possible solutions are proposed. The use of stochastic geometry tools allows us to obtain analytical results where other techniques cannot. Secondly, we address the problem of lack of QoS in CSMA and we propose two different mechanisms that guarantee a minimum rate for each accepted transmission. The main aim of our study is to identify which of the proposed mechanisms outperforms CSMA best depending on the scenario
Singla, Stéphanie. "Prévisibilité des ressources en eau à l'échelle saisonnière en France." Phd thesis, Toulouse, INPT, 2012. http://oatao.univ-toulouse.fr/8928/1/singla.pdf.
Baïle, Rachel. "Analyse et modélisation multifractales de vitesses de vent. Application à la prévision de la ressource éolienne." Phd thesis, Université Pascal Paoli, 2010. http://tel.archives-ouvertes.fr/tel-00604139.
Nguyen, Danh Ngoc. "Contribution aux approches probabilistes pour le pronostic et la maintenance des systèmes contrôlés." Thesis, Troyes, 2015. http://www.theses.fr/2015TROY0010/document.
The automatic control systems play an important role in the development of civilization and modern technology. The loss of effectiveness of the actuator acting on the system is harmful in the sense that it modifies the behavior of the system compared to that desired. This thesis is a contribution to the prognosis of the remaining useful life (RUL) and the maintenance of closed loop systems with actuators subjected to degradation. In the first contribution, a modeling framework with piecewise deterministic Markov process is considered in order to model the overall behavior of the system. In this context, the behavior of the system is represented by deterministic trajectories that are intersected by random size jumps occurring at random times and modeling the discrete degradation phenomenon of the actuator. The second contribution is a prognosis method of the system RUL which consists of two steps: the estimation of the probability distribution of the system state at the prognostic instant by particle filtering and the computation of the RUL which requires the estimation of the system reliability starting from the prognostic instant. The third contribution is the proposal of a parametric maintenance policy which dynamically take into account the available information on the state and on the current environment of the system and under the constraint of opportunity dates
Eressa, Muluken Regas. "Probabilistic Models for Demand Supply Prediction in The Eenergy Sector." Electronic Thesis or Diss., Université Gustave Eiffel, 2024. http://www.theses.fr/2024UEFL2005.
This thesis investigate probabilistic predictive models based on the Gaussian process and deep learning for electricity demand forecasting. As Gaussian processes are kernel-based predictive models, their performance is constrained by the type, number and dimension of the selected kernel. To address these limitations, first it proposes a new Gaussian approximation technique that address the Bayesian computational bottleneck. Second, it proposes a stochastic compositional kernel estimation algorithm using the proposed Gaussian approximation as the underlying model. Third, it follows an iterative procedure using cross-validation for selecting an optimal combinations of kernels that best explain the data generating model. Furthermore, it also tries to address the limitation of maximum likelihood approach which is usually employed in probabilistic deep learning models and yet fails in guaranteeing a minimized interval width and maximized coverage probability for the forecasted points. This thesis proposes a new training algorithm for neural networks. The proposed distribution based lower upper bound estimation algorithm encompasses interval width and coverage probability as quality metrics with adaptive parameters that guarantee the needed performance compared to other alternative techniques. The suggested approaches enhance the deployment of Gaussian and deep learning models in the energy sector. The bound estimation model for a minimized prediction interval and maximized coverage probability, can help potential energy suppliers in sizing generators which will result in a marginal profit gain. In addition, the kernel estimation algorithm can simplify the application of kernel-based learning to those who find kernel selection vague. To the experienced, it can give a preliminary insight into the structure of the kernels that could potentially fit the data. The randomized column sampling technique could offer an alternative method for a fast Gaussian model building and approximation that is scalable to large data. Furthermore, the bound estimation, in addition to providing a forecast distribution to a point estimate neural models, it can also serve as a good starting point to an alternative probabilistic model training in deep neural nets
Brochero, Darwin. "Hydroinformatics and diversity in hydrological ensemble prediction systems." Thesis, Université Laval, 2013. http://www.theses.ulaval.ca/2013/29908/29908.pdf.
In this thesis, we tackle the problem of streamflow probabilistic forecasting from two different perspectives based on multiple hydrological models collaboration (diversity). The first one favours a hybrid approach for the evaluation of multiple global hydrological models and tools of machine learning for predictors selection, while the second one constructs Artificial Neural Network (ANN) ensembles, forcing diversity within. This thesis is based on the concept of diversity for developing different methodologies around two complementary problems. The first one focused on simplifying, via members selection, a complex Hydrological Ensemble Prediction System (HEPS) that has 800 daily forecast scenarios originating from the combination of 50 meteorological precipitation members and 16 global hydrological models. We explore in depth four techniques: Linear Correlation Elimination, Mutual Information, Backward Greedy Selection, and Nondominated Sorting Genetic Algorithm II (NSGA-II). We propose the optimal hydrological model participation concept that identifies the number of meteorological representative members to propagate into each hydrological model in the simplified HEPS scheme. The second problem consists in the stratified selection of data patterns that are used for training an ANN ensemble or stack. For instance, taken from the database of the second and third MOdel Parameter Estimation eXperiment (MOPEX) workshops, we promoted an ANN prediction stack in which each predictor is trained on input spaces defined by the Input Variable Selection application on different stratified sub-samples. In summary, we demonstrated that implicit diversity in the configuration of a HEPS is efficient in the search for a HEPS of high performance.
Ruiz, Virginie. "Estimation et prédiction d'un système évoluant de façon non linéaire : filtrage par densités approchées." Rouen, 1993. http://www.theses.fr/1993ROUE5018.
Pertsinidou, Christina Elisavet. "Stochastic models for the estimation of the seismic hazard." Thesis, Compiègne, 2017. http://www.theses.fr/2017COMP2342.
In the first chapter the definition of the seismic hazard assessment is provided, the seismotectonic features of the study areas are briefly presented and the already existing mathematical models applied in the field of Seismology are thoroughly reviewed. In chapter 2, different semi-Markov models are developed for studying the seismicity of the areas of the central Ionian Islands and the North Aegean Sea (Greece). Quantities such as the kernel and the destination probabilities are evaluated, considering geometric, discrete-Weibull and Pareto distributed sojourn times. Useful results are obtained for forecasting purposes. In the third chapter a new Viterbi algorithm for hidden semi-Markov models is developed, whose complexity is a linear function of the number of observations and a quadratic function of the number of hidden states, the lowest existing in the literature. Furthermore, an extension of this new algorithm is introduced for the case that an observation depends on the corresponding hidden state but also on the previous observation (SM1-M1 case). In chapter 4, different hidden semi-Markov models (HSMMs) are applied for the study of the North and South Aegean Sea. The earthquake magnitudes and locations comprise the observation sequence and the new Viterbi algorithm is implemented in order to decode the hidden stress field associated with seismogenesis. Precursory phases (variations of the hidden stress field) were detected warning for an anticipated earthquake occurrence for 70 out of 88 cases (the optimal model’s score). The sojourn times of the hidden process were assumed to follow Poisson, logarithmic or negative binomial distributions, whereas the hidden stress levels were classified into 2, 3 or 4 states. HMMs were also adapted without presenting significant results as for the precursory phases. In chapter 5 a generalized Viterbi algorithm for HSMMs is constructed in the sense that now transitions to the same hidden state are allowed and can also be decoded. Furthermore, an extension of this generalized algorithm in the SM1-M1 context is given. In chapter 6 we modify adequately the Cramér-Lundberg model considering negative and positive claims, in order to describe the evolution in time of the Coulomb failure function changes (ΔCFF values) computed at the locations of seven strong (M ≥ 6) earthquakes of the North Aegean Sea. Ruin probability formulas are derived and proved in a general form. Corollaries are also formulated for the exponential and the Pareto distribution. The aim is to shed light to the following problem posed by the seismologists: During a specific year why did an earthquake occur at a specific location and not at another location in seismotectonically homogeneous areas with positive ΔCFF values (stress enhanced areas). The results demonstrate that the new probability formulas can contribute in answering the aforementioned question
Fouemkeu, Norbert. "Modélisation de l'incertitude sur les trajectoires d'avions." Phd thesis, Université Claude Bernard - Lyon I, 2010. http://tel.archives-ouvertes.fr/tel-00710595.
Huynh, Khac Tuan. "Quantification de l'apport de l'information de surveillance dans la prise de décision en maintenance." Phd thesis, Université de Technologie de Troyes, 2011. http://tel.archives-ouvertes.fr/tel-00788661.
Fouemkeu, Norbert. "Modélisation de l’incertitude sur les trajectoires d’avions." Thesis, Lyon 1, 2010. http://www.theses.fr/2010LYO10217/document.
In this thesis we propose probabilistic and statistic models based on multidimensional data for forecasting uncertainty on aircraft trajectories. Assuming that during the flight, aircraft follows his 3D trajectory contained into his initial flight plan, we used all characteristics of flight environment as predictors to explain the crossing time of aircraft at given points on their planned trajectory. These characteristics are: weather and atmospheric conditions, flight current parameters, information contained into the flight plans and the air traffic complexity. Typically, in this study, the dependent variable is difference between actual time observed during flight and planned time to cross trajectory planned points: this variable is called temporal difference. We built four models using method based on partitioning recursive of the sample. The first called classical CART is based on Breiman CART method. Here, we use regression trees to build points typology of aircraft trajectories based on previous characteristics and to forecast crossing time of aircrafts on these points. The second model called amended CART is the previous model improved. This latter is built by replacing forecasting estimated by the mean of dependent variable inside the terminal nodes of classical CART by new forecasting given by multiple regression inside these nodes. This new model developed using Stepwise algorithm is parcimonious because for each terminal node it permits to explain the flight time by the most relevant predictors inside the node. The third model is built based on MARS (Multivariate adaptive regression splines) method. Besides continuity of the dependent variable estimator, this model allows to assess the direct and interaction effects of the explanatory variables on the crossing time on flight trajectory points. The fourth model uses boostrap sampling method. It’s random forests where for each bootstrap sample from the initial data, a tree regression model is built like in CART method. The general model forecasting is obtained by aggregating forecasting on the set of trees. Despite the overfitting observed on this model, it is robust and constitutes a solution against instability problem concerning regression trees obtained from CART method. The models we built have been assessed and validated using data test. Their using to compute the sector load forecasting in term to aircraft count entering the sector shown that, the forecast time horizon about 20 minutes with the interval time larger than 20 minutes, allowed to obtain forecasting with relative errors less than 10%. Among all these models, classical CART and random forests are more powerful. Hence, for regulator authority these models can be a very good help for managing the sector load of the airspace controlled
Dammak, Neila. "Notation financière et comportement des acteurs sur le marché financier." Thesis, Paris Est, 2013. http://www.theses.fr/2013PEST0048.
The main objective of this thesis is to analyze the role of rating agencies on the financial market. Our contribution consists in a better understanding of the impact of rating announcements on the agents on the French financial market (both investors and analysts).First we focus on the information content of announcements by rating agencies and the impact of theirs decisions in the market. To answer this question, we made an event study at the rating announcements, by identifying them by nature, type and category.This research highlights the fact that the rating announcements generally have an impact on the stock market. This impact depends on the nature of the announcement, the information provided in the reports as well as score changes between categories and within the speculative category. Moreover, the rating level depends on the firm financial and accounting characteristics.Second, we intend to understand the beneficial role of rating agencies on the financial markets. To answer this question, we analyzed the evolution of the information asymmetry and stock market liquidity around rating announcements.Our results show that positive announcements (respectively negative) lead to a decrease (respectively increase) of information asymmetry. We also found that positive and neutral announcements, unlike the negative ones, lead to a reduction of bid-ask spread and to an increase of transactions volumes. Both effects reflect higher (respectively lower) stock market liquidity when the announcements are positive or neutral (respectively negative).Finally, we focus on the study of the impact of rating announcements on analysts' forecasts. For this purpose, we studied the evolution of the analysts' forecasts dispersion and errors around rating announcements.Our results indicate an inverse relationship between the characteristics of financial analysts' forecasts and the nature of the rating announcement. Indeed, positive and neutral announcements reduce the error and the dispersion of analysts' forecasts.This research shows the informative content of rating announcements on the stock market and the real contribution of the announcements by improving financial communication
Seltz, Andréa. "Application of deep learning to turbulent combustion modeling of real jet fuel for the numerical prediction of particulate emissions Direct mapping from LES resolved scales to filtered-flame generated manifolds using convolutional neural networks Solving the population balance equation for non-inertial particles dynamics using probability density function and neural networks: application to a sooting flame." Thesis, Normandie, 2020. http://www.theses.fr/2020NORMIR08.
With the climate change emergency, pollutant and fuel consumption reductions are now a priority for aircraft industries. In combustion chambers, the chemistry and soot modeling are critical to correctly quantify engines soot particles and greenhouse gases emissions. This thesis aimed at improving aircraft numerical pollutant tools, in terms of computational cost and prediction level, for engines high fidelity simulations. It was achieved by enhancing chemistry reduction tools, allowing to predict CO emissions of an aircraft engines at affordable cost for the industry. Next, a novel closure model for unresolved terms in the LES filtered transport equations is developed, based on neural networks (NN), to propose a better flame modeling. Then, an original soot model for engine high fidelity simulations is presented, also based on NN. This new model is applied to a one-dimensional premixed sooted flame, and finally to an industrial combustion chamber LES with measured soot comparison
Tsafack-Teufack, Idriss. "Essays in functional econometrics and financial markets." Thesis, 2020. http://hdl.handle.net/1866/24837.
In this thesis, I exploit the functional data analysis framework and develop inference, prediction and forecasting analysis, with an application to topics in the financial market. This thesis is organized in three chapters. The first chapter is a paper co-authored with Marine Carrasco. In this chapter, we consider a functional linear regression model with a functional predictor variable and a scalar response. We develop a theoretical comparison of the Functional Principal Component Analysis (FPCA) and Functional Partial Least Squares (FPLS) techniques. We derive the convergence rate of the Mean Squared Error (MSE) for these methods. We show that this rate of convergence is sharp. We also find that the regularization bias of the FPLS method is smaller than the one of FPCA, while its estimation error tends to be larger than that of FPCA. Additionally, we show that FPLS outperforms FPCA in terms of prediction accuracy with a fewer number of components. The second chapter considers a fully functional autoregressive model (FAR) to forecast the next day’s return curve of the S&P 500. In contrast to the standard AR(1) model where each observation is a scalar, in this research each daily return curve is a collection of 390 points and is considered as one observation. I conduct a comparative analysis of four big data techniques including Functional Tikhonov method (FT), Functional Landweber-Fridman technique (FLF), Functional spectral-cut off (FSC), and Functional Partial Least Squares (FPLS). The convergence rate, asymptotic distribution, and a test-based strategy to select the lag number are provided. Simulations and real data show that FPLS method tends to outperform the other in terms of estimation accuracy while all the considered methods display almost the same predictive performance. The third chapter proposes to estimate the risk neutral density (RND) for options pricing with a functional linear model. The benefit of this approach is that it exploits directly the fundamental arbitrage-free equation and it is possible to avoid any additional density parametrization. The estimation problem leads to an inverse problem and the functional Landweber-Fridman (FLF) technique is used to overcome this issue.