Dissertationen zum Thema „Optimisation du Risque“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit Top-50 Dissertationen für die Forschung zum Thema "Optimisation du Risque" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.
Seck, Babacar. „Optimisation stochastique sous contrainte de risque et fonctions d'utilité“. Phd thesis, Ecole des Ponts ParisTech, 2008. http://pastel.archives-ouvertes.fr/pastel-00004576.
Der volle Inhalt der QuelleNgoupeyou, Armand Brice. „Optimisation des portefeuilles d'actifs soumis au risque de défaut“. Thesis, Evry-Val d'Essonne, 2010. http://www.theses.fr/2010EVRY0012/document.
Der volle Inhalt der QuelleThe topic of this thesis is about the optimization of portfolio of defaultable assets. The subprime crisis show us that in the future, we should deal with the default risk to get the real value of the portfolio. In fact, there are so many exchanges in the markets that the financial market become a network with many connections and we should identity it to understand the real risk that we take when we deal with financial assets. In this thesis; we define a financial system with a finite number of connections and we give a model for the dynamics of the asset in such system considering the connections in the network. We deal with the jumps intensity processes to take account the correlation in the network. Using Backward Stochastic Differential Equations (BSDE), we get the price of a contingent claim and we consider the model risk to get the optimal consumption and terminal wealth in such financial network
Andrieu, Laetitia. „Optimisation sous contrainte en probabilité“. Phd thesis, Ecole des Ponts ParisTech, 2004. http://pastel.archives-ouvertes.fr/pastel-00001239.
Der volle Inhalt der QuelleAlais, Jean-Christophe. „Risque et optimisation pour le management d'énergies : application à l'hydraulique“. Thesis, Paris Est, 2013. http://www.theses.fr/2013PEST1071/document.
Der volle Inhalt der QuelleHydropower is the main renewable energy produced in France. It brings both an energy reserve and a flexibility, of great interest in a contextof penetration of intermittent sources in the production of electricity. Its management raises difficulties stemming from the number of dams, from uncertainties in water inflows and prices and from multiple uses of water. This Phd thesis has been realized in partnership with Electricité de France and addresses two hydropower management issues, modeled as stochastic dynamic optimization problems. The manuscript is divided in two parts. In the first part, we consider the management of a hydroelectric dam subject to a so-called tourist constraint. This constraint assures the respect of a given minimum dam stock level in Summer months with a prescribed probability level. We propose different original modelings and we provide corresponding numerical algorithms. We present numerical results that highlight the problem under various angles useful for dam managers. In the second part, we focus on the management of a cascade of dams. We present the approximate decomposition-coordination algorithm called Dual Approximate Dynamic Programming (DADP). We show how to decompose an original (large scale) problem into smaller subproblems by dualizing the spatial coupling constraints. On a three dams instance, we are able to compare the results of DADP with the exact solution (obtained by dynamic programming); we obtain approximate gains that are only at a few percents of the optimum, with interesting running times. The conclusions we arrived at offer encouraging perspectives for the stochastic optimization of large scale problems
Martin, Yvan. „Maintenance par la fiabilité : optimisation au sens de la maîtrise des risques“. Lyon, INSA, 1996. http://www.theses.fr/1996ISAL0015.
Der volle Inhalt der QuelleThe aim of our work is the study of probabilistic models and the design of tools and statistical methods in order to optimize industrial maintenance process, taking into account Reliability, Quality and Availability. After a careful modelling of the most common life-cycle function (namely the "Bath-Curve"), an optimal frequency of replacement is determined using a "self-adaptive" Simulation method. A complexity study lead us to use a numerical method, based on evolutionary algorithm, in order to optimize stock management in case of hazardous material requisition. Through an analysis of the standards of production: stops, an heuristical method allows us to optimize maintenance policy in the context of opportunistic maintenance. Thinking about financial risk assessment, we integrate imperfect knowledge of life-cycle function and supplies process in all methods developed above
Therrien, Karine. „Validation et optimisation d'une méthode d'indice de risque de perte de phosphore“. Thesis, Université Laval, 2007. http://www.theses.ulaval.ca/2007/24468/24468.pdf.
Der volle Inhalt der QuelleSurface water quality impairment is primarily caused by phosphorus (P) lost from surrounding agricultural fields. This knowledge triggered the development of P losses predicting tools taking into account transport and source factors. In the United States, a great number of P index (PI) approaches has been developed and adopted. Because these PIs are not necessarily applicable in the province of Québec, a P index specific to Québec has been developed : the Phosphorus Risk Index (PRI). The objective of this research was to validate and possibly improve the PRI method using P losses measured on nine experimental agricultural plots located on the IRDA farm in Saint-Lambert-de-Lauzon. For each plot, P losses and the amounts of water from runoff and tile-drainage were continuously measured during a two-year period (2001-2002) using automated systems. The total P losses were on average 540 g ha-1 from which 95% was exported via the subsurface drainage system. For each plots, the selected P loss value of each of the PRI components were multiplied by the weight assigned to each components and summed to obtain the final PRI value which is associated to one of the five P loss ratings (i.e. very low, low, medium, high and very high). Results indicate that the measured total P losses and PRI values showed a correlation coefficient of 0.63. In order to improve the relationship between P losses and the IRP, the Trust-region algorithm of the SAS none-linear programming (proc. NLP) was used to optimize weights and P loss potential values of the components. Optimization resulted in a correlation coefficient of 0.92. Modifications to the method kept the additive structure of the PRI method. Results were only valid for the Saint-Lambert plots. Generalization across Québec would require experiments on a range of soil, climate, and agronomic conditions. Results from this research indicated possible improvement in the predictive accuracy of the PRI method. Keywords: Phosphorus index, phosphorus losses, optimization, risk assessment, source factors, transport factors.
Hamdi, Faiza. „Optimisation et planification de l'approvisionnement en présence du risque de rupture des fournisseurs“. Thesis, Ecole nationale des Mines d'Albi-Carmaux, 2017. http://www.theses.fr/2017EMAC0002/document.
Der volle Inhalt der QuelleTrade liberalization, the development of mean of transport and the development economic of emerging countries which lead to globalization of supply chain is irreversible phenomen. They can reduce costs, in return, they multiply the risk of disruption from upstream stage to downstream stage. In this thesis, we focus on the inbound supply chain stage. We treat more specifically the case of a purchasing central to select suppliers and allocate the orders. Each of the suppliers cannot deliver its orders due to internal reasons (poor quality problems) or external reasons (natural disasters, transport problems). According to the selected suppliers deliver their orders or not, the transaction operation will generate a profit or loss. The objective of this thesis is to provide decision support tools to a decision maker faced with this problem by taking into account the behavior of decision maker toward risk. We proposed stochastic mixed integer linear programs to model this problem. In the first part, we focuses on the development of a decision support visual tool that allows a decision maker to find a compromise between maximizing the expected profit and minimize the risk of loss. In the second part, we integrated the techniques of estimation of risk VaR and CVaR in this problem. The objective is to help decision maker to minimize the expected cost and minimize the conditional value at risk simultanously via calculating of VaR. Result shows that the decision maker must tack into account the different scenarios of disruption regardless their probability of realisation
Forsell, Nicklas. „Planification dans le risque et l'incertain : optimisation des stratégies de gestion spatiale des forêts“. Toulouse 3, 2009. http://www.theses.fr/2009TOU30260.
Der volle Inhalt der QuelleThis thesis concentrates on the optimization of large-scale management policies under conditions of risk and uncertainty. In paper I, we address the problem of solving large-scale spatial and temporal natural resource management problems. To model these types of problems, the framework of graph-based Markov decision processes (GMDPs) can be used. Two algorithms for computation of high-quality management policies are presented: the first is based on approximate linear programming (ALP) and __ the second is based on mean-field approximation and approximate policy iteration (MF-API). The applicability and efficiency of the algorithms were demonstrated by their ability to compute near-optimal management policies for two large-scale management problems. It was concluded that the two algorithms compute policies of similar quality. However, the MF-API algorithm should be used when both the policy and the expected value of the computed policy are required, while the ALP algorithm may be preferred when only the policy is required. In paper II, a number of reinforcement learning algorithms are presented that can be used to compute management policies for GMDPs when the transition function can only be simulated because its explicit formulation is unknown. Studies of the efficiency of the algorithms for three management problems led us to conclude that some of these algorithms were able to compute near-optimal management policies. In paper III, we used the GMDP framework to optimize long-term forestry management policies under stochastic wind-damage events. The model was demonstrated by a case study of an estate consisting of 1,200 ha of forest land, divided into 623 stands. We concluded that managing the estate according to the risk of wind damage increased the expected net present value (NPV) of the whole estate only slightly, less than 2%, under different wind-risk assumptions. Most of the stands were managed in the same manner as when the risk of wind damage was not considered. However, the analysis rests on properties of the model that need to be refined before definite conclusions can be drawn
Gérard, Henri. „Stochastic optimization problems : decomposition and coordination under risk“. Thesis, Paris Est, 2018. http://www.theses.fr/2018PESC1111/document.
Der volle Inhalt der QuelleWe consider stochastic optimization and game theory problems with risk measures. In a first part, we focus on time consistency. We begin by proving an equivalence between time consistent mappings and the existence of a nested formula. Motivated by well-known examples in risk measures, we investigate three classes of mappings: translation invariant, Fenchel-Moreau transform and supremum mappings. Then, we extend the concept of time consistency to player consistency, by replacing the sequential time by any unordered set and mappings by any relations. Finally, we show how player consistency relates to sequential and parallel forms of decomposition in optimization. In a second part, we study how risk measures impact the multiplicity of equilibria in dynamic game problems in complete and incomplete markets. We design an example where the introduction of risk measures leads to the existence of three equilibria instead of one in the risk neutral case. We analyze the ability of two different algorithms to recover the different equilibria. We discuss links between player consistency and equilibrium problems in games. In a third part, we study distribution ally robust optimization in machine learning. Using convex risk measures, we provide a unified framework and propose an adapted algorithm covering three ambiguity sets discussed in the literature
Torossian, Léonard. „Méthodes d'apprentissage statistique pour la régression et l'optimisation globale de mesures de risque“. Thesis, Toulouse 3, 2019. http://www.theses.fr/2019TOU30192.
Der volle Inhalt der QuelleThis thesis presents methods for estimation and optimization of stochastic black box functions. Motivated by the necessity to take risk-averse decisions in medecine, agriculture or finance, in this study we focus our interest on indicators able to quantify some characteristics of the output distribution such as the variance or the size of the tails. These indicators also known as measure of risk have received a lot of attention during the last decades. Based on the existing literature on risk measures, we chose to focus this work on quantiles, CVaR and expectiles. First, we will compare the following approaches to perform quantile regression on stochastic black box functions: the K-nearest neighbors, the random forests, the RKHS regression, the neural network regression and the Gaussian process regression. Then a new regression model is proposed in this study that is based on chained Gaussian processes inferred by variational techniques. Though our approach has been initially designed to do quantile regression, we showed that it can be easily applied to expectile regression. Then, this study will focus on optimisation of risk measures. We propose a generic approach inspired from the X-armed bandit which enables the creation of an optimiser and an upper bound on the simple regret that can be adapted to any risk measure. The importance and relevance of this approach is illustrated by the optimization of quantiles and CVaR. Finally, some optimisation algorithms for the conditional quantile and expectile are developed based on Gaussian processes combined with UCB and Thompson sampling strategies
Papa, Guillaume. „Méthode d'échantillonnage appliqué à la minimisation du risque empirique“. Electronic Thesis or Diss., Paris, ENST, 2018. http://www.theses.fr/2018ENST0005.
Der volle Inhalt der QuelleIn this manuscript, we present and study applied sampling strategies, with problems related to statistical learning. The goal is to deal with the problems that usually arise in a context of large data when the number of observations and their dimensionality constrain the learning process. We therefore propose to address this problem using two sampling strategies: - Accelerate the learning process by sampling the most helpful. - Simplify the problem by discarding some observations to reduce complexity and the size of the problem. We first consider the context of the binary classification, when the observations used to form a classifier come from a sampling / survey scheme and present a complex dependency structure. for which we establish bounds of generalization. Then we study the implementation problem of stochastic gradient descent when observations are drawn non uniformly. We conclude this thesis by studying the problem of graph reconstruction for which we establish new theoretical results
Blanchet-Scalliet, Christophette. „Processus à sauts et risque de défaut“. Phd thesis, Université d'Evry-Val d'Essonne, 2001. http://tel.archives-ouvertes.fr/tel-00192209.
Der volle Inhalt der QuelleLa seconde est consacrée à une modélisation du risque de défaut. Nous insistons sur la différence entre l'information liée au défaut de celle du marché sans défaut. Nous établissons des théorèmes de représentation prévisibles pour les martingales dans la filtration élargie.
Perchet, Romain. „Construction et gestion d'un portefeuille en budget de risque“. Paris, EHESS, 2015. http://www.theses.fr/2015EHES0179.
Der volle Inhalt der QuelleThis thesis, based on articles, propose solutions to investors to build and manage portfolios taking into account the frequency of crisis. The first chapter explains the motivation of these articles. The second chapter proposes a solution to build robust mid-term asset allocation whereas the chapter three and four offer solutions to manage short term risk. For a number of different formulations of robust portfolio optimization, quadratic and absolute, I show that (a) in the limit of low uncertainty in estimated asset mean returns the robust portfolio converges towards the mean-variance portfolio obtained with the same inputs, and (b) in the limit of high uncertainty the robust portfolio converges towards a risk-based portfolio, which is a function of how the uncertainty in estimated asset mean returns is defined. Inter-temporal risk parity is a strategy which rebalances between a risky asset and cash in order to target a constant level of risk over time. When applied to equities and compared to a buy and hold strategy it is known to improve the Sharpe ratio and reduce drawdowns. I used Monte Carlo simulations based on a number of time-series parametric models from the GARCH family in order to analyze the relative importance of a number of effects in explaining those benefits. I also apply these strategies to factor investing, namely value and momentum investing in equities, government bonds and foreign exchange. Value and momentum factors generate a premium which is traditionally captured by dollar-neutral long-short portfolios rebalanced every month to take into account changes in stock, bond or foreign exchange factor exposures and keep leverage constant
Favier, Philomène. „Une approche intégrée du risque avalanche : quantification de la vulnérabilité physique et humaine et optimisation des structures de protection“. Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENU051/document.
Der volle Inhalt der QuelleLong term avalanche risk quantification for mapping and the design of defense structures is done in mostcountries on the basis of high magnitude events. Such return period/level approaches, purely hazardoriented,do not consider elements at risk (buildings, people inside, etc.) explicitly, and neglect possiblebudgetary constraints. To overcome these limitations, risk based zoning methods and cost-benefit analyseshave emerged recently. They combine the hazard distribution and vulnerability relations for the elementsat risk. Hence, the systematic vulnerability assessment of buildings can lead to better quantify the riskin avalanche paths. However, in practice, available vulnerability relations remain mostly limited to scarceempirical estimates derived from the analysis of a few catastrophic events. Besides, existing risk-basedmethods remain computationally intensive, and based on discussable assumptions regarding hazard modelling(choice of few scenarios, little consideration of extreme values, etc.). In this thesis, we tackle theseproblems by building reliability-based fragility relations to snow avalanches for several building types andpeople inside them, and incorporating these relations in a risk quantification and defense structure optimaldesign framework. So, we enrich the avalanche vulnerability and risk toolboxes with approaches of variouscomplexity, usable in practice in different conditions, depending on the case study and on the time availableto conduct the study. The developments made are detailed in four papers/chapters.In paper one, we derive fragility curves associated to different limit states for various reinforced concrete(RC) buildings loaded by an avalanche-like uniform pressure. Numerical methods to describe the RCbehaviour consist in civil engineering abacus and a yield line theory model, to make the computations asfast as possible. Different uncertainty propagation techniques enable to quantify fragility relations linkingpressure to failure probabilities, study the weight of the different parameters and the different assumptionsregarding the probabilistic modelling of the joint input distribution. In paper two, the approach is extendedto more complex numerical building models, namely a mass-spring and a finite elements one. Hence, muchmore realistic descriptions of RC walls are obtained, which are useful for complex case studies for whichdetailed investigations are required. However, the idea is still to derive fragility curves with the simpler,faster to run, but well validated mass-spring model, in a “physically-based meta-modelling” spirit. Inpaper three, we have various fragility relations for RC buildings at hand, thus we propose new relationsrelating death probability of people inside them to avalanche load. Second, these two sets of fragilitycurves for buildings and human are exploited in a comprehensive risk sensitivity analysis. By this way,we highlight the gap that can exist between return period based zoning methods and acceptable riskthresholds. We also show the higher robustness to vulnerability relations of optimal design approaches ona typical dam design case. In paper four, we propose simplified analytical risk formulas based on extremevalue statistics to quantify risk and perform the optimal design of an avalanche dam in an efficient way. Asensitivity study is conducted to assess the influence of the chosen statistical distributions and flow-obstacleinteraction law, highlighting the need for precise risk evaluations to well characterise the tail behaviour ofextreme runouts and the predominant patterns in avalanche - structure interactions
Grüss, Sandrine. „Optimisation de la prise en charge de l'hémorragie du postpartum : analyse des facteurs de risque et des pratiques professionnelles“. Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASR026.
Der volle Inhalt der QuelleAbstract : Post-partum hemorrhage is one of the most important causes of maternal morbidity and mortality in France. Despite medical advances, the optimal management of post-partum hemorrhage remains a challenge, not least because of the complexity of coordinating different treatments in a multi-professional emergency setting. This thesis focused on optimizing the management of postpartum hemorrhage through an in-depth analysis of risk factors and professional practices, particularly in relation to the use of uterotonics. This research is structured in three parts. Firstly, we carried out an analysis of the risk factors associated with postpartum hemorrhage, stratifying our analyses by parity in order to gain a better understanding of the determinants and to identify high-risk populations using data from a randomized clinical trial. Secondly, this study assessed the professional practices of midwives, focusing in particular on the sometimes-inappropriate use of oxytocin as a first-line treatment, by identifying deviations from recommendations using a clinical vignette survey distributed to French midwives.Finally, this work identified the factors leading to failure of treatment with sulproston, a second-line uterotonic, to reduce the invasive procedures that generate significant morbidity, using Bayesian statistics analysis based on data from a bicentric cohort.In conclusion, this work provides food for thought for optimized detection and management of post-partum hemorrhage, with the emphasis on early identification of modifiable risk factors such as appropriate use of uterotonics
Capitanul, Elena Mihaela. „Airport strategic planning under uncertainty : fuzzy dual dynamic programming approach“. Thesis, Toulouse 3, 2016. http://www.theses.fr/2016TOU30109/document.
Der volle Inhalt der QuelleAirports are critical connectors in the air transportation operational system. In order to meet their operational, economic and social obligations in a very volatile environment, airports need to embrace change rather than resist it. Like any other industry, airports face a wide array of risks, some specific to air transportation, other having only an indirect influence but powerful enough to disrupt airport activities. Long term airport planning has become a complex issue due to the constant growth in air traffic demand. A new dimension of complexity emerged when uncertainty began having a more, and more disruptive, and significantly costly impact on developing airport infrastructure. Historically, the ability of traditional risk and uncertainty mitigation tools proved inefficient. Countless unforeseen events like terrorist attacks, economic recession, natural disasters, had a dramatic impact on traffic levels, some with a global reach. To these highly improbable type of events can be added technological advancements, new airlines and airports business models, policy and regulation changes, increasing concern for environmental impact. In this context, the thesis puts forward an innovative approach for addressing risk assessment and mitigation under uncertainty in long-term airport infrastructure development projects. The thesis expands on the newly developed formalism of fuzzy dual numbers as a key tool to address uncertainty. After a comprehensive review of the airport industry in the context of uncertain environments, fuzzy dual numbers and fuzzy dual calculus are introduced. Since airport infrastructure development project is another case of multi-stage decision-making problem, dynamic programming is considered in order to optimize the sequential decision making process. The originality of the approach resides in the fact that the entire process will be fuzzified and fuzzy dual dynamic programming components will be introduced. To validate our method, a study case will be developed
Apparigliato, Romain. „Règles de décision pour la gestion du risque : Application à la gestion hebdomadaire de la production électrique“. Phd thesis, Ecole Polytechnique X, 2008. http://pastel.archives-ouvertes.fr/pastel-00004166.
Der volle Inhalt der QuelleTsimaratos, Alkis. „Optimisation du portefeuille de chiffre d'affaires d'une société d'assurance non-vie sur le marché de la réassurance et des grands risques“. Université Louis Pasteur (Strasbourg) (1971-2008), 2002. http://www.theses.fr/2002STR1EC02.
Der volle Inhalt der QuelleInsurance prices in the Reinsurance and Large Industrial Risks markets are highly volatile. Their variations are conditioned by the occurrence (or not) of Major Losses. Companies operating in this market therefore review their underwriting portfolio every year. Doing so, their objective is to maximize value creation, limiting the risk they can take with respect to the shareholders funds that they have. The underlying principles of this problematic can be compared to asset portfolio management. In this Ph. D. We first present the industrial context of the Reinsurance and Large Risks market that raised this subject. We then formalize an original analytical framework where we develop various models for optimizing the underwriting portfolio of a company in that market. These models are multiperiodic and rely on a financial projection model in a stochastic environment. The main source of uncertainty we have considered is that of insurance claims charges. We have proposed a separated treatment for the Major Loss components. A Company's behavior toward risk is analyzed through different objective functions and constraints and the associated optimal strategies. These developments led us to propose a new operational model for optimizing underwriting portfolios derived from asset allocation models. Our model relies on an analytical expression of the Balance Sheet and the Result of the company in a stochastic environment. Compared to traditional simulations approaches, our analytical one is characterized by its fast processing. It enables us to build instantaneously operational efficient frontiers for our business. We illustrate numerically the benefits of this approach by its application to the AXA Réassurance underwriting portfolio, in the strategic planing context
Dubuc, Carole. „Vers une amélioration de l’analyse des données et une optimisation des plans d’expérience pour une analyse quantitative du risque en écotoxicologie“. Thesis, Lyon 1, 2013. http://www.theses.fr/2013LYO10039/document.
Der volle Inhalt der QuelleIn ecotoxicology, the effects of toxic compounds on living organisms are usually measured at the individual level, in the laboratory and according to standards. This ensures the reproducibility of bioassays and the control of environmental factors. Bioassays, in acute or chronic toxicity, generally apply to survival, reproduction and growth of organisms. The statistical analysis of standardized bioassays classically leads to the estimation of critical effect concentrations used in risk assessment. Nevertheless, several methods/models are used to determine a critical effect concentration. These methods/models are more and less adapted to the data type. The first aim of this work is to select the most adapted methods/models to improve data analysis and so the critical effect concentration estimation. Usually, data sets are built from standard bioassays and so follow recommendations about exposure duration, number and range of tested concentrations and number of individuals per concentration. We can think that these recommendations are not the most adapted for each critical effect concentration and each method/model. That’s why, the second aim of this work is to optimize the experimental design in order to improve the critical effect concentration estimations for a fixed cost or at least to reduce the waste of time and organisms
Jaziri, Wassim. „Modélisation et gestion des contraintes pour un problème d'optimisation sur-contraint : application à l'aide à la décision pour la gestion du risque de ruissellement“. Phd thesis, Rouen, INSA, 2004. http://www.theses.fr/2004ISAM0007.
Der volle Inhalt der QuelleJaziri, Wassim. „Modélisation et gestion des contraintes pour un problème d'optimisation sur-contraint : Application à l'aide à la décision pour la gestion du risque de ruissellement“. Phd thesis, INSA de Rouen, 2004. http://tel.archives-ouvertes.fr/tel-00008124.
Der volle Inhalt der QuelleMorhun, Nicolas. „Optimisation et sécurisation des investissements immobiliers russes en France“. Thesis, Paris Sciences et Lettres (ComUE), 2016. http://www.theses.fr/2016PSLED067.
Der volle Inhalt der QuelleBased on an economic analysis of the investment, which is increasingly tending to develop in contemporary law, the study of optimisation and security of Russian real estate investment in France relegates the risk of money laundering. This thesis shows that although such a risk cannot be denied, it can still be evaluated by implementing a management approach in order to optimise client’s issues and interests.The investment risk analysis requires consideration regarding questions relating to international private law, international taxation, as well as financing for transactions and implementation of guarantees. As a rule, the money laundering risk is leading various professionals involved in the transaction to fear the worst; however such risks seem to appear as a result of economic and legal analysis which aims to serve investor’s interests. Understanding the issues and reasons for investment, whilst trying to find solutions in order to secure the investment process is the objective of this thesis
Guillet, Julie. „Les papillomavirus Humains dans les cancers des Voies Aéro-Digestives Supérieures : optimisation de méthodes de détection et étude de populations à risque“. Thesis, Université de Lorraine, 2016. http://www.theses.fr/2016LORR0050/document.
Der volle Inhalt der QuelleThe Human Papillomavirus (HPV) are involved in almost 100% of cervical cancers. Recently, HPVs have been recognized as the cause of tumors of the upper aerodigestive tract, especially of squamous cell carcinoma of the oropharynx. In France, the proportion of oropharyngeal HPV-related tumors is unknown, partly because viral testing is not in guidelines. Moreover, assess the proportion of HPV-positive tumors in tumor banks is difficult because the tumor samples were fixed in formalin and embedded in paraffin (FFPE), which complicates detection techniques. We tested a high risk HPV detection method, indicated for liquid based pap smear, on FFPE samples. We compared this technique to the gold-standard : PCR (Polymerase Chain Reaction) followed by electrophoresis. Our results indicate that this technique is applicable to FFPE samples and even appears to be more sensitive. The majority of French patients (2/3) with head and neck consult with an advanced stage of disease. This is explained in part by the lack of organized screening of these cancers, contrary to breast, prostate, cervical, or colorectal cancers. But an early treatment is essential to increase the survival rate. We therefore conducted a prospective study on patients with head and neck tumors to test the oral brushing as screening cancer and HPV detection. We found tumor and/or dystrophic cells in 97.8% of patients with biopsy, and in 88.9% of patients by brushing. Compared with biopsy, our results suggested that smear has similar specificity for HPV detection in tumors (94.4%), but lower sensitivity (66.7%). This study has shown an HPV-related tumor in 12.2% of cases. Among them, we detected by brushing (in healthy area) an oral infection by high-risk HPV in 53.3% of cases. WHO has classified HPV as carcinogenic agents since 1995, and determined that patients who developed cervical cancer are six-times more likely to develop another HPV-related tumor. In this context, we have planned a multicenter prospective study to detect oral HPV infection in patients with a pre-neoplastic or neoplastic lesion of the cervix. Co-infection rate of the two anatomical sites is unknown in women infected with genital level. Insofar oral infection could be the cause of a second tumor location, it seems important to know how much women are co-infected to propose thereafter a special monitoring. The preventive vaccination, which exists against HPV 16 and 18 in the prevention of cervical cancer, is a future perspective. Because HPV 16 is found in 90% of HPV-related squamous cell carcinoma of the oropharynx, extending vaccine recommendations emerge as a new public health issue
Yacoub, Mohamed. „Conception des systèmes : calculs technico-économiques et aide à la décision dans la conduite d'un projet "recherche"“. Toulouse 3, 2004. http://www.theses.fr/2004TOU30064.
Der volle Inhalt der QuelleThe increasing requirements in management of Research & Development invite to implement a methodology of specific project control. Indeed, it has obvious characteristics: to be very closed with the innovation and very far from the stages of industrialization and marketing. The methodology of control that we explored is to base the decisions on a technico-economic modelling in the long-term, to take into account the technological development in the form of optimization stages or selection of scenarios producing the specified target for the project. Long-term modelling is carried out by the coupling of the tools: Microsoft Project, for planning and WinNag, for economic and financial evaluation We propose that the establishment of the planning is based on the definition of the functions established by the design upstream and associating the non-functional data with it:. .
Mouti, Saad. „Le management du risque pour les compagnies d'assurance : une approche marchés financiers“. Thesis, Paris 6, 2017. http://www.theses.fr/2017PA066744.
Der volle Inhalt der QuelleThis thesis tackles several aspects of financial risks encountered in the life insurance industry and particularly in a class of the products insurers offer; namely variable annuities and unit-linked products. It consists of three distinct topics and is split into six chapter that can be read independently.In variable annuities (VAs), policyholders’ behavior is a major risk for the insurer that affects life insurance industry in almost every aspect. The first two chapters of this first part deal with policyholders’ optimal policyholder for two VAs products. We address the rational lapse behavior in the guaranteed minimum account benefit (GMAB), and optimal withdrawals in the guaranteed minimum income benefit (GMIB). The third chapter is dedicated to a class of unit-linked products from a managing and hedging point of view. The second topic consists of one chapter and addresses the optimal execution of a large book of options. Typically, life insurance products are partially hedged using vanilla options. We consider the case where trades are affected by the traded quantity, and seek to find an optimal strategy that minimizes the expected cost and the mean-variance criterion.Finally, in the last topic we study the volatility process using two different proxies. First, range based estimators that rely on the asset price range data allow us to double-check that volatility is a rough process in the sense that it has a scaling parameter H less than 1/2. Then, using short time-to-maturity implied volatility, and a refined version of it, allows us to confirm that the rough aspect of volatility is universal along different proxies
Shetabi, Mehrafarin. „Opacité, analystes financiers et évaluation optimisée du risque dans l'industrie bancaire et les FinTech“. Electronic Thesis or Diss., Limoges, 2025. http://www.theses.fr/2025LIMO0001.
Der volle Inhalt der QuelleThis thesis provides a comprehensive investigation into the intricate interactions between bank opacity, financial analyst influence, and risk optimization, focusing on the dual contexts of banking and FinTech. Through three detailed studies, it explores how opacity and analyst behavior shape financial stability across varied regulatory environments and introduces an innovative, adaptive approach to optimizing credit risk assessment in FinTech lending.The first chapter rigorously examines the destabilizing effects of bank opacity, particularly during periods of market overvaluation and economic uncertainty, with a focus on U.S. and European banks. Using analyst forecast errors and dispersion as forward-looking measures of opacity, the study reveals that high opacity significantly heightens risk, especially in smaller, opaque U.S. institutions where analyst coverage paradoxically amplifies market sensitivity to negative earnings signals. By contrast, the effects in European banks are less pronounced, reflecting differences in regulatory frameworks and market structures. Additionally, high dividend payouts are shown to intensify opacity-driven risks, highlighting the intricate relationship between transparency, market discipline, and financial prudence in shaping bank stability.The second chapter investigates the role of financial analyst characteristics and career incentives in forecasting accuracy, boldness, and career trajectories across global banking markets, comparing trends in the U.S., Europe, and Asia. The findings demonstrate that experience, firm affiliation, and portfolio breadth significantly influence forecasting behavior, with distinct regional patterns. In the U.S., experienced analysts at leading firms provide bold and accurate forecasts, while younger analysts tend toward herding. In Europe, larger portfolios reduce forecasting accuracy, and younger analysts employ boldness to stand out, often at the expense of precision. These results highlight how regional labor market dynamics and career incentives shape analysts' forecasting contributions and their impact on market discipline.The third chapter addresses the challenges of credit risk assessment in FinTech lending, introducing EFSGA, an evolutionary-based ensemble learning framework that integrates genetic algorithms with machine learning. EFSGA is designed for dynamic, application-specific credit risk classification, balancing predictive accuracy with interpretability while adapting to evolving market conditions in real time. The model significantly outperforms traditional methods in handling unbalanced datasets and providing timely, actionable insights for credit risk management, establishing itself as a powerful tool for post-loan monitoring and risk mitigation in digital finance.Together, these studies offer a holistic analysis of opacity, financial analyst behavior, and advanced risk assessment techniques within banking and FinTech contexts. By clarifying the role of opacity and analyst pressures in shaping stability across financial systems and presenting cutting-edge tools for managing credit risk in emerging digital markets, this thesis provides crucial insights for financial institutions, regulators, and policymakers striving to foster systemic resilience and transparency in an increasingly interconnected financial landscape
Guiras, Zouhour. „Contribution à l'optimisation des politiques de maintenance et l'analyse de risque dans la planification des opérations d’assemblage - désassemblage à deux niveaux“. Electronic Thesis or Diss., Université de Lorraine, 2019. http://www.theses.fr/2019LORR0016.
Der volle Inhalt der QuelleThe reality of the economic markets places constraints on manufacturing companies that are increasingly difficult to achieve, such as product diversification, quality improvement, cost reduction and fewer delays. These constraints are satisfied by a better organization of manufacturing systems using existing technical resources. Our thesis focuses on two major contributions, the first is to model different industrial systems (simple production system, assembly system, disassembly system) by integrating maintenance policies. The second contribution is based on risk assessment of profit loss following a decision taken after an optimization of an industrial system. Three different industrial problems are studied; the first concerns the development of risk assessment methods of profit loss resulting from the choice of an optimization algorithm to solve a problem of joint production and maintenance planning. To achieve our goals, we start by calculating production and maintenance plans using different optimization algorithms. In addition, we propose analytical models to quantify the risk of profit loss resulting from product returns and of repair times. This study provides information on the most effective optimization algorithms for the problems encountered to help and guide decision-makers in the analysis and evaluation of their decisions. The second problem concerns the optimization of two-level assembly system planning. A mathematical model is developed to incorporate supply planning for two-level assembly system with stochastic lead times and failures. The optimal maintenance planning obtained is used in the risk assessment to find the threshold repair period that reduces the profit loss. The third problem studied concerns the optimization of disassembly system of returned products (used or end of life products), remanufacturing and assembly of finished products taking into account the degradation of the production system. An analytical model is developed to consider disassembly, remanufacturing of returned products that contribute to the assembly of finished products. Indeed, the latter may consist of new or remanufactured components. A maintenance policy is sequentially integrated to reduce the unavailability of the system. The goal of this study is to help decision makers, under certain conditions, choose the most cost-effective process to satisfy the customer and who can also adapt to the potential risks that can disrupt the disassembly-remanufacturing-assembly system. The risk associated with system repair periods is discussed, which has an impact on managerial decision-making
Ghandour, Raymond. „Diagnostic et évaluation anticipée des risques de rupture d'itinéraires basés sur l'estimation de la dynamique du véhicule“. Compiègne, 2011. http://www.theses.fr/2011COMP1966.
Der volle Inhalt der QuelleThe aim of this thesis is the development on an innovative methodology to address the issue of increasing road safety, by the diagnosis and the monitoring of the evolution of the parameters of the dynamic interaction of the vehicle with its surrounding environment. For that, the development and the evaluation of risk indicators seems necessary to warn the driver in order to avoid the risk situations. The research work of this thesis is divided in two methodologies. The first one, consists on the development of an estimator for the maximum friction coefficient estimation based on the Dugoff tyre-road interaction model and the iterative non-linear optimization method of Levenberg-Marquardt. This estimation is the base behind the development of the lateral skid indicator LSI, that compares the value of the used friction coefficient to the maximum one. An alert is generated, when the value of the LSI exceeds a threshold, to warn the driver on the risk situations. This methodology is validated in simulation using data from the vehicle dynamics simulator CALLAS® and in experimentation using the data from the laboratory vehicle of the IFSTTARMA. The simulation dat correspond to different road states (dry, wet, snowy and icy) and the experimental data correspond to a dry road state. The second methodology consists on the development of an algorithm for the anticipation of the risk situations by the evaluation of the risk indicators in future instant. This method is based on assumptions on the trajectory and longitudinal velocity and acceleration, to anticipate the vehicle dynamics parameters such as, the steering angle, the wheel rotational speed, the yaw rate, the side-slip angle, the normal forces, the lateral forces and the maximum friction coefficient. By knowing these parameters, we can calculate the risk indicators and evaluate them in future instant. The risk indicators evaluated in this method are the lateral load transfer LTR, based on the normal forces and the lateral skid indicator LSI based on the maximum friction coefficient. As well as for the estimation method, this method is validated using simulation data and experimental data. The results obtained in both methods have shown their applicability
Jiao, Ying. „Etudes sur les risques financiers: risques de crédit et d'information asymétrique“. Habilitation à diriger des recherches, Université Paris-Diderot - Paris VII, 2012. http://tel.archives-ouvertes.fr/tel-00759505.
Der volle Inhalt der QuelleCazaubon, Yoann. „Évaluation par méthode in silico du risque d’émergence de la résistance bactérienne des antibiotiques : exemple des fluoroquinolones et des glycopeptides en gériatrie“. Thesis, Lyon, 2018. http://www.theses.fr/2018LYSE1221/document.
Der volle Inhalt der QuelleBacterial resistance is a clearly established global threat. Researchers are coming together to find new solutions and alternatives to antibiotics. One of the main causes is the massive use of antibiotics. The unreasonable recourse of their use must be stopped. The population is aging, so it is the most vulnerable people who pay the price. We have used modeling tools to better understand the causes of this scourge. The goal of this work is to construct population pharmacokinetic models in the elderly for vancomycin and teicoplanin. From models validated upstream or from the literature, simulations have been performed for glycopeptides and ciprofloxacin for methicillin-resistant Staphylococcus aureus and Pseudomonas aeruginosa infections, respectively. They showed that when the minimal inhibitory concentration (MIC) of the incriminated bacterium was close to the critical value of sensitivity, the doses to be administered in order to be effective have to be increased compared to current recommendations. As for the prevention of the emergence of bacterial resistance, in the case of ciprofloxacin and teicoplanin, the doses to be administered are so high that toxicity is unavoidable. Analyses using Monte Carlo simulations on the antibiotics studied provided a better understanding of the determinants of minimizing the emergence of bacterial resistance to obtain as quickly as possible the exact MIC of the bacteria and an estimation of individual patient parameters
Guiras, Zouhour. „Contribution à l'optimisation des politiques de maintenance et l'analyse de risque dans la planification des opérations d’assemblage - désassemblage à deux niveaux“. Thesis, Université de Lorraine, 2019. http://www.theses.fr/2019LORR0016/document.
Der volle Inhalt der QuelleThe reality of the economic markets places constraints on manufacturing companies that are increasingly difficult to achieve, such as product diversification, quality improvement, cost reduction and fewer delays. These constraints are satisfied by a better organization of manufacturing systems using existing technical resources. Our thesis focuses on two major contributions, the first is to model different industrial systems (simple production system, assembly system, disassembly system) by integrating maintenance policies. The second contribution is based on risk assessment of profit loss following a decision taken after an optimization of an industrial system. Three different industrial problems are studied; the first concerns the development of risk assessment methods of profit loss resulting from the choice of an optimization algorithm to solve a problem of joint production and maintenance planning. To achieve our goals, we start by calculating production and maintenance plans using different optimization algorithms. In addition, we propose analytical models to quantify the risk of profit loss resulting from product returns and of repair times. This study provides information on the most effective optimization algorithms for the problems encountered to help and guide decision-makers in the analysis and evaluation of their decisions. The second problem concerns the optimization of two-level assembly system planning. A mathematical model is developed to incorporate supply planning for two-level assembly system with stochastic lead times and failures. The optimal maintenance planning obtained is used in the risk assessment to find the threshold repair period that reduces the profit loss. The third problem studied concerns the optimization of disassembly system of returned products (used or end of life products), remanufacturing and assembly of finished products taking into account the degradation of the production system. An analytical model is developed to consider disassembly, remanufacturing of returned products that contribute to the assembly of finished products. Indeed, the latter may consist of new or remanufactured components. A maintenance policy is sequentially integrated to reduce the unavailability of the system. The goal of this study is to help decision makers, under certain conditions, choose the most cost-effective process to satisfy the customer and who can also adapt to the potential risks that can disrupt the disassembly-remanufacturing-assembly system. The risk associated with system repair periods is discussed, which has an impact on managerial decision-making
Xu, Xia. „Four Essays on Capital Markets and Asset Allocation“. Thesis, Lyon, 2018. http://www.theses.fr/2018LYSE2059/document.
Der volle Inhalt der QuelleExtreme events have a material impact on return distributions and investment decisions. However, the role of event risks is understated in popular financial decision making approaches. This thesis includes event risks into investment decisions to improve global investment optimality. We examine event risks in two different but coherent financial settings: portfolio selection and corporate finance. In the portfolio selection setting, we focus on the incorporation of higher order information to capture the impact of event risks on portfolio construction. Higher order extensions are implemented on two main portfolio optimization methods: the classic framework of mean variance optimization and CAPM, and the stochastic dominance approach. We find that the inclusion of higher order information improves global portfolio optimality given the presence of event risks. As a special case, we combine the traditional applications of mean variance optimization and stochastic dominance analysis to examine the index efficiency of DJIA. We find that DJIA is efficient as a performance benchmark. In the corporate finance setting, we principally identified corporate name changes of M&As among the S&P 500 index, and examined how the name change events impact the return patterns for the acquirers and the targets. Conducting this corporate event study, we show that name changeevents substantially affect return dynamics, and that the abnormal return difference between name change events and non name change events is economically and statistically significant. Generally, our studies illustrate that the inclusion of event risks in decision processes brings important benefits to the asset allocation optimization
Gassiat, Paul. „Modélisation du risque de liquidité et méthodes de quantification appliquées au contrôle stochastique séquentiel“. Phd thesis, Université Paris-Diderot - Paris VII, 2011. http://tel.archives-ouvertes.fr/tel-00651357.
Der volle Inhalt der QuelleLaarabi, Mohamed Haitam. „Optimisation multicritère des itinéraires pour transport des marchandises dangereuses en employant une évaluation en logique floue du risque et la simulation du trafic à base d'agents“. Thesis, Paris, ENMP, 2014. http://www.theses.fr/2014ENMP0074/document.
Der volle Inhalt der QuelleEveryday thousands of trucks transporting hundreds of thousands of tons of dangerous goods by various modalities and both within and across nations. However, the term “dangerous” indicates an intrinsic adversity that characterize these products, which can manifest in an accident leading to release of a hazardous substance (e.g. radioactive, flammable, explosive etc.). In this situation, the consequences can be lethal to human beings, other living organisms and damage the environment and public/private properties.The importance of dangerous goods boils down to the significant economic benefits that generates. In fact, one cannot deny the contribution of the transport of all fossil fuel derived product, which represents more than 60% of dangerous goods transported in Europe. Eni, the Italian leading petrochemical company, every day operates a fleet of about 1,500 trucks, which performs numerous trips from loading terminals to filling stations. Distribution of petroleum products is a risky activity, and an accident during the transportation may lead to serious consequences.Aware of what is at stake, the division Eni R&M - Logistics Secondary, historically active in Genoa headquarters, is collaborating since 2002 with the DIBRIS department at University of Genoa, and the CRC at Mines ParisTech, with the purpose of studying possible improvements regarding safety in transport of dangerous goods, particularly petroleum products. Over years, this collaboration has led to the development of different technologies and mainly to an information and decision support system. The major component of this system is a platform for monitoring Eni fleet, at the national level, to deliver the products to the distribution points, called the Transport Integrated Platform (TIP). These vehicles are equipped with a device capable of transmitting data stream in real-time using a GPRS modem. The data transmitted can be of different nature and contain information about the state of the vehicle and occurred events during the trip. These data are intended to be received by centralized servers then get processed and stored, in order to support various applications within the TIP.With this in mind, the studies undertaken throughout the thesis are directed towards the development of a proposal to further minimize the risk related to the transportation of dangerous goods. In other words, a trade-off based model for route selection taking into consideration economic and safety factors. The objective is prompted by the need to support existent regulations and safety standards, which does not assure a full warranty against accidents involving dangerous goods.The goal is carried out by considering the existent system as basis for developing an Intelligent Transportation System (ITS) aggregating multiple software platforms. These platforms should allow planners and decision makers to monitor in real-time their fleet, to assess risk and evaluate all possible routes, to simulate and create different scenarios, and to assist at finding solutions to particular problems.Throughout this dissertation, I highlight the motivation for such research work, the related problem statements, and the challenges in dangerous goods transport. I introduce the TIP as the core for the proposed ITS architecture. For simulation purposes, virtual vehicles are injected into the system. The management of the data collection was the subject of technical improvement for more reliability, efficiency and scalability in real-time monitoring of dangerous goods shipment. Finally, I present a systematic explanation of the methodology for route optimization considering both economic and risk criteria. The risk is assessed based on various factors mainly the frequency of accident leading to hazardous substance release and its consequences. Uncertainty quantification in risk assessment is modelled using fuzzy sets theory
Abune'meh, Mohammed. „Optimisation de l’aménagement d’un Chantier de Construction en Fonction des Risques Naturels et Technologiques, Utilisation du SIG“. Thesis, Paris Est, 2017. http://www.theses.fr/2017PESC1012/document.
Der volle Inhalt der QuelleConstruction sites contain several supporting facilities that are required to complete construction activities. These facilities are susceptible to damage due to the occurrence of natural or technological hazards such as fire, explosion, blast wave, and so on. These may cause adverse consequences for the whole construction process, which in turn lead to fatal accidents that have a major impact on worker and employee productivity, project completion time, project quality and project budget. Therefore, project planners must adopt and develop innovative approaches able to face the occurrence of potential hazards, minimize their consequences, and facilitate the evacuation of the site in case of their occurrence. One of these approaches is optimizing construction site layout. In general, generating construction site layout able minimizing risk resulting from natural or technological hazards is still a scientific challenge.In the present research, two proposed model (deterministic and probabilistic) are developed to minimize the risks within a construction site. The common methodology adopted to develop these two models consists of:• Modeling construction site components, for instance; electric generator, offices and material storages, in a 2D layout, to act as either hazardous source or potential target or both at the same time.• Modeling hazard: it shows the hazard interaction among site components and the attenuation of hazard.• Modeling vulnerability: it represents the potential weakness of whole targets to the hazard generated from each source.• Defining the utility function: it aims to afford an optimized site layout with minimum total risk in the construction site. The differential evolution algorithm is adopted to run optimization process.Particularly, in the deterministic model, we use space syntax principle in order to realize the impact of space configurations in evaluating the risk in the construction site. Therefore, as the evacuation process is considered in estimating the risk, the actual risk is amplified by utilizing penalty factor called mean depth. Furthermore, Dijkstra’s algorithm is run on deterministic model to find the safest paths (least risk paths) for evacuating sites from any position on the construction site towards the safe places in order to diminish losses and fatalities. On the other hand, the framework utilized to develop a probabilistic model assumed that the risk is combined of the individual failure of each facility within a construction site. Moreover, the numerical simulation is performed to find the probabilistic distribution of failure for the whole site.Geographic information system (GIS) capabilities were exploited, in this research, to present data in maps format, generate the spatial risk map in the construction site, and implement the Dijkstra’s algorithm and least-cost path analysis.For illustration purposes, the proposed models are employed in a case study consisting of several facilities. In the deterministic model, all of these facilities act as hazardous sources and potential targets, at the same time, while, in a probabilistic model, only three of these facilities act as fire hazardous sources, whereas, all of them are potential targets. The results revealed that the proposed models are efficient due to their capability of generating site layout with the safer work environment. In addition, the model is capable of highlighting the riskiest areas within a construction site. Moreover, the proposed models are able to generate paths through least-risk zones, which will minimize the serious injuries and victims in cases of emergencies
Bouvier, Louis. „Apprentissage structuré et optimisation combinatoire : contributions méthodologiques et routage d'inventaire chez Renault“. Electronic Thesis or Diss., Marne-la-vallée, ENPC, 2024. http://www.theses.fr/2024ENPC0046.
Der volle Inhalt der QuelleThis thesis stems from operations research challenges faced by Renault supply chain. Toaddress them, we make methodological contributions to the architecture and training of neural networks with combinatorial optimization (CO) layers. We combine them with new matheuristics to solve Renault’s industrial inventory routing problems.In Part I, we detail applications of neural networks with CO layers in operations research. We notably introduce a methodology to approximate constraints. We also solve some off- policy learning issues that arise when using such layers to encode policies for Markov decision processes with large state and action spaces. While most studies on CO layers rely on supervised learning, we introduce a primal-dual alternating minimization scheme for empirical risk minimization. Our algorithm is deep learning-compatible, scalable to large combinatorial spaces, and generic. In Part II, we consider Renault European packaging return logistics. Our rolling-horizon policy for the operational-level decisions is based on a new large neighborhood search for the deterministic variant of the problem. We demonstrate its efficiency on large-scale industrialinstances, that we release publicly, together with our code and solutions. We combine historical data and experts’ predictions to improve performance. A version of our policy has been used daily in production since March 2023. We also consider the tactical-level route contracting process. The sheer scale of this industrial problem prevents the use of classic stochastic optimization approaches. We introduce a new algorithm based on methodological contributions of Part I for empirical risk minimization
Sani, Amir. „Apprentissage automatique pour la prise de décisions“. Thesis, Lille 1, 2015. http://www.theses.fr/2015LIL10038/document.
Der volle Inhalt der QuelleStrategic decision-making over valuable resources should consider risk-averse objectives. Many practical areas of application consider risk as central to decision-making. However, machine learning does not. As a result, research should provide insights and algorithms that endow machine learning with the ability to consider decision-theoretic risk. In particular, in estimating decision-theoretic risk on short dependent sequences generated from the most general possible class of processes for statistical inference and through decision-theoretic risk objectives in sequential decision-making. This thesis studies these two problems to provide principled algorithmic methods for considering decision-theoretic risk in machine learning. An algorithm with state-of-the-art performance is introduced for accurate estimation of risk statistics on the most general class of stationary--ergodic processes and risk-averse objectives are introduced in sequential decision-making (online learning) in both the stochastic multi-arm bandit setting and the adversarial full-information setting
Guillaume, Romain. „Gestion des risques dans les chaînes logistiques : planification sous incertitude par la théorie des possibilités“. Phd thesis, Université Toulouse le Mirail - Toulouse II, 2011. http://tel.archives-ouvertes.fr/tel-00700518.
Der volle Inhalt der QuelleRios, Mora Juan Sebastian. „Optimisation de la gestion de l’impact des polluants gazeux du sol sur la qualité de l’air intérieur“. Thesis, La Rochelle, 2021. http://www.theses.fr/2021LAROS035.
Der volle Inhalt der QuellePolluted sites and most precisely vapor intrusion represents a potential risk for human health and its environment. Various screening-level and analytical models have been proposed in order to evaluate vapor intrusion and provide assessment tools for exposure risk. However, some in situ investigations show significant differences between predicted and measured indoor concentrations leading eventually to misleading conclusions and inappropriate solution implementations. These uncertainties are mainly associated with a poor characterization of the site, an incomplete modeling of transfer pathways and mechanisms, or by neglecting certain influencing parameters on this transfer. For example, ignoring the lateral source/building separation may serve as possible explanation of the uncertainties presented by the conventional models based on a homogeneous source distribution assumption. The authors agree that lateral migration plays an important role in the attenuation of the indoor concentration. In homogeneous or continuous source scenarios vapors may migrate mainly vertically towards the building. However, lateral source may promote lateral migration to the atmosphere and less into the building generating a greater attenuation of the indoor concentration. In this context, the main objective of this thesis work is to contribute to the improvement of the assessment and management risk tools in order to improve the accuracy of their estimations and increase their range of application. To do this, new vapor intrusion models are developed considering the lateral source/building separation. These models are built on a numerical experimentation and dimensionless analysis based on existing models (semi-empirical models considering a homogeneous source distribution). The combination of these two approaches allows, on the one hand, to maintain the aptitude of the existing models to consider the physical properties of the soil (permeability, diffusion coefficient, …) and the characteristics of the building (type of construction, building depression, volume,…), and on the other hand, to better precise the position of the source in the soil taking into account the influence of the lateral source/building separation in the estimations. From a comparative analysis, the accuracy of these new expressions is verified comparing to the proposed numerical model (CFD), experimental data and existing models in the literature. Finally, the proposed expressions were coupled with a ventilation code (MATHIS-QAI) allowing to better specify indoor characteristics (ventilation system, air permeability of the envelope, volume of the building, …) and estimate indoor air concentration levels as a function of environmental variations (wind speed, outside temperature, …) over time. From a parametric study it was shown that despite the significant impact of the characteristics of the building, the influence of the lateral source/building separation remains predominant on the attenuation of the indoor concentration (attenuation of several orders of magnitude when the source is laterally offset of the building compared to a homogeneous source). However, specifying the characteristics of the building (construction type, ventilation system, air permeability, …) and weather conditions may increase the accuracy of the estimation avoiding the implementation of extreme solutions or insufficient actions
Jourdain, Gonzague. „Contribution à l' optimisation de la prévention de la transmission materno-fœtale du VIH dans les pays en développement“. Paris 6, 2005. http://www.theses.fr/2005PA066419.
Der volle Inhalt der QuelleMerlin, Aurélie. „Optimisation de l’usage des antiparasitaires chez la génisse d’élevage en vue de prévenir le risque d’émergence de populations de strongles digestifs résistants : développement d’une stratégie durable de traitement sélectif“. Thesis, Nantes, Ecole nationale vétérinaire, 2017. http://www.theses.fr/2017ONIR093F/document.
Der volle Inhalt der QuelleIn first grazing season calves (FGSC), the anthelmintic (AH) treatments used to control the negative impact of gastrointestinal nematodes (GIN) on growth must be rationalized to preserve their long-term efficacy. The aim of this PhD thesis was to develop and assess targeted selective treatment (TST) strategies based on growth in FGSC, in order to preserve GIN populations in refugia i.e. not exposed to AH, and thus delay the emergence of AH resistance. Firstly, the relation growth/GIN infection at housing was demonstrated in different environments which allowed identifying groups, and within groups, the most infected animals. Then, a tree treatment decision at housing was proposed combing grazing management indicators to identify the groups at risk, and several average daily weight gain (ADWG) thresholds to identify, within groups, the animals suffering the most of infection. A TST strategy based on mid- season mean ADWG was assessed in field survey in comparison with whole group treatment (WGT). No significant difference, in terms of growth and GIN infection, was observed at housing between the TST and the WGT groups. Lastly, the veterinarians’ behaviors and perceptions about the control of GIN in dairy cattle farming, including a more rational AH management, were assessed. The veterinarians recognize the need to consider the sustainability of the AH treatment but identify serval obstacles as the development of advices and the availability of simple, reliable and cheap tools.The results of this thesis show that it is possible to target the use of AH in FGSC basing on individual and group indicators
Mkireb, Chouaïb. „Optimisation et gestion des risques pour la valorisation de la flexibilité énergétique : application aux systèmes d’eau potable“. Thesis, Compiègne, 2019. http://www.theses.fr/2019COMP2492/document.
Der volle Inhalt der QuelleIn a context of demographic growth in which natural resources are more and more limited, optimized management of water and power networks is required. Changes in electricity markets regulation in several countries have recently enabled effective integration of Demand Response mechanisms in power systems, making it possible to involve electricity consumers in the real-time balance of the power system. Through its flexible components (variable-speed pumps, tanks), drinking water systems, which are huge electricity consumers, are suitable candidates for energy-efficient Demand Response mechanisms. However, these systems are often managed independently of power system operation, for both economic and operational reasons. In this thesis, the objective is the evaluation of the economic and the ecological values related to the integration of drinking water systems flexibility into power system operation through french demand response mechanisms. An analysis of the architecture of french electricity markets is first conducted, allowing to target the most suitable demand response mechanisms considering water systems operating constraints. Some mathematical models to optimize water systems flexibility are then proposed and solved through original heuristics, integrating uncertainties about water demands, market prices and pumping stations availability. Numerical results, which are discussed using three real water systems in France, integrate the economic aspects inclunding risks, operational and ecological aspects. Significant reductions in water systems operating costs are estimated through the optimization of demand response power bids on the French spot power market during peak times. In parallel, uncertainties consideration secures the operation of water systems in real time, and makes it possible to manage economic risks related to the power grid balancing. In addition, significant savings in CO2 emissions, estimated to around 400 tons per day in France, can be achieved by reducing electricity production from fossil sources
Ben, Dbabis Makram. „Modèles et méthodes actuarielles pour l'évaluation quantitative des risques en environnement solvabilité II“. Phd thesis, Université Paris Dauphine - Paris IX, 2012. http://tel.archives-ouvertes.fr/tel-00833856.
Der volle Inhalt der QuelleBiais, Matthieu. „Stratégie d’optimisation hémodynamique des patients à risque : impacts de l’acidose respiratoire et métabolique, du clampage de l’aorte abdominale sous-rénale et du positionnement peropératoire“. Thesis, Bordeaux 2, 2013. http://www.theses.fr/2013BOR22095/document.
Der volle Inhalt der QuelleThe aim of perioperative haemodynamic optimization is to maximize oxygen delivery and/or stroke volume during high risk surgery. This concept has evolved during the last thirty years, to a simpler, more feasible and less invasive approach. Main treatments used in different hemodynamic optimization protocols are fluid loading, inotropes and vasopressors administration. However, pathophysiological consequences of surgical stress can greatly impact the mode of administration and the efficacy of the above therapeutics. In the first study, we described the impact of respiratory and metabolic acidosis (frequently encountered during major surgery and/or laparoscopic surgery) on the effectiveness of α and β-adrenergic agents in healthy rat myocardium. In a second work, we demonstrated that intravenous fluids cannot be guided by dynamic indices of preload dependency during surgical clamping of the infrarenal abdominal aorta in a porcine model. Finally, in the third study, we demonstrated in a clinical model, that positioning in prone position during spine surgery induced major changes in cardiorespiratory interactions and dynamic indices should be interpreted with caution to guide fluid therapy in this context. These translational studies highlight three common situations impacting the effectiveness and/or administration of therapeutic necessary for intraoperative hemodynamic optimization
Serra, Romain. „Opérations de proximité en orbite : évaluation du risque de collision et calcul de manoeuvres optimales pour l'évitement et le rendez-vous“. Thesis, Toulouse, INSA, 2015. http://www.theses.fr/2015ISAT0035/document.
Der volle Inhalt der QuelleThis thesis is about collision avoidance for a pair of spherical orbiting objects. The primary object - the operational satellite - is active in the sense that it can use its thrusters to change its trajectory, while the secondary object is a space debris that cannot be controlled in any way. Onground radars or other means allow to foresee a conjunction involving an operational space craft,leading in the production of a collision alert. The latter contains statistical data on the position and velocity of the two objects, enabling for the construction of a probabilistic collision model.The work is divided in two parts : the computation of collision probabilities and the design of maneuvers to lower the collision risk. In the first part, two kinds of probabilities - that can be written as integrals of a Gaussian distribution over an Euclidean ball in 2 and 3 dimensions -are expanded in convergent power series with positive terms. It is done using the theories of Laplace transform and Definite functions. In the second part, the question of collision avoidance is formulated as a chance-constrained optimization problem. Depending on the collision model, namely short or long-term encounters, it is respectively tackled via the scenario approach or relaxed using polyhedral collision sets. For the latter, two methods are proposed. The first one directly tackles the joint chance constraints while the second uses another relaxation called risk selection to obtain a mixed-integer program. Additionaly, the solution to the problem of fixed-time fuel minimizing out-of-plane proximity maneuvers is derived. This optimal control problem is solved via the primer vector theory
Neirac, Lucie. „Learning with a linear loss function : excess risk and estimation bounds for ERM and minimax MOM estimators, with applications“. Electronic Thesis or Diss., Institut polytechnique de Paris, 2023. http://www.theses.fr/2023IPPAG012.
Der volle Inhalt der QuelleCommunity detection, phase recovery, signed clustering, angular group synchronization, Maxcut, sparse PCA, the single index model, and the list goes on, are all classical topics within the field of machine learning and statistics. At first glance, they are pretty different problems with different types of data and different goals. However, the literature of recent years shows that they do have one thing in common: they all are amenable to Semi-Definite Programming (SDP). And because they are amenable to SDP, we can go further and recast them in the classical machine learning framework of risk minimization, and this with the simplest possible loss function: the linear loss function. This, in turn, opens up the opportunity to leverage the vast literature related to risk minimization to derive excess risk and estimation bounds as well as algorithms to unravel these problems. The aim of this work is to propose a unified methodology to obtain statistical properties of classical machine learning procedures based on the linear loss function, which corresponds, for example, to the case of SDP procedures that we look as ERM procedures. Embracing a machine learning view point allows us to go into greater depth and introduce other estimators which are effective in handling two key challenges within statistical learning: sparsity, and robustness to adversarial contamination and heavy-tailed data. We attack the structural learning problem by proposing a regularized version of the ERM estimator. We then turn to the robustness problem and introduce an estimator based on the median of means (MOM) principle, which we call the minmax MOM estimator. This latter estimator addresses the problem of robustness and can be constructed whatever the loss function, including the linear loss function. We also present a regularized version of the minmax MOM estimator. For each of those estimators we are able to provide excess risk and estimation bounds, which are derived from two key tools: local complexity fixed points and curvature equations of the excess risk function. To illustrate the relevance of our approach, we apply our methodology to five classical problems within the frame of statistical learning, for which we improve the state-of-the-art results
Lemoussu, Sophie. „A model-based framework for innovative Small and Medium-sized Enterprises (SMEs) in Aeronautics“. Thesis, Toulouse, ISAE, 2020. http://www.theses.fr/2020ESAE0014.
Der volle Inhalt der QuelleThe aviation market is facing nowadays a fast growth of innovative airborne systems. Drone cargo, drone taxi, airships, stratospheric balloons, to cite a few, could be part of the next generation of air transportation. In the same time, Small and Medium-sized Enterprises (SMEs) are becoming more and more involved in designing and developing new forms of air transportation, transitioning from the traditional role of supplier to those of system designer and integrator. This situation changes drastically the scope of SMEs' responsibility. As integrators they become responsible for certification of the components and the manufacturing process, an area in which they have little experience. Certification mandates very specific knowledge, regarding the regulations, norms and standards. Certification is a mandatory process and a critical activity for the enterprises in the aerospace industry. It constitutes a major challenge for SMEs who have to take on this certification responsibility with only limited resources. In this thesis, two major needs are identified: methodological support is not easily available for SMEs; and certification requirements are not easily comprehensive and adaptable to each situation. We examine alternate paths, reducing the complexity and bringing one step closer to solving the problem for the innovative SMEs. The objective is to provide support so that they can be more efficient to comprehend and integrate rules, legislations and guidelines to their internal processes in a simpler way. This thesis proposes then a methodological approach to support such organisation. Developed in close cooperation with a French SME in this situation, the approach is composed of a set of models (metamodel, structural, and behavioural models) covered by a certification governance mechanism
Ben, Saida Abdallah. „Essais sur la Diversification des Portefeuilles Financiers et des Fonds Structurés de Crédit : Une Approche en termes de copules“. Thesis, Cergy-Pontoise, 2014. http://www.theses.fr/2014CERG0705/document.
Der volle Inhalt der QuelleIn this thesis, we examine one key topic related to copula theory contributions to the financial portfolio management theory and to the study of structured credit products.The first part of this thesis deals with financial portfolio management. We first discuss the relationship between portfolio diversification and the choice of the copula that better describes the dependence structure. The goal is to identify one feature of portfolios making straightforward the process of the appropriate copula selection. In the second chapter, we propose to study the impact of copula model misspecification, on conventional risk measures estimates as for the example of Value-at-Risk and Expected-Shortfall. The idea is to check the validity of developing these estimates under the true copula model. In a third chapter, we apply such approach to optimal portfolio allocation problem. The main objective is to identify investor's sensitivity, depending on their risk (or loss) aversion, to one of the component of the copula model. Thus, we propose one possible linkage between the behavioral finance theory and the copula functions framework.The second part of the thesis focuses on structured credit products. In a first chapter, we study the contribution of an actuarial model, using copulas functions in modeling the dependence structure between times of defaults, in the process of risk measures estimates. Finally, in a last chapter we re-examine the “Diversity Score” concept, developed by the Moody's rating agency to assign the quality of structured credit portfolio in terms of diversification. We discuss the analogy of this measure with that of the copula approach, and demonstrate its adequacy with some copula functions families
Costedoat-Chalumeau, Nathalie. „Etude du rapport bénéfice/ risque du traitement par hydroxychloroquine dans le lupus systématique : étude de la toxicité foetale, de la toxicité cardiaque et optimisation posologique basée sur la détermination de sa concentration sanguine“. Paris 6, 2007. http://www.theses.fr/2007PA066015.
Der volle Inhalt der QuelleHydroxychloroquine (HCQ), a drug with a particularly long half-life, is an important treatment for systemic lupus erythematosus. In this PhD thesis, we report studies performed to improve the knowledge on the pharmacology of the HCQ: transplacental passage, foetal toxicity, cardiac toxicity, pharmacokinetic-pharmacodynamic relation (PK-PD) and monitoring of adherence to treatment. We determined blood HCQ concentration among 143 patients with SLE. The PK-PD relation of HCQ was strong and followed an inhibitory model with maximum effect. The other interest of determining blood HCQ concentration was the identification of non compliant patients. Indeed, very low blood HCQ concentrations are an objective marker of prolonged poor compliance (14 patients out of 203, 7%). The PK-PD results led us to organize a randomised, multicenter and national trial, entitled “study PLUS” (we detail the protocol)
Large, Aurore. „Une meilleure gestion patrimoniale des réseaux d'eau potable : le modèle de prévision du renouvellement à long terme OPTIMEAU“. Thesis, Bordeaux, 2016. http://www.theses.fr/2016BORD0086/document.
Der volle Inhalt der QuelleIn developed countries, drinking water is distributed to households. This comfort requires a long networkwith a high value. To optimize resources and performance, it is important to renew pipe sectionsat the best possible time. This thesis presents the short (1-3 years), medium and long term (> 30 years)models commonly employed. The short-term processes seem quite effective, but long term methods remainrough. The OPTIMEAU model proposes a long-term approach to get a prospective overview ofpipes and improve its management, while remaining connected to the technical practices at short-termlevel. This approach, based on survival statistical models, is tested using actual data of three Europeanwater services : SEDIF (Syndicat des Eaux d’Ile de France), Grand Lyon and eauservice Lausanne inSwitzerland. The originality of this model lies in the estimation of the past decommissioning age distribution,keystone in the construction of eleven indicators related to finance, achievements and futureperformance of water utilities