Дисертації з теми "Quantification des risques"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-50 дисертацій для дослідження на тему "Quantification des risques".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.
Barnier, Thibaud de. "Evaluation quantitative des risques industriels avec incertitudes par la méthode du noeud papillon. Application sur un cas d'étude concret." Electronic Thesis or Diss., Université de Toulouse (2023-....), 2024. http://www.theses.fr/2024TLSEP037.
Quantitative risk analysis has become a decision-making tool for all industrial activities exposed to the risk of major accidents. In France, since the AZF accident, regulations have changed and now require quantitative safety studies to assess and reduce technological risks. There is still considerable margin for improvement in this analysis, including the incorporation of the uncertainties inherent in the probabilities of occurrence of events and scenarios likely to occur. The assessment of industrial risks using the bow-tie method is no exception to this need for development. Through the treatment of a concrete case study proposed by Technip Energies, this study provides a practical analysis of the resolution of the bow-tie that integrates the aleatoric and epistemic uncertainties linked to the events and safety barriers present in the bow-tie
Laplante-Albert, Kathy-Andrée. "Quantification des risques de mortalité liés à l'habitat chez des espèces de poissons lacustres /." Thèse, Trois-Rivières : Université du Québec à Trois-Rivières, 2008. http://www.uqtr.ca/biblio/notice/resume/30077744R.pdf.
Laplante-Albert, Kathy-Andrée. "Quantification des risques de mortalité liés à l'habitat chez des espèces de poissons lacustres." Thèse, Université du Québec à Trois-Rivières, 2008. http://depot-e.uqtr.ca/1891/1/030077744.pdf.
Hassani, Bertrand Kian. "Quantification des risques opérationnels : méthodes efficientes de calcul de capital basées sur des données internes." Paris 1, 2011. http://www.theses.fr/2011PA010009.
Sauvage, Hélène. "Détection et quantification d'Aphanomyces euteiches dans les parcelles agricoles pour la prévention des risques phytosanitaires." Rouen, 2007. http://www.theses.fr/2007ROUES080.
Coutard, François. "Quantification de l'expression de gènes de virulence chez Vibrio parahaemolyticus dans le milieu marin." Nantes, 2007. http://www.theses.fr/2007NANT2005.
Winter, Anne. "Evaluation model of a supply chain's sustainability performance and risk assessment model towards a redesign process : case study at Kuehne + Nagel Luxembourg." Thesis, Strasbourg, 2016. http://www.theses.fr/2016STRAD044/document.
Ln the present work, the sustainabillty concept has been redeflned so that common understanding can be guaranteed. Subsequently, a model intended to evaluate an existent supply chain's overall degree of sustainability has been developed and empirically tested through a case study. Considering the approach of continuous improvement, this evaluation should be followed by a redesign of the considered supply chain. However, a risk assessment needs to be done ex-ante. Forthis reason, a risk Identification and quantification model has been evolved. This model may consider bath, the risks leading to the redesign process and the risks resultlng from the redesign phase. A case study, which considers the risks leading to a redesign phase, has been implemented so that the model's feasibility in a real business' environment can be proved. The model's outcomes must not be mistaken for ultimate results but need to be considered being a decision support for managers
Roullier, Vincent. "Classification floue et modélisation IRM : application à la quantification de la graisse pour une évaluation optimale des risques pathologiques associés à l'obésité." Phd thesis, Université d'Angers, 2008. http://tel.archives-ouvertes.fr/tel-00348028.
Bräutigam, Marcel. "Pro-cyclicality of risk measurements. Empirical quantification and theoretical confirmation." Thesis, Sorbonne université, 2020. http://www.theses.fr/2020SORUS100.
This thesis examines, empirically and theoretically, the pro-cyclicality of risk measurements made on historical data. Namely, the effect that risk measurements overestimate the future risk in times of crisis, while underestimating it in quiet times. As starting point, we lay down a methodology to empirically evaluate the amount of pro-cyclicality when using a sample quantile (Value-at-Risk) process to measure risk. Applying this procedure to 11 stock indices, we identify two factors explaining the pro-cyclical behavior: The clustering and return-to-the-mean of volatility (as modeled by a GARCH(1,1)) and the very way of estimating risk on historical data (even when no volatility dynamics are present). To confirm these claims theoretically, we proceed in two steps. First, we derive bivariate (functional) central limit theorems for quantile estimators with different measure of dispersion estimators. We establish them for sequences of iid random variables as well as for the class of augmented GARCH(p,q) processes. Then, we use these asymptotics to theoretically prove the pro-cyclicality observed empirically. Extending the setting of the empirical study, we show that no matter the choice of risk measure (estimator), measure of dispersion estimator or underlying model considered, pro-cyclicality will always exist
Fares, Almabrouk. "Risque de salmonellose humaine liée à la consommation de fromage à pâte molle au lait cru : développement d'un modèle pour l'appréciation quantitative du risque." Phd thesis, AgroParisTech, 2007. http://pastel.archives-ouvertes.fr/pastel-00003463.
Letortu, P. "Le recul des falaises crayeuses haut-normandes et les inondations par la mer en Manche centrale et orientale : de la quantification de l'aléa à la caractérisation des risques induits." Phd thesis, Université de Caen, 2013. http://tel.archives-ouvertes.fr/tel-01018719.
Letortu, Pauline. "Le recul des falaises crayeuses haut-normandes et les inondations par la mer en Manche centrale et orientale : de la quantification de l’aléa à la caractérisation des risques induits." Caen, 2013. https://tel.archives-ouvertes.fr/tel-01018719.
This thesis focuses on the quantification of two hazards that are coastal chalk cliff retreat in Upper Normandy and coastal flooding events in the central and eastern English Channel. These interdependent phenomena that generate damages have a little-known functioning which is characterized by events of varying intensity and frequency. The quantitative work is based on a systemic approach and on an approach that joints spatial and temporal scales together. This work integrates the frequency/intensity ratio, and therefore the effectiveness of the factors and processes responsible for their trigger. This work aims to:1) calculate the retreat rates and retreat rhythms of cliffs;2) identify the factors responsible for triggering falls; 3) quantify the production of debris and ablation rate of the cliff face; 4) assess the changes in frequency and intensity of coastal flooding events; 5) determine the origin of these changes; 6) think about the methods of mapping the hazards « coastal flooding » and « cliff retreat »
Mathey, Marguerite. "Quantification haute résolution du champ de déformation 3D des Alpes occidentales : interprétations tectoniques et apports à l’aléa sismique Seismogenic potential of the High Durance Fault constrained by 20 yr of GNSS measurements in the Western European Alps." Thesis, Université Grenoble Alpes, 2020. http://www.theses.fr/2020GRALU033.
The Alpine range is one of the first monitored mountain belts worldwide, both by seismic networks and space geodesy (GNSS), in particular due to its moderate but steady seismicity. Permanent GNSS measurements demonstrated that uplift is the main signal characterizing the current deformation in the European western Alps, reaching up to 2 mm/yr, while no shortening is observed across the belt. Based on the incredible amount of geodetic and seismic data available today, it now appears possible to constrain a new high resolution crustal strain field in the western Alps. This 3D field, made of surface deformation measurements and of the seismic deformation characteristics, is of primary importance in order to decipher the links between horizontal, vertical and seismic deformations. We rely for the present work on a multidisciplinary approach that aims at integrating 25 years long seismic records, 20 years of GPS measurements, and 4 years of Sentinel-1 satellite acquisitions, in order to establish the corresponding 3D strain rate field.For the first time in the western Alps, the spatial variability of the style of seismic deformation is robustly assessed, thanks to the analysis of the Sismalp (ISTerre Grenoble) database. New focal mechanisms computation along with principal stresses inversions have led to establish new orientations for the extensive deformation component, which occurs mainly in the center of the belt. As for the highly resolved 3D field of style of deformation, provided by Bayesian interpolations of focal mechanisms both at the surface and at depth, it reveals a vast majority of dextral strike-slip deformation occurring at the periphery of the belt, associated, in one specific area, with compression. These results bring new insights in the dynamics of the western alpine belt.Four GPS surveys (conducted in 1996, 2006, 2011 and 2016), along with the data provided by the permanent RENAG network, allowed us to increase the spatial resolution of the surface velocity and strain rate fields at the scale of the western Alps. These high resolution geodetic fields reveal that the amplitude of the extensive signal is at the highest in the Briançonnais area, while its kinematics appear consistent with interseismic deformation accommodated on at least one fault (the High Durance Fault). Longer-term seismic records show that, at least at the local scale, seismic and geodetic deformation patterns seem consistent within their uncertainty bounds in terms of kinematics and amplitude. At the regional scale of the entire western Alps though, geodetic strain rates appear one order of magnitude higher than the seismic ones, the latter comprising both instrumental and historical seismicity.Finally, four years of Sentinel-1 acquisitions appear to be the minimum time span required in order to derive long-term velocity maps in the satellite line of sight at the scale of the western Alps. The interferometric processing of the corresponding data allowed for the first time to get rid of the effects of snow and vegetation in a consistent way. The results feature short scale spatial variations in the uplift pattern, which are spatially correlated to crystalline external Alpine massifs as well as to the uplift patterns predicted by several isostatic adjustment models.This multidisciplinary work enabled us to increase the spatial resolution both of horizontal and vertical surface deformations and of seismic crustal deformation. The related 3D strain rate field sheds new lights on the various processes from which seismicity and deformation can originate. This 3D strain rate field moreover brings new constraints on several primary inputs to seismic hazard assessment models
Hammadi, Lamia. "Custom supply chain engineering : modeling and risk management : application to the customs." Thesis, Normandie, 2018. http://www.theses.fr/2018NORMIR23.
The security, safety and efficiency of the international supply chain are of central importance for the governments, for their financial and economic interests and for the security of its residents. In this regard, the society faces multiple threats, such as illicit traffic of drugs, arms and other contraband, as well as counterfeiting and commercial fraud. For countering (detecting, preventing, investigating and mitigating) such threats, the role of customs arises as the gatekeepers of international trade and the main actor in securing the international supply chain. Customs intervene in all stages along the routing of cargo; all transactions leaving or entering the country must be processed by the custom agencies. In such an environment, customs become an integral thread within the supply chain. We adopt this point of view, with a particular focus on customs operations and, in order to underline this focus, we refer to this analysis as “customs supply chain”. In this thesis, we firstly set up the concept of customs supply chain, identify the actors and structural links between them, then establish the process mapping, integration approach and performance model. Secondly, we develop a new approach for managing risks in customs supply chain based on qualitative analysis. Such an approach leads to identify the risk classes as well as recommend best possible solutions to reduce the risk level. Our approach is applied in Moroccan customs by considering the criticality as a risk indicator. In a first time we use Failure Modes Effects Criticality Analysis (FMECA) and Cross Activity Based Costing (ABC) Method and priority weight; in the second time we use Analytic Hierarchy Process (AHP) and Fuzzy AHP (i.e., risk assessment under uncertainty); then a benchmarking of the two indicators is conducted in order to examine the effectiveness of the obtained results. Finally, we develop stochastic models for risk time series that address the most important challenge of risk modeling in the customs context: Seasonality. To be more specific, we propose on the one hand, models based on uncertainty quantification to describe monthly components. The different models are fitted using Moment Matching method to the time series of seized quantities of the illicit traffic on five sites. On the other hand, Hidden Markov Models which are fitted using the EM-algorithm on the same observation sequences. We show that these models allow to accurately handle and describe the seasonal components of risk time series in customs context. It is also shown that the fitted models can be easily interpreted and provide a good description of important properties of the data such as the second-order structure and Probability Density Function (PDFs) per season per site
Gatti, Filippo. "Analyse physics-based de scénarios sismiques «de la faille au site» : prédiction de mouvement sismique fort pour l’étude de vulnérabilité sismique de structures critiques." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLC051/document.
The ambition of this work is the prediction of a synthetic yet realistic broad-band incident wave-field, induced by strong ground motion earthquakes at sites of strategic importance, such as nuclear power plants. To this end, an multi-tool platform is developed and exploited to simulate the different aspects of the complex and multi-scale phenomenon an earthquake embodies. This multi-scale computational framework copes with the manifold nature of an earthquake by a holistic local-to-regional approach. A complex case study is chosen to this end: is the MW6.6 Niigata-Ken Ch¯uetsu-Oki earthquake, which damaged the Kashiwazaki-Kariwa nuclear power plant. The observed non-linear site-effects are at first investigated and characterized. In the following, the 3D source-to-site model is constructed and employed to provide reliable input ground motion, for a frequency band of 0-7 Hz. The effect of the folded geological structure underneath the site is quantified by simulating two aftershocks of moderate intensity and by estimating the spatial variability of the response spectra at different locations within the nuclear site. The numerical outcome stresses the need for a more detailed description of the incident wave-field used as input parameter in the antiseismic structural design of nuclear reactors and facilities. Finally, the frequency band of the time-histories obtained as outcome of the numerical simulations is enlarged by exploiting the stochastic prediction of short-period response ordinates provided by Artificial Neural Networks
Niang, Ibrahima. "Quantification et méthodes statistiques pour le risque de modèle." Thesis, Lyon, 2016. http://www.theses.fr/2016LYSE1015/document.
In finance, model risk is the risk of loss resulting from using models. It is a complex risk which recover many different situations, and especially estimation risk and risk of model misspecification. This thesis focuses: on model risk inherent in yield and credit curve construction methods and the analysis of the consistency of Sobol indices with respect to stochastic ordering of model parameters. it is divided into three chapters. Chapter 1 focuses on model risk embedded in yield and credit curve construction methods. We analyse in particular the uncertainty associated to the construction of yield curves or credit curves. In this context, we derive arbitrage-free bounds for discount factor and survival probability at the most liquid maturities. In Chapter 2 of this thesis, we quantify the impact of parameter risk through global sensitivity analysis and stochastic orders theory. We analyse in particular how Sobol indices are transformed further to an increase of parameter uncertainty with respect to the dispersive or excess wealth orders. Chapter 3 of the thesis focuses on contrast quantile index. We link this latter with the risk measure CTE and then we analyse on the other side, in which circumstances an increase of a parameter uncertainty in the sense of dispersive or excess wealth orders implies and increase of contrast quantile index. We propose finally an estimation procedure for this index. We prove under some conditions that our estimator is consistent and asymptotically normal
Pichené, Anne. "Quantification des facteurs de risque biomecaniques du syndrome du canal carpien." Nancy 1, 1994. http://www.theses.fr/1994NAN11192.
Dambra, Savino. "Data-driven risk quantification for proactive security." Electronic Thesis or Diss., Sorbonne université, 2021. http://www.theses.fr/2021SORUS356.
The feasibility and efficacy of proactive measures depend upon a cascading of challenges: how can one quantify the cyber risks of a given entity, what reliable indicators can be used to predict them, and from which data sources can they be extracted? In this thesis, we enumerate active challenges that practitioners and researchers face when attempting to quantify cyber-risks and contextualise them in the emerging domain of cyber insurance, and propose several research directions. We then explore some of these areas, evaluate the incidence that different security measures and security postures have on malware-infection risks and assess the goodness of nine host- extracted indicators when investigating the systematic nature of those risks. We finally provide evidence about the importance that data-source selection together with a holistic approach have on risk measurements. We look at web-tracking and demonstrate how underestimated privacy risks are when excluding the users' perspective
Caillot, Véronique. "Quantification statistique et étude expérimentale de mouvements sismiques : application à l'évaluation du risque." Grenoble 1, 1992. http://www.theses.fr/1992GRE10033.
Baud, Céline. "Le crédit sous Bâle II - un dispositif néolibéral de financiarisation en pratiques." Thesis, Jouy-en Josas, HEC, 2013. http://www.theses.fr/2013EHEC0007/document.
How does international regulation organize credit activities ? In order to address this question, this thesis analyzes the regulatory standards for assessing and controlling credit risks defined by the 2004 reform of the Basel Agreements on Capital Adequacy, the so-called Basel II Agreements, and it traces the genealogy of these standards from where and when they have been constructed – in the Basel Committee from 1998 to 2004 – down to their effects on the daily practices of a bank. The thesis describes the transnational community and the liberal project that launched the Basel Agreements reform process. Then it analyzes the regulatory framework itself and suggests that Basel II marks a shift within the regulatory regime of credit risk. Indeed it demonstrates that the norms are embedded in a financialized representation of credits and that they are implemented following a neoliberal mode of government. Finally, the thesis investigates how the new norms have been translated into a french cooperative bank specialized in SME lending and it concludes that the new regulatory framework is actively participating in the disembeddedness and in the financialization of credit relationships
Charvat, Hadrien. "Le risque attribuable : de la quantification de l’impact populationnel des facteurs de risque à la mesure de l’importance relative des biomarqueurs." Thesis, Lyon 1, 2010. http://www.theses.fr/2010LYO10320.
The attributable risk is an epidemiologic tool that dates back to the fifties but is still relatively seldom used. It estimates the proportion of cases of a given disease that could be avoided if the exposure to a specific risk factor was removed or reduced. Its major interest is that it combines the magnitude of the effect of the risk factor to the distribution of this factor within the population. After a review of the attributable risk main features and the principles of its estimation from case-control studies data, we propose a conceptual framework that allows estimating the impact of a public health intervention in a new population with different exposure to certain risk factors than those observed in the study population. To reach this goal, we used a splitting of the attributable risk that takes into account the combined action –or synergy– of the risk factors on the occurrence of the disease. Because the attributable risk allows estimating the effect of a variable at the population level, it is particularly interesting to quantify the relative importance of the covariates of a regression model. In diagnostic models, the estimation of the relative importance of classic biomarkers and biomarkers obtained from high-throughput technologies is currently crucial in establishing the contribution of each of these two levels of information. Using simulations we have demonstrated the way the role of high-throughput-technologies –quantified in terms of attributable risk– may be wrongly assessed through the use of unsuitable methodology
Gassiat, Paul. "Modélisation du risque de liquidité et méthodes de quantification appliquées au contrôle stochastique séquentiel." Phd thesis, Université Paris-Diderot - Paris VII, 2011. http://tel.archives-ouvertes.fr/tel-00651357.
Demotier, Sabrina. "Approches probabiliste et crédibiliste de quantification du risque de production d'une eau non-conforme." Compiègne, 2004. http://www.theses.fr/2004COMP1501.
Ln order to minimize sanitary and financial risks due to the distribution of drinking water non-compliant with current legislation, it is fundamental to define an efficient treatment process line, without increasing production cost. This thesis thus proposes a decision aid approach dedicated to treatment plant design, based on quantification of non-compliant water production risk. Ln a first step, usual dependability methods (FMECA, Fault trees. . . ) are used in order to calculate probability of non-compliant water production. This approach takes into account quality of raw water to be treated, technical characteristics of treatment line as weil as possible failures mode. Ln a second time and in order to mitigate uncertainties in this model and lack of data, the belief function theory is applied and permits to define the credibility degree of compliant water production
Sanson, Francois. "Estimation du risque humain lié à la retombée d'objets spatiaux sur Terre." Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLX035.
Recent regulations impose the re-entry of human-made end-of-life space object with a rigorous assessment of the risk for human assets. The risk evaluation requires sequences of complex numerical simulations accounting for the multi-physics phenomena occurring during the reentry of a space object, e.g., fluid-structure interactions and heat transfer. Further, these simulations are inaccurate because they rely on overly simplified models and partial knowledge of the reentry conditions.In this thesis, we propose novel uncertainty quantification techniques to deal with some of the uncertainties characterizing the problem and apply them to predict the risk for human assets due to the reentry of a space object.First, we construct a system of solvers to predict both the controlled or uncontrolled reentry of space objects. Compared to the existing reentry software, our system naturally accommodates the uncertainty in the object breakup predictions. Moreover, the constitutive solvers are interfaced and coupled within a framework that allows a single user to perform parallel runs of the full system.Second, we present two original methods to propagate the uncertainties in reentry predictions using the system of solvers. First, we construct a surrogate model approximating the directed systems of solvers, using a system of Gaussian Processes (SoGP). We build this probabilistic surrogate by approximating each solver (or a group of solvers) of the directed system by a Gaussian Process (GP). We show that the predictive variance of the SoGP is composed of individual contributions from each GP.We use this decomposition of the variance decomposition to develop active learning strategies based on training datasets which are enriched parsimoniously to improve the prediction of the least reliable GP only. We assessed the performance of the SoGP on several analytical and industrial cases. The SoGP coupled with active learning strategies yielded systematically significant improvements.The second method aims at predicting the survivability of space objects. During a space reentry event, the object can break up and generate fragments. Some fragments disintegrate in the atmosphere while others survive to the ground. Assessing the survivability of a fragment implies determining whether it reaches the ground or not and if it does, the impact location and the risk associated. We propose an original formulation of the survivability assessment problem to efficiently estimate the risk. The proposed method involves the composition of a classifier (demise prediction) with a Gaussian Process (impact location prediction).Dedicated active learning strategies are designed to balance the prediction errors of the classifier and GP and allocate training samples adequately.Finally, we apply the methods developed in the thesis to the prediction of the controlled reentry of a rocket upper-stage. The problem involves a large number of uncertainties (38), including the initial orbit properties, the deorbiting conditions, the upper stage material characteristics, the atmosphere model parameters, and the fragment material uncertainties. Moreover, we use a probabilistic breakup model for the object breakup to account for the model uncertainties. With our methods, we estimate at a reasonable computational cost the statistics of the conditions at breakup, the survival probability of the fragments, the casualty area, and the human risk. Global sensitivity analyses of the breakup conditions and casualty area provide a ranking of the most critical uncertainties. This study demonstrates the capability of our surrogate simulator to produce a robust measure of on-ground risk for a realistic reentry scenario
Margheri, Luca. "Etude de l'impact des incertitudes dans l'évaluation du risque NRBC provoqué en zone urbaine." Thesis, Paris 6, 2015. http://www.theses.fr/2015PA066453.
The dispersion of highly pathogenic biological agents in an urbanized area following a terrorist act is one of the situations that national security agencies need to evaluate in terms of risk assessment and decision-making. The numerical simulation of turbulent flows in urban areas, including monitoring the dispersion of pollutants, has reached a sufficient level of maturity to make predictions on realistic urban zones up to 4 square kilometers. However, the existing simulations are deterministic in the sense that all the parameters that define the case studied (intensity and wind direction, atmospheric stratification, source of emissions location, quantity of injected toxic agent, etc.) should be well known. Such precision cannot be achieved in practice due to a lack of knowledge about the source of emission and the intrinsic aleatoric uncertainty of the meteorological conditions. To significantly increase the contribution of numerical simulation for risk assessment and decision-making, it is essential to quantitatively measure the impact of a lack of knowledge especially in terms of spatial and temporal resolution of the danger zones. The object of this thesis is to apply uncertainty quantification methods to quantify the impact of these uncertainties in the evaluation of the danger zones in medium range toxic gas dispersion scenarios. A hybrid method merging c-ANOVA and POD/Kriging allows to consider up to 5 uncertain parameters in a high-fidelity unsteady 3D-CFD simulation of the dispersion of a toxic gas from a pond-like source in an urban area of 1km2
Margheri, Luca. "Etude de l'impact des incertitudes dans l'évaluation du risque NRBC provoqué en zone urbaine." Electronic Thesis or Diss., Paris 6, 2015. http://www.theses.fr/2015PA066453.
The dispersion of highly pathogenic biological agents in an urbanized area following a terrorist act is one of the situations that national security agencies need to evaluate in terms of risk assessment and decision-making. The numerical simulation of turbulent flows in urban areas, including monitoring the dispersion of pollutants, has reached a sufficient level of maturity to make predictions on realistic urban zones up to 4 square kilometers. However, the existing simulations are deterministic in the sense that all the parameters that define the case studied (intensity and wind direction, atmospheric stratification, source of emissions location, quantity of injected toxic agent, etc.) should be well known. Such precision cannot be achieved in practice due to a lack of knowledge about the source of emission and the intrinsic aleatoric uncertainty of the meteorological conditions. To significantly increase the contribution of numerical simulation for risk assessment and decision-making, it is essential to quantitatively measure the impact of a lack of knowledge especially in terms of spatial and temporal resolution of the danger zones. The object of this thesis is to apply uncertainty quantification methods to quantify the impact of these uncertainties in the evaluation of the danger zones in medium range toxic gas dispersion scenarios. A hybrid method merging c-ANOVA and POD/Kriging allows to consider up to 5 uncertain parameters in a high-fidelity unsteady 3D-CFD simulation of the dispersion of a toxic gas from a pond-like source in an urban area of 1km2
Favier, Philomène. "Une approche intégrée du risque avalanche : quantification de la vulnérabilité physique et humaine et optimisation des structures de protection." Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENU051/document.
Long term avalanche risk quantification for mapping and the design of defense structures is done in mostcountries on the basis of high magnitude events. Such return period/level approaches, purely hazardoriented,do not consider elements at risk (buildings, people inside, etc.) explicitly, and neglect possiblebudgetary constraints. To overcome these limitations, risk based zoning methods and cost-benefit analyseshave emerged recently. They combine the hazard distribution and vulnerability relations for the elementsat risk. Hence, the systematic vulnerability assessment of buildings can lead to better quantify the riskin avalanche paths. However, in practice, available vulnerability relations remain mostly limited to scarceempirical estimates derived from the analysis of a few catastrophic events. Besides, existing risk-basedmethods remain computationally intensive, and based on discussable assumptions regarding hazard modelling(choice of few scenarios, little consideration of extreme values, etc.). In this thesis, we tackle theseproblems by building reliability-based fragility relations to snow avalanches for several building types andpeople inside them, and incorporating these relations in a risk quantification and defense structure optimaldesign framework. So, we enrich the avalanche vulnerability and risk toolboxes with approaches of variouscomplexity, usable in practice in different conditions, depending on the case study and on the time availableto conduct the study. The developments made are detailed in four papers/chapters.In paper one, we derive fragility curves associated to different limit states for various reinforced concrete(RC) buildings loaded by an avalanche-like uniform pressure. Numerical methods to describe the RCbehaviour consist in civil engineering abacus and a yield line theory model, to make the computations asfast as possible. Different uncertainty propagation techniques enable to quantify fragility relations linkingpressure to failure probabilities, study the weight of the different parameters and the different assumptionsregarding the probabilistic modelling of the joint input distribution. In paper two, the approach is extendedto more complex numerical building models, namely a mass-spring and a finite elements one. Hence, muchmore realistic descriptions of RC walls are obtained, which are useful for complex case studies for whichdetailed investigations are required. However, the idea is still to derive fragility curves with the simpler,faster to run, but well validated mass-spring model, in a “physically-based meta-modelling” spirit. Inpaper three, we have various fragility relations for RC buildings at hand, thus we propose new relationsrelating death probability of people inside them to avalanche load. Second, these two sets of fragilitycurves for buildings and human are exploited in a comprehensive risk sensitivity analysis. By this way,we highlight the gap that can exist between return period based zoning methods and acceptable riskthresholds. We also show the higher robustness to vulnerability relations of optimal design approaches ona typical dam design case. In paper four, we propose simplified analytical risk formulas based on extremevalue statistics to quantify risk and perform the optimal design of an avalanche dam in an efficient way. Asensitivity study is conducted to assess the influence of the chosen statistical distributions and flow-obstacleinteraction law, highlighting the need for precise risk evaluations to well characterise the tail behaviour ofextreme runouts and the predominant patterns in avalanche - structure interactions
De, Moya Jean-François. "Essais sur l'adoption des technologies de quantification de soi : une approche critique." Thesis, Strasbourg, 2019. http://www.theses.fr/2019STRAB001/document.
This thesis by articles explores the adoption of technologies and practices of quantified-self with a critical stance. The aim is to have a better understanding of users' experiences with self-quantization technologies, such as connected bracelets, and to question the real contribution of these technologies to the well-being and health of individuals. The first essay presents a systematic literature review on quantified-self and a research agenda for business researchers. The second essay is a qualitative study that reveals the power relationships that users have with technology. The third essay focuses on the underlying mechanisms that guide the user's decision in order to identify the factors that lead to the adoption of a quantified-self technology
Monpoeho, Dé Mondé Serge. "Quantification genomique de deux virus enteriques (enterovirus et hav) dans les boues de stations d'epuration. Estimation de l'impact sanitaire lie a leur valorisation agricole." Nantes, 2001. http://www.theses.fr/2001NANT08VS.
Dagnas, Stéphane. "Développement d'une méthode de quantification du risque d'altération de produits alimentaires par les moisissures. Applications aux produits de Boulangerie, Viennoiserie, Pâtisserie." Nantes, 2015. https://doc-agro.oniris-nantes.fr/GED_ONI/192147991032/Manuscrit_these_Dagnas_finale_version_protegee.pdf.
Bakery products represent a six billion euro market in France and a 40 billion euro market in Europe. With great volumes manufactured, these products do not avoid the current food waste phenomenon of which a significant part comes from mold spoilage. In order to quantify this spoilage and tackle it, this PhD work aimed at developing a method to assess mold spoilage risk of bakery products
Laachir, Ismail. "Quantification of the model risk in finance and related problems." Thesis, Lorient, 2015. http://www.theses.fr/2015LORIS375/document.
The main objective of this thesis is the study of the model risk and its quantification through monetary measures. On the other hand we expect it to fit a large set of complex (exotic) financial products. The first two chapters treat the model risk problem both from the empirical and the theoretical point of view, while the third chapter concentrates on a theoretical study of another financial risk called basis risk. In the first chapter of this thesis, we are interested in the model-independent pricing and hedging of complex financial products, when a set of standard (vanilla) products are available in the market. We follow the optimal transport approach for the computation of the option bounds and the super (sub)-hedging strategies. We characterize the optimal martingale probability measures, under which the exotic option price attains the model-free bounds; we devote special interest to the case when the martingales are positive. We stress in particular on the symmetry relations that arise when studying the option bounds. In the second chapter, we approach the model risk problem from an empirical point of view. We study the optimal management of a natural gas storage and we quantify the impact of that risk on the gas storage value. As already mentioned, the last chapter concentrates on the basis risk, which is the risk that arises when one hedges a contingent claim written on a non-tradable but observable asset (e.g. the temperature) using a portfolio of correlated tradable assets. One hedging criterion is the mean-variance minimization, which is closely related to the celebrated Föllmer-Schweizer decomposition. That decomposition can be deduced from the resolution of a special Backward Stochastic Differential Equations (BSDEs) driven by a càdlàg martingale. When this martingale is a standard Brownian motion, the related BSDEs are strongly related to semi-linear parabolic PDEs. In that chapter, we formulate a deterministic problem generalizing those PDEs to the general context of martingales and we apply this methodology to discuss some properties of the Föllmer-Schweizer decomposition. We also give an explicit expression of such decomposition of the option payoff when the underlying prices are exponential of additives processes
Gilbert, Adrien. "Modélisation du régime thermique des glaciers : applications à l'étude du risque glaciaire et à la quantification des changements climatiques à haute altitude." Phd thesis, Université de Grenoble, 2013. http://tel.archives-ouvertes.fr/tel-01061854.
Gilbert, Adrien. "Modélisation du régime thermique des glaciers : applications à l’étude du risque glaciaire et à la quantification des changements climatiques à haute altitude." Thesis, Grenoble, 2013. http://www.theses.fr/2013GRENU029/document.
This thesis focuses on the numerical modeling of the thermal response of glaciers to climate changes. The change in surface conditions related to climate forcing is one of the key point of this study. We point out the crucial role of the meltwater refreezing in cold accumulation zone and the thickness of the snow and firn cover around the equilibrium line. The models are developed from the study of specific cases of three glaciers in Mont Blanc range ( French Alps). These studies allow us to analyze the risk related to hanging glaciers and to reconstruct the past temperaures at high altitude from observede nglacial temperatures. The very extensive dataset obtained at the site of the Col du Dôme (4250 m) includes measurements of deep temperatures and density profiles as well as measurements of snow accumulation and surface velocities. These observations allow us to develop and validate a three-dimensional thermal regime model coupled with a flow model and forced by meteorological data. This model is used to perform a thorough study of the thermal response of cold accumulation zone to climate change. It can also be used for the prediction of future glacial hazard related to hanging glaciers warming. A preliminary analysis has been carried out on the hanging glacier of Taconnaz. The warming could have a major impact on the stability of this kind of glaciers frozen to their beds if the melting point is reached. The study performed on Tête Rousse glacier (about 3200 m) shows clearly the influence of the snowpack and firn cover on the thermal regime of glaciers around the equilibrium line. The results obtained from coupling of a snow model (CROCUS) with a model of thermal regime reveal a very good agreement with the observed ice temperatures. This study provides insights into the thermal processes responsible for water storage inside a small almost static glacier, which can lead to catastrophic outburst floods. In the future, according to atmospheric temperature increase scenarios for the coming century, most of the glacier will become cold which will reduce the risk of water filled cavity formation. Finally, a new inverse method has been developed to reconstruct temperatures in the past from observed englacial temperatures. The method is based on the simultaneous inversion of several temperature profiles coming from different drilling sites with the same climate forcing. The method overcomes the impact of refreezing meltwater to reconstruct past air temperatures variations. This is similar to the observed regional low altitude trend. The results obtained on the Col du Dome shows that climate warming on the site at 4250 m is similar to the observed regional low altitude trend in the northwestern Alps, suggesting that air temperature trends are not altitude dependent
Cuenca, Botey Luis-Emilio. "De la Réforme Sociale à l’Optimisation du Risque-Rendement : une compréhension du processus de gouvernementalisation au travers de l’histoire d’une Banque Ouvrière en Amérique Latine." Thesis, Jouy-en Josas, HEC, 2015. http://www.theses.fr/2015EHEC0010/document.
In the world of organizations, banks are characterized by three particular elements: the preeminence of calculations in defining the course of action of the organization; building control and evaluation devices for customers, who, in accounting terms, are their assets; the key role they play in the dissemination of economic rationality in society. Therefore, all banks are based on calculations, and also on paradigms from political economy and its implicit moral imperatives. However, relations between these constitutive dimensions of the banking practices have been little explored by research in accounting. The aim of this work is to contribute to a better knowledge of banking practices and its role in the construction of individuals governed by economic reason. To this end, this work was conducted with close reference to the social and technical history of a particular banking organization, Costa Rica’s Banco Popular y de Desarrollo Comunal (BPDC). The case study has three levels. First, the “governmental discourse” level, which we studied through systematic analysis of the original files and amendments of the organic law of BPDC. In this analysis we identified five political rationales that have historically disputed hegemony in the definition of the conception of control of the organization. Secondly, we studied the level of strategic decision, emphasizing the conflicts linked to the management of assets and liabilities. In this part we put in evidence the relationship between calculative technologies and political rationales identified in the previous level and we show the ways in which the evidence of justification was displaced to reach agreements and make decisions. Finally, the third level of the study has to do with the interaction between the borrowers and the organization. To address it, we reconstructed the history of the devices used to evaluate the borrowers. Thus, we show the relationship between ways of resolving conflicts at the strategic level of the organization and ways to assess credit risk. This work is related to current research trends that study relationships between political economy and management tools. Its aim is to discuss with governmentality studies in accounting. Its main contribution is the construction of a different approach to understand the way in which economic rationality is diffused into the functioning of organizations
Haziza, Simon. "Quantification du transport intraneuronal par suivi de nanodiamants fluorescents. Application à l’étude de l’impact fonctionnel de facteurs de risque génétiques associés aux maladies neuropsychiatriques." Thesis, Université Paris-Saclay (ComUE), 2015. http://www.theses.fr/2015SACLN013/document.
The identification of molecular biomarkers of brain diseases as diverse as autism, schizophrenia and Alzheimer’s disease, is of crucial importance not only for an objective diagnosis but also to monitor response to treatments. The establishment and maintenance of sub-cellular neuronal functions, such as synaptic plasticity, are highly dependent on intracellular transport, which is essential to deliver important materials to specific locations. Abnormalities in such active transport are thought to be partly responsible for synaptic plasticity and neuronal morphology impairment found in many neuropsychiatric and neurodegenerative diseases. This thesis reports (i) the development of a quantification technic of intraneuronal transport based on fluorescent nanodiamonds (fNDs) tracking; (ii) the application of this simple and minimally invasive approach to the functional analysis of neuropsychiatric disease-related genetic variants.This manuscript falls into four chapters. The first one details the complex polygenic architecture of mental disorders and demonstrates the disease relevance of monitoring the intraneuronal transport. The second and the third chapters are dedicated to the nanodiamond-tracking assay and describe the fNDs internalisation strategies, the spatiotemporal quantitative readouts and the validation of the technic. The high brightness, the perfect photostability and the absence of cytotoxicity make fNDs a tool of choice to perform high throughput long-term bioimaging at high spatiotemporal resolution. Finally, in the fourth chapter, we apply this new functional analysis method to study the effect of genetic variants associated to autism and schizophrenia. We established transgenic mouse lines in which MARK1 and SLC25A12 genes were slightly overexpressed, and AAV-shRNA to induce AUTS2 gene haploinsufficiency. Our molecular diagnosis assay proves sufficiently sensitive to detect fine changes in intraneuronal transport dynamic, paving the way for future development in translational nanomedicine
Venne, Philippe. "Quantification des ecdystéroïdes et acides rétinoïques chez la puce d’eau (D.magna) par chromatographie liquide couplée à la spectrométrie de masse en tandem." Mémoire, Université de Sherbrooke, 2015. http://hdl.handle.net/11143/8570.
Dayhum, Abdunaser. "Appréciation du risque de contamination de l'homme par des Salmonella spp à partir de produit d'origine bovine : viande hachée." Phd thesis, AgroParisTech, 2008. http://pastel.archives-ouvertes.fr/pastel-00003795.
Beck, François. "représentativité des échantillons et représentation des usages : l'apport des enquêtes en population générale à la compréhension des usages de drogues." Phd thesis, Université René Descartes - Paris V, 2006. http://tel.archives-ouvertes.fr/tel-00338155.
Reutenauer, Victor. "Algorithmes stochastiques pour la gestion du risque et l'indexation de bases de données de média." Thesis, Université Côte d'Azur (ComUE), 2017. http://www.theses.fr/2017AZUR4018/document.
This thesis proposes different problems of stochastic control and optimization that can be solved only thanks approximation. On one hand, we develop methodology aiming to reduce or suppress approximations to obtain more accurate solutions or something exact ones. On another hand we develop new approximation methodology in order to solve quicker larger scale problems. We study numerical methodology to simulated differential equations and enhancement of computation of expectations. We develop quantization methodology to build control variate and gradient stochastic methods to solve stochastic control problems. We are also interested in clustering methods linked to quantization, and principal composant analysis or compression of data thanks neural networks. We study problems motivated by mathematical finance, like stochastic control for the hedging of derivatives in incomplete market but also to manage huge databases of media commonly known as big Data in chapter 5. Theoretically we propose some upper bound for convergence of the numerical method used. This is the case of optimal hedging in incomplete market in chapter 3 but also an extension of Beskos-Roberts methods of exact simulation of stochastic differential equations in chapter 4. We present an original application of karhunen-Loève decomposition for a control variate of computation of expectation in chapter 2
Houriez--Gombaud-Saintonge, Sophia. "Analyse automatisée des données 3D+t d’imagerie par résonance magnétique de vélocimétrie. Quantification de l’apport du 3D+t." Electronic Thesis or Diss., Sorbonne université, 2020. http://www.theses.fr/2020SORUS328.
Cardiovascular diseases remain the leading cause of death in OECD countries, in particular, because of the population aging, making it one of the major health issues on a global scale. Advances in imaging today make it possible to better understand and diagnose these diseases in a non-invasive way. More recently, a new non-invasive and non-radiative imaging technique named « 4D Flow MRI » allows for the first time to image the speed of blood flow in three dimensions during a whole cardiac cycle, thus offering new perspectives of visualization, understanding, and measurement. This thesis in image processing carried out in connection with cardiologists and radiologists aims to develop new indicators and to quantify the contribution of 4D flow MRI especially in 1) the assessment of the aortic stiffness leading to a comparison between several approaches to estimate the pulse wave velocity 2) the analysis of flow disorganization in aging and pathological dilation 3) the evaluation of filling flow in the left ventricle
Graff, Kevin. "Contribution à la cartographie multirisques de territoires côtiers : approche quantitative des conséquences potentielles et des concomitances hydrologiques (Normandie, France) Analysis and quantification of potential consequences in multirisk coastal context at different spatial scales (Normandy, France) Characterization of elements at risk in the multirisk coastal context and at different spatial scales: Multi-database integration (normandy, France)." Thesis, Normandie, 2020. http://www.theses.fr/2020NORMC001.
The coastal environment in Normandy is conducive to a convergence of multiple hazards (erosion, marine submersion, flooding by overflowing streams or upwelling of a water table, turbid flooding by runoff, coastal or continental slope movement). Because of their interface positions, important regressive dynamics go between the marine and continental processes. This interaction will occur within the slopes and valleys where coastal populations and their activities have tended to become more densified since the 19th century. In this context, it is necessary to adopt a multi-hazard and multi-risk approach considering the spatial or temporal confluence of several hazards and their possible cascading effects and to assess the multi-sector impacts generated. by these vagaries.As part of this thesis, as well as in the ANR RICOCHET program, three study sites were selected at the outlet of the coastal rivers: from Auberville to Pennedepie, from Quiberville to Dieppe and from Criel-sur-Mer to Ault due to significant issues and strong interactions between hydrological and gravitational phenomena. Two main objectives have been carried out: (1) an methodological development on analyses of potential consequences by considering all the elements at risk within a study territory through a multiscaling approach; (2) an analysis of hydrological concomitances through a both statistical and spatial approach
Ed-Daoui, Ilyas. "Towards systems-of-systems structural resilience assessment Resilience assessment as a foundation for systems-of-systems safety evaluation : application to an economic infrastructure An approach to systems-of-systems structural analysis through interoperability assessment : application on Moroccan Case A study of an adaptive approach for systems-of-systems integration A contribution to systems-of-systems concept standardization Unstructured peer-to-peer systems : towards swift Routing A deterministic approach for systems-of-systems resilience quantification Vers des systèmes de systèmes robustes Security enhancement architectural model for IMS based networks Towards reliable IMS-based networks." Thesis, Normandie, 2019. http://www.theses.fr/2019NORMIR07.
Nowadays, we expect of SoS (systems-of-systems) more than just to be functional, but also to be reliable, to preserve their performance, to complete the required fonctions and rnost importantly to anticipate potential defects. The relationship with resilience is among the numerous perspectives tackling reliability in the context of SoS. It is about the consequences in case of disturbances and associated uncertainties. Resilience is defined as the ability of systems to withstand a major disruption within acceptable degradation parameters and to recover within an acceptable time, composite costs and risks. In this thesis, two complementary approaches are proposed in an attempt to analyze SoS structural resilience. First is related to extensibility which is a specific characteristic of SoS as they are in continuous evolvement and change. A major focus is to evaluate SoS structural resilience with regards to its dynamic aspect and through interoperability assessment. On the other hand, a consideration of the SoS structure and inner workflow pathways represents the second approach. This perspective leads to structural resilience assessment through a set of indicators. Both proposed approaches are deterministic and can be used to evaluate the current state of SoS structure or to anticipate its resilience in future scenarios. Futhermore, a prototype is designed in order to process the structural resilience assessment. Considering spatial objects, it has been used to conduct experiments on real-based industrial infrastructures approached as SoS
Munayeno, Muvova. "Les infections sexuellement transmissibles (maladies vénériennes) et la santé publique au Congo: contribution à l'histoire socio-épidémiologique des IST en milieux urbains (1885-1960)." Doctoral thesis, Universite Libre de Bruxelles, 2010. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210102.
négligée par des chercheurs africains spécialistes en sciences sociales, en raison notamment du tabou
qui entoure la sexualité dans ce continent. Toutefois, les dernières décennies ont donné lieu à plusieurs
recherches menées principalement par les Européens africanistes sur ces pathologies grâce à
l’émergence de la pandémie actuelle du Sida. La plupart des travaux réalisés sont axés sur les facteurs
de risque, les mécanismes de diffusion, les croyances et les attitudes populaires face à ces maladies, les
politiques de lutte, etc. Mais les études historiques consacrées aux IST sont très rares. Celles qui
existent ont surtout mis en évidence la dimension démographique axée sur le problème de la dénatalité
en laissant dans l’ombre le contexte socio-historique et les conditions socio-épidémiologiques de
propagation de ces affections. Au moment où le Sida fait des ravages dans le monde et tout
particulièrement en Afrique subsaharienne, l’intérêt d’une réflexion historique sur les IST au Congo
n’est plus à démontrer.
Contrairement à une affirmation classiquement admise dans la littérature, selon laquelle la
lutte contre les IST au sein de la population congolaise fut un franc succès pour les autorités coloniales
surtout après la Deuxième Guerre mondiale, cette thèse montre plutôt l’augmentation de la prévalence
des IST dans le temps. Les archives inédites et l’analyse des données révèlent que cette progression
continue est la conséquence de l'urbanisation accélerée et de la monétarisation de la société et de la sexualité entraînant des modes de vie propres à la société coloniale urbaine. Les villes issues de ce processus deviendront non seulement des espaces
d’acculturation et de modernité, mais aussi des lieux d’expansion de ces maladies. Le développement
de la prostitution et la multiplicité des partenaires sexuels, à travers les unions plus libres et
momentanées, sont les principaux facteurs explicatifs de cette observation.
On présente généralement de manière panégyrique l’oeuvre sanitaire coloniale de la Belgique
au Congo comme ‘‘modèle’’. Pourtant, aucune étude n’a déjà été menée pour examiner, de manière
chiffrée, les aspets liés aux différences de santé entre les Congolais et les Blancs. Cette
dissertation vient combler les lacunes existantes dans ce domaine. De ce point de vue, il en résulte de
fortes inégalités et des déséquilibres persistants de santé entre ces deux types de populations. Les Congolais beaucoup plus
nombreux, socialement défavorisés, ne bénéficient que d’une situation peu ou moins favorable ;tandis
que les Blancs, socialement plus favorisés, bénéficient en général d’une meilleure situation sanitaire.
Plusieurs indicateurs élaborés dans ce travail sont révélateurs de cette réalité coloniale, en termes
d’équipements sanitaires, d’accès et d’utilisation de soins et d’état de santé différencié./
The issue of sexually transmitted infections (STI) in Africa has long been neglected by
researchers African social scientists, particularly because of the taboo surrounding sexuality in Africa.
However, recent decades have resulted in several research conducted mainly by the European
Africanists on these diseases through the emergence of the current pandemic of AIDS. Most of studies
are focused on risk factors, distribution mechanisms, the popular attitudes about these infections,
control policies. But historical studies on STI are seldom examined. Those that exist are mainly
concerning the demographic dimension focuses on the problem of declining birth, leaving the socio-historical
and socio-epidemiological spread of such diseases. While AIDS is ravaging the world and
especially in sub-Saharan Africa, one thing to mention is that the interest of historical reflection on
STI in the Congo is obvious.
Contrary to an assertion conventionally accepted in the literature, that the fight against
gonorrhea and syphilis among the Congolese population was a success for the colonial authorities,
especially after the Second World War, our thesis shows rather the increasing prevalence of STI. The
archives and analysis of data indicates this continued progress is the result of special conditions of
industrialization and urbanization colonial that make people vulnerable. Cities from this historical
process will not only areas of acculturation and modernity, but also places for expansion of these
diseases. The development of prostitution and multiple sexual partners through free and temporary
unions are the main factors explaining this observation.
It has generally praises how the actions of Belgian colonial health in the Congo as 'model'.
However, no study has been conducted to establish or to compare quantitatively the health status
between Blacks (Congolese) and Withes (Europeans in majority). This essay shows the social health
inequalities among these two populations. The Congolese many in number, but more socially
disadvantaged have only less favorable conditions to health. While the white people, socially
privileged, generally have better health status. Several indicators developed in this study are revealing
of the colonial reality in terms of sanitation, access and use of care and health status differential.
Doctorat en Histoire, art et archéologie
info:eu-repo/semantics/nonPublished
Gosselin-Théberge, Maxime. "Campylobacter dans différents environnements aquatiques : quantification et génotypage afin de mieux évaluer les risques potentiels d’infection pour l’être humain." Thèse, 2015. http://hdl.handle.net/1866/13370.
Campylobacter is a zoonotic pathogen that is responsible for the majority of cases of bacterial gastroenteritis. Among the numerous Campylobacter transmission routes including direct contact, food and water, poultry consumption has been recognized as the major route. A strong seasonal variation in campylobacteriosis cases exists for reasons that are not well understood; environmental water is suspected to be involved. This cross-sectional study was conducted in the Southeastern region of Quebec (Canada), wherein Campylobacter from different waters (drinking water source, recreational and sewage) and clinical sources was quantified and genotyped in order to evaluate the potential risks posed by environmental water. Several real-time PCR assays were compared for specific application to environmental water: two were selected for their specificity and sensitivity of quantification. Standard curves were calibrated using digital PCR to accurately determine concentrations. Campylobacter isolates from clinical and water sources were genetically compared using CGF (comparative genomic fingerprinting). Sewage waters showed the highest Campylobacter concentrations, while drinking water source and recreational waters showed the lowest (average of 3.9Log, 1.7Log and 1.0Log cells/L, respectively). CGF revealed that 6% of water isolates were genetically similar (100% homology) to clinical isolates. Summer cases of campylobacteriosis revealed isolates showing more genetic similarities with environmental water isolates compared to other seasons (p<0.01). The low Campylobacter concentrations and genetic similarities between water and clinical isolates from the same region, suggests that these environmental waters pose a real, but low risk of transmission.
Caillot, Veronique. "Quantification statistique et étude expérimentale de mouvements sismiques : application à l'évaluation du risque." Phd thesis, 1992. http://tel.archives-ouvertes.fr/tel-00709805.
Dion-Labrie, Marianne. "Perceptions de néphrologues transplanteurs et référents face à la quantification du risque immunologique global en transplantation rénale." Thèse, 2010. http://hdl.handle.net/1866/4463.
Background: The overwhelming scarcity of organs within renal transplantation forces researchers and transplantation teams to seek new ways to increase efficacy. The Groupe de recherche transdisciplinaire sur les prédicteurs du risque immunologique is attempting to put in place a scientifically precise method for determining the global immunological risk (GIR) of rejection for each patient waiting for a renal transplant. The quantification of the GIR is based on scientific factors, such as biological, immunological, clinical and psychosocial. The precise and global determination of the GIR could change the way patients are selected for renal transplantation. This selection will be based thus on scientific and quantifiable criteria. The advantages of the use of this method for selecting potential allograft recipients could be improvement in the efficacy of the process and the individualization of immunosuppressive therapy. In spite of these numerous advantages, this approach raises several ethical questions to explore with nephrologists working in kidney transplantation. Aims of the study: The aims of this study is to explore the views of transplant and referring nephrologists on the use of personalized medicine tools to develop a new method for selection potential recipients of a renal allograft. The results of this research could contribute to determine the acceptable use of this method in renal transplantation and to study the link between science and medicine. Methods: Twenty-two semi-directed interviews, using short clinical vignettes, were conducted with nephrologists in the province of Quebec between June 2007 and July 2008. The semi-directed interviews were analyzed qualitatively using the content and thematic analysis method described by Miles and Huberman. Results: The results demonstrate a general acceptance of this approach amongst the participants. Knowledge of each patient’s immunological risk could improve treatment and the post-graft follow-up. On the other hand, the possibility that patients might be excluded from transplantation poses a significant ethical issue. It could be more effective than the method presently used. The method must be validated scientifically, and must leave a role for clinical judgment. Conclusions: The use of personalized medicine within transplantation must be in the best interests of the patient. However, in spite of the use of such scientific data, a place must be retained for the clinical judgment that allows a physician to make decisions based on medical data, professional expertise and knowledge of the patient. An ethical reflection is necessary in order to focus on the possibility of patients being excluded, as well as on the resolution of the equity/efficacy dilemma.
Sahnoune, Zakaria. "Vers une plateforme holistique de protection de la vie privée dans les services géodépendants." Thèse, 2018. http://hdl.handle.net/1866/21144.
Lafrance-Girard, Corinne. "Toxoplasma gondii dans la viande au détail au Canada : prévalence, quantification et facteurs de risque dans une perspective de santé publique." Thèse, 2017. http://hdl.handle.net/1866/19890.
Vermeys, Nicolas W. "Qualification et quantification de l'obligation de sécurité informationnelle dans la détermination de la faute civile." Thèse, 2009. http://hdl.handle.net/1866/3663.
In Quebec, as in most western jurisdictions, the duty to ensure information security, i.e. the obligation bestowed upon companies to protect the integrity, confidentiality and availability of information, stems from a series of legal dispositions which, rather than to impose a certain conduct, or the use of given technologies or processes, simply demand that "reasonable", "adequate", or "sufficient" security measures be applied. However, in a field an nascent and complex as information security, where available solutions are numerous, and where case law is sparse, how can a company reliably predict the full extend of its duty? In other words, how can one establish what a reasonably prudent and diligent company would do in a field where laws, case law, and even customs fail to dictate precisely what level of diligence is sought by the legislator? The lack of legal certainty offered in such a case is obvious, and requires us to reconfigure the framework associated with the duty to ensure information security in order to identify its components and objectives. Such an endeavour begins with redefining the duty to ensure information security as a duty to reduce information-related risk to a socially acceptable leve1. Since security stems from risk management, it can therefore be said that risk is at the core of said duty. By analysing risk, i.e. by identifying the threats that aim to exploit a system's vulnerabilities, it becomes possible to specify which counter measures could be useful and what costs they may entail. From that point, it's feasible, if using the economic definition of negligence (which is based on the probability of a security breach, and the damages incurred), to establish the optimal amount that should be invested in the purchasing, upkeep and replacement of these counter measures. This type of analysis will allow companies to quantify, with a certain degree of precision, the extend to which they need to ensure information security by giving them a set of tools based on easily accessible data. Furthermore, said tools appear to be fully compatible with the current legal landscape.
Vermeys, Nicolas. "Qualification et quantification de l'obligation de sécurité informationnelle dans la détermination de la faute civile." Thèse, 2009. http://hdl.handle.net/1866/3663.
In Quebec, as in most western jurisdictions, the duty to ensure information security, i.e. the obligation bestowed upon companies to protect the integrity, confidentiality and availability of information, stems from a series of legal dispositions which, rather than to impose a certain conduct, or the use of given technologies or processes, simply demand that "reasonable", "adequate", or "sufficient" security measures be applied. However, in a field an nascent and complex as information security, where available solutions are numerous, and where case law is sparse, how can a company reliably predict the full extend of its duty? In other words, how can one establish what a reasonably prudent and diligent company would do in a field where laws, case law, and even customs fail to dictate precisely what level of diligence is sought by the legislator? The lack of legal certainty offered in such a case is obvious, and requires us to reconfigure the framework associated with the duty to ensure information security in order to identify its components and objectives. Such an endeavour begins with redefining the duty to ensure information security as a duty to reduce information-related risk to a socially acceptable leve1. Since security stems from risk management, it can therefore be said that risk is at the core of said duty. By analysing risk, i.e. by identifying the threats that aim to exploit a system's vulnerabilities, it becomes possible to specify which counter measures could be useful and what costs they may entail. From that point, it's feasible, if using the economic definition of negligence (which is based on the probability of a security breach, and the damages incurred), to establish the optimal amount that should be invested in the purchasing, upkeep and replacement of these counter measures. This type of analysis will allow companies to quantify, with a certain degree of precision, the extend to which they need to ensure information security by giving them a set of tools based on easily accessible data. Furthermore, said tools appear to be fully compatible with the current legal landscape.