Thèses sur le sujet « Propagation risk »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Consultez les 37 meilleures thèses pour votre recherche sur le sujet « Propagation risk ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Parcourez les thèses sur diverses disciplines et organisez correctement votre bibliographie.
Garg, Tushar. « Estimating change propagation risk using TRLs and system architecture ». Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/110134.
Texte intégralCataloged from PDF version of thesis.
Includes bibliographical references (pages 27-28).
Risk estimation is a key activity for product development and technology integration programs. There are a number of decision support tools that help project managers identify and mitigate risks in a project, however few explicitly consider the effects of architecture on risk. We propose a novel risk estimation framework that includes considerations of the system architecture. By starting with traditional project management literature, we define risk as a combination of likelihood and impact. We use Technology Readiness Levels as our measure for likelihood, and given that change propagates through interfaces, we used metrics that relate to connectivity to estimate impact. To analyze the connectivity, we model systems using networks of nodes and edges and calculate centrality metrics. This framework is applied to an industry example and we visualize the data in different formats to aid in analysis. The insights gained from this analysis are discussed, and we conclude that the risk estimation framework provides estimates that are in line with the experience of engineers at the company.
by Tushar Garg.
S.M. in Engineering and Management
Selda, Konukcu. « Application Of Risk Management Process On Wave Propagation In Aerospace Medium ». Master's thesis, METU, 2006. http://etd.lib.metu.edu.tr/upload/3/12607606/index.pdf.
Texte intégralGhadge, Abhijeet. « A systems thinking approach for modelling supply chain risk propagation ». Thesis, Loughborough University, 2013. https://dspace.lboro.ac.uk/2134/13561.
Texte intégralKumar, Vikas. « Soft computing approaches to uncertainty propagation in environmental risk mangement ». Doctoral thesis, Universitat Rovira i Virgili, 2008. http://hdl.handle.net/10803/8558.
Texte intégralIn the first part of this thesis different uncertainty propagation methods have been investigated. The first methodology is generalized fuzzy α-cut based on the concept of transformation method. A case study of uncertainty analysis of pollutant transport in the subsurface has been used to show the utility of this approach. This approach shows superiority over conventional methods of uncertainty modelling. A Second method is proposed to manage uncertainty and variability together in risk models. The new hybrid approach combining probabilistic and fuzzy set theory is called Fuzzy Latin Hypercube Sampling (FLHS). An important property of this method is its ability to separate randomness and imprecision to increase the quality of information. A fuzzified statistical summary of the model results gives indices of sensitivity and uncertainty that relate the effects of variability and uncertainty of input variables to model predictions. The feasibility of the method is validated to analyze total variance in the calculation of incremental lifetime risks due to polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/F) for the residents living in the surroundings of a municipal solid waste incinerator (MSWI) in Basque Country, Spain.
The second part of this thesis deals with the use of artificial intelligence technique for generating environmental indices. The first paper focused on the development of a Hazzard Index (HI) using persistence, bioaccumulation and toxicity properties of a large number of organic and inorganic pollutants. For deriving this index, Self-Organizing Maps (SOM) has been used which provided a hazard ranking for each compound. Subsequently, an Integral Risk Index was developed taking into account the HI and the concentrations of all pollutants in soil samples collected in the target area. Finally, a risk map was elaborated by representing the spatial distribution of the Integral Risk Index with a Geographic Information System (GIS). The second paper is an improvement of the first work. New approach called Neuro-Probabilistic HI was developed by combining SOM and Monte-Carlo analysis. It considers uncertainty associated with contaminants characteristic values. This new index seems to be an adequate tool to be taken into account in risk assessment processes. In both study, the methods have been validated through its implementation in the industrial chemical / petrochemical area of Tarragona.
The third part of this thesis deals with decision-making framework for environmental risk management. In this study, an integrated fuzzy relation analysis (IFRA) model is proposed for risk assessment involving multiple criteria. The fuzzy risk-analysis model is proposed to comprehensively evaluate all risks associated with contaminated systems resulting from more than one toxic chemical. The model is an integrated view on uncertainty techniques based on multi-valued mappings, fuzzy relations and fuzzy analytical hierarchical process. Integration of system simulation and risk analysis using fuzzy approach allowed to incorporate system modelling uncertainty and subjective risk criteria. In this study, it has been shown that a broad integration of fuzzy system simulation and fuzzy risk analysis is possible.
In conclusion, this study has broadly demonstrated the usefulness of soft computing approaches in environmental risk analysis. The proposed methods could significantly advance practice of risk analysis by effectively addressing critical issues of uncertainty propagation problem.
Los problemas del mundo real, especialmente aquellos que implican sistemas naturales, son complejos y se componen de muchos componentes indeterminados, que muestran en muchos casos una relación no lineal. Los modelos convencionales basados en técnicas analíticas que se utilizan actualmente para conocer y predecir el comportamiento de dichos sistemas pueden ser muy complicados e inflexibles cuando se quiere hacer frente a la imprecisión y la complejidad del sistema en un mundo real. El tratamiento de dichos sistemas, supone el enfrentarse a un elevado nivel de incertidumbre así como considerar la imprecisión. Los modelos clásicos basados en análisis numéricos, lógica de valores exactos o binarios, se caracterizan por su precisión y categorización y son clasificados como una aproximación al hard computing. Por el contrario, el soft computing tal como la lógica de razonamiento probabilístico, las redes neuronales artificiales, etc., tienen la característica de aproximación y disponibilidad. Aunque en la hard computing, la imprecisión y la incertidumbre son propiedades no deseadas, en el soft computing la tolerancia en la imprecisión y la incerteza se aprovechan para alcanzar tratabilidad, bajos costes de computación, una comunicación efectiva y un elevado Machine Intelligence Quotient (MIQ). La tesis propuesta intenta explorar el uso de las diferentes aproximaciones en la informática blanda para manipular la incertidumbre en la gestión del riesgo medioambiental. El trabajo se ha dividido en tres secciones que forman parte de cinco artículos.
En la primera parte de esta tesis, se han investigado diferentes métodos de propagación de la incertidumbre. El primer método es el generalizado fuzzy α-cut, el cual está basada en el método de transformación. Para demostrar la utilidad de esta aproximación, se ha utilizado un caso de estudio de análisis de incertidumbre en el transporte de la contaminación en suelo. Esta aproximación muestra una superioridad frente a los métodos convencionales de modelación de la incertidumbre. La segunda metodología propuesta trabaja conjuntamente la variabilidad y la incertidumbre en los modelos de evaluación de riesgo. Para ello, se ha elaborado una nueva aproximación híbrida denominada Fuzzy Latin Hypercube Sampling (FLHS), que combina los conjuntos de la teoría de probabilidad con la teoría de los conjuntos difusos. Una propiedad importante de esta teoría es su capacidad para separarse los aleatoriedad y imprecisión, lo que supone la obtención de una mayor calidad de la información. El resumen estadístico fuzzificado de los resultados del modelo generan índices de sensitividad e incertidumbre que relacionan los efectos de la variabilidad e incertidumbre de los parámetros de modelo con las predicciones de los modelos. La viabilidad del método se llevó a cabo mediante la aplicación de un caso a estudio donde se analizó la varianza total en la cálculo del incremento del riesgo sobre el tiempo de vida de los habitantes que habitan en los alrededores de una incineradora de residuos sólidos urbanos en Tarragona, España, debido a las emisiones de dioxinas y furanos (PCDD/Fs).
La segunda parte de la tesis consistió en la utilización de las técnicas de la inteligencia artificial para la generación de índices medioambientales. En el primer artículo se desarrolló un Índice de Peligrosidad a partir de los valores de persistencia, bioacumulación y toxicidad de un elevado número de contaminantes orgánicos e inorgánicos. Para su elaboración, se utilizaron los Mapas de Auto-Organizativos (SOM), que proporcionaron un ranking de peligrosidad para cada compuesto. A continuación, se elaboró un Índice de Riesgo Integral teniendo en cuenta el Índice de peligrosidad y las concentraciones de cada uno de los contaminantes en las muestras de suelo recogidas en la zona de estudio. Finalmente, se elaboró un mapa de la distribución espacial del Índice de Riesgo Integral mediante la representación en un Sistema de Información Geográfico (SIG). El segundo artículo es un mejoramiento del primer trabajo. En este estudio, se creó un método híbrido de los Mapas Auto-organizativos con los métodos probabilísticos, obteniéndose de esta forma un Índice de Riesgo Integrado. Mediante la combinación de SOM y el análisis de Monte-Carlo se desarrolló una nueva aproximación llamada Índice de Peligrosidad Neuro-Probabilística. Este nuevo índice es una herramienta adecuada para ser utilizada en los procesos de análisis. En ambos artículos, la viabilidad de los métodos han sido validados a través de su aplicación en el área de la industria química y petroquímica de Tarragona (Cataluña, España).
El tercer apartado de esta tesis está enfocado en la elaboración de una estructura metodológica de un sistema de ayuda en la toma de decisiones para la gestión del riesgo medioambiental. En este estudio, se presenta un modelo integrado de análisis de fuzzy (IFRA) para la evaluación del riesgo cuyo resultado depende de múltiples criterios. El modelo es una visión integrada de las técnicas de incertidumbre basadas en diseños de valoraciones múltiples, relaciones fuzzy y procesos analíticos jerárquicos inciertos. La integración de la simulación del sistema y el análisis del riesgo utilizando aproximaciones inciertas permitieron incorporar la incertidumbre procedente del modelo junto con la incertidumbre procedente de la subjetividad de los criterios. En este estudio, se ha demostrado que es posible crear una amplia integración entre la simulación de un sistema incierto y de un análisis de riesgo incierto.
En conclusión, este trabajo demuestra ampliamente la utilidad de aproximación Soft Computing en el análisis de riesgos ambientales. Los métodos propuestos podría avanzar significativamente la práctica de análisis de riesgos de abordar eficazmente el problema de propagación de incertidumbre.
Fang, Chao. « Modeling and Analysing Propagation Behavior in Complex Risk Network : A Decision Support System for Project Risk Management ». Phd thesis, Ecole Centrale Paris, 2011. http://tel.archives-ouvertes.fr/tel-01018574.
Texte intégralEsperon, Miguez Manuel. « Financial and risk assessment and selection of health monitoring system design options for legacy aircraft ». Thesis, Cranfield University, 2013. http://dspace.lib.cranfield.ac.uk/handle/1826/8062.
Texte intégralXuan, Yunqing. « Uncertainty propagation in complex coupled flood risk models using numerical weather prediction and weather radars ». Thesis, University of Bristol, 2007. http://hdl.handle.net/1983/c76c4eb0-9c9e-4ddc-866c-9bbdbfa4ec25.
Texte intégralLARI, SERENA. « Multi scale heuristic and quantitative multi-risk assessment in the Lombardy region, with uncertainty propagation ». Doctoral thesis, Università degli Studi di Milano-Bicocca, 2009. http://hdl.handle.net/10281/7550.
Texte intégralAkiode, Olukemi Adejoke. « Examination and management of human African Trypanosomiasis propagation using geospatial techniques ». Thesis, Abertay University, 2014. https://rke.abertay.ac.uk/en/studentTheses/9419b401-6604-4530-9938-57ab03234e67.
Texte intégralAksen, Ernest, Jacek Cukrowski et Manfred M. Fischer. « Propagation of Crises Across Countries : Trade Roots of Contagion Effects ». WU Vienna University of Economics and Business, 2001. http://epub.wu.ac.at/4235/1/WGI_DP_7801.pdf.
Texte intégralSeries: Discussion Papers of the Institute for Economic Geography and GIScience
Okhulkova, Tatiana. « Integration of uncertainty and definition of critical thresholds for CO2 storage risk assessment ». Thesis, Université Paris-Saclay (ComUE), 2015. http://www.theses.fr/2015SACLC021/document.
Texte intégralThe main goal of the thesis is to define how the uncertainty can be accounted for in the process of riskassessment for CO2 storage and to quantify by means of numerical models the scenarii of leakage by lateralmigration and through the caprock. The chosen scenarii are quantified using the system modeling approachfor which ad-hoc predictive numerical models are developed. A probabilistic parametric uncertaintypropagation study using polynomial chaos expansion is performed. Matters of spatial variability are alsodiscussed and a comparison between homogeneous and heterogeneous representations of permeability isprovided
Zimmerman, Lora Leigh. « Propagation of Juvenile Freshwater Mussels (Bivalvia : Unionidae) and Assessment of Habitat Suitability for Restoration of Mussels in the Clinch River, Virginia ». Thesis, Virginia Tech, 2003. http://hdl.handle.net/10919/10055.
Texte intégralMaster of Science
Perrin, Guillaume. « Random fields and associated statistical inverse problems for uncertainty quantification : application to railway track geometries for high-speed trains dynamical responses and risk assessment ». Phd thesis, Université Paris-Est, 2013. http://pastel.archives-ouvertes.fr/pastel-01001045.
Texte intégralJaber, Hadi. « Modeling and analysis of propagation risks in complex projects : application to the development of new vehicles ». Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLC022/document.
Texte intégralThe management of complex projects requires orchestrating the cooperation of hundreds of individuals from various companies, professions and backgrounds, working on thousands of activities, deliverables, and risks. As well, these numerous project elements are more and more interconnected, and no decision or action is independent. This growing complexity is one of the greatest challenges of project management and one of the causes for project failure in terms of cost overruns and time delays. For instance, in the automotive industry, increasing market orientation and growing complexity of automotive product has changed the management structure of the vehicle development projects from a hierarchical to a networked structure, including the manufacturer but also numerous suppliers. Dependencies between project elements increase risks, since problems in one element may propagate to other directly or indirectly dependent elements. Complexity generates a number of phenomena, positive or negative, isolated or in chains, local or global, that will more or less interfere with the convergence of the project towards its goals. The thesis aim is thus to reduce the risks associated with the complexity of the vehicle development projects by increasing the understanding of this complexity and the coordination of project actors. To do so, a first research question is to prioritize actions to mitigate complexity-related risks. Then, a second research question is to propose a way to organize and coordinate actors in order to cope efficiently with the previously identified complexity-related phenomena.The first question will be addressed by modeling project complexity and by analyzing complexity-related phenomena within the project, at two levels. First, a high-level factor-based descriptive modeling is proposed. It permits to measure and prioritize project areas where complexity may have the most impact. Second, a low-level graph-based modeling is proposed, based on the finer modeling of project elements and interdependencies. Contributions have been made on the complete modeling process, including the automation of some data-gathering steps, in order to increase performance and decrease effort and error risk. These two models can be used consequently; a first high-level measure can permit to focus on some areas of the project, where the low-level modeling will be applied, with a gain of global efficiency and impact. Based on these models, some contributions are made to anticipate potential behavior of the project. Topological and propagation analyses are proposed to detect and prioritize critical elements and critical interdependencies, while enlarging the sense of the polysemous word “critical."The second research question will be addressed by introducing a clustering methodology to propose groups of actors in new product development projects, especially for the actors involved in many deliverable-related interdependencies in different phases of the project life cycle. This permits to increase coordination between interdependent actors who are not always formally connected via the hierarchical structure of the project organization. This allows the project organization to be actually closer to what a networked structure should be. The automotive-based industrial application has shown promising results for the contributions to both research questions. Finally, the proposed methodology is discussed in terms of genericity and seems to be applicable to a wide set of complex projects for decision support
Favier, Philomène. « Une approche intégrée du risque avalanche : quantification de la vulnérabilité physique et humaine et optimisation des structures de protection ». Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENU051/document.
Texte intégralLong term avalanche risk quantification for mapping and the design of defense structures is done in mostcountries on the basis of high magnitude events. Such return period/level approaches, purely hazardoriented,do not consider elements at risk (buildings, people inside, etc.) explicitly, and neglect possiblebudgetary constraints. To overcome these limitations, risk based zoning methods and cost-benefit analyseshave emerged recently. They combine the hazard distribution and vulnerability relations for the elementsat risk. Hence, the systematic vulnerability assessment of buildings can lead to better quantify the riskin avalanche paths. However, in practice, available vulnerability relations remain mostly limited to scarceempirical estimates derived from the analysis of a few catastrophic events. Besides, existing risk-basedmethods remain computationally intensive, and based on discussable assumptions regarding hazard modelling(choice of few scenarios, little consideration of extreme values, etc.). In this thesis, we tackle theseproblems by building reliability-based fragility relations to snow avalanches for several building types andpeople inside them, and incorporating these relations in a risk quantification and defense structure optimaldesign framework. So, we enrich the avalanche vulnerability and risk toolboxes with approaches of variouscomplexity, usable in practice in different conditions, depending on the case study and on the time availableto conduct the study. The developments made are detailed in four papers/chapters.In paper one, we derive fragility curves associated to different limit states for various reinforced concrete(RC) buildings loaded by an avalanche-like uniform pressure. Numerical methods to describe the RCbehaviour consist in civil engineering abacus and a yield line theory model, to make the computations asfast as possible. Different uncertainty propagation techniques enable to quantify fragility relations linkingpressure to failure probabilities, study the weight of the different parameters and the different assumptionsregarding the probabilistic modelling of the joint input distribution. In paper two, the approach is extendedto more complex numerical building models, namely a mass-spring and a finite elements one. Hence, muchmore realistic descriptions of RC walls are obtained, which are useful for complex case studies for whichdetailed investigations are required. However, the idea is still to derive fragility curves with the simpler,faster to run, but well validated mass-spring model, in a “physically-based meta-modelling” spirit. Inpaper three, we have various fragility relations for RC buildings at hand, thus we propose new relationsrelating death probability of people inside them to avalanche load. Second, these two sets of fragilitycurves for buildings and human are exploited in a comprehensive risk sensitivity analysis. By this way,we highlight the gap that can exist between return period based zoning methods and acceptable riskthresholds. We also show the higher robustness to vulnerability relations of optimal design approaches ona typical dam design case. In paper four, we propose simplified analytical risk formulas based on extremevalue statistics to quantify risk and perform the optimal design of an avalanche dam in an efficient way. Asensitivity study is conducted to assess the influence of the chosen statistical distributions and flow-obstacleinteraction law, highlighting the need for precise risk evaluations to well characterise the tail behaviour ofextreme runouts and the predominant patterns in avalanche - structure interactions
Yang, Guanglin. « Geothermal Development Opportunity and Risk of Using Abandoned Oil-Gas Wells and Mines with MRI tests ». Master's thesis, Alma Mater Studiorum - Università di Bologna, 2022. http://amslaurea.unibo.it/25836/.
Texte intégralAnilkumar, A. K. « NEW PERSPECTIVES FOR ANALYZING THE BREAKUP, ENVIRONMENT, EVOLUTION, COLLISION RISK AND REENTRY OF SPACE DEBRIS OBJECTS ». Thesis, Indian Institute of Science, 2004. https://etd.iisc.ac.in/handle/2005/80.
Texte intégralVikram Sarabhai Space Centre,Trivandrum
Anilkumar, A. K. « NEW PERSPECTIVES FOR ANALYZING THE BREAKUP, ENVIRONMENT, EVOLUTION, COLLISION RISK AND REENTRY OF SPACE DEBRIS OBJECTS ». Thesis, Indian Institute of Science, 2004. http://hdl.handle.net/2005/80.
Texte intégralIn the space surrounding the earth there are two major regions where orbital debris causes concern. They are the Low Earth Orbits (LEO) up to about 2000 km, and Geosynchronous Orbits (GEO) at an altitude of around 36000 km. The impact of the debris accumulations are in principle the same in the two regions; nevertheless they require different approaches and solutions, due to the fact that the perturbations in the orbital decay due to atmospheric drag effects predominates in LEO, gravitational forces including earth’s oblateness and luni solar effects dominating in GEO are different in these two regions. In LEO it is generally known that the debris population dominates even the natural meteoroid population for object sizes 1 mm and larger. This thesis focuses the study mainly in the LEO region. Since the first satellite breakup in 1961 up to 01 January 2003 more than 180 spacecraft and rocket bodies have been known to fragment in orbit. The resulting debris fragments constitute nearly 40% of the 9000 or more of the presently tracked and catalogued objects by USSPACECOM. The catalogued fragment count does not include the much more numerous fragments, which are too small to be detected from ground. Hence in order to describe the trackable orbital debris environment, it is important to develop mathematical models to simulate the trackable fragments and later expand it to untrackable objects. Apart from the need to better characterize the orbital debris environment down to sub millimeter particles, there is also a pressing necessity of simulation tools able to model in a realistic way the long term evolution of space debris, to highlight areas, which require further investigations, and to study the actual mitigation effects of space policy measures. The present thesis has provided newer perspectives for five major issues in space debris modeling studies. The issues are (i) breakup modeling, (ii) environment modeling, (iii) evolution of the debris environment, (iv) collision probability analysis and (v) reentry prediction. The Chapter 1 briefly describes an overview of space debris environment and the issues associated with the growing space debris populations. A literature survey of important earlier work carried out regarding the above mentioned five issues are provided in the Chapter 2. The new contributions of the thesis commence from Chapter 3. The Chapter 3 proposes a new breakup model to simulate the creation of debris objects by explosion in LEO named “A Semi Stochastic Environment Modeling for Breakup in LEO” (ASSEMBLE). This model is based on a study of the characteristics of the fragments from on orbit breakups as provided in the TLE sets for the INDIAN PSLV-TES mission spent upper stage breakup. It turned out that based on the physical mechanisms in the breakup process the apogee, perigee heights (limited by the breakup altitude) closely fit suitable Laplace distributions and the eccentricity follows a lognormal distribution. The location parameters of these depend on the orbit of the parent body at the time of breakup and their scale parameters on the intensity of explosion. The distribution of the ballistic coefficient in the catalogue was also found to follow a lognormal distribution. These observations were used to arrive at the proper physical, aerodynamic, and orbital characteristics of the fragments. Subsequently it has been applied as an inverse problem to simulate and further validate it based on some more typical well known historical on orbit fragmentation events. All the simulated results compare quite well with the observations both at the time of breakup and at a later epoch. This model is called semi stochastic in nature since the size and mass characteristics have to be obtained from empirical relations and is capable of simulating the complete scenario of the breakup. A new stochastic environment model of the debris scenario in LEO that is simple and impressionistic in nature named SIMPLE is proposed in Chapter 4. Firstly among the orbital debris, the distribution of the orbital elements namely altitude, perigee height, eccentricity and the ballistic coefficient values for TLE sets of data in each of the years were analyzed to arrive at their characteristic probability distributions. It is observed that the altitude distribution for the number of fragments exhibits peaks and it turned out that such a feature can be best modeled with a tertiary mixture of Laplace distributions with eight parameters. It was noticed that no statistically significant variations could be observed for the parameters across the years. Hence it is concluded that the probability density function of the altitude distribution of the debris objects has some kind of equilibrium and it follows a three component mixture of Laplace distributions. For the eccentricity ‘e’ and the ballistic parameter ‘B’ values the present analysis showed that they could be acceptably quite well fitted by Lognormal distributions with two parameters. In the case of eccentricity also the describing parameter values do not vary much across the years. But for the parameters of the B distribution there is some trend across the years which perhaps may be attributed to causes such as decay effect, miniaturization of space systems and even the uncertainty in the measurement data of B. However in the absence of definitive cause that can be attributed for the variation across the years, it turns out to be best to have the most recent value as the model value. Lastly the same kind of analysis has also been carried out with respect to the various inclination bands. Here the orbital parameters are analyzed with respect to the inclination bands as is done in ORDEM (Kessler et al 1997, Liou et al 2001) for near circular orbits in LEO. The five inclination bands considered here are 0-36 deg (in ORDEM they consider 19-36 deg, and did not consider 0-19 deg), 36-61 deg, 61-73 deg, 73-91 deg and 91- 180 deg, and corresponding to each band, the altitude, eccentricity and B values were modeled. It is found that the third band shows the models with single Laplace distribution for altitude and Lognormal for eccentricity and B fit quite well. The altitude of other bands is modeled using tertiary mixture of Laplace distributions, with the ‘e’ and ‘B’ following once again a Lognormal distribution. The number of parameter values in SIMPLE is, in general, just 8 for each description of altitude or perigee distributions whereas in ORDEM96 it is more. The present SIMPLE model captures closely all the peak densities without losing the accuracy at other altitudes. The Chapter 5 treats the evolution of the debris objects generated by on orbit breakup. A novel innovative approach based on the propagation of an equivalent fragment in a three dimensional bin of semi major axis, eccentricity, and the ballistic coefficient (a, e, B) together with a constant gain Kalman filter technique is described in this chapter. This new approach propagates the number density in a bin of ‘a’ and ‘e’ rapidly and accurately without propagating each and every of the space debris objects in the above bin. It is able to assimilate the information from other breakups as well with the passage of time. Further this approach expands the scenario to provide suitable equivalent ballistic coefficient values for the conglomeration of the fragments in the various bins. The heart of the technique is to use a constant Kalman gain filter, which is optimal to track the dynamically evolving fragment scenario and further expand the scenario to provide time varying equivalent ballistic coefficients for the various bins. In the next chapter 6 a new approach for the collision probability assessment utilizing the closed form solution of Wiesel (1989) by the way of a three dimensional look up table, which takes only air drag effect and an exponential model of the atmosphere, is presented. This approach can serve as a reference collision probability assessment tool for LEO debris cloud environment. This approach takes into account the dynamical behavior of the debris objects propagation and the model utilizes a simple propagation for quick assessment of collision probability. This chapter also brings out a comparison of presently available collision probability assessment algorithms based on their complexities, application areas and sample space on which they operate. Further the quantitative assessment of the collision probability estimates between different presently available methods is carried out and the obtained collision probabilities are match qualitatively. The Chapter 7 utilizes once again the efficient and robust constant Kalman gain filter approach that is able to handle the many uncertain, variable, and complex features existing in the scenario to predict the reentry time of the risk objects. The constant gain obtained by using only a simple orbit propagator by considering drag alone is capable of handling the other modeling errors in a real life situation. A detailed validation of the approach was carried out based on a few recently reentered objects and comparison of the results with the predictions of other agencies during IADC reentry campaigns are also presented. The final Chapter 8 provides the conclusions based on the present work carried together with suggestions for future efforts needed in the study of space debris. Also the application of the techniques evolved in the present work to other areas such as atmospheric data assimilation and forecasting have also been suggested.
Bartlett, Alastair Ian. « Auto-extinction of engineered timber ». Thesis, University of Edinburgh, 2018. http://hdl.handle.net/1842/31052.
Texte intégralChandra, Johanes. « Analyses expérimentales de la réponse sismique non-linéaire du système sol-structure ». Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENU023/document.
Texte intégralThe concentration of population in urban areas in seismic-prone regions can generate more and more damages and losses. Seismic response in urban areas depends on site effects (direct amplification and nonlinearity of the soil) and the coupling between the soil and structures (soil-structure and site-city interaction). Therefore, the understanding of urban seismology, that is the ground motion incorporating the urban environment, is critical to reduce the damage. This requires the prediction of ground motion in urban areas, a fundamental element in the evaluation of the seismic hazard. Taking into account the amplification caused by the presence of sediment has been widely studied. However, the non-linearity of the soil and the coupling between the ground and the structure is seldom integrated to the prediction of the ground motion. Because of their complexity, these problems have been addressed separately. In this context, this dissertation analyzes the non-linear response of the soil-structure by integrating the non-linearity of the soil and the soil-structure interaction. Two experimental studies were performed, with the aim of providing a proxy that reflects the non-linearity of the soil. The first is the centrifuge test that reproduces the response of soil and structures at reduced scale. The state of stress and strain is conserved by applying an artificial acceleration model. This test was performed at IFSTTAR Nantes in the framework of the ANR ARVISE. Different configurations were tested with and without buildings, under different stress levels, to analyze the response of the soil and structures. The second uses the vertical accelerometric networks of two sites in California: Garner Valley Downhole (GVDA) and the Wildlife Liquefaction Array (WLA), both managed by the University of California, Santa Barbara (UCSB), USA. In-situ response is important since it describes the actual behavior of the site. Information describing the conditions of sites is widely available and the earthquakes recorded were used to test several levels of shaking to reconstruct the overall response of each site. In addition, the GVDA site is equipped with a Soil-Foundation-Structure-Interaction structure (SFSI) which aims to study the problems of soil-structure interaction. In both experiments, thanks to the vertical accelerometer network in the ground and the structure we are able to apply the 1D wave propagation method to extract the response of these systems. The waves are considered as an SH wave which propagates in a 1D horizontal layer. Seismic interferometry by deconvolution method is applied to extract the Impulse Response Function (IRF) of the 1D system. Thus the analysis of the variation in function of elastic properties of the soil and the structure is done under several magnitude of shaking, including variation in depth and the elements of the total response of the structure including the soil-structure interaction. At the end, a deformation proxy to evaluate and also to predict the nonlinear response of the soil, the structure and the soil-structure interaction is proposed
Darmet, Natacha. « Prise en compte du risque de propagation de l'emballement thermique dans le développement des modules de batteries. Approche expérimentale tenant compte de la génération de gaz ». Electronic Thesis or Diss., Université Grenoble Alpes, 2024. http://www.theses.fr/2024GRALI024.
Texte intégralLithium-ion batteries are playing a prominent role in the energy transition, particularly in terms of storage and transportation. The analysis of failure modes reveals that despite of a low probability, the battery self heating has consequences that requires mitigation. Studying the phenomena that lead to the propagation from a cell’s thermal runaway to neighboring cells is essential. The key parameters involved in the thermal runaway propagation for a battery module are evaluated in details in this thesis. Different approaches based on a plurality of characterization methods are proposed. Still, some mechanisms need further developments through specific measures at each stage of the phenomenon.When incidents occur, the cell’s materials react by producing gas, which mechanically affects the cell casing. An experimental device has been developed to measure the internal pressure of the cell, allowing an operando monitoring of this gas generation. The correlation of this measurement with the casing deformations also confirms the significant impact of electrode swelling on the casing’s mechanical strength.At a critical temperature, the lithium-ion cell initiates thermal runaway, which leads to the ejection of materials (gas, flame, particles) at high temperature. Pressure sensors and fast cameras at different spectra were used to analyze the jet dynamics, from its genesis in the cell to its ejection.Heat transfers then occur between the ejecta and surrounding cells, which can results in a thermal runaway propagation. Calorimetry was used to characterize the first cell in order to predict its thermal exchanges in different configurations. Then, based on the knowledge of this energy release and the jet dynamics, a calculation of the emitted heat flow is proposed. The X-rays observations of the active material during the thermal runaway propagation revealed that degradation kinetics and gas generation are linked. It also stresses the importance of trigger cell particles in the thermal propagation risks.Lastly, the thermal runaway propagation between rebuilt charged all-solid-states batteries cells were evaluated experimentally, for the first time. This research has demonstrated that the propagation is feasible with these new battery technologies. The study must be conducted further
Salines, Morgane. « Modélisation de la propagation du virus de l'hépatite E dans la filière porcine et évaluation de stratégies de réduction du risque d'exposition humaine ». Thesis, Rennes 1, 2019. http://www.theses.fr/2019REN1B039/document.
Texte intégralHepatitis E virus (HEV) is a zoonotic pathogen whose main reservoir in industrialised countries is pigs. This research project combined epidemiological studies, mathematical modelling and social sciences to propose levers for reducing the risk of human exposure to HEV through the consumption of pork products. Two experimental trials and one study under natural conditions highlighted the major role of immunomodulating co-infections on the dynamics of HEV infection in pigs, as these intercurrent pathogens led to chronic HEV infection and an increased risk of the virus in the liver, blood and muscles of slaughtered animals. The development of a within-herd, stochastic, individual-based and multi-pathogen model has made it possible to identify both zootechnical and sanitary control measures to reduce the prevalence of the virus on farms. In addition, the design of a between-herd model has enabled to analyse the factors responsible for the spread of the virus in a network of French farms. All these HEV control measures have been submitted for the opinion of public and private organisations and individual players in the pig sector (farmers, farming advisors, veterinarians) through social science approaches. Finally, this transversal and multidisciplinary project made it possible to define tangible and achievable lines of action for the management of HEV in the pig sector while making significant methodological contributions in epidemiology and modelling
Boulanger, Xavier. « Modélisation du canal de propagation Terre-Espace en bandes Ka et Q/V : synthèse de séries temporelles, variabilité statistique et estimation de risque ». Thesis, Toulouse, ISAE, 2013. http://www.theses.fr/2013ESAE0009/document.
Texte intégralNowadays, C and Ku bands used for fixed SATCOM systems are totally congested. However, the demand of the end users for high data rate multimedia services is increasing. Consequently, the use of higher frequency bands (Ka: 20 GHz and Q/V 40/50 GHz) is under investigation. For frequencies higher than 5 GHz, radiowave propagation is strongly affected by tropospheric attenuation. Among the different contributors, rain is the most significant. To compensate the deterioration of the propagation channel, Fade Mitigation Techniques (FMT) are used. The lack of experimental data needed to optimize the real-time control loops of FMT leads tothe use of rain attenuation and total attenuation time series synthesizers. The manuscript is a compilation of five articles. The first contribution is dedicated to the temporal modelling of total impairments. The second article aims at providing significant improvements on the rain attenuation time series synthesizer recommended by ITU-R. The last three contributions are a critical analysis and a modelling of the variability observed on the 1st order statistics used to validate propagation channel models. The variance of the statistical estimator of the complementary cumulative distribution functions of rainfall rate and rain attenuation is highlighted. A worldwide model parameterized in compliance with propagation measurements is proposed. It allows the confidence intervals to be estimated and the risk on a required availability associated with a given propagation margin prediction to be quantified. This approach is extended to the variability of joint statistics. It allows the impact of site diversity techniques on system performances at small scale (few kms) and large scale (few hundred of kms) to be evaluated
Boukraa, Lotfi. « Simulation of wireless propagation in a high-rise building ». Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2004. http://library.nps.navy.mil/uhtbin/hyperion/04Dec%5FBoukraa.pdf.
Texte intégralKaya, Yildirim. « Simulation of wireless propagation and jamming in a high-rise building ». Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2005. http://library.nps.navy.mil/uhtbin/hyperion/05Sep%5FKaya.pdf.
Texte intégralNekkab, Narimane. « Spread of hospital-acquired infections and emerging multidrug resistant enterobacteriaceae in healthcare networks : assessment of the role of interfacility patient transfers on infection risks and control measures ». Thesis, Paris, CNAM, 2018. http://www.theses.fr/2018CNAM1180/document.
Texte intégralLa propagation des infections nosocomiales (IN), notamment liées aux bactéries multi-résistantes, au sein du réseau des hôpitaux, est un grand enjeu de santé publique. L’évaluation du rôle joué par les transferts inter-établissements des patients sur cette propagation pourrait permettre l’élaboration de nouvelles mesures de contrôle. L’identification de nouvelles mesures de contrôle est particulièrement importante pour les bactéries résistantes aux antibiotiques comme les entérobactéries productrices de carbapenemase (EPC) pour lesquelles les possibilités de traitement sont très limitées. L’utilisation des données de réseaux de contact inter-individus et de transferts inter-établissement dans la modélisation mathématique ont rendu ces modèles plus proches de la réalité. Toutefois, ces derniers restent limités à quelques milieux hospitaliers et quelques pathogènes. La thèse a eu pour objectifs de 1) mieux comprendre la structure des réseaux hospitaliers français et leur impact sur la propagation des IN ; et 2) évaluer le rôle des transferts sur la propagation des EPC.Les réseaux hospitaliers français sont caractérisés par des flux de patients vers des hubs et par deux niveaux de communautés des hôpitaux. La structure du réseau de transfert des patients présentant une IN n’est pas différente de celle du réseau général de transfert des patients. Au cours des dernières années, le nombre d’épisode d’EPC a augmenté en France et les prédictions prévoient une poursuite de cette augmentation, avec des pics de saisonnalité en octobre. Ce travail a également montré que, depuis 2012, les transferts de patients jouent avec les années un rôle de plus en plus important sur la diffusion des EPC en France. Des évènements de propagation multiple liée aux transferts sont également de plus en plus souvent observés.En conséquence, la structure du réseau des hôpitaux pourrait servir de base pour la proposition des nouvelles stratégies de contrôles des IN en général, et des EPC en particulier. Les hôpitaux très connectés des grandes métropoles et les flux des patients entre les communautés locale et régionale doivent être considérés pour le développement de mesures de contrôle coordonnées entre établissements de santé
Dutozia, Jérôme. « Espaces à enjeux et effets de réseaux dans les systèmes de risques ». Phd thesis, Université Nice Sophia Antipolis, 2013. http://tel.archives-ouvertes.fr/tel-00785772.
Texte intégralHedin, Svante. « Data Propagation and Self-Configuring Directory Services in a Distributed Environment ». Thesis, Linköping University, Department of Science and Technology, 2001. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-1105.
Texte intégralThe Swedish field of digital X-ray imaging has since several years relied heavily on distributed information systems and digital storage containers.
To ensure accurate and safe radiological reporting, Swedish software-firm eCare AB delivers a system called Feedback—the first and only quality assurance IT support product of its kind. This thesis covers several aspects of the design and implementation of future versions of this software platform.
The focus lies on distributed directory services and models for secure and robust data propagation in TCP/IP networks. For data propagation, a new application, InfoBroker, has been designed and implemented to facilitate integration between Feedback and other medical IT support systems. The directory services, introduced in this thesis as the Feedback Directory Services, have been designed on the architectural level. A combination of CORBA and Java Enterprise Edition is suggested as the implementation platform.
Amitrano, Davide. « Emission acoustique des roches et endommagement : approches experimentale et numerique, application a la sismicite miniere ». Grenoble 1, 1999. http://www.theses.fr/1999GRE10002.
Texte intégralPicot, Alexis. « 2P optogenetics : simulation and modeling for optimized thermal dissipation and current integration Temperature rise under two-photon optogenetics brain stimulation ». Thesis, Sorbonne Paris Cité, 2018. http://www.theses.fr/2018USPCB227.
Texte intégralOver the past fifteen years, optogenetics has revolutionized neuroscience research by enabling control of neuronal circuits. The recent development of several illumination approaches, combined with new photosensitive proteins, opsins, have paved the way to neuronal control with the single-cell precision. The new ambition to use these approaches in order to activate tens, hundreds, thousands of cells in vivo has raised many questions, in particular concerning the possible photoinduced damages and the optimization of the choice of the illumination / opsin couple. During my PhD, I developed an experimentally verified simulation that calculates, under all actual illumination protocols, what will be the temperature rise in the brain tissue due to the absorption of light. In parallel, I modeled, from electrophysiology recordings, the intracellular currents observed during these photostimulations, for three different opsins, allowing me to simulate them. These models will allow the researchers to optimize their illumination protocols to keep heating as low as possible in the sample, while helping to generate optimized photocurrent dynamics according to experimental requirements
Bzduch, Pavel. « Využití počítačové grafiky v silnoproudé elektrotechnice ». Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2008. http://www.nusl.cz/ntk/nusl-217593.
Texte intégralMay, Daniel. « Transiente Methoden der Infrarot-Thermografie zur zerstörungsfreien Fehleranalytik in der mikroelektronischen Aufbau- und Verbindungstechnik ». Doctoral thesis, Universitätsbibliothek Chemnitz, 2015. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-qucosa-163082.
Texte intégralIn this work a new failure analytical method for the industrial application of new technologies in electronic packaging has been developed. The developed method is based on the interaction of the thermal waves and defects. The special fature is non-destructive, speed, resolution and high temperature sensitivity due to latest IR-detectors. It fundamental studies regarding resolution and parasitic effects in the application were carried out cinsidering industrial conditions. Here, a systematic approach regarding the complexity has been selected. This now enables a prediction of the expected test period for detecting buried defects, limits for excitation pulse width (for a given defect depth) and the quantitative determination of the influence of parasitic paints. Methodically always simulations and comparative experiments were used. Simple samples for the isolation and purification of parasitic effects has been used. Finally, the measurement system has been successfully demonstrated on an industrial applications. The developed measurement system is characterized by high flexibility. Different problem-adapted excitation sources (internal and external excitation by numerous physical effects) are used. The measurement system currently consists of four main modules, the difference image method, the pulse thermography, and two variants of LockIn-thermography. Together, the system is capable of detecting voids, delaminations and cracks in various fields of electronic packageing. It will reach temperature resolutions up to 5 mK and lateral resolutions down to 17 µm. This work stes a foundation for the application of thermal failure analysis for industry by showing and charcterization the limits of IR imaging
Deobarro, Mikaël. « Etude de l'immunité des circuits intégrés face aux agressions électromagnétiques : proposition d'une méthode de prédiction des couplages des perturbations en mode conduit ». Thesis, Toulouse, INSA, 2011. http://www.theses.fr/2011ISAT0002/document.
Texte intégralWith technological advances in recent decades, the complexity and operating speeds of integrated circuits have greatly increased. While these developments have reduced dimensions and supply voltages of circuits, electromagnetic compatibility (EMC) of components has been highly degraded. Identified as a technological lock, EMC is now one of the main causes of circuits re-designs because issues related to generating and coupling noise mechanisms are not sufficiently studied during their design. This manuscript introduces a methodology to study propagation of electromagnetic disturbances through integrated circuits by measurements and simulations. To improve our knowledge about propagation of electromagnetic interferences (EMI) and coupling mechanisms through integrated circuits, we designed a test vehicle developed in the SMOS8MV® 0.25µm technology from Freescale Semiconductor. In this circuit, several basic functions such as I/O bus and digital blocks have been implemented. Asynchronous on-chip voltage sensors have also been integrated on different supplies of the chip to analyze propagation of disturbances injected on supply pins and wires of the component (DPI and BCI injection). In addition, we propose various tools to facilitate modeling and simulations of Integrated Circuit’s immunity (PCB model extraction, injection systems modeling approaches, innovative method to predict and correlate levels of voltage / power injected during conducted immunity measurements, modeling flow). Each tool and modeling method proposed is evaluated on different test cases. To assess our modeling approach, we finally apply it on a digital block of our test vehicle and compare simulation results to various internal and external measurements performed on the circuit
Jhao, Wun-Jie, et 趙文傑. « Using Malware for Threat Risk Analysis based on a Dynamic Taint Propagation Approach ». Thesis, 2015. http://ndltd.ncl.edu.tw/handle/49513444934381853966.
Texte intégral崑山科技大學
資訊管理研究所
103
Due to the popularity of 3G network and the fast growing of mobile device, it leads to the increasing situations of network-dependent for people. However, the number of malware for smartphone attacks is several times the number of the previous one in the past three years. To guarantee the mobile security of cloud computing services, cloud service providers (CSPs) use Taint Checking (TC) approaches to examine how a developing cloud app uses sensitive information before deploying an app. Accordingly, this paper proposes a taint propagation analysis model incorporating a weighted spanning tree analysis scheme to track data with taint marking using several taint checking tools. In performing malware threat analysis against unspecified malware attacks, CSPs can use a TC approach for tracking the spreads and scopes of information flows between attack sources (malware) and detect vulnerabilities of targeted network applications. To verifiy the proposed model, we perform android programs using dynamic taint propagation to analyse the spread of and risks posed by suspicious apps were connected to the cloud computing server. For probabilistic analysis of taint propagation, risk and defence capability are used for each taint path for assisting a defender in recognising the attack results against network threats caused by malware infection and estimate the losses of associated taint sources.
LIN, ZHI-TING, et 林智婷. « A novel portfolio diversification and risk reduction strategy based on affinity propagation clustering algorithm ». Thesis, 2016. http://ndltd.ncl.edu.tw/handle/61185905112765890814.
Texte intégral國立交通大學
資訊管理研究所
104
In this paper, an intelligent portfolio selection method based on Affinity Propagation clustering algorithm is proposed to solve the stable investment problem. The goal of this work is to minimize the volatility of the selected portfolio from the component stocks of S&P 500 index. Each independent stock can be viewed as a node in graph, and the similarity measurements between companies are calculated as the edge weights. Affinity Propagation clustering algorithm solve the graph theory problem by repeatedly update responsibility and availability message passing matrices. This research tried to find most representative and discriminant features to model the stock similarity. The testing features are divided into four major categories, including time-series covariance, technical indicators, previous return information, paired return value. The historical price and trading volume data is used to simulate the portfolio selection and volatility measurement. After grouping these investment targets into a small set of clusters, the selection process will choose fixed number of stocks from different clusters to form the portfolio. The experimental results show that the proposed system can effectively generate more stable portfolio by Affinity Propagation clustering algorithm with proper similarity features than average cases with similar settings.
Rist, Stefan [Verfasser]. « Light propagation in ultracold atomic gases / vorgelegt von Stefan Rist ». 2010. http://d-nb.info/1010685287/34.
Texte intégralGENTILI, FILIPPO. « Multi-physics modelling for the safety assessment of complex structural systems under fire. The case of high-rise buildings ». Doctoral thesis, 2013. http://hdl.handle.net/11573/918045.
Texte intégral