Dissertations / Theses on the topic 'Probabilistic risk modelling'

To see the other types of publications on this topic, follow the link: Probabilistic risk modelling.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 18 dissertations / theses for your research on the topic 'Probabilistic risk modelling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Macey, P. "Probabilistic risk assessment modelling for passenger aircraft fire safety." Thesis, Cranfield University, 1997. http://hdl.handle.net/1826/4260.

Full text
Abstract:
This thesis describes the development of a computer simulation model for the investigation of airliner fire accident safety. The aim of the work has been to create a computer-based analysis tool that generates representative aircraft accident scenarios and then simulates their outcome in terms of passenger injuries and fatalities. The details of the accident scenarios are formulated to closely match the type of events that are known to have occurred in aircraft accidents over the last 40 years. This information has been obtained by compiling a database and undertaking detailed analysis of approximately 200 airliner fire accidents. In addition to utilising historical data, the modelling work has incorporated many of the key findings obtained from experimental research undertaken by the world's air safety community. An unusual feature of the simulation process is that all critical aspects of the accident scenario have been analysed and catered for in the formative stages of the programme development. This has enabled complex effects, such as cabin crash disruption, impact trauma injuries, fire spread, smoke incapacitation and passenger evacuation to be simulated in a balanced and integrated manner. The study is intended to further the general appreciation and understanding of the complex events that lead to fatalities in aircraft fire accidents. This is achieved by analysing all contributory factors that are likely to arise in real fire accident scenarios and undertaking quantitative risk assessment through the use of novel simulation methods. Future development of the research could potentially enable the undertaking of a systematic exploration and appraisal of the effectiveness of both current and future aircraft fire safety policies.
APA, Harvard, Vancouver, ISO, and other styles
2

Crossland, Ross. "Risk in the development design." Thesis, University of Bristol, 1997. http://hdl.handle.net/1983/aa5c6f5c-8e74-44ab-a6da-545ec6d39cfd.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Barr, Gordon. "A probabilistic risk modelling methodology for the formative stages of engineering projects." Thesis, University of Bristol, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.402343.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Lieswyn, John. "Probabilistic Risk Analysis in Transport Project Economic Evaluation." Thesis, University of Canterbury. Civil and Natural Resources Engineering, 2012. http://hdl.handle.net/10092/7652.

Full text
Abstract:
Transport infrastructure investment decision making is typically based on a range of inputs such as social, environmental and economic factors. The benefit cost ratio (BCR), a measure of economic efficiency (“value for money”) determined through cost benefit analysis (CBA), is dependent on accurate estimates of the various option costs and net social benefits such as reductions in travel time, accidents, and vehicle operating costs. However, most evaluations are deterministic procedures using point estimates for the inputs and producing point estimates for the outputs. Transport planners have primarily focused on the cost risks and treat risk through sensitivity testing. Probabilistic risk analysis techniques are available which could provide more information about the statistical confidence of the economic evaluation outputs. This research project report investigated how risk and uncertainty are dealt with in the literature and guidelines. The treatment of uncertainty in the Nelson Arterial Traffic Study (ATS) was reviewed and an opportunity to apply risk analysis to develop probabilities of sea level rise impacting on the coastal road options was identified. A simplified transport model and economic evaluation case study based on the ATS was developed in Excel to enable the application of @RISK Monte Carlo simulation software. The simplifications mean that the results are not comparable with the ATS. Seven input variables and their likely distributions were defined for simulation based on the literature review. The simulation of seven variables, five worksheets, and 10,000 iterations takes about 30 seconds of computation time. The input variables in rank order of influence on the BCR were capital cost, car mode share, unit vehicle operating cost, basic employment forecast growth rate, and unit value of time cost. The deterministically derived BCR of 0.75 is associated with a 50% chance that the BCR will be less than 0.6, although this probability is partly based on some statistical parameters without an empirical basis. In practice, probability distribution fitting to appropriate datasets should be undertaken to better support probabilistic risk analysis conclusions. Probabilities for different confidence levels can be reported to suit the risk tolerance of the decision makers. It was determined that the risk analysis approach is feasible and can produce useful outputs, given a clear understanding of the data inputs and their associated distributions.
APA, Harvard, Vancouver, ISO, and other styles
5

Arnold, Patrick. "Probabilistic modelling of unsaturated slope stability accounting for heterogeneity." Thesis, University of Manchester, 2017. https://www.research.manchester.ac.uk/portal/en/theses/probabilistic-modelling-of-unsaturated-slope-stability-accounting-for-heterogeneity(fb3d214c-8a42-4a2c-81c2-bda45e9ae7af).html.

Full text
Abstract:
The performance and safety assessment of geo-structures is strongly affected by uncertainty; that is, both due a subjective lack of knowledge as well as objectively present and irreducible unknowns. Due to uncertainty in the non-linear variation of the matric suction induced effective stress as a function of the transient soil-atmosphere boundary conditions, the unsaturated state of the subsoil is generally not accounted for in a deterministic slope stability assessment. Probability theory, accounting for uncertainties quantitatively rather than using "cautious estimates" on loads and resistances, may aid to partly bridge the gap between unsaturated soil mechanics and engineering practice. This research investigates the effect of uncertainty in soil property values on the stability of unsaturated soil slopes. Two 2D Finite Element (FE) programs have been developed and implemented into a parallelised Reliability-Based Design (RBD) framework, which allows for the assessment of the failure probability, failure consequence and parameter sensitivity, rather than a deterministic factor of safety. Utilising the Random Finite Element Method (RFEM), within a Monte Carlo framework, multivariate cross-correlated random property fields have been mapped onto the FE mesh to assess the effect of isotropic and anisotropic moderate heterogeneity on the transient slope response, and thus performance. The framework has been applied to a generic slope subjected to different rainfall scenarios. The performance was found to be sensitive to the uncertainty in the effective shear strength parameters, as well as the parameters governing the unsaturated soil behaviour. The failure probability was found to increase most during prolonged rainfall events with a low precipitation rate. Nevertheless, accounting for the unsaturated state resulted in a higher slope reliability than when not considering suction effects. In a heterogeneous deposit failure is attracted to local zones of low shear strength, which, for an unsaturated soil, are a function of both the spatial variability of soil property values, as well as of the soil-water dynamics, leading to a significant increase in the failure probability near the end of the main rainfall event.
APA, Harvard, Vancouver, ISO, and other styles
6

Khan, Saad Ullah. "Exploring the effect of political risks in large infrastructure projects in politically unstable countries using a probabilistic modelling approach." Thesis, Queensland University of Technology, 2014. https://eprints.qut.edu.au/79325/1/Saad_Khan_Thesis.pdf.

Full text
Abstract:
This research aims to explore and identify political risks on a large infrastructure project in an exaggerated environment to ascertain whether sufficient objective information can be gathered by project managers to utilise risk modelling techniques. During the study, the author proposes a new definition of political risk; performs a detailed project study of the Neelum Jhelum Hydroelectric Project in Pakistan; implements a probabilistic model using the principle of decomposition and Bayes probabilistic theorem and answers the question: was it possible for project managers to obtain all the relevant objective data to implement a probabilistic model?
APA, Harvard, Vancouver, ISO, and other styles
7

Pretorius, Samantha. "The effect of observation errors on parameter estimates applied to seismic hazard and insurance risk modelling." Diss., University of Pretoria, 2014. http://hdl.handle.net/2263/79774.

Full text
Abstract:
The research attempts to resolve which method of estimation is the most consistent for the parameters of the earthquake model, and how these different methods of estimation, as well as other changes, in the earthquake model parameters affect the damage estimates for a specific area. The research also investigates different methods of parameter estimation in the context of the log-linear relationship characterised by the Gutenberg-Richter relation. Traditional methods are compared to those methods that take uncertainty in the underlying data into account. Alternative methods based on Bayesian statistics are investigated briefly. The efficiency of the feasible methods is investigated by comparing the results for a large number of synthetic earthquake catalogues for which the parameters are known and errors have been incorporated into each observation. In the second part of the study, the effects of changes in key parameters of the earthquake model on damage estimates are investigated. This includes an investigation of the different methods of estimation and their effect on the damage estimates. It is found that parameter estimates are affected by observation errors. If errors are not included in the method of estimation, the estimate is subject to bias. The nature of the errors determines the level of bias. It is concluded that uncertainty in the data used in earthquake parameter estimates is largely a function of the quality of the data that is available. The inaccuracy of parameter estimates depends on the nature of the errors that are present in the data. In turn, the nature of the errors in an earthquake catalogue depends on the method of compilation of the catalogue and can vary from being negligible, for single source catalogues for an area with a sophisticated seismograph network, to fairly impactful, for historical earthquake catalogues that predate seismograph networks. Probabilistic seismic risk assessment is used as a catastrophe modelling tool to circumvent the problem of scarce loss data in areas of low seismicity and is applied in this study for the greater Cape Town region in South Africa. The results of the risk assessment demonstrate that seemingly small changes in underlying earthquake parameters as a result of the incorporation of errors can lead to significant changes in loss estimates for buildings in an area of low seismicity.
Dissertation (MSc)--University of Pretoria, 2014.
Insurance and Actuarial Science
MSc
Unrestricted
APA, Harvard, Vancouver, ISO, and other styles
8

Impoco, Stefano. "Probabilistic analysis of the performance of barriers controlling the ignition of combustible gas in gas turbine air intakes." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2020.

Find full text
Abstract:
Nowadays, the frequent occurrence of hydrocarbon leaks underlies the need for investigating every possible contribution to the fire and explosion risk in the oil and gas industry. Hence, controlling the potential ignition sources is paramount for ensuring tolerable risk levels. Among potential ignitors, gas turbines (GT) are regarded as one of the main contributors when employed for mechanical-drive in the process area of offshore facilities even though their behaviour when acting as live sources has never been analysed in detail. This study aimed at investigating both the role that GTs play on the fire and explosion risk and the effect of further controlling ignition barrier elements by examining a real case of study. The Kamaleon FireEx™ – Risk and Barrier Management tool was used since this couples a CFD-based description of transient releases dispersion to the ignition probability modelling. After a first optimisation of the grid resolution, a selection procedure was developed to define a base case risk level. This result was analysed to highlight the time-window where ignitions occur and used as terms of comparison in the following sensitivity studies. First, the contribution that the installation of a GT carries along was quantified. The effect of an alternative ignition probability modelling, which underlies either a detailed representation of the GT behaviour when acting as an ignition source or the activation of an inert gas injecting system, was investigated thereafter. Results show the need for a quick response to obtain a risk reduction. Then, the influence of a shield wall that was built around the GT air-intakes was analysed. A covering effect was observed, which involved a reduction in the risk. Finally, since the wall introduced even a delaying effect, it was investigated whether this may combine with the alternative ignition model. Results showed a further risk reduction and the dominant role that physical barriers play in this context.
APA, Harvard, Vancouver, ISO, and other styles
9

PANIZZI, SILVIA. "Sfide e prospettive nella valutazione del rischio ambientale dei prodotti fitosanitari." Doctoral thesis, Università Cattolica del Sacro Cuore, 2017. http://hdl.handle.net/10280/19081.

Full text
Abstract:
La prima parte dell’elaborato presenta le origini e lo sviluppo delle politiche di valutazione del rischio per le sostanze chimiche. Dopo un primo inquadramento storico, l’attenzione è stata dedicata a temi emergenti come la valutazione delle incertezze, la necessità di integrazione delle valutazioni del rischio per l’uomo e per l'ambiente. La seconda parte presenta l’evoluzione delle politiche di valutazione del rischio dei prodotti fitosanitari, soprattutto a livello europeo (dall’applicazione della direttiva 91/414 all’attuale regolamento 1107/2009), con particolare attenzione all’applicazione del principio di precauzione. Il terzo capitolo approfondisce le fasi e gli approcci delle attuali procedure di valutazione ambientale del rischio dei pesticidi usati a livello europeo e americano; è stato in particolar modo esplorato il tema dell’individuazione degli obiettivi specifici di protezione in fase preliminare di valutazione del rischio. Il quarto capitolo tratta di un tema attualmente molto dibattuto, ovvero la valutazione dei potenziali effetti combinati sugli organismi non bersaglio esposti a più sostanze attive simultaneamente. Infine, l’obiettivo del quinto capitolo è quello di valutare la contaminazione ambientale dovuta all’applicazione di fungicidi a base rame su melo. A tale scopo è stato testato un nuovo modello per il calcolo integrato dell’esposizioni umana e ambientale MERLIN – Expo, sviluppato grazie al progetto europeo 4FUN. I risultati ottenuti per le acque superficiali e il sedimento sono stati confrontati con i risultati degli attuali modelli usati in Unione Europea, i modelli FOCUS. Le simulazioni probabilistiche hanno anche permesso di effettuare valutazioni di incertezza e sensitività sui parametri utilizzati nelle simulazioni.
This PhD thesis is a multidisciplinary work on the risk assessment of plant protection products including both legislative and scientific aspects. The first part of the thesis introduces the origin of risk assessment procedures with a wide glance on the whole process of risk analysis to protect the humans and the environment. The accent is put on emerging issues and trends, such as the uncertainties appraisal, the necessity of integration between human and environmental impacts without ignoring socio- economic and behavioural factors. The second chapter deals with the origin and development of global risk assessment policies on pesticides. It focuses in particular on European policies, from the original Directive 91/414 to the current Regulation 1107/2009 and the application of the precautionary principle. A brief comparison with US approaches for risk assessment is also presented. The third chapter gives an overview on the risk assessment procedures that nowadays provide the highest achievable protection for the environment, starting with the definition of clear and specific protection goals. The fourth chapter addresses the issue of combined risk assessment of pesticides: current approaches for the evaluation of effects on non-target organisms are analysed. The last chapter is dedicated to the estimation of the environmental contamination following the application of copper –based fungicides sprayed on orchards by using MERLIN - Expo, which is a multimedia model developed in the frame of the FP7 EU project 4FUN. The performance of the MERLIN- Expo software in estimating the contamination of the metal is also analysed through a comparison with the currently used FOCUS standard models for the calculation of pesticides concentrations in surface water and sediment. Both deterministic and probabilistic simulations have been run; the latter has allowed to perform uncertainty and sensitivity assessment.
APA, Harvard, Vancouver, ISO, and other styles
10

Vyčítal, Václav. "Pravděpodobnostní přístup pro hodnocení zemnících soustav." Doctoral thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2020. http://www.nusl.cz/ntk/nusl-414160.

Full text
Abstract:
This dissertation thesis deals with application of probabilistic approach to assessment of earthing system safety in distribution networks, especially for cases with common earthing of high and low voltage side of distribution transformers HV/LV. In these cases, the increased potential during fault might be transferred from high voltage to low voltage network and thus the individuals from public can be exposed to increased risk. Thus, for these cases were in this thesis defined expectable touch scenarios together with the resulting risk imposed on individuals from the public. Based on the results it seems that adoption of probabilistic approach for these cases of earthing systems might be more suitable compared to the conventional deterministic worst case approach. In accordance to the aims of the thesis, a thorough analysis of currently adopted probabilistic approaches was carried out as well and it was pointed out to some new possible simplifications in the adopted probabilistic based methodologies. For example, it seems that appropriate modelling of human body resistance by the full lognormal distribution is not completely necessary and similar results can be obtained when only the resistance for 50 % of population together with c3 and c4 fibrillation curves are used. Much of the work was also devoted to the determination of possible uncertainty of calculated risk of evaluated earthing system, especially due to inappropriate modelling of earthing system. The appropriateness of different earthing system modelling methods together with other parameters on the value of calculated risk was investigated through conducting sensitivity analysis. Based on the analysis results it seems, that due to using more, or less simplified modelling method, a possible underrating in the resulting risk of about 40 % (about half an order/decade) is expectable. On the other hand, the change of parameters related directly to calculation of fibrillation probability seems to exhibit greater change in calculated risk by up to units of orders/decades.
APA, Harvard, Vancouver, ISO, and other styles
11

Beisler, Matthias Werner. "Modelling of input data uncertainty based on random set theory for evaluation of the financial feasibility for hydropower projects." Doctoral thesis, Technische Universitaet Bergakademie Freiberg Universitaetsbibliothek "Georgius Agricola", 2011. http://nbn-resolving.de/urn:nbn:de:bsz:105-qucosa-71564.

Full text
Abstract:
The design of hydropower projects requires a comprehensive planning process in order to achieve the objective to maximise exploitation of the existing hydropower potential as well as future revenues of the plant. For this purpose and to satisfy approval requirements for a complex hydropower development, it is imperative at planning stage, that the conceptual development contemplates a wide range of influencing design factors and ensures appropriate consideration of all related aspects. Since the majority of technical and economical parameters that are required for detailed and final design cannot be precisely determined at early planning stages, crucial design parameters such as design discharge and hydraulic head have to be examined through an extensive optimisation process. One disadvantage inherent to commonly used deterministic analysis is the lack of objectivity for the selection of input parameters. Moreover, it cannot be ensured that the entire existing parameter ranges and all possible parameter combinations are covered. Probabilistic methods utilise discrete probability distributions or parameter input ranges to cover the entire range of uncertainties resulting from an information deficit during the planning phase and integrate them into the optimisation by means of an alternative calculation method. The investigated method assists with the mathematical assessment and integration of uncertainties into the rational economic appraisal of complex infrastructure projects. The assessment includes an exemplary verification to what extent the Random Set Theory can be utilised for the determination of input parameters that are relevant for the optimisation of hydropower projects and evaluates possible improvements with respect to accuracy and suitability of the calculated results
Die Auslegung von Wasserkraftanlagen stellt einen komplexen Planungsablauf dar, mit dem Ziel das vorhandene Wasserkraftpotential möglichst vollständig zu nutzen und künftige, wirtschaftliche Erträge der Kraftanlage zu maximieren. Um dies zu erreichen und gleichzeitig die Genehmigungsfähigkeit eines komplexen Wasserkraftprojektes zu gewährleisten, besteht hierbei die zwingende Notwendigkeit eine Vielzahl für die Konzepterstellung relevanter Einflussfaktoren zu erfassen und in der Projektplanungsphase hinreichend zu berücksichtigen. In frühen Planungsstadien kann ein Großteil der für die Detailplanung entscheidenden, technischen und wirtschaftlichen Parameter meist nicht exakt bestimmt werden, wodurch maßgebende Designparameter der Wasserkraftanlage, wie Durchfluss und Fallhöhe, einen umfangreichen Optimierungsprozess durchlaufen müssen. Ein Nachteil gebräuchlicher, deterministischer Berechnungsansätze besteht in der zumeist unzureichenden Objektivität bei der Bestimmung der Eingangsparameter, sowie der Tatsache, dass die Erfassung der Parameter in ihrer gesamten Streubreite und sämtlichen, maßgeblichen Parameterkombinationen nicht sichergestellt werden kann. Probabilistische Verfahren verwenden Eingangsparameter in ihrer statistischen Verteilung bzw. in Form von Bandbreiten, mit dem Ziel, Unsicherheiten, die sich aus dem in der Planungsphase unausweichlichen Informationsdefizit ergeben, durch Anwendung einer alternativen Berechnungsmethode mathematisch zu erfassen und in die Berechnung einzubeziehen. Die untersuchte Vorgehensweise trägt dazu bei, aus einem Informationsdefizit resultierende Unschärfen bei der wirtschaftlichen Beurteilung komplexer Infrastrukturprojekte objektiv bzw. mathematisch zu erfassen und in den Planungsprozess einzubeziehen. Es erfolgt eine Beurteilung und beispielhafte Überprüfung, inwiefern die Random Set Methode bei Bestimmung der für den Optimierungsprozess von Wasserkraftanlagen relevanten Eingangsgrößen Anwendung finden kann und in wieweit sich hieraus Verbesserungen hinsichtlich Genauigkeit und Aussagekraft der Berechnungsergebnisse ergeben
APA, Harvard, Vancouver, ISO, and other styles
12

Grant, Matthew. "Systems and reliability modelling for probabilistic risk assessment of improvised explosive device attacks." Thesis, 2018. http://hdl.handle.net/1959.13/1355397.

Full text
Abstract:
Research Doctorate - Doctor of Philosophy (PhD)
Improvised Explosive Devices, commonly known as IEDs, inspire terror in communities across the globe, and continue to be a terrorist weapon of choice. IED Attacks target infrastructure and population centres, including airports, buildings and sporting venues using a broad array of devices, from vehicle borne IEDs to small postal devices. They provide the media with sensational headlines and governments with tough challenges, balancing the electorate’s emotive needs against investing in projects on the basis of rigorous Cost-Benefit Analysis. With increasing pressures on the economies of nations, spending on counter-terrorism is subject to greater scrutiny. Homeland security agencies are no longer exempt from government fiscal due diligence, needing to justify how their spending achieves best value-for-money. Probabilistic Risk Assessment (PRA) is a valuable tool that can assist in this endeavour. A significant issue surrounding justification of counter-terrorism spending lies in how risk and cost-benefit studies are conducted. Much work in the counter-terrorism domain is informed by expert opinion and traditional risk analysis techniques. These approaches have been criticised, particularly because they do not account for terrorists as being adaptive or intelligent adversaries. This research develops a multi-disciplinary approach to PRA for IED Attack, based on a fusion of systems and military engineering techniques. This research has identified that IED Attacks can be characterised by a combination of human factors and reliability modeling techniques to quantify many of the aspects surrounding IED Attacks, and resolve some of the key challenges surrounding the use of PRA to assess the risks associated with terrorism. The PRA model that has been developed can be used to assess the risk-reduction associated with IED Attack countermeasures, and used to complement other risk assessment modelling techniques.
APA, Harvard, Vancouver, ISO, and other styles
13

Netherton, Michael David. "Probabilistic modelling of structural and safety hazard risks for monolithic glazing subject to explosive blast loads." Thesis, 2013. http://hdl.handle.net/1959.13/938509.

Full text
Abstract:
Research Doctorate - Doctor of Philosophy (PhD)
Explosions within urban areas can inflict great damage to building envelopes and facades, particularly glazed areas, following which, fragments of broken glass can then cause damage to building interiors and pose significant safety hazards to occupants. The aim of this thesis is to develop a new probabilistic framework that can be used to quantify the risks of glass damage and glazing safety hazards associated with explosive blast loads; indeed, the framework has utility for all manner of blast-load/structural-response scenarios. In any blast-load scenario there is considerable uncertainty and variability associated with many parameters, such as: explosive mass, stand-off, net equivalent quantity, explosive shape, confinement, the inherent variability of a blast wave, errors in predicting blast-load parameters, and so forth. The new framework uses well-accepted methods of structural reliability and probability theory to undertake assessments of two topical blast load scenarios: (i) an aerially delivered military weapon (where collateral damage risks are paramount), and (ii) a terrorist style vehicle borne improved explosive device (where the risk of any safety hazard is of interest). A new probabilistic blast load model propagates the uncertainty and variability through the computations to reveal estimates of probable blast-load values, such as: pressure, impulse and the duration time of a blast wave’s first positive pulse. These probabilistically derived values are compared against classical (deterministically calculated) blast-load predictions with a view to assess their degree of conservatism (or otherwise).The two blast load scenarios are then used within a new probabilistic glazing-response model that considers a typical 20-story urban structure containing normal facade glazing. The glazing system (both pre and post fracture) also contains uncertainty and variability in items such as: glass dimensions, the strength of glass, Poisson’s ratio, Young’s Modulus, the drag coefficients of different (and randomly sized/shaped) glass fragments, and so forth. Values of uncertainty and variability are again propagated through the calculations to reveal estimates of the risk of glazing failure and the risk of glazing safety hazards to building occupants. Different glazing options are considered (in a number of case studies) where risk, reliability and cost-benefit analyses allows comparisons to be made between the relative effectiveness of security measures, weapon selection, delivery method or other mitigation measures. The intent for the framework presented in this thesis is that it represents a rational approach to predicting blast damage risks which can then be used: (a) As a decision support tool to mitigate damage (risk-cost-benefit analysis), (b) By emergency services to predict the extent and likelihood of damage and casualty levels in contingency planning and emergency response simulations, (c) For collateral damage estimation for military planners (i.e., minimise risk of collateral damage when selecting military ordinance), and (d) In forensics to back-calculate charge mass based on the extent of observed damage and a known stand-off distance.
APA, Harvard, Vancouver, ISO, and other styles
14

Tan, Samson. "A Dynamic, Probabilistic Fire Risk Model incorporating Technical, Human and Organizational Risks for High-rise Residential Buildings." Thesis, 2021. https://vuir.vu.edu.au/42814/.

Full text
Abstract:
Fire events in high-rise residential buildings pose threats to both property and human life and upon investigation it is frequently revealed that the cause of a fire event is not simply due to technical errors. Often these investigations uncover human and organizational errors (HOEs) that contribute to fire risk and fire events. Many human factors identified in fire risk environments can be minimized through employee training and development while organizational factors, such as safety culture, can be changed over time through transformational interventions that shift existing mindsets. Probabilistic risk analysis (PRA) methods are modeling tools that allow fire risk professionals to estimate risk by computing several scenarios of what can go wrong, the likelihood of events occurring, and the consequences of the events. PRA often takes a fixed value of events occurring likelihood over the building design period, whereas it may change due to aging of a fire safety measure. PRA is an explicit methodology for complying with performance requirements of building codes, but existing PRA methods may underestimate safety risk levels by ignoring HOEs while focusing solely on technical risks and errors as well as not taking into account reliability changes over the time. In this work, a systematic review identifies HOEs that can potentially affect risk estimates in fire safety modelling of high-rise buildings. The importance and uniqueness of high-rise buildings is mainly due to the special nature of buildings where fire-fighting techniques require different safety measures than in other industries. In addition, the height of high-rise buildings and the increased number of occupants result in longer evacuation times than other types of buildings or industrial plants. Evacuation times are increased further when the number of stairways in these buildings is limited. A wide range of HOEs have been identified as impacting risk in various industries such as offshore oil production and nuclear plants, but not all these identified HOEs will be appropriate for high-rise buildings. Important factors are those that emerge consistently from different published sources supported by quantitative case studies of events such as the Grenfell Tower fire in London and the fire in the Lacrosse building fire in Melbourne. The linking of published HOEs with errors identified from high-rise building fire case studies uncover HOEs likely to influence risk estimates. Quantifications of the impact of HOEs on risk estimates in other industries indeed justify additional research and inclusion of HOEs for risk estimates in high-rise buildings. This work uniquely connects HOEs from various industries to likely HOEs associated with risks in high-rise buildings to address an important gap in the literature. The research provides empirical quantitative studies, theoretical framework, and guidelines demonstrating how HOEs risks can be distilled to improve PRAs of fires in high-rise buildings. To further address the gap, this work proposes a comprehensive Technical- Human-Organizational Risk (T-H-O-Risk) methodology to enhance existing PRA approaches by quantifying human and organizational risks. The methodology incorporates Bayesian Network (BN) analysis of HOEs and System Dynamics (SD) modeling for dynamic characterization of risk variations over time in high- rise residential buildings. Most current approaches assume that the relationships among HOEs are independent and current methods do not explain the interactions among these variables. An integrated T-H-O-Risk model overcomes this limitation by measuring causal relationships among variables and quantifying HOEs such as staff training, fire drill practices, safety culture and building maintenance. The model addresses the underestimation of risk resulting from not following the proper practices and regulations. Issues of selecting fire safety measures needed to reduce risk to an acceptable level are examined while evaluating the efficacy of active systems that are sensitive to HOEs. The methodology utilizes the “as low as reasonably practicable” (ALARP) principle in comparing risk acceptance for different case studies demonstrating the model’s value related to risk reduction with respect to initial designs of high-rise residential buildings. By incorporating both BN and SD techniques, the T-H-O-Risk model developed in this research evaluates HOEs dynamically in an innovative and integrated quantitative risk framework. This is possible by incorporating factors that vary with time since event tree/fault tree (ET/FT) and BN alone cannot deal with dynamic characteristics of the process variables and HOEs. The model includes risk variation over time which is significantly better than contemporary methods that only provide static values of risks. Initially three case studies are conducted with limited number of scenarios for the purpose of validation to demonstrate the application of this comprehensive approach to the designs of various high-rise residential buildings ranging from 18 to 24 stories. Societal risks are represented in F-N curves. Results show that in general, fire safety designs that do not consider HOEs underestimate the overall risks significantly which can reach 40% in some extreme cases. Furthermore, risks over time due to HOEs vary by as much as 30% over 10 years. A sensitivity analysis indicates that deficient training, poor safety culture and ineffective emergency plans have significant impact on overall risk. Subsequently, the application of the T-H-O-Risk methodology was expanded to seven designs of high-rise residential buildings (including earlier three) with 16 different technical solutions to quantify the impact of HOEs on different fire safety systems. The active systems considered are sprinklers, building occupant warning systems, smoke detectors, and smoke control systems. The results indicate that HOEs impact risks in active systems by approximately 20%, however, HOEs have a limited impact on passive fire protection systems. Large variations are observed in the reliability of active systems due to HOEs over time. Finally, sensitivity and uncertainty analyses of HOEs were carried out on three selected buildings from the above seven. The sensitivity analysis again indicates that deficient training, poor safety culture and ineffective emergency plans have significant impact on overall risk. The model also identifies multiple cases where tenable conditions are breached. A detailed uncertainty analysis is carried out using a Monte Carlo approach to isolate critical parameters affecting the risk levels. This research has developed a novel approach to enhance fire risk assessment methods using a holistic quantification of technical, human, and organizational risks for high-rise residential buildings which ultimately benefits future risk assessments providing more precise estimates. A significant contribution of this research involves the systematic identification of HOEs and their associated risks for consideration in future PRAs. By studying various trial designs, the impact of HOEs on fire safety systems is analyzed while demonstrating the robustness of the T-H-O-Risk methodology for high-rise buildings. The research lays foundations for next-generation building codes and risk assessment methods.
APA, Harvard, Vancouver, ISO, and other styles
15

Chu, James Yick Gay. "Synthesis and experimental validation of a new probabilistic strategy to minimize heat transfers used in conditioning of dry air in buildings with fluctuating ambient and room occupancy." Thesis, 2017. http://hdl.handle.net/2440/114256.

Full text
Abstract:
Steady-state unit-operations are globally used in chemical engineering. Advantages include ease of control and a uniform product quality. Nonetheless there will be naturally occurring, random (stochastic) fluctuations about any steady-state ‘set’ value of a process parameter. Traditional chemical engineering does not explicitly take account of these. This is because, generally, fluctuation in one parameter appears to be off-set by change in another – with the process outcome remaining apparently steady. However Davey and co-workers (e.g. Davey et al., 2015; Davey, 2015 a; Zou and Davey, 2016; Abdul-Halim and Davey, 2016; Chandrakash and Davey, 2017 a) have shown these naturally occurring fluctuations can accumulate and combine unexpectedly to leverage significant impact and thereby make apparently well-running processes vulnerable to sudden and surprise failure. They developed a probabilistic and quantitative risk framework they titled Fr 13 (Friday 13th) to underscore the nature of these events. Significantly, the framework can be used in ‘second-tier’ studies for re-design to reduce vulnerability to failure. Here, this framework is applied for the first time to show how naturally occurring fluctuations in peak ambient temperature (T₀) and occupancy (room traffic flows) (Lᴛ) can impact heat transfers for conditioning of room air. The conditioning of air in large buildings, including hotels and hospitals, is globally important (Anon., 2012 a). The overarching aim is to quantitatively ‘use’ these fluctuations to develop a strategy for minimum energy. A justification is that methods that permit quantitative determination of reliable strategies for conditioning of air can lead to better energy use, with potential savings, together with reductions in greenhouse gases (GHG). Oddly many buildings do not appear to have a quantitative strategy to minimize conditioning heat transfers. Wide-spread default practice is to simply use an on-off strategy i.e. conditioning-on when the room is occupied and conditioning-off, when un-occupied. One alternative is an on-only strategy i.e. leave the conditioner run continuously. A logical and stepwise combined theoretical-and-experimental, approach was used as a research strategy. A search of the literature showed that work had generally focused on discrete, deterministic aspects and not on mathematically rigorous developments to minimise overall whole-of-building conditioning heat transfers. A preliminary steady-state convective model was therefore synthesized for conditioning air in a (hotel) room (4.5 x 5.0 x 2.5, m) in dry, S-E Australia during summer (20 ≤ T₀ ≤ 40, °C) to an auto-set room bulk temperature of 22 °C for the first time. This was solved using traditional, deterministic methods to show the alternative on-only strategy would use less electrical energy than that of the default on-off for Lᴛ > 36 % (Chu et al., 2016). Findings underscored the importance of the thermal capacitance of a building. The model was again solved using the probabilistic Fr 13 framework in which distributions to mimic fluctuations in T₀ and Lᴛ were (reasonably) assumed and a new energy risk factor (p) was synthesized such that all p > 0 characterized a failure in applied energy strategy (Chu and Davey, 2015). Predictions showed on-only would use less energy on 86.6 % of summer days. Practically, this meant that a continuous on-only strategy would be expected to fail in only 12 of the 90 days of summer, averaged over the long term. It was concluded the Fr 13 framework was an advance over the traditional, deterministic method because all conditioning scenarios that can practically exist are simulated. It was acknowledged however that: 1) a more realistic model was needed to account for radiative heat transfers, and; 2) to improve predictive accuracy, local distributions for T₀ and Lᴛ were needed. To address these: 1) the model was extended mathematically to account for radiative transfers from ambient to the room-interior, and; 2) distributions were carefully-defined based on extensive historical data for S-E Australia from, respectively, Bureau of Meteorology (BoM) (Essendon Airport) and Clarion Suites Gateway Hotel (CSGH) (Melbourne) – a large (85 x 2-room suites) commercial hotel (latitude -37.819708, longitude 144.959936) – for T₀ and Lᴛ for 541 summer days (Dec. 2009 to Feb. 2015) (Chu and Davey, 2017 a). Predictions showed that radiative heat transfers were significant and highlighted that for Lᴛ ≥ 70 %, that is, all commercially viable occupancies, the on-only conditioning strategy would be expected to use less energy. Because findings predicted meaningful savings with the on-only strategy, ‘proof-of-concept’ experiments were carried out for the first time in a controlled-trial in-situ in CSGH over 10 (2 x 5 contiguous) days of summer with 24.2 ≤ T₀ ≤ 40.5, °C and 13.3 ≤ Lᴛ ≤ 100, %. Independent invoices (Origin Energy Ltd, or Simply Energy Ltd, Australia) (at 30 min intervals from nationally registered ‘smart’ power meters) for geometrically identical control and treated suites showed a mean saving of 18.9 % (AUD $2.23 per suite per day) with the on-only strategy, with a concomitant 20.7 % reduction (12.2 kg CO₂-e) in GHG. It was concluded that because findings supported model predictions, and because robust experimental SOPs had been established and agreed by CSGH, a large-scale validation test of energy strategies should be undertaken in the hotel. Commercial-scale testing over 77 contiguous days of summer (Jan. to Mar., 2016) was carried out in two, dimensionally-identical 2-room suites, with the same fit-out and (S-E) aspect, together with identical air-conditioner (8.1 kW) and nationally registered meters to automatically transmit contiguous (24-7) electrical use (at 30 min intervals) (n = 3,696) for the first time. Each suite (10.164 x 9.675, m floor plan) was auto-set to a bulk air temperature of 22 °C (Chu and Davey, 2017 b). In the treated suite the air-conditioner was operated on-only, whilst in the control it was left to wide-spread industry practice of on-off. The suites had (standard) single-glazed pane windows with heat-attenuating (fabric) internal curtains. Peak ambient ranged from 17.8 ≤ T₀ ≤ 39.1, °C. There were 32 days with recorded rainfall. The overall occupancy Lᴛ of both suites was almost identical at 69.7 and 71.2, % respectively for the treated and control suite. Importantly, this coincided with a typical business period for the CSGH hotel. Based on independent electrical invoices, results showed the treated suite used less energy on 47 days (61 %) of the experimental period, and significantly, GHG was reduced by 12 %. An actual reduction in electrical energy costs of AUD $0.75 per day (9 %) averaged over the period was demonstrated for the treated suite. It was concluded therefore that experimental findings directly confirmed the strategy hypothesis that continuous on-only conditioning will use less energy. Although the hypothesis appeared generalizable, and adaptable to a range of room geometries, it was acknowledged that a drawback was that extrapolation of results could not be reliably done because actual energy used would be impacted by seasons. The in-situ commercial-scale experimental study was therefore extended to encompass four consecutive seasons. The research aim was to provide sufficient experimental evidence (n = 13,008) to reliably test the generalizability of the on-only hypothesis (Chu and Davey, 2017 c). Ambient peak ranged from 9.8 ≤ T₀ ≤ 40.5, °C, with rainfall on 169 days (62 %). Overall, Lᴛ was almost identical at 71.9 and 71.7, % respectively, for the treated and control suite. Results based on independent electrical energy invoices showed the on-only strategy used less energy on 147 days (54 %) than the on-off. An overall mean energy saving of 2.68 kWh per suite per day (9.2 %) (i.e. AUD $0.58 or 8.0 %) with a concomitant reduction in indirect GHG of 3.16 kg CO₂-e was demonstrated. Extrapolated for the 85 x 2-room suites of the hotel, this amounted to a real saving of AUD $18,006 per annum - plus credit certificates that could be used to increase savings. Overall, it was concluded therefore the on-only conditioning hypothesis is generalizable to all seasons, and that there appears no barrier to adaption to a range of room geometries. Highly significantly, the methodology could be readily applied to existing buildings without capital outlays or increases in maintenance. A total of five (5) summative research presentations of results and findings were made to the General Manager and support staff of CSGH over the period to July 2017 inclusive (see Appendix I) that maintained active industry-engagement for the study. To apply these new findings, the synthesis of a computational algorithm in the form of a novel App (Anon., 2012 b; Davey, 2015 b) was carried out for the first time (Chu and Davey, 2017 d). The aim was to demonstrate an App that could be used practically to minimize energy in conditioning of dry air in buildings that must maintain an auto-set temperature despite the impact of fluctuations in T₀ and Lᴛ . The App was synthesized from the extensive experimental commercial-scale data and was applied to compute energy for both strategies from independently forecast T₀ and Lᴛ . Practical performance of the App was shown to be dependent on the accuracy of locally forecast T₀ and Lᴛ . Overall results predicted a saving of 2.62 kWh per 2-room suite per day ($47,870 per annum for CSGH) where accuracy of forecast T₀ is 77 % and Lᴛ is 99 %, averaged over the long term. A concomitant benefit was a predicted reduction greenhouse emissions of 3.1 kg CO₂-e per day. The App appears generalizable – and importantly it is not limited by any underlying heat-model. Its predictive accuracy can be refined with accumulation of experimental data for a range of geo-locations and building-types to make it globally applicable. It was concluded that the App is a useful new tool to minimize energy transfers in conditioning of room dry air in large buildings – and could be readily developed commercially 6. Importantly, it can be applied without capital outlays or additional maintenance cost and at both design and analysis stages. This research is original and not incremental work. Results of this research will be of immediate benefit to risk analysts, heat-design engineers, and owners and operators of large buildings.
Thesis (Ph.D.) -- University of Adelaide, School of Chemical Engineering, 2018
APA, Harvard, Vancouver, ISO, and other styles
16

Collins, Samuel. "A Novel FR 13 Risk Assessment of Corrosion of Pipeline Steel in De-Aerated Water." Thesis, 2018. http://hdl.handle.net/2440/120220.

Full text
Abstract:
Steady-state operations are used globally in chemical engineering. Advantages include ease of control and a more uniform product quality (Ghasem and Henda, 2008; McCabe et al, 2001). Importantly however, there will be naturally occurring, random (stochastic) fluctuations in parameter values about the ‘set’ mean when process control is inadequate. These are not addressed explicitly in traditional chemical engineering because they are not sufficient on their own to be considered transient (unsteady) and because, generally, fluctuations in one parameter are off-set by changes in others with plant output behaviour seeming to remain steady (Amundson et al., 1980; Sinnott, 2005; Zou and Davey, 2016). Davey and co-workers, however, have demonstrated these fluctuations can unexpectedly accumulate in one direction and leverage significant (sudden and surprise) change in output behaviour with failure in product or plant (e.g. Abdul-Halim and Davey, 2016; Zou and Davey, 2016; Chandrakash and Davey, 2017). To underscore the unexpected element of the failure event they titled their risk framework Fr 13 2 (Friday 13th Syndrome). Case studies of their probabilistic risk framework to 1-step operations include loss of thermal efficiency in a coal-fired boiler (Davey, 2015 a) and failure to remove whey deposits in Clean-In-Place (CIP) milk processing (Davey et al., 2015). More recently, to advance their risk framework for progressively, multi-step and complex (in the sense of ‘integrated’, not ‘complicated’) processes they demonstrated its usefulness to 2-step membrane fouling with combined ultrafiltration-osmotic distillation (UF-OD) (Zou and Davey, 2016), and; a 3-step microbiological raw milk pasteurization (Chandrakash and Davey, 2017). Findings overall revealed no methodological complications in application - and it was concluded the risk framework was generalizable (Zou and Davey, 2016; Chandrakash and Davey, 2017). A significant advantage of the framework is it can be used in ‘second-tier studies’ to reduce risk through simulations of intervention strategies and re-design of physical plant or operating practice. It can be applied at both synthesis and analysis stages. 2 see Appendix A for a definition of some important terms used in this research. Although the risk framework has been successfully applied to corrosive pitting of AISI 316L metal widely used in off-shore oil and gas structures (Davey et al., 2016) 3 it was not known if it could provide new insight into corrosion of metal, more specifically microbiologically influenced corrosion (MIC), a major problem globally that accounts for ~ 20 % of overall corrosion (Flemming, 1996). It is estimated to cost AUD$7 billion to Australia annually (Javaherdashti and Raman-Singh, 2001). A review of the literature showed that a thorough understanding of MIC has been slow to emerge, both because of the role of micro-organisms in corrosion and because of a lack of methodology to determine any impact of natural fluctuations in the internal pipe environment. Importantly, the insidious nature of MIC was known to pose a practical risk of failure of pipes used to transport wet-fluids. However, because modelling of direct MIC would be uniquely complex it was planned that a general model for corrosion should be synthesized and understood that could be extended. A limited research program was therefore undertaken with the aim to advance the Fr 13 framework and to gain unique insight into how naturally occurring fluctuations in fluid temperature (T) and pH of the internal pipe environment can be transmitted and impact corrosion. A logical and stepwise approach was implemented as a research strategy. The initial model of Smith et al. (2011) was modified to simulate MIC causing micro-organisms such as sulphate-reducing bacteria (SRB) on widely used ASTM A105 carbon-steel pipe that is corroded under steady-state, abiotic and synthetic conditions. This was solved using traditional, deterministic simulations to give a predicted, underlying corrosion rate (CR) of 0.5 mm yr-1 as impacted by internal pipe-fluid T and pH. Importantly, findings underscored the controlling importance of low pH on CR. This initial model was then simulated, for the first time, using the probabilistic Fr 13 framework (Collins et al., 2016) 4 in which distributions to mimic fluctuations in T (K) and 3 This research was a Finalist, IChemE Global Awards 2016, Innovative Product, Manchester, UK, Nov. 4 Collins, S.D., Davey, K.R., Chu, J.Y.G., O’Neill, B.K., 2016. A new quantitative risk assessment of Microbiologically Influenced Corrosion (MIC) of carbon steel pipes used in chemical engineering. In: pH in the pipe were (reasonably) assumed as truncated Normal, and; a new corrosion risk factor (p) was synthesized such that all p > 0 characterized a CR failure i.e. a corrosion rate greater than 0.5 mm yr-1. Normal distributions that were truncated were used because these permitted T and pH to fluctuate randomly during process operations but limited these to values that could occur only practically. Predictions showed that 28.1 % of all corrosion of ASTM A105 pipe, averaged over the long term for a range of fluctuations 290.15 ≤ T ≤ 298.15 K, and 4.64 ≤ pH ≤ 5.67, would in fact be greater than the underlying value despite a design margin of safety (tolerance) of 50 % CR, and were therefore process failures (p > 0). Findings highlighted that corrosion was a combination of ‘successful’ and ‘failed’ operations. This insight is not available from traditional risk approaches, with or without sensitivity analyses. It was concluded that the Fr 13 framework was an advance over the traditional, deterministic methods because all corrosion scenarios that can practically exist are simulated. It was concluded also that if each simulation was (reasonably) thought of as one operational day, there would be (28.1/100 days × 365.25 days / year) ~103 corrosion failures in ASTM A105 pipe per year. However it was acknowledged that to enhance corrosion simulation, the free corrosion potential (Ecorr, V vs SCE), a key parameter in this initial model formulation, should more realistically be considered a combined function of the internal pipe-fluid T and pH, and; that this assumption should be tested, and, that this would necessitate a trial-and-error simulation for corrosion rate (CR). It was also determined that the truncations that were used for T and pH were too restrictive for off-shore oil processing (Arnold and Stewart, 1999; J. Y. G. Chu, Upstream Production Services Pty Ltd., Australia, pers. comm.). To address this, the initial model was extended mathematically for the first time, and; Fr 13 risk simulations carried out using spread-sheeting techniques utilizing the Solver CHEMECA 2016: Chemical Engineering – Regeneration, Recovery and Reinvention, Sept. 25-28, Adelaide, Australia, paper 3386601. ISBN: 9781922107831 function (Microsoft Excel™). A significant advantage was that the distributions defining the naturally occurring fluctuations in T and pH could be entered, viewed, copied, pasted and manipulated as Excel formulae. Predictions showed (Collins and Davey, 2018) 5 an underlying corrosion rate CR = 0.45 mm yr-1 – a change of approximately 10 % when the design margin of safety (tolerance) was reduced from 50 % to a more realistic 20 % for the improved model. This is significant because the tolerance of a model should be as low as can be accepted, as higher tolerances can infer that the process is safer than it actually is. Fr 13 simulations showed that 43.6 % of all corrosion of internal ASTM A105 pipe, averaged over the long term for a range of realistic fluctuations 282.55 ≤ T ≤ 423.75 K, and 4.12 ≤ pH ≤ 6.18 would be deemed to be process pipe-failures (p > 0). This translates to a corrosion failure in ASTM A105 pipe every 160 days, averaged over the long term. It is not expected that these would be equally spaced however. Findings were used in investigative ‘second-tier’ studies to explore possible intervention strategies to reduce vulnerability to corrosion and to improve plant design and safety. For example, repeat Fr 13 simulations revealed that, for a fixed mean-value of T = 353.15 K a decrease in pH from 5.15 to 4.5 resulted in an increase in carbon-steel pipe corrosion of ~1.55 mm yr-1 i.e. ~347 % increase. This implied that the pipe vulnerability to Fr 13 corrosion failure could be practically minimised by adding bases, such as potassium hydroxide or sodium carbonate (Kemmer, 1988). However, if the pH is too high, anions in the pipe-fluid could precipitate and form insoluble mineral scales, leading to fouling (Pichtel, 2016). It is acknowledged that the present research is limited to an abiotic system i.e. one without micro-organism kinetics. A justification is that the models presented in this research should be seen as a ‘starting point’, which could be expanded in later iterations to include: biotic model components such as the simple bacterial kinetics in the predictive MIC model of Maxwell and Campbell (2006); other species that are involved in MIC such 5 Collins, S.D., Davey, K.R., 2018. A novel Fr 13 risk assessment of corrosion of carbon-steel pipe in de-aerated water. Chemical Engineering Science – submitted CES-D-18-00449, Feb. as sulphates, chlorides and hydrogen sulphide (H2S); different metals/alloys that are used in pipe equipment where MIC can be found e.g. copper or zinc (Roberge, 2000), or; a ‘global’ model i.e. two or more connected unit-operations (Chandrakash and Davey, 2017). (A global model however, might not be applicable because MIC can be initiated in localized sites (Roberge, 2000)). It is concluded that these thesis findings nevertheless significantly enhance understanding of factors that lead to excessive corrosion rates in ASTM A105 pipes. It is concluded also that the Fr 13 risk framework appears generalizable to a range of micro-organism-metal systems and is an advancement over current existing risk and hazard assessments. If properly developed, it is thought that this risk technique could be adopted as a new design tool for steady-state unit-operations in both the design and synthesis stages and to increase understanding of MIC behaviour and outcomes. This research is original and not incremental work. Results and findings will be of immediate benefit and interest to a range of risk analysts, and to a broad range of practical operations involving carbon-steel pipe flows
Thesis (MPhil) -- University of Adelaide, School of Chemical Engineering & Advanced Materials, 2018
APA, Harvard, Vancouver, ISO, and other styles
17

Schmidt, Philip J. "Addressing the Uncertainty Due to Random Measurement Errors in Quantitative Analysis of Microorganism and Discrete Particle Enumeration Data." Thesis, 2010. http://hdl.handle.net/10012/5596.

Full text
Abstract:
Parameters associated with the detection and quantification of microorganisms (or discrete particles) in water such as the analytical recovery of an enumeration method, the concentration of the microorganisms or particles in the water, the log-reduction achieved using a treatment process, and the sensitivity of a detection method cannot be measured exactly. There are unavoidable random errors in the enumeration process that make estimates of these parameters imprecise and possibly also inaccurate. For example, the number of microorganisms observed divided by the volume of water analyzed is commonly used as an estimate of concentration, but there are random errors in sample collection and sample processing that make these estimates imprecise. Moreover, this estimate is inaccurate if poor analytical recovery results in observation of a different number of microorganisms than what was actually present in the sample. In this thesis, a statistical framework (using probabilistic modelling and Bayes’ theorem) is developed to enable appropriate analysis of microorganism concentration estimates given information about analytical recovery and knowledge of how various random errors in the enumeration process affect count data. Similar models are developed to enable analysis of recovery data given information about the seed dose. This statistical framework is used to address several problems: (1) estimation of parameters that describe random sample-to-sample variability in the analytical recovery of an enumeration method, (2) estimation of concentration, and quantification of the uncertainty therein, from single or replicate data (which may include non-detect samples), (3) estimation of the log-reduction of a treatment process (and the uncertainty therein) from pre- and post-treatment concentration estimates, (4) quantification of random concentration variability over time, and (5) estimation of the sensitivity of enumeration processes given knowledge about analytical recovery. The developed models are also used to investigate alternative strategies that may enable collection of more precise data. The concepts presented in this thesis are used to enhance analysis of pathogen concentration data in Quantitative Microbial Risk Assessment so that computed risk estimates are more predictive. Drinking water research and prudent management of treatment systems depend upon collection of reliable data and appropriate interpretation of the data that are available.
APA, Harvard, Vancouver, ISO, and other styles
18

Beisler, Matthias Werner. "Modelling of input data uncertainty based on random set theory for evaluation of the financial feasibility for hydropower projects." Doctoral thesis, 2010. https://tubaf.qucosa.de/id/qucosa%3A22775.

Full text
Abstract:
The design of hydropower projects requires a comprehensive planning process in order to achieve the objective to maximise exploitation of the existing hydropower potential as well as future revenues of the plant. For this purpose and to satisfy approval requirements for a complex hydropower development, it is imperative at planning stage, that the conceptual development contemplates a wide range of influencing design factors and ensures appropriate consideration of all related aspects. Since the majority of technical and economical parameters that are required for detailed and final design cannot be precisely determined at early planning stages, crucial design parameters such as design discharge and hydraulic head have to be examined through an extensive optimisation process. One disadvantage inherent to commonly used deterministic analysis is the lack of objectivity for the selection of input parameters. Moreover, it cannot be ensured that the entire existing parameter ranges and all possible parameter combinations are covered. Probabilistic methods utilise discrete probability distributions or parameter input ranges to cover the entire range of uncertainties resulting from an information deficit during the planning phase and integrate them into the optimisation by means of an alternative calculation method. The investigated method assists with the mathematical assessment and integration of uncertainties into the rational economic appraisal of complex infrastructure projects. The assessment includes an exemplary verification to what extent the Random Set Theory can be utilised for the determination of input parameters that are relevant for the optimisation of hydropower projects and evaluates possible improvements with respect to accuracy and suitability of the calculated results.
Die Auslegung von Wasserkraftanlagen stellt einen komplexen Planungsablauf dar, mit dem Ziel das vorhandene Wasserkraftpotential möglichst vollständig zu nutzen und künftige, wirtschaftliche Erträge der Kraftanlage zu maximieren. Um dies zu erreichen und gleichzeitig die Genehmigungsfähigkeit eines komplexen Wasserkraftprojektes zu gewährleisten, besteht hierbei die zwingende Notwendigkeit eine Vielzahl für die Konzepterstellung relevanter Einflussfaktoren zu erfassen und in der Projektplanungsphase hinreichend zu berücksichtigen. In frühen Planungsstadien kann ein Großteil der für die Detailplanung entscheidenden, technischen und wirtschaftlichen Parameter meist nicht exakt bestimmt werden, wodurch maßgebende Designparameter der Wasserkraftanlage, wie Durchfluss und Fallhöhe, einen umfangreichen Optimierungsprozess durchlaufen müssen. Ein Nachteil gebräuchlicher, deterministischer Berechnungsansätze besteht in der zumeist unzureichenden Objektivität bei der Bestimmung der Eingangsparameter, sowie der Tatsache, dass die Erfassung der Parameter in ihrer gesamten Streubreite und sämtlichen, maßgeblichen Parameterkombinationen nicht sichergestellt werden kann. Probabilistische Verfahren verwenden Eingangsparameter in ihrer statistischen Verteilung bzw. in Form von Bandbreiten, mit dem Ziel, Unsicherheiten, die sich aus dem in der Planungsphase unausweichlichen Informationsdefizit ergeben, durch Anwendung einer alternativen Berechnungsmethode mathematisch zu erfassen und in die Berechnung einzubeziehen. Die untersuchte Vorgehensweise trägt dazu bei, aus einem Informationsdefizit resultierende Unschärfen bei der wirtschaftlichen Beurteilung komplexer Infrastrukturprojekte objektiv bzw. mathematisch zu erfassen und in den Planungsprozess einzubeziehen. Es erfolgt eine Beurteilung und beispielhafte Überprüfung, inwiefern die Random Set Methode bei Bestimmung der für den Optimierungsprozess von Wasserkraftanlagen relevanten Eingangsgrößen Anwendung finden kann und in wieweit sich hieraus Verbesserungen hinsichtlich Genauigkeit und Aussagekraft der Berechnungsergebnisse ergeben.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography