Dissertations / Theses on the topic 'Uncertainty assessment in APT'

To see the other types of publications on this topic, follow the link: Uncertainty assessment in APT.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Uncertainty assessment in APT.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Glass, Deborah Catherine, and mikewood@deakin edu au. "Exposure estimation, uncertainty and variability in occupational hygiene retrospective assessment." Deakin University. School of Biological and Chemical Sciences, 1999. http://tux.lib.deakin.edu.au./adt-VDU/public/adt-VDU20051017.142634.

Full text
Abstract:
This thesis reports on a quantitative exposure assessment and on an analysis of the attributes of the data used in the estimations, in particular distinguishing between its uncertainty and variability. A retrospective assessment of exposure to benzene was carried out for a case control study of leukaemia in the Australian petroleum industry. The study used the mean of personal task-based measurements (Base Estimates) in a deterministic algorithm and applied factors to model back to places, times etc for which no exposure measurements were available. Mean daily exposures were estimated, on an individual subject basis, by summing the task-based exposures. These mean exposures were multiplied by the years spent on each job to provide exposure estimates in ppm-years. These were summed to provide a Cumulative Estimate for each subject. Validation was completed for the model and key inputs. Exposures were low, most jobs were below TWA of 5 ppm benzene. Exposures in terminals were generally higher than at refineries. Cumulative Estimates ranged from 0.005 to 50.9 ppm-years, with 84 percent less than 10 ppm-years. Exposure probability distributions were developed for tanker drivers using Monte Carlo simulation of the exposure estimation algorithm. The outcome was a lognormal distribution of exposure for each driver. These provide the basis for alternative risk assessment metrics e.g. the frequency of short but intense exposures which provided only a minimal contribution to the long-term average exposure but may increase risk of leukaemia. The effect of different inputs to the model were examined and their significance assessed using Monte Carlo simulation. The Base Estimates were the most important determinant of exposure in the model. The sources of variability in the measured data were examined, including the effect of having censored data and the between and within-worker variability. The sources of uncertainty in the exposure estimates were analysed and consequential improvements in exposure assessment identified. Monte Carlo sampling was also used to examine the uncertainties and variability associated with the tanker drivers' exposure assessment, to derive an estimate of the range and to put confidence intervals on the daily mean exposures. The identified uncertainty was less than the variability associated with the estimates. The traditional approach to exposure estimation typically derives only point estimates of mean exposure. The approach developed here allows a range of exposure estimates to be made and provides a more flexible and improved basis for risk assessment.
APA, Harvard, Vancouver, ISO, and other styles
2

Skinner, Laura. "Negotiating uncertainty : mental health professionals’ experiences of the Mental Health Act assessment process." Thesis, University of Leicester, 2006. http://hdl.handle.net/2381/8972.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hridoy, Md Rafiul Sabbir. "An Intelligent Flood Risk Assessment System using Belief Rule Base." Thesis, Luleå tekniska universitet, Institutionen för system- och rymdteknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-65390.

Full text
Abstract:
Natural disasters disrupt our daily life and cause many sufferings. Among the various natural disasters, flood is one of the most catastrophic. Assessing flood risk helps to take necessary precautions and can save human lives. The assessment of risk involves various factors which can not be measured with hundred percent certainty. Therefore, the present methods of flood risk assessment can not assess the risk of flooding accurately.  This research rigorously investigates various types of uncertainties associated with the flood risk factors. In addition, a comprehensive study of the present flood risk assessment approaches has been conducted. Belief Rule Base expert systems are widely used to handle various of types of uncertainties. Therefore, this research considers BRBES’s approach to develop an expert system to assess the risk of flooding. In addition, to facilitate the learning procedures of BRBES, an optimal learning algorithm has been proposed. The developed BRBES has been applied taking real world case study area, located at Cox’s Bazar, Bangladesh. The training data has been collected from the case study area to obtain the trained BRB and to develop the optimal learning model. The BRBES can generate different "What-If" scenarios which enables the analysis of flood risk of an area from various perspectives which makes the system robust and sustainable. This system is said to be intelligent as it has knowledge base, inference engine as well as the learning capability.
APA, Harvard, Vancouver, ISO, and other styles
4

Dixon, William J., and bill dixon@dse vic gov au. "Uncertainty in Aquatic Toxicological Exposure-Effect Models: the Toxicity of 2,4-Dichlorophenoxyacetic Acid and 4-Chlorophenol to Daphnia carinata." RMIT University. Biotechnology and Environmental Biology, 2005. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20070119.163720.

Full text
Abstract:
Uncertainty is pervasive in risk assessment. In ecotoxicological risk assessments, it arises from such sources as a lack of data, the simplification and abstraction of complex situations, and ambiguities in assessment endpoints (Burgman 2005; Suter 1993). When evaluating and managing risks, uncertainty needs to be explicitly considered in order to avoid erroneous decisions and to be able to make statements about the confidence that we can place in risk estimates. Although informative, previous approaches to dealing with uncertainty in ecotoxicological modelling have been found to be limited, inconsistent and often based on assumptions that may be false (Ferson & Ginzburg 1996; Suter 1998; Suter et al. 2002; van der Hoeven 2004; van Straalen 2002a; Verdonck et al. 2003a). In this thesis a Generalised Linear Modelling approach is proposed as an alternative, congruous framework for the analysis and prediction of a wide range of ecotoxicological effects. This approach was used to investigate the results of toxicity experiments on the effect of 2,4-Dichlorophenoxyacetic Acid (2,4-D) formulations and 4-Chlorophenol (4-CP, an associated breakdown product) on Daphnia carinata. Differences between frequentist Maximum Likelihood (ML) and Bayesian Markov-Chain Monte-Carlo (MCMC) approaches to statistical reasoning and model estimation were also investigated. These approaches are inferentially disparate and place different emphasis on aleatory and epistemic uncertainty (O'Hagan 2004). Bayesian MCMC and Probability Bounds Analysis methods for propagating uncertainty in risk models are also compared for the first time. For simple models, Bayesian and frequentist approaches to Generalised Linear Model (GLM) estimation were found to produce very similar results when non-informative prior distributions were used for the Bayesian models. Potency estimates and regression parameters were found to be similar for identical models, signifying that Bayesian MCMC techniques are at least a suitable and objective replacement for frequentist ML for the analysis of exposureresponse data. Applications of these techniques demonstrated that Amicide formulations of 2,4-D are more toxic to Daphnia than their unformulated, Technical Acid parent. Different results were obtained from Bayesian MCMC and ML methods when more complex models and data structures were considered. In the analysis of 4-CP toxicity, the treatment of 2 different factors as fixed or random in standard and Mixed-Effect models was found to affect variance estimates to the degree that different conclusions would be drawn from the same model, fit to the same data. Associated discrepancies in the treatment of overdispersion between ML and Bayesian MCMC analyses were also found to affect results. Bayesian MCMC techniques were found to be superior to the ML ones employed for the analysis of complex models because they enabled the correct formulation of hierarchical (nested) datastructures within a binomial logistic GLM. Application of these techniques to the analysis of results from 4-CP toxicity testing on two strains of Daphnia carinata found that between-experiment variability was greater than that within-experiments or between-strains. Perhaps surprisingly, this indicated that long-term laboratory culture had not significantly affected the sensitivity of one strain when compared to cultures of another strain that had recently been established from field populations. The results from this analysis highlighted the need for repetition of experiments, proper model formulation in complex analyses and careful consideration of the effects of pooling data on characterising variability and uncertainty. The GLM framework was used to develop three dimensional surface models of the effects of different length pulse exposures, and subsequent delayed toxicity, of 4-CP on Daphnia. These models described the relationship between exposure duration and intensity (concentration) on toxicity, and were constructed for both pulse and delayed effects. Statistical analysis of these models found that significant delayed effects occurred following the full range of pulse exposure durations, and that both exposure duration and intensity interacted significantly and concurrently with the delayed effect. These results indicated that failure to consider delayed toxicity could lead to significant underestimation of the effects of pulse exposure, and therefore increase uncertainty in risk assessments. A number of new approaches to modelling ecotoxicological risk and to propagating uncertainty were also developed and applied in this thesis. In the first of these, a method for describing and propagating uncertainty in conventional Species Sensitivity Distribution (SSD) models was described. This utilised Probability Bounds Analysis to construct a nonparametric 'probability box' on an SSD based on EC05 estimates and their confidence intervals. Predictions from this uncertain SSD and the confidence interval extrapolation methods described by Aldenberg and colleagues (2000; 2002a) were compared. It was found that the extrapolation techniques underestimated the width of uncertainty (confidence) intervals by 63% and the upper bound by 65%, when compared to the Probability Bounds (P3 Bounds) approach, which was based on actual confidence estimates derived from the original data. An alternative approach to formulating ecotoxicological risk modelling was also proposed and was based on a Binomial GLM. In this formulation, the model is first fit to the available data in order to derive mean and uncertainty estimates for the parameters. This 'uncertain' GLM model is then used to predict the risk of effect from possible or observed exposure distributions. This risk is described as a whole distribution, with a central tendency and uncertainty bounds derived from the original data and the exposure distribution (if this is also 'uncertain'). Bayesian and P-Bounds approaches to propagating uncertainty in this model were compared using an example of the risk of exposure to a hypothetical (uncertain) distribution of 4-CP for the two Daphnia strains studied. This comparison found that the Bayesian and P-Bounds approaches produced very similar mean and uncertainty estimates, with the P-bounds intervals always being wider than the Bayesian ones. This difference is due to the different methods for dealing with dependencies between model parameters by the two approaches, and is confirmation that the P-bounds approach is better suited to situations where data and knowledge are scarce. The advantages of the Bayesian risk assessment and uncertainty propagation method developed are that it allows calculation of the likelihood of any effect occurring, not just the (probability)bounds, and that the same software (WinBugs) and model construction may be used to fit regression models and predict risks simultaneously. The GLM risk modelling approaches developed here are able to explain a wide range of response shapes (including hormesis) and underlying (non-normal) distributions, and do not involve expression of the exposure-response as a probability distribution, hence solving a number of problems found with previous formulations of ecotoxicological risk. The approaches developed can also be easily extended to describe communities, include modifying factors, mixed-effects, population growth, carrying capacity and a range of other variables of interest in ecotoxicological risk assessments. While the lack of data on the toxicological effects of chemicals is the most significant source of uncertainty in ecotoxicological risk assessments today, methods such as those described here can assist by quantifying that uncertainty so that it can be communicated to stakeholders and decision makers. As new information becomes available, these techniques can be used to develop more complex models that will help to bridge the gap between the bioassay and the ecosystem.
APA, Harvard, Vancouver, ISO, and other styles
5

Cui, W. C. "Uncertainty analysis in structural safety assessment." Thesis, University of Bristol, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.303742.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Budzinski, Maik. "The differentiation between variability uncertainty and knowledge uncertainty in life cycle assessment." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2014. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-135913.

Full text
Abstract:
The following thesis deals with methods to increase the reliability of the results in life cycle assessment. The paper is divided into two parts. The first part points out the typologies and sources of uncertainty in LCA and summarises the existing methods dealing with it. The methods are critically discussed and pros and cons are contrasted. Within the second part a case study is carried out. This study calculates the carbon footprint of a cosmetic product of Li-iL GmbH. Thereby the whole life cycle of the powder bath Blaue Traube is analysed. To increase the reliability of the result a procedure, derived from the first part, is applied. Recommendations to enhance the product´s sustainability are then given to the decision-makers of the company. Finally the applied procedure for dealing with uncertainty in LCAs is evaluated. The aims of the thesis are to make a contribution to the understanding of uncertainty in life cycle assessment and to deal with it in a more consistent manner. As well, the carbon footprint of the powder bath shall be based on appropriate assumptions and shall consider occurring uncertainties. Basing on discussed problems, a method is introduced to avoid the problematic merging of variability uncertainty and data uncertainty to generate probability distributions. The introduced uncertainty importance analysis allows a consistent differentiation of these types of uncertainty. Furthermore an assessment of the used data of LCA studies is possible. The method is applied at a PCF study of the bath powder Blaue Traube of Li-iL GmbH. Thereby the analysis is carried out over the whole life cycle (cradle-to-grave) as well as cradle-to-gate. The study gives a practical example to the company determining the carbon footprint of products. In addition, it meets the requirements of ISO guidelines of publishing the study and comparing it with other products. Within the PCF study the introduced method allows a differentiation of variability uncertainty and knowledge uncertainty. The included uncertainty importance analysis supports the assessment of each aggregated unit process within the analysed product system. Finally this analysis can provide a basis to collect additional, more reliable or uncertain data for critical processes.
APA, Harvard, Vancouver, ISO, and other styles
7

Burke, Michael Martin. "Software dependability assessment." Thesis, University of Bristol, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.320203.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Mesa-Frias, M. "Modelling uncertainty in environmental health impact assessment." Thesis, London School of Hygiene and Tropical Medicine (University of London), 2015. http://researchonline.lshtm.ac.uk/2391599/.

Full text
Abstract:
Quantifying uncertainty in environmental health impact assessment models is important, particularly if the models are to be used for decision support. This thesis develops a new non-probabilistic framework to quantify uncertainty in environmental health impact assessment models. The framework takes into account two different perspectives of uncertainty: conceptual and analytical in terms of where uncertainty occurs in the model. The first perspective is concerned with uncertainty in the framing assumptions of health impact assessment, whereas the second perspective is concerned with uncertainty in the parameters of a model. The construction of the framework was achieved by focusing on five specific objectives: (i) to describe the complexity of how uncertainty arises in environmental health impact assessment and classify the uncertainty to be amenable for quantitative modelling;(ii) to critically appraise the strengths and limitations of current methods used to handle the uncertainty in environmental health impact assessment; (iii) to develop a novel quantitative framework for quantifying uncertainty from the conceptual and analytical perspectives; (iv) to formulate two detailed case-study examples on health impact assessment of indoor housing interventions; (v) to apply the framework to the two case-studies. After critiquing the uncertainty quantification methods that are currently applied in environmental health impact assessment, the thesis develops the framework for quantifying uncertainty, starting with the conceptual uncertainty (uncertainty associated with the framing assumptions or formulation of the model), then quantifying the analytical uncertainty (uncertainty associated with the input parameters and outputs of the model). The first case-study was concerned with the health impact assessment of improving housing insulation. Using fuzzy cognitive maps, the thesis identifies key indoor factors and their pathways highly sensitive to the framing assumptions of the health impact assessment. The second case-study was concerned with estimating the uncertainty in the health burdens in England, associated with three ventilation exposure scenarios using fuzzy sets and interval analysis. The thesis presents a wider uncertainty framework as a first step forward in quantifying conceptual and analytical uncertainty in environmental health impact assessment when dealing with limited information.
APA, Harvard, Vancouver, ISO, and other styles
9

Senel, Ozgur. "Infill location determination and assessment of corresponding uncertainty." [College Station, Tex. : Texas A&M University, 2008. http://hdl.handle.net/1969.1/ETD-TAMU-2806.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Jesus, André H. "Modular Bayesian uncertainty assessment for structural health monitoring." Thesis, University of Warwick, 2018. http://wrap.warwick.ac.uk/109522/.

Full text
Abstract:
Civil infrastructure are critical elements to a society’s welfare and economic thriving. Understanding their behaviour and monitoring their serviceability are relevant challenges of Structural Health Monitoring (SHM). Despite the impressive improvement of miniaturisation, standardisation and diversity of monitoring systems, the ability to interpret data has registered a much slower progression across years. The underlying causes for such disparity are the overall complexity of the proposed challenge, and the inherent errors and lack of information associated with it. Overall, it is necessary to appropriately quantify the uncertainties which undermine the SHM concept. This thesis proposes an enhanced modular Bayesian framework (MBA) for structural identification (st-id) and measurement system design (MSD). The framework is hybrid, in the sense that it uses a physics-based model, and Gaussian processes (mrGp) which are trained against data, for uncertainty quantification. The mrGp act as emulators of the model response surface and its model discrepancy, also quantifying observation error, parametric and interpolation uncertainty. Finally, this framework has been enhanced with the Metropolis–Hastings for multiple parameters st-id. In contrast to other probabilistic frameworks, the MBA allows to estimate structural parameters (which reflect a performance of interest) consistently with their physical interpretation, while highlighting patterns of a model’s discrepancy. The MBA performance can be substantially improved by considering multiple responses which are sensitive to the structural parameters. An extension of the MBA for MSD has been validated on a reduced-scale aluminium bridge subject to thermal expansion (supported at one end with springs and instrumented with strain gauges and thermocouples). A finite element (FE) model of the structure was used to obtain a semi-optimal sensor configuration for stid. Results indicate that 1) measuring responses which are sensitive to the structural parameters and are more directly related to model discrepancy, provide the best results for st-id; 2) prior knowledge of the model discrepancy is essential to capture the latter type of responses. Subsequently, an extension of the MBA for st-id was also applied for identification of the springs stiffness, and results indicate relative errors five times less than other state of the art Bayesian/deterministic methodologies. Finally, a first application to field data was performed, to calibrate a detailed FE model of the Tamar suspension bridge using long-term monitored data. Measurements of temperature, traffic, mid-span displacement and natural frequencies of the bridge, were used to identify the bridge’s main/stay cables initial strain and friction of its bearings. Validation of results suggests that the identified parameters agree more closely with the true structural behaviour of the bridge, with an error that is several orders of magnitude smaller than other probabilistic st-id approaches. Additionally, the MBA allowed to predicted model discrepancy functions to assess the predictive ability of the Tamar bridge FE model. It was found, that the model predicts more accurately the bridge mid-span displacements than its natural frequencies, and that the adopted traffic model is less able to simulate the bridge behaviour during periods of traffic jams. Future developments of the MBA framework include its extension and application for damage detection and MSD with multiple parameter identification.
APA, Harvard, Vancouver, ISO, and other styles
11

Fulchino, Matthew T. "An assessment of uncertainty due to adversary mobility." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/100373.

Full text
Abstract:
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, Engineering Systems Division, System Design and Management Program, 2015..
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 47-50).
Uncertainty related to an adversary's tactics, techniques, and procedures is often difficult to characterize, particularly during the period immediately before a conflict, when planning for a face-to-face confrontation with a combatant. Adversarial freedom of maneuver and the fixed nature of asset defense leaves limited room for error or half-assessments, yet past analysis of regional defendability presumes a static, symmetric adversary, rather than a nimble, cunning one. This thesis examines historical events to identify the source of uncertainty with respect to defensive operations, and proposes that an alternative measure of performance be evaluated to fully characterize the effectiveness and limitations of defensive elements in the face of a determined peer.
by Matthew T. Fulchino.
S.M. in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
12

Chen, Qi. "Uncertainty quantification in assessment of damage ship survivability." Thesis, University of Strathclyde, 2012. http://oleg.lib.strath.ac.uk:80/R/?func=dbin-jump-full&object_id=19511.

Full text
Abstract:
Ongoing developments in improving ship safety indicate the gradual transition from a compliance-based culture to a sustainable safety-oriented culture. Sophisticated methods, tools and techniques are demanded to address the dynamic behaviour of a ship in a physical environment. This is particularly true for investigating the flooding phenomenon of a damaged ship, a principal hazard endangering modern ships. In this respect, first-principles tools represent a rational and cost-effective approach to address it at both design and operational stages. Acknowledging the criticality of ship survivability and the various maturity levels of state-of-the-art tools, analyses of the underlying uncertainties in relation to relevant predictions become an inevitable component to be addressed. The research presented in this thesis proposes a formalised Bayesian approach for quantifying uncertainties associated with the assessment of ship survivability. It elaborates a formalised procedu re for synthesizing first-principles tools with existing knowledge from various sources. The outcome is a mathematical model for predicting time-domain survivability and quantifying the associated uncertainties. In view of emerging ship life-cycle safety management issues and the recent initiative of "Safe Return to Port", emergency management is recognised as the last remedy to address an evolving flooding crisis. For this reason, an emergency decision support framework is proposed to demonstrate the applicability of the presented Bayesian approach. A case study is enclosed to elucidate the devised shipboard decision support framework for flooding-related emergency control. Various aspects of the presented methodology demonstrate considerable potential for further research, development and application. In an environment where more emphasis is placed on performance and probabilistic-based solutions, it is believed that this research has contributed positiv ely and substantially towards ship safety, with particular reference to uncertainty analysis and ensuing applications.
APA, Harvard, Vancouver, ISO, and other styles
13

Jayaraman, Venkataramanan. "Assessment of uncertainty management approaches in construction organizations." Diss., Connect to online resource - MSU authorized users, 2006.

Find full text
Abstract:
Thesis (M. S.)--Michigan State University. Construction Management Program , 2006.
Title from PDF t.p. (viewed on June 19, 2009) Includes bibliographical references (p. 147-150). Also issued in print.
APA, Harvard, Vancouver, ISO, and other styles
14

Blasone, Roberta-Serena. "Parameter estimation and uncertainty assessment in hydrological modelling." Kgs. Lyngby, 2007. http://www.er.dtu.dk/publications/fulltext/2007/MR2007-105.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Levin, Rikard. "Uncertainty in risk assessment : contents and modes of communication." Licentiate thesis, Stockholm : Kungliga Tekniska högskolan, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-473.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Arkhipov, Ivan, and Marina Boltenko. "Investment Under Uncertainty : Risk Assessment in Emerging Market Countries." Thesis, Jönköping University, JIBS, Economics, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-8029.

Full text
Abstract:

The overall purpose of the paper is to see how crediting institutions assess risks in emerging market countries. The paper describes prevalent economic and social conditions for each of the selected emerging market countries (Brazil, China, Kazakhstan, India, Russia and Ukraine) as examples of recent attractive investment locations in quest of higher returns.  Second, recognizing the importance of ratings for risk management in credit institutions, the authors show what determines country ratings made by main rating agencies by running a linear regression on several macroeconomic indicators and the country ratings. It is also explained what the most widely-used ratings mean and described the correlation between the ratings as well as between the macroeconomic indicators and the ratings. The authors also describe the characteristic approach of a Scandinavian bank towards dealing with risk factors in emerging market countries. Concluding comments: risks happen to be inbound in the bank interest rates; there is no common pattern for banks to apply to all the emerging market countries and each market should be analyzed separately. Nordic banks have a relatively safe and careful strategy concerning lending in the emerging markets.

 

APA, Harvard, Vancouver, ISO, and other styles
17

De, Aguinaga José Guillermo. "Uncertainty Assessment of Hydrogeological Models Based on Information Theory." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2011. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-71814.

Full text
Abstract:
There is a great deal of uncertainty in hydrogeological modeling. Overparametrized models increase uncertainty since the information of the observations is distributed through all of the parameters. The present study proposes a new option to reduce this uncertainty. A way to achieve this goal is to select a model which provides good performance with as few calibrated parameters as possible (parsimonious model) and to calibrate it using many sources of information. Akaike’s Information Criterion (AIC), proposed by Hirotugu Akaike in 1973, is a statistic-probabilistic criterion based on the Information Theory, which allows us to select a parsimonious model. AIC formulates the problem of parsimonious model selection as an optimization problem across a set of proposed conceptual models. The AIC assessment is relatively new in groundwater modeling and it presents a challenge to apply it with different sources of observations. In this dissertation, important findings in the application of AIC in hydrogeological modeling using different sources of observations are discussed. AIC is tested on ground-water models using three sets of synthetic data: hydraulic pressure, horizontal hydraulic conductivity, and tracer concentration. In the present study, the impact of the following factors is analyzed: number of observations, types of observations and order of calibrated parameters. These analyses reveal not only that the number of observations determine how complex a model can be but also that its diversity allows for further complexity in the parsimonious model. However, a truly parsimonious model was only achieved when the order of calibrated parameters was properly considered. This means that parameters which provide bigger improvements in model fit should be considered first. The approach to obtain a parsimonious model applying AIC with different types of information was successfully applied to an unbiased lysimeter model using two different types of real data: evapotranspiration and seepage water. With this additional independent model assessment it was possible to underpin the general validity of this AIC approach
Hydrogeologische Modellierung ist von erheblicher Unsicherheit geprägt. Überparametrisierte Modelle erhöhen die Unsicherheit, da gemessene Informationen auf alle Parameter verteilt sind. Die vorliegende Arbeit schlägt einen neuen Ansatz vor, um diese Unsicherheit zu reduzieren. Eine Möglichkeit, um dieses Ziel zu erreichen, besteht darin, ein Modell auszuwählen, das ein gutes Ergebnis mit möglichst wenigen Parametern liefert („parsimonious model“), und es zu kalibrieren, indem viele Informationsquellen genutzt werden. Das 1973 von Hirotugu Akaike vorgeschlagene Informationskriterium, bekannt als Akaike-Informationskriterium (engl. Akaike’s Information Criterion; AIC), ist ein statistisches Wahrscheinlichkeitskriterium basierend auf der Informationstheorie, welches die Auswahl eines Modells mit möglichst wenigen Parametern erlaubt. AIC formuliert das Problem der Entscheidung für ein gering parametrisiertes Modell als ein modellübergreifendes Optimierungsproblem. Die Anwendung von AIC in der Grundwassermodellierung ist relativ neu und stellt eine Herausforderung in der Anwendung verschiedener Messquellen dar. In der vorliegenden Dissertation werden maßgebliche Forschungsergebnisse in der Anwendung des AIC in hydrogeologischer Modellierung unter Anwendung unterschiedlicher Messquellen diskutiert. AIC wird an Grundwassermodellen getestet, bei denen drei synthetische Datensätze angewendet werden: Wasserstand, horizontale hydraulische Leitfähigkeit und Tracer-Konzentration. Die vorliegende Arbeit analysiert den Einfluss folgender Faktoren: Anzahl der Messungen, Arten der Messungen und Reihenfolge der kalibrierten Parameter. Diese Analysen machen nicht nur deutlich, dass die Anzahl der gemessenen Parameter die Komplexität eines Modells bestimmt, sondern auch, dass seine Diversität weitere Komplexität für gering parametrisierte Modelle erlaubt. Allerdings konnte ein solches Modell nur erreicht werden, wenn eine bestimmte Reihenfolge der kalibrierten Parameter berücksichtigt wurde. Folglich sollten zuerst jene Parameter in Betracht gezogen werden, die deutliche Verbesserungen in der Modellanpassung liefern. Der Ansatz, ein gering parametrisiertes Modell durch die Anwendung des AIC mit unterschiedlichen Informationsarten zu erhalten, wurde erfolgreich auf einen Lysimeterstandort übertragen. Dabei wurden zwei unterschiedliche reale Messwertarten genutzt: Evapotranspiration und Sickerwasser. Mit Hilfe dieser weiteren, unabhängigen Modellbewertung konnte die Gültigkeit dieses AIC-Ansatzes gezeigt werden
APA, Harvard, Vancouver, ISO, and other styles
18

Kentel, Elçin. "Uncertainty Modeling Health Risk Assessment and Groundwater Resources Management." Diss., Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/11584.

Full text
Abstract:
Real-world problems especially the ones that involve natural systems are complex and they are composed of many non-deterministic components. Uncertainties associated with these non-deterministic components may originate from randomness or from imprecision due to lack of information. Until recently, uncertainty, regardless of its nature or source has been treated by probability concepts. However, uncertainties associated with real-world systems are not limited to randomness. Imprecise, vague or incomplete information may better be represented by other mathematical tools, such as fuzzy set theory, possibility theory, belief functions, etc. New approaches which allow utilization of probability theory in combination with these new mathematical tools found applications in various engineering fields. Uncertainty modeling in human health risk assessment and groundwater resources management areas are investigated in this thesis. In the first part of this thesis two new approaches which utilize both probability theory and fuzzy set theory concepts to treat parameter uncertainties in carcinogenic risk assessment are proposed. As a result of these approaches fuzzy health risks are generated. For the fuzzy risk to be useful for practical purposes its acceptability with respect to compliance guideline has to be evaluated. A new fuzzy measure, the risk tolerance measure, is proposed for this purpose. The risk tolerance measure is a weighed average of the possibility and the necessity measures which are currently used for decision making purposes. In the second part of this thesis two decision making frameworks are proposed to determine the best groundwater resources management strategy in the Savannah region, Georgia. Groundwater resources management problems, especially ones in the coastal areas are complex and require treatment of various uncertain inputs. The first decision making framework proposed in this study is composed of a coupled simulation-optimization model followed by a fuzzy multi-objective decision making approach while the second framework includes a groundwater flow model in which the parameters of the flow equation are characterized by fuzzy numbers and a decision making approach which utilizes the risk tolerance measure proposed in the first part of this thesis.
APA, Harvard, Vancouver, ISO, and other styles
19

Bhatt, Chinmay P. "Assessment of uncertainty in equivalent sand grain roughness methods." Birmingham, Ala. : University of Alabama at Birmingham, 2007. http://www.mhsl.uab.edu/dt/2007m/bhatt.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

BASTOS, BERNARDO LEOPARDI GONCALVES BARRETTO. "UNCERTAINTY QUANTIFICATION AT RISK ASSESSMENT PROCEDURE DUE CONTAMINATED GROUNDWATER." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2005. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=8184@1.

Full text
Abstract:
UNIVERSIDADE FEDERAL DE ALAGOAS
FUNDAÇÃO DE APOIO À PESQUISA DO ESTADO DO RIO DE JANEIRO
A análise quantitativa de risco à saúde humana (AqR) devido a uma determinada área contaminada vem se verificando como importante ferramenta na gestão ambiental bem como a concretização de dano ambiental, tanto no Brasil como em outros países. Os procedimentos para AqR consistem em passos seqüenciados de forma orgânica e lógica e englobam características legais, aspectos toxicológicos e mecanismos de transporte. Apesar de não haver uma lei específica que regule a AqR, o Direito Ambiental permite que estas metodologias sejam plenamente aplicadas tanto no âmbito administrativo quanto no âmbito judicial para a caracterização de dano ambiental. As metodologias de AqR se valem de modelos fármaco-cinéticos que relacionam a exposição ao composto químico à possibilidade de causar danos à saúde humana. A Geotecnia Ambiental estuda o transporte e comportamento dos contaminantes nos solos e nas águas subterrâneas. A AqR se mostra um problema complexo e permeado por inúmeras incertezas e variabilidades. Foi proposta a utilização do método do segundo momento de primeira ordem (FOSM) para quantificar as incertezas relacionadas com a estimativa dos parâmetros de transporte a serem usadas em um modelo analítico de transporte de soluto em meios porosos (Domenico). O estudo de caso consiste na aplicação do programa desenvolvido para esta finalidade (SeRis). O método se mostra computacionalmente econômico e o estudo de caso, dentro das idealizações, identificou os parâmetros com maior importância relativa e apresentou uma variância total razoável para o resultado.
The quantitative human health risk assessment (AqR) due a contaminated site has became an important tool at Environmental Managenment and at the identification of environmental harm, at Brazil and other countries. The AqR procedures consists in logical sequence of actions concerned about legal aspects, toxicological matter and transport phenomena. In spite of the absence of a single law that could regulate specifically the AqR, the Environmental Law, as a whole, allows that AqR methodologies to be fully applied at governamental and judicial levels. The AqR procedures are base on pharmaco-kinetics models that quantitatively relates the exposure to the chemicals to human harm potency. The Environmental Geotechnics studies the fate and transport of contaminants at soil and groundwater. AqR is complex and full of uncertainties and variabilities subject. It have been proposed the application of the first order second moment method (FOSM) to quantify the uncertainties related to the estimation of the transport parameters to be used in the analytical transport model of solute in porous media (Domenico). It have been developed a specific software that meets this objective (SeRis). This software proved to be computationally efficient. The case study example indicated the relative importance of the considered parameters and presented a reasonable total system variance.
APA, Harvard, Vancouver, ISO, and other styles
21

Léchelle, Jacques, S. Noyau, Laurence Aufore, Antoine Arredondo, and Fabienne Audubert. "Volume interdiffusion coeffcient and uncertainty assessment for polycrystalline materials." Diffusion fundamentals 17 (2012) 2, S. 1-39, 2012. https://ul.qucosa.de/id/qucosa%3A13726.

Full text
Abstract:
A method has been developed in order to assess small volume interdiffusion coeffcients from experimental Electron Probe MicroAnalysis concentration profiles of polycrystalline materials by means of Boltzmann-Matano or den Broeder methods and their complementary Hall method. These methods have been used as tools for the investigation of the quasi-binary UO2/U(1-y)PuyO(2-z) interdiffusion, for which obtaining a solid solution in the bulk of grains is of major interest. In this paper uncertainties on the interdiffusion coefficient as a function of concentration have been computed for each method. Small volume cofficient measurements were enhanced by means of a small angle acquisition profile line with respect to the interdiffusion interface.
APA, Harvard, Vancouver, ISO, and other styles
22

Chen, Xiaoju. "Uncertainty Estimation in Matrix-based Life Cycle Assessment Models." Research Showcase @ CMU, 2017. http://repository.cmu.edu/dissertations/891.

Full text
Abstract:
Life Cycle Assessment (LCA) has been applied to help decision-makers understand quantitative environmental effects and impacts through the life stages of a product or process. Matrix-based LCA models are widely incorporated to LCA software tools to simplify the assessment and provide straightforward results. However, these tools do not sufficiently provide the uncertainties that arise from the inventory data as well as from the matrix-based models. To address this problem, in this thesis I use a range method to explore three types of uncertainties (parameter, scenario, and model uncertainties) present in matrix-based LCA models. These three types of uncertainties are assessed separately for two different types of LCA models: the Input-Output- based LCA model, and the process-based LCA models analyzed with matrix methods. IO-based LCA models are studied with the Environmental Input-Output Life Cycle Assessment (EIO-LCA) model, and the US LCI database incorporated to matrix methods is used as an example of process-based LCA models. I selected two demonstrate the results with two environmental effects (greenhouse gas emissions and energy consumptions) and five environmental impacts (global warming, ozone depletion, acidification, eutrophication and ecotoxicity). First, I analyzed the parameter uncertainty in the EIO-LCA model. Publicly available data sources and assumptions are used to estimate the parameter uncertainties of the direct energy consumption in the US industrial sectors. The direct and indirect energy consumption ranges are estimated through the EIO-LCA model. The results show that the parameter uncertainties are generally within -40% to 40% from the default values, with several outliers. Second, I examined the scenario uncertainties in total carbon dioxide emissions by using alternative inputs in the US LCI database. I found that the US LCI database fails to take full advantage of matrix-based methods; when incorporated to matrix-based LCA models, less than 10% of the processes contribute to the indirect environmental effects. The results of scenario uncertainty estimation in the US LCI database show that on average, the total carbon dioxide emissions across all processes are between -30% to -30%. Finally, I addressed the model uncertainty by using different Life Cycle Inventory Assessment (LCIA) methods incorporated in the matrix-based models. The results show that when the US LCI inventories are applied, the uncertainties due to choosing different impact methods are within 5%. This is possibly caused by the incompleteness of the inventories: more than 50% of the characterized substances are excluded in the inventory, resulting in the neglect of some impact values. The results from this study emphasize the importance of estimating uncertainties in matrix-based LCA models. The variability in the LCA results is caused by all three types of uncertainties, as well as the incomplete inventories embedded in the matrix-based LCA models. Future LCA database and software should focus on including uncertainty estimation in the features and improving the inventory data to take full advantage of the matrix-based LCA models.
APA, Harvard, Vancouver, ISO, and other styles
23

Léchelle, Jacques, S. Noyau, Laurence Aufore, Antoine Arredondo, and Fabienne Audubert. "Volume interdiffusion coeffcient and uncertainty assessment for polycrystalline materials." Universitätsbibliothek Leipzig, 2015. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-184476.

Full text
Abstract:
A method has been developed in order to assess small volume interdiffusion coeffcients from experimental Electron Probe MicroAnalysis concentration profiles of polycrystalline materials by means of Boltzmann-Matano or den Broeder methods and their complementary Hall method. These methods have been used as tools for the investigation of the quasi-binary UO2/U(1-y)PuyO(2-z) interdiffusion, for which obtaining a solid solution in the bulk of grains is of major interest. In this paper uncertainties on the interdiffusion coefficient as a function of concentration have been computed for each method. Small volume cofficient measurements were enhanced by means of a small angle acquisition profile line with respect to the interdiffusion interface.
APA, Harvard, Vancouver, ISO, and other styles
24

Slaughter, Jean G. "Motives of Uncertainty: Accurate Self-Assessment or Self-Handicapping?" W&M ScholarWorks, 1987. https://scholarworks.wm.edu/etd/1539625424.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Filipsson, Monika. "Uncertainty, variability and environmental risk analysis." Doctoral thesis, Linnéuniversitetet, Institutionen för naturvetenskap, NV, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-11193.

Full text
Abstract:
The negative effects of hazardous substances and possible measures that can be taken are evaluated in the environmental risk analysis process, consisting of risk assessment, risk communication and risk management. Uncertainty due to lack of knowledge and natural variability are always present in this process. The aim of this thesis is to evaluate some tools as well as discuss the management of uncertainty and variability, as it is necessary to treat them both in a reliable and transparent way to gain regulatory acceptance in decision making. The catalytic effects of various metals on the formation of chlorinated aromatic compounds during the heating of fly ash were investigated (paper I). Copper showed a positive catalytic effect, while cobalt, chromium and vanadium showed a catalytic effect for degradation. Knowledge of the catalytic effects may facilitate the choice and design of combustion processes to decrease emissions, but it also provides valuable information to identify and characterize the hazard. Exposure factors of importance in risk assessment (physiological parameters, time use factors and food consumption) were collected and evaluated (paper II). Interindividual variability was characterized by mean, standard deviation, skewness, kurtosis and multiple percentiles, while uncertainty in these parameters was estimated with confidence intervals. How these statistical parameters can be applied was shown in two exposure assessments (papers III and IV). Probability bounds analysis was used as a probabilistic approach, which enables separate propagation of uncertainty and variability even in cases where the availability of data is limited. In paper III it was determined that the exposure cannot be expected to cause any negative health effects for recreational users of a public bathing place. Paper IV concluded that the uncertainty interval in the estimated exposure increased when accounting for possible changes in climate-sensitive model variables. Risk managers often need to rely on precaution and an increased uncertainty may therefore have implications for risk management decisions. Paper V focuses on risk management and a questionnaire was sent to employees at all Swedish County Administrative Boards working with contaminated land. It was concluded that the gender, age and work experience of the employees, as well as the funding source of the risk assessment, all have an impact on the reviewing of risk assessments. Gender was the most significant factor, and it also affected the perception of knowledge.
Negativa effekter orsakade av skadliga ämnen och möjliga åtgärder bedöms och utvärderas i en miljöriskanalys, som kan delas i riskbedömning, riskkommunikation och riskhantering. Osäkerhet som beror på kunskapsbrist samt naturlig variabilitet finns alltid närvarande i denna process. Syftet med avhandlingen är att utvärdera några tillvägagångssätt samt diskutera hur osäkerhet och variabilitet hanteras då det är nödvändigt att båda hanteras trovärdigt och transparent för att riskbedömningen ska vara användbar för beslutsfattande. Metallers katalytiska effekt på bildning av klorerade aromatiska ämnen under upphettning av flygaska undersöktes (artikel I). Koppar visade en positiv katalytisk effekt medan kobolt, krom och vanadin istället katalyserade nedbrytningen. Kunskap om katalytisk potential för bildning av skadliga ämnen är viktigt vid val och design av förbränningsprocesser för att minska utsläppen, men det är också ett exempel på hur en fara kan identifieras och karaktäriseras. Information om exponeringsfaktorer som är viktiga i riskbedömning (fysiologiska parametrar, tidsanvändning och livsmedelskonsumtion) samlades in och analyserades (artikel II). Interindividuell variabilitet karaktäriserades av medel, standardavvikelse, skevhet, kurtosis (toppighet) och multipla percentiler medan osäkerhet i dessa parametrar skattades med konfidensintervall. Hur dessa statistiska parametrar kan tillämpas i exponeringsbedömningar visas i artikel III och IV. Probability bounds analysis användes som probabilistisk metod, vilket gör det möjligt att separera osäkerhet och variabilitet i bedömningen även när tillgången på data är begränsad. Exponeringsbedömningen i artikel III visade att vid nu rådande föroreningshalter i sediment i en badsjö så medför inte bad någon hälsofara. I artikel IV visades att osäkerhetsintervallet i den skattade exponeringen ökar när hänsyn tas till förändringar i klimatkänsliga modellvariabler. Riskhanterare måste ta hänsyn till försiktighetsprincipen och en ökad osäkerhet kan därmed få konsekvenser för riskhanteringsbesluten. Artikel V fokuserar på riskhantering och en enkät skickades till alla anställda som arbetar med förorenad mark på länsstyrelserna i Sverige. Det konstaterades att anställdas kön, ålder och erfarenhet har en inverkan på granskningsprocessen av riskbedömningar. Kön var den mest signifikanta variabeln, vilken också påverkade perceptionen av kunskap. Skillnader i de anställdas svar kunde också ses beroende på om riskbedömningen finansierades av statliga bidrag eller av en ansvarig verksamhetsutövare.
APA, Harvard, Vancouver, ISO, and other styles
26

Clausen, Mork Jonas. "Dealing with uncertainty." Doctoral thesis, KTH, Filosofi, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-72680.

Full text
Abstract:
Uncertainty is, it seems, more or less constantly present in our lives. Even so, grasping the concept philosophically is far from trivial. In this doctoral thesis, uncertainty and its conceptual companion information are studied. Axiomatic analyses are provided and numerical measures suggested. In addition to these basic conceptual analyses, the widespread practice of so-called safety factor use in societal regulation is analyzed along with the interplay between science and policy in European regulation of chemicals and construction.
QC 20120202
APA, Harvard, Vancouver, ISO, and other styles
27

Ericok, Ozlen. "Uncertainty Assessment In Reserv Estimation Of A Naturally Fractured Reservoir." Master's thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/2/12605713/index.pdf.

Full text
Abstract:
ABSTRACT UNCERTAINTY ASSESSMENT IN RESERVE ESTIMATION OF A NATURALLY FRACTURED RESERVOIR ERIÇ
OK, Ö
zlen M.S., Department of Petroleum and Natural Gas Engineering Supervisor : Prof. Dr. Fevzi GÜ
MRAH December 2004, 169 pages Reservoir performance prediction and reserve estimation depend on various petrophysical parameters which have uncertainties due to available technology. For a proper and economical field development, these parameters must be determined by taking into consideration their uncertainty level and probable data ranges. For implementing uncertainty assessment on estimation of original oil in place (OOIP) of a field, a naturally fractured carbonate field, Field-A, is chosen to work with. Since field information is obtained by drilling and testing wells throughout the field, uncertainty in true ranges of reservoir parameters evolve due to impossibility of drilling every location on an area. This study is based on defining the probability distribution of uncertain variables in reserve estimation and evaluating probable reserve amount by using Monte Carlo simulation method. Probabilistic reserve estimation gives the whole range of probable v original oil in place amount of a field. The results are given by their likelyhood of occurance as P10, P50 and P90 reserves in summary. In the study, Field-A reserves at Southeast of Turkey are estimated by probabilistic methods for three producing zones
Karabogaz Formation, Kbb-C Member of Karababa formation and Derdere Formation. Probability density function of petrophysical parameters are evaluated as inputs in volumetric reserve estimation method and probable reserves are calculated by @Risk software program that is used for implementing Monte Carlo method. Outcomes of the simulation showed that Field-A has P50 reserves as 11.2 MMstb in matrix and 2.0 MMstb in fracture of Karabogaz Formation, 15.7 MMstb in matrix and 3.7 MMstb in fracture of Kbb-C Member and 10.6 MMstb in matrix and 1.6 MMstb in fracture of Derdere Formation. Sensitivity analysis of the inputs showed that matrix porosity, net thickness and fracture porosity are significant in Karabogaz Formation and Kbb-C Member reserve estimation while water saturation and fracture porosity are most significant in estimation of Derdere Formation reserves.
APA, Harvard, Vancouver, ISO, and other styles
28

Yuan, Chengwu. "An efficient Bayesian approach to history matching and uncertainty assessment." Texas A&M University, 2005. http://hdl.handle.net/1969.1/4962.

Full text
Abstract:
Conditioning reservoir models to production data and assessment of uncertainty can be done by Bayesian theorem. This inverse problem can be computationally intensive, generally requiring orders of magnitude more computation time compared to the forward flow simulation. This makes it not practical to assess the uncertainty by multiple realizations of history matching for field applications. We propose a robust adaptation of the Bayesian formulation, which overcomes the current limitations and is suitable for large-scale applications. It is based on a generalized travel time inversion and utilizes a streamline-based analytic approach to compute the sensitivity of the travel time with respect to reservoir parameters. Streamlines are computed from the velocity field that is available from finite-difference simulators. We use an iterative minimization algorithm based on efficient SVD (singular value decomposition) and a numerical ‘stencil’ for calculation of the square root of the inverse of the prior covariance matrix. This approach is computationally efficient. And the linear scaling property of CPU time with increasing model size makes it suitable for large-scale applications. Then it is feasible to assess uncertainty by sampling from the posterior probability distribution using Randomized Maximum Likelihood method, an approximate Markov Chain Monte Carlo algorithms. We apply this approach in a field case from the Goldsmith San Andres Unit (GSAU) in West Texas. In the application, we show the effect of prior modeling on posterior uncertainty by comparing the results from prior modeling by Cloud Transform and by generalized travel time inversion and utilizes a streamline-based analytic approach to compute the sensitivity of the travel time with respect to reservoir parameters. Streamlines are computed from the velocity field that is available from finite-difference simulators. We use an iterative minimization algorithm based on efficient SVD (singular value decomposition) and a numerical Collocated Sequential Gaussian Simulation. Exhausting prior information will reduce the prior uncertainty and posterior uncertainty after dynamic data integration and thus improve the accuracy of prediction of future performance.
APA, Harvard, Vancouver, ISO, and other styles
29

MacAulay, Gavin. "Characterisation of structured surfaces and assessment of associated measurement uncertainty." Thesis, Brunel University, 2016. http://bura.brunel.ac.uk/handle/2438/13473.

Full text
Abstract:
Recently, structured surfaces, consisting of deterministic features designed to produce a particular effect, have shown promise in providing superior functional performance for a range of applications including: low friction surfaces, hydrophobic surfaces and optical effects. Methods have been developed to characterise such structured surfaces. The most widely used characterisation methods are based on segmenting the surface in feature and background regions and then determining the geometrical properties of those features. However, further work is needed to refine these characterisation techniques and provide associated uncertainties. This thesis considers the effect of various segmentation control parameters such as thresholds on the final geometric parameters. The effect of varying filter size is also considered. These considerations should help in selecting a suitable characterisation method for future projects. Additionally, uncertainty in the characterisation should be estimated in order to give an indication of the accuracy of the assessment. However, no previous work has assessed uncertainty in the dimensional properties of structured surfaces. Therefore, this thesis presents two methods to characterise the uncertainty in the geometric characteristics of structured surfaces. First, the measurement reproducibility is used, which can be determined by repeated measurement of a feature. However, measurement reproducibility cannot account for all sources of uncertainty and cannot assess any bias in the measurements. Therefore, a second method based on assessment of the metrological characteristics of the instrument is considered. The metrological characteristics estimate errors produced by the instrument in a way that can easily be measured. Monte Carlo techniques are then used to propagate the effects of the metrological characteristics and their uncertainties into the final measurement uncertainty. For the example used, it was found that the results using the metrological characteristics were in good agreement with the reproducibility results. From these results, it is concluded that the choice of segmentation method, control parameters and filtering can all significantly effect the characterisation of features on a structured surface, often in unexpected ways. Therefore, care must be taken when selecting these values for a specific application. Additionally, two methods of determining the uncertainty of the structured surfaces were considered. Both methods are valid and produce similar results. Using the measurement reproducibility is simple to perform, but requires many measurements and cannot account for some uncertainty sources such as those due to the instrument amplification factors. On the other hand, the use of metrological characteristics can account for all significant sources of uncertainty in a measurement, but is mathematically more complex, requiring Monte Carlo simulations to propagate the uncertainties into the final characteristics. Additionally, other artefacts than the sample being measured are required to determine the metrological characteristics, which may be an issue in some cases.
APA, Harvard, Vancouver, ISO, and other styles
30

Koops, Marten A. "Misinformation and assessment uncertainty in the ecology of information use." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp03/NQ35042.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Dörr, Ulrike [Verfasser]. "Subjective Self-Assessment and Decision Making under Uncertainty / Ulrike Dörr." Kiel : Universitätsbibliothek Kiel, 2013. http://d-nb.info/1036406296/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Goksel, Lorens Sarim. "Fatigue and damage tolerance assessment of aircraft structure under uncertainty." Thesis, Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/49124.

Full text
Abstract:
This thesis presents a new modeling framework and application methodology for the study of aircraft structures. The framework provides a ‘cradle-to-grave’ approach to structural analysis of a component, where structural integrity encompasses all phases of its lifespan. The methodology examines the holistic structural design of aircraft components by integrating fatigue and damage tolerance methodologies. It accomplishes this by marrying the load inputs from a fatigue analysis for new design, into a risk analysis for an existing design. The risk analysis incorporates the variability found from literature, including recorded defects, loadings, and material strength properties. The methodology is verified via formal conceptualization of the structures, which are demonstrated on an actual hydraulic accumulator and an engine nacelle inlet. The hydraulic accumulator is examined for structural integrity utilizing different base materials undergoing variable amplitude loading. Integrity is accomplished through a risk analysis by means of fault tree analysis. The engine nacelle inlet uses the damage tolerance philosophy for a sonic fatigue condition undergoing both constant amplitude loading and a theoretical flight design case. Residual strength changes are examined throughout crack growth, where structural integrity is accomplished through a risk analysis of component strength versus probability of failure. Both methodologies can be applied to nearly any structural application, not necessarily limited to aerospace.
APA, Harvard, Vancouver, ISO, and other styles
33

Lea, Francesca C. "Uncertainty in condition and strength assessment of reinforced concrete bridges." Thesis, University of Cambridge, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.614976.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Lee, Renée. "Uncertainty and correlation in seismic risk assessment of transportation systems /." May be available electronically:, 2007. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Maharaja, Amisha. "Global net-to-gross uncertainty assessment at reservoir appraisal stage /." May be available electronically:, 2007. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Keller, Armin. "Assessment of uncertainty in modelling heavy metal balances of regional agroecosystems /." [S.l.] : [s.n.], 2000. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=13944.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Zagonjolli, Migena. "Dam break modelling, risk assessment and uncertainty analysis for flood mitigation /." London : Taylor & Francis, 2007. http://opac.nebis.ch/cgi-bin/showAbstract.pl?u20=9780415455947.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Baalousha, Husam M. [Verfasser]. "Risk Assessment and Uncertainty Analysis in Groundwater Modelling / Husam M Baalousha." Aachen : Shaker, 2004. http://d-nb.info/1172614350/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Diggle, Rebecca. "Regulatory science and uncertainty in the risk assessment of pesticide residues." Thesis, University of Nottingham, 2010. http://eprints.nottingham.ac.uk/11451/.

Full text
Abstract:
In this thesis I examine how the scientific advisory system in England and Wales has responded to concerns about the risks of pesticide residues in food and demands for wider engagement in the formulation of advice. Specifically, I explore how the Advisory Committee on Pesticides (ACP) frames scientific uncertainties in risk assessment, and why some bodies outside and within government are critical of the ACP’s approach that is centred in the conventional single-chemical, high-dose-response paradigm of toxicology. Although some of these challenges date back to the early history of pesticide regulation in England and Wales, the emergence of scientific research employing different methods to assess the effects of chemical mixtures and chronic low-level exposure has stimulated new concerns about the risks posed by pesticide residues for human health. Using semi-structured interviews and documentary analysis, a key finding is that concerns about low-level exposure to chemical mixtures have been persistently bracketed in official advice as insufficient for changing current advice and regulation. Drawing from literature in science and technology studies, I account for this finding in three ways. First, it is perceived that change is unnecessary since established methods of pesticide risk assessment represent an exemplar for other domains. Secondly, evidence selection by the ACP and related committees is shaped by regulatory guidelines which aim to provide standardisation and quality assurance, but also constrain judgements about which risk assessment studies are considered admissible. Thirdly, fundamentally different notions are at play in terms of what constitutes legitimate expertise and who should embody it, leading to tensions within government as well as between the ACP and NGOs. These limit the impact of post-BSE attempts to make the role of scientific advice in policy-making more participatory and ‘evidence-based’, and the capacity to introduce new paradigms of chemical risk assessment in the pesticide advisory process.
APA, Harvard, Vancouver, ISO, and other styles
40

Aoudé, Georges Salim. "Threat assessment for safe navigation in environments with uncertainty in predictability." Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/68401.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 2011.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 213-224).
This thesis develops threat assessment algorithms to improve the safety of the decision making of autonomous and human-operated vehicles navigating in dynamic and uncertain environments, where the source of uncertainty is in the predictability of the nearby vehicles' future trajectories. The first part of the thesis introduces two classes of algorithms to classify drivers behaviors at roads intersections based on Support Vector Machines (SVM) and Hidden Markov Models (HMM). These algorithms are successfully validated using a large real-world intersection dataset, and can be used as part of future driver assistance systems. They are also compared to three popular traditional methods, and the results show significant and consistent improvements with the developed algorithms. The second part of the thesis presents an efficient trajectory prediction algorithm that has been developed to improve the performance of future collision avoidance and detection systems. The proposed approach, RR-GP, combines the Rapidly-exploring Random Trees (RRT) based algorithm, RRT-Reach, with mixtures of Gaussian Processes (GP) to compute dynamically feasible paths, in real-time, while embedding the flexibility of GP's nonparametric Bayesian model. RR-GP efficiently approximates the reachability sets of surrounding vehicles, and is shown in simulation and on naturalistic data to improve the performance over two standard GP-based algorithms. The third part introduces new path planning algorithms that build upon the tools that have been previously introduced in this thesis. The focus is on safe autonomous navigation in the presence of other vehicles with uncertain motion patterns. First, it presents a new threat assessment module (TAM) that combines the RRT-Reach algorithm with an SVM-based intention predictor, to develop a threat-aware path planner. The strengths of this approach are demonstrated through simulation and experiments performed in the MIT RAVEN testbed. Second, another novel path planning technique is developed by integrating the RR-GP trajectory prediction algorithm with a state-of-the-art chance-constrained RRT planner. This framework provides several theoretical guarantees on the probabilistic satisfaction of collision avoidance constraints. Extensive simulation results show that the resulting approach can be used in real-time to efficiently and accurately execute safe paths. The last part of the thesis considers the decision-making problem for a human-driven vehicle crossing a road intersection in the presence of other, potentially errant, drivers. The proposed approach uses the TAM framework to compute the threat level in real-time, and provides the driver with a warning signal and the best escape maneuver through the intersection. Experimental results with small autonomous and human-driven vehicles in the RAVEN testbed demonstrate that this approach can be successfully used in real-time to minimize the risk of collision in urban-like environments.
by Georges Salim Aoudé.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
41

Allaire, Douglas L. "Uncertainty assessment of complex models with application to aviation environmental systems." Thesis, Massachusetts Institute of Technology, 2009. http://hdl.handle.net/1721.1/50601.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 2009.
Includes bibliographical references (p. 131-136).
Numerical simulation models that support decision-making and policy-making processes are often complex, involving many disciplines, and long computation times. These models typically have many factors of different character, such as operational, design-based, technological, and economics-based. Such factors generally contain uncertainty, which leads to uncertainty in model outputs. For such models, it is critical to both the application of model results and the future development of the model that uncertainty be properly assessed. This thesis presents a comprehensive approach to the uncertainty assessment of complex models intended to support decision- and policy-making processes. The approach consists of seven steps, which are establishing assessment goals, documenting assumptions and limitations, documenting model factors and outputs, classifying and characterizing factor uncertainty, conducting uncertainty analysis, conducting sensitivity analysis, and presenting results. Factor uncertainty is represented probabilistically, characterized by the principle of maximum uncertainty, and propagated via Monte Carlo simulation. State-of-the-art methods of global sensitivity analysis are employed to apportion model output variance across model factors, and a fundamental extension of global sensitivity analysis, termed distributional sensitivity analysis, is developed to determine on which factors future research should focus to reduce output variability.
(cont.) The complete approach is demonstrated on a real-world model intended to estimate the impacts of aviation on climate change in support of decision- and policy-making, where it is established that a systematic approach to uncertainty assessment is critical to the proper application and future development of complex models. A novel surrogate modeling methodology designed specifically for uncertainty assessment is also presented and demonstrated for an aircraft emissions prediction model that is being developed and applied to support aviation environmental policy-making. The results demonstrate how confidence intervals on surrogate model predictions can be used to balance the tradeoff between computation time and uncertainty in the estimation of statistical outputs of interest in uncertainty assessment.
by Douglas Lawrence Allaire.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
42

Liem, Rhea Patricia. "System level assessment of uncertainty in aviation environmental policy impact analysis." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/62318.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 2010.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (p. 83-93).
This thesis demonstrates the assessment of uncertainty of a simulation model at the system level, which takes into account the interaction between the modules that comprise the system. Results from this system level assessment process aid policy-makers by identifying the key drivers of uncertainty in model outputs, among the input factors of the various modules that comprise the system. This knowledge can help direct resource allocation for research to reduce the uncertainty in policy outputs. The assessment results can also identify input factors that, when treated as deterministic variables, will not significantly affect the output variability. The system level assessment process is demonstrated on a model that estimates the air quality impacts of aviation. The model comprises two modules: the Aviation Environmental Design Tool (AEDT), which simulates aircraft operations to estimate performance and emissions inventories, and the Aviation environmental Portfolio Management Tool (APMT)- Impacts Air Quality module, which estimates the health and welfare impacts associated with aviation emissions. Global sensitivity analysis is employed to quantify the contribution of uncertainty in each input factor to the variability of system outputs, which here are adult mortality rates and total health cost. The assessment results show that none of the input factors of AEDT contribute significantly to the variability of system outputs. Therefore, if uncertainty reduction in the estimation of adult mortality and total health cost is desired, future research efforts should be directed towards gaining more knowledge on the input factors of the APMT-Impacts Air Quality module. This thesis also demonstrates the application of system level assessment in policy impact analysis, where policy impact is defined as the incremental change between baseline and policy outputs. In such an analysis, it is important to ensure that the uncertainty in policy impacts only accounts for the uncertainty corresponding to the difference between baseline and policy scenarios. Some input factors have a common source of uncertainty between scenarios, in which case the same representation of uncertainty must be used. Other input factors, on the other hand, are assumed to have independent variability between the different scenarios, and therefore need to have independent representation of uncertainty. This thesis demonstrates uncertainty assessment of a technology infusion policy analysis.
by Rhea Patricia Liem.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
43

Garibaldi, Jonathan Mark. "Intelligent techniques for handling uncertainty in the assessment of neonatal outcome." Thesis, University of Plymouth, 1997. http://hdl.handle.net/10026.1/1900.

Full text
Abstract:
Objective assessment of the neonatal outcome of labour is important, but it is a difficult and challenging problem. It is an invaluable source of information which can be used to provide feedback to clinicians, to audit a unit's overall performance, and can guide subsequent neonatal care. Current methods are inadequate as they fail to distinguish damage that occurred during labour from damage that occurred before or after labour. Analysis of the chemical acid-base status of blood taken from the umbilical cord of an infant immediately after delivery provides information on any damage suffered by the infant due to lack of oxygen during labour. However, this process is complex and error prone, and requires expertise which is not always available on labour wards. A model of clinical expertise required for the accurate interpretation of umbilical acid-base status was developed, and encapsulated in a rule-based expert system. This expert system checks results to ensure their consistency, identifies whether the results come from arterial or venous vessels, and then produces an interpretation of their meaning. This 'crisp' expert system was validated, verified and commercially released, and has since been installed at twenty two hospitals all around the United Kingdom. The assessment of umbilical acid-base status is characterised by uncertainty in both the basic data and the knowledge required for its interpretation. Fuzzy logic provides a technique for representing both these forms of uncertainty in a single framework. A 'preliminary' fuzzy-logic based expert system to interpret error-free results was developed, based on the knowledge embedded in the crisp expert system. Its performance was compared against clinicians in a validation test, but initially its performance was found to be poor in comparison with the clinicians and inferior to the crisp expert system. An automatic tuning algorithm was developed to modify the behaviour of the fuzzy model utilised in the expert system. Sub-normal membership functions were used to weight terms in the fuzzy expert system in a novel manner. This resulted in an improvement in the performance of the fuzzy expert system to a level comparable to the clinicians, and superior to the crisp expert system. Experimental work was carried out to evaluate the imprecision in umbilical cord acid-base parameters. This information, in conjunction with fresh knowledge elicitation sessions, allowed the creation of a more comprehensive fuzzy expert system, to validate and interpret all acid-base data. This 'integrated' fuzzy expert system was tuned using the comparison data obtained previously, and incorporated vessel identification rules and interpretation rules, with numeric and linguistic outputs for each. The performance of each of the outputs was evaluated in a rigorous validation study. This demonstrated excellent agreement with the experts for the numeric outputs, and agreement on a par with the experts for the linguistic outputs. The numeric interpretation produced by the fuzzy expert system is a novel single dimensional measure that accurately represents the severity of acid-base results. The development of the crisp and fuzzy expert systems represents a major achievement and constitutes a significant contribution to the assessment of neonatal outcome.
APA, Harvard, Vancouver, ISO, and other styles
44

Srivastava, Anruag. "A Computational Framework for Dam Safety Risk Assessment with Uncertainty Analysis." DigitalCommons@USU, 2013. https://digitalcommons.usu.edu/etd/1480.

Full text
Abstract:
The growing application of risk analysis in dam safety, especially for the owners of large numbers of dams (e.g., U.S. Army Corps of Engineers), has motivated the development of a new tool (DAMRAE) for event tree based dam safety risk analysis. Various theoretical challenges were overcome in formulating the computational framework of DAMRAE and several new computational concepts were introduced. The concepts of Connectivity and Pedigree matrices are proposed to quantify the user-drawn event tree structures with proper accounting of interdependencies among the event tree branches. A generic calculation of Common-Cause Adjustment for the non-mutually exclusive failure modes is implemented along with introducing the new concepts of system response probability and consequence freezing. New output presentation formats such as cumulative risk estimate vs. initiating variable plots to analyze the increase of an incremental (annualized) risk estimate as a function of initiating variable are introduced. An additional consideration is given to the non-breach risk estimates in the risk modeling and new output formats such as non-breach F-N and F-$ charts are included as risk analysis outputs. DAMRAE, a Visual Basic.NET based framework, provides a convenient platform to structure the risk assessment of a dam in its existing state and for alternatives or various stages of implementing a risk reduction plan. The second chapter of the dissertation presents the architectural framework of DAMRAE and describes the underlying theoretical and computational logic employed in the software. An example risk assessment is presented in the third chapter to demonstrate the DAMRAE functionalities. In the fourth chapter, the DAMRAE framework is extended into DAMRAE-U to incorporate uncertainty analysis functionality. Various aspects and requirements reviewed for uncertainty analysis in the context of dam safety risk assessment and theoretical challenges overcome to develop the computational framework for DAMRAE-U are described in this chapter. The capabilities of DAMRAE-U are illustrated in the fifth chapter, which contains an example dam safety risk assessment with uncertainty analysis. The dissertation concludes with a summary of DAMRAE features and recommendations for further work in the sixth chapter.
APA, Harvard, Vancouver, ISO, and other styles
45

Thorsén, Erik. "Assessment of the uncertainty in small and large dimensional portfolio allocation." Licentiate thesis, Stockholms universitet, Matematiska institutionen, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-176095.

Full text
Abstract:
Portfolio theory is a large subject with many branches. In this thesis we concern ourselves with one of these, the precense of uncertainty in the portfolio allocation problem and in turn, what it leads to. There are many forms of uncertainty, we consider two of these. The first being the optimization problem itself and optimizing what might be the wrong objective. In the classical mean-variance portfolio problem we aim to provide a portfolio with the smallest risk while we constrain the mean. However, in practice we might not assign a fixed portfolio goal but assign probabilities to the amount of return a portfolio might give and its relation to benchmarks. That is, we assign quantiles of the portfolio return distribution. In this scenario, the use of the portfolio mean as a return measure could be misleading. It does not take any quantile into account! In the first paper, we exchange the portfolio moments to quantile-based measures in the portfolio selection problem. The properties of the quantilebased portfolio selection problem is thereafter investigated with two different (quantile-based) measures of risk. We also present a closed form solution under the assumption that the returns follow an elliptical distribution. In this specific case the portfolio is shown to be mean-variance efficient. The second paper takes on a different type of uncertainty which is classic to statistics, the problem of estimation uncertainty. We consider the sample estimators of the mean vector and of the covariance matrix of the asset returns and integrate the uncertainty these provide into a large class of optimal portfolios. We derive the sampling distribution, of the estimated optimal portfolio weights, which are obtained in both small and large dimensions. This consists of deriving the joint distribution of several quantities and thereafter specifying their high dimensional asymptotic distribution.
APA, Harvard, Vancouver, ISO, and other styles
46

Becerra, Ernesto Jose. "Characterization and assessment of uncertainty in San Juan Reservoir Santa Rosa Field." Thesis, Texas A&M University, 2003. http://hdl.handle.net/1969.1/1628.

Full text
Abstract:
This study proposes a new, easily applied method to quantify uncertainty in production forecasts for a volumetric gas reservoir based on a material balance model (p/z vs. Gp). The new method uses only observed data and mismatches between regression values and observed values to identify the most probable value of gas reserves. The method also provides the range of probability of values of reserves from the minimum to the maximum likely value. The method is applicable even when only limited information is available from a field. Previous methods suggested in the literature require more information than our new method. Quantifying uncertainty in reserves estimation is becoming increasingly important in the petroleum industry. Many current investment opportunities in reservoir development require large investments, many in harsh exploration environments, with intensive technology requirements and possibly marginal investment indicators. Our method of quantifying uncertainty uses a priori information, which could come from different sources, typically from geological data, used to build a static or prior reservoir model. Additionally, we propose a method to determine the uncertainty in our reserves estimate at any stage in the life of the reservoir for which pressure-production data are available. We applied our method to San Juan reservoir at Santa Rosa Field, Venezuela. This field was ideal for this study because it is a volumetric reservoir for which the material balance method, the p/z vs. Gp plot, appears to be appropriate.
APA, Harvard, Vancouver, ISO, and other styles
47

Spindler, Henry C. (Henry Carlton) 1970. "Residential building energy analysis : development and uncertainty assessment of a simplified model." Thesis, Massachusetts Institute of Technology, 1998. http://hdl.handle.net/1721.1/70305.

Full text
Abstract:
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Architecture, 1998.
Includes bibliographical references (p. 163-165).
Effective design of energy-efficient buildings requires attention to energy issues during the preliminary stages of design. To aid in the early consideration of a building's future energy usage, a simplified building energy analysis model was developed. Using this model, a new computer program was written in C/C++ to calculate annual heat and cooling loads for residential buildings and to provide information about the relative importance of load contributions from the different building components. Estimates were made regarding the uncertainties of parameter inputs to the model, such as material properties, heat transfer coefficients and infiltration rates. The new computer program was used to determine the sensitivity of annual heat and cooling loads to model input uncertainties. From the results of these sensitivity studies, it was estimated that the overall uncertainties in the annual sensible heat and cooling load predictions amount to approximately ±30% and ±40%, respectively, for two buildings studied in Boston, Massachusetts. Further model simplification techniques were implemented that reduced annual load calculation times on a 180 MHz computer to about 8 and 12 seconds for a lightweight and massive building, respectively. The error introduced by these simplifications was approximately 4% and 10% for the annual sensible heat and cooling loads, well below the overall uncertainties in the load predictions. Comparison studies were performed with this new computer program and Energy-10. Overall, good agreement between the programs' annual load predictions was found.
by Henry C. Spindler.
M.S.
APA, Harvard, Vancouver, ISO, and other styles
48

Jehan, Zainab. "Monetary policy rules, total factor productivity growth and uncertainty : an empirical assessment." Thesis, University of Sheffield, 2013. http://etheses.whiterose.ac.uk/4574/.

Full text
Abstract:
This dissertation explores how uncertainty a�ffects diff�erent facets of an economy through three empirical essays. First, we presents an analytical framework to examine the policy reaction function of a central bank in an open economy context while allowing for asymmetric preferences. This implies that the policy makers can weigh negative and positive deviations of target variables (inflation and output gap) from their corresponding targets di�fferently. We use an open economy New-Keynesian forward looking model where aggregate demand and supply depend on real exchange rate. Using quarterly data ranging from 1979q1-2007q4 for Canada, Japan, the UK and the US, the empirical evaluation is drawn through generalized methods of moments. The results strongly favor the presence of asymmetries in the response of monetary policy towards both inflation rate and output gap for all sample countries. The estimates show that central banks follow an active monetary policy. Also, there is evidence that changes in foreign interest rate and exchange rate �significantly aff�ects the domestic monetary policy formation. Second, we examine the role of various sources of uncertainty on total factor productivity growth. Specifi�cally, this essay estimates the role of of uncertainty emanating from global, country, and industry level on TFP growth in manufacturing industries of sixteen emerging economies. For this purpose, we use annual data covering the period from 1971-2008. Our fi�ndings suggest a signifi�cant impact of each source of uncertainty on TFP growth. Particularly, we observe that industry and country specifi�c uncertainty have a positive impact on TFP growth of manufacturing industries. However, global uncertainty has statistically signifi�cant and negative impact on TFP growth. We also provide evidence that the impact of industry speci�fic uncertainty strengthens as the size of industry increases whereas the reverse holds for both country specifi�c and global uncertainty. In addition, we observe that the positive impact of both industry and country speci�fic uncertainty gets stronger at higher levels of factor intensity. Third, we examine the role of of uncertainty of technology diff�usion in TFP convergence of manufacturing industries of frontier and non-frontier countries. For this purpose, we use annual data covering the time period from 1981-2008, eighteen manufacturing industries of fi�ve emerging economies. We employ superlative index number approach to compute the TFP level and growth in manufacturing industries of these countries. Our �findings suggest a signi�ficant evidence of TFP convergence in manufacturing industries of non-frontier and frontier countries. Moreover, technology diff�usion not only pertains a positive impact on TFP growth of manufacturing industries of non-frontier countries but also facilitates the process of TFP convergence. More importantly, we report a signi�ficant negative impact of uncertainty of technology diff�usion on the TFP growth.
APA, Harvard, Vancouver, ISO, and other styles
49

Bottiglieri, Michael John. "Uncertainty assessment for free-running model cases at the IIHR wave basin." Thesis, University of Iowa, 2016. https://ir.uiowa.edu/etd/2049.

Full text
Abstract:
Uncertainty analysis is performed to analyze the motions and results of maneuvering characteristics of a 1/49 scale surface combatant model during free-running maneuverability testing. The model is designed with a twin rudder and twin propeller rotating inwards. Calm water and wave testing is completed with an initial ship speed corresponding to a Froude number of 0.20 while the wave cases have wavelength to ship length ratio of 1.0 and wave height to wavelength ratio of 0.02. These conditions were tested for course keeping, turning circle, and zig zag maneuvers. The turning circles were completed to both port and starboard side. Tracking of the model is completed with an overhead carriage design with a mounted camera to record the motions of the ship and convert these motion to six degree of freedom motions. The combination of the tracking systems are analyzed to find the systematic standard uncertainty of the system. Uncertainty was performed in accordance with the performance test codes written by ASME during 2013 to find the systematic standard and random uncertainty of measurements. The random uncertainty is found based on the standard deviation of repeated measurements, while the systematic standard uncertainty is found based on the bias of the measurement system and the sensitivity coefficients found from the data reduction equations. The data reduction equations are used to non-dimensionalize the measured values to compare to CFD results as well as results from other model scales. From the data reduction equations partial derivatives are taken to determine how the uncertainty propagates throughout the sensitivity coefficients. After the uncertainties are calculated the results were compared to other facilities to evaluate the method used and gauge the quality of the repeatability of the measurements. Few other facilities have analyzed the uncertainty during free running tests past looking at the random error based on repeated tests. The comparison with these facilities displayed that the uncertainty process and measurement repeatability used by IIHR at the wave basin produce consistent results with limited uncertainties when the end results of maneuvering characteristics are observed. Large uncertainties occur for some of the measured variables during the full scale of the testing time when the uncertainties are reported as a percentage of the harmonic amplitudes and the reported harmonic amplitude are near zero with a small uncertainty.
APA, Harvard, Vancouver, ISO, and other styles
50

Slinskey, Emily Anne. "Assessment of Observational Uncertainty in Extreme Precipitation Over the Continental United States." PDXScholar, 2018. https://pdxscholar.library.pdx.edu/open_access_etds/4450.

Full text
Abstract:
An extreme precipitation categorization scheme, developed to temporally and spatially visualize and track the multi-scale variability of extreme precipitation climatology, is introduced over the continental United States and used as the basis for an observational dataset intercomparison. The categorization scheme groups three-day precipitation totals exceeding 100 mm into five precipitation categories, or "P-Cats". To assess observational uncertainty across a range of precipitation measurement approaches, we compare in situ station data from the Global Historical Climatology Network-Daily (GHCN-D), satellite derived data from the Tropical Rainfall Measuring Mission (TRMM), gridded station data from the Parameter-elevation Regression on Independent Slopes Model (PRISM), global reanalysis from the Modern-Era Retrospective analysis for Research and Applications, version 2 (MERRA 2), and regional reanalysis from the North American Regional Reanalysis (NARR). While all datasets capture the principal spatial patterns of extreme precipitation climatology, results show considerable variability across the five-platform suite in P-Cat frequency, spatial extent, and magnitude. Higher resolution datasets, PRISM and TRMM, most closely resemble GHCN-D and capture a greater frequency of high-end totals relative to lower resolution products, NARR and MERRA-2. When all datasets are regridded to a common coarser grid, differences persist with datasets originally constructed at a high resolution maintaining the highest frequency and magnitude of P-Cats. Potential future applications of this scheme include tracking change in P-Cats over space and time, climate model evaluation, and assessment of model projected change.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography