Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Uncertainty Analysis.

Rozprawy doktorskie na temat „Uncertainty Analysis”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 50 najlepszych rozpraw doktorskich naukowych na temat „Uncertainty Analysis”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj rozprawy doktorskie z różnych dziedzin i twórz odpowiednie bibliografie.

1

Gomolka, Beth. "Service Offering Uncertainty Analysis Tool". Thesis, Linköping University, Linköping University, Department of Management and Engineering, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-19945.

Pełny tekst źródła
Streszczenie:

Companies that seek to venture into providing services in addition to providing products have many business issues to consider as there are many differences between providing service and product offerings.  One factor that needs to be considered in service offerings is the aspect of time, as services are offered for an extended period of time, creating a unique type of relationship between the customer and the service provider.  With product offerings, the point of sale is usually the end of the product provider and customer relationship.  The added time aspect in the service offering brings with it the issues of uncertainty as service contracts are made for a certain period of time in the future, where things are unknown.

 

This thesis looked at types of uncertainties important to service offerings, especially in the manufacturing industry.  The uncertainties have an impact on how service offering contracts are constructed, as they can affect the profit and costs of the service provider. The three types of uncertainties that were examined were product malfunction uncertainty, service delivery uncertainty, and customer requirement uncertainty. Using these three types of uncertainty, mathematical models were constructed to represent the cost and revenue of different contract types. The different contract types were identified through a case study with a product manufacturer in Sweden.  Different probability distributions were selected to model the three types of uncertainty based on a literature review.  The mathematical models were then used to construct a software program, the uncertainty simulator tool, which service contract designers can use to model how uncertainties affect cost and revenue in their contracts.

Style APA, Harvard, Vancouver, ISO itp.
2

Zomlot, Loai M. M. "Handling uncertainty in intrusion analysis". Diss., Kansas State University, 2014. http://hdl.handle.net/2097/17603.

Pełny tekst źródła
Streszczenie:
Doctor of Philosophy
Department of Computing and Information Sciences
Xinming Ou
Intrusion analysis, i.e., the process of combing through Intrusion Detection System (IDS) alerts and audit logs to identify true successful and attempted attacks, remains a difficult problem in practical network security defense. The primary cause of this problem is the high false positive rate in IDS system sensors used to detect malicious activity. This high false positive rate is attributed to an inability to differentiate nearly certain attacks from those that are merely possible. This inefficacy has created high uncertainty in intrusion analysis and consequently causing an overwhelming amount of work for security analysts. As a solution, practitioners typically resort to a specific IDS-rules set that precisely captures specific attacks. However, this results in failure to discern other forms of the targeted attack because an attack’s polymorphism reflects human intelligence. Alternatively, the addition of generic rules so that an activity with remote indication of an attack will trigger an alert, requires the security analyst to discern true alerts from a multitude of false alerts, thus perpetuating the original problem. The perpetuity of this trade-off issue is a dilemma that has puzzled the cyber-security community for years. A solution to this dilemma includes reducing uncertainty in intrusion analysis by making IDS-nearly-certain alerts prominently discernible. Therefore, I propose alerts prioritization, which can be attained by integrating multiple methods. I use IDS alerts correlation by building attack scenarios in a ground-up manner. In addition, I use Dempster-Shafer Theory (DST), a non-traditional theory to quantify uncertainty, and I propose a new method for fusing non-independent alerts in an attack scenario. Finally, I propose usage of semi-supervised learning to capture an organization’s contextual knowledge, consequently improving prioritization. Evaluation of these approaches was conducted using multiple datasets. Evaluation results strongly indicate that the ranking provided by the approaches gives good prioritization of IDS alerts based on their likelihood of indicating true attacks.
Style APA, Harvard, Vancouver, ISO itp.
3

Urganci, Ilksen. "Positional Uncertainty Analysis Using Data Uncertainy Engine A Case Study On Agricultural Land Parcels". Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/3/12611409/index.pdf.

Pełny tekst źródła
Streszczenie:
Most of spatial data extraction and updating procedures require digitization of geographical entities from satellite imagery. During digitization, errors are introduced by factors like instrument deficiencies or user errors. In this study positional uncertainty of geographical objects, digitized from high resolution Quickbird satellite imagery, is assessed using Data Uncertainty Engine (DUE). It is a software tool for assessing uncertainties in environmental data
and generating realisations of uncertain data for use in uncertainty propagation analyses. A case study area in Kocaeli, Turkey that mostly includes agricultural land parcels is selected in order to evaluate positional uncertainty and obtain uncertainty boundaries for manually digitized fields. Geostatistical evaluation of discrepancy between reference data and digitized polygons are undertaken to analyse auto and cross correlation structures of errors. This process is utilized in order to estimate error model parameters which are employed in defining an uncertainty model within DUE. Error model parameters obtained from training data, are used to generate simulations for test data. Realisations of data derived via Monte Carlo Simulation using DUE, are evaluated to generate uncertainty boundaries for each object guiding user for further analyses with pre-defined information related to the accuracy of spatial entities. It is also aimed to assess area uncertainties affected by the position of spatial entities. For all different correlation structures and object models, weighted average positional error for this study is between 2.66 to 2.91 meters. At the end of uncertainty analysis, deformable object model produced the smallest uncertainty bandwidth by modelling cross correlation.
Style APA, Harvard, Vancouver, ISO itp.
4

Filipsson, Monika. "Uncertainty, variability and environmental risk analysis". Doctoral thesis, Linnéuniversitetet, Institutionen för naturvetenskap, NV, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-11193.

Pełny tekst źródła
Streszczenie:
The negative effects of hazardous substances and possible measures that can be taken are evaluated in the environmental risk analysis process, consisting of risk assessment, risk communication and risk management. Uncertainty due to lack of knowledge and natural variability are always present in this process. The aim of this thesis is to evaluate some tools as well as discuss the management of uncertainty and variability, as it is necessary to treat them both in a reliable and transparent way to gain regulatory acceptance in decision making. The catalytic effects of various metals on the formation of chlorinated aromatic compounds during the heating of fly ash were investigated (paper I). Copper showed a positive catalytic effect, while cobalt, chromium and vanadium showed a catalytic effect for degradation. Knowledge of the catalytic effects may facilitate the choice and design of combustion processes to decrease emissions, but it also provides valuable information to identify and characterize the hazard. Exposure factors of importance in risk assessment (physiological parameters, time use factors and food consumption) were collected and evaluated (paper II). Interindividual variability was characterized by mean, standard deviation, skewness, kurtosis and multiple percentiles, while uncertainty in these parameters was estimated with confidence intervals. How these statistical parameters can be applied was shown in two exposure assessments (papers III and IV). Probability bounds analysis was used as a probabilistic approach, which enables separate propagation of uncertainty and variability even in cases where the availability of data is limited. In paper III it was determined that the exposure cannot be expected to cause any negative health effects for recreational users of a public bathing place. Paper IV concluded that the uncertainty interval in the estimated exposure increased when accounting for possible changes in climate-sensitive model variables. Risk managers often need to rely on precaution and an increased uncertainty may therefore have implications for risk management decisions. Paper V focuses on risk management and a questionnaire was sent to employees at all Swedish County Administrative Boards working with contaminated land. It was concluded that the gender, age and work experience of the employees, as well as the funding source of the risk assessment, all have an impact on the reviewing of risk assessments. Gender was the most significant factor, and it also affected the perception of knowledge.
Negativa effekter orsakade av skadliga ämnen och möjliga åtgärder bedöms och utvärderas i en miljöriskanalys, som kan delas i riskbedömning, riskkommunikation och riskhantering. Osäkerhet som beror på kunskapsbrist samt naturlig variabilitet finns alltid närvarande i denna process. Syftet med avhandlingen är att utvärdera några tillvägagångssätt samt diskutera hur osäkerhet och variabilitet hanteras då det är nödvändigt att båda hanteras trovärdigt och transparent för att riskbedömningen ska vara användbar för beslutsfattande. Metallers katalytiska effekt på bildning av klorerade aromatiska ämnen under upphettning av flygaska undersöktes (artikel I). Koppar visade en positiv katalytisk effekt medan kobolt, krom och vanadin istället katalyserade nedbrytningen. Kunskap om katalytisk potential för bildning av skadliga ämnen är viktigt vid val och design av förbränningsprocesser för att minska utsläppen, men det är också ett exempel på hur en fara kan identifieras och karaktäriseras. Information om exponeringsfaktorer som är viktiga i riskbedömning (fysiologiska parametrar, tidsanvändning och livsmedelskonsumtion) samlades in och analyserades (artikel II). Interindividuell variabilitet karaktäriserades av medel, standardavvikelse, skevhet, kurtosis (toppighet) och multipla percentiler medan osäkerhet i dessa parametrar skattades med konfidensintervall. Hur dessa statistiska parametrar kan tillämpas i exponeringsbedömningar visas i artikel III och IV. Probability bounds analysis användes som probabilistisk metod, vilket gör det möjligt att separera osäkerhet och variabilitet i bedömningen även när tillgången på data är begränsad. Exponeringsbedömningen i artikel III visade att vid nu rådande föroreningshalter i sediment i en badsjö så medför inte bad någon hälsofara. I artikel IV visades att osäkerhetsintervallet i den skattade exponeringen ökar när hänsyn tas till förändringar i klimatkänsliga modellvariabler. Riskhanterare måste ta hänsyn till försiktighetsprincipen och en ökad osäkerhet kan därmed få konsekvenser för riskhanteringsbesluten. Artikel V fokuserar på riskhantering och en enkät skickades till alla anställda som arbetar med förorenad mark på länsstyrelserna i Sverige. Det konstaterades att anställdas kön, ålder och erfarenhet har en inverkan på granskningsprocessen av riskbedömningar. Kön var den mest signifikanta variabeln, vilken också påverkade perceptionen av kunskap. Skillnader i de anställdas svar kunde också ses beroende på om riskbedömningen finansierades av statliga bidrag eller av en ansvarig verksamhetsutövare.
Style APA, Harvard, Vancouver, ISO itp.
5

Söderman, Filip. "Uncertainty Analysis of the Aerodynamic Coefficients". Thesis, KTH, Flygdynamik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-223317.

Pełny tekst źródła
Streszczenie:
This thesis treats an error propagation analysis used to estimate the uncertainty of the aerodynamic coefficients. The propagation methods used in this analysis are a Taylor Series Method and a Monte Carlo Method. The Taylor Series Method uses the partial derivatives of each input variable whereas the Monte Carlo Method uses random and repeated samples from the probability density function of each variable. By comparing the results obtained by the different methods, the results can be validated. Coverage intervals with a coverage probability of 95% are calculated along with the percentage contribution each input variable has on the expanded uncertainty. The results showed that the uncertainty of the coefficients varied between 10% and 20% and negligible differences between the methods were observed. More accurate measurements of the dynamic pressure and the position of the center of gravity are needed in order to decrease the uncertainty.
Style APA, Harvard, Vancouver, ISO itp.
6

Johnson, David G. "Representations of uncertainty in risk analysis". Thesis, Loughborough University, 1998. https://dspace.lboro.ac.uk/2134/31941.

Pełny tekst źródła
Streszczenie:
Uncertainty in situations involving risk is frequently modelled by assuming a plausible form of probability distribution for the uncertain quantities involved, and estimating the relevant parameters of that distribution based on the knowledge and judgement of informed experts or decision makers. The distributions assumed are usually uni-modal (and often bell-shaped) around some most likely value, with the Normal, Beta, Gamma and Triangular distributions being popular choices.
Style APA, Harvard, Vancouver, ISO itp.
7

Walker, A. M. "Uncertainty Analysis of Zone Fire Models". University of Canterbury. Civil Engineering, 1997. http://hdl.handle.net/10092/8298.

Pełny tekst źródła
Streszczenie:
Zone fire models are used by practising engineers every day in New Zealand, yet the models have limitations, and the uncertainty of these models has not been well documented. Comparisons with experimental data are simply comparison and do not analyse the uncertainty of the models, nor are they validation of the models. The object of this research has been to discuss the uncertainties in components of zone models and show how uncertainty within user supplied data affects the results obtained. The zone fire model selected for analysis is the second version of CFAST. A numerical uncertainty analysis is performed, utilising sensitivity factors as the basis of the analysis. In the analysis, no assumptions are made as to the independency of the input variables. A large amount of information is appended, with a discussion of pertinent results. Several input variables were identified to resulted in discernible uncertainty in the output. Consisting of the heat release rate, radiative fraction, ambient temperature, ambient pressure, and ceiling height.
Style APA, Harvard, Vancouver, ISO itp.
8

Gallagher, Raymond. "Uncertainty modelling in quantitative risk analysis". Thesis, University of Liverpool, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.367676.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Cui, W. C. "Uncertainty analysis in structural safety assessment". Thesis, University of Bristol, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.303742.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Ghate, Devendra. "Inexpensive uncertainty analysis for CFD applications". Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:6be44a1d-6e2f-4bf9-b1e5-1468f92e21e3.

Pełny tekst źródła
Streszczenie:
The work presented in this thesis aims to provide various tools to be used during design process to make maximum use of the increasing availability of accurate engine blade measurement data for high fidelity fluid mechanic simulations at a reasonable computational expense. A new method for uncertainty propagation for geometric error has been proposed for fluid mechanics codes using adjoint error correction. Inexpensive Monte Carlo (IMC) method targets small uncertainties and provides complete probability distribution for the objective function at a significantly reduced computational cost. A brief literature survey of the existing methods is followed by the formulation of IMC. An example algebraic model is used to demonstrate the IMC method. The IMC method is extended to fluid mechanic applications using Principal Component Analysis (PCA) for reduced order modelling. Implementation details for the IMC method are discussed using an example airfoil code. Finally, the IMC method has been implemented and validated for an industrial fluid mechanic code HYDRA. A consistent methodology has been developed for the automatic generation of the linear and adjoint codes by selective use of automatic differentiation (AD) technique. The method has the advantage of keeping the linear and the adjoint codes in-sync with the changes in the underlying nonlinear fluid mechanic solver. The use of various consistency checks have been demonstrated to ease the development and maintenance process of the linear and the adjoint codes. The use of AD has been extended for the calculation of the complete Hessian using forward-on-forward approach. The complete mathematical formulation for Hessian calculation using the linear and the adjoint solutions has been outlined for fluid mechanic solvers. An efficient implementation for the Hessian calculation is demonstrated using the airfoil code. A new application of the Independent Component Analysis (ICA) is proposed for manufacturing uncertainty source identification. The mathematical formulation is outlined followed by an example application of ICA for artificially generated uncertainty for the NACA0012 airfoil.
Style APA, Harvard, Vancouver, ISO itp.
11

Jones, Richard. "Uncertainty analysis in the Model Web". Thesis, Aston University, 2014. http://publications.aston.ac.uk/21397/.

Pełny tekst źródła
Streszczenie:
This thesis provides a set of tools for managing uncertainty in Web-based models and workflows. To support the use of these tools, this thesis firstly provides a framework for exposing models through Web services. An introduction to uncertainty management, Web service interfaces,and workflow standards and technologies is given, with a particular focus on the geospatial domain. An existing specification for exposing geospatial models and processes, theWeb Processing Service (WPS), is critically reviewed. A processing service framework is presented as a solutionto usability issues with the WPS standard. The framework implements support for Simple ObjectAccess Protocol (SOAP), Web Service Description Language (WSDL) and JavaScript Object Notation (JSON), allowing models to be consumed by a variety of tools and software. Strategies for communicating with models from Web service interfaces are discussed, demonstrating the difficultly of exposing existing models on the Web. This thesis then reviews existing mechanisms for uncertainty management, with an emphasis on emulator methods for building efficient statistical surrogate models. A tool is developed to solve accessibility issues with such methods, by providing a Web-based user interface and backend to ease the process of building and integrating emulators. These tools, plus the processing service framework, are applied to a real case study as part of the UncertWeb project. The usability of the framework is proved with the implementation of aWeb-based workflow for predicting future crop yields in the UK, also demonstrating the abilities of the tools for emulator building and integration. Future directions for the development of the tools are discussed.
Style APA, Harvard, Vancouver, ISO itp.
12

Fu, Shuai. "Inverse problems occurring in uncertainty analysis". Thesis, Paris 11, 2012. http://www.theses.fr/2012PA112208/document.

Pełny tekst źródła
Streszczenie:
Ce travail de recherche propose une solution aux problèmes inverses probabilistes avec des outils de la statistique bayésienne. Le problème inverse considéré est d'estimer la distribution d'une variable aléatoire non observée X à partir d'observations bruitées Y suivant un modèle physique coûteux H. En général, de tels problèmes inverses sont rencontrés dans le traitement des incertitudes. Le cadre bayésien nous permet de prendre en compte les connaissances préalables d'experts en particulier lorsque peu de données sont disponibles. Un algorithme de Metropolis-Hastings-within-Gibbs est proposé pour approcher la distribution a posteriori des paramètres de X avec un processus d'augmentation des données. A cause d'un nombre élevé d'appels, la fonction coûteuse H est remplacée par un émulateur de krigeage (métamodèle). Cette approche implique plusieurs erreurs de natures différentes et, dans ce travail,nous nous attachons à estimer et réduire l'impact de ces erreurs. Le critère DAC a été proposé pour évaluer la pertinence du plan d'expérience (design) et le choix de la loi apriori, en tenant compte des observations. Une autre contribution est la construction du design adaptatif adapté à notre objectif particulier dans le cadre bayésien. La méthodologie principale présentée dans ce travail a été appliquée à un cas d'étude en ingénierie hydraulique
This thesis provides a probabilistic solution to inverse problems through Bayesian techniques.The inverse problem considered here is to estimate the distribution of a non-observed random variable X from some noisy observed data Y explained by a time-consuming physical model H. In general, such inverse problems are encountered when treating uncertainty in industrial applications. Bayesian inference is favored as it accounts for prior expert knowledge on Xin a small sample size setting. A Metropolis-Hastings-within-Gibbs algorithm is proposed to compute the posterior distribution of the parameters of X through a data augmentation process. Since it requires a high number of calls to the expensive function H, the modelis replaced by a kriging meta-model. This approach involves several errors of different natures and we focus on measuring and reducing the possible impact of those errors. A DAC criterion has been proposed to assess the relevance of the numerical design of experiments and the prior assumption, taking into account the observed data. Another contribution is the construction of adaptive designs of experiments adapted to our particular purpose in the Bayesian framework. The main methodology presented in this thesis has been applied to areal hydraulic engineering case-study
Style APA, Harvard, Vancouver, ISO itp.
13

Di, Francesco Tommaso <1994&gt. "Italian uncertainty- A twitter based analysis". Master's Degree Thesis, Università Ca' Foscari Venezia, 2020. http://hdl.handle.net/10579/16760.

Pełny tekst źródła
Streszczenie:
Evidence indicates that uncertainty has a significant and relevant effect on macro economic and financial variables. In this work we review existing studies about the relationship between uncertainty and economic and financial variables, such as Bonds and Credit Default Swaps. To investigate these relations we will estimate a Structural Topic Model, based on textual data from an online Social Network about uncertainty. Specifically we will use Italian tweets, collected in the years 2018-2019, explicitly mentioning uncertainty. This model will allow us to categorize tweets about uncertainty by topic and to hence construct domain and topic specific uncertainty indexes. In order to validate our indexes and to analyze their relations with market phenomena, we estimate a SVEC model, to highlight the relations between social and market uncertainty phenomena in Italy.
Style APA, Harvard, Vancouver, ISO itp.
14

Guo, Jia. "Uncertainty analysis and sensitivity analysis for multidisciplinary systems design". Diss., Rolla, Mo. : Missouri University of Science and Technology, 2008. http://scholarsmine.mst.edu/thesis/pdf/Guo_09007dcc8066e905.pdf.

Pełny tekst źródła
Streszczenie:
Thesis (Ph. D.)--Missouri University of Science and Technology, 2008.
Vita. The entire thesis text is included in file. Title from title screen of thesis/dissertation PDF file (viewed May 28, 2009) Includes bibliographical references.
Style APA, Harvard, Vancouver, ISO itp.
15

Doty, Austin. "Nonlinear Uncertainty Quantification, Sensitivity Analysis, and Uncertainty Propagation of a Dynamic Electrical Circuit". University of Dayton / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=dayton1355456642.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
16

Di, Gessa Giorgio. "Simple strategies for variance uncertainty in meta-analysis". Connect to e-thesis, 2007. http://theses.gla.ac.uk/128/.

Pełny tekst źródła
Streszczenie:
Thesis (M.Sc.(R)) - University of Glasgow, 2007.
M.Sc.(R) thesis submitted to the Department of Statistics, Faculty of Information and Mathematical Sciences, University of Glasgow, 2007. Includes bibliographical references. Print version also available.
Style APA, Harvard, Vancouver, ISO itp.
17

Gajev, Ivan. "Sensitivity and Uncertainty Analysis of BWR Stability". Licentiate thesis, KTH, Kärnkraftsäkerhet, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-26387.

Pełny tekst źródła
Streszczenie:
Best Estimate codes are used for licensing, but with conservative assumptions. It is claimed that the uncertainties are covered by the conservatism of the calculation. As Nuclear Power Plants are applying for power up-rates and life extension, evaluation of the uncertainties could help improve the performance, while staying below the limit of the safety margins.   Given the problem of unstable behavior of Boiling Water Reactors (BWRs), which is known to occur during operation at certain power and flow conditions, it could cause SCRAM and decrease the economic performance of the plant. Performing an uncertainty analysis for BWR stability would give better understating of the phenomenon and it would help to verify and validate (V&V) the codes used to predict the NPP behavior.   This thesis reports an uncertainty study of the impact of Thermal-Hydraulic, Neutronic, and Numerical parameters on the prediction of the stability of the BWR within the framework of OECD Ringhals-1 stability benchmark. The time domain code TRACE/PARCS was used in the analysis. This thesis is divided in two parts: Sensitivity study on Numerical Discretization Parameters (Nodalization, Time Step, etc.) and Uncertainty part.   A Sensitivity study was done for the Numerical Parameters (Nodalization and Time step). This was done by refining all possible components until obtaining Space-Time Converged Solution, i.e. further refinement doesn’t change the solution. When the space-time converged solution was compared to the initial discretization, a much better solution has been obtained for both the stability measures (Decay Ratio and Frequency) with the space-time converged model.   Further on, important Neutronic and Thermal-Hydraulic Parameters were identified and the uncertainty calculation was performed using the Propagation of Input Errors (PIE) methodology. This methodology, also known as the GRS method, has been used because it has been tested and extensively verified by the industry, and because it allows identifying the most influential parameters using the Spearman Rank Correlation.
QC 20101126
Style APA, Harvard, Vancouver, ISO itp.
18

Ukhov, Ivan. "System-Level Analysis and Design under Uncertainty". Doctoral thesis, Linköpings universitet, Programvara och system, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-140758.

Pełny tekst źródła
Streszczenie:
One major problem for the designer of electronic systems is the presence of uncertainty, which is due to phenomena such as process and workload variation. Very often, uncertainty is inherent and inevitable. If ignored, it can lead to degradation of the quality of service in the best case and to severe faults or burnt silicon in the worst case. Thus, it is crucial to analyze uncertainty and to mitigate its damaging consequences by designing electronic systems in such a way that uncertainty is effectively and efficiently taken into account. We begin by considering techniques for deterministic system-level analysis and design of certain aspects of electronic systems. These techniques do not take uncertainty into account, but they serve as a solid foundation for those that do. Our attention revolves primarily around power and temperature, as they are of central importance for attaining robustness and energy efficiency. We develop a novel approach to dynamic steady-state temperature analysis of electronic systems and apply it in the context of reliability optimization. We then proceed to develop techniques that address uncertainty. The first technique is designed to quantify the variability in process parameters, which is induced by process variation, across silicon wafers based on indirect and potentially incomplete and noisy measurements. The second technique is designed to study diverse system-level characteristics with respect to the variability originating from process variation. In particular, it allows for analyzing transient temperature profiles as well as dynamic steady-state temperature profiles of electronic systems. This is illustrated by considering a problem of design-space exploration with probabilistic constraints related to reliability. The third technique that we develop is designed to efficiently tackle the case of sources of uncertainty that are less regular than process variation, such as workload variation. This technique is exemplified by analyzing the effect that workload units with uncertain processing times have on the timing-, power-, and temperature-related characteristics of the system under consideration. We also address the issue of runtime management of electronic systems that are subject to uncertainty. In this context, we perform an early investigation into the utility of advanced prediction techniques for the purpose of fine-grained long-range forecasting of resource usage in large computer systems. All the proposed techniques are assessed by extensive experimental evaluations, which demonstrate the superior performance of our approaches to analysis and design of electronic systems compared to existing techniques.
Style APA, Harvard, Vancouver, ISO itp.
19

Ibrahim, Hanaa Abdel Hamid. "Analysis of Sudan's agricultural trade under uncertainty /". Aachen : Shaker, 2004. http://www.gbv.de/dms/zbw/389983667.pdf.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
20

Oakley, Jeremy. "Bayesian uncertainty analysis for complex computer codes". Thesis, University of Sheffield, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.322915.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
21

LEVY, NATALIA CORDEIRO. "INVESTMENT ANALYSIS UNDER UNCERTAINTY: AN ANALYTICAL APPROACH". PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2009. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=14911@1.

Pełny tekst źródła
Streszczenie:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
COORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR
A avaliação de oportunidades de investimentos é sem duvida um tema de grande interesse, pois é o modo pela qual as firmas norteiam suas decisões de investimento ao avaliar que este ou aquele projeto cria ou não valor para esta firma. A teoria de avaliação de investimentos produtivos inicia seu caminho partindo do Valor Presente Líquido (VPL) e vai se ramificando ao longo se sua literatura, percorrendo sempre o objetivo de incorporar a incerteza nos modelos. O estágio atual desta caminhada é a avaliação por opções reais, e tudo que a antecede passou a ser chamado de teoria clássica. Mas muitos problemas enfrentados nas abordagens encontradas na literatura de avaliação de opções reais são antigos. Em função da analogia com as opções financeiras, a metodologia proposta para avaliação das opções reais originaram dos modelos de apreçamento de opções financeiras. Mas esta extensão metodológica é em si problemática, pois os ativos ditos reais e os ativos financeiros guardam entre si importantes diferenças como: risco privado, completude dos mercados, diferenças de liquidez, reversibilidade e uma profunda diferença entre os níveis de assimetria de informação. Estas diferenças comprometem a significância dos resultados finais desta avaliação, pois violam algumas hipóteses que estão por de trás da teoria de apreçamento de opções financeiras, além de não incorporar a parcela de risco privado na avaliação, apenas risco de mercado. Outras abordagens para avaliação de opções reais surgiram para tentar resolver o problema da incompletude dos mercados, mas também retornam a outros problemas já discutidos na teoria clássica como, por exemplo, a dificuldade da escolha da taxa de desconto e a subjetividade da estimativa de um fluxo de caixa equivalente certo. Apesar de ter criado um novo paradigma na concepção de valor dos projetos de investimento, a literatura da teoria de opções reais é ainda divergente quanto aos métodos de avaliação. Este trabalho tem como objetivo discutir as dificuldades práticas de se avaliar/ quantificar as opções de um ativo real que se dá tanto pela inadequação dos métodos de apreçamento próprios para derivativos financeiros, quanto pela subjetividade que se incorre com a utilização de métodos alternativos.
The valuation of investment opportunities is undoubtedly a topic of great interest as it is the manner by which firms guide their investment decisions and assess whether this or that project creates or not value. The valuation theory of productive investments starts its way on the Net Present Value Rule (NPV) and branches along its literature, pursuing always the goal of incorporating the uncertainty into the models. The current stage of this path is the valuation of real options, and so everything that precedes it is now called classical theory. Nevertheless, many problems in the approaches found in literature for assessing real options are old. As the analogy with financial options is common, the proposed methodology for pricing real options bases itself in the financial options models. But this methodological extension is in itself problematic, as the so-called real assets and financial assets retain important differences between themselves such as private risk, completeness of markets, differences in liquidity, reversibility and a dramatic difference in the levels of information asymmetry. These differences undermine the significance of the valuation’s final results, as they violate some of the assumptions behind the pricing theory of financial options. As well as that, only the market component of risk is considered in the assessment, leaving private risk unattended. Other approaches for pricing real options have emerged in order to tackle the problem of market incompleteness, but are not able to prevent other issues already discussed in the classical theory, such as the difficulty in choosing the discount rate and the subjectivity of the certainty equivalent cash flow estimation. Despite having created a new standard in the understanding of what does the value of an investment project represent, real options literature is still uneasy with regards to valuation methods. The aim of this dissertation is to discuss the practical difficulties in the valuation/ quantification of the options present in a real asset. These are given both by the inadequacy in the methods that were designed specifically for financial derivatives, and by the subjectivity that is incurred when one makes use of alternative methods.
Style APA, Harvard, Vancouver, ISO itp.
22

Taylor, Joshua Adam. "Uncertainty analysis of power systems using collocation". Thesis, Massachusetts Institute of Technology, 2008. http://hdl.handle.net/1721.1/45891.

Pełny tekst źródła
Streszczenie:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2008.
Includes bibliographical references (p. 93-97).
The next-generation all-electric ship represents a class of design and control problems in which the system is too large to approach analytically, and even with many conventional computational techniques. Additionally, numerous environmental interactions and inaccurate system model information make uncertainty a necessary consideration. Characterizing systems under uncertainty is essentially a problem of representing the system as a function over a random space. This can be accomplished by sampling the function, where in the case of the electric ship a "sample" is a simulation with uncertain parameters set according to the location of the sample. For systems on the scale of the electric ship, simulation is expensive, so we seek an accurate representation of the system from a minimal number of simulations. To this end, collocation is employed to compute statistical moments, from which sensitivity can be inferred, and to construct surrogate models with which interpolation can be used to propagate PDF's. These techniques are applied to three large-scale electric ship models. The conventional formulation for the sparse grid, a collocation algorithm, is modified to yield improved performance. Theoretical bounds and computational examples are given to support the modification. A dimension-adaptive collocation algorithm is implemented in an unscented Kalman filter, and improvement over extended Kalman and unscented filters is seen in two examples.
by Joshua Adam Taylor.
S.M.
Style APA, Harvard, Vancouver, ISO itp.
23

Mastin, Dana Andrew. "Analysis of approximation and uncertainty in optimization". Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/97761.

Pełny tekst źródła
Streszczenie:
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2015.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 249-260).
We study a series of topics involving approximation algorithms and the presence of uncertain data in optimization. On the first theme of approximation, we derive performance bounds for rollout algorithms. Interpreted as an approximate dynamic programming algorithm, a rollout algorithm estimates the value-to-go at each decision stage by simulating future events while following a heuristic policy, referred to as the base policy. We provide a probabilistic analysis of knapsack problems, proving that rollout algorithms perform significantly better than their base policies. Next, we study the average performance of greedy algorithms for online matching on random graphs. In online matching problems, vertices arrive sequentially and reveal their neighboring edges. Vertices may be matched upon arrival and matches are irrevocable. We determine asymptotic matching sizes obtained by a variety of greedy algorithms on random graphs, both for bipartite and non-bipartite graphs. Moving to the second theme of uncertainty, we analyze losses resulting from uncertain transition probabilities in Markov decision processes. We assume that policies are computed using exact dynamic programming with estimated transition probabilities, but the system evolves according to dierent, true transition probabilities. Given a bound on the total variation error of estimated transition probability distributions, we derive a general tight upper bound on the loss of expected total reward. Finally, we consider a randomized model for minmax regret in combinatorial optimization under cost uncertainty. This problem can be viewed as a zero-sum game played between an optimizing player and an adversary, where the optimizing player selects a solution and the adversary selects costs with the intention of maximizing the regret of the player. We analyze a model where the optimizing player selects a probability distribution over solutions and the adversary selects costs with knowledge of the player's distribution. We show that under this randomized model, the minmax regret version of any polynomial solvable combinatorial problem is polynomial solvable, both for interval and discrete scenario representations of uncertainty.
by Dana Andrew Mastin.
Ph. D.
Style APA, Harvard, Vancouver, ISO itp.
24

Smaling, Rudolf M. "System architecture analysis and selection under uncertainty". Thesis, Massachusetts Institute of Technology, 2005. http://hdl.handle.net/1721.1/28943.

Pełny tekst źródła
Streszczenie:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Engineering Systems Division, 2005.
Includes bibliographical references (leaves 183-191).
A system architecture analysis and selection methodology is presented that builds on the Multidisciplinary Analysis and Optimization framework. It addresses a need and opportunity to extend the MAO techniques to include a means to analyze not only within the technical domain, but also include the ability to evaluate external influences that will act on the system once it is in operation. The nature and extent of these external influences is uncertain and increasingly uncertain for systems with long development timelines and methods for addressing such uncertainty are central to the thesis. The research presented in this document has culminated in a coherent system architecture analysis and selection process addressing this need that consists of several steps: 1. The introduction of the concept of Fuzzy Pareto Optimality. Under uncertainty, one must necessarily consider more than just Pareto Optimal solutions to avoid the unintentional exclusion of viable and possibly even desirable designs. 2. The introduction of a proximity based filtering technique that explicitly links the design and solution spaces. The intent here is preserve diverse designs, even if their resulting performance is similar. 3. Introduction of the concept of Technology Invasiveness through the use of a component Delta Design Structure Matrix (ADSM). The component DSM is used to evaluate the changes in the DSM due to the technology insertion. Based on the quantity and type of these changes a Technology Invasiveness metric is computed. 4. Through the use of utility curves, the technical domain analysis is linked to an analysis of external influence factors.
(cont.) The shape of these curves depends wholly on the external influences that may act on the system once it is commercialized or otherwise put into use. The utility curves, in combination with the (technical) performance distributions, are then used to compute risk and opportunity for each system architecture. System Architecture selection follows from analysis in the technical domain linked to an analysis of external influences and their impact on system architecture potential for success. All of the concepts and the integrated process are developed and assessed in the context of a case which involves the study of a Hydrogen Enhanced Combustion Engine being studied for possible insertion into the vehicle fleet.
by Rudolf M. Smaling.
Ph.D.
Style APA, Harvard, Vancouver, ISO itp.
25

Kavathia, Kepin Bipin. "Uncertainty Analysis of an Engine Test Cell". The Ohio State University, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=osu1532030837916798.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
26

Dai, Qiang. "Radar rainfall uncertainty analysis for hydrological applications". Thesis, University of Bristol, 2014. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.681555.

Pełny tekst źródła
Streszczenie:
Weather radar has been widely used in hydrologic forecasting and decision making; nevertheless, there is increasing attention on its unceltainties that propagates through hydrologic models. These unceltainties are not only caused by radar itself in measuring and estimating rainfall (such as attenuation, extrapolation of the rainfall measured aloft to the ground, sampling methods and pmtial beam blocking, etc.), but also come from the complicated synoptic regimes (such as air motion, vertical variability of temperature, and conversion to and from different hydrometeors). This thesis aims to improve the quality of radar rainfall and describe its uncertainty. The Brue catchment (135 sq. km) in Southwest England covering 28 radar pixels and 49 rain gauges and a hilly area to the east and south of Manchester with around 5000 sq . km and 50 rain gauges are chosen as the experimental domains for this thesis. The studies are composed of three main pmts: Firstly, I propose a fully formulated uncertainty model that can statistically quantify the characteristics of the radar rainfall errors and their spatial and temporal structure, which is a novel method of its kind in the radar data unceltainty field. The uncertainty model is established based on the distribution of gauge rainfall conditioned on radar rainfall (GRJRR). Its spatial and temporal dependences are simulated based on the copula function. With this proposed uncertainty model, a Multivariate Distributed Ensemble Generator (MDEG) driven by the copula and autoregressive filter is designed. The products from MDEG include a time series of ensemble rainfall fields with each of them representing a probable true rainfall. As wind is a typical weather factor that influences radar measurement, this thesis introduces the wind field into the unceltainty model and designs the radar rainfall uncertainty model under different wind conditions. In addition, I also propose an Empirically-based Ensemble Rainfall Forecasts (ERFEM) model to measure and quantify the combined effect of all the error sources in the radarrainfall forecasts .. The essence of the unceltainty model is formulated into an empirical relation between the radar rainfall forecasts and the corresponding 'ground truth' represented by the rainfall field from rain gauges. In modelling the radar rainfall unceltainty, I find that the wind has a huge impact on radar-gauge comparison. Due to the wind effects, the raindrops observed by the radar do not always fall veltically to the ground, and the raindrops arriving at the ground cannot all be caught by the rain gauge. This thesis proposes a practical approach to simulate the movement of raindrops in the air and adjust the aforementioned wind-induced errors on radar bias correction procedure. This scheme is based on the numerical simulation of raindrop movements in the three-dimensional ' wind field. The Weather Research and Forecasting (WRF) model is used to downs.cale the reanalysis data ERA-40 to obtain the wind field with high spatial and temporal resolutions. A normalized gamma model is adopted to estimate the raindrop size distribution (DSD). This work is the first study to tackle both wind effects on radar and rain gauges, which could be considered as one of the essential components in processing radar observation data, which should be undertaken after the aforementioned physical processes and before bias correction. Finally, this thesis analyzes how the radar rainfall uncertainty propagates through a hydrological model (the Xinanjiang model) and investigates which features of the uncertainty model have significant impacts on flow simulation. The generated ensemble rainfall values by MDEG are input into the Xinanjiang model to produce uncertainty bands of the ensemble flows. Five important indicators are used to describe the characteristics of uncertainty bands. It is concluded that the Gaussian marginal distribution and spatio-temporal dependence using Gaussian copula is considered to be the preferred configuration of the MDEG model for hydrological model unceltainty analysis in the Brue catchment. Keywords: Multivariate Distributed Ensemble Generator (MDEG); Copula; flow simulation; Radar-Rainfall Estimates; Hydrological Model Uncertainty; wind-induced error; drop size distribution; WRF; wind-drift.
Style APA, Harvard, Vancouver, ISO itp.
27

Zhang, Yanyang. "Second-order effects on uncertainty analysis calculations". Master's thesis, Mississippi State : Mississippi State University, 2002. http://library.msstate.edu/etd/show.asp?etd=etd-10292002-122359.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
28

El-Shanawany, Ashraf Ben Mamdouh. "Quantification of uncertainty in probabilistic safety analysis". Thesis, Imperial College London, 2016. http://hdl.handle.net/10044/1/48104.

Pełny tekst źródła
Streszczenie:
This thesis develops methods for quantification and interpretation of uncertainty in probabilistic safety analysis, focussing on fault trees. The output of a fault tree analysis is, usually, the probability of occurrence of an undesirable event (top event) calculated using the failure probabilities of identified basic events. The standard method for evaluating the uncertainty distribution is by Monte Carlo simulation, but this is a computationally intensive approach to uncertainty estimation and does not, readily, reveal the dominant reasons for the uncertainty. A closed form approximation for the fault tree top event uncertainty distribution, for models using only lognormal distributions for model inputs, is developed in this thesis. Its output is compared with the output from two sampling based approximation methods; standard Monte Carlo analysis, and Wilks’ method, which is based on order statistics using small sample sizes. Wilks’ method can be used to provide an upper bound for the percentiles of top event distribution, and is computationally cheap. The combination of the lognormal approximation and Wilks’ Method can be used to give, respectively, the overall shape and high confidence on particular percentiles of interest. This is an attractive, practical option for evaluation of uncertainty in fault trees and, more generally, uncertainty in certain multilinear models. A new practical method of ranking uncertainty contributors in lognormal models is developed which can be evaluated in closed form, based on cutset uncertainty. The method is demonstrated via examples, including a simple fault tree model and a model which is the size of a commercial PSA model for a nuclear power plant. Finally, quantification of “hidden uncertainties” is considered; hidden uncertainties are those which are not typically considered in PSA models, but may contribute considerable uncertainty to the overall results if included. A specific example of the inclusion of a missing uncertainty is explained in detail, and the effects on PSA quantification are considered. It is demonstrated that the effect on the PSA results can be significant, potentially permuting the order of the most important cutsets, which is of practical concern for the interpretation of PSA models. Finally, suggestions are made for the identification and inclusion of further hidden uncertainties.
Style APA, Harvard, Vancouver, ISO itp.
29

Wang, Cheng 1971. "Parametric uncertainty analysis for complex engineering systems". Thesis, Massachusetts Institute of Technology, 1999. http://hdl.handle.net/1721.1/9507.

Pełny tekst źródła
Streszczenie:
Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Chemical Engineering, 1999.
Includes bibliographical references (p. 259-275).
With the rapid advancement of computational science, modeling and simulation have become standard methods to study the behavior of complex systems. As scientists and engineers try to capture more detail, the models become more complex. Given that there are inevitable uncertainties entering at every stage of a model's life cycle, the challenge is to identify those components that contribute most to uncertainties in the predictions. This thesis presents new methodologies for allowing direct incorporation of uncertainty into the model formulation and for identifying the relative importance of different parameters. The basis of these methods is the deterministic equivalent modeling method (DEMM), which applies polynomial chaos expansions and the probabilistic collocation approach to transform the stochastic model into a deterministic equivalent model. By transforming the model the task of determining the probability density function of the model response surface is greatly simplified. In order to advance the representation method of parametric uncertainty. a theoretical study of polynomial chaos representation of uncertain parameters has been performed and an Adomian polynomial expansion for functions of random variables has been developed. While DEMM is applied to various engineering systems to study the propagation of uncertainty in complex models, a systematic framework is introduced to quantitatively assess the effect of uncertain parameters in stochastic optimization problems for chemical product and process design. Furthermore, parametric uncertainty analysis techniques for discrete and correlated random variables have been developed such that the deterministic equivalent modeling method can be applied to a broader range of engineering problems. As a result of these developments, uncertainty analysis can now be performed 2 to 3 orders faster than conventional methods such as Monte Carlo. Examples of models in various engineering systems suggest both the accuracy and the practicality of the new framework for parametric uncertainty analysis established in this thesis.
by Cheng Wang.
Ph.D.
Style APA, Harvard, Vancouver, ISO itp.
30

Ferone, A. "EXPLOITING HIGHER ORDER UNCERTAINTY IN IMAGE ANALYSIS". Doctoral thesis, Università degli Studi di Milano, 2011. http://hdl.handle.net/2434/155479.

Pełny tekst źródła
Streszczenie:
Soft computing is a group of methodologies that works synergistically to provide flexible information processing capability for handling real-life ambiguous situations. Its aim is to exploit the tolerance for imprecision, uncertainty, approximate reasoning, and partial truth in order to achieve tractability, robustness, and low-cost solutions. Soft computing methodologies (involving fuzzy sets, neural networks, genetic algorithms, and rough sets) have been successfully employed in various image processing tasks including image segmentation, enhancement and classification, both individually or in combination with other soft computing techniques. The reason of such success has its motivation in the fact that soft computing techniques provide a powerful tools to describe uncertainty, naturally embedded in images, which can be exploited in various image processing tasks. The main contribution of this thesis is to present tools for handling uncertainty by means of a rough-fuzzy framework for exploiting feature level uncertainty. The first contribution is the definition of a general framework based on the hybridization of rough and fuzzy sets, along with a new operator called RF-product, as an effective solution to some problems in image analysis. The second and third contributions are devoted to prove the effectiveness of the proposed framework, by presenting a compression method based on vector quantization and its compression capabilities and an HSV color image segmentation technique.
Style APA, Harvard, Vancouver, ISO itp.
31

Rapadamnaba, Robert. "Uncertainty analysis, sensitivity analysis, and machine learning in cardiovascular biomechanics". Thesis, Montpellier, 2020. http://www.theses.fr/2020MONTS058.

Pełny tekst źródła
Streszczenie:
Cette thèse fait suite à une étude récente, menée par quelques chercheurs de l'Université de Montpellier, dans le but de proposer à la communauté scientifique une procédure d'inversion capable d'estimer de manière non invasive la pression dans les artères cérébrales d'un patient.Son premier objectif est, d'une part, d'examiner la précision et la robustesse de la procédure d'inversion proposée par ces chercheurs, en lien avec diverses sources d'incertitude liées aux modèles utilisés, aux hypothèses formulées et aux données cliniques du patient, et d'autre part, de fixer un critère d'arrêt pour l'algorithme basé sur le filtre de Kalman d'ensemble utilisé dans leur procédure d'inversion. À cet effet, une analyse d'incertitude et plusieurs analyses de sensibilité sont effectuées. Le second objectif est d'illustrer comment l'apprentissage machine, orienté réseaux de neurones convolutifs, peut être une très bonne alternative à la longue et coûteuse procédure mise en place par ces chercheurs pour l'estimation de la pression.Une approche prenant en compte les incertitudes liées au traitement des images médicales du patient et aux hypothèses formulées sur les modèles utilisés, telles que les hypothèses liées aux conditions limites, aux paramètres physiques et physiologiques, est d'abord présentée pour quantifier les incertitudes sur les résultats de la procédure. Les incertitudes liées à la segmentation des images sont modélisées à l'aide d'une distribution gaussienne et celles liées au choix des hypothèses de modélisation sont analysées en testant plusieurs scénarios de choix d'hypothèses possibles. De cette démarche, il ressort que les incertitudes sur les résultats de la procédure sont du même ordre de grandeur que celles liées aux erreurs de segmentation. Par ailleurs, cette analyse montre que les résultats de la procédure sont très sensibles aux hypothèses faites sur les conditions aux limites du modèle du flux sanguin. En particulier, le choix des conditions limites symétriques de Windkessel pour le modèle s'avère être le plus approprié pour le cas du patient étudié.Ensuite, une démarche permettant de classer les paramètres estimés à l'aide de la procédure par ordre d'importance et de fixer un critère d'arrêt pour l'algorithme utilisé dans cette procédure est proposée. Les résultats de cette stratégie montrent, d'une part, que la plupart des résistances proximales sont les paramètres les plus importants du modèle pour l'estimation du débit sanguin dans les carotides internes et, d'autre part, que l'algorithme d'inversion peut être arrêté dès qu'un certain seuil de convergence raisonnable de ces paramètres les plus influents est atteint.Enfin, une nouvelle plateforme numérique basée sur l'apprentissage machine permettant d'estimer la pression artérielle spécifique au patient dans les artères cérébrales beaucoup plus rapidement qu'avec la procédure d'inversion mais avec la même précision, est présentée. L'application de cette plateforme aux données du patient utilisées dans la procédure d'inversion permet une estimation non invasive et en temps réel de la pression dans les artères cérébrales du patient cohérente avec l'estimation de la procédure d'inversion
This thesis follows on from a recent study conducted by a few researchers from University of Montpellier, with the aim of proposing to the scientific community an inversion procedure capable of noninvasively estimating patient-specific blood pressure in cerebral arteries. Its first objective is, on the one hand, to examine the accuracy and robustness of the inversion procedure proposed by these researchers with respect to various sources of uncertainty related to the models used, formulated assumptions and patient-specific clinical data, and on the other hand, to set a stopping criterion for the ensemble Kalman filter based algorithm used in their inversion procedure. For this purpose, uncertainty analysis and several sensitivity analyses are carried out. The second objective is to illustrate how machine learning, mainly focusing on convolutional neural networks, can be a very good alternative to the time-consuming and costly inversion procedure implemented by these researchers for cerebral blood pressure estimation.An approach taking into account the uncertainties related to the patient-specific medical images processing and the blood flow model assumptions, such as assumptions about boundary conditions, physical and physiological parameters, is first presented to quantify uncertainties in the inversion procedure outcomes. Uncertainties related to medical images segmentation are modelled using a Gaussian distribution and uncertainties related to modeling assumptions choice are analyzed by considering several possible hypothesis choice scenarii. From this approach, it emerges that the uncertainties on the procedure results are of the same order of magnitude as those related to segmentation errors. Furthermore, this analysis shows that the procedure outcomes are very sensitive to the assumptions made about the model boundary conditions. In particular, the choice of the symmetrical Windkessel boundary conditions for the model proves to be the most relevant for the case of the patient under study.Next, an approach for ranking the parameters estimated during the inversion procedure in order of importance and setting a stopping criterion for the algorithm used in the inversion procedure is presented. The results of this strategy show, on the one hand, that most of the model proximal resistances are the most important parameters for blood flow estimation in the internal carotid arteries and, on the other hand, that the inversion algorithm can be stopped as soon as a certain reasonable convergence threshold for the most influential parameter is reached.Finally, a new numerical platform, based on machine learning and allowing to estimate the patient-specific blood pressure in the cerebral arteries much faster than with the inversion procedure but with the same accuracy, is presented. The application of this platform to the patient-specific data used in the inversion procedure provides noninvasive and real-time estimate of patient-specific cerebral pressure consistent with the inversion procedure estimation
Style APA, Harvard, Vancouver, ISO itp.
32

Zhang, Guohong. "Estimating uncertainties in integrated reservoir studies". Diss., Texas A&M University, 2003. http://hdl.handle.net/1969.1/204.

Pełny tekst źródła
Streszczenie:
To make sound investment decisions, decision makers need accurate estimates of the uncertainties present in forecasts of reservoir performance. In this work I propose a method, the integrated mismatch method, that incorporates the misfit in the history match into the estimation of uncertainty in the prediction. I applied the integrated mismatch method, which overcomes some deficiencies of existing methods, to uncertainty estimation in two reservoir studies and compared results to estimations from existing methods. The integrated mismatch method tends to generate smaller ranges of uncertainty than many existing methods. When starting from nonoptimal reservoir models, in some cases the integrated mismatch method is able to bracket the true reserves value while other methods fail to bracket it. The results show that even starting from a nonoptimal reservoir model, but as long as the experimental designs encompass the true case parameters, the integrated mismatch method brackets the true reserves value. If the experimental designs do not encompass all the true case parameters, but the true reserves value is covered by the experiments, the integrated mismatch method may still bracket the true case. This applies if there is a strong correlation between mismatch and closeness to the true reserves value. The integrated mismatch method does not need a large number of simulation runs for the uncertainty analysis, while some other methods need hundreds of runs.
Style APA, Harvard, Vancouver, ISO itp.
33

Balonon-Rosen, Mitchell. "An uncertainty analysis of a color tolerance database /". Online version of thesis, 1993. http://hdl.handle.net/1850/11066.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
34

Kreye, Melanie E. "Uncertainty analysis in competitive bidding for service contracts". Thesis, University of Bath, 2011. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.548104.

Pełny tekst źródła
Streszczenie:
Sustainable production and consumption have become more important internationally, which has led to the transformation of market structures and competitive situations into the direction of servitisation. This means that manufacturing companies are forced to compete through the supply of services as opposed to products. Particularly the suppliers of long-life products such as submarines and airplanes no longer simply sell these products but provide their capability or availability. Companies such as Rolls-Royce Engines achieve 60% of their revenue through selling a service rather than the engine itself. For a manufacturing company, the shift towards being a service provider means that they usually have to bid for service contracts, sometimes competitively. In the context of competitive bidding, the decision makers face various uncertainties that influence their decision. Ignoring these uncertainties or their influences can result in problems such as the generation of too little profit or even a loss or the exposure to financial risks. Raising the decision maker’s awareness of the uncertainties in the form of e.g. a decision matrix, expressing the trade-off between the probability of winning the contract and the probability of making a profit, aims at integrating these factors in the decision process. The outcome is to enable the bidding company to make a more informed decision. This was the focus of the research presented in this thesis. The aim of this research was to support the pricing decision by defining a process for modelling the influencing uncertainties and including them in a decision matrix depicting the trade-off between the probability of winning the contract and the probability of making a profit. Three empirical studies are described and the associated decision process and influencing uncertainties are discussed. Based on these studies, a conceptual framework was defined which depicts the influencing factors on a pricing decision at the bidding stage and the uncertainties within these. The framework was validated with a case study in contract bidding where the uncertainties were modelled and included in a decision matrix depicting the probability of winning the contract and the probability of making a profit. The main contributions of this research are the identification of the uncertainties influencing a pricing decision, the depiction of these in a conceptual framework, a method for ascertaining how to model these uncertainties and assessing the use of such an approach via an industrial case study.
Style APA, Harvard, Vancouver, ISO itp.
35

Duncan, Gregory S. "Milling dynamics prediction and uncertainty analysis using receptance coupling substructure analysis". [Gainesville, Fla.] : University of Florida, 2006. http://purl.fcla.edu/fcla/etd/UFE0015544.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
36

Lange, Matthias. "Analysis of the uncertainty of wind power predictions". [S.l. : s.n.], 2003. http://deposit.ddb.de/cgi-bin/dokserv?idn=969985789.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
37

Sozak, Ahmet. "Uncertainty Analysis Of Coordinate Measuring Machine (cmm) Measurements". Master's thesis, METU, 2007. http://etd.lib.metu.edu.tr/upload/2/12608887/index.pdf.

Pełny tekst źródła
Streszczenie:
In this thesis, the measurement uncertainty of Coordinate Measuring Machine (CMM) is analysed and software is designed to simulate this. Analysis begins with the inspection of the measurement process and structure of the CMMs. After that, error sources are defined with respect to their effects on the measurement and then an error model is constructed to compensate these effects. In other words, systematic part of geometric, kinematic and thermal errors are compensated with error modelling. Kinematic and geometric error model is specific for the structure of CMM under inspection. Also, a common orthogonal kinematic model is formed and with using the laser error data of the CMM and error maps of the machine volume is obtained. Afterwards, the models are compared with each other by taking the difference and ratio. The definition and compensation of the systematic errors leave the uncertainty of measurements for analysing. Measurement uncertainty consists of the uncompensated systematic errors and random errors. The other aim of the thesis is to quantify these uncertainties with using the different methods and to inspect the success of these methods. Uncertainty budgeting, comparison, statistical evaluation by designing an experiments and simulation methods are examined and applied to the CMM under inspection. In addition, Virtual CMM software is designed to simulate the task specific measurement uncertainty of circle, sphere and plane without using the repeated measurements. Finally, the performance of the software, highly depending on the mathematical modelling of machine volume, is tested by using actual measurements.
Style APA, Harvard, Vancouver, ISO itp.
38

Hapa, Cankat. "Uncertainty In Well Test And Core Permeability Analysis". Master's thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/2/12610144/index.pdf.

Pełny tekst źródła
Streszczenie:
Reservoir permeability is one of the important parameters derived from well test analysis. Small-scale permeability measurements in wells are usually made using core plugs, or more recently, probe permeameter measurements. Upscaling of these measurements for comparisons with permeability derived well tests (Pressure Build-Up) can be completed by statistical averaging methods. Well Test permeability is often compared with one of the core plug averages: arithmetic, geometric and harmonic. A question that often arises is which average does the well test-derived permeability represent and over what region is this average valid? A second important question is how should the data sets be reconciled when there are discrepancies? In practice, the permeability derived from well tests is often assumed to be equivalent to the arithmetic (in a layered reservoir) or geometric (in a randomly distributed permeability field) average of the plug measures. These averages are known to be members of a more general power-average solution. This pragmatic approach (which may include an assumption on the near-well geology) is often flawed due to a number of reasons, which is tried to be explained in this study. The assessment of in-situ, reservoir permeability requires an understanding of both core (plug and probe) and well test measurements &
#8211
in terms of their volume scale of investigation, measurement mechanism, interpretation and integration. Pressure build-up tests for 26 wells and core plug analysis for 32 wells have valid measured data to be evaluated. Core plug permeabilities are upscaled and compared with pressure build-up test derived permeabilities. The arithmetic, harmonic and geometric averages of core plug permeability data are found out for each facies and formation distribution. The reservoir permeability heterogeneities are evaluated in each step of upscaling procedure by computing coefficient of variation, The Dykstra-Parson&
#8217
s Coefficient and Lorenz Coefficients. This study compared core and well test measurements in South East of Turkey heavy oil carbonate field. An evaluation of well test data and associated core plug data sets from a single field will be resulting from the interpretation of small (core) and reservoir (well test) scale permeability data. The techniques that were used are traditional volume averaging/homogenization methods with the contribution of determining permeability heterogeneities of facies at each step of upscaling procedure and manipulating the data which is not proper to be averaged (approximately normally distributed) with the combination of Lorenz Plot to identify the flowing intervals. As a result, geometrical average of upscaled core plug permeability data is found to be approximately equal to the well test derived permeability for the goodly interpreted well tests. Carbonates are very heterogeneous and this exercise will also be instructive in understanding the heterogeneity for the guidance of reservoir models in such a system.
Style APA, Harvard, Vancouver, ISO itp.
39

Kabir, Sohag. "Compositional dependability analysis of dynamic systems with uncertainty". Thesis, University of Hull, 2016. http://hydra.hull.ac.uk/resources/hull:13595.

Pełny tekst źródła
Streszczenie:
Over the past two decades, research has focused on simplifying dependability analysis by looking at how we can synthesise dependability information from system models automatically. This has led to the field of model-based safety assessment (MBSA), which has attracted a significant amount of interest from industry, academia, and government agencies. Different model-based safety analysis methods, such as Hierarchically Performed Hazard Origin & Propagation Studies (HiP-HOPS), are increasingly applied by industry for dependability analysis of safety-critical systems. Such systems may feature multiple modes of operation where the behaviour of the systems and the interactions between system components can change according to what modes of operation the systems are in. MBSA techniques usually combine different classical safety analysis approaches to allow the analysts to perform safety analyses automatically or semi-automatically. For example, HiP-HOPS is a state-of-the-art MBSA approach which enhances an architectural model of a system with logical failure annotations to allow safety studies such as Fault Tree Analysis (FTA) and Failure Modes and Effects Analysis (FMEA). In this way it shows how the failure of a single component or combinations of failures of different components can lead to system failure. As systems are getting more complex and their behaviour becomes more dynamic, capturing this dynamic behaviour and the many possible interactions between the components is necessary to develop an accurate failure model. One of the ways of modelling this dynamic behaviour is with a state-transition diagram. Introducing a dynamic model compatible with the existing architectural information of systems can provide significant benefits in terms of accurate representation and expressiveness when analysing the dynamic behaviour of modern large-scale and complex safety-critical systems. Thus the first key contribution of this thesis is a methodology to enable MBSA techniques to model dynamic behaviour of systems. This thesis demonstrates the use of this methodology using the HiP-HOPS tool as an example, and thus extends HiP-HOPS with state-transition annotations. This extension allows HiP-HOPS to model more complex dynamic scenarios and perform compositional dynamic dependability analysis of complex systems by generating Pandora temporal fault trees (TFTs). As TFTs capture state, the techniques used for solving classical FTs are not suitable to solve them. They require a state space solution for quantification of probability. This thesis therefore proposes two methodologies based on Petri Nets and Bayesian Networks to provide state space solutions to Pandora TFTs. Uncertainty is another important (yet incomplete) area of MBSA: typical MBSA approaches are not capable of performing quantitative analysis under uncertainty. Therefore, in addition to the above contributions, this thesis proposes a fuzzy set theory based methodology to quantify Pandora temporal fault trees with uncertainty in failure data of components. The proposed methodologies are applied to a case study to demonstrate how they can be used in practice. Finally, the overall contributions of the thesis are evaluated by discussing the results produced and from these conclusions about the potential benefits of the new techniques are drawn.
Style APA, Harvard, Vancouver, ISO itp.
40

McIntyre, Neil Robert. "Analysis of uncertainty in river water quality modelling". Thesis, Imperial College London, 2004. http://hdl.handle.net/10044/1/11828.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
41

Wu, Guangxi. "Sensitivity and uncertainty analysis of subsurface drainage design". Thesis, University of British Columbia, 1988. http://hdl.handle.net/2429/28529.

Pełny tekst źródła
Streszczenie:
Literature on subsurface drainage theories, determination of drainage parameters, and analysis approaches of uncertainty was reviewed. Sensitivity analysis was carried out on drain spacing equations for steady state and nonsteady state, in homogeneous soils and in layered soils. It was found that drain spacing is very sensitive to the hydraulic conductivity, the drainage coefficient, and the design midspan water table height. Spacing is not sensitive to the depth of the impermeable layer and the drain radius. In transient state, spacing is extremely sensitive to the midspan water table heights if the water table fall is relatively small. In that case steady state theory will yield more reliable results and its use is recommended. Drain spacing is usually more sensitive to the hydraulic conductivity of the soil below the drains than to that of the soil above the drains. Therefore, it is desirable to take samples from deeper soil when measuring hydraulic conductivity. A new spacing formula was developed for two-layered soils and a special case of three-layered soils with drains at the interface of the top two layers. This equation was compared with the Kirkham equation. The new formula yields spacings close to the Kirkham equation if the hydraulic conductivity of the soil above the drains is relatively small; otherwise, it tends to give more accurate results. First and second order analysis methods were employed to analyze parameter uncertainty in subsurface drainage design. It was found that conventional design methods based on a deterministic framework may result in inadequate spacing due to the uncertainty involved. Uncertainty may be incorporated into practical design by using the simple equations and graphs presented in this research; the procedure was illustrated through an example. Conclusions were drawn from the present study and recommendations were made for future research.
Applied Science, Faculty of
Graduate
Style APA, Harvard, Vancouver, ISO itp.
42

Pourgol-Mohamad, Mohammad. "Integrated Methodology for Thermal-Hydraulics Uncertainty Analysis (IMTHUA)". College Park, Md. : University of Maryland, 2007. http://hdl.handle.net/1903/6681.

Pełny tekst źródła
Streszczenie:
Thesis (Ph. D.) -- University of Maryland, College Park, 2007.
Thesis research directed by: Mechanical Engineering. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
Style APA, Harvard, Vancouver, ISO itp.
43

McClure, John Douglas. "Sensitivity and uncertainty analysis in atmospheric dispersion models". Thesis, University of Glasgow, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.270992.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
44

Clough, Robert. "Uncertainty contributions to species specific isotope dilution analysis". Thesis, University of Plymouth, 2003. http://hdl.handle.net/10026.1/2092.

Pełny tekst źródła
Streszczenie:
Mercury speciation in solid sample matrices has been investigated using high performance liquid chromatography (HPLC) coupled with multicollector sector field (MCSF) and quadrupole (Q) inductively coupled plasma mass spectrometry (ICP-MS) for species specific isotope dilution mass spectrometry (IDMS). 199Hg enriched methylmercurychloride has been synthesised and recovered in the solid form for use as a spike material. The stability of methylmercury during the IDMS procedure was investigated using 199Hg and 13C labelled methylmercury isotopomers and ¹H Nuclear Magnetic Resonance spectroscopy. IntermoIecuJar exchange of the methylmercury halide counter ion was observed, the halide counter ion order of preference was l>Br>Cl. No evidence was found for the decomposition, or formation, of methylmercury during equilibration with soil (NIST2710 SRM) or dogfish muscle (DORM-2 CRM), or during chromatographic separation. The extent of equilibration between the spike and the particulate bound mercury compounds was studied by temporal monitoring of the 200Hg:199Hg isotope amount ratio and determining the amount of Hg species in the liquid phase. For N1ST2710, complete equilibration was only achieved when concentrated HNO3 in combination with a microwave digestion was employed. For DORM-2, complete equilibration was achieved when using 1:1 H2O.CH3OH v\v and 0.01 % 2-mercaptoethanol as the solvent, even though only 47% of the analyte was extracted into the liquid phase. The mass fraction of methylmercurychloride has been determined in E)ORM-2 and BCR464 lobster hepatopancreas CRM by two different procedures, single IDMS and approximate matching double IDMS. Mercury cold vapour generation of the HPLC column eluent allowed isotope amount ratios measurements by MC-SF-ICP-MS. For each CRM the mass fraction of methylmercury determined by the two IDMS methods was not statistically different, within the limits of uncertainty, from the certified values. An uncertainty budget for both IDMS procedures has been formulated to allow the performance of each method to be compared For single IDMS the major uncertainty contribution was derived from the within replicate uncertainty, Uwithin The combined standard uncertainty of each replicate analysis was dominated by two components, the uncertainty associated with the natural isotonic abundance 200Hg: 199Hg isotope amount ratio and the uncertainty associated with the 199Hg enriched methylmercurychloride spike mass fraction. The between blend standard uncertainty, Ubetween, was the major contributor to the expanded uncertainty for approximate matching double IDMS. The combined standard uncertainty for each individual replicate was dominated by the contribution from the standard uncertainty associated with the measured 200Hg:199Hg isotope amount ratios in the spiked sample and the mass bias calibration blend.
Style APA, Harvard, Vancouver, ISO itp.
45

Cossa, Paul F. (Paul Francois) 1979. "Uncertainty analysis of the cost of climate policies". Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/30074.

Pełny tekst źródła
Streszczenie:
Thesis (S.M.)--Massachusetts Institute of Technology, Engineering Systems Division, Technology and Policy Program, 2004.
Includes bibliographical references (leaves 84-85).
Every climate change policy issue is inherently limited by two questions: what are exactly the consequences of climate change for our lives? How much will it cost to deal with them? Almost twelve years after the parties of the United Nations Framework Convention on Climate Change met in Kyoto in 1992, acknowledging the fact that "change in the Earth's climate and its adverse effects are a common concern of humankind" (United Nations, 1992), no global effort is really visible yet. The reason lies in the difficulty scientists and economists have to answer those two questions. This thesis will try to understand how uncertainty on the consequences of climate change drives the cost of policy decisions. It will especially try to find out what are the main sources of uncertainty in policy costs and where should we therefore put our research and policy efforts. In the first part of this thesis, we will perform a sensitivity analysis on the economic parameters relevant to the analysis, in order to identify the ones that most influence the cost of climate change policies. We will then develop and run a specific method to elicit experts' opinions on the uncertainty on each on these parameters. This step will allow us to conduct our uncertainty analysis under different policy assumptions and to understand better the implications of uncertainty on climate change policies.
by Paul F. Cossa.
S.M.
Style APA, Harvard, Vancouver, ISO itp.
46

Campbell, Mark E. (Mark Eric) 1968. "Uncertainty modeling for structural control analysis and synthesis". Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/49602.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
47

Genbäck, Minna. "Uncertainty intervals and sensitivity analysis for missing data". Doctoral thesis, Umeå universitet, Statistik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-127121.

Pełny tekst źródła
Streszczenie:
In this thesis we develop methods for dealing with missing data in a univariate response variable when estimating regression parameters. Missing outcome data is a problem in a number of applications, one of which is follow-up studies. In follow-up studies data is collected at two (or more) occasions, and it is common that only some of the initial participants return at the second occasion. This is the case in Paper II, where we investigate predictors of decline in self reported health in older populations in Sweden, the Netherlands and Italy. In that study, around 50% of the study participants drop out. It is common that researchers rely on the assumption that the missingness is independent of the outcome given some observed covariates. This assumption is called data missing at random (MAR) or ignorable missingness mechanism. However, MAR cannot be tested from the data, and if it does not hold, the estimators based on this assumption are biased. In the study of Paper II, we suspect that some of the individuals drop out due to bad health. If this is the case the data is not MAR. One alternative to MAR, which we pursue, is to incorporate the uncertainty due to missing data into interval estimates instead of point estimates and uncertainty intervals instead of confidence intervals. An uncertainty interval is the analog of a confidence interval but wider due to a relaxation of assumptions on the missing data. These intervals can be used to visualize the consequences deviations from MAR have on the conclusions of the study. That is, they can be used to perform a sensitivity analysis of MAR. The thesis covers different types of linear regression. In Paper I and III we have a continuous outcome, in Paper II a binary outcome, and in Paper IV we allow for mixed effects with a continuous outcome. In Paper III we estimate the effect of a treatment, which can be seen as an example of missing outcome data.
Style APA, Harvard, Vancouver, ISO itp.
48

Hoops, Christopher Michael. "Uncertainty Analysis for Control Inputs of Diesel Engines". The Ohio State University, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=osu1282067559.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
49

Hayes, Richard. "Efficient analysis of nonlinear aeroelastic systems under uncertainty". Thesis, Queen's University Belfast, 2016. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.707228.

Pełny tekst źródła
Streszczenie:
In aircraft design, considerations to aeroelastic interactions are necessary to define permitted operating conditions which are safe. Limit cycle oscillations (LCO) are an example of unwanted and potentially dangerous wing behaviour. These are constant amplitude structural vibrations, driven by the interaction between a wing and the surrounding airflow. The present work is concerned with the numerical analysis of LCOs. More specifically, addressing the lack of comprehensive stochastic analyses of these highly nonlinear phenomena by utilising techniques which improve efficiency and support high complexity. This will allow simulation to be used more extensively in the aircraft design process which will enable safety factors to be rationalised thus improving aircraft performance. The cost of LCO simulation is reduced by implementing a High-Dimensional Harmonic Balance-based (HDHB) formulation. This method showed excellent agreement with the corresponding time-marching models and efficiency gains surpassed one order of magnitude. An objective of this work is to implement HDHB to problems with nonlinearities in both structural and flow field characteristics. Polynomial Chaos Expansions (PCE) are employed for the propagation of parametric uncertainties, offering large efficiency gains in comparison to Monte Carlo methods. Bifurcations uncovered in the stochastic analysis were handled well by the PCE by partitioning the parameter space along the discontinuity. Stochastic model updating, also enabled by the efficiency of the HDHB method, is explored to tackle the difficulties in establishing sources of nonlinearities within aeroelastic problems. A Bayesian inference technique is used to perform the parameter estimations of elusory structural characteristics for the purpose of model calibration. Parameter values were able to be identified with high accuracy using sparse observational data. This work has shown that the HDHB method provides an attractive alternative to timemarching methods. In addition, when combined with PCE, the stochastic analysis of highly complex LCOs is realisable.
Style APA, Harvard, Vancouver, ISO itp.
50

Willis, Thomas D. M. "Systematic analysis of uncertainty in flood inundation modelling". Thesis, University of Leeds, 2014. http://etheses.whiterose.ac.uk/7493/.

Pełny tekst źródła
Streszczenie:
Recent evaluations of 2D models have analysed uncertainty in data inputs into flood models, but have treated the model code as a black box. In this work, the influence of the numerical representation of the model on the results is evaluated. The purpose is not only to understand the significance of the physical scheme in the model on results, but also the importance of this in respect to other known sources of uncertainty, in particular boundary conditions, calibrated parameters such as Manning’s friction values, DEM accuracy and other more subjective forms of uncertainty associated with the choices used by modellers in constructing models, such as building representation. To further explore the impact that the level of physical representation has on model output, models were also analysed using risk and exposure based measures. The methods included vulnerability weighted measures and the use of damage curves from the Multi Coloured Manual. A series of Monte Carlo tests were undertaken for a range of parameters over 3 test cases using the LISFLOOD-FP code. The LISFLOOD-FP code was chosen as it has several formulations for solving 2D floodplain flow within its framework, each with different level of physical representation. The test cases included two urban events, a culvert overtopping event in Glasgow and canal embankment failure Coventry, and a river overtopping in Mexborough, Yorkshire a rural urban domain. The test cases provided a wider range of hydraulic conditions and are reflected events typically assessed with inundation models to ensure the effect of model bias was removed from the results. The results for the test cases indicated that the choice of physical representation was the most critical in affecting model results, particularly for the urban test case. However, the interaction between factors and parameters also indicated that for certain scenarios, this becomes less critical to model results. The use of risk based methods also identified areas of variations between parameters sets and numerical schemes that are not identified with traditional model evaluation techniques. Recent evaluations of 2D models have analysed uncertainty in data inputs into flood models, but have treated the model code as a black box. In this work, the influence of the numerical representation of the model on the results is evaluated. The purpose is not only to understand the significance of the physical scheme in the model on results, but also the importance of this in respect to other known sources of uncertainty, in particular boundary conditions, calibrated parameters such as Manning’s friction values, DEM accuracy and other more subjective forms of uncertainty associated with the choices used by modellers in constructing models, such as building representation. To further explore the impact that the level of physical representation has on model output, models were also analysed using risk and exposure based measures. The methods included vulnerability weighted measures and the use of damage curves from the Multi Coloured Manual. A series of Monte Carlo tests were undertaken for a range of parameters over 3 test cases using the LISFLOOD-FP code. The LISFLOOD-FP code was chosen as it has several formulations for solving 2D floodplain flow within its framework, each with different level of physical representation. The test cases included two urban events, a culvert overtopping event in Glasgow and canal embankment failure Coventry, and a river overtopping in Mexborough, Yorkshire a rural urban domain. The test cases provided a wider range of hydraulic conditions and are reflected events typically assessed with inundation models to ensure the effect of model bias was removed from the results. The results for the test cases indicated that the choice of physical representation was the most critical in affecting model results, particularly for the urban test case. However, the interaction between factors and parameters also indicated that for certain scenarios, this becomes less critical to model results. The use of risk based methods also identified areas of variations between parameters sets and numerical schemes that are not identified with traditional model evaluation techniques.
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii