Academic literature on the topic 'Uncertainty quantification framework'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Uncertainty quantification framework.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Uncertainty quantification framework"

1

Verdonck, H., O. Hach, J. D. Polman, O. Braun, C. Balzani, S. Müller, and J. Rieke. "-An open-source framework for the uncertainty quantification of aeroelastic wind turbine simulation tools." Journal of Physics: Conference Series 2265, no. 4 (May 1, 2022): 042039. http://dx.doi.org/10.1088/1742-6596/2265/4/042039.

Full text
Abstract:
Abstract The uncertainty quantification of aeroelastic wind turbine simulations is an active research topic. This paper presents a dedicated, open-source framework for this purpose. The framework is built around the uncertainpy package, likewise available as open source. Uncertainty quantification is done with a non-intrusive, global and variance-based surrogate model, using PCE (i.e., polynomial chaos expansion). Two methods to handle the uncertain parameter distribution along the blades are presented. The framework is demonstrated on the basis of an aeroelastic stability analysis. A sensitivity analysis is performed on the influence of the flapwise, edgewise and torsional stiffness of the blades on the damping of the most critical mode for both a Bladed linearization and a Bladed time domain simulation. The sensitivities of both models are in excellent agreement and the PCE surrogate models are shown to be accurate approximations of the true models.
APA, Harvard, Vancouver, ISO, and other styles
2

Wang, Jiajia, Hao Chen, Jing Ma, and Tong Zhang. "Research on application method of uncertainty quantification technology in equipment test identification." MATEC Web of Conferences 336 (2021): 02026. http://dx.doi.org/10.1051/matecconf/202133602026.

Full text
Abstract:
This paper introduces the concepts of equipment test qualification and uncertainty quantification, and the analysis framework and process of equipment test uncertainty quantification. It analyzes the data uncertainty, model uncertainty and environmental uncertainty, and studies the corresponding uncertainty quantification theory to provide technical reference for the application of uncertainty quantification technology in the field of test identification.
APA, Harvard, Vancouver, ISO, and other styles
3

Zhang, Juan, Junping Yin, and Ruili Wang. "Basic Framework and Main Methods of Uncertainty Quantification." Mathematical Problems in Engineering 2020 (August 31, 2020): 1–18. http://dx.doi.org/10.1155/2020/6068203.

Full text
Abstract:
Since 2000, the research of uncertainty quantification (UQ) has been successfully applied in many fields and has been highly valued and strongly supported by academia and industry. This review firstly discusses the sources and the types of uncertainties and gives an overall discussion on the goal, practical significance, and basic framework of the research of UQ. Then, the core ideas and typical methods of several important UQ processes are introduced, including sensitivity analysis, uncertainty propagation, model calibration, Bayesian inference, experimental design, surrogate model, and model uncertainty analysis.
APA, Harvard, Vancouver, ISO, and other styles
4

DeVolder, B., J. Glimm, J. W. Grove, Y. Kang, Y. Lee, K. Pao, D. H. Sharp, and K. Ye. "Uncertainty Quantification for Multiscale Simulations1." Journal of Fluids Engineering 124, no. 1 (November 12, 2001): 29–41. http://dx.doi.org/10.1115/1.1445139.

Full text
Abstract:
A general discussion of the quantification of uncertainty in numerical simulations is presented. A principal conclusion is that the distribution of solution errors is the leading term in the assessment of the validity of a simulation and its associated uncertainty in the Bayesian framework. Key issues that arise in uncertainty quantification are discussed for two examples drawn from shock wave physics and modeling of petroleum reservoirs. Solution error models, confidence intervals and Gaussian error statistics based on simulation studies are presented.
APA, Harvard, Vancouver, ISO, and other styles
5

Mirzayeva, A., N. A. Slavinskaya, M. Abbasi, J. H. Starcke, W. Li, and M. Frenklach. "Uncertainty Quantification in Chemical Modeling." Eurasian Chemico-Technological Journal 20, no. 1 (March 31, 2018): 33. http://dx.doi.org/10.18321/ectj706.

Full text
Abstract:
A module of PrIMe automated data-centric infrastructure, Bound-to-Bound Data Collaboration (B2BDC), was used for the analysis of systematic uncertainty and data consistency of the H2/CO reaction model (73/17). In order to achieve this purpose, a dataset of 167 experimental targets (ignition delay time and laminar flame speed) and 55 active model parameters (pre-exponent factors in the Arrhenius form of the reaction rate coefficients) was constructed. Consistency analysis of experimental data from the composed dataset revealed disagreement between models and data. Two consistency measures were applied to identify the quality of experimental targets (Quantities of Interest, QoI): scalar consistency measure, which quantifies the tightening index of the constraints while still ensuring the existence of a set of the model parameter values whose associated modeling output predicts the experimental QoIs within the uncertainty bounds; and a newly-developed method of computing the vector consistency measure (VCM), which determines the minimal bound changes for QoIs initially identified as inconsistent, each bound by its own extent, while still ensuring the existence of a set of the model parameter values whose associated modeling output predicts the experimental QoIs within the uncertainty bounds. The consistency analysis suggested that elimination of 45 experimental targets, 8 of which were self- inconsistent, would lead to a consistent dataset. After that the feasible parameter set was constructed through decrease uncertainty parameters for several reaction rate coefficients. This dataset was subjected for the B2BDC framework model optimization and analysis on. Forth methods of parameter optimization were applied, including those unique in the B2BDC framework. The optimized models showed improved agreement with experimental values, as compared to the initially-assembled model. Moreover, predictions for experiments not included in the initial dataset were investigated. The results demonstrate benefits of applying the B2BDC methodology for development of predictive kinetic models.
APA, Harvard, Vancouver, ISO, and other styles
6

Neal, Douglas R., Andrea Sciacchitano, Barton L. Smith, and Fulvio Scarano. "Collaborative framework for PIV uncertainty quantification: the experimental database." Measurement Science and Technology 26, no. 7 (June 5, 2015): 074003. http://dx.doi.org/10.1088/0957-0233/26/7/074003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Rasheed, Muhibur, Nathan Clement, Abhishek Bhowmick, and Chandrajit L. Bajaj. "Statistical Framework for Uncertainty Quantification in Computational Molecular Modeling." IEEE/ACM Transactions on Computational Biology and Bioinformatics 16, no. 4 (July 1, 2019): 1154–67. http://dx.doi.org/10.1109/tcbb.2017.2771240.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Westover, M. Brandon, Nathaniel A. Eiseman, Sydney S. Cash, and Matt T. Bianchi. "Information Theoretic Quantification of Diagnostic Uncertainty." Open Medical Informatics Journal 6, no. 1 (December 14, 2012): 36–50. http://dx.doi.org/10.2174/1874431101206010036.

Full text
Abstract:
Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes’ rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians’ deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians’ application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.
APA, Harvard, Vancouver, ISO, and other styles
9

Yin, Zhen, Sebastien Strebelle, and Jef Caers. "Automated Monte Carlo-based quantification and updating of geological uncertainty with borehole data (AutoBEL v1.0)." Geoscientific Model Development 13, no. 2 (February 19, 2020): 651–72. http://dx.doi.org/10.5194/gmd-13-651-2020.

Full text
Abstract:
Abstract. Geological uncertainty quantification is critical to subsurface modeling and prediction, such as groundwater, oil or gas, and geothermal resources, and needs to be continuously updated with new data. We provide an automated method for uncertainty quantification and the updating of geological models using borehole data for subsurface developments within a Bayesian framework. Our methodologies are developed with the Bayesian evidential learning protocol for uncertainty quantification. Under such a framework, newly acquired borehole data directly and jointly update geological models (structure, lithology, petrophysics, and fluids), globally and spatially, without time-consuming model rebuilding. To address the above matters, an ensemble of prior geological models is first constructed by Monte Carlo simulation from prior distribution. Once the prior model is tested by means of a falsification process, a sequential direct forecasting is designed to perform the joint uncertainty quantification. The direct forecasting is a statistical learning method that learns from a series of bijective operations to establish “Bayes–linear-Gauss” statistical relationships between model and data variables. Such statistical relationships, once conditioned to actual borehole measurements, allow for fast-computation posterior geological models. The proposed framework is completely automated in an open-source project. We demonstrate its application by applying it to a generic gas reservoir dataset. The posterior results show significant uncertainty reduction in both spatial geological model and gas volume prediction and cannot be falsified by new borehole observations. Furthermore, our automated framework completes the entire uncertainty quantification process efficiently for such large models.
APA, Harvard, Vancouver, ISO, and other styles
10

Narayan, Akil, and Dongbin Xiu. "Distributional Sensitivity for Uncertainty Quantification." Communications in Computational Physics 10, no. 1 (July 2011): 140–60. http://dx.doi.org/10.4208/cicp.160210.300710a.

Full text
Abstract:
AbstractIn this work we consider a general notion ofdistributional sensitivity, which measures the variation in solutions of a given physical/mathematical system with respect to the variation of probability distribution of the inputs. This is distinctively different from the classical sensitivity analysis, which studies the changes of solutions with respect to the values of the inputs. The general idea is measurement of sensitivity of outputs with respect to probability distributions, which is a well-studied concept in related disciplines. We adapt these ideas to present a quantitative framework in the context of uncertainty quantification for measuring such a kind of sensitivity and a set of efficient algorithms to approximate the distributional sensitivity numerically. A remarkable feature of the algorithms is that they do not incur additional computational effort in addition to a one-time stochastic solver. Therefore, an accurate stochastic computation with respect to a prior input distribution is needed only once, and the ensuing distributional sensitivity computation for different input distributions is a post-processing step. We prove that an accurate numericalmodel leads to accurate calculations of this sensitivity, which applies not just to slowly-converging Monte-Carlo estimates, but also to exponentially convergent spectral approximations. We provide computational examples to demonstrate the ease of applicability and verify the convergence claims.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Uncertainty quantification framework"

1

Ricciardi, Denielle E. "Uncertainty Quantification and Propagation in Materials Modeling Using a Bayesian Inferential Framework." The Ohio State University, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=osu1587473424147276.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ogele, Chile. "Integration and quantification of uncertainty of volumetric and material balance analyses using a Bayesian framework." Texas A&M University, 2005. http://hdl.handle.net/1969.1/2621.

Full text
Abstract:
Estimating original hydrocarbons in place (OHIP) in a reservoir is fundamentally important to estimating reserves and potential profitability. Quantifying the uncertainties in OHIP estimates can improve reservoir development and investment decision-making for individual reservoirs and can lead to improved portfolio performance. Two traditional methods for estimating OHIP are volumetric and material balance methods. Probabilistic estimates of OHIP are commonly generated prior to significant production from a reservoir by combining volumetric analysis with Monte Carlo methods. Material balance is routinely used to analyze reservoir performance and estimate OHIP. Although material balance has uncertainties due to errors in pressure and other parameters, probabilistic estimates are seldom done. In this thesis I use a Bayesian formulation to integrate volumetric and material balance analyses and to quantify uncertainty in the combined OHIP estimates. Specifically, I apply Bayes?? rule to the Havlena and Odeh material balance equation to estimate original oil in place, N, and relative gas-cap size, m, for a gas-cap drive oil reservoir. The paper considers uncertainty and correlation in the volumetric estimates of N and m (reflected in the prior probability distribution), as well as uncertainty in the pressure data (reflected in the likelihood distribution). Approximation of the covariance of the posterior distribution allows quantification of uncertainty in the estimates of N and m resulting from the combined volumetric and material balance analyses. Several example applications to illustrate the value of this integrated approach are presented. Material balance data reduce the uncertainty in the volumetric estimate, and the volumetric data reduce the considerable non-uniqueness of the material balance solution, resulting in more accurate OHIP estimates than from the separate analyses. One of the advantages over reservoir simulation is that, with the smaller number of parameters in this approach, we can easily sample the entire posterior distribution, resulting in more complete quantification of uncertainty. The approach can also detect underestimation of uncertainty in either volumetric data or material balance data, indicated by insufficient overlap of the prior and likelihood distributions. When this occurs, the volumetric and material balance analyses should be revisited and the uncertainties of each reevaluated.
APA, Harvard, Vancouver, ISO, and other styles
3

Aprilia, Asti Wulandari. "Uncertainty quantification of volumetric and material balance analysis of gas reservoirs with water influx using a Bayesian framework." Texas A&M University, 2005. http://hdl.handle.net/1969.1/4998.

Full text
Abstract:
Accurately estimating hydrocarbon reserves is important, because it affects every phase of the oil and gas business. Unfortunately, reserves estimation is always uncertain, since perfect information is seldom available from the reservoir, and uncertainty can complicate the decision-making process. Many important decisions have to be made without knowing exactly what the ultimate outcome will be from a decision made today. Thus, quantifying the uncertainty is extremely important. Two methods for estimating original hydrocarbons in place (OHIP) are volumetric and material balance methods. The volumetric method is convenient to calculate OHIP during the early development period, while the material balance method can be used later, after performance data, such as pressure and production data, are available. In this work, I propose a methodology for using a Bayesian approach to quantify the uncertainty of original gas in place (G), aquifer productivity index (J), and the volume of the aquifer (Wi) as a result of combining volumetric and material balance analysis in a water-driven gas reservoir. The results show that we potentially have significant non-uniqueness (i.e., large uncertainty) when we consider only volumetric analyses or material balance analyses. By combining the results from both analyses, the non-uniqueness can be reduced, resulting in OGIP and aquifer parameter estimates with lower uncertainty. By understanding the uncertainty, we can expect better management decision making.
APA, Harvard, Vancouver, ISO, and other styles
4

Wu, Sichao. "Computational Framework for Uncertainty Quantification, Sensitivity Analysis and Experimental Design of Network-based Computer Simulation Models." Diss., Virginia Tech, 2017. http://hdl.handle.net/10919/78764.

Full text
Abstract:
When capturing a real-world, networked system using a simulation model, features are usually omitted or represented by probability distributions. Verification and validation (V and V) of such models is an inherent and fundamental challenge. Central to V and V, but also to model analysis and prediction, are uncertainty quantification (UQ), sensitivity analysis (SA) and design of experiments (DOE). In addition, network-based computer simulation models, as compared with models based on ordinary and partial differential equations (ODE and PDE), typically involve a significantly larger volume of more complex data. Efficient use of such models is challenging since it requires a broad set of skills ranging from domain expertise to in-depth knowledge including modeling, programming, algorithmics, high- performance computing, statistical analysis, and optimization. On top of this, the need to support reproducible experiments necessitates complete data tracking and management. Finally, the lack of standardization of simulation model configuration formats presents an extra challenge when developing technology intended to work across models. While there are tools and frameworks that address parts of the challenges above, to the best of our knowledge, none of them accomplishes all this in a model-independent and scientifically reproducible manner. In this dissertation, we present a computational framework called GENEUS that addresses these challenges. Specifically, it incorporates (i) a standardized model configuration format, (ii) a data flow management system with digital library functions helping to ensure scientific reproducibility, and (iii) a model-independent, expandable plugin-type library for efficiently conducting UQ/SA/DOE for network-based simulation models. This framework has been applied to systems ranging from fundamental graph dynamical systems (GDSs) to large-scale socio-technical simulation models with a broad range of analyses such as UQ and parameter studies for various scenarios. Graph dynamical systems provide a theoretical framework for network-based simulation models and have been studied theoretically in this dissertation. This includes a broad range of stability and sensitivity analyses offering insights into how GDSs respond to perturbations of their key components. This stability-focused, structure-to-function theory was a motivator for the design and implementation of GENEUS. GENEUS, rooted in the framework of GDS, provides modelers, experimentalists, and research groups access to a variety of UQ/SA/DOE methods with robust and tested implementations without requiring them to necessarily have the detailed expertise in statistics, data management and computing. Even for research teams having all the skills, GENEUS can significantly increase research productivity.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
5

Wang, Jianxun. "Physics-Informed, Data-Driven Framework for Model-Form Uncertainty Estimation and Reduction in RANS Simulations." Diss., Virginia Tech, 2017. http://hdl.handle.net/10919/77035.

Full text
Abstract:
Computational fluid dynamics (CFD) has been widely used to simulate turbulent flows. Although an increased availability of computational resources has enabled high-fidelity simulations (e.g. large eddy simulation and direct numerical simulation) of turbulent flows, the Reynolds-Averaged Navier-Stokes (RANS) equations based models are still the dominant tools for industrial applications. However, the predictive capability of RANS models is limited by potential inaccuracies driven by hypotheses in the Reynolds stress closure. With the ever-increasing use of RANS simulations in mission-critical applications, the estimation and reduction of model-form uncertainties in RANS models have attracted attention in the turbulence modeling community. In this work, I focus on estimating uncertainties stemming from the RANS turbulence closure and calibrating discrepancies in the modeled Reynolds stresses to improve the predictive capability of RANS models. Both on-line and off-line data are utilized to achieve this goal. The main contributions of this dissertation can be summarized as follows: First, a physics-based, data-driven Bayesian framework is developed for estimating and reducing model-form uncertainties in RANS simulations. An iterative ensemble Kalman method is employed to assimilate sparse on-line measurement data and empirical prior knowledge for a full-field inversion. The merits of incorporating prior knowledge and physical constraints in calibrating RANS model discrepancies are demonstrated and discussed. Second, a random matrix theoretic framework is proposed for estimating model-form uncertainties in RANS simulations. Maximum entropy principle is employed to identify the probability distribution that satisfies given constraints but without introducing artificial information. Objective prior perturbations of RANS-predicted Reynolds stresses in physical projections are provided based on comparisons between physics-based and random matrix theoretic approaches. Finally, a physics-informed, machine learning framework towards predictive RANS turbulence modeling is proposed. The functional forms of model discrepancies with respect to mean flow features are extracted from the off-line database of closely related flows based on machine learning algorithms. The RANS-modeled Reynolds stresses of prediction flows can be significantly improved by the trained discrepancy function, which is an important step towards the predictive turbulence modeling.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
6

Loughnane, Gregory Thomas. "A Framework for Uncertainty Quantification in Microstructural Characterization with Application to Additive Manufacturing of Ti-6Al-4V." Wright State University / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=wright1441064431.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Huang, Chao-Min. "Robust Design Framework for Automating Multi-component DNA Origami Structures with Experimental and MD coarse-grained Model Validation." The Ohio State University, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=osu159051496861178.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Janya-anurak, Chettapong [Verfasser]. "Framework for Analysis and Identification of Nonlinear Distributed Parameter Systems using Bayesian Uncertainty Quantification based on Generalized Polynomial Chaos / Chettapong Janya-anurak." Karlsruhe : KIT Scientific Publishing, 2017. http://www.ksp.kit.edu.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Janya-Anurak, Chettapong [Verfasser]. "Framework for Analysis and Identification of Nonlinear Distributed Parameter Systems using Bayesian Uncertainty Quantification based on Generalized Polynomial Chaos / Chettapong Janya-anurak." Karlsruhe : KIT Scientific Publishing, 2017. http://www.ksp.kit.edu.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Cioaca, Alexandru George. "A Computational Framework for Assessing and Optimizing the Performance of Observational Networks in 4D-Var Data Assimilation." Diss., Virginia Tech, 2013. http://hdl.handle.net/10919/51795.

Full text
Abstract:
A deep scientific understanding of complex physical systems, such as the atmosphere, can be achieved neither by direct measurements nor by numerical simulations alone. Data assimilation is a rigorous procedure to fuse information from a priori knowledge of the system state, the physical laws governing the evolution of the system, and real measurements, all with associated error statistics. Data assimilation produces best (a posteriori) estimates of model states and parameter values, and results in considerably improved computer simulations. The acquisition and use of observations in data assimilation raises several important scientific questions related to optimal sensor network design, quantification of data impact, pruning redundant data, and identifying the most beneficial additional observations. These questions originate in operational data assimilation practice, and have started to attract considerable interest in the recent past. This dissertation advances the state of knowledge in four dimensional variational (4D-Var) - data assimilation by developing, implementing, and validating a novel computational framework for estimating observation impact and for optimizing sensor networks. The framework builds on the powerful methodologies of second-order adjoint modeling and the 4D-Var sensitivity equations. Efficient computational approaches for quantifying the observation impact include matrix free linear algebra algorithms and low-rank approximations of the sensitivities to observations. The sensor network configuration problem is formulated as a meta-optimization problem. Best values for parameters such as sensor location are obtained by optimizing a performance criterion, subject to the constraint posed by the 4D-Var optimization. Tractable computational solutions to this "optimization-constrained" optimization problem are provided. The results of this work can be directly applied to the deployment of intelligent sensors and adaptive observations, as well as to reducing the operating costs of measuring networks, while preserving their ability to capture the essential features of the system under consideration.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Uncertainty quantification framework"

1

Sanderson, Benjamin Mark. Uncertainty Quantification in Multi-Model Ensembles. Oxford University Press, 2018. http://dx.doi.org/10.1093/acrefore/9780190228620.013.707.

Full text
Abstract:
Long-term planning for many sectors of society—including infrastructure, human health, agriculture, food security, water supply, insurance, conflict, and migration—requires an assessment of the range of possible futures which the planet might experience. Unlike short-term forecasts for which validation data exists for comparing forecast to observation, long-term forecasts have almost no validation data. As a result, researchers must rely on supporting evidence to make their projections. A review of methods for quantifying the uncertainty of climate predictions is given. The primary tool for quantifying these uncertainties are climate models, which attempt to model all the relevant processes that are important in climate change. However, neither the construction nor calibration of climate models is perfect, and therefore the uncertainties due to model errors must also be taken into account in the uncertainty quantification.Typically, prediction uncertainty is quantified by generating ensembles of solutions from climate models to span possible futures. For instance, initial condition uncertainty is quantified by generating an ensemble of initial states that are consistent with available observations and then integrating the climate model starting from each initial condition. A climate model is itself subject to uncertain choices in modeling certain physical processes. Some of these choices can be sampled using so-called perturbed physics ensembles, whereby uncertain parameters or structural switches are perturbed within a single climate model framework. For a variety of reasons, there is a strong reliance on so-called ensembles of opportunity, which are multi-model ensembles (MMEs) formed by collecting predictions from different climate modeling centers, each using a potentially different framework to represent relevant processes for climate change. The most extensive collection of these MMEs is associated with the Coupled Model Intercomparison Project (CMIP). However, the component models have biases, simplifications, and interdependencies that must be taken into account when making formal risk assessments. Techniques and concepts for integrating model projections in MMEs are reviewed, including differing paradigms of ensembles and how they relate to observations and reality. Aspects of these conceptual issues then inform the more practical matters of how to combine and weight model projections to best represent the uncertainties associated with projected climate change.
APA, Harvard, Vancouver, ISO, and other styles
2

Franklin, James. Pre-history of Probability. Edited by Alan Hájek and Christopher Hitchcock. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199607617.013.3.

Full text
Abstract:
The history of the evaluation of uncertain evidence before the quantification of probability in 1654 is a mass of examples relevant to current debates. They deal with matters that in general are as unquantified now as ever – the degree to which evidence supports theory, the strength and justification of inductive inferences, the weight of testimony, the combination of pieces of uncertain evidence, the price of risk, the philosophical nature of chance, and the problem of acting in case of doubt. Concepts similar to modern “proof beyond reasonable doubt” were developed especially in the legal theory of evidence. Moral theology discussed “probabilism”, the doctrine that one could follow a probable opinion in ethics even if the opposite was more probable. Philosophers understood the difficult problem of induction. Legal discussion of “aleatory contracts” such as insurance and games of chance developed the framework in which the quantification of probability eventually took place.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Uncertainty quantification framework"

1

Schöbi, Roland, and Eleni Chatzi. "Maintenance Planning Under Uncertainties Using a Continuous-State POMDP Framework." In Model Validation and Uncertainty Quantification, Volume 3, 135–43. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-04552-8_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Atamturktur, S., and G. Stevens. "Validation of Strongly Coupled Models: A Framework for Resource Allocation." In Model Validation and Uncertainty Quantification, Volume 3, 25–32. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-04552-8_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gomes, H. M., F. A. DiazDelaO, and J. E. Mottershead. "Inferring Structural Variability Using Modal Analysis in a Bayesian Framework." In Model Validation and Uncertainty Quantification, Volume 3, 363–73. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-04552-8_36.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Argyris, Costas, and Costas Papadimitriou. "A Bayesian Framework for Optimal Experimental Design in Structural Dynamics." In Model Validation and Uncertainty Quantification, Volume 3, 263–70. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-29754-5_26.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Jia, Xinyu, Omid Sedehi, Costas Papadimitriou, Lambros Katafygiotis, and Babak Moaveni. "Two-Stage Hierarchical Bayesian Framework for Finite Element Model Updating." In Model Validation and Uncertainty Quantification, Volume 3, 383–87. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-47638-0_42.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Markogiannaki, O., A. Arailopoulos, D. Giagopoulos, and C. Papadimitriou. "Vibration-Based Damage Detection Framework of Large-Scale Structural Systems." In Model Validation and Uncertainty Quantification, Volume 3, 179–86. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-77348-9_22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Tosi, Riccardo, Marc Nuñez, Brendan Keith, Jordi Pons-Prats, Barbara Wohlmuth, and Riccardo Rossi. "Scalable Dynamic Asynchronous Monte Carlo Framework Applied to Wind Engineering Problems." In Advances in Uncertainty Quantification and Optimization Under Uncertainty with Aerospace Applications, 55–68. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-80542-5_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Edington, Lara J., Nikolaos Dervilis, Paul Gardner, and David J. Wagg. "An Initial Concept for an Error-Based Digital Twin Framework for Dynamics Applications." In Model Validation and Uncertainty Quantification, Volume 3, 81–89. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-77348-9_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Chadha, Mayank, Zhen Hu, Charles R. Farrar, and Michael D. Todd. "An Optimal Sensor Network Design Framework for Structural Health Monitoring Using Value of Information." In Model Validation and Uncertainty Quantification, Volume 3, 107–10. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-04090-0_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bijak, Jakub, and Jason Hilton. "Uncertainty Quantification, Model Calibration and Sensitivity." In Towards Bayesian Model-Based Demography, 71–92. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-83039-7_5.

Full text
Abstract:
AbstractBetter understanding of the behaviour of agent-based models, aimed at embedding them in the broader, model-based line of scientific enquiry, requires a comprehensive framework for analysing their results. Seeing models as tools for experimenting in silico, this chapter discusses the basic tenets and techniques of uncertainty quantification and experimental design, both of which can help shed light on the workings of complex systems embedded in computational models. In particular, we look at: relationships between model inputs and outputs, various types of experimental design, methods of analysis of simulation results, assessment of model uncertainty and sensitivity, which helps identify the parts of the model that matter in the experiments, as well as statistical tools for calibrating models to the available data. We focus on the role of emulators, or meta-models – high-level statistical models approximating the behaviour of the agent-based models under study – and in particular, on Gaussian processes (GPs). The theoretical discussion is illustrated by applications to the Routes and Rumours model of migrant route formation introduced before.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Uncertainty quantification framework"

1

Marelli, Stefano, and Bruno Sudret. "UQLab: A Framework for Uncertainty Quantification in Matlab." In Second International Conference on Vulnerability and Risk Analysis and Management (ICVRAM) and the Sixth International Symposium on Uncertainty, Modeling, and Analysis (ISUMA). Reston, VA: American Society of Civil Engineers, 2014. http://dx.doi.org/10.1061/9780784413609.257.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Liang, Chen, and Sankaran Mahadevan. "Bayesian Framework for Multidisciplinary Uncertainty Quantification and Optimization." In 16th AIAA Non-Deterministic Approaches Conference. Reston, Virginia: American Institute of Aeronautics and Astronautics, 2014. http://dx.doi.org/10.2514/6.2014-1499.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Cruz-Lozano, Ricardo, Fisseha Alemayehu, and Stephen Ekwaro-Osire. "Quantification of Uncertainty in Sketches." In ASME 2014 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/imece2014-39383.

Full text
Abstract:
Design Notebooks (DNBs) can be used to assess the information gathering activities, creativity and individual participation within design groups. Moreover DNBs are the communication tools in the overall design process. For communication purposes, DNBs contain information representations (IRs) such as sketches, symbols, text and equations that usually are imprecisely defined. Imprecise or vague IRs may lead to uncertainty in design communication. This work considered the uncertainty in sketches, one of the most widely used IRs in DNBs. The research question of this study is: Can the uncertainty in sketches be quantified? To answer the research question the following specific aims are formulated: identify the type of uncertainty, assess appropriate uncertainty quantification methods and establish a framework to quantify the uncertainty in sketches. The uncertainty in sketches is found to be mainly an epistemic uncertainty and a modified numerical approach was implemented to quantify it. Using the established framework, the uncertainty in sketches has been quantified and further study is recommended to assess its effect on design communication.
APA, Harvard, Vancouver, ISO, and other styles
4

Ritto, Thiago, and Luiz Nunes. "Ranking Hyperelastic models for simple and pure shear at large deformation using the Bayesian Framework." In 3rd International Symposium on Uncertainty Quantification and Stochastic Modeling. Rio de Janeiro, Brazil: ABCM Brazilian Society of Mechanical Sciences and Engineering, 2015. http://dx.doi.org/10.20906/cps/usm-2016-0002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hu, Zhen, and Sankaran Mahadevan. "Bayesian Network Learning for Uncertainty Quantification." In ASME 2017 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/detc2017-68187.

Full text
Abstract:
Bayesian Networks (BNs) are being studied in recent years for system diagnosis, reliability analysis, and design of complex engineered systems. In several practical applications, BNs need to be learned from available data before being used for design or other purposes. Current BN learning algorithms are mainly developed for networks with only discrete variables. Engineering design problems often consist of both discrete and continuous variables. This paper develops a framework to handle continuous variables in BN learning by integrating learning algorithms of discrete BNs with Gaussian mixture models (GMMs). We first make the topology learning more robust by optimizing the number of Gaussian components in the univariate GMMs currently available in the literature. Based on the BN topology learning, a new Multivariate Gaussian Mixture (MGM) strategy is developed to improve the accuracy of conditional probability learning in the BN. A method is proposed to address this difficulty of MGM modeling with data of mixed discrete and continuous variables by mapping the data for discrete variables into data for a standard normal variable. The proposed framework is capable of learning BNs without discretizing the continuous variables or making assumptions about their CPDs. The applications of the learned BN to uncertainty quantification and model calibration are also investigated. The results of a mathematical example and an engineering application example demonstrate the effectiveness of the proposed framework.
APA, Harvard, Vancouver, ISO, and other styles
6

Rathi, Amit Kumar, and Arunasis Chakraborty. "MLS BASED SEQUENTIAL SRSM IN SPARSE GRID FRAMEWORK FOR EFFICIENT UNCERTAINTY QUANTIFICATION." In 1st International Conference on Uncertainty Quantification in Computational Sciences and Engineering. Athens: Institute of Structural Analysis and Antiseismic Research School of Civil Engineering National Technical University of Athens (NTUA) Greece, 2017. http://dx.doi.org/10.7712/120217.5387.17110.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Wan, Zhiqiang, Jianbing Chen, Jie Li, and Michael Beer. "A PDEM-COM Framework for Quantification of Epistemic Uncertainty." In Proceedings of the 29th European Safety and Reliability Conference (ESREL). Singapore: Research Publishing Services, 2019. http://dx.doi.org/10.3850/978-981-11-2724-3_0969-cd.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Rasheed, Muhibur, Nathan Clement, Abhishek Bhowmick, and Chandrajit Bajaj. "Statistical Framework for Uncertainty Quantification in Computational Molecular Modeling." In BCB '16: ACM International Conference on Bioinformatics, Computational Biology, and Health Informatics. New York, NY, USA: ACM, 2016. http://dx.doi.org/10.1145/2975167.2975182.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Patelli, Edoardo, Diego A. Alvarez, Matteo Broggi, and Marco de Angelis. "An integrated and efficient numerical framework for uncertainty quantification: application to the NASA Langley multidisciplinary Uncertainty Quantification Challenge." In 16th AIAA Non-Deterministic Approaches Conference. Reston, Virginia: American Institute of Aeronautics and Astronautics, 2014. http://dx.doi.org/10.2514/6.2014-1501.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Xu, Meng, Chris J. Dent, and Amy Wilson. "Uncertainty quantification in power system reliability using a Bayesian framework." In 2016 International Conference on Probabilistic Methods Applied to Power Systems (PMAPS). IEEE, 2016. http://dx.doi.org/10.1109/pmaps.2016.7764187.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Uncertainty quantification framework"

1

Ye, Ming. Computational Bayesian Framework for Quantification and Reduction of Predictive Uncertainty in Subsurface Environmental Modeling. Office of Scientific and Technical Information (OSTI), January 2019. http://dx.doi.org/10.2172/1491235.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Glimm, James, Yunha Lee, Kenny Q. Ye, and David H. Sharp. Prediction Using Numerical Simulations, A Bayesian Framework for Uncertainty Quantification and its Statistical Challenge. Fort Belvoir, VA: Defense Technical Information Center, January 2002. http://dx.doi.org/10.21236/ada417842.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sinclair, Samantha, and Sandra LeGrand. Reproducibility assessment and uncertainty quantification in subjective dust source mapping. Engineer Research and Development Center (U.S.), August 2021. http://dx.doi.org/10.21079/11681/41523.

Full text
Abstract:
Accurate dust-source characterizations are critical for effectively modeling dust storms. A previous study developed an approach to manually map dust plume-head point sources in a geographic information system (GIS) framework using Moderate Resolution Imaging Spectroradiometer (MODIS) imagery processed through dust-enhancement algorithms. With this technique, the location of a dust source is digitized and recorded if an analyst observes an unobscured plume head in the imagery. Because airborne dust must be sufficiently elevated for overland dust-enhancement algorithms to work, this technique may include up to 10 km in digitized dust-source location error due to downwind advection. However, the potential for error in this method due to analyst subjectivity has never been formally quantified. In this study, we evaluate a version of the methodology adapted to better enable reproducibility assessments amongst multiple analysts to determine the role of analyst subjectivity on recorded dust source location error. Four analysts individually mapped dust plumes in Southwest Asia and Northwest Africa using five years of MODIS imagery collected from 15 May to 31 August. A plume-source location is considered reproducible if the maximum distance between the analyst point-source markers for a single plume is ≤10 km. Results suggest analyst marker placement is reproducible; however, additional analyst subjectivity-induced error (7 km determined in this study) should be considered to fully characterize locational uncertainty. Additionally, most of the identified plume heads (> 90%) were not marked by all participating analysts, which indicates dust source maps generated using this technique may differ substantially between users.
APA, Harvard, Vancouver, ISO, and other styles
4

Sinclair, Samantha, and Sandra LeGrand. Reproducibility assessment and uncertainty quantification in subjective dust source mapping. Engineer Research and Development Center (U.S.), August 2021. http://dx.doi.org/10.21079/11681/41542.

Full text
Abstract:
Accurate dust-source characterizations are critical for effectively modeling dust storms. A previous study developed an approach to manually map dust plume-head point sources in a geographic information system (GIS) framework using Moderate Resolution Imaging Spectroradiometer (MODIS) imagery processed through dust-enhancement algorithms. With this technique, the location of a dust source is digitized and recorded if an analyst observes an unobscured plume head in the imagery. Because airborne dust must be sufficiently elevated for overland dust-enhancement algorithms to work, this technique may include up to 10 km in digitized dust-source location error due to downwind advection. However, the potential for error in this method due to analyst subjectivity has never been formally quantified. In this study, we evaluate a version of the methodology adapted to better enable reproducibility assessments amongst multiple analysts to determine the role of analyst subjectivity on recorded dust source location error. Four analysts individually mapped dust plumes in Southwest Asia and Northwest Africa using five years of MODIS imagery collected from 15 May to 31 August. A plume-source location is considered reproducible if the maximum distance between the analyst point-source markers for a single plume is ≤10 km. Results suggest analyst marker placement is reproducible; however, additional analyst subjectivity-induced error (7 km determined in this study) should be considered to fully characterize locational uncertainty. Additionally, most of the identified plume heads (> 90%) were not marked by all participating analysts, which indicates dust source maps generated using this technique may differ substantially between users.
APA, Harvard, Vancouver, ISO, and other styles
5

Adams, Brian M., Mohamed Salah Ebeida, Michael S. Eldred, John Davis Jakeman, Laura Painton Swiler, John Adam Stephens, Dena M. Vigil, et al. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :. Office of Scientific and Technical Information (OSTI), May 2014. http://dx.doi.org/10.2172/1177077.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Swiler, Laura Painton, Jon C. Helton, Eduardo Basurto, Dusty Marie Brooks, Paul Mariner, Leslie Melissa Moore, Sitakanta Mohanty, Stephen David Sevougian, and Emily Stein. Status Report on Uncertainty Quantification and Sensitivity Analysis Tools in the Geologic Disposal Safety Assessment (GDSA) Framework. Office of Scientific and Technical Information (OSTI), November 2019. http://dx.doi.org/10.2172/1574263.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Eldred, Michael Scott, Dena M. Vigil, Keith R. Dalbey, William J. Bohnhoff, Brian M. Adams, Laura Painton Swiler, Sophia Lefantzi, Patricia Diane Hough, and John P. Eddy. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Office of Scientific and Technical Information (OSTI), December 2011. http://dx.doi.org/10.2172/1031910.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Gel, Aytekin, Yang Jiao, Heather Emady, and Charles Tong. MFIX-DEM Phi: Performance and Capability Improvements Towards Industrial Grade Open-source DEM Framework with Integrated Uncertainty Quantification. Office of Scientific and Technical Information (OSTI), May 2018. http://dx.doi.org/10.2172/1439328.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Tosi, R., R. Amela, M. Nuñez, R. Badia, C. Roig, R. Rossi, and C. Soriano. D1.2 First realease of the softwares. Scipedia, 2021. http://dx.doi.org/10.23967/exaqute.2021.2.011.

Full text
Abstract:
This deliverable presents the software release of the Kratos Multiphysics software [3], ”a framework for building parallel, multi-disciplinary simulation software, aiming at modularity, extensibility, and high performance. Kratos is written in C++, and counts with an extensive Python interface”. In this deliverable we focus on the development of Uncertainty Quantification inside Kratos. This takes place in the MultilevelMonteCarloApplication, a recent development inside the software that allows to deal with uncertainty quantification.
APA, Harvard, Vancouver, ISO, and other styles
10

ELDRED, MICHAEL S., ANTHONY A. GIUNTA, BART G. VAN BLOEMEN WAANDERS, STEVEN F. WOJTKIEWICZ, JR, WILLIAM E. HART, and MARIO ALLEVA. DAKOTA, A Multilevel Parallel Object-Oriented Framework for Design Optimization, Parameter Estimation, Uncertainty Quantification, and Sensitivity Analysis Version 3.0. Office of Scientific and Technical Information (OSTI), April 2002. http://dx.doi.org/10.2172/800774.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography