Journal articles on the topic 'Uncertainty quantification framework'

To see the other types of publications on this topic, follow the link: Uncertainty quantification framework.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Uncertainty quantification framework.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Verdonck, H., O. Hach, J. D. Polman, O. Braun, C. Balzani, S. Müller, and J. Rieke. "-An open-source framework for the uncertainty quantification of aeroelastic wind turbine simulation tools." Journal of Physics: Conference Series 2265, no. 4 (May 1, 2022): 042039. http://dx.doi.org/10.1088/1742-6596/2265/4/042039.

Full text
Abstract:
Abstract The uncertainty quantification of aeroelastic wind turbine simulations is an active research topic. This paper presents a dedicated, open-source framework for this purpose. The framework is built around the uncertainpy package, likewise available as open source. Uncertainty quantification is done with a non-intrusive, global and variance-based surrogate model, using PCE (i.e., polynomial chaos expansion). Two methods to handle the uncertain parameter distribution along the blades are presented. The framework is demonstrated on the basis of an aeroelastic stability analysis. A sensitivity analysis is performed on the influence of the flapwise, edgewise and torsional stiffness of the blades on the damping of the most critical mode for both a Bladed linearization and a Bladed time domain simulation. The sensitivities of both models are in excellent agreement and the PCE surrogate models are shown to be accurate approximations of the true models.
APA, Harvard, Vancouver, ISO, and other styles
2

Wang, Jiajia, Hao Chen, Jing Ma, and Tong Zhang. "Research on application method of uncertainty quantification technology in equipment test identification." MATEC Web of Conferences 336 (2021): 02026. http://dx.doi.org/10.1051/matecconf/202133602026.

Full text
Abstract:
This paper introduces the concepts of equipment test qualification and uncertainty quantification, and the analysis framework and process of equipment test uncertainty quantification. It analyzes the data uncertainty, model uncertainty and environmental uncertainty, and studies the corresponding uncertainty quantification theory to provide technical reference for the application of uncertainty quantification technology in the field of test identification.
APA, Harvard, Vancouver, ISO, and other styles
3

Zhang, Juan, Junping Yin, and Ruili Wang. "Basic Framework and Main Methods of Uncertainty Quantification." Mathematical Problems in Engineering 2020 (August 31, 2020): 1–18. http://dx.doi.org/10.1155/2020/6068203.

Full text
Abstract:
Since 2000, the research of uncertainty quantification (UQ) has been successfully applied in many fields and has been highly valued and strongly supported by academia and industry. This review firstly discusses the sources and the types of uncertainties and gives an overall discussion on the goal, practical significance, and basic framework of the research of UQ. Then, the core ideas and typical methods of several important UQ processes are introduced, including sensitivity analysis, uncertainty propagation, model calibration, Bayesian inference, experimental design, surrogate model, and model uncertainty analysis.
APA, Harvard, Vancouver, ISO, and other styles
4

DeVolder, B., J. Glimm, J. W. Grove, Y. Kang, Y. Lee, K. Pao, D. H. Sharp, and K. Ye. "Uncertainty Quantification for Multiscale Simulations1." Journal of Fluids Engineering 124, no. 1 (November 12, 2001): 29–41. http://dx.doi.org/10.1115/1.1445139.

Full text
Abstract:
A general discussion of the quantification of uncertainty in numerical simulations is presented. A principal conclusion is that the distribution of solution errors is the leading term in the assessment of the validity of a simulation and its associated uncertainty in the Bayesian framework. Key issues that arise in uncertainty quantification are discussed for two examples drawn from shock wave physics and modeling of petroleum reservoirs. Solution error models, confidence intervals and Gaussian error statistics based on simulation studies are presented.
APA, Harvard, Vancouver, ISO, and other styles
5

Mirzayeva, A., N. A. Slavinskaya, M. Abbasi, J. H. Starcke, W. Li, and M. Frenklach. "Uncertainty Quantification in Chemical Modeling." Eurasian Chemico-Technological Journal 20, no. 1 (March 31, 2018): 33. http://dx.doi.org/10.18321/ectj706.

Full text
Abstract:
A module of PrIMe automated data-centric infrastructure, Bound-to-Bound Data Collaboration (B2BDC), was used for the analysis of systematic uncertainty and data consistency of the H2/CO reaction model (73/17). In order to achieve this purpose, a dataset of 167 experimental targets (ignition delay time and laminar flame speed) and 55 active model parameters (pre-exponent factors in the Arrhenius form of the reaction rate coefficients) was constructed. Consistency analysis of experimental data from the composed dataset revealed disagreement between models and data. Two consistency measures were applied to identify the quality of experimental targets (Quantities of Interest, QoI): scalar consistency measure, which quantifies the tightening index of the constraints while still ensuring the existence of a set of the model parameter values whose associated modeling output predicts the experimental QoIs within the uncertainty bounds; and a newly-developed method of computing the vector consistency measure (VCM), which determines the minimal bound changes for QoIs initially identified as inconsistent, each bound by its own extent, while still ensuring the existence of a set of the model parameter values whose associated modeling output predicts the experimental QoIs within the uncertainty bounds. The consistency analysis suggested that elimination of 45 experimental targets, 8 of which were self- inconsistent, would lead to a consistent dataset. After that the feasible parameter set was constructed through decrease uncertainty parameters for several reaction rate coefficients. This dataset was subjected for the B2BDC framework model optimization and analysis on. Forth methods of parameter optimization were applied, including those unique in the B2BDC framework. The optimized models showed improved agreement with experimental values, as compared to the initially-assembled model. Moreover, predictions for experiments not included in the initial dataset were investigated. The results demonstrate benefits of applying the B2BDC methodology for development of predictive kinetic models.
APA, Harvard, Vancouver, ISO, and other styles
6

Neal, Douglas R., Andrea Sciacchitano, Barton L. Smith, and Fulvio Scarano. "Collaborative framework for PIV uncertainty quantification: the experimental database." Measurement Science and Technology 26, no. 7 (June 5, 2015): 074003. http://dx.doi.org/10.1088/0957-0233/26/7/074003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Rasheed, Muhibur, Nathan Clement, Abhishek Bhowmick, and Chandrajit L. Bajaj. "Statistical Framework for Uncertainty Quantification in Computational Molecular Modeling." IEEE/ACM Transactions on Computational Biology and Bioinformatics 16, no. 4 (July 1, 2019): 1154–67. http://dx.doi.org/10.1109/tcbb.2017.2771240.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Westover, M. Brandon, Nathaniel A. Eiseman, Sydney S. Cash, and Matt T. Bianchi. "Information Theoretic Quantification of Diagnostic Uncertainty." Open Medical Informatics Journal 6, no. 1 (December 14, 2012): 36–50. http://dx.doi.org/10.2174/1874431101206010036.

Full text
Abstract:
Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes’ rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians’ deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians’ application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.
APA, Harvard, Vancouver, ISO, and other styles
9

Yin, Zhen, Sebastien Strebelle, and Jef Caers. "Automated Monte Carlo-based quantification and updating of geological uncertainty with borehole data (AutoBEL v1.0)." Geoscientific Model Development 13, no. 2 (February 19, 2020): 651–72. http://dx.doi.org/10.5194/gmd-13-651-2020.

Full text
Abstract:
Abstract. Geological uncertainty quantification is critical to subsurface modeling and prediction, such as groundwater, oil or gas, and geothermal resources, and needs to be continuously updated with new data. We provide an automated method for uncertainty quantification and the updating of geological models using borehole data for subsurface developments within a Bayesian framework. Our methodologies are developed with the Bayesian evidential learning protocol for uncertainty quantification. Under such a framework, newly acquired borehole data directly and jointly update geological models (structure, lithology, petrophysics, and fluids), globally and spatially, without time-consuming model rebuilding. To address the above matters, an ensemble of prior geological models is first constructed by Monte Carlo simulation from prior distribution. Once the prior model is tested by means of a falsification process, a sequential direct forecasting is designed to perform the joint uncertainty quantification. The direct forecasting is a statistical learning method that learns from a series of bijective operations to establish “Bayes–linear-Gauss” statistical relationships between model and data variables. Such statistical relationships, once conditioned to actual borehole measurements, allow for fast-computation posterior geological models. The proposed framework is completely automated in an open-source project. We demonstrate its application by applying it to a generic gas reservoir dataset. The posterior results show significant uncertainty reduction in both spatial geological model and gas volume prediction and cannot be falsified by new borehole observations. Furthermore, our automated framework completes the entire uncertainty quantification process efficiently for such large models.
APA, Harvard, Vancouver, ISO, and other styles
10

Narayan, Akil, and Dongbin Xiu. "Distributional Sensitivity for Uncertainty Quantification." Communications in Computational Physics 10, no. 1 (July 2011): 140–60. http://dx.doi.org/10.4208/cicp.160210.300710a.

Full text
Abstract:
AbstractIn this work we consider a general notion ofdistributional sensitivity, which measures the variation in solutions of a given physical/mathematical system with respect to the variation of probability distribution of the inputs. This is distinctively different from the classical sensitivity analysis, which studies the changes of solutions with respect to the values of the inputs. The general idea is measurement of sensitivity of outputs with respect to probability distributions, which is a well-studied concept in related disciplines. We adapt these ideas to present a quantitative framework in the context of uncertainty quantification for measuring such a kind of sensitivity and a set of efficient algorithms to approximate the distributional sensitivity numerically. A remarkable feature of the algorithms is that they do not incur additional computational effort in addition to a one-time stochastic solver. Therefore, an accurate stochastic computation with respect to a prior input distribution is needed only once, and the ensuing distributional sensitivity computation for different input distributions is a post-processing step. We prove that an accurate numericalmodel leads to accurate calculations of this sensitivity, which applies not just to slowly-converging Monte-Carlo estimates, but also to exponentially convergent spectral approximations. We provide computational examples to demonstrate the ease of applicability and verify the convergence claims.
APA, Harvard, Vancouver, ISO, and other styles
11

Ping, Menghao, Xinyu Jia, Costas Papadimitriou, Xu Han, and Chao Jiang. "Statistics-based Bayesian modeling framework for uncertainty quantification and propagation." Mechanical Systems and Signal Processing 174 (July 2022): 109102. http://dx.doi.org/10.1016/j.ymssp.2022.109102.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Xie, Wei, Cheng Li, Yuefeng Wu, and Pu Zhang. "A Nonparametric Bayesian Framework for Uncertainty Quantification in Stochastic Simulation." SIAM/ASA Journal on Uncertainty Quantification 9, no. 4 (January 2021): 1527–52. http://dx.doi.org/10.1137/20m1345517.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Saracco, P., and M. G. Pia. "An exact framework for uncertainty quantification in Monte Carlo simulation." Journal of Physics: Conference Series 513, no. 2 (June 11, 2014): 022033. http://dx.doi.org/10.1088/1742-6596/513/2/022033.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Sciacchitano, Andrea, Douglas R. Neal, Barton L. Smith, Scott O. Warner, Pavlos P. Vlachos, Bernhard Wieneke, and Fulvio Scarano. "Collaborative framework for PIV uncertainty quantification: comparative assessment of methods." Measurement Science and Technology 26, no. 7 (June 5, 2015): 074004. http://dx.doi.org/10.1088/0957-0233/26/7/074004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Sarrafi, Aral, Zhu Mao, and Michael Shiao. "Uncertainty quantification framework for wavelet transformation of noise-contaminated signals." Measurement 137 (April 2019): 102–15. http://dx.doi.org/10.1016/j.measurement.2019.01.049.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Kotteda, V. M. Krushnarao, J. Adam Stephens, William Spotz, Vinod Kumar, and Anitha Kommu. "Uncertainty quantification of fluidized beds using a data-driven framework." Powder Technology 354 (September 2019): 709–18. http://dx.doi.org/10.1016/j.powtec.2019.06.021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Gorodetsky, Alex A., Gianluca Geraci, Michael S. Eldred, and John D. Jakeman. "A generalized approximate control variate framework for multifidelity uncertainty quantification." Journal of Computational Physics 408 (May 2020): 109257. http://dx.doi.org/10.1016/j.jcp.2020.109257.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Hu, Mengqi, Yifei Lou, and Xiu Yang. "A General Framework of Rotational Sparse Approximation in Uncertainty Quantification." SIAM/ASA Journal on Uncertainty Quantification 10, no. 4 (October 27, 2022): 1410–34. http://dx.doi.org/10.1137/21m1391602.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Kostakis, Filippos, Bradley T. Mallison, and Louis J. Durlofsky. "Multifidelity framework for uncertainty quantification with multiple quantities of interest." Computational Geosciences 24, no. 2 (June 21, 2019): 761–73. http://dx.doi.org/10.1007/s10596-019-9825-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Shahane, Shantanu, Narayana Aluru, Placid Ferreira, Shiv G. Kapoor, and Surya Pratap Vanka. "Finite volume simulation framework for die casting with uncertainty quantification." Applied Mathematical Modelling 74 (October 2019): 132–50. http://dx.doi.org/10.1016/j.apm.2019.04.045.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Garg, Shailesh, and Souvik Chakraborty. "VB-DeepONet: A Bayesian operator learning framework for uncertainty quantification." Engineering Applications of Artificial Intelligence 118 (February 2023): 105685. http://dx.doi.org/10.1016/j.engappai.2022.105685.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Marepally, Koushik, Yong Su Jung, James Baeder, and Ganesh Vijayakumar. "Uncertainty quantification of wind turbine airfoil aerodynamics with geometric uncertainty." Journal of Physics: Conference Series 2265, no. 4 (May 1, 2022): 042041. http://dx.doi.org/10.1088/1742-6596/2265/4/042041.

Full text
Abstract:
Abstract An artificial neural network based reduced order model (ROM) is developed to predict the load coefficients and performance of wind turbine airfoils. The model is trained using a representative database of 972 wind turbine airfoil shapes generated by perturbing the design parameters in each of 12 baseline airfoils defining commercially relevant modern wind turbines. The predictions from our ROM show excellent agreement with the CFD data, with a 99th percentile maximum errors of 0.03 in lift-coefficient, 2 in lift-to-drag ratio and 0.002 in pitching moment coefficient. A Monte-Carlo based uncertainty quantification (UQ) and global sensitivity analysis (GSA) framework is developed using this computationally economical ROM. Using UQ, we observed the stall behavior to be very sensitive to geometric uncertainty, with more than 10% deviation in lift coefficient associated to 5% deviation in geometric features. Sobol’s analysis is used to identify the most influencing geometric feature for the stall behavior to be concentrated at the maximum thickness location on the airfoil suction surface.
APA, Harvard, Vancouver, ISO, and other styles
23

Naozuka, Gustavo Taiji, Emanuelle Arantes Paixão, João Vitor Oliveira Silva, Maurício Pessoa da Cunha Menezes, and Regina Cerqueira Almeida. "Model Comparison and Uncertainty Quantification in Tumor Growth." Trends in Computational and Applied Mathematics 22, no. 3 (September 2, 2021): 495–514. http://dx.doi.org/10.5540/tcam.2021.022.03.00495.

Full text
Abstract:
Mathematical and computational modeling have been increasingly applied in many areas of cancer research, aiming to improve the understanding of tumorigenic mechanisms and to suggest more effective therapy protocols. The mathematical description of the tumor growth dynamics is often made using the exponential, logistic, and Gompertz models. However, recent literature has suggested that the Allee effect may play an important role in the early stages of tumor dynamics, including cancer relapse and metastasis. For a model to provide reliable predictions, it is necessary to have a rigorous evaluation of the uncertainty inherent in the modeling process. In this work, our main objective is to show how a model framework that integrates sensitivity analysis, model calibration, and model selection techniques can improve and systematically characterize model and data uncertainties. We investigate five distinct models with different complexities, which encompass the exponential, logistic, Gompertz, and weak and strong Allee effects dynamics. Using tumor growth data published in the literature, we perform a global sensitivity analysis, apply a Bayesian framework for parameter inference, evaluate the associated sensitivity matrices, and use different information criteria for model selection (First- and Second-Order Akaike Information Criteria and Bayesian Information Criterion). We show that such a wider methodology allows having a more detailed picture of each model assumption and uncertainty, calibration reliability, ultimately improving tumor mathematical description. The used in vivo data suggested the existence of both a competitive effect among tumor cells and a weak Allee effect in the growth dynamics. The proposed model framework highlights the need for more detailed experimental studies on the influence of the Allee effect on the analyzed cancer scenario.
APA, Harvard, Vancouver, ISO, and other styles
24

Singh, Rishabh, and Jose C. Principe. "Toward a Kernel-Based Uncertainty Decomposition Framework for Data and Models." Neural Computation 33, no. 5 (April 13, 2021): 1164–98. http://dx.doi.org/10.1162/neco_a_01372.

Full text
Abstract:
Abstract This letter introduces a new framework for quantifying predictive uncertainty for both data and models that relies on projecting the data into a gaussian reproducing kernel Hilbert space (RKHS) and transforming the data probability density function (PDF) in a way that quantifies the flow of its gradient as a topological potential field (quantified at all points in the sample space). This enables the decomposition of the PDF gradient flow by formulating it as a moment decomposition problem using operators from quantum physics, specifically Schrödinger's formulation. We experimentally show that the higher-order moments systematically cluster the different tail regions of the PDF, thereby providing unprecedented discriminative resolution of data regions having high epistemic uncertainty. In essence, this approach decomposes local realizations of the data PDF in terms of uncertainty moments. We apply this framework as a surrogate tool for predictive uncertainty quantification of point-prediction neural network models, overcoming various limitations of conventional Bayesian-based uncertainty quantification methods. Experimental comparisons with some established methods illustrate performance advantages that our framework exhibits.
APA, Harvard, Vancouver, ISO, and other styles
25

Xu, Ting. "Uncertainty, Ignorance and Decision-Making." Amicus Curiae 3, no. 1 (October 27, 2021): 10–32. http://dx.doi.org/10.14296/ac.v3i1.5350.

Full text
Abstract:
A great deal of decision-making during crises is about coping with uncertainty. For rulemakers, this poses a fundamental challenge, as there has been a lack of a rigorous framework for understanding and analysing the nature and function of uncertainty in the context of rulemaking. In coping with crises, modelling has become a governance tool to navigate and tame uncertainty and justify decisions. This is because models, in particular mathematical models, can be useful to produce precise answers in numbers. This article examines the challenges rulemakers are facing in an uncertain world and argues that one of the most important challenges lies in rulemakers’ failures to understand the nature of uncertainty and ignorance in the contested arena of science for decision-making. It focuses on the relationship between uncertainty, ignorance and decisionmaking through a case study of the interaction between modelling and rulemaking in the Covid-19 pandemic. In so doing, this article provides an alternative strategy to number- and model-based rulemaking in an uncertain world. It provokes a rethinking of using science to measure and govern human affairs and the impact of numbers and quantification on law. Keywords: uncertainty; ignorance; decision-making; rulemaking; models; mathematical modelling; quantification; Covid-19.
APA, Harvard, Vancouver, ISO, and other styles
26

Oh, Min-hwan, Peder Olsen, and Karthikeyan Natesan Ramamurthy. "Crowd Counting with Decomposed Uncertainty." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 07 (April 3, 2020): 11799–806. http://dx.doi.org/10.1609/aaai.v34i07.6852.

Full text
Abstract:
Research in neural networks in the field of computer vision has achieved remarkable accuracy for point estimation. However, the uncertainty in the estimation is rarely addressed. Uncertainty quantification accompanied by point estimation can lead to a more informed decision, and even improve the prediction quality. In this work, we focus on uncertainty estimation in the domain of crowd counting. With increasing occurrences of heavily crowded events such as political rallies, protests, concerts, etc., automated crowd analysis is becoming an increasingly crucial task. The stakes can be very high in many of these real-world applications. We propose a scalable neural network framework with quantification of decomposed uncertainty using a bootstrap ensemble. We demonstrate that the proposed uncertainty quantification method provides additional insight to the crowd counting problem and is simple to implement. We also show that our proposed method exhibits state-of-the-art performances in many benchmark crowd counting datasets.
APA, Harvard, Vancouver, ISO, and other styles
27

Chen, Peng, and Nicholas Zabaras. "Adaptive Locally Weighted Projection Regression Method for Uncertainty Quantification." Communications in Computational Physics 14, no. 4 (October 2013): 851–78. http://dx.doi.org/10.4208/cicp.060712.281212a.

Full text
Abstract:
AbstractWe develop an efficient, adaptive locally weighted projection regression (ALWPR) framework for uncertainty quantification (UQ) of systems governed by ordinary and partial differential equations. The algorithm adaptively selects the new input points with the largest predictive variance and decides when and where to add new local models. It effectively learns the local features and accurately quantifies the uncertainty in the prediction of the statistics. The developed methodology provides predictions and confidence intervals at any query input and can deal with multi-output cases. Numerical examples are presented to show the accuracy and efficiency of the ALWPR framework including problems with non-smooth local features such as discontinuities in the stochastic space.
APA, Harvard, Vancouver, ISO, and other styles
28

Kuhn, Thomas, Jakob Dürrwächter, Fabian Meyer, Andrea Beck, Christian Rohde, and Claus-Dieter Munz. "Uncertainty Quantification for Direct Aeroacoustic Simulations of Cavity Flows." Journal of Theoretical and Computational Acoustics 27, no. 01 (March 2019): 1850044. http://dx.doi.org/10.1142/s2591728518500445.

Full text
Abstract:
We investigate the influence of uncertain input parameters on the aeroacoustic feedback of cavity flows. The so-called Rossiter feedback requires a direct numerical computation of the acoustic noise, which solves hydrodynamics and acoustics simultaneously, in order to capture the interaction of acoustic waves and the hydrodynamics of the flow. Due to the large bandwidth of spatial and temporal scales, a high-order numerical scheme with low dissipation and dispersion error is necessary to preserve important small scale information. Therefore, the open-source CFD solver FLEXI, which is based on a high-order discontinuous Galerkin spectral element method, is used to perform the aforementioned direct simulations of an open cavity configuration with a laminar upstream boundary layer. To analyze the precision of the deterministic cavity simulation with respect to random input parameters, we establish a framework for uncertainty quantification (UQ). In particular, a nonintrusive spectral projection method with Legendre and Hermite polynomial basis functions is employed in order to treat uniform and normal probability distributions of the random input. The results indicate a strong, nonlinear dependency of the acoustic feedback mechanism on the investigated uncertain input parameters. An analysis of the stochastic results offers new insights into the noise generation process of open cavity flows and reveals the strength of the implemented UQ framework.
APA, Harvard, Vancouver, ISO, and other styles
29

Danquah, Benedikt, Stefan Riedmaier, Yasin Meral, and Markus Lienkamp. "Statistical Validation Framework for Automotive Vehicle Simulations Using Uncertainty Learning." Applied Sciences 11, no. 5 (February 24, 2021): 1983. http://dx.doi.org/10.3390/app11051983.

Full text
Abstract:
The modelling and simulation process in the automotive domain is transforming. Increasing system complexity and variant diversity, especially in new electric powertrain systems, lead to complex, modular simulations that depend on virtual vehicle development, testing and approval. Consequently, the emerging key requirements for automotive validation involve a precise reliability quantification across a large application domain. Validation is unable to meet these requirements because its results provide little information, uncertainties are neglected, the model reliability cannot be easily extrapolated and the resulting application domain is small. In order to address these insufficiencies, this paper develops a statistical validation framework for dynamic systems with changing parameter configurations, thus enabling a flexible validation of complex total vehicle simulations including powertrain modelling. It uses non-deterministic models to consider input uncertainties, applies uncertainty learning to predict inherent model uncertainties and enables precise reliability quantification of arbitrary system parameter configurations to form a large application domain. The paper explains the framework with real-world data from a prototype electric vehicle on a dynamometer, validates it with additional tests and compares it to conventional validation methods. It is published as an open-source document. With the validation information from the framework and the knowledge deduced from the real-world problem, the paper solves its key requirements and offers recommendations on how to efficiently revise models with the framework’s validation results.
APA, Harvard, Vancouver, ISO, and other styles
30

Talarico, Erick Costa e. Silva, Dario Grana, Leandro Passos de Figueiredo, and Sinesio Pesco. "Uncertainty quantification in seismic facies inversion." GEOPHYSICS 85, no. 4 (June 24, 2020): M43—M56. http://dx.doi.org/10.1190/geo2019-0392.1.

Full text
Abstract:
In seismic reservoir characterization, facies prediction from seismic data often is formulated as an inverse problem. However, the uncertainty in the parameters that control their spatial distributions usually is not investigated. In a probabilistic setting, the vertical distribution of facies often is described by statistical models, such as Markov chains. Assuming that the transition probabilities in the vertical direction are known, the most likely facies sequence and its uncertainty can be obtained by computing the posterior distribution of a Bayesian inverse problem conditioned by seismic data. Generally, the model hyperparameters such as the transition matrix are inferred from seismic data and nearby wells using a Bayesian inference framework. It is assumed that there is a unique set of hyperparameters that optimally fit the measurements. The novelty of the proposed work is to investigate the nonuniqueness of the transition matrix and show the multimodality of their distribution. We then generalize the Bayesian inversion approach based on Markov chain models by assuming that the hyperparameters, the facies prior proportions and transition matrix, are unknown and derive the full posterior distribution. Including all of the possible transition matrices in the inversion improves the uncertainty quantification of the predicted facies conditioned by seismic data. Our method is demonstrated on synthetic and real seismic data sets, and it has high relevance in exploration studies due to the limited number of well data and in geologic environments with rapid lateral variations of the facies vertical distribution.
APA, Harvard, Vancouver, ISO, and other styles
31

Datar, Makarand, David Gorsich, David Lamb, and Dan Negrut. "A framework for terrain-induced uncertainty quantification in vehicle dynamics simulation." International Journal of Vehicle Systems Modelling and Testing 4, no. 4 (2009): 234. http://dx.doi.org/10.1504/ijvsmt.2009.032018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Nagel, Joseph B., and Bruno Sudret. "A unified framework for multilevel uncertainty quantification in Bayesian inverse problems." Probabilistic Engineering Mechanics 43 (January 2016): 68–84. http://dx.doi.org/10.1016/j.probengmech.2015.09.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Hilton, Samuel, Federico Cairola, Alessandro Gardi, Roberto Sabatini, Nichakorn Pongsakornsathien, and Neta Ezer. "Uncertainty Quantification for Space Situational Awareness and Traffic Management." Sensors 19, no. 20 (October 9, 2019): 4361. http://dx.doi.org/10.3390/s19204361.

Full text
Abstract:
This paper presents a sensor-orientated approach to on-orbit position uncertainty generation and quantification for both ground-based and space-based surveillance applications. A mathematical framework based on the least squares formulation is developed to exploit real-time navigation measurements and tracking observables to provide a sound methodology that supports separation assurance and collision avoidance among Resident Space Objects (RSO). In line with the envisioned Space Situational Awareness (SSA) evolutions, the method aims to represent the navigation and tracking errors in the form of an uncertainty volume that accurately depicts the size, shape, and orientation. Simulation case studies are then conducted to verify under which sensors performance the method meets Gaussian assumptions, with a greater view to the implications that uncertainty has on the cyber-physical architecture evolutions and Cognitive Human-Machine Systems required for Space Situational Awareness and the development of a comprehensive Space Traffic Management framework.
APA, Harvard, Vancouver, ISO, and other styles
34

Feng, Jinchao, Joshua L. Lansford, Markos A. Katsoulakis, and Dionisios G. Vlachos. "Explainable and trustworthy artificial intelligence for correctable modeling in chemical sciences." Science Advances 6, no. 42 (October 2020): eabc3204. http://dx.doi.org/10.1126/sciadv.abc3204.

Full text
Abstract:
Data science has primarily focused on big data, but for many physics, chemistry, and engineering applications, data are often small, correlated and, thus, low dimensional, and sourced from both computations and experiments with various levels of noise. Typical statistics and machine learning methods do not work for these cases. Expert knowledge is essential, but a systematic framework for incorporating it into physics-based models under uncertainty is lacking. Here, we develop a mathematical and computational framework for probabilistic artificial intelligence (AI)–based predictive modeling combining data, expert knowledge, multiscale models, and information theory through uncertainty quantification and probabilistic graphical models (PGMs). We apply PGMs to chemistry specifically and develop predictive guarantees for PGMs generally. Our proposed framework, combining AI and uncertainty quantification, provides explainable results leading to correctable and, eventually, trustworthy models. The proposed framework is demonstrated on a microkinetic model of the oxygen reduction reaction.
APA, Harvard, Vancouver, ISO, and other styles
35

VandenHeuvel, Daniel J., Christopher Drovandi, and Matthew J. Simpson. "Computationally efficient mechanism discovery for cell invasion with uncertainty quantification." PLOS Computational Biology 18, no. 11 (November 16, 2022): e1010599. http://dx.doi.org/10.1371/journal.pcbi.1010599.

Full text
Abstract:
Parameter estimation for mathematical models of biological processes is often difficult and depends significantly on the quality and quantity of available data. We introduce an efficient framework using Gaussian processes to discover mechanisms underlying delay, migration, and proliferation in a cell invasion experiment. Gaussian processes are leveraged with bootstrapping to provide uncertainty quantification for the mechanisms that drive the invasion process. Our framework is efficient, parallelisable, and can be applied to other biological problems. We illustrate our methods using a canonical scratch assay experiment, demonstrating how simply we can explore different functional forms and develop and test hypotheses about underlying mechanisms, such as whether delay is present. All code and data to reproduce this work are available at https://github.com/DanielVandH/EquationLearning.jl.
APA, Harvard, Vancouver, ISO, and other styles
36

Janya-anurak, Chettapong, Thomas Bernard, and Jürgen Beyerer. "Uncertainty quantification of nonlinear distributed parameter systems using generalized polynomial chaos." at - Automatisierungstechnik 67, no. 4 (April 26, 2019): 283–303. http://dx.doi.org/10.1515/auto-2017-0116.

Full text
Abstract:
Abstract Many industrial and environmental processes are characterized as complex spatio-temporal systems. Such systems known as distributed parameter systems (DPSs) are usually highly complex and it is difficult to establish the relation between model inputs, model outputs and model parameters. Moreover, the solutions of physics-based models commonly differ somehow from the measurements. In this work, appropriate Uncertainty Quantification (UQ) approaches are selected and combined systematically to analyze and identify systems. However, there are two main challenges when applying the UQ approaches to nonlinear distributed parameter systems. These are: (1) how uncertainties are modeled and (2) the computational effort, as the conventional methods require numerous evaluations of the model to compute the probability density function of the response. This paper presents a framework to solve these two issues. Within the Bayesian framework, incomplete knowledge about the system is considered as uncertainty of the system. The uncertainties are represented by random variables, whose probability density function can be achieved by converting the knowledge of the parameters using the Principle of Maximum Entropy. The generalized Polynomial Chaos (gPC) expansion is employed to reduce the computational effort. The framework using gPC based on Bayesian UQ proposed in this work is capable of analyzing systems systematically and reducing the disagreement between model predictions and measurements of the real processes to fulfill user defined performance criteria. The efficiency of the framework is assessed by applying it to a benchmark model (neutron diffusion equation) and to a model of a complex rheological forming process. These applications illustrate that the framework is capable of systematically analyzing the system and optimally calibrating the model parameters.
APA, Harvard, Vancouver, ISO, and other styles
37

Berger, James O., and Leonard A. Smith. "On the Statistical Formalism of Uncertainty Quantification." Annual Review of Statistics and Its Application 6, no. 1 (March 7, 2019): 433–60. http://dx.doi.org/10.1146/annurev-statistics-030718-105232.

Full text
Abstract:
The use of models to try to better understand reality is ubiquitous. Models have proven useful in testing our current understanding of reality; for instance, climate models of the 1980s were built for science discovery, to achieve a better understanding of the general dynamics of climate systems. Scientific insights often take the form of general qualitative predictions (i.e., “under these conditions, the Earth's poles will warm more than the rest of the planet”); such use of models differs from making quantitative forecasts of specific events (i.e. “high winds at noon tomorrow at London's Heathrow Airport”). It is sometimes hoped that, after sufficient model development, any model can be used to make quantitative forecasts for any target system. Even if that were the case, there would always be some uncertainty in the prediction. Uncertainty quantification aims to provide a framework within which that uncertainty can be discussed and, ideally, quantified, in a manner relevant to practitioners using the forecast system. A statistical formalism has developed that claims to be able to accurately assess the uncertainty in prediction. This article is a discussion of if and when this formalism can do so. The article arose from an ongoing discussion between the authors concerning this issue, the second author generally being considerably more skeptical concerning the utility of the formalism in providing quantitative decision-relevant information.
APA, Harvard, Vancouver, ISO, and other styles
38

Wei, Wei, Jiang Wu, Yang Yu, Tong Niu, and Xinxin Deng. "Uncertainty Quantification Analysis of Wind Power: A Data-Driven Monitoring-Forecasting Framework." IEEE Access 9 (2021): 84403–16. http://dx.doi.org/10.1109/access.2021.3086583.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Buisson, Bertrand, and Djamel Lakehal. "Towards an integrated machine-learning framework for model evaluation and uncertainty quantification." Nuclear Engineering and Design 354 (December 2019): 110197. http://dx.doi.org/10.1016/j.nucengdes.2019.110197.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Ricciardi, Denielle E., Oksana A. Chkrebtii, and Stephen R. Niezgoda. "Uncertainty Quantification Accounting for Model Discrepancy Within a Random Effects Bayesian Framework." Integrating Materials and Manufacturing Innovation 9, no. 2 (June 2020): 181–98. http://dx.doi.org/10.1007/s40192-020-00176-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Jiang, Chen, Zhen Hu, Yixuan Liu, Zissimos P. Mourelatos, David Gorsich, and Paramsothy Jayakumar. "A sequential calibration and validation framework for model uncertainty quantification and reduction." Computer Methods in Applied Mechanics and Engineering 368 (August 2020): 113172. http://dx.doi.org/10.1016/j.cma.2020.113172.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Wate, P., M. Iglesias, V. Coors, and D. Robinson. "Framework for emulation and uncertainty quantification of a stochastic building performance simulator." Applied Energy 258 (January 2020): 113759. http://dx.doi.org/10.1016/j.apenergy.2019.113759.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Abbas, Tajammal, and Guido Morgenthal. "Framework for sensitivity and uncertainty quantification in the flutter assessment of bridges." Probabilistic Engineering Mechanics 43 (January 2016): 91–105. http://dx.doi.org/10.1016/j.probengmech.2015.12.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Roy, Christopher J., and William L. Oberkampf. "A comprehensive framework for verification, validation, and uncertainty quantification in scientific computing." Computer Methods in Applied Mechanics and Engineering 200, no. 25-28 (June 2011): 2131–44. http://dx.doi.org/10.1016/j.cma.2011.03.016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Poliannikov, Oleg V., and Alison E. Malcolm. "The effect of velocity uncertainty on migrated reflectors: Improvements from relative-depth imaging." GEOPHYSICS 81, no. 1 (January 1, 2016): S21—S29. http://dx.doi.org/10.1190/geo2014-0604.1.

Full text
Abstract:
We have studied the problem of uncertainty quantification for migrated images. A traditional migrated image contains deterministic reconstructions of subsurface structures. However, input parameters used in migration, such as reflection data and a velocity model, are inherently uncertain. This uncertainty is carried through to the migrated images. We have used Bayesian analysis to quantify the uncertainty of the migrated structures by constructing a joint statistical distribution of the location of these structures. From this distribution, we could deduce the uncertainty in any quantity derived from these structures. We have developed the proposed framework using a simple model with velocity uncertainty in the overburden, and we estimated the absolute positions of the horizons and the relative depth of one horizon with respect to another. By quantifying the difference in the corresponding uncertainties, we found that, in this case, the relative depths of the structures could be estimated much better than their absolute depths. This analysis justifies redatuming below an uncertain overburden for the purposes of the uncertainty reduction.
APA, Harvard, Vancouver, ISO, and other styles
46

Du, Hongfei, Emre Barut, and Fang Jin. "Uncertainty Quantification in CNN Through the Bootstrap of Convex Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 13 (May 18, 2021): 12078–85. http://dx.doi.org/10.1609/aaai.v35i13.17434.

Full text
Abstract:
Despite the popularity of Convolutional Neural Networks (CNN), the problem of uncertainty quantification (UQ) of CNN has been largely overlooked. Lack of efficient UQ tools severely limits the application of CNN in certain areas, such as medicine, where prediction uncertainty is critically important. Among the few existing UQ approaches that have been proposed for deep learning, none of them has theoretical consistency that can guarantee the uncertainty quality. To address this issue, we propose a novel bootstrap based framework for the estimation of prediction uncertainty. The inference procedure we use relies on convexified neural networks to establish the theoretical consistency of bootstrap. Our approach has a significantly less computational load than its competitors, as it relies on warm-starts at each bootstrap that avoids refitting the model from scratch. We further explore a novel transfer learning method so our framework can work on arbitrary neural networks. We experimentally demonstrate our approach has a much better performance compared to other baseline CNNs and state-of-the-art methods on various image datasets.
APA, Harvard, Vancouver, ISO, and other styles
47

Tang, Hesheng, Dawei Li, Lixin Deng, and Songtao Xue. "Evidential uncertainty quantification of the Park–Ang damage model in performance based design." Engineering Computations 35, no. 7 (October 1, 2018): 2480–501. http://dx.doi.org/10.1108/ec-11-2017-0466.

Full text
Abstract:
Purpose This paper aims to develop a comprehensive uncertainty quantification method using evidence theory for Park–Ang damage index-based performance design in which epistemic uncertainties are considered. Various sources of uncertainty emanating from the database of the cyclic test results of RC members provided by the Pacific Earthquake Engineering Research Center are taken into account. Design/methodology/approach In this paper, an uncertainty quantification methodology based on evidence theory is presented for the whole process of performance-based seismic design (PBSD), while considering uncertainty in the Park–Ang damage model. To alleviate the burden of high computational cost in propagating uncertainty, the differential evolution interval optimization strategy is used for efficiently finding the propagated belief structure throughout the whole design process. Findings The investigation results of this paper demonstrate that the uncertainty rooted in Park–Ang damage model have a significant influence on PBSD design and evaluation. It might be worth noting that the epistemic uncertainty present in the Park–Ang damage model needs to be considered to avoid underestimating the true uncertainty. Originality/value This paper presents an evidence theory-based uncertainty quantification framework for the whole process of PBSD.
APA, Harvard, Vancouver, ISO, and other styles
48

Subber, Waad, Sayan Ghosh, Piyush Pandita, Yiming Zhang, and Liping Wang. "Data-Informed Decomposition for Localized Uncertainty Quantification of Dynamical Systems." Vibration 4, no. 1 (December 31, 2020): 49–63. http://dx.doi.org/10.3390/vibration4010004.

Full text
Abstract:
Industrial dynamical systems often exhibit multi-scale responses due to material heterogeneity and complex operation conditions. The smallest length-scale of the systems dynamics controls the numerical resolution required to resolve the embedded physics. In practice however, high numerical resolution is only required in a confined region of the domain where fast dynamics or localized material variability is exhibited, whereas a coarser discretization can be sufficient in the rest majority of the domain. Partitioning the complex dynamical system into smaller easier-to-solve problems based on the localized dynamics and material variability can reduce the overall computational cost. The region of interest can be specified based on the localized features of the solution, user interest, and correlation length of the material properties. For problems where a region of interest is not evident, Bayesian inference can provide a feasible solution. In this work, we employ a Bayesian framework to update the prior knowledge of the localized region of interest using measurements of the system response. Once, the region of interest is identified, the localized uncertainty is propagate forward through the computational domain. We demonstrate our framework using numerical experiments on a three-dimensional elastodynamic problem.
APA, Harvard, Vancouver, ISO, and other styles
49

Delipei, Gregory Kyriakos, Josselin Garnier, Jean-Charles Le Pallec, and Benoit Normand. "High to Low pellet cladding gap heat transfer modeling methodology in an uncertainty quantification framework for a PWR Rod Ejection Accident with best estimate coupling." EPJ Nuclear Sciences & Technologies 6 (2020): 56. http://dx.doi.org/10.1051/epjn/2020018.

Full text
Abstract:
High to Low modeling approaches can alleviate the computationally expensive fuel modeling in nuclear reactor’s transient uncertainty quantification. This is especially the case for Rod Ejection Accident (REA) in Pressurized Water Reactors (PWR) were strong multi-physics interactions occur. In this work, we develop and propose a pellet cladding gap heat transfer (Hgap) High to Low modeling methodology for a PWR REA in an uncertainty quantification framework. The methodology involves the calibration of a simplified Hgap model based on high fidelity simulations with the fuel-thermomechanics code ALCYONE1. The calibrated model is then introduced into the CEA developed CORPUS Best Estimate (BE) multi-physics coupling between APOLLO3® and FLICA4. This creates an Improved Best Estimate (IBE) coupling that is then used for an uncertainty quantification study. The results indicate that with IBE the distance to boiling crisis uncertainty is decreased from 57% to 42%. This is reflected to the decrease of the sensitivity of Hgap. In the BE coupling Hgap was responsible for 50% of the output variance while in IBE it is close to 0. These results show the potential gain of High to Low approaches for Hgap modeling in REA uncertainty analyses.
APA, Harvard, Vancouver, ISO, and other styles
50

Kekez, Toni, Snježana Knezić, and Roko Andričević. "Incorporating Uncertainty of the System Behavior in Flood Risk Assessment—Sava River Case Study." Water 12, no. 10 (September 24, 2020): 2676. http://dx.doi.org/10.3390/w12102676.

Full text
Abstract:
This paper proposes a framework for evaluation of the sources of uncertainty that can disrupt the flood emergency response process. During the flood response, flood emergency managers usually choose between several decision options under limited available lead-time, but they are often compelled with different sources of uncertainty. These sources can significantly affect the quality of decisions related to adequate response and rapid recovery of the affected system. The proposed framework considers efficient identification, integration, and quantification of system uncertainties related to the flood risk. Uncertainty analysis is performed from a decision-maker’s perspective and focused on the time period near and during the flood event. The major scope of proposed framework is to recognize and characterize sources of uncertainty which can potentially appear within the behavior of the observed system. Using a Bayesian network approach, a model is developed capable for quantification of different sources uncertainty in respect to their particular type. The proposed approach is validated on the Sava River case study, in the area of the city of Slavonski Brod, following the destructive 2014 flood event. The results indicate that, despite improvements of structural measures, the weir failure can still cause flooding of the approximately 1 km2 of otherwise safe area, resulting in the increased flood risk.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography