To see the other types of publications on this topic, follow the link: Uncertainty Quantification model.

Journal articles on the topic 'Uncertainty Quantification model'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Uncertainty Quantification model.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Salehghaffari, S., and M. Rais-Rohani. "Material model uncertainty quantification using evidence theory." Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science 227, no. 10 (January 8, 2013): 2165–81. http://dx.doi.org/10.1177/0954406212473390.

Full text
Abstract:
Uncertainties in material models and their influence on structural behavior and reliability are important considerations in analysis and design of structures. In this article, a methodology based on the evidence theory is presented for uncertainty quantification of constitutive models. The proposed methodology is applied to Johnson–Cook plasticity model while considering various sources of uncertainty emanating from experimental stress–strain data as well as method of fitting the model constants and representation of the nondimensional temperature. All uncertain parameters are represented in interval form. Rules for agreement, conflict, and ignorance relationships in the data are discussed and subsequently used to construct a belief structure for each uncertain material parameter. The material model uncertainties are propagated through nonlinear crush simulation of an aluminium alloy 6061-T6 circular tube under axial impact load. Surrogate modeling and global optimization techniques are used for efficient calculation of the propagated belief structure of the tube response, whereas Yager’s aggregation rule of evidence is used for multi-model consideration. Evidence-based uncertainty in the structural response is measured and presented in terms of belief, plausibility, and plausibility-decision values.
APA, Harvard, Vancouver, ISO, and other styles
2

Vallam, P., X. S. Qin, and J. J. Yu. "Uncertainty Quantification of Hydrologic Model." APCBEE Procedia 10 (2014): 219–23. http://dx.doi.org/10.1016/j.apcbee.2014.10.042.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Guo, Xianpeng, Dezhi Wang, Lilun Zhang, Yongxian Wang, Wenbin Xiao, and Xinghua Cheng. "Uncertainty Quantification of Underwater Sound Propagation Loss Integrated with Kriging Surrogate Model." International Journal of Signal Processing Systems 5, no. 4 (December 2017): 141–45. http://dx.doi.org/10.18178/ijsps.5.4.141-145.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Franck, Isabell M., and P. S. Koutsourelakis. "Constitutive model error and uncertainty quantification." PAMM 17, no. 1 (December 2017): 865–68. http://dx.doi.org/10.1002/pamm.201710400.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

de Vries, Douwe K., and Paul M. J. Den Van Hof. "Quantification of model uncertainty from data." International Journal of Robust and Nonlinear Control 4, no. 2 (1994): 301–19. http://dx.doi.org/10.1002/rnc.4590040206.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kamga, P. H. T., B. Li, M. McKerns, L. H. Nguyen, M. Ortiz, H. Owhadi, and T. J. Sullivan. "Optimal uncertainty quantification with model uncertainty and legacy data." Journal of the Mechanics and Physics of Solids 72 (December 2014): 1–19. http://dx.doi.org/10.1016/j.jmps.2014.07.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Liu, Chang, and Duane A. McVay. "Continuous Reservoir-Simulation-Model Updating and Forecasting Improves Uncertainty Quantification." SPE Reservoir Evaluation & Engineering 13, no. 04 (August 12, 2010): 626–37. http://dx.doi.org/10.2118/119197-pa.

Full text
Abstract:
Summary Most reservoir-simulation studies are conducted in a static context—at a single point in time using a fixed set of historical data for history matching. Time and budget constraints usually result in significant reduction in the number of uncertain parameters and incomplete exploration of the parameter space, which results in underestimation of forecast uncertainty and less-than-optimal decision making. Markov Chain Monte Carlo (MCMC) methods have been used in static studies for rigorous exploration of the parameter space for quantification of forecast uncertainty, but these methods suffer from long burn-in times and many required runs for chain stabilization. In this paper, we apply the MCMC in a real-time reservoirmodeling application. The system operates in a continuous process of data acquisition, model calibration, forecasting, and uncertainty quantification. The system was validated on the PUNQ (production forecasting with uncertainty quantification) synthetic reservoir in a simulated multiyear continuous-modeling scenario, and it yielded probabilistic forecasts that narrowed with time. Once the continuous MCMC simulation process has been established sufficiently, the continuous approach usually allows generation of a reasonable probabilistic forecast at a particular point in time with many fewer models than the traditional application of the MCMC method in a one-time, static simulation study starting at the same time. Operating continuously over the many years of typical reservoir life, many more realizations can be run than with traditional approaches. This allows more-thorough investigation of the parameter space and more-complete quantification of forecast uncertainty. More importantly, the approach provides a mechanism for, and can thus encourage, calibration of uncertainty estimates over time. Greater investigation of the uncertain parameter space and calibration of uncertainty estimates by using a continuous modeling process should improve the reliability of probabilistic forecasts significantly.
APA, Harvard, Vancouver, ISO, and other styles
8

Cheng, Xi, Clément Henry, Francesco P. Andriulli, Christian Person, and Joe Wiart. "A Surrogate Model Based on Artificial Neural Network for RF Radiation Modelling with High-Dimensional Data." International Journal of Environmental Research and Public Health 17, no. 7 (April 9, 2020): 2586. http://dx.doi.org/10.3390/ijerph17072586.

Full text
Abstract:
This paper focuses on quantifying the uncertainty in the specific absorption rate values of the brain induced by the uncertain positions of the electroencephalography electrodes placed on the patient’s scalp. To avoid running a large number of simulations, an artificial neural network architecture for uncertainty quantification involving high-dimensional data is proposed in this paper. The proposed method is demonstrated to be an attractive alternative to conventional uncertainty quantification methods because of its considerable advantage in the computational expense and speed.
APA, Harvard, Vancouver, ISO, and other styles
9

Sun, Xianming, and Michèle Vanmaele. "Uncertainty Quantification of Derivative Instruments." East Asian Journal on Applied Mathematics 7, no. 2 (May 2017): 343–62. http://dx.doi.org/10.4208/eajam.100316.270117a.

Full text
Abstract:
AbstractModel and parameter uncertainties are common whenever some parametric model is selected to value a derivative instrument. Combining the Monte Carlo method with the Smolyak interpolation algorithm, we propose an accurate efficient numerical procedure to quantify the uncertainty embedded in complex derivatives. Except for the value function being sufficiently smooth with respect to the model parameters, there are no requirements on the payoff or candidate models. Numerical tests carried out quantify the uncertainty of Bermudan put options and down-and-out put options under the Heston model, with each model parameter specified in an interval.
APA, Harvard, Vancouver, ISO, and other styles
10

Herty, Michael, and Elisa Iacomini. "Uncertainty quantification in hierarchical vehicular flow models." Kinetic and Related Models 15, no. 2 (2022): 239. http://dx.doi.org/10.3934/krm.2022006.

Full text
Abstract:
<p style='text-indent:20px;'>We consider kinetic vehicular traffic flow models of BGK type [<xref ref-type="bibr" rid="b24">24</xref>]. Considering different spatial and temporal scales, those models allow to derive a hierarchy of traffic models including a hydrodynamic description. In this paper, the kinetic BGK–model is extended by introducing a parametric stochastic variable to describe possible uncertainty in traffic. The interplay of uncertainty with the given model hierarchy is studied in detail. Theoretical results on consistent formulations of the stochastic differential equations on the hydrodynamic level are given. The effect of the possibly negative diffusion in the stochastic hydrodynamic model is studied and numerical simulations of uncertain traffic situations are presented.</p>
APA, Harvard, Vancouver, ISO, and other styles
11

Mirzayeva, A., N. A. Slavinskaya, M. Abbasi, J. H. Starcke, W. Li, and M. Frenklach. "Uncertainty Quantification in Chemical Modeling." Eurasian Chemico-Technological Journal 20, no. 1 (March 31, 2018): 33. http://dx.doi.org/10.18321/ectj706.

Full text
Abstract:
A module of PrIMe automated data-centric infrastructure, Bound-to-Bound Data Collaboration (B2BDC), was used for the analysis of systematic uncertainty and data consistency of the H2/CO reaction model (73/17). In order to achieve this purpose, a dataset of 167 experimental targets (ignition delay time and laminar flame speed) and 55 active model parameters (pre-exponent factors in the Arrhenius form of the reaction rate coefficients) was constructed. Consistency analysis of experimental data from the composed dataset revealed disagreement between models and data. Two consistency measures were applied to identify the quality of experimental targets (Quantities of Interest, QoI): scalar consistency measure, which quantifies the tightening index of the constraints while still ensuring the existence of a set of the model parameter values whose associated modeling output predicts the experimental QoIs within the uncertainty bounds; and a newly-developed method of computing the vector consistency measure (VCM), which determines the minimal bound changes for QoIs initially identified as inconsistent, each bound by its own extent, while still ensuring the existence of a set of the model parameter values whose associated modeling output predicts the experimental QoIs within the uncertainty bounds. The consistency analysis suggested that elimination of 45 experimental targets, 8 of which were self- inconsistent, would lead to a consistent dataset. After that the feasible parameter set was constructed through decrease uncertainty parameters for several reaction rate coefficients. This dataset was subjected for the B2BDC framework model optimization and analysis on. Forth methods of parameter optimization were applied, including those unique in the B2BDC framework. The optimized models showed improved agreement with experimental values, as compared to the initially-assembled model. Moreover, predictions for experiments not included in the initial dataset were investigated. The results demonstrate benefits of applying the B2BDC methodology for development of predictive kinetic models.
APA, Harvard, Vancouver, ISO, and other styles
12

Chung, Gunhui, Kyu Bum Sim, and Eung Seok Kim. "Uncertainty Quantification Index of SWMM Model Parameters." Journal of the Korean Water Resources Association 48, no. 2 (February 28, 2015): 105–14. http://dx.doi.org/10.3741/jkwra.2015.48.2.105.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

YAMAZAKI, Wataru. "Uncertainty Quantification via Variable Fidelity Kriging Model." JOURNAL OF THE JAPAN SOCIETY FOR AERONAUTICAL AND SPACE SCIENCES 60, no. 2 (2012): 80–88. http://dx.doi.org/10.2322/jjsass.60.80.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Ye, Ming, Philip D. Meyer, Yu-Feng Lin, and Shlomo P. Neuman. "Quantification of model uncertainty in environmental modeling." Stochastic Environmental Research and Risk Assessment 24, no. 6 (April 28, 2010): 807–8. http://dx.doi.org/10.1007/s00477-010-0377-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Adhikari, S. "On the quantification of damping model uncertainty." Journal of Sound and Vibration 306, no. 1-2 (September 2007): 153–71. http://dx.doi.org/10.1016/j.jsv.2007.05.022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Farmer, C. L. "Uncertainty quantification and optimal decisions." Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 473, no. 2200 (April 2017): 20170115. http://dx.doi.org/10.1098/rspa.2017.0115.

Full text
Abstract:
A mathematical model can be analysed to construct policies for action that are close to optimal for the model. If the model is accurate, such policies will be close to optimal when implemented in the real world. In this paper, the different aspects of an ideal workflow are reviewed: modelling, forecasting, evaluating forecasts, data assimilation and constructing control policies for decision-making. The example of the oil industry is used to motivate the discussion, and other examples, such as weather forecasting and precision agriculture, are used to argue that the same mathematical ideas apply in different contexts. Particular emphasis is placed on (i) uncertainty quantification in forecasting and (ii) how decisions are optimized and made robust to uncertainty in models and judgements. This necessitates full use of the relevant data and by balancing costs and benefits into the long term may suggest policies quite different from those relevant to the short term.
APA, Harvard, Vancouver, ISO, and other styles
17

Wang, Jiajia, Hao Chen, Jing Ma, and Tong Zhang. "Research on application method of uncertainty quantification technology in equipment test identification." MATEC Web of Conferences 336 (2021): 02026. http://dx.doi.org/10.1051/matecconf/202133602026.

Full text
Abstract:
This paper introduces the concepts of equipment test qualification and uncertainty quantification, and the analysis framework and process of equipment test uncertainty quantification. It analyzes the data uncertainty, model uncertainty and environmental uncertainty, and studies the corresponding uncertainty quantification theory to provide technical reference for the application of uncertainty quantification technology in the field of test identification.
APA, Harvard, Vancouver, ISO, and other styles
18

Liu, Xuejun, Hailong Tang, Xin Zhang, and Min Chen. "Gaussian Process Model-Based Performance Uncertainty Quantification of a Typical Turboshaft Engine." Applied Sciences 11, no. 18 (September 8, 2021): 8333. http://dx.doi.org/10.3390/app11188333.

Full text
Abstract:
The gas turbine engine is a widely used thermodynamic system for aircraft. The demand for quantifying the uncertainty of engine performance is increasing due to the expectation of reliable engine performance design. In this paper, a fast, accurate, and robust uncertainty quantification method is proposed to investigate the impact of component performance uncertainty on the performance of a classical turboshaft engine. The Gaussian process model is firstly utilized to accurately approximate the relationships between inputs and outputs of the engine performance simulation model. Latin hypercube sampling is subsequently employed to perform uncertainty analysis of the engine performance. The accuracy, robustness, and convergence rate of the proposed method are validated by comparing with the Monte Carlo sampling method. Two main scenarios are investigated, where uncertain parameters are considered to be mutually independent and partially correlated, respectively. Finally, the variance-based sensitivity analysis is used to determine the main contributors to the engine performance uncertainty. Both approximation and sampling errors are explained in the uncertainty quantification to give more accurate results. The final results yield new insights about the engine performance uncertainty and the important component performance parameters.
APA, Harvard, Vancouver, ISO, and other styles
19

Mueller, Michael E., and Venkat Raman. "Model form uncertainty quantification in turbulent combustion simulations: Peer models." Combustion and Flame 187 (January 2018): 137–46. http://dx.doi.org/10.1016/j.combustflame.2017.09.011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Lei, Chon Lok, Sanmitra Ghosh, Dominic G. Whittaker, Yasser Aboelkassem, Kylie A. Beattie, Chris D. Cantwell, Tammo Delhaas, et al. "Considering discrepancy when calibrating a mechanistic electrophysiology model." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 378, no. 2173 (May 25, 2020): 20190349. http://dx.doi.org/10.1098/rsta.2019.0349.

Full text
Abstract:
Uncertainty quantification (UQ) is a vital step in using mathematical models and simulations to take decisions. The field of cardiac simulation has begun to explore and adopt UQ methods to characterize uncertainty in model inputs and how that propagates through to outputs or predictions; examples of this can be seen in the papers of this issue. In this review and perspective piece, we draw attention to an important and under-addressed source of uncertainty in our predictions—that of uncertainty in the model structure or the equations themselves. The difference between imperfect models and reality is termed model discrepancy , and we are often uncertain as to the size and consequences of this discrepancy. Here, we provide two examples of the consequences of discrepancy when calibrating models at the ion channel and action potential scales. Furthermore, we attempt to account for this discrepancy when calibrating and validating an ion channel model using different methods, based on modelling the discrepancy using Gaussian processes and autoregressive-moving-average models, then highlight the advantages and shortcomings of each approach. Finally, suggestions and lines of enquiry for future work are provided. This article is part of the theme issue ‘Uncertainty quantification in cardiac and cardiovascular modelling and simulation’.
APA, Harvard, Vancouver, ISO, and other styles
21

Tuczyński, Tomasz, and Jerzy Stopa. "Uncertainty Quantification in Reservoir Simulation Using Modern Data Assimilation Algorithm." Energies 16, no. 3 (January 20, 2023): 1153. http://dx.doi.org/10.3390/en16031153.

Full text
Abstract:
Production forecasting using numerical simulation has become a standard in the oil and gas industry. The model construction process requires an explicit definition of multiple uncertain parameters; thus, the outcome of the modelling is also uncertain. For the reservoirs with production data, the uncertainty can be reduced by history-matching. However, the manual matching procedure is time-consuming and usually generates one deterministic realization. Due to the ill-posed nature of the calibration process, the uncertainty cannot be captured sufficiently with only one simulation model. In this paper, the uncertainty quantification process carried out for a gas-condensate reservoir is described. The ensemble-based uncertainty approach was used with the ES-MDA algorithm, conditioning the models to the observed data. Along with the results, the author described the solutions proposed to improve the algorithm’s efficiency and to analyze the factors controlling modelling uncertainty. As a part of the calibration process, various geological hypotheses regarding the presence of an active aquifer were verified, leading to important observations about the drive mechanism of the analyzed reservoir.
APA, Harvard, Vancouver, ISO, and other styles
22

Wang, Lujia, Hyunsoo Yoon, Andrea Hawkins-Daarud, Kyle Singleton, Kamala Clark-Swanson, Kris Smith, Peter Nakaji, et al. "NIMG-52. UNCERTAINTY QUANTIFICATION IN RADIOMICS." Neuro-Oncology 21, Supplement_6 (November 2019): vi172—vi173. http://dx.doi.org/10.1093/neuonc/noz175.721.

Full text
Abstract:
Abstract INTRODUCTION The quantification of intratumoral heterogeneity – through radiomics-based approaches - can help resolve the regionally distinct genetic drug targets that may co-exist within a single Glioblastoma (GBM) tumor. While this offers potential diagnostic value under the paradigm of individualized oncology, clinical decision-making must also consider the degree of uncertainty associated with each model. In this study, we evaluate the performance of a novel machine-learning (ML) algorithm, called Gaussian Process (GP) modeling, that can quantify the impact of multiple sources of uncertainty in ML model development and prediction accuracy, including variabilities in the copy number measurement, radiomics features, training sample characteristics, and training sample size. METHOD We collected 95 image-localized biopsies from 25 primary GBM patients. We coregistered stereotactic locations with preoperative multi-parametric MRI features (conventional MRI, DSC perfusion, Diffusion Tensor Imaging) to generate spatially matched pairs of MRI and copy number variants (CNV) for for each biopsy. We developed a Gaussian Process (GP) model to predict CNV for Epidermal Growth Factor Receptor (EGFR) based on MRI radiomic features in each patient. We used leave-one-patient-out cross validation to quantify prediction accuracy and model uncertainty. Spatial prediction and uncertainty (p-value) maps were overlaid to help visualize regional genetic variation of EGFR and uncertainty of the radiomic predictions. RESULT: The initial GP radiomics model for EGFR amplification (CNV > 3.5) produced a sensitivity of 0.8 and specificity of 0.8. Samples/regions associated with high uncertainty (p-value >0.05) correlated with either 1) extrapolation of radiomic features from the training set-defined feature space or 2) insufficient training samples in the feature space. CONCLUSION We present a ML-based model that quantifies spatial genetic heterogeneity in GBM, while also estimating model uncertainties that result from multi-source data variabilities. This approach lays the groundwork for prospective clinical integration of modeling-based diagnostic approaches in the paradigm of individualized medicine.
APA, Harvard, Vancouver, ISO, and other styles
23

Hu, Juxi, Lei Wang, and Xiaojun Wang. "Non-Probabilistic Uncertainty Quantification of Fiber-Reinforced Composite Laminate Based on Micro- and Macro-Mechanical Analysis." Applied Sciences 12, no. 22 (November 18, 2022): 11739. http://dx.doi.org/10.3390/app122211739.

Full text
Abstract:
In this paper, the main aim is to study and predict macro elastic mechanical parameters of fiber-reinforced composite laminates by combining micro-mechanical analysis models and the non-probabilistic set theory. It deals with uncertain input parameters existing in quantification models as interval variables. Here, several kinds of micro-mechanical mathematical models are introduced, and the parameter vertex solution theorem and the Monte Carlo simulation method can be used to perform uncertainty quantification of macro elastic properties for composites. In order to take the correlations between macro elastic properties into consideration, the obtained limited sample points or experimental data are utilized on the basis of the grey mathematical modeling theory, where correlated uncertain macro parameters can be treated as a closed and bounded convex polyhedral model. It can give out a clear analytical description for feasible domains of correlated macro elastic properties in the process of uncertainty quantification. Finally, two numerical examples are carried out to account for the validity and feasibility of the proposed quantification method. The results show that the proposed method can become a powerful and meaningful supplement for uncertainty quantification of composite laminates and provide data support for further uncertainty propagation analysis.
APA, Harvard, Vancouver, ISO, and other styles
24

Zimoń, Małgorzata, Robert Sawko, David Emerson, and Christopher Thompson. "Uncertainty Quantification at the Molecular–Continuum Model Interface." Fluids 2, no. 1 (March 21, 2017): 12. http://dx.doi.org/10.3390/fluids2010012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Naozuka, Gustavo Taiji, Emanuelle Arantes Paixão, João Vitor Oliveira Silva, Maurício Pessoa da Cunha Menezes, and Regina Cerqueira Almeida. "Model Comparison and Uncertainty Quantification in Tumor Growth." Trends in Computational and Applied Mathematics 22, no. 3 (September 2, 2021): 495–514. http://dx.doi.org/10.5540/tcam.2021.022.03.00495.

Full text
Abstract:
Mathematical and computational modeling have been increasingly applied in many areas of cancer research, aiming to improve the understanding of tumorigenic mechanisms and to suggest more effective therapy protocols. The mathematical description of the tumor growth dynamics is often made using the exponential, logistic, and Gompertz models. However, recent literature has suggested that the Allee effect may play an important role in the early stages of tumor dynamics, including cancer relapse and metastasis. For a model to provide reliable predictions, it is necessary to have a rigorous evaluation of the uncertainty inherent in the modeling process. In this work, our main objective is to show how a model framework that integrates sensitivity analysis, model calibration, and model selection techniques can improve and systematically characterize model and data uncertainties. We investigate five distinct models with different complexities, which encompass the exponential, logistic, Gompertz, and weak and strong Allee effects dynamics. Using tumor growth data published in the literature, we perform a global sensitivity analysis, apply a Bayesian framework for parameter inference, evaluate the associated sensitivity matrices, and use different information criteria for model selection (First- and Second-Order Akaike Information Criteria and Bayesian Information Criterion). We show that such a wider methodology allows having a more detailed picture of each model assumption and uncertainty, calibration reliability, ultimately improving tumor mathematical description. The used in vivo data suggested the existence of both a competitive effect among tumor cells and a weak Allee effect in the growth dynamics. The proposed model framework highlights the need for more detailed experimental studies on the influence of the Allee effect on the analyzed cancer scenario.
APA, Harvard, Vancouver, ISO, and other styles
26

Lew, Jiann-Shiun, Lee H. Keel, and Jer-Nan Juang. "Quantification of parametric uncertainty via an interval model." Journal of Guidance, Control, and Dynamics 17, no. 6 (November 1994): 1212–18. http://dx.doi.org/10.2514/3.21335.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Wang, Hai, and David A. Sheen. "Combustion kinetic model uncertainty quantification, propagation and minimization." Progress in Energy and Combustion Science 47 (April 2015): 1–31. http://dx.doi.org/10.1016/j.pecs.2014.10.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Park, Inseok, Hemanth K. Amarchinta, and Ramana V. Grandhi. "A Bayesian approach for quantification of model uncertainty." Reliability Engineering & System Safety 95, no. 7 (July 2010): 777–85. http://dx.doi.org/10.1016/j.ress.2010.02.015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Li, Jing, and Panos Stinis. "Mesh refinement for uncertainty quantification through model reduction." Journal of Computational Physics 280 (January 2015): 164–83. http://dx.doi.org/10.1016/j.jcp.2014.09.021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Shen, Maohao, Yuheng Bu, Prasanna Sattigeri, Soumya Ghosh, Subhro Das, and Gregory Wornell. "Post-hoc Uncertainty Learning Using a Dirichlet Meta-Model." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 8 (June 26, 2023): 9772–81. http://dx.doi.org/10.1609/aaai.v37i8.26167.

Full text
Abstract:
It is known that neural networks have the problem of being over-confident when directly using the output label distribution to generate uncertainty measures. Existing methods mainly resolve this issue by retraining the entire model to impose the uncertainty quantification capability so that the learned model can achieve desired performance in accuracy and uncertainty prediction simultaneously. However, training the model from scratch is computationally expensive, and a trade-off might exist between prediction accuracy and uncertainty quantification. To this end, we consider a more practical post-hoc uncertainty learning setting, where a well-trained base model is given, and we focus on the uncertainty quantification task at the second stage of training. We propose a novel Bayesian uncertainty learning approach using the Dirichlet meta-model, which is effective and computationally efficient. Our proposed method requires no additional training data and is flexible enough to quantify different uncertainties and easily adapt to different application settings, including out-of-domain data detection, misclassification detection, and trustworthy transfer learning. Finally, we demonstrate our proposed meta-model approach's flexibility and superior empirical performance on these applications over multiple representative image classification benchmarks.
APA, Harvard, Vancouver, ISO, and other styles
31

Berends, Koen D., Menno W. Straatsma, Jord J. Warmink, and Suzanne J. M. H. Hulscher. "Uncertainty quantification of flood mitigation predictions and implications for interventions." Natural Hazards and Earth System Sciences 19, no. 8 (August 13, 2019): 1737–53. http://dx.doi.org/10.5194/nhess-19-1737-2019.

Full text
Abstract:
Abstract. Reduction of water levels during river floods is key in preventing damage and loss of life. Computer models are used to design ways to achieve this and assist in the decision-making process. However, the predictions of computer models are inherently uncertain, and it is currently unknown to what extent that uncertainty affects predictions of the effect of flood mitigation strategies. In this study, we quantify the uncertainty of flood mitigation interventions on the Dutch River Waal, based on 39 different sources of uncertainty and 12 intervention designs. The aim of each intervention is to reduce flood water levels. Our objective is to investigate the uncertainty of model predictions of intervention effect and to explore relationships that may aid in decision-making. We identified the relative uncertainty, defined as the ratio between the confidence interval and the expected effect, as a useful metric to compare uncertainty between different interventions. Using this metric, we show that intervention effect uncertainty behaves like a traditional backwater curve with an approximately constant relative uncertainty value. In general, we observe that uncertainty scales with effect: high flood level decreases have high uncertainty, and, conversely, small effects are accompanied by small uncertainties. However, different interventions with the same expected effect do not necessarily have the same uncertainty. For example, our results show that the large-scale but relatively ineffective intervention of floodplain smoothing by removing vegetation has much higher uncertainty compared to alternative options. Finally, we show how a level of acceptable uncertainty can be defined and how this can affect the design of interventions. In general, we conclude that the uncertainty of model predictions is not large enough to invalidate model-based intervention design, nor small enough to neglect altogether. Instead, uncertainty information is valuable in the selection of alternative interventions.
APA, Harvard, Vancouver, ISO, and other styles
32

Karimi, Hamed, and Reza Samavi. "Quantifying Deep Learning Model Uncertainty in Conformal Prediction." Proceedings of the AAAI Symposium Series 1, no. 1 (October 3, 2023): 142–48. http://dx.doi.org/10.1609/aaaiss.v1i1.27492.

Full text
Abstract:
Precise estimation of predictive uncertainty in deep neural networks is a critical requirement for reliable decision-making in machine learning and statistical modeling, particularly in the context of medical AI. Conformal Prediction (CP) has emerged as a promising framework for representing the model uncertainty by providing well-calibrated confidence levels for individual predictions. However, the quantification of model uncertainty in conformal prediction remains an active research area, yet to be fully addressed. In this paper, we explore state-of-the-art CP methodologies and their theoretical foundations. We propose a probabilistic approach in quantifying the model uncertainty derived from the produced prediction sets in conformal prediction and provide certified boundaries for the computed uncertainty. By doing so, we allow model uncertainty measured by CP to be compared by other uncertainty quantification methods such as Bayesian (e.g., MC-Dropout and DeepEnsemble) and Evidential approaches.
APA, Harvard, Vancouver, ISO, and other styles
33

Rajaraman, Sivaramakrishnan, Ghada Zamzmi, Feng Yang, Zhiyun Xue, Stefan Jaeger, and Sameer K. Antani. "Uncertainty Quantification in Segmenting Tuberculosis-Consistent Findings in Frontal Chest X-rays." Biomedicines 10, no. 6 (June 4, 2022): 1323. http://dx.doi.org/10.3390/biomedicines10061323.

Full text
Abstract:
Deep learning (DL) methods have demonstrated superior performance in medical image segmentation tasks. However, selecting a loss function that conforms to the data characteristics is critical for optimal performance. Further, the direct use of traditional DL models does not provide a measure of uncertainty in predictions. Even high-quality automated predictions for medical diagnostic applications demand uncertainty quantification to gain user trust. In this study, we aim to investigate the benefits of (i) selecting an appropriate loss function and (ii) quantifying uncertainty in predictions using a VGG16-based-U-Net model with the Monto–Carlo (MCD) Dropout method for segmenting Tuberculosis (TB)-consistent findings in frontal chest X-rays (CXRs). We determine an optimal uncertainty threshold based on several uncertainty-related metrics. This threshold is used to select and refer highly uncertain cases to an expert. Experimental results demonstrate that (i) the model trained with a modified Focal Tversky loss function delivered superior segmentation performance (mean average precision (mAP): 0.5710, 95% confidence interval (CI): (0.4021,0.7399)), (ii) the model with 30 MC forward passes during inference further improved and stabilized performance (mAP: 0.5721, 95% CI: (0.4032,0.7410), and (iii) an uncertainty threshold of 0.7 is observed to be optimal to refer highly uncertain cases.
APA, Harvard, Vancouver, ISO, and other styles
34

Zou, Q., and M. Sester. "UNCERTAINTY REPRESENTATION AND QUANTIFICATION OF 3D BUILDING MODELS." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLIII-B2-2022 (May 30, 2022): 335–41. http://dx.doi.org/10.5194/isprs-archives-xliii-b2-2022-335-2022.

Full text
Abstract:
Abstract. The quality of environmental perception is of great interest for localization tasks in autonomous systems. Maps, generated from the sensed information, are often used as additional spatial references in these applications. The quantification of the map uncertainties gives an insight into how reliable and complete the map is, avoiding the potential systematic deviation in pose estimation. Mapping 3D buildings in urban areas using Light detection and ranging (LiDAR) point clouds is a challenging task as it is often subject to uncertain error sources in the real world such as sensor noise and occlusions, which should be well represented in the 3D models for the downstream localization tasks. In this paper, we propose a method to model 3D building façades in complex urban scenes with uncertainty quantification, where the uncertainties of windows and façades are indicated in a probabilistic fashion. The potential locations of the missing objects (here: windows) are inferred by the available data and layout patterns with the Monte Carlo (MC) sampling approach. The proposed 3D building model and uncertainty measures are evaluated using the real-world LiDAR point clouds collected by Riegl Mobile Mapping System. The experimental results show that our uncertainty representation conveys the quality information of the estimated locations and shapes for the modelled map objects.
APA, Harvard, Vancouver, ISO, and other styles
35

Tang, Hesheng, Dawei Li, Lixin Deng, and Songtao Xue. "Evidential uncertainty quantification of the Park–Ang damage model in performance based design." Engineering Computations 35, no. 7 (October 1, 2018): 2480–501. http://dx.doi.org/10.1108/ec-11-2017-0466.

Full text
Abstract:
Purpose This paper aims to develop a comprehensive uncertainty quantification method using evidence theory for Park–Ang damage index-based performance design in which epistemic uncertainties are considered. Various sources of uncertainty emanating from the database of the cyclic test results of RC members provided by the Pacific Earthquake Engineering Research Center are taken into account. Design/methodology/approach In this paper, an uncertainty quantification methodology based on evidence theory is presented for the whole process of performance-based seismic design (PBSD), while considering uncertainty in the Park–Ang damage model. To alleviate the burden of high computational cost in propagating uncertainty, the differential evolution interval optimization strategy is used for efficiently finding the propagated belief structure throughout the whole design process. Findings The investigation results of this paper demonstrate that the uncertainty rooted in Park–Ang damage model have a significant influence on PBSD design and evaluation. It might be worth noting that the epistemic uncertainty present in the Park–Ang damage model needs to be considered to avoid underestimating the true uncertainty. Originality/value This paper presents an evidence theory-based uncertainty quantification framework for the whole process of PBSD.
APA, Harvard, Vancouver, ISO, and other styles
36

Gerstoft, Peter, and Ishan D. Khurjekar. "Uncertainty quantification for acoustical problems." Journal of the Acoustical Society of America 155, no. 3_Supplement (March 1, 2024): A213. http://dx.doi.org/10.1121/10.0027342.

Full text
Abstract:
Acoustical parameter estimation is a routine task in many domains and is typically done using signal processing methods. The performance of existing estimation methods is affected due to external uncertainty and yet the methods provide no measure of confidence in the outputs. Hence it is crucial to quantify uncertainty in the estimates before real-world deployment. Conformal prediction is a simple method to obtain statistically valid prediction intervals from an estimation model. In this work, conformal prediction is used for obtaining statistically valid uncertainty intervals for various acoustical parameter estimation tasks. We consider the tasks of DOA estimation and localization of one or more sources in an acoustical environment. The performance is validated on plane wave data with different sources of uncertainty including ambient noise, interference, and sensor location uncertainty, using statistical metrics. Results demonstrate that conformal prediction is a suitable and easy-to-use technique to generate statistically valid uncertainty quantification for acoustical estimation tasks.
APA, Harvard, Vancouver, ISO, and other styles
37

Verdonck, H., O. Hach, J. D. Polman, O. Braun, C. Balzani, S. Müller, and J. Rieke. "-An open-source framework for the uncertainty quantification of aeroelastic wind turbine simulation tools." Journal of Physics: Conference Series 2265, no. 4 (May 1, 2022): 042039. http://dx.doi.org/10.1088/1742-6596/2265/4/042039.

Full text
Abstract:
Abstract The uncertainty quantification of aeroelastic wind turbine simulations is an active research topic. This paper presents a dedicated, open-source framework for this purpose. The framework is built around the uncertainpy package, likewise available as open source. Uncertainty quantification is done with a non-intrusive, global and variance-based surrogate model, using PCE (i.e., polynomial chaos expansion). Two methods to handle the uncertain parameter distribution along the blades are presented. The framework is demonstrated on the basis of an aeroelastic stability analysis. A sensitivity analysis is performed on the influence of the flapwise, edgewise and torsional stiffness of the blades on the damping of the most critical mode for both a Bladed linearization and a Bladed time domain simulation. The sensitivities of both models are in excellent agreement and the PCE surrogate models are shown to be accurate approximations of the true models.
APA, Harvard, Vancouver, ISO, and other styles
38

Tian, Yudong, Grey S. Nearing, Christa D. Peters-Lidard, Kenneth W. Harrison, and Ling Tang. "Performance Metrics, Error Modeling, and Uncertainty Quantification." Monthly Weather Review 144, no. 2 (February 1, 2016): 607–13. http://dx.doi.org/10.1175/mwr-d-15-0087.1.

Full text
Abstract:
Abstract A common set of statistical metrics has been used to summarize the performance of models or measurements—the most widely used ones being bias, mean square error, and linear correlation coefficient. They assume linear, additive, Gaussian errors, and they are interdependent, incomplete, and incapable of directly quantifying uncertainty. The authors demonstrate that these metrics can be directly derived from the parameters of the simple linear error model. Since a correct error model captures the full error information, it is argued that the specification of a parametric error model should be an alternative to the metrics-based approach. The error-modeling methodology is applicable to both linear and nonlinear errors, while the metrics are only meaningful for linear errors. In addition, the error model expresses the error structure more naturally, and directly quantifies uncertainty. This argument is further explained by highlighting the intrinsic connections between the performance metrics, the error model, and the joint distribution between the data and the reference.
APA, Harvard, Vancouver, ISO, and other styles
39

Marepally, Koushik, Yong Su Jung, James Baeder, and Ganesh Vijayakumar. "Uncertainty quantification of wind turbine airfoil aerodynamics with geometric uncertainty." Journal of Physics: Conference Series 2265, no. 4 (May 1, 2022): 042041. http://dx.doi.org/10.1088/1742-6596/2265/4/042041.

Full text
Abstract:
Abstract An artificial neural network based reduced order model (ROM) is developed to predict the load coefficients and performance of wind turbine airfoils. The model is trained using a representative database of 972 wind turbine airfoil shapes generated by perturbing the design parameters in each of 12 baseline airfoils defining commercially relevant modern wind turbines. The predictions from our ROM show excellent agreement with the CFD data, with a 99th percentile maximum errors of 0.03 in lift-coefficient, 2 in lift-to-drag ratio and 0.002 in pitching moment coefficient. A Monte-Carlo based uncertainty quantification (UQ) and global sensitivity analysis (GSA) framework is developed using this computationally economical ROM. Using UQ, we observed the stall behavior to be very sensitive to geometric uncertainty, with more than 10% deviation in lift coefficient associated to 5% deviation in geometric features. Sobol’s analysis is used to identify the most influencing geometric feature for the stall behavior to be concentrated at the maximum thickness location on the airfoil suction surface.
APA, Harvard, Vancouver, ISO, and other styles
40

Wells, S., A. Plotkowski, J. Coleman, M. Rolchigo, R. Carson, and M. J. M. Krane. "Uncertainty quantification for computational modelling of laser powder bed fusion." IOP Conference Series: Materials Science and Engineering 1281, no. 1 (May 1, 2023): 012024. http://dx.doi.org/10.1088/1757-899x/1281/1/012024.

Full text
Abstract:
Abstract Additive manufacturing (AM) may have many advantages over traditional casting and wrought methods, but our understanding of the various processes is still limited. Computational models are useful to study and isolate underlying physics and improve our understanding of the AM process-microstructure-property relations. However, these models necessarily rely on simplifications and parameters of uncertain value. These assumptions reduce the overall reliability of the predictive capabilities of these models, so it is important to estimate the uncertainty in model output. In doing so, we quantify the effect of model limitations and identify potential areas of improvement, a procedure made possible by uncertainty quantification (UQ). Here we highlight recent work which coupled and propagated statistical and systematic uncertainties from a melt pool transport model based in OpenFOAM, through a grain scale cellular automaton code. We demonstrate how a UQ framework can identify model parameters which most significantly impact the reliability of model predictions through both models and thus provide insight for future improvements in the models and suggest measurements to reduce output uncertainty.
APA, Harvard, Vancouver, ISO, and other styles
41

Mannina, Giorgio, and Gaspare Viviani. "An urban drainage stormwater quality model: Model development and uncertainty quantification." Journal of Hydrology 381, no. 3-4 (February 2010): 248–65. http://dx.doi.org/10.1016/j.jhydrol.2009.11.047.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Pirot, Guillaume, Ranee Joshi, Jérémie Giraud, Mark Douglas Lindsay, and Mark Walter Jessell. "loopUI-0.1: indicators to support needs and practices in 3D geological modelling uncertainty quantification." Geoscientific Model Development 15, no. 12 (June 20, 2022): 4689–708. http://dx.doi.org/10.5194/gmd-15-4689-2022.

Full text
Abstract:
Abstract. To support the needs of practitioners regarding 3D geological modelling and uncertainty quantification in the field, in particular from the mining industry, we propose a Python package called loopUI-0.1 that provides a set of local and global indicators to measure uncertainty and features dissimilarities among an ensemble of voxet models. Results are presented of a survey launched among practitioners in the mineral industry, enquiring about their modelling and uncertainty quantification practice and needs. It reveals that practitioners acknowledge the importance of uncertainty quantification even if they do not perform it. A total of four main factors preventing practitioners performing uncertainty quantification were identified: a lack of data uncertainty quantification, (computing) time requirement to generate one model, poor tracking of assumptions and interpretations and relative complexity of uncertainty quantification. The paper reviews and proposes solutions to alleviate these issues. Elements of an answer to these problems are already provided in the special issue hosting this paper and more are expected to come.
APA, Harvard, Vancouver, ISO, and other styles
43

Kaplan, David. "On the Quantification of Model Uncertainty: A Bayesian Perspective." Psychometrika 86, no. 1 (March 2021): 215–38. http://dx.doi.org/10.1007/s11336-021-09754-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Jung, Yeong-Ki, Kyoungsik Chang, and Jae Hyun Bae. "Uncertainty Quantification of GEKO Model Coefficients on Compressible Flows." International Journal of Aerospace Engineering 2021 (June 7, 2021): 1–17. http://dx.doi.org/10.1155/2021/9998449.

Full text
Abstract:
In the present work, supersonic flows over an axisymmetric base and a 24-deg compression ramp are investigated using the generalized k - ω (GEKO) model implemented in the commercial software, ANSYS FLUENT. GEKO is a two-equation model based on the k - ω formulation, and some specified model coefficients can be tuned depending on the flow characteristics. Uncertainty quantification (UQ) analysis is incorporated to quantify the uncertainty of the model coefficients and to calibrate the coefficients. The Latin hypercube sampling (LHS) method is used for sampling independent input parameters as a uniform distribution. A metamodel is constructed based on general polynomial chaos expansion (gPCE) using ordinary least squares (OLS). The influential coefficient closure is obtained by using Sobol indices. The affine invariant ensemble algorithm (AIES) is selected to characterize the posterior distribution via Markov chain Monte Carlo sampling. Calibrated model coefficients are extracted from posterior distributions obtained through Bayesian inference, which is based on the point-collocation nonintrusive polynomial chaos (NIPC) method. Results obtained through calibrated model coefficients by Bayesian inference show superior prediction with available experimental measurements than those from original model ones.
APA, Harvard, Vancouver, ISO, and other styles
45

Kampouris, Konstantinos, Vassilios Vervatis, John Karagiorgos, and Sarantis Sofianos. "Oil spill model uncertainty quantification using an atmospheric ensemble." Ocean Science 17, no. 4 (July 15, 2021): 919–34. http://dx.doi.org/10.5194/os-17-919-2021.

Full text
Abstract:
Abstract. We investigate the impact of atmospheric forcing uncertainties on the prediction of the dispersion of pollutants in the marine environment. Ensemble simulations consisting of 50 members were carried out using the ECMWF ensemble prediction system and the oil spill model MEDSLIK-II in the Aegean Sea. A deterministic control run using the unperturbed wind of the ECMWF high-resolution system served as reference for the oil spill prediction. We considered the oil spill rates and duration to be similar to major accidents of the past (e.g., the Prestige case) and we performed simulations for different seasons and oil spill types. Oil spill performance metrics and indices were introduced in the context of probabilistic hazard assessment. Results suggest that oil spill model uncertainties were sensitive to the atmospheric forcing uncertainties, especially to phase differences in the intensity and direction of the wind among members. An oil spill ensemble prediction system based on model uncertainty of the atmospheric forcing, shows great potential for predicting pathways of oil spill transport alongside a deterministic simulation, increasing the reliability of the model prediction and providing important information for the control and mitigation strategies in the event of an oil spill accident.
APA, Harvard, Vancouver, ISO, and other styles
46

Osypov, Konstantin, Yi Yang, Aimé Fournier, Natalia Ivanova, Ran Bachrach, Can Evren Yarman, Yu You, Dave Nichols, and Marta Woodward. "Model-uncertainty quantification in seismic tomography: method and applications." Geophysical Prospecting 61, no. 6 (August 19, 2013): 1114–34. http://dx.doi.org/10.1111/1365-2478.12058.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Islam, Tanvir, Prashant K. Srivastava, and George P. Petropoulos. "Uncertainty Quantification in the Infrared Surface Emissivity Model (ISEM)." IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 9, no. 12 (December 2016): 5888–92. http://dx.doi.org/10.1109/jstars.2016.2557303.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Liu, D. S., and H. G. Matthies. "Uncertainty quantification with spectral approximations of a flood model." IOP Conference Series: Materials Science and Engineering 10 (June 1, 2010): 012208. http://dx.doi.org/10.1088/1757-899x/10/1/012208.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Li, Mingyang, and Zequn Wang. "Surrogate model uncertainty quantification for reliability-based design optimization." Reliability Engineering & System Safety 192 (December 2019): 106432. http://dx.doi.org/10.1016/j.ress.2019.03.039.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Hariri-Ardebili, M. A., and V. E. Saouma. "Sensitivity and uncertainty quantification of the cohesive crack model." Engineering Fracture Mechanics 155 (April 2016): 18–35. http://dx.doi.org/10.1016/j.engfracmech.2016.01.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography