Literatura científica selecionada sobre o tema "Predictive uncertainty quantification"

Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos

Selecione um tipo de fonte:

Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Predictive uncertainty quantification".

Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.

Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.

Artigos de revistas sobre o assunto "Predictive uncertainty quantification"

1

Cacuci, Dan Gabriel. "Sensitivity Analysis, Uncertainty Quantification and Predictive Modeling of Nuclear Energy Systems". Energies 15, n.º 17 (1 de setembro de 2022): 6379. http://dx.doi.org/10.3390/en15176379.

Texto completo da fonte
Resumo:
The Special Issue “Sensitivity Analysis, Uncertainty Quantification and Predictive Modeling of Nuclear Energy Systems” comprises nine articles that present important applications of concepts for performing sensitivity analyses and uncertainty quantifications of models of nuclear energy systems [...]
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Csillag, Daniel, Lucas Monteiro Paes, Thiago Ramos, João Vitor Romano, Rodrigo Schuller, Roberto B. Seixas, Roberto I. Oliveira e Paulo Orenstein. "AmnioML: Amniotic Fluid Segmentation and Volume Prediction with Uncertainty Quantification". Proceedings of the AAAI Conference on Artificial Intelligence 37, n.º 13 (26 de junho de 2023): 15494–502. http://dx.doi.org/10.1609/aaai.v37i13.26837.

Texto completo da fonte
Resumo:
Accurately predicting the volume of amniotic fluid is fundamental to assessing pregnancy risks, though the task usually requires many hours of laborious work by medical experts. In this paper, we present AmnioML, a machine learning solution that leverages deep learning and conformal prediction to output fast and accurate volume estimates and segmentation masks from fetal MRIs with Dice coefficient over 0.9. Also, we make available a novel, curated dataset for fetal MRIs with 853 exams and benchmark the performance of many recent deep learning architectures. In addition, we introduce a conformal prediction tool that yields narrow predictive intervals with theoretically guaranteed coverage, thus aiding doctors in detecting pregnancy risks and saving lives. A successful case study of AmnioML deployed in a medical setting is also reported. Real-world clinical benefits include up to 20x segmentation time reduction, with most segmentations deemed by doctors as not needing any further manual refinement. Furthermore, AmnioML's volume predictions were found to be highly accurate in practice, with mean absolute error below 56mL and tight predictive intervals, showcasing its impact in reducing pregnancy complications.
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Lew, Jiann-Shiun, e Jer-Nan Juang. "Robust Generalized Predictive Control with Uncertainty Quantification". Journal of Guidance, Control, and Dynamics 35, n.º 3 (maio de 2012): 930–37. http://dx.doi.org/10.2514/1.54510.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Karimi, Hamed, e Reza Samavi. "Quantifying Deep Learning Model Uncertainty in Conformal Prediction". Proceedings of the AAAI Symposium Series 1, n.º 1 (3 de outubro de 2023): 142–48. http://dx.doi.org/10.1609/aaaiss.v1i1.27492.

Texto completo da fonte
Resumo:
Precise estimation of predictive uncertainty in deep neural networks is a critical requirement for reliable decision-making in machine learning and statistical modeling, particularly in the context of medical AI. Conformal Prediction (CP) has emerged as a promising framework for representing the model uncertainty by providing well-calibrated confidence levels for individual predictions. However, the quantification of model uncertainty in conformal prediction remains an active research area, yet to be fully addressed. In this paper, we explore state-of-the-art CP methodologies and their theoretical foundations. We propose a probabilistic approach in quantifying the model uncertainty derived from the produced prediction sets in conformal prediction and provide certified boundaries for the computed uncertainty. By doing so, we allow model uncertainty measured by CP to be compared by other uncertainty quantification methods such as Bayesian (e.g., MC-Dropout and DeepEnsemble) and Evidential approaches.
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Akitaya, Kento, e Masaatsu Aichi. "Land Subsidence Model Inversion with the Estimation of Both Model Parameter Uncertainty and Predictive Uncertainty Using an Evolutionary-Based Data Assimilation (EDA) and Ensemble Model Output Statistics (EMOS)". Water 16, n.º 3 (28 de janeiro de 2024): 423. http://dx.doi.org/10.3390/w16030423.

Texto completo da fonte
Resumo:
The nonlinearity nature of land subsidence and limited observations cause premature convergence in typical data assimilation methods, leading to both underestimation and miscalculation of uncertainty in model parameters and prediction. This study focuses on a promising approach, the combination of evolutionary-based data assimilation (EDA) and ensemble model output statistics (EMOS), to investigate its performance in land subsidence modeling using EDA with a smoothing approach for parameter uncertainty quantification and EMOS for predictive uncertainty quantification. The methodology was tested on a one-dimensional subsidence model in Kawajima (Japan). The results confirmed the EDA’s robust capability: Model diversity was maintained even after 1000 assimilation cycles on the same dataset, and the obtained parameter distributions were consistent with the soil types. The ensemble predictions were converted to Gaussian predictions with EMOS using past observations statistically. The Gaussian predictions outperformed the ensemble predictions in predictive performance because EMOS compensated for the over/under-dispersive prediction spread and the short-term bias, a potential weakness for the smoothing approach. This case study demonstrates that combining EDA and EMOS contributes to groundwater management for land subsidence control, considering both the model parameter uncertainty and the predictive uncertainty.
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Singh, Rishabh, e Jose C. Principe. "Toward a Kernel-Based Uncertainty Decomposition Framework for Data and Models". Neural Computation 33, n.º 5 (13 de abril de 2021): 1164–98. http://dx.doi.org/10.1162/neco_a_01372.

Texto completo da fonte
Resumo:
Abstract This letter introduces a new framework for quantifying predictive uncertainty for both data and models that relies on projecting the data into a gaussian reproducing kernel Hilbert space (RKHS) and transforming the data probability density function (PDF) in a way that quantifies the flow of its gradient as a topological potential field (quantified at all points in the sample space). This enables the decomposition of the PDF gradient flow by formulating it as a moment decomposition problem using operators from quantum physics, specifically Schrödinger's formulation. We experimentally show that the higher-order moments systematically cluster the different tail regions of the PDF, thereby providing unprecedented discriminative resolution of data regions having high epistemic uncertainty. In essence, this approach decomposes local realizations of the data PDF in terms of uncertainty moments. We apply this framework as a surrogate tool for predictive uncertainty quantification of point-prediction neural network models, overcoming various limitations of conventional Bayesian-based uncertainty quantification methods. Experimental comparisons with some established methods illustrate performance advantages that our framework exhibits.
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Chen, Peng, e Nicholas Zabaras. "Adaptive Locally Weighted Projection Regression Method for Uncertainty Quantification". Communications in Computational Physics 14, n.º 4 (outubro de 2013): 851–78. http://dx.doi.org/10.4208/cicp.060712.281212a.

Texto completo da fonte
Resumo:
AbstractWe develop an efficient, adaptive locally weighted projection regression (ALWPR) framework for uncertainty quantification (UQ) of systems governed by ordinary and partial differential equations. The algorithm adaptively selects the new input points with the largest predictive variance and decides when and where to add new local models. It effectively learns the local features and accurately quantifies the uncertainty in the prediction of the statistics. The developed methodology provides predictions and confidence intervals at any query input and can deal with multi-output cases. Numerical examples are presented to show the accuracy and efficiency of the ALWPR framework including problems with non-smooth local features such as discontinuities in the stochastic space.
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Omagbon, Jericho, John Doherty, Angus Yeh, Racquel Colina, John O'Sullivan, Julian McDowell, Ruanui Nicholson, Oliver J. Maclaren e Michael O'Sullivan. "Case studies of predictive uncertainty quantification for geothermal models". Geothermics 97 (dezembro de 2021): 102263. http://dx.doi.org/10.1016/j.geothermics.2021.102263.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Nitschke, C. T., P. Cinnella, D. Lucor e J. C. Chassaing. "Model-form and predictive uncertainty quantification in linear aeroelasticity". Journal of Fluids and Structures 73 (agosto de 2017): 137–61. http://dx.doi.org/10.1016/j.jfluidstructs.2017.05.007.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Mirzayeva, A., N. A. Slavinskaya, M. Abbasi, J. H. Starcke, W. Li e M. Frenklach. "Uncertainty Quantification in Chemical Modeling". Eurasian Chemico-Technological Journal 20, n.º 1 (31 de março de 2018): 33. http://dx.doi.org/10.18321/ectj706.

Texto completo da fonte
Resumo:
A module of PrIMe automated data-centric infrastructure, Bound-to-Bound Data Collaboration (B2BDC), was used for the analysis of systematic uncertainty and data consistency of the H2/CO reaction model (73/17). In order to achieve this purpose, a dataset of 167 experimental targets (ignition delay time and laminar flame speed) and 55 active model parameters (pre-exponent factors in the Arrhenius form of the reaction rate coefficients) was constructed. Consistency analysis of experimental data from the composed dataset revealed disagreement between models and data. Two consistency measures were applied to identify the quality of experimental targets (Quantities of Interest, QoI): scalar consistency measure, which quantifies the tightening index of the constraints while still ensuring the existence of a set of the model parameter values whose associated modeling output predicts the experimental QoIs within the uncertainty bounds; and a newly-developed method of computing the vector consistency measure (VCM), which determines the minimal bound changes for QoIs initially identified as inconsistent, each bound by its own extent, while still ensuring the existence of a set of the model parameter values whose associated modeling output predicts the experimental QoIs within the uncertainty bounds. The consistency analysis suggested that elimination of 45 experimental targets, 8 of which were self- inconsistent, would lead to a consistent dataset. After that the feasible parameter set was constructed through decrease uncertainty parameters for several reaction rate coefficients. This dataset was subjected for the B2BDC framework model optimization and analysis on. Forth methods of parameter optimization were applied, including those unique in the B2BDC framework. The optimized models showed improved agreement with experimental values, as compared to the initially-assembled model. Moreover, predictions for experiments not included in the initial dataset were investigated. The results demonstrate benefits of applying the B2BDC methodology for development of predictive kinetic models.
Estilos ABNT, Harvard, Vancouver, APA, etc.

Teses / dissertações sobre o assunto "Predictive uncertainty quantification"

1

Lonsdale, Jack Henry. "Predictive modelling and uncertainty quantification of UK forest growth". Thesis, University of Edinburgh, 2015. http://hdl.handle.net/1842/16202.

Texto completo da fonte
Resumo:
Forestry in the UK is dominated by coniferous plantations. Sitka spruce (Picea sitchensis) and Scots pine (Pinus sylvestris) are the most prevalent species and are mostly grown in single age mono-culture stands. Forest strategy for Scotland, England, and Wales all include efforts to achieve further afforestation. The aim of this afforestation is to provide a multi-functional forest with a broad range of benefits. Due to the time scale involved in forestry, accurate forecasts of stand productivity (along with clearly defined uncertainties) are essential to forest managers. These can be provided by a range of approaches to modelling forest growth. In this project model comparison, Bayesian calibration, and data assimilation methods were all used to attempt to improve forecasts and understanding of uncertainty therein of the two most important conifers in UK forestry. Three different forest growth models were compared in simulating growth of Scots pine. A yield table approach, the process-based 3PGN model, and a Stand Level Dynamic Growth (SLeDG) model were used. Predictions were compared graphically over the typical productivity range for Scots pine in the UK. Strengths and weaknesses of each model were considered. All three produced similar growth trajectories. The greatest difference between models was in volume and biomass in unthinned stands where the yield table predicted a much larger range compared to the other two models. Future advances in data availability and computing power should allow for greater use of process-based models, but in the interim more flexible dynamic growth models may be more useful than static yield tables for providing predictions which extend to non-standard management prescriptions and estimates of early growth and yield. A Bayesian calibration of the SLeDG model was carried out for both Sitka spruce and Scots pine in the UK for the first time. Bayesian calibrations allow both model structure and parameters to be assessed simultaneously in a probabilistic framework, providing a model with which forecasts and their uncertainty can be better understood and quantified using posterior probability distributions. Two different structures for including local productivity in the model were compared with a Bayesian model comparison. A complete calibration of the more probable model structure was then completed. Example forecasts from the calibration were compatible with existing yield tables for both species. This method could be applied to other species or other model structures in the future. Finally, data assimilation was investigated as a way of reducing forecast uncertainty. Data assimilation assumes that neither observations nor models provide a perfect description of a system, but combining them may provide the best estimate. SLeDG model predictions and LiDAR measurements for sub-compartments within Queen Elizabeth Forest Park were combined with an Ensemble Kalman Filter. Uncertainty was reduced following the second data assimilation in all of the state variables. However, errors in stand delineation and estimated stand yield class may have caused observational uncertainty to be greater thus reducing the efficacy of the method for reducing overall uncertainty.
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Gligorijevic, Djordje. "Predictive Uncertainty Quantification and Explainable Machine Learning in Healthcare". Diss., Temple University Libraries, 2018. http://cdm16002.contentdm.oclc.org/cdm/ref/collection/p245801coll10/id/520057.

Texto completo da fonte
Resumo:
Computer and Information Science
Ph.D.
Predictive modeling is an ever-increasingly important part of decision making. The advances in Machine Learning predictive modeling have spread across many domains bringing significant improvements in performance and providing unique opportunities for novel discoveries. A notably important domains of the human world are medical and healthcare domains, which take care of peoples' wellbeing. And while being one of the most developed areas of science with active research, there are many ways they can be improved. In particular, novel tools developed based on Machine Learning theory have drawn benefits across many areas of clinical practice, pushing the boundaries of medical science and directly affecting well-being of millions of patients. Additionally, healthcare and medicine domains require predictive modeling to anticipate and overcome many obstacles that future may hold. These kinds of applications employ a precise decision--making processes which requires accurate predictions. However, good prediction by its own is often insufficient. There has been no major focus in developing algorithms with good quality uncertainty estimates. Ergo, this thesis aims at providing a variety of ways to incorporate solutions by learning high quality uncertainty estimates or providing interpretability of the models where needed for purpose of improving existing tools built in practice and allowing many other tools to be used where uncertainty is the key factor for decision making. The first part of the thesis proposes approaches for learning high quality uncertainty estimates for both short- and long-term predictions in multi-task learning, developed on top for continuous probabilistic graphical models. In many scenarios, especially in long--term predictions, it may be of great importance for the models to provide a reliability flag in order to be accepted by domain experts. To this end we explored a widely applied structured regression model with a goal of providing meaningful uncertainty estimations on various predictive tasks. Our particular interest is in modeling uncertainty propagation while predicting far in the future. To address this important problem, our approach centers around providing an uncertainty estimate by modeling input features as random variables. This allows modeling uncertainty from noisy inputs. In cases when model iteratively produces errors it should propagate uncertainty over the predictive horizon, which may provide invaluable information for decision making based on predictions. In the second part of the thesis we propose novel neural embedding models for learning low-dimensional embeddings of medical concepts, such are diseases and genes, and show how they can be interpreted to allow accessing their quality, and show how can they be used to solve many problems in medical and healthcare research. We use EHR data to discover novel relationships between diseases by studying their comorbidities (i.e., co-occurrences in patients). We trained our models on a large-scale EHR database comprising more than 35 million inpatient cases. To confirm value and potential of the proposed approach we evaluate its effectiveness on a held-out set. Furthermore, for select diseases we provide a candidate gene list for which disease-gene associations were not studied previously, allowing biomedical researchers to better focus their often very costly lab studies. We furthermore examine how disease heterogeneity can affect the quality of learned embeddings and propose an approach for learning types of such heterogeneous diseases, while in our study we primarily focus on learning types of sepsis. Finally, we evaluate the quality of low-dimensional embeddings on tasks of predicting hospital quality indicators such as length of stay, total charges and mortality likelihood, demonstrating their superiority over other approaches. In the third part of the thesis we focus on decision making in medicine and healthcare domain by developing state-of-the-art deep learning models capable of outperforming human performance while maintaining good interpretability and uncertainty estimates.
Temple University--Theses
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Zaffran, Margaux. "Post-hoc predictive uncertainty quantification : methods with applications to electricity price forecasting". Electronic Thesis or Diss., Institut polytechnique de Paris, 2024. http://www.theses.fr/2024IPPAX033.

Texto completo da fonte
Resumo:
L'essor d'algorithmes d'apprentissage statistique offre des perspectives prometteuses pour prévoir les prix de l'électricité. Cependant, ces méthodes fournissent des prévisions ponctuelles, sans indication du degré de confiance à leur accorder. Pour garantir un déploiement sûr de ces modèles prédictifs, il est crucial de quantifier leur incertitude prédictive. Cette thèse porte sur le développement d'intervalles prédictifs pour tout algorithme de prédiction. Bien que motivées par le secteur électrique, les méthodes développées, basées sur la prédiction conforme par partition (SCP), sont génériques : elles peuvent être appliquées dans de nombreux autres domaines sensibles.Dans un premier temps,cette thèse étudie la quantification post-hoc de l'incertitude prédictive pour les séries temporelles. Le premier obstacle à l'application de SCP pour obtenir des prévisions probabilistes théoriquement valides des prix de l'électricité de manière post-hoc est l'aspect temporel hautement non-stationnaire des prix de l'électricité, brisant l'hypothèse d'échangeabilité. La première contribution propose un algorithme qui ne dépend pas d'un paramètre et adapté aux séries temporelles, reposant sur l'analyse théorique de l'efficacité d'une méthode pré-existante, l'Inférence Conforme Adaptative. La deuxième contribution mène une étude d'application détaillée sur un nouveau jeu de données de prix spot français récents et turbulents en 2020 et 2021.Un autre défi sont les valeurs manquantes (NAs). Dans un deuxièmte temps, cette thèse analyse l'interaction entre les NAs et la quantification de l'incertitude prédictive. La troisième contribution montre que les NAs induisent de l'hétéroscédasticité, ce qui conduit à une couverture inégale en fonction de quelles valeurs sont manquantes. Deux algorithmes sont conçus afin d'assurer une couverture constante quelque soit le schéma de NAs, ceci étant assuré sous des hypothèses distributionnelles sur les NAs. La quatrième contribution approfondit l'analyse théorique afin de comprendre précisément quelles hypothèses de distribution sont inévitables pour construite des régions prédictives informatives. Elle unifie également les algorithmes proposés précédemment dans un cadre général qui démontre empiriquement être robuste aux violations des hypothèses distributionnelles sur les NAs
The surge of more and more powerful statistical learning algorithms offers promising prospects for electricity prices forecasting. However, these methods provide ad hoc forecasts, with no indication of the degree of confidence to be placed in them. To ensure the safe deployment of these predictive models, it is crucial to quantify their predictive uncertainty. This PhD thesis focuses on developing predictive intervals for any underlying algorithm. While motivated by the electrical sector, the methods developed, based on Split Conformal Prediction (SCP), are generic: they can be applied in many sensitive fields.First, this thesis studies post-hoc predictive uncertainty quantification for time series. The first bottleneck to apply SCP in order to obtain guaranteed probabilistic electricity price forecasting in a post-hoc fashion is the highly non-stationary temporal aspect of electricity prices, breaking the exchangeability assumption. The first contribution proposes a parameter-free algorithm tailored for time series, which is based on theoretically analysing the efficiency of the existing Adaptive Conformal Inference method. The second contribution conducts an extensive application study on novel data set of recent turbulent French spot prices in 2020 and 2021.Another challenge are missing values (NAs). In a second part, this thesis analyzes the interplay between NAs and predictive uncertainty quantification. The third contribution highlights that NAs induce heteroskedasticity, leading to uneven coverage depending on which features are observed. Two algorithms recovering equalized coverage for any NAs under distributional assumptions on the missigness mechanism are designed. The forth contribution pushes forwards the theoretical analysis to understand precisely which distributional assumptions are unavoidable for theoretical informativeness. It also unifies the previously proposed algorithms into a general framework that demontrastes empirical robustness to violations of the supposed missingness distribution
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Riley, Matthew E. "Quantification of Model-Form, Predictive, and Parametric Uncertainties in Simulation-Based Design". Wright State University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=wright1314895435.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Freeman, Jacob Andrew. "Optimization Under Uncertainty and Total Predictive Uncertainty for a Tractor-Trailer Base-Drag Reduction Device". Diss., Virginia Tech, 2012. http://hdl.handle.net/10919/77168.

Texto completo da fonte
Resumo:
One key outcome of this research is the design for a 3-D tractor-trailer base-drag reduction device that predicts a 41% reduction in wind-averaged drag coefficient at 57 mph (92 km/h) and that is relatively insensitive to uncertain wind speed and direction and uncertain deflection angles due to mounting accuracy and static aeroelastic loading; the best commercial device of non-optimized design achieves a 12% reduction at 65 mph. Another important outcome is the process by which the optimized design is obtained. That process includes verification and validation of the flow solver, a less complex but much broader 2-D pathfinder study, and the culminating 3-D aerodynamic shape optimization under uncertainty (OUU) study. To gain confidence in the accuracy and precision of a computational fluid dynamics (CFD) flow solver and its Reynolds-averaged Navier-Stokes (RANS) turbulence models, it is necessary to conduct code verification, solution verification, and model validation. These activities are accomplished using two commercial CFD solvers, Cobalt and RavenCFD, with four turbulence models: Spalart-Allmaras (S-A), S-A with rotation and curvature, Menter shear-stress transport (SST), and Wilcox 1998 k-ω. Model performance is evaluated for three low subsonic 2-D applications: turbulent flat plate, planar jet, and NACA 0012 airfoil at α = 0°. The S-A turbulence model is selected for the 2-D OUU study. In the 2-D study, a tractor-trailer base flap model is developed that includes six design variables with generous constraints; 400 design candidates are evaluated. The design optimization loop includes the effect of uncertain wind speed and direction, and post processing addresses several other uncertain effects on drag prediction. The study compares the efficiency and accuracy of two optimization algorithms, evolutionary algorithm (EA) and dividing rectangles (DIRECT), twelve surrogate models, six sampling methods, and surrogate-based global optimization (SBGO) methods. The DAKOTA optimization and uncertainty quantification framework is used to interface the RANS flow solver, grid generator, and optimization algorithm. The EA is determined to be more efficient in obtaining a design with significantly reduced drag (as opposed to more efficient in finding the true drag minimum), and total predictive uncertainty is estimated as ±11%. While the SBGO methods are more efficient than a traditional optimization algorithm, they are computationally inefficient due to their serial nature, as implemented in DAKOTA. Because the S-A model does well in 2-D but not in 3-D under these conditions, the SST turbulence model is selected for the 3-D OUU study that includes five design variables and evaluates a total of 130 design candidates. Again using the EA, the study propagates aleatory (wind speed and direction) and epistemic (perturbations in flap deflection angle) uncertainty within the optimization loop and post processes several other uncertain effects. For the best 3-D design, total predictive uncertainty is +15/-42%, due largely to using a relatively coarse (six million cell) grid. That is, the best design drag coefficient estimate is within 15 and 42% of the true value; however, its improvement relative to the no-flaps baseline is accurate within 3-9% uncertainty.
Ph. D.
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Wu, Jinlong. "Predictive Turbulence Modeling with Bayesian Inference and Physics-Informed Machine Learning". Diss., Virginia Tech, 2018. http://hdl.handle.net/10919/85129.

Texto completo da fonte
Resumo:
Reynolds-Averaged Navier-Stokes (RANS) simulations are widely used for engineering design and analysis involving turbulent flows. In RANS simulations, the Reynolds stress needs closure models and the existing models have large model-form uncertainties. Therefore, the RANS simulations are known to be unreliable in many flows of engineering relevance, including flows with three-dimensional structures, swirl, pressure gradients, or curvature. This lack of accuracy in complex flows has diminished the utility of RANS simulations as a predictive tool for engineering design, analysis, optimization, and reliability assessments. Recently, data-driven methods have emerged as a promising alternative to develop the model of Reynolds stress for RANS simulations. In this dissertation I explore two physics-informed, data-driven frameworks to improve RANS modeled Reynolds stresses. First, a Bayesian inference framework is proposed to quantify and reduce the model-form uncertainty of RANS modeled Reynolds stress by leveraging online sparse measurement data with empirical prior knowledge. Second, a machine-learning-assisted framework is proposed to utilize offline high-fidelity simulation databases. Numerical results show that the data-driven RANS models have better prediction of Reynolds stress and other quantities of interest for several canonical flows. Two metrics are also presented for an a priori assessment of the prediction confidence for the machine-learning-assisted RANS model. The proposed data-driven methods are also applicable to the computational study of other physical systems whose governing equations have some unresolved physics to be modeled.
Ph. D.
Reynolds-Averaged Navier–Stokes (RANS) simulations are widely used for engineering design and analysis involving turbulent flows. In RANS simulations, the Reynolds stress needs closure models and the existing models have large model-form uncertainties. Therefore, the RANS simulations are known to be unreliable in many flows of engineering relevance, including flows with three-dimensional structures, swirl, pressure gradients, or curvature. This lack of accuracy in complex flows has diminished the utility of RANS simulations as a predictive tool for engineering design, analysis, optimization, and reliability assessments. Recently, data-driven methods have emerged as a promising alternative to develop the model of Reynolds stress for RANS simulations. In this dissertation I explore two physics-informed, data-driven frameworks to improve RANS modeled Reynolds stresses. First, a Bayesian inference framework is proposed to quantify and reduce the model-form uncertainty of RANS modeled Reynolds stress by leveraging online sparse measurement data with empirical prior knowledge. Second, a machine-learning-assisted framework is proposed to utilize offline high fidelity simulation databases. Numerical results show that the data-driven RANS models have better prediction of Reynolds stress and other quantities of interest for several canonical flows. Two metrics are also presented for an a priori assessment of the prediction confidence for the machine-learning-assisted RANS model. The proposed data-driven methods are also applicable to the computational study of other physical systems whose governing equations have some unresolved physics to be modeled.
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Cortesi, Andrea Francesco. "Predictive numerical simulations for rebuilding freestream conditions in atmospheric entry flows". Thesis, Bordeaux, 2018. http://www.theses.fr/2018BORD0021/document.

Texto completo da fonte
Resumo:
Une prédiction fidèle des écoulements hypersoniques à haute enthalpie est capitale pour les missions d'entrée atmosphérique. Cependant, la présence d'incertitudes est inévitable, sur les conditions de l'écoulement libre comme sur d'autres paramètres des modèles physico-chimiques. Pour cette raison, une quantification rigoureuse de l'effet de ces incertitudes est obligatoire pour évaluer la robustesse et la prédictivité des simulations numériques. De plus, une reconstruction correcte des paramètres incertains à partir des mesures en vol peut aider à réduire le niveau d'incertitude sur les sorties. Dans ce travail, nous utilisons un cadre statistique pour la propagation directe des incertitudes ainsi que pour la reconstruction inverse des conditions de l'écoulement libre dans le cas d'écoulements de rentrée atmosphérique. La possibilité d'exploiter les mesures de flux thermique au nez du véhicule pour la reconstruction des variables de l'écoulement libre et des paramètres incertains du modèle est évaluée pour les écoulements de rentrée hypersoniques. Cette reconstruction est réalisée dans un cadre bayésien, permettant la prise en compte des différentes sources d'incertitudes et des erreurs de mesure. Différentes techniques sont introduites pour améliorer les capacités de la stratégie statistique de quantification des incertitudes. Premièrement, une approche est proposée pour la génération d'un métamodèle amélioré, basée sur le couplage de Kriging et Sparse Polynomial Dimensional Decomposition. Ensuite, une méthode d'ajoute adaptatif de nouveaux points à un plan d'expériences existant est présentée dans le but d'améliorer la précision du métamodèle créé. Enfin, une manière d'exploiter les sous-espaces actifs dans les algorithmes de Markov Chain Monte Carlo pour les problèmes inverses bayésiens est également exposée
Accurate prediction of hypersonic high-enthalpy flows is of main relevance for atmospheric entry missions. However, uncertainties are inevitable on freestream conditions and other parameters of the physico-chemical models. For this reason, a rigorous quantification of the effect of uncertainties is mandatory to assess the robustness and predictivity of numerical simulations. Furthermore, a proper reconstruction of uncertain parameters from in-flight measurements can help reducing the level of uncertainties of the output. In this work, we will use a statistical framework for direct propagation of uncertainties and inverse freestream reconstruction applied to atmospheric entry flows. We propose an assessment of the possibility of exploiting forebody heat flux measurements for the reconstruction of freestream variables and uncertain parameters of the model for hypersonic entry flows. This reconstruction is performed in a Bayesian framework, allowing to account for sources of uncertainties and measurement errors. Different techniques are introduced to enhance the capabilities of the statistical framework for quantification of uncertainties. First, an improved surrogate modeling technique is proposed, based on Kriging and Sparse Polynomial Dimensional Decomposition. Then a method is proposed to adaptively add new training points to an existing experimental design to improve the accuracy of the trained surrogate model. A way to exploit active subspaces in Markov Chain Monte Carlo algorithms for Bayesian inverse problems is also proposed
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Erbas, Demet. "Sampling strategies for uncertainty quantification in oil recovery prediction". Thesis, Heriot-Watt University, 2007. http://hdl.handle.net/10399/70.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Whiting, Nolan Wagner. "Assessment of Model Validation, Calibration, and Prediction Approaches in the Presence of Uncertainty". Thesis, Virginia Tech, 2019. http://hdl.handle.net/10919/91903.

Texto completo da fonte
Resumo:
Model validation is the process of determining the degree to which a model is an accurate representation of the true value in the real world. The results of a model validation study can be used to either quantify the model form uncertainty or to improve/calibrate the model. However, the model validation process can become complicated if there is uncertainty in the simulation and/or experimental outcomes. These uncertainties can be in the form of aleatory uncertainties due to randomness or epistemic uncertainties due to lack of knowledge. Four different approaches are used for addressing model validation and calibration: 1) the area validation metric (AVM), 2) a modified area validation metric (MAVM) with confidence intervals, 3) the standard validation uncertainty from ASME VandV 20, and 4) Bayesian updating of a model discrepancy term. Details are given for the application of the MAVM for accounting for small experimental sample sizes. To provide an unambiguous assessment of these different approaches, synthetic experimental values were generated from computational fluid dynamics simulations of a multi-element airfoil. A simplified model was then developed using thin airfoil theory. This simplified model was then assessed using the synthetic experimental data. The quantities examined include the two dimensional lift and moment coefficients for the airfoil with varying angles of attack and flap deflection angles. Each of these validation/calibration approaches will be assessed for their ability to tightly encapsulate the true value in nature at locations both where experimental results are provided and prediction locations where no experimental data are available. Generally it was seen that the MAVM performed the best in cases where there is a sparse amount of data and/or large extrapolations and Bayesian calibration outperformed the others where there is an extensive amount of experimental data that covers the application domain.
Master of Science
Uncertainties often exists when conducting physical experiments, and whether this uncertainty exists due to input uncertainty, uncertainty in the environmental conditions in which the experiment takes place, or numerical uncertainty in the model, it can be difficult to validate and compare the results of a model with those of an experiment. Model validation is the process of determining the degree to which a model is an accurate representation of the true value in the real world. The results of a model validation study can be used to either quantify the uncertainty that exists within the model or to improve/calibrate the model. However, the model validation process can become complicated if there is uncertainty in the simulation (model) and/or experimental outcomes. These uncertainties can be in the form of aleatory (uncertainties which a probability distribution can be applied for likelihood of drawing values) or epistemic uncertainties (no knowledge, inputs drawn within an interval). Four different approaches are used for addressing model validation and calibration: 1) the area validation metric (AVM), 2) a modified area validation metric (MAVM) with confidence intervals, 3) the standard validation uncertainty from ASME V&V 20, and 4) Bayesian updating of a model discrepancy term. Details are given for the application of the MAVM for accounting for small experimental sample sizes. To provide an unambiguous assessment of these different approaches, synthetic experimental values were generated from computational fluid dynamics(CFD) simulations of a multi-element airfoil. A simplified model was then developed using thin airfoil theory. This simplified model was then assessed using the synthetic experimental data. The quantities examined include the two dimensional lift and moment coefficients for the airfoil with varying angles of attack and flap deflection angles. Each of these validation/calibration approaches will be assessed for their ability to tightly encapsulate the true value in nature at locations both where experimental results are provided and prediction locations where no experimental data are available. Also of interest was to assess how well each method could predict the uncertainties about the simulation outside of the region in which experimental observations were made, and model form uncertainties could be observed.
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Phadnis, Akash. "Uncertainty quantification and prediction for non-autonomous linear and nonlinear systems". Thesis, Massachusetts Institute of Technology, 2013. http://hdl.handle.net/1721.1/85476.

Texto completo da fonte
Resumo:
Thesis: S.M., Massachusetts Institute of Technology, Department of Mechanical Engineering, 2013.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 189-197).
The science of uncertainty quantification has gained a lot of attention over recent years. This is because models of real processes always contain some elements of uncertainty, and also because real systems can be better described using stochastic components. Stochastic models can therefore be utilized to provide a most informative prediction of possible future states of the system. In light of the multiple scales, nonlinearities and uncertainties in ocean dynamics, stochastic models can be most useful to describe ocean systems. Uncertainty quantification schemes developed in recent years include order reduction methods (e.g. proper orthogonal decomposition (POD)), error subspace statistical estimation (ESSE), polynomial chaos (PC) schemes and dynamically orthogonal (DO) field equations. In this thesis, we focus our attention on DO and various PC schemes for quantifying and predicting uncertainty in systems with external stochastic forcing. We develop and implement these schemes in a generic stochastic solver for a class of non-autonomous linear and nonlinear dynamical systems. This class of systems encapsulates most systems encountered in classic nonlinear dynamics and ocean modeling, including flows modeled by Navier-Stokes equations. We first study systems with uncertainty in input parameters (e.g. stochastic decay models and Kraichnan-Orszag system) and then with external stochastic forcing (autonomous and non-autonomous self-engineered nonlinear systems). For time-integration of system dynamics, stochastic numerical schemes of varied order are employed and compared. Using our generic stochastic solver, the Monte Carlo, DO and polynomial chaos schemes are inter-compared in terms of accuracy of solution and computational cost. To allow accurate time-integration of uncertainty due to external stochastic forcing, we also derive two novel PC schemes, namely, the reduced space KLgPC scheme and the modified TDgPC (MTDgPC) scheme. We utilize a set of numerical examples to show that the two new PC schemes and the DO scheme can integrate both additive and multiplicative stochastic forcing over significant time intervals. For the final example, we consider shallow water ocean surface waves and the modeling of these waves by deterministic dynamics and stochastic forcing components. Specifically, we time-integrate the Korteweg-de Vries (KdV) equation with external stochastic forcing, comparing the performance of the DO and Monte Carlo schemes. We find that the DO scheme is computationally efficient to integrate uncertainty in such systems with external stochastic forcing.
by Akash Phadnis.
S.M.
Estilos ABNT, Harvard, Vancouver, APA, etc.

Livros sobre o assunto "Predictive uncertainty quantification"

1

McClarren, Ryan G. Uncertainty Quantification and Predictive Computational Science. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Eva, Boegh, e International Association of Hydrological Sciences., eds. Quantification and reduction of predictive uncertainty for sustainable water resources management: Proceedings of an international symposium [held] during IUGG2007, the XXIV General Assembly of the International Union of Geodesy and Geophysics at Perugia, Italy, July 2007. Wallingford: International Association of Hydrological Sciences, 2007.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Harrington, Matthew R. Predicting and Understanding the Presence of Water through Remote Sensing, Machine Learning, and Uncertainty Quantification. [New York, N.Y.?]: [publisher not identified], 2022.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Hemez, François, e Sez Atamturktur. Predictive Modelling: Verification, Validation and Uncertainty Quantification. Wiley & Sons, Limited, John, 2018.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

McClarren, Ryan G. Uncertainty Quantification and Predictive Computational Science: A Foundation for Physical Scientists and Engineers. Springer, 2018.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Anderson, Mark, Francois Hemez e Scott Doebling. Model Verification and Validation in Engineering Mechanics: Theory and Applications of Uncertainty Quantification and Predictive Accuracy. John Wiley & Sons, 2005.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Model Verification and Validation in Engineering Mechanics: Theory and Applications of Uncertainty Quantification and Predictive Accuracy. Wiley & Sons, Limited, John, 2004.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Sanderson, Benjamin Mark. Uncertainty Quantification in Multi-Model Ensembles. Oxford University Press, 2018. http://dx.doi.org/10.1093/acrefore/9780190228620.013.707.

Texto completo da fonte
Resumo:
Long-term planning for many sectors of society—including infrastructure, human health, agriculture, food security, water supply, insurance, conflict, and migration—requires an assessment of the range of possible futures which the planet might experience. Unlike short-term forecasts for which validation data exists for comparing forecast to observation, long-term forecasts have almost no validation data. As a result, researchers must rely on supporting evidence to make their projections. A review of methods for quantifying the uncertainty of climate predictions is given. The primary tool for quantifying these uncertainties are climate models, which attempt to model all the relevant processes that are important in climate change. However, neither the construction nor calibration of climate models is perfect, and therefore the uncertainties due to model errors must also be taken into account in the uncertainty quantification.Typically, prediction uncertainty is quantified by generating ensembles of solutions from climate models to span possible futures. For instance, initial condition uncertainty is quantified by generating an ensemble of initial states that are consistent with available observations and then integrating the climate model starting from each initial condition. A climate model is itself subject to uncertain choices in modeling certain physical processes. Some of these choices can be sampled using so-called perturbed physics ensembles, whereby uncertain parameters or structural switches are perturbed within a single climate model framework. For a variety of reasons, there is a strong reliance on so-called ensembles of opportunity, which are multi-model ensembles (MMEs) formed by collecting predictions from different climate modeling centers, each using a potentially different framework to represent relevant processes for climate change. The most extensive collection of these MMEs is associated with the Coupled Model Intercomparison Project (CMIP). However, the component models have biases, simplifications, and interdependencies that must be taken into account when making formal risk assessments. Techniques and concepts for integrating model projections in MMEs are reviewed, including differing paradigms of ensembles and how they relate to observations and reality. Aspects of these conceptual issues then inform the more practical matters of how to combine and weight model projections to best represent the uncertainties associated with projected climate change.
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Chen, Nan. Stochastic Methods for Modeling and Predicting Complex Dynamical Systems: Uncertainty Quantification, State Estimation, and Reduced-Order Models. Springer International Publishing AG, 2023.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Capítulos de livros sobre o assunto "Predictive uncertainty quantification"

1

Svensson, Emma, Hannah Rosa Friesacher, Adam Arany, Lewis Mervin e Ola Engkvist. "Temporal Evaluation of Uncertainty Quantification Under Distribution Shift". In Lecture Notes in Computer Science, 132–48. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-72381-0_11.

Texto completo da fonte
Resumo:
AbstractUncertainty quantification is emerging as a critical tool in high-stakes decision-making processes, where trust in automated predictions that lack accuracy and precision can be time-consuming and costly. In drug discovery, such high-stakes decisions are based on modeling the properties of potential drug compounds on biological assays. So far, existing uncertainty quantification methods have primarily been evaluated using public datasets that lack the temporal context necessary to understand their performance over time. In this work, we address the pressing need for a comprehensive, large-scale temporal evaluation of uncertainty quantification methodologies in the context of assay-based molecular property prediction. Our novel framework benchmarks three ensemble-based approaches to uncertainty quantification and explores the effect of adding lower-quality data during training in the form of censored labels. We investigate the robustness of the predictive performance and the calibration and reliability of predictive uncertainty by the models as time evolves. Moreover, we explore how the predictive uncertainty behaves in response to varying degrees of distribution shift. By doing so, our analysis not only advances the field but also provides practical implications for real-world pharmaceutical applications.
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

McClarren, Ryan G. "Introduction to Uncertainty Quantification and Predictive Science". In Uncertainty Quantification and Predictive Computational Science, 3–17. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_1.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

McClarren, Ryan G. "Gaussian Process Emulators and Surrogate Models". In Uncertainty Quantification and Predictive Computational Science, 257–74. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_10.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

McClarren, Ryan G. "Predictive Models Informed by Simulation, Measurement, and Surrogates". In Uncertainty Quantification and Predictive Computational Science, 275–304. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_11.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

McClarren, Ryan G. "Epistemic Uncertainties: Dealing with a Lack of Knowledge". In Uncertainty Quantification and Predictive Computational Science, 305–22. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_12.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

McClarren, Ryan G. "Probability and Statistics Preliminaries". In Uncertainty Quantification and Predictive Computational Science, 19–51. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_2.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

McClarren, Ryan G. "Input Parameter Distributions". In Uncertainty Quantification and Predictive Computational Science, 53–91. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_3.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

McClarren, Ryan G. "Local Sensitivity Analysis Based on Derivative Approximations". In Uncertainty Quantification and Predictive Computational Science, 95–109. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_4.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

McClarren, Ryan G. "Regression Approximations to Estimate Sensitivities". In Uncertainty Quantification and Predictive Computational Science, 111–28. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_5.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

McClarren, Ryan G. "Adjoint-Based Local Sensitivity Analysis". In Uncertainty Quantification and Predictive Computational Science, 129–43. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_6.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Trabalhos de conferências sobre o assunto "Predictive uncertainty quantification"

1

Mossina, Luca, Joseba Dalmau e Léo Andéol. "Conformal Semantic Image Segmentation: Post-hoc Quantification of Predictive Uncertainty". In 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 3574–84. IEEE, 2024. http://dx.doi.org/10.1109/cvprw63382.2024.00361.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Duan, Jinhao, Hao Cheng, Shiqi Wang, Alex Zavalny, Chenan Wang, Renjing Xu, Bhavya Kailkhura e Kaidi Xu. "Shifting Attention to Relevance: Towards the Predictive Uncertainty Quantification of Free-Form Large Language Models". In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 5050–63. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.acl-long.276.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Grewal, Ruben, Paolo Tonella e Andrea Stocco. "Predicting Safety Misbehaviours in Autonomous Driving Systems Using Uncertainty Quantification". In 2024 IEEE Conference on Software Testing, Verification and Validation (ICST), 70–81. IEEE, 2024. http://dx.doi.org/10.1109/icst60714.2024.00016.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Zhou, Hao, Yanze Zhang e Wenhao Luo. "Safety-Critical Control with Uncertainty Quantification using Adaptive Conformal Prediction". In 2024 American Control Conference (ACC), 574–80. IEEE, 2024. http://dx.doi.org/10.23919/acc60939.2024.10644391.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Neumeier, Marion, Sebastian Dorn, Michael Botsch e Wolfgang Utschick. "Reliable Trajectory Prediction and Uncertainty Quantification with Conditioned Diffusion Models". In 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 3461–70. IEEE, 2024. http://dx.doi.org/10.1109/cvprw63382.2024.00350.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Zheng, Lang, Ruxue Xing e Yaojun Wang. "GraphCast-Qtf: An improved weather prediction model based on uncertainty quantification methods". In 2024 12th International Conference on Agro-Geoinformatics (Agro-Geoinformatics), 1–5. IEEE, 2024. http://dx.doi.org/10.1109/agro-geoinformatics262780.2024.10660789.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Xu, Xinyue, Suman Paneru, Sez A. Russcher e Julian Wang. "Physics-Guided Bayesian Neural Networks and Their Application in ODE Problems". In ASME 2024 Verification, Validation, and Uncertainty Quantification Symposium. American Society of Mechanical Engineers, 2024. http://dx.doi.org/10.1115/vvuq2024-122961.

Texto completo da fonte
Resumo:
Abstract Deterministic Neural Networks (NNs) have been widely adopted to represent system behaviors in various engineering problems, which have been criticized for providing overconfident predictions. This challenge, i.e., overfitting, refers to a model that captures data patterns and random noise during the learning process, leading to an inferior predictive capability in the unobserved domains. As known to all, Bayesian inference offers an effective way for uncertainty quantification (UQ) to help enhance model predictions and offer informed decision-making. Integrating deterministic NN models with Bayesian inference provides a way to fast simulate and quantify both data and model uncertainty simultaneously. Thereby, Bayesian Neural Networks (BNNs), a combination of standard NNs with Bayesian philosophy, introduce probability to the network parameters, i.e., biases and weights. In BNNs, predictive uncertainty reflects how confident model predictions are regarding the corresponding credible intervals, which helps identify overfitting and interpreting abnormal predictions. Another advantage is that BNNs naturally implement regularization by integrating over the weights such that BNNs can reach good generalization when data availability is limited. BNNs rely heavily on observations as a purely data-driven approach. Besides, standard BNNs have relatively complex network structures, which makes it hard to provide detailed interpretability on model predictions. Researchers share concerns about their predictions that may violate real-world phenomena. The combination of physics knowledge with the optimization objective can improve the adherence of predictions with the known physical principles. Physics laws are usually embedded as constraint functions in the process of optimization. With the help of constrained optimization, model predictions keep consistent with the known physical principles. The authors refine BNNs with physics-guided information, i.e., Physics-Guided BNNs (PG-BNNs), to reduce dependence on data, improve model interpretability, and offer reliable model predictions. PG-BNNs interpret the known physics knowledge as constraint functions to better guide optimization processes, especially for case studies with sparse and limited data sets. A case study of an ordinary differential equation (ODE) problem is carried out to compare the performance of PG-BNN and unconstrained BNN models and demonstrate the effectiveness of PG-BNN.
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Lye, Adolphus, Alice Cicrello e Edoardo Patelli. "UNCERTAINTY QUANTIFICATION OF OPTIMAL THRESHOLD FAILURE PROBABILITY FOR PREDICTIVE MAINTENANCE USING CONFIDENCE STRUCTURES". In 2nd International Conference on Uncertainty Quantification in Computational Sciences and Engineering. Athens: Institute of Structural Analysis and Antiseismic Research School of Civil Engineering National Technical University of Athens (NTUA) Greece, 2019. http://dx.doi.org/10.7712/120219.6364.18502.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Hammam, Ahmed, Frank Bonarens, Seyed Eghbal Ghobadi e Christoph Stiller. "Predictive Uncertainty Quantification of Deep Neural Networks using Dirichlet Distributions". In CSCS '22: Computer Science in Cars Symposium. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3568160.3570233.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Catanach, Thomas. "Posterior Predictive Variational Inference for Uncertainty Quantification in Machine Learning." In Proposed for presentation at the SIAM Conference on Uncertainty Quantification (UQ22) held April 12-15, 2022 in Atlanta, GA. US DOE, 2022. http://dx.doi.org/10.2172/2002336.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Relatórios de organizações sobre o assunto "Predictive uncertainty quantification"

1

Adams, Marvin. Phenomena-based Uncertainty Quantification in Predictive Coupled- Physics Reactor Simulations. Office of Scientific and Technical Information (OSTI), junho de 2017. http://dx.doi.org/10.2172/1364745.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Favorite, Jeffrey A., Garrett James Dean, Keith C. Bledsoe, Matt Jessee, Dan Gabriel Cacuci, Ruixian Fang e Madalina Badea. Predictive Modeling, Inverse Problems, and Uncertainty Quantification with Application to Emergency Response. Office of Scientific and Technical Information (OSTI), abril de 2018. http://dx.doi.org/10.2172/1432629.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Lawson, Matthew, Bert J. Debusschere, Habib N. Najm, Khachik Sargsyan e Jonathan H. Frank. Uncertainty quantification of cinematic imaging for development of predictive simulations of turbulent combustion. Office of Scientific and Technical Information (OSTI), setembro de 2010. http://dx.doi.org/10.2172/1011617.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Ye, Ming. Computational Bayesian Framework for Quantification and Reduction of Predictive Uncertainty in Subsurface Environmental Modeling. Office of Scientific and Technical Information (OSTI), janeiro de 2019. http://dx.doi.org/10.2172/1491235.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Marzouk, Youssef. Final Report, DOE Early Career Award: Predictive modeling of complex physical systems: new tools for statistical inference, uncertainty quantification, and experimental design. Office of Scientific and Technical Information (OSTI), agosto de 2016. http://dx.doi.org/10.2172/1312896.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Cattaneo, Matias D., Richard K. Crump e Weining Wang. Beta-Sorted Portfolios. Federal Reserve Bank of New York, julho de 2023. http://dx.doi.org/10.59576/sr.1068.

Texto completo da fonte
Resumo:
Beta-sorted portfolios—portfolios comprised of assets with similar covariation to selected risk factors—are a popular tool in empirical finance to analyze models of (conditional) expected returns. Despite their widespread use, little is known of their statistical properties in contrast to comparable procedures such as two-pass regressions. We formally investigate the properties of beta-sorted portfolio returns by casting the procedure as a two-step nonparametric estimator with a nonparametric first step and a beta-adaptive portfolios construction. Our framework rationalizes the well-known estimation algorithm with precise economic and statistical assumptions on the general data generating process. We provide conditions that ensure consistency and asymptotic normality along with new uniform inference procedures allowing for uncertainty quantification and general hypothesis testing for financial applications. We show that the rate of convergence of the estimator is non-uniform and depends on the beta value of interest. We also show that the widely used Fama-MacBeth variance estimator is asymptotically valid but is conservative in general and can be very conservative in empirically relevant settings. We propose a new variance estimator, which is always consistent and provide an empirical implementation which produces valid inference. In our empirical application we introduce a novel risk factor—a measure of the business credit cycle—and show that it is strongly predictive of both the cross-section and time-series behavior of U.S. stock returns.
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Gonzales, Lindsey M., Thomas M. Hall, Kendra L. Van Buren, Steven R. Anton e Francois M. Hemez. Quantification of Prediction Bounds Caused by Model Form Uncertainty. Office of Scientific and Technical Information (OSTI), setembro de 2013. http://dx.doi.org/10.2172/1095195.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Adams, Jason, Brandon Berman, Joshua Michalenko e Rina Deka. Non-conformity Scores for High-Quality Uncertainty Quantification from Conformal Prediction. Office of Scientific and Technical Information (OSTI), setembro de 2023. http://dx.doi.org/10.2172/2430248.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Vecherin, Sergey, Stephen Ketcham, Aaron Meyer, Kyle Dunn, Jacob Desmond e Michael Parker. Short-range near-surface seismic ensemble predictions and uncertainty quantification for layered medium. Engineer Research and Development Center (U.S.), setembro de 2022. http://dx.doi.org/10.21079/11681/45300.

Texto completo da fonte
Resumo:
To make a prediction for seismic signal propagation, one needs to specify physical properties and subsurface ground structure of the site. This information is frequently unknown or estimated with significant uncertainty. This paper describes a methodology for probabilistic seismic ensemble prediction for vertically stratified soils and short ranges with no in situ site characterization. Instead of specifying viscoelastic site properties, the methodology operates with probability distribution functions of these properties taking into account analytical and empirical relationships among viscoelastic variables. This yields ensemble realizations of signal arrivals at specified locations where statistical properties of the signals can be estimated. Such ensemble predictions can be useful for preliminary site characterization, for military applications, and risk analysis for remote or inaccessible locations for which no data can be acquired. Comparison with experiments revealed that measured signals are not always within the predicted ranges of variability. Variance-based global sensitivity analysis has shown that the most significant parameters for signal amplitude predictions in the developed stochastic model are the uncertainty in the shear quality factor and the Poisson ratio above the water table depth.
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Glimm, James, Yunha Lee, Kenny Q. Ye e David H. Sharp. Prediction Using Numerical Simulations, A Bayesian Framework for Uncertainty Quantification and its Statistical Challenge. Fort Belvoir, VA: Defense Technical Information Center, janeiro de 2002. http://dx.doi.org/10.21236/ada417842.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Oferecemos descontos em todos os planos premium para autores cujas obras estão incluídas em seleções literárias temáticas. Contate-nos para obter um código promocional único!

Vá para a bibliografia