Academic literature on the topic 'Predictive uncertainty quantification'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Predictive uncertainty quantification.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Predictive uncertainty quantification"
Cacuci, Dan Gabriel. "Sensitivity Analysis, Uncertainty Quantification and Predictive Modeling of Nuclear Energy Systems." Energies 15, no. 17 (September 1, 2022): 6379. http://dx.doi.org/10.3390/en15176379.
Full textCsillag, Daniel, Lucas Monteiro Paes, Thiago Ramos, João Vitor Romano, Rodrigo Schuller, Roberto B. Seixas, Roberto I. Oliveira, and Paulo Orenstein. "AmnioML: Amniotic Fluid Segmentation and Volume Prediction with Uncertainty Quantification." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 13 (June 26, 2023): 15494–502. http://dx.doi.org/10.1609/aaai.v37i13.26837.
Full textLew, Jiann-Shiun, and Jer-Nan Juang. "Robust Generalized Predictive Control with Uncertainty Quantification." Journal of Guidance, Control, and Dynamics 35, no. 3 (May 2012): 930–37. http://dx.doi.org/10.2514/1.54510.
Full textKarimi, Hamed, and Reza Samavi. "Quantifying Deep Learning Model Uncertainty in Conformal Prediction." Proceedings of the AAAI Symposium Series 1, no. 1 (October 3, 2023): 142–48. http://dx.doi.org/10.1609/aaaiss.v1i1.27492.
Full textAkitaya, Kento, and Masaatsu Aichi. "Land Subsidence Model Inversion with the Estimation of Both Model Parameter Uncertainty and Predictive Uncertainty Using an Evolutionary-Based Data Assimilation (EDA) and Ensemble Model Output Statistics (EMOS)." Water 16, no. 3 (January 28, 2024): 423. http://dx.doi.org/10.3390/w16030423.
Full textSingh, Rishabh, and Jose C. Principe. "Toward a Kernel-Based Uncertainty Decomposition Framework for Data and Models." Neural Computation 33, no. 5 (April 13, 2021): 1164–98. http://dx.doi.org/10.1162/neco_a_01372.
Full textChen, Peng, and Nicholas Zabaras. "Adaptive Locally Weighted Projection Regression Method for Uncertainty Quantification." Communications in Computational Physics 14, no. 4 (October 2013): 851–78. http://dx.doi.org/10.4208/cicp.060712.281212a.
Full textOmagbon, Jericho, John Doherty, Angus Yeh, Racquel Colina, John O'Sullivan, Julian McDowell, Ruanui Nicholson, Oliver J. Maclaren, and Michael O'Sullivan. "Case studies of predictive uncertainty quantification for geothermal models." Geothermics 97 (December 2021): 102263. http://dx.doi.org/10.1016/j.geothermics.2021.102263.
Full textNitschke, C. T., P. Cinnella, D. Lucor, and J. C. Chassaing. "Model-form and predictive uncertainty quantification in linear aeroelasticity." Journal of Fluids and Structures 73 (August 2017): 137–61. http://dx.doi.org/10.1016/j.jfluidstructs.2017.05.007.
Full textMirzayeva, A., N. A. Slavinskaya, M. Abbasi, J. H. Starcke, W. Li, and M. Frenklach. "Uncertainty Quantification in Chemical Modeling." Eurasian Chemico-Technological Journal 20, no. 1 (March 31, 2018): 33. http://dx.doi.org/10.18321/ectj706.
Full textDissertations / Theses on the topic "Predictive uncertainty quantification"
Lonsdale, Jack Henry. "Predictive modelling and uncertainty quantification of UK forest growth." Thesis, University of Edinburgh, 2015. http://hdl.handle.net/1842/16202.
Full textGligorijevic, Djordje. "Predictive Uncertainty Quantification and Explainable Machine Learning in Healthcare." Diss., Temple University Libraries, 2018. http://cdm16002.contentdm.oclc.org/cdm/ref/collection/p245801coll10/id/520057.
Full textPh.D.
Predictive modeling is an ever-increasingly important part of decision making. The advances in Machine Learning predictive modeling have spread across many domains bringing significant improvements in performance and providing unique opportunities for novel discoveries. A notably important domains of the human world are medical and healthcare domains, which take care of peoples' wellbeing. And while being one of the most developed areas of science with active research, there are many ways they can be improved. In particular, novel tools developed based on Machine Learning theory have drawn benefits across many areas of clinical practice, pushing the boundaries of medical science and directly affecting well-being of millions of patients. Additionally, healthcare and medicine domains require predictive modeling to anticipate and overcome many obstacles that future may hold. These kinds of applications employ a precise decision--making processes which requires accurate predictions. However, good prediction by its own is often insufficient. There has been no major focus in developing algorithms with good quality uncertainty estimates. Ergo, this thesis aims at providing a variety of ways to incorporate solutions by learning high quality uncertainty estimates or providing interpretability of the models where needed for purpose of improving existing tools built in practice and allowing many other tools to be used where uncertainty is the key factor for decision making. The first part of the thesis proposes approaches for learning high quality uncertainty estimates for both short- and long-term predictions in multi-task learning, developed on top for continuous probabilistic graphical models. In many scenarios, especially in long--term predictions, it may be of great importance for the models to provide a reliability flag in order to be accepted by domain experts. To this end we explored a widely applied structured regression model with a goal of providing meaningful uncertainty estimations on various predictive tasks. Our particular interest is in modeling uncertainty propagation while predicting far in the future. To address this important problem, our approach centers around providing an uncertainty estimate by modeling input features as random variables. This allows modeling uncertainty from noisy inputs. In cases when model iteratively produces errors it should propagate uncertainty over the predictive horizon, which may provide invaluable information for decision making based on predictions. In the second part of the thesis we propose novel neural embedding models for learning low-dimensional embeddings of medical concepts, such are diseases and genes, and show how they can be interpreted to allow accessing their quality, and show how can they be used to solve many problems in medical and healthcare research. We use EHR data to discover novel relationships between diseases by studying their comorbidities (i.e., co-occurrences in patients). We trained our models on a large-scale EHR database comprising more than 35 million inpatient cases. To confirm value and potential of the proposed approach we evaluate its effectiveness on a held-out set. Furthermore, for select diseases we provide a candidate gene list for which disease-gene associations were not studied previously, allowing biomedical researchers to better focus their often very costly lab studies. We furthermore examine how disease heterogeneity can affect the quality of learned embeddings and propose an approach for learning types of such heterogeneous diseases, while in our study we primarily focus on learning types of sepsis. Finally, we evaluate the quality of low-dimensional embeddings on tasks of predicting hospital quality indicators such as length of stay, total charges and mortality likelihood, demonstrating their superiority over other approaches. In the third part of the thesis we focus on decision making in medicine and healthcare domain by developing state-of-the-art deep learning models capable of outperforming human performance while maintaining good interpretability and uncertainty estimates.
Temple University--Theses
Zaffran, Margaux. "Post-hoc predictive uncertainty quantification : methods with applications to electricity price forecasting." Electronic Thesis or Diss., Institut polytechnique de Paris, 2024. http://www.theses.fr/2024IPPAX033.
Full textThe surge of more and more powerful statistical learning algorithms offers promising prospects for electricity prices forecasting. However, these methods provide ad hoc forecasts, with no indication of the degree of confidence to be placed in them. To ensure the safe deployment of these predictive models, it is crucial to quantify their predictive uncertainty. This PhD thesis focuses on developing predictive intervals for any underlying algorithm. While motivated by the electrical sector, the methods developed, based on Split Conformal Prediction (SCP), are generic: they can be applied in many sensitive fields.First, this thesis studies post-hoc predictive uncertainty quantification for time series. The first bottleneck to apply SCP in order to obtain guaranteed probabilistic electricity price forecasting in a post-hoc fashion is the highly non-stationary temporal aspect of electricity prices, breaking the exchangeability assumption. The first contribution proposes a parameter-free algorithm tailored for time series, which is based on theoretically analysing the efficiency of the existing Adaptive Conformal Inference method. The second contribution conducts an extensive application study on novel data set of recent turbulent French spot prices in 2020 and 2021.Another challenge are missing values (NAs). In a second part, this thesis analyzes the interplay between NAs and predictive uncertainty quantification. The third contribution highlights that NAs induce heteroskedasticity, leading to uneven coverage depending on which features are observed. Two algorithms recovering equalized coverage for any NAs under distributional assumptions on the missigness mechanism are designed. The forth contribution pushes forwards the theoretical analysis to understand precisely which distributional assumptions are unavoidable for theoretical informativeness. It also unifies the previously proposed algorithms into a general framework that demontrastes empirical robustness to violations of the supposed missingness distribution
Riley, Matthew E. "Quantification of Model-Form, Predictive, and Parametric Uncertainties in Simulation-Based Design." Wright State University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=wright1314895435.
Full textFreeman, Jacob Andrew. "Optimization Under Uncertainty and Total Predictive Uncertainty for a Tractor-Trailer Base-Drag Reduction Device." Diss., Virginia Tech, 2012. http://hdl.handle.net/10919/77168.
Full textPh. D.
Wu, Jinlong. "Predictive Turbulence Modeling with Bayesian Inference and Physics-Informed Machine Learning." Diss., Virginia Tech, 2018. http://hdl.handle.net/10919/85129.
Full textPh. D.
Reynolds-Averaged Navier–Stokes (RANS) simulations are widely used for engineering design and analysis involving turbulent flows. In RANS simulations, the Reynolds stress needs closure models and the existing models have large model-form uncertainties. Therefore, the RANS simulations are known to be unreliable in many flows of engineering relevance, including flows with three-dimensional structures, swirl, pressure gradients, or curvature. This lack of accuracy in complex flows has diminished the utility of RANS simulations as a predictive tool for engineering design, analysis, optimization, and reliability assessments. Recently, data-driven methods have emerged as a promising alternative to develop the model of Reynolds stress for RANS simulations. In this dissertation I explore two physics-informed, data-driven frameworks to improve RANS modeled Reynolds stresses. First, a Bayesian inference framework is proposed to quantify and reduce the model-form uncertainty of RANS modeled Reynolds stress by leveraging online sparse measurement data with empirical prior knowledge. Second, a machine-learning-assisted framework is proposed to utilize offline high fidelity simulation databases. Numerical results show that the data-driven RANS models have better prediction of Reynolds stress and other quantities of interest for several canonical flows. Two metrics are also presented for an a priori assessment of the prediction confidence for the machine-learning-assisted RANS model. The proposed data-driven methods are also applicable to the computational study of other physical systems whose governing equations have some unresolved physics to be modeled.
Cortesi, Andrea Francesco. "Predictive numerical simulations for rebuilding freestream conditions in atmospheric entry flows." Thesis, Bordeaux, 2018. http://www.theses.fr/2018BORD0021/document.
Full textAccurate prediction of hypersonic high-enthalpy flows is of main relevance for atmospheric entry missions. However, uncertainties are inevitable on freestream conditions and other parameters of the physico-chemical models. For this reason, a rigorous quantification of the effect of uncertainties is mandatory to assess the robustness and predictivity of numerical simulations. Furthermore, a proper reconstruction of uncertain parameters from in-flight measurements can help reducing the level of uncertainties of the output. In this work, we will use a statistical framework for direct propagation of uncertainties and inverse freestream reconstruction applied to atmospheric entry flows. We propose an assessment of the possibility of exploiting forebody heat flux measurements for the reconstruction of freestream variables and uncertain parameters of the model for hypersonic entry flows. This reconstruction is performed in a Bayesian framework, allowing to account for sources of uncertainties and measurement errors. Different techniques are introduced to enhance the capabilities of the statistical framework for quantification of uncertainties. First, an improved surrogate modeling technique is proposed, based on Kriging and Sparse Polynomial Dimensional Decomposition. Then a method is proposed to adaptively add new training points to an existing experimental design to improve the accuracy of the trained surrogate model. A way to exploit active subspaces in Markov Chain Monte Carlo algorithms for Bayesian inverse problems is also proposed
Erbas, Demet. "Sampling strategies for uncertainty quantification in oil recovery prediction." Thesis, Heriot-Watt University, 2007. http://hdl.handle.net/10399/70.
Full textWhiting, Nolan Wagner. "Assessment of Model Validation, Calibration, and Prediction Approaches in the Presence of Uncertainty." Thesis, Virginia Tech, 2019. http://hdl.handle.net/10919/91903.
Full textMaster of Science
Uncertainties often exists when conducting physical experiments, and whether this uncertainty exists due to input uncertainty, uncertainty in the environmental conditions in which the experiment takes place, or numerical uncertainty in the model, it can be difficult to validate and compare the results of a model with those of an experiment. Model validation is the process of determining the degree to which a model is an accurate representation of the true value in the real world. The results of a model validation study can be used to either quantify the uncertainty that exists within the model or to improve/calibrate the model. However, the model validation process can become complicated if there is uncertainty in the simulation (model) and/or experimental outcomes. These uncertainties can be in the form of aleatory (uncertainties which a probability distribution can be applied for likelihood of drawing values) or epistemic uncertainties (no knowledge, inputs drawn within an interval). Four different approaches are used for addressing model validation and calibration: 1) the area validation metric (AVM), 2) a modified area validation metric (MAVM) with confidence intervals, 3) the standard validation uncertainty from ASME V&V 20, and 4) Bayesian updating of a model discrepancy term. Details are given for the application of the MAVM for accounting for small experimental sample sizes. To provide an unambiguous assessment of these different approaches, synthetic experimental values were generated from computational fluid dynamics(CFD) simulations of a multi-element airfoil. A simplified model was then developed using thin airfoil theory. This simplified model was then assessed using the synthetic experimental data. The quantities examined include the two dimensional lift and moment coefficients for the airfoil with varying angles of attack and flap deflection angles. Each of these validation/calibration approaches will be assessed for their ability to tightly encapsulate the true value in nature at locations both where experimental results are provided and prediction locations where no experimental data are available. Also of interest was to assess how well each method could predict the uncertainties about the simulation outside of the region in which experimental observations were made, and model form uncertainties could be observed.
Phadnis, Akash. "Uncertainty quantification and prediction for non-autonomous linear and nonlinear systems." Thesis, Massachusetts Institute of Technology, 2013. http://hdl.handle.net/1721.1/85476.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (pages 189-197).
The science of uncertainty quantification has gained a lot of attention over recent years. This is because models of real processes always contain some elements of uncertainty, and also because real systems can be better described using stochastic components. Stochastic models can therefore be utilized to provide a most informative prediction of possible future states of the system. In light of the multiple scales, nonlinearities and uncertainties in ocean dynamics, stochastic models can be most useful to describe ocean systems. Uncertainty quantification schemes developed in recent years include order reduction methods (e.g. proper orthogonal decomposition (POD)), error subspace statistical estimation (ESSE), polynomial chaos (PC) schemes and dynamically orthogonal (DO) field equations. In this thesis, we focus our attention on DO and various PC schemes for quantifying and predicting uncertainty in systems with external stochastic forcing. We develop and implement these schemes in a generic stochastic solver for a class of non-autonomous linear and nonlinear dynamical systems. This class of systems encapsulates most systems encountered in classic nonlinear dynamics and ocean modeling, including flows modeled by Navier-Stokes equations. We first study systems with uncertainty in input parameters (e.g. stochastic decay models and Kraichnan-Orszag system) and then with external stochastic forcing (autonomous and non-autonomous self-engineered nonlinear systems). For time-integration of system dynamics, stochastic numerical schemes of varied order are employed and compared. Using our generic stochastic solver, the Monte Carlo, DO and polynomial chaos schemes are inter-compared in terms of accuracy of solution and computational cost. To allow accurate time-integration of uncertainty due to external stochastic forcing, we also derive two novel PC schemes, namely, the reduced space KLgPC scheme and the modified TDgPC (MTDgPC) scheme. We utilize a set of numerical examples to show that the two new PC schemes and the DO scheme can integrate both additive and multiplicative stochastic forcing over significant time intervals. For the final example, we consider shallow water ocean surface waves and the modeling of these waves by deterministic dynamics and stochastic forcing components. Specifically, we time-integrate the Korteweg-de Vries (KdV) equation with external stochastic forcing, comparing the performance of the DO and Monte Carlo schemes. We find that the DO scheme is computationally efficient to integrate uncertainty in such systems with external stochastic forcing.
by Akash Phadnis.
S.M.
Books on the topic "Predictive uncertainty quantification"
McClarren, Ryan G. Uncertainty Quantification and Predictive Computational Science. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0.
Full textEva, Boegh, and International Association of Hydrological Sciences., eds. Quantification and reduction of predictive uncertainty for sustainable water resources management: Proceedings of an international symposium [held] during IUGG2007, the XXIV General Assembly of the International Union of Geodesy and Geophysics at Perugia, Italy, July 2007. Wallingford: International Association of Hydrological Sciences, 2007.
Find full textHarrington, Matthew R. Predicting and Understanding the Presence of Water through Remote Sensing, Machine Learning, and Uncertainty Quantification. [New York, N.Y.?]: [publisher not identified], 2022.
Find full textHemez, François, and Sez Atamturktur. Predictive Modelling: Verification, Validation and Uncertainty Quantification. Wiley & Sons, Limited, John, 2018.
Find full textMcClarren, Ryan G. Uncertainty Quantification and Predictive Computational Science: A Foundation for Physical Scientists and Engineers. Springer, 2018.
Find full textAnderson, Mark, Francois Hemez, and Scott Doebling. Model Verification and Validation in Engineering Mechanics: Theory and Applications of Uncertainty Quantification and Predictive Accuracy. John Wiley & Sons, 2005.
Find full textModel Verification and Validation in Engineering Mechanics: Theory and Applications of Uncertainty Quantification and Predictive Accuracy. Wiley & Sons, Limited, John, 2004.
Find full textSanderson, Benjamin Mark. Uncertainty Quantification in Multi-Model Ensembles. Oxford University Press, 2018. http://dx.doi.org/10.1093/acrefore/9780190228620.013.707.
Full textChen, Nan. Stochastic Methods for Modeling and Predicting Complex Dynamical Systems: Uncertainty Quantification, State Estimation, and Reduced-Order Models. Springer International Publishing AG, 2023.
Find full textBook chapters on the topic "Predictive uncertainty quantification"
Svensson, Emma, Hannah Rosa Friesacher, Adam Arany, Lewis Mervin, and Ola Engkvist. "Temporal Evaluation of Uncertainty Quantification Under Distribution Shift." In Lecture Notes in Computer Science, 132–48. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-72381-0_11.
Full textMcClarren, Ryan G. "Introduction to Uncertainty Quantification and Predictive Science." In Uncertainty Quantification and Predictive Computational Science, 3–17. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_1.
Full textMcClarren, Ryan G. "Gaussian Process Emulators and Surrogate Models." In Uncertainty Quantification and Predictive Computational Science, 257–74. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_10.
Full textMcClarren, Ryan G. "Predictive Models Informed by Simulation, Measurement, and Surrogates." In Uncertainty Quantification and Predictive Computational Science, 275–304. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_11.
Full textMcClarren, Ryan G. "Epistemic Uncertainties: Dealing with a Lack of Knowledge." In Uncertainty Quantification and Predictive Computational Science, 305–22. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_12.
Full textMcClarren, Ryan G. "Probability and Statistics Preliminaries." In Uncertainty Quantification and Predictive Computational Science, 19–51. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_2.
Full textMcClarren, Ryan G. "Input Parameter Distributions." In Uncertainty Quantification and Predictive Computational Science, 53–91. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_3.
Full textMcClarren, Ryan G. "Local Sensitivity Analysis Based on Derivative Approximations." In Uncertainty Quantification and Predictive Computational Science, 95–109. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_4.
Full textMcClarren, Ryan G. "Regression Approximations to Estimate Sensitivities." In Uncertainty Quantification and Predictive Computational Science, 111–28. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_5.
Full textMcClarren, Ryan G. "Adjoint-Based Local Sensitivity Analysis." In Uncertainty Quantification and Predictive Computational Science, 129–43. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99525-0_6.
Full textConference papers on the topic "Predictive uncertainty quantification"
Mossina, Luca, Joseba Dalmau, and Léo Andéol. "Conformal Semantic Image Segmentation: Post-hoc Quantification of Predictive Uncertainty." In 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 3574–84. IEEE, 2024. http://dx.doi.org/10.1109/cvprw63382.2024.00361.
Full textDuan, Jinhao, Hao Cheng, Shiqi Wang, Alex Zavalny, Chenan Wang, Renjing Xu, Bhavya Kailkhura, and Kaidi Xu. "Shifting Attention to Relevance: Towards the Predictive Uncertainty Quantification of Free-Form Large Language Models." In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 5050–63. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.acl-long.276.
Full textGrewal, Ruben, Paolo Tonella, and Andrea Stocco. "Predicting Safety Misbehaviours in Autonomous Driving Systems Using Uncertainty Quantification." In 2024 IEEE Conference on Software Testing, Verification and Validation (ICST), 70–81. IEEE, 2024. http://dx.doi.org/10.1109/icst60714.2024.00016.
Full textZhou, Hao, Yanze Zhang, and Wenhao Luo. "Safety-Critical Control with Uncertainty Quantification using Adaptive Conformal Prediction." In 2024 American Control Conference (ACC), 574–80. IEEE, 2024. http://dx.doi.org/10.23919/acc60939.2024.10644391.
Full textNeumeier, Marion, Sebastian Dorn, Michael Botsch, and Wolfgang Utschick. "Reliable Trajectory Prediction and Uncertainty Quantification with Conditioned Diffusion Models." In 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 3461–70. IEEE, 2024. http://dx.doi.org/10.1109/cvprw63382.2024.00350.
Full textZheng, Lang, Ruxue Xing, and Yaojun Wang. "GraphCast-Qtf: An improved weather prediction model based on uncertainty quantification methods." In 2024 12th International Conference on Agro-Geoinformatics (Agro-Geoinformatics), 1–5. IEEE, 2024. http://dx.doi.org/10.1109/agro-geoinformatics262780.2024.10660789.
Full textXu, Xinyue, Suman Paneru, Sez A. Russcher, and Julian Wang. "Physics-Guided Bayesian Neural Networks and Their Application in ODE Problems." In ASME 2024 Verification, Validation, and Uncertainty Quantification Symposium. American Society of Mechanical Engineers, 2024. http://dx.doi.org/10.1115/vvuq2024-122961.
Full textLye, Adolphus, Alice Cicrello, and Edoardo Patelli. "UNCERTAINTY QUANTIFICATION OF OPTIMAL THRESHOLD FAILURE PROBABILITY FOR PREDICTIVE MAINTENANCE USING CONFIDENCE STRUCTURES." In 2nd International Conference on Uncertainty Quantification in Computational Sciences and Engineering. Athens: Institute of Structural Analysis and Antiseismic Research School of Civil Engineering National Technical University of Athens (NTUA) Greece, 2019. http://dx.doi.org/10.7712/120219.6364.18502.
Full textHammam, Ahmed, Frank Bonarens, Seyed Eghbal Ghobadi, and Christoph Stiller. "Predictive Uncertainty Quantification of Deep Neural Networks using Dirichlet Distributions." In CSCS '22: Computer Science in Cars Symposium. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3568160.3570233.
Full textCatanach, Thomas. "Posterior Predictive Variational Inference for Uncertainty Quantification in Machine Learning." In Proposed for presentation at the SIAM Conference on Uncertainty Quantification (UQ22) held April 12-15, 2022 in Atlanta, GA. US DOE, 2022. http://dx.doi.org/10.2172/2002336.
Full textReports on the topic "Predictive uncertainty quantification"
Adams, Marvin. Phenomena-based Uncertainty Quantification in Predictive Coupled- Physics Reactor Simulations. Office of Scientific and Technical Information (OSTI), June 2017. http://dx.doi.org/10.2172/1364745.
Full textFavorite, Jeffrey A., Garrett James Dean, Keith C. Bledsoe, Matt Jessee, Dan Gabriel Cacuci, Ruixian Fang, and Madalina Badea. Predictive Modeling, Inverse Problems, and Uncertainty Quantification with Application to Emergency Response. Office of Scientific and Technical Information (OSTI), April 2018. http://dx.doi.org/10.2172/1432629.
Full textLawson, Matthew, Bert J. Debusschere, Habib N. Najm, Khachik Sargsyan, and Jonathan H. Frank. Uncertainty quantification of cinematic imaging for development of predictive simulations of turbulent combustion. Office of Scientific and Technical Information (OSTI), September 2010. http://dx.doi.org/10.2172/1011617.
Full textYe, Ming. Computational Bayesian Framework for Quantification and Reduction of Predictive Uncertainty in Subsurface Environmental Modeling. Office of Scientific and Technical Information (OSTI), January 2019. http://dx.doi.org/10.2172/1491235.
Full textMarzouk, Youssef. Final Report, DOE Early Career Award: Predictive modeling of complex physical systems: new tools for statistical inference, uncertainty quantification, and experimental design. Office of Scientific and Technical Information (OSTI), August 2016. http://dx.doi.org/10.2172/1312896.
Full textCattaneo, Matias D., Richard K. Crump, and Weining Wang. Beta-Sorted Portfolios. Federal Reserve Bank of New York, July 2023. http://dx.doi.org/10.59576/sr.1068.
Full textGonzales, Lindsey M., Thomas M. Hall, Kendra L. Van Buren, Steven R. Anton, and Francois M. Hemez. Quantification of Prediction Bounds Caused by Model Form Uncertainty. Office of Scientific and Technical Information (OSTI), September 2013. http://dx.doi.org/10.2172/1095195.
Full textAdams, Jason, Brandon Berman, Joshua Michalenko, and Rina Deka. Non-conformity Scores for High-Quality Uncertainty Quantification from Conformal Prediction. Office of Scientific and Technical Information (OSTI), September 2023. http://dx.doi.org/10.2172/2430248.
Full textVecherin, Sergey, Stephen Ketcham, Aaron Meyer, Kyle Dunn, Jacob Desmond, and Michael Parker. Short-range near-surface seismic ensemble predictions and uncertainty quantification for layered medium. Engineer Research and Development Center (U.S.), September 2022. http://dx.doi.org/10.21079/11681/45300.
Full textGlimm, James, Yunha Lee, Kenny Q. Ye, and David H. Sharp. Prediction Using Numerical Simulations, A Bayesian Framework for Uncertainty Quantification and its Statistical Challenge. Fort Belvoir, VA: Defense Technical Information Center, January 2002. http://dx.doi.org/10.21236/ada417842.
Full text