Dissertations / Theses on the topic 'Parameter uncertainty'

To see the other types of publications on this topic, follow the link: Parameter uncertainty.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Parameter uncertainty.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Sui, Liqi. "Uncertainty management in parameter identification." Thesis, Compiègne, 2017. http://www.theses.fr/2017COMP2330/document.

Full text
Abstract:
Afin d'obtenir des simulations plus prédictives et plus précises du comportement mécanique des structures, des modèles matériau de plus en plus complexes ont été développés. Aujourd'hui, la caractérisation des propriétés des matériaux est donc un objectif prioritaire. Elle exige des méthodes et des tests d'identification dédiés dans des conditions les plus proches possible des cas de service. Cette thèse vise à développer une méthodologie d'identification efficace pour trouver les paramètres des propriétés matériau, en tenant compte de toutes les informations disponibles. L'information utilisée pour l'identification est à la fois théorique, expérimentale et empirique : l'information théorique est liée aux modèles mécaniques dont l'incertitude est épistémique; l'information expérimentale provient ici de la mesure de champs cinématiques obtenues pendant l'essai ct dont l'incertitude est aléatoire; l'information empirique est liée à l'information à priori associée à une incertitude épistémique ainsi. La difficulté principale est que l'information disponible n'est pas toujours fiable et que les incertitudes correspondantes sont hétérogènes. Cette difficulté est surmontée par l'utilisation de la théorie des fonctions de croyance. En offrant un cadre général pour représenter et quantifier les incertitudes hétérogènes, la performance de l'identification est améliorée. Une stratégie basée sur la théorie des fonctions de croyance est proposée pour identifier les propriétés élastiques macro et micro des matériaux multi-structures. Dans cette stratégie, les incertitudes liées aux modèles et aux mesures sont analysées et quantifiées. Cette stratégie est ensuite étendue pour prendre en compte l'information à priori et quantifier l'incertitude associée
In order to obtain more predictive and accurate simulations of mechanical behaviour in the practical environment, more and more complex material models have been developed. Nowadays, the characterization of material properties remains a top-priority objective. It requires dedicated identification methods and tests in conditions as close as possible to the real ones. This thesis aims at developing an effective identification methodology to find the material property parameters, taking advantages of all available information. The information used for the identification is theoretical, experimental, and empirical: the theoretical information is linked to the mechanical models whose uncertainty is epistemic; the experimental information consists in the full-field measurement whose uncertainty is aleatory; the empirical information is related to the prior information with epistemic uncertainty as well. The main difficulty is that the available information is not always reliable and its corresponding uncertainty is heterogeneous. This difficulty is overcome by the introduction of the theory of belief functions. By offering a general framework to represent and quantify the heterogeneous uncertainties, the performance of the identification is improved. The strategy based on the belief function is proposed to identify macro and micro elastic properties of multi-structure materials. In this strategy, model and measurement uncertainties arc analysed and quantified. This strategy is subsequently developed to take prior information into consideration and quantify its corresponding uncertainty
APA, Harvard, Vancouver, ISO, and other styles
2

Mao, Yi. "Domain knowledge, uncertainty, and parameter constraints." Diss., Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/37295.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Clouse, Randy Wayne. "Evaluation of GLEAMS considering parameter uncertainty." Thesis, Virginia Tech, 1996. http://hdl.handle.net/10919/44516.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Clouse, Randy W. "Evaluation of GLEAMS considering parameter uncertainty /." This resource online, 1996. http://scholar.lib.vt.edu/theses/available/etd-09042008-063009/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Tao, Zuoyu. "Improved uncertainty estimates for geophysical parameter retrieval." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/61516.

Full text
Abstract:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2010.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (p. 167-169).
Algorithms for retrieval of geophysical parameters from radiances measured by instruments onboard satellites play a large role in helping scientists monitor the state of the planet. Current retrieval algorithms based on neural networks are superior in accuracy and speed compared to physics-based algorithms like iterated minimum variance (IMV). However, they do not have any form of error estimation, unlike IMV. This thesis examines the suitability of several different approaches to adding in confidence intervals and other methods of error estimation to the retrieval algorithm, as well as alternative machine learning methods that can both retrieve the parameters desired and assign error bars. Test datasets included both current generation operational instruments like AIRS/AMSU, as well as a hypothetical future hyper- spectral microwave sounder. Mixture density networks (MDN) and Sparse Pseudo Input Gaussian processes (SPGP) were found to be the most accurate at variance prediction. Both of these are novel methods in the field of remote sensing. MDNs also had similar training and testing time to neural networks, while SPGPs often took three times as long to train in typical cases. As a baseline, neural networks trained to estimate variance were also tested, but found to be lacking in accuracy and reliability compared to the other methods.
by Zuoyu Tao.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
6

Kumar, Dipmani. "Parameter uncertainty in nonpoint source pollution modeling." Diss., This resource online, 1995. http://scholar.lib.vt.edu/theses/available/etd-10042006-143856/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Green, Nathan. "Optimal intervention of epidemic models with parameter uncertainty." Thesis, University of Liverpool, 2005. http://www.manchester.ac.uk/escholar/uk-ac-man-scw:76732.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hagen, David Robert. "Parameter and topology uncertainty for optimal experimental design." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/90148.

Full text
Abstract:
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Biological Engineering, 2014.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 157-169).
A major effort of systems biology is the building of accurate and detailed models of biological systems. Because biological models are large, complex, and highly nonlinear, building accurate models requires large quantities of data and algorithms appropriate to translate this data into a model of the underlying system. This thesis describes the development and application of several algorithms for simulation, quantification of uncertainty, and optimal experimental design for reducing uncertainty. We applied a previously described algorithm for choosing optimal experiments for reducing parameter uncertainty as estimated by the Fisher information matrix. We found, using a computational scenario where the true parameters were unknown, that the parameters of the model could be recovered from noisy data in a small number of experiments if the experiments were chosen well. We developed a method for quickly and accurately approximating the probability distribution over a set of topologies given a particular data set. The method was based on a linearization applied at the maximum a posteriori parameters. This method was found to be about as fast as existing heuristics but much closer to the true probability distribution as computed by an expensive Monte Carlo routine. We developed a method for optimal experimental design to reduce topology uncertainty based on the linear method for topology probability. This method was a Monte Carlo method that used the linear method to quickly evaluate the topology uncertainty that would result from possible data sets of each candidate experiment. We applied the method to a model of ErbB signaling. Finally, we developed a method for reducing the size of models defined as rule-based models. Unlike existing methods, this method handles compartments of models and allows for cycles between monomers. The methods developed here generally improve the detail at which models can be built, as well as quantify how well they have been built and suggest experiments to build them even better.
by David Robert Hagen.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
9

Macatula, Romcholo Yulo. "Linear Parameter Uncertainty Quantification using Surrogate Gaussian Processes." Thesis, Virginia Tech, 2020. http://hdl.handle.net/10919/99411.

Full text
Abstract:
We consider uncertainty quantification using surrogate Gaussian processes. We take a previous sampling algorithm and provide a closed form expression of the resulting posterior distribution. We extend the method to weighted least squares and a Bayesian approach both with closed form expressions of the resulting posterior distributions. We test methods on 1D deconvolution and 2D tomography. Our new methods improve on the previous algorithm, however fall short in some aspects to a typical Bayesian inference method.
Master of Science
Parameter uncertainty quantification seeks to determine both estimates and uncertainty regarding estimates of model parameters. Example of model parameters can include physical properties such as density, growth rates, or even deblurred images. Previous work has shown that replacing data with a surrogate model can provide promising estimates with low uncertainty. We extend the previous methods in the specific field of linear models. Theoretical results are tested on simulated computed tomography problems.
APA, Harvard, Vancouver, ISO, and other styles
10

Blasone, Roberta-Serena. "Parameter estimation and uncertainty assessment in hydrological modelling." Kgs. Lyngby, 2007. http://www.er.dtu.dk/publications/fulltext/2007/MR2007-105.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Giampellegrini, Laurent. "Uncertainty in correlation-driven operational modal parameter estimation." Thesis, University College London (University of London), 2007. http://discovery.ucl.ac.uk/1445512/.

Full text
Abstract:
Due to the practical advantages over traditional input-output testing, operational or output-only modal analysis is receiving increased attention when the modal parameters of large civil engineering structures are of interest. However, as a consequence of the random nature of ambient loading and the unknown relationship between excitation and response, the identified operational modal parameters are inevitably corrupted by errors. Whether the estimated modal data is used to update a finite element model or different sets of modal parameters are used as a damage indicator, it is desirable to know the extent of the error in the modal data for more accurate response predictions or to assess, if changes in the modal data are indicative of damage or just the result of the random error inherent in the identification process. In this thesis, two techniques are investigated to estimate the error in the modal parameters identified from response data only: a perturbation and a bootstrap based method. The perturbation method, applicable exclusively to the correlation-driven stochastic subspace identification algorithm (SSI/Cov), is a two stage procedure. It operates on correlation functions estimated from a single set of response measurements and, in a first step, the perturbations to these correlation function estimates need to be determined. A robust, data-driven method is developed for this purpose. The next step consists in propagating these perturbations through the algorithm resulting in an estimate of the sensitivities of the modal data to these perturbations. Combining the sensitivities with the perturbations, an estimate of both the random and bias errors in the SSI/Cov-identified modal parameters is found. The bootstrap technique involves creating pseudo time-series by resampling from the only available set of response measurements. With this additional data at hand, a modal identification is performed for each set of data and the errors in the modal parameters are determined by sample statistics. However, the bootstrap itself introduces errors in the computed sample statistics. Three bootstrapping schemes are investigate in relation to operational modal analysis and an automated, optimal block length selection is implemented to minimise the error introduced by the bootstrap. As opposed to the perturbation method, the bootstrap technique is more versatile and it is not restricted to correlation-driven operational modal analysis. Its applicability to the data-driven stochastic subspace identification algorithm (SSI/Data) for error prediction of the SSI/data-identified modal data is explored. The performance of the two techniques is assessed by simulation on simple systems. Monte-Carlo type error estimates are used as a benchmark against which the predicted errors in the modal parameters computed from a single response history from both techniques are validated. Both techniques are assessed in terms of their accuracy and stability in predicting the uncertainty in the operational modal parameters and their computational efficiency is compared. Also, the performance of the bootstrap and the perturbation theoretic method is investigated in hostile ambient excitation conditions such as non-stationarity and the presence of deterministic components and the limitations of both methods are clearly exposed.
APA, Harvard, Vancouver, ISO, and other styles
12

Vrugt, Jasper Alexander. "Towards improved treatment of parameter uncertainty in hydrologic modeling." [S.l. : Amsterdam : s.n.] ; Universiteit van Amsterdam [Host], 2004. http://dare.uva.nl/document/77207.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Bodine, Andrew James. "The Effect of Item Parameter Uncertainty on Test Reliability." The Ohio State University, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=osu1343316705.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Noronha, Alston Marian Lee Jejung. "Information theory approach to quantifying parameter uncertainty in groundwater modeling." Diss., UMK access, 2005.

Find full text
Abstract:
Thesis (M.S.)--School of Computing and Engineering. University of Missouri--Kansas City, 2005.
"A thesis in civil engineering." Typescript. Advisor: Jejung Lee. Vita. Title from "catalog record" of the print edition Description based on contents viewed March 12, 2007. Includes bibliographical references (leaves 96-100). Online version of the print edition.
APA, Harvard, Vancouver, ISO, and other styles
15

BEZERRA, BERNARDO VIEIRA. "STOCHASTIC HYDROTHERMAL SCHEDULING WITH PARAMETER UNCERTAINTY IN THE STREAMFLOW MODELS." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2015. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=25337@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
COORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR
PROGRAMA DE EXCELENCIA ACADEMICA
O objetivo do planejamento da operação hidrotérmica de médio e longo prazo é definir as metas para geração de cada hidroelétrica e termelétrica, a fim de atender à carga ao menor custo esperado de operação e respeitando as restrições operacionais. Algoritmos de Programação Dinâmica Estocástica (PDE) e de Programação Dinâmica Dual Estocástica (PDDE) têm sido amplamente aplicados para determinar uma política operativa ideal o despacho hidrotérmico. Em ambas as abordagens a estocasticidade das afluências é comumente produzida por modelos periódicos autoregressivos de lag p - PAR(p), cuja estimativa dos parâmetros é baseada nos dados históricos disponíveis. Como os estimadores são funções de fenômenos aleatórios, além da incerteza sobre as vazões, também há incerteza sobre os parâmetros estatísticos, o que não é capturado no modelo PAR (p) padrão. A existência de incerteza nos parâmetros significa que há um risco de que a política da operação hidrotérmica planejada não será a ótima. O objetivo desta tese é apresentar uma metodologia para incorporar a incerteza dos parâmetros do modelo PAR (p) no problema de programação estocástica hidrotérmica. São apresentados estudos de caso ilustrando o impacto da incerteza dos parâmetros nos custos operativos do sistema e como uma política operativa que incorpore esta incerteza pode reduzir este impacto.
The objective of the medium and long-term hydrothermal scheduling problem is to define operational target for each power plant in order to meet the load at the lowest expected cost and respecting the operational constraints. Stochastic Dynamic Programming (SDP) and Stochastic Dual Dynamic Programming (SDDP) algorithms have been widely applied to determine the optimal operating policy for the hydrothermal dispatch. In both approaches, the stochasticity of the inflows is usually produced by periodic auto-regressive models - PAR (p), whose parameters are estimated based on available historical data. As the estimators are a function of random phenomena, besides the inflows uncertainty there is statistical parameter uncertainty, which is not captured in the standard PAR (p) model. The existence of uncertainty in the parameters means that there is a risk that the hydrothermal operating policy will not be optimal. This thesis presents a methodology to incorporate the PAR(p) parameter uncertainty into stochastic hydrothermal scheduling and to assess the resulting impact on the computation of a hydro operations policy. Case studies are presented illustrating the impact of parameter uncertainty in the system operating costs and how an operating policy that incorporates this uncertainty can reduce this impact.
APA, Harvard, Vancouver, ISO, and other styles
16

Greenall, Nicholas Robert. "Parameter extraction and uncertainty in terahertz time-domain spectroscopic measurements." Thesis, University of Leeds, 2017. http://etheses.whiterose.ac.uk/19045/.

Full text
Abstract:
Terahertz (THz) time domain spectroscopy is emerging as a powerful tool to characterise samples both chemically and physically. In this work different methods of estimating spectroscopic parameters of a sample, its thickness and the uncertainty of these estimates is presented. A number of case studies are also examined including paracetamol polymorphs and a method of creating a spectroscopic simulant of Semtex-H is presented. Approximation of the sample spectroscopic parameters, real refractive index and absorption coeficient were formed by building up a simple model of the samples interaction with THz radiation. Methods of correcting unwrapping error in the real refractive index were developed, including a method to correct in the presence of discontinuities in the refractive index itself. These approximations were then applied to extract parameters of both lactose and paracetamol samples. An algorithm to generate spectroscopic simulants was developed and applied to Semtex-H. These simulants consisted of simple mixtures of inert compounds, which were measured and found to have similar spectrum to the target sample. Methods of fitting resonant models to the sample response were developed to extract both the spectroscopic parameters and sample thickness. These were refined by calibrating for the Gaussian beam profile of the THz radiation, which was shown to increase the accuracy of the extracted thickness. The thickness and spectroscopic parameters of a lactose sample were measured with temperature, and it was found that the spectroscopic parameter change was underestimated when thickness was assumed constant. A resonant model for multilayered samples was then developed and used to characterise IPA in a flowcell measurement. This was then combined with a method of time segmentation of the sample response, to extract spectroscopic parameters and sample thickness simultaneously. This was then applied to a two layer sample, to extract the spectroscopic parameters of a silicon and a quartz layer from a single measurement. Finally, methods of propagating the uncertainty from the time domain to the spectroscopic parameters were developed. These were based on a multivariate normal statistical model of the measurements andwere compared to numerical bootstrap and Monte–Carlo estimates. These were used to develop confidence intervals for the extracted refractive index, absorption coefficient and thickness. These methods were applied to both a lactose and quartz sample.
APA, Harvard, Vancouver, ISO, and other styles
17

How, Jonathan P. "Robust control design with real parameter uncertainty using absolute stability." Thesis, Massachusetts Institute of Technology, 1993. http://hdl.handle.net/1721.1/12538.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 1993.
GRSN 640480
Includes bibliographical references (p. 185-198).
by Jonathan P. How.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
18

Plaskett, Joseph H. "Parameter uncertainty and modeling of sludge dewatering in one dimension." PDXScholar, 1992. https://pdxscholar.library.pdx.edu/open_access_etds/4432.

Full text
Abstract:
Separation of liquid from solids is a necessary step in the ultimate disposal of wastewater sludges. Most commonly, sludges are dewatered by pressure-filtration methods. Mathematical models of the physics of the sludge dewatering process would provide the ability to predict dewatering performance and optimize the design and operation of dewatering facilities.
APA, Harvard, Vancouver, ISO, and other styles
19

Mangado, López Nerea. "Cochlear implantation modeling and functional evaluation considering uncertainty and parameter variability." Doctoral thesis, Universitat Pompeu Fabra, 2017. http://hdl.handle.net/10803/586214.

Full text
Abstract:
Recent innovations in computational modeling have led to important advances towards the development of predictive tools to simulate and optimize surgery outcomes. This thesis is focused on cochlear implantation surgery, technique which allows recovering functional hearing in patients with severe deafness. The success of this intervention, however, relies on factors, which are unpredictable or difficult to control. This, combined with the high variability of hearing restoration levels among patients, makes the prediction of this surgery a very challenging process. The aim of this thesis is to develop computational tools to assess the functional outcome of the cochlear implantation. To this end, this thesis addresses a set of challenges, such as the automatic optimization of the implantation and stimulation parameters by evaluating the neural response evoked by the cochlear implant or the functional evaluation of a large set of virtual patients.
Recientes mejoras en el desarrollo del modelado computacional han facilitado importantes avances en herramientas predictivas para simular procesos quirúrgicos maximizando así los resultados de la cirugía. Esta tesis se focaliza en la cirugía de implantación coclear. Dicha técnica permite recuperar el sentido auditivo a pacientes con sordera severa. Sin embargo, el éxito de la intervención depende de un conjunto de factores, difíciles de controlar o incluso impredecibles. Por este motivo, existe una gran variabilidad interindividual, lo cual lleva a considerar la predicción de esta cirugía como un proceso complejo. El objetivo de esta tesis es el desarrollo de herramientas computacionales para la evaluación funcional de dicha cirugía. Para este fi n, esta tesis aborda una serie de retos, entre ellos la optimización automática de la respuesta neural inducida por el implante coclear y la evaluación numérica de grandes grupos de pacientes.
APA, Harvard, Vancouver, ISO, and other styles
20

Binder, Tanja [Verfasser], and Ekaterina [Akademischer Betreuer] Kostina. "Optimization under uncertainty : robust parameter estimation with erroneous measurements and uncertain model coefficients / Tanja Binder. Betreuer: Ekaterina Kostina." Marburg : Philipps-Universität Marburg, 2013. http://d-nb.info/1032315245/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Durisek, Nicholas Joseph. "Simultaneous overall measurement uncertainty reduction for multi-parameter macro-measurement system design /." The Ohio State University, 1996. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487942739808246.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Söderström, Ulf. "Monetary policy under uncertainty." Doctoral thesis, Handelshögskolan i Stockholm, Samhällsekonomi (S), 1999. http://urn.kb.se/resolve?urn=urn:nbn:se:hhs:diva-646.

Full text
Abstract:
This thesis contains four chapters, each of which examines different aspects of the uncertainty facing monetary policymakers.''Monetary policy and market interest rates'' investigates how interest rates set on financial markets respond to policy actions taken by the monetary authorities. The reaction of market rates is shown to depend crucially on market participants' interpretation of the factors underlying the policy move. These theoretical predictions find support in an empirical analysis of the U.S. financial markets.''Predicting monetary policy using federal funds futures prices'' examines how prices of federal funds futures contracts can be used to predict policy moves by the Federal Reserve. Although the futures prices exhibit systematic variation across trading days and calendar months, they are shown to be fairly successful in predicting the federal funds rate target that will prevailafter the next meeting of the Federal Open Market Committee from 1994 to 1998.''Monetary policy with uncertain parameters'' examines the effects  of parameter uncertainty on the optimal monetary policy strategy. Under certain parameter configurations, increasing uncertainty is shown to lead to more aggressive policy, in contrast to the accepted wisdom.''Should central banks be more aggressive?'' examines why a certain class of monetary policy models leads to more aggressive policy prescriptions than what is observed in reality. These counterfactual results are shown to be due to model restrictions rather than central banks being too cautious in their policy behavior. An unrestricted model, taking the dynamics of the economy and multiplicative parameter uncertainty into account, leads to optimal policy prescriptions which are very close to observed Federal Reserve behavior.

Diss. Stockholm : Handelshögskolan, 1999

APA, Harvard, Vancouver, ISO, and other styles
23

Blanchard, Emmanuel. "Polynomial Chaos Approaches to Parameter Estimation and Control Design for Mechanical Systems with Uncertain Parameters." Diss., Virginia Tech, 2010. http://hdl.handle.net/10919/26727.

Full text
Abstract:
Mechanical systems operate under parametric and external excitation uncertainties. The polynomial chaos approach has been shown to be more efficient than Monte Carlo approaches for quantifying the effects of such uncertainties on the system response. This work uses the polynomial chaos framework to develop new methodologies for the simulation, parameter estimation, and control of mechanical systems with uncertainty. This study has led to new computational approaches for parameter estimation in nonlinear mechanical systems. The first approach is a polynomial-chaos based Bayesian approach in which maximum likelihood estimates are obtained by minimizing a cost function derived from the Bayesian theorem. The second approach is based on the Extended Kalman Filter (EKF). The error covariances needed for the EKF approach are computed from polynomial chaos expansions, and the EKF is used to update the polynomial chaos representation of the uncertain states and the uncertain parameters. The advantages and drawbacks of each method have been investigated. This study has demonstrated the effectiveness of the polynomial chaos approach for control systems analysis. For control system design the study has focused on the LQR problem when dealing with parametric uncertainties. The LQR problem was written as an optimality problem using Lagrange multipliers in an extended form associated with the polynomial chaos framework. The solution to the Hâ problem as well as the H2 problem can be seen as extensions of the LQR problem. This method might therefore have the potential of being a first step towards the development of computationally efficient numerical methods for Hâ design with parametric uncertainties. I would like to gratefully acknowledge the support provided for this work under NASA Grant NNL05AA18A.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
24

Sar, Preeti. "Eco-Inspired Robust Control Design Algorithm For Linear Systems with Real Parameter Uncertainty." The Ohio State University, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=osu1367439491.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Alyaseri, Isam. "QUALITATIVE AND QUANTITATIVE PROCEDURE FOR UNCERTAINTY ANALYSIS IN LIFE CYCLE ASSESSMENT OF WASTEWATER SOLIDS TREATMENT PROCESSES." OpenSIUC, 2014. https://opensiuc.lib.siu.edu/dissertations/795.

Full text
Abstract:
In order to perform the environmental analysis and find the best management in the wastewater treatment processes using life cycle assessment (LCA) method, uncertainty in LCA has to be evaluated. A qualitative and quantitative procedure was constructed to deal with uncertainty for the wastewater treatment LCA studies during the inventory and analysis stages. The qualitative steps in the procedure include setting rules for the inclusion of inputs and outputs in the life cycle inventory (LCI), setting rules for the proper collection of data, identifying and conducting data collection analysis for the significant contributors in the model, evaluating data quality indicators, selecting the proper life cycle impact assessment (LCIA) method, evaluating the uncertainty in the model through different cultural perspectives, and comparing with other LCIA methods. The quantitative steps in the procedure include assigning the best guess value and the proper distribution for each input or output in the model, calculating the uncertainty for those inputs or outputs based on data characteristics and the data quality indicators, and finally using probabilistic analysis (Monte Carlo simulation) to estimate uncertainty in the outcomes. Environmental burdens from the solids handling unit at Bissell Point Wastewater Treatment Plant (BPWWTP) in Saint Louis, Missouri was analyzed. Plant specific data plus literature data were used to build an input-output model. Environmental performance of an existing treatment scenario (dewatering-multiple hearth incineration-ash to landfill) was analyzed. To improve the environmental performance, two alternative scenarios (fluid bed incineration and anaerobic digestion) were proposed, constructed, and evaluated. System boundaries were set to include the construction, operation and dismantling phases. The impact assessment method chosen was Eco-indicator 99 and the impact categories were: carcinogenicity, respiratory organics and inorganics, climate change, radiation, ozone depletion, ecotoxicity, acidification-eutrophication, and minerals and fossil fuels depletion. Analysis of the existing scenario shows that most of the impacts came from the operation phase on the categories related to fossil fuels depletion, respiratory inorganics, and carcinogens due to energy consumed and emissions from incineration. The proposed alternatives showed better performance than the existing treatment. Fluid bed incineration had better performance than anaerobic digestion. Uncertainty analysis showed there is 57.6% possibility to have less impact on the environment when using fluid bed incineration than the anaerobic digestion. Based on single scores ranking in the Eco-indicator 99 method, the environmental impact order is: multiple hearth incineration > anaerobic digestion > fluid bed incineration. This order was the same for the three model perspectives in the Eco-indicator 99 method and when using other LCIA methods (Eco-point 97 and CML 2000). The study showed that the incorporation of qualitative/quantitative uncertainty analysis into LCA gave more information than the deterministic LCA and can strengthen the LCA study. The procedure tested in this study showed that Monte Carlo simulation can be used in quantifying uncertainty in the wastewater treatment studies. The procedure can be used to analyze the performance of other treatment options. Although the analysis in different perspectives and different LCIA methods did not impact the order of the scenarios, it showed a possibility of variation in the final outcomes of some categories. The study showed the importance of providing decision makers with the best and worst possible outcomes in any LCA study and informing them about the perspectives and assumptions used in the assessment. Monte Carlo simulation is able to perform uncertainty analysis in the comparative LCA only between two products or scenarios based on the (A-B) approach due to the overlapping between the probability distributions of the outcomes. It is recommended to modify it to include more than two scenarios.
APA, Harvard, Vancouver, ISO, and other styles
26

Xu, Yijun. "Uncertainty Quantification, State and Parameter Estimation in Power Systems Using Polynomial Chaos Based Methods." Diss., Virginia Tech, 2019. http://hdl.handle.net/10919/97876.

Full text
Abstract:
It is a well-known fact that a power system contains many sources of uncertainties. These uncertainties coming from the loads, the renewables, the model and the measurement, etc, are influencing the steady state and dynamic response of the power system. Facing this problem, traditional methods, such as the Monte Carlo method and the Perturbation method, are either too time consuming or suffering from the strong nonlinearity in the system. To solve these, this Dissertation will mainly focus on developing the polynomial chaos based method to replace the traditional ones. Using it, the uncertainties from the model and the measurement are propagated through the polynomial chaos bases at a set of collocation points. The approximated polynomial chaos coefficients contain the statistical information. The method can greatly accelerate the calculation efficiency while not losing the accuracy, even when the system is highly stressed. In this dissertation, both the forward problem and the inverse problem of uncertainty quantification will be discussed. The forward problems will include the probabilistic power flow problem and statistical power system dynamic simulations. The generalized polynomial chaos method, the adaptive polynomial chaos-ANOVA method and the multi-element polynomial chaos method will be introduced and compared. The case studies show that the proposed methods have great performances in the statistical analysis of the large-scale power systems. The inverse problems will include the state and parameter estimation problem. A novel polynomial-chaos-based Kalman filter will be proposed. The comparison studies with other traditional Kalman filter demonstrate the good performances of the proposed Kalman filter. We further explored the area dynamic parameter estimation problem under the Bayesian inference framework. The polynomial-chaos-expansions are treated as the response surface of the full dynamic solver. Combing with hybrid Markov chain Monte Carlo method, the proposed method yields very high estimation accuracy while greatly reducing the computing time. For both the forward problem and the inverse problems, the polynomial chaos based methods haven shown great advantages over the traditional methods. These computational techniques can improve the efficiency and accuracy in power system planning, guarantee the rationality and reliability in power system operations, and, finally, speed up the power system dynamic security assessment.
PHD
APA, Harvard, Vancouver, ISO, and other styles
27

Chandavarkar, Rohan Vivek. "Eco-inspired Robust Control Design for Linear Time-Invariant systems with Real Parameter Uncertainty." The Ohio State University, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=osu1373467190.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Qin, Wenyi. "Many server queueing models with heterogeneous servers and parameter uncertainty with customer contact centre applications." Thesis, University of Edinburgh, 2018. http://hdl.handle.net/1842/33167.

Full text
Abstract:
In this thesis, we study the queueing systems with heterogeneous servers and service rate uncertainty under the Halfin-Whitt heavy traffic regime. First, we analyse many server queues with abandonments when service rates are i.i.d. random variables. We derive a diffusion approximation using a novel method. The diffusion has a random drift, and hence depending on the realisations of service rates, the system can be in Quality Driven (QD), Efficiency Driven (ED) or Quality-Efficiency-Driven (QED) regime. When the system is under QD or QED regime, the abandonments are negligible in the fluid limit, but when it is under ED regime, the probability of abandonment will converge to a non-zero value. We then analyse the optimal staffing levels to balance holding costs with staffing costs combining these three regimes. We also analyse how the variance of service rates influence abandonment rate. Next, we focus on the state space collapse (SSC) phenomenon. We prove that under some assumptions, the system process will collapse to a lower dimensional process without losing essential information. We first formulate a general method to prove SSC results inside pools for heavy traffic systems using the hydrodynamic limit idea. Then we work on the SSC in multi-class queueing networks under the Halfin-Whitt heavy traffic when service rates are i.i.d. random variables within pools. For such systems, exact analysis provides limited insight on the general properties. Alternatively, asymptotic analysis by diffusion approximation proves to be effective. Further, limit theorems, which state the diffusively scaled system process weakly converges to a diffusion process, are usually the central part in such asymptotic analysis. The SSC result is key to proving such a limit. We conclude by giving examples on how SSC is applied to the analysis of systems.
APA, Harvard, Vancouver, ISO, and other styles
29

Shimp, Samuel Kline III. "Vehicle Sprung Mass Parameter Estimation Using an Adaptive Polynomial-Chaos Method." Thesis, Virginia Tech, 2008. http://hdl.handle.net/10919/32056.

Full text
Abstract:
The polynomial-chaos expansion (PCE) approach to modeling provides an estimate of the probabilistic response of a dynamic system with uncertainty in the system parameters. A novel adaptive parameter estimation method exploiting the polynomial-chaos representation of a general quarter-car model is presented. Because the uncertainty was assumed to be concentrated in the sprung mass parameter, a novel pseudo mass matrix was developed for generating the state-space PCE model. In order to implement the PCE model in a real-time adaptation routine, a novel technique for representing PCE output equations was also developed. A simple parameter estimation law based on the output error between measured accelerations and PCE acceleration estimates was developed and evaluated through simulation and experiment. Simulation results of the novel adaptation algorithm demonstrate the estimation convergence properties as well as its limitations. The simulation results are further verified by a real-time experimental implementation on a quarter-car test rig. This work presents the first truly real-time implementation of a PCE model. The experimental real-time implementation of the novel adaptive PCE estimation method shows promising results by its ability to converge and maintain a stable estimate of the unknown parameter.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
30

Doehr, Rachel M. "Adventures at the Zero Lower Bound: A Bayesian Time-Varying Parameter Vector Autoregressive Analysis of Monetary Policy Uncertainty Shocks." Scholarship @ Claremont, 2016. http://scholarship.claremont.edu/cmc_theses/1318.

Full text
Abstract:
Using survey-based measures of future interest rate expectations from the Blue Chip Economic Indicators and the Survey of Professional Forecasters, we examine the relationship between monetary policy uncertainty, captured as the dispersion of interest rate forecasts, and fluctuations in real economic activity and core inflation. We use a flexible time-varying parameter vector autoregression (TVP-VAR) model to clearly isolate the dynamic effects of shocks to monetary policy uncertainty. To further study possible a possible nonlinear relationship between monetary policy uncertainty and the macroeconomic aggregates, we extract the impulse-response functions (IRF’s) estimated at each quarter in the time series, and use a multi-variate regression with various measures of the shape of the IRF’s and the level of monetary policy uncertainty at that quarter in the TVP-VAR model to gauge the relationship between the effectiveness of traditional monetary policy (shocks to the Federal Funds rate), forward guidance (shocks to expected interest rates) and uncertainty. The results show that monetary policy uncertainty can have a quantitatively significant impact on output, with a one standard deviation shock to uncertainty associated with a 0.6% rise in unemployment. The indirect effects are more substantial, with a one standard deviation increase in monetary policy uncertainty associated with a 23% decrease in the maximum response of unemployment to a forward guidance episode (interest rate expectations shock). This evidence points to the importance of managing monetary policy uncertainty (clear and direct forward guidance) as a key policy tool in both stimulating economic activity as well as propagating other monetary policy through the macroeconomy.
APA, Harvard, Vancouver, ISO, and other styles
31

Zhang, Xuesong. "Evaluating and developing parameter optimization and uncertainty analysis methods for a computationally intensive distributed hydrological model." [College Station, Tex. : Texas A&M University, 2008. http://hdl.handle.net/1969.1/ETD-TAMU-3091.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Beckers, Joseph. "Modelling the Oro Moraine multi-aquifer system, role of geology, numerical model, parameter estimation and uncertainty." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape9/PQDD_0020/NQ38218.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Demaria, Eleonora Maria. "EVALUATING THE IMPACTS OF INPUT AND PARAMETER UNCERTAINTY ON STREAMFLOW SIMULATIONS IN LARGE UNDER-INSTRUMENTED BASINS." Diss., The University of Arizona, 2010. http://hdl.handle.net/10150/195641.

Full text
Abstract:
In data-poor regions around the world, particularly in less-privileged countries, hydrologists cannot always take advantage of available hydrological models to simulate a hydrological system due to the lack of reliable measurements of hydrological variables, in particular rainfall and streamflows, needed to implement and evaluate these models. Rainfall estimates obtained with remotely deployed sensors constitute an excellent source of precipitation for these basins, however they are prone to errors that can potentially affect hydrologic simulations. Concurrently, limited access to streamflow measurements does not allow a detailed representation of the system's structure through parameter estimation techniques. This dissertation presents multiple studies that evaluate the usefulness of remotely sensed products for different hydrological applications and the sensitivity of simulated streamflow to parameter uncertainty across basins with different hydroclimatic characteristics with the ultimate goal of increasing the applicability of land surface models in ungauged basins, particularly in South America. Paper 1 presents a sensitivity analysis of daily simulated streamflows to changes in model parameters along a hydroclimatic gradient. Parameters controlling the generation of surface and subsurface flow were targeted for the study. Results indicate that the sensitivity is strongly controlled by climate and that a more parsimonious version of the model could be implemented. Paper 2 explores how errors in satellite-estimated precipitation, due to infrequent satellite measurements, propagate through the simulation of a basin's hydrological cycle and impact the characteristics of peak streamflows within the basin. Findings indicate that nonlinearities in the hydrological cycle can introduce bias in simulated streamflows with error-corrupted precipitation. They also show that some characteristics of peak discharges are not conditioned by errors in satellite-estimated precipitation at a daily time step. Paper 3 evaluates the dominant sources of error in three satellite products when representing convective storms and how shifts in the location of the storm affect simulated peak streamflows in the basin. Results indicate that satellite products show some deficiencies retrieving convective processes and that a ground bias correction can mitigate these deficiencies but without sacrificing the potential for real-time hydrological applications. Finally, spatially shifted precipitation fields affect the magnitude of the peaks, however, its impact on the timing of the peaks is dampened out by the system's response at a daily time scale.
APA, Harvard, Vancouver, ISO, and other styles
34

Sun, Jin. "Conquering Variability for Robust and Low Power Designs." Diss., The University of Arizona, 2011. http://hdl.handle.net/10150/145458.

Full text
Abstract:
As device feature sizes shrink to nano-scale, continuous technology scaling has led to a large increase in parameter variability during semiconductor manufacturing process. According to the source of uncertainty, parameter variations can be classified into three categories: process variations, environmental variations, and temporal variations. All these variation sources exert significant influences on circuit performance, and make it more challenging to characterize parameter variability and achieve robust, low-power designs. The scope of this dissertation is conquering parameter variability and successfully designing efficient yet robust integrated circuit (IC) systems. Previous experiences have indicated that we need to tackle this issue at every design stage of IC chips. In this dissertation, we propose several robust techniques for accurate variability characterization and efficient performance prediction under parameter variations. At pre-silicon verification stage, a robust yield prediction scheme under limited descriptions of parameter uncertainties, a robust circuit performance prediction methodology based on importance of uncertainties, and a robust gate sizing framework by ElasticR estimation model, have been developed. These techniques provide possible solutions to achieve both prediction accuracy and computation efficiency in early design stage. At on-line validation stage, a dynamic workload balancing framework and an on-line self-tuning design methodology have been proposed for application-specific multi-core systems under variability-induced aging effects. These on-line validation techniques are beneficial to alleviate device performance degradation due to parameter variations and extend device lifetime.
APA, Harvard, Vancouver, ISO, and other styles
35

van, Wyk Hans-Werner. "A Variational Approach to Estimating Uncertain Parameters in Elliptic Systems." Diss., Virginia Tech, 2012. http://hdl.handle.net/10919/27635.

Full text
Abstract:
As simulation plays an increasingly central role in modern science and engineering research, by supplementing experiments, aiding in the prototyping of engineering systems or informing decisions on safety and reliability, the need to quantify uncertainty in model outputs due to uncertainties in the model parameters becomes critical. However, the statistical characterization of the model parameters is rarely known. In this thesis, we propose a variational approach to solve the stochastic inverse problem of obtaining a statistical description of the diffusion coefficient in an elliptic partial differential equation, based noisy measurements of the model output. We formulate the parameter identification problem as an infinite dimensional constrained optimization problem for which we establish existence of minimizers as well as first order necessary conditions. A spectral approximation of the uncertain observations (via a truncated Karhunen-Loeve expansion) allows us to estimate the infinite dimensional problem by a smooth, albeit high dimensional, deterministic optimization problem, the so-called 'finite noise' problem, in the space of functions with bounded mixed derivatives. We prove convergence of 'finite noise' minimizers to the appropriate infinite dimensional ones, and devise a gradient based, as well as a sampling based strategy for locating these numerically. Lastly, we illustrate our methods by means of numerical examples.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
36

Bogodorova, Tetiana. "Modeling, Model Validation and Uncertainty Identification for Power System Analysis." Doctoral thesis, KTH, Elkraftteknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-218008.

Full text
Abstract:
It is widely accepted that correct system modeling and identification are among the most important issues power system operators face when managing instability and post-contingency scenarios. The latter is usually performed involving special computational tools that allow the operator to forecast, prevent system failure and take appropriate actions according to protocols for different contingency cases in the system. To ensure that operators make the correct simulation-based decisions, the power system models have to be validated continuously. This thesis investigates power system modeling, identification and validation problems that are formulated and based on data provided by operators, and offers new methods and deeper insight into stages of an identification cycle considering the specifics of power systems. One of the problems this thesis tackled is the selection of a modeling and simulation environment that provides transparency and possibility for unambiguous model exchange between system operators. Modelica as equation-based language fulfills these requirements. In this thesis Modelica phasor time domain models were developed and software-to-software validated against conventional simulation environments, i.e. SPS/Simulink and PSAT in MATLAB. Parameter estimation tasks for Modelica models require a modular and extensible toolbox. Thus, RaPiD Toolbox, a framework that provides system identification algorithms for Modelica models, was developed in MATLAB. Contributions of this thesis are an implementation of the Particle Filter algorithm and validation metrics for parameter identification. The performance of the proposed algorithm has been compared with Particle Swarm Optimization (PSO) algorithm when combined with simplex search and parallelized to get computational speed up. The Particle Filter outperformed PSO when estimating turbine-governor model parameters in the Greek power plant model relying on real measurements. This thesis also analyses different model structures (Nonlinear AutoRegressive eXogenous (NARX) model, Hammerstein-Wiener model, and high order transfer function) that are selected to reproduce nonlinear dynamics of a Static VAR Compensator (SVC) under incomplete information available for National Grid system operator. The study has shown that standard SVC model poorly reproduces the measured dynamics of the real system. Therefore, black-box mathematical modeling and identification approach has been proposed to solve the problem. Also, the introduced combination of first-principle and black-box approach has shown the best output fit. The methodology following identification cycle together with model order selection and model validation issues was presented in detail. Finally, one of the major contributions is a new method to formulate the uncertainty of parameters estimated in the form of a multimodal Gaussian mixture distribution that is estimated from the Particle Filter output by applying statistical methods to select the standard deviations. The proposed methodology gives additional insight into power system properties when estimating the parameters of the model. This allows power system analysts to decide on the design of validation tests for the chosen model.
Det är allmänt accepterat att korrekt modellering och identifiering av systemet är bland de mest viktiga utmaningarna som kraftsystemoperatörer ställs inför när de hanterar scenarior med instabiliteter och oförutsedda händelser. Det senare är vanligen hanterat med speciella beräkningsverktyg som låter operatören förutse utvecklingen och utföra lämpliga åtgärder enligt de protokoll som finns vid olika systemhändelser. För att försäkra sig om att operatörer tar de korrekta, simuleringsbaseda besluten måste kraftsystemsmodellen kontinuerligt valideras. Denna avhandling undersöker problem inom modellering, identifiering och validering av kraftsystem, formulerade och baserade på data tillhandahållet av operatörer, samt erbjuder nya metoder och fördjupade insikter i delar av en identifieringscykel som beaktar kraftsystemets. Ett av de problem som denna avhandling tar upp är val av en programmiljö för simulering och modellering som ger transparens och möjlighet till otvetydigt modellutbyte mellan systemoperatörer. Modelica är ett ekvationsbaserat programspråk som uppfyller dessa krav. I denna avhandling utvecklades enfasekvivalenter i Modelica som blev validerade mot konventionella program för simulering, såsom SPS/Simulink och PSAT i MATLAB. Parameterestimering i Modelica-modellerna kräver en modulär och utbyggbar verktygslåda. Därför har verktyget RaPiD Toolbox, som tillhandahåller systemidentifieringsalgoritmer för Modelica-modeller, utvecklats i MATLAB. Bidrag från denna avhandling är en implementation av ett partikelfilter (en sekventiell Monte Carlo-metod) och valideringsmetrik för parameteridentifiering. Prestandan i den föreslagna algoritmen har jämförts med partikelsvärmoptimering (PSO) då den är kombinerad med simplexsök och parallellisering. Partikelfiltret överträffade PSO när modellparametrar i turbinregulatorn i ett grekiskt kraftverk skulle estimeras utifrån verklig mätdata.  Avhandling analyserar också olika modellstrukturer (NARX, Hammerstein-Wiener-modeller, och överföringsfunktioner med höga ordningstal) som används för att reproducera den ickelinjära dynamiken hos statiska reaktiv effekt-kompenserare (SVC) vid ofullständig information som är tillgänglig för systemoperatören National Grid. Undersökningen visar att den vanliga SVC-modellen är dålig på att reproducera den verkliga, uppmätta dynamiken. Genom att matematiskt modellera problemet som en svart låda har en identifieringsmetod föreslagits. Vidare, genom att kombinera modelleringen som en svart låda med fysikaliska principer har givit den bästa anpassningen till utdata. Metodologin för identifieringscykeln tillsammans med valet av modellkomplexitet och svårigheter med modellvalidering har utförligt presenterats. Slutligen, ett av de främsta bidragen är en ny metod för att formulera osäkerheten i parameteruppskattningarna i form av en blandning av normalfördelningar med flera typvärden som estimeras med partikelfiltrets utdata genom att använda statistiska metoder för att välja standardavvikelsen. Detta ger kraftsystemanalytiker möjlighet att utforma valideringstest för den valda modellen.

QC 20171121


EU FP7 iTesla project
APA, Harvard, Vancouver, ISO, and other styles
37

Muhammad, Ruqiah. "A new dynamic model for non-viral multi-treatment gene delivery systems for bone regeneration: parameter extraction, estimation, and sensitivity." Diss., University of Iowa, 2019. https://ir.uiowa.edu/etd/6996.

Full text
Abstract:
In this thesis we develop new mathematical models, using dynamical systems, to represent localized gene delivery of bone morphogenetic protein 2 into bone marrow-derived mesenchymal stem cells and rat calvarial defects. We examine two approaches, using pDNA or cmRNA treatments, respectively, towards the production of calcium deposition and bone regeneration in in vitro and in vivo experiments. We first review the relevant scientific literature and survey existing mathematical representations for similar treatment approaches. We then motivate and develop our new models and determine model parameters from literature, heuristic approaches, and estimation using sparse data. We next conduct a qualitative analysis using dynamical systems theory. Due to the nature of the parameter estimation, it was important that we obtain local and global sensitivity analyses of model outputs to changes in model inputs. Finally we compared results from different treatment protocols. Our model suggests that cmRNA treatments may perform better than pDNA treatments towards bone fracture healing. This work is intended to be a foundation for predictive models of non-viral local gene delivery systems.
APA, Harvard, Vancouver, ISO, and other styles
38

Son, Kyongho. "Improving model structure and reducing parameter uncertainty in conceptual water balance models with the use of auxiliary data." University of Western Australia. School of Environmental Systems Engineering, 2006. http://theses.library.uwa.edu.au/adt-WU2006.0094.

Full text
Abstract:
[Truncated abstract] The use of uncertainty analysis is gaining considerable attention in catchment hydrological modeling. In particular, the choice of an appropriate model structure, the identifiability of parameter values, and the reduction of model predictive uncertainty are deemed as essential elements of hydrological modelling. The chosen model structure must be parsimonious, and the parameters used must either be derivable from field measured data or inferred unambiguously from analysis of catchment response data. In this thesis, a long-term water balance model for the Susannah Brook catchment in Western Australia has been pursued using the ?downward approach?, which is a systematic approach to determine the model with the minimum level of complexity, with parameter values that in theory are derivable from existing physiographic data relating to the catchment. Through the analysis of the rainfall-runoff response at different timescales, and the exploration of the climate, soil and vegetation controls on the water balance response, an initial model structure was formulated, and a priori model parameter values estimated. Further investigation with the use of auxiliary data such as deuterium concentration in the stream and groundwater level data exposed inadequacies in the model structure. Two more model structures were then proposed and investigated through formulating alternative hypotheses regarding the underlying causes of observed variability, including those associated with the absence of a contribution of deep groundwater flow to the streamflow, which was indicated by deuterium concentration and internal dynamics characterized by the observed groundwater levels. ... These differences are due to differences in the time delay between rainfall and recharge between upland and riparian regions. The ages of water recharging the groundwater and discharging from the catchment were estimated by assuming a piston flow mechanism. In the deeper, upland soils, the age of recharging water was considerably larger than the unsaturated zone delay would suggest; a recharge response 16 days after an infiltration event may involve water as much as 160 days old. On the other hand, the delay and the age of recharging water were much lower in the shallow riparian zone. Where the upland zone contributes significantly to discharge, the predicted difference between the rainfall-discharge response time and the average age of discharging water can be significant.
APA, Harvard, Vancouver, ISO, and other styles
39

Juutilainen, I. (Ilmari). "Modelling of conditional variance and uncertainty using industrial process data." Doctoral thesis, University of Oulu, 2006. http://urn.fi/urn:isbn:9514282620.

Full text
Abstract:
Abstract This thesis presents methods for modelling conditional variance and uncertainty of prediction at a query point on the basis of industrial process data. The introductory part of the thesis provides an extensive background of the examined methods and a summary of the results. The results are presented in detail in the original papers. The application presented in the thesis is modelling of the mean and variance of the mechanical properties of steel plates. Both the mean and variance of the mechanical properties depend on many process variables. A method for predicting the probability of rejection in a quali?cation test is presented and implemented in a tool developed for the planning of strength margins. The developed tool has been successfully utilised in the planning of mechanical properties in a steel plate mill. The methods for modelling the dependence of conditional variance on input variables are reviewed and their suitability for large industrial data sets are examined. In a comparative study, neural network modelling of the mean and dispersion narrowly performed the best. A method is presented for evaluating the uncertainty of regression-type prediction at a query point on the basis of predicted conditional variance, model variance and the effect of uncertainty about explanatory variables at early process stages. A method for measuring the uncertainty of prediction on the basis of the density of the data around the query point is proposed. The proposed distance measure is utilised in comparing the generalisation ability of models. The generalisation properties of the most important regression learning methods are studied and the results indicate that local methods and quadratic regression have a poor interpolation capability compared with multi-layer perceptron and Gaussian kernel support vector regression. The possibility of adaptively modelling a time-varying conditional variance function is disclosed. Two methods for adaptive modelling of the variance function are proposed. The background of the developed adaptive variance modelling methods is presented.
APA, Harvard, Vancouver, ISO, and other styles
40

Pinkwart, Nicolas [Verfasser], and Jürgen [Akademischer Betreuer] Kähler. "Parameter Uncertainty and the Interest Rate Smoothing Behavior of the European Central Bank / Nicolas Pinkwart. Betreuer: Jürgen Kähler." Erlangen : Universitätsbibliothek der Universität Erlangen-Nürnberg, 2012. http://d-nb.info/102031382X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Haller, Vanessa. "Ecological models for threat management: Considering the unknowns using numerical analysis and machine learning." Thesis, Queensland University of Technology, 2021. https://eprints.qut.edu.au/210467/1/Vanessa_Haller_Thesis.pdf.

Full text
Abstract:
The thesis tackles the problem of uncertainty in the modelling of ecosystems of any complexity. This uncertainty can originate from many sources including missing knowledge of interactions or low data availability. The major contribution of the thesis is a new workflow that extends on the traditional scientific method to allow for the thorough investigation of uncertainty in especially data poor system utilising new techniques such as machine learning.
APA, Harvard, Vancouver, ISO, and other styles
42

Claußen, Arndt [Verfasser]. "Essays on risk management of financial institutions : systematic risk, cross-sectional pricing of risk factors, parameter errors affecting risk measures, and credit decisions under parameter uncertainty / Arndt Claußen." Hannover : Technische Informationsbibliothek und Universitätsbibliothek Hannover (TIB), 2015. http://d-nb.info/1078747318/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Sorgatz, Julia Verfasser], Holger [Akademischer Betreuer] [Schüttrumpf, and Kerstin [Akademischer Betreuer] Lesny. "Towards reliability-based bank revetment design : investigation of limit states and parameter uncertainty / Julia Sorgatz ; Holger Schüttrumpf, Kerstin Lesny." Aachen : Universitätsbibliothek der RWTH Aachen, 2020. http://d-nb.info/122799298X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Kouki, Slim. "An experiment on the parameter uncertainty of hydrological models with different levels of complexity in a climate change context." Doctoral thesis, Université Laval, 2016. http://hdl.handle.net/20.500.11794/26979.

Full text
Abstract:
La possibilité d’estimer l’impact du changement climatique en cours sur le comportement hydrologique des hydro-systèmes est une nécessité pour anticiper les adaptations inévitables et nécessaires que doivent envisager nos sociétés. Dans ce contexte, ce projet doctoral présente une étude sur l'évaluation de la sensibilité des projections hydrologiques futures à : (i) La non-robustesse de l’identification des paramètres des modèles hydrologiques, (ii) l’utilisation de plusieurs jeux de paramètres équifinaux et (iii) l’utilisation de différentes structures de modèles hydrologiques. Pour quantifier l’impact de la première source d’incertitude sur les sorties des modèles, quatre sous-périodes climatiquement contrastées sont tout d’abord identifiées au sein des chroniques observées. Les modèles sont calés sur chacune de ces quatre périodes et les sorties engendrées sont analysées en calage et en validation en suivant les quatre configurations du Different Splitsample Tests (Klemeš, 1986; Wilby, 2005; Seiller et al. (2012); Refsgaard et al. (2014)). Afin d’étudier la seconde source d’incertitude liée à la structure du modèle, l’équifinalité des jeux de paramètres est ensuite prise en compte en considérant pour chaque type de calage les sorties associées à des jeux de paramètres équifinaux. Enfin, pour évaluer la troisième source d'incertitude, cinq modèles hydrologiques de différents niveaux de complexité sont appliqués (GR4J, MORDOR, HSAMI, SWAT et HYDROTEL) sur le bassin versant québécois de la rivière Au Saumon. Les trois sources d'incertitude sont évaluées à la fois dans conditions climatiques observées passées et dans les conditions climatiques futures. Les résultats montrent que, en tenant compte de la méthode d'évaluation suivie dans ce doctorat, l'utilisation de différents niveaux de complexité des modèles hydrologiques est la principale source de variabilité dans les projections de débits dans des conditions climatiques futures. Ceci est suivi par le manque de robustesse de l'identification des paramètres. Les projections hydrologiques générées par un ensemble de jeux de paramètres équifinaux sont proches de celles associées au jeu de paramètres optimal. Par conséquent, plus d'efforts devraient être investis dans l'amélioration de la robustesse des modèles pour les études d'impact sur le changement climatique, notamment en développant les structures des modèles plus appropriés et en proposant des procédures de calage qui augmentent leur robustesse. Ces travaux permettent d’apporter une réponse détaillée sur notre capacité à réaliser un diagnostic des impacts des changements climatiques sur les ressources hydriques du bassin Au Saumon et de proposer une démarche méthodologique originale d’analyse pouvant être directement appliquée ou adaptée à d’autres contextes hydro-climatiques.
The possibility to estimate the impact of climate change on the hydrological behavior of hydrosystems, the hydrological risks, and the associated resources is a necessity in order to anticipate the inevitable and necessary adaptations that must consider our societies. In this context, the doctoral project presents a study on the evaluation of the uncertainty of hydrological projections for the future climate when considering: (i) The non-robustness of hydrological model parameter identification, (ii) the use of several ensembles of equifinal parameter sets over a given calibration period and (iii) the use of different model structures for the hydrological model. To quantify the impact of the first source of uncertainty on the model outputs, four climatically contrasted sub-periods are first identified within the observed time series. The models are calibrated on each of these four periods, then generated outputs are analyzed on calibration and validation data. The calibration and validation tests were performed according to the configurations of four Different Split-sample Tests (Klemeš, 1986; Wilby, 2005; Seiller et al., 2012; Refsgaard et al., 2014). In order to study the second source of uncertainty related to the model structure, the equifinality of the parameter sets is taken into account by considering an ensemble of equifinal parameter sets for each sub-period calibration. Finally, to assess the third source of uncertainty, five hydrological models of different levels of complexity are applied (GR4J, MORDOR, HSAMI, SWAT, and HYDROTEL) on the watershed of the Au Saumon River (Québec, Canada).The three sources of uncertainty are assessed in the past observed period and in future climate conditions. Results show that, given the evaluation approach followed in this Ph.D. research, the use of different levels of complexity of hydrological models is the major source of variability in streamflow projections in future climate conditions for the five models tested. This is followed by the lack of robustness of parameter identification. The hydrological projections generated by an ensemble of equifinal parameter sets are close to those associated with the optimal set. Therefore, it seems that greater effort should be invested in improving the robustness of models for climate change impact studies, especially by developing more suitable model structures and proposing calibration procedures that increase their robustness. This work serves to provide a detailed response on our ability to make a diagnosis of the impacts of climate change on water resources of the Au Saumon watershed and proposes a novel methodological approach that can be directly applied or adapted to other hydro-climatic contexts.
APA, Harvard, Vancouver, ISO, and other styles
45

Sorgatz, Julia [Verfasser], Holger [Akademischer Betreuer] Schüttrumpf, and Kerstin [Akademischer Betreuer] Lesny. "Towards reliability-based bank revetment design : investigation of limit states and parameter uncertainty / Julia Sorgatz ; Holger Schüttrumpf, Kerstin Lesny." Aachen : Universitätsbibliothek der RWTH Aachen, 2020. http://d-nb.info/122799298X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Rajaraman, Srinivasan. "Robust model-based fault diagnosis for chemical process systems." Texas A&M University, 2003. http://hdl.handle.net/1969.1/3956.

Full text
Abstract:
Fault detection and diagnosis have gained central importance in the chemical process industries over the past decade. This is due to several reasons, one of them being that copious amount of data is available from a large number of sensors in process plants. Moreover, since industrial processes operate in closed loop with appropriate output feedback to attain certain performance objectives, instrument faults have a direct effect on the overall performance of the automation system. Extracting essential information about the state of the system and processing the measurements for detecting, discriminating, and identifying abnormal readings are important tasks of a fault diagnosis system. The goal of this dissertation is to develop such fault diagnosis systems, which use limited information about the process model to robustly detect, discriminate, and reconstruct instrumentation faults. Broadly, the proposed method consists of a novel nonlinear state and parameter estimator coupled with a fault detection, discrimination, and reconstruction system. The first part of this dissertation focuses on designing fault diagnosis systems that not only perform fault detection and isolation but also estimate the shape and size of the unknown instrument faults. This notion is extended to nonlinear processes whose structure is known but the parameters of the process are a priori uncertain and bounded. Since the uncertainty in the process model and instrument fault detection interact with each other, a novel two-time scale procedure is adopted to render overall fault diagnosis. Further, some techniques to enhance the convergence properties of the proposed state and parameter estimator are presented. The remaining part of the dissertation extends the proposed model-based fault diagnosis methodology to processes for which first principles modeling is either expensive or infeasible. This is achieved by using an empirical model identification technique called subspace identification for state-space characterization of the process. Finally the proposed methodology for fault diagnosis has been applied in numerical simulations to a non-isothermal CSTR (continuous stirred tank reactor), an industrial melter process, and a debutanizer plant.
APA, Harvard, Vancouver, ISO, and other styles
47

Debchoudhury, Shantanab. "Parameter Estimation from Retarding Potential Analyzers in the Presence of Realistic Noise." Diss., Virginia Tech, 2019. http://hdl.handle.net/10919/88466.

Full text
Abstract:
Retarding Potential Analyzers (RPA) have a rich flight heritage. These instruments are largely popular since a single current-voltage (I-V) profile can provide in-situ measurements of ion temperature, velocity and composition. The estimation of parameters from an RPA I-V curve is affected by grid geometries and non-ideal biasing which have been studied in the past. In this dissertation, we explore the uncertainties associated with estimated ion parameters from an RPA in the presence of instrument noise. Simulated noisy I-V curves representative of those expected from a mid-inclination low Earth orbit are fitted with standard curve fitting techniques to reveal the degree of uncertainty and inter-dependence between expected errors, with varying levels of additive noise. The main motive is to provide experimenters working with RPA data with a measure of error scalable for different geometries. In subsequent work, we develop a statistics based bootstrap technique designed to mitigate the large inter-dependency between spacecraft potential and ion velocity errors, which were seen to be highly correlated when estimated using a standard algorithm. The new algorithm - BATFORD, acronym for "Bootstrap-based Algorithm with Two-stage Fit for Orbital RPA Data analysis" - was applied to a simulated dataset treated with noise from a laboratory calibration based realistic noise model, and also tested on real in-flight data from the C/NOFS mission. BATFORD outperforms a traditional algorithm in simulation and also provides realistic in-situ estimates from a section of a C/NOFS orbit when the satellite passed through a plasma bubble. The low signal-to-noise ratios (SNR) of measured I-Vs in these bubbles make autonomous parameter estimation notoriously difficult. We thus propose a method for robust autonomous analysis of RPA data that is reliable in low SNR environments, and is applicable for all RPA designs.
Doctor of Philosophy
The plasma environment in Earth’s upper atmosphere is dynamic and diverse. Of particular interest is the ionosphere - a region of dense ionized gases that directly affects the variability in weather in space and the communication of radio wave signals across Earth. Retarding potential analyzers (RPA) are instruments that can directly measure the characteristics of this environment in flight. With the growing popularity of small satellites, these probes need to be studied in greater detail to exploit their ability to understand how ions - the positively charged particles- behave in this region. In this dissertation, we aim to understand how the RPA measurements, obtained as current-voltage relationships, are affected by electronic noise. We propose a methodology to understand the associated uncertainties in the estimated parameters through a simulation study. The results show that a statistics based algorithm can help to interpret RPA data in the presence of noise, and can make autonomous, robust and more accurate measurements compared to a traditional non-linear curve-fitting routine. The dissertation presents the challenges in analyzing RPA data that is affected by noise and proposes a new method to better interpret measurements in the ionosphere that can enable further scientific progress in the space physics community.
APA, Harvard, Vancouver, ISO, and other styles
48

Abu, Rumman Malek. "Conjunctive Management of Surface Water and Groundwater Resources." Diss., Georgia Institute of Technology, 2005. http://hdl.handle.net/1853/6917.

Full text
Abstract:
Surface water and groundwater systems consist of interconnected reservoirs, rivers, and confined and unconfined aquifers. The integrated management of such resources faces several challenges: High dimensionality refers to the requirement of the large number of variables that need to be considered in the description of surface water and groundwater systems. As the number of these variables increases, the computational requirements quickly saturate the capabilities of the existing management methods. Uncertainty relates to the imprecise nature of many system inputs and parameters, including reservoir and tributary inflows, precipitation, evaporation, aquifer parameters (e.g., hydraulic conductivity and storage coefficient), and various boundary and initial conditions. Uncertainty complicates very significantly the development and application of efficient management models. Nonlinearity is intrinsic to some physical processes and also enters through various facility and operational constraints on reservoir storages, releases, and aquifer drawdown and pumping. Nonlinearities compound the previous difficulties. Multiple objectives pertain to the process of optimizing the use of the integrated surface and groundwater resources to meet various water demands, generate sufficient energy, maintain adequate instream flows, and protect the environment and the ecosystems. Multi-objective decision models and processes continue to challenge professional practice. This research draws on several disciplines including groundwater flow modeling, hydrology and water resources systems, uncertainty analysis, estimation theory, stochastic optimization of dynamical systems, and policy assessment. A summary of the research contributions made in this work follows: 1.High dimensionality issues related to groundwater aquifers system have been mitigated by the use of transfer functions and their representation by state space approximations. 2.Aquifer response under uncertainty of inputs and aquifer parameters is addressed by a new statistical procedure that is applicable to regions of relatively few measurements and incorporates management reliability considerations. 3.The conjunctive management problem is formulated in a generally applicable way, taking into consideration all relevant uncertainties and system objectives. This problem is solved via an efficient stochastic optimization method that overcomes dimensionality limitations. 4.The methods developed in this Thesis are applied to the Jordanian water resources system, demonstrating their value for operational planning and management.
APA, Harvard, Vancouver, ISO, and other styles
49

Bushnell, Tanner Hans. "Parameter Importance of an Analytical Model for Transport in the Vadose Zone." Diss., CLICK HERE for online access, 2007. http://contentdm.lib.byu.edu/ETD/image/etd1728.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Sharp, Jesse A. "Numerical methods for optimal control and parameter estimation in the life sciences." Thesis, Queensland University of Technology, 2022. https://eprints.qut.edu.au/230762/1/Jesse_Sharp_Thesis.pdf.

Full text
Abstract:
This thesis concerns numerical methods in mathematical optimisation and inference; with a focus on techniques for optimal control, and for parameter estimation and uncertainty quantification. Novel methodological and computational developments are presented, with a view to improving the efficiency, effectiveness and accessibility of these techniques for practitioners. The numerical methods considered in this work are widely applied throughout the life sciences; in areas including ecology, epidemiology and oncology, and beyond the life sciences; in engineering, economics, aeronautics and other disciplines.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography