To see the other types of publications on this topic, follow the link: Empirical calibration.

Dissertations / Theses on the topic 'Empirical calibration'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 15 dissertations / theses for your research on the topic 'Empirical calibration.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Priest, Angela Lynn Timm David Harold. "Calibration of fatigue transfer functions for mechanistic-empirical flexible pavement design." Auburn, Ala., 2005. http://repo.lib.auburn.edu/2005%20Fall/Thesis/PRIEST_ANGELA_32.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Sufian, Abu Ahmed. "Local calibration of the mechanistic empirical pavement design guide for Kansas." Thesis, Kansas State University, 2016. http://hdl.handle.net/2097/34533.

Full text
Abstract:
Master of Science
Department of Civil Engineering
Mustaque Hossain
The Kansas Department of Transportation is transitioning from adherence to the 1993 American Association of State Highway and Transportation Officials (AASHTO) Pavement Design Guide to implementation of the new AASHTO Mechanistic-Empirical Pavement Design Guide (MEPDG) for flexible and rigid pavement design. This study was initiated to calibrate MEPDG distress models for Kansas. Twenty-seven newly constructed projects were selected for flexible pavement distress model calibration, 21 of which were used for calibration and six that were selected for validation. In addition, 22 newly constructed jointed plain concrete pavements (JPCPs) were selected to calibrate rigid models; 17 of those projects were selected for calibration and five were selected for validation. AASHTOWare Pavement ME Design (ver. 2.2) software was used for design analysis, and the traditional split sampling method was followed in calibration. MEPDG-predicted distresses of Kansas road segments were compared with those from Pavement Management Information System data. Statistical analysis was performed using the Microsoft Excel statistical toolbox. The rutting and roughness models for flexible pavement were successfully calibrated with reduced bias and accepted null hypothesis. Calibration of the top-down fatigue cracking model was not satisfactory due to variability in measured data, and the bottom-up fatigue cracking model was not calibrated because measured data was unavailable. AASHTOWare software did not predict transverse cracking for any projects with global values. Thus thermal cracking model was not calibrated. The JPCP transverse joint faulting model was calibrated using sensitivity analysis and iterative runs of AASHTOWare to determine optimal coefficients that minimize bias. The IRI model was calibrated using the generalized reduced gradient nonlinear optimization technique in Microsoft Excel Solver. The transverse slab cracking model could not be calibrated due to lack of measured cracking data.
APA, Harvard, Vancouver, ISO, and other styles
3

Cimino, Joseph A. "Empirical mass balance calibration of analytical hydrograph separation techniques using electrical conductivity." [Tampa, Fla.] : University of South Florida, 2003. http://purl.fcla.edu/fcla/etd/SFE0000213.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Nguyen, Minh Khoa. "Estimation and calibration of agent-based models of financial markets using empirical likelihood." Thesis, University of Essex, 2014. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.617040.

Full text
Abstract:
This thesis introduces the eBAEL (extended Empirical Balanced Augmented Likelihood) method for general analytical moment conditions. It proves that eBAEL has a X2 - limit distribution and provides a consistent estimator. Numerical results demonstrate that the eBAEL ratio statistics exhibits less bias than AEL ratio statistics and has smaller type I errors. This thesis also presents a general framework for calibrating financial agent-based models using eBAEL, where the aim is to find the model parameter for which the true model moments match the given empirical values. It is demonstrated that our proposed approach is able to retrieve that parameter with probability approaching one as the number of simulations increase. Furthermore this thesis demonstrates that the EL approach may also be used for estimating financial agent-based models. In contrast to calibration, estimation via moment matching in particular emphasizes that empirical moments are estimates themselves and the aim is to find a parameter configuration for which the true model moments and true empirical moments coincide. As a numerical benchmark case, the parameters of a Geometric Brownian Motion are calibrated and estimated from its simulated sample paths in comparison to the SMM. In this case the EL approach is able to provide the best mean squared errors for both calibration and estimation and in particular is the most robust calibration method. In terms of calibration efficiency this robustness holds figuratively, as the SMM is only more efficient in cases where it provides worse mean squared errors. Additionally, this thesis also estimates an actual agent-based model of a financial market against empirical moments that are generated at some known model parameter setting. Similarly, the resulting EL mean squared errors are mostly better than those of the SMM.
APA, Harvard, Vancouver, ISO, and other styles
5

Lu, Min. "A Study of the Calibration Regression Model with Censored Lifetime Medical Cost." Digital Archive @ GSU, 2006. http://digitalarchive.gsu.edu/math_theses/14.

Full text
Abstract:
Medical cost has received increasing interest recently in Biostatistics and public health. Statistical analysis and inference of life time medical cost have been challenging by the fact that the survival times are censored on some study subjects and their subsequent cost are unknown. Huang (2002) proposed the calibration regression model which is a semiparametric regression tool to study the medical cost associated with covariates. In this thesis, an inference procedure is investigated using empirical likelihood ratio method. The unadjusted and adjusted empirical likelihood confidence regions are constructed for the regression parameters. We compare the proposed empirical likelihood methods with normal approximation based method. Simulation results show that the proposed empirical likelihood ratio method outperforms the normal approximation based method in terms of coverage probability. In particular, the adjusted empirical likelihood is the best one which overcomes the under coverage problem.
APA, Harvard, Vancouver, ISO, and other styles
6

Gairing, Jan, Michael Högele, Tetiana Kosenkova, and Alexei Kulik. "On the calibration of Lévy driven time series with coupling distances : an application in paleoclimate." Universität Potsdam, 2014. http://opus.kobv.de/ubp/volltexte/2014/6978/.

Full text
Abstract:
This article aims at the statistical assessment of time series with large fluctuations in short time, which are assumed to stem from a continuous process perturbed by a Lévy process exhibiting a heavy tail behavior. We propose an easily implementable procedure to estimate efficiently the statistical difference between the noisy behavior of the data and a given reference jump measure in terms of so-called coupling distances. After a short introduction to Lévy processes and coupling distances we recall basic statistical approximation results and derive rates of convergence. In the sequel the procedure is elaborated in detail in an abstract setting and eventually applied in a case study to simulated and paleoclimate data. It indicates the dominant presence of a non-stable heavy-tailed jump Lévy component for some tail index greater than 2.
APA, Harvard, Vancouver, ISO, and other styles
7

Cimino, Joseph A. (Joseph Anthony). "Empirical mass balance calibration of analytical hydrograph separation techniques using electrical conductivity [electronic resource] / by Joseph A. Cimino." University of South Florida, 2003. http://purl.fcla.edu/fcla/etd/SFE0000213.

Full text
Abstract:
Title from PDF of title page.
Document formatted into pages; contains 75 pages.
Thesis (M.S.C.E.)--University of South Florida, 2003.
Includes bibliographical references.
Text (Electronic thesis) in PDF format.
ABSTRACT: Analytical baseflow separation techniques such as those used in the automated hydrograph separation program HYSEP rely on a single input parameter that defines the period of time after which surface runoff ceases and all streamflow is considered baseflow. In HYSEP, this input parameter is solely a function of drainage basin contributing area. This method cannot be applied universally since in most regions the time of surface runoff cessation is a function of a number of different hydrologic and hydrogeologic basin characteristics, not just contributing drainage area. This study demonstrates that streamflow conductivity can be used as a natural tracer that integrates the different hydrologic and hydrogeologic basin characteristics that influence baseflow response. Used as an indicator of baseflow as a component of total flow, streamflow conductivity allows for an empirical approach to hydrograph separation using a simple mass balance algorithm.
ABSTRACT: Although conductivity values for surface-water runoff and ground-water baseflow must be identified to apply this mass balance algorithm, field studies show that assumptions based on streamflow at low flow and high flow conditions are valid for estimating these end member conductivities. The only data required to apply the mass balance algorithm are streamflow conductivity and discharge measurements. Using minimal data requirements, empirical hydrograph separation techniques can be applied that yield reasonable estimates of baseflow. This procedure was performed on data from 10 USGS gaging stations for which reliable, real-time conductivity data are available. Comparison of empirical hydrograph separations using streamflow conductivity data with analytical hydrograph separations demonstrates that uncalibrated, graphical estimation of baseflow can lead to substantial errors in baseflow estimates.
ABSTRACT: Results from empirical separations can be used to calibrate the runoff cessation input parameter used in analytical separation for each gaging station. In general, collection of stream conductivity data at gaging stations is relatively recent, while discharge measurements may extend many decades into the past. Results demonstrate that conductivity data available for a relatively short period of record can be used to calibrate the runoff cessation input parameter used for analytical separation. The calibrated analytical method can then be applied over a much longer period record since discharge data are the only requirement.
System requirements: World Wide Web browser and PDF reader.
Mode of access: World Wide Web.
APA, Harvard, Vancouver, ISO, and other styles
8

Flores, Ignacio. "On the empirical measurement of inequality." Thesis, Paris 1, 2019. http://www.theses.fr/2019PA01E003/document.

Full text
Abstract:
Le 1er chapitre présente une série de 50 ans sur les hauts revenus chiliens basée sur des données fiscales et comptes nationaux. L’étude contredit les enquêtes, selon lesquelles les inégalités diminuent les 25 dernières années. Au contraire, elles changent de direction à partir de 2000. Le Chili est parmi les pays les plus inégalitaires de l’OCDE et l’Amérique latine. Le 2ème chapitre mesure la sous-estimation des revenus factoriels dans les données distributives. Les ménages ne reçoivent que 50% des revenus du capital brut, par opposition aux firmes. L’hétérogénéité des taux de réponse et autres problèmes font que les enquêtes ne capturent que 20% de ceux-ci, contre 70% du revenu du travail. Cela sous-estime l’inégalité,dont les estimations deviennent insensibles à la "capital share" et sa distribution. Je formalise à partir d’identités comptables pour ensuite calculer des effets marginaux et contributions aux variations d’inégalité. Le 3ème chapitre présente une méthode pour ajuster les enquêtes. Celles-ci capturent souvent mal le sommet de la distribution. La méthode présente plusieurs avantages par rapport aux options précédentes : elle est compatible avec les méthodes de calibration standard ; elle a des fondements probabilistes explicites et préserve la continuité des fonctions de densité ; elle offre une option pour surmonter les limites des supports d’enquête bornées; et elle préserve la structure de micro données en préservant la représentativité des variables sociodémographiques. Notre procédure est illustrée par des applications dans cinq pays, couvrant à la fois des contextes développés et moins développés
The 1st chapter presents historical series of Chilean top income shares over a period of half a century, mostly using data from tax statistics and national accounts. The study contradicts evidence based on survey data, according to which inequality has fallen constantly over the past 25 years. Rather, it changes direction, increasing from around the year 2000. Chile ranks as one of the most unequal countries among both OECD and Latin American countries over the whole period of study. The 2nd chapter measures the underestimation of factor income in distributive data. I find that households receive only half of national gross capital income,as opposed to corporations. Due to heterogeneous non-response and misreporting, Surveys only capture 20% of it, vs. 70% of labor income. This understates inequality estimates, which become insensitive to the capital share and its distribution. I formalize this system based on accounting identities. I then compute marginal effects and contributions to changes in fractile shares. The 3rd chapter, presents a method to adjust surveys. These generally fail to capturethe top of the income distribution. It has several advantages over previous ones: it is consistent with standard survey calibration methods; it has explicit probabilistic foundations and preserves the continuity of density functions; it provides an option to overcome the limitations of bounded survey-supports; and it preserves the microdata structure of the survey
APA, Harvard, Vancouver, ISO, and other styles
9

Brimley, Bradford Keith. "Calibration of the Highway Safety Manual Safety Performance Function and Development of Jurisdiction-Specific Models for Rural Two-Lane Two-Way Roads in Utah." BYU ScholarsArchive, 2011. https://scholarsarchive.byu.edu/etd/2611.

Full text
Abstract:
This thesis documents the results of the calibration of the Highway Safety Manual (HSM) safety performance function (SPF) for rural two-lane two-way roadway segments in Utah and the development of new SPFs using negative binomial and hierarchical Bayesian modeling techniques. SPFs estimate the safety of a roadway entity, such as a segment or intersection, in terms of number of crashes. The new SPFs were developed for comparison to the calibrated HSM SPF. This research was performed for the Utah Department of Transportation (UDOT).The study area was the state of Utah. Crash data from 2005-2007 on 157 selected study segments provided a 3-year observed crash frequency to obtain a calibration factor for the HSM SPF and develop new SPFs. The calibration factor for the HSM SPF for rural two-lane two-way roads in Utah is 1.16. This indicates that the HSM underpredicts the number of crashes on rural two-lane two-way roads in Utah by sixteen percent. The new SPFs were developed from the same data that were collected for the HSM calibration, with the addition of new data variables that were hypothesized to have a significant effect on crash frequencies. Negative binomial regression was used to develop four new SPFs, and one additional SPF was developed using hierarchical (or full) Bayesian techniques. The empirical Bayes (EB) method can be applied with each negative binomial SPF because the models include an overdispersion parameter used with the EB method. The hierarchical Bayesian technique is a newer, more mathematically-intense method that accounts for high levels of uncertainty often present in crash modeling. Because the hierarchical Bayesian SPF produces a density function of a predicted crash frequency, a comparison of this density function with an observed crash frequency can help identify segments with significant safety concerns. Each SPF has its own strengths and weaknesses, which include its data requirements and predicting capability. This thesis recommends that UDOT use Equation 5-11 (a new negative binomial SPF) for predicting crashes, because it predicts crashes with reasonable accuracy while requiring much less data than other models. The hierarchical Bayesian process should be used for evaluating observed crash frequencies to identify segments that may benefit from roadway safety improvements.
APA, Harvard, Vancouver, ISO, and other styles
10

Rebecq, Antoine. "Méthodes de sondage pour les données massives." Thesis, Paris 10, 2019. http://www.theses.fr/2019PA100014/document.

Full text
Abstract:
Cette thèse présente trois parties liées à la théorie des sondages. La première partie présente deux résultats originaux de sondages qui ont eu des applications pratiques dans des enquêtes par sondage de l'Insee. Le premier article présente un théorème autorisant un plan de sondage stratifié constituant un compromis entre la dispersion des poids et l'allocation de précision optimale pour une variable d'intérêt spécifique. Les données d’enquête sont souvent utilisées pour estimer nombre de totaux ou modèles issus de variables exclues du design. La précision attendue pour ces variables est donc faible, mais une faible dispersion des poids permet de limiter les risques qu'une estimation dépendant d'une de ces variables ait une très mauvaise précision. Le second article concerne le facteur de repondération dans les estimateurs par calage. On propose un algorithme efficace capable de calculer les facteurs de poids les plus rapprochés autour de 1 tels qu'une solution au problème de calage existe. Cela permet de limiter les risques d'apparition d'unités influentes, particulièrement pour l'estimation sur des domaines. On étudie par simulations sur données réelles les propriétés statistiques des estimateurs obtenus. La seconde partie concerne l'étude des propriétés asymptotique des estimateurs sur données issues de sondage. Celles-ci sont difficiles à étudier en général. On présente une méthode originale qui établit la convergence faible vers un processus gaussien pour le processus empirique d'Horvitz-Thompson indexé par des classes de fonction, pour de nombreux algorithmes de sondage différents utilisés en pratique. Dans la dernière partie, on s'intéresse à des méthodes de sondage pour des données issues de graphes, qui ont des applications pratiques lorsque les graphes sont de taille telles que leur exploitation informatique est coûteuse. On détaille des algorithmes de sondage permettant d'estimer des statistiques d'intérêt pour le réseaux. Deux applications, à des données de Twitter puis à des données simulées, concluent cette partie
This thesis presents three different parts with ties to survey sampling theory. In the first part, we present two original results that led to practical applications in surveys conducted at Insee (French official statistics Institute). The first chapter deals with allocations in stratified sampling. We present a theorem that proves the existence of an optimal compromise between the dispersion of the sampling weights and the allocation yielding optimal precision for a specific variable of interest. Survey data are commonly used to compute estimates for variables that were not included in the survey design. Expected precision is poor, but a low dispersion of the weights limits risks of very high variance for one or several estimates. The second chapter deals with reweighting factors in calibration estimates. We study an algorithm that computes the minimal bounds so that the calibration estimators exist, and propose an efficient way of resolution. We also study the statistical properties of estimates using these minimal bounds. The second part studies asymptotic properties of sampling estimates. Obtaining asymptotic guarantees is often hard in practice. We present an original method that establishes weak convergence for the Horvitz-Thompson empirical process indexed by a class of functions for a lot of sampling algorithms used in practice. In the third and last part, we focus on sampling methods for populations that can be described as networks. They have many applications when the graphs are so big that storing and computing algorithms on them are very costly. Two applications are presented, one using Twitter data, and the other using simulated data to establish guidelines to design efficient sampling designs for graphs
APA, Harvard, Vancouver, ISO, and other styles
11

Fredette, Marc. "Prediction of recurrent events." Thesis, University of Waterloo, 2004. http://hdl.handle.net/10012/1142.

Full text
Abstract:
In this thesis, we will study issues related to prediction problems and put an emphasis on those arising when recurrent events are involved. First we define the basic concepts of frequentist and Bayesian statistical prediction in the first chapter. In the second chapter, we study frequentist prediction intervals and their associated predictive distributions. We will then present an approach based on asymptotically uniform pivotals that is shown to dominate the plug-in approach under certain conditions. The following three chapters consider the prediction of recurrent events. The third chapter presents different prediction models when these events can be modeled using homogeneous Poisson processes. Amongst these models, those using random effects are shown to possess interesting features. In the fourth chapter, the time homogeneity assumption is relaxed and we present prediction models for non-homogeneous Poisson processes. The behavior of these models is then studied for prediction problems with a finite horizon. In the fifth chapter, we apply the concepts discussed previously to a warranty dataset coming from the automobile industry. The number of processes in this dataset being very large, we focus on methods providing computationally rapid prediction intervals. Finally, we discuss the possibilities of future research in the last chapter.
APA, Harvard, Vancouver, ISO, and other styles
12

Ewen, Stephanie. "Evaluation and Calibration of the CroBas-PipeQual Model for Jack Pine (Pinus banksiana Lamb.) using Bayesian Melding. Hybridization of a process-based forest growth model with empirical yield curves." Thesis, Université Laval, 2013. http://www.theses.ulaval.ca/2013/29919/29919.pdf.

Full text
Abstract:
CroBas-PipeQual a été élaboré pour étudier les effets de croissance des arbres sur la qualité du bois. Ainsi, il s’agit d’un modèle d’intérêt pour maximiser la valeur des produits extraits des forêts. Nous avons évalué qualitativement une version de CroBas-PipeQual calibrée pour le pin gris (Pinus banksiana Lamb.) de façon à vérifier l’intérêt de l’utiliser comme outil de planification forestière. Par la suite, nous avons fait une analyse de sensibilité et une calibration bayesienne à partir d’une table de production utilisée au Québec. Les principales conclusions sont: 1. Les prédictions de hauteur sont les plus sensibles aux intrants et aux paramètres liés à la photosynthèse; 2. La performance de CroBas est améliorée en tenant compte de la relation observée entre deux paramètres utilisés pour estimer la productivité nette et l'indice de qualité de station; et 3. CroBas requiert d’autres améliorations avant de pouvoir être utilisé comme outil de planification.
CroBas-PipeQual is a process-based forest growth model designed to study foliage development and how growth processes relate to changes in wood quality. As such, CroBas-PipeQual is of interest as a component model in a forest level decision support model for value assessment. In this thesis, the version of CroBas-PipeQual calibrated for jack pine (Pinus banksiana Lamb.) in Québec, Canada was qualitatively evaluated for use in forest management decision-making. Then, sensitivity analyses and Bayesian melding were used to create and calibrate a stand-level version of CroBas-PipeQual to local empirical height yield models in a hybrid-modelling approach. Key findings included: 1. Height predictions were most sensitive to input values and to parameters related to net photosynthesis; 2. Model performance was improved by varying two net-productivity parameters with site quality; and 3. Model performance needs further improvement before CroBas-PipeQual can be used as a component of a forest-management decision tool.
APA, Harvard, Vancouver, ISO, and other styles
13

Jiang, Li. "Methods of calibration for the empirical likelihood ratio." Thesis, 2006. http://hdl.handle.net/1828/1854.

Full text
Abstract:
This thesis provides several new calibration methods for the empirical log-likelihood ratio. The commonly used Chi-square calibration is based on the limiting distribu¬tion of this ratio but it constantly suffers from the undercoverage problem. The finite sample distribution of the empirical log-likelihood ratio is recognized to have a mix¬ture structure with a continuous component on [0, +∞) and a probability mass at +∞. Consequently, new calibration methods are developed to take advantage of this mixture structure; we propose new calibration methods based on the mixture distrib¬utions, such as the mixture Chi-square and the mixture Fisher's F distribution. The E distribution introduced in Tsao (2004a) has a natural mixture structure and the calibration method based on this distribution is considered in great details. We also discuss methods of estimating the E distributions.
APA, Harvard, Vancouver, ISO, and other styles
14

Giusti, Ilaria. "Improvement of piezocone test interpretation for partial drainage conditions and for transitional soils." Doctoral thesis, 2017. http://hdl.handle.net/2158/1138812.

Full text
Abstract:
The present study shows the results of experimental analyses of field cone penetration tests as well as calibration chamber mini-piezocone tests on soils of intermediate permeability (silts, clayey and sandy silts). The penetration rate varied across over three orders of magnitude to provide information on partially drained and undrained tip resistance, excess pore water pressure and friction sleeve. Whilst previous experimental researches essentially focused on tip resistance and pore water pressure measurements, it is worthwhile underlying that the present study is one of the first experimental studies that explored the effect of penetration rate on sleeve friction measurements. As the penetration rate is reduced, moving from the undrained conditions to the fully drained conditions, friction sleeve systematically decreases, together with the expected results in terms of increasing tip resistance and decreasing excess porewater pressure. The obtained experimental database of penetration measurements on intermediate soils can be added to the previous worldwide collected data to develop a new general interpretation procedure for cone tests in transitional soils. Besides, numerical analyses have been carried out by using the Finite Element Method. The Updated Lagrangian technique has been adopted to simulate the large strain penetration process. Both the Modified Cam Clay constitutive model and the Mohr Coulomb model have been used to compare numerical simulation results with, respectively, the experimental results on kaolin clay (Randolph and Hope, 2004; Schneider et al., 2007) and those obtained with the present study. The problem of piezocone miss-interpretation in case of transitional soils, such as loose silt mixture has been dealt with an empirical methodology, based on the calibration of the Soil Behaviour Type index using soil characteristics inferred from reference boreholes. Moreover, a new approach has been proposed to overcome miss-interpretation of piezocone test results for soil layers belonging to vadose zones in which the effective stress state is controlled by suction. This procedure allows for the correction of the Soil Behaviour Type (SBT) index, in order to allocate correctly the investigated soils inside SBT classification charts (Robertson, 1990). In addition to that, the applied method has suggested a procedure, based on piezocone measurements, to estimate the effective stress state in the case of a homogeneous soil layer in which a vadose zone above the water table is present.
APA, Harvard, Vancouver, ISO, and other styles
15

Hsieh, I.-Huan, and 謝亦歡. "Combining Principal Component Analysis and Empirical Orthogonal Function Development of Regional Groundwater Numerical Model Calibration Methodology. A Case Study of Ming-Chu Basin." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/739emg.

Full text
Abstract:
碩士
國立臺灣大學
土木工程學研究所
105
This study is aimed to develop a regional groundwater numerical model calibration method. First, use principal component analysis (PCA) on the groundwater section to find out its temporal-spatial variable,and use it as a reference to create new assistance well. Then, applies empirical orthogonal function (EOF) with the change hydrograph of groundwater storage and simulated error hydrograph of groundwater level to quickly and accurately catch and calibrate the temporal-spatial distribuation of water recharge and hydrogeological parameters. The established method was applied on the groundwater system of Ming Chu Basin. This study is aimed to develop a groundwater numerical model calibration method. First, use principal component analysis (PCA) on the groundwater section to get its temporal-spatial variable distribution, and finding the line of eigenvalue=0, and create the new assistance well on it. After, setting the objective function is minimizing the the root mean square error (RMSE) of the simulated and observed groundwater level. The decision variables are horizontal hydraulic conductivity, vertical hydraulic conductivity and water recharge. There are three constraints of the optimization model: (1) the water recharge of groundwater system in every iteration of calibrating process must obey the mass balance; (2) the simulated groundwater level must follow the governing equation of groundwater flow; (3) the value of horizontal hydraulic conductivity and vertical hydraulic conductivity are restricted to a reasonable limits. The process of the optimization model sets the initial value of decision variables first, and inputs the variables to groundwater model. Thus, the groundwater level can be simulated and the objective function will be estimated. If the objective function doesn’t satisfy the stop condition, the simulated error hydrograph of groundwater level will be calculated and analyzed with EOF. Then, the modified decision variables is calculater according to the simulated error hydrograph of groundwater level end the result of EOF analysis. From iterations, the optimal temporal-spatial distribuation of surface water recharge and hydrogeological parameters can be obtain. This study applied the model on the calibration of the groundwater system in Ming Chu Basin. The simulated period is from January 2012 to December 2012 daily. The decision variables were selected in this study are horizontal hydraulic conductivity, vertical hydraulic conductivity of two acqufiers and rain water recharge, river water recharge and boundary water recharge of hydraulic conductivity in first acquifer. The result show that the RMSE is decreased dramatically in early iteration of the calibration and become smoothly after several iterations. The calibrated hydraulic conductivity and vertical leakence are in reasonable limits. The simulated groundwater level can reflect the approximately trendance in all acquifer and can capture the peak of the observed value in first acquifer. Hence, the established method of this study can effectively and accurately calibrate temporal-spatial distribution of surface water recharge and hydrogeological parameters.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography