Dissertations / Theses on the topic 'Data assimilation'

To see the other types of publications on this topic, follow the link: Data assimilation.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Data assimilation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Peubey, Carole. "Assimilation of ENVISAT data in an advanced data assimilation system." Thesis, University of Reading, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.485367.

Full text
Abstract:
i~to a stratosphere-troposphere version of the Met Office assimilation system, producing one of the first analyses to reproduce the September 2002 split of the Antarctic polar vortex. The aim of the project was to investigate the benefit of assimilating MIPAS retrievals and to assess the Met Office 3D-Var assimilation system by examining its different components. The ozone analysis was found to agree with independent ozone observations through most of the middle and upper stratosphere, biases above 60 hPa being within the range -20% to +10% and typically smaller. More significant positive biases were found in the lower stratosphere and inside the polar vortex. Although ozone amounts are shown to be slightly overestimated by MIPAS retrievals in these same regions, these biases are demonstrated to be caused by shortcomings in the model chemistry and transport. MIPAS data have been shown to have a limited impact on the Met Office temperature analysis, although a ' positive effect was identified at the mesopause. It is shown that MIPAScould bring larger benefits if more realistic background error statistics were used for ozone, especially in the lower stratosphere. Based on an evaluation of these statistics using independent datasets, it is suggested that background error variances should be decreased near the ozone maximum and increased below 70 hPa It is also recommended to introduce latitudedependence in vertical error correlations and height-dependence in horizontal error correlations. Improvements are also proposed to improve the ozone assimilation in the polar vortex region. Finally, analysed winds have been found to induce errOneous transport of ozone by increasing vertical diffusion of ozone and enhancing the mean zonal circulations. This especially affects the tropics, where ozone analyses reveal excessive exchanges of air parcels between the stratosphere and the troposphere.
APA, Harvard, Vancouver, ISO, and other styles
2

Barillec, Remi Louis. "Bayesian data assimilation." Thesis, Aston University, 2008. http://publications.aston.ac.uk/15276/.

Full text
Abstract:
This thesis addresses data assimilation, which typically refers to the estimation of the state of a physical system given a model and observations, and its application to short-term precipitation forecasting. A general introduction to data assimilation is given, both from a deterministic and stochastic point of view. Data assimilation algorithms are reviewed, in the static case (when no dynamics are involved), then in the dynamic case. A double experiment on two non-linear models, the Lorenz 63 and the Lorenz 96 models, is run and the comparative performance of the methods is discussed in terms of quality of the assimilation, robustness in the non-linear regime and computational time.
APA, Harvard, Vancouver, ISO, and other styles
3

Gregory, Alastair. "Multilevel ensemble data assimilation." Thesis, Imperial College London, 2017. http://hdl.handle.net/10044/1/60645.

Full text
Abstract:
This thesis aims to investigate and improve the efficiency of ensemble transform methods for data assimilation, using an application of multilevel Monte Carlo. Multilevel Monte Carlo is an interesting framework to estimate statistics of discretized random variables, since it uses a hierarchy of discretizations with a refinement in resolution. This is in contrast to standard Monte Carlo estimators that only use a discretization at a fine resolution. A linear combination of sub-estimators, on different levels of this hierarchy, can provide new statistical estimators to random variables at the finest level of resolution with significantly greater efficiency than a standard Monte Carlo equivalent. Therefore, the extension to computing filtering estimators for data assimilation is a natural, but challenging area of study. These challenges arise due to the fact that correlation must be imparted between ensembles on adjacent levels of resolution and maintained during the assimilation of data. The methodology proposed in this thesis, considers coupling algorithms to establish this correlation. This generates multilevel estimators that significantly reduce the computational expense of propagating ensembles of discretizations through time and space, in between stages of data assimilation. An effective benchmark of this methodology is realised by filtering data into high-dimensional spatio-temporal systems, where a high computational complexity is required to solve the underlying partial differential equations. A novel extension of an ensemble transform localisation framework to finite element approximations within random spatio-temporal systems is proposed, in addition to a multilevel equivalent.
APA, Harvard, Vancouver, ISO, and other styles
4

Woodgate, Rebecca A. "Data assimilation in ocean models." Thesis, University of Oxford, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.359566.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Moore, A. M. "Data assimilation in ocean models." Thesis, University of Oxford, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.375276.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Da, Dalt Federico. "Ionospheric modelling and data assimilation." Thesis, University of Bath, 2015. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.665450.

Full text
Abstract:
A New Ionospheric Model (ANIMo) based upon the physics of production, loss, and vertical transport has been developed. The model is driven by estimates of neutral composition, temperature and solar flux and is applicable to the mid-latitude regions of the Earth under quiet and moderate geomagnetic conditions. This model was designed to exhibit specific features that were not easy to find all together in other existing ionospheric models. ANIMo needed to be simple to use and interact with, relatively accurate, reliable, robust and computationally efficient. The definition of these characteristics was mostly driven by the intention to use ANIMo in a Data Assimilation (DA) scheme. DA or data ingestion can be described as a technique where observations and model realizations, called background information, are combined together to achieve a level of accuracy that is higher than the accuracy of the two elements taken separately. In this project ANIMo was developed to provide a robust and reliable background contribution. The observations are given by the Global Positioning System (GPS) ionospheric measurements, collected from several networks of GPS ground-station receivers and are available on on-line repositories. The research benefits from the Multi-Instrument Data Analysis System (MIDAS) [Mitchell and Spencer, 2003; Spencer and Mitchell, 2007], which is an established ionospheric tomography software package that produces three dimensional reconstructions of the ionosphere starting from GPS measurements. Utilizing ANIMo in support of MIDAS has therefore the potential to generate a very stable set-up for monitoring and study the ionosphere. In particular, the model is expected to compensate some of the typical limitations of ionospheric tomography techniques described by Yeh and Raymund [1991] and Raymund et al. [1994]. These are associated with the lack of data due to the uneven distribution of ground-based receivers and limitations to viewing angles. Even in regions of good receiver coverage there is a need to compensate for information on the vertical profile of ionisation. MIDAS and other tomography techniques introduce regularization factors that can assure the achievement of a unique solution in the inversion operation. These issues could be solved by aiding the operation with external information provided by a physical model, like ANIMo, through a data ingestion scheme; this ensures that the contribution is completely independent and there is an effective accuracy improvement. Previously, the limitation in vertical resolution has been solved by applying vertical orthonormal functions based upon empirical models in different ways [Fougere, 1995; Fremouw et al., 1992; Sutton and Na, 1994]. The potential for the application of a physical model, such ANIMo is that it can provide this information according to the current ionospheric conditions. During the project period ANIMo has been developed and incorporated with MIDAS. The result is A New Ionospheric Data Assimilation System (ANIDAS); its name suggests that the system is the implementation of ANIMo in MIDAS. Because ANIDAS is a data ingestion scheme, it has the potential to be used to perform not only more accurate now-casting but also forecasting. The outcomes of ANIDAS at the current time can be used to initialise ANIMo for the next time step and therefore trigger another assimilation turn. In future, it is intended that ANIMo will form the basis to a new system to predict the electron density of the ionosphere – ionospheric forecasting.
APA, Harvard, Vancouver, ISO, and other styles
7

Shukla, Abhishek. "Analysis of data assimilation schemes." Thesis, University of Warwick, 2016. http://wrap.warwick.ac.uk/90880/.

Full text
Abstract:
Data assimilation schemes are methods to estimate true underlying state of the physical systems of interest by combining the theoretical knowledge about the underlying system with available observations of the state. However, in most of the physical systems the observations often are noisy and only partially available. In the first part of this thesis we study the case of sequential data assimilation scheme, when the underlying system is nonlinear chaotic and the observations are partial and noisy. We produce a rigorous and quantitative analysis of data assimilation process for fixed observation modes. We also introduce a novel method of dynamically rearranging observation modes, leading to the requirement of fewer observation modes while maintaining the accuracy of the data assimilation process. In the second part of the thesis we focus on 4DVAR data assimilation scheme which is a variational method. 4DVAR data assimilation is a method that solves a variational problem; given a set of observations and a numerical model for the underlying physical system together with a priori information on the initial condition to estimate the initial condition for the underlying model. We propose a hybrid data assimilation scheme where, we consider the 3DVAR scheme for the model as the constraint on the variational form, rather than constraining the variational form with the original model. We observe that this method reduces the computational cost of the minimization of the 4DVAR variational form, however, it introduces a bias in the estimate of the initial condition. We then explore how the results can be extended to weak constraint 4DVAR.
APA, Harvard, Vancouver, ISO, and other styles
8

Lindskog, Magnus. "On errors in meteorological data assimilation." Doctoral thesis, Stockholm : Department of Meteorology, Stockholm university, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-7258.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Milewski, Thomas. "Stratospheric chemical-dynamical ensemble data assimilation." Thesis, McGill University, 2012. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=110352.

Full text
Abstract:
Ensemble data assimilation uses Monte-Carlo methods to estimate flow-dependent error covariances which allow the transfer of information from observed variables to correlated ones. As the winds are largely unobserved in the stratosphere and models have biases there, the possibility to constrain the dynamical analysis from temperature or ozone observations is attempted using ensemble data assimilation.The applicability of coupled chemical/dynamical ensemble data assimilation in the stratosphere is tested in idealized perfect model observation system simulation experiments with the IGCM-FASTOC chemistry-climate model. Covariance localization is found to be necessary for stability of the Ensemble Kalman Filter (EnKF) data assimilation system and optimal localization parameters yield a strong constraint on the global dynamical state of the model when assimilating synthetic limb-sounding stratospheric temperature or ozone observations only. The multivariate coupling between ozone, temperature and winds is investigated in the optimized EnKF system. Stratospheric temperature and ozone observations induce valuable dynamical analysis increments during the analysis step. There is additional feedback during the forecast steps in the ensemble data assimilation system, further constraining the global dynamical and ozone states. The potential impact of assimilating observations posterior to analysis time in multivariate mode was estimated with an Ensemble Kalman Smoother (EnKS). Assimilation of additional asynchronous observations up to 48 hours posterior toanalysis time provided improvements on the EnKF analysis nearly similar to the ones obtained from the assimilation of a same amount of additional synchronous observations. The EnKS assimilation showed beneficial impacts on the unobserved variables analysis state but mixed impacts on the observed variable analysis state.The capacity to constrain the unobserved stratospheric winds by assimilating ozone observations is demonstrated in the ensemble data assimilation system with the EnKF and EnKS. The chemical-dynamical error covariances are critical to reduce the wind error in the model analysis state particularly through the ozone-wind covariances effective in the upper-troposphere lower-stratosphere region. Additional tests with strongly-biased initial forecasts, within a stratospheric sudden warming experiment, confirm the ability of the EnKF to efficiently propagate information from ozone observations to the dynamical model state.
L'assimilation d'ensemble utilise une méthode de Monte-Carlo pour estimer les covariances d'erreur du moment qui permettent le transfert d'information des variables observées aux variables corrélées à celles-ci. Puisque les vents sont très peu observés dans la stratosphère et que les modèles y présentent des biais, la possibilité de contraindre l'état dynamique du modèle par l'assimilation d'observations de température et d'ozone par la technique d'ensemble est tentée. L'applicabilité de l'assimilation d'ensemble dans un système chimique/dynamique couplé est testé lors d'une expérience idéalisé (modèle parfait) de simulation de système d'observation avec le modèle de chimie-climat IGCM-FASTOC. La localisation des covariances est indispensable à la stabilité du système d'assimilation avec filtre de Kalman d'ensemble (EnKF) et les paramètres optimaux offrent une forte contrainte sur l'état dynamique global du modèle lorsque l'on assimile des observations satellites synthétiques de température et d'ozone stratosphériques uniquement. Le couplage entre l'ozone, la température et les vents est étudié dans le système EnKF optimisé. Les observations de température et d'ozone stratosphériques créent des incréments dynamiques bénéfiques lors des phases d'analyses. Il y a également une rétroaction lors de la phase de prédiction du système d'assimilation de données, qui aide à contraindre davantage les états chimiques et dynamiques globaux. L'impact potentiel de l'assimilation de données postérieures au temps d'analyse en mode multivarié est estimé avec un lisseur d'ensemble de Kalman (EnKS). L'assimilation d'observations additionnelles asynchrones, ayant jusqu'à 48 heures d'écart avec le temps d'analyse, offre des améliorations aux analyses de l'EnKF presque équivalentes à celles obtenues par assimilation d'une quantité égale d'observations additionnelles synchrones. L'EnKS présente des impacts bénéfiques sur l'état d'analyse des variables non observées mais des impacts mitigés sur l'état analysé des variables observées. La capacité de contraindre les vents stratosphériques non-observés grâce à l'assimilation d'observations d'ozone est démontrée dans le système d'assimilation d'ensemble avec l'EnKF et l'EnKS. Les covariances d'erreurs chimiques- dynamiques sont essentielles à la réduction de l'erreur de vents dans l'état analysé du modèle, en particulier les covariances ozone-vent qui font effet dans la haute troposphère et basse stratosphère. Des expériences additionelles avec un état initial fortement biaisé, en l'occurence un réchauffement stratosphérique soudain, confirment l'abilité de l'EnKF à transférer de façon efficace l'information depuis les observations d'ozone vers l'état dynamique du modèle.
APA, Harvard, Vancouver, ISO, and other styles
10

Stewart, Laura M. "Correlated observation errors in data assimilation." Thesis, University of Reading, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.553080.

Full text
Abstract:
Data assimilation techniques combine observations and prior model forecasts to create initial conditions for numerical weather prediction (NWP). The relative weighting as- signed to each observation in the analysis is determined by the error associated with its measurement. Remote sensing data often have correlated errors, but the correlations are typically ignored in NWP. As operational centres move towards high-resolution fore- casting, the assumption of uncorrelated errors becomes impractical. This thesis provides new evidence that including observation error correlations in data assimilation schemes is both feasible and beneficial. We study the dual problem of quantifying and modelling observation error correlation structure. Firstly, in original work using statistics from the Met Office 4D- Var assimilation system, we diagnose strong cross-channel error eo- variances for the IASI satellite instrument. We then see how in a 3D- Var framework, information content is degraded under the assumption of uncorrelated errors, while re- tention of an approximate correlation gives clear benefits. These novel results motivate further study. We conclude by modelling observation error correlation structure in the framework of a one-dimensional shallow water model. Using an incremental 4D- Var assimilation system we observe that analysis errors are smallest when correlated error covariance matrix approximations are used over diagonal approximations. The new re- sults reinforce earlier conclusions on the benefits of including some error correlation structure.
APA, Harvard, Vancouver, ISO, and other styles
11

Moodey, Alexander J. F. "Instability and regularization for data assimilation." Thesis, University of Reading, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.602412.

Full text
Abstract:
The process of blending observations and numerical models is called in the environmental sciences community, data assimilation. Data assimilation schemes produce an analysis state, which is the best estimate to the state of the system. Error in the analysis state, which is due to errors in the observations and numerical models, is called the analysis error. In this thesis we formulate an expression for the analysis error as the data assimilation procedure is cycled in time and derive results on the boundedness of the analysis error for a number of different data assimilation schemes. Our work is focused on infinite dimensional dynamical systems where the equation which we solve is ill-posed. We present stability results for diagonal dynamical systems for a three-dimensional variational data assimilation scheme. We show that increasing the assumed uncertainty in the background state, which is an a priori estimate to the state of the system, leads to a bounded analysis error. We demonstrate for general linear dynamical systems that if there is uniform dissipation in the model dynamics with respect to the observation operator, then regularization can be used to ensure stability of many cycled data assimilation schemes. Under certain conditions we show that cycled data assimilation schemes that update the background error covariance in a general way remain stable for all time and demonstrate that many of these conditions hold for the Kalman filter. Our results are extended to dynamical system where the model dynamics are nonlinear and the observation operator is linear. Under certain Lipschitz continuous and dissipativity assumptions we demonstrate that the assumed uncertainty in the background state can be increased to ensure stability of cycled data assimilation schemes that update the background error covariance. The results are demonstrated numerically using the two-dimensional Eady model and the Lorenz 1963 model.
APA, Harvard, Vancouver, ISO, and other styles
12

Luo, Xiaodong. "Recursive bayesian filters for data assimilation." Thesis, University of Oxford, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.509987.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Ades, Melanie. "Data assimilation in highly nonlinear sytems." Thesis, University of Reading, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.603528.

Full text
Abstract:
Particle filters are a class of data-assimilation schemes which, unlike current operational data-assimilation methods, make no assumptions about the linearity of the model equations or observation operators. This means They can potentially represent the full, possibly multi-modal, posterior probability density function (pdf). Unfortunately, the standard Sequential Importance Resampling (SIR) particle filter requires too many pal1icles to make it a viable operational data-assimilation scheme in high dimensional systems. This thesis explores an adaptation to the SIR filter, the equivalent-weights particle filter, designed to ensure an ensemble representation of the high probability region of the posterior pdf even in high dimensional systems. The formulation of the equivalent-weights particle filter involves various tuneable parameters. The first part of this thesis focusses on a theoretical and practical examination of the effect of the parameter choices on the ability of the equivalent-weights particle filter to represent the posterior pdf. Theoretically, the importance of ensuring equivalent weights for the majority of particles is shown and consequently the need to sample from a mixture proposal density al analysis time. Practically, the potentially large influence of the parameters is demonstrated and how this can be used to establish appropriate choices for some of the parameters is discussed. The second part of the thesis considers two areas related to the potential of the equivalent-weights particle filter as a viable data-assimilation scheme: the capacity to represent the posterior pdf and the effect on any model balances that may be present in the system. The ability of the equivalent-weights particle filter to represent the high probability region of the posterior pdf with relatively few particles in high dimensional systems is demonstrated. Changes in the observation distribution, frequency or error statistics result in minimal impact to this posterior representation. More distinctive effects are seen when the model error statistics are misrepresented in the ensemble. The final section on model balances relates to the use of the equivalent-weights pa11icle filter in atmosphere and ocean numerical models. The equivalent-weights par1icle filter is shown to have little effect on the model balances present in a simple ocean model and hence there is no evidence for the introduction of spurious gravity waves
APA, Harvard, Vancouver, ISO, and other styles
14

Jenkins, Siân. "Numerical model error in data assimilation." Thesis, University of Bath, 2015. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.665395.

Full text
Abstract:
In this thesis, we produce a rigorous and quantitative analysis of the errors introduced by finite difference schemes into strong constraint 4D-Variational (4D-Var) data assimilation. Strong constraint 4D-Var data assimilation is a method that solves a particular kind of inverse problem; given a set of observations and a numerical model for a physical system together with a priori information on the initial condition, estimate an improved initial condition for the numerical model, known as the analysis vector. This method has many forms of error affecting the accuracy of the analysis vector, and is derived under the assumption that the numerical model is perfect, when in reality this is not true. Therefore it is important to assess whether this assumption is realistic and if not, how the method should be modified to account for model error. Here we analyse how the errors introduced by finite difference schemes used as the numerical model, affect the accuracy of the analysis vector. Initially the 1D linear advection equation is considered as our physical system. All forms of error, other than those introduced by finite difference schemes, are initially removed. The error introduced by `representative schemes' is considered in terms of numerical dissipation and numerical dispersion. A spectral approach is successfully implemented to analyse the impact on the analysis vector, examining the effects on unresolvable wavenumber components and the l2-norm of the error. Subsequently, a similar also successful analysis is conducted when observation errors are re-introduced to the problem. We then explore how the results can be extended to weak constraint 4D-Var. The 2D linear advection equation is then considered as our physical system, demonstrating how the results from the 1D problem extend to 2D. The linearised shallow water equations extend the problem further, highlighting the difficulties associated with analysing a coupled system of PDEs.
APA, Harvard, Vancouver, ISO, and other styles
15

Siddons, Lee Anthony. "Data assimilation of HF radar data into coastal wave models." Thesis, University of Sheffield, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.444578.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Grunmann, Pablo Javier. "Variational data assimilation of soil moisture information." College Park, Md. : University of Maryland, 2005. http://hdl.handle.net/1903/2476.

Full text
Abstract:
Thesis (Ph. D.) -- University of Maryland, College Park, 2005.
Thesis research directed by: Meteorology. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
APA, Harvard, Vancouver, ISO, and other styles
17

Liu, Liyan Jones C. K. R. T. "Lagrangian data assimilation into layered ocean model." Chapel Hill, N.C. : University of North Carolina at Chapel Hill, 2007. http://dc.lib.unc.edu/u?/etd,786.

Full text
Abstract:
Thesis (Ph. D.)--University of North Carolina at Chapel Hill, 2007.
Title from electronic title page (viewed Dec. 18, 2007). "... in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Department of Mathematics." Discipline: Mathematics; Department/School: Mathematics.
APA, Harvard, Vancouver, ISO, and other styles
18

Lui, Chiu-sing Gilbert. "Some statistical topics on sequential data assimilation." Click to view the E-thesis via HKUTO, 2008. http://sunzi.lib.hku.hk/hkuto/record/b40204005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Lui, Chiu-sing Gilbert, and 雷照盛. "Some statistical topics on sequential data assimilation." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2008. http://hub.hku.hk/bib/B40204005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Phillipson, Luke. "Ocean data assimilation in the Angola Basin." Thesis, Imperial College London, 2018. http://hdl.handle.net/10044/1/62645.

Full text
Abstract:
The predictability of the ocean currents and the Congo River plume within the Angola Basin was investigated using the Regional Ocean Modelling System (ROMS) with data assimilation (4D-Var). Firstly, the impact of assimilating a novel remote-sensing data set, satellite-derived ocean currents (OSCAR) as compared to the more conventional satellite sea surface height (SSH) on ocean current predictability was assessed. In comparing 17 simulated and observed drifters throughout January-March 2013 using four different metrics, it was found that OSCAR assimilation only improves the Lagrangian predictability of ocean currents as much as altimetry assimilation. The impact of combining the aforementioned remote-sensing observations (OSCAR or SSH) with drifters was then investigated throughout the same period to assess whether this combination could improve upon assimilating the drifters alone on ocean current predictability. It was found that the addition of drifters significantly improves the Lagrangian predictability of the ocean currents in comparison to either altimetry or OSCAR as expected. More surprisingly, the assimilation of either SSH or OSCAR with the drifter velocities does not significantly improve the Lagrangian predictability compared to the drifter assimilation alone, even degrading predictability in some cases. Additionally, a new metric denoted the crossover time was formulated using the drifters, defined as the time it takes for a numerical model to equal the performance of persistence. In addition to ROMS, a global ocean model was also evaluated to demonstrate and quantify the metric fully. Finally, the impact of assimilating a recently available advanced version of a satellite salinity product (SMOS), on the Congo River plume was investigated. With some metrics specifically focusing on validating the Congo River plume, it was found that the assimilation of SMOS improved the representation of the plume within the model as well as the modelled salinity fields.
APA, Harvard, Vancouver, ISO, and other styles
21

Jafarpour, Behnam. "Oil reservoir characterization using ensemble data assimilation." Thesis, Massachusetts Institute of Technology, 2008. http://hdl.handle.net/1721.1/43046.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, 2008.
Pages 211-212 blank.
Includes bibliographical references.
Increasing world energy demand combined with decreasing discoveries of new and accessible hydrocarbon reserves are necessitating optimal recovery from the world's current hydrocarbon resources. Advances in drilling and monitoring technologies have introduced intelligent oilfields that provide real-time measurements of reservoir conditions. These measurements can be used for more frequent reservoir model calibration and characterization that can lead to improved oil recovery though model-based closed-loop control and management. This thesis proposes an efficient method for probabilistic characterization of reservoir states and properties. The proposed algorithm uses an ensemble data assimilation approach to provide stochastic characterization of reservoir attributes by conditioning individual prior ensemble members on dynamic production observations at wells. The conditioning is based on the second-order Kalman filter analysis and is performed recursively, which is suitable for real-time control applications. The prior sample mean and covariance are derived from nonlinear dynamic propagation of an initial ensemble of reservoir properties. Realistic generation of these initial reservoir properties is shown to be critical for successful performance of the filter. When properly designed and implemented, recursive ensemble filtering is concluded to be a practical and attractive alternative to classical iterative history matching algorithms. A reduced representation of reservoir's states and parameters using discrete cosine transform is presented to improve the estimation problem and geological consistency of the results. The discrete cosine transform allows for efficient, flexible, and robust parameterization of reservoir properties and can be used to eliminate redundancy in reservoir description while preserving important geological features.
This improves under-constrained inverse problems such as reservoir history matching in which the number of unknowns significantly exceeds available data. The proposed parameterization approach is general and can be applied with any inversion algorithm. The suitability of the proposed estimation framework for hydrocarbon reservoir characterization is demonstrated through several water flooding examples using synthetic reservoir models.
by Behnam Jafarpour.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
22

Sheinbaum, Julio. "Assimilation of oceanographic data in numerical models." Thesis, University of Oxford, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.238143.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Ward, Ben Andrew. "Marine ecosystem model analysis using data assimilation." Thesis, University of Southampton, 2009. https://eprints.soton.ac.uk/145089/.

Full text
Abstract:
Numerical modelling of the marine ecosystem requires the aggregation of diverse chemical and biological species into broad categories. To avoid large bias errors it is preferable to resolve as many explicit state variables and processes as possible. The cost of this increased complexity is greater uncertainty in model parameters and output. When comparing models, the importance of quantifying both bias error and the variability of unconstrained solutions was revealed as two marine ecosystem models were calibrated to data. Results demonstrated that all prior parameter information must include realistic error estimates if model uncertainty is to be quantied. Five simple ecosystem models were calibrated to observations from two North Atlantic sites; the Bermuda Atlantic Time-series Study (BATS) and the North Atlantic Bloom Experiment (NABE). Model-data mists were reduced by between 45 and 50%. The addition of model complexity (a parameterised microbial loop, a variable chlorophyll a to nitrogen ratio and dissolved organic nitrogen) led to larger improvements in model performance at BATS relative to NABE. Calibrated parameter values developed at NABE performed better than the default parameter values when applied at BATS. Solutions developed at BATS performed worse than the default values at NABE. The models lacked sucient ecological complexity to function well at BATS. Errors in the model were masked by errors in the calibrated parameters and the models did not perform well with regard to independent data. The models were well suited to reproducing the NABE data, and the calibrated models performed relatively well at BATS. The models were sensitive to the underlying physical forcing. Although the ecosystem models were originally calibrated within a poor representation of the physical environment at BATS, results from experiments using an improved physical model support the conclusion that the ecosystem models lacked the required complexity at that site.
APA, Harvard, Vancouver, ISO, and other styles
24

Ross, Natalie. "The dynamics of point-vortex data assimilation." Connect to online resource, 2008. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3303865.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Nerger, Lars. "Parallel filter algorithms for data assimilation in oceanography." [S.l.] : [s.n.], 2004. http://deposit.ddb.de/cgi-bin/dokserv?idn=975524844.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Aksoy, Altug. "Mesoscale ensemble-based data assimilation and parameter estimation." Texas A&M University, 2005. http://hdl.handle.net/1969.1/2523.

Full text
Abstract:
The performance of the ensemble Kalman filter (EnKF) in forced, dissipative flow under imperfect model conditions is investigated through simultaneous state and parameter estimation where the source of model error is the uncertainty in the model parameters. Two numerical models with increasing complexity are used with simulated observations. For lower complexity, a two-dimensional, nonlinear, hydrostatic, non-rotating, and incompressible sea breeze model is developed with buoyancy and vorticity as the prognostic variables. Model resolution is 4 km horizontally and 50 m vertically. The ensemble size is set at 40. Forcing is maintained through an explicit heating function with additive stochastic noise. Simulated buoyancy observations on land surface with 40-km spacing are assimilated every 3 hours. Up to six model parameters are successfully subjected to estimation attempts in various experiments. The overall EnKF performance in terms of the error statistics is found to be superior to the worst-case scenario (when there is parameter error but no parameter estimation is performed) with an average error reduction in buoyancy and vorticity of 40% and 46%, respectively, for the simultaneous estimation of six parameters. The model chosen to represent the complexity of operational weather forecasting is the Pennsylvania State University-National Center for Atmospheric Research MM5 model with a 36-km horizontal resolution and 43 vertical layers. The ensemble size for all experiments is chosen as 40 and a 41st member is generated as the truth with the same ensemble statistics. Assimilations are performed with a 12-hour interval with simulated sounding and surface observations of horizontal winds and temperature. Only single-parameter experiments are performed focusing on a constant inserted into the code as the multiplier of the vertical eddy mixing coefficient. Estimation experiments produce very encouraging results and the mean estimated parameter value nicely converges to the true value exhibiting a satisfactory level of variability.
APA, Harvard, Vancouver, ISO, and other styles
27

Dowd, Michael. "Assimilation of data into limited-area coastal models." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/nq24773.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Torn, Ryan. "Using ensemble data assimilation for predictability and dynamics /." Thesis, Connect to this title online; UW restricted, 2007. http://hdl.handle.net/1773/10037.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Lundvall, Johan. "Data Assimilation in Fluid Dynamics using Adjoint Optimization." Doctoral thesis, Linköping : Matematiska institutionen, Linköpings universitet, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-9684.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Watkinson, Laura. "Four Dimensional Variational Data Assimilation for Hamiltonian Problems." Thesis, University of Reading, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.485506.

Full text
Abstract:
In this thesis we bring together two areas of mathematics; Hamiltonian dynamics and data assimilation. We construct a four dimensional variational (4d Var) data assimilation scheme for two Hamiltonian systems. This is to reflect the Hamiltonian behaviour observed in the atmosphere. We know, for example, that potential vorticity is conserved in atmospheric models. However, current data assimilation schemes do not explicitly include such physical relationships. In this thesis, by considering the two and three body problems, we demonstrate how such characteristic behaviour can be included in the data assimilation schemes. In our 4d Var schemes we add a weak constraint that imposes the conservation of the Hamiltonian, the total energy, at the initial time. This is effectively imposing an energy constraint from one data assimilation window to the next. Our results imply that these weak constraints affect the underlying geometry of the resulting data assimilation solution. We also demonstrate that this constraint reduces the error on this solution and the forecast. By imposing this constraint we are including additional information to the system. Due to the additional term in the cost function gradient, the analysis can only change in such a way as to satisfy this weak constraint. This thesis therefore demonstrates that the inclusion of similar weak con-straints, perhaps using the conservation of potential vorticity, could improve the analysis and forecast for atmospheric models.
APA, Harvard, Vancouver, ISO, and other styles
31

Johnson, Christine. "Information content of observations in variational data assimilation." Thesis, University of Reading, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.288737.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Lal, Rajnesh. "Data assimilation and uncertainty quantification in cardiovascular biomechanics." Thesis, Montpellier, 2017. http://www.theses.fr/2017MONTS088/document.

Full text
Abstract:
Les simulations numériques des écoulements sanguins cardiovasculaires peuvent combler d’importantes lacunes dans les capacités actuelles de traitement clinique. En effet, elles offrent des moyens non invasifs pour quantifier l’hémodynamique dans le cœur et les principaux vaisseaux sanguins chez les patients atteints de maladies cardiovasculaires. Ainsi, elles permettent de recouvrer les caractéristiques des écoulements sanguins qui ne peuvent pas être obtenues directement à partir de l’imagerie médicale. Dans ce sens, des simulations personnalisées utilisant des informations propres aux patients aideraient à une prévision individualisée des risques. Nous pourrions en effet, disposer des informations clés sur la progression éventuelle d’une maladie ou détecter de possibles anomalies physiologiques. Les modèles numériques peuvent fournir également des moyens pour concevoir et tester de nouveaux dispositifs médicaux et peuvent être utilisés comme outils prédictifs pour la planification de traitement chirurgical personnalisé. Ils aideront ainsi à la prise de décision clinique. Cependant, une difficulté dans cette approche est que, pour être fiables, les simulations prédictives spécifiques aux patients nécessitent une assimilation efficace de leurs données médicales. Ceci nécessite la solution d’un problème hémodynamique inverse, où les paramètres du modèle sont incertains et sont estimés à l’aide des techniques d’assimilation de données.Dans cette thèse, le problème inverse pour l’estimation des paramètres est résolu par une méthode d’assimilation de données basée sur un filtre de Kalman d’ensemble (EnKF). Connaissant les incertitudes sur les mesures, un tel filtre permet la quantification des incertitudes liées aux paramètres estimés. Un algorithme d’estimation de paramètres, basé sur un filtre de Kalman d’ensemble, est proposé dans cette thèse pour des calculs hémodynamiques spécifiques à un patient, dans un réseau artériel schématique et à partir de mesures cliniques incertaines. La méthodologie est validée à travers plusieurs scenarii in silico utilisant des données synthétiques. La performance de l’algorithme d’estimation de paramètres est également évaluée sur des données expérimentales pour plusieurs réseaux artériels et dans un cas provenant d’un banc d’essai in vitro et des données cliniques réelles d’un volontaire (cas spécifique du patient). Le but principal de cette thèse est l’analyse hémodynamique spécifique du patient dans le polygone de Willis, appelé aussi cercle artériel du cerveau. Les propriétés hémodynamiques communes, comme celles de la paroi artérielle (module de Young, épaisseur de la paroi et coefficient viscoélastique), et les paramètres des conditions aux limites (coefficients de réflexion et paramètres du modèle de Windkessel) sont estimés. Il est également démontré qu’un modèle appelé compartiment d’ordre réduit (ou modèle dimension zéro) permet une estimation simple et fiable des caractéristiques du flux sanguin dans le polygone de Willis. De plus, il est ressorti que les simulations avec les paramètres estimés capturent les formes attendues pour les ondes de pression et de débit aux emplacements prescrits par le clinicien
Cardiovascular blood flow simulations can fill several critical gaps in current clinical capabilities. They offer non-invasive ways to quantify hemodynamics in the heart and major blood vessels for patients with cardiovascular diseases, that cannot be directly obtained from medical imaging. Patient-specific simulations (incorporating data unique to the individual) enable individualised risk prediction, provide key insights into disease progression and/or abnormal physiologic detection. They also provide means to systematically design and test new medical devices, and are used as predictive tools to surgical and personalize treatment planning and, thus aid in clinical decision-making. Patient-specific predictive simulations require effective assimilation of medical data for reliable simulated predictions. This is usually achieved by the solution of an inverse hemodynamic problem, where uncertain model parameters are estimated using the techniques for merging data and numerical models known as data assimilation methods.In this thesis, the inverse problem is solved through a data assimilation method using an ensemble Kalman filter (EnKF) for parameter estimation. By using an ensemble Kalman filter, the solution also comes with a quantification of the uncertainties for the estimated parameters. An ensemble Kalman filter-based parameter estimation algorithm is proposed for patient-specific hemodynamic computations in a schematic arterial network from uncertain clinical measurements. Several in silico scenarii (using synthetic data) are considered to investigate the efficiency of the parameter estimation algorithm using EnKF. The usefulness of the parameter estimation algorithm is also assessed using experimental data from an in vitro test rig and actual real clinical data from a volunteer (patient-specific case). The proposed algorithm is evaluated on arterial networks which include single arteries, cases of bifurcation, a simple human arterial network and a complex arterial network including the circle of Willis.The ultimate aim is to perform patient-specific hemodynamic analysis in the network of the circle of Willis. Common hemodynamic properties (parameters), like arterial wall properties (Young’s modulus, wall thickness, and viscoelastic coefficient) and terminal boundary parameters (reflection coefficient and Windkessel model parameters) are estimated as the solution to an inverse problem using time series pressure values and blood flow rate as measurements. It is also demonstrated that a proper reduced order zero-dimensional compartment model can lead to a simple and reliable estimation of blood flow features in the circle of Willis. The simulations with the estimated parameters capture target pressure or flow rate waveforms at given specific locations
APA, Harvard, Vancouver, ISO, and other styles
33

Chatdarong, Virat 1978. "Multi-sensor rainfall data assimilation using ensemble approaches." Thesis, Massachusetts Institute of Technology, 2006. http://hdl.handle.net/1721.1/35493.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, 2006.
Includes bibliographical references (p. 195-203).
Rainfall is a major process transferring water mass and energy from the atmosphere to the surface. Rainfall data is needed over large scales for improved understanding of the Earth climate system. Although there are many instruments for measuring rainfall, none of them can provide continuous global coverage at fine spatial and temporal resolutions. This thesis proposes an efficient methodology for obtaining a probabilistic characterization of rainfall over an extended time period and spatial domain. The characterization takes the form of an ensemble of rainfall replicates, each conditioned on multiple measurement sources. The conditional replicates are obtained from ensemble data assimilation algorithms (Kalman filters and smoothers) based on a recursive cluster rainfall model. Satellite measurements of cloud-top temperatures are used to identify areas where rainfall can possibly occur. A variational field alignment algorithm is used to estimate rainfall advective velocity field from successive cloud-top temperature images. A stable pseudo-inverse improves the stability of the algorithms when the ensemble size is small. The ensemble data assimilation is implemented over the United States Great Plains during the summer of 2004.
(cont.) It combines surface rain-gauge data with three satellite-based instruments. The ensemble output is then validated with ground-based radar precipitation product. The recursive rainfall model is simple, fast and reliable. In addition, the ensemble Kalman filter and smoother are practical for a very large-scale data assimilation problem with a limited ensemble size. Finally, this thesis describes a multi-scale recursive algorithm for estimating scaling parameters for popular multiplicative cascade rainfall models. In addition, this algorithm can be used to merge static rainfall data from multiple sources.
by Virat Chatdarong.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
34

Dutt, Arkopal. "High order stochastic transport and Lagrangian data assimilation." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/115663.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, Department of Mechanical Engineering, 2018.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 103-113).
Ocean currents transport a variety of natural (e.g. water masses, phytoplankton, zooplankton, sediments, etc.) and man-made materials (e.g. pollutants, floating debris, particulate matter, etc.). Understanding such uncertain Lagrangian transport is imperative for reducing environmental damage due to natural hazards and for allowing rigorous risk analysis and effective search and rescue. While secondary variables and trajectories have classically been used for the analyses of such transports, Lagrangian Coherent Structures (LCSs) provide a robust and objective description of the important material lines. To ensure accurate and useful Lagrangian hazard scenario predictions and prevention, the first goal of this thesis is to obtain accurate probabilistic prediction of the underlying stochastic velocity fields using the Dynamically Orthogonal (DO) approach. The second goal is to merge data from both Eulerian and Lagrangian observations with predictions such that the whole information content of observations is utilized. In the first part of this thesis, we develop high-order numerical schemes for the DO equations that ensure efficiency, accuracy, stability, and consistency between the Monte Carlo (MC) and DO solutions. We discuss the numerical challenges in applying the DO equations to the unsteady stochastic Navier-Stokes equations. In order to maintain consistent evaluation of advection terms, we utilize linear centered advection schemes with fully explicit and linear Shapiro filters. We then discuss how to combine the semi-implicit projection method with new high order implicitexplicit (IMEX) linear multi-step and multistage IMEX-RK time marching schemes for the coupled DO equations to ensure further stability and accuracy. We also review efficient numerical re-orthonormalization strategies during time marching. We showcase our results with stochastic test cases of stochastic passive tracer advection in a deterministic swirl flow, stochastic flow past a cylinder, and stochastic lid-driven cavity flow. We show that our schemes improve the consistency between reconstructed DO realizations and the corresponding MC realizations, and that we achieve the expected order of accuracy. In the second part of the work, we first undertake a study of different Lagrangian instruments and outline how the DO methodology can be applied to obtain Lagrangian variables of stochastic flow maps and LCS in uncertain flows. We then review existing methods for Bayesian Lagrangian data assimilation (DA). Disadvantages of earlier methods include the use of approximate measurement models to directly link Lagrangian variables with Eulerian variables, the challenges in respecting the Lagrangian nature of variables, and the assumptions of linearity or of Gaussian statistics during prediction or assimilation. To overcome these, we discuss how the Gaussian Mixture Model (GMM) DO Filter can be extended to fully coupled Eulerian-Lagrangian data assimilation. We define an augmented state vector of the Eulerian and Lagrangian state variables that directly exploits the full mutual information and complete the Bayesian DA in the joint Eulerian-Lagrangian stochastic subspace. Results of such coupled Eulerian-Lagrangian DA are discussed using test cases based on a double gyre flow with random frequency.
by Arkopal Dutt.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
35

Gou, Tianyi. "Computational Tools for Chemical Data Assimilation with CMAQ." Thesis, Virginia Tech, 2010. http://hdl.handle.net/10919/31017.

Full text
Abstract:
The Community Multiscale Air Quality (CMAQ) system is the Environmental Protection Agency's main modeling tool for atmospheric pollution studies. CMAQ-ADJ, the adjoint model of CMAQ, offers new analysis capabilities such as receptor-oriented sensitivity analysis and chemical data assimilation. This thesis presents the construction, validation, and properties of new adjoint modules in CMAQ, and illustrates their use in sensitivity analyses and data assimilation experiments. The new module of discrete adjoint of advection is implemented with the aid of automatic differentiation tool (TAMC) and is fully validated by comparing the adjoint sensitivities with finite difference values. In addition, adjoint sensitivity with respect to boundary conditions and boundary condition scaling factors are developed and validated in CMAQ. To investigate numerically the impact of the continuous and discrete advection adjoints on data assimilation, various four dimensional variational (4D-Var) data assimilation experiments are carried out with the 1D advection PDE, and with CMAQ advection using synthetic and real observation data. The results show that optimization procedure gives better estimates of the reference initial condition and converges faster when using gradients computed by the continuous adjoint approach. This counter-intuitive result is explained using the nonlinearity properties of the piecewise parabolic method (the numerical discretization of advection in CMAQ). Data assimilation experiments are carried out using real observation data. The simulation domain encompasses Texas and the simulation period is August 30 to September 1, 2006. Data assimilation is used to improve both initial and boundary conditions. These experiments further validate the tools developed in this thesis.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
36

Farchi, Alban. "On the localisation of ensemble data assimilation methods." Thesis, Paris Est, 2019. http://www.theses.fr/2019PESC1034.

Full text
Abstract:
L’assimilation de données est la discipline permettant de combiner des observations d’un système dynamique avec un modèle numérique simulant ce système, l'objectif étant d'améliorer la connaissance de l'état du système. Le principal domaine d'application de l'assimilation de données est la prévision numérique du temps. Les techniques d'assimilation sont implémentées dans les centres opérationnels depuis plusieurs décennies et elles ont largement contribué à améliorer la qualité des prédictions. Une manière efficace de réduire la dimension des systèmes d'assimilation de données est d'utiliser des méthodes ensemblistes. La plupart de ces méthodes peuvent être regroupées en deux classes~: le filtre de Kalman d'ensemble (EnKF) et le filtre particulaire (PF). Le succès de l'EnKF pour des problèmes géophysiques de grande dimension est largement dû à la localisation. La localisation repose sur l'hypothèse que les corrélations entre variables d'un système dynamique décroissent très rapidement avec la distance. Dans cette thèse, nous avons étudié et amélioré les méthodes de localisation pour l'assimilation de données ensembliste. La première partie est dédiée à l'implémentation de la localisation dans le PF. Nous passons en revue les récents développements concernant la localisation dans le PF et nous proposons une classification théorique des algorithmes de type PF local. Nous insistons sur les avantages et les inconvénients de chaque catégorie puis nous proposons des solutions pratiques aux problèmes que posent les PF localisés. Les PF locaux sont testés et comparés en utilisant des expériences jumelles avec des modèles de petite et moyenne dimension. Finalement, nous considérons le cas de la prédiction de l'ozone troposphérique en utilisant des mesures de concentration. Plusieurs algorithmes, dont des PF locaux, sont implémentés et appliqués à ce problème et leurs performances sont comparées.La deuxième partie est dédiée à l'implémentation de la localisation des covariances dans l'EnKF. Nous montrons comment la localisation des covariances peut être efficacement implémentée dans l'EnKF déterministe en utilisant un ensemble augmenté. L'algorithme obtenu est testé au moyen d'expériences jumelles avec un modèle de moyenne dimension et des observations satellitaires. Finalement, nous étudions en détail la cohérence de l'EnKF déterministe avec localisation des covariances. Une nouvelle méthode est proposée puis comparée à la méthode traditionnelle en utilisant des simulation jumelles avec des modèles de petite dimension
Data assimilation is the mathematical discipline which gathers all the methods designed to improve the knowledge of the state of a dynamical system using both observations and modelling results of this system. In the geosciences, data assimilation it mainly applied to numerical weather prediction. It has been used in operational centres for several decades, and it has significantly contributed to the increase in quality of the forecasts.Ensemble methods are powerful tools to reduce the dimension of the data assimilation systems. Currently, the two most widespread classes of ensemble data assimilation methods are the ensemble Kalman filter (EnKF) and the particle filter (PF). The success of the EnKF in high-dimensional geophysical systems is largely due to the use of localisation. Localisation is based on the assumption that correlations between state variables in a dynamical system decrease at a fast rate with the distance. In this thesis, we have studied and improved localisation methods for ensemble data assimilation.The first part is dedicated to the implementation of localisation in the PF. The recent developments in local particle filtering are reviewed, and a generic and theoretical classification of local PF algorithms is introduced, with an emphasis on the advantages and drawbacks of each category. Alongside the classification, practical solutions to the difficulties of local particle filtering are suggested. The local PF algorithms are tested and compared using twin experiments with low- to medium-order systems. Finally, we consider the case study of the prediction of the tropospheric ozone using concentration measurements. Several data assimilation algorithms, including local PF algorithms, are applied to this problem and their performances are compared.The second part is dedicated to the implementation of covariance localisation in the EnKF. We show how covariance localisation can be efficiently implemented in the deterministic EnKF using an augmented ensemble. The proposed algorithm is tested using twin experiments with a medium-order model and satellite-like observations. Finally, the consistency of the deterministic EnKF with covariance localisation is studied in details. A new implementation is proposed and compared to the original one using twin experiments with low-order models
APA, Harvard, Vancouver, ISO, and other styles
37

Rosenthal, William Steven. "Data Assimilation In Systems With Strong Signal Features." Diss., The University of Arizona, 2014. http://hdl.handle.net/10150/339055.

Full text
Abstract:
Filtering problems in high dimensional geophysical applications often require spatially continuous models to interpolate spatially and temporally sparse data. Many applications in numerical weather and ocean state prediction are concerned with tracking and assessing the uncertainty in the position of large scale vorticity features, such as storm fronts, jets streams, and hurricanes. Quantifying the amplitude variance in these features is complicated by the fact that both height and lateral perturbations in the feature geometry are represented in the same covariance estimate. However, when there are sufficient observations to detect feature information like spatial gradients, the positions of these features can be used to further constrain the filter, as long as the statistical model (cost function) has provisions for both height perturbations and lateral displacements. Several authors since the 1990s have proposed various formalisms for the simultaneous modeling of position and amplitude errors, and the typical approaches to computing the generalized solutions in these applications are variational or direct optimization. The ensemble Kalman filter is often employed in large scale nonlinear filtering problems, but its predication on Gaussian statistics causes its estimators suffer from analysis deflation or collapse, as well as the usual curse of dimensionality in high dimensional Monte Carlo simulations. Moreover, there is no theoretical guarantee of the performance of the ensemble Kalman filter with nonlinear models. Particle filters which employ importance sampling to focus attention on the important regions of the likelihood have shown promise in recent studies on the control of particle size. Consider an ensemble forecast of a system with prominent feature information. The correction of displacements in these features, by pushing them into better agreement with observations, is an application of importance sampling, and Monte Carlo methods, including particle filters, and possibly the ensemble Kalman filter as well, are well suited to applications of feature displacement correction. In the present work, we show that the ensemble Kalman filter performs well in problems where large features are displaced both in amplitude and position, as long as it is used on a statistical model which includes both function height and local position displacement in the model state. In a toy model, we characterize the performance-degrading effect that untracked displacements have on filters when large features are present. We then employ tools from classical physics and fluid dynamics to statistically model displacements by area-preserving coordinate transformations. These maps preserve the area of contours in the displaced function, and using strain measures from continuum mechanics, we regularize the statistics on these maps to ensure they model smooth, feature-preserving displacements. The position correction techniques are incorporated into the statistical model, and this modified ensemble Kalman filter is tested on a system of vortices driven by a stochastically forced barotropic vorticity equation. We find that when the position correction term is included in the statistical model, the modified filter provides estimates which exhibit substantial reduction in analysis error variance, using a much smaller ensemble than what is required when the position correction term is removed from the model.
APA, Harvard, Vancouver, ISO, and other styles
38

Bulygina, Nataliya. "Model Structure Estimation and Correction Through Data Assimilation." Diss., The University of Arizona, 2007. http://hdl.handle.net/10150/195345.

Full text
Abstract:
The main philosophy underlying this research is that a model should constitute a representation of both what we know and what we do not know about the structure and behavior of a system. In other words it should summarize, as far as possible, both our degree of certainty and degree of uncertainty, so that it facilitates statements about prediction uncertainty arising from model structural uncertainty. Based on this philosophy, the following issues were explored in the dissertation: Identification of a hydrologic system model based on assumption about perceptual and conceptual models structure only, without strong additional assumptions about its mathematical structure Development of a novel data assimilation method for extraction of mathematical relationships between modeled variables using a Bayesian probabilistic framework as an alternative to up-scaling of governing equations Evaluation of the uncertainty in predicted system response arising from three uncertainty types: o uncertainty caused by initial conditions, o uncertainty caused by inputs, o uncertainty caused by mathematical structure Merging of theory and data to identify a system as an alternative to parameter calibration and state-updating approaches Possibility of correcting existing models and including descriptions of uncertainty about their mapping relationships using the proposed method Investigation of a simple hydrological conceptual mass balance model with two-dimensional input, one-dimensional state and two-dimensional output at watershed scale and different temporal scales using the method
APA, Harvard, Vancouver, ISO, and other styles
39

Yan, Hanjun. "Numerical methods for data assimilation in weather forecasting." HKBU Institutional Repository, 2018. https://repository.hkbu.edu.hk/etd_oa/555.

Full text
Abstract:
Data assimilation plays an important role in weather forecasting. The purpose of data assimilation is try to provide a more accurate atmospheric state for future forecast. Several existed methods currently used in this field fall into two categories: statistical data assimilation and variational data assimilation. This thesis focuses mainly on variational data assimilation. The original objective function of three dimensional data assimilation (3D-VAR) consists of two terms: the difference between the pervious forecast and analysis and the difference between the observations and analysis in observation space. Considering the inaccuracy of previous forecasting results, we replace the first term by the difference between the previous forecast gradients and analysis gradients. The associated data fitting term can be interpreted using the second-order finite difference matrix as the inverse of the background error covariance matrix in the 3D-VAR setting. In our approach, it is not necessary to estimate the background error covariance matrix and to deal with its inverse in the 3D-VAR algorithm. Indeed, the existence and uniqueness of the analysis solution of the proposed objective function are already established. Instead, the solution can be calculated using the conjugate gradient method iteratively. We present the experimental results based on WRF simulations. We show that the performance of this forecast gradient based DA model is better than that of 3D-VAR. Next, we propose another optimization method of variational data assimilation. Using the tensor completion in the cost function for the analysis, we replace the second term in the 3D-VAR cost function. This model is motivated by a small number of observations compared with the large portion of the grids. Applying the alternating direction method of multipliers to solve this optimization problem, we conduct numerical experiments on real data. The results show that this tensor completion based DA model is competitive in terms of prediction accuracy with 3D-VAR and the forecast gradient based DA model. Then, 3D-VAR and the two model proposed above lack temporal information, we construct a third model in four-dimensional space. To include temporal information, this model is based on the second proposed model, in which introduce the total variation to describe the change of atmospheric state. To this end, we use the alternating direction method of multipliers. One set of experimental results generates a positive performance. In fact, the prediction accuracy of our third model is better than that of 3D-VAR, the forecast gradient based DA model, and the tensor completion based DA model. Nevertheless, although the other sets of experimental results show that this model has a better performance than 3D-VAR and the forecast gradient based DA model, its prediction accuracy is slightly lower than the tensor completion based model.
APA, Harvard, Vancouver, ISO, and other styles
40

DeChant, Caleb Matthew. "Hydrologic Data Assimilation: State Estimation and Model Calibration." PDXScholar, 2010. https://pdxscholar.library.pdx.edu/open_access_etds/172.

Full text
Abstract:
This thesis is a combination of two separate studies which examine hydrologic data assimilation techniques: 1) to determine the applicability of assimilation of remotely sensed data in operational models and 2) to compare the effectiveness of assimilation and other calibration techniques. The first study examines the ability of Data Assimilation of remotely sensed microwave radiance data to improve snow water equivalent prediction, and ultimately operational streamflow forecasts. Operational streamflow forecasts in the National Weather Service River Forecast Center are produced with a coupled SNOW17 (snow model) and SACramento Soil Moisture Accounting (SAC-SMA) model. A comparison of two assimilation techniques, the Ensemble Kalman Filter (EnKF) and the Particle Filter (PF), is made using a coupled SNOW17 and the Microwave Emission Model for Layered Snowpack model to assimilate microwave radiance data. Microwave radiance data, in the form of brightness temperature (TB), is gathered from the Advanced Microwave Scanning Radiometer-Earth Observing System at the 36.5GHz channel. SWE prediction is validated in a synthetic experiment. The distribution of snowmelt from an experiment with real data is then used to run the SAC-SMA model. Several scenarios on state or joint state-parameter updating with TB data assimilation to SNOW-17 and SAC-SMA models were analyzed, and the results show potential benefit for operational streamflow forecasting. The second study compares the effectiveness of different calibration techniques in hydrologic modeling. Currently, the most commonly used methods for hydrologic model calibration are global optimization techniques. While these techniques have become very efficient and effective in optimizing the complicated parameter space of hydrologic models, the uncertainty with respect to parameters is ignored. This has led to recent research looking into Bayesian Inference through Monte Carlo methods to analyze the ability to calibrate models and represent the uncertainty in relation to the parameters. Research has recently been performed in filtering and Markov Chain Monte Carlo (MCMC) techniques for optimization of hydrologic models. At this point, a comparison of the effectiveness of global optimization, filtering and MCMC techniques has yet to be reported in the hydrologic modeling community. This study compares global optimization, MCMC, the PF, the Particle Smoother, the EnKF and the Ensemble Kalman Smoother for the purpose of parameter estimation in both the HyMod and SAC-SMA hydrologic models.
APA, Harvard, Vancouver, ISO, and other styles
41

Zoccarato, Claudia. "Data Assimilation in Geomechanics: Characterization of Hydrocarbon Reservoirs." Doctoral thesis, Università degli studi di Padova, 2016. http://hdl.handle.net/11577/3424496.

Full text
Abstract:
The prediction of the stress field distribution induced by the pore pressure change in deep hydrocarbon reservoirs and the consequent compaction of the porous rock formation is modeled with the aid of a Finite-Element (FE) geomechanical model. Despite the reliability of the model, which has been tested in several previous applications, many sources of uncertainty may affect the model outcome in terms of ground surface displacements. The uncertainty are mainly related to the mathematical model itself, that is an approximation reproducing a real and complex system, the initial and boundary conditions, the forcing terms, and the model parameters. The latter are the physical properties of the reservoir that are usually a-priori poorly known. A proper estimation of these parameters using a deterministic approach is discouraged as several parameters combinations may equally reproduce the observed data. Instead, the reservoir characterization is here performed by establishing a stochastic approach providing also for the quantification of the uncertainties affecting the parameter calibration. For this purpose, an ensemble-based data assimilation algorithm, i.e., the Ensemble Smoother, is elected among the available literature approaches. The methodology is investigated and tested in both synthetic cases and in real case applications by assimilating the available observations from in-situ measurements of ground-surface displacements. The characterization of the reservoir rock properties is provided for an Underground Gas Storage (UGS) reservoir and an offshore producing gas reservoir. Different set of parameters are estimated depending on the available information on the different fields. The parameters of a transversely isotropic model are calibrated using horizontal and vertical displacements from Persistent Scatterer Interferometry (PSI) measured above the UGS field, while vertical displacements from a time-lapse bathymetry are used to calibrate the uniaxial vertical compressibility of an isotropic constitutive law characterizing the behaviour of the offshore gas reservoir. Generally, it is obtained a satisfactory estimation of the geomechanical parameters with a significant spread reduction of the prior probability distributions when synthetic measurements, i.e., the displacements generated by an independent model run, are assimilated. However, more difficulties are encountered using real observations. This study gives indications on the main factors influencing the geomechanical characterization when assimilating movements of the land surface. The numerical results underline the importance of the consistency between the forward model and the assimilated measurements with an appropriate selection of data necessary to eliminate potential biases of the measurements and/or the modeling procedure.
Lo stato tensionale indotto dalla variazione di pressione in giacimenti profondi e la conseguente compattazione delle formazioni geologiche sono simulati con l'ausilio di un modello geomeccanico agli Elementi Finiti (FEM). Nei decenni passati, il citato modello è stato utilizzato in molteplici applicazioni e, tuttavia, le incertezze introdotte nella modellazione sono numerose e possono influire significativamente sulla risposta del modello, in termini di spostamenti superficiali. Le incertezze sono principalmente legate alla semplificazione intrinseca nel processo di modellazione, alle scarsamente note condizioni iniziali e al contorno, alle forzanti esterne e ai parametri del modello, e cioè le proprietà fisiche del giacimento, solitamente non conosciute a-priori. La stima di questi ultimi è ottenuta, in questo lavoro di tesi, attraverso lo sviluppo e l'implementazione di metodologie di tipo probabilistico che permettono di quantificare anche il grado di incertezza associato alla stima dei parametri del modello. Per questo scopo viene utilizzato il cosiddetto Ensemble Smoother, un particolare algoritmo di data assimilation basato su un approccio di tipo Monte Carlo. La metodologia proposta è stata applicata e testata sia su casi sintetici che su casi reali assimilando dati di spostamento superficiale misurati in-situ. I parametri geomeccanici sono stati stimati in due specifici giacimenti. Nel primo caso, si tratta di un sito per lo stoccaggio di gas metano mentre, il secondo caso, riguarda un sito offshore utilizzato per l'estrazione di gas. Nei due casi, per descrivere il comportamento geomeccanico del giacimento, sono state utilizzate leggi costitutive differenti, sulla base delle osservazioni disponibili nei due campi di interesse. In un caso, i parametri di un modello trasversalmente isotropo sono stati stimati usando misure interferometriche satellitari di spostamento superficiale sia orizzontale che verticale disponibili sul sito di stoccaggio. Nell'altro caso, una legge costitutiva più semplice di tipo isotropo è stata calibrata nel sito offshore dove le osservazioni a disposizione forniscono solo la componente verticale dello spostamento, stimata da una mappa differenziale di batimetria. Nei test sintetici, è stato dimostrato che la metodologia permette di valutare in modo soddisfacente i parametri geomeccanici con una riduzione notevole dell'incertezza inizialmente ipotizzata per i parametri in gioco. Tuttavia, la stima degli stessi è più difficile nei casi reali dove la discrepanza tra il risultato del modello FEM e le misure assimilate può suggerire una preliminare selezione delle misure disponibili per eliminare potenziali evidenti errori nelle misure stesse.
APA, Harvard, Vancouver, ISO, and other styles
42

Hasegawa, Takanori. "Reconstructing Biological Systems Incorporating Multi-Source Biological Data via Data Assimilation Techniques." 京都大学 (Kyoto University), 2015. http://hdl.handle.net/2433/195985.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Žagar, Nedjeljka. "Dynamical aspects of atmospheric data assimilation in the tropics." Doctoral thesis, Stockholm University, Department of Meteorology, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-111.

Full text
Abstract:

A faithful depiction of the tropical atmosphere requires three-dimensional sets of observations. Despite the increasing amount of observations presently available, these will hardly ever encompass the entire atmosphere and, in addition, observations have errors. Additional (background) information will always be required to complete the picture. Valuable added information comes from the physical laws governing the flow, usually mediated via a numerical weather prediction (NWP) model. These models are, however, never going to be error-free, why a reliable estimate of their errors poses a real challenge since the whole truth will never be within our grasp.

The present thesis addresses the question of improving the analysis procedures for NWP in the tropics. Improvements are sought by addressing the following issues:

- the efficiency of the internal model adjustment,

- the potential of the reliable background-error information, as compared to observations,

- the impact of a new, space-borne line-of-sight wind measurements, and

- the usefulness of multivariate relationships for data assimilation in the tropics.

Most NWP assimilation schemes are effectively univariate near the equator. In this thesis, a multivariate formulation of the variational data assimilation in the tropics has been developed. The proposed background-error model supports the mass-wind coupling based on convectively-coupled equatorial waves. The resulting assimilation model produces balanced analysis increments and hereby increases the efficiency of all types of observations.

Idealized adjustment and multivariate analysis experiments highlight the importance of direct wind measurements in the tropics. In particular, the presented results confirm the superiority of wind observations compared to mass data, in spite of the exact multivariate relationships available from the background information. The internal model adjustment is also more efficient for wind observations than for mass data.

In accordance with these findings, new satellite wind observations are expected to contribute towards the improvement of NWP and climate modeling in the tropics. Although incomplete, the new wind-field information has the potential to reduce uncertainties in the tropical dynamical fields, if used together with the existing satellite mass-field measurements.

The results obtained by applying the new background-error representation to the tropical short-range forecast errors of a state-of-art NWP model suggest that achieving useful tropical multivariate relationships may be feasible within an operational NWP environment.

APA, Harvard, Vancouver, ISO, and other styles
44

Žagar, Nedjeljka. "Dynamical aspects of atmospheric data assimilation in the tropics /." Stockholm : Meteorologiska institutionen, Univ, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Weaver, Anthony T. "Variational data assimilation in numerical models of the ocean." Thesis, University of Oxford, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.240442.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Partridge, Dale. "Numerical modelling of glaciers : moving meshes and data assimilation." Thesis, University of Reading, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.602408.

Full text
Abstract:
In this thesis we consider the solution to dynamical ice flow equations using a combination of a moving mesh method and data assimilation. We show that by moving the mesh the approximation to the ice thickness profile is improved, and the location of the domain boundary is significantly better estimated. The method used is derived by utilising a relative mass conservation principle to define a net deformation velocity comprising of the internal diffusion of ice and the effect of accumulation or ablation. We use a finite difference numerical approximation in one-dimension and a finite element approximation in two-dimensions to demonstrate the ability of the methods to simulate different aspects of ice flow. In particular we focus on the accurate representation of the moving front of the glacier without the need for an interpolation procedure. Results are shown to compare favourably to exact, steady state solutions, while demonstrating improvements over traditional fixed grid methods. The impact of the internal diffusion of the ice on the movement of the glacier front is analysed, and a condition on the local profile near the boundary is constructed to determine when the front is moving as a result of diffusion rather than the accumulation or ablation. We utilise the technique of data assimilation to combine the moving mesh method with observational information to get a statistically best estimate of the ice thickness profile. In a moving mesh environment there are differences to the scheme that we detail, in both one and two dimensions. We introduce an extension to our data assimilation scheme to directly include the numerical mesh within the update. This allows for the potential inclusion of observations of key features such as the location of the boundary. We demonstrate the improvement that this extension has on our prediction of the domain in one dimension and discuss the challenges encountered when applying this extension to two dimensions.
APA, Harvard, Vancouver, ISO, and other styles
47

Chakraborty, Soham. "DATA ASSIMILATION AND VISUALIZATION FOR ENSEMBLE WILDLAND FIRE MODELS." UKnowledge, 2008. http://uknowledge.uky.edu/gradschool_theses/529.

Full text
Abstract:
This thesis describes an observation function for a dynamic data driven application system designed to produce short range forecasts of the behavior of a wildland fire. The thesis presents an overview of the atmosphere-fire model, which models the complex interactions between the fire and the surrounding weather and the data assimilation module which is responsible for assimilating sensor information into the model. Observation plays an important role in data assimilation as it is used to estimate the model variables at the sensor locations. Also described is the implementation of a portable and user friendly visualization tool which displays the locations of wildfires in the Google Earth virtual globe.
APA, Harvard, Vancouver, ISO, and other styles
48

Martin, Matthew J. "Data assimilation in ocean circulation models with systematic errors." Thesis, University of Reading, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.365425.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Griffith, Anne K. "Data assimilation for numerical weather prediction using control theory." Thesis, University of Reading, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.363416.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Williams, John K. (John Kenneth). "WRF-Var implementation for data assimilation experimentation at MIT." Thesis, Massachusetts Institute of Technology, 2008. http://hdl.handle.net/1721.1/45784.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Earth, Atmospheric, and Planetary Sciences, 2008.
Includes bibliographical references (p. 55-57).
The goal of this Masters project is to implement the WRF model with 3D variational assimilation (3DVAR) at MIT. A working version of WRF extends the scope of experimentation to mesoscale problems in both real and idealized scenarios. A state-of-the-art model and assimilation package can now be used to conduct science or as a benchmark to compare new methods with.The second goal of this project is to demonstrate MIT's WRF implementation in an ongoing study of the impact of position errors on contemporary data assimilation (DA) methods [21]. In weather forecasting, accurately predicting the position and shape of small scale features can be as important as predicting their strength. Position errors are unfortunately common in operational forecasts [2, 14, 21, 27] and arise for a number of reasons. It is difficult to factor error into its constituent sources [21].Traditional data assimilation methods are amplitude adjustment methods, which do not deal with position errors well [4, 21]. In this project, we configured the WRF-Var system for use at MIT to extend experimentation on data assimilation to mesoscale problems. We experiment on position errors with the WRF-Var system by using a standard WRF test; a tropical cyclone. The results for this identical twin experiment show the common distorted analysis from 3DVAR in dealing with position errors. A field alignment solution proposed by Ravela et al. [21] explicitly represents and minimizes position errors. We achieve promising results in testing this algorithm with WRF-Var by aligning WRF fields from the identical twin.
by John K. Williams.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography