Articles de revues sur le sujet « Geostatistical procedure »

Pour voir les autres types de publications sur ce sujet consultez le lien suivant : Geostatistical procedure.

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les 50 meilleurs articles de revues pour votre recherche sur le sujet « Geostatistical procedure ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Parcourez les articles de revues sur diverses disciplines et organisez correctement votre bibliographie.

1

Rocha, Samille Santos, Anabele Lindner et Cira Souza Pitombo. « PROPOSAL OF A GEOSTATISTICAL PROCEDURE FOR TRANSPORTATION PLANNING FIELD ». Boletim de Ciências Geodésicas 23, no 4 (décembre 2017) : 636–53. http://dx.doi.org/10.1590/s1982-21702017000400042.

Texte intégral
Résumé :
Abstract: The main objective of this study is to estimate variables related to transportation planning, in particular transit trip production, by proposing a geostatistical procedure. The procedure combines the semivariogram deconvolution and Kriging with External Drift (KED). The method consists of initially assuming a disaggregated systematic sample from aggregate data. Subsequently, KED was applied to estimate the primary variable, considering the population as a secondary input. This research assesses two types of information related to the city of Salvador (Bahia, Brazil): an origin-destination dataset based on a home-interview survey carried out in 1995 and the 2010 census data. Besides standing out for the application of Geostatistics in the field of transportation planning, this paper introduces the concepts of semivariogram deconvolution applied to aggregated travel data. Thus far these aspects have not been explored in the research area. In this way, this paper mainly presents three contributions: 1) estimating urban travel data in unsampled spatial locations; 2) obtaining the values of the variable of interest deriving out of other variables; and 3) introducing a simple semivariogram deconvolution procedure, considering that disaggregated data are not available to maintain the confidentiality of individual data.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Murakami, H., X. Chen, M. S. Hahn, Y. Liu, M. L. Rockhold, V. R. Vermeul, J. M. Zachara et Y. Rubin. « Bayesian approach for three-dimensional aquifer characterization at the hanford 300 area ». Hydrology and Earth System Sciences Discussions 7, no 2 (23 mars 2010) : 2017–52. http://dx.doi.org/10.5194/hessd-7-2017-2010.

Texte intégral
Résumé :
Abstract. This study presents a stochastic, three-dimensional characterization of a heterogeneous hydraulic conductivity field within DOE's Hanford 300 Area site, Washington, by assimilating large-scale, constant-rate injection test data with small-scale, three-dimensional electromagnetic borehole flowmeter (EBF) measurement data. We first inverted the injection test data to estimate the transmissivity field, using zeroth-order temporal moments of pressure buildup curves. We applied a newly developed Bayesian geostatistical inversion framework, the method of anchored distributions (MAD), to obtain a joint posterior distribution of geostatistical parameters and local log-transmissivities at multiple locations. The unique aspects of MAD that make it suitable for this purpose are its ability to integrate multi-scale, multi-type data within a Bayesian framework and to compute a nonparametric posterior distribution. After we combined the distribution of transmissivities with depth-discrete relative-conductivity profile from the EBF data, we inferred the three-dimensional geostatistical parameters of the log-conductivity field, using the Bayesian model-based geostatistics. Such consistent use of the Bayesian approach throughout the procedure enabled us to systematically incorporate data uncertainty into the final posterior distribution. The method was tested in a synthetic study and validated using the actual data that was not part of the estimation. Results showed broader and skewed posterior distributions of geostatistical parameters except for the mean, which suggests the importance of inferring the entire distribution to quantify the parameter uncertainty.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Murakami, H., X. Chen, M. S. Hahn, Y. Liu, M. L. Rockhold, V. R. Vermeul, J. M. Zachara et Y. Rubin. « Bayesian approach for three-dimensional aquifer characterization at the Hanford 300 Area ». Hydrology and Earth System Sciences 14, no 10 (21 octobre 2010) : 1989–2001. http://dx.doi.org/10.5194/hess-14-1989-2010.

Texte intégral
Résumé :
Abstract. This study presents a stochastic, three-dimensional characterization of a heterogeneous hydraulic conductivity field within the Hanford 300 Area, Washington, USA, by assimilating large-scale, constant-rate injection test data with small-scale, three-dimensional electromagnetic borehole flowmeter (EBF) measurement data. We first inverted the injection test data to estimate the transmissivity field, using zeroth-order temporal moments of pressure buildup curves. We applied a newly developed Bayesian geostatistical inversion framework, the method of anchored distributions (MAD), to obtain a joint posterior distribution of geostatistical parameters and local log-transmissivities at multiple locations. The unique aspects of MAD that make it suitable for this purpose are its ability to integrate multi-scale, multi-type data within a Bayesian framework and to compute a nonparametric posterior distribution. After we combined the distribution of transmissivities with depth-discrete relative-conductivity profile from the EBF data, we inferred the three-dimensional geostatistical parameters of the log-conductivity field, using the Bayesian model-based geostatistics. Such consistent use of the Bayesian approach throughout the procedure enabled us to systematically incorporate data uncertainty into the final posterior distribution. The method was tested in a synthetic study and validated using the actual data that was not part of the estimation. Results showed broader and skewed posterior distributions of geostatistical parameters except for the mean, which suggests the importance of inferring the entire distribution to quantify the parameter uncertainty.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Clarke, R. T. « Classification procedures in the context of PUB : ways forward ? » Hydrology and Earth System Sciences Discussions 8, no 1 (21 janvier 2011) : 855–67. http://dx.doi.org/10.5194/hessd-8-855-2011.

Texte intégral
Résumé :
Abstract. Limitations of cluster analysis as a procedure for classifying parameters of rainfall-runoff models are discussed, and a procedure is suggested by which such parameters could be estimated, using site characteristics together with a split record test as a measure of performance. It is suggested that geostatistical models may be a possible alternative to procedures based on cluster analysis, and that long-established principles of experimental design (replication, randomization) be used for comparing alternative PUB procedures.
Styles APA, Harvard, Vancouver, ISO, etc.
5

Pugliese, Alessio, Simone Persiano, Stefano Bagli, Paolo Mazzoli, Juraj Parajka, Berit Arheimer, René Capell, Alberto Montanari, Günter Blöschl et Attilio Castellarin. « A geostatistical data-assimilation technique for enhancing macro-scale rainfall–runoff simulations ». Hydrology and Earth System Sciences 22, no 9 (6 septembre 2018) : 4633–48. http://dx.doi.org/10.5194/hess-22-4633-2018.

Texte intégral
Résumé :
Abstract. Our study develops and tests a geostatistical technique for locally enhancing macro-scale rainfall–runoff simulations on the basis of observed streamflow data that were not used in calibration. We consider Tyrol (Austria and Italy) and two different types of daily streamflow data: macro-scale rainfall–runoff simulations at 11 prediction nodes and observations at 46 gauged catchments. The technique consists of three main steps: (1) period-of-record flow–duration curves (FDCs) are geostatistically predicted at target ungauged basins, for which macro-scale model runs are available; (2) residuals between geostatistically predicted FDCs and FDCs constructed from simulated streamflow series are computed; (3) the relationship between duration and residuals is used for enhancing simulated time series at target basins. We apply the technique in cross-validation to 11 gauged catchments, for which simulated and observed streamflow series are available over the period 1980–2010. Our results show that (1) the procedure can significantly enhance macro-scale simulations (regional LNSE increases from nearly zero to ≈0.7) and (2) improvements are significant for low gauging network densities (i.e. 1 gauge per 2000 km2).
Styles APA, Harvard, Vancouver, ISO, etc.
6

Kozubal, Janusz, Roman Wróblewski, Zbigniew Muszyński, Marek Wyjadłowski et Joanna Stróżyk. « Non-Deterministic Assessment of Surface Roughness as Bond Strength Parameters between Concrete Layers Cast at Different Ages ». Materials 13, no 11 (3 juin 2020) : 2542. http://dx.doi.org/10.3390/ma13112542.

Texte intégral
Résumé :
The importance of surface roughness and its non-destructive examination has often been emphasised in structural rehabilitation. The presented innovative procedure enables the estimation of concrete-to-concrete strength based on a combination of low-cost, area-limited tests and geostatistical methods. The new method removes the shortcomings of the existing one, i.e., it is neither qualitative nor subjective. The interface strength factors, cohesion and friction, can be estimated accurately based on the collected data on a surface texture. The data acquisition needed to create digital models of the concrete surface can be performed by terrestrial close-range photogrammetry or other methods. In the presented procedure, limitations to the availability of concrete surfaces are overcome by the generation of subsequential Gaussian random fields (via height profiles) based on the semivariograms fitted to the digital surface models. In this way, the randomness of the surface texture is reproduced. The selected roughness parameters, such as mean valley depth and, most importantly, the geostatistical semivariogram parameter sill, were transformed into contact bond strength parameters based on the available strength tests. The proposed procedure estimates the interface bond strength based on the geostatistical methods applied to the numerical surface model and can be used in practical and theoretical applications.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Saito, H., K. Seki et J. Šimůnek. « Geostatistical modeling of spatial variability of water retention curves ». Hydrology and Earth System Sciences Discussions 5, no 4 (3 septembre 2008) : 2491–522. http://dx.doi.org/10.5194/hessd-5-2491-2008.

Texte intégral
Résumé :
Abstract. This study compares the performance of two geostatistical approaches, parametric and non-parametric, to evaluate the spatial distribution of water retention curves. Data used in this study were obtained from the Las Cruces trench site database that contains water retention data for 448 soil samples. In a commonly used parametric approach, three standard water retention models, i.e. Brooks and Corey (BC), van Genuchten (VG), and log-normal (LN), were first fitted to each data set. For each model, a cross validation procedure was used to estimate parameters at each sampling location, allowing computation of prediction errors. In a rarely used non-parametric approach, a cross validation procedure was first used to directly estimate water content values for eleven pressure heads at each sampling location and then the three water retention models were fitted using the same automated procedure to compute prediction errors. The results show that the non-parametric approach significantly lowered prediction errors for the VG model, while moderately reducing them also for the LN and BC models.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Ribeiro, Sara, Júlio Caineta, Ana Cristina Costa et Roberto Henriques. « gsimcli : a geostatistical procedure for the homogenisation of climatic time series ». International Journal of Climatology 37, no 8 (17 novembre 2016) : 3452–67. http://dx.doi.org/10.1002/joc.4929.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Wen, X. H., T. T. Tran, R. A. Behrens et J. J. Gomez-Hernandez. « Production Data Integration in Sand/Shale Reservoirs Using Sequential Self-Calibration and GeoMorphing : AComparison ». SPE Reservoir Evaluation & ; Engineering 5, no 03 (1 juin 2002) : 255–65. http://dx.doi.org/10.2118/78139-pa.

Texte intégral
Résumé :
Summary The stochastic inversion of spatial distribution of lithofacies from multiphase production data is a difficult problem. This is true even for the simplest case, addressed here, of a sand/shale distribution and under the assumption that reservoir properties are constant within each lithofacies. Two geostatistically based inverse techniques, sequential self-calibration (SSC) and GeoMorphing (GM), are extended for such purposes and then compared with synthetic reference fields. The extension of both techniques is based on the one-to-one relationship existing between lithofacies and Gaussian deviates in truncated Gaussian simulation. Both techniques attempt to modify the field of Gaussian deviates while maintaining the truncation threshold field through an optimization procedure. Maintaining a fixed threshold field, which has been computed previously on the basis of prior lithofacies proportion data, well data, and other static soft data, guarantees preservation of the initial geostatistical structure. Comparisons of the two techniques using 2D and 3D synthetic data show that the SSC is very efficient in producing sand/shale realizations matching production data and reproducing the large-scale patterns displayed in the reference fields, although it has difficulty in reproducing small-scale features. GM is a simpler algorithm than SSC, but it is computationally more intensive and has difficulty in matching complex production data. Better results could be obtained with a combination of the two techniques in which SSC is used to generate realizations identifying large-scale features; then, these realizations could be used as input to GM for a final update to match small-scale details. Introduction Reliable predictions of future reservoir performance require reservoir models that incorporate all available relevant information. Geostatistical methods are widely used and well suited to construct reservoir models of porosity and permeability honoring static data, such as core data, well-log data, seismic data, and geological conceptual data. Dynamic production data, such as production rate, pressure, water cut, and gas/oil ratio (GOR), have been largely overlooked for constraining geostatistical models because of the complication and difficulty of integrating them. Traditional geostatistical methods for integrating static data are not well suited for integrating dynamic data because dynamic data are nonlinearly related to reservoir properties through the flow equations. Typically, an inverse technique is needed for such integration, in which the flow equations must be solved many times within a nonlinear optimization procedure. In recent years, a number of inverse techniques have been developed and shown capable of preconstraining geostatistical models before they go to the manual history matching phase. Ref. 1 provides a review of these inverse techniques. Two geostatistically based approaches that have shown great potential for the integration of dynamic data are SSC and GM. The SSC method iteratively perturbs the given reservoir model at each gridblock to match the production data while preserving the geostatistical features and static hard/soft data conditioning.2–6 The perturbation is computed through an optimization procedure after a parameterization of the optimization problem with a reduced number of parameters that requires the computation of sensitivity coefficients. The reduced number of parameters to optimize and a fast calculation of the sensitivity coefficients make the inversion computationally feasible. Multiple realizations of the reservoir model can be produced, from which uncertainty can be assessed. Applications of the SSC method to invert permeability distribution from single-phase and multiphase production data have shown their efficiency and robustness.3–6 In this paper, we extend the SSC method to invert lithofacies distributions from production data within the framework of truncated Gaussian simulation. We limit ourselves to sand/shale reservoirs in which permeability is assumed constant within each facies. GM is an evolution and extension of the Gradual Deformation method.7–9 This method generates realizations of reservoir models by an iterative procedure in which, at each iteration, unconditional realizations are linearly and optimally combined into a new realization with a better reproduction of the production data than any other members of the linear combination. Because the linear combination of a few realizations depends only on a few parameters, the optimization procedure is very easy to implement. Our GM algorithm follows the modification of the gradual deformation algorithm by Ying and Gómez-Hernández10 to honor the well data while preserving the permeability variogram. Our modification here is aimed at inverting a lithofacies distribution from production data within the framework of truncated Gaussian simulation. Comparisons of these two methods in generating multiple geostatistical sand/shale reservoir models that honor dynamic production data are made by using both 2D and 3D synthetic data sets. The comparison of the results against the reference models provides direct assessment of the two methods. A thorough comparison of the two methods is made in terms of reproduction of reservoir spatial patterns, matching of production data, implementation issues, feasibility, CPU time, and generality. We also discuss briefly the possible combination of the strength of the two methods to achieve better, more efficient integration of production data. In the following sections, we first recall the methodology of truncated Gaussian simulation to construct a categorical type of reservoir model; then, the SSC and GM methods are presented under the framework of truncated Gaussian simulation to invert lithofacies distributions. Applications of the two methods to invert sand/shale distributions in 2D and 3D reservoir models are made using synthetic data sets, with emphasis on the comparisons of the strengths and weaknesses of the two methods. The production data considered in this paper are fractional-flow rates (water cut) at production wells and water-saturation spatial distribution at a given time in two-phase-flow (oil/water) reservoirs.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Phoon, Kok-Kwang, Ser-Tong Quek et Ping An. « Geostatistical analysis of cone penetration test (CPT) sounding using the modified Bartlett test ». Canadian Geotechnical Journal 41, no 2 (1 avril 2004) : 356–65. http://dx.doi.org/10.1139/t03-091.

Texte intégral
Résumé :
More in situ tests are typically carried out over the same volume of soil in comparison to laboratory tests on undisturbed borehole samples. Hence, geostatistical analysis of in situ test records should in principle provide a more accurate and representative overview of spatial variation. A natural probabilistic model for correlated spatial data is the random field. Although the random field provides a concise description of spatial variation, it poses considerable practical difficulties for statistical inference because of the underlying autocorrelation structure. This note presents an extended discussion of the modified Bartlett random field estimation procedure, which is capable of rejecting the null hypothesis of weak stationarity for spatially correlated data. In comparison with simple visual inspection and the standard run test, the modified Bartlett test is shown to provide three advantages: (i) it is a more consistent measure that is unaffected by the vagaries of subjective interpretation; (ii) it is sufficiently discriminative to decide if a section is stationary, even when visual clues are ambiguous; and (iii) it is capable of accommodating realistic constraints (e.g., short record length). The possibility of identifying secondary soil boundaries that may not be readily apparent from visual inspection of cone soundings, its robustness to alternate transformations of the cone data, and the sensitivity of the proposed procedure to different levels of significance are discussed.Key words: geostatistics, random field, stationarity, modified Bartlett test, level of significance, run test.
Styles APA, Harvard, Vancouver, ISO, etc.
11

SCHEUERER, M., R. SCHABACK et M. SCHLATHER. « Interpolation of spatial data – A stochastic or a deterministic problem ? » European Journal of Applied Mathematics 24, no 4 (7 février 2013) : 601–29. http://dx.doi.org/10.1017/s0956792513000016.

Texte intégral
Résumé :
Interpolation of spatial data is a very general mathematical problem with various applications. In geostatistics, it is assumed that the underlying structure of the data is a stochastic process which leads to an interpolation procedure known as kriging. This method is mathematically equivalent to kernel interpolation, a method used in numerical analysis for the same problem, but derived under completely different modelling assumptions. In this paper we present the two approaches and discuss their modelling assumptions, notions of optimality and different concepts to quantify the interpolation accuracy. Their relation is much closer than has been appreciated so far, and even results on convergence rates of kernel interpolants can be translated to the geostatistical framework. We sketch different answers obtained in the two fields concerning the issue of kernel misspecification, present some methods for kernel selection and discuss the scope of these methods with a data example from the computer experiments literature.
Styles APA, Harvard, Vancouver, ISO, etc.
12

Pugliese, A., A. Castellarin et A. Brath. « Geostatistical prediction of flow-duration curves ». Hydrology and Earth System Sciences Discussions 10, no 11 (1 novembre 2013) : 13053–91. http://dx.doi.org/10.5194/hessd-10-13053-2013.

Texte intégral
Résumé :
Abstract. We present in this study an adaptation of Topological kriging (or Top-kriging), which makes the geostatistical procedure capable of predicting flow-duration curves (FDCs) in ungauged catchments. Previous applications of Top-kriging mainly focused on the prediction of point streamflow indices (e.g. flood quantiles, low-flow indices, etc.). In this study Top-kriging is used to predict FDCs in ungauged sites as a weighted average of standardised empirical FDCs through the traditional linear-weighting scheme of kriging methods. Our study focuses on the prediction of period-of-record FDCs for 18 unregulated catchments located in Central Italy, for which daily streamflow series with length from 5 to 40 yr are available, together with information on climate referring to the same time-span of each daily streamflow sequence. Empirical FDCs are standardised by a reference streamflow value (i.e. mean annual flow, or mean annual precipitation times the catchment drainage area) and the overall deviation of the curves from this reference value is then used for expressing the hydrological similarity between catchments and for deriving the geostatistical weights. We performed an extensive leave-one-out cross-validation to quantify the accuracy of the proposed technique, and to compare it to traditional regionalisation models that were recently developed for the same study region. The cross-validation points out that Top-kriging is a reliable approach for predicting FDCs, which can significantly outperform traditional regional models in ungauged basins.
Styles APA, Harvard, Vancouver, ISO, etc.
13

Bailey, R. T., et D. Baù. « Estimating geostatistical parameters and spatially-variable hydraulic conductivity within a catchment system using an ensemble smoother ». Hydrology and Earth System Sciences 16, no 2 (2 février 2012) : 287–304. http://dx.doi.org/10.5194/hess-16-287-2012.

Texte intégral
Résumé :
Abstract. Groundwater flow models are important tools in assessing baseline conditions and investigating management alternatives in groundwater systems. The usefulness of these models, however, is often hindered by insufficient knowledge regarding the magnitude and spatial distribution of the spatially-distributed parameters, such as hydraulic conductivity (K), that govern the response of these models. Proposed parameter estimation methods frequently are demonstrated using simplified aquifer representations, when in reality the groundwater regime in a given watershed is influenced by strongly-coupled surface-subsurface processes. Furthermore, parameter estimation methodologies that rely on a geostatistical structure of K often assume the parameter values of the geostatistical model as known or estimate these values from limited data. In this study, we investigate the use of a data assimilation algorithm, the Ensemble Smoother, to provide enhanced estimates of K within a catchment system using the fully-coupled, surface-subsurface flow model CATHY. Both water table elevation and streamflow data are assimilated to condition the spatial distribution of K. An iterative procedure using the ES update routine, in which geostatistical parameter values defining the true spatial structure of K are identified, is also presented. In this procedure, parameter values are inferred from the updated ensemble of K fields and used in the subsequent iteration to generate the K ensemble, with the process proceeding until parameter values are converged upon. The parameter estimation scheme is demonstrated via a synthetic three-dimensional tilted v-shaped catchment system incorporating stream flow and variably-saturated subsurface flow, with spatio-temporal variability in forcing terms. Results indicate that the method is successful in providing improved estimates of the K field, and that the iterative scheme can be used to identify the geostatistical parameter values of the aquifer system. In general, water table data have a much greater ability than streamflow data to condition K. Future research includes applying the methodology to an actual regional study site.
Styles APA, Harvard, Vancouver, ISO, etc.
14

Bailey, R. T., et D. Baù. « Estimating geostatistical parameters and spatially-variable hydraulic conductivity within a catchment system using an ensemble smoother ». Hydrology and Earth System Sciences Discussions 8, no 5 (31 octobre 2011) : 9587–635. http://dx.doi.org/10.5194/hessd-8-9587-2011.

Texte intégral
Résumé :
Abstract. Groundwater flow models are important tools in assessing baseline conditions and investigating management alternatives in groundwater systems. The usefulness of these models, however, is often hindered by insufficient knowledge regarding the magnitude and spatial distribution of the spatially-distributed parameters, such as hydraulic conductivity (K), that govern the response of these models. Proposed parameter estimation methods frequently are demonstrated using simplified aquifer representations, when in reality the groundwater regime in a given watershed is influenced by strongly-coupled surface-subsurface processes. Furthermore, parameter estimation methodologies that rely on a geostatistical structure of K often assume the parameter values of the geostatistical model as known or estimate these values from limited data. In this study, we investigate the use of a data assimilation algorithm, the Ensemble Smoother, to provide enhanced estimates of K within a catchment system using the fully-coupled, surface-subsurface flow model CATHY. Both water table elevation and streamflow data are assimilated to condition the spatial distribution of K. An iterative procedure using the ES update routine, in which geostatistical parameter values defining the true spatial structure of K are identified, is also presented. In this procedure, parameter values are inferred from the updated ensemble of K fields and used in the subsequent iteration to generate the K ensemble, with the process proceeding until parameter values are converged upon. The parameter estimation scheme is demonstrated via a synthetic three-dimensional tilted v-shaped catchment system incorporating stream flow and variably-saturated subsurface flow, with spatio-temporal variability in forcing terms. Results indicate that the method is successful in providing improved estimates of the K field, and that the iterative scheme can be used to identify the geostatistical parameter values of the aquifer system. In general, water table data have a much greater ability than streamflow data to condition K. Future research includes applying the methodology to an actual regional study site.
Styles APA, Harvard, Vancouver, ISO, etc.
15

Parfitt, José Maria Barbat, Luís Carlos Timm, Eloy Antonio Pauletto, Rogério Oliveira de Sousa, Danilo Dufech Castilhos, Conceição Lagos de Ávila et Nestor Luis Reckziegel. « Spatial variability of the chemical, physical and biological properties in lowland cultivated with irrigated rice ». Revista Brasileira de Ciência do Solo 33, no 4 (août 2009) : 819–30. http://dx.doi.org/10.1590/s0100-06832009000400007.

Texte intégral
Résumé :
In the areas where irrigated rice is grown in the south of Brazil, few studies have been carried out to investigate the spatial variability structure of soil properties and to establish new forms of soil management as well as determine soil corrective and fertilizer applications. In this sense, this study had the objective of evaluating the spatial variability of chemical, physical and biological soil properties in a lowland area under irrigated rice cultivation in the conventional till system. For this purpose, a 10 x 10 m grid of 100 points was established, in an experimental field of the Embrapa Clima Temperado, in the County of Capão do Leão, State of Rio Grande do Sul. The spatial variability structure was evaluated by geostatistical tools and the number of subsamples required to represent each soil property in future studies was calculated using classical statistics. Results showed that the spatial variability structure of sand, silt, SMP index, cation exchange capacity (pH 7.0), Al3+ and total N properties could be detected by geostatistical analysis. A pure nugget effect was observed for the nutrients K, S and B, as well as macroporosity, mean weighted diameter of aggregates, and soil water storage. The cross validation procedure, based on linear regression and the determination coefficient, was more efficient to evaluate the quality of the adjusted mathematical model than the degree of spatial dependence. It was also concluded that the combination of classical with geostatistics can in many cases simplify the soil sampling process without losing information quality.
Styles APA, Harvard, Vancouver, ISO, etc.
16

Pereira, Ângela, Rúben Nunes, Leonardo Azevedo, Luís Guerreiro et Amílcar Soares. « Geostatistical seismic inversion for frontier exploration ». Interpretation 5, no 4 (30 novembre 2017) : T477—T485. http://dx.doi.org/10.1190/int-2016-0171.1.

Texte intégral
Résumé :
Numerical 3D high-resolution models of subsurface petroelastic properties are key tools for exploration and production stages. Stochastic seismic inversion techniques are often used to infer the spatial distribution of the properties of interest by integrating simultaneously seismic reflection and well-log data also allowing accessing the spatial uncertainty of the retrieved models. In frontier exploration areas, the available data set is often composed exclusively of seismic reflection data due to the lack of drilled wells and are therefore of high uncertainty. In these cases, subsurface models are usually retrieved by deterministic seismic inversion methodologies based exclusively on the existing seismic reflection data and an a priori elastic model. The resulting models are smooth representations of the real complex geology and do not allow assessing the uncertainty. To overcome these limitations, we have developed a geostatistical framework that allows inverting seismic reflection data without the need of experimental data (i.e., well-log data) within the inversion area. This iterative geostatistical seismic inversion methodology simultaneously integrates the available seismic reflection data and information from geologic analogs (nearby wells and/or analog fields) allowing retrieving acoustic impedance models. The model parameter space is perturbed by a stochastic sequential simulation methodology that handles the nonstationary probability distribution function. Convergence from iteration to iteration is ensured by a genetic algorithm driven by the trace-by-trace mismatch between real and synthetic seismic reflection data. The method was successfully applied to a frontier basin offshore southwest Europe, where no well has been drilled yet. Geologic information about the expected impedance distribution was retrieved from nearby wells and integrated within the inversion procedure. The resulting acoustic impedance models are geologically consistent with the available information and data, and the match between the inverted and the real seismic data ranges from 85% to 90% in some regions.
Styles APA, Harvard, Vancouver, ISO, etc.
17

Jimeno-Sáez, Patricia, David Pulido-Velazquez, Antonio-Juan Collados-Lara, Eulogio Pardo-Igúzquiza, Javier Senent-Aparicio et Leticia Baena-Ruiz. « A Preliminary Assessment of the “Undercatching” and the Precipitation Pattern in an Alpine Basin ». Water 12, no 4 (8 avril 2020) : 1061. http://dx.doi.org/10.3390/w12041061.

Texte intégral
Résumé :
Gauges modify wind fields, producing important systematic errors (undercatching) in the measurement of solid precipitation (Ps), especially under windy conditions. A methodology that combines geostatistical techniques and hydrological models to perform a preliminary assessment of global undercatch and precipitation patterns in alpine regions is proposed. An assessment of temperature and precipitation fields is performed by applying geostatistical approaches assuming different hypothesis about the relationship between climatic fields and altitude. Several experiments using different approximations of climatic fields in different approaches to a hydrological model are evaluated. A new hydrological model, the Snow-Témez Model (STM), is developed including two parameters to correct the solid (Cs) and liquid precipitation (Cr). The procedure allows identifying the best combination of geostatistical approach and hydrological model for estimating streamflow in the Canales Basin, an alpine catchment of the Sierra Nevada (Spain). The sensitivity of the results to the correction of the precipitation fields is analyzed, revealing that the results of the streamflow simulation are improved when the precipitation is corrected considerably. High values of solid Cs are obtained, while Cr values, although smaller than the solid one, are also significant.
Styles APA, Harvard, Vancouver, ISO, etc.
18

Lebrenz, Henning, et András Bárdossy. « Geostatistical interpolation by quantile kriging ». Hydrology and Earth System Sciences 23, no 3 (20 mars 2019) : 1633–48. http://dx.doi.org/10.5194/hess-23-1633-2019.

Texte intégral
Résumé :
Abstract. The widely applied geostatistical interpolation methods of ordinary kriging (OK) or external drift kriging (EDK) interpolate the variable of interest to the unknown location, providing a linear estimator and an estimation variance as measure of uncertainty. The methods implicitly pose the assumption of Gaussianity on the observations, which is not given for many variables. The resulting “best linear and unbiased estimator” from the subsequent interpolation optimizes the mean error over many realizations for the entire spatial domain and, therefore, allows a systematic under-(over-)estimation of the variable in regions of relatively high (low) observations. In case of a variable with observed time series, the spatial marginal distributions are estimated separately for one time step after the other, and the errors from the interpolations might accumulate over time in regions of relatively extreme observations. Therefore, we propose the interpolation method of quantile kriging (QK) with a two-step procedure prior to interpolation: we firstly estimate distributions of the variable over time at the observation locations and then estimate the marginal distributions over space for every given time step. For this purpose, a distribution function is selected and fitted to the observed time series at every observation location, thus converting the variable into quantiles and defining parameters. At a given time step, the quantiles from all observation locations are then transformed into a Gaussian-distributed variable by a 2-fold quantile–quantile transformation with the beta- and normal-distribution function. The spatio-temporal description of the proposed method accommodates skewed marginal distributions and resolves the spatial non-stationarity of the original variable. The Gaussian-distributed variable and the distribution parameters are now interpolated by OK and EDK. At the unknown location, the resulting outcomes are reconverted back into the estimator and the estimation variance of the original variable. As a summary, QK newly incorporates information from the temporal axis for its spatial marginal distribution and subsequent interpolation and, therefore, could be interpreted as a space–time version of probability kriging. In this study, QK is applied for the variable of observed monthly precipitation from raingauges in South Africa. The estimators and estimation variances from the interpolation are compared to the respective outcomes from OK and EDK. The cross-validations show that QK improves the estimator and the estimation variance for most of the selected objective functions. QK further enables the reduction of the temporal bias at locations of extreme observations. The performance of QK, however, declines when many zero-value observations are present in the input data. It is further revealed that QK relates the magnitude of its estimator with the magnitude of the respective estimation variance as opposed to the traditional methods of OK and EDK, whose estimation variances do only depend on the spatial configuration of the observation locations and the model settings.
Styles APA, Harvard, Vancouver, ISO, etc.
19

Marashly, O., et M. Dobroka. « Hilbert transform using a robust geostatistical method ». IOP Conference Series : Earth and Environmental Science 942, no 1 (1 novembre 2021) : 012029. http://dx.doi.org/10.1088/1755-1315/942/1/012029.

Texte intégral
Résumé :
Abstract In this paper, we introduced an efficient inversion method for Hilbert transform calculation which can be able to eliminate the outlier noise. The Most Frequent Value method (MFV) developed by Steiner merged with an inversion-based Fourier transform to introduce a powerful Fourier transform. The Fourier transform process (IRLS-FT) ability to noise overthrow efficiency and refusal to outliers make it an applicable method in the field of seismic data processing. In the first part of the study, we introduced the Hilbert transform stand on a efficient inversion, after that as an example we obtain the absolute value of the analytical signal which can be used as an attribute gauge. The method depends on a dual inversion, first we obtain the Fourier spectrum of the time signal via inversion, after that, the spectrum calculated via transformation of Hilbert transforms into time range using a efficient inversion. Steiner Weights is used later and calculated using the Iterative Reweighting Least Squares (IRLS) method (efficient inverse Fourier transform). Hermite functions in a series expansion are used to discretize the spectrum of the signal in time. These expansion coefficients are the unknowns in this case. The test procedure was made on a Ricker wavelet signal loaded with Cauchy distribution noise to test the new Hilbert transform. The method shows very good resistance to outlier noises better than the conventional (DFT) method.
Styles APA, Harvard, Vancouver, ISO, etc.
20

Doyen, Philippe M. « Porosity from seismic data : A geostatistical approach ». GEOPHYSICS 53, no 10 (octobre 1988) : 1263–75. http://dx.doi.org/10.1190/1.1442404.

Texte intégral
Résumé :
Using a geostatistical technique called cokriging, the areal distribution of porosity is estimated first in a numerically simulated reservoir model, then in an oil‐bearing channel‐sand of Alberta, Canada. The cokriging method consistently integrates 3-D reflection seismic data with well measurements of the porosity and provides error‐qualified, linear mean square estimates of this parameter. In contrast to traditional seismically assisted porosity mapping techniques that treat the data as spatially independent observations, the geostatistical approach uses spatial autocorrelation and crosscorrelation functions to model the lateral variations of the reservoir properties. In the simulated model, the experimental root‐mean square porosity error with cokriging is 50 percent smaller than the error in predictions relying on a least‐squares regression of porosity on seismically derived transit time in the reservoir interval. In the Alberta reservoir, a cross‐validation study at the wells demonstrates that the cokriging procedure is 20 percent more accurate, in a mean square sense, than a standard regression method, which accounts only for local correlations between porosity and seismically derived impedances. In both cases, cokriging capitalizes on areally dense seismic measurements that are indirectly related to porosity. As a result, when compared to estimates obtained by interpolating the well data, this technique considerably improves the spatial description of porosity in areas of sparse well control.
Styles APA, Harvard, Vancouver, ISO, etc.
21

Caers, J. K., S. Srinivasan et A. G. Journel. « Geostatistical Quantification of Geological Information for a Fluvial-Type North Sea Reservoir ». SPE Reservoir Evaluation & ; Engineering 3, no 05 (1 octobre 2000) : 457–67. http://dx.doi.org/10.2118/66310-pa.

Texte intégral
Résumé :
Summary Accurate prediction of petroleum reservoir performance requires reliable models of the often complex reservoir heterogeneity. Geostatistical simulation techniques generate multiple realizations of the reservoir model, all equally likely to be drawn. Traditional to geostatistics, geological continuity is represented through the variogram. The variogram is limited in describing complex geological structures as it measures correlation between rock properties at two locations only: it is a two-point statistic. Reservoir analogs such as outcrops can serve as training images depicting the interpreted geological structure. Due to scarcity of well data, the variogram models are often borrowed from such training sets. However, the same training images could be utilized to extract more complex information in the form of multiple-point statistics measuring the joint dependency between multiple locations. This paper compares a traditional variogram-based geostatistical model vs. a novel geostatistical method utilizing multiple-point statistics borrowed from training images. The comparison is made on the basis of flow performance for a typical North Sea reservoir. To obtain such comparison a "true" reference reservoir is generated using object-based simulation that depicts the complex intertwining of fluvial channels. Next, a different but similar reservoir is generated, termed the "training reservoir." The latter is used to extract the necessary structural information, be it variograms or multiple-point statistics, to build multiple geostatistical models of the true reservoir conditioned to sparse well data. A waterflood flow scenario with an inverted five-spot pattern is simulated using ECLIPSE on the true reference and the various geostatistical models. Water breakthrough characteristics and water saturation distributions are used for comparison. Introduction Typically, geostatistical reservoir characterization must address two important issues. First, a structural model needs to be established that provides an adequate description of the underlying geology. In geostatistics, the structural model describes the spatial continuity of geology in all directions. Traditional to geostatistics is to take variogram(s) as the basis for that prior to the structural model. Second, the structural model needs to be conditioned to all available hard and soft data. The intent of this paper is to compare two approaches to reservoir modeling: a traditional variogram-based technique and a novel training image-based simulation method. In traditional geostatistics, one models the variogram from well data, then one produces simulation models that honor or reflect the variogram model. This seems a highly objective procedure: the variogram model, which conditions the pattern generated from the reservoir model originates from data from the same reservoir. However, the practice of geostatistics has shown that it is difficult to model variograms from the limited well data and the variogram is often borrowed from ancillary information such as outcrops. Moreover, it is by now understood that the variogram is a very limited measure for quantifying spatial patterns. Every simulation algorithm that is variogram based implicitly needs to assume higher-order statistics (e.g., Gaussian simulation methods1). Essentially, any simulation algorithm imposes higher-order statistics beyond the control of the reservoir modeler. These imposed higher-order statistics, termed multiple-point statistics, might conflict with the actual understanding of the reservoir geology. The novel approach presented is based on the fact that outcrop or any other source of ancillary geological information allows us to borrow spatial structures beyond the variogram, which is only a two-point correlation measure. These patterns are borrowed in the form of multiple-point statistics from so-called training images, allowing a better description of the complex reservoir geology. Such training images could be as simple as a series of hand-drawn sketches by the geologist or a compilation of outcrop data (there may be several at different scales). If enough ancillary geological information is present, it should be possible to construct three-dimensional (3D) training images. If not enough geological information is present one can resort back to the traditional variogram-based method. Although the proposed methodology is general, this paper shows the application of the novel approach to a North Sea reservoir dataset and attempts to make comparisons with the variogram-based methodology. The comparison is based on the flow performance of a set of reservoir models generated with each geostatistical technique. The geology of many North Sea reservoirs is very heterogeneous due to the presence of high-permeability fluvial channels.2 The amount of hard data available along wells is typically sparse and the soft data (seismic) show a low correlation with petrophysical or facies properties within the reservoir. Hence, the construction of a representative prior structural model, accurately representing the reservoir geology, is of crucial importance. Data Sets The reservoir under study is a Triassic fluvial reservoir typical of a large number of fields in the North Sea. The fluvial channel formation was deposited by streams that range from braided to low-moderate sinuosity. The reservoir is made up of complex patterns of sand intercalated in a silty mudstone matrix. The reservoir is characterized by a trend of upward increasing sandiness. Well-defined fluvial channels of sandstones embedded in a mudstone matrix occur towards the base, while interstratified channels occur towards the top. For more-detailed information about the geology of such reservoirs refer to Ref. 2 and 3. True Reservoir. The purpose of this study is to evaluate the impact of alternative geostatistical reservoir models on the result of a flow simulation in a setting approaching a real case. To provide a common reference, a true reservoir, whose properties are exhaustively known at each gridblock, must be established. In order to keep the number of variables limited and be able to make a conclusive comparison, the reservoir is described by two facies only: channel sand and mudstone facies. For the true reservoir, a Boolean (object-based) simulation of channels was constructed from a detailed geological description of the channeling in actual North Sea reservoirs. For more details on the Boolean algorithm used, see Ref. 4. Selected slices of the true reservoir are shown in Fig. 1. The reservoir has the following general characteristics:The reservoir is discretized into 37×66×15 gridblocks in the x, y, and z vertical directions, respectively.
Styles APA, Harvard, Vancouver, ISO, etc.
22

Williamson, N. « Application of a one-dimensional geostatistical procedure to fisheries acoustic surveys of Alaskan pollock ». ICES Journal of Marine Science 53, no 2 (avril 1996) : 423–28. http://dx.doi.org/10.1006/jmsc.1996.0059.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
23

Folle, Daiane, João F. Costa, Jair C. Koppe et André C. Zingano. « A Procedure to Quantify the Variability of Geotechnical Properties ». Soils and Rocks 31, no 3 (1 septembre 2008) : 127–35. http://dx.doi.org/10.28927/sr.313127.

Texte intégral
Résumé :
The geotechnical properties of soil should be considered for several civil engineering purposes. Geotechnical informa- tion is used for urban planning, environmental management, slope stability analysis, and foundation design, among others. Given the importance that geotechnical information assumes in several engineering applications, geotechnical mapping is deemed rele- vant. Methods for integrating field tests and quantifying estimate uncertainty in the construction of these geotechnical maps is preferably used in the decision-making process. A methodology to build this kind of maps is proposed based on geostatistical sto- chastic simulation. Maps covering an area of 4 km2 were built, based on the information derived from 141 boreholes, where stan- dard penetration tests (SPT) were carried out. Sequential Gaussian simulation was used for building these maps, since it reproduces data statistics and spatial continuity. The soil resistance to penetration of panels of 100 x 100 m2 was estimated and the estimation error was calculated. The results demonstrate the appropriateness and usefulness of the methodology for mapping geotechnical attributes.
Styles APA, Harvard, Vancouver, ISO, etc.
24

Narciso, João, Cristina Paixão Araújo, Leonardo Azevedo, Ruben Nunes, João Filipe Costa et Amílcar Soares. « A Geostatistical Simulation of a Mineral Deposit using Uncertain Experimental Data ». Minerals 9, no 4 (23 avril 2019) : 247. http://dx.doi.org/10.3390/min9040247.

Texte intégral
Résumé :
In the geostatistical modeling and characterization of natural resources, the traditional approach for determining the spatial distribution of a given deposit using stochastic sequential simulation is to use the existing experimental data (i.e., direct measurements) of the property of interest as if there is no uncertainty involved in the data. However, any measurement is prone to error from different sources, for example from the equipment, the sampling method, or the human factor. It is also common to have distinct measurements for the same property with different levels of resolution and uncertainty. There is a need to assess the uncertainty associated with the experimental data and integrate it during the modeling procedure. This process is not straightforward and is often overlooked. For the reliable modeling and characterization of a given ore deposit, measurement uncertainties should be included as an intrinsic part of the geo-modeling procedure. This work proposes the use of a geostatistical simulation algorithm to integrate uncertain experimental data through the use of stochastic sequential simulations with local probability functions. The methodology is applied to the stochastic modeling of a benchmark mineral deposit, where certain and uncertain experimental data co-exist. The uncertain data is modeled by assigning individual probability distribution functions to each sample location. Different strategies are proposed to build these local probability distributions. Each scenario represents variable degrees of uncertainty. The impacts of the different modeling approaches on the final deposit model are discussed. The resulting models of these proposed scenarios are also compared against those retrieved from previous studies that use conventional geostatistical simulation. The results from the proposed approaches showed that using stochastic sequential simulation with local probability functions to represent local uncertainties decreased the estimation error of the resulting model, producing fewer misclassified ore blocks.
Styles APA, Harvard, Vancouver, ISO, etc.
25

Hunziker, Jürg, Eric Laloy et Niklas Linde. « Inference of multi-Gaussian relative permittivity fields by probabilistic inversion of crosshole ground-penetrating radar data ». GEOPHYSICS 82, no 5 (1 septembre 2017) : H25—H40. http://dx.doi.org/10.1190/geo2016-0347.1.

Texte intégral
Résumé :
In contrast to deterministic inversion, probabilistic Bayesian inversion provides an ensemble of solutions that can be used to quantify model uncertainty. We have developed a probabilistic inversion approach that uses crosshole first-arrival traveltimes to estimate an underlying geostatistical model, the subsurface structure, and the standard deviation of the data error simultaneously. The subsurface is assumed to be represented by a multi-Gaussian field, which allows us to reduce the dimensionality of the problem significantly. Compared with previous applications in hydrogeology, novelties of this study include an improvement of the dimensionality reduction algorithm to avoid streaking artifacts, it is the first application to geophysics and the first application to field data. The results of a synthetic example show that the model domain enclosed by one borehole pair is generally too small to provide reliable estimates of geostatistical variables. A real-data example based on two borehole pairs confirms these findings and demonstrates that the inversion procedure also works under realistic conditions with, for example, unknown measurement errors.
Styles APA, Harvard, Vancouver, ISO, etc.
26

Namysłowska-Wilczyńska, Barbara. « Geostatistical analysis of space variation in underground water various quality parameters in Kłodzko water intake area (SW part of Poland) ». Studia Geotechnica et Mechanica 38, no 3 (1 septembre 2016) : 15–34. http://dx.doi.org/10.1515/sgem-2016-0022.

Texte intégral
Résumé :
Abstract This paper presents selected results of research connected with the development of a (3D) geostatistical hydrogeochemical model of the Kłodzko Drainage Basin, dedicated to the spatial variation in the different quality parameters of underground water in the water intake area (SW part of Poland). The research covers the period 2011-2012. Spatial analyses of the variation in various quality parameters, i.e., contents of: iron, manganese, ammonium ion, nitrate ion, phosphate ion, total organic carbon, pH redox potential and temperature, were carried out on the basis of the chemical determinations of the quality parameters of underground water samples taken from the wells in the water intake area. Spatial variation in the parameters was analyzed on the basis of data obtained (November 2011) from tests of water taken from 14 existing wells with a depth ranging from 9.5 to 38.0 m b.g.l. The latest data (January 2012) were obtained (gained) from 3 new piezometers, made in other locations in the relevant area. A depth of these piezometers amounts to 9-10 m. Data derived from 14 wells (2011) and 14 wells + 3 piezometers (2012) were subjected to spatial analyses using geostatistical methods. The evaluation of basic statistics of the quality parameters, including their histograms of distributions, scatter diagrams and correlation coefficient values r were presented. The directional semivariogram function γ(h) and the ordinary (block) kriging procedure were used to build the 3D geostatistical model. The geostatistical parameters of the theoretical models of directional semivariograms of the water quality parameters under study, calculated along the wells depth (taking into account the terrain elevation), were used in the ordinary (block) kriging estimation. The obtained results of estimation, i.e., block diagrams allowed us to determine the levels of increased values of estimated averages Z* of underground water quality parameters.
Styles APA, Harvard, Vancouver, ISO, etc.
27

Pugliese, A., A. Castellarin et A. Brath. « Geostatistical prediction of flow–duration curves in an index-flow framework ». Hydrology and Earth System Sciences 18, no 9 (30 septembre 2014) : 3801–16. http://dx.doi.org/10.5194/hess-18-3801-2014.

Texte intégral
Résumé :
Abstract. An empirical period-of-record flow–duration curve (FDC) describes the percentage of time (duration) in which a given streamflow was equaled or exceeded over an historical period of time. In many practical applications one has to construct FDCs in basins that are ungauged or where very few observations are available. We present an application strategy of top-kriging, which makes the geostatistical procedure capable of predicting FDCs in ungauged catchments. Previous applications of top-kriging mainly focused on the prediction of point streamflow indices (e.g. flood quantiles, low-flow indices, etc.); here the procedure is used to predict the entire curve in ungauged sites as a weighted average of standardised empirical FDCs through the traditional linear-weighting scheme of kriging methods. In particular, we propose to standardise empirical FDCs by a reference index-flow value (i.e. mean annual flow, or mean annual precipitation × the drainage area) and to compute the overall negative deviation of the curves from this reference value. We then propose to use these values, which we term total negative deviation (TND), for expressing the hydrological similarity between catchments and for deriving the geostatistical weights. We focus on the prediction of FDCs for 18 unregulated catchments located in central Italy, and we quantify the accuracy of the proposed technique under various operational conditions through an extensive cross-validation and sensitivity analysis. The cross-validation points out that top-kriging is a reliable approach for predicting FDCs with Nash–Sutcliffe efficiency measures ranging from 0.85 to 0.96 (depending on the model settings) very low biases over the entire duration range, and an enhanced representation of the low-flow regime relative to other regionalisation models that were recently developed for the same study region.
Styles APA, Harvard, Vancouver, ISO, etc.
28

Xu, Zhiwei, James Irving, Yu Liu, Peimin Zhu et Klaus Holliger. « Conditional stochastic inversion of common-offset ground-penetrating radar reflection data ». GEOPHYSICS 86, no 5 (2 juillet 2021) : WB147—WB157. http://dx.doi.org/10.1190/geo2020-0639.1.

Texte intégral
Résumé :
We have developed a stochastic inversion procedure for common-offset ground-penetrating radar (GPR) reflection measurements. Stochastic realizations of subsurface properties that offer an acceptable fit to GPR data are generated via simulated annealing optimization. The realizations are conditioned to borehole porosity measurements available along the GPR profile or equivalent measurements of another petrophysical property that can be related to the dielectric permittivity, as well as to geostatistical parameters derived from the borehole logs and the processed GPR image. Validation of our inversion procedure is performed on a pertinent synthetic data set and indicates that our method is capable of reliably recovering strongly heterogeneous porosity structures associated with surficial alluvial aquifers. This finding is largely corroborated through application of the methodology to field measurements from the Boise Hydrogeophysical Research Site near Boise, Idaho, USA.
Styles APA, Harvard, Vancouver, ISO, etc.
29

Persiano, Simone, Alessio Pugliese, Alberto Aloe, Jon Olav Skøien, Attilio Castellarin et Alberto Pistocchi. « Streamflow data availability in Europe : a detailed dataset of interpolated flow-duration curves ». Earth System Science Data 14, no 9 (29 septembre 2022) : 4435–43. http://dx.doi.org/10.5194/essd-14-4435-2022.

Texte intégral
Résumé :
Abstract. For about 24 000 river basins across Europe, we provide a continuous representation of the streamflow regime in terms of empirical flow-duration curves (FDCs), which are key signatures of the hydrological behaviour of a catchment and are widely used for supporting decisions on water resource management as well as for assessing hydrologic change. In this study, FDCs are estimated by means of the geostatistical procedure termed total negative deviation top-kriging (TNDTK), starting from the empirical FDCs made available by the Joint Research Centre of the European Commission (DG-JRC) for about 3000 discharge measurement stations across Europe. Consistent with previous studies, TNDTK is shown to provide high accuracy for the entire study area, even with different degrees of reliability, which varies significantly over the study area. In order to provide this kind of information site by site, together with the estimated FDCs, for each catchment we provide indicators of the accuracy and reliability of the performed large-scale geostatistical prediction. The dataset is freely available at the PANGAEA open-access library (Data Publisher for Earth & Environmental Science) at https://doi.org/10.1594/PANGAEA.938975 (Persiano et al., 2021b).
Styles APA, Harvard, Vancouver, ISO, etc.
30

Schoepfer, Valerie, Amy Burgin, Terry Loecke et Ashley Helton. « Seasonal Salinization Decreases Spatial Heterogeneity of Sulfate Reducing Activity ». Soil Systems 3, no 2 (2 avril 2019) : 25. http://dx.doi.org/10.3390/soilsystems3020025.

Texte intégral
Résumé :
Evidence of sulfate input and reduction in coastal freshwater wetlands is often visible in the black iron monosulfide (FeS) complexes that form in iron rich reducing sediments. Using a modified Indicator of Reduction in Soils (IRIS) method, digital imaging, and geostatistics, we examine controls on the spatial properties of FeS in a coastal wetland fresh-to-brackish transition zone over a multi-month, drought-induced saltwater incursion event. PVC sheets (10 × 15 cm) were painted with an iron oxide paint and incubated vertically belowground and flush with the surface for 24 h along a salt-influenced to freshwater wetland transect in coastal North Carolina, USA. Along with collection of complementary water and soil chemistry data, the size and location of the FeS compounds on the plate were photographed and geostatistical techniques were employed to characterize FeS formation on the square cm scale. Herein, we describe how the saltwater incursion front is associated with increased sulfate loading and decreased aqueous Fe(II) content. This accompanies an increased number of individual FeS complexes that were more uniformly distributed as reflected in a lower Magnitude of Spatial Heterogeneity at all sites except furthest downstream. Future work should focus on streamlining the plate analysis procedure as well as developing a more robust statistical based approach to determine sulfide concentration.
Styles APA, Harvard, Vancouver, ISO, etc.
31

Castiglioni, S., A. Castellarin, A. Montanari, J. O. Skøien, G. Laaha et G. Blöschl. « Geostatistical regionalization of low-flow indices : PSBI and Top-Kriging ». Hydrology and Earth System Sciences Discussions 7, no 5 (23 septembre 2010) : 7231–61. http://dx.doi.org/10.5194/hessd-7-7231-2010.

Texte intégral
Résumé :
Abstract. Recent studies highlight that geostatistical interpolation, which has been originally developed for the spatial interpolation of point data, can be effectively applied to the problem of regionalization of hydrometric information. This study compares two innovative geostatistical approaches for the prediction of low-flows in ungauged basins. The first one, named Physiographic-Space Based Interpolation (PSBI), performs the spatial interpolation of the desired streamflow index (e.g., annual streamflow, low-flow index, flood quantile, etc.) in the space of catchment descriptors. The second technique, named Topological kriging or Top-Kriging, predicts the variable of interest along river networks taking both the area and nested nature of catchments into account. PSBI and Top-Kriging are applied for the regionalization of Q355 (i.e., the streamflow that is equalled or exceeded 355 days in a year, on average) over a broad geographical region in central Italy, which contains 51 gauged catchments. Both techniques are cross-validated through a leave-one-out procedure at all available gauges and applied to a subregion to produce a continuous estimation of Q355 along the river network extracted from a 90 m DEM. The results of the study show that Top-Kriging and PSBI present complementary features and have comparable performances (Nash-Sutcliffe efficiencies in cross-validation of 0.89 and 0.83, respectively). Both techniques provide plausible and accurate predictions of Q355 in ungauged basins and represent promising opportunities for regionalization of low-flows.
Styles APA, Harvard, Vancouver, ISO, etc.
32

Jezierski, Pawel, et Cezary Kabala. « Geostatistical Tools to Assess Existing Monitoring Network of Forest Soils in a Mountainous National Park ». Forests 12, no 3 (11 mars 2021) : 333. http://dx.doi.org/10.3390/f12030333.

Texte intégral
Résumé :
Environmental changes in national parks are generally subject to constant observation. A particular case is parks located in mountains, which are more vulnerable to climate change and the binding of pollutants in mountain ranges as orographic barriers. The effectiveness of forest soil monitoring networks based on a systematic grid with a predetermined density has not been analysed so far. This study’s analysis was conducted in the Stolowe Mountains National Park (SMNP), SW Poland, using total Pb concentration data obtained from an initial network of 403 circle plots with centroids arranged in a regular 400 × 400 m square grid. The number and distribution of monitoring plots were analysed using geostatistical tools in terms of the accuracy and correctness of soil parameters obtained from spatial distribution imaging. The analysis also aimed at reducing the number of monitoring plots taking into account the economic and logistic aspects of the monitoring investigations in order to improve sampling efficiency in subsequent studies in the SMNP. The concept of the evaluation and modification of the monitoring network presented in this paper is an original solution and included first the reduction and then the extension of plot numbers. Two variants of reduced monitoring networks, constructed using the proposed procedure, allowed us to develop the correct geostatistical models, which were characterised by a slightly worse mean standardised error (MSE) and root mean squared error (RMSE) compared to errors from the original, regular monitoring network. Based on the new geostatistical models, the prediction of Pb concentration in soils in the reduced grids changed the spatial proportions of areas in different pollution classes to a limited extent compared to the original network.
Styles APA, Harvard, Vancouver, ISO, etc.
33

Ferrari, Alessia, Marco D'Oria, Renato Vacondio, Paolo Mignosa et Maria Giovanna Tanda. « Hydrograph estimation at upstream ungauged sections on the Secchia River (Italy) by means of a parallel Bayesian inverse methodology ». E3S Web of Conferences 40 (2018) : 06034. http://dx.doi.org/10.1051/e3sconf/20184006034.

Texte intégral
Résumé :
In this work, we present a reverse flow routing procedure, which allows estimating discharge hydrographs at upstream ungauged stations by means of information available at downstream monitored sites. The reverse routing problem is solved adopting a Bayesian Geostatistical Approach (BGA). In order to capture the complex hydrodynamic field typical of many real cases of rivers including large floodable areas, meanwhile overcoming the computational time limitations, we adopted as forward model a selfdeveloped 2D-SWE parallel numerical model (PARFLOOD) that allows achieving ratio of physical to computational time of about 500-1000. To exploit the computational capabilities of modern GPU cluster, a parallel procedure to estimate the Jacobian matrix required by the BGA approach has been implemented. The inflow hydrograph in a river reach with several meanders and floodplains has been estimated in “only” 13 hours using a HPC cluster with 10 P100 Nvidia GPUs.
Styles APA, Harvard, Vancouver, ISO, etc.
34

Roa-Ureta, Rubén, et Edwin Niklitschek. « Biomass estimation from surveys with likelihood-based geostatistics ». ICES Journal of Marine Science 64, no 9 (4 octobre 2007) : 1723–34. http://dx.doi.org/10.1093/icesjms/fsm149.

Texte intégral
Résumé :
Abstract Roa-Ureta, R., and Niklitschek, E. 2007. Biomass estimation from surveys with likelihood-based geostatistics. – ICES Journal of Marine Science, 64. A likelihood-based geostatistical method for estimating fish biomass from survey data is presented. Biomass estimates from analysis of a positive random variable with an additional discrete probability mass at zero means that the method accommodates null observations and positive fish density. The positive fish density data were used to estimate mean fish density in the subareas where the stock was present. A presence/absence representation of the data in the survey area was modelled with a generalized linear spatial model of the binomial family, leading to an estimate of the area effectively occupied by the stock. As an extension, a procedure is proposed to accommodate extra sources of correlation, such as multiple surveys or multiple vessels. The new methodology was applied to three cases. The simplest case is a scallop trawl survey for which only the positive density data need to be analysed. The intermediate case is a trawl survey of highly mobile squid where the stock area and the mean density inside the stock area are analysed. The most complex case is in estimating the biomass of very localized orange roughy, for which repeat surveys create dependence in the data in addition to spatial correlation.
Styles APA, Harvard, Vancouver, ISO, etc.
35

Avansi, Guilherme Daniel, Célio Maschio et Denis José Schiozer. « Simultaneous History-Matching Approach by Use of Reservoir-Characterization and Reservoir-Simulation Studies ». SPE Reservoir Evaluation & ; Engineering 19, no 04 (14 juin 2016) : 694–712. http://dx.doi.org/10.2118/179740-pa.

Texte intégral
Résumé :
Summary Reservoir characterization is the key to success in history matching and production forecasting. Thus, numerical simulation becomes a powerful tool to achieve a reliable model by quantifying the effect of uncertainties in field development and management planning, calibrating a model with history data, and forecasting field production. History matching is integrated into several areas, such as geology (geological characterization and petrophysical attributes), geophysics (4D-seismic data), statistical approaches (Bayesian theory and Markov field), and computer science (evolutionary algorithms). Although most integrated-history-matching studies use a unique objective function (OF), this is not enough. History matching by simultaneous calibrations of different OFs is necessary because all OFs must be within the acceptance range as well as maintain the consistency of generated geological models during reservoir characterization. The main goal of this work is to integrate history matching and reservoir characterization, applying a simultaneous calibration of different OFs in a history-matching procedure, and keeping the geological consistency in an adjustment approach to reliably forecast production. We also integrate virtual wells and geostatistical methods into the reservoir characterization to ensure realistic geomodels, avoiding the geological discontinuities, to match the reservoir numerical model. The proposed methodology comprises a geostatistical method to model the spatial reservoir-property distribution on the basis of the well-log data; numerical simulation; and adjusting conditional realizations (models) on the basis of geological modeling (variogram model, vertical-proportion curve, and regularized well-log data). In addition, reservoir uncertainties are included, simultaneously adjusting different OFs to evaluate the history-matching process and virtual wells to perturb geological continuities. This methodology effectively preserves the consistency of geological models during the history-matching process. We also simultaneously combine different OFs to calibrate and validate the models with well-production data. Reliable numerical and geological models are used in forecasting production under uncertainties to validate the integrated procedure.
Styles APA, Harvard, Vancouver, ISO, etc.
36

Soltani-Mohammadi, Saeed, et Mohammad Safa. « A Simulated Annealing based Optimization Algorithm for Automatic Variogram Model Fitting ». Archives of Mining Sciences 61, no 3 (1 septembre 2016) : 635–49. http://dx.doi.org/10.1515/amsc-2016-0045.

Texte intégral
Résumé :
AbstractFitting a theoretical model to an experimental variogram is an important issue in geostatistical studies because if the variogram model parameters are tainted with uncertainty, the latter will spread in the results of estimations and simulations. Although the most popular fitting method is fitting by eye, in some cases use is made of the automatic fitting method on the basis of putting together the geostatistical principles and optimization techniques to: 1) provide a basic model to improve fitting by eye, 2) fit a model to a large number of experimental variograms in a short time, and 3) incorporate the variogram related uncertainty in the model fitting. Effort has been made in this paper to improve the quality of the fitted model by improving the popular objective function (weighted least squares) in the automatic fitting. Also, since the variogram model function (£) and number of structures (m) too affect the model quality, a program has been provided in the MATLAB software that can present optimum nested variogram models using the simulated annealing method. Finally, to select the most desirable model from among the single/multi-structured fitted models, use has been made of the cross-validation method, and the best model has been introduced to the user as the output. In order to check the capability of the proposed objective function and the procedure, 3 case studies have been presented.
Styles APA, Harvard, Vancouver, ISO, etc.
37

Storm, B., K. Høgh Jensen et J. C. Refsgaard. « Estimation of Catchment Rainfall Uncertainty and its Influence on Runoff Prediction ». Hydrology Research 19, no 2 (1 avril 1988) : 77–88. http://dx.doi.org/10.2166/nh.1988.0006.

Texte intégral
Résumé :
Interpolation of spatially varying point precipitation depths introduces uncertainties in the estimated mean areal precipitation (MAP). This paper describes a geostatistical approach – the Kriging method – to calculate the daily MAP on real-time basis. The procedure provides a linear unbiased estimate with minimum estimation variance. The structural analysis of the random precipitation field is automatized by relating the time-varying semivariogram model to the sample variance. This is illustrated on data from a Danish IHD catchment. The conceptual rainfall-runoff model NAM incorporated into a Kalman-filter algortithm is applied to investigate the effects of uncertainties in MAP on the runoff predictions. Measurement and processing errors are not included in the investigation.
Styles APA, Harvard, Vancouver, ISO, etc.
38

Mucha, Jacek, et Monika Wasilewska-Błaszczyk. « Geostatistical support for categorization of metal ore resources in Poland ». Gospodarka Surowcami Mineralnymi 31, no 4 (1 décembre 2015) : 21–34. http://dx.doi.org/10.1515/gospo-2015-0035.

Texte intégral
Résumé :
Abstract The authors attempted to introduce some components of the Australasian JORC Code system to the categorization of Polish Cu-Ag and Zn-Pb ore resources. The proposed geostatistical method of resource categorization applies two criteria: continuity of deposit parameters described by semivariograms and permissible, relative standard error of resources estimation determined with the ordinary kriging procedure. Considering the first criterion, we propose the following values of autocorrelation coefficients, which define the ranges (distances) of the resources categories around the measurement sites (e.g., exploration wells): “measured” category (A + B in the Polish system) – the values of the autocorrelation coefficient from 1 to 2/3, “indicated” category (C1 in the Polish system) – the values of the autocorrelation coefficient from 2/3 to 1/3, “inferred” category (partly C2 in the Polish system) – the values of the autocorrelation coefficient from 1/3 to 1/20, “out-of-doors” category (partly D in the Polish system) – the values of autocorrelation coefficient from 1/20 to 0. The second criterion of resources categorization is based upon the relative, standard errors of resources estimations calculated for the parts of deposit defined with the first criterion. The following permissible values of errors determined as the errors of ordinary kriging have been proposed: “measured” category (A + B in the Polish system) – 10% error, “indicated” category (C1 in the Polish system) – 20% error, “inferred” category (partly C2 in the Polish system) – 30% error, “out-of-doors” category (partly D in the Polish system) – 50% error. It was found that the Polish metal ore deposits reveal low continuity of deposit parameters, as indicated by a high share of nugget variance in the overall variability of these parameters. Moreover, an inconsistency was observed between the semivariograms of deposit parameters based upon samplings of drill cores and underground mine workings, which results in extreme differences in the ranges of resources categories around the sampling sites. This, in turn, causes radical discrepancies in estimated resources. Thus, it was concluded that the sufficiently credible categorization of resources is only possible when a significant part of the deposit is explored with mine workings, in which the grid of sampling sites is much denser than that of exploration wells. It was proposed that the principal criterion of resources categorization should be the permissible error of estimations whereas the continuity of deposit parameters should only be a supplementary criterion.
Styles APA, Harvard, Vancouver, ISO, etc.
39

Lamy, Philippe, P. A. Swaby, P. S. Rowbotham, Olivier Dubrule et A. Haas. « From Seismic to Reservoir Properties With Geostatistical Inversion ». SPE Reservoir Evaluation & ; Engineering 2, no 04 (1 août 1999) : 334–40. http://dx.doi.org/10.2118/57476-pa.

Texte intégral
Résumé :
Summary The methodology presented in this paper incorporates seismic data, geological knowledge and well logs to produce models of reservoir parameters and uncertainties associated with them. A three-dimensional (3D) seismic dataset is inverted within a geological and stratigraphic model using the geostatistical inversion technique. Several reservoir-scale acoustic impedance blocks are obtained and quantification of uncertainty is determined by computing statistics on these 3D blocks. Combining these statistics with the kriging of the reservoir parameter well logs allows the transformation of impedances into reservoir parameters. This combination is similar to performing a collocated cokriging of the acoustic impedances. Introduction Our geostatistical inversion approach is used to invert seismic traces within a geological and stratigraphic model. At each seismic trace location, a large number of acoustic impedance (AI) traces are generated by conditional simulation, and a local objective function is minimized to find the trace that best fits the actual seismic trace. Several three-dimensional (3D) AI realizations are obtained, all of which are constrained by both the well logs and seismic data. Statistics are then computed in each stratigraphic cell of the 3D results to quantify the nonuniqueness of the solution and to summarize the information provided by individual realizations. Finally, AI are transformed into other reservoir parameters such as Vshale through a statistical petrophysical relationship. This transformation is used to map Vshale between wells, by combining information derived from Vshale logs with information derived from AI blocks. The final block(s) can then be mapped from the time to the depth domain and used for building the flow simulation models or for defining reservoir characterization maps (e.g., net to gross, hydrocarbon pore volume). We illustrate the geostatistical inversion method with results from an actual case study. The construction of the a-priori model in time, the inversion, and the final reservoir parameters in depth are described. These results show the benefit of a multidisciplinary approach, and illustrate how the geostatistical inversion method provides clear quantification of uncertainties affecting the modeling of reservoir properties between wells. Methodology The Geostatistical Inversion Approach. This methodology was introduced by Bortoli et al.1 and Haas and Dubrule.2 It is also discussed in Dubrule et al.3 and Rowbotham et al.4 Its application on a synthetic case is described in Dubrule et al.5 A brief review of the method will be presented here, emphasizing how seismic data and well logs are incorporated into the inversion process. The first step is to build a geological model of the reservoir in seismic time. Surfaces are derived from sets of picks defining the interpreted seismic. These surfaces are important sincethey delineate the main layers of the reservoir and, as we will see below, the statistical model associated with these layers, andthey control the 3D stratigraphic grid construction. The structure of this grid (onlap, eroded, or proportional) depends on the geological context. The maximum vertical discretization may be higher than that of the seismic, typically from 1 to 4 milliseconds. The horizontal discretization is equal to the number of seismic traces to invert in each direction (one trace per cell in map view). Raw AI logs at the wells have to be located within this stratigraphic grid since they will be used as conditioning data during the inversion process. It is essential that well logs should be properly calibrated with the seismic. This implies that a representative seismic wavelet has been matched to the wells, by comparing the convolved reflectivity well log response with the seismic response at the same location. This issue is described more fully in Rowbotham et al.4 Geostatistical parameters are determined by using both the wells and seismic data. Lateral variograms are computed from the seismic mapped into the stratigraphic grid. Well logs are used to both give an a priori model (AI mean and standard deviation) per stratum and to compute vertical variograms. The geostatistical inversion process can then be started. A random path is followed by the simulation procedure, and at each randomly drawn trace location AI trace values can be generated by sequential Gaussian simulation (SGS). A large number of AI traces are generated at the same location and the corresponding reflectivities are calculated. After convolution with the wavelet, the AI trace that leads to the best fit with the actual seismic is kept and merged with the wells and the previously simulated AI traces. The 3D block is therefore filled sequentially, trace after trace (see Fig. 1). It is possible to ignore the seismic data in the simulation process by generating only one trace at any (X, Y) location and automatically keeping it as "the best one." In this case, realizations are only constrained by the wells and the geostatistical model (a-priori parameters and variograms).
Styles APA, Harvard, Vancouver, ISO, etc.
40

Erdin, Rebekka, Christoph Frei et Hans R. Künsch. « Data Transformation and Uncertainty in Geostatistical Combination of Radar and Rain Gauges ». Journal of Hydrometeorology 13, no 4 (1 août 2012) : 1332–46. http://dx.doi.org/10.1175/jhm-d-11-096.1.

Texte intégral
Résumé :
Abstract Geostatistics provides a popular framework for deriving high-resolution quantitative precipitation estimates (QPE) by combining radar and rain gauge data. However, the skewed and heteroscedastic nature of precipitation is in contradiction to assumptions of classical geostatistics. This study examines the potential of trans-Gaussian kriging to overcome this contradiction. Combination experiments are undertaken with kriging with external drift (KED) using several settings of the Box–Cox transformation. Hourly precipitation data in Switzerland for the year 2008 serve as test bed to compare KED with and without transformation. The impact of transformation is examined with regard to compliance with model assumptions, accuracy of the point estimate, and reliability of the probabilistic estimate. Data transformation improves the compliance with model assumptions, but some level of contradiction remains in situations with many dry gauges. Very similar point estimates are found for KED with untransformed and appropriately transformed data. However, care is needed to avoid excessive transformation (close to log) because this can introduce a positive bias. Strong benefits from transformation are found for the probabilistic estimate, which is rendered positively skewed, sensitive to precipitation amount, and quantitatively more reliable. Without transformation, 44% of all precipitation observations larger than 5 mm h−1 are considered as extremely unlikely by the probabilistic estimate in the test application. Transformation reduces this rate to 4%. Although transformation cannot fully remedy the complications for geostatistics in radar–gauge combination, it seems a useful procedure if realistic and reliable estimation uncertainties are desired, such as for the stochastic simulation of QPE ensembles.
Styles APA, Harvard, Vancouver, ISO, etc.
41

Vieira, Sidney Rosa, José Ruy Porto de Carvalho et Antonio Paz González. « Jack knifing for semivariogram validation ». Bragantia 69, suppl (2010) : 97–105. http://dx.doi.org/10.1590/s0006-87052010000500011.

Texte intégral
Résumé :
The semivariogram function fitting is the most important aspect of geostatistics and because of this the model chosen must be validated. Jack knifing may be one the most efficient ways for this validation purpose. The objective of this study was to show the use of the jack knifing technique to validate geostatistical hypothesis and semivariogram models. For that purpose, topographical heights data obtained from six distinct field scales and sampling densities were analyzed. Because the topographical data showed very strong trend for all fields as it was verified by the absence of a sill in the experimental semivariograms, the trend was removed with a trend surface fitted by minimum square deviation. Semivariogram models were fitted with different techniques and the results of the jack knifing with them were compared. The jack knifing parameters analyzed were the intercept, slope and correlation coefficient between measured and estimated values, and the mean and variance of the errors calculated by the difference between measured and estimated values, divided by the square root of the estimation variances. The ideal numbers of neighbors used in each estimation was also studied using the jack knifing procedure. The jack knifing results were useful in the judgment of the adequate models fitted independent of the scale and sampling densities. It was concluded that the manual fitted semivariogram models produced better jack knifing parameters because the user has the freedom to choose a better fit in distinct regions of the semivariogram.
Styles APA, Harvard, Vancouver, ISO, etc.
42

El Baba, Moustafa, Prabin Kayastha, Marijke Huysmans et Florimond De Smedt. « Groundwater Vulnerability and Nitrate Contamination Assessment and Mapping Using DRASTIC and Geostatistical Analysis ». Water 12, no 7 (16 juillet 2020) : 2022. http://dx.doi.org/10.3390/w12072022.

Texte intégral
Résumé :
The Gaza Strip is in a chronic state of water shortage and the coastal aquifer as the only freshwater source is increasingly depleted and polluted, especially by nitrate. Assessment of groundwater vulnerability to pollution is essential for adequate protection and management. In this study, the assessment of the aquifer vulnerability to contamination is derived by applying the DRASTIC procedure, firstly with original default weights and ratings and, secondly, improved by estimating rating values by multiple linear regression of observed log-transformed nitrate concentration in groundwater, with DRASTIC factors extended to land-use. The results are very different because high and low vulnerability areas shift considerably. Subsequently, a geostatistical analysis of the spatial distribution of the nitrate concentration is performed, firstly by ordinary kriging interpolation of the observed nitrate concentration and secondly by regression kriging using DRASTIC factors and land-use as indicators of the spatial variation in nitrate occurrence. These maps differ because the map obtained by regression kriging interpolation shows much more details of environmental factors such as dunes, ridges, soil types and built-up areas that affect the presence of nitrate in groundwater. The results of this study can be used by the Palestinian authorities concerned with sustainable groundwater management in the Gaza Strip.
Styles APA, Harvard, Vancouver, ISO, etc.
43

Ferrari, Alessia, Marco D'Oria, Renato Vacondio, Alessandro Dal Palù, Paolo Mignosa et Maria Giovanna Tanda. « Discharge hydrograph estimation at upstream-ungauged sections by coupling a Bayesian methodology and a 2-D GPU shallow water model ». Hydrology and Earth System Sciences 22, no 10 (16 octobre 2018) : 5299–316. http://dx.doi.org/10.5194/hess-22-5299-2018.

Texte intégral
Résumé :
Abstract. This paper presents a novel methodology for estimating the unknown discharge hydrograph at the entrance of a river reach when no information is available. The methodology couples an optimization procedure based on the Bayesian geostatistical approach (BGA) with a forward self-developed 2-D hydraulic model. In order to accurately describe the flow propagation in real rivers characterized by large floodable areas, the forward model solves the 2-D shallow water equations (SWEs) by means of a finite volume explicit shock-capturing algorithm. The two-dimensional SWE code exploits the computational power of graphics processing units (GPUs), achieving a ratio of physical to computational time of up to 1000. With the aim of enhancing the computational efficiency of the inverse estimation, the Bayesian technique is parallelized, developing a procedure based on the Secure Shell (SSH) protocol that allows one to take advantage of remote high-performance computing clusters (including those available on the Cloud) equipped with GPUs. The capability of the methodology is assessed by estimating irregular and synthetic inflow hydrographs in real river reaches, also taking into account the presence of downstream corrupted observations. Finally, the procedure is applied to reconstruct a real flood wave in a river reach located in northern Italy.
Styles APA, Harvard, Vancouver, ISO, etc.
44

Mello, Carlos Rogério de, Léo Fernandes Ávila, Lloyd Darrell Norton, Antônio Marciano da Silva, José Márcio de Mello et Samuel Beskow. « Spatial distribution of top soil water content in an experimental catchment of Southeast Brazil ». Scientia Agricola 68, no 3 (juin 2011) : 285–94. http://dx.doi.org/10.1590/s0103-90162011000300003.

Texte intégral
Résumé :
Soil water content is essential to understand the hydrological cycle. It controls the surface runoff generation, water infiltration, soil evaporation and plant transpiration. This work aims to analyze the spatial distribution of top soil water content and to characterize the spatial mean and standard deviation of top soil water content over time in an experimental catchment located in the Mantiqueira Range region, state of Minas Gerais, Brazil. Measurements of top soil water content were carried out every 15 days, between May/2007 and May/2008. Using time-domain reflectometry (TDR) equipment, 69 points were sampled in the top 0.2 m of the soil profile. Geostatistical procedures were applied in all steps of the study. First, the spatial continuity was evaluated, and the experimental semi-variogram was modeled. For the development of top soil water content maps over time a co-kriging procedure was used having the slope as a secondary variable. Rainfall regime controlled the top soil water content during the wet season. Land use was also another fundamental local factor. The spatial standard deviation had low values under dry conditions, and high values under wet conditions. Thus, more variability occurs under wet conditions.
Styles APA, Harvard, Vancouver, ISO, etc.
45

Lark, R. M. « A geostatistical extension of the sectioning procedure for disaggregating soil information to the scale of functional models of soil processes ». Geoderma 95, no 1-2 (mars 2000) : 89–112. http://dx.doi.org/10.1016/s0016-7061(99)00086-5.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
46

Lam, L., et D. G. Fredlund. « A general limit equilibrium model for three-dimensional slope stability analysis ». Canadian Geotechnical Journal 30, no 6 (1 décembre 1993) : 905–19. http://dx.doi.org/10.1139/t93-089.

Texte intégral
Résumé :
A generalized model for three-dimensional analysis, using the method of columns, is presented. The model is an extension of the two-dimensional general limit equilibrium formulation. Intercolumn force functions of arbitrary shape can be specified to simulate various directions for the intercolumn resultant forces. A unique feature of the model involves the use of a geostatistical procedure (i.e., the Kriging technique) in modelling the geometry of the slope, the stratigraphy, the potential slip surface, and the pore-water pressure conditions. The technique simplifies the data-input procedure and expedites the column discretization and the factor of safety computations. The shape of the intercolumn force functions was investigated for several slope geometries using a three-dimensional finite element stress analysis. The significance of the intercolumn force functions in three-dimensional stability analyses was also studied. The model was utilized to study a case history involving an open-pit mining failure. The results indicate that the model is able to provide a more realistic simulation of the case history than was possible using a conventional two-dimensional model. Key words : stability analysis, general limit equilibrium, three-dimensional, method of columns, factor of safety.
Styles APA, Harvard, Vancouver, ISO, etc.
47

Nerini, Daniele, Zed Zulkafli, Li-Pen Wang, Christian Onof, Wouter Buytaert, Waldo Lavado-Casimiro et Jean-Loup Guyot. « A Comparative Analysis of TRMM–Rain Gauge Data Merging Techniques at the Daily Time Scale for Distributed Rainfall–Runoff Modeling Applications ». Journal of Hydrometeorology 16, no 5 (1 octobre 2015) : 2153–68. http://dx.doi.org/10.1175/jhm-d-14-0197.1.

Texte intégral
Résumé :
Abstract This study compares two nonparametric rainfall data merging methods—the mean bias correction and double-kernel smoothing—with two geostatistical methods—kriging with external drift and Bayesian combination—for optimizing the hydrometeorological performance of a satellite-based precipitation product over a mesoscale tropical Andean watershed in Peru. The analysis is conducted using 11 years of daily time series from the Tropical Rainfall Measuring Mission (TRMM) Multisatellite Precipitation Analysis (TMPA) research product (also TRMM 3B42) and 173 rain gauges from the national weather station network. The results are assessed using 1) a cross-validation procedure and 2) a catchment water balance analysis and hydrological modeling. It is found that the double-kernel smoothing method delivered the most consistent improvement over the original satellite product in both the cross-validation and hydrological evaluation. The mean bias correction also improved hydrological performance scores, particularly at the subbasin scale where the rain gauge density is higher. Given the spatial heterogeneity of the climate, the size of the modeled catchment, and the sparsity of data, it is concluded that nonparametric merging methods can perform as well as or better than more complex geostatistical methods, whose assumptions may not hold under the studied conditions. Based on these results, a systematic approach to the selection of a satellite–rain gauge data merging technique is proposed that is based on data characteristics. Finally, the underperformance of an ordinary kriging interpolation of the rain gauge data, compared to TMPA and other merged products, supports the use of satellite-based products over gridded rain gauge products that utilize sparse data for hydrological modeling at large scales.
Styles APA, Harvard, Vancouver, ISO, etc.
48

Sidiropoulos, Pantelis, Nicolas R. Dalezios, Athanasios Loukas, Nikitas Mylopoulos, Marios Spiliotopoulos, Ioannis N. Faraslis, Nikos Alpanakis et Stavros Sakellariou. « Quantitative Classification of Desertification Severity for Degraded Aquifer Based on Remotely Sensed Drought Assessment ». Hydrology 8, no 1 (17 mars 2021) : 47. http://dx.doi.org/10.3390/hydrology8010047.

Texte intégral
Résumé :
Natural and anthropogenic causes jointly lead to land degradation and eventually to desertification, which occurs in arid, semiarid, and dry subhumid areas. Furthermore, extended drought periods may cause soil exposure and erosion, land degradation and, finally, desertification. Several climatic, geological, hydrological, physiographic, biological, as well as human factors contribute to desertification. This paper presents a methodological procedure for the quantitative classification of desertification severity over a watershed with degraded groundwater resources. It starts with drought assessment using Standardized Precipitation Index (SPI), based on gridded satellite-based precipitation data (taken from the CHIRPS database), then erosion potential is assessed through modeling. The groundwater levels are estimated with the use of a simulation model and the groundwater quality components of desertification, based on scattered data, are interpolated with the use of geostatistical tools. Finally, the combination of the desertification severity components leads to the final mapping of desertification severity classification.
Styles APA, Harvard, Vancouver, ISO, etc.
49

Liu, Peng, et Yeou-Koung Tung. « Spatial interpolation of rain-field dynamic time-space evolution based on radar rainfall data ». Hydrology Research 51, no 3 (17 mars 2020) : 521–40. http://dx.doi.org/10.2166/nh.2020.115.

Texte intégral
Résumé :
Abstract Accurate and reliable measurement and prediction of the spatial and temporal distribution of rain field over a wide range of scales are important topics in hydrologic investigations. In this study, a geostatistical approach was adopted. To estimate the rainfall intensity over a study domain with the sample values and the spatial structure from the radar data, the cumulative distribution functions (CDFs) at all unsampled locations were estimated. Indicator kriging (IK) was used to estimate the exceedance probabilities for different preselected threshold levels, and a procedure was implemented for interpolating CDF values between the thresholds that were derived from the IK. Different probability distribution functions of the CDF were tested and their influences on the performance were also investigated. The performance measures and visual comparison between the observed rain field and the IK-based estimation suggested that the proposed method can provide good results of the estimation of indicator variables and is capable of producing a realistic image.
Styles APA, Harvard, Vancouver, ISO, etc.
50

Moraes, Diego Augusto de campos, et Anderson Antônio da Conceição Sartori. « AMOSTRAS VIRTUAIS DE ATRIBUTOS DO SOLO COMO SUBSÍDIO AO PLANEJAMENTO PARA ANÁLISE GEOESTATÍSTICA ». ENERGIA NA AGRICULTURA 35, no 3 (29 septembre 2020) : 426–36. http://dx.doi.org/10.17224/energagric.2020v35n3p426-436.

Texte intégral
Résumé :
AMOSTRAS VIRTUAIS DE ATRIBUTOS DO SOLO COMO SUBSÍDIO AO PLANEJAMENTO PARA ANÁLISE GEOESTATÍSTICA DIEGO AUGUSTO DE CAMPOS MORAES1, ANDERSON ANTÔNIO DA CONCEIÇÃO SARTORI2 1 Professor Doutor, Departamento de Análise e Desenvolvimento de Sistemas, Faculdade Eduvale de Avaré, Av. Prefeito Misael Eufrásio Leal, 347 - Centro, Avaré - SP, 18705-050, diego.moraes@ead.eduvaleavare.com.br. 2 Professor Doutor, Grupo de Estudos e Pesquisas Agrárias Georreferenciadas, Faculdade de Ciências Agronômicas de Botucatu – FCA/UNESP, Avenida Universitária, 3780, Altos do Paraíso, Botucatu – SP, 18610-034, sartori80@gmail.com. RESUMO: O objetivo deste artigo foi propor uma metodologia de amostragem virtual para atributos do solo em área agrícola, a qual pode subsidiar o planejamento para análise geoestatística. Foram selecionadas, aleatoriamente, 23 amostras de solo (profundidades de 0-20 cm e 20-40 cm) do conjunto de dados original, com o objetivo de realizar a validação externa. Foi aplicado o procedimento de polígonos de Thiessen com base nas demais amostras originais do solo (47 amostras) e, em seguida, foram inseridas, aleatoriamente, amostras virtuais (53 amostras). A análise do variograma, validação cruzada, krigagem ordinária e validação externa foram executadas com a finalidade de verificar a robustez da metodologia. A inserção de amostras virtuais mostrou-se promissora, uma vez que o GDE (Grau de Dependência Espacial) e a validação cruzada dos atributos do solo foram aprimorados, situação que não foi observada nos dados originalmente amostrados. A validação externa obteve bons resultados, indicando que a amostragem virtual pode ser utilizada unicamente no planejamento para análise geoestatística. Palavras-chaves: variograma, validação cruzada, solos. VIRTUAL SAMPLES OF SOIL ATTRIBUTES AS A SUBSIDY FOR GEOSTATISTICAL ANALYSIS PLANNING ABSTRACT: The aim of this article was to propose a virtual sampling methodology for soil attributes in an agricultural area, which can support planning for geostatistical analysis. Twenty-three soil samples (depths of 0-20 cm and 20-40 cm) from the original data set were selected randomly, for an external validation process. The Thiessen polygons procedure was applied based on the remaining original soil samples (47 samples), and then, virtual samples (53 samples) were randomly inserted. The analysis of the variogram, cross-validation, ordinary kriging and external validation were performed in order to verify the robustness of the methodology. The insertion of virtual samples was promising, since the GDE (Degree of Spatial Dependence) and the cross-validation of soil attributes were improved, which was not observed in the data originally sampled. The external validation obtained good results, indicating that the virtual sampling can be used only in the planning for geostatistical analysis. Keywords: variogram, cross-validation, soil.
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie