Articoli di riviste sul tema "Stochastic matching models"

Segui questo link per vedere altri tipi di pubblicazioni sul tema: Stochastic matching models.

Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili

Scegli il tipo di fonte:

Vedi i top-50 articoli di riviste per l'attività di ricerca sul tema "Stochastic matching models".

Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.

Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.

Vedi gli articoli di riviste di molte aree scientifiche e compila una bibliografia corretta.

1

Dewilde, Patrick. "Stochastic models based on moment matching". Communications in Information and Systems 20, n. 2 (2020): 209–48. http://dx.doi.org/10.4310/cis.2020.v20.n2.a5.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Hu, Lin Y., e Sandra Jenni. "History Matching of Object-Based Stochastic Reservoir Models". SPE Journal 10, n. 03 (1 settembre 2005): 312–23. http://dx.doi.org/10.2118/81503-pa.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Mairesse, Jean, e Pascal Moyal. "Editorial introduction to the special issue on stochastic matching models, matching queues and applications". Queueing Systems 96, n. 3-4 (dicembre 2020): 357–58. http://dx.doi.org/10.1007/s11134-021-09690-2.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Comte, Céline. "Stochastic non-bipartite matching models and order-independent loss queues". Stochastic Models 38, n. 1 (10 ottobre 2021): 1–36. http://dx.doi.org/10.1080/15326349.2021.1962352.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Iwata, Tomoharu, Tsutomu Hirao e Naonori Ueda. "Unsupervised Cluster Matching via Probabilistic Latent Variable Models". Proceedings of the AAAI Conference on Artificial Intelligence 27, n. 1 (30 giugno 2013): 445–51. http://dx.doi.org/10.1609/aaai.v27i1.8558.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
We propose a probabilistic latent variable model for unsupervised cluster matching, which is the task of finding correspondences between clusters of objects in different domains. Existing object matching methods find one-to-one matching. The proposed model finds many-to-many matching, and can handle multiple domains with different numbers of objects. The proposed model assumes that there are an infinite number of latent vectors that are shared by all domains, and that each object is generated using one of the latent vectors and a domain-specific linear projection. By inferring a latent vector to be used for generating each object, objects in different domains are clustered in shared groups, and thus we can find matching between clusters in an unsupervised manner. We present efficient inference procedures for the proposed model based on a stochastic EM algorithm. The effectiveness of the proposed model is demonstrated with experiments using synthetic and real data sets.
6

Pankov, Vikentii, e Oleg Granichin. "SPSA Algorithm for Matching of Historical Data for Complex Non-Gaussian Geological Models". Cybernetics and Physics, Volume 11, 2022, Number 1 (2 giugno 2022): 18–24. http://dx.doi.org/10.35470/2226-4116-2022-11-1-18-24.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
History matching is the process of integrating dynamic production data in the reservoir model. It consists in estimation of uncertain model parameters such that oil or water production data from flow simulation become close to observed dynamic data. Various optimization methods can be used to estimate the model parameters. Simultaneous perturbation stochastic approximation (SPSA) is one of the stochastic approximation algorithms. It requires only two objective function measurements for gradient approximation per iteration. Also parameters estimated by this algorithm might converge to their true values under arbitrary bounded additive noise, while many other optimization algorithms require the noise to have zero mean. SPSA algorithm has not been well explored for history matching problems and has been applied only to simple Gaussian models. In this paper, we applied SPSA to history matching of binary channelized reservoir models. We also used SPSA in combination with parameterization method CNN-PCA. And we considered the case of complex noise in observed production data and with objective function that does not require assumptions of normality of the observations, which is common in history matching literature. We experimentally showed that SPSA method can be successfully used for history matching of non-Gaussian geological models with different types of noise in observations and outperforms Particle Swarm Optimization by convergence speed.
7

Zhou, Deng Rong, Jian Chun Gong e Fang Ling Fan. "A Stochastic Resonance Phenomenon in Linear Models". Applied Mechanics and Materials 401-403 (settembre 2013): 1301–4. http://dx.doi.org/10.4028/www.scientific.net/amm.401-403.1301.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
When there exist certain kinds of matching in an electric system between the nonlinear input and noise, amplifying the input noise may dramatically increase the output SNR other than decrease it. And stochastic resonance is a phenomenon that when noise is input at certain amplitude the output SNR reaches its peak. Generalized stochastic resonance is the kind of nonlinear phenomena that the output (output SNR, output mean value, etc.) is a non-monotonic function of some parameter of noise (amplitude, correlation time) or input (amplitude, frequency). We studied the phenomenon of stochastic resonance in linear models with random parameters. And we discovered that the output amplitude is a non-monotonic function of the noise amplitude, the noise correlation time and the system parameters. The result will be a helpful complement to both the traditional linear system theory and the analysis of RL circuit.
8

Meszáros, András, János Papp e Miklós Telek. "Fitting traffic traces with discrete canonical phase type distributions and Markov arrival processes". International Journal of Applied Mathematics and Computer Science 24, n. 3 (1 settembre 2014): 453–70. http://dx.doi.org/10.2478/amcs-2014-0034.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Abstract Recent developments of matrix analytic methods make phase type distributions (PHs) and Markov Arrival Processes (MAPs) promising stochastic model candidates for capturing traffic trace behaviour and for efficient usage in queueing analysis. After introducing basics of these sets of stochastic models, the paper discusses the following subjects in detail: (i) PHs and MAPs have different representations. For efficient use of these models, sparse (defined by a minimal number of parameters) and unique representations of discrete time PHs and MAPs are needed, which are commonly referred to as canonical representations. The paper presents new results on the canonical representation of discrete PHs and MAPs. (ii) The canonical representation allows a direct mapping between experimental moments and the stochastic models, referred to as moment matching. Explicit procedures are provided for this mapping. (iii) Moment matching is not always the best way to model the behavior of traffic traces. Model fitting based on appropriately chosen distance measures might result in better performing stochastic models. We also demonstrate the efficiency of fitting procedures with experimental results
9

Tsioulou, Alexandra, Alexandros A. Taflanidis e Carmine Galasso. "Modification of stochastic ground motion models for matching target intensity measures". Earthquake Engineering & Structural Dynamics 47, n. 1 (9 luglio 2017): 3–24. http://dx.doi.org/10.1002/eqe.2933.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Brignone, Riccardo, Ioannis Kyriakou e Gianluca Fusai. "Moment-matching approximations for stochastic sums in non-Gaussian Ornstein–Uhlenbeck models". Insurance: Mathematics and Economics 96 (gennaio 2021): 232–47. http://dx.doi.org/10.1016/j.insmatheco.2020.12.002.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
11

Tadjer, Amine, Reider B. Bratvold e Remus G. Hanea. "Efficient Dimensionality Reduction Methods in Reservoir History Matching". Energies 14, n. 11 (27 maggio 2021): 3137. http://dx.doi.org/10.3390/en14113137.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Production forecasting is the basis for decision making in the oil and gas industry, and can be quite challenging, especially in terms of complex geological modeling of the subsurface. To help solve this problem, assisted history matching built on ensemble-based analysis such as the ensemble smoother and ensemble Kalman filter is useful in estimating models that preserve geological realism and have predictive capabilities. These methods tend, however, to be computationally demanding, as they require a large ensemble size for stable convergence. In this paper, we propose a novel method of uncertainty quantification and reservoir model calibration with much-reduced computation time. This approach is based on a sequential combination of nonlinear dimensionality reduction techniques: t-distributed stochastic neighbor embedding or the Gaussian process latent variable model and clustering K-means, along with the data assimilation method ensemble smoother with multiple data assimilation. The cluster analysis with t-distributed stochastic neighbor embedding and Gaussian process latent variable model is used to reduce the number of initial geostatistical realizations and select a set of optimal reservoir models that have similar production performance to the reference model. We then apply ensemble smoother with multiple data assimilation for providing reliable assimilation results. Experimental results based on the Brugge field case data verify the efficiency of the proposed approach.
12

Alqurashi, M., e J. Wang. "Impacts of stochastic models on real-time 3D UAV mapping". ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XL-1 (7 novembre 2014): 35–44. http://dx.doi.org/10.5194/isprsarchives-xl-1-35-2014.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
In UAV mapping using direct geo-referencing, the formation of stochastic model generally takes into the account the different types of measurements required to estimate the 3D coordinates of the feature points. Such measurements include image tie point coordinate measurements, camera position measurements and camera orientation measurements. In the commonly used stochastic model, it is commonly assumed that all tie point measurements have the same variance. In fact, these assumptions are not always realistic and thus, can lead to biased 3D feature coordinates. Tie point measurements for different image feature objects may not have the same accuracy due to the facts that the geometric distribution of features, particularly their feature matching conditions are different. More importantly, the accuracies of the geo-referencing measurements should also be considered into the mapping process. In this paper, impacts of typical stochastic models on the UAV mapping are investigated. It has been demonstrated that the quality of the geo-referencing measurements plays a critical role in real-time UAV mapping scenarios.
13

Wu, Mingqi, Yinsen Miao, Neilkunal Panchal, Daniel R. Kowal, Marina Vannucci, Jeremy Vila e Faming Liang. "Stochastic clustering and pattern matching for real-time geosteering". GEOPHYSICS 84, n. 5 (1 settembre 2019): ID13—ID24. http://dx.doi.org/10.1190/geo2018-0781.1.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
We have developed a Bayesian statistical framework for quantitative geosteering in real time. Two types of contemporary geosteering approaches, model based and stratification based, are introduced. The latter is formulated as a Bayesian optimization procedure: The log from a pilot reference well is used as a stratigraphic signature of the geologic structure in a given region; the observed log sequence acquired along the wellbore is projected into the stratigraphic domain given a proposed earth model and directional survey; the pattern similarity between the converted log and the signature is measured by a correlation coefficient; then stochastic searching is performed on the space of all possible earth models to maximize the similarity under constraints of the prior understanding of the drilling process and target formation; finally, an inference is made based on the samples simulated from the posterior distribution using stochastic approximation Monte Carlo in which we extract the most likely earth model and the associated credible intervals as a quantified confidence indicator. We extensively test our method using synthetic and real geosteering data sets. Our method consistently achieves good performance on synthetic data sets with high correlations between the interpreted and the reference logs and provides similar interpretations as the geosteering geologists on four real wells. We also conduct a reliability performance test of the method on a benchmark set of 200 horizontal wells randomly sampled from the Permian Basin. Our Bayesian framework informs geologists with key drilling decisions in real time and helps them navigate the drilling bit into the target formation with confidence.
14

Yan, Yutong, Wei Zhang, Yahua Yin e Weidong Huo. "An Ornstein–Uhlenbeck Model with the Stochastic Volatility Process and Tempered Stable Process for VIX Option Pricing". Mathematical Problems in Engineering 2022 (6 ottobre 2022): 1–14. http://dx.doi.org/10.1155/2022/4018292.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
To effectively fit the dynamics and structure of frequent small jumps and sparse large jumps in the VIX time series, we introduce the tempered stable process (the CTS process and CGMY process) into the Ornstein–Uhlenbeck (OU) stochastic volatility model to build an OU model with the stochastic volatility process and tempered stable process. Based on two different assumptions for the underlying assets, we derive the formula of pricing models via two methods. Empirical studies are conducted to prove that our pricing models have a better performance in matching the VIX options. Furthermore, we find that the pricing model via the infinitesimal value method yields better results than pricing models with a measure of change. Overall, our proposed models enrich the derivative pricing theory and help investors understand and hedge risks.
15

Cadas, Arnaud, Josu Doncel, Jean-Michel Fourneau e Ana Busic. "Flexibility can Hurt Dynamic Matching System Performance". ACM SIGMETRICS Performance Evaluation Review 49, n. 3 (22 marzo 2022): 37–42. http://dx.doi.org/10.1145/3529113.3529126.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
We study the performance of stochastic matching models with general compatibility graphs. Items of different classes arrive to the system according to independent Poisson processes. Upon arrival, an item is matched with a compatible item according to the First Come First Matched discipline and both items leave the system immediately. If there are no compatible items, the new arrival joins the queue of unmatched items of the same class. Compatibilities between item classes are defined by a connected graph, where nodes represent the classes of items and the edges the compatibilities between item classes. We show that such a model may exhibit a non intuitive behavior: increasing the matching flexibility by adding new edges in the matching graph may lead to a larger average population at the steady state. This performance paradox can be viewed as an analog of the Braess paradox. We show sufficient conditions for the existence or non-existence of this paradox. This performance paradox in matching models appears when specific independent sets are in saturation, i.e., the system is close to the stability condition.
16

Kwon, Jay Hyoun, e Christopher Jekeli. "The effect of stochastic gravity models in airborne vector gravimetry". GEOPHYSICS 67, n. 3 (maggio 2002): 770–76. http://dx.doi.org/10.1190/1.1484520.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Measurements of specific force using inertial measurement units (IMU) combined with Global Positioning System (GPS) accelerometry can be used on an airborne platform to determine the total gravitational vector. Traditional methods, originating with inertial surveying systems and based on Kalman filtering, rely on choosing an appropriate stochastic model for the gravity disturbance components included in the set of system error states. An alternative procedure that uses no a priori stochastic model has proven to be as effective, or moreso, in extracting the gravity vector from airborne IMU/GPS data. This method is based on inspecting acceleration residuals from a Kalman filter that estimates only sensor biases. Using actual data collected over the Canadian Rocky Mountains, this method was compared to the traditional approach adapted for different types of stochastic models for the gravity disturbance vector. In all test cases, the estimation filter without a gravitational model yielded better results—up to 50%. This implies that accurate gravity vector determination from airborne IMU/GPS need not rely on an a priori stochastic model of the field, even though the theory of optimal estimation requests it. However, no filter was able to remove all systematic errors from the data; these remaining errors could only be reduced by elementary methods such as endpoint matching and correlative processing of adjacent passes of the system over the gravity field. The final, best gravity estimates had standard deviations with respect to control data of 6 mGal in the horizontal components and 3–4 mGal in the vertical component.
17

Tamegawa, Kenichi, e Shin Fukuda. "EXPECTATION ERRORS IN CREDIT MARKET AND BUSINESS CYCLES". Macroeconomic Dynamics 20, n. 5 (30 giugno 2016): 1359–80. http://dx.doi.org/10.1017/s1365100514000923.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
This study demonstrates how expectation errors in a credit market generate economic fluctuations. To this end, we employ simulation analysis using a dynamic stochastic general equilibrium model. Our model includes two building blocks that are not included in the standard models: the banking sector and matching friction in the labor market. By introducing the banking sector, we can confirm that if economic agents fallaciously expect a rise in future asset prices, such expectations will cause an economic boom and bust. The variation of this fluctuation is quite large and the recession short-lived, but these drawbacks can be avoided by adding matching friction.
18

Nguatem, W., M. Drauschke e H. Mayer. "AUTOMATIC GENERATION OF BUILDING MODELS WITH LEVELS OF DETAIL 1-3". ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLI-B3 (10 giugno 2016): 649–54. http://dx.doi.org/10.5194/isprs-archives-xli-b3-649-2016.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
We present a workflow for the automatic generation of building models with levels of detail (LOD) 1 to 3 according to the CityGML standard (Gröger et al., 2012). We start with orienting unsorted image sets employing (Mayer et al., 2012), we compute depth maps using semi-global matching (SGM) (Hirschmüller, 2008), and fuse these depth maps to reconstruct dense 3D point clouds (Kuhn et al., 2014). Based on planes segmented from these point clouds, we have developed a stochastic method for roof model selection (Nguatem et al., 2013) and window model selection (Nguatem et al., 2014). We demonstrate our workflow up to the export into CityGML.
19

Nguatem, W., M. Drauschke e H. Mayer. "AUTOMATIC GENERATION OF BUILDING MODELS WITH LEVELS OF DETAIL 1-3". ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLI-B3 (10 giugno 2016): 649–54. http://dx.doi.org/10.5194/isprsarchives-xli-b3-649-2016.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
We present a workflow for the automatic generation of building models with levels of detail (LOD) 1 to 3 according to the CityGML standard (Gröger et al., 2012). We start with orienting unsorted image sets employing (Mayer et al., 2012), we compute depth maps using semi-global matching (SGM) (Hirschmüller, 2008), and fuse these depth maps to reconstruct dense 3D point clouds (Kuhn et al., 2014). Based on planes segmented from these point clouds, we have developed a stochastic method for roof model selection (Nguatem et al., 2013) and window model selection (Nguatem et al., 2014). We demonstrate our workflow up to the export into CityGML.
20

Claes, Steven, e Hans Janssen. "Towards stochastic generation of 3D pore network models of building materials". MATEC Web of Conferences 282 (2019): 02022. http://dx.doi.org/10.1051/matecconf/201928202022.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Pore-scale-based prediction of the hygric properties of porous building materials is on the rise as an attractive alternative for the current experimental procedure. Pore-scale simulations do however require a complete pore network model for the building material. With the currently available characterization techniques, such complete pore network model cannot be established, instead typically fragmented direct (pores sizes, shapes, positions, connections, …) or indirect (pore size distribution, pore surface area, …) information is obtained. The aim of this paper is to present stochastic pore network generation, wherein the fragmented pore structure information is used to generate a complete pore network for the building material involved. The novelty of our approach lies in the generation of a PNM by matching the distributions of direct parameters as well as indirect parameters of the input data and the model. Additionally, the position of the pores are no longer bound to a cubic lattice. This workflow will first be tested on a single scale material with a relatively straightforward pore space such as sintered glass. Finally, the hygric properties of the generated network will be compared to the measured properties of the real material as a validation step.
21

Shutts, Glenn, e Alfons Callado Pallarès. "Assessing parametrization uncertainty associated with horizontal resolution in numerical weather prediction models". Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 372, n. 2018 (28 giugno 2014): 20130284. http://dx.doi.org/10.1098/rsta.2013.0284.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
The need to represent uncertainty resulting from model error in ensemble weather prediction systems has spawned a variety of ad hoc stochastic algorithms based on plausible assumptions about sub-grid-scale variability. Currently, few studies have been carried out to prove the veracity of such schemes and it seems likely that some implementations of stochastic parametrization are misrepresentations of the true source of model uncertainty. This paper describes an attempt to quantify the uncertainty in physical parametrization tendencies in the European Centre for Medium-Range Weather Forecasts (ECMWF) Integrated Forecasting System with respect to horizontal resolution deficiency. High-resolution truth forecasts are compared with matching target forecasts at much lower resolution after coarse-graining to a common spatial and temporal resolution. In this way, model error is defined and its probability distribution function is examined as a function of tendency magnitude. It is found that the temperature tendency error associated with convection parametrization and explicit water phase changes behaves like a Poisson process for which the variance grows in proportion to the mean, which suggests that the assumptions underpinning the Craig and Cohen statistical model of convection might also apply to parametrized convection. By contrast, radiation temperature tendency errors have a very different relationship to their mean value. These findings suggest that the ECMWF stochastic perturbed parametrization tendency scheme could be improved since it assumes that the standard deviation of the tendency error is proportional to the mean. Using our finding that the variance error is proportional to the mean, a prototype stochastic parametrization scheme is devised for convective and large-scale condensation temperature tendencies and tested within the ECMWF Ensemble Prediction System. Significant impact on forecast skill is shown, implying its potential for further development.
22

Mohamed, Linah, Mike Christie e Vasily Demyanov. "Comparison of Stochastic Sampling Algorithms for Uncertainty Quantification". SPE Journal 15, n. 01 (17 novembre 2009): 31–38. http://dx.doi.org/10.2118/119139-pa.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Summary History matching and uncertainty quantification are two important research topics in reservoir simulation currently. In the Bayesian approach, we start with prior information about a reservoir (e.g., from analog outcrop data) and update our reservoir models with observations (e.g., from production data or time-lapse seismic). The goal of this activity is often to generate multiple models that match the history and use the models to quantify uncertainties in predictions of reservoir performance. A critical aspect of generating multiple history-matched models is the sampling algorithm used to generate the models. Algorithms that have been studied include gradient methods, genetic algorithms, and the ensemble Kalman filter (EnKF). This paper investigates the efficiency of three stochastic sampling algorithms: Hamiltonian Monte Carlo (HMC) algorithm, Particle Swarm Optimization (PSO) algorithm, and the Neighbourhood Algorithm (NA). HMC is a Markov chain Monte Carlo (MCMC) technique that uses Hamiltonian dynamics to achieve larger jumps than are possible with other MCMC techniques. PSO is a swarm intelligence algorithm that uses similar dynamics to HMC to guide the search but incorporates acceleration and damping parameters to provide rapid convergence to possible multiple minima. NA is a sampling technique that uses the properties of Voronoi cells in high dimensions to achieve multiple history-matched models. The algorithms are compared by generating multiple history- matched reservoir models and comparing the Bayesian credible intervals (p10-p50-p90) produced by each algorithm. We show that all the algorithms are able to find equivalent match qualities for this example but that some algorithms are able to find good fitting models quickly, whereas others are able to find a more diverse set of models in parameter space. The effects of the different sampling of model parameter space are compared in terms of the p10-p50-p90 uncertainty envelopes in forecast oil rate. These results show that algorithms based on Hamiltonian dynamics and swarm intelligence concepts have the potential to be effective tools in uncertainty quantification in the oil industry.
23

Полухин, V. Polukhin, Берестин, D. Berestin, Филатова, D. Filatova, Глазова e O. Glazova. "Biophysical models of pathological and postural tremor". Journal of New Medical Technologies. eJournal 9, n. 4 (8 dicembre 2015): 0. http://dx.doi.org/10.12737/16779.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Real chaotic and stochastic analysis of the two movements (tremor was considered as involuntary movements and tapping - arbitrary) shows them as chaotic movements (involuntary on the results of the test, not by the presence of the target). The authors introduce new criteria for separating these two types of motion in the form of matrices of pairwise comparisons of samples tremorograms and tappingrams. Identifying differences between the concrete (obtained continuously, during the sequential measurement) that are compared in pairs in one subject groups was performed using the Wilcoxon test. The increase in the number of "common" pairs of samples of tappingrams compared to tremorograms demonstrates a partial increase of phase-matching due to the afferentation and engaging mental activity. This indicates the beginning of a shift from the chaotic regime to stochastic. The increase in common pairs of tapping may be possible due to the change in patterns of fluctuations. The authors propose a new calculation of quasi-attractors of these two types of movements that allow the identification of the differences in the physiological state of the subject. The concrete examples of the changes in the parameters of the matrices of paired comparisons and quasi-attractors are demonstrated. The authors present a method of analysis of autocorrelation functions when partitioning the interval (-1; 1) into four parts. Using the analysis of the density autocorrelation functions and tremorograms and tappingrams shows a significant difference between involuntary movements (tremor) and arbitrary movement (tapping).
24

BOVIER, ANTON, JIŘÍ ČERNÝ e OSTAP HRYNIV. "THE OPINION GAME: STOCK PRICE EVOLUTION FROM MICROSCOPIC MARKET MODELING". International Journal of Theoretical and Applied Finance 09, n. 01 (febbraio 2006): 91–111. http://dx.doi.org/10.1142/s0219024906003421.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
We propose a class of Markovian agent based models for the time evolution of a share price in an interactive market. The models rely on a microscopic description of a market of buyers and sellers who change their opinion about the stock value in a stochastic way. The actual price is determined in realistic way by matching (clearing) offers until no further transactions can be performed. Some analytic results for simple special cases are presented. We also propose basic interaction mechanisms and show in simulations that these already reproduce certain particular features of prices in real stock markets.
25

Nejadi, Siavash, Juliana Y. Leung, Japan J. Trivedi e Claudio Virues. "Integrated Characterization of Hydraulically Fractured Shale-Gas Reservoirs—Production History Matching". SPE Reservoir Evaluation & Engineering 18, n. 04 (25 novembre 2015): 481–94. http://dx.doi.org/10.2118/171664-pa.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Summary Advancements in horizontal-well drilling and multistage hydraulic fracturing have enabled economically viable gas production from tight formations. Reservoir-simulation models play an important role in the production forecasting and field-development planning. To enhance their predictive capabilities and to capture the uncertainties in model parameters, one should calibrate stochastic reservoir models to both geologic and flow observations. In this paper, a novel approach to characterization and history matching of hydrocarbon production from a hydraulic-fractured shale is presented. This new methodology includes generating multiple discrete-fracture-network (DFN) models, upscaling the models for numerical multiphase-flow simulation, and updating the DFN-model parameters with dynamic-flow responses. First, measurements from hydraulic-fracture treatment, petrophysical interpretation, and in-situ stress data are used to estimate the initial probability distribution of hydraulic-fracture and induced-microfracture parameters, and multiple initial DFN models are generated. Next, the DFN models are upscaled into an equivalent continuum dual-porosity model with analytical techniques. The upscaled models are subjected to the flow simulation, and their production performances are compared with the actual responses. Finally, an assisted-history-matching algorithm is implemented to assess the uncertainties of the DFN-model parameters. Hydraulic-fracture parameters including half-length and transmissivity are updated, and the length, transmissivity, intensity, and spatial distribution of the induced fractures are also estimated. The proposed methodology is applied to facilitate characterization of fracture parameters of a multifractured shale-gas well in the Horn River basin. Fracture parameters and stimulated reservoir volume (SRV) derived from the updated DFN models are in agreement with estimates from microseismic interpretation and rate-transient analysis. The key advantage of this integrated assisted-history-matching approach is that uncertainties in fracture parameters are represented by the multiple equally probable DFN models and their upscaled flow-simulation models, which honor the hard data and match the dynamic production history. This work highlights the significance of uncertainties in SRV and hydraulic-fracture parameters. It also provides insight into the value of microseismic data when integrated into a rigorous production-history-matching work flow.
26

Elsheikh, Ahmed H., Mary F. Wheeler e Ibrahim Hoteit. "Sparse calibration of subsurface flow models using nonlinear orthogonal matching pursuit and an iterative stochastic ensemble method". Advances in Water Resources 56 (giugno 2013): 14–26. http://dx.doi.org/10.1016/j.advwatres.2013.02.002.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
27

Bottke, William F., Richard J. Walker, James M. D. Day, David Nesvorny e Linda Elkins-Tanton. "Stochastic Late Accretion to Earth, the Moon, and Mars". Science 330, n. 6010 (9 dicembre 2010): 1527–30. http://dx.doi.org/10.1126/science.1196874.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Core formation should have stripped the terrestrial, lunar, and martian mantles of highly siderophile elements (HSEs). Instead, each world has disparate, yet elevated HSE abundances. Late accretion may offer a solution, provided that ≥0.5% Earth masses of broadly chondritic planetesimals reach Earth’s mantle and that ~10 and ~1200 times less mass goes to Mars and the Moon, respectively. We show that leftover planetesimal populations dominated by massive projectiles can explain these additions, with our inferred size distribution matching those derived from the inner asteroid belt, ancient martian impact basins, and planetary accretion models. The largest late terrestrial impactors, at 2500 to 3000 kilometers in diameter, potentially modified Earth’s obliquity by ~10°, whereas those for the Moon, at ~250 to 300 kilometers, may have delivered water to its mantle.
28

Qaiser Abbas. "Template Matching Based Probabilistic Optical Character Recognition for Urdu Nastaliq Script". Lahore Garrison University Research Journal of Computer Science and Information Technology 5, n. 2 (21 giugno 2021): 41–47. http://dx.doi.org/10.54692/lgurjcsit.2021.0502207.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
This paper presents a technique for optical recognition of Urdu characters using template matching based on a probabilistic N-Gram language model. Dataset used has the collection of both printed and typed text. This model is able to perform three types of segmentations including line, ligature and character using horizontal projection, connected component labeling, corners and pointers techniques, respectively. A separate stochastic lexicon is built from a collected corpus, which contains the probability values of grams. By using template matching and the N-Gram language model, our study predicts complete segmented words with the promising result, particularly in case of bigrams. It outperforms three out of four existing models with an accuracy rate of 97.33%. Results achieved on our test dataset are encouraging in one perspective but provide direction to work for further improvement in this model.
29

Tavakoli, Reza, Sanjay Srinivasan e Mary F. Wheeler. "Rapid Updating of Stochastic Models by Use of an Ensemble-Filter Approach". SPE Journal 19, n. 03 (31 dicembre 2013): 500–513. http://dx.doi.org/10.2118/163673-pa.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Summary Applying an ensemble Kalman filter (EnKF) is an effective method for reservoir history matching. The underlying principle is that an initial ensemble of stochastic models can be progressively updated to reflect measured values as they become available. The EnKF performance is only optimal, however, if the prior-state vector is linearly related to the predicted data and if the joint distribution of the prior-state vector is multivariate Gaussian. Therefore, it is challenging to implement the filtering scheme for non-Gaussian random fields, such as channelized reservoirs, in which the continuity of permeability extremes is well-preserved. In this paper, we develop a methodology by combining model classification with multidimensional scaling (MDS) and the EnKF to create rapidly updating models of a channelized reservoir. A dissimilarity matrix is computed by use of the dynamic responses of ensemble members. This dissimilarity matrix is transformed into a lower-dimensional space by use of MDS. Responses mapped in the lower-dimension space are clustered, and on the basis of the distances between the models in a cluster and the actual observed response, the closest models to the observed response are retrieved. Model updates within the closest cluster are performed with EnKF equations. The results of an update are used to resample new models for the next step. Two-dimensional, waterflooding examples of channelized reservoirs are provided to demonstrate the applicability of the proposed method. The obtained results demonstrate that the proposed algorithm is viable both for sequentially updating reservoir models and for preserving channel features after the data-assimilation process.
30

Liu, Jian, Jiaqi Guo, Bing Hu, Qiqing Zhai, Can Tang e Wanjia Zhang. "Controlled Symmetry with Woods-Saxon Stochastic Resonance Enabled Weak Fault Detection". Sensors 23, n. 11 (25 maggio 2023): 5062. http://dx.doi.org/10.3390/s23115062.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Weak fault detection with stochastic resonance (SR) is distinct from conventional approaches in that it is a nonlinear optimal signal processing to transfer noise into the signal, resulting in a higher output SNR. Owing to this special characteristic of SR, this study develops a controlled symmetry with Woods-Saxon stochastic resonance (CSwWSSR) model based on the Woods-Saxon stochastic resonance (WSSR), where each parameter of the model may be modified to vary the potential structure. Then, the potential structure of the model is investigated in this paper, along with the mathematical analysis and experimental comparison to clarify the effect of each parameter on it. The CSwWSSR is a tri-stable stochastic resonance, but differs from others in that each of its three potential wells is controlled by different parameters. Moreover, the particle swarm optimization (PSO), which can quickly find the ideal parameter matching, is introduced to attain the optimal parameters of the CSwWSSR model. Fault diagnosis of simulation signals and bearings was carried out to confirm the viability of the proposed CSwWSSR model, and the results revealed that the CSwWSSR model is superior to its constituent models.
31

Liu, Mingliang, Dario Grana e Leandro Passos de Figueiredo. "Uncertainty quantification in stochastic inversion with dimensionality reduction using variational autoencoder". GEOPHYSICS 87, n. 2 (27 dicembre 2021): M43—M58. http://dx.doi.org/10.1190/geo2021-0138.1.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Estimating rock and fluid properties in the subsurface from geophysical measurements is a computationally and memory-intensive inverse problem. For nonlinear problems with non-Gaussian variables, analytical solutions are generally not available, and the solutions of those inverse problems must be approximated using sampling and optimization methods. To reduce the computational cost, model and data can be reparameterized into low-dimensional spaces where the solution of the inverse problem can be computed more efficiently. Among the potential dimensionality reduction methods, deep-learning algorithms based on deep generative models provide an efficient approach to reduce the dimension of the model and data vectors. However, such dimension reduction might lead to information loss in the reconstructed model and data, reduction of the accuracy and resolution of the inverted models, and under- or overestimation of the uncertainty of the predicted models. To comprehensively investigate the impact of model and data dimension reduction with deep generative models on uncertainty quantification, we compare the prediction uncertainty in nonlinear inverse problem solutions obtained from Markov chain Monte Carlo and ensemble-based data assimilation methods implemented in lower dimensional data and model spaces using a deep variational autoencoder. Our workflow is applied to two geophysical inverse problems for the prediction of reservoir properties: prestack seismic inversion and seismic history matching. The inversion results consist of the most likely model and a set of realizations of the variables of interest. The application of dimensionality reduction methods makes the stochastic inversion more efficient.
32

Korsunovs, Aleksandrs, Felician Campean, Gaurav Pant, Oscar Garcia-Afonso e Efe Tunc. "Evaluation of zero-dimensional stochastic reactor modelling for a Diesel engine application". International Journal of Engine Research 21, n. 4 (29 aprile 2019): 592–609. http://dx.doi.org/10.1177/1468087419845823.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Prediction of engine-out emissions with high fidelity from in-cylinder combustion simulations is still a significant challenge early in the engine development process. This article contributes to this fast evolving body of knowledge by focusing on the evaluation of NO x emission prediction capability of a probability density function–based stochastic reactor engine models for a Diesel engine. The research implements a systematic approach to the study of the stochastic reactor engine model performance, underpinned by a detailed space-filling design of experiments (DoE)-based sensitivity analysis of both external and internal parameters, evaluating their effects on the accuracy in matching physical measurements of both in-cylinder conditions and NO x output. The approach proposed in this article introduces an automatic stochastic reactor engine model calibration methodology across the engine operating envelope, based on a multi-objective optimization approach. This aims to exploit opportunities for internal stochastic reactor engine model parameters tuning to achieve good overall modelling performance as a trade-off between physical in-cylinder measurements accuracy and the output NO x emission predictions error. The results from the case study provide a valuable insight into the effectiveness of the stochastic reactor engine model, showing good capability for NO x emissions prediction and trends, while pointing out the critical sensitivity to the external input parameters and modelling conditions.
33

Liu, Jida, e Changqi Dong. "Understanding the Complex Adaptive Characteristics of Cross-Regional Emergency Collaboration in China: A Stochastic Evolutionary Game Approach". Fractal and Fractional 8, n. 2 (5 febbraio 2024): 98. http://dx.doi.org/10.3390/fractalfract8020098.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Regional integration and pairing assistance are two forms of cross-regional emergency collaboration practice carried out by the Chinese government. Based on the Chinese government’s emergency management practice, evolutionary game models of cross-regional emergency collaboration were constructed. Further, the traditional evolutionary game model was improved by introducing the stochastic process, and Gaussian white noise was introduced as a random disturbance. The stochastic evolutionary game model was constructed, and the existence and stability of the equilibrium solutions of the two kinds of stochastic evolutionary game systems for cross-regional emergency collaboration were verified based on the stability discrimination theorem of stochastic differential equations. We used numerical simulations to simulate the evolution trajectories of the regional integration and the pairing assistance stochastic evolutionary game system. In the regional integration game system, when the efficiency of emergency collaboration, the emergency capital stock, and the externality coefficients are higher, positive emergency strategies are more likely to become the stable state of the game subjects’ strategy selection. In the pairing assistance game system, the efficiency of emergency collaboration, the rewards and benefits from the central government, and the matching degree between governments all had positive effects on the formation of the positive emergency strategies of the game subjects. In addition, the pairing assistance mechanism for sustainable development requires external support from the central government.
34

Burda, Michael C., e Mark Weder. "Complementarity of Labor Market Institutions, Equilibrium Unemployment and the Propagation of Business Cycles". German Economic Review 3, n. 1 (1 febbraio 2002): 1–24. http://dx.doi.org/10.1111/1468-0475.00049.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Abstract This paper evaluates complementarities of labor market institutions and the business cycle in the context of a stochastic dynamic general equilibrium model economy. Matching between workers and vacancies with endogenous time spent in search, Nash-bargained wages, payroll taxation, and differential support for unemployed labor in search and leisure are central aspects of the model. For plausible regions of the policy and institutional parameter space, the model exhibits more persistence than standard real business cycle models and can exhibit indeterminacy of rational expectations paths without increasing returns in production. Furthermore, labor market institutions act in a complementary fashion in generating these effects.
35

Amara-Ouali, Yvenn, Yannig Goude, Pascal Massart, Jean-Michel Poggi e Hui Yan. "A Review of Electric Vehicle Load Open Data and Models". Energies 14, n. 8 (16 aprile 2021): 2233. http://dx.doi.org/10.3390/en14082233.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
The field of electric vehicle charging load modelling has been growing rapidly in the last decade. In light of the Paris Agreement, it is crucial to keep encouraging better modelling techniques for successful electric vehicle adoption. Additionally, numerous papers highlight the lack of charging station data available in order to build models that are consistent with reality. In this context, the purpose of this article is threefold. First, to provide the reader with an overview of the open datasets available and ready to be used in order to foster reproducible research in the field. Second, to review electric vehicle charging load models with their strengths and weaknesses. Third, to provide suggestions on matching the models reviewed to six datasets found in this research that have not previously been explored in the literature. The open data search covered more than 860 repositories and yielded around 60 datasets that are relevant for modelling electric vehicle charging load. These datasets include information on charging point locations, historical and real-time charging sessions, traffic counts, travel surveys and registered vehicles. The models reviewed range from statistical characterization to stochastic processes and machine learning and the context of their application is assessed.
36

Wells, Konstans, Barry W. Brook, Robert C. Lacy, Greg J. Mutze, David E. Peacock, Ron G. Sinclair, Nina Schwensow, Phillip Cassey, Robert B. O'Hara e Damien A. Fordham. "Timing and severity of immunizing diseases in rabbits is controlled by seasonal matching of host and pathogen dynamics". Journal of The Royal Society Interface 12, n. 103 (febbraio 2015): 20141184. http://dx.doi.org/10.1098/rsif.2014.1184.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Infectious diseases can exert a strong influence on the dynamics of host populations, but it remains unclear why such disease-mediated control only occurs under particular environmental conditions. We used 16 years of detailed field data on invasive European rabbits ( Oryctolagus cuniculus ) in Australia, linked to individual-based stochastic models and Bayesian approximations, to test whether (i) mortality associated with rabbit haemorrhagic disease (RHD) is driven primarily by seasonal matches/mismatches between demographic rates and epidemiological dynamics and (ii) delayed infection (arising from insusceptibility and maternal antibodies in juveniles) are important factors in determining disease severity and local population persistence of rabbits. We found that both the timing of reproduction and exposure to viruses drove recurrent seasonal epidemics of RHD. Protection conferred by insusceptibility and maternal antibodies controlled seasonal disease outbreaks by delaying infection; this could have also allowed escape from disease. The persistence of local populations was a stochastic outcome of recovery rates from both RHD and myxomatosis. If susceptibility to RHD is delayed, myxomatosis will have a pronounced effect on population extirpation when the two viruses coexist. This has important implications for wildlife management, because it is likely that such seasonal interplay and disease dynamics has a strong effect on long-term population viability for many species.
37

Li, Boxiao, Eric W. Bhark, (ret ). Stephen Gross, Travis C. Billiter e Kaveh Dehghani. "Best Practices of Assisted History Matching Using Design of Experiments". SPE Journal 24, n. 04 (9 maggio 2019): 1435–51. http://dx.doi.org/10.2118/191699-pa.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Summary Assisted history matching (AHM) using design of experiments (DOE) is one of the most commonly applied history-matching techniques in the oil and gas industry. When applied properly, this stochastic method finds a representative ensemble of history-matched reservoir models for probabilistic uncertainty analysis of production forecasts. Although DOE-based AHM is straightforward in concept, it can be misused in practice because the work flow involves many statistical and modeling principles that should be followed rigorously. In this paper, the entire DOE-based AHM work flow is demonstrated in a coherent and comprehensive case study that is divided into seven key stages: problem framing, sensitivity analysis, proxy building, Monte Carlo simulation, history-match filtering, production forecasting, and representative model selection. The best practices of each stage are summarized to help reservoir-management engineers understand and apply this powerful work flow for reliable history matching and probabilistic production forecasting. One major difficulty in any history-matching method is to define the history-match tolerance, which reflects the engineer's comfort level of calling a reservoir model “history matched” even though the difference between simulated and observed production data is not zero. It is a compromise to the intrinsic and unavoidable imperfectness of reservoir-model construction, data measurement, and proxy creation. A practical procedure is provided to help engineers define the history-match tolerance considering the model, data-measurement, and proxy errors.
38

Saglietti, Luca, Federica Gerace, Alessandro Ingrosso, Carlo Baldassi e Riccardo Zecchina. "From statistical inference to a differential learning rule for stochastic neural networks". Interface Focus 8, n. 6 (19 ottobre 2018): 20180033. http://dx.doi.org/10.1098/rsfs.2018.0033.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Stochastic neural networks are a prototypical computational device able to build a probabilistic representation of an ensemble of external stimuli. Building on the relationship between inference and learning, we derive a synaptic plasticity rule that relies only on delayed activity correlations, and that shows a number of remarkable features. Our delayed-correlations matching (DCM) rule satisfies some basic requirements for biological feasibility: finite and noisy afferent signals, Dale’s principle and asymmetry of synaptic connections, locality of the weight update computations. Nevertheless, the DCM rule is capable of storing a large, extensive number of patterns as attractors in a stochastic recurrent neural network, under general scenarios without requiring any modification: it can deal with correlated patterns, a broad range of architectures (with or without hidden neuronal states), one-shot learning with the palimpsest property, all the while avoiding the proliferation of spurious attractors. When hidden units are present, our learning rule can be employed to construct Boltzmann machine-like generative models, exploiting the addition of hidden neurons in feature extraction and classification tasks.
39

Harrison, Jonathan U., e Christian A. Yates. "A hybrid algorithm for coupling partial differential equation and compartment-based dynamics". Journal of The Royal Society Interface 13, n. 122 (settembre 2016): 20160335. http://dx.doi.org/10.1098/rsif.2016.0335.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Stochastic simulation methods can be applied successfully to model exact spatio-temporally resolved reaction–diffusion systems. However, in many cases, these methods can quickly become extremely computationally intensive with increasing particle numbers. An alternative description of many of these systems can be derived in the diffusive limit as a deterministic, continuum system of partial differential equations (PDEs). Although the numerical solution of such PDEs is, in general, much more efficient than the full stochastic simulation, the deterministic continuum description is generally not valid when copy numbers are low and stochastic effects dominate. Therefore, to take advantage of the benefits of both of these types of models, each of which may be appropriate in different parts of a spatial domain, we have developed an algorithm that can be used to couple these two types of model together. This hybrid coupling algorithm uses an overlap region between the two modelling regimes. By coupling fluxes at one end of the interface and using a concentration-matching condition at the other end, we ensure that mass is appropriately transferred between PDE- and compartment-based regimes. Our methodology gives notable reductions in simulation time in comparison with using a fully stochastic model, while maintaining the important stochastic features of the system and providing detail in appropriate areas of the domain. We test our hybrid methodology robustly by applying it to several biologically motivated problems including diffusion and morphogen gradient formation. Our analysis shows that the resulting error is small, unbiased and does not grow over time.
40

Fernández Martínez, Juan Luis, Tapan Mukerji, Esperanza García Gonzalo e Amit Suman. "Reservoir characterization and inversion uncertainty via a family of particle swarm optimizers". GEOPHYSICS 77, n. 1 (gennaio 2012): M1—M16. http://dx.doi.org/10.1190/geo2011-0041.1.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
History matching provides to reservoir engineers an improved spatial distribution of physical properties to be used in forecasting the reservoir response for field management. The ill-posed character of the history-matching problem yields nonuniqueness and numerical instabilities that increase with the reservoir complexity. These features might cause local optimization methods to provide unpredictable results not being able to discriminate among the multiple models that fit the observed data (production history). Also, the high dimensionality of the inverse problem impedes estimation of uncertainties using classical Markov-chain Monte Carlo methods. We attenuated the ill-conditioned character of this history-matching inverse problem by reducing the model complexity using a spatial principal component basis and by combining as observables flow production measurements and time-lapse seismic crosswell tomographic images. Additionally the inverse problem was solved in a stochastic framework. For this purpose, we used a family of particle swarm optimization (PSO) optimizers that have been deduced from a physical analogy of the swarm system. For a synthetic sand-and-shale reservoir, we analyzed the performance of the different PSO optimizers, both in terms of exploration and convergence rate for two different reservoir models with different complexity and under the presence of different levels of white Gaussian noise added to the synthetic observed data. We demonstrated that PSO optimizers have a very good convergence rate for this example, and provide in addition, approximate measures of uncertainty around the optimum facies model. The PSO algorithms are robust in presence of noise, which is always the case for real data.
41

Sedighi, F., e K. D. D. Stephen. "Faster Convergence in Seismic History Matching by Dividing and Conquering the Unknowns". SPE Journal 15, n. 04 (1 luglio 2010): 1077–88. http://dx.doi.org/10.2118/121210-pa.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Summary Seismic history matching is the process of modifying a reservoir simulation model to reproduce the observed production data in addition to information gained through time-lapse (4D) seismic data. The search for good predictions requires that many models be generated, particularly if there is an interaction between the properties that we change and their effect on the misfit to observed data. In this paper, we introduce a method of improving search efficiency by estimating such interactions and partitioning the set of unknowns into noninteracting subspaces. We use regression analysis to identify the subspaces, which are then searched separately but simultaneously with an adapted version of the quasiglobal stochastic neighborhood algorithm. We have applied this approach to the Schiehallion field, located on the UK continental shelf. The field model, supplied by the operator, contains a large number of barriers that affect flow at different times during production, and their transmissibilities are highly uncertain. We find that we can successfully represent the misfit function as a second-order polynomial dependent on changes in barrier transmissibility. First, this enables us to identify the most important barriers, and, second, we can modify their transmissibilities efficiently by searching subgroups of the parameter space. Once the regression analysis has been performed, we reduce the number of models required to find a good match by an order of magnitude. By using 4D seismic data to condition saturation and pressure changes in history matching effectively, we have gained a greater insight into reservoir behavior and have been able to predict flow more accurately with an efficient inversion tool. We can now determine unswept areas and make better business decisions.
42

Liao, Qinzhuo, Lingzao Zeng, Haibin Chang e Dongxiao Zhang. "Efficient History Matching Using the Markov-Chain Monte Carlo Method by Means of the Transformed Adaptive Stochastic Collocation Method". SPE Journal 24, n. 04 (27 febbraio 2019): 1468–89. http://dx.doi.org/10.2118/194488-pa.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Summary Bayesian inference provides a convenient framework for history matching and prediction. In this framework, prior knowledge, system nonlinearity, and measurement errors can be directly incorporated into the posterior distribution of the parameters. The Markov-chain Monte Carlo (MCMC) method is a powerful tool to generate samples from the posterior distribution. However, the MCMC method usually requires a large number of forward simulations. Hence, it can be a computationally intensive task, particularly when dealing with large-scale flow and transport models. To address this issue, we construct a surrogate system for the model outputs in the form of polynomials using the stochastic collocation method (SCM). In addition, we use interpolation with the nested sparse grids and adaptively take into account the different importance of parameters for high-dimensional problems. Furthermore, we introduce an additional transform process to improve the accuracy of the surrogate model in case of strong nonlinearities, such as a discontinuous or unsmooth relation between the input parameters and the output responses. Once the surrogate system is built, we can evaluate the likelihood with little computational cost. Numerical results demonstrate that the proposed method can efficiently estimate the posterior statistics of input parameters and provide accurate results for history matching and prediction of the observed data with a moderate number of parameters.
43

Franzese, Giulio, Simone Rossi, Lixuan Yang, Alessandro Finamore, Dario Rossi, Maurizio Filippone e Pietro Michiardi. "How Much Is Enough? A Study on Diffusion Times in Score-Based Generative Models". Entropy 25, n. 4 (7 aprile 2023): 633. http://dx.doi.org/10.3390/e25040633.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Score-based diffusion models are a class of generative models whose dynamics is described by stochastic differential equations that map noise into data. While recent works have started to lay down a theoretical foundation for these models, a detailed understanding of the role of the diffusion time T is still lacking. Current best practice advocates for a large T to ensure that the forward dynamics brings the diffusion sufficiently close to a known and simple noise distribution; however, a smaller value of T should be preferred for a better approximation of the score-matching objective and higher computational efficiency. Starting from a variational interpretation of diffusion models, in this work we quantify this trade-off and suggest a new method to improve quality and efficiency of both training and sampling, by adopting smaller diffusion times. Indeed, we show how an auxiliary model can be used to bridge the gap between the ideal and the simulated forward dynamics, followed by a standard reverse diffusion process. Empirical results support our analysis; for image data, our method is competitive with regard to the state of the art, according to standard sample quality metrics and log-likelihood.
44

Kaiser, T., C. Clemen e H. G. Maas. "AUTOMATED ALIGNMENT OF LOCAL POINT CLOUDS IN DIGITAL BUILDING MODELS". ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-5/W2 (20 settembre 2019): 35–39. http://dx.doi.org/10.5194/isprs-archives-xlii-5-w2-35-2019.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
<p><strong>Abstract.</strong> For the correct usage and analysis within a BIM environment, image-based point clouds that were created with Structure from Motion (SfM) tools have to be transformed into the building coordinate system via a seven parameter Helmert Transformation. Usually control points are used for the estimation of the transformation parameters. In this paper we present a novel, highly automated approach to calculate these transformation parameters without the use of control points. The process relies on the relationship between wall respectively plane information of the BIM and three-dimensional line data that is extracted from the image data. In a first step, 3D lines are extracted from the oriented input images using the tool Line3D++. These lines are defined by the 3D coordinates of the start and end points. Afterwards the lines are matched to the planes originating from the BIM model representing the walls, floors and ceilings. Besides finding a suitable functional and stochastic model for the observation equations and the adjustment calculation, the most critical aspect is finding a correct match for the lines and the planes. We therefore developed a RANSAC-inspired matching algorithm to get a correct assignment between elements of the two data sources. Synthetic test data sets have been created for evaluating the methodology.</p>
45

Anson, Michael, Kai-Chi Thomas Ying e Ming-Fung Francis Siu. "Analytical models towards explaining the difficulty in efficiently matching site concrete supply resources with placing crew needs". Engineering, Construction and Architectural Management 26, n. 8 (16 settembre 2019): 1672–95. http://dx.doi.org/10.1108/ecam-02-2018-0049.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
PurposeFor parts of the time on a typical construction site concrete pour, the site placing crew is idle waiting for the arrival of the next truckmixer delivery, whereas for other periods, truckmixers are idle on site waiting to be unloaded. Ideally, the work of the crew should be continuous, with successive truckmixers arriving on site just as the preceding truckmixer has been emptied, to provide perfect matching between site and concrete plant resources. However, in reality, sample benchmark data, representing 118 concrete pours of 69 m3average volume, illustrate that significant wastage occurs of both crew and truckmixer time. The purpose of this paper is to present and explain the characteristics of the wastage pattern observed and provide further understanding of the effects of the factors affecting the productivity of this everyday routine site concreting system.Design/methodology/approachAnalytical algebraic models have been developed applicable to both serial and circulating truckmixer dispatch policies. The models connect crew idle time, truckmixer waiting time, truckmixer round trip time, truckmixer unloading time and truckmixer numbers. The truckmixer dispatch interval is another parameter included in the serial dispatch model. The models illustrate that perfect resource matching cannot be expected in general, such is the sensitivity of the system to the values applying to those parameters. The models are directly derived from theoretical truckmixer and crew placing time-based flow charts, which graphically depict crew and truckmixer idle times as affected by truckmixer emptying times and other relevant parameters.FindingsThe models successfully represent the magnitudes of the resource wastage seen in real life but fail to mirror the wastage distribution of crew and truckmixer time for the 118 pour benchmark. When augmented to include the simulation of stochastic activity durations, however, the models produce pour combinations of crew and truckmixer wastage that do mirror those of the benchmark.Originality/valueThe basic contribution of the paper consists of the proposed analytical models themselves, and their augmented versions, which describe the site and truckmixer resource wastage characteristics actually observed in practice. A further contribution is the step this makes towards understanding why such an everyday construction process is so apparently wasteful of resources.
46

Vazquez, O., C. Young, V. Demyanov, D. Arnold, A. Fisher, A. MacMillan e Mike Christie. "Produced-Water-Chemistry History Matching in the Janice Field". SPE Reservoir Evaluation & Engineering 18, n. 04 (25 novembre 2015): 564–76. http://dx.doi.org/10.2118/164903-pa.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Summary Produced-water-chemistry (PWC) data are the main sources of information to monitor scale precipitation in oilfield operations. Chloride concentration is used to evaluate the seawater fraction of the total produced water per producing well and is included as an extra history-matching constraint to reevaluate a good conventionally history-matched (HM) reservoir model for the Janice field. Generally, PWC is not included in conventional history matching, and this approach shows the value of considering the nature of the seawater-injection front and the associated brine mixing between the distinctive formation water and injected seawater. Adding the extra constraint resulted in the reconceptualization of the reservoir geology between a key injector and two producers. The transmissibility of a shale layer is locally modified within a range of geologically consistent values. Also, a major lineament is identified which is interpreted as a northwest/southeast-trending fault, whereby the zero transmissibility of a secondary shale in the Middle Fulmar is locally adjusted to allow crossflow. Both uncertainties are consistent with the complex faulting known to exist in the region of the targeted wells. Other uncertainties that were carried forward to the assisted-history-matching phase included water allocation to the major seawater injectors; thermal fracture orientation of injectors; and the vertical and horizontal permeability ratio (Kv/Kh) of the Fulmar formation. Finally, a stochastic particle-swarm-optimization (PSO) algorithm is used to generate an ensemble of HM models with seawater fraction as an extra constraint in the misfit definition. Use of additional data in history matching has improved the original good HM solution. Field oil-production rate is interpreted as improved over a key period, and although no obvious improvement was observed in field water-production rate, seawater fraction in a number of wells was improved.
47

Chen, Yiwei, e Ming Hu. "Pricing and Matching with Forward-Looking Buyers and Sellers". Manufacturing & Service Operations Management 22, n. 4 (luglio 2020): 717–34. http://dx.doi.org/10.1287/msom.2018.0769.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Problem definition: We study a dynamic market over a finite horizon for a single product or service in which buyers with private valuations and sellers with private supply costs arrive following Poisson processes. A single market-making intermediary decides dynamically on the ask and bid prices that will be posted to buyers and sellers, respectively, and on the matching decisions after buyers and sellers agree to buy and sell. Buyers and sellers can wait strategically for better prices after they arrive. Academic/practical relevance: This problem is motivated by the emerging sharing economy and directly speaks to the core of operations management that is about matching supply with demand. Methodology: The dynamic, stochastic, and game-theoretic nature makes the problem intractable. We employ the mechanism-design methodology to establish a tractable upper bound on the optimal profit, which motivates a simple heuristic policy. Results: Our heuristic policy is: fixed ask and bid prices plus price adjustments as compensation for waiting costs, in conjunction with the greedy matching policy on a first-come-first-served basis. These fixed base prices balance demand and supply in expectation and can be computed efficiently. The waiting-compensated price processes are time-dependent and tend to have opposite trends at the beginning and end of the horizon. Under this heuristic policy, forward-looking buyers and sellers behave myopically. This policy is shown to be asymptotically optimal. Managerial implications: Our results suggest that the intermediary might not lose much optimality by maintaining stable prices unless the underlying market conditions have significantly changed, not to mention that frequent surge pricing may antagonize riders and induce riders and drivers to behave strategically in ways that are hard to account for with traditional pricing models.
48

Schaaf, Alexander, Miguel de la Varga, Florian Wellmann e Clare E. Bond. "Constraining stochastic 3-D structural geological models with topology information using approximate Bayesian computation in GemPy 2.1". Geoscientific Model Development 14, n. 6 (28 giugno 2021): 3899–913. http://dx.doi.org/10.5194/gmd-14-3899-2021.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Abstract. Structural geomodeling is a key technology for the visualization and quantification of subsurface systems. Given the limited data and the resulting necessity for geological interpretation to construct these geomodels, uncertainty is pervasive and traditionally unquantified. Probabilistic geomodeling allows for the simulation of uncertainties by automatically constructing geomodel ensembles from perturbed input data sampled from probability distributions. But random sampling of input parameters can lead to construction of geomodels that are unrealistic, either due to modeling artifacts or by not matching known information about the regional geology of the modeled system. We present a method to incorporate geological information in the form of known geomodel topology into stochastic simulations to constrain resulting probabilistic geomodel ensembles using the open-source geomodeling software GemPy. Simulated geomodel realizations are checked against topology information using an approximate Bayesian computation approach to avoid the specification of a likelihood function. We demonstrate how we can infer the posterior distributions of the model parameters using topology information in two experiments: (1) a synthetic geomodel using a rejection sampling scheme (ABC-REJ) to demonstrate the approach and (2) a geomodel of a subset of the Gullfaks field in the North Sea comparing both rejection sampling and a sequential Monte Carlo sampler (ABC-SMC). Possible improvements to processing speed of up to 10.1 times are discussed, focusing on the use of more advanced sampling techniques to avoid the simulation of unfeasible geomodels in the first place. Results demonstrate the feasibility of using topology graphs as a summary statistic to restrict the generation of geomodel ensembles with known geological information and to obtain improved ensembles of probable geomodels which respect the known topology information and exhibit reduced uncertainty using stochastic simulation methods.
49

Avadhanula, Vashist, Andrea Celli, Riccardo Colini-Baldeschi, Stefano Leonardi e Matteo Russo. "Fully Dynamic Online Selection through Online Contention Resolution Schemes". Proceedings of the AAAI Conference on Artificial Intelligence 37, n. 6 (26 giugno 2023): 6693–700. http://dx.doi.org/10.1609/aaai.v37i6.25821.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
We study fully dynamic online selection problems in an adversarial/stochastic setting that includes Bayesian online selection, prophet inequalities, posted price mechanisms, and stochastic probing problems subject to combinatorial constraints. In the classical ``incremental'' version of the problem, selected elements remain active until the end of the input sequence. On the other hand, in the fully dynamic version of the problem, elements stay active for a limited time interval, and then leave. This models, for example, the online matching of tasks to workers with task/worker-dependent working times, and sequential posted pricing of perishable goods. A successful approach to online selection problems in the adversarial setting is given by the notion of Online Contention Resolution Scheme (OCRS), that uses a priori information to formulate a linear relaxation of the underlying optimization problem, whose optimal fractional solution is rounded online for any adversarial order of the input sequence. Our main contribution is providing a general method for constructing an OCRS for fully dynamic online selection problems. Then, we show how to employ such OCRS to construct no-regret algorithms in a partial information model with semi-bandit feedback and adversarial inputs.
50

Zhang, Yanfen, e Dean S. Oliver. "History Matching Using the Ensemble Kalman Filter With Multiscale Parameterization: A Field Case Study". SPE Journal 16, n. 02 (8 dicembre 2010): 307–17. http://dx.doi.org/10.2118/118879-pa.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Abstract (sommario):
Summary The increased use of optimization in reservoir management has placed greater demands on the application of history matching to produce models that not only reproduce the historical production behavior but also preserve geological realism and quantify forecast uncertainty. Geological complexity and limited access to the subsurface typically result in a large uncertainty in reservoir properties and forecasts. However, there is a systematic tendency to underestimate such uncertainty, especially when rock properties are modeled using Gaussian random fields. In this paper, we address one important source of uncertainty: the uncertainty in regional trends by introducing stochastic trend coefficients. The multiscale parameters including trend coefficients and heterogeneities can be estimated using the ensemble Kalman filter (EnKF) for history matching. Multiscale heterogeneities are often important, especially in deepwater reservoirs, but are generally poorly represented in history matching. In this paper, we describe a method for representing and updating multiple scales of heterogeneity in the EnKF. We tested our method for updating these variables using production data from a deepwater field whose reservoir model has more than 200,000 unknown parameters. The match of reservoir simulator forecasts to real field data using a standard application of EnKF had not been entirely satisfactory because it was difficult to match the water cut of a main producer in the reservoir. None of the realizations of the reservoir exhibited water breakthrough using the standard parameterization method. By adding uncertainty in large-scale trends of reservoir properties, the ability to match the water cut and other production data was improved substantially. The results indicate that an improvement in the generation of the initial ensemble and in the variables describing the property fields gives an improved history match with plausible geology. The multiscale parameterization of property fields reduces the tendency to underestimate uncertainty while still providing reservoir models that match data.

Vai alla bibliografia