Academic literature on the topic 'Simulation-Based Inference'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Simulation-Based Inference.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Simulation-Based Inference":

1

Gourieroux, Christian, and Alain Monfort. "Simulation-based inference." Journal of Econometrics 59, no. 1-2 (September 1993): 5–33. http://dx.doi.org/10.1016/0304-4076(93)90037-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Vogelsang, T. J. "Simulation-Based Inference in Econometrics." Journal of the American Statistical Association 97, no. 458 (June 2002): 657. http://dx.doi.org/10.1198/jasa.2002.s478.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Brehmer, Johann. "Simulation-based inference in particle physics." Nature Reviews Physics 3, no. 5 (March 22, 2021): 305. http://dx.doi.org/10.1038/s42254-021-00305-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sheu, Ching-fan, and Suzanne L. O’Curry. "Simulation-based bayesian inference using BUGS." Behavior Research Methods, Instruments, & Computers 30, no. 2 (June 1998): 232–37. http://dx.doi.org/10.3758/bf03200649.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Cranmer, Kyle, Johann Brehmer, and Gilles Louppe. "The frontier of simulation-based inference." Proceedings of the National Academy of Sciences 117, no. 48 (May 29, 2020): 30055–62. http://dx.doi.org/10.1073/pnas.1912789117.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Many domains of science have developed complex simulations to describe phenomena of interest. While these simulations provide high-fidelity models, they are poorly suited for inference and lead to challenging inverse problems. We review the rapidly developing field of simulation-based inference and identify the forces giving additional momentum to the field. Finally, we describe how the frontier is expanding so that a broad audience can appreciate the profound influence these developments may have on science.
6

Gouriéroux and Monfort. "Simulation Based Inference in Models with Heterogeneity." Annales d'Économie et de Statistique, no. 20/21 (1990): 69. http://dx.doi.org/10.2307/20075807.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ghysels, Khalaf, and Vodounou. "Simulation Based Inference in Moving Average Models." Annales d'Économie et de Statistique, no. 69 (2003): 85. http://dx.doi.org/10.2307/20076364.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Jasra, Ajay, David A. Stephens, and Christopher C. Holmes. "On population-based simulation for static inference." Statistics and Computing 17, no. 3 (July 27, 2007): 263–79. http://dx.doi.org/10.1007/s11222-007-9028-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

McKinley, Trevelyan J., Joshua V. Ross, Rob Deardon, and Alex R. Cook. "Simulation-based Bayesian inference for epidemic models." Computational Statistics & Data Analysis 71 (March 2014): 434–47. http://dx.doi.org/10.1016/j.csda.2012.12.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Tejero-Cantero, Alvaro, Jan Boelts, Michael Deistler, Jan-Matthis Lueckmann, Conor Durkan, Pedro Gonçalves, David Greenberg, and Jakob Macke. "sbi: A toolkit for simulation-based inference." Journal of Open Source Software 5, no. 52 (August 21, 2020): 2505. http://dx.doi.org/10.21105/joss.02505.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Simulation-Based Inference":

1

Rannestad, Bjarte. "Exact Statistical Inference in Nonhomogeneous Poisson Processes, based on Simulation." Thesis, Norwegian University of Science and Technology, Department of Mathematical Sciences, 2007. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-10775.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:

We present a general approach for Monte Carlo computation of conditional expectations of the form E[(T)|S = s] given a sufficient statistic S. The idea of the method was first introduced by Lillegård and Engen [4], and has been further developed by Lindqvist and Taraldsen [7, 8, 9]. If a certain pivotal structure is satised in our model, the simulation could be done by direct sampling from the conditional distribution, by a simple parameter adjustment of the original statistical model. In general it is shown by Lindqvist and Taraldsen [7, 8] that a weighted sampling scheme needs to be used. The method is in particular applied to the nonhomogeneous Poisson process, in order to develop exact goodness-of-fit tests for the null hypothesis that a set of observed failure times follow the NHPP of a specic parametric form. In addition exact confidence intervals for unknown parameters in the NHPP model are considered [6]. Different test statistics W=W(T) designed in order to reveal departure from the null model are presented [1, 10, 11]. By the method given in the following, the conditional expectation of these test statistics could be simulated in the absence of the pivotal structure mentioned above. This extends results given in [10, 11], and answers a question stated in [1]. We present a power comparison of 5 of the test statistics considered under the nullhypothesis that a set of observed failure times are from a NHPP with log linear intensity, under the alternative hypothesis of power law intensity. Finally a convergence comparison of the method presented here and an alternative approach of Gibbs sampling is given.

2

Rouillard, Louis. "Bridging Simulation-based Inference and Hierarchical Modeling : Applications in Neuroscience." Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
La neuroimagerie étudie l'architecture et le fonctionnement du cerveau à l'aide de la résonance magnétique (IRM). Pour comprendre le signal complexe observé, les neuroscientifiques émettent des hypothèses sous la forme de modèles explicatifs, régis par des paramètres interprétables. Cette thèse étudie l'inférence statistique : deviner quels paramètres auraient pu produire le signal à travers le modèle.L'inférence en neuroimagerie est complexifiée par au moins trois obstacles : une grande dimensionnalité, une grande incertitude et la structure hiérarchique des données. Pour s'attaquer à ce régime, nous utlisons l'inférence variationnelle (VI), une méthode basée sur l'optimisation.Plus précisément, nous combinons l'inférence variationnelle stochastique structurée et les flux de normalisation (NF) pour concevoir des familles variationnelles expressives et adaptées à la large dimensionnalité. Nous appliquons ces techniques à l'IRM de diffusion et l'IRM fonctionnelle, sur des tâches telles que la parcellation individuelle, l'inférence de la microstructure et l'estimation du couplage directionnel. Via ces applications, nous soulignons l'interaction entre les divergences de Kullback-Leibler (KL) forward et reverse comme outils complémentaires pour l'inférence. Nous démontrons également les capacité de l'inférence variationelle automatique (AVI) comme méthode d'inférence robuste et adaptée à la large dimensionnalité, apte à relever les défis de la modélisation en neuroscience
Neuroimaging investigates the brain's architecture and function using magnetic resonance (MRI). To make sense of the complex observed signal, Neuroscientists posit explanatory models, governed by interpretable parameters. This thesis tackles statistical inference : guessing which parameters could have yielded the signal through the model.Inference in Neuroimaging is complexified by at least three hurdles : a large dimensionality, a large uncertainty, and the hierarchcial structure of data. We look into variational inference (VI) as an optimization-based method to tackle this regime.Specifically, we conbine structured stochastic VI and normalizing flows (NFs) to design expressive yet scalable variational families. We apply those techniques in diffusion and functional MRI, on tasks including individual parcellation, microstructure inference and directional coupling estimation. Through these applications, we underline the interplay between the forward and reverse Kullback-Leibler (KL) divergences as complemen-tary tools for inference. We also demonstrate the ability of automatic VI (AVI) as a reliable and scalable inference method to tackle the challenges of model-driven Neuroscience
3

Khalaf, Lynda. "Simulation based finite and large sample inference methods in seemingly unrelated regressions and simultaneous equations." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape11/PQDD_0008/NQ38813.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Follestad, Turid. "Stochastic Modelling and Simulation Based Inference of Fish Population Dynamics and Spatial Variation in Disease Risk." Doctoral thesis, Norwegian University of Science and Technology, Faculty of Information Technology, Mathematics and Electrical Engineering, 2003. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-41.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:

We present a non-Gaussian and non-linear state-space model for the population dynamics of cod along the Norwegian Skagerak coast, embedded in the framework of a Bayesian hierarchical model. The model takes into account both process error, representing natural variability in the dynamics of a population, and observational error, reflecting the sampling process relating the observed data to true abundances. The data set on which our study is based, consists of samples of two juvenile age-groups of cod taken by beach seine hauls at a set of sample stations within several fjords along the coast. The age-structure population dynamics model, constituting the prior of the Bayesian model, is specified in terms of the recruitment process and the processes of survival for these two juvenile age-groups and the mature population, for which we have no data. The population dynamics is specified on abundances at the fjord level, and an explicit down-scaling from the fjord level to the level of the monitored stations is included in the likelihood, modelling the sampling process relating the observed counts to the underlying fjord abundances.

We take a sampling based approach to parameter estimation using Markov chain Monte Carlo methods. The properties of the model in terms of mixing and convergence of the MCMC algorithm and explored empirically on the basis of a simulated data set, and we show how the mixing properties can be improved by re-parameterisation. Estimation of the model parameters, and not the abundances, is the primary aim of the study, and we also propose an alternative approach to the estimation of the model parameters based on the marginal posterior distribution integrating over the abundances.

Based on the estimated model we illustrate how we can simulate the release of juvenile cod, imitating an experiment conducted in the early 20th century to resolve a controversy between a fisherman and a scientist who could not agree on the effect of releasing cod larvae on the mature abundance of cod. This controversy initiated the monitoring programme generating the data used in our study.

5

Fuentes, Antonio. "Proactive Decision Support Tools for National Park and Non-Traditional Agencies in Solving Traffic-Related Problems." Diss., Virginia Tech, 2019. http://hdl.handle.net/10919/88727.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Transportation Engineers have recently begun to incorporate statistical and machine learning approaches to solving difficult problems, mainly due to the vast quantities of data collected that is stochastic (sensors, video, and human collected). In transportation engineering, a transportation system is often denoted by jurisdiction boundaries and evaluated as such. However, it is ultimately defined by the consideration of the analyst in trying to answer the question of interest. In this dissertation, a transportation system located in Jackson, Wyoming under the jurisdiction of the Grand Teton National Park and recognized as the Moose-Wilson Corridor is evaluated to identify transportation-related factors that influence its operational performance. The evaluation considers its unique prevalent conditions and takes into account future management strategies. The dissertation accomplishes this by detailing four distinct aspects in individual chapters; each chapter is a standalone manuscript with detailed introduction, purpose, literature review, findings, and conclusion. Chapter 1 provides a general introduction and provides a summary of Chapters 2 – 6. Chapter 2 focuses on evaluating the operational performance of the Moose-Wilson Corridor's entrance station, where queueing performance and arrival and probability mass functions of the vehicle arrival rates are determined. Chapter 3 focuses on the evaluation of a parking system within the Moose-Wilson Corridor in a popular attraction known as the Laurance S. Rockefeller Preserve, in which the system's operational performance is evaluated, and a probability mass function under different arrival and service rates are provided. Chapter 4 provides a data science approach to predicting the probability of vehicles stopping along the Moose-Wilson Corridor. The approach is a machine learning classification methodology known as "decision tree." In this study, probabilities of stopping at attractions are predicted based on GPS tracking data that include entrance location, time of day and stopping at attractions. Chapter 5 considers many of the previous findings, discusses and presents a developed tool which utilizes a Bayesian methodology to determine the posterior distributions of observed arrival rates and service rates which serve as bounds and inputs to an Agent-Based Model. The Agent-Based Model represents the Moose-Wilson Corridor under prevailing conditions and considers some of the primary operational changes in Grand Teton National Park's comprehensive management plan for the Moose-Wilson Corridor. The implementation of an Agent-Based Model provides a flexible platform to model multiple aspects unique to a National Park, including visitor behavior and its interaction with wildlife. Lastly, Chapter 6 summarizes and concludes the dissertation.
Doctor of Philosophy
In this dissertation, a transportation system located in Jackson, Wyoming under the jurisdiction of the Grand Teton National Park and recognized as the Moose-Wilson Corridor is evaluated to identify transportation-related factors that influence its operational performance. The evaluation considers its unique prevalent conditions and takes into account future management strategies. Furthermore, emerging analytical strategies are implemented to identify and address transportation system operational concerns. Thus, in this dissertation, decision support tools for the evaluation of a unique system in a National Park are presented in four distinct manuscripts. The manuscripts cover traditional approaches that breakdown and evaluate traffic operations and identify mitigation strategies. Additionally, emerging strategies for the evaluation of data with machine learning approaches are implemented on GPS-tracks to determine vehicles stopping at park attractions. Lastly, an agent-based model is developed in a flexible platform to utilize previous findings and evaluate the Moose-Wilson corridor while considering future policy constraints and the unique natural interactions between visitors and prevalent ecological and wildlife.
6

Kazakov, Mikhaïl. "A Methodology of semi-automated software integration : an approach based on logical inference. Application to numerical simulation solutions of Open CASCADE." INSA de Rouen, 2004. http://www.theses.fr/2004ISAM0001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Application integration is a process of bringing of data or functionality from one program together with that from another application programs that were not initially created to work together. Recently, the integration of numerical simulation solvers gained the importance. Integration within this domain has high complexity due to the presence of non-standard application interfaces that exchange complex, diverse and often ambiguous data. Nowadays, the integration is done mostly manually. Difficulties of the manual process force to increase the level of automation of the integration process. The author of this dissertation created a methodology and its software implementation for semi-automated (i. E. Partially automated) application integration. Application interfaces are usually represented by their syntactical definitions, but they miss the high-level semantics of applicative domains - human understanding on what the software does. The author proposes to use formal specifications (ontologies) expressed in Description Logics in order to specify software interfaces and define their high-level semantics. The author proposes a three-tier informational model for structuring ontologies and the integration process. This model distinguishes among computation-indeoendent domain knowledge (domain ontology), platform-independent interface specifications (interface ontology) and platform-specific technological integration information (technological ontology). A mediation ontology is defined to fuse the specifications. A reasoning procedure over these ontologies searches for semantic links among syntactic definitions of application interfaces. Connectors among applications are generated using the information about semantic links. Integrated applications communicate later via the connectors. The author designed a meta-model based data manipulation approach that facilitates and supports the software implementation of the integration process.
7

Cho, B. "Control of a hybrid electric vehicle with predictive journey estimation." Thesis, Cranfield University, 2008. http://hdl.handle.net/1826/2589.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Battery energy management plays a crucial role in fuel economy improvement of charge-sustaining parallel hybrid electric vehicles. Currently available control strategies consider battery state of charge (SOC) and driver’s request through the pedal input in decision-making. This method does not achieve an optimal performance for saving fuel or maintaining appropriate SOC level, especially during the operation in extreme driving conditions or hilly terrain. The objective of this thesis is to develop a control algorithm using forthcoming traffic condition and road elevation, which could be fed from navigation systems. This would enable the controller to predict potential of regenerative charging to capture cost-free energy and intentionally depleting battery energy to assist an engine at high power demand. The starting point for this research is the modelling of a small sport-utility vehicle by the analysis of the vehicles currently available in the market. The result of the analysis is used in order to establish a generic mild hybrid powertrain model, which is subsequently examined to compare the performance of controllers. A baseline is established with a conventional powertrain equipped with a spark ignition direct injection engine and a continuously variable transmission. Hybridisation of this vehicle with an integrated starter alternator and a traditional rule-based control strategy is presented. Parameter optimisation in four standard driving cycles is explained, followed by a detailed energy flow analysis. An additional potential improvement is presented by dynamic programming (DP), which shows a benefit of a predictive control. Based on these results, a predictive control algorithm using fuzzy logic is introduced. The main tools of the controller design are the DP, adaptive-network-based fuzzy inference system with subtractive clustering and design of experiment. Using a quasi-static backward simulation model, the performance of the controller is compared with the result from the instantaneous control and the DP. The focus is fuel saving and SOC control at the end of journeys, especially in aggressive driving conditions and a hilly road. The controller shows a good potential to improve fuel economy and tight SOC control in long journey and hilly terrain. Fuel economy improvement and SOC correction are close to the optimal solution by the DP, especially in long trips on steep road where there is a large gap between the baseline controller and the DP. However, there is little benefit in short trips and flat road. It is caused by the low improvement margin of the mild hybrid powertrain and the limited future journey information. To provide a further step to implementation, a software-in-the-loop simulation model is developed. A fully dynamic model of the powertrain and the control algorithm are implemented in AMESim-Simulink co-simulation environment. This shows small deterioration of the control performance by driver’s pedal action, powertrain dynamics and limited computational precision on the controller performance.
8

Dominicy, Yves. "Quantile-based inference and estimation of heavy-tailed distributions." Doctoral thesis, Universite Libre de Bruxelles, 2014. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/209311.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This thesis is divided in four chapters. The two first chapters introduce a parametric quantile-based estimation method of univariate heavy-tailed distributions and elliptical distributions, respectively. If one is interested in estimating the tail index without imposing a parametric form for the entire distribution function, but only on the tail behaviour, we propose a multivariate Hill estimator for elliptical distributions in chapter three. In the first three chapters we assume an independent and identically distributed setting, and so as a first step to a dependent setting, using quantiles, we prove in the last chapter the asymptotic normality of marginal sample quantiles for stationary processes under the S-mixing condition.

The first chapter introduces a quantile- and simulation-based estimation method, which we call the Method of Simulated Quantiles, or simply MSQ. Since it is based on quantiles, it is a moment-free approach. And since it is based on simulations, we do not need closed form expressions of any function that represents the probability law of the process. Thus, it is useful in case the probability density functions has no closed form or/and moments do not exist. It is based on a vector of functions of quantiles. The principle consists in matching functions of theoretical quantiles, which depend on the parameters of the assumed probability law, with those of empirical quantiles, which depend on the data. Since the theoretical functions of quantiles may not have a closed form expression, we rely on simulations.

The second chapter deals with the estimation of the parameters of elliptical distributions by means of a multivariate extension of MSQ. In this chapter we propose inference for vast dimensional elliptical distributions. Estimation is based on quantiles, which always exist regardless of the thickness of the tails, and testing is based on the geometry of the elliptical family. The multivariate extension of MSQ faces the difficulty of constructing a function of quantiles that is informative about the covariation parameters. We show that the interquartile range of a projection of pairwise random variables onto the 45 degree line is very informative about the covariation.

The third chapter consists in constructing a multivariate tail index estimator. In the univariate case, the most popular estimator for the tail exponent is the Hill estimator introduced by Bruce Hill in 1975. The aim of this chapter is to propose an estimator of the tail index in a multivariate context; more precisely, in the case of regularly varying elliptical distributions. Since, for univariate random variables, our estimator boils down to the Hill estimator, we name it after Bruce Hill. Our estimator is based on the distance between an elliptical probability contour and the exceedance observations.

Finally, the fourth chapter investigates the asymptotic behaviour of the marginal sample quantiles for p-dimensional stationary processes and we obtain the asymptotic normality of the empirical quantile vector. We assume that the processes are S-mixing, a recently introduced and widely applicable notion of dependence. A remarkable property of S-mixing is the fact that it doesn't require any higher order moment assumptions to be verified. Since we are interested in quantiles and processes that are probably heavy-tailed, this is of particular interest.


Doctorat en Sciences économiques et de gestion
info:eu-repo/semantics/nonPublished

9

Toft, Albin. "Particle-based Parameter Inference in Stochastic Volatility Models: Batch vs. Online." Thesis, KTH, Matematisk statistik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-252313.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This thesis focuses on comparing an online parameter estimator to an offline estimator, both based on the PaRIS-algorithm, when estimating parameter values for a stochastic volatility model. By modeling the stochastic volatility model as a hidden Markov model, estimators based on particle filters can be implemented in order to estimate the unknown parameters of the model. The results from this thesis implies that the proposed online estimator could be considered as a superior method to the offline counterpart. The results are however somewhat inconclusive, and further research regarding the subject is recommended.
Detta examensarbetefokuserar på att jämföra en online och offline parameter-skattare i stokastiskavolatilitets modeller. De två parameter-skattarna som jämförs är båda baseradepå PaRIS-algoritmen. Genom att modellera en stokastisk volatilitets-model somen dold Markov kedja, kunde partikelbaserade parameter-skattare användas föratt uppskatta de okända parametrarna i modellen. Resultaten presenterade idetta examensarbete tyder på att online-implementationen av PaRIS-algorimen kanses som det bästa alternativet, jämfört med offline-implementationen.Resultaten är dock inte helt övertygande, och ytterligare forskning inomområdet
10

Wang, Shiwei. "Motion Control for Intelligent Ground Vehicles Based on the Selection of Paths Using Fuzzy Inference." Digital WPI, 2014. https://digitalcommons.wpi.edu/etd-theses/725.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In this paper I describe a motion planning technique for intelligent ground vehicles. The technique is an implementation of a path selection algorithm based on fuzzy inference. The approach extends on the motion planning algorithm known as driving with tentacles. The selection of the tentacle (a drivable path) to follow relies on the calculation of a weighted cost function for each tentacle in the current speed set, and depends on variables such as the distance to the desired position, speed, and the closeness of a tentacle to any obstacles. A Matlab simulation and the practical implementation of the fuzzy inference rule on a Clearpath Husky robot within the Robot Operating System (ROS) framework are provided.

Books on the topic "Simulation-Based Inference":

1

Dufour, Jean-Marie. Simulation based finite and large sample inference methods in multivariate regressions and seemingly unrelated regressions. Bristol: University of Bristol, Department of Economics, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

S, Teichrow Jon, University of Houston--Clear Lake. Research Institute for Computing and Information Systems., and Lyndon B. Johnson Space Center. Information Technology Division., eds. Real-time fuzzy inference based robot path planning: Final report. [Clear Lake City, Tex.?]: Research Institute for Computing and Information Systems, University of Houston-Clear Lake, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Mariano, Roberto, Til Schuermann, and Melvyn J. Weeks, eds. Simulation-based Inference in Econometrics. Cambridge University Press, 2000. http://dx.doi.org/10.1017/cbo9780511751981.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Schuermann, Til, Roberto Mariano, and Melvyn J. Weeks. Simulation-Based Inference in Econometrics: Methods and Applications. Cambridge University Press, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Simulation-based Inference in Econometrics: Methods and Applications. Cambridge University Press, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

(Editor), Roberto Mariano, Til Schuermann (Editor), and Melvyn J. Weeks (Editor), eds. Simulation-based Inference in Econometrics: Methods and Applications. Cambridge University Press, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Levin, Ines, and Betsy Sinclair. Causal Inference with Complex Survey Designs. Edited by Lonna Rae Atkeson and R. Michael Alvarez. Oxford University Press, 2016. http://dx.doi.org/10.1093/oxfordhb/9780190213299.013.4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This article discusses methods that combine survey weighting and propensity score matching to estimate population average treatment effects. Beginning with an overview of causal inference techniques that incorporate data from complex surveys and the usefulness of survey weights, it then considers approaches for incorporating survey weights into three matching algorithms, along with their respective methodologies: nearest-neighbor matching, subclassification matching, and propensity score weighting. It also presents the results of a Monte Carlo simulation study that illustrates the benefits of incorporating survey weights into propensity score matching procedures, as well as the problems that arise when survey weights are ignored. Finally, it explores the differences between population-based inferences and sample-based inferences using real-world data from the 2012 panel of The American Panel Survey (TAPS). The article highlights the impact of social media usage on political participation, when such impact is not actually apparent in the target population.
8

Quintana, José Mario, Carlos Carvalho, James Scott, and Thomas Costigliola. Extracting S&P500 and NASDAQ Volatility: The Credit Crisis of 2007–2008. Edited by Anthony O'Hagan and Mike West. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780198703174.013.13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This article demonstrates the utility of Bayesian modelling and inference in financial market volatility analysis, using the 2007-2008 credit crisis as a case study. It first describes the applied problem and goal of the Bayesian analysis before introducing the sequential estimation models. It then discusses the simulation-based methodology for inference, including Markov chain Monte Carlo (MCMC) and particle filtering methods for filtering and parameter learning. In the study, Bayesian sequential model choice techniques are used to estimate volatility and volatility dynamics for daily data for the year 2007 for three market indices: the Standard and Poor’s S&P500, the NASDAQ NDX100 and the financial equity index called XLF. Three models of financial time series are estimated: a model with stochastic volatility, a model with stochastic volatility that also incorporates jumps in volatility, and a Garch model.
9

Real-time fuzzy inference based robot path planning: Final report. [Clear Lake City, Tex.?]: Research Institute for Computing and Information Systems, University of Houston-Clear Lake, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bao, Yun, Carl Chiarella, and Boda Kang. Particle Filters for Markov-Switching Stochastic Volatility Models. Edited by Shu-Heng Chen, Mak Kaboudan, and Ye-Rong Du. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780199844371.013.9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This chapter proposes an auxiliary particle filter algorithm for inference in regime switching stochastic volatility models in which the regime state is governed by a first-order Markov chain. It proposes an ongoing updated Dirichlet distribution to estimate the transition probabilities of the Markov chain in the auxiliary particle filter. A simulation-based algorithm is presented for the method that demonstrates the ability to estimate a class of models in which the probability that the system state transits from one regime to a different regime is relatively high. The methodology is implemented in order to analyze a real-time series, namely, the foreign exchange rate between the Australian dollar and the South Korean won.

Book chapters on the topic "Simulation-Based Inference":

1

Jávor, András. "Knowledge Based Inference Controlled Logic Simulation." In Advances in Simulation, 397–405. New York, NY: Springer US, 1988. http://dx.doi.org/10.1007/978-1-4684-6389-7_80.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Yurrita, Mireia, Arnaud Grignard, Luis Alonso, and Kent Larson. "Real-Time Inference of Urban Metrics Applying Machine Learning to an Agent-Based Model Coupling Mobility Mode and Housing Choice." In Multi-Agent-Based Simulation XXII, 125–38. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-94548-0_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ayusuk, Apiwat, and Songsak Sriboonchitta. "Copula Based Volatility Models and Extreme Value Theory for Portfolio Simulation with an Application to Asian Stock Markets." In Causal Inference in Econometrics, 279–93. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-27284-9_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cabarcos, M., R. P. Otero, and S. G. Pose. "Efficient Concurrent Simulation of DEVS Systems Based on Concurrent Inference." In Computer Aided Systems Theory - EUROCAST’99, 307–18. Berlin, Heidelberg: Springer Berlin Heidelberg, 2000. http://dx.doi.org/10.1007/10720123_28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Møller, Jesper, and Rasmus P. Waagepetersen. "An Introduction to Simulation-Based Inference for Spatial Point Processes." In Spatial Statistics and Computational Methods, 143–98. New York, NY: Springer New York, 2003. http://dx.doi.org/10.1007/978-0-387-21811-3_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ben Abdessalem, Anis. "Structural Model Updating and Model Selection: Bayesian Inference Approach Based on Simulation." In Lecture Notes in Civil Engineering, 223–33. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-57224-1_22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Li, Xiao, and Xiaoping Chen. "Fuzzy Inference Based Forecasting in Soccer Simulation 2D, the RoboCup 2015 Soccer Simulation 2D League Champion Team." In RoboCup 2015: Robot World Cup XIX, 144–52. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-29339-4_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kim, Mingoo, and Taeseok Jin. "Simulation of Mobile Robot Navigation Using the Distributed Control Command Based Fuzzy Inference." In Intelligent Robotics and Applications, 457–66. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-13966-1_45.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Herbach, Ulysse. "Harissa: Stochastic Simulation and Inference of Gene Regulatory Networks Based on Transcriptional Bursting." In Computational Methods in Systems Biology, 97–105. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-42697-1_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Flores-Parra, Josue-Miguel, Manuel Castañón-Puga, Carelia Gaxiola-Pacheco, Luis-Enrique Palafox-Maestre, Ricardo Rosales, and Alfredo Tirado-Ramos. "A Fuzzy Inference System and Data Mining Toolkit for Agent-Based Simulation in NetLogo." In Computer Science and Engineering—Theory and Applications, 127–49. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-74060-7_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Simulation-Based Inference":

1

Nevin, Becky. "Simulation-based Inference Cosmology from Strong Lensing." In Simulation-based Inference Cosmology from Strong Lensing. US DOE, 2024. http://dx.doi.org/10.2172/2282440.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Tamargo-Arizmendi, Marcos. "Simulation Based Inference with Domain Adaptation for Strong Gravitational Lensing." In Simulation Based Inference with Domain Adaptation for Strong Gravitational Lensing. US DOE, 2023. http://dx.doi.org/10.2172/2246937.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jarugula, Sreevani, Brian Nord, Jason Poh, Aleksandra Ciprijanovic, and Becky Nevin. "Cosmology constraints from Strong Gravitational Lensing using Simulation-Based Inference." In Cosmology constraints from Strong Gravitational Lensing using Simulation-Based Inference. US DOE, 2023. http://dx.doi.org/10.2172/2246728.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Poh, Jason, Ashwin Samudre, Aleksandra Ciprijanovic, Brian Nord, Gourav Khullar, Dimitrios Tanoglidis, and Joshua Frieman. "Strong Lensing Parameter Estimation on Ground-Based Imaging Data Using Simulation-Based Inference." In Strong Lensing Parameter Estimation on Ground-Based Imaging Data Using Simulation-Based Inference. US DOE, 2023. http://dx.doi.org/10.2172/1958791.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Dunham, Bruce. "Exploring Simulation-Based Inference in a High School Course." In Bridging the Gap: Empowering and Educating Today’s Learners in Statistics. International Association for Statistical Education, 2022. http://dx.doi.org/10.52041/iase.icots11.t14a3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Although inferential concepts are typically introduced in courses at high school, the approaches taught are usually the methodologies in introductory classes at university level. There is much research to support that learners have difficulty with classical frequentist inference and that a better understanding of inferential concepts can be obtained via an introduction using simulation-based methods. A new course available to high schools in British Columbia, Canada, incorporates several novel aspects, a key feature being the reliance on “intuitive,” simulation-based inference. We describe the pedagogical approaches adopted in this course and how students appeared to have learned from their experiences.
6

Kulkarni, Sourabh, Alexander Tsyplikhin, Mario Michael Krell, and Csaba Andras Moritz. "Accelerating Simulation-based Inference with Emerging AI Hardware." In 2020 International Conference on Rebooting Computing (ICRC). IEEE, 2020. http://dx.doi.org/10.1109/icrc2020.2020.00003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Dong, Ming, Jianzhong Cha, and Mingcheng E. "Knowledge-Based Integrated Manufacturing Flexible Simulation System." In ASME 1996 Design Engineering Technical Conferences and Computers in Engineering Conference. American Society of Mechanical Engineers, 1996. http://dx.doi.org/10.1115/96-detc/dac-1041.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract In this paper, we present a knowledge-based flexible simulation system for integrated manufacturing. The simulation model knowledge base of a CIMS is constituted of five parts: FBS models of the CIM-OSA system architecture, entity classes library, procedural knowledge base, database and inference engines. The knowledge-based simulation models are represented by the object-oriented frame language and their behaviours are generated by inference engines reasoning about the sets of procedural rules. Because of the use of various classes libraries which make this system flexible, we call it a knowledge-based integrated manufacturing flexible simulation system.
8

Yun-Jie, Ji, Chen Wen-Qi, and He Ling. "Risk Identification and Simulation Based on the Bayesian Inference." In 2018 4th Annual International Conference on Network and Information Systems for Computers (ICNISC). IEEE, 2018. http://dx.doi.org/10.1109/icnisc.2018.00089.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Boyali, Ali, Simon Thompson, and David Robert Wong. "Identification of Vehicle Dynamics Parameters Using Simulation-based Inference." In 2021 IEEE Intelligent Vehicles Symposium Workshops (IV Workshops). IEEE, 2021. http://dx.doi.org/10.1109/ivworkshops54471.2021.9669252.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

IKIDID, Abdelouafi, and El Fazziki Abdelaziz. "Multi-Agent and Fuzzy Inference Based Framework for Urban Traffic Simulation." In 2019 Fourth International Conference on Systems of Collaboration Big Data, Internet of Things & Security ( SysCoBIoTS). IEEE, 2019. http://dx.doi.org/10.1109/syscobiots48768.2019.9028016.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Simulation-Based Inference":

1

Chen, Zhenzhen. Model-based analysis in survey: an application in analytic inference and a simulation in Small Area Estimation. Ames (Iowa): Iowa State University, January 2019. http://dx.doi.org/10.31274/cc-20240624-1010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Paule, Bernard, Flourentzos Flourentzou, Tristan de KERCHOVE d’EXAERDE, Julien BOUTILLIER, and Nicolo Ferrari. PRELUDE Roadmap for Building Renovation: set of rules for renovation actions to optimize building energy performance. Department of the Built Environment, 2023. http://dx.doi.org/10.54337/aau541614638.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In the context of climate change and the environmental and energy constraints we face, it is essential to develop methods to encourage the implementation of efficient solutions for building renovation. One of the objectives of the European PRELUDE project [1] is to develop a "Building Renovation Roadmap"(BRR) aimed at facilitating decision-making to foster the most efficient refurbishment actions, the implementation of innovative solutions and the promotion of renewable energy sources in the renovation process of existing buildings. In this context, Estia is working on the development of inference rules that will make it possible. On the basis of a diagnosis such as the Energy Performance Certificate, it will help establishing a list of priority actions. The dynamics that drive this project permit to decrease the subjectivity of a human decisions making scheme. While simulation generates digital technical data, interpretation requires the translation of this data into natural language. The purpose is to automate the translation of the results to provide advice and facilitate decision-making. In medicine, the diagnostic phase is a process by which a disease is identified by its symptoms. Similarly, the idea of the process is to target the faulty elements potentially responsible for poor performance and to propose remedial solutions. The system is based on the development of fuzzy logic rules [2],[3]. This choice was made to be able to manipulate notions of membership with truth levels between 0 and 1, and to deliver messages in a linguistic form, understandable by non-specialist users. For example, if performance is low and parameter x is unfavourable, the algorithm can gives an incentive to improve the parameter such as: "you COULD, SHOULD or MUST change parameter x". Regarding energy performance analysis, the following domains are addressed: heating, domestic hot water, cooling, lighting. Regarding the parameters, the analysis covers the following topics: Characteristics of the building envelope. and of the technical installations (heat production-distribution, ventilation system, electric lighting, etc.). This paper describes the methodology used, lists the fields studied and outlines the expected outcomes of the project.
3

Stewart, Jonathan, Grace Parker, Joseph Harmon, Gail Atkinson, David Boore, Robert Darragh, Walter Silva, and Youssef Hashash. Expert Panel Recommendations for Ergodic Site Amplification in Central and Eastern North America. Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, March 2017. http://dx.doi.org/10.55461/tzsy8988.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The U.S. Geological Survey (USGS) national seismic hazard maps have historically been produced for a reference site condition of VS30 = 760 m/sec (where VS30 is time averaged shear wave velocity in the upper 30 m of the site). The resulting ground motions are modified for five site classes (A-E) using site amplification factors for peak acceleration and ranges of short- and long-oscillator periods. As a result of Project 17 recommendations, this practice is being revised: (1) maps will be produced for a range of site conditions (as represented by VS30 ) instead of a single reference condition; and (2) the use of site factors for period ranges is being replaced with period-specific factors over the period range of interest (approximately 0.1 to 10 sec). Since the development of the current framework for site amplification factors in 1992, the technical basis for the site factors used in conjunction with the USGS hazard maps has remained essentially unchanged, with only one modification (in 2014). The approach has been to constrain site amplification for low-to-moderate levels of ground shaking using inference from observed ground motions (approximately linear site response), and to use ground response simulations (recently combined with observations) to constrain nonlinear site response. Both the linear and nonlinear site response has been based on data and geologic conditions in the western U.S. (an active tectonic region). This project and a large amount of previous and contemporaneous related research (e.g., NGA-East Geotechnical Working Group for site response) has sought to provide an improved basis for the evaluation of ergodic site amplification in central and eastern North America (CENA). The term ‘ergodic’ in this context refers to regionally-appropriate, but not site-specific, site amplification models (i.e., models are appropriate for CENA generally, but would be expected to have bias for any particular site). The specific scope of this project was to review and synthesize relevant research results so as to provide recommendations to the USGS for the modeling of ergodic site amplification in CENA for application in the next version of USGS maps. The panel assembled for this project recommends a model provided as three terms that are additive in natural logarithmic units. Two describe linear site amplification. One of these describes VS30-scaling relative to a 760 m/sec reference, is largely empirical, and has several distinct attributes relative to models for active tectonic regions. The second linear term adjusts iv site amplification from the 760 m/sec reference to the CENA reference condition (used with NGA-East ground motion models) of VS =3000 m/sec; this second term is simulation-based. The panel is also recommending a nonlinear model, which is described in a companion report [Hashash et al. 2017a]. All median model components are accompanied by models for epistemic uncertainty. The models provided in this report are recommended for application by the USGS and other entities. The models are considered applicable for VS30 = 200–2000 m/sec site conditions and oscillator periods of 0.08–5 sec. Finally, it should be understood that as ergodic models, they lack attributes that may be important for specific sites, such as resonances at site periods. Site-specific analyses are recommended to capture such effects for significant projects and for any site condition with VS30 < 200 m/sec. We recommend that future site response models for hazard applications consider a two-parameter formulation that includes a measure of site period in addition to site stiffness.
4

Tsidylo, Ivan M., Serhiy O. Semerikov, Tetiana I. Gargula, Hanna V. Solonetska, Yaroslav P. Zamora, and Andrey V. Pikilnyak. Simulation of intellectual system for evaluation of multilevel test tasks on the basis of fuzzy logic. CEUR Workshop Proceedings, June 2021. http://dx.doi.org/10.31812/123456789/4370.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The article describes the stages of modeling an intelligent system for evaluating multilevel test tasks based on fuzzy logic in the MATLAB application package, namely the Fuzzy Logic Toolbox. The analysis of existing approaches to fuzzy assessment of test methods, their advantages and disadvantages is given. The considered methods for assessing students are presented in the general case by two methods: using fuzzy sets and corresponding membership functions; fuzzy estimation method and generalized fuzzy estimation method. In the present work, the Sugeno production model is used as the closest to the natural language. This closeness allows for closer interaction with a subject area expert and build well-understood, easily interpreted inference systems. The structure of a fuzzy system, functions and mechanisms of model building are described. The system is presented in the form of a block diagram of fuzzy logical nodes and consists of four input variables, corresponding to the levels of knowledge assimilation and one initial one. The surface of the response of a fuzzy system reflects the dependence of the final grade on the level of difficulty of the task and the degree of correctness of the task. The structure and functions of the fuzzy system are indicated. The modeled in this way intelligent system for assessing multilevel test tasks based on fuzzy logic makes it possible to take into account the fuzzy characteristics of the test: the level of difficulty of the task, which can be assessed as “easy”, “average", “above average”, “difficult”; the degree of correctness of the task, which can be assessed as “correct”, “partially correct”, “rather correct”, “incorrect”; time allotted for the execution of a test task or test, which can be assessed as “short”, “medium”, “long”, “very long”; the percentage of correctly completed tasks, which can be assessed as “small”, “medium”, “large”, “very large”; the final mark for the test, which can be assessed as “poor”, “satisfactory”, “good”, “excellent”, which are included in the assessment. This approach ensures the maximum consideration of answers to questions of all levels of complexity by formulating a base of inference rules and selection of weighting coefficients when deriving the final estimate. The robustness of the system is achieved by using Gaussian membership functions. The testing of the controller on the test sample brings the functional suitability of the developed model.

To the bibliography