Tesi sul tema "Rare events simulation"
Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili
Vedi i top-50 saggi (tesi di laurea o di dottorato) per l'attività di ricerca sul tema "Rare events simulation".
Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.
Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.
Vedi le tesi di molte aree scientifiche e compila una bibliografia corretta.
Liu, Hong. "Rare events, heavy tails, and simulation". Diss., Connect to online resource, 2006. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3239435.
Testo completoBooth, Jonathan James. "New applications of boxed molecular dynamics : efficient simulation of rare events". Thesis, University of Leeds, 2016. http://etheses.whiterose.ac.uk/13101/.
Testo completoLiu, Gang. "Rare events simulation by shaking transformations : Non-intrusive resampler for dynamic programming". Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLX043/document.
Testo completoThis thesis contains two parts: rare events simulation and non-intrusive stratified resampler for dynamic programming. The first part consists of quantifying statistics related to events which are unlikely to happen but which have serious consequences. We propose Markovian transformation on path spaces and combine them with the theories of interacting particle system and of Markov chain ergodicity to propose methods which apply very generally and have good performance. The second part consists of resolving dynamic programming problem numerically in a context where we only have historical observations of small size and we do not know the values of model parameters. We propose and analyze a new scheme with stratification and resampling techniques
DHAMODARAN, RAMYA. "EFFICIENT ANALYSIS OF RARE EVENTS ASSOCIATED WITH INDIVIDUAL BUFFERS IN A TANDEM JACKSON NETWORK". University of Cincinnati / OhioLINK, 2004. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1099073321.
Testo completoFreitas, Rodrigo Moura 1989. "Molecular simulation = methods and applications = Simulações moleculares : métodos e aplicações". [s.n.], 2013. http://repositorio.unicamp.br/jspui/handle/REPOSIP/278440.
Testo completoDissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Física Gleb Wataghin
Made available in DSpace on 2018-08-23T00:50:21Z (GMT). No. of bitstreams: 1 Freitas_RodrigoMoura_M.pdf: 11496259 bytes, checksum: 41c29f22d80da01064cf7a3b9681b05f (MD5) Previous issue date: 2013
Resumo: Devido aos avanços conceptuais e técnicos feitos em física computacional e ciência dos materiais computacional nos estamos aptos a resolver problemas que eram inacessíveis a alguns anos atrás. Nessa dissertação estudamos a evolução de alguma destas técnicas, apresentando a teoria e técnicas de simulação computacional para estudar transições de fase de primeira ordem com ênfase nas técnicas mais avançadas de calculo de energia livre (Reversible Scaling) e métodos de simulação de eventos raros (Forward Flux Sampling) usando a técnica de simulação atomística da Dinâmica Molecular. A evolução e melhora da e ciência destas técnicas e apresentada junto com aplicações a sistemas simples que permitem solução exata e também ao caso mais complexo da transição de fase Martenstica. Também apresentamos a aplicação de métodos numéricos no estudo do modelo de Pauling para o gelo. Nos desenvolvemos e implementamos um novo algoritmo para a criação e ciente de estruturas de gelo desordenadas. Este algoritmo de geração de cristais de gelo nos permitiu criar células de gelo Ih de tamanhos que não eram possíveis antes. Usando este algoritmo abordamos o problema de efeitos de tamanho finito não estudados anteriormente
Abstract: Due to the conceptual and technical advances being made in computational physics and computational materials science we have been able to tackle problems that were inaccessible a few years ago. In this dissertation we study the evolution of some of these techniques, presenting the theory and simulation methods to study _rst order phase transitions with emphasis on state-of-the-art free-energy calculation (Reversible Scaling) and rare event (Forward Flux Sampling) methods using the atomistic simulation technique of Molecular Dynamics. The evolution and efficiency improvement of these techniques is presented together with applications to simple systems that allow exact solution as well as the more the complex case of Martensitic phase transitions. We also present the application of numerical methods to study Pauling\'s model of ice. We have developed and implemented a new algorithm for efficient generation of disordered ice structures. This ice generator algorithm allows us to create ice Ih cells of sizes not reported before. Using this algorithm we address finite size effects not studied before
Mestrado
Física
Mestre em Física
Hewett, Angela Dawn. "Expecting the unexpected : to what extent does simulation help healthcare professionals prepare for rare, critical events during childbearing?" Thesis, University of Leeds, 2016. http://etheses.whiterose.ac.uk/15431/.
Testo completoRai, Ajit. "Estimation de la disponibilité par simulation, pour des systèmes incluant des contraintes logistiques". Thesis, Rennes 1, 2018. http://www.theses.fr/2018REN1S105/document.
Testo completoRAM (Reliability, Availability and Maintainability) analysis forms an integral part in estimation of Life Cycle Costs (LCC) of passenger rail systems. These systems are highly reliable and include complex logistics. Standard Monte-Carlo simulations are rendered useless in efficient estimation of RAM metrics due to the issue of rare events. Systems failures of these complex passenger rail systems can include rare events and thus need efficient simulation techniques. Importance Sampling (IS) are an advanced class of variance reduction techniques that can overcome the limitations of standard simulations. IS techniques can provide acceleration of simulations, meaning, less variance in estimation of RAM metrics in same computational budget as a standard simulation. However, IS includes changing the probability laws (change of measure) that drive the mathematical models of the systems during simulations and the optimal IS change of measure is usually unknown, even though theroretically there exist a perfect one (zero-variance IS change of measure). In this thesis, we focus on the use of IS techniques and its application to estimate two RAM metrics : reliability (for static networks) and steady state availability (for dynamic systems). The thesis focuses on finding and/or approximating the optimal IS change of measure to efficiently estimate RAM metrics in rare events context. The contribution of the thesis is broadly divided into two main axis : first, we propose an adaptation of the approximate zero-variance IS method to estimate reliability of static networks and show the application on real passenger rail systems ; second, we propose a multi-level Cross-Entropy optimization scheme that can be used during pre-simulation to obtain CE optimized IS rates of Markovian Stochastic Petri Nets (SPNs) transitions and use them in main simulations to estimate steady state unavailability of highly reliably Markovian systems with complex logistics involved. Results from the methods show huge variance reduction and gain compared to MC simulations
Picciani, Massimiliano. "Rare events in many-body systems : reactive paths and reaction constants for structural transitions". Phd thesis, Université Pierre et Marie Curie - Paris VI, 2012. http://tel.archives-ouvertes.fr/tel-00706510.
Testo completoSilva, lopes Laura. "Méthodes numériques pour la simulation d'évènements rares en dynamique moléculaire". Thesis, Paris Est, 2019. http://www.theses.fr/2019PESC1045.
Testo completoIn stochastic dynamical systems, such as those encountered in molecular dynamics, rare events naturally appear as events due to some low probability stochastic fluctuations. Examples of rare events in our everyday life includes earthquakes and major floods. In chemistry, protein folding, ligandunbinding from a protein cavity and opening or closing of channels in cell membranes are examples of rare events. Simulation of rare events has been an important field of research in biophysics over the past thirty years.The events of interest in molecular dynamics generally involve transitions between metastable states, which are regions of the phase space where the system tends to stay trapped. These transitions are rare, making the use of a naive, direct Monte Carlo method computationally impracticable. To dealwith this difficulty, sampling methods have been developed to efficiently simulate rare events. Among them are splitting methods, that consists in dividing the rare event of interest into successive nested more likely events.Adaptive Multilevel Splitting (AMS) is a splitting method in which the positions of the intermediate interfaces, used to split reactive trajectories, are adapted on the fly. The surfaces are defined suchthat the probability of transition between them is constant, which minimizes the variance of the rare event probability estimator. AMS is a robust method that requires a small quantity of user defined parameters, and is therefore easy to use.This thesis focuses on the application of the adaptive multilevel splitting method to molecular dynamics. Two kinds of systems are studied. The first one contains simple models that allowed us to improve the way AMS is used. The second one contains more realistic and challenging systems, where AMS isused to get better understanding of the molecular mechanisms. Hence, the contributions of this thesis include both methodological and numerical results.We first validate the AMS method by applying it to the paradigmatic alanine dipeptide conformational change. We then propose a new technique combining AMS and importance sampling to efficiently sample the initial conditions ensemble when using AMS to obtain the transition time. This is validatedon a simple one dimensional problem, and our results show its potential for applications in complex multidimensional systems. A new way to identify reaction mechanisms is also proposed in this thesis.It consists in performing clustering techniques over the reactive trajectories ensemble generated by the AMS method.The implementation of the AMS method for NAMD has been improved during this thesis work. In particular, this manuscript includes a tutorial on how to use AMS on NAMD. The use of the AMS method allowed us to study two complex molecular systems. The first consists in the analysis of the influence of the water model (TIP3P and TIP4P/2005) on the β -cyclodextrin and ligand unbinding process. In the second, we apply the AMS method to sample unbinding trajectories of a ligand from the N-terminal domain of the Hsp90 protein
Saggadi, Samira. "Simulation d'évènements rares par Monte Carlo dans les réseaux hautement fiables". Thesis, Rennes 1, 2013. http://www.theses.fr/2013REN1S055.
Testo completoNetwork reliability determination, is an NP-hard problem. For instance, in telecommunications, it is desired to evaluate the probability that a selected group of nodes communicate or not. In this case, a set of disconnected nodes can lead to critical financials security consequences. A precise estimation of the reliability is, therefore, needed. In this work, we are interested in the study and the calculation of the reliability of highly reliable networks. In this case the unreliability is very small, which makes the standard Monte Carlo approach useless, because it requires a large number of iterations. For a good estimation of system reliability with minimum cost, we have developed new simulation techniques based on variance reduction using importance sampling
Lestang, Thibault. "Numerical simulation and rare events algorithms for the study of extreme fluctuations of the drag force acting on an obstacle immersed in a turbulent flow". Thesis, Lyon, 2018. http://www.theses.fr/2018LYSEN049/document.
Testo completoThis thesis discusses the numerical simulation of extreme fluctuations of the drag force acting on an object immersed in a turbulent medium.Because such fluctuations are rare events, they are particularly difficult to investigate by means of direct sampling. Indeed, such approach requires to simulate the dynamics over extremely long durations.In this work an alternative route is introduced, based on rare events algorithms.The underlying idea of such algorithms is to modify the sampling statistics so as to favour rare trajectories of the dynamical system of interest.These techniques recently led to impressive results for relatively simple dynamics. However, it is not clear yet if such algorithms are useful for complex deterministic dynamics, such as turbulent flows.This thesis focuses on the study of both the dynamics and statistics of extreme fluctuations of the drag experienced by a square cylinder mounted in a two-dimensional channel flow.This simple framework allows for very long simulations of the dynamics, thus leading to the sampling of a large number of events with an amplitude large enough so as they can be considered extreme.Subsequently, the application of two different rare events algorithms is presented and discussed.In the first case, a drastic reduction of the computational cost required to sample configurations resulting in extreme fluctuations is achieved.Furthermore, several difficulties related to the flow dynamics are highlighted, paving the way to novel approaches specifically designed to turbulent flows
Poma, Bernaola Adolfo Maximo. "Simulações atomísticas de eventos raros através de Transition Path Sampling". [s.n.], 2007. http://repositorio.unicamp.br/jspui/handle/REPOSIP/278438.
Testo completoDissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Fisica Gleb Wataghin
Made available in DSpace on 2018-08-08T20:27:57Z (GMT). No. of bitstreams: 1 PomaBernaola_AdolfoMaximo_M.pdf: 3697892 bytes, checksum: a07c1ad647a61d9862283f697732410e (MD5) Previous issue date: 2007
Resumo: Nesta dissertação abordamos o estudo de uma das limitações da simulação atomística denominada o evento raro, quem é responsável pela limitação temporal, exemplos de problemas que envolvem os eventos raros são, o enovelamento de proteínas, mudanças conformacionais de moléculas, reações químicas (em solução), difusão de sólidos e os processos de nucleação numa transição de fase de 1a ordem, entre outros. Métodos convencionais como Dinâmica Molecular (MD) ou Monte Carlo (MC) são úteis para explorar a paisagem de energia potencial de sistemas muito complexos, mas em presença de eventos raros se tornam muito ineficientes, devido à falta de estatística na amostragem do evento. Estes métodos gastam muito tempo computacional amostrando as configurações irrelevantes e não as transições de interesse. Neste sentido o método Transition Path Sampling (TPS), desenvolvido por D. Chandler e seus colaboradores, consegue explorar a paisagem de energia potencial e obter um conjunto de verdadeiras trajetórias dinâmicas que conectam os estados metaestáveis em presença de evento raros. A partir do ensemble de caminhos a constante de reação e o mecanismo de reação podem ser extraídos com muito sucesso. Neste trabalho de mestrado implementamos com muito sucesso o método TPS e realizamos uma comparação quantitativa em relação ao método MC configuracional num problema padrão da isomerização de uma molécula diatômica imersa num líquido repulsivo tipo Weeks-Chandler-Andersen (WCA). A aplicação destes métodos mostrou como o ambiente, na forma de solvente, pode afetar a cinética de um evento raro
Abstract: In this dissertation we aproach the study of one of the limitations of the atomistic simulation called the rare event, which is responsible for the temporal limitation. Examples of problems that involve the rare event are the folding protein, conformational changes in molecules, chemical reactions (in solution), solid diffusion, and the processes of nucleation in a first-order phase transition, among other. Conventional methods as Molecular Dynamics (MD) or Monte Carlo (MC) are useful to explore the potencial energy landscape of very complex systems, but in presence of rare events they become very inefficient, due to lack of statistics in the sampling of the event. These methods spend much computational time sampling the irrelevant configurations and not the transition of interest. In this sense, the Transition Path Sampling (TPS) method, developed by D. Chandler and his collaborators, can explore the potential energy landscape and get a set of true dynamical trajectories that connect the metastable states in presence of the rare events. From this ensemble of trajectories the rate constant and the mechanism of reaction can be extracted with great success. In this work we implemented the TPS method and carried out a quantitative comparison in relation to the configurational MC method in a standard problem of the isomerization of a diatomic molecule immersed in a Weeks-Chandler-Andersen (WCA) repulsive fluid. The application of these methods showed as the environment, in the form of solvent, can affect the kinetic of a rare event
Mestrado
Física Estatistica e Termodinamica
Mestre em Física
Sallin, Mathieu. "Approche probabiliste du diagnostic de l'état de santé des véhicules militaires terrestres en environnement incertain". Thesis, Université Clermont Auvergne (2017-2020), 2018. http://www.theses.fr/2018CLFAC099.
Testo completoThis thesis is a contribution to the structural health analysis of the body of ground military vehicles. Belonging to the 20 - 30 tons range, such vehicles are deployed in a variety of operational contexts where driving conditions are severe and difficult to characterize. In addition, due to a growing industrial competition, the mobility function of vehicles is acquired from suppliers and is no longer developed by Nexter Systems. As a result, the complete definition of this function is unknown. Based on this context, the main objective of this thesis is to analyze the health of the vehicle body using a probabilistic approach in order to control the calculation techniques allowing to take into account the random nature of loads related to the use of ground military vehicles. In particular, the most relevant strategies for propagating uncertainties due to the terrain within a vehicle dynamics model are defined. This work describes how it is possible to manage an observation data measured in the vehicle for the purpose of assessing the reliability with respect to a given damage criterion. An application on a demonstrator entirely designed by Nexter Systems illustrates the proposed approach
Jegourel, Cyrille. "Rare event simulation for statistical model checking". Thesis, Rennes 1, 2014. http://www.theses.fr/2014REN1S084/document.
Testo completoIn this thesis, we consider two problems that statistical model checking must cope. The first problem concerns heterogeneous systems, that naturally introduce complexity and non-determinism into the analysis. The second problem concerns rare properties, difficult to observe, and so to quantify. About the first point, we present original contributions for the formalism of composite systems in BIP language. We propose SBIP, a stochastic extension and define its semantics. SBIP allows the recourse to the stochastic abstraction of components and eliminate the non-determinism. This double effect has the advantage of reducing the size of the initial system by replacing it by a system whose semantics is purely stochastic, a necessary requirement for standard statistical model checking algorithms to be applicable. The second part of this thesis is devoted to the verification of rare properties in statistical model checking. We present a state-of-the-art algorithm for models described by a set of guarded commands. Lastly, we motivate the use of importance splitting for statistical model checking and set up an optimal splitting algorithm. Both methods pursue a common goal to reduce the variance of the estimator and the number of simulations. Nevertheless, they are fundamentally different, the first tackling the problem through the model and the second through the properties
Walter, Clément. "Using Poisson processes for rare event simulation". Thesis, Sorbonne Paris Cité, 2016. http://www.theses.fr/2016USPCC304/document.
Testo completoThis thesis address the issue of extreme event simulation. From a original understanding of the Splitting methods, a new theoretical framework is proposed, regardless of any algorithm. This framework is based on a point process associated with any real-valued random variable and lets defined probability, quantile and moment estimators without any hypothesis on this random variable. The artificial selection of threshold in Splitting vanishes and the estimator of the probability of exceeding a threshold is indeed an estimator of the whole cumulative distribution function until the given threshold. These estimators are based on the simulation of independent and identically distributed replicas of the point process. So they allow for the use of massively parallel computer cluster. Suitable practical algorithms are thus proposed.Finally it can happen that these advanced statistics still require too much samples. In this context the computer code is considered as a random process with known distribution. The point process framework lets handle this additional source of uncertainty and estimate easily the conditional expectation and variance of the resulting random variable. It also defines new SUR enrichment criteria designed for extreme event probability estimation
Suzuki, Yuya. "Rare-event Simulation with Markov Chain Monte Carlo". Thesis, KTH, Matematisk statistik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-138950.
Testo completoGudmundsson, Thorbjörn. "Rare-event simulation with Markov chain Monte Carlo". Doctoral thesis, KTH, Matematisk statistik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-157522.
Testo completoQC 20141216
Phinikettos, Ioannis. "Monitoring in survival analysis and rare event simulation". Thesis, Imperial College London, 2012. http://hdl.handle.net/10044/1/9518.
Testo completoBastide, Dorinel-Marian. "Handling derivatives risks with XVAs in a one-period network model". Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASM027.
Testo completoFinance regulators require banking institutions to be able to conduct regular scenario analyses to assess their resistance to various shocks (stress tests) of their exposures, in particular towards clearing houses (CCPs) to which they are largely exposed, by applying market shocks to capture market risk and economic shocks leading some financial players to bankruptcy, known as default state, to reflect both credit and counterparty risks. By interposing itself between financial actors, one of the main purposes of CCPs are to limit counterparty risk due to contractual payment failures due to one or several defaults among engaged parties. They also facilitate the various financial flows of the trading activities even in the event of default of one or more of their members by re-arranging certain positions and allocating any loss that could materialize following these defaults to the surviving members. To develop a relevant view of risks and ensure effective capital steering tools, it is essential for banks to have the capacity to comprehensively understand the losses and liquidity needs caused by these various shocks within these financial networks as well as to have an understanding of the underlying mechanisms. This thesis project aims at tackling modelling issues to answer those different needs that are at the heart of risk management practices for banks under clearing environments. We begin by defining a one-period static model for reflecting the market heterogeneous positions and possible joint defaults of multiple financial players, being members of CCPs and other financial participants, to identify the different costs, known as XVAs, generated by both clearing and bilateral activities, with explicit formulas for these costs. Various use cases of this modelling framework are illustrated with stress test exercises examples on financial networks from a member's point of view or innovation of portfolio of CCP defaulted members with other surviving members. Fat-tailed distributions are favoured to generate portfolio losses and defaults with the application of very large-dimension Monte-Carlo methods along with numerical uncertainty quantifications. We also expand on the novation aspects of portfolios of defaulted members and the associated XVA costs transfers. These innovations can be carried out either on the marketplaces (exchanges) or by the CCPs themselves by identifying the optimal buyers or by conducting auctions of defaulted positions with dedicated economic equilibrium problems. Failures of members on several CCPs in common also lead to the formulation and resolution of multidimensional optimization problems of risk transfer that are introduced in this thesis
Zhang, Benjamin Jiahong. "A coupling approach to rare event simulation via dynamic importance sampling". Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/112384.
Testo completoThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 106-109).
Rare event simulation involves using Monte Carlo methods to estimate probabilities of unlikely events and to understand the dynamics of a system conditioned on a rare event. An established class of algorithms based on large deviations theory and control theory constructs provably asymptotically efficient importance sampling estimators. Dynamic importance sampling is one these algorithms in which the choice of biasing distribution adapts in the course of a simulation according to the solution of an Isaacs partial differential equation or by solving a sequence of variational problems. However, obtaining the solution of either problem may be expensive, where the cost of solving these problems may be even more expensive than performing simple Monte Carlo exhaustively. Deterministic couplings induced by transport maps allows one to relate a complex probability distribution of interest to a simple reference distribution (e.g. a standard Gaussian) through a monotone, invertible function. This diverts the complexity of the distribution of interest into a transport map. We extend the notion of transport maps between probability distributions on Euclidean space to probability distributions on path space following a similar procedure to Itô's coupling. The contraction principle is a key concept from large deviations theory that allows one to relate large deviations principles of different systems through deterministic couplings. We convey that with the ability to computationally construct transport maps, we can leverage the contraction principle to reformulate the sequence of variational problems required to implement dynamic importance sampling and make computation more amenable. We apply this approach to simple rotorcraft models. We conclude by outlining future directions of research such as using the coupling interpretation to accelerate rare event simulation via particle splitting, using transport maps to learn large deviations principles, and accelerating inference of rare events.
by Benjamin Jiahong Zhang.
S.M.
Lamers, Eugen. "Contributions to Simulation Speed-up : Rare Event Simulation and Short-term Dynamic Simulation for Mobile Network Planning /". Wiesbaden : Vieweg + Teubner in GWV Fachverlage, 2008. http://d-nb.info/988382016/04.
Testo completoLamers, Eugen. "Contributions to simulation speed-up rare event simulation and short-term dynamic simulation for mobile network planning". Wiesbaden Vieweg + Teubner, 2007. http://d-nb.info/988382016/04.
Testo completoGudmundsson, Thorbjörn. "Markov chain Monte Carlo for rare-event simulation in heavy-tailed settings". Licentiate thesis, KTH, Matematisk statistik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-134624.
Testo completoDean, Thomas Anthony. "A subsolutions approach to the analysis and implementation of splitting algorithms in rare event simulation". View abstract/electronic edition; access limited to Brown University users, 2008. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3318304.
Testo completoTurati, Pietro. "Méthodes de simulation adaptative pour l’évaluation des risques de système complexes". Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLC032/document.
Testo completoRisk assessment is conditioned on the knowledge and information available at the moment of the analysis. Modeling and simulation are ways to explore and understand system behavior, for identifying critical scenarios and avoiding surprises. A number of simulations of the model are run with different initial and operational conditions to identify scenarios leading to critical consequences and to estimate their probabilities of occurrence. For complex systems, the simulation models can be: i) high-dimensional; ii) black-box; iii) dynamic; and iv) computationally expensive to run, preventing the analyst from running the simulations for the multiple conditions that need to be considered.The present thesis presents advanced frameworks of simulation-based risk assessment. The methods developed within the frameworks are attentive to limit the computational cost required by the analysis, in order to keep them scalable to complex systems. In particular, all methods proposed share the powerful idea of automatically focusing and adaptively driving the simulations towards those conditions that are of interest for the analysis, i.e., for risk-oriented information.The advantages of the proposed methods have been shown with respect to different applications including, among others, a gas transmission subnetwork, a power network and the Advanced Lead Fast Reactor European Demonstrator (ALFRED)
Yu, Xiaomin. "Simulation Study of Sequential Probability Ratio Test (SPRT) in Monitoring an Event Rate". University of Cincinnati / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1244562576.
Testo completoRazaaly, Nassim. "Rare Event Estimation and Robust Optimization Methods with Application to ORC Turbine Cascade". Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLX027.
Testo completoThis thesis aims to formulate innovative Uncertainty Quantification (UQ) methods in both Robust Optimization (RO) and Reliability-Based Design Optimization (RBDO) problems. The targeted application is the optimization of supersonic turbines used in Organic Rankine Cycle (ORC) power systems.Typical energy sources for ORC power systems feature variable heat load and turbine inlet/outlet thermodynamic conditions. The use of organic compounds with a heavy molecular weight typically leads to supersonic turbine configurations featuring supersonic flows and shocks, which grow in relevance in the aforementioned off-design conditions; these features also depend strongly on the local blade shape, which can be influenced by the geometric tolerances of the blade manufacturing. A consensus exists about the necessity to include these uncertainties in the design process, so requiring fast UQ methods and a comprehensive tool for performing shape optimization efficiently.This work is decomposed in two main parts. The first one addresses the problem of rare events estimation, proposing two original methods for failure probability (metaAL-OIS and eAK-MCS) and one for quantile computation (QeAK-MCS). The three methods rely on surrogate-based (Kriging) adaptive strategies, aiming at refining the so-called Limit-State Surface (LSS) directly, unlike Subset Simulation (SS) derived methods. Indeed, the latter consider intermediate threshold associated with intermediate LSSs to be refined. This direct refinement property is of crucial importance since it enables the adaptability of the developed methods for RBDO algorithms. Note that the proposed algorithms are not subject to restrictive assumptions on the LSS (unlike the well-known FORM/SORM), such as the number of failure modes, however need to be formulated in the Standard Space. The eAK-MCS and QeAK-MCS methods are derived from the AK-MCS method and inherit a parallel adaptive sampling based on weighed K-Means. MetaAL-OIS features a more elaborate sequential refinement strategy based on MCMC samples drawn from a quasi-optimal ISD. It additionally proposes the construction of a Gaussian mixture ISD, permitting the accurate estimation of small failure probabilities when a large number of evaluations (several millions) is tractable, as an alternative to SS. The three methods are shown to perform very well for 2D to 8D analytical examples popular in structural reliability literature, some featuring several failure modes, all subject to very small failure probability/quantile level. Accurate estimations are performed in the cases considered using a reasonable number of calls to the performance function.The second part of this work tackles original Robust Optimization (RO) methods applied to the Shape Design of a supersonic ORC Turbine cascade. A comprehensive Uncertainty Quantification (UQ) analysis accounting for operational, fluid parameters and geometric (aleatoric) uncertainties is illustrated, permitting to provide a general overview over the impact of multiple effects and constitutes a preliminary study necessary for RO. Then, several mono-objective RO formulations under a probabilistic constraint are considered in this work, including the minimization of the mean or a high quantile of the Objective Function. A critical assessment of the (Robust) Optimal designs is finally investigated
Uribe-Castillo, Felipe [Verfasser], Daniel [Akademischer Betreuer] Straub, Youssef M. [Gutachter] Marzouk, Daniel [Gutachter] Straub e Elisabeth [Gutachter] Ullmann. "Bayesian analysis and rare event simulation of random fields / Felipe Uribe-Castillo ; Gutachter: Youssef M. Marzouk, Daniel Straub, Elisabeth Ullmann ; Betreuer: Daniel Straub". München : Universitätsbibliothek der TU München, 2020. http://d-nb.info/1232406139/34.
Testo completoReinhardt, Aleks. "Computer simulation of the homogeneous nucleation of ice". Thesis, University of Oxford, 2013. http://ora.ox.ac.uk/objects/uuid:9ec0828b-df99-42e1-8694-14786d7578b9.
Testo completoEstecahandy, Maïder. "Méthodes accélérées de Monte-Carlo pour la simulation d'événements rares. Applications aux Réseaux de Petri". Thesis, Pau, 2016. http://www.theses.fr/2016PAUU3008/document.
Testo completoThe dependability analysis of safety instrumented systems is an important industrial concern. To be able to carry out such safety studies, TOTAL develops since the eighties the dependability software GRIF. To take into account the increasing complexity of the operating context of its safety equipment, TOTAL is more frequently led to use the engine MOCA-RP of the GRIF Simulation package. Indeed, MOCA-RP allows to estimate quantities associated with complex aging systems modeled in Petri nets thanks to the standard Monte Carlo (MC) simulation. Nevertheless, deriving accurate estimators, such as the system unavailability, on very reliable systems involves rare event simulation, which requires very long computing times with MC. In order to address this issue, the common fast Monte Carlo methods do not seem to be appropriate. Many of them are originally defined to improve only the estimate of the unreliability and/or well-suited for Markovian processes. Therefore, the work accomplished in this thesis pertains to the development of acceleration methods adapted to the problematic of performing safety studies modeled in Petri nets and estimating in particular the unavailability. More specifically, we propose the Extension of the "Méthode de Conditionnement Temporel" to accelerate the individual failure of the components, and we introduce the Dissociation Method as well as the Truncated Fixed Effort Method to increase the occurrence of their simultaneous failures. Then, we combine the first technique with the two other ones, and we also associate them with the Randomized Quasi-Monte Carlo method. Through different sensitivities studies and benchmark experiments, we assess the performance of the acceleration methods and observe a significant improvement of the results compared with MC. Furthermore, we discuss the choice of the confidence interval method to be used when considering rare event simulation, which is an unfamiliar topic in the field of dependability. Last, an application to an industrial case permits the illustration of the potential of our solution methodology
Aboutaleb, Adam. "Empirical study of the effect of stochastic variability on the performance of human-dependent flexible flow lines". Thesis, De Montfort University, 2015. http://hdl.handle.net/2086/12103.
Testo completoLagnoux, Agnès. "Analyse des modeles de branchement avec duplication des trajectoires pour l'étude des événements rares". Toulouse 3, 2006. http://www.theses.fr/2006TOU30231.
Testo completoThis thesis deals with the splitting method first introduced in rare event analysis in order to speed-up simulation. In this technique, the sample paths are split into R multiple copies at various stages during the simulation. Given the cost, the optimization of the algorithm suggests to take the transition probabilities between stages equal to some constant and to resample the inverse of that constant subtrials, which may be non-integer and even unknown but estimated. First, we study the sensitivity of the relative error between the probability of interest P(A) and its estimator depending on the strategy that makes the resampling numbers integers. Then, since in practice the transition probabilities are generally unknown (and so the optimal resampling umbers), we propose a two-steps algorithm to face that problem. Several numerical applications and comparisons with other models are proposed
Gedion, Michael. "Contamination des composants électroniques par des éléments radioactifs". Thesis, Montpellier 2, 2012. http://www.theses.fr/2012MON20267/document.
Testo completoThis work studies radioactive elements that can affect the proper functioning of electronic components at ground level. These radioactive elements are called alpha emitters. Intrinsic to electronic components, they decay and emit alpha particles which ionize the material of the electronic device and trigger SEU (Single Event Upset).This thesis aims to assess the reliability of digital circuits due to this internal radiative constraint of electronic components. For that, all alpha-emitting natural or artificial isotopes that can contaminate digital circuits have been identified and classified into two categories: natural impurities and introduced radionuclides.Natural impurities result from a natural or accidental contamination of materials used in nanotechnology. To assess their effects on reliability, the SER (Soft Error Rate) was determined by Monte Carlo simulations for different technology nodes in the case of secular equilibrium. Besides, a new analytical approach was developed to determine the consequences of secular disequilibrium on the reliability of digital circuits.Moreover, with the miniaturization of digital circuits, new chemical elements have been suggested or used in nanoelectronics. The introduced radionuclides include this type of element consisting of natural alpha emitters. Studies based on Monte Carlo simulations and analytical approches have been conducted to evaluate the reliability of electronic devices. Subsequently, recommendations were proposed on the use of new chemical elements in nanotechnology
DANTAS, Renata Ara?jo. "An?lise t?cnica e econ?mica da produ??o de biodiesel utilizando ?leo de fritura residual em unidade piloto". Universidade Federal Rural do Rio de Janeiro, 2016. https://tede.ufrrj.br/jspui/handle/jspui/1934.
Testo completoMade available in DSpace on 2017-07-28T18:32:47Z (GMT). No. of bitstreams: 1 2016 - Renata Ara?jo Dantass.pdf: 3354692 bytes, checksum: 1d9395453e1cd720f2bd08e346ef7ce7 (MD5) Previous issue date: 2016-12-15
The increasing economic and technological development over the world has caused a huge demand of energy. In Brazil, the transportation sector is the main responsible for energy consumption in the country, reaching 45.2% of the diesel use. In this scenario, renewable fuels become an attractive alternative due to the numerous benefits attributed to their use, such as degradability, absence of toxicity, reduction of pollutant emission and renewable origin. As in other countries, Brazil has authorized the use of a minimum quantity of biodiesel in diesel. Currently, according to federal law n? 11.097/2005, the minimum mixing percentage is equivalent to 7% and during 2016, it will increase to 8%. However, the major challenge to produce biodiesel in a large-scale is the high operating cost. According to the literature, the raw material costs are so high, and correspond to 70-85% of the total cost of production. The use of alternative and low cost sources of triglycerides for biodiesel production has been extensively studied. Waste oil, from domestic and commercial consumption, is a potential residue because of its high quantity available and low aggregated value. In this context, the production of biodiesel in a pilot unit present in the UFRRJ was studied using the homogeneous alkaline transesterification reaction. The operational condition used in the pilot plant was previously determined from an experimental planning, choosing the best one that gave the high conversion and acceptable values of viscosity. The simulations were performed in the SuperPro Designer?, and the prices of materials, utilities, consumables and costs with operators were determined based on the domestic and international market. A previous analysis of the production of 250 kg/batch of biodiesel was economically viable considering the unit cost of biodiesel of $ 0.62/kg. The sensitivity analysis showed that the plant becomes economically viable from a biodiesel price equal or greater than $ 0.754/kg, almost 34 cents above the initially estimated value. In order to turn the biodiesel production more profitable, simulations were done with different vegetable oils with different levels of costs. The results were unsatisfactory from the economic point of view, confirming the importance of the use of low cost raw materials. A scale up was made from the pilot unit and the technical and economic data showed that the project becomes more profitable starting from the production of 2.500 kg of residual oil, presenting IRR and return time of respectively 116.9 % and 1 year. The production of biodiesel in the pilot unit was carried out and the specifications were compared to ANP. The values of viscosity, specific mass, flash point, iodine content and corrosivity were in the limit range established by ANP.
O crescente desenvolvimento econ?mico e tecnol?gico associado ao aumento populacional tem provocado uma grande demanda de energia. No Brasil, atrav?s do Balan?o Energ?tico Nacional divulgado pelo Minist?rio de Minas e Energia, foi apontado que o setor de transporte foi o maior respons?vel pelo consumo de energia no pa?s, sendo a maior parte (45,2%) referente ao uso de ?leo diesel. Neste cen?rio, os combust?veis renov?veis tornam-se uma alternativa atraente devido aos in?meros benef?cios atribu?dos ao seu uso, como degradabilidade, aus?ncia de toxicidade, redu??o da emiss?o de poluentes e origem renov?vel. Assim como em outros pa?ses, o Brasil autorizou o uso de misturas m?nimas obrigat?rias de biodiesel ao diesel. Atualmente, de acordo com a lei federal n?. 11.097/2005, o percentual de mistura m?nima ? equivalente a 7% e, para o ano de 2016, foi aprovado o aumento para 8%. Entretanto, a maior barreira para produ??o e utiliza??o de biodiesel em larga escala ? o elevado custo operacional. De acordo com a literatura, o custo de aquisi??o da mat?ria prima ? o mais alto e corresponde entre 70- 85% do custo total de produ??o. O uso de fontes de triglicer?deos alternativas e menos dispendiosas para produ??o de biodiesel tem sido amplamente estudada. O ?leo residual, proveniente do consumo dom?stico e comercial, ? uma fonte em potencial devido a sua vasta abund?ncia e baixo valor agregado. Neste contexto, a produ??o de biodiesel em uma unidade piloto presente na UFRRJ foi estudada a partir da rea??o de transesterifica??o homog?nea b?sica. A condi??o operacional empregada na planta piloto foi previamente determinada a partir de um planejamento experimental, objetivando maior convers?o e menor gasto energ?tico e material. Com aux?lio de um simulador comercial, SuperPro Designer?, o balan?o de massa e energia foi realizado. Com base no mercado nacional e internacional, os pre?os de materiais, utilidades, consum?veis e custos com operadores foram determinados. Uma an?lise pr?via da produ??o de 250 kg/batelada de biodiesel, de acordo com as opera??es unit?rias presentes na planta original, apresentou-se invi?vel economicamente considerando o custo unit?rio de biodiesel de $0,62/kg. A an?lise de sensibilidade mostrou que a planta torna-se vi?vel economicamente a partir de um pre?o de venda do biodiesel igual ou superior a $0,754/kg, 34 centavos acima do valor inicialmente estimado. Simula??es da produ??o de biodiesel na unidade piloto com ?leos vegetais de diferentes custos de aquisi??o foram realizadas. Os resultados mostraram-se insatisfat?rios do ponto de vista econ?mico, ratificando a import?ncia do uso de mat?rias-primas de baixo valor agregado. Um ?scale up? foi feito a partir da unidade piloto e os dados t?cnicos e econ?micos mostraram que o projeto torna-se mais lucrativo partindo da produ??o de 2.500 kg de ?leo residual, apresentando TIR e tempo de retorno de, respectivamente, 116,9% e 1 ano. A produ??o de biodiesel na unidade piloto foi realizada e o biodiesel produzido foi analisado e comparado quanto ?s especifica??es da ANP. Os valores de viscosidade cinem?tica, massa espec?fica, ponto de fulgor, ?ndice de iodo, corrosividade ao cobre e aspecto atenderam os limites preconizados pela ANP.
Malherbe, Victor. "Multi-scale modeling of radiation effects for emerging space electronics : from transistors to chips in orbit". Thesis, Aix-Marseille, 2018. http://www.theses.fr/2018AIXM0753/document.
Testo completoThe effects of cosmic radiation on electronics have been studied since the early days of space exploration, given the severe reliability constraints arising from harsh space environments. However, recent evolutions in the space industry landscape are changing radiation effects practices and methodologies, with mainstream technologies becoming increasingly attractive for radiation-hardened integrated circuits. Due to their high operating frequencies, new transistor architectures, and short rad-hard development times, chips manufactured in latest CMOS processes pose a variety of challenges, both from an experimental standpoint and for modeling perspectives. This work thus focuses on simulating single-event upsets and transients in advanced FD-SOI and bulk silicon processes.The soft-error response of 28 nm FD-SOI transistors is first investigated through TCAD simulations, allowing to develop two innovative models for radiation-induced currents in FD-SOI. One of them is mainly behavioral, while the other captures complex phenomena, such as parasitic bipolar amplification and circuit feedback effects, from first semiconductor principles and in agreement with detailed TCAD simulations.These compact models are then interfaced to a complete Monte Carlo Soft-Error Rate (SER) simulation platform, leading to extensive validation against experimental data collected on several test vehicles under accelerated particle beams. Finally, predictive simulation studies are presented on bit-cells, sequential and combinational logic gates in 28 nm FD-SOI and 65 nm bulk Si, providing insights into the mechanisms that contribute to the SER of modern integrated circuits in orbit
Malherbe, Victor. "Multi-scale modeling of radiation effects for emerging space electronics : from transistors to chips in orbit". Electronic Thesis or Diss., Aix-Marseille, 2018. http://www.theses.fr/2018AIXM0753.
Testo completoThe effects of cosmic radiation on electronics have been studied since the early days of space exploration, given the severe reliability constraints arising from harsh space environments. However, recent evolutions in the space industry landscape are changing radiation effects practices and methodologies, with mainstream technologies becoming increasingly attractive for radiation-hardened integrated circuits. Due to their high operating frequencies, new transistor architectures, and short rad-hard development times, chips manufactured in latest CMOS processes pose a variety of challenges, both from an experimental standpoint and for modeling perspectives. This work thus focuses on simulating single-event upsets and transients in advanced FD-SOI and bulk silicon processes.The soft-error response of 28 nm FD-SOI transistors is first investigated through TCAD simulations, allowing to develop two innovative models for radiation-induced currents in FD-SOI. One of them is mainly behavioral, while the other captures complex phenomena, such as parasitic bipolar amplification and circuit feedback effects, from first semiconductor principles and in agreement with detailed TCAD simulations.These compact models are then interfaced to a complete Monte Carlo Soft-Error Rate (SER) simulation platform, leading to extensive validation against experimental data collected on several test vehicles under accelerated particle beams. Finally, predictive simulation studies are presented on bit-cells, sequential and combinational logic gates in 28 nm FD-SOI and 65 nm bulk Si, providing insights into the mechanisms that contribute to the SER of modern integrated circuits in orbit
Orire, Endurance. "The techno-economics of bitumen recovery from oil and tar sands as a complement to oil exploration in Nigeria / E. Orire". Thesis, North-West University, 2009. http://hdl.handle.net/10394/5704.
Testo completoThesis (M.Ing. (Development and Management))--North-West University, Potchefstroom Campus, 2010.
NOTARANGELO, NICLA MARIA. "A Deep Learning approach for monitoring severe rainfall in urban catchments using consumer cameras. Models development and deployment on a case study in Matera (Italy) Un approccio basato sul Deep Learning per monitorare le piogge intense nei bacini urbani utilizzando fotocamere generiche. Sviluppo e implementazione di modelli su un caso di studio a Matera (Italia)". Doctoral thesis, Università degli studi della Basilicata, 2021. http://hdl.handle.net/11563/147016.
Testo completoNegli ultimi 50 anni, le alluvioni si sono confermate come il disastro naturale più frequente e diffuso a livello globale. Tra gli impatti degli eventi meteorologici estremi, conseguenti ai cambiamenti climatici, rientrano le alterazioni del regime idrogeologico con conseguente incremento del rischio alluvionale. Il monitoraggio delle precipitazioni in tempo quasi reale su scala locale è essenziale per la mitigazione del rischio di alluvione in ambito urbano e periurbano, aree connotate da un'elevata vulnerabilità. Attualmente, la maggior parte dei dati sulle precipitazioni è ottenuta da misurazioni a terra o telerilevamento che forniscono informazioni limitate in termini di risoluzione temporale o spaziale. Ulteriori problemi possono derivare dagli elevati costi. Inoltre i pluviometri sono distribuiti in modo non uniforme e spesso posizionati piuttosto lontano dai centri urbani, comportando criticità e discontinuità nel monitoraggio. In questo contesto, un grande potenziale è rappresentato dall'utilizzo di tecniche innovative per sviluppare sistemi inediti di monitoraggio a basso costo. Nonostante la diversità di scopi, metodi e campi epistemologici, la letteratura sugli effetti visivi della pioggia supporta l'idea di sensori di pioggia basati su telecamera, ma tende ad essere specifica per dispositivo scelto. La presente tesi punta a indagare l'uso di dispositivi fotografici facilmente reperibili come rilevatori-misuratori di pioggia, per sviluppare una fitta rete di sensori a basso costo a supporto dei metodi tradizionali con una soluzione rapida incorporabile in dispositivi intelligenti. A differenza dei lavori esistenti, lo studio si concentra sulla massimizzazione del numero di fonti di immagini (smartphone, telecamere di sorveglianza generiche, telecamere da cruscotto, webcam, telecamere digitali, ecc.). Ciò comprende casi in cui non sia possibile regolare i parametri fotografici o ottenere scatti in timeline o video. Utilizzando un approccio di Deep Learning, la caratterizzazione delle precipitazioni può essere ottenuta attraverso l'analisi degli aspetti percettivi che determinano se e come una fotografia rappresenti una condizione di pioggia. Il primo scenario di interesse per l'apprendimento supervisionato è una classificazione binaria; l'output binario (presenza o assenza di pioggia) consente la rilevazione della presenza di precipitazione: gli apparecchi fotografici fungono da rivelatori di pioggia. Analogamente, il secondo scenario di interesse è una classificazione multi-classe; l'output multi-classe descrive un intervallo di intensità delle precipitazioni quasi istantanee: le fotocamere fungono da misuratori di pioggia. Utilizzando tecniche di Transfer Learning con reti neurali convoluzionali, i modelli sviluppati sono stati compilati, addestrati, convalidati e testati. La preparazione dei classificatori ha incluso la preparazione di un set di dati adeguato con impostazioni verosimili e non vincolate: dati aperti, diversi dati di proprietà del National Research Institute for Earth Science and Disaster Prevention - NIED (telecamere dashboard in Giappone accoppiate con dati radar multiparametrici ad alta precisione) e attività sperimentali condotte nel simulatore di pioggia su larga scala del NIED. I risultati sono stati applicati a uno scenario reale, con la sperimentazione attraverso una telecamera di sorveglianza preesistente che utilizza la connettività 5G fornita da Telecom Italia S.p.A. nella città di Matera (Italia). L'analisi si è svolta su più livelli, fornendo una panoramica sulle questioni relative al paradigma del rischio di alluvione in ambito urbano e questioni territoriali specifiche inerenti al caso di studio. Queste ultime includono diversi aspetti del contesto, l'importante ruolo delle piogge dal guidare l'evoluzione millenaria della morfologia urbana alla determinazione delle criticità attuali, oltre ad alcune componenti di un prototipo Web per la comunicazione del rischio alluvionale su scala locale. I risultati ottenuti e l'implementazione del modello corroborano la possibilità che le tecnologie a basso costo e le capacità locali possano aiutare a caratterizzare la forzante pluviometrica a supporto dei sistemi di allerta precoce basati sull'identificazione di uno stato meteorologico significativo. Il modello binario ha raggiunto un'accuratezza e un F1-score di 85,28% e 0,86 per il set di test e di 83,35% e 0,82 per l'implementazione nel caso di studio. Il modello multi-classe ha raggiunto un'accuratezza media e F1-score medio (macro-average) di 77,71% e 0,73 per il classificatore a 6 vie e 78,05% e 0,81 per quello a 5 classi. Le prestazioni migliori sono state ottenute nelle classi relative a forti precipitazioni e assenza di pioggia, mentre le previsioni errate sono legate a precipitazioni meno estreme. Il metodo proposto richiede requisiti operativi limitati, può essere implementato facilmente e rapidamente in casi d'uso reali, sfruttando dispositivi preesistenti con un uso parsimonioso di risorse economiche e computazionali. La classificazione può essere eseguita su singole fotografie scattate in condizioni disparate da dispositivi di acquisizione di uso comune, ovvero da telecamere statiche o in movimento senza regolazione dei parametri. Questo approccio potrebbe essere particolarmente utile nelle aree urbane in cui i metodi di misurazione come i pluviometri incontrano difficoltà di installazione o limitazioni operative o in contesti in cui non sono disponibili dati di telerilevamento o radar. Il sistema non si adatta a scene che sono fuorvianti anche per la percezione visiva umana. I limiti attuali risiedono nelle approssimazioni intrinseche negli output. Per colmare le lacune evidenti e migliorare l'accuratezza della previsione dell'intensità di precipitazione, sarebbe possibile un'ulteriore raccolta di dati. Sviluppi futuri potrebbero riguardare l'integrazione con ulteriori esperimenti in campo e dati da crowdsourcing, per promuovere comunicazione, partecipazione e dialogo aumentando la resilienza attraverso consapevolezza pubblica e impegno civico in una concezione di comunità smart.
Shi, Yixi. "Rare Events in Stochastic Systems: Modeling, Simulation Design and Algorithm Analysis". Thesis, 2013. https://doi.org/10.7916/D86H4QNS.
Testo completoMcKenzie, Todd G., e 麥陶德. "Rare Events in Monte Carlo Methods with Applications to Circuit Simulation". Thesis, 2019. http://ndltd.ncl.edu.tw/handle/3hu2xz.
Testo completo國立臺灣大學
資訊工程學研究所
107
The simulation of rare events in Monte Carlo is of practical importance across many disciplines including circuit simulation, finance, and meterology to name a few. To make inferences about the behavior of distributions in systems which have a likelihood of 1 in billions, simulation by traditional Monte Carlo becomes impractical. Intuitively, Monte Carlo samples which are likely (i.e. samples drawn from regions of higher probability density) have little influence on the tail distribution which is associated with rare events. This work provides a novel methodology to directly sample rare events in input multivariate random vector space as a means to efficiently learn about the distribution tail in the output space. In addition, the true form of the Monte Carlo simulation is modeled by first linear and then quadratic forms. A systematic procedure is developed which traces the flow from the linear or quadratic modeling to the computation of distribution statistics such as moments and quantiles directly from the modeling form itself. Next, a general moment calculation method is derived based on the distribution quantiles where no underlying linear or quadratic model is assumed. Finally, each of the proposed methods is grounded in practical circuit simulation examples. Overall, the thesis provides several new methods and approaches to tackle some challenging problems in Monte Carlo simulation, probability distribution modeling, and statistical analysis.
MARSILI, SIMONE. "Simulations of rare events in chemistry". Doctoral thesis, 2009. http://hdl.handle.net/2158/555496.
Testo completoKuruganti, Indira. "Optimal importance sampling for simulating rare events in Markov chains /". 1997. http://wwwlib.umi.com/dissertations/fullcit/9708640.
Testo completoHawk, Alexander Timothy. "Calculating rare biophysical events. A study of the milestoning method and simple polymer models". 2012. http://hdl.handle.net/2152/19528.
Testo completotext
Ben, Rached Nadhir. "Rare Events Simulations with Applications to the Performance Evaluation of Wireless Communication Systems". Diss., 2018. http://hdl.handle.net/10754/629483.
Testo completoLipsmeier, Florian [Verfasser]. "Rare event simulation for probabilistic models of T-cell activation / Florian Lipsmeier". 2010. http://d-nb.info/1007843659/34.
Testo completoWang, Sing-Po, e 王星博. "Optimal Hit-Based Splitting Technique for Rare-Event Simulation and Its Application to Power Grid Blackout Simulation". Thesis, 2011. http://ndltd.ncl.edu.tw/handle/20805668368023114338.
Testo completo國立臺灣大學
工業工程學研究所
99
Rare-event probability estimation is a crucial issue in areas such as reliability, telecommunications and aircraft management. When an event rarely occurs, naive Monte Carlo simulation becomes unreasonably demanding for computing power and often results in an unreliable probability estimate, i.e., an estimate with a large variance. Level splitting simulation has emerged as a promising technique to reduce the variance of a probability estimate by creating separate copies (splits) of the simulation whenever it gets close to a rare event. This technique allocates simulation runs to levels of event progressively approaching a final rare event. However, determination of a good number of simulation runs at each stage can be challenging. An optimal splitting technique called the Optimal Splitting Technique for Rare Events (OSTRE) provides an asymptotically optimal allocation solution of simulation runs. A splitting simulation characterized by allocating simulation runs may fail to obtain a probability estimate because the probability of an event occurring at a given level is too low, and the number of simulation runs allocated to that level is not enough to observe it. In this research, we propose a hit-based splitting method that allocates a number of hits, instead of the number of simulation runs, to each stage. The number of hits is the number of event occurrences at each stage. Regardless of the number of simulation runs required, the allocated number of hits has to be reached before advancing to the next level of splitting simulation. We derive an asymptotically optimal allocation of hits to each stage of splitting simulation, referred to as Optimal Hit-based Splitting Simulation. Experiments indicate that the proposed method performs as well as OSTRE under most conditions. In addition to the allocation problem, choice of levels is also a critical issue in level splitting simulation. Based on the proposed hit-based splitting simulation method, we have developed an algorithm capable of obtaining optimal levels by effectively estimating the initial probability for some events to occur at each stage of progression. Results indicate that the choice of optimal levels is effective. Furthermore, we apply our technique to an IEEE-bus electric network and demonstrate our approach is just as effective as conventional techniques in detecting the most vulnerable link in the electric grid, i.e., the link with the highest probability leading to a blackout event.
AMABILI, MATTEO. "Stability of the submerged superhydrophobic state via rare event molecular dynamics simulations". Doctoral thesis, 2017. http://hdl.handle.net/11573/936817.
Testo completoBen, Issaid Chaouki. "Effcient Monte Carlo Simulations for the Estimation of Rare Events Probabilities in Wireless Communication Systems". Diss., 2019. http://hdl.handle.net/10754/660001.
Testo completoPeres, Ricardo José Mota. "Contributions to the Xenon dark matter experiment: simulations of detector response to calibration sources and electric field optimization". Master's thesis, 2018. http://hdl.handle.net/10316/86267.
Testo completoA Matéria Escura continua a ser dos maiores mistérios do atual mundo da Física, potenciando imensos esforços no estudo do seu comportamento e das suas propriedades, tanto a nível experimental como teórico. Os detetores XENON tentam detetar de forma direta estas esquivas partículas, focando-se principalmente nas Partículas Pesadas que Interagem por Força Fraca, ou WIMPs, em grandes volumes de Xenon líquido com detetores de Câmaras de Projeção Temporal (TPC) de dupla fase. Desde a publicação dos seus últimos resultados, o detetor XENON1T é reconhecido como a mais sensível experiência de Matéria Escura.Neste trabalho, enumeram-se as principais evidências da existência de Matéria Escura no Universo e alguns dos seus mais reconhecidos modelos teóricos, passando depois à apresentação do detetor XENON1T e, de seguida, do futuro detetor XENONnT. O principal foco desta dissertação é, numa primeira instância, a simulação e análise de calibrações de recuos nucleares (NR) com um gerador de neutrões (NG), feitas no detetor XENON1T. Uma simulação de todo o detetor é feita para estudar o comportamento esperado e os resultados de análise comparados com os resultados em dados do detetor real. Mais ainda, é de seguida feita uma análise a outras campanhas de calibração com o NG de forma a estudar e modelar a banda de recuos nucleares. Mais tarde, no último capítulo, o foco muda para a construção do modelo geométrico e simulações de elementos finitos de campo elétrico da TPC do detetor XENONnT. Aqui, o principal objetivo é otimizar a geometria e diferencças de potencial aplicadas aos anéis de deformação do campo elétrico(FSR), responsáveis por o manter o mais uniforme possível dentro da TPC.
Dark Matter (DM) still stands as one of the great mysteries of current day Physics, fueling massive experimental and theoretical endeavors to understand its behavior and properties. The XENON detectors aim to directly detect these elusive particles, mostly focusing on Weakly Interactive Massive Particles, or WIMPs, in a large volume of liquid Xenon, using double phase time projection chamber (TPC) detectors. Since its latest results published, the XENON1T detector stands as the most sensitive Dark Matter experiment up to date.In this work, a journey through the evidences on the existence of Dark Matter in the Universe and some of its most notorious models leads to a presentation of the current generation XENON1T detector and, later on, the next generation XENONnT detector. The core of this dissertation focuses, at first, on the simulation and analysis of neutron generator (NG) nuclear recoil (NR) calibration data from the XENON1T detector. Here, a full simulation of a NG calibration run is computed and its results compared with data taken with the XENON1T detector. A separate analysis of other NR calibration data to look into the NR band model and ots empirical fit is achieved. Later, in the last chapter, the focus changes to the construction of the geometry model and electric field finite-element simulation of the XENONnT TPC. The main objective is to optimize the resistive chain that ensures the uniformity of the field inside the TPC, changing the geometry and voltage of the field shaping rings, preliminary based on the design used for XENON1T.
Lu, Chun-Yaung. "Dynamical simulation of molecular scale systems : methods and applications". Thesis, 2010. http://hdl.handle.net/2152/ETD-UT-2010-12-2151.
Testo completotext