Academic literature on the topic 'Rare event probability'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Rare event probability.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Rare event probability":

1

Sinha, Ashoke Kumar, and Laurens de Haan. "Estimating the probability of a rare event." Annals of Statistics 27, no. 2 (April 1999): 732–59. http://dx.doi.org/10.1214/aos/1018031214.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Abbot, Dorian S., Robert J. Webber, Sam Hadden, Darryl Seligman, and Jonathan Weare. "Rare Event Sampling Improves Mercury Instability Statistics." Astrophysical Journal 923, no. 2 (December 1, 2021): 236. http://dx.doi.org/10.3847/1538-4357/ac2fa8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract Due to the chaotic nature of planetary dynamics, there is a non-zero probability that Mercury’s orbit will become unstable in the future. Previous efforts have estimated the probability of this happening between 3 and 5 billion years in the future using a large number of direct numerical simulations with an N-body code, but were not able to obtain accurate estimates before 3 billion years in the future because Mercury instability events are too rare. In this paper we use a new rare-event sampling technique, Quantile Diffusion Monte Carlo (QDMC), to estimate that the probability of a Mercury instability event in the next 2 billion years is approximately 10−4 in the REBOUND N-body code. We show that QDMC provides unbiased probability estimates at a computational cost of up to 100 times less than direct numerical simulation. QDMC is easy to implement and could be applied to many problems in planetary dynamics in which it is necessary to estimate the probability of a rare event.
3

Ho, Yu-Chi, and Michael E. Larson. "Ordinal optimization approach to rare event probability problems." Discrete Event Dynamic Systems: Theory and Applications 5, no. 2-3 (April 1995): 281–301. http://dx.doi.org/10.1007/bf01439043.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Chan, Joshua C. C., and Dirk P. Kroese. "Rare-event probability estimation with conditional Monte Carlo." Annals of Operations Research 189, no. 1 (March 24, 2009): 43–61. http://dx.doi.org/10.1007/s10479-009-0539-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Lagnoux, Agnès. "RARE EVENT SIMULATION." Probability in the Engineering and Informational Sciences 20, no. 1 (December 12, 2005): 45–66. http://dx.doi.org/10.1017/s0269964806060025.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This article deals with estimations of probabilities of rare events using fast simulation based on the splitting method. In this technique, the sample paths are split into multiple copies at various stages in the simulation. Our aim is to optimize the algorithm and to obtain a precise confidence interval of the estimator using branching processes. The numerical results presented suggest that the method is reasonably efficient.
6

Picard, Richard R. "Introduction to Rare Event Simulation." Journal of the American Statistical Association 100, no. 471 (September 2005): 1091–92. http://dx.doi.org/10.1198/jasa.2005.s32.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Pienaar, Elsje. "Multifidelity Analysis for Predicting Rare Events in Stochastic Computational Models of Complex Biological Systems." Biomedical Engineering and Computational Biology 9 (January 2018): 117959721879025. http://dx.doi.org/10.1177/1179597218790253.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Rare events such as genetic mutations or cell-cell interactions are important contributors to dynamics in complex biological systems, eg, in drug-resistant infections. Computational approaches can help analyze rare events that are difficult to study experimentally. However, analyzing the frequency and dynamics of rare events in computational models can also be challenging due to high computational resource demands, especially for high-fidelity stochastic computational models. To facilitate analysis of rare events in complex biological systems, we present a multifidelity analysis approach that uses medium-fidelity analysis (Monte Carlo simulations) and/or low-fidelity analysis (Markov chain models) to analyze high-fidelity stochastic model results. Medium-fidelity analysis can produce large numbers of possible rare event trajectories for a single high-fidelity model simulation. This allows prediction of both rare event dynamics and probability distributions at much lower frequencies than high-fidelity models. Low-fidelity analysis can calculate probability distributions for rare events over time for any frequency by updating the probabilities of the rare event state space after each discrete event of the high-fidelity model. To validate the approach, we apply multifidelity analysis to a high-fidelity model of tuberculosis disease. We validate the method against high-fidelity model results and illustrate the application of multifidelity analysis in predicting rare event trajectories, performing sensitivity analyses and extrapolating predictions to very low frequencies in complex systems. We believe that our approach will complement ongoing efforts to enable accurate prediction of rare event dynamics in high-fidelity computational models.
8

Balesdent, Mathieu, Jérôme Morio, and Julien Marzat. "Recommendations for the tuning of rare event probability estimators." Reliability Engineering & System Safety 133 (January 2015): 68–78. http://dx.doi.org/10.1016/j.ress.2014.09.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wiorkowski, John. "Finding the probability of a rare real world event." Mathematical Gazette 103, no. 557 (June 6, 2019): 240–47. http://dx.doi.org/10.1017/mag.2019.55.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
One of the nicer things about growing older is that you tend to find a great deal of pleasure in the little rituals of life. Things like morning coffee or tea, a warm bath, or a glass of good wine at the end of the day. There are some rituals, however, that are not at all enjoyable. Take, for example, the monthly ritual of paying bills. I still do this by cheque, so I need to make sure my bank account balance is sufficient to pay the cheque, write the cheque, and subtract the cheque amount from the account balance. Now there is a little pleasure when one actually adds money to the account balance, but the one time that gives me more than a modicum of joy is when the pence portion of the account balance comes out to be exactly zero. This happens very rarely and unpredictably. As a statistician I became intrigued with trying to find the probability of this event occurring. I did an extensive literature search both in traditional hard copy journals and also online, but I could find no reference to this problem.
10

Dobson, Ian, Benjamin A. Carreras, and David E. Newman. "How Many Occurrences of Rare Blackout Events Are Needed to Estimate Event Probability?" IEEE Transactions on Power Systems 28, no. 3 (August 2013): 3509–10. http://dx.doi.org/10.1109/tpwrs.2013.2242700.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Rare event probability":

1

Drozdenko, Myroslav. "Weak Convergence of First-Rare-Event Times for Semi-Markov Processes." Doctoral thesis, Västerås : Mälardalen University, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-394.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Razaaly, Nassim. "Rare Event Estimation and Robust Optimization Methods with Application to ORC Turbine Cascade." Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLX027.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Cette thèse vise à formuler des méthodes innovantes de quantification d'incertitude (UQ) à la fois pour l'optimisation robuste (RO) et l'optimisation robuste et fiable (RBDO). L’application visée est l’optimisation des turbines supersoniques pour les Cycles Organiques de Rankine (ORC).Les sources d'énergie typiques des systèmes d'alimentation ORC sont caractérisées par une source de chaleur et des conditions thermodynamiques entrée/sortie de turbine variables. L'utilisation de composés organiques, généralement de masse moléculaire élevée, conduit à des configurations de turbines sujettes à des écoulements supersoniques et des chocs, dont l'intensité augmente dans les conditions off-design; ces caractéristiques dépendent également de la forme locale de la pâle, qui peut être influencée par la variabilité géométrique induite par les procédures de fabrication. Il existe un consensus sur la nécessité d’inclure ces incertitudes dans la conception, nécessitant ainsi des méthodes UQ et un outil permettant l'optimisation de form adapté.Ce travail est décomposé en deux parties principales. La première partie aborde le problème de l’estimation des événements rares en proposant deux méthodes originales pour l'estimation de probabilité de défaillance (metaAL-OIS et eAK-MCS) et un pour le calcul quantile (QeAK-MCS). Les trois méthodes reposent sur des stratégies d’adaptation basées sur des métamodèles (Kriging), visant à affiner directement la région dite Limit-State-Surface (LSS), contrairement aux methodes de type Subset Simulation (SS). En effet, ces dernières considèrent différents seuils intermédiaires associés à des LSSs devant être raffinés. Cette propriété de raffinement direct est cruciale, car elle permet la compatibilité de couplage à des méthodes RBDO existantes.En particulier, les algorithmes proposés ne sont pas soumis à des hypothèses restrictives sur le LSS (contrairement aux méthodes de type FORM/SORM), tel que le nombre de modes de défaillance, cependant doivent être formulés dans l’espace standard. Les méthodes eAK-MCS et QeAK-MCS sont dérivées de la méthode AK-MCS, et d'un échantillonnage adaptatif et parallèle basé sur des algorithmes de type K-Means pondéré. MetaAL-OIS présente une stratégie de raffinement séquentiel plus élaborée basée sur des échantillons MCMC tirés à partir d'une densité d'échantillonage d'importance (ISD) quasi optimale. Par ailleurs, il propose la construction d’une ISD de type mélange de gaussiennes, permettant l’estimation précise de petites probabilités de défaillance lorsqu’un grand nombre d'échantillons (plusieurs millions) est disponible, comme alternative au SS. Les trois méthodes sont très performantes pour des exemples analytiques 2D à 8D classiques, tirés de la littérature sur la fiabilité des structures, certaines présentant plusieurs modes de défaillance, et tous caractérisés par une très faible probabilité de défaillance/niveau de quantile. Des estimations précises sont obtenues pour les cas considérés en un nombre raisonnable d'appels à la fonction de performance
This thesis aims to formulate innovative Uncertainty Quantification (UQ) methods in both Robust Optimization (RO) and Reliability-Based Design Optimization (RBDO) problems. The targeted application is the optimization of supersonic turbines used in Organic Rankine Cycle (ORC) power systems.Typical energy sources for ORC power systems feature variable heat load and turbine inlet/outlet thermodynamic conditions. The use of organic compounds with a heavy molecular weight typically leads to supersonic turbine configurations featuring supersonic flows and shocks, which grow in relevance in the aforementioned off-design conditions; these features also depend strongly on the local blade shape, which can be influenced by the geometric tolerances of the blade manufacturing. A consensus exists about the necessity to include these uncertainties in the design process, so requiring fast UQ methods and a comprehensive tool for performing shape optimization efficiently.This work is decomposed in two main parts. The first one addresses the problem of rare events estimation, proposing two original methods for failure probability (metaAL-OIS and eAK-MCS) and one for quantile computation (QeAK-MCS). The three methods rely on surrogate-based (Kriging) adaptive strategies, aiming at refining the so-called Limit-State Surface (LSS) directly, unlike Subset Simulation (SS) derived methods. Indeed, the latter consider intermediate threshold associated with intermediate LSSs to be refined. This direct refinement property is of crucial importance since it enables the adaptability of the developed methods for RBDO algorithms. Note that the proposed algorithms are not subject to restrictive assumptions on the LSS (unlike the well-known FORM/SORM), such as the number of failure modes, however need to be formulated in the Standard Space. The eAK-MCS and QeAK-MCS methods are derived from the AK-MCS method and inherit a parallel adaptive sampling based on weighed K-Means. MetaAL-OIS features a more elaborate sequential refinement strategy based on MCMC samples drawn from a quasi-optimal ISD. It additionally proposes the construction of a Gaussian mixture ISD, permitting the accurate estimation of small failure probabilities when a large number of evaluations (several millions) is tractable, as an alternative to SS. The three methods are shown to perform very well for 2D to 8D analytical examples popular in structural reliability literature, some featuring several failure modes, all subject to very small failure probability/quantile level. Accurate estimations are performed in the cases considered using a reasonable number of calls to the performance function.The second part of this work tackles original Robust Optimization (RO) methods applied to the Shape Design of a supersonic ORC Turbine cascade. A comprehensive Uncertainty Quantification (UQ) analysis accounting for operational, fluid parameters and geometric (aleatoric) uncertainties is illustrated, permitting to provide a general overview over the impact of multiple effects and constitutes a preliminary study necessary for RO. Then, several mono-objective RO formulations under a probabilistic constraint are considered in this work, including the minimization of the mean or a high quantile of the Objective Function. A critical assessment of the (Robust) Optimal designs is finally investigated
3

Sinks, Shuxian. "Response Adaptive Design using Auxiliary and Primary Outcomes." VCU Scholars Compass, 2013. http://scholarscompass.vcu.edu/etd/572.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Response adaptive designs intend to allocate more patients to better treatments without undermining the validity and the integrity of the trial. The immediacy of the primary response (e.g. deaths, remission) determines the efficiency of the response adaptive design, which often requires outcomes to be quickly or immediately observed. This presents difficulties for survival studies, which may require long durations to observe the primary endpoint. Therefore, we introduce auxiliary endpoints to assist the adaptation with the primary endpoint, where an auxiliary endpoint is generally defined as any measurement that is positively associated with the primary endpoint. Our proposed design (referred to as bivariate adaptive design) is based on the classical response adaptive design framework. The connection of auxiliary and primary endpoints is established through Bayesian method. We extend parameter space from one dimension to two dimensions, say primary and auxiliary efficacies, by implementing a conditional weigh function on the loss function of the design. The allocation ratio is updated at each stage by optimization of the loss function subject to the information provided for both the auxiliary and primary outcomes. We demonstrate several methods of joint modeling the auxiliary and primary outcomes. Through simulation studies, we show that the bivariate adaptive design is more effective in assigning patients to better treatments as compared with univariate optimal and balanced designs. As hoped, this joint-approach also reduces the expected number of patient failures and preserves the comparable power as compared with other designs.
4

Chabridon, Vincent. "Analyse de sensibilité fiabiliste avec prise en compte d'incertitudes sur le modèle probabiliste - Application aux systèmes aérospatiaux." Thesis, Université Clermont Auvergne‎ (2017-2020), 2018. http://www.theses.fr/2018CLFAC054/document.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Les systèmes aérospatiaux sont des systèmes complexes dont la fiabilité doit être garantie dès la phase de conception au regard des coûts liés aux dégâts gravissimes qu’engendrerait la moindre défaillance. En outre, la prise en compte des incertitudes influant sur le comportement (incertitudes dites « aléatoires » car liées à la variabilité naturelle de certains phénomènes) et la modélisation de ces systèmes (incertitudes dites « épistémiques » car liées au manque de connaissance et aux choix de modélisation) permet d’estimer la fiabilité de tels systèmes et demeure un enjeu crucial en ingénierie. Ainsi, la quantification des incertitudes et sa méthodologie associée consiste, dans un premier temps, à modéliser puis propager ces incertitudes à travers le modèle numérique considéré comme une « boîte-noire ». Dès lors, le but est d’estimer une quantité d’intérêt fiabiliste telle qu’une probabilité de défaillance. Pour les systèmes hautement fiables, la probabilité de défaillance recherchée est très faible, et peut être très coûteuse à estimer. D’autre part, une analyse de sensibilité de la quantité d’intérêt vis-à-vis des incertitudes en entrée peut être réalisée afin de mieux identifier et hiérarchiser l’influence des différentes sources d’incertitudes. Ainsi, la modélisation probabiliste des variables d’entrée (incertitude épistémique) peut jouer un rôle prépondérant dans la valeur de la probabilité obtenue. Une analyse plus profonde de l’impact de ce type d’incertitude doit être menée afin de donner une plus grande confiance dans la fiabilité estimée. Cette thèse traite de la prise en compte de la méconnaissance du modèle probabiliste des entrées stochastiques du modèle. Dans un cadre probabiliste, un « double niveau » d’incertitudes (aléatoires/épistémiques) doit être modélisé puis propagé à travers l’ensemble des étapes de la méthodologie de quantification des incertitudes. Dans cette thèse, le traitement des incertitudes est effectué dans un cadre bayésien où la méconnaissance sur les paramètres de distribution des variables d‘entrée est caractérisée par une densité a priori. Dans un premier temps, après propagation du double niveau d’incertitudes, la probabilité de défaillance prédictive est utilisée comme mesure de substitution à la probabilité de défaillance classique. Dans un deuxième temps, une analyse de sensibilité locale à base de score functions de cette probabilité de défaillance prédictive vis-à-vis des hyper-paramètres de loi de probabilité des variables d’entrée est proposée. Enfin, une analyse de sensibilité globale à base d’indices de Sobol appliqués à la variable binaire qu’est l’indicatrice de défaillance est réalisée. L’ensemble des méthodes proposées dans cette thèse est appliqué à un cas industriel de retombée d’un étage de lanceur
Aerospace systems are complex engineering systems for which reliability has to be guaranteed at an early design phase, especially regarding the potential tremendous damage and costs that could be induced by any failure. Moreover, the management of various sources of uncertainties, either impacting the behavior of systems (“aleatory” uncertainty due to natural variability of physical phenomena) and/or their modeling and simulation (“epistemic” uncertainty due to lack of knowledge and modeling choices) is a cornerstone for reliability assessment of those systems. Thus, uncertainty quantification and its underlying methodology consists in several phases. Firstly, one needs to model and propagate uncertainties through the computer model which is considered as a “black-box”. Secondly, a relevant quantity of interest regarding the goal of the study, e.g., a failure probability here, has to be estimated. For highly-safe systems, the failure probability which is sought is very low and may be costly-to-estimate. Thirdly, a sensitivity analysis of the quantity of interest can be set up in order to better identify and rank the influential sources of uncertainties in input. Therefore, the probabilistic modeling of input variables (epistemic uncertainty) might strongly influence the value of the failure probability estimate obtained during the reliability analysis. A deeper investigation about the robustness of the probability estimate regarding such a type of uncertainty has to be conducted. This thesis addresses the problem of taking probabilistic modeling uncertainty of the stochastic inputs into account. Within the probabilistic framework, a “bi-level” input uncertainty has to be modeled and propagated all along the different steps of the uncertainty quantification methodology. In this thesis, the uncertainties are modeled within a Bayesian framework in which the lack of knowledge about the distribution parameters is characterized by the choice of a prior probability density function. During a first phase, after the propagation of the bi-level input uncertainty, the predictive failure probability is estimated and used as the current reliability measure instead of the standard failure probability. Then, during a second phase, a local reliability-oriented sensitivity analysis based on the use of score functions is achieved to study the impact of hyper-parameterization of the prior on the predictive failure probability estimate. Finally, in a last step, a global reliability-oriented sensitivity analysis based on Sobol indices on the indicator function adapted to the bi-level input uncertainty is proposed. All the proposed methodologies are tested and challenged on a representative industrial aerospace test-case simulating the fallout of an expendable space launcher
5

Krauth, Timothé. "Modèle génératif profond pour l'estimation de probabilité de collision en vol." Electronic Thesis or Diss., Toulouse, ISAE, 2024. http://www.theses.fr/2024ESAE0018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Il est essentiel de calculer la probabilité de collisions aériennes pour optimiser le trafic aérien tout en maintenant de hauts standards de sécurité. Cette nécessité s'est accentuée dans les années 1960 avec l'augmentation du trafic aérien commercial transatlantique. Initialement, les modèles analytiques tels que ceux de Reich et Anderson-Hsu étaient les références pour évaluer les risques de collision en vol, mais s’avèrent peu adaptés aux espaces aériens complexes autour des aéroports.Les méthodes basées sur les données, et en particulier les simulations de Monte-Carlo, sont devenues une alternative prometteuse pour l'évaluation du risque de collision. Elles offrent une grande flexibilité grâce à des hypothèses simplifiées, les rendant adaptables à divers contextes. Toutefois, les simulations de Monte-Carlo classiques se révèlent peu efficaces pour estimer les probabilités d'événements rares, nécessitant un grand nombre de trajectoires d'avions et des ressources computationnelles importantes. Cette thèse propose un modèle de risque de collision basé sur les simulations de Monte-Carlo, utilisant un modèle de génération de trajectoires pour pallier ces limitations dues aux événements rares. Ces méthodes génératives reproduisent fidèlement les distributions de trajectoires observées tout en intégrant les incertitudes dues à des facteurs externes. Trois axes de recherche principaux sont définis : (i) le développement d'une méthode de génération de trajectoires, (ii) la construction d'un modèle de risque de collision basé sur les méthodes de Monte-Carlo utilisant des trajectoires synthétiques, et (iii) l'amélioration de l'interprétabilité des estimations du risque de collision.La génération d'échantillons synthétiques nécessite l'estimation de la distribution des données observées pour garantir une distribution identique des nouveaux échantillons. C'est particulièrement important pour les trajectoires aériennes, où le modèle doit refléter les incertitudes causant des écarts par rapport aux trajectoires nominales. Nous avons d’abord utilisé des méthodes d'apprentissage statistique traditionnelles pour estimer des trajectoires aériennes complexes en deux dimensions. Malgré la réduction de la dimensionnalité du problème, les méthodes conventionnelles peinent à estimer les distributions en grande dimension. Nous avons alors exploré l’utilisation des autoencodeurs variationnels pour une estimation plus fine de la densité de probabilité. Convenablement adaptés aux applications de séries temporelles multivariées, les autoencodeurs variationnels se révèlent efficaces pour estimer la distribution de trajectoires aériennes complexes.En utilisant la méthode de génération développée, nous estimons alors le risque de perte de séparation induit par les procédures de décollage et d'atterrissage de l'aéroport de Paris-Orly à l'aide de simulations de Monte-Carlo. L'emploi d'une méthode de génération de trajectoires se révèle prometteur, permettant de créer l'équivalent de 20 ans de trajectoires aériennes à partir de seulement deux mois d'observations. Toutefois, les contraintes inhérentes aux méthodes classiques de Monte-Carlo ne sont pas réellement surmontées mais simplement différées par la production d'un ensemble de trajectoires de taille arbitrairement grande.Le travail final de cette thèse unifie les cadres de l'autoencodeur variationnel et de la quantification de l'incertitude. Il démontre comment les autoencodeurs variationnels peuvent construire des distributions d'entrée adaptées pour les algorithmes de quantification de l'incertitude, améliorant la fiabilité des simulations de Monte-Carlo grâce au subset simulation et l'explicabilité de l'estimation de la probabilité de collision en vol par l'analyse de sensibilité. Plus généralement, nous avons montré que l’autoencodeur variationel représente un outil prometteur à associer aux problèmes de quantification d’incertitudes
It is essential to calculate the probability of aircraft collisions to optimise air traffic while maintaining high safety standards. This need became more pronounced in the 1960s with the increase in transatlantic commercial air traffic. Initially, analytical models such as those of Reich and Anderson-Hsu were benchmarks for assessing in-flight collision risks, but they proved to be less suited for the complex airspace around airports.Data-driven methods, especially Monte Carlo simulations, have become a promising alternative for collision risk assessment. They offer significant flexibility through simplified assumptions, making them adaptable to various contexts. However, traditional Monte Carlo simulations are inefficient for estimating rare event probabilities, requiring a large number of aircraft trajectories and substantial computational resources. This thesis proposes a collision risk model based on Monte Carlo simulations, using a trajectory generation model to overcome these limitations associated with rare events. These generative methods faithfully reproduce observed trajectory distributions while incorporating uncertainties from external factors. Three main research areas are defined: (i) developing a trajectory generation method, (ii) constructing a Monte Carlo-based collision risk model using synthetic trajectories, and (iii) improving the interpretability of collision risk estimates.Generating synthetic samples involves estimating the distribution of observed data to ensure identical distribution in new samples. This is particularly important for aircraft trajectories, where the model must reflect uncertainty sources causing deviations from standard trajectories. We initially use traditional statistical learning methods to estimate complex two-dimensional aircraft trajectories. Despite reducing the problem's dimensionality, conventional methods struggle with high-dimensional distribution estimation. We then explore the use of variational autoencoders for more refined probability density estimation. Suitably adapted for multivariate time-series applications, variational autoencoders prove effective for estimating the distribution of complex aircraft trajectories.Using the developed generation method, we estimate the risk of loss of separation induced by the departure and approach procedures of Paris-Orly Airport using Monte Carlo simulations. The use of a trajectory generation method proves promising, allowing the creation of the equivalent of 20 years of air traffic trajectories from only two months of observations. However, this direct method has limitations for estimating extremely low collision probabilities, requiring the use of one variational autoencoder per flight procedure considered in the studied scenario. The processes of trajectory generation and collision risk evaluation are distinctly separated. Consequently, the inherent constraints of classical Monte Carlo methods are not truly overcome but merely postponed by the production of a set of arbitrarily large trajectories.The thesis's final work unifies the frameworks of variational autoencoders and uncertainty quantification. It demonstrates how variational autoencoders can build suitable input distributions for uncertainty quantification algorithms, enhancing the reliability of Monte Carlo simulations through subset simulation and the explainability of mid-air collision probability estimation through sensitivity analysis. More broadly, we show that the variational autoencoder represents a promising tool to be associated with uncertainty quantification problems
6

Jacquemart, Damien. "Contributions aux méthodes de branchement multi-niveaux pour les évènements rares, et applications au trafic aérien." Thesis, Rennes 1, 2014. http://www.theses.fr/2014REN1S186/document.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
La thèse porte sur la conception et l'analyse mathématique de méthodes de Monte Carlo fiables et précises pour l'estimation de la (très petite) probabilité qu'un processus de Markov atteigne une région critique de l'espace d'état avant un instant final déterministe. L'idée sous-jacente aux méthodes de branchement multi-niveaux étudiées ici est de mettre en place une suite emboitée de régions intermédiaires de plus en plus critiques, de telle sorte qu'atteindre une région intermédiaire donnée sachant que la région intermédiaire précédente a déjà été atteinte, n'est pas si rare. En pratique, les trajectoires sont propagées, sélectionnées et répliquées dès que la région intermédiaire suivante est atteinte, et il est facile d'estimer avec précision la probabilité de transition entre deux régions intermédiaires successives. Le biais dû à la discrétisation temporelle des trajectoires du processus de Markov est corrigé en utilisant des régions intermédiaires perturbées, comme proposé par Gobet et Menozzi. Une version adaptative consiste à définir automatiquement les régions intermédiaires, à l’aide de quantiles empiriques. Néanmoins, une fois que le seuil a été fixé, il est souvent difficile voire impossible de se rappeler où (dans quel état) et quand (à quel instant) les trajectoires ont dépassé ce seuil pour la première fois, le cas échéant. La contribution de la thèse consiste à utiliser une première population de trajectoires pilotes pour définir le prochain seuil, à utiliser une deuxième population de trajectoires pour estimer la probabilité de dépassement du seuil ainsi fixé, et à itérer ces deux étapes (définition du prochain seuil, et évaluation de la probabilité de transition) jusqu'à ce que la région critique soit finalement atteinte. La convergence de cet algorithme adaptatif à deux étapes est analysée dans le cadre asymptotique d'un grand nombre de trajectoires. Idéalement, les régions intermédiaires doivent êtres définies en terme des variables spatiale et temporelle conjointement (par exemple, comme l'ensemble des états et des temps pour lesquels une fonction scalaire de l’état dépasse un niveau intermédiaire dépendant du temps). Le point de vue alternatif proposé dans la thèse est de conserver des régions intermédiaires simples, définies en terme de la variable spatiale seulement, et de faire en sorte que les trajectoires qui dépassent un seuil précocement sont davantage répliquées que les trajectoires qui dépassent ce même seuil plus tardivement. L'algorithme résultant combine les points de vue de l'échantillonnage pondéré et du branchement multi-niveaux. Sa performance est évaluée dans le cadre asymptotique d'un grand nombre de trajectoires, et en particulier un théorème central limite est obtenu pour l'erreur d'approximation relative
The thesis deals with the design and mathematical analysis of reliable and accurate Monte Carlo methods in order to estimate the (very small) probability that a Markov process reaches a critical region of the state space before a deterministic final time. The underlying idea behind the multilevel splitting methods studied here is to design an embedded sequence of intermediate more and more critical regions, in such a way that reaching an intermediate region, given that the previous intermediate region has already been reached, is not so rare. In practice, trajectories are propagated, selected and replicated as soon as the next intermediate region is reached, and it is easy to accurately estimate the transition probability between two successive intermediate regions. The bias due to time discretization of the Markov process trajectories is corrected using perturbed intermediate regions as proposed by Gobet and Menozzi. An adaptive version would consist in the automatic design of the intermediate regions, using empirical quantiles. However, it is often difficult if not impossible to remember where (in which state) and when (at which time instant) did each successful trajectory reach the empirically defined intermediate region. The contribution of the thesis consists in using a first population of pilot trajectories to define the next threshold, in using a second population of trajectories to estimate the probability of exceeding this empirically defined threshold, and in iterating these two steps (definition of the next threshold, and evaluation of the transition probability) until the critical region is reached. The convergence of this adaptive two-step algorithm is studied in the asymptotic framework of a large number of trajectories. Ideally, the intermediate regions should be defined in terms of the spatial and temporal variables jointly (for example, as the set of states and times for which a scalar function of the state exceeds a time-dependent threshold). The alternate point of view proposed in the thesis is to keep intermediate regions as simple as possible, defined in terms of the spatial variable only, and to make sure that trajectories that manage to exceed a threshold at an early time instant are more replicated than trajectories that exceed the same threshold at a later time instant. The resulting algorithm combines importance sampling and multilevel splitting. Its preformance is evaluated in the asymptotic framework of a large number of trajectories, and in particular a central limit theorem is obtained for the relative approximation error
7

Yu, Xiaomin. "Simulation Study of Sequential Probability Ratio Test (SPRT) in Monitoring an Event Rate." University of Cincinnati / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1244562576.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

DHAMODARAN, RAMYA. "EFFICIENT ANALYSIS OF RARE EVENTS ASSOCIATED WITH INDIVIDUAL BUFFERS IN A TANDEM JACKSON NETWORK." University of Cincinnati / OhioLINK, 2004. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1099073321.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Estecahandy, Maïder. "Méthodes accélérées de Monte-Carlo pour la simulation d'événements rares. Applications aux Réseaux de Petri." Thesis, Pau, 2016. http://www.theses.fr/2016PAUU3008/document.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Les études de Sûreté de Fonctionnement (SdF) sur les barrières instrumentées de sécurité représentent un enjeu important dans de nombreux domaines industriels. Afin de pouvoir réaliser ce type d'études, TOTAL développe depuis les années 80 le logiciel GRIF. Pour prendre en compte la complexité croissante du contexte opératoire de ses équipements de sécurité, TOTAL est de plus en plus fréquemment amené à utiliser le moteur de calcul MOCA-RP du package Simulation. MOCA-RP permet d'analyser grâce à la simulation de Monte-Carlo (MC) les performances d'équipements complexes modélisés à l'aide de Réseaux de Petri (RP). Néanmoins, obtenir des estimateurs précis avec MC sur des équipements très fiables, tels que l'indisponibilité, revient à faire de la simulation d'événements rares, ce qui peut s'avérer être coûteux en temps de calcul. Les méthodes standard d'accélération de la simulation de Monte-Carlo, initialement développées pour répondre à cette problématique, ne semblent pas adaptées à notre contexte. La majorité d'entre elles ont été définies pour améliorer l'estimation de la défiabilité et/ou pour les processus de Markov. Par conséquent, le travail accompli dans cette thèse se rapporte au développement de méthodes d'accélération de MC adaptées à la problématique des études de sécurité se modélisant en RP et estimant notamment l'indisponibilité. D'une part, nous proposons l'Extension de la Méthode de Conditionnement Temporel visant à accélérer la défaillance individuelle des composants. D'autre part, la méthode de Dissociation ainsi que la méthode de ``Truncated Fixed Effort'' ont été introduites pour accroitre l'occurrence de leurs défaillances simultanées. Ensuite, nous combinons la première technique avec les deux autres, et nous les associons à la méthode de Quasi-Monte-Carlo randomisée. Au travers de diverses études de sensibilité et expériences numériques, nous évaluons leur performance, et observons une amélioration significative des résultats par rapport à MC. Par ailleurs, nous discutons d'un sujet peu familier à la SdF, à savoir le choix de la méthode à utiliser pour déterminer les intervalles de confiance dans le cas de la simulation d'événements rares. Enfin, nous illustrons la faisabilité et le potentiel de nos méthodes sur la base d'une application à un cas industriel
The dependability analysis of safety instrumented systems is an important industrial concern. To be able to carry out such safety studies, TOTAL develops since the eighties the dependability software GRIF. To take into account the increasing complexity of the operating context of its safety equipment, TOTAL is more frequently led to use the engine MOCA-RP of the GRIF Simulation package. Indeed, MOCA-RP allows to estimate quantities associated with complex aging systems modeled in Petri nets thanks to the standard Monte Carlo (MC) simulation. Nevertheless, deriving accurate estimators, such as the system unavailability, on very reliable systems involves rare event simulation, which requires very long computing times with MC. In order to address this issue, the common fast Monte Carlo methods do not seem to be appropriate. Many of them are originally defined to improve only the estimate of the unreliability and/or well-suited for Markovian processes. Therefore, the work accomplished in this thesis pertains to the development of acceleration methods adapted to the problematic of performing safety studies modeled in Petri nets and estimating in particular the unavailability. More specifically, we propose the Extension of the "Méthode de Conditionnement Temporel" to accelerate the individual failure of the components, and we introduce the Dissociation Method as well as the Truncated Fixed Effort Method to increase the occurrence of their simultaneous failures. Then, we combine the first technique with the two other ones, and we also associate them with the Randomized Quasi-Monte Carlo method. Through different sensitivities studies and benchmark experiments, we assess the performance of the acceleration methods and observe a significant improvement of the results compared with MC. Furthermore, we discuss the choice of the confidence interval method to be used when considering rare event simulation, which is an unfamiliar topic in the field of dependability. Last, an application to an industrial case permits the illustration of the potential of our solution methodology
10

Mattrand, Cécile. "Approche probabiliste de la tolérance aux dommages." Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2011. http://tel.archives-ouvertes.fr/tel-00738947.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
En raison de la gravité des accidents liés au phénomène de fatigue-propagation de fissure, les préoccupations de l'industrie aéronautique à assurer l'intégrité des structures soumises à ce mode de sollicitation revêtent un caractère tout à fait essentiel. Les travaux de thèse présentés dans ce mémoire visent à appréhender le problème de sûreté des structures aéronautiques dimensionnées en tolérance aux dommages sous l'angle probabiliste. La formulation et l'application d'une approche fiabiliste menant à des processus de conception et de maintenance fiables des structures aéronautiques en contexte industriel nécessitent cependant de lever un nombre important de verrous scientifiques. Les efforts ont été concentrés au niveau de trois domaines dans ce travail. Une méthodologie a tout d'abord été développée afin de capturer et de retranscrire fidèlement l'aléa du chargement de fatigue à partir de séquences de chargement observées sur des structures en service et monitorées, ce qui constitue une réelle avancée scientifique. Un deuxième axe de recherche a porté sur la sélection d'un modèle mécanique apte à prédire l'évolution de fissure sous chargement d'amplitude variable à coût de calcul modéré. Les travaux se sont ainsi appuyés sur le modèle PREFFAS pour lequel des évolutions ont également été proposées afin de lever l'hypothèse restrictive de périodicité de chargement. Enfin, les analyses probabilistes, produits du couplage entre le modèle mécanique et les modélisations stochastiques préalablement établies, ont entre autre permis de conclure que le chargement est un paramètre qui influe notablement sur la dispersion du phénomène de propagation de fissure. Le dernier objectif de ces travaux a ainsi porté sur la formulation et la résolution du problème de fiabilité en tolérance aux dommages à partir des modèles stochastiques retenus pour le chargement, constituant un réel enjeu scientifique. Une méthode de résolution spécifique du problème de fiabilité a été mise en place afin de répondre aux objectifs fixés et appliquée à des structures jugées représentatives de problèmes réels.

Books on the topic "Rare event probability":

1

1955-, Rubino Gerardo, and Tuffin Bruno, eds. Rare event simulation using Monte Carlo methods. Hoboken, N.J: Wiley, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Falk, Michael. Laws of Small Numbers: Extremes and Rare Events. Basel: Springer Basel AG, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kalashnikov, Vladimir Vi͡acheslavovich. Geometric sums, bounds for rare events with applications: Risk analysis, reliability, queueing. Dordrecht: Kluwer Academic, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Falk, Michael. Laws of small numbers: Extremes and rare events. 2nd ed. Basel: Birkhauser Verlag, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Falk, Michael. Laws of small numbers: Extremes and rare events. Basel: Birkhäuser Verlag, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kalashnikov, Vladimir. Geometric Sums: Bounds for Rare Events with Applications: Risk Analysis, Reliability, Queueing. Dordrecht: Springer Netherlands, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Schneider, Jörg, and Ton Vrouwenvelder. Introduction to safety and reliability of structures. 3rd ed. Zurich, Switzerland: International Association for Bridge and Structural Engineering (IABSE), 1997. http://dx.doi.org/10.2749/sed005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
<p>Society expects that buildings and other structures are safe for the people who use them or who are near them. The failure of a building or structure is expected to be an extremely rare event. Thus, society implicitly relies on the expertise of the professionals involved in the planning, design, construction, operation and maintenance of the structures it uses.<p>Structural engineers devote all their effort to meeting society’s expectations effi ciently. Engineers and scientists work together to develop solutions to structural problems. Given that nothing is absolutely and eternally safe, the goal is to attain an acceptably small probability of failure for a structure, a facility, or a situation. Reliability analysis is part of the science and practice of engineering today, not only with respect to the safety of structures, but also for questions of serviceability and other requirements of technical systems that might be impacted by some probability.<p>The present volume takes a rather broad approach to safety and reliability in Structural Engineering. It treats the underlying concepts of safety, reliability and risk and introduces the reader in a fi rst chapter to the main concepts and strategies for dealing with hazards. The next chapter is devoted to the processing of data into information that is relevant for applying reliability theory. Two following chapters deal with the modelling of structures and with methods of reliability analysis. Another chapter focuses on problems related to establishing target reliabilities, assessing existing structures, and on effective strategies against human error. The last chapter presents an outlook to more advanced applications. The Appendix supports the application of the methods proposed and refers readers to a number of related computer programs.<p>This book is aimed at both students and practicing engineers. It presents the concepts and procedures of reliability analysis in a straightforward, understandable way, making use of simple examples, rather than extended theoretical discussion. It is hoped that this approach serves to advance the application of safety and reliability analysis in engineering practice.<p>The book is amended with a free access to an educational version of a Variables Processor computer program. FreeVaP can be downloaded free of charge and supports the understanding of the subjects treated in this book.
8

1969-, Lee-Treweek Geraldine, and Linkogle Stephanie, eds. Danger in the field: Risk and ethics in social research. London: Routledge, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Clark, James S., Dave Bell, Michael Dietze, Michelle Hersh, Ines Ibanez, Shannon LaDeau, Sean McMahon, et al. Assessing the probability of rare climate events. Edited by Anthony O'Hagan and Mike West. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780198703174.013.16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This article focuses on the use of Bayesian methods in assessing the probability of rare climate events, and more specifically the potential collapse of the meridional overturning circulation (MOC) in the Atlantic Ocean. It first provides an overview of climate models and their use to perform climate simulations, drawing attention to uncertainty in climate simulators and the role of data in climate prediction, before describing an experiment that simulates the evolution of the MOC through the twenty-first century. MOC collapse is predicted by the GENIE-1 (Grid Enabled Integrated Earth system model) for some values of the model inputs, and Bayesian emulation is used for collapse probability analysis. Data comprising a sparse time series of five measurements of the MOC from 1957 to 2004 are analysed. The results demonstrate the utility of Bayesian analysis in dealing with uncertainty in complex models, and in particular in quantifying the risk of extreme outcomes.
10

Hüsler, Jürg, Rolf-Dieter Reiss, and Michael Falk. Laws of Small Numbers: Extremes and Rare Events. Birkhauser Verlag, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Rare event probability":

1

Caron, Virgile. "Importance Sampling for Multi-Constraints Rare Event Probability." In Springer Proceedings in Mathematics & Statistics, 119–28. New York, NY: Springer New York, 2014. http://dx.doi.org/10.1007/978-1-4939-2104-1_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Vořechovský, Miroslav. "Active Learning for Efficient Rare Event Probability Estimation and Sensitivity Analyses in Highly Nonlinear Systems." In Lecture Notes in Civil Engineering, 324–33. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-60271-9_30.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kalashnikov, Vladimir. "Ruin Probability." In Geometric Sums: Bounds for Rare Events with Applications, 171–200. Dordrecht: Springer Netherlands, 1997. http://dx.doi.org/10.1007/978-94-017-1693-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kalashnikov, Vladimir. "Miscellaneous Probability Topics." In Geometric Sums: Bounds for Rare Events with Applications, 30–73. Dordrecht: Springer Netherlands, 1997. http://dx.doi.org/10.1007/978-94-017-1693-2_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hüsler, Jürg. "Extreme Values and Rare Events of Non-Stationary Random Sequences." In Dependence in Probability and Statistics, 439–56. Boston, MA: Birkhäuser Boston, 1986. http://dx.doi.org/10.1007/978-1-4615-8162-8_21.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Longpré, Luc, and Vladik Kreinovich. "How to Describe Hypothetic Truly Rare Events (With Probability 0)." In Uncertainty, Constraints, and Decision Making, 211–15. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-36394-8_35.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Joy, Christy, and Marria C. Cyriac. "Phytochemicals as Potential Drug Candidates for SARS Cov-2: An RDRp Based In-Silico Drug Designing." In Proceedings of the Conference BioSangam 2022: Emerging Trends in Biotechnology (BIOSANGAM 2022), 58–69. Dordrecht: Atlantis Press International BV, 2022. http://dx.doi.org/10.2991/978-94-6463-020-6_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
AbstractThe global pandemic that the world is currently witnessing, COVID-19, even with vaccines available, the test positivity rate (TPR) tends to remain highly threatening. This research focuses on identifying phytochemicals, previously known for their broad-spectrum antiviral properties which can be potential drug candidates for theSARS-CoV-2. A total of 225 phytocompounds (downloaded from PubChem database) are docked against targetprotein (downloaded from PDB database) of SARS-CoV-2using the POAP pipeline. The target protein is the RDRp complex. They are screened according to their binding affinity values and the filtered phytochemicals are then subjected to various analyses including ADME properties (preADMET, swissADME), bioactivity score, and molecular properties (molinspiration), drug-likeness (preADMET), lipophilicity, water solubility, and pharmacokinetics (swissADME). The receptor-ligand interactions and the amino acid positions are obtained using Discovery Studio Visualiser. Molecular dynamic simulation studies are performed to reveal key receptor-drug interactions that must be formed to achieve tight drug binding and also to predict stability. Out of the 225, 10 phytochemicals showed the best scores and more probability of drug action. Compounds that showed promising drug action potential include oriciacridone, corilagin, cinchophyllamine, sophaline D, amentoflavone, cryptomisrine, ginkgetin, hypericin, pseudojervine, dieckol, hinokiflavone, robustaflavone, solamargine. The research herein provides new possibilities for in vitro and in vivo analyses of the proposed ligands to develop new drugs againstSARS-CoV-2.
8

Morio, J., and M. Balesdent. "Introduction to rare event probability estimation." In Estimation of Rare Event Probabilities in Complex Aerospace and Other Systems, 1–2. Elsevier, 2016. http://dx.doi.org/10.1016/b978-0-08-100091-5.00001-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Morio, J., and M. Balesdent. "Basics of probability and statistics." In Estimation of Rare Event Probabilities in Complex Aerospace and Other Systems, 5–32. Elsevier, 2016. http://dx.doi.org/10.1016/b978-0-08-100091-5.00002-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Morio, J., D. Jacquemart, and M. Balesdent. "Estimation of conflict probability between aircraft." In Estimation of Rare Event Probabilities in Complex Aerospace and Other Systems, 183–88. Elsevier, 2016. http://dx.doi.org/10.1016/b978-0-08-100091-5.00013-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Rare event probability":

1

Botev, Zdravko I., and Ad Ridder. "An M-estimator for rare-event probability estimation." In 2016 Winter Simulation Conference (WSC). IEEE, 2016. http://dx.doi.org/10.1109/wsc.2016.7822103.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Shah, Rohan, Christian Hirsch, Dirk P. Kroese, and Volker Schmidt. "Rare event probability estimation for connectivity of large random graphs." In 2014 Winter Simulation Conference - (WSC 2014). IEEE, 2014. http://dx.doi.org/10.1109/wsc.2014.7019916.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Qiu, Yue, Hong Zhou, and Yue-qin Wu. "An importance sampling method with applications to rare event probability." In 2007 IEEE International Conference on Grey Systems and Intelligent Services. IEEE, 2007. http://dx.doi.org/10.1109/gsis.2007.4443499.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Elsheikh, A. H., S. Oladyshkin, W. Nowak, and M. Christie. "Estimating the Probability of CO2 Leakage Using Rare Event Simulation." In ECMOR XIV - 14th European Conference on the Mathematics of Oil Recovery. Netherlands: EAGE Publications BV, 2014. http://dx.doi.org/10.3997/2214-4609.20141876.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

de Boer, Pieter-Tjerk, Pierre L'Ecuyer, Gerardo Rubino, and Bruno Tuffin. "Estimating the probability of a rare event over a finite time horizon." In 2007 Winter Simulation Conference. IEEE, 2007. http://dx.doi.org/10.1109/wsc.2007.4419629.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Cavaluzzi, Jack, Chase Gilmore, Bilal Khan, and Minh Hong Tran. "Probability of the Loss of Offsite Power and Damage to Road Network due to a Rare Event." In 2012 20th International Conference on Nuclear Engineering and the ASME 2012 Power Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/icone20-power2012-54582.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The main goal of the project was to better understand the impact to a nuclear power plant due to the unavailability of critical infrastructure. We evaluated the use of rare event analysis to establish rare event occurrences in the vicinity of the plant. For the purpose of this report, rare events were considered extreme scenarios of natural disasters. The initial data was acquired by viewing STP FSARs and topographical maps of the region. The two specific factors that were extracted by analyzing the initial data of the plant design and location were the road networks and the electrical grid system. These two variables were then analyzed in light of the rare event analysis model that was used as a building block for this project. The rare event analysis also provided information regarding the types of data and distributions that were likely to be seen as a result. For the road networks, the road layout around the plant was mapped and the relevant data was used to consider the possible routes to the plants in case of a rare event. Data regarding LOOP Events were gathered mainly from NRC Documents and Institutional Sources. These data were used to analyze how many events occurred per year and the downtime associated with the rare event.
7

Yue Qiu, Hong Zhou, and Yueqin Wu. "An importance sampling method based on martingale with applications to rare event probability." In 2008 7th World Congress on Intelligent Control and Automation. IEEE, 2008. http://dx.doi.org/10.1109/wcica.2008.4593574.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

M, Bittner, Broggi M, and Beer M. "Rare Event Modelling for Stochastic Dynamic Systems approximated by the Probability Density Evolution Method." In Proceedings of the 29th European Safety and Reliability Conference (ESREL). Singapore: Research Publishing Services, 2019. http://dx.doi.org/10.3850/978-981-11-2724-3_0735-cd.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Xu, Yanwen, and Pingfeng Wang. "Sequential Sampling Based Reliability Analysis for High Dimensional Rare Events With Confidence Intervals." In ASME 2020 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/detc2020-22146.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract Analysis of rare failure events accurately is often challenging with an affordable computational cost in many engineering applications, and this is especially true for problems with high dimensional system inputs. The extremely low probabilities of occurrences for those rare events often lead to large probability estimation errors and low computational efficiency. Thus, it is vital to develop advanced probability analysis methods that are capable of providing robust estimations of rare event probabilities with narrow confidence bounds. Generally, confidence intervals of an estimator can be established based on the central limit theorem, but one of the critical obstacles is the low computational efficiency, since the widely used Monte Carlo method often requires a large number of simulation samples to derive a reasonably narrow confidence interval. This paper develops a new probability analysis approach that can be used to derive the estimates of rare event probabilities efficiently with narrow estimation bounds simultaneously for high dimensional problems. The asymptotic behaviors of the developed estimator has also been proved theoretically without imposing strong assumptions. Further, an asymptotic confidence interval is established for the developed estimator. The presented study offers important insights into the robust estimations of the probability of occurrences for rare events. The accuracy and computational efficiency of the developed technique is assessed with numerical and engineering case studies. Case study results have demonstrated that narrow bounds can be built efficiently using the developed approach, and the true values have always been located within the estimation bounds, indicating that good estimation accuracy along with a significantly improved efficiency.
10

Babanin, Alexander V. "Physics-Based Approach to Wave Statistics and Probability." In ASME 2013 32nd International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/omae2013-10416.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Design criteria in ocean engineering, whether this is one in 50 years or one in 5000 years event, are hardly ever based on measurements, and rather on statistical distributions of relevant metocean properties. Of utmost interest is the tail of these distributions, that is rare events such as the highest waves with low probability. Engineers have long since realised that the superposition of linear waves with narrow-banded spectrum as depicted by the Rayleigh distribution underestimates the probability of extreme wave crests, and is not adequate for wave heights either, which is a critical shortcoming as far as the engineering design is concerned. Ongoing theoretical and experimental efforts have been under way for decades to address this issue. Here, we will concentrate on short-term statistics, i.e. probability of crests/heights of individual waves. Typical approach is to treat all possible waves in the ocean or at a particular location as a single ensemble for which some comprehensive solution can be found. The oceanographic knowledge, however, now indicates that no single and united comprehensive solution is possible. Probability distributions in different physical circumstances should be different, and by combining them together the inevitable scatter is introduced. The scatter and the accuracy will not improve by increasing the bulk data quality and quantity, and it hides the actual distribution of extreme events. The groups have to be separated and their probability distributions treated individually. The paper offers a review of physical conditions, from simple one-dimensional trains of free waves to realistic two-dimensional wind-forced wave fields, in order to understand where different probability distributions can be expected. If the wave trains/fields in the wave records are stable, distributions for the second-order waves should serve well. If modulational instability is active, rare extreme events not predicted by the second-order theory should become possible. This depends on wave steepness, bandwidth and directionality. Mean steepness also defines the wave breaking and therefore the upper limit for wave heights in this group of conditions. Under hurricane-like circumstances, the instability gives way to direct wind forcing, and yet another statistics is to be expected.

Reports on the topic "Rare event probability":

1

Montalvo-Bartolomei, Axel, Bryant Robbins, and Jamie López-Soto. Backward erosion progression rates from small-scale flume tests. Engineer Research and Development Center (U.S.), September 2021. http://dx.doi.org/10.21079/11681/42135.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Backward erosion piping (BEP) is an internal erosion mechanism by which erosion channels progress upstream, typically through cohesionless or highly erodible foundation materials of dams and levees. As one of the primary causes of embankment failures, usually during high pool events, the probability of BEP-induced failure is commonly evaluated by the U.S. Army Corps of Engineers for existing dams and levees. In current practice, BEP failure probability is quantitatively assessed assuming steady state conditions with qualitative adjustments for temporal aspects of the process. In cases with short-term hydraulic loads, the progression rate of the erosion pipe may control the failure probability such that more quantitative treatment of the temporal development of erosion is necessary to arrive at meaningful probabilities of failure. This report builds upon the current state of the practice by investigating BEP progression rates through a series of laboratory experiments. BEP progression rates were measured for nine uniform sands in a series of 55 small-scale flume tests. Results indicate that the pipe progression rates are proportional to the seepage velocity and can be predicted using equations recently proposed in the literature.
2

Bragdon, Sophia, Vuong Truong, and Jay Clausen. Environmentally informed buried object recognition. Engineer Research and Development Center (U.S.), November 2022. http://dx.doi.org/10.21079/11681/45902.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The ability to detect and classify buried objects using thermal infrared imaging is affected by the environmental conditions at the time of imaging, which leads to an inconsistent probability of detection. For example, periods of dense overcast or recent precipitation events result in the suppression of the soil temperature difference between the buried object and soil, thus preventing detection. This work introduces an environmentally informed framework to reduce the false alarm rate in the classification of regions of interest (ROIs) in thermal IR images containing buried objects. Using a dataset that consists of thermal images containing buried objects paired with the corresponding environmental and meteorological conditions, we employ a machine learning approach to determine which environmental conditions are the most impactful on the visibility of the buried objects. We find the key environmental conditions include incoming shortwave solar radiation, soil volumetric water content, and average air temperature. For each image, ROIs are computed using a computer vision approach and these ROIs are coupled with the most important environmental conditions to form the input for the classification algorithm. The environmentally informed classification algorithm produces a decision on whether the ROI contains a buried object by simultaneously learning on the ROIs with a classification neural network and on the environmental data using a tabular neural network. On a given set of ROIs, we have shown that the environmentally informed classification approach improves the detection of buried objects within the ROIs.
3

Russo, David, Daniel M. Tartakovsky, and Shlomo P. Neuman. Development of Predictive Tools for Contaminant Transport through Variably-Saturated Heterogeneous Composite Porous Formations. United States Department of Agriculture, December 2012. http://dx.doi.org/10.32747/2012.7592658.bard.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The vadose (unsaturated) zone forms a major hydrologic link between the ground surface and underlying aquifers. To understand properly its role in protecting groundwater from near surface sources of contamination, one must be able to analyze quantitatively water flow and contaminant transport in variably saturated subsurface environments that are highly heterogeneous, often consisting of multiple geologic units and/or high and/or low permeability inclusions. The specific objectives of this research were: (i) to develop efficient and accurate tools for probabilistic delineation of dominant geologic features comprising the vadose zone; (ii) to develop a complementary set of data analysis tools for discerning the fractal properties of hydraulic and transport parameters of highly heterogeneous vadose zone; (iii) to develop and test the associated computational methods for probabilistic analysis of flow and transport in highly heterogeneous subsurface environments; and (iv) to apply the computational framework to design an “optimal” observation network for monitoring and forecasting the fate and migration of contaminant plumes originating from agricultural activities. During the course of the project, we modified the third objective to include additional computational method, based on the notion that the heterogeneous formation can be considered as a mixture of populations of differing spatial structures. Regarding uncertainly analysis, going beyond approaches based on mean and variance of system states, we succeeded to develop probability density function (PDF) solutions enabling one to evaluate probabilities of rare events, required for probabilistic risk assessment. In addition, we developed reduced complexity models for the probabilistic forecasting of infiltration rates in heterogeneous soils during surface runoff and/or flooding events Regarding flow and transport in variably saturated, spatially heterogeneous formations associated with fine- and coarse-textured embedded soils (FTES- and CTES-formations, respectively).We succeeded to develop first-order and numerical frameworks for flow and transport in three-dimensional (3-D), variably saturated, bimodal, heterogeneous formations, with single and dual porosity, respectively. Regarding the sampling problem defined as, how many sampling points are needed, and where to locate them spatially in the horizontal x₂x₃ plane of the field. Based on our computational framework, we succeeded to develop and demonstrate a methdology that might improve considerably our ability to describe quntitaively the response of complicated 3-D flow systems. The results of the project are of theoretical and practical importance; they provided a rigorous framework to modeling water flow and solute transport in a realistic, highly heterogeneous, composite flow system with uncertain properties under-specified by data. Specifically, they: (i) enhanced fundamental understanding of the basic mechanisms of field-scale flow and transport in near-surface geological formations under realistic flow scenarios, (ii) provided a means to assess the ability of existing flow and transport models to handle realistic flow conditions, and (iii) provided a means to assess quantitatively the threats posed to groundwater by contamination from agricultural sources.

To the bibliography