Dissertations / Theses on the topic 'Analisi probabilistica'

To see the other types of publications on this topic, follow the link: Analisi probabilistica.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Analisi probabilistica.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Bertoni, Vanessa. "Analisi probabilistica e uso di controventi dissipativi in telai in accaio e composti acciaio - calcestruzzo." Doctoral thesis, Università degli studi di Trieste, 2011. http://hdl.handle.net/10077/4469.

Full text
Abstract:
2009/2010
In questa tesi si indagano vari metodi per la protezione sismica delle strutture a partire da una corretta e realistica valutazione della loro risposta sismica attraverso un approccio probabilistico che tenga conto delle aleatorietà più significative. Se la risposta non soddisfa quanto atteso si può intervenire in vari modi come attraverso l’introduzione di controventi dissipativi. Si analizzano qui due tipologie: i controventi dotati di dispositivi visco elastici e quelli con dispositivi fluido viscosi. Di entrambi si presentano dei metodi di progetto completi con relativi confronti.
XXIII Ciclo
1977
APA, Harvard, Vancouver, ISO, and other styles
2

Bella, Maurizio. "Modellazione numerica di strutture sismoresistenti e analisi probabilistiche di tipo montecarlo." Doctoral thesis, Università degli studi di Trieste, 2010. http://hdl.handle.net/10077/3460.

Full text
Abstract:
2008/2009
Lo scopo della presente Tesi è lo sviluppo e la messa a punto di una serie di modelli numerici e codici di calcolo atti a modellare il comportamento di strutture sismoresistenti di tipo intelaiato sottoposte ad azione sismica al fine di poter effettuare una valutazione dell'affidabilità strutturale per mezzo di analisi probabilistiche di tipo Montecarlo. In particolare la modellazione dei giunti composti trave-colonna, è stata realizzata definendo un modello meccanico per componenti, atto a descrivere il comportamento del giunto modellato per mezzo di un insieme di elementi rigidi e deformabili, opportunamente connessi tra di loro e a cui sono stati assegnati opportuni modelli istertici. A tale scopo si è proceduto allo sviluppo ed implementazione all’interno del codice di calcolo ABAQUS e ADAPTIC di una serie di modelli isteretici reperiti in letteratura atti alla modellazione numerica delle principali componenti deformative individuabili nei giunti trave colonna e nei dispositivi viscoelastici utilizzati nella realizzazione di controventi dissipativi. Nello sviluppo del modello meccanico per la modellazione del comportamento dei giunti nodi trave colonna composti si è prestata particolare attenzione alla definizione degli elementi che descrivono l’interazione tra la soletta e la colonna. Per agevolare l’esecuzione dell’elevato numero di analisi numeriche richieste dalle analisi probabilistiche di tipo Montecarlo si è inoltre proceduto a sviluppare un pre-processore ed un post-processore in grado di interfacciarsi con i codici di calcolo utilizzati. Nel primo capitolo verrà introdotto il problema della valutazione dell'affidabilità delle strutture. Nel secondo capitolo verranno introdotti i metodi di analisi strutturale con particolare riferimento al metodo dell'analisi probabilistica di tipo montecarlo applicato alla valutazione dell'affidabilità delle strutture. Nel terzo capitolo si esaminerà la problematica della modellazione numerica delle strutture sismoresistenti. Nel quarto capitolo verranno illustrati i modelli numerici sviluppati e implementati nonchè i codici di calcolo sviluppati. Nel quinto capitolo verrà illustrato il processo di validazione dei modelli numerici sviluppati utilizzando i risultati forniti da una serie di indagini sperimentali condotte da vari autori e reperite in letteratura. Infine nel sesto capitolo si procederà alla valutazione dell'affidabilità strutturale di una struttura intelaiata campione. Più precisamente la struttura oggetto dell’indagine è il telaio testato nel European Laboratory for Structural Assessment (ELSA) del Joint Research Center (JRC) di Ispra (Varese). In particolare, per tale struttura verranno determinate le curve di fragilità con riferimento allo spostamento relativo di interpiano (IRDA) assunto come indice di danno strutturale.
XXII Ciclo
1979
APA, Harvard, Vancouver, ISO, and other styles
3

Tagliaferri, Lorenza. "Probabilistic Envelope Curves for Extreme Rainfall Events - Curve Inviluppo Probabilistiche per Precipitazioni Estreme." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2008. http://amslaurea.unibo.it/99/.

Full text
Abstract:
A regional envelope curve (REC) of flood flows summarises the current bound on our experience of extreme floods in a region. RECs are available for most regions of the world. Recent scientific papers introduced a probabilistic interpretation of these curves and formulated an empirical estimator of the recurrence interval T associated with a REC, which, in principle, enables us to use RECs for design purposes in ungauged basins. The main aim of this work is twofold. First, it extends the REC concept to extreme rainstorm events by introducing the Depth-Duration Envelope Curves (DDEC), which are defined as the regional upper bound on all the record rainfall depths at present for various rainfall duration. Second, it adapts the probabilistic interpretation proposed for RECs to DDECs and it assesses the suitability of these curves for estimating the T-year rainfall event associated with a given duration and large T values. Probabilistic DDECs are complementary to regional frequency analysis of rainstorms and their utilization in combination with a suitable rainfall-runoff model can provide useful indications on the magnitude of extreme floods for gauged and ungauged basins. The study focuses on two different national datasets, the peak over threshold (POT) series of rainfall depths with duration 30 min., 1, 3, 9 and 24 hrs. obtained for 700 Austrian raingauges and the Annual Maximum Series (AMS) of rainfall depths with duration spanning from 5 min. to 24 hrs. collected at 220 raingauges located in northern-central Italy. The estimation of the recurrence interval of DDEC requires the quantification of the equivalent number of independent data which, in turn, is a function of the cross-correlation among sequences. While the quantification and modelling of intersite dependence is a straightforward task for AMS series, it may be cumbersome for POT series. This paper proposes a possible approach to address this problem.
APA, Harvard, Vancouver, ISO, and other styles
4

LOPES, VALDIR M. "Incidentes em reatores nucleares de pesquisa examinados por analise de probabilidade deterministica e analise probabilistica de seguranca." reponame:Repositório Institucional do IPEN, 2010. http://repositorio.ipen.br:8080/xmlui/handle/123456789/9589.

Full text
Abstract:
Made available in DSpace on 2014-10-09T12:28:22Z (GMT). No. of bitstreams: 0
Made available in DSpace on 2014-10-09T13:56:28Z (GMT). No. of bitstreams: 0
Tese (Doutoramento)
IPEN/T
Instituto de Pesquisas Energeticas e Nucleares - IPEN-CNEN/SP
APA, Harvard, Vancouver, ISO, and other styles
5

Beccuti, Marco. "Modeling and analisys of probabilistic system : Formalism and efficient algorithm." Paris 9, 2008. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=2008PA090060.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Stanghellini, Andrea. "Analisi costi benefici di un'infrastruttura stradale con approccio probabilistico all'analisi di sensitività e di rischio." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amslaurea.unibo.it/4782/.

Full text
Abstract:
L'obiettivo dell'elaborato è definire gli importi delle voci fondamentali in ingresso al metodo di calcolo tramite un'intensa ricerca bibliografica in materia e realizzare un procedimento pratico di riferimento applicato al campo delle infrastrutture stradali. I risultati ottenuti dal calcolo sono stati verificati e confrontati con test di sensitività sulle variabili critiche (Analisi di Sensitività, Analisi di Switch e Analisi di Rischio).
APA, Harvard, Vancouver, ISO, and other styles
7

Asafu-Adjei, Joseph Kwaku. "Probabilistic Methods." VCU Scholars Compass, 2007. http://hdl.handle.net/10156/1420.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Sato, Fujio 1944. "Um estudo comparativo da analise de curto-circuito probabilistico em ambientes paralelo e distribuido." [s.n.], 1995. http://repositorio.unicamp.br/jspui/handle/REPOSIP/260361.

Full text
Abstract:
Orientadores: Alcir Jose Monticelli, Ariovaldo Verandio Garcia
Tese (doutorado) - Universidade Estadual de Campinas, Faculdade de Engenharia Eletrica
Made available in DSpace on 2018-07-20T14:56:15Z (GMT). No. of bitstreams: 1 Sato_Fujio_D.pdf: 8110910 bytes, checksum: ae60d1e9b50e670431be8430488de6e4 (MD5) Previous issue date: 1995
Resumo: Este trabalho apresenta a paralelização de um programa de análise de curto-circuito probabilístico utilizando o método de Monte Cado para sistemas de potência. O programa, originariamente desenvolvido e implementado em computadores seqüenciais, foi codificado para dois ambientes distintos de alto desempenho (paralelo e distribuído), tendo como um dos objetivos a verificação de alguns itens importantes concernentes ao processamento paralelo, tais como: portabilidade, desempenho, escalabilidade e comunicação. As implementações paralela e distribuída desta aplicação foram feitas com dois modelos de programação concorrente: o SP M D (Single Process Multiple Data) e o Mestre/Escravo. Os resultados foram obtidos através de testes em quatro sistemas elétricos da região Sul-Sudeste do Sistema Interligado brasileiro
Abstract: This work presents the parallelization of a power system probabilistic short-circuit analysis program using Monte Gar/o method. A sequential version of the code, originally developed for one-processor machine, was extended to two different high performance computer system architectures (parallel and distributed). The main objective of the research was to study issues such as portability, performance, scalability and communication. Two programming models have been implemented on both architectures: SP MD, (Single Process Multiple Data) and MasterjSlave. The architectures and the models have been evalueted by simulation on four real-life networks of the brazilian South-Southeast interconnected system
Doutorado
Energia Eletrica
Doutor em Engenharia Elétrica
APA, Harvard, Vancouver, ISO, and other styles
9

Alhajj, Chehade Hicham. "Geosynthetic-Reinforced Retaining Walls-Deterministic And Probabilistic Approaches." Thesis, Université Grenoble Alpes, 2021. http://www.theses.fr/2021GRALI010.

Full text
Abstract:
L'objectif de cette thèse est de développer, dans le cadre de la mécanique des sols, des méthodes d’analyse de la stabilité interne des murs de soutènement renforcés par géosynthétiques sous chargement sismique. Le travail porte d'abord sur des analyses déterministes, puis est étendu à des analyses probabilistes. Dans la première partie de cette thèse, un modèle déterministe, basé sur le théorème cinématique de l'analyse limite, est proposé pour évaluer le facteur de sécurité d’un mur en sol renforcé ou la résistance nécessaire du renforcement pour stabiliser la structure. Une technique de discrétisation spatiale est utilisée pour générer une surface de rupture rotationnelle, afin de pouvoir considérer des remblais hétérogènes et/ou de représenter le chargement sismique par une approche de type pseudo-dynamique. Les cas de sols secs, non saturés et saturés sont étudiés. La présence de fissures dans le sol est également prise en compte. Ce modèle déterministe permet d’obtenir des résultats rigoureux et est validé par confrontation avec des résultats existants dans la littérature. Dans la deuxième partie du mémoire de thèse, ce modèle déterministe est utilisé dans un cadre probabiliste. Tout d'abord, l’approche en variables aléatoires est utilisée. Les incertitudes considérées concernent les paramètres de résistance au cisaillement du sol, la charge sismique et la résistance des renforcements. L'expansion du chaos polynomial qui consiste à remplacer le modèle déterministe coûteux par un modèle analytique, combinée avec la technique de simulation de Monte Carlo est la méthode fiabiliste considérée pour effectuer l'analyse probabiliste. L'approche en variables aléatoires néglige la variabilité spatiale du sol puisque les propriétés du sol et les autres paramètres modélisés par des variables aléatoires, sont considérés comme constants dans chaque simulation déterministe. Pour cette raison, dans la dernière partie du manuscrit, la variabilité spatiale du sol est considérée en utilisant la théorie des champs aléatoires. La méthode SIR/A-bSPCE, une combinaison entre la technique de réduction dimensionnelle SIR (Sliced Inverse Regression) et une expansion de chaos polynomial adaptative (A-bSPCE), est la méthode fiabiliste considérée pour effectuer l'analyse probabiliste. Le temps de calcul total de l'analyse probabiliste, effectuée à l'aide de la méthode SIR-SPCE, est considérablement réduit par rapport à l'exécution directe des méthode probabilistes classiques. Seuls les paramètres de résistance du sol sont modélisés à l'aide de champs aléatoires, afin de se concentrer sur l'effet de la variabilité spatiale sur les résultats fiabilistes
The aim of this thesis is to assess the seismic internal stability of geosynthetic reinforced soil retaining walls. The work first deals with deterministic analyses and then focus on probabilistic ones. In the first part of this thesis, a deterministic model, based on the upper bound theorem of limit analysis, is proposed for assessing the reinforced soil wall safety factor or the required reinforcement strength to stabilize the structure. A spatial discretization technique is used to generate the rotational failure surface and give the possibility of considering heterogeneous backfills and/or to represent the seismic loading by the pseudo-dynamic approach. The cases of dry, unsaturated and saturated soils are investigated. Additionally, the crack presence in the backfill soils is considered. This deterministic model gives rigorous results and is validated by confrontation with existing results from the literature. Then, in the second part of the thesis, this deterministic model is used in a probabilistic framework. First, the uncertain input parameters are modeled using random variables. The considered uncertainties involve the soil shear strength parameters, seismic loading and reinforcement strength parameters. The Sparse Polynomial Chaos Expansion that consists of replacing the time expensive deterministic model by a meta-model, combined with Monte Carlo Simulations is considered as the reliability method to carry out the probabilistic analysis. Random variables approach neglects the soil spatial variability since the soil properties and the other uncertain input parameters, are considered constant in each deterministic simulation. Therefore, in the last part of the manuscript, the soil spatial variability is considered using the random field theory. The SIR/A-bSPCE method, a combination between the dimension reduction technique, Sliced Inverse Regression (SIR) and an active learning sparse polynomial chaos expansion (A-bSPCE), is implemented to carry out the probabilistic analysis. The total computational time of the probabilistic analysis, performed using SIR-SPCE, is significantly reduced compared to directly running classical probabilistic methods. Only the soil strength parameters are modeled using random fields, in order to focus on the effect of the spatial variability on the reliability results
APA, Harvard, Vancouver, ISO, and other styles
10

Guo, Xiangfeng. "Probabilistic stability analysis of an earth dam using field data." Thesis, Université Grenoble Alpes, 2020. http://www.theses.fr/2020GRALI017.

Full text
Abstract:
Compte tenu de la nature des sols, des incertitudes sur leurs propriétés sont largement rencontrées en géotechnique, en particulier dans le domaine des barrages en terre. Actuellement, il est de plus en plus nécessaire de tenir compte de ces incertitudes pour l'évaluation de la sécurité des grands barrages, notamment dans le cadre des études d’analyse de risques. Cependant, les analyses probabilistes sont complexes et difficiles à mettre en œuvre en raison du nombre limité de mesures, des temps de calcul importants et des limites des méthodes fiabilistes implémentées dans les outils de simulation commerciaux. De plus, la plupart des études précédentes sont basées sur des cas académiques et des données hypothétiques.Ce travail tente de résoudre les problèmes mentionnés ci-dessus en fournissant une étude d'analyse probabiliste pour la stabilité d'un barrage réel en terre en considérant les données in-situ disponibles. Cette étude inclut les éléments principaux suivants: (1) définition de la variabilité des sols en utilisant les mesures disponibles; (2) développement des modèles déterministes; (3-4) analyses probabilistes bu barrage en utilisant des approches en variables aléatoires et en champs aléatoires; (5) analyse 3D de la fiabilité du barrage considéré. Des méthodes fiabilistes avancées (par exemple le métamodèle adaptatif) sont introduites. Cela permet d'estimer précisément la probabilité de rupture du barrage et les valeurs statistiques des facteurs de sécurité avec un temps de calcul significativement réduit. En outre, certaines questions, qui restaient floues dans le domaine de l'analyse probabiliste des barrages, sont discutées (e.g. l’analyse de sensibilité globale des paramètres hydrauliques et géo-mécaniques des sols ; l’étude des performances de cinq méthodes de fiabilité; la simulation/comparaison de trois types de champs aléatoires : générique, conditionnel et non-stationnaire). Le travail présenté, basé sur des données réelles, pourrait être un bon complément aux études probabilistes existantes des ouvrages géotechniques. Les lecteurs pourront également trouver des informations utiles à partir des résultats obtenus afin de mieux résoudre les problèmes pratiques de géo-ingénierie dans un cadre probabiliste
Uncertainties of soil properties are widely encountered in the field of geotechnical engineering especially for earth dams which are constructed with earthen materials. In recent years, there is an increasing need, motivated by the deficiencies of the traditional deterministic approach or guided by the national regulations such as in France, of accounting for these uncertainties for a safe assessment of large dams particularly in the framework of risk analysis studies. However, probabilistic analyses are still complex and not so easy to implement in practice due to the limited number of in-situ measurements, expensive computation efforts and lack of implementation of reliability methods in commercial simulation tools. Moreover, most of the previous studies are based on academic cases and hypothetic data.This work attempts to deal with the aforementioned issues by providing a probabilistic analysis study for the stability of a real earth dam using available field data. This study includes the following main elements: (1) definition of the soil variability by using the available measurements; (2) development of the deterministic models; (3-4) dam probabilistic analyses using the random-variables and random-fields approaches; (5) three-dimensional reliability analysis of the considered dam. Advanced reliability methods, such as the adaptive surrogate modelling, are introduced for the studied earth dam problem. This allows accurately estimating the dam failure probability and the safety factor statistics with a significantly reduced calculation time. In addition, some issues, that remain unknown or unclear in the field of the dam probabilistic analysis, are discussed (e.g. global sensitivity analysis of the soil hydraulic and shear strength parameters; performance survey of five reliability methods; simulation/comparison of three different kinds of random fields: generic (unconditional-stationary), conditional and nonstationary). The presented work, based on real measurements, could be a good supplement to the existing probabilistic studies of geo-structures. Readers will find useful information from the obtained results in order to better solve the practical geotechnical problems in a probabilistic framework
APA, Harvard, Vancouver, ISO, and other styles
11

Hohn, Jennifer Lynn. "Generalized Probabilistic Bowling Distributions." TopSCHOLAR®, 2009. http://digitalcommons.wku.edu/theses/82.

Full text
Abstract:
Have you ever wondered if you are better than the average bowler? If so, there are a variety of ways to compute the average score of a bowling game, including methods that account for a bowler’s skill level. In this thesis, we discuss several different ways to generate bowling scores randomly. For each distribution, we give results for the expected value and standard deviation of each frame's score, the expected value of the game’s final score, and the correlation coefficient between the score of the first and second roll of a single frame. Furthermore, we shall generalize the results in each distribution for an frame game on pins. Additionally, we shall generalize the number of possible games when bowling frames on pins. Then, we shall derive the frequency distribution of each frame’s scores and the arithmetic mean for frames on pins. Finally, to summarize the variety of distributions, we shall make tables that display the results obtained from each distribution used to model a particular bowler’s score. We evaluate the special case when bowling 10 frames on 10 pins, which represents a standard bowling game.
APA, Harvard, Vancouver, ISO, and other styles
12

Larsson, Emelie. "Utvärdering av osäkerhet och variabilitet vid beräkning av riktvärden för förorenad mark." Thesis, Uppsala universitet, Institutionen för geovetenskaper, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-218289.

Full text
Abstract:
I Sverige finns cirka 80 000 identifierade förorenade områden som i vissa fall behöver efterbehandling för att hantera föroreningssituationen. Naturvårdsverket publicerade 2009 ett reviderat vägledningsmaterial för riskbedömningar av förorenade områden tillsammans med en beräkningsmodell för att ta fram riktvärden. Riktvärdesmodellen är deterministisk och genererar enskilda riktvärden för ämnen under givna förutsättningar. Modellen tar inte explicit hänsyn till osäkerhet och variabilitet utan hanterar istället det implicit med säkerhets­faktorer och genom att användaren alltid utgår från ett rimligt värsta scenario vid val av parametervärden. En metod för att hantera osäkerhet och variabilitet i riskbedömningar är att göra en så kallad probabilistisk riskbedömning med Monte Carlo-simuleringar. Fördelen med detta är att ingångsparametrar kan definieras med sannolikhetsfördelningar och på så vis hantera inverkan av osäkerhet och variabilitet. I examensarbetet genomfördes en probabilistisk riskbedömning genom en vidare egen implementering av Naturvårdsverkets metodik varefter probabilistiska riktvärden beräknades för ett antal ämnen. Modellen tillämpades med två parameter­uppsättningar vars värden hade förankrats i litteraturen respektive Naturvårdsverkets metodik. Uppsättningarna genererade kumulativa fördelningsfunktioner av riktvärden som överensstämde olika mycket med de deterministiska riktvärden som Naturvårdsverket definierat. Generellt överensstämde deterministiska riktvärden för markanvändningsscenariot känslig mark­användning (KM) mer med den probabilistiska riskbedömningen än för scenariot mindre känslig markanvändning (MKM). Enligt resultatet i examensarbetet skulle dioxin och PCB-7 behöva en sänkning av riktvärden för att fullständigt skydda människor och miljö för MKM. En fallstudie över ett uppdrag som Geosigma AB utfört under hösten 2013 genomfördes också. Det var generellt en överensstämmelse mellan de platsspecifika riktvärden (PRV) som beräknats i undersökningsrapporten och den probabilistiska risk­bedömningen. Undantaget var ämnet koppar som enligt studien skulle behöva halverade riktvärden för att skydda människor och miljö. I den probabilistiska riskbedömningen kvantifierades hur olika skyddsobjekt respektive exponeringsvägar blev styrande för olika ämnens riktvärden mellan simuleringar. För några ämnen skedde avvikelser jämfört med de deterministiska motsvarigheterna i mellan 70-90 % av fallen. Exponeringsvägarnas bidrag till det ojusterade hälsoriskbaserade riktvärdet kvantifierades också i en probabilistisk hälsoriskbaserad riskbedömning. Riktvärden med likvärdiga numeriska värden erhölls för riktvärden med skild sammansättning. Detta motiverade att riktvärdenas sammansättning och styrande exponeringsvägar alltid bör kvantifieras vid en probabilistisk riskbedömning.
In Sweden, approximately 80,000 contaminated areas have been identified. Some of these areas are in need of remediation to cope with the effects that the contaminants have on both humans and the environment. The Swedish Environmental Protection Agency (EPA) has published a methodology on how to perform risk assessments for contaminated soils together with a complex model for calculating soil guideline values. The guideline value model is deterministic and calculates single guideline values for contaminants. The model does not account explicitly for uncertainty and variability in parameters but rather handles it implicitly by using safety-factors and reasonable worst-case assumptions for different parameters. One method to account explicitly for uncertainty and variability in a risk assessment is to perform a probabilistic risk assessment (PRA) through Monte Carlo-simulations. A benefit with this is that the parameters can be defined with probability density functions (PDFs) that account for the uncertainty and variability of the parameters. In this Master's Thesis a PRA was conducted and followed by calculations of probabilistic guideline values for selected contaminants. The model was run for two sets of PDFs for the parameters: one was collected from extensive research in published articles and another one included the deterministic values set by the Swedish EPA for all parameters. The sets generated cumulative probability distributions (CPDs) of guideline values that, depending on the contaminant, corresponded in different levels to the deterministic guideline values that the Swedish EPA had calculated. In general, there was a stronger correlation between the deterministic guideline values and the CPDs for the sensitive land-use scenario compared to the less sensitive one. For contaminants, such as dioxin and PCB-7, a lowering of the guideline values would be required to fully protect humans and the environment based on the results in this thesis. Based on a recent soil investigation that Geosigma AB has performed, a case study was also conducted. In general there was a correlation between the deterministic site specific guideline values and the CPDs in the case study. In addition to this, a health oriented risk assessment was performed in the thesis where unexpected exposure pathways were found to be governing for the guideline values. For some contaminants the exposure pathway governing the guideline values in the PRA differed from the deterministic ones in 70-90 % of the simulations. Also, the contributing part of the exposure pathways to the unadjusted health guideline values differed from the deterministic ones. This indicated the need of always quantifying the composition of guideline values in probabilistic risk assessments.
APA, Harvard, Vancouver, ISO, and other styles
13

Pan, Qiujing. "Deterministic and Probabilistic Assessment of Tunnel Face Stability." Thesis, Université Grenoble Alpes (ComUE), 2017. http://www.theses.fr/2017GREAI044.

Full text
Abstract:
The main work for Qiujing PAN’s PhD thesis is to develop the stability analysis for underground structures, which contains two parts, deterministic model and probabilistic analysis. During his 1st year of PhD research, he has mainly finished the deterministic model study. In the 2nd year, I developed a probabilistic model for high dimensional problems
In the contemporary society, the utilization and exploitation of underground space has become an inevitable and necessary measure to solve the current urban congestion. One of the most important requirements for successful design and construction in tunnels and underground engineering is to maintain the stability of the surrounding soils of the engineering. But the stability analysis requires engineers to have a clear ideal of the earth pressure, the pore water pressure, the seismic effects and the soil variability. Therefore, the research aimed at employing an available theory to design tunnels and underground structures which would be a hot issue with high engineering significance. Among these approaches employed to address the above problem, limit analysis is a powerful tool to perform the stability analysis and has been widely used for real geotechnical works. This research subject will undertake further research on the application of upper bound theorem to the stability analysis of tunnels and underground engineering. Then this approach will be compared with three dimensional analysis and experimental available data. The final goal is to validate new simplified mechanisms using limit analysis to design the collapse and blow-out pressure at the tunnel face. These deterministic models will then be used in a probabilistic framework. The Collocation-based Stochastic Response Surface Methodology will be used, and generalized in order to make possible at a limited computational cost a complete parametric study on the probabilistic properties of the input variables. The uncertainty propagation through the models of stability and ground movements will be evaluated, and some methods of reliability-based design will be proposed. The spatial variability of the soil will be taken into account using the random field theory, and applied to the tunnel face collapse. This model will be developed in order to take into account this variability for much smaller computation times than numerical models, will be validated numerically and submitted to extensive random samplings. The effect of the spatial variability will be evaluated
APA, Harvard, Vancouver, ISO, and other styles
14

Feng, Jianwen. "Probabilistic modelling of heterogeneous media." Thesis, Swansea University, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.644724.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Mason, Dave. "Probabilistic Program Analysis for Software Component Reliability." Thesis, University of Waterloo, 2002. http://hdl.handle.net/10012/1059.

Full text
Abstract:
Components are widely seen by software engineers as an important technology to address the "software crisis''. An important aspect of components in other areas of engineering is that system reliability can be estimated from the reliability of the components. We show how commonly proposed methods of reliability estimation and composition for software are inadequate because of differences between the models and the actual software systems, and we show where the assumptions from system reliability theory cause difficulty when applied to software. This thesis provides an approach to reliability that makes it possible, if not currently plausible, to compose component reliabilities so as to accurately and safely determine system reliability. Firstly, we extend previous work on input sub-domains, or partitions, such that our sub-domains can be sampled in a statistically sound way. We provide an algorithm to generate the most important partitions first, which is particularly important when there are an infinite number of input sub-domains. We combine analysis and testing to provide useful reliabilities for the various input sub-domains of a system, or component. This provides a methodology for calculating true reliability for a software system for any accurate statistical distribution of input values. Secondly, we present a calculus for probability density functions that permits accurately modeling the input distribution seen by each component in the system - a critically important issue in dealing with reliability of software components. Finally, we provide the system structuring calculus that allows a system designer to take components from component suppliers that have been built according to our rules and to determine the resulting system reliability. This can be done without access to the actual components. This work raises many issues, particularly about scalability of the proposed techniques and about the ability of the system designer to know the input profile to the level and kind of accuracy required. There are also large classes of components where the techniques are currently intractable, but we see this work as an important first step.
APA, Harvard, Vancouver, ISO, and other styles
16

Kassa, Negede Abate. "Probabilistic safety analysis of dams." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2010. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-60843.

Full text
Abstract:
Successful dam design endeavor involves generating technical solutions that can meet intended functional objectives and choosing the best one among the alternative technical solutions. The process of choosing the best among the alternative technical solutions depends on evaluation of design conformance with technical specifications and reliability standards (such as capacity, environmental, safety, social, political etc pecifications). The process also involves evaluation on whether an optimal balance is set between safety and economy. The process of evaluating alternative design solutions requires generating a quantitative expression for lifetime performance and safety. An objective and numerical evaluation of lifetime performance and safety of dams is an essential but complex undertaking. Its domain involves much uncertainty (uncertainty in loads, hazards, strength parameters, boundary conditions, models and dam failure consequences) all of which should be characterized. Arguably uncertainty models and risk analysis provide the most complete characterization of dam performance and safety issues. Risk is a combined measure of the probability and severity of an adverse effect (functional and/or structural failure), and is often estimated by the product of the probability of the adverse event occurring and the expected consequences. Thus, risk analysis requires (1) determination of failure probabilities. (2) probabilistic estimation of consequences. Nonetheless, there is no adequately demonstrated, satisfactorily comprehensive and precise method for explicit treatment and integration of all uncertainties in variables of dam design and risk analysis. Therefore, there is a need for evaluating existing uncertainty models for their applicability, to see knowledge and realization gaps, to drive or adopt new approaches and tools and to adequately demonstrate their practicability by using real life case studies. This is required not only for hopefully improving the performance and safety evaluation process accuracy but also for getting better acceptance of the probabilistic approaches by those who took deterministic design based research and engineering practices as their life time career. These problems have motivated the initiation of this research. In this research the following have been accomplished: (1) Identified various ways of analyzing and representing uncertainty in dam design parameters pertinent to three dominant dam failure causes (sliding, overtopping and seepage), and tested a suite of stochastic models capable of capturing design parameters uncertainty to better facilitate evaluation of failure probabilities; (2) Studied three classical stochastic models: Monte Carlo Simulation Method (MCSM), First Order Second Moment (FOSM) and Second Order Second Moment (SOSM), and applied them for modeling dam performance and for evaluating failure probabilities in line with the above mentioned dominant dam failure causes; (3) Presented an exact new for the purpose analytical method of transforming design parameters distributions to a distribution representing dam performance (Analytical Solution for finding Derived Distributions (ASDD) method). Laid out proves of its basic principles, prepared a generic implementation architecture and demonstrated its applicability for the three failure modes using a real life case study data; (4) Presented a multitude of tailor-made reliability equations and solution procedures that will enable the implementations of the above stochastic and analytical methods for failure probability evaluation; (5) Implemented the stochastic and analytical methods using real life data pertinent to the three failure mechanisms from Tendaho Dam, Ethiopia. Compared the performance of the various stochastic and analytical methods with each other and with the classical deterministic design approach; and (6) Provided solution procedures, implementation architectures, and Mathematica 5.2, Crystal Ball 7 and spreadsheet based tools for doing the above mentioned analysis. The results indicate that: (1) The proposed approaches provide a valid set of procedures, internally consistent logic and produce more realistic solutions. Using the approaches engineers could design dams to meet a quantified level of performance (volume of failure) and could set a balance between safety and economy; (2) The research is assumed to bridge the gap between the available probability theories in one hand and the suffering distribution problems in dam safety evaluation on the other; (3) Out of the suite of stochastic approaches studied the ASDD method out perform the classical methods (MCSM, FOSM and SOSM methods) by its theoretical foundation, accuracy and reproducibility. However, when compared with deterministic approach, each of the stochastic approaches provides valid set of procedures, consistent logic and they gave more realistic solution. Nonetheless, it is good practice to compare results from the proposed probabilistic approaches; (4) The different tailor-made reliability equations and solution approaches followed are proved to work for stochastic safety evaluation of dams; and (5) The research drawn from some important conclusions and lessons, in relation to stochastic safety analysis of dams against the three dominant failure mechanisms, are. The end result of the study should provide dam engineers and decision makers with perspectives, methodologies, techniques and tools that help them better understand dam safety related issues and enable them to conduct quantitative safety analysis and thus make intelligent dam design, upgrading and rehabilitation decisions.
APA, Harvard, Vancouver, ISO, and other styles
17

Natarajan, Iniyan. "Probabilistic methods for radio interferometry data analysis." Doctoral thesis, University of Cape Town, 2017. http://hdl.handle.net/11427/27243.

Full text
Abstract:
Probability theory provides a uniquely valid set of rules for plausible reasoning. This enables us to apply this mathematical formalism of probability, also known as Bayesian, with greater flexibility to problems of scientific inference. In this thesis, we are concerned with applying this method to the analysis of visibility data from radio interferometers. Any radio interferometry observation can be described using the Radio Interferometry Measurement Equation (RIME). Throughout the thesis, we use the RIME to model the visibilities in performing the probabilistic analysis. We first develop the theory for employing the RIME in performing Bayesian analysis of interferometric data. We then apply this to the problem of super-resolution with radio interferometers by performing model selection successfully between different source structures, all smaller in scale than the size of the point spread function (PSF) of the interferometer, on Westerbork Synthesis Radio Telescope (WSRT) simulations at a frequency of 1.4 GHz. We also quantify the change in the scale of the sources that can be resolved by WSRT at this frequency, with changing signal-to-noise (SNR) of the data, using simulations. Following this, we apply this method to a 5 GHz European VLBI Network (EVN) observation of the flaring blazar CGRaBS J0809+5341, to ascertain the presence of a jet emanating from its core, taking into account the imperfections in the station gain calibration performed on the data, especially on the longest baselines, prior to our analysis. We find that the extended source model is preferred over the point source model with an odds ratio of 109 : 1. Using the flux-density and shape parameter estimates of this model, we also derive the brightness temperature of the blazar (10¹¹-10¹² K), which confirms the presence of a relativistically boosted jet with an intrinsic brightness temperature lower than the apparent brightness temperature, consistent with the literature. We also develop a Bayesian criterion for super-resolution in the presence of baseline-dependent noise and calibration errors and find that these errors play an important role in determining how close one can get to the theoretical super-resolution limit. We then proceed to include fringe-fitting, the process of solving for the time and frequency dependent phase variations introduced by the interstellar medium and the Earth's atmosphere, in our probabilistic approach. Fringe-fitting is one of the first corrections made to Very Long Baseline Interferometry (VLBI) observations, and, by extending our method to include simultaneous fringefitting and source structure estimation, we will be able to perform end-to-end VLBI analysis using our method. To this end, we estimate source amplitude and fringe-fitting phase terms (phase offsets and delays) on 43 GHz Very Long Baseline Array and 230 GHz Event Horizon Telescope (EHT) simulations of point sources. We then perform model selection on a 5 μas extended Gaussian source (one-fourth the size of the PSF) on a synthetic 230 GHz EHT observation. Finally we incorporate turbulent time-varying phase offsets and delays in our model selection and show that the delays can be estimated to within 10-16 per cent error (often better than contemporary software packages) while simultaneously estimating the extended source structure.
APA, Harvard, Vancouver, ISO, and other styles
18

Tsang, Hing-ho. "Probabilistic seismic hazard assessment direct amplitude-based approach /." Click to view the E-thesis via HKUTO, 2006. http://sunzi.lib.hku.hk/hkuto/record/B36783456.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Tsang, Hing-ho, and 曾慶豪. "Probabilistic seismic hazard assessment: direct amplitude-based approach." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2006. http://hub.hku.hk/bib/B36783456.

Full text
Abstract:
The Best PhD Thesis in the Faculties of Dentistry, Engineering, Medicine and Science (University of Hong Kong), Li Ka Shing Prize, 2005-2006.
published_or_final_version
abstract
Civil Engineering
Doctoral
Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
20

POZZI, FEDERICO ALBERTO. "Probabilistic Relational Models for Sentiment Analysis in Social Networks." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2015. http://hdl.handle.net/10281/65709.

Full text
Abstract:
The huge amount of textual data on theWeb has grown in the last few years rapidly creating unique contents of massive dimensions that constitutes fertile ground for Sentiment Analysis. In particular, social networks represents an emerging challenging sector where the natural language expressions of people can be easily reported through short but meaningful text messages. This unprecedented contents of huge dimensions need to be efficiently and effectively analyzed to create actionable knowledge for decision making processes. A key information that can be grasped from social environments relates to the polarity of text messages, i. e. the sentiment (positive, negative or neutral) that the messages convey. However, most of the works regarding polarity classification usually consider text as unique information to infer sentiment, do not taking into account that social networks are actually networked environments. A representation of real world data where instances are considered as homogeneous, independent and identically distributed (i.i.d.) leads us to a substantial loss of information and to the introduction of a statistical bias. For this reason, the combination of content and relationships is a core task of the recent literature on Sentiment Analysis, where friendships are usually investigated to model the principle of homophily (a contact among similar people occurs at a higher rate than among dissimilar people). However, paired with the assumption of homophily, constructuralism explains how social relationships evolve via dynamic and continuous interactions as the knowledge and behavior that two actors share increase. Considering the similarity among users on the basis of constructuralism appears to be a much more powerful force than interpersonal influence within the friendship network. As first contribution, this Ph.D. thesis proposes Approval Network as a novel graph representation to jointly model homophily and constructuralism, which is intended to better represent the contagion on social networks. Starting from the classical state-of-the-art methodologies where only text is used to infer the polarity of social networks messages, this thesis presents novel Probabilistic Relational Models on user, document and aspect-level which integrate the structural information to improve classification performance. The integration is particularly useful when textual features do not provide sufficient or explicit information to infer sentiment (e. g., I agree!). The experimental investigations reveal that incorporating network information through approval relations can lead to statistically significant improvements over the performance of complex learning approaches based only on textual features.
APA, Harvard, Vancouver, ISO, and other styles
21

Bagheri, Mehdi. "Block stability analysis using deterministic and probabilistic methods." Doctoral thesis, KTH, Jord- och bergmekanik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-49447.

Full text
Abstract:
This thesis presents a discussion of design tools for analysing block stability around a tunnel. First, it was determined that joint length and field stress have a significant influence on estimating block stability. The results of calculations using methods based on kinematic limit equilibrium (KLE) were compared with the results of filtered DFN-DEM, which are closer to reality. The comparison shows that none of the KLE approaches– conventional, limited joint length, limited joint length with stress and probabilistic KLE – could provide results similar to DFN-DEM. This is due to KLE’s unrealistic assumptions in estimating either volume or clamping forces. A simple mechanism for estimating clamping forces such as continuum mechanics or the solution proposed by Crawford-Bray leads to an overestimation of clamping forces, and thus unsafe design. The results of such approaches were compared to those of DEM, and it was determined that these simple mechanisms ignore a key stage of relaxation of clamping forces due to joint existence. The amount of relaxation is a function of many parameters, such as stiffness of the joint and surrounding rock, the joint friction angle and the block half-apical angle. Based on a conceptual model, the key stage was considered in a new analytical solution for symmetric blocks, and the amount of joint relaxation was quantified. The results of the new analytical solution compared to those of DEM and the model uncertainty of the new solution were quantified. Further numerical investigations based on local and regional stress models were performed to study initial clamping forces. Numerical analyses reveal that local stresses, which are a product of regional stress and joint stiffness, govern block stability. Models with a block assembly show that the clamping forces in a block assembly are equal to the clamping forces in a regional stress model. Therefore, considering a single block in massive rock results in lower clamping forces and thus safer design compared to a block assembly in the same condition of in-situ stress and properties. Furthermore, a sensitivity analysis was conducted to determine which is  the most important parameter by assessing sensitivity factors and studying the applicability of the partial coefficient method for designing block stability. It was determined that the governing parameter is the dispersion of the half-apical angle. For a dip angle with a high dispersion, partial factors become very large and the design value for clamping forces is close to zero. This suggests that in cases with a high dispersion of the half-apical angle, the clamping forces could be ignored in a stability analysis, unlike in cases with a lower dispersion. The costs of gathering more information about the joint dip angle could be compared to the costs of overdesign. The use of partial factors is uncertain, at least without dividing the problem into sub-classes. The application of partial factors is possible in some circumstances but not always, and a FORM analysis is preferable.
QC 20111201
APA, Harvard, Vancouver, ISO, and other styles
22

Khan, Khader A. "Probabilistic Stress Analysis of Liquid Storage Tank." Cleveland State University / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=csu1271639817.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Kaowichakorn, Peerachai. "Probabilistic Analysis of Quality of Service." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4880.

Full text
Abstract:
Current complex service systems are usually comprised of many other components which are often external services performing particular tasks. The quality of service (QoS) attributes such as availability, cost, response time are essential to determine usability and eciency of such system. Obviously, the QoS of such compound system is dependent on the QoS of its components. However, the QoS of each component is naturally unstable and di erent each time it is called due to many factors like network bandwidth, workload, hardware resource, etc. This will consequently make the QoS of the whole system be unstable. This uncertainty can be described and represented with probability distributions. This thesis presents an approach to calculate the QoS of the system when the probability distributions of QoS of each component are provided by service provider or derived from historical data, along with the structure of their compositions. In addition, an analyzer tool is implemented in order to predict the QoS of the given compositions and probability distributions following the proposed approach. The output of the analyzer can be used to predict the behavior of the system to be implemented and to make decisions based on the expected performance. The experimental evaluation shows that the estimation is reliable with a minimal and acceptable error measurement.
APA, Harvard, Vancouver, ISO, and other styles
24

Bandini, Samantha. "WiFi Analytics e Indoor Positioning: analisi di un caso di studio e valutazione del rispetto della privacy." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019.

Find full text
Abstract:
Il lavoro propone un approccio innovativo per garantire la tutela della privacy di dati sensibili di utenti in ambito WiFi Analytics. Il metodo proposto si avvale di strutture dati probabilistiche appartenenti alla famiglia dei Bloom Filter, in particolare Spatial Bloom Filter e Counting Bloom Filter, e consente di memorizzare informazioni utili allo scopo di misurare insight, nascondendo i dati privati degli utenti, quindi in maniera privacy preserving. Al fine di validare l'approccio è stato sviluppato un prototipo software, e sono stati condotti diversi esperimenti relativamente a un caso di studio reale che utilizza Access Point Cisco Meraki. Il lavoro discute in dettaglio i risultati ottenuti, dimostrando la bontà dell'approccio nonché l'ampia possibilità di personalizzazione per rispondere a molteplici diversificate esigenze applicative.
APA, Harvard, Vancouver, ISO, and other styles
25

Cruz, Fernández Francisco. "Probabilistic graphical models for document analysis." Doctoral thesis, Universitat Autònoma de Barcelona, 2016. http://hdl.handle.net/10803/399520.

Full text
Abstract:
Actualmente, más del 80\% de los documentos almacenados en papel pertenecen al ámbito empresarial. Avances en materia de digitalización de documentos han fomentado el interés en crear copias digitales para solucionar problemas de mantenimiento y almacenamiento, además de poder disponer de formas eficientes de transmisión y extracción automática de la información contenida en ellos. Esta situación ha propiciado la necesidad de crear sistemas capaces de extraer y analizar automáticamente esta información. La gran variedad en tipos de documentos hace que esta no sea una tarea trivial. Un proceso de extracción de datos numéricos de tablas o facturas difiere sustancialmente del reconocimiento de texto manuscrito en un documento con anotaciones. No obstante, hay un nexo común en las dos tareas: dado un documento, es necesario localizar la región donde está la información de interés. En el área del Análisis de Documentos, a este proceso se denomina Análisis de la estructura del documento, y tiene como objetivo la identificación y categorización de las diferentes entidades que lo componen. Estas entidades pueden ser regiones de texto, imágenes, líneas de texto, celdas de una tabla, campos de un formulario, etc. Este proceso se puede realizar desde dos enfoques diferentes: análisis físico, o análisis lógico. El análisis físico consiste en identificar la ubicación y los limites que definen el área donde se encuentra la región de interés. El análisis lógico incluye además información acerca de su función y significado dentro del ámbito del documento. Para poder modelar esta información, es necesario incorporar al proceso de análisis un conocimiento previo sobre la tarea. Este conocimiento previo se puede modelar haciendo uso de relaciones contextuales entre las diferentes entidades. El uso del contexto en tareas de visión por computador ha demostrado ser de gran utilidad para guiar el proceso de reconocimiento y reforzar los resultados. Este proceso implica dos cuestiones fundamentales: qué tipo de información contextual es la adecuada para cada problema, y como incorporamos esa información al modelo. En esta tesis abordamos el análisis de la estructura de documentos basándonos en la incorporación de información contextual en el proceso de análisis. Hacemos énfasis en el uso de modelos gráficos probabilísticos y otros mecanismos para proponer soluciones al problema de la identificación de regiones y la segmentación de líneas de texto manuscritas. Presentamos varios métodos que hacen uso de modelos gráficos probabilísticos para resolver las anteriores tareas, y varios tipos de información contextual. En primer lugar presentamos un conjunto de características que pueden modelar información contextual sobre la posición relativa entre las diferentes regiones. Utilizamos estas características junto a otras para en varios modelos basados en modelos gráficos probabilísticos, y los comparamos con un modelo sintáctico clásico basado en gramáticas libres de contexto. En segundo lugar presentamos un marco probabilístico aplicado a la segmentación de líneas de líneas de texto. Combinamos el proceso de inferencia en el modelo con la estimación de las líneas de texto. Demostramos como el uso de información contextual mediante modelos gráficos probabilísticos es de gran utilidad para estas tareas.
Currently, more than 80% of the documents stored on paper belong to the business field. Advances in digitization techniques have fostered the interest in creating digital copies in order to solve maintenance and storage problems, as well as to have efficient ways for transmission and automatic extraction of the information contained therein. This situation has led to the need to create systems that can automatically extract and analyze this kind of information. The great variety of types of documents makes this not a trivial task. The extraction process of numerical data from tables or invoices differs substantially from a task of handwriting recognition in a document with annotations. However, there is a common link in the two tasks: Given a document, we need to identify the region where the information of interest is located. In the area of Document Analysis this process is called Layout Analysis, and aims at identifying and categorizing the different entities that compose the document. These entities can be text regions, pictures, text lines or tables, among others. This process can be done from two different approaches: physical or logical analysis. Physical analysis focus on identifying the physical boundaries that define the area of interest, whereas logical analysis also models information about the role and semantics of the entities within the scope of the document. To encode this information it is necessary to incorporate prior knowledge about the task into the analysis process, which can be introduced in terms of contextual relations between entities. The use of context has proven to be useful to reinforce the recognition process and improve the results on many computer vision tasks. It presents two fundamental questions: what kind of contextual information is appropriate, and how to incorporate this information into the model. In this thesis we study several ways to incorporate contextual information on the task of document layout analysis. We focus on the study of Probabilistic Graphical Models and other mechanisms for the inclusion of contextual relations applied to the specific tasks of region identification and handwritten text line segmentation. On the one hand, we present several methods for region identification. First, we present a method for layout analysis based on Conditional Random Fields for maximum a posteriori estimation. We encode a set of structural relations between different classes of regions on a set of features. Second, we present a method based on 2D-Probabilistic Context-free Grammars and perform a comparative study between probabilistic graphical models and this syntactic approach. Third, we propose a statistical approach based on the Expectation-Maximization algorithm devised to structured documents. We perform a thorough evaluation of the proposed methods on two particular collections of documents: a historical dataset composed of ancient structured documents, and a collection of contemporary documents. On the other hand, we present a probabilistic framework applied to the task of handwritten text line segmentation. We successfully combine the EM algorithm and variational approaches for this purpose. We demonstrate that the use of contextual information using probabilistic graphical models is of great utility for these tasks.
APA, Harvard, Vancouver, ISO, and other styles
26

De, Biasio Marco. "Ground motion intensity measures for seismic probabilistic risk analysis." Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENI051/document.

Full text
Abstract:
Une question fondamentale qui surgit dans le cadre de l’analyse probabiliste du risque sismique est le choix des indicateurs de nocivité des signaux sismiques. En plus de réduire la variabilité de la réponse structurelle (ou non structurelle),un indicateur amélioré (i.e. capable de mieux capturer les caractéristiques de nocivité des mouvements sismiques, aussi bien que l’alea sismique) fournit des critères moins stricts pour la sélection des signaux sismiques.Deux nouveaux indicateurs sont proposés dans cette étude: le premier, nommé ASAR (i.e. Relative Average Spectral Acceleration), est conçu pour la prévision de la demande structurelle, le second, nommé E-ASAR (i.e.Equipment Relative Average Spectral Acceleration), vise à prévoir la demande des composants non structuraux. Les performances des indicateurs proposés sont comparées avec celles des indicateurs de la littérature, sur la base de: a)milliers d’enregistrements sismiques ; b) analyses numériques conduites avec des modèles représentants différents types de bâtiments; et c) analyses statistiques rigoureuses des résultats. Selon l'étude comparative, les indicateurs développés s'avèrent être plus “efficaces” que les indicateurs couramment utilisés. D'ailleurs, l’ASAR et l’E-ASAR ont montré au propre la caractéristique de la “suffisance” en ce qui concerne la magnitude, la distance source-site, et le type de sol (VS30). De plus, les deux indicateurs originaux peuvent être calculés simplement avec la connaissance de la fréquence fondamentale du bâtiment. Cette caractéristique rend l’ASAR et l’E-ASAR facilement exploitables dans les études probabilistes d’alea sismique.Par conséquent, en raison de leur efficacité, suffisance, robustesse et formulation simple, l’ASAR et l’E-ASAR peuvent être considérés comme des candidats prometteurs pour la définition de l’alea sismique dans les cadres de l'analyse probabiliste et déterministe du risque sismique
A fundamental issue that arises in the framework of Probabilistic Seismic Risk Analysis is the choice of groundmotion Intensity Measures (IMs). In addition to reducing record-to-record variability, an improved IM (i.e. one able tobetter capture the damaging features of a record, as well as the site hazard) provides criteria for selecting input groundmotions to loosen restrictions.Two new structure-specific IMs are proposed in this study: the first, namely ASAR (i.e. Relative Average SpectralAcceleration), is conceived for Structural demand prediction, the second namely, E-ASAR (i.e. Equipment-RelativeAverage Spectral Acceleration), aims to predict Non-Structural components acceleration demand. The performance ofthe proposed IMs are compared with the ones of current IMs, based on: a) a large dataset of thousands recordedearthquake ground motions; b) numerical analyses conducted with state-of-the-art FE models, representing actualload-bearing walls and frame structures, and validated against experimental tests; and c) systematic statistical analysesof the results. According to the comparative study, the introduced IMs prove to be considerably more “efficient” withrespect to the IMs currently used. Likewise, both ASAR and E-ASAR have shown to own the characteristic of“sufficiency” with respect to magnitude, source-to-site distance and soil-type (Vs30). Furthermore, both the introducedIMs possess the valuable characteristics to need (in order to be computed) merely the knowledge of the building’sfundamental frequency, exactly as it is for the wide-spread spectral acceleration Spa(f1). This key characteristic makesboth ASAR and E-ASAR easily exploitable in Probabilistic Seismic Hazard Analysis.Therefore, due to their proven efficiency, sufficiency, robustness and applicable formulation, both ASAR and EASARcan be considered as worthy candidates for defining seismic hazard within the frameworks of both Probabilisticand Deterministic Seismic Risk Analysis
APA, Harvard, Vancouver, ISO, and other styles
27

Sproston, Jeremy James. "Model checking of probabilistic timed and hybrid systems." Thesis, University of Birmingham, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.391021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Bonilha, Murilo Weingarten. "A hybrid deterministic-probabilistic model for vibroacoustic studies." Thesis, University of Southampton, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.242537.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Pilla, Srikanth. "Integration of Micromechanical and Probabilistic Analysis Models of Nanocomposites." University of Toledo / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1134422032.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Lieswyn, John. "Probabilistic Risk Analysis in Transport Project Economic Evaluation." Thesis, University of Canterbury. Civil and Natural Resources Engineering, 2012. http://hdl.handle.net/10092/7652.

Full text
Abstract:
Transport infrastructure investment decision making is typically based on a range of inputs such as social, environmental and economic factors. The benefit cost ratio (BCR), a measure of economic efficiency (“value for money”) determined through cost benefit analysis (CBA), is dependent on accurate estimates of the various option costs and net social benefits such as reductions in travel time, accidents, and vehicle operating costs. However, most evaluations are deterministic procedures using point estimates for the inputs and producing point estimates for the outputs. Transport planners have primarily focused on the cost risks and treat risk through sensitivity testing. Probabilistic risk analysis techniques are available which could provide more information about the statistical confidence of the economic evaluation outputs. This research project report investigated how risk and uncertainty are dealt with in the literature and guidelines. The treatment of uncertainty in the Nelson Arterial Traffic Study (ATS) was reviewed and an opportunity to apply risk analysis to develop probabilities of sea level rise impacting on the coastal road options was identified. A simplified transport model and economic evaluation case study based on the ATS was developed in Excel to enable the application of @RISK Monte Carlo simulation software. The simplifications mean that the results are not comparable with the ATS. Seven input variables and their likely distributions were defined for simulation based on the literature review. The simulation of seven variables, five worksheets, and 10,000 iterations takes about 30 seconds of computation time. The input variables in rank order of influence on the BCR were capital cost, car mode share, unit vehicle operating cost, basic employment forecast growth rate, and unit value of time cost. The deterministically derived BCR of 0.75 is associated with a 50% chance that the BCR will be less than 0.6, although this probability is partly based on some statistical parameters without an empirical basis. In practice, probability distribution fitting to appropriate datasets should be undertaken to better support probabilistic risk analysis conclusions. Probabilities for different confidence levels can be reported to suit the risk tolerance of the decision makers. It was determined that the risk analysis approach is feasible and can produce useful outputs, given a clear understanding of the data inputs and their associated distributions.
APA, Harvard, Vancouver, ISO, and other styles
31

Chrszon, Philipp, Clemens Dubslaff, Sascha Klüppelholz, and Christel Baier. "Family-Based Modeling and Analysis for Probabilistic Systems." Springer, 2016. https://tud.qucosa.de/id/qucosa%3A70790.

Full text
Abstract:
Feature-based formalisms provide an elegant way to specify families of systems that share a base functionality and differ in certain features. They can also facilitate an all-in-one analysis, where all systems of the family are analyzed at once on a single family model instead of one-by-one. This paper presents the basic concepts of the tool ProFeat, which provides a guarded-command language for modeling families of probabilistic systems and an automatic translation of family models to the input language of the probabilistic model checker PRISM. This translational approach enables a family-based quantitative analysis with PRISM. Besides modeling families of systems that differ in system parameters such as the number of identical processes or channel sizes, ProFeat also provides special support for the modeling and analysis of (probabilistic) product lines with dynamic feature switches, multi-features and feature attributes. By means of several case studies we show how ProFeat eases family-based modeling and compare the one-by-one and all-in-one analysis approach.
APA, Harvard, Vancouver, ISO, and other styles
32

Madhira, Venkata Sridhar. "PROBABILISTIC STRESS ANALYSIS OF CIRCULAR FINS OF DIFFERENT PROFILES." Cleveland State University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=csu1307567164.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Aytemiz, Tevfik. "A Probabilistic Study of 3-SATISFIABILITY." Diss., Virginia Tech, 2001. http://hdl.handle.net/10919/28202.

Full text
Abstract:
Discrete optimization problems are defined by a finite set of solutions together with an objective function value assigned to each solution. Local search algorithms provide useful tools for addressing a wide variety of intractable discrete optimization problems. Each such algorithm offers a distinct set of rules to intelligently exploit the solution space with the hope of finding an optimal/near optimal solution using a reasonable amount of computing time. This research studies and analyses randomly generated instances of 3-SATISFIABILITY to gain insights into the structure of the underlying solution space. Two random variables are defined and analyzed to assess the probability that a fixed solution will be assigned a particular objective function value in a randomly generated instance of 3-SATISFIABILITY. Then, a random vector is defined and analyzed to investigate how the solutions in the solution space are distributed over their objective function values. These results are then used to define a stopping criterion for local search algorithms applied to MAX 3-SATISFIABILITY. This research also analyses and compares the effectiveness of two local search algorithms, tabu search and random restart local search, on MAX 3-SATISFIABILITY. Computational results with tabu search and random restart local search on randomly generated instances of 3-SATISFIABILITY are reported. These results suggest that, given a limited computing budget, tabu search offers an effective alternative to random restart local search. On the other hand, these two algorithms yield similar results in terms of the best solution found. The computational results also suggest that for randomly generated instances of 3-SATISFIABILITY (of the same size), the globally optimal solution objective function values are typically concentrated over a narrow range.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
34

Mandelli, Diego. "SCENARIO CLUSTERING AND DYNAMIC PROBABILISTIC RISK ASSESSMENT." The Ohio State University, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=osu1306438099.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Fayad, Ghassan Najib. "Probabilistic Finite Element Analysis of Marine Grade Composites." Fogler Library, University of Maine, 2005. http://www.library.umaine.edu/theses/pdf/FayadGN2005.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Qin, Xinzhou. "A Probabilistic-Based Framework for INFOSEC Alert Correlation." Diss., Georgia Institute of Technology, 2005. http://hdl.handle.net/1853/7278.

Full text
Abstract:
Deploying a large number of information security (INFOSEC) systems can provide in-depth protection for systems and networks. However, the sheer number of security alerts output by security sensors can overwhelm security analysts from performing effective analysis and taking timely response. Therefore, alert correlation is the core component in a security management system. Most of existing alert correlation techniques depend on a priori and hard-coded domain knowledge that lead to their limited capabilities of detecting new attack strategies. These approaches also focus more on the aggregation and analysis of raw security alerts, and build basic or low-level attack scenarios. This thesis focuses on discovering novel attack strategies with analysis of security alerts. Our framework helps security administrator aggregate redundant alerts, intelligently correlate security alerts, analyze attack strategies, and take appropriate actions against forthcoming attacks. In alert correlation, we have developed an integrated correlation system with three complementary correlation mechanisms. We have developed a probabilistic-based correlation engine that incorporates domain knowledge to correlate alerts that have direct causal relationship. We have developed a statistical analysis-based and a temporal analysis-based correlation engines to discover attack transition patterns in which attack steps do not have direct causal relationship in terms of security and performance measure but exhibit statistical and temporal patterns. We construct attack scenarios and conduct attack path analysis based on the correlation results. Security analysts are presented with aggregated information on attack strategies from the integrated correlation system. In attack plan recognition, we address the challenges of identifying attacker's high-level strategies and intentions as well as predicting upcoming attacks. We apply graph-based techniques to correlating isolated attack scenarios derived from low-level alert correlation based on their relationship in attack plans. We conduct probabilistic inference to evaluate the likelihood of attack goal(s) and predict potential upcoming attacks based on observed attack activities. We evaluate our algorithms using DARPA's Grand Challenge Problem (GCP) data sets and live traffic data collected from our backbone network. The results show that our approach can effectively discover novel attack strategies, provide a quantitative analysis of attack scenarios and identify attack plans.
APA, Harvard, Vancouver, ISO, and other styles
37

Palhares, André Vitor de Almeida. "Probabilistic Risk Assessment in Clouds: Models and Algorithms." Universidade Federal de Pernambuco, 2012. https://repositorio.ufpe.br/handle/123456789/10423.

Full text
Abstract:
Submitted by Pedro Henrique Rodrigues (pedro.henriquer@ufpe.br) on 2015-03-04T17:17:29Z No. of bitstreams: 2 dissert-avap.pdf: 401311 bytes, checksum: 5bd3f82323bd612e8265a6ab8a55eda0 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5)
Made available in DSpace on 2015-03-04T17:17:29Z (GMT). No. of bitstreams: 2 dissert-avap.pdf: 401311 bytes, checksum: 5bd3f82323bd612e8265a6ab8a55eda0 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Previous issue date: 2012-03-08
Cloud reliance is critical to its success. Although fault-tolerance mechanisms are employed by cloud providers, there is always the possibility of failure of infrastructure components. We consequently need to think proactively of how to deal with the occurrence of failures, in an attempt to minimize their effects. In this work, we draw the risk concept from probabilistic risk analysis in order to achieve this. In probabilistic risk analysis, consequence costs are associated to failure events of the target system, and failure probabilities are associated to infrastructural components. The risk is the expected consequence of the whole system. We use the risk concept in order to present representative mathematical models for which computational optimization problems are formulated and solved, in a Cloud Computing environment. In these problems, consequence costs are associated to incoming applications that must be allocated in the Cloud and the risk is either seen as an objective function that must be minimized or as a constraint that should be limited. The proposed problems are solved either by optimal algorithm reductions or by approximation algorithms with provably performance guarantees. Finally, the models and problems are discussed from a more practical point of view, with examples of how to assess risk using these solutions. Also, the solutions are evaluated and results on their performance are established, showing that they can be used in the effective planning of the Cloud.
APA, Harvard, Vancouver, ISO, and other styles
38

Hua, Ke Qian. "Probabilistic power system contingency analysis considering wind." Thesis, Queensland University of Technology, 2015. https://eprints.qut.edu.au/79903/1/Ke%20Qian_Hua_Thesis.pdf.

Full text
Abstract:
This thesis was a step forward in developing probabilistic assessment of power system response to faults subject to intermittent generation by renewable energy. It has investigated the wind power fluctuation effect on power system stability, and the developed fast estimation process has demonstrated the feasibility for real-time implementation. A better balance between power network security and efficiency can be achieved based on this research outcome.
APA, Harvard, Vancouver, ISO, and other styles
39

Ma, Zheng. "Probabilistic Boolean network modeling for fMRI study in Parkinson's disease." Thesis, University of British Columbia, 2008. http://hdl.handle.net/2429/4172.

Full text
Abstract:
Recent research has suggested disrupted interactions between brain regions may contribute to some of the symptoms of motor disorders such as Parkinson’s Disease (PD). It is therefore important to develop models for inferring brain functional connectivity from data obtained through non-invasive imaging technologies, such as functional magnetic resonance imaging (fMRI). The complexity of brain activities as well as the dynamic nature of motor disorders require such models to be able to perform complex, large-scale, and dynamic system computation. Traditional models proposed in the literature such as structural equation modeling (SEM), multivariate autoregressive models (MAR), dynamic causal modeling (DCM), and dynamic Bayesian networks (DBNs) have all been suggested as suitable for fMRI data analysis. However, they suffer from their own disadvantages such as high computational cost (e.g. DBNs), inability to deal with non-linear case (e.g. MAR), large sample size requirement (e.g. SEM), et., al. In this research, we propose applying Probabilistic Boolean Network (PBN) for modeling brain connectivity due to its solid stochastic properties, computational simplicity, robustness to uncertainty, and capability to deal with small-size data, typical for fIVIRI data sets. Applying the proposed PBN framework to real fMRI data recorded from PD subjects enables us to identify statistically significant abnormality in PD connectivity by comparing it with normal subjects. The PBN results also suggest a mechanism of evaluating the effectiveness of L-dopa, the principal treatment for PD. In addition to PBNs’ promising application in inferring brain connectivity, PBN modeling for brain ROTs also enables researchers to study dynamic activities of the system under stochastic conditions, gaining essential information regarding asymptotic behaviors of ROTs for potential therapeutic intervention in PD. The results indicate significant difference in feature states between PD patients and normal subjects. Hypothesizing the observed feature states for normal subject as the desired functional states, we further explore possible methods to manipulate the dynamic network behavior of PD patients in the favor of the desired states from the view of random perturbation as well as intervention. Results identified a target ROT with the best intervention performance, and that ROl is a potential candidate for therapeutic exercise.
APA, Harvard, Vancouver, ISO, and other styles
40

Azmi, Mastura Binti. "STUDY ON SLOPE STABILITY OF PENANG ISLAND CONSIDERING EARTHQUAKE AND RAINFALL EFFECTS." 京都大学 (Kyoto University), 2014. http://hdl.handle.net/2433/188539.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Sehhati, Reza. "Probabilistic seismic demand analysis for the near-fault zone." Pullman, Wash. : Washington State University, 2008. http://www.dissertations.wsu.edu/Dissertations/Fall2008/r_sehhati_120108.pdf.

Full text
Abstract:
Thesis (Ph. D.)--Washington State University, December 2008.
Title from PDF title page (viewed on Oct. 22, 2009). "Department of Civil & Environmental Engineering." Includes bibliographical references (p. 166-171).
APA, Harvard, Vancouver, ISO, and other styles
42

Lu, Yuan-Chiao. "Probabilistic Analysis of the Material and Shape Properties for Human Liver." Diss., Virginia Tech, 2014. http://hdl.handle.net/10919/64798.

Full text
Abstract:
Realistic assessments of liver injury risk for the entire occupant population require incorporating inter-subject variations into numerical human models. The main objective of this study was to quantify the variations in shape and material properties of the human liver. Statistical shape analysis was applied to analyze the geometrical variation using a surface set of 15 adult human livers recorded in an occupant posture. Principal component analysis was then utilized to obtain the modes of variation, the mean model, and a set of 95% statistical boundary shape models. Specimen-specific finite element (FE) models were employed to quantify material and failure properties of human liver parenchyma. The mean material model parameters were then determined, and a stochastic optimization approach was utilized to determine the standard deviations of the material model parameters. The distributions of the material parameters were used to develop probabilistic FE models of the liver implemented in THUMS human FE model to simulate oblique impact tests under three impact speeds. In addition, the influence of organ preservation on the biomechanical responses of animal livers was investigated using indentation and tensile tests. Results showed that the first five modes of the human liver shape models accounted for more than 70% of the overall anatomical variations. The Ogden material model with two parameters showed a good fit to experimental tensile data before failure. Significant changes of the biomechanical responses of liver parenchyma were found after cooling or freezing storage. The force-deflection responses of THUMS model with probabilistic liver material models were within the test corridors obtained from cadaveric tests. Significant differences were observed in the maximum and minimum principal Green-Lagrangian strain values recorded in the THUMS liver model with the default and updated average material properties. The results from this study could help in the development of more biofidelic human models, which may provide a better understanding of injury mechanisms of the liver during automobile collisions.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
43

Robertson, Bradford E. "A hybrid probabilistic method to estimate design margin." Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/50375.

Full text
Abstract:
Weight growth has been a significant factor in nearly every space and launch vehicle development program. In order to account for weight growth, program managers allocate a design margin. However, methods of estimating design margin are not well suited for the task of assigning a design margin for a novel concept. In order to address this problem, a hybrid method of estimating margin is developed. This hybrid method utilizes range estimating, a well-developed method for conducting a bottom-up weight analysis, and a new forecasting technique known as executable morphological analysis. Executable morphological analysis extends morphological analysis in order to extract quantitative information from the morphological field. Specifically, the morphological field is extended by adding attributes (probability and mass impact) to each condition. This extended morphological field is populated with alternate baseline options with corresponding probabilities of occurrence and impact. The overall impact of alternate baseline options can then be estimated by running a Monte Carlo analysis over the extended morphological field. This methodology was applied to two sample problems. First, the historical design changes of the Space Shuttle Orbiter were evaluated utilizing original mass estimates. Additionally, the FAST reference flight system F served as the basis for a complete sample problem; both range estimating and executable morphological analysis were performed utilizing the work breakdown structure created during the conceptual design of this vehicle.
APA, Harvard, Vancouver, ISO, and other styles
44

Devaney, Shaun. "Development of software for reliability based design of steel framed structures in fire." Thesis, University of Edinburgh, 2015. http://hdl.handle.net/1842/10468.

Full text
Abstract:
Fire in building structures represents a risk both to life and property that cannot be fully eliminated. It is the aim of fire safety engineering to reduce this risk to an acceptable level through the application of scientific and engineering principles to evaluate the risk posed by fire and to determine the optimal set of protective measures. This is increasingly being achieved through performance-based design methods. Performance-based design sets out performance requirements, typically related to life safety and control of property losses, and the designer is free to choose the most suitable approach to meet these requirements. Accurate performance-based design requires the evaluation of the risks to a structure through the evaluation of the range of hazards that may occur and the resulting structural responses. The purpose of this research is to develop simplified methodologies for the reliability based design of steel framed structures in fire. These methodologies are incorporated into a software package, FireLab, which is intended to act as a tool for practicing engineers to aid in learning and applying performance-based design. FireLab is a Matlab based program that incorporates a number of different models for analysing the response of structural elements exposed to fire. It includes both deterministic and probabilistic analysis procedures. A range of simple fire models are presented for modelling compartment fires. A set of heat transfer processes are discussed for calculating the temperature distribution within common structural elements exposed to fire. A variety of structural models are discussed which may be used to model the effects of fire on a structure. An analytical model for the analysis of composite beams has been implemented in the software program. Interfaces between the software and 2 separate third party programs have also been created to allow for the analysis of composite beams using the finite element method. Analytical methods for the analysis of composite slabs under thermo-mechanical load have been implemented in the software. These methods account for the additional load carrying capacity that slabs have in fire due to the positive effects of tensile membrane action. A numerical analysis method for the vertical stability of structures subjected to multi-floor fires has been implemented using the direct stiffness method. This method uses an elastic 2nd order solution in order to check the stability of a column under the fire induced horizontal loads from sagging floors. These models of potential failure scenarios provide the basis for the probabilistic analysis methods. A variety of methods for reliability analysis are evaluated based on ease of use, accuracy and efficiency. A selection of these methods has been implemented in the software program. A selection of sample cases are examined in order to illustrate the procedures and to evaluate the important input variables. These methods provide the probability of failure of a structure under specific loads. The probability of failure is a useful parameter in comparing the level of safety between various design options. A more comprehensive framework is developed for the evaluation of the probable costs due to fire associated with a given design. This framework is based on an existing framework from earthquake engineering. It involves calculating the statistical spread of both the magnitude and likelihood of occurrence of fire and the resulting structural responses. The damage that occurs from the structural response may be then estimated. Finally, given the likely level of damage that will occur it is possible to estimate the cost of the damage either in terms of monetary cost of repair or downtime due to repair works. This method is applied to a variety of design options for a typical office building in order to illustrate the application of the framework.
APA, Harvard, Vancouver, ISO, and other styles
45

Gudjonsen, Ludvik. "Combining Probabilistic and Discrete Methods for Sequence Modelling." Thesis, University of Skövde, Department of Computer Science, 1999. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-390.

Full text
Abstract:

Sequence modelling is used for analysing newly sequenced proteins, giving indication of the 3-D structure and functionality. Current approaches to the modelling of protein families are either based on discrete or probabilistic methods. Here we present an approach for combining these two approaches in a hybrid model, where discrete patterns are used to model conserved regions and probabilistic models are used for variable regions. When hidden Markov models are used to model the variable regions, the hybrid method gives increased classification accuracy, compared to pure discrete or probabilistic models.

APA, Harvard, Vancouver, ISO, and other styles
46

Aysan, Hüseyin. "Fault-Tolerance Strategies and Probabilistic Guarantees for Real-Time Systems." Doctoral thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-14653.

Full text
Abstract:
Ubiquitous deployment of embedded systems is having a substantial impact on our society, since they interact with our lives in many critical real-time applications. Typically, embedded systems used in safety or mission critical applications (e.g., aerospace, avionics, automotive or nuclear domains) work in harsh environments where they are exposed to frequent transient faults such as power supply jitter, network noise and radiation. They are also susceptible to errors originating from design and production faults. Hence, they have the design objective to maintain the properties of timeliness and functional correctness even under error occurrences. Fault-tolerance plays a crucial role towards achieving dependability, and the fundamental requirement for the design of effective and efficient fault-tolerance mechanisms is a realistic and applicable model of potential faults and their manifestations. An important factor to be considered in this context is the random nature of faults and errors, which, if addressed in the timing analysis by assuming a rigid worst-case occurrence scenario, may lead to inaccurate results. It is also important that the power, weight, space and cost constraints of embedded systems are addressed by efficiently using the available resources for fault-tolerance. This thesis presents a framework for designing predictably dependable embedded real-time systems by jointly addressing the timeliness and the reliability properties. It proposes a spectrum of fault-tolerance strategies particularly targeting embedded real-time systems. Efficient resource usage is attained by considering the diverse criticality levels of the systems' building blocks. The fault-tolerance strategies are complemented with the proposed probabilistic schedulability analysis techniques, which are based on a comprehensive stochastic fault and error model.
APA, Harvard, Vancouver, ISO, and other styles
47

Chrszon, Philipp, Clemens Dubslaff, Sascha Klüppelholz, and Christel Baier. "ProFeat: Feature-oriented engineering for family-based probabilistic model checking." Springer, 2017. https://tud.qucosa.de/id/qucosa%3A70792.

Full text
Abstract:
The concept of features provides an elegant way to specify families of systems. Given a base system, features encapsulate additional functionalities that can be activated or deactivated to enhance or restrict the base system’s behaviors. Features can also facilitate the analysis of families of systems by exploiting commonalities of the family members and performing an all-in-one analysis, where all systems of the family are analyzed at once on a single family model instead of one-by-one. Most prominent, the concept of features has been successfully applied to describe and analyze (software) product lines. We present the tool ProFeat that supports the feature-oriented engineering process for stochastic systems by probabilistic model checking. To describe families of stochastic systems, ProFeat extends models for the prominent probabilistic model checker Prism by feature-oriented concepts, including support for probabilistic product lines with dynamic feature switches, multi-features and feature attributes. ProFeat provides a compact symbolic representation of the analysis results for each family member obtained by Prism to support, e.g., model repair or refinement during feature-oriented development. By means of several case studies we show how ProFeat eases family-based quantitative analysis and compare one-by-one and all-in-one analysis approaches.
APA, Harvard, Vancouver, ISO, and other styles
48

Eimontas, Tadas. "Tikimybinės dinamikos modeliavimas ir patikimumo analizė." Master's thesis, Lithuanian Academic Libraries Network (LABT), 2007. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2007~D_20070816_144151-63546.

Full text
Abstract:
Dėl spartaus technologijų naudojimo paskutiniais dešimtmečiais kuriama vis daugiau sudėtingų sistemų, kurių saugos užtikrinimui turi būti vertinamas techninės ir programinės įrangos patikimumas bei žmogaus-operatoriaus veiksmai. Analizuojant tokias sistemas ypatingai svarbią reikšmę turi laiko faktorius ir su juo susiję determinuoti fiziniai procesai bei stochastiniai įvykiai. Šio tiriamojo darbo tikslas yra sukurti tikimybinės dinamikos modeliavimo metodiką išplėtojant susijusias patikimumo analizės priemones bei jas pritaikyti realios sistemos tyrimui. Sprendžiant užsibrėžtus uždavinius, darbe buvo pritaikyta pažangi stimuliuojamos dinamikos teorija, kol kas neturinti plataus praktinių modelių pagrindimo. Pritaikyta imitacinio modeliavimo metodika suteikė galimybę atlikti visapusišką šilumnešio praradimo avarijos saugos analizę. Reikšmingiausi darbo rezultatai yra susiję su neapibrėžtų įvykių ir dinaminių sistemų saugos analize, siekiant padidinti jų patikimumą. Didžioji atlikto darbo taikymų dalis skirta techninėms dinaminėms sistemoms. Darbe išnagrinėti ir išplėtoti modeliavimo metodai, kurie yra tinkami tirti sistemų patikimumą, susijusį su uždelstais pavojingais įvykiais ar rizikingais operatorių sprendimais.
The current probabilistic safety analysis is not capable of estimating the reliability of the complex dynamic systems where the interactions occur between hardware, software and human actions. In the safety analysis of these systems the time factor is as much important as it joins an evolution of physical variables and stochastic events. In this master thesis the simulation and reliability analysis of the probabilistic dynamics are considered. The new approach of stimulus based probabilistic dynamics is used for the Monte Carlo simulations of the dynamic system. The developed methodology was applied for the safety analysis of the loss of the coolant accident in the nuclear reactor. Besides the assessment of the probability of system failure the scenario analysis was accomplished. The essential events were identified. The uncertainty and sensitivity analysis revealed that the failure probability had a wide range of the distribution due to the uncertainty of twelve simulation parameters. Four main parameters were identified as their uncertainty had the biggest correlation with the uncertainty of the system failure. For the complete reliability analysis the relations between the failure probability and the system characteristics were determined.
APA, Harvard, Vancouver, ISO, and other styles
49

Al-Bittar, Tamara. "Probabilistic analysis of shallow foundations resting on spatially varying soils." Nantes, 2012. http://archive.bu.univ-nantes.fr/pollux/show.action?id=17b61462-4bf8-4bbd-9c16-ad777ebd98ab.

Full text
Abstract:
Cette thèse présente une analyse probabiliste de fondations superficielles reposant sur un sol variable spatialement et soumises à un chargement statique ou dynamique (sismique). Dans la première partie de cette thèse, le cas d'un chargement statique a été considéré. Dans cette partie, seule la variabilité spatiale du sol a été examinée et les propriétés du sol ont été modélisées par des champs aléatoires. Dans la littérature, la méthode de simulation de Monte-Carlo (MCS comme Monte Carlo Simulation) est généralement utilisée. Dans cette thèse, la méthodologie de développement par chaos polynomial creux (SPCE comme Sparse Polynomial Chaos Expansion) est employée. Cette méthode vise à remplacer le modèle élément fini/différence finie par un méta-modèle. Cela conduit (dans le cas présent des problèmes stochastiques à grande dimension) à une réduction significative du nombre d'appels du modèle déterministe. En outre, une utilisation combinée du SPCE et de l'analyse de sensibilité globale GSA (dénommée SPCE/GSA) est proposée pour réduire encore une fois le coût de l'analyse probabiliste pour les problèmes faisant intervenir un modèle déterministe coûteux. Dans la deuxième partie de cette thèse, le chargement sismique a été considéré dans l'analyse probabiliste. Dans cette partie, la variabilité spatiale du sol et/ou la variabilité temporelle du chargement sismique ont été prises en compte. Dans ce cas, le chargement sismique a été modélisé par un processus aléatoire. Les résultats numériques ont montré l'effet significatif de la variabilité temporelle du signal sismique dans l'analyse probabiliste
The aim of this thesis is to study the performance of shallow foundations resting on spatially varying soils and subjected to a static or a dynamic (seismic) loading using probabilistic approaches. In the first part of this thesis, a static loading was considered in the probabilistic analysis. In this part, only the soil spatial variability was considered and the soil parameters were modelled by random fields. In such cases, Monte Carlo Simulation (MCS) methodology is generally used in literature. In this thesis, the Sparse Polynomial Chaos Expansion (SPCE) methodology was employed. This methodology aims at replacing the finite element/finite difference deterministic model by a meta-model. This leads (in the present case of highly dimensional stochastic problems) to a significant reduction in the number of calls of the deterministic model with respect to the crude MCS methodology. Moreover, an efficient combined use of the SPCE methodology and the Global Sensitivity Analysis (GSA) was proposed. The aim is to reduce once again the probabilistic computation time for problems with expensive deterministic models. In the second part of this thesis, a seismic loading was considered. In this part, the soil spatial variability and/or the time variability of the earthquake Ground-Motion (GM) were considered. In this case, the earthquake GM was modelled by a random process. Both cases of a free field and a Soil-Structure Interaction (SSI) problem were investigated. The numerical results have shown the significant effect of the time variability of the earthquake GM in the probabilistic analysis
APA, Harvard, Vancouver, ISO, and other styles
50

König, Johan. "Analyzing Substation Automation System Reliability using Probabilistic Relational Models and Enterprise Architecture." Doctoral thesis, KTH, Industriella informations- och styrsystem, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-145006.

Full text
Abstract:
Modern society is unquestionably heavily reliant on supply of electricity. Hence, the power system is one of the important infrastructures for future growth. However, the power system of today was designed for a stable radial flow of electricity from large power plants to the customers and not for the type of changes it is presently being exposed to, like large scale integration of electric vehicles, wind power plants, residential photovoltaic systems etc. One aspect of power system control particular exposed to these changes is the design of power system control and protection functionality. Problems occur when the flow of electricity changes from a unidirectional radial flow to a bidirectional. Such an implication requires redesign of control and protection functionality as well as introduction of new information and communication technology (ICT). To make matters worse, the closer the interaction between the power system and the ICT systems the more complex the matter becomes from a reliability perspective. This problem is inherently cyber-physical, including everything from system software to power cables and transformers, rather than the traditional reliability concern of only focusing on power system components. The contribution of this thesis is a framework for reliability analysis, utilizing system modeling concepts that supports the industrial engineering issues that follow with the imple-mentation of modern substation automation systems. The framework is based on a Bayesian probabilistic analysis engine represented by Probabilistic Relational Models (PRMs) in com-bination with an Enterprise Architecture (EA) modeling formalism. The gradual development of the framework is demonstrated through a number of application scenarios based on substation automation system configurations. This thesis is a composite thesis consisting of seven papers. Paper 1 presents the framework combining EA, PRMs and Fault Tree Analysis (FTA). Paper 2 adds primary substation equipment as part of the framework. Paper 3 presents a mapping between modeling entities from the EA framework ArchiMate and substation automation system configuration objects from the IEC 61850 standard. Paper 4 introduces object definitions and relations in coherence with EA modeling formalism suitable for the purpose of the analysis framework. Paper 5 describes an extension of the analysis framework by adding logical operators to the probabilistic analysis engine. Paper 6 presents enhanced failure rates for software components by studying failure logs and an application of the framework to a utility substation automation system. Finally, Paper 7 describes the ability to utilize domain standards for coherent modeling of functions and their interrelations and an application of the framework utilizing software-tool support.

QC 20140505

APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography