Thèses sur le sujet « Probabilistic analysi »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Consultez les 50 meilleures thèses pour votre recherche sur le sujet « Probabilistic analysi ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Parcourez les thèses sur diverses disciplines et organisez correctement votre bibliographie.
POZZI, FEDERICO ALBERTO. « Probabilistic Relational Models for Sentiment Analysis in Social Networks ». Doctoral thesis, Università degli Studi di Milano-Bicocca, 2015. http://hdl.handle.net/10281/65709.
Texte intégralCrotta, M. « PROBABILISTIC MODELLING IN FOOD SAFETY : A SCIENCE-BASED APPROACH FOR POLICY DECISIONS ». Doctoral thesis, Università degli Studi di Milano, 2015. http://hdl.handle.net/2434/339138.
Texte intégralSCOZZESE, FABRIZIO. « AN EFFICIENT PROBABILISTIC FRAMEWORK FOR SEISMIC RISK ANALYSIS OF STRUCTURAL SYSTEMS EQUIPPED WITH LINEAR AND NONLINEAR VISCOUS DAMPERS ». Doctoral thesis, Università degli Studi di Camerino, 2018. http://hdl.handle.net/11581/429547.
Texte intégralTagliaferri, Lorenza. « Probabilistic Envelope Curves for Extreme Rainfall Events - Curve Inviluppo Probabilistiche per Precipitazioni Estreme ». Master's thesis, Alma Mater Studiorum - Università di Bologna, 2008. http://amslaurea.unibo.it/99/.
Texte intégralSaad, Feras Ahmad Khaled. « Probabilistic data analysis with probabilistic programming ». Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/113164.
Texte intégralThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 48-50).
Probabilistic techniques are central to data analysis, but dierent approaches can be challenging to apply, combine, and compare. This thesis introduces composable generative population models (CGPMs), a computational abstraction that extends directed graphical models and can be used to describe and compose a broad class of probabilistic data analysis techniques. Examples include hierarchical Bayesian models, multivariate kernel methods, discriminative machine learning, clustering algorithms, dimensionality reduction, and arbitrary probabilistic programs. We also demonstrate the integration of CGPMs into BayesDB, a probabilistic programming platform that can express data analysis tasks using a modeling language and a structured query language. The practical value is illustrated in two ways. First, CGPMs are used in an analysis that identifies satellite data records which probably violate Kepler's Third Law, by composing causal probabilistic programs with non-parametric Bayes in under 50 lines of probabilistic code. Second, for several representative data analysis tasks, we report on lines of code and accuracy measurements of various CGPMs, plus comparisons with standard baseline solutions from Python and MATLAB libraries.
by Feras Ahmad Khaled Saad.
M. Eng.
Shirmohammadi, Mahsa. « Qualitative analysis of synchronizing probabilistic systems ». Thesis, Cachan, Ecole normale supérieure, 2014. http://www.theses.fr/2014DENS0054/document.
Texte intégralMarkov decision processes (MDPs) are finite-state probabilistic systems with bothstrategic and random choices, hence well-established to model the interactions between a controller and its randomly responding environment.An MDP can be mathematically viewed as a one and half player stochastic game played in rounds when the controller chooses an action,and the environment chooses a successor according to a fixedprobability distribution.There are two incomparable views on the behavior of an MDP, when thestrategic choices are fixed. In the traditional view, an MDP is a generator of sequence of states, called the state-outcome; the winning condition of the player is thus expressed as a set of desired sequences of states that are visited during the game, e.g. Borel condition such as reachability.The computational complexity of related decision problems and memory requirement of winning strategies for the state-outcome conditions are well-studied.Recently, MDPs have been viewed as generators of sequences of probability distributions over states, calledthe distribution-outcome. We introduce synchronizing conditions defined on distribution-outcomes,which intuitively requires that the probability mass accumulates insome (group of) state(s), possibly in limit.A probability distribution is p-synchronizing if the probabilitymass is at least p in some state, anda sequence of probability distributions is always, eventually,weakly, or strongly p-synchronizing if respectively all, some, infinitely many, or all but finitely many distributions in the sequence are p-synchronizing.For each synchronizing mode, an MDP can be (i) sure winning if there is a strategy that produces a 1-synchronizing sequence; (ii) almost-sure winning if there is a strategy that produces a sequence that is, for all epsilon > 0, a (1-epsilon)-synchronizing sequence; (iii) limit-sure winning if for all epsilon > 0, there is a strategy that produces a (1-epsilon)-synchronizing sequence.We consider the problem of deciding whether an MDP is winning, for each synchronizing and winning mode: we establish matching upper and lower complexity bounds of the problems, as well as the memory requirementfor optimal winning strategies.As a further contribution, we study synchronization in probabilistic automata (PAs), that are kind of MDPs where controllers are restricted to use only word-strategies; i.e. no ability to observe the history of the system execution, but the number of choices made so far.The synchronizing languages of a PA is then the set of all synchronizing word-strategies: we establish the computational complexity of theemptiness and universality problems for all synchronizing languages in all winning modes.We carry over results for synchronizing problems from MDPs and PAs to two-player turn-based games and non-deterministic finite state automata. Along with the main results, we establish new complexity results foralternating finite automata over a one-letter alphabet.In addition, we study different variants of synchronization for timed andweighted automata, as two instances of infinite-state systems
Baier, Christel, Benjamin Engel, Sascha Klüppelholz, Steffen Märcker, Hendrik Tews et Marcus Völp. « A Probabilistic Quantitative Analysis of Probabilistic-Write/Copy-Select ». Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-129917.
Texte intégralMunch, Mélanie. « Améliorer le raisonnement dans l'incertain en combinant les modèles relationnels probabilistes et la connaissance experte ». Thesis, université Paris-Saclay, 2020. http://www.theses.fr/2020UPASB011.
Texte intégralThis thesis focuses on integrating expert knowledge to enhance reasoning under uncertainty. Our goal is to guide the probabilistic relations’ learning with expert knowledge for domains described by ontologies.To do so we propose to couple knowledge bases (KBs) and an oriented-object extension of Bayesian networks, the probabilistic relational models (PRMs). Our aim is to complement the statistical learning with expert knowledge in order to learn a model as close as possible to the reality and analyze it quantitatively (with probabilistic relations) and qualitatively (with causal discovery). We developped three algorithms throught three distinct approaches, whose main differences lie in their automatisation and the integration (or not) of human expert supervision.The originality of our work is the combination of two broadly opposed philosophies: while the Bayesian approach favors the statistical analysis of the given data in order to reason with it, the ontological approach is based on the modelization of expert knowledge to represent a domain. Combining the strenght of the two allows to improve both the reasoning under uncertainty and the expert knowledge
Echard, Benjamin. « Assessment by kriging of the reliability of structures subjected to fatigue stress ». Thesis, Clermont-Ferrand 2, 2012. http://www.theses.fr/2012CLF22269/document.
Texte intégralTraditional procedures for designing structures against fatigue are grounded upon the use of so-called safety factors in an attempt to ensure structural integrity while masking the uncertainties inherent to fatigue. These engineering methods are simple to use and fortunately, they give satisfactory solutions with regard to safety. However, they do not provide the designer with the structure’s safety margin as well as the influence of each design parameter on reliability. Probabilistic approaches are considered in this thesis in order to acquire this information, which is essential for an optimal design against fatigue. A general approach for probabilistic analysis in fatigue is proposed in this manuscript. It relies on the modelling of the uncertainties (load, material properties, geometry, and fatigue curve), and aims at assessing the reliability level of the studied structure in the case of a fatigue failure scenario. Classical reliability methods require a large number of calls to the mechanical model of the structure and are thus not applicable when the model evaluation is time-demanding. A family of methods named AK-RM (Active learning and Kriging-based Reliability methods) is proposed in this research work in order to solve the reliability problem with a minimum number of mechanical model evaluations. The general approach is applied to two case studies submitted by SNECMA in the frame of the ANR project APPRoFi
Kassa, Negede Abate. « Probabilistic safety analysis of dams ». Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2010. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-60843.
Texte intégralGlynn, Luke. « A Probabilistic Analysis of Causation ». Thesis, University of Oxford, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.523098.
Texte intégralARAÚJO, Rafael Pereira de. « Probabilistic analysis applied to robots ». Universidade Federal de Pernambuco, 2016. https://repositorio.ufpe.br/handle/123456789/20827.
Texte intégralMade available in DSpace on 2017-08-23T12:48:01Z (GMT). No. of bitstreams: 2 license_rdf: 811 bytes, checksum: e39d27027a6cc9cb039ad269a5db8e34 (MD5) dissertacao_mestrado_rafael_araujo.pdf: 1319314 bytes, checksum: 15854b595d618c609a911b95573a01ad (MD5) Previous issue date: 2016-09-15
Robots are increasingly being used in industry and starting their way to our homes as well. Nonetheless, the most frequently used techniques to analyze robots motion are based on simulations or statistical experiments made from filming robots’ movements. In this work we propose an alternative way of performing such analysis by using Probabilistic Model Checking with the language and tool PRISM. With PRISM we can perform simulations as well as check exhaustively whether a robot motion planning satisfies specific Probabilistic Temporal formulas. Therefore we can measure energy consumption, time to complete missions, etc., and all of these in terms of specific motion planning algorithms. As consequence we can also determine if an algorithm is superior to another in certain metrics. Furthermore, to ease the use of our work, we hide the PRISM syntax by proposing a more user-friendly DSL. As a consequence, we created a translator from the DSL to PRISM by implementing the translation rules and also, a preliminary investigation about its relative completeness by using the grammatical elements generation tool LGen. We illustrate those ideas with motion planning algorithms for home cleaning robots.
Robôs estão sendo cada vez mais utilizados na indústria e entrando em nossas casas também. No entanto, as técnicas mais frequentemente utilizadas para analisar a movimentação dos robôs são baseadas em simulações ou experimentos estatísticos realizados a partir da filmagem do movimento dos robôs. Neste trabalho, nós propomos uma maneira alternativa de realizar tais análises com a utilização da técnica de Verificação de Modelos Probabilísticos com a linguagem e ferramenta PRISM. Com PRISM, podemos, tanto realizar simulações quanto verificar exaustivamente se um planejamento de movimentação do robô satisfaz fórmulas Probabilísticas Temporais específicas. Portanto, podemos medir o consumo de energia, tempo necessário para completar missões, etc. e tudo isso em termos de algoritmos específicos de planejamento de movimentação. Como consequência, podemos, também, determinar se um algoritmo é superior a outro em relação a certas métricas. Além disso, para facilitar o uso do nosso trabalho, escondemos a sintaxe do PRISM propondo uma DSL amigável ao usuário. Em consequência, criamos um tradutor da DSL em PRISM através da implementação de regras de tradução bem como fizemos uma investigação preliminar sobre sua completude relativa usando a ferramenta de geração de elementos gramaticais LGen. Ilustramos as idéias com algoritmos de planejamento de movimentação para robôs de limpeza de casas.
Asafu-Adjei, Joseph Kwaku. « Probabilistic Methods ». VCU Scholars Compass, 2007. http://hdl.handle.net/10156/1420.
Texte intégralJreich, Rana. « Distribution verticale du carbone dans les sols - Analyse bayésienne des profils des teneurs en carbone et de C14 ». Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLV060/document.
Texte intégralGlobal warming is a major issue for both the scientific world and societies. The concentration of carbon dioxide has increased by 45% since the pre-industrial era (Harris, 2010) as a consequence of human activities, unbalancing the global carbon cycle. This results in global warming with dramatic impacts on the Earth, particularly for fragile populations.Amongst mitigation solutions, a better use of soil is proposed. Soils have the largest capacity of carbon exchanges with the atmosphere and contain a large stock of carbon. A tiny increase in this soil carbon stock and in carbon exchanges between atmosphere and soil would be more favorable to soil carbon sequestration and would compensate for carbon emissios from burning fossil fuel. However, soil carbon dynamics still suffers from insufficient knowledge. There remains therefore a huge uncertainty about the soil carbon response to climate and land-use changes.While several mechanistic models have been developed to better understand the dynamics of soil carbon, they provide an incomplete view of the physical processes affecting soil organic matter (OM). It will be long before a complete and updated soil dynamics model becomes available.In my thesis, I propose a Bayesian statistical model aiming at describing the vertical dynamics of soil carbon. This is done thanks to the modeling of both soil organic carbon and of radiocarbon data as they illustrate the residence time of organic matter and thus the soil carbon dynamics. The purpose of this statistical approach was to better represent the uncertainties on soil carbon dynamics and to quantify the effects of climatic and environmental factors on both surface and deep soil carbon.This meta-analysis was performed on a database of 344 profiles, collected from 87 soil science papers and the literature in archeology and paleoclimatology, under different climate conditions (temperature, precipitation, etc.) and environments (soil type and type of ecosystem).A hierarchical non-linear model with random effects was proposed to model the vertical dynamics of radiocarbon as a function of depth. Bayesian selection techniques, recently published, were applied to the latent layers of the model, which in turn are linked by a linear relationship to the climatic and environmental factors. The Bayesian Group Lasso with Spike and Slab Prior (BGL-SS), the Bayesian Sparse Group Selection (BSGS) and the Bayesian Effect Fusion model-based clustering (BEF) were tested to identify the significant categorical explanatory predictors (soil type, ecosystem type) and the Stochastic Search Variable Selection method to identify the influential numerical explanatory predictors. A comparison of these Bayesian techniques was made based on the Bayesian model selection criteria (the DIC (Deviance Information Criterion), the Posterior Predictive Check, etc.) to specify which model has the best predictive and adjustment power of the database profiles. In addition to selecting categorical predictors, the BSGS allows the formulation of an a posteriori inclusion probability for each level within the categorical predictors such as soil type and ecosystem type (9 soil types and 6 ecosystem types were considered in our study). Furthermore, the BEF made it possible to merge the types of soil as well as the types of ecosystem, which according to the BEF, are considered to have the same effects on the responses of interest here, such as the response of the topsoil radiocarbon.The application of these techniques allowed us to predict, on average and on a global level, the vertical dynamics of the radiocarbon in the case of a temperature increase of 1, 1.5 and 2 °C, and in the case of a change in vegetation cover. For example, we studied the impact of deforesting tropical forests and replacing them by cultivated land on soil carbon dynamics. The same statistical analysis was also done to better understand the vertical dynamics of soil carbon content
Pan, Qiujing. « Deterministic and Probabilistic Assessment of Tunnel Face Stability ». Thesis, Université Grenoble Alpes (ComUE), 2017. http://www.theses.fr/2017GREAI044.
Texte intégralIn the contemporary society, the utilization and exploitation of underground space has become an inevitable and necessary measure to solve the current urban congestion. One of the most important requirements for successful design and construction in tunnels and underground engineering is to maintain the stability of the surrounding soils of the engineering. But the stability analysis requires engineers to have a clear ideal of the earth pressure, the pore water pressure, the seismic effects and the soil variability. Therefore, the research aimed at employing an available theory to design tunnels and underground structures which would be a hot issue with high engineering significance. Among these approaches employed to address the above problem, limit analysis is a powerful tool to perform the stability analysis and has been widely used for real geotechnical works. This research subject will undertake further research on the application of upper bound theorem to the stability analysis of tunnels and underground engineering. Then this approach will be compared with three dimensional analysis and experimental available data. The final goal is to validate new simplified mechanisms using limit analysis to design the collapse and blow-out pressure at the tunnel face. These deterministic models will then be used in a probabilistic framework. The Collocation-based Stochastic Response Surface Methodology will be used, and generalized in order to make possible at a limited computational cost a complete parametric study on the probabilistic properties of the input variables. The uncertainty propagation through the models of stability and ground movements will be evaluated, and some methods of reliability-based design will be proposed. The spatial variability of the soil will be taken into account using the random field theory, and applied to the tunnel face collapse. This model will be developed in order to take into account this variability for much smaller computation times than numerical models, will be validated numerically and submitted to extensive random samplings. The effect of the spatial variability will be evaluated
Goka, Edoh. « Analyse des tolérances des systèmes complexes – Modélisation des imperfections de fabrication pour une analyse réaliste et robuste du comportement des systèmes ». Thesis, Paris, ENSAM, 2019. http://www.theses.fr/2019ENAM0019/document.
Texte intégralTolerance analysis aims toward the verification of the impact of individual tolerances on the assembly and functional requirements of a mechanical system. The manufactured products have several types of contacts and their geometry is imperfect, which may lead to non-functioning and non-assembly. Traditional methods for tolerance analysis do not consider the form defects. This thesis aims to propose a new procedure for tolerance analysis which considers the form defects and the different types of contact in its geometrical behavior modeling. A method is firstly proposed to model the form defects to make realistic analysis. Thereafter, form defects are integrated in the geometrical behavior modeling of a mechanical system and by considering also the different types of contacts. Indeed, these different contacts behave differently once the imperfections are considered. The Monte Carlo simulation coupled with an optimization technique is chosen as the method to perform the tolerance analysis. Nonetheless, this method is subject to excessive numerical efforts. To overcome this problem, probabilistic models using the Kernel Density Estimation method are proposed
Alhajj, Chehade Hicham. « Geosynthetic-Reinforced Retaining Walls-Deterministic And Probabilistic Approaches ». Thesis, Université Grenoble Alpes, 2021. http://www.theses.fr/2021GRALI010.
Texte intégralThe aim of this thesis is to assess the seismic internal stability of geosynthetic reinforced soil retaining walls. The work first deals with deterministic analyses and then focus on probabilistic ones. In the first part of this thesis, a deterministic model, based on the upper bound theorem of limit analysis, is proposed for assessing the reinforced soil wall safety factor or the required reinforcement strength to stabilize the structure. A spatial discretization technique is used to generate the rotational failure surface and give the possibility of considering heterogeneous backfills and/or to represent the seismic loading by the pseudo-dynamic approach. The cases of dry, unsaturated and saturated soils are investigated. Additionally, the crack presence in the backfill soils is considered. This deterministic model gives rigorous results and is validated by confrontation with existing results from the literature. Then, in the second part of the thesis, this deterministic model is used in a probabilistic framework. First, the uncertain input parameters are modeled using random variables. The considered uncertainties involve the soil shear strength parameters, seismic loading and reinforcement strength parameters. The Sparse Polynomial Chaos Expansion that consists of replacing the time expensive deterministic model by a meta-model, combined with Monte Carlo Simulations is considered as the reliability method to carry out the probabilistic analysis. Random variables approach neglects the soil spatial variability since the soil properties and the other uncertain input parameters, are considered constant in each deterministic simulation. Therefore, in the last part of the manuscript, the soil spatial variability is considered using the random field theory. The SIR/A-bSPCE method, a combination between the dimension reduction technique, Sliced Inverse Regression (SIR) and an active learning sparse polynomial chaos expansion (A-bSPCE), is implemented to carry out the probabilistic analysis. The total computational time of the probabilistic analysis, performed using SIR-SPCE, is significantly reduced compared to directly running classical probabilistic methods. Only the soil strength parameters are modeled using random fields, in order to focus on the effect of the spatial variability on the reliability results
Royer, Clément. « Algorithmes d'optimisation sans dérivées à caractère probabiliste ou déterministe : analyse de complexité et importance en pratique ». Thesis, Toulouse 3, 2016. http://www.theses.fr/2016TOU30207/document.
Texte intégralRandomization has had a major impact on the latest developments in the field of numerical optimization, partly due to the outbreak of machine learning applications. In this increasingly popular context, classical nonlinear programming algorithms have indeed been outperformed by variants relying on randomness. The cost of these variants is usually lower than for the traditional schemes, however theoretical guarantees may not be straightforward to carry out from the deterministic to the randomized setting. Complexity analysis is a useful tool in the latter case, as it helps in providing estimates on the convergence speed of a given scheme, which implies some form of convergence. Such a technique has also gained attention from the deterministic optimization community thanks to recent findings in the nonconvex case, as it brings supplementary indicators on the behavior of an algorithm. In this thesis, we investigate the practical enhancement of deterministic optimization algorithms through the introduction of random elements within those frameworks, as well as the numerical impact of their complexity results. We focus on direct-search methods, one of the main classes of derivative-free algorithms, yet our analysis applies to a wide range of derivative-free methods. We propose probabilistic variants on classical properties required to ensure convergence of the studied methods, then enlighten their practical efficiency induced by their lower consumption of function evaluations. Firstorder concerns form the basis of our analysis, which we apply to address unconstrained and linearly-constrained problems. The observed gains incite us to additionally take second-order considerations into account. Using complexity properties of derivative-free schemes, we develop several frameworks in which information of order two is exploited. Both a deterministic and a probabilistic analysis can be performed on these schemes. The latter is an opportunity to introduce supplementary probabilistic properties, together with their impact on numerical efficiency and robustness
Guo, Xiangfeng. « Probabilistic stability analysis of an earth dam using field data ». Thesis, Université Grenoble Alpes, 2020. http://www.theses.fr/2020GRALI017.
Texte intégralUncertainties of soil properties are widely encountered in the field of geotechnical engineering especially for earth dams which are constructed with earthen materials. In recent years, there is an increasing need, motivated by the deficiencies of the traditional deterministic approach or guided by the national regulations such as in France, of accounting for these uncertainties for a safe assessment of large dams particularly in the framework of risk analysis studies. However, probabilistic analyses are still complex and not so easy to implement in practice due to the limited number of in-situ measurements, expensive computation efforts and lack of implementation of reliability methods in commercial simulation tools. Moreover, most of the previous studies are based on academic cases and hypothetic data.This work attempts to deal with the aforementioned issues by providing a probabilistic analysis study for the stability of a real earth dam using available field data. This study includes the following main elements: (1) definition of the soil variability by using the available measurements; (2) development of the deterministic models; (3-4) dam probabilistic analyses using the random-variables and random-fields approaches; (5) three-dimensional reliability analysis of the considered dam. Advanced reliability methods, such as the adaptive surrogate modelling, are introduced for the studied earth dam problem. This allows accurately estimating the dam failure probability and the safety factor statistics with a significantly reduced calculation time. In addition, some issues, that remain unknown or unclear in the field of the dam probabilistic analysis, are discussed (e.g. global sensitivity analysis of the soil hydraulic and shear strength parameters; performance survey of five reliability methods; simulation/comparison of three different kinds of random fields: generic (unconditional-stationary), conditional and nonstationary). The presented work, based on real measurements, could be a good supplement to the existing probabilistic studies of geo-structures. Readers will find useful information from the obtained results in order to better solve the practical geotechnical problems in a probabilistic framework
Bertsimas, Dimitris J. « The Probabilistic Minimum Spanning Tree, Part II : Probabilistic Analysis and Asymptotic Results ». Massachusetts Institute of Technology, Operations Research Center, 1988. http://hdl.handle.net/1721.1/5284.
Texte intégralCruz, Fernández Francisco. « Probabilistic graphical models for document analysis ». Doctoral thesis, Universitat Autònoma de Barcelona, 2016. http://hdl.handle.net/10803/399520.
Texte intégralCurrently, more than 80% of the documents stored on paper belong to the business field. Advances in digitization techniques have fostered the interest in creating digital copies in order to solve maintenance and storage problems, as well as to have efficient ways for transmission and automatic extraction of the information contained therein. This situation has led to the need to create systems that can automatically extract and analyze this kind of information. The great variety of types of documents makes this not a trivial task. The extraction process of numerical data from tables or invoices differs substantially from a task of handwriting recognition in a document with annotations. However, there is a common link in the two tasks: Given a document, we need to identify the region where the information of interest is located. In the area of Document Analysis this process is called Layout Analysis, and aims at identifying and categorizing the different entities that compose the document. These entities can be text regions, pictures, text lines or tables, among others. This process can be done from two different approaches: physical or logical analysis. Physical analysis focus on identifying the physical boundaries that define the area of interest, whereas logical analysis also models information about the role and semantics of the entities within the scope of the document. To encode this information it is necessary to incorporate prior knowledge about the task into the analysis process, which can be introduced in terms of contextual relations between entities. The use of context has proven to be useful to reinforce the recognition process and improve the results on many computer vision tasks. It presents two fundamental questions: what kind of contextual information is appropriate, and how to incorporate this information into the model. In this thesis we study several ways to incorporate contextual information on the task of document layout analysis. We focus on the study of Probabilistic Graphical Models and other mechanisms for the inclusion of contextual relations applied to the specific tasks of region identification and handwritten text line segmentation. On the one hand, we present several methods for region identification. First, we present a method for layout analysis based on Conditional Random Fields for maximum a posteriori estimation. We encode a set of structural relations between different classes of regions on a set of features. Second, we present a method based on 2D-Probabilistic Context-free Grammars and perform a comparative study between probabilistic graphical models and this syntactic approach. Third, we propose a statistical approach based on the Expectation-Maximization algorithm devised to structured documents. We perform a thorough evaluation of the proposed methods on two particular collections of documents: a historical dataset composed of ancient structured documents, and a collection of contemporary documents. On the other hand, we present a probabilistic framework applied to the task of handwritten text line segmentation. We successfully combine the EM algorithm and variational approaches for this purpose. We demonstrate that the use of contextual information using probabilistic graphical models is of great utility for these tasks.
Kosmidis, Leonidas. « Enabling caches in probabilistic timing analysis ». Doctoral thesis, Universitat Politècnica de Catalunya, 2017. http://hdl.handle.net/10803/460819.
Texte intégralLa complejidad de hardware y software de los sistemas críticos del futuro desafía la escalabilidad de los métodos tradicionales de análisis temporal. El análisis temporal probabilístico basado en medidas (MBPTA) ha aparecido últimamente como una solución viable alternativa para la industria, para manejar hardware/software complejo. Sin embargo, MBPTA requiere ciertas propiedades de tiempo en el sistema bajo análisis que no satisfacen los sistemas convencionales. En esta tesis introducimos, por primera vez, soluciones hardware y software para satisfacer estos requisitos como también mejorar la aplicabilidad de MBPTA. Nos centramos en uno de los recursos hardware con el máximo impacto en el rendimiento medio y el peor caso del tiempo de ejecución (WCET) en plataformas actuales de tiempo real, la cache. En esta línea, las contribuciones de esta tesis siguen 3 ejes distintos: soluciones hardware y soluciones software para habilitar MBPTA, y mejoras de el análisis MBPTA en sistemas usado caches. A nivel de hardware, creamos las bases del diseño de un procesador compatible con MBPTA, y definimos diseños de cache con tiempo aleatorio para jerarquías de memoria con uno y múltiples niveles de cualquier complejidad, incluso caches unificadas, las cuales pueden ser analizadas temporalmente por primera vez. Proponemos tres nuevos enfoques de aleatorización de software (uno dinámico y dos variedades estáticas) para manejar, en una manera compatible con MBPTA, la variabilidad del tiempo (jitter) de la cache en procesadores comerciales comunes en el mercado (COTS) en sistemas de tiempo real. Por eso, todas nuestras propuestas varían aleatoriamente la posición del código y de los datos del programa en la memoria entre ejecuciones del mismo, para conseguir propiedades de tiempo aleatorias, similares a las logradas con diseños hardware personalizados. Proponemos un nuevo método para estimar el WCET de un programa usando MBPTA, sin requerir que el usuario dentifique los caminos y las entradas de programa del peor caso, mejorando así la aplicabilidad de MBPTA en la industria. Además, introducimos la composabilidad de tiempo probabilística, que permite a los sistemas integrados reducir su WCET cuando usan caches de tiempo aleatorio. Con estas contribuciones, esta tesis empuja los limites en el uso de diseños complejos de procesadores empotrados en sistemas de tiempo real equipados con caches y prepara el terreno para la industrialización de la tecnología MBPTA.
Muller, Maria Anna Elizabeth. « Probabilistic analysis of repairable redundant systems ». Pretoria : [s.n.], 2005. http://upetd.up.ac.za/thesis/available/etd-10182006-132917.
Texte intégralRabeau, Nicholas Marc. « Probabilistic approach to contingent claims analysis ». Thesis, Imperial College London, 1996. http://hdl.handle.net/10044/1/8195.
Texte intégralKaowichakorn, Peerachai. « Probabilistic Analysis of Quality of Service ». Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4880.
Texte intégralKüttler, Martin, Michael Roitzsch, Claude-Joachim Hamann et Marcus Völp. « Probabilistic Analysis of Low-Criticality Execution ». Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2018. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-233117.
Texte intégralCai, Xing Shi. « A probabilistic analysis of Kademlia networks ». Thesis, McGill University, 2013. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=114333.
Texte intégralAujourd'hui Kademlia est l'un des les plus utilisés DHTs (Distributed Hash Tableau) dans les réseaux P2P (peer-to-peer). Cet article étudie une question essentielle des réseaux "overlay" de Kademlia d'un point de vue mathématique: combien de temps faut-il pour localiser un noeud? Pour y répondre, nous introduisons un graphe aléatoire K pour modéliser un réseau de Kademlia et étudier la complexité d'un algorithme de routage de Kademlia.
Johnson, Elizabeth Alice. « Probabilistic analysis of shingle beach management ». Thesis, University of Bristol, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.422561.
Texte intégralTrim, A. D. « Probabilistic dynamic analysis of offshore structures ». Thesis, Cranfield University, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.376215.
Texte intégralKountras, Apostolos 1970. « Probabilistic analysis of turbine blade durability ». Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/28893.
Texte intégralIncludes bibliographical references (leaves 71-72).
The effect of variability on turbine blade durability was assessed for seven design/operating parameters in three blade designs. The parameters included gas path and cooling convective parameters, metal and coating thermal conductivity and coating thickness. The durability life was modelled as limited by thermo-mechanical low cycle fatigue and creep. A nominal blade design as well as two additional variants were examined using deterministic and probabilistic approaches. External thermal and pressure boundary conditions were generated by three-dimensional CFD calculations. The location of expected failure was the bottom of the trailing edge cooling slot and was the same for all three designs examined. The nominal design had higher life and less variability for the ranges of design parameters examined. For the temperature range studied fatigue was the primary damage mechanism. The variation in cooling air bulk temperature was most important in setting the variation in blade durability life. This life variation was also affected by main gas bulk temperature and heat transfer coefficient, and cooling heat transfer coefficient, but to a lesser extent.
by Apostolos Kountras.
S.M.
Sheel, Minaskshi. « Probabilistic analysis of ground-holding strategies ». Thesis, Massachusetts Institute of Technology, 1997. http://hdl.handle.net/1721.1/28175.
Texte intégralKüttler, Martin, Michael Roitzsch, Claude-Joachim Hamann et Marcus Völp. « Probabilistic Analysis of Low-Criticality Execution ». Technische Universität Dresden, 2017. https://tud.qucosa.de/id/qucosa%3A30798.
Texte intégralYu, Jenn-Hwa. « Probabilistic analysis of some search algorithms / ». The Ohio State University, 1990. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487683756126241.
Texte intégralMapuranga, Victor Philip. « Probabilistic seismic hazard analysis for Zimbabwe ». Diss., University of Pretoria, 2014. http://hdl.handle.net/2263/43166.
Texte intégralDissertation (MSc)--University of Pretoria, 2014.
lk2014
Physics
MSc
Unrestricted
SAVELY, JAMES PALMER. « PROBABILISTIC ANALYSIS OF FRACTURED ROCK MASSES ». Diss., The University of Arizona, 1987. http://hdl.handle.net/10150/184249.
Texte intégralBlakely, Scott. « Probabilistic Analysis for Reliable Logic Circuits ». PDXScholar, 2014. https://pdxscholar.library.pdx.edu/open_access_etds/1860.
Texte intégralHohn, Jennifer Lynn. « Generalized Probabilistic Bowling Distributions ». TopSCHOLAR®, 2009. http://digitalcommons.wku.edu/theses/82.
Texte intégralMason, Dave. « Probabilistic Program Analysis for Software Component Reliability ». Thesis, University of Waterloo, 2002. http://hdl.handle.net/10012/1059.
Texte intégralFeng, Jianwen. « Probabilistic modelling of heterogeneous media ». Thesis, Swansea University, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.644724.
Texte intégralNicola, Jérémy. « Robust, precise and reliable simultaneous localization and mapping for and underwater robot. Comparison and combination of probabilistic and set-membership methods for the SLAM problem ». Thesis, Brest, 2017. http://www.theses.fr/2017BRES0066/document.
Texte intégralIn this thesis, we work on the problem of simultaneously localizing an underwater robot while mapping a set of acoustic beacons lying on the seafloor, using an acoustic range-meter and an inertial navigation system. We focus on the two main approaches classically used to solve this type of problem: Kalman filtering and set-membership filtering using interval analysis. The Kalman filter is optimal when the state equations of the robot are linear, and the noises are additive, white and Gaussian. The interval-based filter do not model uncertainties in a probabilistic framework, and makes only one assumption about their nature: they are bounded. Moreover, the interval-based approach allows to rigorously propagate the uncertainties, even when the equations are non-linear. This results in a high reliability in the set estimate, at the cost of a reduced precision.We show that in a subsea context, when the robot is equipped with a high precision inertial navigation system, a part of the SLAM equations can reasonably be seen as linear with additive Gaussian noise, making it the ideal playground of a Kalman filter. On the other hand, the equations related to the acoustic range-meter are much more problematic: the system is not observable, the equations are non-linear, and the outliers are frequent. These conditions are ideal for a set-based approach using interval analysis.By taking advantage of the properties of Gaussian noises, this thesis reconciles the probabilistic and set-membership processing of uncertainties for both linear and non-linear systems with additive Gaussian noises. By reasoning geometrically, we are able to express the part of the Kalman filter equations linked to the dynamics of the vehicle in a set-membership context. In the same way, a more rigorous and precise treatment of uncertainties is described for the part of the Kalman filter linked to the range-measurements. These two tools can then be combined to obtain a SLAM algorithm that is reliable, precise and robust. Some of the methods developed during this thesis are demonstrated on real data
Mason, David Victor. « Probabilistic program analysis for software component reliability ». Waterloo, Ont. : University of Waterloo, 2002. http://etd.uwaterloo.ca/etd/dmason2002.pdf.
Texte intégral"A thesis presented to the University of Waterloo in fulfilment of the thesis requirement for the degree of Doctor of Philosophy in Computer Science". Includes bibliographical references. Also available in microfiche format.
Milutinovic, Suzana. « On the limits of probabilistic timing analysis ». Doctoral thesis, Universitat Politècnica de Catalunya, 2019. http://hdl.handle.net/10803/668475.
Texte intégralEn los últimos años, se ha podido observar un crecimiento rápido y sostenido de la industria de los sistemas embebidos críticos de tiempo real (abreviado en inglés CRTES}, como por ejemplo la industria aeronáutica o la automovilística. En un futuro cercano, muchas de las funcionalidades complejas que actualmente se están implementando a través de sistemas mecánicos en los CRTES pasarán a ser controladas por software crítico. Esta tendencia tiene dos consecuencias claras. La primera, el tamaño y la complejidad del software se incrementará en cada nuevo producto embebido que se lance al mercado. La segunda, las técnicas hardware destinadas a alto rendimiento (por ejemplo, memorias caché) serán usadas más frecuentemente en los procesadores de tiempo real. El incremento en la complejidad de los CRTES impone un reto en los procesos de validación y verificación de los procesadores, un paso imprescindible para certificar que los sistemas se pueden comercializar de forma segura. La validación y verificación del tiempo de ejecución incluye la estimación del tiempo de ejecución en el peor caso (abreviado en inglés WCET}, que debe ser precisa y certera. Desafortunadamente, los procesos tradicionales para analizar el tiempo de ejecución tienen problemas para analizar las complejas combinaciones entre el software y el hardware, produciendo estimaciones del WCET de mala calidad y conservadoras. Para superar dicha limitación, es necesario que florezcan nuevas técnicas que ayuden a proporcionar WCET más precisos de forma segura y automatizada. En esta Tesis se profundiza en la investigación referente al análisis probabilístico de tiempo de ejecución basado en medidas (abreviado en inglés MBPTA), cuyas primeras implementaciones muestran potencial para obtener un WCET preciso y certero en tareas ejecutadas en sistemas complejos. Primero, se propone una metodología para certificar que todas las distribuciones de la memoria caché, una de las estructuras más complejas de los CRTES, han sido contabilizadas adecuadamente durante el proceso de estimación del WCET. Segundo, se expone una solución para conseguir a la vez representatividad en la memoria caché y cobertura total en caminos críticos del programa. Dicha solución garantiza que la estimación WCET obtenida es válida para todos los caminos de ejecución, independientemente de como el código y los datos se guardan en la memoria caché. Finalmente, se analizan y discuten los mayores malentendidos y obstáculos que pueden prevenir la aplicabilidad del análisis de WCET basado en la teoría de valores extremos, la cual forma parte del MBPTA.
Li, Bin. « Integrating Software into PRA (Probabilistic Risk Analysis) ». College Park, Md. : University of Maryland, 2004. http://hdl.handle.net/1903/1993.
Texte intégralThesis research directed by: Reliability Engineering. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
Rexhepi, Astrit. « Motion analysis using probabilistic and statistical reasoning ». Thesis, University of Surrey, 2007. http://epubs.surrey.ac.uk/843205/.
Texte intégralRejimon, Thara. « Reliability-centric probabilistic analysis of VLSI circuits ». [Tampa, Fla] : University of South Florida, 2006. http://purl.fcla.edu/usf/dc/et/SFE0001707.
Texte intégralHughes, Nicholas Peter. « Probabilistic models for automated ECG interval analysis ». Thesis, University of Oxford, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.433475.
Texte intégralWojtczak, Dominik. « Recursive probabilistic models : efficient analysis and implementation ». Thesis, University of Edinburgh, 2009. http://hdl.handle.net/1842/3217.
Texte intégralWallace, William Frederick. « Design and analysis of probabilistic carry adders ». Thesis, University of Newcastle Upon Tyne, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.247875.
Texte intégralHo, K. H. L. « Probabilistic scene analysis of two dimensional images ». Thesis, University of Bristol, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.303747.
Texte intégralGUEDES, MARIA CECILIA SAFADY. « DISCUSSION ON PROBABILISTIC ANALYSIS OF SLOPE STABILITY ». PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 1997. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=1924@1.
Texte intégralSão abordados alguns aspectos relativos à execução de análises probabilísticas em projetos de geotecnia. Apresenta-se um resumo dos conceitos de probabilidade e estatística, utilizados ao longo do trabalho. Descreve-se uma metodologia para a obtenção dos dados necessários à análise probabilística, incluindo a quantidade e a localização de amostras, o cálculo das médias e variâncias dos parâmetros do solo e a quantificação das incertezas relativas a estes valores. Apresenta-se o procedimento de execução dos três métodos probabilísticos mais utilizados em geotecnia com ênfase especial para o Método do Segundo Momento de Primeira Ordem. São executadas análises probabilísticas considerando, separadamente, variações de altura e inclinação de um talude de mineração sob condições drenadas. Avalia-se também a aplicação da metodologia de análise probabilística em situações não-drenadas, através da análise da estabilidade de um quebra-mar sobre argila mole.
Some aspects about probabilistic analysis of stability in geotechnical engineering are studied in this thesis. A summary about basic concepts of probability and statistics used along this work is presented. The methodology for obtaining the data needed for probabilistic analysis is described, including quantity and localization of samples, computation of mean and variance of soil properties and determination of uncertainties about these values. The procedures of three probabilistic methods which are useful in geotechnics are presented, with special emphasis on the first order second moment method (FOSM). Probabilistic analysis are made considering independent changes of height and inclination of a mine slope under drained conditions. The application of probabilistic analysis of a breakwater above a soft clay deposit under undrained conditions is also presented.
Se abordan algunos aspectos relativos a la ejecución de análisis probabilístico en proyectos de geotecnia. Se presenta un resumen de los conceptos de probabilidades y estadísticas, utilizados a lo largo del trabajo. Se describe una metodología para la obtención de los datos necesarios para el análisis probabilístico, incluyendo la cantidad y la localización de las muestras, el cálculo de las medias y variancias de los parámetros del suelo y la cuantificación de los errores relativos a estos valores. Se presenta el procedimientode ejecución de los tres métodos probabilísticos más utilizados en geotecnia con énfasis especial para el Método del Segundo Momento de Primer Orden. Se realizan análisis probabilísticos considerando, separadamente, variaciones de altura e inclinación de un talud de mineración en condiciones drenadas. También se evalúa la aplicación de la metodología de análisis probabilística en situaciones no-drenadas, a través de el análisis de la estabilidad de un quebra olas sobre arcilla blanda.