To see the other types of publications on this topic, follow the link: Probability theory.

Dissertations / Theses on the topic 'Probability theory'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Probability theory.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Halliwell, Joe. "Linguistic probability theory." Thesis, University of Edinburgh, 2008. http://hdl.handle.net/1842/29135.

Full text
Abstract:
A theory of linguistic probabilities as is patterned after the standard Kolmogorov axioms of probability theory. Since fuzzy numbers lack algebraic inverses, the resulting theory is weaker than, but generalizes its classical counterpart. Nevertheless, it is demonstrated that analogues for classical probabilistic concepts such as conditional probability and random variables can be constructed. In the classical theory, representation theorems mean that most of the time the distinction between mass/density distributions and probability measures can be ignored. Similar results are proven for linguistic probabilities. From these results it is shown that directed acyclic graphs annotated with linguistic probabilities (under certain identified conditions) represent systems of linguistic random variables. It is then demonstrated these linguistic Bayesian networks can utilize adapted best-of-breed Bayesian network algorithms (junction tree based inference and Bayes’ ball irrelevancy calculation). These algorithms are implemented in Arbor, an interactive design, editing and querying tool for linguistic Bayesian networks. To explore the applications of these techniques, a realistic example drawn from the domain of forensic statistics is developed. In this domain the knowledge engineering problems cited above are especially pronounced and expert estimates are commonplace. Moreover, robust conclusions are of unusually critical importance. An analysis of the resulting linguistic Bayesian network for assessing evidential support in glass-transfer scenarios highlights the potential utility of the approach.
APA, Harvard, Vancouver, ISO, and other styles
2

Youmbi, Norbert. "Probability theory on semihypergroups." [Tampa, Fla.] : University of South Florida, 2005. http://purl.fcla.edu/fcla/etd/SFE0001201.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sorokin, Yegor. "Probability theory, fourier transform and central limit theorem." Manhattan, Kan. : Kansas State University, 2009. http://hdl.handle.net/2097/1604.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Johns, Richard. "A theory of physical probability." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape9/PQDD_0027/NQ38907.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Perlin, Alex 1974. "Probability theory on Galton-Watson trees." Thesis, Massachusetts Institute of Technology, 2001. http://hdl.handle.net/1721.1/8673.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mathematics, 2001.
Includes bibliographical references (p. 91).
By a Galton-Watson tree T we mean an infinite rooted tree that starts with one node and where each node has a random number of children independently of the rest of the tree. In the first chapter of this thesis, we prove a conjecture made in [7] for Galton-Watson trees where vertices have bounded number of children not equal to 1. The conjecture states that the electric conductance of such a tree has a continuous distribution. In the second chapter, we study rays in Galton-Watson trees. We establish what concentration of vertices with is given number of children is possible along a ray in a typical tree. We also gauge the size of the collection of all rays with given concentrations of vertices of given degrees.
by Alex Perlin.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Jiun-Chau. "Limit theorems in noncommutative probability theory." [Bloomington, Ind.] : Indiana University, 2008. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3331258.

Full text
Abstract:
Thesis (Ph.D.)--Indiana University, Dept. of Mathematics, 2008.
Title from PDF t.p. (viewed on Jul 27, 2009). Source: Dissertation Abstracts International, Volume: 69-11, Section: B, page: 6852. Adviser: Hari Bercovici.
APA, Harvard, Vancouver, ISO, and other styles
7

Burns, Jonathan. "Recursive Methods in Number Theory, Combinatorial Graph Theory, and Probability." Scholar Commons, 2014. https://scholarcommons.usf.edu/etd/5193.

Full text
Abstract:
Recursion is a fundamental tool of mathematics used to define, construct, and analyze mathematical objects. This work employs induction, sieving, inversion, and other recursive methods to solve a variety of problems in the areas of algebraic number theory, topological and combinatorial graph theory, and analytic probability and statistics. A common theme of recursively defined functions, weighted sums, and cross-referencing sequences arises in all three contexts, and supplemented by sieving methods, generating functions, asymptotics, and heuristic algorithms. In the area of number theory, this work generalizes the sieve of Eratosthenes to a sequence of polynomial values called polynomial-value sieving. In the case of quadratics, the method of polynomial-value sieving may be characterized briefly as a product presentation of two binary quadratic forms. Polynomials for which the polynomial-value sieving yields all possible integer factorizations of the polynomial values are called recursively-factorable. The Euler and Legendre prime producing polynomials of the form n2+n+p and 2n2+p, respectively, and Landau's n2+1 are shown to be recursively-factorable. Integer factorizations realized by the polynomial-value sieving method, applied to quadratic functions, are in direct correspondence with the lattice point solutions (X,Y) of the conic sections aX2+bXY +cY2+X-nY=0. The factorization structure of the underlying quadratic polynomial is shown to have geometric properties in the space of the associated lattice point solutions of these conic sections. In the area of combinatorial graph theory, this work considers two topological structures that are used to model the process of homologous genetic recombination: assembly graphs and chord diagrams. The result of a homologous recombination can be recorded as a sequence of signed permutations called a micronuclear arrangement. In the assembly graph model, each micronuclear arrangement corresponds to a directed Hamiltonian polygonal path within a directed assembly graph. Starting from a given assembly graph, we construct all the associated micronuclear arrangements. Another way of modeling genetic rearrangement is to represent precursor and product genes as a sequence of blocks which form arcs of a circle. Associating matching blocks in the precursor and product gene with chords produces a chord diagram. The braid index of a chord diagram can be used to measure the scope of interaction between the crossings of the chords. We augment the brute force algorithm for computing the braid index to utilize a divide and conquer strategy. Both assembly graphs and chord diagrams are closely associated with double occurrence words, so we classify and enumerate the double occurrence words based on several notions of irreducibility. In the area of analytic probability, moments abstractly describe the shape of a probability distribution. Over the years, numerous varieties of moments such as central moments, factorial moments, and cumulants have been developed to assist in statistical analysis. We use inversion formulas to compute high order moments of various types for common probability distributions, and show how the successive ratios of moments can be used for distribution and parameter fitting. We consider examples for both simulated binomial data and the probability distribution affiliated with the braid index counting sequence. Finally we consider a sequence of multiparameter binomial sums which shares similar properties with the moment sequences generated by the binomial and beta-binomial distributions. This sequence of sums behaves asymptotically like the high order moments of the beta distribution, and has completely monotonic properties.
APA, Harvard, Vancouver, ISO, and other styles
8

Christopher, Fisher Ryan. "Are people naive probability theorists? An examination of the probability theory + variation model." Miami University / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=miami1406657670.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Tarrago, Pierre. "Non-commutative generalization of some probabilistic results from representation theory." Thesis, Paris Est, 2015. http://www.theses.fr/2015PESC1123/document.

Full text
Abstract:
Le sujet de cette thèse est la généralisation non-commutative de résultats probabilistes venant de la théorie des représentations. Les résultats obtenus se divisent en trois parties distinctes. Dans la première partie de la thèse, le concept de groupe quantique easy est étendu au cas unitaire. Tout d'abord, nous donnons une classification de l'ensemble des groupes quantiques easy unitaires dans le cas libre et classique. Nous étendons ensuite les résultats probabilistes de au cas unitaire. La deuxième partie de la thèse est consacrée à une étude du produit en couronne libre. Dans un premier temps, nous décrivons les entrelaceurs des représentations dans le cas particulier d'un produit en couronne libre avec le groupe symétrique libre: cette description permet également d'obtenir plusieurs résultats probabilistes. Dans un deuxième temps, nous établissons un lien entre le produit en couronne libre et les algèbres planaires: ce lien mène à une preuve d'une conjecture de Banica et Bichon. Dans la troisième partie de la thèse, nous étudions un analoque du graphe de Young qui encode la structure multiplicative des fonctions fondamentales quasi-symétriques. La frontière minimale de ce graphe a déjà été décrite par Gnedin et Olshanski. Nous prouvons que la frontière minimale coïncide avec la frontière de Martin. Au cours de cette preuve, nous montrons plusieurs résultats combinatoires asymptotiques concernant les diagrammes de Young en ruban
The subject of this thesis is the non-commutative generalization of some probabilistic results that occur in representation theory. The results of the thesis are divided into three different parts. In the first part of the thesis, we classify all unitary easy quantum groups whose intertwiner spaces are described by non-crossing partitions, and develop the Weingarten calculus on these quantum groups. As an application of the previous work, we recover the results of Diaconis and Shahshahani on the unitary group and extend those results to the free unitary group. In the second part of the thesis, we study the free wreath product. First, we study the free wreath product with the free symmetric group by giving a description of the intertwiner spaces: several probabilistic results are deduced from this description. Then, we relate the intertwiner spaces of a free wreath product with the free product of planar algebras, an object which has been defined by Bisch and Jones. This relation allows us to prove the conjecture of Banica and Bichon. In the last part of the thesis, we prove that the minimal and the Martin boundaries of a graph introduced by Gnedin and Olshanski are the same. In order to prove this, we give some precise estimates on the uniform standard filling of a large ribbon Young diagram. This yields several asymptotic results on the filling of large ribbon Young diagrams
APA, Harvard, Vancouver, ISO, and other styles
10

McGillivray, Ivor Edward. "Some applications of Dirichlet forms in probability theory." Thesis, University of Cambridge, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.241102.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Lundquist, Anders. "Contributions to the theory of unequal probability sampling." Doctoral thesis, Umeå : Department of Mathematics and Mathematical Statistics, Umeå University, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-22459.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Narayanan, Bhargav. "Problems in Ramsey theory, probabilistic combinatorics and extremal graph theory." Thesis, University of Cambridge, 2015. https://www.repository.cam.ac.uk/handle/1810/252850.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Bright, Leslie William. "Matrix-analytic methods in applied probability /." Title page, table of contents and abstract only, 1996. http://web4.library.adelaide.edu.au/theses/09PH/09phb855.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Palafox, Damian. "THINKING POKER THROUGH GAME THEORY." CSUSB ScholarWorks, 2016. https://scholarworks.lib.csusb.edu/etd/314.

Full text
Abstract:
Poker is a complex game to analyze. In this project we will use the mathematics of game theory to solve some simplified variations of the game. Probability is the building block behind game theory. We must understand a few concepts from probability such as distributions, expected value, variance, and enumeration methods to aid us in studying game theory. We will solve and analyze games through game theory by using different decision methods, decision trees, and the process of domination and simplification. Poker models, with and without cards, will be provided to illustrate optimal strategies. Extensions to those models will be presented, and we will show that optimal strategies still exist. Finally, we will close this paper with an original work to an extension that can be used as a medium to creating more extensions and, or, different games to explore.
APA, Harvard, Vancouver, ISO, and other styles
15

Stacey, Alan Martin. "Bounds on the critical probability in oriented percolation models." Thesis, University of Cambridge, 1994. https://www.repository.cam.ac.uk/handle/1810/251746.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Jafri, Syeda Rabab. "Operator inequalities and characterization with applications in probability theory." Thesis, University of Nottingham, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.539218.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Levray, Amélie. "Interval-based possibility theory : conditioning and probability/possibility transformations." Thesis, Artois, 2017. http://www.theses.fr/2017ARTO0408/document.

Full text
Abstract:
Cette thèse contribue au développement de formalismes efficaces pour représenter l’information incertaine. Les formalismes existants tels que la théorie des probabilités ou la théorie des possibilités sont parmi les cadres les plus connus et utilisés pour représenter ce type d’information. Différentes extensions (e.g. théorie des probabilités imprécises, théorie des possibilités à intervalles) ont été proposées pour traiter des informations incomplètes ou des connaissances mal-connues, ainsi que pour raisonner avec les connaissances d’un groupe d’experts. Les contributions de cette thèse sont divisées en deux parties. Dans la première partie, nous développons le conditionnement dans le cadre des possibilités à intervalles et dans le cadre des possibilités ensemblistes. Conditionner dans le cadre standard diffère que l’on considère l’échelle possibiliste qualitative ou quantitative. Notre travail traite les deux définitions du conditionnement possibiliste. Ce qui nous amène à étudier une nouvelle extension de la logique possibiliste, définie comme logique possibiliste ensembliste, et son opérateur de conditionnement dans le cadre possibiliste qualitatif. Ces résultats, plus spécialement en termes de complexité, nous amène à étudier les transformations, plus précisément des transformations du cadre probabiliste vers le cadre possibiliste. En effet, nous analysons des propriétés les tâches de raisonnement comme la marginalisation et le conditionnement. Nous nous attaquons aussi aux transformations des probabilités imprécises vers les possibilités avec un intérêt particulier pour l’inférence MAP
This thesis contributes to the development of efficient formalisms to handle uncertain information. Existing formalisms such as probability theory or possibility theory are among the most known and used settings to represent such information. Extensions and generalizations (e.g. imprecise probability theory, interval-based possibilistic theory) have been provided to handle uncertainty such as incomplete and ill-known knowledge and reasoning with the knowledge of a group of experts. We are particularly interested in reasoning tasks within these theories such as conditioning. The contributions of this thesis are divided in two parts. In the first part, we tackle conditioning in interval-based possibilistic framework and set-valued possibilistic framework. The purpose is to develop a conditioning machinery for interval-based possibilistic logic. Conditioning in a standard possibilistic setting differs whether we consider a qualitative or quantitative scale. Our works deal with both definitions of possibilistic conditioning. This leads us to investigate a new extension of possibilisticlogic, defined as set-valued possibilistic logic, and its conditioning machinery in the qualitative possibilistic setting. These results, especially in terms of complexity, lead us to study transformations, more precisely from probability to possibility theories. The second part of our contributions deals with probability-possibility transformation procedures. Indeed, we analyze properties of reasoning tasks such as conditioning and marginalization. We also tackle transformations from imprecise probability theory to possibility theory with a particular interest in MAP inference
APA, Harvard, Vancouver, ISO, and other styles
18

Port, Dan. "Polynomial maps with applications to combinatorics and probability theory." Thesis, Massachusetts Institute of Technology, 1994. http://hdl.handle.net/1721.1/28041.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Elamir, Elsayed Ali Habib. "Probability distribution theory, generalisations and applications of L-moments." Thesis, Durham University, 2001. http://etheses.dur.ac.uk/3987/.

Full text
Abstract:
In this thesis, we have studied L-moments and trimmed L-moments (TL-moments) which are both linear functions of order statistics. We have derived expressions for exact variances and covariances of sample L-moments and of sample TL-moments for any sample size n in terms of first and second-order moments of order statistics from small conceptual sample sizes, which do not depend on the actual sample size n. Moreover, we have established a theorem which characterises the normal distribution in terms of these second-order moments and the characterisation suggests a new test of normality. We have also derived a method of estimation based on TL-moments which gives zero weight to extreme observations. TL-moments have certain advantages over L-moments and method of moments. They exist whether or not the mean exists (for example the Cauchy distribution) and they are more robust to the presence of outliers. Also, we have investigated four methods for estimating the parameters of a symmetric lambda distribution: maximum likelihood method in the case of one parameter and L-moments, LQ-moments and TL-moments in the case of three parameters. The L-moments and TL-moments estimators are in closed form and simple to use, while numerical methods are required for the other two methods, maximum likelihood and LQ-moments. Because of the flexibility and the simplicity of the lambda distribution, it is useful in fitting data when, as is often the case, the underlying distribution is unknown. Also, we have studied the symmetric plotting position for quantile plot assuming a symmetric lambda distribution and conclude that the choice of the plotting position parameter depends upon the shape of the distribution. Finally, we propose exponentially weighted moving average (EWMA) control charts to monitor the process mean and dispersion using the sample L-mean and sample L-scale and charts based on trimmed versions of the same statistics. The proposed control charts limits are less influenced by extreme observations than classical EWMA control charts, and lead to tighter limits in the presence of out-of-control observations.
APA, Harvard, Vancouver, ISO, and other styles
20

Bowman, Christopher. "Applications of Bayesian probability theory in fusion data analysis." Thesis, University of York, 2016. http://etheses.whiterose.ac.uk/16978/.

Full text
Abstract:
Bayesian probability theory is a powerful tool for solving complex problems in experimental data analysis. In this thesis we explore the use of Bayesian methods in magnetic confinement fusion with an emphasis toward developing analysis tools and techniques. The original research content is presented in three chapters. In the first we develop a new approach to efficiently characterising multi-dimensional posterior distributions. This is achieved through an algorithm which, for any number of posterior dimensions, can decide which areas of the probability space contain significant information and evaluate only those areas. This addresses the computational challenges which arise in calculating marginal distributions from many-dimensional posteriors. In the second research chapter Bayesian probability theory is applied to the discrete Fourier-transform of an arbitrary real series containing random noise. The effect of the noise on the Fourier coefficients is used to derive a correction to the Fourier magnitudes, which results in a reduction in the overall noise-level after an inverse-transform. Calculating these corrections requires the solution of a challenging inverse problem which is discussed at length, and several methods for obtaining approximate solutions are developed and tested. The correction itself, plus the methods allowing its calculation together form the basis of a new technique for noise correction which is completely general, as no assumptions are made about the series which is to be corrected. In the final research chapter the inference of physics parameters using the DIII-D CER system is discussed. A Bayesian network approach is used to derive a model for the observed charge-exchange spectrum, which is itself used to construct a posterior distribution for the model parameters. The spectrum model is used to explore the possibility of inferring the time-evolution of physical parameters on sub-integration time-scales.
APA, Harvard, Vancouver, ISO, and other styles
21

Calhoun, Grayson Ford. "Limit theory for overfit models." Diss., [La Jolla] : University of California, San Diego, 2009. http://wwwlib.umi.com/cr/ucsd/fullcit?p3359804.

Full text
Abstract:
Thesis (Ph. D.)--University of California, San Diego, 2009.
Title from first page of PDF file (viewed July 23, 2009). Available via ProQuest Digital Dissertations. Vita. Includes bibliographical references (p. 104-109).
APA, Harvard, Vancouver, ISO, and other styles
22

Giamouridis, Daniel. "Implied probability distributions : estimation, testing and applications." Thesis, City University London, 2001. http://openaccess.city.ac.uk/8388/.

Full text
Abstract:
A relatively large number of authors have proposed alternative techniques for the estimation of implied risk-neutral densities. As a general rule, an assumption for a theoretical equilibrium option pricing model is made and with the use of cross-sections of observed options prices point estimates of the risk-neutral probability densities are obtained. The present study is primarily concerned with the estimation of implied riskneutral densities by means of a semi-parametric Edgeworth Series Expansion probability model as an alternative to the widely criticized log-normal parameterization of the Black, Scholes and Merton model. Despite the relatively early introduction of this type of models in academic literature in the early '80s, it was not until the mid '90s that people started showing interest in their applications. Moreover, no studies by means of the Edgeworth Series Expansion probability model have so far been conducted with American style options. To this end, the present work initially develops the general theoretical framework and the numerical algorithm for the estimation of implied risk-neutral densities of the Edgeworth Series Expansion type from options prices. The technique is applicable to European options written on a generalized asset that pays dividends in continuous time or American futures options. The empirical part of the study considers data for the Oil and the Interest rates markets. The first task in the empirical investigation is to address general concerns with regard to the validity of an implied risk-neutral density estimation technique and its ability to stimulate meaningful discussion. To this end, the consistency of the Edgeworth Series Expansion type implied densities with the data is checked. This consistency is viewed in a broader sense: internal consistency - adequate fit to observed data - and economic rationale of the respective densities. An analysis is, therefore, performed to examine the properties of the implied densities in the presence of large changes in economic conditions. More specifically, the ability of the implied Edgeworth Series Expansion type implied densities to capture speculation over future eventualities and their capacity to immediately reflect changes in the market sentiment are examined. Motivated by existing concerns in the literature that the differences between the estimates from an alternative parameterization and the log-normal Black-Scholes-Merton parameterization may be apparent - better fit to observed data - but not significant.
APA, Harvard, Vancouver, ISO, and other styles
23

Guglielmetti, Fabrizia. "Background-Source separation in astronomical images with Bayesian Probability Theory." Diss., lmu, 2010. http://nbn-resolving.de/urn:nbn:de:bvb:19-127320.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Månsson, Anders. "Quantum State Analysis : Probability theory as logic in Quantum mechanics." Doctoral thesis, KTH, Mikroelektronik och tillämpad fysik, MAP, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4417.

Full text
Abstract:
Quantum mechanics is basically a mathematical recipe on how to construct physical models. Historically its origin and main domain of application has been in the microscopic regime, although it strictly seen constitutes a general mathematical framework not limited to this regime. Since it is a statistical theory, the meaning and role of probabilities in it need to be defined and understood in order to gain an understanding of the predictions and validity of quantum mechanics. The interpretational problems of quantum mechanics are also connected with the interpretation of the concept of probability. In this thesis the use of probability theory as extended logic, in particular in the way it was presented by E. T. Jaynes, will be central. With this interpretation of probabilities they become a subjective notion, always dependent on one's state of knowledge or the context in which they are assigned, which has consequences on how things are to be viewed, understood and tackled in quantum mechanics. For instance, the statistical operator or density operator, is usually defined in terms of probabilities and therefore also needs to be updated when the probabilities are updated by acquisition of additional data. Furthermore, it is a context dependent notion, meaning, e.g., that two observers will in general assign different statistical operators to the same phenomenon, which is demonstrated in the papers of the thesis. It is also presented an alternative and conceptually clear approach to the problematic notion of "probabilities of probabilities", which is related to such things as probability distributions on statistical operators. In connection to this, we consider concrete numerical applications of Bayesian quantum state assignment methods to a three-level quantum system, where prior knowledge and various kinds of measurement data are encoded into a statistical operator, which can then be used for deriving probabilities of other measurements. The thesis also offers examples of an alternative quantum state assignment technique, using maximum entropy methods, which in some cases are compared with the Bayesian quantum state assignment methods. Finally, the interesting and important problem whether the statistical operator, or more generally quantum mechanics, gives a complete description of "objective physical reality" is considered. A related concern is here the possibility of finding a "local hidden-variable theory" underlying the quantum mechanical description. There have been attempts to prove that such a theory cannot be constructed, where the most well-known impossibility proof claiming to show this was given by J. S. Bell. In connection to this, the thesis presents an idea for an interpretation or alternative approach to quantum mechanics based on the concept of space-time.
QC 20100810
APA, Harvard, Vancouver, ISO, and other styles
25

Månsson, Anders. "Quantum state analysis : probability theory as logic in Quantum mechanics /." Stockholm : Department of Microelectronics and Applied Physics, Royal Institute of Technology, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4417.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Kart, Özlem. "A Historical Survey of the Development of Classical Probability Theory." Thesis, Uppsala universitet, Analys och sannolikhetsteori, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-359774.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Spencer, Steven Robert. "Renewal theory for uniform random variables." CSUSB ScholarWorks, 2002. https://scholarworks.lib.csusb.edu/etd-project/2248.

Full text
Abstract:
This project will focus on finding formulas for E[N(t)] using one of the classical problems in the discipline first, and then extending the scope of the problem to include overall times greater than the time t in the original problem. The expected values in these cases will be found using the uniform and exponential distributions of random variables.
APA, Harvard, Vancouver, ISO, and other styles
28

Jónsson, Ragner H. "Adaptive subband coding of video using probability distribution models." Diss., Georgia Institute of Technology, 1994. http://hdl.handle.net/1853/14453.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Ha, Cong Loc. "Time-dependent reliability analysis for deteriorating structures using imprecise probability theory." Thesis, The University of Sydney, 2017. http://hdl.handle.net/2123/17731.

Full text
Abstract:
Reliability analysis, which takes into account uncertainties, is considered to be the best tool for modern structural evaluation. In this assessment, the deterioration model is one of the most important factors, but it is complicated for modelling due to the inherent uncertainties in the deterioration process. Theoretically, the uncertainties of the deterioration process can be modelled using a probabilistic approach. However, there are practical difficulties in identifying the probabilistic model for the deterioration process as the actual deterioration data are rather limited. Also, the dependencies between different uncertainties are often ignored. Thus the present study proposes a probabilistic analysis framework, using dependent p-boxes in which copulas describe the dependence, for modelling the deterioration process with incomplete information. There are two main parts of the framework. Firstly, the theory of statistical inference is developed for the quantification of uncertainties and their dependence structure. Secondly, simulation techniques in the structural reliability analysis are also developed. Two simulation approaches are integrated to propagate the dependent p-boxes for reliability analysis, including interval MC simulation and importance sampling. The accuracy and efficiency of the uncertainty framework are also verified through numerical examples. When the accuracy and efficiency of the framework are verified, the framework is then applied to the proposed deterioration models. Due to the different properties involved in the process, deterioration models for steel structures and reinforced concrete structures are considered separately. The finding suggests that significant epistemic uncertainties exist in the current deterioration models due to the limited availability of reliable corrosion data. In addition, new dependence structure of Frank copula is discovered in the deterioration models of steel and RC structures. In summary, the proposed framework in the study is recommended as a useful tool to model the uncertain corrosion process, accounting for both the aleatory and epistemic uncertainties. The inaccuracy of error measurements and insufficient data have been taken into account for modelling of uncertainty and dependence structure.
APA, Harvard, Vancouver, ISO, and other styles
30

Kousha, Termeh. "Topics in Random Matrices: Theory and Applications to Probability and Statistics." Thèse, Université d'Ottawa / University of Ottawa, 2011. http://hdl.handle.net/10393/20480.

Full text
Abstract:
In this thesis, we discuss some topics in random matrix theory which have applications to probability, statistics and quantum information theory. In Chapter 2, by relying on the spectral properties of an associated adjacency matrix, we find the distribution of the maximum of a Dyck path and show that it has the same distribution function as the unsigned Brownian excursion which was first derived in 1976 by Kennedy. We obtain a large and moderate deviation principle for the law of the maximum of a random Dyck path. Our result extends the results of Chung, Kennedy and Khorunzhiy and Marckert. In Chapter 3, we discuss a method of sampling called the Gibbs-slice sampler. This method is based on Neal's slice sampling combined with Gibbs sampling. In Chapter 4, we discuss several examples which have applications in physics and quantum information theory.
APA, Harvard, Vancouver, ISO, and other styles
31

Hudson, Derek Lavell. "Improving Accuracy in Microwave Radiometry via Probability and Inverse Problem Theory." Diss., CLICK HERE for online access, 2009. http://contentdm.lib.byu.edu/ETD/image/etd3244.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Balch, Michael Scott. "Methods for Rigorous Uncertainty Quantification with Application to a Mars Atmosphere Model." Diss., Virginia Tech, 2010. http://hdl.handle.net/10919/30115.

Full text
Abstract:
The purpose of this dissertation is to develop and demonstrate methods appropriate for the quantification and propagation of uncertainty in large, high-consequence engineering projects. The term "rigorous uncertainty quantification" refers to methods equal to the proposed task. The motivating practical example is uncertainty in a Mars atmosphere model due to the incompletely characterized presence of dust. The contributions made in this dissertation, though primarily mathematical and philosophical, are driven by the immediate needs of engineers applying uncertainty quantification in the field. Arguments are provided to explain how the practical needs of engineering projects like Mars lander missions motivate the use of the objective probability bounds approach, as opposed to the subjectivist theories which dominate uncertainty quantification in many research communities. An expanded formalism for Dempster-Shafer structures is introduced, allowing for the representation of continuous random variables and fuzzy variables as Dempster-Shafer structures. Then, the correctness and incorrectness of probability bounds analysis and the Cartesian product propagation method for Dempster-Shafer structures under certain dependency conditions are proven. It is also conclusively demonstrated that there exist some probability bounds problems in which the best-possible bounds on probability can not be represented using Dempster-Shafer structures. Nevertheless, Dempster-Shafer theory is shown to provide a useful mathematical framework for a wide range of probability bounds problems. The dissertation concludes with the application of these new methods to the problem of propagating uncertainty from the dust parameters in a Mars atmosphere model to uncertainty in that model's prediction of atmospheric density. A thirty-day simulation of the weather at Holden Crater on Mars is conducted using a meso-scale atmosphere model, MRAMS. Although this analysis only addresses one component of Mars atmosphere uncertainty, it demonstrates the applicability of probability bounds methods in practical engineering work. More importantly, the Mars atmosphere uncertainty analysis provides a framework in which to conclusively establish the practical importance of epistemology in rigorous uncertainty quantification.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
33

Słowiński, Witold. "Autonomous learning of domain models from probability distribution clusters." Thesis, University of Aberdeen, 2014. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?pid=211059.

Full text
Abstract:
Nontrivial domains can be difficult to understand and the task of encoding a model of such a domain can be difficult for a human expert, which is one of the fundamental problems of knowledge acquisition. Model learning provides a way to address this problem by allowing a predictive model of the domain's dynamics to be learnt algorithmically, without human supervision. Such models can provide insight about the domain to a human or aid in automated planning or reinforcement learning. This dissertation addresses the problem of how to learn a model of a continuous, dynamic domain, from sensory observations, through the discretisation of its continuous state space. The learning process is unsupervised in that there are no predefined goals, and it assumes no prior knowledge of the environment. Its outcome is a model consisting of a set of predictive cause-and-effect rules which describe changes in related variables over brief periods of time. We present a novel method for learning such a model, which is centred around the idea of discretising the state space by identifying clusters of uniform density in the probability density function of variables, which correspond to meaningful features of the state space. We show that using this method it is possible to learn models exhibiting predictive power. Secondly, we show that applying this discretisation process to two-dimensional vector variables in addition to scalar variables yields a better model than only applying it to scalar variables and we describe novel algorithms and data structures for discretising one- and two-dimensional spaces from observations. Finally, we demonstrate that this method can be useful for planning or decision making in some domains where the state space exhibits stable regions of high probability and transitional regions of lesser probability. We provide evidence for these claims by evaluating the model learning algorithm in two dynamic, continuous domains involving simulated physics: the OpenArena computer game and a two-dimensional simulation of a bouncing ball falling onto uneven terrain.
APA, Harvard, Vancouver, ISO, and other styles
34

Ong, Chong Tean. "On the undetected error probability of linear codes." Thesis, University of British Columbia, 1990. http://hdl.handle.net/2429/29722.

Full text
Abstract:
The probability of undetected error P[formula omitted](є) for the primitive triple-error-correcting BCH codes of blocklength 2[formula omitted]  1, used solely for error detection on a binary symmetric channel with crossover probability є ≤ 1/2, is examined. It is shown that for odd values of m, P[formula omitted(є) increases monotonically with є. For even values of m, this is not necessarily true. However, for a fixed є, as m increases, P[formula omitted](є) approaches 2‾[formula omitted] where p is the number of parity bits. The extended double and triple-error-correcting primitive BCH codes are also examined. The undetected error probability of these codes is shown to have similar characteristics as the non-extended cases. An improved upper bound on the probability of undetected error which is valid for any linear code is derived. Comparison of this improved upper bound with the Kasami upper bound for some classes of codes is shown.
Applied Science, Faculty of
Electrical and Computer Engineering, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
35

Guinaudeau, Alexandre. "Estimating the probability of event occurrence." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-246338.

Full text
Abstract:
In complex systems anomalous behaviors can occur intermittently and stochastically. In this case, it is hard to diagnose real errors among spurious ones. These errors are often hard to troubleshoot and require close attention, but troubleshooting each occurrence is time-consuming and is not always an option. In this thesis, we define two different models to estimate the underlying probability of occurrence of an error, one based on binary segmentation and null hypothesis testing, and the other one based on hidden Markov models. Given a threshold level of confidence, these models are tuned to trigger alerts when a change is detected with sufficiently high probability. We generated events drawn from Bernoulli distributions emulating these anomalous behaviors to benchmark these two candidate models. Both models have the same sensitivity, δp ≈ 10%, and delay, δt ≈ 100 observations, to detect change points. However, they do not generalize in the same way to broader problems and provide therefore two complementary solutions.
I komplexa system kan anomala beteenden uppträda intermittent och stokastiskt. I de här fallen är det svårt att diagnostisera verkliga fel bland falska sådana. Dessa fel är ofta svåra att felsöka och kräver noggrann uppmärksamhet, men felsökning av varje händelse är mycket tidskrävande och är inte alltid ett alternativ. I denna avhandling definierar vi två olika modeller för att uppskatta den underliggande sannolikheten för att ett fel uppträder, den första baserad på binär segmentering och prövning av nollhypotes, och den andra baserad på dolda Markovmodeller. Givet ett tröskelvärde för konfidensgraden är dessa modeller justerade för att utlösa varningar när en förändring detekteras med tillräcklig hög sannolikhet. Vi genererade händelser som drogs från Bernoullifördelningar som emulerar dessa avvikande beteenden för att utvärdera dessa två kandidatmodeller. Båda modellerna har samma sensitivitet, δp ≈ 10% och fördröjning, δt ≈ 100  observationer, för att upptäcka ändringspunkter. De generaliserar emellertid inte på samma sätt till större problem och ger därför två kompletterande lösningar.
APA, Harvard, Vancouver, ISO, and other styles
36

Pashley, Peter J. "The analysis of latency data using the inverse Gaussian distribution /." Thesis, McGill University, 1987. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=75343.

Full text
Abstract:
The inverse Gaussian distribution is investigated as a basis for statistical analyses of skewed and possibly censored response times. This distribution arises from a random walk process, is a member of the exponential family, and admits the sample arithmetic and harmonic means as complete sufficient statistics. In addition, the inverse Gaussian provides a reasonable alternative to the more commonly used lognormal statistical model due to the attractive properties of its parameter estimates.
Three modifications were made to the basic distribution definition: adding a shift parameter to account for minimum latencies, allowing for Type I censoring, and convoluting two inverse Gaussian random variables in order to model components of response times. Corresponding parameter estimation and large sample test procedures were also developed.
Results from analysing two extensive sets of simple and two-choice reaction times suggest that shifting the origin and accounting for Type I censoring can substantially improve the reliability of inverse Gaussian parameter estimates. The results also indicate that the convolution model provides a convenient medium for probing underlying psychological processes.
APA, Harvard, Vancouver, ISO, and other styles
37

Seidel, Karen. "Probabilistic communicating processes." Thesis, University of Oxford, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.306194.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Etheridge, Alison Mary. "Asymptotic behaviour of some measure-valued diffusions." Thesis, University of Oxford, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.329943.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Johnston, John C. "Bayesian analysis of inverse problems in physics." Thesis, University of Oxford, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.337737.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Ding, Xiqian, and 丁茜茜. "Some new statistical methods for a class of zero-truncated discrete distributions with applications." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2015. http://hdl.handle.net/10722/211126.

Full text
Abstract:
Counting data without zero category often occur in various _elds. Examples include days of hospital stay for patients, numbers of publication for tenure-tracked faculty in a university, numbers of tra_c violation for drivers during a certain period and so on. A class of zero-truncated discrete models such as zero-truncated Poisson, zero-truncated binomial and zero-truncated negative-binomial distributions are proposed in literature to model such count data. In this thesis, firstly, literature review is presented in Chapter 1 on a class of commonly used univariate zero-truncated discrete distributions. In Chapter 2, a unified method is proposed to derive the distribution of the sum of i.i.d. zero-truncated distribution random variables, which has important applications in the construction of the shortest Clopper-Person confidence intervals of parameters of interest and in the calculation of the exact p-value of a two-sided test for small sample sizes in one sample problem. These problems are discussed in Section 2.4. Then a novel expectation-maximization (EM) algorithm is developed for calculating the maximum likelihood estimates (MLEs) of parameters in general zero-truncated discrete distributions. An important feature of the proposed EM algorithm is that the latent variables and the observed variables are independent, which is unusual in general EM-type algorithms. In addition, a unified minorization-maximization (MM) algorithm for obtaining the MLEs of parameters in a class of zero-truncated discrete distributions is provided. The first objective of Chapter 3 is to propose the multivariate zero-truncated Charlier series (ZTCS) distribution by developing its important distributional properties, and providing efficient MLE methods via a novel data augmentation in the framework of the EM algorithm. Since the joint marginal distribution of any r-dimensional sub-vector of the multivariate ZTCS random vector of dimension m is an r-dimensional zero-deated Charlier series (ZDCS) distribution (1 6 r < m), it is the second objective of Chapter 3 to propose a new family of multivariate zero-adjusted Charlier series (ZACS) distributions (including the multivariate ZDCS distribution as a special member) with a more flexible correlation structure by accounting for both inflation and deflation at zero. The corresponding distributional properties are explored and the associated MLE method via EM algorithm is provided for analyzing correlated count data.
published_or_final_version
Statistics and Actuarial Science
Master
Master of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
41

Russell, N. S. "Stochastic techniques for time series with applications to materials accountancy." Thesis, University of Southampton, 1985. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.356093.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Oldfield, Martin John. "Advances in probabilistic modelling." Thesis, University of Cambridge, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.362928.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Baxter, Martin William. "Discounted functionals of Markov processes." Thesis, University of Cambridge, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.309008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Mbah, Alfred Kubong. "On the theory of records and applications." [Tampa, Fla.] : University of South Florida, 2007. http://purl.fcla.edu/usf/dc/et/SFE0002216.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Sheehy, Anne. "Kullback-Leibler estimation of probability measures with an application to clustering /." Thesis, Connect to this title online; UW restricted, 1987. http://hdl.handle.net/1773/8968.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Nordvall, Lagerås Andreas. "Markov Chains, Renewal, Branching and Coalescent Processes : Four Topics in Probability Theory." Doctoral thesis, Stockholm University, Department of Mathematics, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-6637.

Full text
Abstract:

This thesis consists of four papers.

In paper 1, we prove central limit theorems for Markov chains under (local) contraction conditions. As a corollary we obtain a central limit theorem for Markov chains associated with iterated function systems with contractive maps and place-dependent Dini-continuous probabilities.

In paper 2, properties of inverse subordinators are investigated, in particular similarities with renewal processes. The main tool is a theorem on processes that are both renewal and Cox processes.

In paper 3, distributional properties of supercritical and especially immortal branching processes are derived. The marginal distributions of immortal branching processes are found to be compound geometric.

In paper 4, a description of a dynamic population model is presented, such that samples from the population have genealogies as given by a Lambda-coalescent with mutations. Depending on whether the sample is grouped according to litters or families, the sampling distribution is either regenerative or non-regenerative.

APA, Harvard, Vancouver, ISO, and other styles
47

Nordvall, Lagerås Andreas. "Markov chains, renewal, branching and coalescent processes : four topics in probability theory /." Stockholm : Department of Mathematics, Stockholm university, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-6637.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Gusakova, Anna [Verfasser]. "Application of Probability Methods in Number Theory and Integral Geometry / Anna Gusakova." Bielefeld : Universitätsbibliothek Bielefeld, 2018. http://d-nb.info/1174670371/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

S, Suslova O. "Consideration of some problems of probability theory in the field of insurance." Thesis, National Aviation University, 2021. https://er.nau.edu.ua/handle/NAU/50738.

Full text
Abstract:
1. Higher mathematics. Probability Theory. Random events. Method Guide to self study /І. О. Lastivka, I. S. Klyus V.I.Trofymenko – К. : NАU, 2018. – 48 p. 2. 2. Kuzmichov A.I. OptimIzatsIynI metodi I modelI: praktikum v Excel : navchalniy posIbnik / A.I.Kuzmichov. – KiYiv: VPTs AMU, 2013. – 438
In the economic sphere, which is one of the most important spheres of society, probability theory plays an important role and, therefore, is an integral part of the training of specialists such as economists and financiers. Probability theory techniques should be used where it is possible to create and analyze probabilistic models of actions or phenomena. One of the branches of the economy in which calculations allow combining different methods of probability theory is insurance.
В економічній сфері, яка є однією з найважливіших сфер суспільства, теорія ймовірностей відіграє важливу роль і, отже, є невід'ємною частиною підготовки таких фахівців, як економісти та фінансисти. Методи теорії ймовірностей слід застосовувати там, де можливо створити та проаналізувати ймовірнісні моделі дій чи явищ. Однією з галузей економіки, в якій розрахунки дозволяють поєднувати різні методи теорії ймовірностей, є страхування.
APA, Harvard, Vancouver, ISO, and other styles
50

Apedaile, Thomas J. "Computational Topics in Lie Theory and Representation Theory." DigitalCommons@USU, 2014. https://digitalcommons.usu.edu/etd/2156.

Full text
Abstract:
The computer algebra system Maple contains a basic set of commands for working with Lie algebras. The purpose of this thesis was to extend the functionality of these Maple packages in a number of important areas. First, programs for dening multiplication in several types of Cayley algebras, Jordan algebras and Cliord algebras were created to allow users to perform a variety of calculations. Second, commands were created for calculating some basic properties of nite-dimensional representations of complex semisimple Lie algebras. These commands allow one to identify a given representation as direct sum of irreducible subrepresentations, each one identied by an invariant highest weight. Third, creating an algorithm to calculate the Lie bracket for Vinberg's symmetric construction of Freudenthal's Magic Square allowed for a uniform construction of all ve exceptional Lie algebras. Maple examples and tutorials are provided to illustrate the implementation and use of the algebras now available in Maple as well as the tools for working with Lie algebra representations.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography