To see the other types of publications on this topic, follow the link: Gamma mixture.

Dissertations / Theses on the topic 'Gamma mixture'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 23 dissertations / theses for your research on the topic 'Gamma mixture.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Ni, Ying. "Modeling Insurance Claim Sizes using the Mixture of Gamma & Reciprocal Gamma Distributions." Thesis, Mälardalen University, Department of Mathematics and Physics, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-454.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Schwander, Olivier. "Information-geometric methods for mixture models." Palaiseau, Ecole polytechnique, 2013. http://pastel.archives-ouvertes.fr/docs/00/93/17/22/PDF/these.pdf.

Full text
Abstract:
Cette thèse présente de nouvelles méthodes pour l'apprentissage de modèles de mélanges basées sur la géométrie de l'information. Les modèles de mélanges considérés ici sont des mélanges de familles exponentielles, permettant ainsi d'englober une large part des modèles de mélanges utilisés en pratique. Grâce à la géométrie de l'information, les problèmes statistiques peuvent être traités avec des outils géométriques. Ce cadre offre de nouvelles perspectives permettant de mettre au point des algorithmes à la fois rapides et génériques. Deux contributions principales sont proposées ici. La première est une méthode de simplification d'estimateurs par noyaux. Cette simplification est effectuée à l'aide un algorithme de partitionnement, d'abord avec la divergence de Bregman puis, pour des raisons de rapidité, avec la distance de Fisher-Rao et des barycentres modèles. La seconde contribution est une généralisation de l'algorithme k-MLE permettant de traiter des mélanges où toutes les composantes ne font pas partie de la même famille: cette méthode est appliquée au cas des mélanges de Gaussiennes généralisées et des mélanges de lois Gamma et est plus rapide que les méthodes existantes. La description de ces deux méthodes est accompagnée d'une implémentation logicielle complète et leur efficacité est évaluée grâce à des applications en bio-informatique et en classification de textures
This thesis presents new methods for mixture model learning based on information geometry. We focus on mixtures of exponential families, which encompass a large number of mixtures used in practice. With information geometry, statistical problems can be studied with geometrical tools. This framework gives new perspectives allowing to design algorithms which are both fast and generic. Two main contributions are proposed here. The first one is a method for simplification of kernel density estimators. This simplification is made with clustering algorithms, first with the Bregman divergence and next, for speed reason, with the Fisher-Rao distance and model centroids. The second contribution is a generalization of the k-MLE algorithm which allows to deal with mixtures where all the components do not belong to the same family: this method is applied to mixtures of generalized Gaussians and of Gamma laws and is faster than existing methods. The description of this two algorithms comes with a complete software implementation and their efficiency is evaluated through applications in bio-informatics and texture classification
APA, Harvard, Vancouver, ISO, and other styles
3

Malsiner-Walli, Gertraud, Sylvia Frühwirth-Schnatter, and Bettina Grün. "Identifying mixtures of mixtures using Bayesian estimation." Taylor & Francis, 2017. http://dx.doi.org/10.1080/10618600.2016.1200472.

Full text
Abstract:
The use of a finite mixture of normal distributions in model-based clustering allows to capture non-Gaussian data clusters. However, identifying the clusters from the normal components is challenging and in general either achieved by imposing constraints on the model or by using post-processing procedures. Within the Bayesian framework we propose a different approach based on sparse finite mixtures to achieve identifiability. We specify a hierarchical prior where the hyperparameters are carefully selected such that they are reflective of the cluster structure aimed at. In addition, this prior allows to estimate the model using standard MCMC sampling methods. In combination with a post-processing approach which resolves the label switching issue and results in an identified model, our approach allows to simultaneously (1) determine the number of clusters, (2) flexibly approximate the cluster distributions in a semi-parametric way using finite mixtures of normals and (3) identify cluster-specific parameters and classify observations. The proposed approach is illustrated in two simulation studies and on benchmark data sets.
APA, Harvard, Vancouver, ISO, and other styles
4

Borketey, Martha A. "Effects of Select Vitamin E Isoforms on the Production of Polyunsaturated Fatty Acid Metabolites in Colorectal Cancer." Digital Commons @ East Tennessee State University, 2015. https://dc.etsu.edu/etd/2480.

Full text
Abstract:
Vitamin E exhibits anti-tumor activity by regulating pathways in cancer cells, potentially the lipoxygenase (LOX) pathway. We studied the effects of alpha tocopherol (AT), gamma tocopherol (GT), gamma tocotrienol (GT3), and an alpha-gamma tocopherol mixture (ATGT) on the production of the LOX metabolites 13-hydroxyoctadecaenoic acid (HODE), 15-hydroxyeicosatetraenoic acid (HETE), 12-HETE, and 5-HETE in colorectal cancer. These metabolites were examined in the HCT-116 cell line after 24 h treatment with select vitamin E isoforms and quantified by LC/MS/MS. Under physiological conditions, we find that treatment with varying vitamin E isoforms have different effects on the production of 13-HODE, 15-HETE, 12-HETE, and 5-HETE. GT increases 13-HODE and decreases 12-HETE. AT reverses the effects of GT regulation on the LOX pathway, while GT3 has no significant effect on the metabolites tested. GT shows superiority in regulating the LOX pathway as it increases 13-HODE and decreases 12-HETE for possible prevention of colorectal cancer.
APA, Harvard, Vancouver, ISO, and other styles
5

Bere, Alphonce. "Some non-standard statistical dependence problems." University of the Western Cape, 2016. http://hdl.handle.net/11394/4868.

Full text
Abstract:
Philosophiae Doctor - PhD
The major result of this thesis is the development of a framework for the application of pair-mixtures of copulas to model asymmetric dependencies in bivariate data. The main motivation is the inadequacy of mixtures of bivariate Gaussian models which are commonly fitted to data. Mixtures of rotated single parameter Archimedean and Gaussian copulas are fitted to real data sets. The method of maximum likelihood is used for parameter estimation. Goodness-of-fit tests performed on the models giving the highest log-likelihood values show that the models fit the data well. We use mixtures of univariate Gaussian models and mixtures of regression models to investigate the existence of bimodality in the distribution of the widths of autocorrelation functions in a sample of 119 gamma-ray bursts. Contrary to previous findings, our results do not reveal any evidence of bimodality. We extend a study by Genest et al. (2012) of the power and significance levels of tests of copula symmetry, to two copula models which have not been considered previously. Our results confirm that for small sample sizes, these tests fail to maintain their 5% significance level and that the Cramer-von Mises-type statistics are the most powerful.
APA, Harvard, Vancouver, ISO, and other styles
6

Zens, Gregor. "Bayesian shrinkage in mixture-of-experts models: identifying robust determinants of class membership." Springer, 2019. http://dx.doi.org/10.1007/s11634-019-00353-y.

Full text
Abstract:
A method for implicit variable selection in mixture-of-experts frameworks is proposed. We introduce a prior structure where information is taken from a set of independent covariates. Robust class membership predictors are identified using a normal gamma prior. The resulting model setup is used in a finite mixture of Bernoulli distributions to find homogenous clusters of women in Mozambique based on their information sources on HIV. Fully Bayesian inference is carried out via the implementation of a Gibbs sampler.
APA, Harvard, Vancouver, ISO, and other styles
7

Ke, Xiao. "On lower bounds of mixture L₂-discrepancy, construction of uniform design and gamma representative points with applications in estimation and simulation." HKBU Institutional Repository, 2015. https://repository.hkbu.edu.hk/etd_oa/152.

Full text
Abstract:
Two topics related to the experimental design are considered in this thesis. On the one hand, the uniform experimental design (UD), a major kind of space-filling design, is widely used in applications. The majority of UD tables (UDs) with good uniformity are generated under the centralized {dollar}L_2{dollar}-discrepancy (CD) and the wrap-around {dollar}L_2{dollar}-discrepancy (WD). Recently, the mixture {dollar}L_2{dollar}-discrepancy (MD) is proposed and shown to be more reasonable than CD and WD in terms of uniformity. In first part of the thesis we review lower bounds for MD of two-level designs from a different point of view and provide a new lower bound. Following the same idea we obtain a lower bound for MD of three-level designs. Moreover, we construct UDs under the measurement of MD by the threshold accepting (TA) algorithm, and finally we attach two new UD tables with good properties derived from TA under the measurement of MD. On the other hand, the problem of selecting a specific number of representative points (RPs) to maintain as much information as a given distribution has raised attention. Previously, a method has been given to select type-II representative points (RP-II) from normal distribution. These point sets have good properties and minimize the information loss. Whereafter, following similar idea, Fu, 1985 have discussed RP-II for gamma distribution. In second part of the thesis, we improve the discussion of selecting Gamma RP-II and provide more RP-II tables with a number of parameters. Further in statistical simulation, we also evaluate the estimation performance of point sets resampled from Gamma RP-II by making comparison in different situations.
APA, Harvard, Vancouver, ISO, and other styles
8

Malsiner-Walli, Gertraud, Sylvia Frühwirth-Schnatter, and Bettina Grün. "Model-based clustering based on sparse finite Gaussian mixtures." Springer, 2016. http://dx.doi.org/10.1007/s11222-014-9500-2.

Full text
Abstract:
In the framework of Bayesian model-based clustering based on a finite mixture of Gaussian distributions, we present a joint approach to estimate the number of mixture components and identify cluster-relevant variables simultaneously as well as to obtain an identified model. Our approach consists in specifying sparse hierarchical priors on the mixture weights and component means. In a deliberately overfitting mixture model the sparse prior on the weights empties superfluous components during MCMC. A straightforward estimator for the true number of components is given by the most frequent number of non-empty components visited during MCMC sampling. Specifying a shrinkage prior, namely the normal gamma prior, on the component means leads to improved parameter estimates as well as identification of cluster-relevant variables. After estimating the mixture model using MCMC methods based on data augmentation and Gibbs sampling, an identified model is obtained by relabeling the MCMC output in the point process representation of the draws. This is performed using K-centroids cluster analysis based on the Mahalanobis distance. We evaluate our proposed strategy in a simulation setup with artificial data and by applying it to benchmark data sets. (authors' abstract)
APA, Harvard, Vancouver, ISO, and other styles
9

Janeiro, Vanderly. "Modelagem de dados contínuos censurados, inflacionados de zeros." Universidade de São Paulo, 2010. http://www.teses.usp.br/teses/disponiveis/11/11134/tde-20092010-090511/.

Full text
Abstract:
Muitos equipamentos utilizados para quantificar substâncias, como toxinas em alimentos, freqüentemente apresentam deficiências para quantificar quantidades baixas. Em tais casos, geralmente indicam a ausência da substância quando esta existe, mas está abaixo de um valor pequeno \'ksi\' predeterminado, produzindo valores iguais a zero não necessariamente verdadeiros. Em outros casos, detectam a presença da substância, mas são incapazes de quantificá-la quando a quantidade da substância está entre \'ksai\' e um valor limiar \'tau\', conhecidos. Por outro lado, quantidades acima desse valor limiar são quantificadas de forma contínua, dando origem a uma variável aleatória contínua X cujo domínio pode ser escrito como a união dos intervalos, [ómicron, \"ksai\'), [\"ksai\', \'tau\' ] e (\'tau\', ?), sendo comum o excesso de valores iguais a zero. Neste trabalho, são propostos modelos que possibilitam discriminar a probabilidade de zeros verdadeiros, como o modelo de mistura com dois componentes, sendo um degenerado em zero e outro com distribuição contínua, sendo aqui consideradas as distribuições: exponencial, de Weibull e gama. Em seguida, para cada modelo, foram observadas suas características, propostos procedimentos para estimação de seus parâmetros e avaliados seus potenciais de ajuste por meio de métodos de simulação. Finalmente, a metodologia desenvolvida foi ilustrada por meio da modelagem de medidas de contaminação com aflatoxina B1, observadas em grãos de milho, de três subamostras de um lote de milho, analisados no Laboratório de Micotoxinas do Departamento de Agroindústria, Alimentos e Nutrição da ESALQ/USP. Como conclusões, na maioria dos casos, as simulações indicaram eficiência dos métodos propostos para as estimações dos parâmetros dos modelos, principalmente para a estimativa do parâmetro \'delta\' e do valor esperado, \'Epsilon\' (Y). A modelagem das medidas de aflatoxina, por sua vez, mostrou que os modelos propostos são adequados aos dados reais, sendo que o modelo de mistura com distribuição de Weibull, entretanto, ajustou-se melhor aos dados.
Much equipment used to quantify substances, such as toxins in foods, is unable to measure low amounts. In cases where the substance exists, but in an amount below a small fixed value \'ksi\' , the equipment usually indicates that the substance is not present, producing values equal to zero. In cases where the quantity is between \'\'ksi\' and a known threshold value \'tau\', it detects the presence of the substance but is unable to measure the amount. When the substance exists in amounts above the threshold value ?, it is measure continuously, giving rise to a continuous random variable X whose domain can be written as the union of intervals, [ómicron, \"ksai\'), [\"ksai\', \'tau\' ] and (\'tau\', ?), This random variable commonly has an excess of zero values. In this work we propose models that can detect the probability of true zero, such as the mixture model with two components, one being degenerate at zero and the other with continuous distribution, where we considered the distributions: exponential, Weibull and gamma. Then, for each model, its characteristics were observed, procedures for estimating its parameters were proposed and its potential for adjustment by simulation methods was evaluated. Finally, the methodology was illustrated by modeling measures of contamination with aflatoxin B1, detected in grains of corn from three sub-samples of a batch of corn analyzed at the laboratory of of Mycotoxins, Department of Agribusiness, Food and Nutrition ESALQ/USP. In conclusion, in the majority of cases the simulations indicated that the proposed methods are efficient in estimating the parameters of the models, in particular for estimating the parameter ? and the expected value, E(Y). The modeling of measures of aflatoxin, in turn, showed that the proposed models are appropriate for the actual data, however the mixture model with a Weibull distribution fits the data best.
APA, Harvard, Vancouver, ISO, and other styles
10

Graversen, Therese. "Statistical and computational methodology for the analysis of forensic DNA mixtures with artefacts." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:4c3bfc88-25e7-4c5b-968f-10a35f5b82b0.

Full text
Abstract:
This thesis proposes and discusses a statistical model for interpreting forensic DNA mixtures. We develop methods for estimation of model parameters and assessing the uncertainty of the estimated quantities. Further, we discuss how to interpret the mixture in terms of predicting the set of contributors. We emphasise the importance of challenging any interpretation of a particular mixture, and for this purpose we develop a set of diagnostic tools that can be used in assessing the adequacy of the model to the data at hand as well as in a systematic validation of the model on experimental data. An important feature of this work is that all methodology is developed entirely within the framework of the adopted model, ensuring a transparent and consistent analysis. To overcome the challenge that lies in handling the large state space for DNA profiles, we propose a representation of a genotype that exhibits a Markov structure. Further, we develop methods for efficient and exact computation in a Bayesian network. An implementation of the model and methodology is available through the R package DNAmixtures.
APA, Harvard, Vancouver, ISO, and other styles
11

Zhao, Fangwei. "Multiresolution analysis of ultrasound images of the prostate." University of Western Australia. School of Electrical, Electronic and Computer Engineering, 2004. http://theses.library.uwa.edu.au/adt-WU2004.0028.

Full text
Abstract:
[Truncated abstract] Transrectal ultrasound (TRUS) has become the urologist’s primary tool for diagnosing and staging prostate cancer due to its real-time and non-invasive nature, low cost, and minimal discomfort. However, the interpretation of a prostate ultrasound image depends critically on the experience and expertise of a urologist and is still difficult and subjective. To overcome the subjective interpretation and facilitate objective diagnosis, computer aided analysis of ultrasound images of the prostate would be very helpful. Computer aided analysis of images may improve diagnostic accuracy by providing a more reproducible interpretation of the images. This thesis is an attempt to address several key elements of computer aided analysis of ultrasound images of the prostate. Specifically, it addresses the following tasks: 1. modelling B-mode ultrasound image formation and statistical properties; 2. reducing ultrasound speckle; and 3. extracting prostate contour. Speckle refers to the granular appearance that compromises the image quality and resolution in optics, synthetic aperture radar (SAR), and ultrasound. Due to the existence of speckle the appearance of a B-mode ultrasound image does not necessarily relate to the internal structure of the object being scanned. A computer simulation of B-mode ultrasound imaging is presented, which not only provides an insight into the nature of speckle, but also a viable test-bed for any ultrasound speckle reduction methods. Motivated by analysis of the statistical properties of the simulated images, the generalised Fisher-Tippett distribution is empirically proposed to analyse statistical properties of ultrasound images of the prostate. A speckle reduction scheme is then presented, which is based on Mallat and Zhong’s dyadic wavelet transform (MZDWT) and modelling statistical properties of the wavelet coefficients and exploiting their inter-scale correlation. Specifically, the squared modulus of the component wavelet coefficients are modelled as a two-state Gamma mixture. Interscale correlation is exploited by taking the harmonic mean of the posterior probability functions, which are derived from the Gamma mixture. This noise reduction scheme is applied to both simulated and real ultrasound images, and its performance is quite satisfactory in that the important features of the original noise corrupted image are preserved while most of the speckle noise is removed successfully. It is also evaluated both qualitatively and quantitatively by comparing it with median, Wiener, and Lee filters, and the results revealed that it surpasses all these filters. A novel contour extraction scheme (CES), which fuses MZDWT and snakes, is proposed on the basis of multiresolution analysis (MRA). Extraction of the prostate contour is placed in a multi-scale framework provided by MZDWT. Specifically, the external potential functions of the snake are designated as the modulus of the wavelet coefficients at different scales, and thus are “switchable”. Such a multi-scale snake, which deforms and migrates from coarse to fine scales, eventually extracts the contour of the prostate
APA, Harvard, Vancouver, ISO, and other styles
12

Chapman, Joanne Shirley. "Statistical methods for gamma mixtures of proportional hazards survival models." Thesis, Lancaster University, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.340567.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Malec, Peter. "Three essays on the econometric analysis of high-frequency data." Doctoral thesis, Humboldt-Universität zu Berlin, Wirtschaftswissenschaftliche Fakultät, 2013. http://dx.doi.org/10.18452/16765.

Full text
Abstract:
Diese Dissertation behandelt die ökonometrische Analyse von hochfrequenten Finanzmarktdaten. Kapitel 1 stellt einen neuen Ansatz zur Modellierung von seriell abhängigen positiven Variablen, die einen nichttrivialen Anteil an Nullwerten aufweisen, vor. Letzteres ist ein weitverbreitetes Phänomen in hochfrequenten Finanzmarktzeitreihen. Eingeführt wird eine flexible Punktmassenmischverteilung, ein maßgeschneiderter semiparametrischer Spezifikationstest sowie eine neue Art von multiplikativem Fehlermodell (MEM). Kapitel 2 beschäftigt sich mit dem Umstand, dass feste symmetrische Kerndichteschätzer eine geringe Präzision aufweisen, falls eine positive Zufallsvariable mit erheblicher Wahrscheinlichkeitsmasse nahe Null gegeben ist. Wir legen dar, dass Gammakernschätzer überlegen sind, wobei ihre relative Präzision von der genauen Form der Dichte sowie des Kerns abhängt. Wir führen einen verbesserten Gammakernschätzer sowie eine datengetriebene Methodik für die Wahl des geeigneten Typs von Gammakern ein. Kapitel 3 wendet sich der Frage nach dem Nutzen von Hochfrequenzdaten für hochdimensionale Portfolioallokationsanwendungen zu. Wir betrachten das Problem der Konstruktion von globalen Minimum-Varianz-Portfolios auf der Grundlage der Konstituenten des S&P 500. Wir zeigen auf, dass Prognosen, welche auf Hochfrequenzdaten basieren, im Vergleich zu Methoden, die tägliche Renditen verwenden, eine signifikant geringere Portfoliovolatilität implizieren. Letzteres geht mit spürbaren Nutzengewinnen aus der Sicht eines Investors mit hoher Risikoaversion einher.
In three essays, this thesis deals with the econometric analysis of financial market data sampled at intraday frequencies. Chapter 1 presents a novel approach to model serially dependent positive-valued variables realizing a nontrivial proportion of zero outcomes. This is a typical phenomenon in financial high-frequency time series. We introduce a flexible point-mass mixture distribution, a tailor-made semiparametric specification test and a new type of multiplicative error model (MEM). Chapter 2 addresses the problem that fixed symmetric kernel density estimators exhibit low precision for positive-valued variables with a large probability mass near zero, which is common in high-frequency data. We show that gamma kernel estimators are superior, while their relative performance depends on the specific density and kernel shape. We suggest a refined gamma kernel and a data-driven method for choosing the appropriate type of gamma kernel estimator. Chapter 3 turns to the debate about the merits of high-frequency data in large-scale portfolio allocation. We consider the problem of constructing global minimum variance portfolios based on the constituents of the S&P 500. We show that forecasts based on high-frequency data can yield a significantly lower portfolio volatility than approaches using daily returns, implying noticeable utility gains for a risk-averse investor.
APA, Harvard, Vancouver, ISO, and other styles
14

Gaglione, R. "Electronique d'acquisition d'une gamma-caméra." Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2005. http://tel.archives-ouvertes.fr/tel-00011068.

Full text
Abstract:
Ce travail de thèse s'inscrit dans une collaboration entre le groupe Application et Valorisation des Interactions Rayonnement-Matière et l'entreprise Hamamatsu pour l'étude d'une électronique dédiée et fortement intégrée destinée à équiper un photomultiplicateur multianodes de type H8500. De par leur faible zone morte et leur configuration multianodes, ces photomultiplicateurs permettent d'améliorer les performances des gamma-caméras utilisées en particulier pour le dépistage du cancer du sein (scintimammographie). Après avoir élaboré un cahier des charges à partir des tests effectués sur ces tubes photomultiplicateurs, une électronique d'acquisition spécifique est proposée. Elle est composée d'un préamplificateur de courant multigain, d'un intégrateur commuté et d'un convertisseur analogique-numérique à rampe. L'ensemble est autodéclenché sur le signal. Cette électronique à fait l'objet de plusieurs prototypes multivoies dont la conception et les résultats de tests sont présentés.
APA, Harvard, Vancouver, ISO, and other styles
15

Qaraguly, Rajiha al. "Dosimétrie mixte neutrons-gammas par thermoluminescence impulsionnelle." Grenoble 2 : ANRT, 1986. http://catalogue.bnf.fr/ark:/12148/cb376005253.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Tedjar, Farouk. "Contribution à l'étude physico-chimique de [gamma]MnO2 : conduction mixte électronique-protonique." Grenoble INPG, 1988. http://www.theses.fr/1988INPG0103.

Full text
Abstract:
Etude par atg, diffraction des rx et des neutrons. Un modele d'ecriture a ete propose, qui tient compte de l'existence initiale de mniii et divise l'eau en eau libre et eau liee aux mnooh dont elle solvate le proton. Une approche thermodynamique a permis d'etablir une relation entre la tension standard et la deficience stoechiometrique. L'etude de l'efficacite electrochimique a montre l'importance de l'eau libre dans ces mecanismes. La valeur du coefficient, de diffusion du proton dans gamma -mno::(2) a ete evaluee. Une correlation a ete etablie entre ce coefficient de diffusion et l'aptitude de transfert protonique. Etude des possibilites de recharge suivant l'importance de la decharge
APA, Harvard, Vancouver, ISO, and other styles
17

Tedjar, Farouk. "Contribution à l'étude physico-chimique de MnO2 conduction mixte électronique-protonique dans gamma MnO2 /." Grenoble 2 : ANRT, 1988. http://catalogue.bnf.fr/ark:/12148/cb37618829f.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Tabone, Elisabeth. "Influence d'une irradiation gamma chronique sur le système sol d'une chênaie mixte méditerranéenne à Cadache." Grenoble 2 : ANRT, 1986. http://catalogue.bnf.fr/ark:/12148/cb37601366c.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Tabone, Elisabeth. "Influence d'une irradiation gamma chronique sur le système sol d'une chênaie mixte méditerranéenne à Cadarache." Aix-Marseille 1, 1986. http://www.theses.fr/1986AIX11059.

Full text
Abstract:
En france, une source de cesium 137 de 1200 curies a ete installee, dans une foret de chenes verts - chenes blancs (quercus ilex, quercus pubescens), en 1969, au centre d'etude atomique de cadarache (bouches-du-rhone). Comme aux u. S. A. , l'experience a pour but de voir l'influence des irradiations gamma chroniques sur diverses composantes de l'ecosysteme forestier et d'apprecier sa reaction globale. Avant et depuis l'installation de la source, certains travaux ont ete realises. Ils portent essentiellement sur : - l'evolution de la phytocenose en general, avec un interet particulier pour certains aspects de la physiologie des plantes - la microbiologie - les arthropodes et la microfaune du sol
APA, Harvard, Vancouver, ISO, and other styles
20

Larmier, Kim. "Transformations de l'isopropanol sur solides aluminiques : une approche mixte expérimentale / modélisation multi-échelle." Thesis, Paris 6, 2015. http://www.theses.fr/2015PA066396/document.

Full text
Abstract:
La valorisation de la biomasse lignocellulosique en molécules plateforme pour l'industrie chimique rend nécessaire l'adaptation des méthodes de raffinage à la transformation de composés organiques oxygénés. La déshydratation des alcools connaît dans ce contexte un fort regain d'intérêt. Les travaux de cette thèse s'attachent à comprendre à l'échelle moléculaire la réactivité d'un alcool modèle (isopropanol) sur catalyseurs aluminiques, au travers d'une étude mettant en jeu expériences et modélisation aux échelles moléculaire (DFT) et du réacteur (modélisation cinétique). En combinant expériences de spectroscopie infrarouge, mesures cinétiques et modélisation moléculaire appliquée à l'adsorption et aux chemins réactionnels de l'isopropanol sur l'alumine gamma, il est montré que la réactivité de cet alcool est principalement gouvernée par la facette (100) de l'alumine. Les formations compétitives de propène, majoritaire, et de diisopropyléther, minoritaire, impliquent un même intermédiaire alcoolate, adsorbé sur un atome d'aluminium acide de Lewis, qui évolue soit par élimination directe d'une molécule d'eau (mécanisme E2), soit par condensation avec une seconde molécule d'alcool adsorbée à proximité (mécanisme SN2). Un modèle microcinétique fondé sur ce site unique de réaction, incluant de surcroît la décomposition de l'éther en isopropanol et en propène, permet de reproduire les résultats expérimentaux à condition de prendre en compte l'effet de molécules d'eau et d'alcool co-adsorbées dans l'environnement du site actif, la formation de dimères eau - intermédiaire et la stabilisation de la seconde molécule d'alcool contribuant à l'ajustement du rapport éther/propène
The upgrading of lignocellulosic biomass into strategic molecules for the chemical industry requires the adaptation of refining procedures to the transformation of oxygenated species. In this context, the dehydration of alcohols has seen renewed interest over the last decade. The work presented here aims at unravelling the reactivity of a model alcohol (isopropanol) over aluminic catalysts at the molecular scale. To this purpose, a study combining experiments and modelling at the molecular scale (DFT) and at the reactor scale (kinetic modelling) has been set up. By combining infrared spectroscopic experiments, kinetic measurements and molecular modelling of the adsorption and reaction pathways of isopropanol on gamma alumina, it is shown that this reactivity is mainly governed by the (100) facets of alumina. The competing formation of propene (major product) and diisopropylether (minor product) involves a common alcoolate intermediate adsorbed on a Lewis acidic aluminium atom, either by direct elimination of a water molecule (E2 mechanism) or by condensation with a second alcohol molecule adsorbed in vicinity (SN2 mechanism).A micro-kinetic model involving this single reaction site and including the transformation of the ether into isopropanol and propene allows reproducing the experimental results, provided that the effect of co-adsorbed water and alcohol molecules in the environment of the active site is taken into account, as the formation of water – intermediate dimers and the stabilization of the second alcohol molecule both contribute to an adjustment of the ether/propene ratio
APA, Harvard, Vancouver, ISO, and other styles
21

Kato, Fernando Hideki. "Análise de carteiras em tempo discreto." Universidade de São Paulo, 2004. http://www.teses.usp.br/teses/disponiveis/12/12139/tde-24022005-005812/.

Full text
Abstract:
Nesta dissertação, o modelo de seleção de carteiras de Markowitz será estendido com uma análise em tempo discreto e hipóteses mais realísticas. Um produto tensorial finito de densidades Erlang será usado para aproximar a densidade de probabilidade multivariada dos retornos discretos uniperiódicos de ativos dependentes. A Erlang é um caso particular da distribuição Gama. Uma mistura finita pode gerar densidades multimodais não-simétricas e o produto tensorial generaliza este conceito para dimensões maiores. Assumindo que a densidade multivariada foi independente e identicamente distribuída (i.i.d.) no passado, a aproximação pode ser calibrada com dados históricos usando o critério da máxima verossimilhança. Este é um problema de otimização em larga escala, mas com uma estrutura especial. Assumindo que esta densidade multivariada será i.i.d. no futuro, então a densidade dos retornos discretos de uma carteira de ativos com pesos não-negativos será uma mistura finita de densidades Erlang. O risco será calculado com a medida Downside Risk, que é convexa para determinados parâmetros, não é baseada em quantis, não causa a subestimação do risco e torna os problemas de otimização uni e multiperiódico convexos. O retorno discreto é uma variável aleatória multiplicativa ao longo do tempo. A distribuição multiperiódica dos retornos discretos de uma seqüência de T carteiras será uma mistura finita de distribuições Meijer G. Após uma mudança na medida de probabilidade para a composta média, é possível calcular o risco e o retorno, que levará à fronteira eficiente multiperiódica, na qual cada ponto representa uma ou mais seqüências ordenadas de T carteiras. As carteiras de cada seqüência devem ser calculadas do futuro para o presente, mantendo o retorno esperado no nível desejado, o qual pode ser função do tempo. Uma estratégia de alocação dinâmica de ativos é refazer os cálculos a cada período, usando as novas informações disponíveis. Se o horizonte de tempo tender a infinito, então a fronteira eficiente, na medida de probabilidade composta média, tenderá a um único ponto, dado pela carteira de Kelly, qualquer que seja a medida de risco. Para selecionar um dentre vários modelos de otimização de carteira, é necessário comparar seus desempenhos relativos. A fronteira eficiente de cada modelo deve ser traçada em seu respectivo gráfico. Como os pesos dos ativos das carteiras sobre estas curvas são conhecidos, é possível traçar todas as curvas em um mesmo gráfico. Para um dado retorno esperado, as carteiras eficientes dos modelos podem ser calculadas, e os retornos realizados e suas diferenças ao longo de um backtest podem ser comparados.
In this thesis, Markowitz’s portfolio selection model will be extended by means of a discrete time analysis and more realistic hypotheses. A finite tensor product of Erlang densities will be used to approximate the multivariate probability density function of the single-period discrete returns of dependent assets. The Erlang is a particular case of the Gamma distribution. A finite mixture can generate multimodal asymmetric densities and the tensor product generalizes this concept to higher dimensions. Assuming that the multivariate density was independent and identically distributed (i.i.d.) in the past, the approximation can be calibrated with historical data using the maximum likelihood criterion. This is a large-scale optimization problem, but with a special structure. Assuming that this multivariate density will be i.i.d. in the future, then the density of the discrete returns of a portfolio of assets with nonnegative weights will be a finite mixture of Erlang densities. The risk will be calculated with the Downside Risk measure, which is convex for certain parameters, is not based on quantiles, does not cause risk underestimation and makes the single and multiperiod optimization problems convex. The discrete return is a multiplicative random variable along the time. The multiperiod distribution of the discrete returns of a sequence of T portfolios will be a finite mixture of Meijer G distributions. After a change of the distribution to the average compound, it is possible to calculate the risk and the return, which will lead to the multiperiod efficient frontier, where each point represents one or more ordered sequences of T portfolios. The portfolios of each sequence must be calculated from the future to the present, keeping the expected return at the desired level, which can be a function of time. A dynamic asset allocation strategy is to redo the calculations at each period, using new available information. If the time horizon tends to infinite, then the efficient frontier, in the average compound probability measure, will tend to only one point, given by the Kelly’s portfolio, whatever the risk measure is. To select one among several portfolio optimization models, it is necessary to compare their relative performances. The efficient frontier of each model must be plotted in its respective graph. As the weights of the assets of the portfolios on these curves are known, it is possible to plot all curves in the same graph. For a given expected return, the efficient portfolios of the models can be calculated, and the realized returns and their differences along a backtest can be compared.
APA, Harvard, Vancouver, ISO, and other styles
22

FANG, MINGLIANG. "Characterizing the Binding Potential, Activity, and Bioaccessibility of Peroxisome Proliferator Activated Receptor Gamma (PPARγ) Ligands in Indoor Dust." Diss., 2015. http://hdl.handle.net/10161/9826.

Full text
Abstract:

Accumulating evidence is suggesting that exposure to some environmental contaminants may alter adipogenesis, resulting in accumulation of adipocytes, and often significant weight gain. Thus these types of contaminants are often referred to as obesogens. Many of these contaminants act via the activation (i.e. agonism) of the peroxisome proliferator activated receptor γ (PPARγ) nuclear receptor. To date, very few chemicals have been identified as possible PPAR ligands. In the thesis, our goal was to determine the PPARγ ligand binding potency and activation of several groups of major semi-volatile organic compounds (SVOCs) that are ubiquitously detected in indoor environments, including flame retardants such as polybrominated diphenyl ethers (PBDEs) and Firemaster 550 (FM550), and other SVOCs such as phthalates, organotins, halogenated phenols and bisphenols. Additional attention was also given to the potential activity of the major metabolites of several of these compounds. Since the primary sink for many of these SVOCs is dust, and dust ingestion has been confirmed as an important pathway for SVOCs accumulation in humans, the potential PPAR binding and activation in extracts from environmentally relevant dust samples was also investigated.

Previous studies have also shown that SVOCs sorbed to organic matrices (e.g., soil and sediment), were only partially bioaccessible (bioavailable), but it was unclear how bioaccessible these compounds are from indoor dust matrices. In addition, bioactivation of SVOCs (via metabolism) could exacerbate their PPAR potency. Therefore, to adequately assess the potential risk of PPARγ activation from exposure to SVOC mixtures in house dust, it is essential that one also investigates the bioaccessibility and bioactivation of these chemicals following ingestion.

In the first research aim of this thesis, the bioaccessibility and bioactivation of several important SVOCs in house dust was investigated. To accomplish this, Tenax beads (TA) encapsulated within a stainless steel insert were used as an infinite adsorption sink to estimate the dynamic absorption of a suite of flame retardants (FRs) commonly detected in indoor dust samples, and from a few polyurethane foam samples for comparison. Experimental results demonstrate that the bioaccessibility and stability of FRs following ingestion varies both by chemical and by matrix. Organophosphate flame retardants (OPFRs) had the highest estimated bioaccessibility (~80%) compared to brominated compounds (e.g. PBDEs), and values generally decreased with increasing Log Kow, with <30% bioaccessibility measured for the most hydrophobic compound tested, BDE209. In addition, the stability of the more labile SVOCs that contained ester groups (e.g. OPFRs and 2-ethylhexyl-tetrabromo-benzoate (TBB)) were examined in a simulated digestive fluid matrix. No significant changes in the OPFR concentrations were observed in this fluid; however, TBB was found to readily hydrolyze to tetrabromobenzoic acid (TBBA) in the intestinal fluid in the presence of lipases.

In research aims 2 and 3, two commercially available high-throughput bioassays, a fluorescence polarization PPAR ligand binding assay (PolarScreenTM PPARγ-competitor assay kit, Invitrogen, Aim 2) and a PPAR reporter gene assay (GeneBLAzer PPARγ non-DA Assay, Invitrogen, Aim 3) were used to investigate the binding potency and activation of several groups of SVOCs and dust extracts with human PPARγ LBD; respectively. In the PPAR binding assay (Aim 2), most of the tested compounds exhibited dose-dependent binding to PPARγ. Mono(2-ethylhexyl) tetrabromophthalate (TB-MEHP), halogenated bisphenol/phenols, triphenyl phosphate and hydroxylated PBDEs were found to be potent or moderate PPARγ ligands, based on the measured ligand binding dissociation constant (Kd). The most potent compound was 3-OH-BDE47, with an IC50 of 0.24 μM. The extent of halogenation and the position of the hydroxyl group strongly affected binding. Of the dust samples tested, 21 of 24 samples showed significant PPAR binding potency at a concentration of 3 mg dust equivalents (DEQ)/mL. In the PPAR reporter assay (Aim 3), many SVOCs or their metabolites were either confirmed (based on previous reports) or for the first time were found to be potential PPARγ agonists with various potency and efficacy. We also observed that 15 of 25 dust extracts examined showed an activation percentage more than 8% (calculated activation threshold) of the maximal activation induced by rosiglitazone (positive control). In some cases, activation was as high as 50% of the rosiglitazone activation for the dust extracts with the highest efficacy. Furthermore, the correlation between the reporter assay and the ligand binding assay among the house dust extracts was significant and positive (r = 0.7, p < 0.003), suggesting the binding potency was predicting activation. In research aim 2, the effect of bioactivation on the PPARγ binding potency was also investigated. In vitro bioactivation of house dust extracts incubated with rat and human hepatic S9 fractions was used to investigate the role of in vivo biotransformation on PPAR gamma activity. The result showed that metabolism may lead to an increased binding affinity, as a 3-16% increase in PPARγ binding activity was observed following bioactivation of the dust extracts.

In research aim 4, an effect-directed analysis (EDA) was used to identify compounds likely contributing to the observed PPAR activity among the dust extract. Three dust extracts which showed significant PPAR activity with approximately 25, 30, and 50% of the maximal response induced by rosiglitazone at the highest efficacy were fractionated using normal phase high-performance liquid chromatography (NP-HPLC) and each fraction was individually tested for PPAR activity. Active fractions were then analyzed using gas-chromatography mass spectrometry (GC-MS) and possible compounds identified. Three dust extracts showed a similar PPAR activity distribution among the NP-HPLC fractions. In the most active fractions, fatty acids (FAs) were identified as the most active chemicals. The concentrations of four FAs were measured in the house dust extracts, and the concentrations were found to be highly correlated with the observed PPAR activity. These four FAs were also tested for PPAR activity and found to be partial PPAR agonists, particularly oleic and myristic acid. To tentatively identify sources of FAs, FAs in human/animal hair, dead skin cells, and two brands of cooking oil were analyzed. We found the same FAs in those samples and there concentrations were relatively abundant, ranging from 186 to 14,868 µg/g. Therefore, these results suggest that FAs are likely responsible for the observed PPAR activity in indoor dust. Also, this is the first study reporting on the level of FAs in dust samples. The source of these FAs in dust may be either from the cooking or accumulation of human/animal cells in indoor dust.

In conclusion, this research demonstrates that many SVOCs ubiqutiously detected in house dust, and/or their metabolites, can be weak or moderate PPAR ligands. In addition, chemical mixtures in house dust can effectively bind to and activate PPAR. However, our results suggest FAs are probably responsible for these observations, and likely outcompeting the synthetic environmental contaminants present in the dust extract. Furthermore, bioactivation of contaminants present in house dust can potentially increase their affinity for PPAR. And lastly, the bioaccessibility and stability of SVOCs in house dust after ingestion are likely to modulate the PPAR activity in the environmental mixtures and should be considered in future risk assessments.


Dissertation
APA, Harvard, Vancouver, ISO, and other styles
23

Evgeniou, Theodoros, and Massimiliano Pontil. "A Note on the Generalization Performance of Kernel Classifiers with Margin." 2000. http://hdl.handle.net/1721.1/7169.

Full text
Abstract:
We present distribution independent bounds on the generalization misclassification performance of a family of kernel classifiers with margin. Support Vector Machine classifiers (SVM) stem out of this class of machines. The bounds are derived through computations of the $V_gamma$ dimension of a family of loss functions where the SVM one belongs to. Bounds that use functions of margin distributions (i.e. functions of the slack variables of SVM) are derived.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography