Dissertations / Theses on the topic 'Bivariate analysi'

To see the other types of publications on this topic, follow the link: Bivariate analysi.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Bivariate analysi.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

CAIRO, BEATRICE. "ESTIMATING CARDIORESPIRATORY COUPLING FROM SPONTANEOUS VARIABILITY IN HEALTH AND PATHOLOGY." Doctoral thesis, Università degli Studi di Milano, 2021. http://hdl.handle.net/2434/816612.

Full text
Abstract:
Molteplici meccanismi sono responsabili delle interazioni cardiorespiratorie osservate nell’uomo. L’azione di questi meccanismi risulta in specifici ritmi di variabilità cardiaca (HRV) e ha effetti sulle interazioni tra attività cardiaca e respiratoria. I quattro principali tipi di fenomeni che derivano dalle interazioni tra cuore e sistema respiratorio sono: i) aritmia respiratoria sinusale (RSA); ii) accoppiamento cardioventilatorio; iii) sincronizzazione cardiorespiratoria in fase; iv) sincronizzazione cardiorespiratoria in frequenza. L’obiettivo di questa tesi è la descrizione e quantificazione di diversi aspetti delle interazioni cardiorespiratorie tramite l’utilizzo di una varietà di metodologie derivate dalla letteratura, adattate e ottimizzate per i tipici contesti sperimentali in cui HRV e segnale respiratorio sono comunemente acquisiti. Sei metodi analitici sono stati sfruttati a questo scopo per valutare l’entropia di trasferimento (TE), l’entropia cross-condizionata tramite entropia corretta cross-condizionata normalizzata (NCCCE), la coerenza quadratica (K2), l’accoppiamento cardioventilatorio tramite entropia normalizzata di Shannon (NSE) dell’intervallo temporale tra complesso QRS e inizio di fase inspiratoria o espiratoria, la sincronizzazione in fase tramite un indice di sincronizzazione (SYNC%) e il quoziente pulsazione-respirazione (PRQ). Questi approcci sono stati utilizzati con la finalità di testare gli effetti di uno stimolo simpatico, ovvero stimoli posturali quali l’head-up tilt (TILT) e l’ortostatismo attivo (STAND) sulle interazioni cardiorespiratorie. Gli approcci proposti sono stati testati in tre protocolli: i) atleti amatoriali sottoposti a un allenamento muscolare inspiratorio (IMT) durante clinostatismo supino (REST) e STAND; ii) volontari sani sottoposti a un decondizionamento da allettamento prolungato (HDBR), durante REST e TILT; iii) pazienti affetti da sindrome da tachicardia posturale ortostatica (POTS), durante REST e TILT, in condizione basale e durante approfondimento un anno dopo. I risultati principali della presente tesi di dottorato concernono l’effetto degli stimoli posturali sulle interazioni cardiorespiratorie in soggetti sani e patologici. Infatti, tutti gli indici proposti danno una visione coerente dell’intensità dell’interazione cardiorespiratoria in risposta a uno stimolo ortostatico, in quanto essa diminuisce in tutti i protocolli. Tuttavia, il potere statistico degli indici è differente. TE e K2 appaiono essere particolarmente deboli nell’identificare l’effetto dello stimolo posturale sulle interazioni cardiorespiratorie. NCCCE, NSE, e SYNC% dimostrano una capacità molto maggiore a tale riguardo, mentre PRQ appare troppo intimamente collegata alla frequenza cardiaca, in assenza di cambiamenti significativi della frequenza respiratoria. Per contro, tutti gli indici appaiono deboli nell’identificare gli effetti cronici di IMT e HDBR in una popolazione sana o le conseguenze croniche della gestione clinica in pazienti POTS. La tesi conclude che diversi aspetti delle interazioni cardiorespiratorie possono essere modificati in modo acuto ma gli effetti cronici di un trattamento o intervento a lungo termine sono irrisori sulla magnitudine delle interazioni cardiorespiratorie e/o possono essere confusi con la variabilità intrinseca degli indici. Considerazioni sulle differenze metodologiche e sull’efficacia degli indici proposti suggeriscono che un utilizzo simultaneo di molteplici metodi bivariati è vantaggiosa negli studi cardiorespiratori, in quanto diversi aspetti delle interazioni cardiorespiratorie possono essere valutati contemporaneamente. Questa valutazione simultanea può essere effettuata a un costo computazionale trascurabile e in contesti applicativi in cui il solo segnale ECG è disponibile.
Several mechanisms are responsible for cardiorespiratory interactions observed in humans. The action of these mechanisms results in specific patterns in heart rate variability (HRV) and affects the interaction between heart and respiratory activities. The four main types of phenomena resulting from the interactions between heart and respiratory system are: i) respiratory sinus arrhythmia (RSA); ii) cardioventilatory coupling; iii) cardiorespiratory phase synchronization; iv) cardiorespiratory frequency synchronization. The aim of this thesis is to describe and quantify different aspects of cardiorespiratory interactions employing a variety of methods from literature, adapted and optimized for the usual experimental settings in which HRV and respiratory signal are commonly acquired. Six analytical methods were exploited for this purpose assessing transfer entropy (TE), cross-conditional entropy via normalized corrected cross-conditional entropy (NCCCE), squared coherence (K2), cardioventilatory coupling via normalized Shannon entropy (NSE) of the time interval between QRS complex and inspiratory, or expiratory, onsets, phase synchronization via a synchronization index (SYNC%) and pulse-respiration quotient (PRQ). These approaches were employed with the goal of testing the effects of a sympathetic challenge, namely postural stimuli like head-up tilt (TILT) and active standing (STAND), on cardiorespiratory interactions. The proposed approaches were tested on three protocols: i) amateur athletes undergoing an inspiratory muscle training (IMT) during supine rest (REST) and STAND; ii) healthy volunteers undergoing a prolonged bed rest deconditioning (HDBR), during REST and TILT; iii) patients suffering from postural orthostatic tachycardia syndrome (POTS), during REST and TILT, at baseline and at one-year follow-up. The most important findings of the present doctoral thesis concern the effect of postural stimuli on cardiorespiratory interactions in health and disease. Indeed, all proposed indexes gave a coherent view of cardiorespiratory interaction strength in response to the orthostatic challenge, as it decreased in all protocols. However, the statistical power of the indexes was different. TE and K2 appeared to be particularly weak in detecting the effect of postural challenge on cardiorespiratory interactions. NCCCE, NSE and SYNC% exhibited much stronger ability in this regard, while PRQ seemed too closely related to heart rate, in presence of no significant modification of the respiratory rate. Conversely, all indexes appeared to be weak in detecting the chronic effects of IMT and HDBR on a healthy population and the long-term consequences of the clinical management in POTS patients. The thesis concludes that the different aspects of cardiorespiratory interactions can be modified acutely but the chronic effects of a long-term treatment or intervention on the magnitude of cardiorespiratory interactions are negligible and/or could be confused with the variability of markers. Considerations about the methodological dissimilarities and differences in effectiveness of the proposed indexes suggest that the simultaneous exploitation of all bivariate methodologies in cardiorespiratory studies is advantageous, as different aspects of cardiorespiratory interactions can be evaluated concurrently. This simultaneous evaluation can be carried out with a relatively negligible computational cost and in applicative contexts when only an ECG signal is available.
APA, Harvard, Vancouver, ISO, and other styles
2

Crampton, James Scutts. "Palaeobiology of cretaceous inoceramid bivalves." Thesis, University of Cambridge, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.308302.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Fan, Juanjuan. "Dependency estimation over a finite bivariate failure time region /." Thesis, Connect to this title online; UW restricted, 1997. http://hdl.handle.net/1773/9548.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Caruso, Elise M. "A Simulation Analysis of Bivariate Availability Models." Thesis, Virginia Tech, 2000. http://hdl.handle.net/10919/34153.

Full text
Abstract:
Equipment behavior is often discussed in terms of age and use. For example, an automobile is frequently referred to 3 years old with 30,000 miles. Bivariate failure modeling provides a framework for studying system behavior as a function of two variables. This is meaningful when studying the reliability/availability of systems and equipment. This thesis extends work done in the area of bivariate failure modeling. Four bivariate failure models are selected for analysis. The study includes exploration of bivariate random number generation. The random data is utilized in estimating the bivariate renewal function and bivariate availability function. The two measures provide insight on system behavior characterized by multiple variables. A method for generating bivariate failure and repair data is developed for each model. Of the four models, two represent correlated random variables; the other two, stochastic functionally dependent variables. Also, methods of estimating the bivariate renewals function and bivariate availability function are constructed. The bivariate failure and repair data from the four failure models is incorporated into the estimation processes to study various failure scenarios.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
5

Koen, Marthinus Christoffel. "The analysis of some bivariate astronomical time series." Master's thesis, University of Cape Town, 1993. http://hdl.handle.net/11427/17341.

Full text
Abstract:
Bibliography: pages 75-76.
In the first part of the thesis, a linear time domain transfer function is fitted to satellite observations of a variable galaxy, NGC5548. The transfer functions relate an input series (ultraviolet continuum flux) to an output series (emission line flux). The methodology for fitting transfer function is briefly described. The autocorrelation structure of the observations of NGC5548 in different electromagnetic spectral bands is investigated, and appropriate univariate autoregressive moving average models given. The results of extensive transfer function fitting using respectively the λ1337 and λ1350 continuum variations as input series, are presented. There is little evidence for a dead time in the response of the emission line variations which are presumed driven by the continuum. Part 2 of the thesis is devoted to the estimation of the lag between two irregularly spaced astronomical time series. Lag estimation methods which have been used in the astronomy literature are reviewed. Some problems are pointed out, particularly the influence of autocorrelation and non-stationarity of the series. If the two series can be modelled as random walks, both these problems can be dealt with efficiently. Maximum likelihood estimation of the random walk and measurement error variances, as well as the lag between the two series, is discussed. Large-sample properties of the estimators are derived. An efficient computational procedure for the likelihood which exploits the sparseness of the covariance matrix, is briefly described. Results are derived for two example data sets: the variations in the two gravitationally lensed images of a quasar, and brightness changes of the active galaxy NGC3783 in two different wavelengths. The thesis is concluded with a brief consideration of other analysis methods which appear interesting.
APA, Harvard, Vancouver, ISO, and other styles
6

Haug, Mark. "Nonparametric density estimation for univariate and bivariate distributions with applications in discriminant analysis for the bivariate case." Thesis, Kansas State University, 1986. http://hdl.handle.net/2097/9916.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Heise, Mark A. "Optimal designs for a bivariate logistic regression model." Diss., Virginia Tech, 1993. http://hdl.handle.net/10919/38538.

Full text
Abstract:
In drug-testing experiments the primary responses of interest are efficacy and toxicity. These can be modeled as a bivariate quantal response using the Gumbel model for bivariate logistic regression. D-optimal and Q-optimal experimental designs are developed for this model The Q-optimal design minimizes the average asymptotic prediction variance of p(l,O;d), the probability of efficacy without toxicity at dose d, over a desired range of doses. In addition, a new optimality criterion, T -optimality, is developed which minimizes the asymptotic variance of the estimate of the therapeutic index. Most experimenters will be less familiar with the Gumbel bivariate logistic regression model than with the univariate logistic regression models which comprise its marginals. Therefore, the optimal designs based on the Gumbel model are evaluated based on univariate logistic regression D-efficiencies; conversely, designs derived from the univariate logistic regression model are evaluated with respect to the Gumbel optimality criteria. Further practical considerations motivate an exploration of designs providing a maximum compromise between the three Gumbel-based criteria D, Q and T. Finally, 5-point designs which can be generated by fitted equations are proposed as a practical option for experimental use.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
8

Blatchford, Patrick Judson. "Monitoring bivariate endpoints in group sequential clinical trials /." Connect to full text via ProQuest. Limited to UCD Anschutz Medical Campus, 2007.

Find full text
Abstract:
Thesis (Ph.D. in Biostatistics) -- University of Colorado Denver, 2007.
Typescript. Includes bibliographical references (leaves 104-106). Free to UCD affiliates. Online version available via ProQuest Digital Dissertations;
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Chunnan. "Analysis of a new bivariate distribution in reliability theory." Diss., The University of Arizona, 2000. http://hdl.handle.net/10150/284128.

Full text
Abstract:
Freund [1961] introduced a bivariate extension of the exponential distribution that provides a model in which the exponential residual lifetime of one component depends on the working status of another component. We define and study an extension of the Freund distribution in this dissertation. In the first chapter we define some basic concepts that are needed for later developments. We give the definition of the multivariate conditional hazard rate functions of a nonnegative absolutely continuous random vector and study a characterization of these functions in Section 1.1. Then we study some notions of aging: an increasing failure rate (IFR) distribution, a decreasing failure rate (DFR) distribution, an increasing failure rate average (IFRA) distribution, and a decreasing failure rate average (DFRA) distribution in Section 1.2. In Section 1.3 we study two concepts of multivariate dependence: association and positive quadrant dependence. In Chapter 2 we construct a shock model and the new bivariate distribution is the joint distribution of the resulting lifetimes. We explicitly compute the density function, survival function, moment generating function, marginal density functions and marginal survival functions. Also in this chapter, we study the correlation coefficient and other senses of positive dependence of the two random variables of the new bivariate distribution. Then we extend the new distribution to multivariate case. In Chapter 3 we study some aging properties. We obtain two results about the new distribution in n dimensions. The first result says that the marginal distributions of the new multivariate distribution have decreasing failure rate if the conditional hazard rates are decreasing and bounded above by 1. The second one concerns an (n-1 )-out-of-n system such that the joint distribution of the lifetimes of each component is the new distribution in n dimensions. It gives conditions on the parameters under which the system has an IFRA distribution. In Chapter 4 we develop some estimation procedure for the parameters a and b of the new bivariate distribution. We apply the method of moments and the maximum likelihood principle to estimate the parameters. We prove that the method of moments estimator is a consistent asymptotically normal estimator. Then we use Mathematica to run simulation and compare the method of moments estimator with the maximum likelihood estimator. We also compute the 95% confidence interval for a and b from the method of moments estimator. In the last chapter we study a stochastic ordering problem. We have two nonnegative n dimensional random vectors X and Y. We assume that X and Y have the same conditional hazard rates up to a certain level. We give a condition under which the two vectors X and Y are stochastically ordered.
APA, Harvard, Vancouver, ISO, and other styles
10

Lavorenti, Norberto A. "Fitting models in a bivariate analaysis of intercropping." Thesis, University of Reading, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.266039.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Beversdorf, Louanne Margaret. "Tests for Correlation on Bivariate Nonnormal Distributions." UNF Digital Commons, 2008. http://digitalcommons.unf.edu/etd/284.

Full text
Abstract:
Many samples in the real world are very small in size and often do not follow a normal distribution. Existing tests for correlation have restrictions on the distribution of data and sample sizes, therefore the current tests cannot be used in some real world situations. In this thesis, two tests are considered to test hypotheses about the population correlation coefficient. The tests are based on statistics transformed by a saddlepoint approximation and by Fisher's Z-transformation. The tests are conducted on small samples of bivariate nonnormal data and found to perfom well. Simulations were run in order to compare the type I error rates and power of the new test with other commonly used tests. The new tests controlled type I error rates well, and have reasonable power performance.
APA, Harvard, Vancouver, ISO, and other styles
12

Lee, Sukhoon. "Inference for a bivariate survival function induced through the environment /." The Ohio State University, 1986. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487267024997374.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Moodie, Felicity Zoe. "A new framework for nonparametric estimation of the bivariate survivor function /." Thesis, Connect to this title online; UW restricted, 2001. http://hdl.handle.net/1773/9535.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

潘成達 and Shing-Tat Poon. "Measuring the degree of dependence of lifetimes in some bivariate survival distributions." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1993. http://hub.hku.hk/bib/B31977443.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Poon, Shing-Tat. "Measuring the degree of dependence of lifetimes in some bivariate survival distributions." [Hong Kong] : University of Hong Kong, 1993. http://sunzi.lib.hku.hk/hkuto/record.jsp?B13787421.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Andrew, Steven Paul. "Tools for the simulation and analysis of aerodynamic models." Ohio : Ohio University, 1999. http://www.ohiolink.edu/etd/view.cgi?ohiou1176224995.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Krämer, Romy, Matthias Richter, and Bernd Hofmann. "Parameter estimation in a generalized bivariate Ornstein-Uhlenbeck model." Universitätsbibliothek Chemnitz, 2005. http://nbn-resolving.de/urn:nbn:de:swb:ch1-200501307.

Full text
Abstract:
In this paper, we consider the inverse problem of calibrating a generalization of the bivariate Ornstein-Uhlenbeck model introduced by Lo and Wang. Even though the generalized Black-Scholes option pricing formula still holds, option prices change in comparison to the classical Black-Scholes model. The time-dependent volatility function and the other (real-valued) parameters in the model are calibrated simultaneously from option price data and from some empirical moments of the logarithmic returns. This gives an ill-posed inverse problem, which requires a regularization approach. Applying the theory of Engl, Hanke and Neubauer concerning Tikhonov regularization we show convergence of the regularized solution to the true data and study the form of source conditions which ensure convergence rates.
APA, Harvard, Vancouver, ISO, and other styles
18

Krämer, Romy, and Matthias Richter. "A Generalized Bivariate Ornstein-Uhlenbeck Model for Financial Assets." Universitätsbibliothek Chemnitz, 2008. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-200800572.

Full text
Abstract:
In this paper, we study mathematical properties of a generalized bivariate Ornstein-Uhlenbeck model for financial assets. Originally introduced by Lo and Wang, this model possesses a stochastic drift term which influences the statistical properties of the asset in the real (observable) world. Furthermore, we generali- ze the model with respect to a time-dependent (but still non-random) volatility function. Although it is well-known, that drift terms - under weak regularity conditions - do not affect the behaviour of the asset in the risk-neutral world and consequently the Black-Scholes option pricing formula holds true, it makes sense to point out that these regularity conditions are fulfilled in the present model and that option pricing can be treated in analogy to the Black-Scholes case.
APA, Harvard, Vancouver, ISO, and other styles
19

Kiss, Alexander J. "The inferential performance of five estimators in bivariate repeated measures regression analysis." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk2/tape15/PQDD_0008/MQ34027.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Van, der Bijl Rinske. "Bivariate wavelet construction based on solutions of algebraic polynomial identities." Thesis, Stellenbosch : Stellenbosch University, 2012. http://hdl.handle.net/10019.1/20175.

Full text
Abstract:
Thesis (PhD)--Stellenbosch University, 2012.
ENGLISH ABSTRACT: Multi-resolution analysis (MRA) has become a very popular eld of mathematical study in the past two decades, being not only an area rich in applications but one that remains lled with open problems. Building on the foundation of re nability of functions, MRA seeks to lter through levels of ever-increasing detail components in data sets { a concept enticing to an age where development of digital equipment (to name but one example) needs to capture more and more information and then store this information in di erent levels of detail. Except for designing digital objects such as animation movies, one of the most recent popular research areas in which MRA is applied, is inpainting, where \lost" data (in example, a photograph) is repaired by using boundary values of the data set and \smudging" these values into the empty entries. Two main branches of application in MRA are subdivision and wavelet analysis. The former uses re nable functions to develop algorithms with which digital curves are created from a nite set of initial points as input, the resulting curves (or drawings) of which possess certain levels of smoothness (or, mathematically speaking, continuous derivatives). Wavelets on the other hand, yield lters with which certain levels of detail components (or noise) can be edited out of a data set. One of the greatest advantages when using wavelets, is that the detail data is never lost, and the user can re-insert it to the original data set by merely applying the wavelet algorithm in reverse. This opens up a wonderful application for wavelets, namely that an existent data set can be edited by inserting detail components into it that were never there, by also using such a wavelet algorithm. In the recent book by Chui and De Villiers (see [2]), algorithms for both subdivision and wavelet applications were developed without using Fourier analysis as foundation, as have been done by researchers in earlier years and which have left such algorithms unaccessible to end users such as computer programmers. The fundamental result of Chapter 9 on wavelets of [2] was that feasibility of wavelet decomposition is equivalent to the solvability of a certain set of identities consisting of Laurent polynomials, referred to as Bezout identities, and it was shown how such a system of identities can be solved in a systematic way. The work in [2] was done in the univariate case only, and it will be the purpose of this thesis to develop similar results in the bivariate case, where such a generalization is entirely non-trivial. After introducing MRA in Chapter 1, as well as discussing the re nability of functions and introducing box splines as prototype examples of functions that are re nable in the bivariate setting, our fundamental result will also be that wavelet decomposition is equivalent to solving a set of Bezout identities; this will be shown rigorously in Chapter 2. In Chapter 3, we give a set of Laurent polynomials of shortest possible length satisfying the system of Bezout identities in Chapter 2, for the particular case of the Courant hat function, which will have been introduced as a linear box spline in Chapter 1. In Chapter 4, we investigate an application of our result in Chapter 3 to bivariate interpolatory subdivision. With the view to establish a general class of wavelets corresponding to the Courant hat function, we proceed in the subsequent Chapters 5 { 8 to develop a general theory for solving the Bezout identities of Chapter 2 separately, before suggesting strategies for reconciling these solution classes in order to be a simultaneous solution of the system.
AFRIKAAANSE OPSOMMING: Multi-resolusie analise (MRA) het in die afgelope twee dekades toenemende gewildheid geniet as 'n veld in wiskundige wetenskappe. Nie net is dit 'n area wat ryklik toepaslik is nie, maar dit bevat ook steeds vele oop vraagstukke. MRA bou op die grondleggings van verfynbare funksies en poog om deur vlakke van data-komponente te sorteer, of te lter, 'n konsep wat aanloklik is in 'n era waar die ontwikkeling van digitale toestelle (om maar 'n enkele voorbeeld te noem) sodanig moet wees dat meer en meer inligting vasgel^e en gestoor moet word. Behalwe vir die ontwerp van digitale voorwerpe, soos animasie- lms, word MRA ook toegepas in 'n mees vername navorsingsgebied genaamd inverwing, waar \verlore" data (soos byvoorbeeld in 'n foto) herwin word deur data te neem uit aangrensende gebiede en dit dan oor die le e data-dele te \smeer." Twee hooftakke in toepassing van MRA is subdivisie en gol e-analise. Die eerste gebruik verfynbare funksies om algoritmes te ontwikkel waarmee digitale krommes ontwerp kan word vanuit 'n eindige aantal aanvanklike gegewe punte. Die verkrygde krommes (of sketse) kan voldoen aan verlangde vlakke van gladheid (of verlangde grade van kontinue afgeleides, wiskundig gesproke). Gol es word op hul beurt gebruik om lters te bou waarmee gewensde dataof geraas-komponente verwyder kan word uit datastelle. Een van die grootste voordeel van die gebruik van gol es bo ander soortgelyke instrumente om data lters mee te bou, is dat die geraas-komponente wat uitgetrek word nooit verlore gaan nie, sodat die proses omkeerbaar is deurdat die gebruiker die sodanige geraas-komponente in die groter datastel kan terugbou deur die gol e-algoritme in trurat toe te pas. Hierdie eienskap van gol fies open 'n wonderlike toepassingsmoontlikheid daarvoor, naamlik dat 'n bestaande datastel verander kan word deur data-komponente daartoe te voeg wat nooit daarin was nie, deur so 'n gol e-algoritme te gebruik. In die onlangse boek deur Chui and De Villiers (sien [2]) is algoritmes ontwikkel vir die toepassing van subdivisie sowel as gol es, sonder om staat te maak op die grondlegging van Fourier-analise, soos wat die gebruik was in vroe ere navorsing en waardeur algoritmes wat ontwikkel is minder e ektief was vir eindgebruikers. Die fundamentele resultaat oor gol es in Hoofstuk 9 in [2], verduidelik hoe suksesvolle gol e-ontbinding ekwivalent is aan die oplosbaarheid van 'n sekere versameling van identiteite bestaande uit Laurent-polinome, bekend as Bezout-identiteite, en dit is bewys hoedat sodanige stelsels van identiteite opgelos kan word in 'n sistematiese proses. Die werk in [2] is gedoen in die eenveranderlike geval, en dit is die doelwit van hierdie tesis om soortgelyke resultate te ontwikkel in die tweeveranderlike geval, waar sodanige veralgemening absoluut nie-triviaal is. Nadat 'n inleiding tot MRA in Hoofstuk 1 aangebied word, terwyl die verfynbaarheid van funksies, met boks-latfunksies as prototipes van verfynbare funksies in die tweeveranderlike geval, bespreek word, word ons fundamentele resultaat gegee en bewys in Hoofstuk 2, naamlik dat gol e-ontbinding in die tweeveranderlike geval ook ekwivalent is aan die oplos van 'n sekere stelsel van Bezout-identiteite. In Hoofstuk 3 word 'n versameling van Laurent-polinome van korste moontlike lengte gegee as illustrasie van 'n oplossing van 'n sodanige stelsel van Bezout-identiteite in Hoofstuk 2, vir die besondere geval van die Courant hoedfunksie, wat in Hoofstuk 1 gede nieer word. In Hoofstuk 4 ondersoek ons 'n toepassing van die resultaat in Hoofstuk 3 tot tweeveranderlike interpolerende subdivisie. Met die oog op die ontwikkeling van 'n algemene klas van gol es verwant aan die Courant hoedfunksie, brei ons vervolglik in Hoofstukke 5 { 8 'n algemene teorie uit om die oplossing van die stelsel van Bezout-identiteite te ondersoek, elke identiteit apart, waarna ons moontlike strategie e voorstel vir die versoening van hierdie klasse van gelyktydige oplossings van die Bezout stelsel.
APA, Harvard, Vancouver, ISO, and other styles
21

Saint, Pierre Aude. "Méthodes d'analyse génétique de traits quantitatifs corrélés : application à l'étude de la densité minérale osseuse." Phd thesis, Université Paris Sud - Paris XI, 2011. http://tel.archives-ouvertes.fr/tel-00633981.

Full text
Abstract:
La plupart des maladies humaines ont une étiologie complexe avec des facteurs génétiques et environnementaux qui interagissent. Utiliser des phénotypes corrélés peut augmenter la puissance de détection de locus de trait quantitatif. Ce travail propose d'évaluer différentes approches d'analyse bivariée pour des traits corrélés en utilisant l'information apportée par les marqueurs au niveau de la liaison et de l'association. Le gain relatif de ces approches est comparé aux analyses univariées. Ce travail a été appliqué à la variation de la densité osseuse à deux sites squelettiques dans une cohorte d'hommes sélectionnés pour des valeurs phénotypiques extrêmes. Nos résultats montrent l'intérêt d'utiliser des approches bivariées en particulier pour l'analyse d'association. Par ailleurs, dans le cadre du groupe de travail GAW16, nous avons comparé les performances relatives de trois méthodes d'association dans des données familiales.
APA, Harvard, Vancouver, ISO, and other styles
22

Orjuela, Maria del Pilar. "A Study on the Correlation of Bivariate And Trivariate Normal Models." FIU Digital Commons, 2013. http://digitalcommons.fiu.edu/etd/976.

Full text
Abstract:
Suppose two or more variables are jointly normally distributed. If there is a common relationship between these variables it would be very important to quantify this relationship by a parameter called the correlation coefficient which measures its strength, and the use of it can develop an equation for predicting, and ultimately draw testable conclusion about the parent population. This research focused on the correlation coefficient ρ for the bivariate and trivariate normal distribution when equal variances and equal covariances are considered. Particularly, we derived the maximum Likelihood Estimators (MLE) of the distribution parameters assuming all of them are unknown, and we studied the properties and asymptotic distribution of . Showing this asymptotic normality, we were able to construct confidence intervals of the correlation coefficient ρ and test hypothesis about ρ. With a series of simulations, the performance of our new estimators were studied and were compared with those estimators that already exist in the literature. The results indicated that the MLE has a better or similar performance than the others.
APA, Harvard, Vancouver, ISO, and other styles
23

Chen, Cuixian. "Asymptotic properties of the Buckley-James estimator for a bivariate interval censorship regression model." Diss., Online access via UMI:, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
24

Erte, Idil. "Bivariate Random Effects And Hierarchical Meta-analysis Of Summary Receiver Operating Characteristic Curve On Fine Needle Aspiration Cytology." Master's thesis, METU, 2011. http://etd.lib.metu.edu.tr/upload/12613619/index.pdf.

Full text
Abstract:
In this study, meta-analysis of diagnostic tests, Summary Receiver Operating Characteristic (SROC) curve, bivariate random effects and Hierarchical Summary Receiver Operating Characteristic (HSROC) curve theories have been discussed and accuracy in literature of Fine Needle Aspiration (FNA) biopsy that is used in the diagnosis of masses in breast cancer (malignant or benign) has been analyzed. FNA Cytological (FNAC) examination in breast tumor is, easy, effective, effortless, and does not require special training for clinicians. Because of the uncertainty related to FNAC&lsquo
s accurate usage in publications, 25 FNAC studies have been gathered in the meta-analysis. In the plotting of the summary ROC curve, the logit difference and sums of the true positive rates and the false positive rates included in the meta-analysis&lsquo
s codes have been generated by SAS. The formula of the bivariate random effects model and hierarchical summary ROC curve is presented in context with the literature. Then bivariate random effects implementation with the new SAS PROC GLIMMIX is generated. Moreover, HSROC implementation is generated by SAS PROC HSROC NLMIXED. Curves are plotted with RevMan Version 5 (2008). It has been stated that the meta-analytic results of bivariate random effects are nearly identical to the results from the HSROC approach. The results achieved through both random effects meta-analytic methods prove that FNA Cytology is a diagnostic test with a high level of distinguish over breast tumor.
APA, Harvard, Vancouver, ISO, and other styles
25

Provizionatou, Vikentia. "Models of risk management with heavy-tailed stock returns : univariate time series and bivariate copula analysis." Thesis, University of Essex, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.437678.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Li, Han. "Statistical Modeling and Analysis of Bivariate Spatial-Temporal Data with the Application to Stream Temperature Study." Diss., Virginia Tech, 2014. http://hdl.handle.net/10919/70862.

Full text
Abstract:
Water temperature is a critical factor for the quality and biological condition of streams. Among various factors affecting stream water temperature, air temperature is one of the most important factors related to water temperature. To appropriately quantify the relationship between water and air temperatures over a large geographic region, it is important to accommodate the spatial and temporal information of the steam temperature. In this dissertation, I devote effort to several statistical modeling techniques for analyzing bivariate spatial-temporal data in a stream temperature study. In the first part, I focus our analysis on the individual stream. A time varying coefficient model (VCM) is used to study the relationship between air temperature and water temperature for each stream. The time varying coefficient model enables dynamic modeling of the relationship, and therefore can be used to enhance the understanding of water and air temperature relationships. The proposed model is applied to 10 streams in Maryland, West Virginia, Virginia, North Carolina and Georgia using daily maximum temperatures. The VCM approach increases the prediction accuracy by more than 50% compared to the simple linear regression model and the nonlinear logistic model. The VCM that describes the relationship between water and air temperatures for each stream is represented by slope and intercept curves from the fitted model. In the second part, I consider water and air temperatures for different streams that are spatial correlated. I focus on clustering multiple streams by using intercept and slope curves estimated from the VCM. Spatial information is incorporated to make clustering results geographically meaningful. I further propose a weighted distance as a dissimilarity measure for streams, which provides a flexible framework to interpret the clustering results under different weights. Real data analysis shows that streams in same cluster share similar geographic features such as solar radiation, percent forest and elevation. In the third part, I develop a spatial-temporal VCM (STVCM) to deal with missing data. The STVCM takes both spatial and temporal variation of water temperature into account. I develop a novel estimation method that emphasizes the time effect and treats the space effect as a varying coefficient for the time effect. A simulation study shows that the performance of the STVCM on missing data imputation is better than several existing methods such as the neural network and the Gaussian process. The STVCM is also applied to all 156 streams in this study to obtain a complete data record.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
27

Romeiro, Renata Guimarães 1987. "Modelo de regressão Birnbaum-Saunders bivariado." [s.n.], 2014. http://repositorio.unicamp.br/jspui/handle/REPOSIP/307091.

Full text
Abstract:
Orientador: Filidor Edilfonso Vilca Labra
Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matemática Estatística e Computação Científica
Made available in DSpace on 2018-08-24T16:29:00Z (GMT). No. of bitstreams: 1 Romeiro_RenataGuimaraes_M.pdf: 10761224 bytes, checksum: 3606332b6846c959d076e318f1667133 (MD5) Previous issue date: 2014
Resumo: O modelo de regressão Birnbaum-Saunders de Rieck e Nedelman (1991) tem sido amplamente discutido por vários autores, com aplicações na área de sobrevivência e confiabilidade. Neste trabalho, desenvolvemos um modelo de regressão Birnbaum-Saunders bivariado através do uso da distribuição Senh-Normal proposta por Rieck (1989). Este modelo de regressão pode ser utilizado para analisar logaritmos de tempos de vida de duas unidades correlacionadas, e gera marginais correspondentes aos modelos de regressão Birnbaum-Saunders univariados. Apresentamos um estudo de inferência e análise de diagnóstico para modelo de regressão Birnbaum-Saunders bivariado proposto. Em primeiro lugar, apresentamos os estimadores obtidos através do método dos momentos e de máxima verossimilhança, e a matriz de informação observada de Fisher. Além disso, discutimos testes de hipóteses com base na normalidade assintótica dos estimadores de máxima verossimilhança. Em segundo lugar, desenvolvemos um método de diagnóstico para o modelo de regressão Birnbaum- Saunders bivariado baseado na metodologia de Cook (1986). Finalmente, apresentamos alguns resultados de estudos de simulações e aplicações em dados reais
Abstract: The Birnbaum-Saunders regression model of Rieck and Nedelman (1991) has been extensively discussed by various authors with application in survival and reliability studies. In this work a bivariate Birnbaum-Saunders regression model is developed through the use of Sinh-Normal distribution proposed by Rieck (1989). This bivariate regression model can be used to analyze correlated log-time of two units, it bivariate regression model has its marginal as the Birnbaum- Saunders regression model. For the bivariate Birnbaum-Saunders regression model is discussed some of its properties, in the moment estimation, the maximum likelihood estimation and the observed Fisher information matrix. Hypothesis testing is performed by using the asymptotic normality of the maximum-likelihood estimators. Influence diagnostic methods are developed for this model based on the Cook¿s(1986) approach. Finally, the results of a simulation study as well as an application to a real data set are presented
Mestrado
Estatistica
Mestra em Estatística
APA, Harvard, Vancouver, ISO, and other styles
28

Brunner, Manuela. "Hydrogrammes synthétiques par bassin et types d'événements. Estimation, caractérisation, régionalisation et incertitude." Thesis, Université Grenoble Alpes (ComUE), 2018. http://www.theses.fr/2018GREAU003/document.

Full text
Abstract:
L'estimation de crues de projet est requise pour le dimensionnement de barrages et de bassins de rétention, de même que pour la gestion des inondations lors de l’élaboration de cartes d’aléas ou lors de la modélisation et délimitation de plaines d’inondation. Généralement, les crues de projet sont définies par leur débit de pointe à partir d’une analyse fréquentielle univariée. Cependant, lorsque le dimensionnement d’ouvrages hydrauliques ou la gestion de crues nécessitent un stockage du volume ruisselé, il est également nécessaire de connaître les caractéristiques volume, durée et forme de l’hydrogramme de crue en plus de son débit maximum. Une analyse fréquentielle bivariée permet une estimation conjointe du débit de pointe et du volume de l’hydrogramme en tenant compte de leur corrélation. Bien qu’une telle approche permette la détermination du couple débit/volume de crue, il manque l’information relative à la forme de l’hydrogramme de crue. Une approche attrayante pour caractériser la forme de la crue de projet est de définir un hydrogramme représentatif normalisé par une densité de probabilité. La combinaison d’une densité de probabilité et des quantiles bivariés débit/volume permet la construction d’un hydrogramme synthétique de crue pour une période de retour donnée, qui modélise le pic d’une crue ainsi que sa forme. De tels hydrogrammes synthétiques sont potentiellement utiles et simples d’utilisation pour la détermination de crues de projet. Cependant, ils possèdent actuellement plusieurs limitations. Premièrement, ils reposent sur la définition d’une période de retour bivariée qui n’est pas univoque. Deuxièmement, ils décrivent en général le comportement spécifique d’un bassin versant en ne tenant pas compte de la variabilité des processus représentée par différents types de crues. Troisièmement, les hydrogrammes synthétiques ne sont pas disponibles pour les bassins versant non jaugés et une estimation de leurs incertitudes n’est pas calculée.Pour remédier à ces manquements, cette thèse propose des avenues pour la construction d’hydrogrammes synthétiques de projet pour les bassins versants jaugés et non jaugés, de même que pour la prise en compte de la diversité des types de crue. Des méthodes sont également développées pour la construction d’hydrogrammes synthétiques de crue spécifiques au bassin et aux événements ainsi que pour la régionalisation des hydrogrammes. Une estimation des diverses sources d’incertitude est également proposée. Ces travaux de recherche montrent que les hydrogrammes synthétiques de projet constituent une approche qui s’adapte bien à la représentation de différents types de crue ou d’événements dans un contexte de détermination de crues de projet. Une comparaison de différentes méthodes de régionalisation montre que les hydrogrammes synthétiques de projet spécifiques au bassin peuvent être régionalisés à des bassins non jaugés à l’aide de méthodes de régression linéaires et non linéaires. Il est également montré que les hydrogrammes de projet spécifiques aux événements peuvent être régionalisés à l’aide d’une approche d’indice de crue bivariée. Dans ce contexte, une représentation fonctionnelle de la forme des hydrogrammes constitue un moyen judicieux pour la délimitation de régions ayant un comportement hydrologique de crue similaire en terme de réactivité. Une analyse de l’incertitude a montré que la longueur de la série de mesures et le choix de la stratégie d’échantillonnage constituent les principales sources d’incertitude dans la construction d’hydrogrammes synthétiques de projet. Cette thèse démontre qu’une approche de crues de projet basée sur un ensemble de crues permet la prise en compte des différents types de crue et de divers processus. Ces travaux permettent de passer de l’analyse fréquentielle statistique de crues vers l’analyse fréquentielle hydrologique de crues permettant de prendre en compte les processus et conduisant à une prise de décision plus éclairée
Design flood estimates are needed in hydraulic design for the construction of dams and retention basins and in flood management for drawing hazard maps or modeling inundation areas. Traditionally, such design floods have been expressed in terms of peak discharge estimated in a univariate flood frequency analysis. However, design or flood management tasks involving storage, in addition to peak discharge, also require information on hydrograph volume, duration, and shape . A bivariate flood frequency analysis allows the joint estimation of peak discharge and hydrograph volume and the consideration of their dependence. While such bivariate design quantiles describe the magnitude of a design flood, they lack information on its shape. An attractive way of modeling the whole shape of a design flood is to express a representative normalized hydrograph shape as a probability density function. The combination of such a probability density function with bivariate design quantiles allows the construction of a synthetic design hydrograph for a certain return period which describes the magnitude of a flood along with its shape. Such synthetic design hydrographs have the potential to be a useful and simple tool in design flood estimation. However, they currently have some limitations. First, they rely on the definition of a bivariate return period which is not uniquely defined. Second, they usually describe the specific behavior of a catchment and do not express process variability represented by different flood types. Third, they are neither available for ungauged catchments nor are they usually provided together with an uncertainty estimate.This thesis therefore explores possibilities for the construction of synthetic design hydrographs in gauged and ungauged catchments and ways of representing process variability in design flood construction. It proposes tools for both catchment- and flood-type specific design hydrograph construction and regionalization and for the assessment of their uncertainty.The thesis shows that synthetic design hydrographs are a flexible tool allowing for the consideration of different flood or event types in design flood estimation. A comparison of different regionalization methods, including spatial, similarity, and proximity based approaches, showed that catchment-specific design hydrographs can be best regionalized to ungauged catchments using linear and nonlinear regression methods. It was further shown that event-type specific design hydrograph sets can be regionalized using a bivariate index flood approach. In such a setting, a functional representation of hydrograph shapes was found to be a useful tool for the delineation of regions with similar flood reactivities.An uncertainty assessment showed that the record length and the choice of the sampling strategy are major uncertainty sources in the construction of synthetic design hydrographs and that this uncertainty propagates through the regionalization process.This thesis highlights that an ensemble-based design flood approach allows for the consideration of different flood types and runoff processes. This is a step from flood frequency statistics to flood frequency hydrology which allows better-informed decision making
APA, Harvard, Vancouver, ISO, and other styles
29

Teese, Robert. "Reckless Behaviour in Emerging Adulthood: A Psychosocial Approach." Thesis, Griffith University, 2018. http://hdl.handle.net/10072/380673.

Full text
Abstract:
The emerging adult years, extending from the late teens through the 20s, are commonly associated with peak physical health and being in the “prime” of one’s life. However, these years are also associated with peak lifetime prevalence rates of a number of reckless health-risk behaviours including alcohol and illicit substance use, dangerous driving, and unsafe sexual behaviour. Despite increased research attention, these preventable behaviours remain amongst the major contributing factors to morbidity and mortality rates of young people. The current thesis presents and tests a psychosocial model of emerging adult reckless behaviour comprising factors from each of the personality, social, cognitive, and affective domains of influence. More specifically, the research program assessed the predictive utility of each of sensation seeking, impulsivity, peer pressure, parental support, perceived risks, perceived benefits, positive affect, negative affect, and affective intensity in accounting for emerging adult involvement in each of reckless alcohol use, reckless illicit drug use, reckless driving, reckless sexual behaviour, and overall recklessness. In addition to two pilot studies, cross-sequential self-report research methods yielded three main studies of Australian emerging adults, one longitudinal (6 month time frame), and two cross-sectional. The first main cross-sectional study tested the proposed psychosocial model of recklessness on a sample of 283 emerging adults. Bivariate correlational analyses demonstrated that each of the components of the psychosocial model, except parental influence, were associated with at least one form of reckless behaviour. Hierarchical multiple regression analyses consistently demonstrated the utility of perceived benefits as the main predictor of emerging adult reckless behaviour involvement. Preliminary evidence of moderator effects were also found. The second main cross-sectional study tested a slightly revised version of the model on a sample of 952 emerging adults. Hierarchical multiple regression analyses demonstrated that the psychosocial model accounted for significant variance in emerging adult reckless behaviour, ranging between 20.9% (reckless driving behaviour) to 47.0% (overall recklessness). All psychosocial factors demonstrated either significant unique contributions, or were involved in a significant moderation effect, for at least one form of reckless behaviour. The study replicated the previous finding that perceived benefits was the strongest unique predictor of reckless behaviour involvement, however, this association was interpreted in light of significant moderation effects. The third main study was a two-wave (six months apart) longitudinal analysis of the psychosocial model, investigating both stability and change effects in a sample of 583 emerging adults. Lagged regression analyses demonstrated that T1 predictors accounted for significant variance in T2 outcomes over the 6 month time frame, ranging from 16.0% (reckless driving behaviour) to 37.2% (overall reckless behaviour), with predictive relationships that mostly mirrored cross-sectional findings. One notable exception to this was in relation to perceived risks, which did not demonstrate strong predictive utility in the cross-sectional findings, but was predictive of T1 to T2 stability for all forms of recklessness, except sexual behaviour. Whilst change in behaviour over the six-month time frame was minimal, investigation of T1 to T2 change effects demonstrated that change in predictor variables accounted for significant change in reckless behaviour involvement across all forms of recklessness. Results as a whole demonstrate the significance of perceived benefits as a key factor in emerging adult reckless behaviour. Whilst its influence can occur via moderating effects, it was consistently the strongest unique predictor across each of the various forms of recklessness. This could have significant implications for how behavioural interventions target cognitive influence pathways, and suggests that rather than continuing to ensure that young people are aware of the possible dangers associated with excessive drinking, illicit substance use, dangerous driving or unsafe sexual practices, increased emphasis should be placed on challenging their perceptions of the benefits derived from these behaviours. When combined with the lagged regression effects, the findings demonstrate the continued influence of cognitive factors on reckless behaviour engagement, even after accounting for demographic, personality, and social influences. The research program further demonstrated the importance of assessing the relative influence of multiple predictors simultaneously across various forms of reckless behaviour. Whilst some predictors demonstrated consistent links to recklessness regardless of type, others such as peer pressure and impulsivity were predictive of some forms of reckless engagement but not others. Such findings demonstrate that there are likely to be multiple pathways towards reckless behaviour involvement, with these pathways differing with the behaviour of interest. This suggests that interventions continuing down a ‘one approach fits all’ approach are likely to have limited success in reducing emerging adult reckless behaviour involvement.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Applied Psychology
Griffith Health
Full Text
APA, Harvard, Vancouver, ISO, and other styles
30

Strugnell, James Paul. "Paintings by numbers : applications of bivariate correlation and descriptive statistics to Russian avant-garde artwork." Thesis, University of St Andrews, 2017. http://hdl.handle.net/10023/10722.

Full text
Abstract:
In this thesis artwork is defined, through analogy with quantum mechanics, as the conjoining of the nonsimultaneously measurable momentum (waves) of artwork-text (words within the primary sources and exhibition catalogues) with the position (particles) of artwork-objects (artist- productivity/exhibition-quantities). Such a proposition allows for the changes within the artwork of the Russian avant-garde to be charted, as such artwork-objects are juxtaposed with different artwork-texts from 1902 to 2009. The artwork of an initial period from 1902 to 1934 is examined using primary-source artwork-text produced by Russian artists and critics in relation to the contemporaneous production-levels of various types of Russian-avant-garde artwork-objects. The primary sources in this dataset are those reproduced in the artwork-text produced by the 62 exhibitions described below, and those published in John E. Bowlt's 1991 edition of Russian Art of the Avant-Garde: Theory and Criticism. The production of artwork in the latter period from 1935 to 2009 is examined through consecutive exhibitions, and the relationship between the artwork-text produced by these exhibitions and the artwork-objects exhibited at them. The exhibitions examined within this thesis are 62 containing Russian avant-garde artwork, held in Britain from 1935 to 2009. Content analysis, using an indices-and-symptom analytical construct, functions to convert the textual, unstructured data of the artwork-text words to numerical, structured data of recording-unit weighted percentages. Whilst artist-productivity and exhibition-quantities of types of artwork-object convert the individual artwork-objects to structured data. Bivariate correlation, descriptive statistics, graphs and charts are used to define and compare relationships between: The recording units of the artwork-texts; the artist-productivity/ exhibition-quantities of types of artwork-objects; the structured artwork-text data and structured artwork-object data. These various correlations between structured artwork-text data and structured artwork-object data are calculated in relationship to time (Years) to chart the changes within these relationships. The changes within these relationships are synonymous with changes within Russian avant-garde artwork as presented from 1902 to 1934 and within the 62 British exhibitions from 1935 to 2009. Bivariate correlations between structured artwork-texts data and structured artwork-objects data express numerically (quantitatively) the ineffable relationships formed over time by large sets of unstructured data in the continued (re)creation of artwork.
APA, Harvard, Vancouver, ISO, and other styles
31

Godfrey, Jodi Anne. "Risk-Taking Characteristics as Explanatory Variables in Variations of Fatality Rates in the Southeastern United States." Scholar Commons, 2015. https://scholarcommons.usf.edu/etd/5483.

Full text
Abstract:
Traffic fatalities accounted for 1.24 million lives lost in 2013 worldwide, and almost 33 thousand of those fatalities were in the U.S. in 2013. The southeastern region of the nation stands out for continuously having higher fatality rates per mile driven than the national average. If one can establish compelling relationships between various factors and fatality rates, then policies and investments can be targeted to increase the safety on the network by focusing on policies that mitigate those factors. In this research effort risk-taking characteristics are explored. These factors have not been as comprehensively reviewed as conventional factors such as vehicle and facility conditions associated with safety. The hypothesis assumes if a person exhibits risk-taking behavior, that risk-taking behavior is not limited to only one aspect of risk, but is likely to occur in multiple facets of the person's life. Some of the risk-taking characteristics explored include credit score, safety belt use, smoking and tobacco use, drug use, mental health, educational attainment, obesity, and overall general health characteristics. All risk-taking characteristics with the exception of mental health were found to have statistically significant correlations with fatality rates alone. However, when a regression model was formed to estimate fatality rates by risk-taking characteristics, only four risk-taking characteristics - credit score, educational attainment, overall poor health, and seat belt use were found to be statistically significant at an integrated level with other demographic characteristics such as unemployment levels and population born is state of residency. By identifying at-risk population segments, education, counseling, enforcement, or other strategies may be deployed to help improve travel safety.
APA, Harvard, Vancouver, ISO, and other styles
32

Zong, Yujie. "A Sensitivity Analysis of a Nonignorable Nonresponse Model Via EM Algorithm and Bootstrap." Digital WPI, 2011. https://digitalcommons.wpi.edu/etd-theses/208.

Full text
Abstract:
The Slovenian Public Opinion survey (SPOS), which carried out in 1990, was used by the government of Slovenia as a benchmark to prepare for an upcoming plebiscite, which asked the respondents whether they support independence from Yugoslavia. However, the sample size was large and it is quite likely that the respondents and nonrespondents had divergent viewpoints. We first develop an ignorable nonresponse model which is an extension of a bivariate binomial model. In order to accommodate the nonrespondents, we then develop a nonignorable nonresponse model which is an extension of the ignorable model. Our methodology uses an EM algorithm to fit both the ignorable and nonignorable nonresponse models, and estimation is carried out using the bootstrap mechanism. We also perform sensitivity analysis to study different degrees of departures of the nonignorable nonresponse model from the ignorable nonresponse model. We found that the nonignorable nonresponse model is mildly sensitive to departures from the ignorable nonresponse model. In fact, our finding based on the nonignorable model is better than an earlier conclusion about another nonignorable nonresponse model fitted to these data.
APA, Harvard, Vancouver, ISO, and other styles
33

Flamant, Julien. "Une approche générique pour l'analyse et le filtrage des signaux bivariés." Thesis, Ecole centrale de Lille, 2018. http://www.theses.fr/2018ECLI0008/document.

Full text
Abstract:
Les signaux bivariés apparaissent dans de nombreuses applications (optique, sismologie, océanographie, EEG, etc.) dès lors que l'analyse jointe de deux signaux réels est nécessaire. Les signaux bivariés simples ont une interprétation naturelle sous la forme d'une ellipse dont les propriétés (taille, forme, orientation) peuvent évoluer dans le temps. Cette propriété géométrique correspondant à la notion de polarisation en physique est fondamentale pour la compréhension et l'analyse des signaux bivariés. Les approches existantes n'apportent cependant pas de description directe des signaux bivariés ou des opérations de filtrage en termes de polarisation. Cette thèse répond à cette limitation par l'introduction d'une nouvelle approche générique pour l'analyse et le filtrage des signaux bivariés. Celle-ci repose sur deux ingrédients essentiels : (i) le plongement naturel des signaux bivariés -- vus comme signaux à valeurs complexes -- dans le corps des quaternions H et (ii) la définition d'une transformée de Fourier quaternionique associée pour une représentation spectrale interprétable de ces signaux. L'approche proposée permet de définir les outils de traitement de signal usuels tels que la notion de densité spectrale, de filtrage linéaire ou encore de spectrogramme ayant une interprétation directe en termes d'attributs de polarisation. Nous montrons la validité de l'approche grâce à des garanties mathématiques et une implémentation numériquement efficace des outils proposés. Diverses expériences numériques illustrent l'approche. En particulier, nous démontrons son potentiel pour la caractérisation de la polarisation des ondes gravitationnelles
Bivariate signals appear in a broad range of applications (optics, seismology, oceanography, EEG, etc.) where the joint analysis of two real-valued signals is required. Simple bivariate signals take the form of an ellipse, whose properties (size, shape, orientation) may evolve with time. This geometric feature of bivariate signals has a natural physical interpretation called polarization. This notion is fundamental to the analysis and understanding of bivariate signals. However, existing approaches do not provide straightforward descriptions of bivariate signals or filtering operations in terms of polarization or ellipse properties. To this purpose, this thesis introduces a new and generic approach for the analysis and filtering of bivariate signals. It essentially relies on two key ingredients: (i) the natural embedding of bivariate signals -- viewed as complex-valued signals -- into the set of quaternions H and (ii) the definition of a dedicated quaternion Fourier transform to enable a meaningful spectral representation of bivariate signals. The proposed approach features the definition of standard signal processing quantities such as spectral densities, linear time-invariant filters or spectrograms that are directly interpretable in terms of polarization attributes. More importantly, the framework does not sacrifice any mathematical guarantee and the newly introduced tools admit computationally fast implementations. Numerical experiments support throughout our theoretical developments. We also demonstrate the potential of the approach for the nonparametric characterization of the polarization of gravitational waves
APA, Harvard, Vancouver, ISO, and other styles
34

Gawde, Purva R. "INTEGRATED ANALYSIS OF TEMPORAL AND MORPHOLOGICAL FEATURES USING MACHINE LEARNING TECHNIQUES FOR REAL TIME DIAGNOSIS OF ARRHYTHMIA AND IRREGULAR BEATS." Kent State University / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=kent1544106866041632.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Santos, Carlos Aparecido dos. "Dados de sobrevivência multivariados na presença de covariáveis e observações censuradas: uma abordagem bayesiana." Universidade Federal de São Carlos, 2010. https://repositorio.ufscar.br/handle/ufscar/4483.

Full text
Abstract:
Made available in DSpace on 2016-06-02T20:04:51Z (GMT). No. of bitstreams: 1 3028.pdf: 7339557 bytes, checksum: 16711c2271b754604bfa0b0fba30290b (MD5) Previous issue date: 2010-03-04
In this work, we introduce a Bayesian Analysis for survival multivariate data in the presence of a covariate vector and censored observations. Different frailties or latent variables are considered to capture the correlation among the survival times for the same individual. We also introduce a Bayesian analysis for some of the most popular bivariate exponential distributions introduced in the literature. A Bayesian analysis is also introduced for the Block & Basu bivariate exponential distribution using Markov Chain Monte Carlo (MCMC) methods and considering lifetimes in presence of covariates and censored data. In another topic, we introduce a Bayesian Analysis for bivariate lifetime data in the presence of covariates and censoring data assuming different bivariate Weibull distributions derived from some existing copula functions. A great computational simplification to simulate samples for the joint posterior distribution is obtained using the WinBUGS software. Numerical illustrations are introduced considering real data sets considering every proposed methodology.
Nesta tese introduzimos uma an´alise Bayesiana para dados de sobreviv encia multivariados, na presen¸ca de um vetor de covari´aveis e observa¸c oes censuradas. Diferentes fragilidades ou vari´aveis latentes s ao consideradas para capturar a correla¸c ao existente entre os tempos de sobreviv encia, para o mesmo indiv´ıduo. Tamb´em apresentamos uma an´alise Bayesiana para algumas das mais populares distribui¸c oes exponenciais bivariadas introduzidas na literatura. Uma an´alise Bayesiana tamb´em ´e introduzida para a distribui¸c ao exponencial bivariada de Block & Basu, usando m´etodos MCMC (Monte Carlo em Cadeias de Markov) e considerando os tempos de sobreviv encia na presen¸ca de covari´aveis e dados censurados. Em outro t´opico, introduzimos uma an´alise Bayesiana para dados de sobreviv encia bivariados na presen¸ca de covari´aveis e observa¸c oes censuradas, assumindo diferentes distribui¸c oes bivariadas Weibull derivadas de algumas fun¸c oes c´opulas existentes. Uma grande simplifica¸c ao computacional para simular amostras da distribui¸c ao a posteriori conjunta de interesse ´e obtida usando o software WinBUGS. Ilustra¸c oes num´ericas s ao introduzidas considerando conjunto de dados reais, para cada uma das metodologias propostas.
APA, Harvard, Vancouver, ISO, and other styles
36

Marius, Matei. "A Contribution to Multivariate Volatility Modeling with High Frequency Data." Doctoral thesis, Universitat Ramon Llull, 2012. http://hdl.handle.net/10803/81072.

Full text
Abstract:
La tesi desenvolupa el tema de la predicció de la volatilitat financera en el context de l’ús de dades d’alta freqüència, i se centra en una línia de recerca doble: proposar models alternatius que millorarien la predicció de la volatilitat i classificar els models de volatilitat ja existents com els que es proposen en aquesta tesi. Els objectius es poden classificar en tres categories. El primer consisteix en la proposta d’un nou mètode de predicció de la volatilitat que segueix una línia de recerca desenvolupada recentment, la qual apunta al fet de mesurar la volatilitat intradia, com també la nocturna. Es proposa una categoria de models realized GARCH bivariants. El segon objectiu consisteix en la proposta d’una metodologia per predir la volatilitat diària multivariant amb models autoregressius que utilitzen estimacions de volatilitat diària (i nocturna, en el cas dels bivariants), a més d’informació d’alta freqüència, quan se’n disposava. S’aplica l’anàlisi de components principals (ACP) a un conjunt de models de tipus realized GARCH univariants i bivariants. El mètode representa una extensió d’un model ja existent (PC-GARCH) que estimava un model GARCH multivariant a partir de l’estimació de models GARCH univariants dels components principals de les variables inicials. El tercer objectiu de la tesi és classificar el rendiment dels models de predicció de la volatilitat ja existents o dels nous, a més de la precisió de les mesures intradia que s’utilitzaven en les estimacions dels models. En relació amb els resultats, s’observa que els models EGARCHX, realized EGARCH i realized GARCH(2,2) obtenen una millor valoració, mentre que els models GARCH i no realized EGARCH obtenen uns resultats inferiors en gairebé totes les proves. Això permet concloure que el fet d’incorporar mesures de volatilitat intradia millora el problema de la modelització. Quant a la classificació dels models realized bivariants, s’observa que tant els models realized GARCH bivariant (en versions completes i parcials) com el model realized EGARCH bivariant obtenen millors resultats; els segueixen els models realized GARCH(2,2) bivariant, EGARCH bivariant I EGARCHX bivariant. En comparar les versions bivariants amb les univariants, amb l’objectiu d’investigar si l’ús de mesures de volatilitat nocturna a les equacions dels models millora l’estimació de la volatilitat, es mostra que els models bivariants superen els univariants. Els resultats proven que els models bivariants no són totalment inferiors als seus homòlegs univariants, sinó que resulten ser bones alternatives per utilitzar-los en la predicció, juntament amb els models univariants, per tal d’obtenir unes estimacions més fiables.
La tesis desarrolla el tema de la predicción de la volatilidad financiera en el contexto del uso de datos de alta frecuencia, y se centra en una doble línea de investigación: la de proponer modelos alternativos que mejorarían la predicción de la volatilidad y la de clasificar modelos de volatilidad ya existentes como los propuestos en esta tesis. Los objetivos se pueden clasificar en tres categorías. El primero consiste en la propuesta de un nuevo método de predicción de la volatilidad que sigue una línea de investigación recientemente desarrollada, la cual apunta al hecho de medir la volatilidad intradía, así como la nocturna. Se propone una categoría de modelos realized GARCH bivariantes. El segundo objetivo consiste en proponer una metodología para predecir la volatilidad diaria multivariante con modelos autorregresivos que utilizaran estimaciones de volatilidad diaria (y nocturna, en el caso de los bivariantes), además de información de alta frecuencia, si la había disponible. Se aplica el análisis de componentes principales (ACP) a un conjunto de modelos de tipo realized GARCH univariantes y bivariantes. El método representa una extensión de un modelo ya existente (PCGARCH) que calculaba un modelo GARCH multivariante a partir de la estimación de modelos GARCH univariantes de los componentes principales de las variables iniciales. El tercer objetivo de la tesis es clasificar el rendimiento de los modelos de predicción de la volatilidad ya existentes o de los nuevos, así como la precisión de medidas intradía utilizadas en las estimaciones de los modelos. En relación con los resultados, se observa que los modelos EGARCHX, realized EGARCH y GARCH(2,2) obtienen una mejor valoración, mientras que los modelos GARCH y no realized EGARCH obtienen unos resultados inferiores en casi todas las pruebas. Esto permite concluir que el hecho de incorporar medidas de volatilidad intradía mejora el problema de la modelización. En cuanto a la clasificación de modelos realized bivariantes, se observa que tanto los modelos realized GARCH bivariante (en versiones completas y parciales) como realized EGARCH bivariante obtienen mejores resultados; les siguen los modelos realized GARCH(2,2) bivariante, EGARCH bivariante y EGARCHX bivariante. Al comparar las versiones bivariantes con las univariantes, con el objetivo de investigar si el uso de medidas de volatilidad nocturna en las ecuaciones de los modelos mejora la estimación de la volatilidad, se muestra que los modelos bivariantes superan los univariantes. Los resultados prueban que los modelos bivariantes no son totalmente inferiores a sus homólogos univariantes, sino que resultan ser buenas alternativas para utilizarlos en la predicción, junto con los modelos univariantes, para lograr unas estimaciones más fiables.
The thesis develops the topic of financial volatility forecasting in the context of the usage of high frequency data, and focuses on a twofold line of research: that of proposing alternative models that would enhance volatility forecasting and that of ranking existing or newly proposed volatility models. The objectives may be disseminated in three categories. The first scope constitutes of the proposal of a new method of volatility forecasting that follows a recently developed research line that pointed to using measures of intraday volatility and also of measures of night volatility, the need for new models being given by the question whether adding measures of night volatility improves day volatility estimations. As a result, a class of bivariate realized GARCH models was proposed. The second scope was to propose a methodology to forecast multivariate day volatility with autoregressive models that used day (and night for bivariate) volatility estimates, as well as high frequency information when that was available. For this, the Principal Component algorithm (PCA) was applied to a class of univariate and bivariate realized GARCH-type of models. The method represents an extension of one existing model (PC GARCH) that estimated a multivariate GARCH model by estimating univariate GARCH models of the principal components of the initial variables. The third goal of the thesis was to rank the performance of existing or newly proposed volatility forecasting models, as well as the accuracy of the intraday measures used in the realized models estimations. With regards to the univariate realized models’ rankings, it was found that EGARCHX, Realized EGARCH and Realized GARCH(2,2) models persistently ranked better, while the non-realized GARCH and EGARCH models performed poor in each stance almost. This allowed us to conclude that incorporating measures of intraday volatility enhances the modeling problem. With respect to the bivariate realized models’ ranking, it was found that Bivariate Realized GARCH (partial and complete versions) and Bivariate Realized EGARCH models performed the best, followed by the Bivariate Realized GARCH(2,2), Bivariate EGARCH and Bivariate EGARCHX models. When the bivariate versions were compared to the univariate ones in order to investigate whether using night volatility measurements in the models’ equations improves volatility estimation, it was found that the bivariate models surpassed the univariate ones when specific methodology, ranking criteria and stocks were used. The results were mixed, allowing us to conclude that the bivariate models did not prove totally inferior to their univariate counterparts, proving as good alternative options to be used in the forecasting exercise, together with the univariate models, for more reliable estimates. Finally, the PC realized models and PC bivariate realized models were estimated and their performances were ranked; improvements the PC methodology brought in high frequency multivariate modeling of stock returns were also discussed. PC models were found to be highly effective in estimating multivariate volatility of highly correlated stock assets and suggestions on how investors could use them for portfolio selection were made.
APA, Harvard, Vancouver, ISO, and other styles
37

Liu, Ke. "A joint model of an internal time-dependent covariate and bivariate time-to-event data with an application to muscular dystrophy surveillance, tracking and research network data." Diss., University of Iowa, 2015. https://ir.uiowa.edu/etd/2237.

Full text
Abstract:
Joint modeling of a single event time response with a longitudinal covariate dates back to the 1990s. The three basic types of joint modeling formulations are selection models, pattern mixture models and shared parameter models. The shared parameter models are most widely used. One type of a shared parameter model (Joint Model I) utilizes unobserved random effects to jointly model a longitudinal sub-model and a survival sub-model to assess the impact of an internal time-dependent covariate on the time-to-event response. Motivated by the Muscular Dystrophy Surveillance, Tracking and Research Network (MD STARnet), we constructed a new model (Joint Model II), to jointly analyze correlated bivariate time-to-event responses associated with an internal time-dependent covariate in the Frequentist paradigm. This model exhibits two distinctive features: 1) a correlation between bivariate time-to-event responses and 2) a time-dependent internal covariate in both survival models. Developing a model that sufficiently accommodates both characteristics poses a challenge. To address this challenge, in addition to the random variables that account for the association between the time-to-event responses and the internal time-dependent covariate, a Gamma frailty random variable was used to account for the correlation between the two event time outcomes. To estimate the model parameters, we adopted the Expectation-Maximization (EM) algorithm. We built a complete joint likelihood function with respect to both latent variables and observed responses. The Gauss-Hermite quadrature method was employed to approximate the two-dimensional integrals in the E-step of the EM algorithm, and the maximum profile likelihood type of estimation method was implemented in the M-step. The bootstrap method was then applied to estimate the standard errors of the estimated model parameters. Simulation studies were conducted to examine the finite sample performance of the proposed methodology. Finally, the proposed method was applied to MD STARnet data to assess the impact of shortening fractions and steroid use on the onsets of scoliosis and mental health issues.
APA, Harvard, Vancouver, ISO, and other styles
38

Serengil, Volkan. "Bagarna i den nya och gamla gymnasieskolan; Vägen till akademiska studier." Thesis, Malmö högskola, Lärarutbildningen (LUT), 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-32381.

Full text
Abstract:
Uppsatsen jämför (a) Restaurang- och livsmedelsprogrammet med inriktningen bageri och konditori (RL-BAG) och (b) Livsmedelsprogrammet mot bageri (LP-BAG) i relation till varandra och fortsatta akademiska studier. Det är motsvarande program från Gy11 respektive Gy2000.Frågeställningen fokuserar på skillnaden mellan RL-BAG och LP-BAG gällande behörigheten och tillgång till akademiska utbildningar vid två lärosäten.Syftet med uppsatsen är att ta reda på hur skillnaden ser ut för de aktuella gymnasieprogrammen. Att det finns en skillnad är redan konstaterad i bl.a. utformningen av den nya gymnasieskolan. En aspekt att se på uppsatsen är att den har en utvärderande funktion av den senaste gymnasieförordningens yrkesprogram (Gy11).Metoderna som används i undersökningen är i huvudsak kvantitativa (i skildringen/resultatet). Den mäter det bivariata förhållandet mellan kombinationer av behörighetsgivande gymnasiala kurser och programtillgången vid de aktuella lärosätena. I metoden används bl.a. SWOT för att definiera skillnaderna.Resultatet visar att det finns en skillnad i urvalet. Den grundläggande behörigheten för Gy11 är betydligt mer fördelaktig än den motsvarande nivån för Gy2000. Även om kärnämnena räknas med i LP-BAG. Däremot har LP-BAG tillgång till fler särskilt behörighetsgivande kurser som i olika kombinationer ger tillgång till fler utbildningar.
APA, Harvard, Vancouver, ISO, and other styles
39

Kahlenberg, Jens [Verfasser]. "Storno und Profitabilität in der Privathaftpflichtversicherung : Eine Analyse unter Verwendung von univariaten und bivariaten verallgemeinerten linearen Modellen / Jens Kahlenberg." Aachen : Shaker, 2005. http://d-nb.info/1181607949/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Moore, Alvin R. "An Evaluation of a Program for Incarcerated Mothers: Parenting Training and the Enhancement of Self-Esteem." VCU Scholars Compass, 1995. http://scholarscompass.vcu.edu/etd/1493.

Full text
Abstract:
The purpose of this study was to examine the effects of parenting training on the acquisition of parenting skills and its impact on self-esteem of incarcerated mothers. The program under study is the "Mothers Inside Loving Kids" (M.I.L.K.) program, which is a holistic training/visitation program designed for incarcerated mothers.Study participants included 40 volunteer incarcerated mothers at the Virginia Correctional Center for women. The treatment group consisted of 20 participants who were already involved in the "M.I.L.K." program. The comparison group was made up of 20 mothers who were on the waiting list for the program due to the lack of space. All participants were administered a battery of pre-tests and post-tests. Instruments utilized for the study included the Adult-Adolescent Parenting Inventory (AAPI), the Nurturing Quiz, the Index of Self Esteem (ISE), and a participant satisfaction survey.Bivariate analyses were used to test the difference between pre-test and post-test mean scores. Both parametric and non-parametric tests were conducted to determine if change scores revealed significant differences. Using independent t-tests to determine if there were significant differences between treatment and comparison groups on change scores, no significant differences were noted. However, in reviewing the direction of change scores for the two groups, the treatment group did show changes in the desired direction in four areas. Specifically, positive directional change occurred on the "Lack of Empathy for the Child" sub-scale, the "Belief in Corporal Punishment" sub-scale, the "Reversing Family Roles" sub-scale, and on the "Nurturing Quiz."Using the Wilcoxon non-parametric test, one measure revealed statistically significant differences between pre-test and post-test scores. Specifically, participants in the treatment group revealed significantly higher scores on the "Nurturing Quiz" at post-testing from pre-testing (z = -2.81, p = .005). This indicates an overall increase in knowledge about positive child management techniques. No significant pre-test to post-test differences were noted in any of the remaining areas under study. However, positive directional change scores were noted in the three different areas of "Inappropriate Expectations of the Child", "Nurturing", and "Self-Esteem."Overall, the findings suggest that the M.I.L.K. Program training positively impacts parenting techniques. Self-esteem appears more difficult to impact.
APA, Harvard, Vancouver, ISO, and other styles
41

Lu, Yang. "Analyse de survie bivariée à facteurs latents : théorie et applications à la mortalité et à la dépendance." Thesis, Paris 9, 2015. http://www.theses.fr/2015PA090020/document.

Full text
Abstract:
Cette thèse étudie quelques problèmes d’identification et d’estimation dans les modèles de survie bivariée, avec présence d’hétérogénéité individuelle et des facteurs communs stochastiques.Chapitre I introduit le cadre général.Chapitre II propose un modèle pour la mortalité des deux époux dans un couple. Il permet de distinguer deux types de dépendance : l’effet de deuil et l’effet lié au facteur de risque commun des deux époux. Une analyse de leurs effets respectifs sur les primes d’assurance écrites sur deux têtes est proposée.Chapitre III montre que, sous certaines hypothèses raisonnables, on peut identifier l’évolution jointe du risque d’entrer en dépendance et du risque de mortalité, à partir des données de mortalité par cohortes. Une application à la population française est proposée.Chapitre IV étudie la queue de distribution dans les modèles de survie bivariée. Sous certaines hypothèses, la loi jointe des deux durées résiduelles converge, après une normalisation adéquate. Cela peut être utilisé pour analyser le risque parmi les survivants aux âges élevés. Parallèlement, la distribution d’hétérogénéité parmi les survivants converge vers une distribution semi-paramétrique
This thesis comprises three essays on identification and estimation problems in bivariate survival models with individual and common frailties.The first essay proposes a model to capture the mortality dependence of the two spouses in a couple. It allows to disentangle two types of dependencies : the broken heart syndrome and the dependence induced by common risk factors. An analysis of their respective effects on joint insurance premia is also proposed.The second essay shows that, under reasonable model specifications that take into account the longevity effect, we can identify the joint distribution of the long-term care and mortality risks from the observation of cohort mortality data only. A numerical application to the French population data is proposed.The third essay conducts an analysis of the tail of the joint distribution for general bivariate survival models with proportional frailty. We show that under appropriate assumptions, the distribution of the joint residual lifetimes converges to a limit distribution, upon normalization. This can be used to analyze the mortality and long-term care risks at advanced ages. In parallel, the heterogeneity distribution among survivors converges also to a semi-parametric limit distribution. Properties of the limit distributions, their identifiability from the data, as well as their implications are discussed
APA, Harvard, Vancouver, ISO, and other styles
42

Lladser, Manuel Eugenio. "Asymptotic enumeration via singularity analysis." Connect to this title online, 2003. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1060976912.

Full text
Abstract:
Thesis (Ph. D.)--Ohio State University, 2003.
Title from first page of PDF file. Document formatted into pages; contains x, 227 p.; also includes graphics Includes bibliographical references (p. 224-227). Available online via OhioLINK's ETD Center
APA, Harvard, Vancouver, ISO, and other styles
43

Sadeghi, Mehdi. "Potential of the Empirical Mode Decomposition to analyze instantaneous flow fields in Direct Injection Spark Ignition engine : Effect of transient regimes." Thesis, Orléans, 2017. http://www.theses.fr/2017ORLE2069/document.

Full text
Abstract:
Cette étude introduit une nouvelle approche appelée Bivariate 2D-EMD pour séparer le mouvement organisé à grande échelle, soit la composante basse fréquence de l’écoulement des fluctuations turbulentes, soit la composante haute fréquence dans un champ de vitesse instantané bidimensionnel.Cette séparation nécessite un seul champ de vitesse instantané contrairement aux autres méthodes plus couramment utilisées en mécanique des fluides, comme le POD. La méthode proposée durant cette thèse est tout à fait appropriée à l’analyse des écoulements qui sont intrinsèquement instationnaires et non linéaires comme l'écoulement dans le cylindre lorsque le moteur fonctionne dans des conditions transitoires. Bivariate 2D-EMD est validé à travers différents cas test, sur un écoulement turbulent homogène et isotrope (THI) expérimental, qui a été perturbé par un tourbillon synthétique de type Lamb-Ossen, qui simule le mouvement organisé dans le cylindre. Enfin, Il est appliqué sur un écoulement expérimental obtenu dans un cylindre et les résultats de la séparation d'écoulement sont comparés à ceux basés sur l'analyse POD. L’évolution d’écoulement dans le cylindre pendant le fonctionnement du moteur transitoire, c’est à dire une accélération du régime moteur de 1000 à 2000tr/min en différentes rampes, sont mesurée en utilisant de PIV 2D-2C haute cadence. Les champs de vitesse sont obtenus dans le plan de tumble dans un moteur un moteur GDI mono-cylindre transparent et forment une base de données nécessaire pour valider les résultats de simulation numérique
This study introduces a new approach called Bivariate 2D-EMD to separate large-scale organizedmotion i.e., flow low frequency component from random turbulent fluctuations i.e., high frequency onein a given in-cylinder instantaneous 2D velocity field. This signal processing method needs only oneinstantaneous velocity field contrary to the other methods commonly used in fluid mechanics, as POD.The proposed method is quite appropriate to analyze the flows intrinsically both unsteady and nonlinearflows as in in-cylinder. The Bivariate 2D-EMD is validated through different test cases, by optimize itand apply it on an experimental homogeneous and isotropic turbulent flow (HIT), perturbed by asynthetic Lamb-Ossen vortex, to simulate the feature of in-cylinder flows. Furthermore, it applies onexperimental in-cylinder flows. The results obtained by EMD and POD analysis are compared. Theevolution of in-cylinder flow during transient engine working mode, i.e., engine speed acceleration from1000 to 2000 rpm with different time periods, was obtained by High speed PIV 2D-2C. The velocityfields are obtained within tumble plane in a transparent mono-cylinder DISI engine and provide a database to validate CFD
APA, Harvard, Vancouver, ISO, and other styles
44

Yalcinoz, Zerrin. "A Simulation Study On Marginalized Transition Random Effects Models For Multivariate Longitudinal Binary Data." Master's thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/12609568/index.pdf.

Full text
Abstract:
In this thesis, a simulation study is held and a statistical model is fitted to the simulated data. This data is assumed to be the satisfaction of the customers who withdraw their salary from a particular bank. It is a longitudinal data which has bivariate and binary response. It is assumed to be collected from 200 individuals at four different time points. In such data sets, two types of dependence -the dependence within subject measurements and the dependence between responses- are important and these are considered in the model. The model is Marginalized Transition Random Effects Models, which has three levels. The first level measures the effect of covariates on responses, the second level accounts for temporal changes, and the third level measures the difference between individuals. Markov Chain Monte Carlo methods are used for the model fit. In the simulation study, the changes between the estimated values and true parameters are searched under two conditions, when the model is correctly specified or not. Results suggest that the better convergence is obtained with the full model. The third level which observes the individual changes is more sensitive to the model misspecification than the other levels of the model.
APA, Harvard, Vancouver, ISO, and other styles
45

Durand, Marianne. "Combinatoire analytique et algorithmique des ensembles de données." Phd thesis, Ecole Polytechnique X, 2004. http://pastel.archives-ouvertes.fr/pastel-00000810.

Full text
Abstract:
Cette thèse traite d'algorithmique des ensembles de données en adoptant le point de vue de la combinatoire analytique. On traite ici de trois problèmes qui illustrent cette approche: les listes à sauts associées à de l'analyse asymptotique bivariée, le hachage à essai aléatoire avec pagination et le comptage probabiliste. Les listes à sauts sont une structure de données intermédiaire entre les skiplists et les arbres binaires de recherche. L'étude de cette structure a donné lieu à un problème d'asymptotique bivariée avec coalescence de singularités. Le hachage avec essai aléatoire est un algorithme qui gère les collisions d'une table de hachage. Dans le contexte étudié qui est celui de la pagination, on obtient la moyenne, ainsi que tous les moments successifs du coût de construction. Les algorithmes de comptage probabilistes originaux Loglog et Super Loglog permettent d'estimer le cardinal d'un ensemble en utilisant un kilooctet de mémoire avec une précision d'environ 3%.
APA, Harvard, Vancouver, ISO, and other styles
46

Li, Wei. "Numerical Modelling and Statistical Analysis of Ocean Wave Energy Converters and Wave Climates." Doctoral thesis, Uppsala universitet, Elektricitetslära, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-305870.

Full text
Abstract:
Ocean wave energy is considered to be one of the important potential renewable energy resources for sustainable development. Various wave energy converter technologies have been proposed to harvest the energy from ocean waves. This thesis is based on the linear generator wave energy converter developed at Uppsala University. The research in this thesis focuses on the foundation optimization and the power absorption optimization of the wave energy converters and on the wave climate modelling at the Lysekil wave converter test site. The foundation optimization study of the gravity-based foundation of the linear wave energy converter is based on statistical analysis of wave climate data measured at the Lysekil test site. The 25 years return extreme significant wave height and its associated mean zero-crossing period are chosen as the maximum wave for the maximum heave and surge forces evaluation. The power absorption optimization study on the linear generator wave energy converter is based on the wave climate at the Lysekil test site. A frequency-domain simplified numerical model is used with the power take-off damping coefficient chosen as the control parameter for optimizing the power absorption. The results show a large improvement with an optimized power take-off damping coefficient adjusted to the characteristics of the wave climate at the test site. The wave climate modelling studies are based on the wave climate data measured at the Lysekil test site. A new mixed distribution method is proposed for modelling the significant wave height. This method gives impressive goodness of fit with the measured wave data. A copula method is applied to the bivariate joint distribution of the significant wave height and the wave period. The results show an excellent goodness of fit for the Gumbel model. The general applicability of the proposed mixed-distribution method and the copula method are illustrated with wave climate data from four other sites. The results confirm the good performance of the mixed-distribution and the Gumbel copula model for the modelling of significant wave height and bivariate wave climate.
APA, Harvard, Vancouver, ISO, and other styles
47

Sztukowski, Lisa Ann. "Foraging ecology of the Campbell Albatross : individual specialisation and fishery interactions." Thesis, University of Plymouth, 2016. http://hdl.handle.net/10026.1/5377.

Full text
Abstract:
Most albatrosses are critically endangered, endangered or vulnerable due to the deleterious impact of fisheries, pollution, introduced species, habitat alteration, and climate change. Foraging behaviour influences many aspects of seabird biology, and a detailed understanding of foraging ecology is required to better predict the impacts of significant changes to the marine environment. Campbell Albatross (Thalassarche impavida) is a threatened endemic, confined to a small number of locations on Campbell Island, New Zealand and was recently split from the closely related Black-browed Albatross (T. melanophrys). We currently lack much basic information on the foraging behaviour of this species, hindering our ability to understand how change may have occurred in the past and make predictions about it’s long-term future. First, I used GPS loggers and stable isotope analysis of blood to investigate how distribution and foraging effort (distance travelled and duration) varied with sex and breeding stage. I found that Campbell Albatrosses are sexually dimorphic and showed sex-specific foraging behaviour and habitat use – although this varied by stage of reproduction. Because males and females may be vulnerable to different threats, such as interactions with fisheries, I compared the spatial overlap and high resolution spatio-temporal overlaps between fisheries vessels and albatrosses within New Zealand’s Exclusive Economic Zone (EEZ). Albatrosses utilised 32% of the EEZ, however they overlapped with fisheries vessels in only 0.20% of the area. Previous research has demonstrated that the influence of fisheries vessels goes beyond the immediate location of the boat itself. Campbell Albatross have low levels of spatio-temporal overlap with fisheries – with males overlapping more than females. More generally, my results indicate that adding data on fine scale interactions will improve fisheries risk assessments, and provide information needed for the conservation and management of the Campbell Albatross. A key development in recent ecological research has been a greater appreciation that inter-individual variation in foraging behaviour can have profound population-level consequences. Accordingly I tested for individual differences in foraging behaviour in Campbell Albatrosses. The majority of individuals demonstrated both annual and inter-annual individual consistency in foraging locations, and the degree of specialisation was influenced by both sex and year. Consistent terminal latitude and longitude of foraging trips indicated high foraging area fidelity with a degree of flexibility in the fine-scale location. During brooding, females used the Campbell Plateau and showed more consistent behaviours than males, which tended to forage in the Southern Ocean. This adds to a growing body of evidence of individual foraging specialisation among seabirds in general and albatrosses in particular and reveals marked inter-individual differences in vulnerability to threats. In light of the evidence of individual foraging specialisations in the Campbell albatross, I also preformed a literature review of individual foraging specialisations across all seabirds. I found studies examining foraging specialisation for 35 species, with 28 (80%) providing evidence of consistent inter-individual differences (i.e. specialisation). Current studies suggest that specialisation is influenced by environmental variability and resource predictability, however, with limited data in tropical regions, more studies are needed to test these links. In summary, my thesis has provided new information on Campbell Albatross foraging ecology. Sex specific variations in behaviour and habitat use may influence conservation and management strategies. I have been able to contextualise the consistent individual differences in foraging distribution described for this species in light of global patterns of individual foraging specialisation in seabirds and highlight future areas of research.
APA, Harvard, Vancouver, ISO, and other styles
48

Ribeiro, Taís Roberta. "Modelagens estatística para dados de sobrevivência bivariados : uma abordagem bayesiana." Universidade Federal de São Carlos, 2017. https://repositorio.ufscar.br/handle/ufscar/9015.

Full text
Abstract:
Submitted by Ronildo Prado (ronisp@ufscar.br) on 2017-08-17T14:39:42Z No. of bitstreams: 1 DissTRR.pdf: 2739559 bytes, checksum: 80c76b7b0d4fcf15e1c9962556cd8745 (MD5)
Approved for entry into archive by Ronildo Prado (ronisp@ufscar.br) on 2017-08-17T14:39:52Z (GMT) No. of bitstreams: 1 DissTRR.pdf: 2739559 bytes, checksum: 80c76b7b0d4fcf15e1c9962556cd8745 (MD5)
Approved for entry into archive by Ronildo Prado (ronisp@ufscar.br) on 2017-08-17T14:39:58Z (GMT) No. of bitstreams: 1 DissTRR.pdf: 2739559 bytes, checksum: 80c76b7b0d4fcf15e1c9962556cd8745 (MD5)
Made available in DSpace on 2017-08-17T14:40:04Z (GMT). No. of bitstreams: 1 DissTRR.pdf: 2739559 bytes, checksum: 80c76b7b0d4fcf15e1c9962556cd8745 (MD5) Previous issue date: 2017-03-31
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
The frailty models are used to model the possible associations between survival times. Another alternative developed for modeling the dependence between multivariate data is the use of models based on copulas functions. In this paper we propose two derived survival models of copula of the Ali-Mikhail-Haq (AMH) and of the Frank to model the dependence of bivariate data in the presence of covariates and censored observations. For inferential purposes, we conducted a Bayesian approach using Monte Carlo methods in Markov Chain (MCMC). Some discussions on the model selection criteria were presented. In order to detect influential observations we use the Bayesian method of cases of deletion of influence analysis based on the difference ^. Finally, we show the applicability of the proposed models to sets of simulated and real data. We present, too, a new survival model with bivariate fraction of healing, which takes into account three settings for the latent activation mechanism: random activation, first activation and final activation. We apply this model to a set of Direct Credit loan data to the Consumer mode (DCC) and compare the settings, through Bayesian criteria for selection of models, which of the three models best fit. Finally, we show our future proposal for further research.
Os modelos de fragilidade são utilizados para modelar as possíveis associações entre os tempos de sobrevivência. Uma outra alternativa desenvolvida para modelar a dependência entre dados multivariados e o uso dos modelos baseados em funções cápulas. Neste trabalho propusemos dois modelos de sobrevivência derivados das copulas de Ali-Mikhail-Haq (AMH) e de Frank para modelar a dependência de dados bivariados na presença de covariáveis e observações censuradas. Para fins inferenciais, realizamos uma abordagem bayesiana usando métodos Monte Carlo em Cadeias de Markov (MCMC). Algumas discussões sobre os critérios de seleção de modelos são apresentadas. Com o objetivo de detectar observações influentes utilizamos o método bayesiano de analise de influencia de deleção de casos baseado na divergência. Por fim, mostramos a aplicabilidade dos modelos propostos a conjuntos de dados simulados e reais. Apresentamos, também, um novo modelo de sobrevivência bivariado com fração de cura, que leva em consideração três configurações para o mecanismo de ativação latente: ativação aleatória, primeira ativação e áltima ativação. Aplicamos este modelo a um conjunto de dados de empréstimo de Credito Direto ao modo do Consumidor (DCC) e comparamos os ajustes por meio dos critérios bayesianos de seleção de modelos para verificar qual dos três modelos melhor se ajustou. Por fim, mostramos nossa proposta futura para a continuaçaão da pesquisa.
APA, Harvard, Vancouver, ISO, and other styles
49

NAKMAI, SIWAT. "'Correlation and portfolio analysis of financial contagion and capital flight'." Doctoral thesis, Università Cattolica del Sacro Cuore, 2018. http://hdl.handle.net/10280/79341.

Full text
Abstract:
This dissertation mainly studies correlation and then portfolio analysis of financial contagion and capital flight, focusing on currency co-movements around the political uncertainty due to the Brexit referendum on 26 June 2016. The correlation, mean, and covariance computations in the analysis are both time-unconditional and time-conditional, and the generalized autoregressive conditional heteroskedasticity (GARCH) and exponentially weighted moving average (EWMA) methods are applied. The correlation analysis in this dissertation (Chapter 1) extends the previous literature on contagion testing based on a single global factor model, bivariate correlation analysis, and heteroskedasticity bias correction. Chapter 1 proposes an alternatively extended framework, assuming that intensification of financial correlations in a state of distress could coincide with rising global-factor-loading variability, provides simple tests to verify the assumptions of the literature and of the extended framework, and considers capital flight other than merely financial contagion. The outcomes show that, compared to the literature, the extended framework can be deemed more verified to the Brexit case. Empirically, with the UK being the shock-originating economy and the sterling value plummeting on the US dollar, there exist contagions to some other major currencies as well as a flight to quality, particularly to the yen, probably suggesting diversification benefits. When the correlation coefficients are time-conditional, or depend more on more recent data, the evidence shows fewer contagions and flights since the political uncertainty in question disappeared gradually over time. After relevant interest rates were partialled out, some previous statistical contagion and flight occurrences became less significant or even insignificant, possibly due to the significant impacts of the interest rates on the corresponding currency correlations. The portfolio analysis in this dissertation (Chapter 2) examines financial contagion and capital flight implied by portfolio reallocations through mean-variance portfolio analysis, and builds on the correlation analysis in Chapter 1. In the correlation analysis, correlations are bivariate, whereas in the portfolio analysis they are multivariate and the risk-return tradeoff is also vitally involved. Portfolio risk minimization and reward-to-risk maximization are the two analytical cases of portfolio optimality taken into consideration. Robust portfolio optimizations, using shrinkage estimations and newly proposed risk-based weight constraints, are also applied. The evidence demonstrates that the portfolio analysis outcomes regarding currency contagions and flights, implying diversification benefits, vary and are noticeably dissimilar from the correlation analysis outcomes of Chapter 1. Subsequently, it could be inferred that the diversification benefits deduced from the portfolio and correlation analyses differ owing to the dominance, during market uncertainty, of the behaviors of the means and (co)variances of all the shock-originating and shock-receiving returns, over the behaviors of just bivariate correlations between the shock-originating and shock-receiving returns. Moreover, corrections of the heteroskedasticity bias inherent in the shock-originating returns, overall, do not have an effect on currency portfolio rebalancing. Additionally, hedging demands could be implied from detected structural portfolio reallocations, probably as a result of variance-covariance shocks rising from Brexit.
This dissertation mainly studies correlation and then portfolio analysis of financial contagion and capital flight, focusing on currency co-movements around the political uncertainty due to the Brexit referendum on 26 June 2016. The correlation, mean, and covariance computations in the analysis are both time-unconditional and time-conditional, and the generalized autoregressive conditional heteroskedasticity (GARCH) and exponentially weighted moving average (EWMA) methods are applied. The correlation analysis in this dissertation (Chapter 1) extends the previous literature on contagion testing based on a single global factor model, bivariate correlation analysis, and heteroskedasticity bias correction. Chapter 1 proposes an alternatively extended framework, assuming that intensification of financial correlations in a state of distress could coincide with rising global-factor-loading variability, provides simple tests to verify the assumptions of the literature and of the extended framework, and considers capital flight other than merely financial contagion. The outcomes show that, compared to the literature, the extended framework can be deemed more verified to the Brexit case. Empirically, with the UK being the shock-originating economy and the sterling value plummeting on the US dollar, there exist contagions to some other major currencies as well as a flight to quality, particularly to the yen, probably suggesting diversification benefits. When the correlation coefficients are time-conditional, or depend more on more recent data, the evidence shows fewer contagions and flights since the political uncertainty in question disappeared gradually over time. After relevant interest rates were partialled out, some previous statistical contagion and flight occurrences became less significant or even insignificant, possibly due to the significant impacts of the interest rates on the corresponding currency correlations. The portfolio analysis in this dissertation (Chapter 2) examines financial contagion and capital flight implied by portfolio reallocations through mean-variance portfolio analysis, and builds on the correlation analysis in Chapter 1. In the correlation analysis, correlations are bivariate, whereas in the portfolio analysis they are multivariate and the risk-return tradeoff is also vitally involved. Portfolio risk minimization and reward-to-risk maximization are the two analytical cases of portfolio optimality taken into consideration. Robust portfolio optimizations, using shrinkage estimations and newly proposed risk-based weight constraints, are also applied. The evidence demonstrates that the portfolio analysis outcomes regarding currency contagions and flights, implying diversification benefits, vary and are noticeably dissimilar from the correlation analysis outcomes of Chapter 1. Subsequently, it could be inferred that the diversification benefits deduced from the portfolio and correlation analyses differ owing to the dominance, during market uncertainty, of the behaviors of the means and (co)variances of all the shock-originating and shock-receiving returns, over the behaviors of just bivariate correlations between the shock-originating and shock-receiving returns. Moreover, corrections of the heteroskedasticity bias inherent in the shock-originating returns, overall, do not have an effect on currency portfolio rebalancing. Additionally, hedging demands could be implied from detected structural portfolio reallocations, probably as a result of variance-covariance shocks rising from Brexit.
APA, Harvard, Vancouver, ISO, and other styles
50

Gismalla, Yousif Ebtihal. "Performance analysis of spectrum sensing techniques for cognitive radio systems." Thesis, University of Manchester, 2013. https://www.research.manchester.ac.uk/portal/en/theses/performance-analysis-of-spectrum-sensing-techniques-for-cognitive-radio-systems(157fe1af-717c-4705-a649-d809766cf5cb).html.

Full text
Abstract:
Cognitive radio is a technology that aims to maximize the current usage of the licensed frequency spectrum. Cognitive radio aims to provide services for license-exempt users by making use of dynamic spectrum access (DSA) and opportunistic spectrum sharing strategies (OSS). Cognitive radios are defined as intelligent wireless devices capable of adapting their communication parameters in order to operate within underutilized bands while avoiding causing interference to licensed users. An underused band of frequencies in a specific location or time is known as a spectrum hole. Therefore, in order to locate spectrum holes, reliable spectrum sensing algorithms are crucial to facilitate the evolution of cognitive radio networks. Since a large and growing body of literature has mainly focused into the conventional time domain (TD) energy detector, throughout this thesis the problem of spectrum sensing is investigated within the context of a frequency domain (FD) approach. The purpose of this study is to investigate detection based on methods of nonparametric power spectrum estimation. The considered methods are the periodogram, Bartlett's method, Welch overlapped segments averaging (WOSA) and the Multitaper estimator (MTE). Another major motivation is that the MTE is strongly recommended for the application of cognitive radios. This study aims to derive the detector performance measures for each case. Another aim is to investigate and highlight the main differences between the TD and the FD approaches. The performance is addressed for independent and identically distributed (i.i.d.) Rayleigh channels and the general Rician and Nakagami fading channels. For each of the investigated detectors, the analytical models are obtained by studying the characteristics of the Hermitian quadratic form representation of the decision statistic and the matrix of the Hermitian form is identified. The results of the study have revealed the high accuracy of the derived mathematical models. Moreover, it is found that the TD detector differs from the FD detector in a number of aspects. One principal and generalized conclusion is that all the investigated FD methods provide a reduced probability of false alarm when compared with the TD detector. Also, for the case of periodogram, the probability of sensing errors is independent of the length of observations, whereas in time domain the probability of false alarm is increased when the sample size increases. The probability of false alarm is further reduced when diversity reception is employed. Furthermore, compared to the periodogram, both Bartlett method and Welch method provide better performance in terms of lower probability of false alarm but an increased probability of detection for a given probability of false alarm. Also, the performance of both Bartlett's method and WOSA is sensitive to the number of segments, whereas WOSA is also sensitive to the overlapping factor. Finally, the performance of the MTE is dependent on the number of employed discrete prolate spheroidal (Slepian) sequences, and the MTE outperforms the periodogram, Bartlett's method and WOSA, as it provides the minimal probability of false alarm.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography