Dissertations / Theses on the topic 'Cox Proportional Hazard Regression Model'

To see the other types of publications on this topic, follow the link: Cox Proportional Hazard Regression Model.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 46 dissertations / theses for your research on the topic 'Cox Proportional Hazard Regression Model.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Crumer, Angela Maria. "Comparison between Weibull and Cox proportional hazards models." Kansas State University, 2011. http://hdl.handle.net/2097/8787.

Full text
Abstract:
Master of Science
Department of Statistics
James J. Higgins
The time for an event to take place in an individual is called a survival time. Examples include the time that an individual survives after being diagnosed with a terminal illness or the time that an electronic component functions before failing. A popular parametric model for this type of data is the Weibull model, which is a flexible model that allows for the inclusion of covariates of the survival times. If distributional assumptions are not met or cannot be verified, researchers may turn to the semi-parametric Cox proportional hazards model. This model also allows for the inclusion of covariates of survival times but with less restrictive assumptions. This report compares estimates of the slope of the covariate in the proportional hazards model using the parametric Weibull model and the semi-parametric Cox proportional hazards model to estimate the slope. Properties of these models are discussed in Chapter 1. Numerical examples and a comparison of the mean square errors of the estimates of the slope of the covariate for various sample sizes and for uncensored and censored data are discussed in Chapter 2. When the shape parameter is known, the Weibull model far out performs the Cox proportional hazards model, but when the shape parameter is unknown, the Cox proportional hazards model and the Weibull model give comparable results.
APA, Harvard, Vancouver, ISO, and other styles
2

Sasieni, Peter D. "Beyond the Cox model : extensions of the model and alternative estimators /." Thesis, Connect to this title online; UW restricted, 1989. http://hdl.handle.net/1773/9556.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lindberg, Erik. "A study of the effect of inbreeding in Skellefteå during the 19th century : Using Cox Proportional hazard model to analyze lifespans and Poisson/Negative Binomial regression to analyze fertility." Thesis, Umeå universitet, Statistik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-122687.

Full text
Abstract:
Inbreeding is defined as when two individuals who are related mate and produce offspring. The level of inbreeding for an individual can be determined by calculating an inbreeding coefficient. Inbreeding can enhance both positive and negative traits. The risk for recessive diseases also increase. Data from old church records from the region of Skellefteå covering individuals from the late 17th century to the early 20th century has been made available. From this data parent-child relations can be observed and levels of inbreeding calculated. By analyzing the available data using Cox Proportional Hazard regression model it was shown that the level inbreeding affected the lifespan of an individual negatively if the parents are second cousins or more closely related. Using Poisson- and Negative Binomial regression, no evicence of an effect of inbreeding of fertility could be found.
APA, Harvard, Vancouver, ISO, and other styles
4

Calsavara, Vinícius Fernando. "Estimação de efeitos variantes no tempo em modelos tipo Cox via bases de Fourier e ondaletas Haar." Universidade de São Paulo, 2015. http://www.teses.usp.br/teses/disponiveis/45/45133/tde-26082015-140547/.

Full text
Abstract:
O modelo semiparamétrico de Cox é frequentemente utilizado na modelagem de dados de sobrevivência, pois é um modelo muito flexível e permite avaliar o efeito das covariáveis sobre a taxa de falha. Uma das principais vantagens é a fácil interpretação, de modo que a razão de riscos de dois indivíduos não varia ao longo do tempo. No entanto, em algumas situações a proporcionalidade dos riscos para uma dada covariável pode não ser válida e, este caso, uma abordagem que não dependa de tal suposição é necessária. Nesta tese, propomos um modelo tipo Cox em que o efeito da covariável e a função de risco basal são representadas via bases de Fourier e ondaletas de Haar clássicas e deformadas. Propomos também um procedimento de predição da função de sobrevivência para um paciente específico. Estudos de simulações e aplicações a dados reais sugerem que nosso método pode ser uma ferramenta valiosa em situações práticas em que o efeito da covariável é dependente do tempo. Por meio destes estudos, fazemos comparações entre as duas abordagens propostas, e comparações com outra já conhecida na literatura, onde verificamos resultados satisfatórios.
The semiparametric Cox model is often considered when modeling survival data. It is very flexible, allowing for the evaluation of covariates effects. One of its main advantages is the easy of interpretation, as long as the rate of the hazards for two individuals does not vary over time. However, this proportionality of the hazards may not be true in some practical situations and, in this case, an approach not relying on such assumption is needed. In this thesis we propose a Cox-type model that allows for time-varying covariate effects, for which the baseline hazard is based on Fourier series and wavelets on a time-frequency representation. We derive a prediction method for the survival of future patients with any specific set of covariates. Simulations and an application to a real data set suggest that our method may be a valuable tool to model data in practical situations where covariate effects vary over time. Through these studies, we make comparisons between the two approaches proposed here and comparisons with other already known in the literature, where we verify satisfactory results.
APA, Harvard, Vancouver, ISO, and other styles
5

Thapa, Ram. "Modeling Mortality of Loblolly Pine Plantations." Diss., Virginia Tech, 2014. http://hdl.handle.net/10919/46726.

Full text
Abstract:
Accurate prediction of mortality is an important component of forest growth and yield prediction systems, yet mortality remains one of the least understood components of the system. Whole-stand and individual-tree mortality models were developed for loblolly pine plantations throughout its geographic range in the United States. The model for predicting stand mortality were developed using stand characteristics and biophysical variables. The models were constructed using two modeling approaches. In the first approach, mortality functions for directly predicting tree number reduction were developed using algebraic difference equation method. In the second approach, a two-step modeling strategy was used where a model predicting the probability of tree death occurring over a period was developed in the first step and a function that estimates the reduction in tree number was developed in the second step. Individual-tree mortality models were developed using multilevel logistic regression and survival analysis techniques. Multilevel data structure inherent in permanent sample plots data i.e. measurement occasions nested within trees (e.g., repeated measurements) and trees nested within plots, is often ignored in modeling tree mortality in forestry applications. Multilevel mixed-effects logistic regression takes into account the full hierarchical structure of the data. Multilevel mixed-effects models gave better predictions than the fixed effects model; however, the model fits and predictions were further improved by taking into account the full hierarchical structure of the data. Semiparametric proportional hazards regression was also used to develop model for individual-tree mortality. Shared frailty model, mixed model extension of Cox proportional hazards model, was used to account for unobserved heterogeneity not explained by the observed covariates in the Cox model.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
6

Sauls, Beverly J. "Relative Survival of Gags Mycteroperca microlepis Released Within a Recreational Hook-and-Line Fishery: Application of the Cox Regression Model to Control for Heterogeneity in a Large-Scale Mark-Recapture Study." Scholar Commons, 2013. http://scholarcommons.usf.edu/etd/4940.

Full text
Abstract:
The objectives of this study were to measure injuries and impairments directly observed from gags Mycteroperca microlepis caught and released within a large-scale recreational fishery, develop methods that may be used to rapidly assess the condition of reef fish discards, and estimate the total portion of discards in the fishery that suffer latent mortality. Fishery observers were placed on for-hire charter and headboat vessels operating in the Gulf of Mexico from June 2009 through December 2012 to directly observe reef fishes as they were caught by recreational anglers fishing with hook-and-line gear. Fish that were not retained by anglers were inspected and marked with conventional tags prior to release. Fish were released in multiple regions over a large geographic area throughout the year and over multiple years. The majority of recaptured fish were reported by recreational and commercial fishers, and fishing effort fluctuated both spatially and temporally over the course of this study in response to changes in recreational harvest restrictions and the Deepwater Horizon oil spill. Therefore, it could not be assumed that encounter probabilities were equal for all individual tagged fish in the population. Fish size and capture depth when fish were initially caught-and-released also varied among individuals in the study and potentially influenced recapture reporting probabilities. The Cox proportional hazards regression model was used to control for potential covariates on both the occurrence and timing of recapture reporting events so that relative survival among fish released in various conditions could be compared. A total of 3,954 gags were observed in this study, and the majority (77.26%) were released in good condition (condition category 1), defined as fish that immediately submerged without assistance from venting and had not suffered internal injuries from embedded hooks or visible damage to the gills. However, compared to gags caught in shallower depths, a greater proportion of gags caught and released from depths deeper than 30 meters were in fair or poor condition. Relative survival was significantly reduced (alpha (underline)<(/underline)0.05) for gags released in fair and poor condition after controlling for variable mark-recapture reporting rates for different sized discards among regions and across months and years when individual fish were initially captured, tagged and released. Gags released within the recreational fishery in fair and poor condition were 66.4% (95% C.I. 46.9 to 94.0%) and 50.6% (26.2 to 97.8%) as likely to be recaptured, respectively, as gags released in good condition. Overall discard mortality was calculated for gags released in all condition categories at ten meter depth intervals. There was a significant linear increase in estimated mortality from less than 15% (range of uncertainty, 0.1-25.2%) in shallow depths up to 30 meters, to 35.6% (5.6-55.7%) at depths greater than 70 meters (p < 0.001, R2 = 0.917). This analysis demonstrated the utility of the proportional hazards regression model for controlling for potential covariates on both the occurrence and timing of recapture events in a large-scale mark-recapture study and for detecting significant differences in the relative survival of fish released in various conditions measured under highly variable conditions within a large-scale fishery.
APA, Harvard, Vancouver, ISO, and other styles
7

Gwaze, Arnold Rumosa. "A cox proportional hazard model for mid-point imputed interval censored data." Thesis, University of Fort Hare, 2011. http://hdl.handle.net/10353/385.

Full text
Abstract:
There has been an increasing interest in survival analysis with interval-censored data, where the event of interest (such as infection with a disease) is not observed exactly but only known to happen between two examination times. However, because so much research has been focused on right-censored data, so many statistical tests and techniques are available for right-censoring methods, hence interval-censoring methods are not as abundant as those for right-censored data. In this study, right-censoring methods are used to fit a proportional hazards model to some interval-censored data. Transformation of the interval-censored observations was done using a method called mid-point imputation, a method which assumes that an event occurs at some midpoint of its recorded interval. Results obtained gave conservative regression estimates but a comparison with the conventional methods showed that the estimates were not significantly different. However, the censoring mechanism and interval lengths should be given serious consideration before deciding on using mid-point imputation on interval-censored data.
APA, Harvard, Vancouver, ISO, and other styles
8

Sandström, Caroline, and Karl Norling. "Female longevity : A survival analysis on 19th century women using the Cox Proportional Hazard model." Thesis, Umeå universitet, Statistiska institutionen, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-49700.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Minya, Kristoffer. "Överlevnadsanalys i tjänsteverksamhet : Tidspåverkan i överklagandeprocessen på Migrationsverket." Thesis, Linköpings universitet, Statistik, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-110428.

Full text
Abstract:
Migrationsverket är en myndighet som prövar ansökningar från personer som vill söka skydd, ha medborgarskap, studera eller vill jobba i Sverige. Då det på senare tid varit en stor ökning i dessa ansökningar har tiden för vilket ett beslut tar ökat. Varje typ av ansökning (exempelvis medborgarskap) är en process som består av flera steg. Hur beslutet går igenom dessa steg kallas för flöde. Migrationsverket vill därför öka sin flödeseffektivitet. När beslutet är klart och personen tagit del av det men inte är nöjd kan denne överklaga. Detta är en av de mest komplexa processerna på Migrationsverket. Syftet är analysera hur lång tid denna process tar och vilka steg i processen som påverkar tiden. Ett steg (som senare visar sig ha en stor effekt på tiden) är yttranden. Det är när domstolen begär information om vad personen som överklagar har att säga om varför denne överklagar. För att analysera detta var två metoder relevanta, accelerated failure time (AFT) och \multi-state models (MSM). Den ena kan predicera tid till händelse (AFT) medan den andra kan analysera effekten av tidspåverkan (MSM) i stegen. Yttranden tidigt i processen har stor betydelse för hur snabbt en överklagan får en dom samtidigt som att antal yttranden ökar tiden enormt. Det finns andra faktorer som påverkar tiden men inte i så stor grad som yttranden. Då yttranden tidigt i processen samtidigt som antal yttranden har betydelse kan flödeseffektiviteten ökas med att ta tid på sig att skriva ett informativt yttrande som gör att domstolen inte behöver begära flera yttranden.
The Swedish Migration Board is an agency that review applications from individuals who wish to seek shelter, have citizenship, study or want to work in Sweden. In recent time there has been a large increase in applications and the time for which a decision is made has increased. Each type of application (such as citizenship) is a process consisting of several stages. How the decision is going through these steps is called flow. The Swedish Migration Board would therefore like to increase their flow efficiency. When the decision is made and the person has take part of it but is not satisfied, he can appeal. This is one of the most complex processes at the Board. The aim is to analyze how long this process will take and what steps in the process affects the time. One step (which was later found to have a significant effect on time) is opinions. This is when the court requests information on what the person is appealing has to say about why he is appealing. To analyze this, two methods were relevant, accelerated failure time (AFT) and the multi-state models (MSM). One can predict time to event (AFT), the other to analyze the effect of time-manipulation (MSM) in the flow. Opinions early in the process is crucial to how quickly an appeal get judgment while the number of opinions increases the time enormously. There are other factors that affect the time but not so much as opinions. The flow efficiency can be increased by taking time to write an informative opinion which allows the court need not to ask for more opinions.
APA, Harvard, Vancouver, ISO, and other styles
10

Sposito, Ítalo Beltrão. "Continuidade e mudança na política externa dos estados latino-americanos (1945-2008)." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/101/101131/tde-28032016-141512/.

Full text
Abstract:
: Este trabalho tem como objeto o redirecionamento na política externa (RPE) - conceituado como as mudanças mais radicais, abrangentes e rápidas em política externa. Para analisar este fenômeno, são buscadas as principais condições conjunturais que podem aumentar a chance de ocorrência deste evento. Estas condições estão relacionadas ao conceito de Janela Política, que representa o período em que é rompida a inércia política e os tomadores de decisões têm condições de iniciar um processo de RPE. Objetivo: encontrar e delimitar quais as condições conjunturais que aumentam as chances de ocorrência de um RPE. Método: são utilizadas ferramentas metodológicas qualitativas e quantitativas. No segundo capítulo, a análise é feita por meio de um modelo de sobrevivência (Cox Proportional Hazard Model) que analisa o efeito das variáveis sobre os riscos de ocorrência do evento em foco, definido como as alterações mais extremas de comportamento nas votações da Assembleia Geral das Nações Unidas. No terceiro capítulo, é desenvolvida uma análise qualitativa histórica focando especificamente nos casos mais radicais de RPE, buscando identificar padrões comuns no desencadeamento dos processos em estudo; com base nestes casos, são desenvolvidas tipologias explicativas para identificar diferentes caminhos causais que levam ao evento em tela. Resultados: foi identificado que mudanças de regime e de líder político, no âmbito doméstico, e intervenções militares de potências estrangeiras aumentam os riscos de ocorrência de RPE; adicionalmente, a alta polarização política e a mudança de regime, a crise política doméstica com envolvimento de atores internacionais, os processos de isolamento internacional com imposição de sanções econômicas e os períodos de crise econômica com questionamento do modelo econômico vigente por parte dos atores políticos podem combinadamente levar à ocorrência de RPE. Conclusões: apesar da importância do interesses de atores políticos em empreender um projeto de RPE, foi identificado que determinados eventos aumentam os riscos deste processo ocorrer.
This thesis object is the foreign policy restructuring (FPR) - conceptualized as the most radical, encompassing, and fast changes in foreign policy. To analyze this phenomenon, there will be sought the main conjuncture conditions that might enhance the chances of this event occurrence. These conditions are related to the Policy Window concept, that represents a period during which the political inertia is disrupted and decision makers have the circumstances to undertake a FPR process. Objective: find and outline the conjectural conditions and variables that increase the chances of occurrence of a FPR. Methods: it will be used qualitative and quantitative methodological tools. In the second chapter, a survival model (Cox Proportional Hazard Model) analyses the effect variables related to the Policy Window concept over the risks of happening a FPR, defined as the most extreme changes of behavior in United Nations General Assembly roll-call votes. In the third chapter, a historical qualitative analysis is undertaken focusing exclusively on the most radical cases of FPR to develop explanatory typologies in order to identify causal conjunctures and common patters that lead to the outcome. Results: we identified that regime and political leader changes, in the national context, and military interventions by foreign powers enhance the risks of FPR occurrence; additionally, high political polarization combined with regime change, political crisis with international forces involvement, processes of international isolation with economic sanctions enforcement, and economic crises with political actors questioning the current economic model might be combined, configuring causal paths to a FPR. Conclusion: despite the importance of main political actors interest in implementing a FPR process, we identified that specific conjunctures and events raise the risks of a positive outcome.
APA, Harvard, Vancouver, ISO, and other styles
11

Belcon, Michael C. "Determinants and Disparities of Survival in Triple-Negative Breast Cancer Patients: A Population-Based Retrospective Longitudinal Cohort Design Utilizing the Cox Proportional Hazard Analytical Model." FIU Digital Commons, 2015. http://digitalcommons.fiu.edu/etd/2311.

Full text
Abstract:
A significant racial disparity in breast cancer mortality exists among women in the United States. Triple-negative breast cancer (TNBC) is a breast cancer phenotype that may explain, in part, this disparity between white and African American women. The objective of this study was to determine the predictors of survival in TNBC and non-triple-negative breast cancer (NTNBC) patients. Data on 168,756 female patients with a diagnosis of invasive breast cancer in the Surveillance Epidemiology and End Results (SEER) program were stratified based on breast cancer receptor phenotypes in this retrospective longitudinal cohort study design. Multiple logistic regressions were used for exploring predictors of treatment which showed that not receiving surgery as standard treatment was associated (odds ratio: 95% CI) with TNBC (OR 1.151: 1.042, 1.177), uninsured (OR 3.552: 3.206, 3.937) and African American (OR 1.804: 1.702, 1.912) while not receiving radiation was associated with TNBC (OR 1.151: 1.113, 1.190), uninsured (OR 1.318; 1.217, 1.429). Cox’s hazard models were used, regressing age, race, ethnicity, marital status, health insurance status, histological tumor grade, and treatment status on survival time, the outcome measure. Analysis revealed that the mean survival time is lower for TNBC [15.60 (± 10.29)] months compared with NTNBC [16.01 (± 10.18)] (p < 0.0001), a difference though small is statistically significant. The independent determinants of survival in TNBC were: young age at diagnosis [(β = 0.033, HR 1.033 (1.026, 1.041)]; being African American [(β = 0.182, HR 1.200 (1.117, 1.289)], being married [(β = - 0.362, HR 0.697 (0.658, 0.737)]; higher tumor histological grades [β = 1.034, HR 2.812 (2.159,3.661)]; uninsured [(β = 0.541, HR 1.717 (1.481, 1.992)]; no surgery [(β = 2.156, HR 8.633 (8.152, 9.143)], or no radiation treatment [(β = 0.489, HR 1.630 (1.535,1.73)]. African American race, uninsured status, higher grade at diagnosis, inadequate treatment are independent predictors of poor survival among breast cancer patients; importantly, TNBC had a lower survival than that of NTNBC patients. A higher proportion of TNBC patients had a diagnosis at younger age, with higher tumor grade and was of the African American race. The survival disparity in African American patients may be partially explained by disproportionately higher TNBC cases among them, as well as, rates of not receiving standard treatments.
APA, Harvard, Vancouver, ISO, and other styles
12

Yuan, Xingchen. "Survival Model and Estimation for Lung Cancer Patients." Digital Commons @ East Tennessee State University, 2005. https://dc.etsu.edu/etd/1002.

Full text
Abstract:
Lung cancer is the most frequent fatal cancer in the United States. Following the notion in actuarial math analysis, we assume an exponential form for the baseline hazard function and combine Cox proportional hazard regression for the survival study of a group of lung cancer patients. The covariates in the hazard function are estimated by maximum likelihood estimation following the proportional hazards regression analysis. Although the proportional hazards model does not give an explicit baseline hazard function, the baseline hazard function can be estimated by fitting the data with a non-linear least square technique. The survival model is then examined by a neural network simulation. The neural network learns the survival pattern from available hospital data and gives survival prediction for random covariate combinations. The simulation results support the covariate estimation in the survival model.
APA, Harvard, Vancouver, ISO, and other styles
13

Cong, Chunling. "Statistical Analysis and Modeling of Breast Cancer and Lung Cancer." Scholar Commons, 2010. http://scholarcommons.usf.edu/etd/3563.

Full text
Abstract:
The objective of the present study is to investigate various problems associate with breast cancer and lung cancer patients. In this study, we compare the effectiveness of breast cancer treatments using decision tree analysis and come to the conclusion that although certain treatment shows overall effectiveness over the others, physicians or doctors should discretionally give different treatment to breast cancer patients based on their characteristics. Reoccurrence time of breast caner patients who receive different treatments are compared in an overall sense, histology type is also taken into consideration. To further understand the relation between relapse time and other variables, statistical models are applied to identify the attribute variables and predict the relapse time. Of equal importance, the transition between different breast cancer stages are analyzed through Markov Chain which not only gives the transition probability between stages for specific treatment but also provide guidance on breast cancer treatment based on stating information. Sensitivity analysis is conducted on breast cancer doubling time which involves two commonly used assumptions: spherical tumor and exponential growth of tumor and the analysis reveals that variation from those assumptions could cause very different statistical behavior of breast cancer doubling time. In lung cancer study, we investigate the mortality time of lung cancer patients from several different perspectives: gender, cigarettes per day and duration of smoking. Statistical model is also used to predict the mortality time of lung cancer patients.
APA, Harvard, Vancouver, ISO, and other styles
14

Tran, Xuan Quang. "Les modèles de régression dynamique et leurs applications en analyse de survie et fiabilité." Thesis, Bordeaux, 2014. http://www.theses.fr/2014BORD0147/document.

Full text
Abstract:
Cette thèse a été conçu pour explorer les modèles dynamiques de régression, d’évaluer les inférences statistiques pour l’analyse des données de survie et de fiabilité. Ces modèles de régression dynamiques que nous avons considérés, y compris le modèle des hasards proportionnels paramétriques et celui de la vie accélérée avec les variables qui peut-être dépendent du temps. Nous avons discuté des problèmes suivants dans cette thèse.Nous avons présenté tout d’abord une statistique de test du chi-deux généraliséeY2nquiest adaptative pour les données de survie et fiabilité en présence de trois cas, complètes,censurées à droite et censurées à droite avec les covariables. Nous avons présenté en détailla forme pratique deY2nstatistique en analyse des données de survie. Ensuite, nous avons considéré deux modèles paramétriques très flexibles, d’évaluer les significations statistiques pour ces modèles proposées en utilisantY2nstatistique. Ces modèles incluent du modèle de vie accélérés (AFT) et celui de hasards proportionnels (PH) basés sur la distribution de Hypertabastic. Ces deux modèles sont proposés pour étudier la distribution de l’analyse de la duré de survie en comparaison avec d’autre modèles paramétriques. Nous avons validé ces modèles paramétriques en utilisantY2n. Les études de simulation ont été conçus.Dans le dernier chapitre, nous avons proposé les applications de ces modèles paramétriques à trois données de bio-médicale. Le premier a été fait les données étendues des temps de rémission des patients de leucémie aiguë qui ont été proposées par Freireich et al. sur la comparaison de deux groupes de traitement avec des informations supplémentaires sur les log du blanc du nombre de globules. Elle a montré que le modèle Hypertabastic AFT est un modèle précis pour ces données. Le second a été fait sur l’étude de tumeur cérébrale avec les patients de gliome malin, ont été proposées par Sauerbrei & Schumacher. Elle a montré que le meilleur modèle est Hypertabastic PH à l’ajout de cinq variables de signification. La troisième demande a été faite sur les données de Semenova & Bitukov, à concernant les patients de myélome multiple. Nous n’avons pas proposé un modèle exactement pour ces données. En raison de cela était les intersections de temps de survie.Par conséquent, nous vous conseillons d’utiliser un autre modèle dynamique que le modèle de la Simple Cross-Effect à installer ces données
This thesis was designed to explore the dynamic regression models, assessing the sta-tistical inference for the survival and reliability data analysis. These dynamic regressionmodels that we have been considered including the parametric proportional hazards andaccelerated failure time models contain the possibly time-dependent covariates. We dis-cussed the following problems in this thesis.At first, we presented a generalized chi-squared test statisticsY2nthat is a convenient tofit the survival and reliability data analysis in presence of three cases: complete, censoredand censored with covariates. We described in detail the theory and the mechanism to usedofY2ntest statistic in the survival and reliability data analysis. Next, we considered theflexible parametric models, evaluating the statistical significance of them by usingY2nandlog-likelihood test statistics. These parametric models include the accelerated failure time(AFT) and a proportional hazards (PH) models based on the Hypertabastic distribution.These two models are proposed to investigate the distribution of the survival and reliabilitydata in comparison with some other parametric models. The simulation studies were de-signed, to demonstrate the asymptotically normally distributed of the maximum likelihood estimators of Hypertabastic’s parameter, to validate of the asymptotically property of Y2n test statistic for Hypertabastic distribution when the right censoring probability equal 0% and 20%.n the last chapter, we applied those two parametric models above to three scenes ofthe real-life data. The first one was done the data set given by Freireich et al. on thecomparison of two treatment groups with additional information about log white blood cellcount, to test the ability of a therapy to prolong the remission times of the acute leukemiapatients. It showed that Hypertabastic AFT model is an accurate model for this dataset.The second one was done on the brain tumour study with malignant glioma patients, givenby Sauerbrei & Schumacher. It showed that the best model is Hypertabastic PH onadding five significance covariates. The third application was done on the data set given by Semenova & Bitukov on the survival times of the multiple myeloma patients. We did not propose an exactly model for this dataset. Because of that was an existing oneintersection of survival times. We, therefore, suggest fitting other dynamic model as SimpleCross-Effect model for this dataset
APA, Harvard, Vancouver, ISO, and other styles
15

Turk, Ana. "Warranty claims analysis for household appliances produced by ASKO Appliances AB." Thesis, Linköpings universitet, Statistik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-96955.

Full text
Abstract:
The input collected from warranty claims data links customer feedback with product quality. Results from warranty claim analysis can potentially improve product quality, customer relationships and positively affect business. However working on warranty claims data holds many challenges that requires a significant share of time devoted to data cleaning and data processing. The purpose of warranty claims analysis is to get the comprehensive overview of the reliability, costs and quality of household appliances produced by ASKO. While there are different ways to approach this problem, we will focus on non-parametric and semi-parametric methods, by using Kaplan-Meier estimators and Cox proportional hazard model respectively. These kinds of models are time dependent and therefore used for prediction of household appliance reliability. Even though non-parametric models are quite informative they cannot handle additional characteristics about observable product hence the semi-parametric Cox proportional hazard model was proposed. Apart from the reliability analysis, we will also predict warranty costs with probit model and observe inequality in household appliances part failures as a part of quality control analysis. Described methods were selected due to the fact that the warranty claims analysis will be practiced in future by ASKO’s quality department and therefore straight forward methods with very informative results are needed.
APA, Harvard, Vancouver, ISO, and other styles
16

Frazão, Italo Marcus da Mota. "Modelos com sobreviventes de longa duração paramétricos e semi-paramétricos aplicados a um ensaio clínico aleatorizado." Universidade de São Paulo, 2012. http://www.teses.usp.br/teses/disponiveis/11/11134/tde-13032013-093628/.

Full text
Abstract:
Diversos modelos têm sido propostos na literatura com o objetivo de analisar dados de sobrevivência em que a população sob estudo é assumida ser uma mistura de indivíduos suscetíveis (em risco) e não suscetíveis a um específico evento de interesse. Tais modelos são usualmente denominados modelos com sobreviventes de longa duração ou modelos com fração de cura. Neste trabalho, diversos desses modelos (nos contextos paramétrico e semi-paramétrico) foram considerados para analisar os dados de um ensaio clínico aleatorizado conduzido com o objetivo de comparar três estratégias terapêuticas (cirurgia, angioplastia e medicamentoso) utilizadas no tratamento de pacientes com doença coronariana multiarterial. Em todos os modelos, as funções de ligação logito e complemento log-log foram utilizadas para modelar a proporção de sobreviventes de longa duração (indivíduos não suscetíveis). Quanto à função de sobrevivência dos indivíduos suscetíveis, foram utilizados os modelos de Weibull e de Cox. Covariáveis foram consideradas tanto na proporção de sobreviventes de longa duração quanto na função de sobrevivência dos indivíduos suscetíveis. De modo geral, os modelos considerados se mostraram adequados para analisar os dados do ensaio clínico aleatorizado, indicando a cirurgia como a estratégia terapêutica mais eficiente. Indicaram também, que as covariáveis idade, hipertensão e diabetes mellitus exercem influência na ocorrência do óbito cardíaco, mas não no tempo até a ocorrência deste óbito nos pacientes suscetíveis.
Several models have been proposed in the literature with the aim of analyzing survival data when the population under study is assumed to be a mixture of susceptible (at risk) and not susceptible individuals to a specific event of interest. Such models are usually called long-term survivors models or cure rate models. In this work, several of these models (under both parametric and semi-parametric approaches) were considered to analyze the data from a randomized clinical trial conducted in order to compare three therapeutic strategies (surgery, angioplasty and medicine) used in the treatment of patients with multivessel coronary artery disease. For all models the logit and complementary log-log link functions were used to model the proportion of long-term survivors (not susceptible individuals). In regards to the survival function of the susceptible individuals, the Weibull and Cox models were used. Covariates were considered both in the proportion of longterm survivors and in the survival function of the susceptible individuals. Overall, the models considered were suitable for analyzing the data from the randomized clinical trial indicating surgery as the most effective therapeutic strategy. They also indicated that the covariates age, hypertension and diabetes mellitus exhibit influence on the occurrence of cardiac death, but not on the time to the occurrence of this death in susceptible patients.
APA, Harvard, Vancouver, ISO, and other styles
17

Holcman, Marcia Moreira. "Avaliação do efeito das perdas de seguimento nas análises feitas pelo estimador produto - limite de Kaplan - Meier e pelo modelo de riscos proporcionais de Cox." Universidade de São Paulo, 2006. http://www.teses.usp.br/teses/disponiveis/6/6132/tde-08032007-100521/.

Full text
Abstract:
Introdução: As técnicas mais comumente empregadas em análise de sobrevida que utilizam dados censurados são o estimador produto limite de Kaplan-Meier (KM) e o modelo de riscos proporcionais de Cox. Estas técnicas têm como suposição que a causa da perda de seguimento seja independente do tempo de sobrevida. Objetivo: O presente estudo visa a analisar o efeito das perdas de seguimento nestas duas técnicas. Material e Métodos: O estudo foi realizado utilizando-se o banco de dados dos pacientes cadastrados no Registro Hospitalar do Hospital do Câncer de São Paulo em 1994. Foram elaborados 28 bancos de dados simulando perdas informativas e não informativas. A perda informativa foi simulada transformando os óbitos em vivos, na proporção de 5 a 50%. A perda não informativa foi simulada através do sorteio de 5 a 50% do total do banco. O estimador de Kaplan-Meier (KM) foi utilizado para estimar a sobrevida acumulada no primeiro, terceiro e quinto ano de seguimento, e o modelo de riscos proporcionais de Cox para estimar as hazard ratio (HR). Todas as estimativas obtidas no KM e as HR's foram comparadas com os resultados do banco de dados original. Resultados: Houve maior proporção de perda nos pacientes com maior escolaridade, admitidos por convênio e particular e os menos graves (estádio I ou II). Quanto maior a proporção de perda informativa, maior a diferença alcançada nas estimativas realizadas pelo KM, verificando-se que a perda de seguimento superior a 15% acarretou diferenças superiores a 20% nas estimativas da probabilidade de sobrevida. As HR's foram menos afetadas, e proporções superiores a 20% de perda de seguimento acarretaram variações de cerca de 10% nas estimativas. Quando as perdas foram não informativas não houve diferenças significativas nas estimativas pelo KM e nas HR's em relação ao banco original. Conclusões: É importante avaliar se as perdas ocorridas em estudos de coorte são informativas ou não, pois se forem podem acarretar distorções principalmente nas estimativas feitas pelo método de KM.
Introduction: The Kaplan Meier product limit estimator (KM) and the Cox proportional hazard (HR) model are the most used tools in survival analysis. These two methods have the key assumption that censoring must be independent from the survival time. Objective: To analyze the consequences of loss to follow up in these two methods. Methods: The study has utilized the data of the Cancer Registry of the patients of Hospital do Cancer in São Paulo of 1994. The informative censure was simulated transforming the death by 5 to 50% into alive. Besides 5 to 50% was spared at random simulating the non-informative censoring. The survival probability and was calculated to the first, third and fifth year of follow –up. All the estimated probabilities and HR’s were compared with the results of the original data. Results: Patients with greater scholars, lower stages and admitted by health plans or private had more losses to follow up. The maximum proportion of accepted loss to follow –up is 10% to 15% when using the KM estimator, and the HR are less affected by the loss to follow-up and one can afford having 20% of it. When the losses were non informative there were no differences between the original probabilities. Conclusions: The possibility of over or under estimated probability must be analyzed in the presence of the losses to follow- up when using the KM and HR in survival analyses.
APA, Harvard, Vancouver, ISO, and other styles
18

El, Nagar Ayman Gamal Fawzy. "Genetic analysis of longevity in specialized lines of rabbits." Doctoral thesis, Universitat Politècnica de València, 2015. http://hdl.handle.net/10251/52390.

Full text
Abstract:
[EN] The global objective of the present thesis was to study the functional longevity defined as length of productive life (LPL) in five Spanish specialized lines of rabbit (A, V, H and LP). Chapter 3, aimed to check the genetic heterogeneity for longevity between the five lines estimating the additive variance and the corresponding effective heritabilities. As well as to test the genetic importance of time-dependent factors such as positive palpation order (OPP), physiological status (PS) and number of kits born alive (NBA) on the genetics of longevity. This point has been assessed using four different Cox proportional hazard models; the first one (Model 1) included all the previous factors in addition to the year-season effect, the inbreeding coefficient effect and finally the animal effect as random factor. The remaining three models were the same as Model 1 but excluding OPP (Model 2), or PS (Model 3), or NBA (Model 4). The complete data set comprised 15,670 does with records 35.6 % having censoring data, and the full pedigree file involved 19,405 animals. The heritability estimates for longevity in the five lines were low and ranged from 0.02±0.01 to 0.14±0.09, and consequently, it is not recommended to include this trait as selection criteria in rabbit breeding programs. Despite of the large variation of the heritability estimates, the corresponding HPD95% always overlapped and consequently the hypothesis of all lines having the same heritability cannot be discarded. Comparing the additive variance estimates of the four models, it was observed that by correcting for PS 51, 39, 38, 83 and 75% of the additive variance in lines A, V, H, LP and R, respectively, was removed. The risk of death or culling decreases as OPP advanced. Non-pregnant-non-lactating females are those under the higher risk. The does which had zero NBA had the highest risk, apart for this special figure (zero NBA) the risk decreased as NBA increased. Chapter 4 intended to estimate the genetic and environmental correlations between longevity and two prolificacy traits (number of kits born alive (NBA) and number of kits alive at weaning (NW)). Furthermore, to estimate the genetic and environmental correlations between longevity and the percentage of days that the doe spent in the different physiological statuses with respect to its entire productive life. The complete pedigree file comprised 19,405 animals. The datasets included records on 15,670 does which had 58,329 kindlings and 57,927 weanings. In general the genetic correlations between NBA and NW, and the hazard were low to very low, and the only line for which it can be said these genetic correlation to be different from zero was the LP line. Regarding the correlations between longevity and the percentage of days the doe spent in each physiological status, there were evidences of non-negligible genetic correlations between the two traits. Chapter 5 purposed to compare the five lines at their foundation and at fixed time periods during their selection programs. The first comparison was done at the origin of the lines, involving the complete data set, and using a genetic model (CM) including the additive values of the animals, so the effect of selection was considered. For the second comparison the same model as the first comparison was used, but excluding the additive effects from the model of analysis (IM), and involving only the data corresponding to each period, so the differences between the lines included the additive values of the animals. The lines V, H and LP showed at foundation a substantial superiority over line A. The line R had higher risk of death or culling with relevant differences when compared to V, H and LP lines. The maximum relative risks were observed between the lines LP and R (0.239), and between LP and A (0.317). For the comparisons at fixed times, the pattern of the differences between the A line and the others was similar to those observed at foundation.
[ES] El objetivo global de la presente tesis fue estudiar la longevidad funcional en cinco líneas españolas de conejos (A, V, H y LP), el carácter se definió como la longitud de la vida productiva. En el Capítulo 3, dirigido a comprobar la heterogeneidad genética de la longevidad entre las 5 líneas, se estimaron las varianzas aditivas y sus correspondientes heredabilidades efectivas. Y además se evaluó la importancia del orden de la palpación positiva (OPP), el estado fisiológico (PS) y el número de gazapos nacidos vivos (NBA) sobre el determinismo genético de la longevidad. Para ello se utilizaron 4 modelos de Cox de riesgos proporcionales; el primer modelo (Modelo 1) incluyó todos los factores anteriores, además del efecto del año-estación, el efecto de la consanguinidad y, finalmente, el valor aditivo de los animales como efecto aleatorio. Los otros tres modelos fueron igual que el Modelo 1 pero excluyendo OPP (Modelo 2), o PS (Modelo 3), o NBA (Modelo 4). Los datos de longevidad estaban referidos a 15,670 conejas y tuvieron una tasa de censura de 35.6%. La genealogía completa involucró a 19,405 animales. Las estimas de heredabilidad efectiva para la longevidad en las 5 líneas fueron bajas y variaron de 0.02±0.01 a 0.14±0.09. A pesar de la gran variación de las estimas puntuales de heredabilidad, los correspondientes intervalos HPD95% siempre se solaparon y por lo tanto la hipótesis de que todas las líneas tengan la misma heredabilidad no pudo descartase. Se observó que la exclusión de PS incrementó la varianza aditiva aproximadamente, en un 51, 39, 38, 83 y 75% en las líneas A, V, H, LP y R, respectivamente. El riesgo de muerte o eliminación disminuía a medida que avanzaba el OPP, observándose el riesgo más alto durante los primeros dos partos, partos en los que las conejas todavía están creciendo lo que sería un factor de riesgo importante. El nivel No-Gestante-No-Lactante de PS tuvo el mayor riesgo. Este nivel se interpreta como indicador de baja fertilidad y/o problemas de salud de la coneja. Las conejas que tenían cero NBA tuvieron el mayor riesgo de muerte o eliminación, aunque para el resto de niveles de NBA se apreció una disminución del riesgo a medida que aumenta la prolificidad. En el capítulo 4, se estimaron las correlaciones genéticas y ambientales entre la longevidad y dos caracteres de prolificidad [número de gazapos nacidos vivos (NBA) y el número de destetados (NW)]. El fichero de datos incluyó 58,329 partos y 57,927 destetes. También se estimaron las correlaciones entre longevidad y el porcentaje de días que la coneja pasó en los diferentes estados fisiológicos con respecto a la totalidad de su vida productiva. La única línea para la que se puede decir que la correlación genética entre NBA o NW y el riesgo fue significativamente diferente de cero fue la línea LP. Hubo evidencias de correlaciones genéticas no despreciables entre la longevidad y el porcentaje de días que la hembra pasó en cada estado fisiológico los dos caracteres. En el capítulo 5 se compararon las longevidades medias de las 5 líneas en su fundación y en períodos de tiempo determinados. La comparación de las líneas en el origen, utilizó todos los datos y un modelo genético (CM) que incluía los valores aditivos de los animales. Para la comparación en tiempos fijos se utilizó el mismo modelo, pero excluyendo los efectos aditivos del modelo de análisis (IM), utilizando sólo los datos correspondientes a cada período, por lo que las diferencias entre las líneas incluían los cambios debidos a la selección. Las líneas V, H y LP mostraron una superioridad sustancial sobre las líneas A y R. Los riesgos relativos máximos se observaron entre las líneas LP y R (0.239), y entre LP y A (0.317). Con respecto a las comparaciones en tiempos fijos, el patrón de las diferencias entre la línea de A y las otras líneas fue similar a los observados en la fundación.
[CAT] L'objectiu global de la present tesi va ser estudiar la longevitat funcional en cinc línies espanyoles de conills (A, V, H i LP), el caràcter es va definir com la longitud de la vida productiva. Al Capítol 3, dirigit a comprovar l'heterogeneïtat genètica de la longevitat entre les 5 línies, es van estimar les variàncies additives i les seues corresponents heretabilitats efectives. A més a més, es va avaluar la importància de factors dependents del temps, com l'orde de la palpació positiva (OPP) , l'estat fisiològic (PS) i el nombre de llorigons nascuts vius (NBA) sobre el determinisme genètic de la longevitat. Per a això es van utilitzar 4 models de Cox de riscos proporcionals; el primer model (Model 1) va incloure tots els factors anteriorment assenyalats, a més de l'efecte de l'any-estació, l'efecte de la consanguinitat i, finalment, el valor additiu dels animals com a efecte aleatori. Els altres tres models van ser igual que el Model 1 però excloent l'OPP (Model 2) , o PS (Model 3) , o NBA (Model 4) . Les dades de longevitat estaven referides a 15,670 conilles i van tindre una taxa de censura de 35.6%. La genealogia completa va involucrar a 19,405 animals. Les estimes d'heretabilitat efectiva (Model 1) per a la longevitat en les 5 línies van ser baixes i van variar de 0.02±0.01 a 0.14±0.09. A pesar de la gran variació de les estimes puntuals d'heretabilitat, els corresponents intervals HPD95% sempre es van solapar i per tant la hipòtesi que totes les línies tinguen la mateixa heretabilitat no va poder descartar-se. Es va observar que l'exclusió de PS va incrementar la variància additiva, aproximadament, en un 51, 39, 38, 83 i 75% en les línies A, V, H, LP i R, respectivament. El risc de mort o eliminació disminuïa a mesura que avançava l'OPP, observant-se el risc més alt durant els primers dos parts, en què les conilles encara estan creixent el que seria un factor de risc important. El nivell No-Gestant-No-Lactant de PS va tindre el major risc en comparació amb els altres nivells. Les conilles que tenien zero NBA van tindre el major risc de mort o eliminació, encara que per a la resta de nivells de NBA es va apreciar una disminució del risc a mesura que augmentà la prolificitat. Al Capítol 4, es van estimar les correlacions genètiques i ambientals entre la longevitat i dos caràcters de prolificitat [nombre de llorigons nascuts vius (NBA) i el nombre de deslletats (NW)]. El fitxer de dades va incloure 58,329 parts i 57,927 deslletaments. L'única línia per a la que es pot dir que la correlació genètica entre NBA o NW i el risc va ser significativament diferent de zero va ser la línia LP. Evidències de correlacions genètiques no menyspreables entre longevitat i els percentatge de dies que la femella va passar en cada estat fisiològic. Al Capítol 5 es compararen les longevitats mitges de les 5 línies en la seua fundació i en períodes de temps determinats. Per a la comparació de les línies a l'origen, es van utilitzar totes les dades i un model genètic (CM) que incloïa els valors additius dels animals, per la qual cosa es va considerar l'efecte de la selecció a partir de la fundació. En la comparació en temps fixos se va utilitzar el mateix model que en l'anterior, però excloent els efectes additius del model d'anàlisi (IM), utilitzant només les dades corresponents a cada període, per la qual cosa les diferències entre les línies incloïen els canvis deguts a la selecció. Les línies V, H i LP van mostrar una superioritat substancial sobre les línies A i R. Els riscos relatius màxims es van observar entre les línies LP i R (0.239), i entre LP i A (0.317). Respecte a les comparacions en temps fixos, el patró de les diferències entre la línia de A i les altres línies va ser semblant als observats en la fundació.
El Nagar, AGF. (2015). Genetic analysis of longevity in specialized lines of rabbits [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/52390
TESIS
APA, Harvard, Vancouver, ISO, and other styles
19

Lee, Kyeong Eun. "Bayesian models for DNA microarray data analysis." Diss., Texas A&M University, 2005. http://hdl.handle.net/1969.1/2465.

Full text
Abstract:
Selection of signi?cant genes via expression patterns is important in a microarray problem. Owing to small sample size and large number of variables (genes), the selection process can be unstable. This research proposes a hierarchical Bayesian model for gene (variable) selection. We employ latent variables in a regression setting and use a Bayesian mixture prior to perform the variable selection. Due to the binary nature of the data, the posterior distributions of the parameters are not in explicit form, and we need to use a combination of truncated sampling and Markov Chain Monte Carlo (MCMC) based computation techniques to simulate the posterior distributions. The Bayesian model is ?exible enough to identify the signi?cant genes as well as to perform future predictions. The method is applied to cancer classi?cation via cDNA microarrays. In particular, the genes BRCA1 and BRCA2 are associated with a hereditary disposition to breast cancer, and the method is used to identify the set of signi?cant genes to classify BRCA1 and others. Microarray data can also be applied to survival models. We address the issue of how to reduce the dimension in building model by selecting signi?cant genes as well as assessing the estimated survival curves. Additionally, we consider the wellknown Weibull regression and semiparametric proportional hazards (PH) models for survival analysis. With microarray data, we need to consider the case where the number of covariates p exceeds the number of samples n. Speci?cally, for a given vector of response values, which are times to event (death or censored times) and p gene expressions (covariates), we address the issue of how to reduce the dimension by selecting the responsible genes, which are controlling the survival time. This approach enables us to estimate the survival curve when n << p. In our approach, rather than ?xing the number of selected genes, we will assign a prior distribution to this number. The approach creates additional ?exibility by allowing the imposition of constraints, such as bounding the dimension via a prior, which in e?ect works as a penalty. To implement our methodology, we use a Markov Chain Monte Carlo (MCMC) method. We demonstrate the use of the methodology with (a) di?use large B??cell lymphoma (DLBCL) complementary DNA (cDNA) data and (b) Breast Carcinoma data. Lastly, we propose a mixture of Dirichlet process models using discrete wavelet transform for a curve clustering. In order to characterize these time??course gene expresssions, we consider them as trajectory functions of time and gene??speci?c parameters and obtain their wavelet coe?cients by a discrete wavelet transform. We then build cluster curves using a mixture of Dirichlet process priors.
APA, Harvard, Vancouver, ISO, and other styles
20

Li, Qiuju. "Statistical inference for joint modelling of longitudinal and survival data." Thesis, University of Manchester, 2014. https://www.research.manchester.ac.uk/portal/en/theses/statistical-inference-for-joint-modelling-of-longitudinal-and-survival-data(65e644f3-d26f-47c0-bbe1-a51d01ddc1b9).html.

Full text
Abstract:
In longitudinal studies, data collected within a subject or cluster are somewhat correlated by their very nature and special cares are needed to account for such correlation in the analysis of data. Under the framework of longitudinal studies, three topics are being discussed in this thesis. In chapter 2, the joint modelling of multivariate longitudinal process consisting of different types of outcomes are discussed. In the large cohort study of UK north Stafforshire osteoarthritis project, longitudinal trivariate outcomes of continuous, binary and ordinary data are observed at baseline, year 3 and year 6. Instead of analysing each process separately, joint modelling is proposed for the trivariate outcomes to account for the inherent association by introducing random effects and the covariance matrix G. The influence of covariance matrix G on statistical inference of fixed-effects parameters has been investigated within the Bayesian framework. The study shows that by joint modelling the multivariate longitudinal process, it can reduce the bias and provide with more reliable results than it does by modelling each process separately. Together with the longitudinal measurements taken intermittently, a counting process of events in time is often being observed as well during a longitudinal study. It is of interest to investigate the relationship between time to event and longitudinal process, on the other hand, measurements taken for the longitudinal process may be potentially truncated by the terminated events, such as death. Thus, it may be crucial to jointly model the survival and longitudinal data. It is popular to propose linear mixed-effects models for the longitudinal process of continuous outcomes and Cox regression model for survival data to characterize the relationship between time to event and longitudinal process, and some standard assumptions have been made. In chapter 3, we try to investigate the influence on statistical inference for survival data when the assumption of mutual independence on random error of linear mixed-effects models of longitudinal process has been violated. And the study is conducted by utilising conditional score estimation approach, which provides with robust estimators and shares computational advantage. Generalised sufficient statistic of random effects is proposed to account for the correlation remaining among the random error, which is characterized by the data-driven method of modified Cholesky decomposition. The simulation study shows that, by doing so, it can provide with nearly unbiased estimation and efficient statistical inference as well. In chapter 4, it is trying to account for both the current and past information of longitudinal process into the survival models of joint modelling. In the last 15 to 20 years, it has been popular or even standard to assume that longitudinal process affects the counting process of events in time only through the current value, which, however, is not necessary to be true all the time, as recognised by the investigators in more recent studies. An integral over the trajectory of longitudinal process, along with a weighted curve, is proposed to account for both the current and past information to improve inference and reduce the under estimation of effects of longitudinal process on the risk hazards. A plausible approach of statistical inference for the proposed models has been proposed in the chapter, along with real data analysis and simulation study.
APA, Harvard, Vancouver, ISO, and other styles
21

Hoglin, Phillip J. "Survival analysis and accession optimization of prior enlisted United States Marine Corps officers." Thesis, Monterey, California. Naval Postgraduate School, 2004. http://hdl.handle.net/10945/1673.

Full text
Abstract:
Approved for public release, distribution is unlimited
The purpose of this thesis is to firstly analyze the determinants on the survival of United States Marine Corps Officers, and secondly, to develop the methodology to optimize the accessions of prior and non-prior enlisted officers. Using data from the Marine Corps Officer Accession Career file (MCCOAC), the Cox Proportional Hazards Model is used to estimate the effects of officer characteristics on their survival as a commissioned officer in the USMC. A Markov model for career transition is combined with fiscal data to determine the optimum number of prior and non-prior enlisted officers under the constraints of force structure and budget. The findings indicate that prior enlisted officers have a better survival rate than their non-prior enlisted counterparts. Additionally, officers who are married, commissioned through MECEP, graduate in the top third of their TBS class, and are assigned to a combat support MOS have a better survival rate than officers who are unmarried, commissioned through USNA, graduate in the middle third of their TBS class, and are assigned to either combat or combat service support MOS. The findings also indicate that the optimum number of prior enlisted officer accessions may be considerably lower than recent trends and may differ across MOS. Based on the findings; it is recommended that prior enlisted officer accession figures be reviewed.
Major, Australian Army
APA, Harvard, Vancouver, ISO, and other styles
22

Karimi, Maryam. "Modélisation conjointe de trajectoire socioprofessionnelle individuelle et de la survie globale ou spécifique." Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLS120/document.

Full text
Abstract:
Appartenir à une catégorie socio-économique moins élevée est généralement associé à une mortalité plus élevée pour de nombreuses causes de décès. De précédentes études ont déjà montré l’importance de la prise en compte des différentes dimensions des trajectoires socio-économiques au cours de la vie. L’analyse des trajectoires professionnelles constitue une étape importante pour mieux comprendre ces phénomènes. L’enjeu pour mesurer l’association entre les parcours de vie des trajectoires socio-économiques et la mortalité est de décomposer la part respective de ces facteurs dans l’explication du niveau de survie des individus. La complexité de l’interprétation de cette association réside dans la causalité bidirectionnelle qui la sous-tend: Les différentiels de mortalité sont-ils dus à des différentielsd’état de santé initial influençant conjointement la situation professionnelle et la mortalité, ou l’évolution professionnelle influence-t-elle directement l’état de santé puis la mortalité?Les méthodes usuelles ne tiennent pas compte de l’interdépendance des changements de situation professionnelle et de la bidirectionnalité de la causalité qui conduit à un biais important dans l’estimation du lien causale entre situation professionnelle et mortalité. Par conséquent, il est nécessaire de proposer des méthodes statistiques qui prennent en compte des mesures répétées (les professions) simultanément avec les variables de survie. Cette étude est motivée par la base de données Cosmop-DADS qui est un échantillon de la population salariée française.Le premier objectif de cette thèse était d’examiner l’ensemble des trajectoires professionnelles avec une classification professionnelle précise, au lieu d’utiliser un nombre limité d’états dans un parcours professionnel qui a été considéré précédemment. A cet effet, nous avons défini des variables dépendantes du temps afinde prendre en compte différentes dimensions des trajectoires professionnelles, à travers des modèles dits de "life-course", à savoir critical period, accumulation model et social mobility model, et nous avons mis en évidence l’association entre les trajectoires professionnelles et la mortalité par cause en utilisant ces variables dans un modèle de Cox.Le deuxième objectif a consisté à intégrer les épisodes professionnel comme un sous-modèle longitudinal dans le cadre des modèles conjoints pour réduire le biais issude l’inclusion des covariables dépendantes du temps endogènes dans le modèle de Cox. Nous avons proposé un modèle conjoint pour les données longitudinales nominaleset des données de risques concurrents dans une approche basée sur la vraisemblance. En outre, nous avons proposé une approche de type méta-analyse pour résoudre les problèmes liés au temps des calculs dans les modèles conjoints appliqués à l’analyse des grandes bases de données. Cette approche consiste à combiner les résultats issus d’analyses effectuées sur les échantillons stratifiés indépendants. Dans la même perspective de l’utilisation du modèle conjoint sur les grandes bases de données, nous avons proposé une procédure basée sur l’avantage computationnel de la régression de Poisson.Cette approche consiste à trouver les trajectoires typesà travers les méthodes de la classification, et d’appliquerle modèle conjoint sur ces trajectoires types
Being in low socioeconomic position is associated with increased mortality risk from various causes of death. Previous studies have already shown the importance of considering different dimensions of socioeconomic trajectories across the life-course. Analyses of professional trajectories constitute a crucial step in order to better understand the association between socio-economic position and mortality. The main challenge in measuring this association is then to decompose the respectiveshare of these factors in explaining the survival level of individuals. The complexity lies in the bidirectional causality underlying the observed associations:Are mortality differentials due to differences in the initial health conditions that are jointly influencing employment status and mortality, or the professional trajectory influences directly health conditions and then mortality?Standard methods do not consider the interdependence of changes in occupational status and the bidirectional causal effect underlying the observed association and that leads to substantial bias in estimating the causal link between professional trajectory and mortality. Therefore, it is necessary to propose statistical methods that consider simultaneously repeated measurements (careers) and survivalvariables. This study was motivated by the Cosmop-DADS database, which is a sample of the French salaried population.The first aim of this dissertation was to consider the whole professional trajectories and an accurate occupational classification, instead of using limitednumber of stages during life course and a simple occupational classification that has been considered previously. For this purpose, we defined time-dependent variables to capture different life course dimensions, namely critical period, accumulation model and social mobility model, and we highlighted the association between professional trajectories and cause-specific mortality using the definedvariables in a Cox proportional hazards model.The second aim was to incorporate the employment episodes in a longitudinal sub-model within the joint model framework to reduce the bias resulting from the inclusion of internal time-dependent covariates in the Cox model. We proposed a joint model for longitudinal nominal outcomes and competing risks data in a likelihood-based approach. In addition, we proposed an approach mimicking meta-analysis to address the calculation problems in joint models and large datasets, by extracting independent stratified samples from the large dataset, applying the joint model on each sample and then combining the results. In the same objective, that is fitting joint model on large-scale data, we propose a procedure based on the appeal of the Poisson regression model. This approach consist of finding representativetrajectories by means of clustering methods and then applying the joint model on these representative trajectories
APA, Harvard, Vancouver, ISO, and other styles
23

Persson, Daniel, and Johannes Ahlström. "Går det att prediktera konkurs i svenska aktiebolag? : En kvantitativ studie om hur finansiella nyckeltal kan användas vid konkursprediktion." Thesis, Linköpings universitet, Företagsekonomi, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-119867.

Full text
Abstract:
Från 1900-talets början har banker och låneinstitut använt nyckeltal som hjälpmedel vid bedömning och kvantifiering av kreditrisk. För dagens investerare är den ekonomiska miljön mer komplicerad än för bara 40 år sedan då teknologin och datoriseringen öppnade upp världens marknader mot varandra. Bedömning av kreditrisk idag kräver effektiv analys av kvantitativa data och modeller som med god träffsäkerhet kan förutse risker. Under 1900-talets andra hälft skedde en snabb utveckling av de verktyg som används för konkursprediktion, från enkla univariata modeller till komplexa data mining-modeller med tusentals observationer. Denna studie undersöker om det är möjligt att prediktera att svenska företag kommer att gå i konkurs och vilka variabler som innehåller relevant information för detta. Metoderna som används är diskriminantanalys, logistisk regression och överlevnadsanalys på 50 aktiva och 50 företag försatta i konkurs. Resultaten visar på en träffsäkerhet mellan 67,5 % och 75 % beroende på vald statistisk metod. Oavsett vald statistisk metod är det möjligt att klassificera företag som konkursmässiga två år innan konkursens inträffande med hjälp av finansiella nyckeltal av typerna lönsamhetsmått och solvensmått. Samhällskostnader reduceras av bättre konkursprediktion med hjälp av finansiella nyckeltal vilka bidrar till ökad förmåga för företag att tillämpa ekonomistyrning med relevanta nyckeltal i form av lager, balanserad vinst, nettoresultat och rörelseresultat.
From the early 1900s, banks and lending institutions have used financial ratios as an aid in the assessment and quantification of credit risk. For today's investors the economic environment is far more complicated than 40 years ago when the technology and computerization opened up the world's markets. Credit risk assessment today requires effective analysis of quantitative data and models that can predict risks with good accuracy. During the second half of the 20th century there was a rapid development of the tools used for bankruptcy prediction. We moved from simple univariate models to complex data mining models with thousands of observations. This study investigates if it’s possible to predict bankruptcy in Swedish limited companies and which variables contain information relevant for this cause. The methods used in the study are discriminant analysis, logistic regression and survival analysis on 50 active and 50 failed companies. The results indicate accuracy between 67.5 % and 75 % depending on the choice of statistical method. Regardless of the selected statistical method used, it’s possible to classify companies as bankrupt two years before the bankruptcy occurs using financial ratios which measures profitability and solvency. Societal costs are reduced by better bankruptcy prediction using financial ratios which contribute to increasing the ability of companies to apply financial management with relevant key ratios in the form of stock , retained earnings , net income and operating income.
APA, Harvard, Vancouver, ISO, and other styles
24

Liu, Chia-Chiung, and 劉佳峻. "A simulation study for cut points analysis in Logist regression and Cox proportional hazards model." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/yn3698.

Full text
Abstract:
碩士
國立中山大學
應用數學系研究所
107
In clinical practice, it is often necessary to segment the continuous variables for risk assessment, that is, to convert the continuous prognostic factors into categories to facilitate clinical judgment and interpretation. There are three problems to be solved in the study of estimating cut points. The first problem is to determine the optimal number of cut points. In the traditional methods, many of them have been developed to find one optimal cut point to categorize variables into two subgroup. However, in a lot of situations, finding more than one cut points is of interest. The second one is to find the location of optimal cut points. The last one is the statistical inferences after finding the optimal number and locations of cut points, including correcting the p-value, relative risks, powers, etc.   In previous theses(Tsai, Y.H.(2018), and Chiu, Y.C.(2018)), they proposed a new approach in both the logistic and Cox regression models, combining the cross-validation and Monte Carlo methods(CVM), to find the optimal number and locations of cut points. However, in their theses, the proposed method was not compared with other methods. In this thesis, we conducted simulation studies to compare the proposed CVM with three other methods, including naive approach(without any correction,NA), split-sample approach(SS), and cross-validation approach(CV). We compares the performance between these four methods in estimating the number and location s of cut points, relative risks, and powers in both univariate and multivariate analysis and for different sample sizes.
APA, Harvard, Vancouver, ISO, and other styles
25

Joy, Nathaniel Allen. "A Duration Analysis of Food Safety Recall Events in the United States: January, 2000 to October, 2009." Thesis, 2010. http://hdl.handle.net/1969.1/ETD-TAMU-2010-12-8826.

Full text
Abstract:
The safety of the food supply in the United States has become an issue of prominence in the minds of ordinary Americans. Several government agencies, including the United States Department of Agriculture and the Food and Drug Administration, are charged with the responsibility of preserving the safety of the food supply. Food is withdrawn from the market in a product recall when tainted or mislabeled and has the potential to harm the consumer in some manner. This research examines recall events issued by firms over the period of January, 2000 through October, 2009 in the United States. Utilizing economic and management theory to establish predictions, this study employs the Cox proportional hazard regression model to analyze the effects of firm size and branding on the risk of recall recurrence. The size of the firm was measured in both billions of dollars of sales and in thousands of employees. Branding by the firm was measured as a binary variable that expressed if a firm had a brand and as a count of the number of brands within a firm. This study also provides a descriptive statistical analysis and several findings based on the recall data specifically relating to annual occurrences, geographical locations of the firms involved, types of products recalled, and reasons for recall. We hypothesized that the increasing firm size would be associated with increased relative risk of a recall event while branding and an increasing portfolio of brands would be associated with decreased relative risk of a recall event. However, it was found that increased firm size and branding by the firm are associated with an increased risk of recall occurrence. The results of this research can have implications on food safety standards in both the public and private sectors.
APA, Harvard, Vancouver, ISO, and other styles
26

Hsiao, Han, and 蕭涵. "Analysis of high dimensional gene expression and mutation data in bladder cancer using Cox proportional hazards model and logistic regression via different penalizations." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/amf4n9.

Full text
Abstract:
碩士
國立中山大學
應用數學系研究所
106
Bladder cancer is one of the malignant diseases in urinary system. Its common symptoms include hematuria which could be seen through eyes or urine analysis. In order to understand the effect of gene expression and mutation data on subtypes and recurrent event in patients with bladder cancer, we downloaded data from The Cancer Genome Atlas (TCGA) and applied high-dimensional analysis such as LASSO, Ridge, Adaptive Lasso and Cox model to screen gene variables, compare the performance of different models and predict the hazard of each patients. Among the selected gene candidates, we found TP53 and ERBB3 have been published in quite a few papers, which could verify our method. Not only the list of genes could help the lab to perform further analysis but also it could screen out the potential patients in advance. On the other hand, we also wrote some functions to access and deal with gene database in R language, which could be used by other researchers in the future.
APA, Harvard, Vancouver, ISO, and other styles
27

Martínez, Vargas Danae Mirel. "Régression de Cox avec partitions latentes issues du modèle de Potts." Thèse, 2019. http://hdl.handle.net/1866/22552.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Liao, Yi-Nan, and 廖宜楠. "Event Per Variable in Cox Proportional Hazard Model." Thesis, 2010. http://ndltd.ncl.edu.tw/handle/38265148402104684976.

Full text
Abstract:
碩士
國立陽明大學
公共衛生研究所
98
Abstract Background:Small number of events often occurred due to short follow-up times or not enough individuals being followed in clinical, epidemiological, or environment studies. It could seriously affect the accuracy and efficiency of the estimates of regression coefficient when using Cox proportional hazard model when event numbers are not enough. Previously studies have shown that the parameter estimates are valid when events per variable (EPV) are more than 10. However, these EPV related simulation studies which focus on Cox proportional hazard models only use methods that are appropriate for large sample. Objectives:To compare the accuracy and the efficiency of the parameter estimates of Cox models using different methods such as Partial likelihood method, Firth’s penalized likelihood method, and Bayesian methods. Methods:In the context of Cox proportional hazard model, this simulation study compared (1)different estimating methods of parameter, including Standard Cox method、Firth’s penalized likelihood method and Bayesian method. (2)different methods of calculating confidence intervals, including Wald-based method、profile likelihood methods, and Bayesian credible intervals. (3)different numbers of EPV. (4) categorical variables or and continuous variables, with collinearity or not, and (6)different methods of generating survival data. Results and Conclusions:For categorical variables, when EPV is less than or equal to 4, the accuracy and the efficiency of the parameter estimates by Firth’s method are better than those by standard Cox method. When EPV is greater than or equal to 10, Firth’s method performs as well as standard Cox method. For Continuous variables, the accuracy and the efficiency of Firth’s method and standard Cox method are similar. The estimate by Bayesian method is close to the pre-set prior distribution when EPV is small. As the EPV increases, the estimates approaches the true parameter.
APA, Harvard, Vancouver, ISO, and other styles
29

Yu-Jheng, Su, and 蘇育正. "Adaptive model selection criterion:An application of Cox proportional hazard model." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/19790975378637992966.

Full text
Abstract:
碩士
東海大學
統計學系
99
The Cox proportional hazards model has been widely used to describe the relationship between survival information and covariates. For model selection Akaike information criterion (AIC) and Bayesian information criterion (BIC) are the most frequently used methods. However the two information criteria may generally perform differently under different circumstances, so that neither the AIC nor the BIC is superior in all cases. In practice, the underlying true model is unknown and model selection is only based on the specific selection criteria. Therefore, in this article, we propose a data driven method to choose the optimal penalty parameter based on a concept of generalized degrees of freedom, resulting in an approximately unbiased estimator of the Kullback-Leibler loss via a data perturbation technique. The effectiveness of the proposed method is justified by a simulation study. Finally, we illustrate the feasibility of this method by the Ger- man Breast Cancer Study Group’ s data set.
APA, Harvard, Vancouver, ISO, and other styles
30

Tung, Feng-Cheng, and 董峰呈. "Mutual Fund’s Failure Prediction - Cox''s Proportional Hazard Model." Thesis, 2001. http://ndltd.ncl.edu.tw/handle/26173602273714817022.

Full text
Abstract:
碩士
國立高雄第一科技大學
金融營運系碩士班
89
ABSTRACT This study applies Cox’s(1984) PHM(Proportional Hazard Model) to set up the pre-warning models for the failure of mutual fund. This study consists 40 mutual funds in Taiwan Stock Exchange during the period 1996-2000. This study uses twelve financial variables as endogenous factors of the mutual fund’s survival. This study expects the pre-warning models for the failure of mutual fund will be helpful for the investors to take some preventive action and reduce the risk of buying mutual fund. This study’s results indicate that the rate of three-monthly return , mutual fund beneficiary units are significant factors affecting the failure of mutual fund. This study also finds the possibility of financial distress is positively associated with the rates of three-monthly return and mutual fund beneficiary units. After the pre-warning models for the failure of mutual funds set up , the study selects 8 mutual funds in Taiwan Stock Exchange, based on the information available in February 2000, and predict the survival possibilities of mutual funds. This study finds the survival possibilities of Nation Wan-Tong and Chinese Wan-Bang fund after 30-month are lower in the 8 mutual funds . The study suggests that investors shall select the mutual funds which have higher rate of three-monthly return and more mutual fund beneficiary units.
APA, Harvard, Vancouver, ISO, and other styles
31

Hsu, Wen Chu, and 許雯筑. "The Determinants of stock repurchse: cox proportional hazard model." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/66161651537784230258.

Full text
Abstract:
碩士
國立政治大學
國際經營與貿易研究所
96
In this study, we apply Cox proportional hazard model in recurrent event analysis, which usually used in medical and science studies, to analyze the determinants of the stock repurchase events of S&P 500 companies. We investigate three main incentives that companies conduct stock repurchase. The empirical results show that companies employ repurchase as a technique to alter capital structure. In addition, companies conduct stock repurchase to distribute excess capital. In contrast, there are little evidences to support signaling undervaluation.
APA, Harvard, Vancouver, ISO, and other styles
32

Tsai, Tsungche, and 蔡宗哲. "Adaptive Model Selection Between Cox Proportional Hazard Model And Aalen Additive Model." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/87529691313741577966.

Full text
Abstract:
碩士
東海大學
統計學系
100
In medical studies, Cox proportional hazards model (Cox, 1972) is the most commonly used method to analyze the survivor function of patients when the right-censored survival data are accompany with covariates which are associated with patients’ physiology and conditions. However, the proportional hazards assumption is usually violated in practice. Therefore, the Aalen’s additive model (Aalen, 1989) is an alternative choice under consideration. In this model, the covariates act in an additive manner on an unknown baseline hazard rate. The unknown risk coefficients in the model are allowed to be functions of time so that the effect of a covariate may vary over time. However, the two models generally perform differently under different circumstances, so that neither the Cox model nor the Aalen model is superior in all cases. How to select between them has not been explored in the literature. Therefore, we proposed a data-driven method for making a selection based on a concept of generalized degrees of freedom, resulting in an approximately unbiased estimator of the Kullback-Leibler loss via a data perturbation technique. The effectiveness of the proposed method is justified by a simulation study and also is applied to two real data sets.
APA, Harvard, Vancouver, ISO, and other styles
33

"Bootstrap distribution for testing a change in the cox proportional hazard model." 2000. http://library.cuhk.edu.hk/record=b5890302.

Full text
Abstract:
Lam Yuk Fai.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2000.
Includes bibliographical references (leaves 41-43).
Abstracts in English and Chinese.
Chapter 1 --- Basic Concepts --- p.9
Chapter 1.1 --- Survival data --- p.9
Chapter 1.1.1 --- An example --- p.9
Chapter 1.2 --- Some important functions --- p.11
Chapter 1.2.1 --- Survival function --- p.12
Chapter 1.2.2 --- Hazard function --- p.12
Chapter 1.3 --- Cox Proportional Hazards Model --- p.13
Chapter 1.3.1 --- A special case --- p.14
Chapter 1.3.2 --- An example (continued) --- p.15
Chapter 1.4 --- Extension of the Cox Proportional Hazards Model --- p.16
Chapter 1.5 --- Bootstrap --- p.17
Chapter 2 --- A New Method --- p.19
Chapter 2.1 --- Introduction --- p.19
Chapter 2.2 --- Definition of the test --- p.20
Chapter 2.2.1 --- Our test statistic --- p.20
Chapter 2.2.2 --- The alternative test statistic I --- p.22
Chapter 2.2.3 --- The alternative test statistic II --- p.23
Chapter 2.3 --- Variations of the test --- p.24
Chapter 2.3.1 --- Restricted test --- p.24
Chapter 2.3.2 --- Adjusting for other covariates --- p.26
Chapter 2.4 --- Apply with bootstrap --- p.28
Chapter 2.5 --- Examples --- p.29
Chapter 2.5.1 --- Male mice data --- p.34
Chapter 2.5.2 --- Stanford heart transplant data --- p.34
Chapter 2.5.3 --- CGD data --- p.34
Chapter 3 --- Large Sample Properties and Discussions --- p.35
Chapter 3.1 --- Large sample properties and relationship to goodness of fit test --- p.35
Chapter 3.1.1 --- Large sample properties of A and Ap --- p.35
Chapter 3.1.2 --- Large sample properties of Ac and A --- p.36
Chapter 3.2 --- Discussions --- p.37
APA, Harvard, Vancouver, ISO, and other styles
34

Kao, Xin-ru, and 高欣如. "Discussion on Cox Proportional Hazards Assumption and Application of Extended Hazard Model." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/wh9svf.

Full text
Abstract:
碩士
國立中央大學
統計研究所
97
The Cox proportional hazards model has been widely used to describe the relationship between survival information and covariates. The validity to apply the Cox model for data is usually based on checking the proportional hazards assumption. It’s an interesting problem to investigate whether checking this assumption is sufficient as an evidence to fit data with the Cox model. On the other hand, when proportional hazards assumption fails, the Accelerated Failure Time (AFT) model is a popular alternative to the Cox model. However, when data include time-dependent covariates there are no convenient tools to check if AFT is appropriate for the data. An general class model termed “extended hazard model”, which contains the Cox and AFT models as its special case may be helpful to study the above problems. Because under the nested structure, we may test the fit of Cox and AFT models for data. Finally, we demonstrate the new model through a case study of Taiwanese HIV/AIDS cohort data.
APA, Harvard, Vancouver, ISO, and other styles
35

Wu, Chi-ruei, and 吳啓瑞. "The study of mobile commerce and operational efficiency-an application of Cox proportional hazard model." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/49883964522804226652.

Full text
Abstract:
碩士
國立中央大學
企業管理研究所
100
As mobile communication technology evolves maturely, Banks introduce Mobile Banking services to customers. Banks can increase their competitiveness through unique convenience, immediacy and personalized services. Therefore, the purpose of this study is to investigate the relationship between the Mobile Banking service and the operational efficiency. This study adopts an only bank as a study case, and collects 30 records measured by quarter from 2004 1st quarter to 2011 1st quarter and 2012 1st quarter. First, the efficiency assessment of data was conducted by Data Envelopment Analysis and Slack-Based Measure to understand operational efficiency of target bank, and it analyzed the factors which improved bank operational efficiency with Cox proportional hazard model. The survey results indicate that the amount of Mobile Banking transactions had a positive relation and insignificant effect on the rate of immediacy improving efficiency. It demonstrates that Mobile baking under limitations of technology and transaction security fails to mitigate workload of physical banking channel; in other words, physical banking channel is not able to redistribute resources to more profitable business, therefore the bank''s operational efficiency improvement is not significant.
APA, Harvard, Vancouver, ISO, and other styles
36

Otto, Simon James Garfield. "ANTIMICROBIAL RESISTANCE OF HUMAN CAMPYLOBACTER JEJUNI INFECTIONS FROM SASKATCHEWAN." Thesis, 2011. http://hdl.handle.net/10214/2658.

Full text
Abstract:
Saskatchewan is the only province in Canada to have routinely tested the antimicrobial susceptibility of all provincially reported human cases of campylobacteriosis. From 1999 to 2006, 1378 human Campylobacter species infections were tested for susceptibility at the Saskatchewan Disease Control Laboratory using the Canadian Integrated Program for Antimicrobial Resistance Surveillance panel and minimum inhibitory concentration (MIC) breakpoints. Of these, 1200 were C. jejuni, 129 were C. coli, with the remaining made up of C. lari, C. laridis, C. upsaliensis and undifferentiated Campylobacter species. Campylobacter coli had significantly higher prevalences of ciprofloxacin resistance (CIPr), erythromycin resistance (ERYr), combined CIPr-ERYr resistance and multidrug resistance (to three or greater drug classes) than C. jejuni. Logistic regression models indicated that CIPr in C. jejuni decreased from 1999 to 2004 and subsequently increased in 2005 and 2006. The risk of CIPr was significantly increased in the winter months (January to March) compared to other seasons. A comparison of logistic regression and Cox proportional hazard survival models found that the latter were better able to detect significant temporal trends in CIPr and tetracycline resistance by directly modeling MICs, but that these trends were more difficult to interpret. Scan statistics detected significant spatial clusters of CIPr C. jejuni infections in urban centers (Saskatoon and Regina) and temporal clusters in the winter months; the space-time permutation model did not detect any space-time clusters. Bernoulli scan tests were computationally the fastest for cluster detection, compared to ordinal MIC and multinomial antibiogram models. eBURST analysis of antibiogram patterns showed a marked distinction between case and non-case isolates from the scan statistic clusters. Multilevel logistic regression models detected significant individual and regional contextual risk factors for infection with CIPr C. jejuni. Patients infected in the winter, that were between the ages of 40-45 years of age, that lived in urban regions and that lived in regions of moderately high poultry density had higher risks of a resistant infection. These results advance the epidemiologic knowledge of CIPr C. jejuni in Saskatchewan and provide novel analytical methods for antimicrobial resistance surveillance data in Canada.
Saskatchewan Disease Control Laboratory (Saskatchewan Ministry of Health); Laboratory for Foodborne Zoonoses (Public Health Agency of Canada); Centre for Foodborne, Environmental and Zoonotic Infectious Diseases (Public Health Agency of Canada); Ontario Veterinary College Blake Graham Fellowship
APA, Harvard, Vancouver, ISO, and other styles
37

Guan-Lin, Huang, and 黃冠霖. "Apply Cox proportional hazard model to create customer loyalty index with case study of smartphone service industry." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/h4b7bd.

Full text
Abstract:
碩士
國立成功大學
工業與資訊管理學系碩博士班
101
The goal of the enterprise operation is to earn profit under limited resources. Customers are the source of the profit and loyal customers can bring stable income. Since the cost of finding new customers is much higher than retaining the existing customers, creating a customer loyalty index and then managing customer loyalty have become a major issue for business nowadays. In this study, we propose a customer loyalty index by Cox proportional hazards model. Our method integrates customer prognostic factors, consumer behavior, service quality, customer satisfaction, and switching costs into our model to evaluate the probability of loss of customers. Let the loss of customer as the event, we calculate hazard ratio for each customer respect to a reference point. Our method is applied to a case from the smartphone service industry. The results showed that mobile connection quality, the gap between expectations and reality, the customer re-patronage intentions, and the age of customer may affect events of loss of customers. Finally, according to hazards ratio to group and sort the customers, the management will have a better view of current distribution of customer loyalty.
APA, Harvard, Vancouver, ISO, and other styles
38

Chen, Yi-Chun, and 陳怡君. "A study on merger & acquisition and duration in Taiwan airline markets: Application cox proportional hazard model." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/46828843673961945880.

Full text
Abstract:
碩士
長榮大學
企業管理研究所
96
In 1997, the domestic aviation transportation required volume reached its peak. However, after 1998, the prosperity of domestic aviation slide down all the way and transportation passenger plane with high cost became a heavy burden for aviation companies. In addition, Taiwan promote road construction and high speed railway in recent years and with the pressure of global recession, without doubt it is a great test for domestic aviation industry in terms of operation management.The research investigates the Domestic Airlines who have been manage the domestic route from 1996 to 2007.The research seeks to find out the key factors affecting the operation performance of airline companies by utilizing the Survival Analysis to examine the variables combination of the Frequency of Flight, Available Seat, Passengers Carried, Passenger Load Factor, Commercial course and Recreation course, the repeated course place of the high speed railway and domestic airlines and the course merges the performance of managing, per capita GDP. The result provides an important reference and basis for airline executives to improve deficiency and serves as a future operation direction. The result of study is found that available Seat, Commercial course, Course after merging, the repeated course place of the high speed railway and domestic airlines for influence domestic aviation industry factor, and find that merging will make the dangerous rate of course drop, look by managing the wise attitude, the result shows the course survival rate: Commercial attitude after merging is greater than Recreation type attitude after merging, Recreation type attitude after merging is greater than Commercial attitude not merged,Commercial attitude not merged is greater than Recreation type attitude not merged.
APA, Harvard, Vancouver, ISO, and other styles
39

Tzu-HsuanLin and 林子暄. "Efficient estimation of the Cox proportional hazard model incorporating with the auxiliary subgroup information of incidence rate." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/kp962r.

Full text
Abstract:
碩士
國立成功大學
統計學系
107
Incidence rate for a disease have been wildly used in the field of medical research because of its clear physical and simple clinical interpretation. In this thesis, we propose an efficient estimation to incorporate with the auxiliary subgroup information of incidence rate information into the estimation of the Cox proportional hazard model. Comparing to the conventional models without incorporation of the available auxiliary information, utilizing the external information shows that the proposed method improves efficiency for the estimation of the regression parameters based on the maximum empirical likelihood method. In addition, simulation studies demonstrate that the proposed method gain in efficiency over the conventional approach.
APA, Harvard, Vancouver, ISO, and other styles
40

Lee, Kuan-Ting, and 李冠霆. "Statistical analysis of censored endpoints under the Cox proportional hazard model for evaluation of targeted drug products under the enrichment design." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/01371322286722738530.

Full text
Abstract:
碩士
國立臺灣大學
農藝學研究所生物統計組
103
In traditional clinical trials, inclusion and exclusion criteria are considered based on some clinical endpoints, the genetic or genomic variability of the trial participants are not totally utilized in the criteria. After the Human Genome Project is completed, many molecules underlying disease can be identified, it is possible to develop a targeted molecular therapy. However, the accuracy of diagnostic devices for identification of such molecular targets is usually not perfect. Some patients with positive diagnosis result is actually might not have the specific molecular targets. As a result, the treatment effect may be underestimated in the patient population truly with the molecular target. In order to resolve this issue, we propose a method based on the mixture Cox’s proportional model for the k latent classes (Eng K.H. and Hanlon B.M., 2014) and under the enrichment design. We develop inferential procedures for the treatment effects of the targeted drug based on the censored endpoints in the patients truly with the molecular targets which also incorporates the inaccuracy of the diagnostic device for detection of the molecular targets on the inference of the treatment effects. We propose using the EM algorithm in conjunction with the bootstrap technique for estimation of hazard ratio and its variance. Though the simulation study, we empirically investigate the performance of the proposed methods and to compare with the current method. The numerical examples illustrate the proposed procedures.
APA, Harvard, Vancouver, ISO, and other styles
41

Zavřelová, Adéla. "Semiparametrický model aditivního rizika." Master's thesis, 2020. http://www.nusl.cz/ntk/nusl-415878.

Full text
Abstract:
Cox proportional hazard model is often used to estimate the effect of covariates on hazard for censored event times. In this thesis we study the semiparametric models of additive risk for censored data. In this model the hazard is given as a sum of unknown baseline hazard function and a product of covariates and coefficients. Further the general additive-multiplicative model is assumed. In this model the effect of a covariate can be either multiplicative, additive or both at the same time. We focuse on determining the effect of a covariate in the general model. This model can be used to test for the multiplicative or addtive effect of a covariate on the hazard.
APA, Harvard, Vancouver, ISO, and other styles
42

Chang, Chiayu, and 張家瑜. "Factors affecting the transition state of depression for the elderly in Taiwan - An application of the Cox proportional hazard model in competing risk." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/80316431886090881232.

Full text
Abstract:
碩士
東海大學
統計學系
100
Due to the rapid decline in fertility and increased longevity in recent years, Taiwan is experiencing a dramatic demographic transition in which the overall proportion of old people in the population is increasing. From many studies, it is evident that there is a steady increase in the incidence and prevalence of depression and related problems among the elderly population. Depression symptoms experienced in late life have serious implications for the health and functioning of older individuals. In addition, it might cause high mortality rates if the depression status of the elderly could not be improved. Thereby, one major task in the elderly population is to deal with the problem of depression. Since many aspects of mental health can be targeted for improvement in health care, the goal of this study is to discuss how the variables related to demographic, self-rated health, physical function and home condition determinants affect the transition state of depression for the elderly in Taiwan by employing Cox proportional hazard model in competing risks. The elderly data collected by the Bureau of Health Promotion of the Department of Health in Taiwan from 1989 to 2003, will be used to explore the variables related to depression symptom which is according to a complete set of depression scale (CES-D) data of the elderly. The old people who were 60 years old and over were first interviewed in 1989 and re-interviewed in 1993, 1996, 1999 and 2003. The Cox proportional hazard model in competing risks is used to explore whether some of the covariates of the factors e.g. demographic characteristics, home conditions, and health characteristics, are related to the three different types of transition about depression, such as transition, reverse transition and repeated transition. No matter the initial state of the elderly is with depression or without depression, the cumulative incidence functions and the Cox proportional hazard model in competing risk reveal that the home condition (economic status) and health characteristics (self-rated health and physical function) are strongly related to the depression of the elderly.
APA, Harvard, Vancouver, ISO, and other styles
43

"An application of cox hazard model and CART model in analyzing the mortality data of elderly in Hong Kong." 2002. http://library.cuhk.edu.hk/record=b5891190.

Full text
Abstract:
Pang Suet-Yee.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2002.
Includes bibliographical references (leaves 85-87).
Abstracts in English and Chinese.
Chapter 1 --- Introduction --- p.1
Chapter 1.1 --- Overview --- p.1
Chapter 1.1.1 --- Survival Analysis --- p.2
Chapter 1.1.2 --- Tree、-structured Statistical Method --- p.2
Chapter 1.1.3 --- Mortality Study --- p.3
Chapter 1.2 --- Motivation --- p.3
Chapter 1.3 --- Background Information --- p.4
Chapter 1.4 --- Data Content --- p.7
Chapter 1.5 --- Thesis Outline --- p.8
Chapter 2 --- Imputation and File Splitting --- p.10
Chapter 2.1 --- Imputation of Missing Values --- p.10
Chapter 2.1.1 --- Purpose of Imputation --- p.10
Chapter 2.1.2 --- Procedure of Hot Deck Imputation --- p.11
Chapter 2.1.3 --- List of Variables for Imputation --- p.12
Chapter 2.2 --- File Splitting --- p.14
Chapter 2.2.1 --- Splitting by Gender --- p.14
Chapter 2.3 --- Splitting for Validation Check --- p.1G
Chapter 3 --- Cox Hazard Model --- p.17
Chapter 3.1 --- Basic Idea --- p.17
Chapter 3.1.1 --- Survival Analysis --- p.17
Chapter 3.1.2 --- Survivor Function --- p.18
Chapter 3.1.3 --- Hazard Function --- p.18
Chapter 3.2 --- The Cox Proportional Hazards Model --- p.19
Chapter 3.2.1 --- Kaplan-Meier Estimate and Log-Rank Test --- p.20
Chapter 3.2.2 --- Hazard Ratio --- p.23
Chapter 3.2.3 --- Partial Likelihood --- p.24
Chapter 3.3 --- Extension of the Cox Proportional Hazards Model for Time-dependent Variables --- p.25
Chapter 3.3.1 --- Modification of the Cox's Model --- p.25
Chapter 3.4 --- Results of Model Fitting --- p.26
Chapter 3.4.1 --- Extract the Significant Covariates from the Models --- p.31
Chapter 3.5 --- Model Interpretation --- p.32
Chapter 4 --- CART --- p.37
Chapter 4.1 --- CART Procedure --- p.38
Chapter 4.2 --- Selection of the Splits --- p.39
Chapter 4.2.1 --- Goodness of Split --- p.39
Chapter 4.2.2 --- Type of Variables --- p.40
Chapter 4.2.3 --- Estimation --- p.40
Chapter 4.3 --- Pruning the Tree --- p.41
Chapter 4.3.1 --- Misclassification Cost --- p.42
Chapter 4.3.2 --- Class Assignment Rule --- p.44
Chapter 4.3.3 --- Minimal Cost Complexity Pruning --- p.44
Chapter 4.4 --- Cross Validation --- p.47
Chapter 4.4.1 --- V-fold Cross-validation --- p.47
Chapter 4.4.2 --- Selecting the right sized tree --- p.49
Chapter 4.5 --- Missing Value --- p.49
Chapter 4.6 --- Results of CART program --- p.51
Chapter 4.7 --- Model Interpretation --- p.53
Chapter 5 --- Model Prediction --- p.58
Chapter 5.1 --- Application to Test Sample --- p.58
Chapter 5.1.1 --- Fitting test sample to Cox's Model --- p.59
Chapter 5.1.2 --- Fitting test sample to CART model --- p.61
Chapter 5.2 --- Comparison of Model Prediction --- p.62
Chapter 5.2.1 --- Misclassification Rate --- p.62
Chapter 5.2.2 --- Misclassification Rate of Cox's model --- p.63
Chapter 5.2.3 --- Misclassification Rate of CART model --- p.64
Chapter 5.2.4 --- Prediction Result --- p.64
Chapter 6 --- Conclusion --- p.67
Chapter 6.1 --- Comparison of Results --- p.67
Chapter 6.2 --- Comparison of the Two Statistical Techniques --- p.68
Chapter 6.3 --- Limitation --- p.70
Appendix A: Coding Description for the Health Factors --- p.72
Appendix B: Log-rank Test --- p.75
Appendix C: Longitudinal Plot of Time Dependent Variables --- p.76
Appendix D: Hypothesis Testing of Suspected Covariates --- p.78
Appendix E: Terminal node report for both gender --- p.81
Appendix F: Calculation of Critical Values --- p.83
Appendix G: Distribution of Missing Value in Learning sample and Test Sample --- p.84
Bibliography --- p.85
APA, Harvard, Vancouver, ISO, and other styles
44

Ejones, Francis. "Regional integration, trade duration and economic growth in the East African community: three empirical essays." Thesis, 2021. http://hdl.handle.net/1959.13/1421878.

Full text
Abstract:
Research Doctorate - Doctor of Philosophy (PhD)
The objective of this thesis is to investigate empirically the impact of regional trade agreements (RTAs) on trade, the duration of trade and economic growth in the East African Community (EAC) consisting of the republics of Kenya, Uganda, Burundi and Rwanda and the United Republic of Tanzania. Since the birth of the World Trade Organization (WTO) in 1995 and the establishment of the multilateral trading system, both developed and developing countries have initiated policies aimed at promoting North–South and South–South trade. Despite these efforts, the impact of multilateral trade agreements on trade and economic growth has been complex and is not well understood. The changing global marketplace—a consequence of external shocks and trade policy reforms across the developed and developing world—has led to increased regionalisation globally. Lack of scholarship on specific regional integration (RI) entities, such as the EAC, motivates the current research. In addition, as much as trade liberalisation in the EAC is commendable, there is strong evidence that trade reforms perform very poorly (Mishra, 2018; Rodrik, 1992). For instance, the EAC remains a marginal player in the global trade in goods (United Nations Conference on Trade & Development [UNCTAD], 2019). Further, the EAC’s level of intraregional trade is the lowest among all African RTAs, implying that the EAC does not perform as well as it does in the area of trade integration (Economic Commission for Africa [ECA], 2017). Such low levels of intraregional trade are still observed even though tremendous resources and strong political will continue to back progress towards implementation of the bloc (Vickers, 2017). Further, there are still discrepancies in the size and relative strength of the economies of the countries participating in the EAC, creating tensions over the perceived distribution of the benefits of RI (ECA, 2017). This thesis reports three empirical studies. The first examines the impact of RTAs on trade in the EAC. Utilising the traditional gravity model, this study extends the model to account for zero trade, endogeneity and heterogeneity. The model is estimated using the Poison pseudo-maximum likelihood estimator and a comprehensive panel dataset for the EAC for the period 1990–2017. The empirical results indicate, first, that although RTAs enhance trade in the EAC, the impact varies across the Common Market for Eastern and Southern Africa (COMESA), WTO markets and regional blocs. Second, although the RTAs enhance trade at the bloc level, results vary by country. Kenya, Rwanda and Uganda experience pure trade creation in the EAC market, though Uganda’s intra-bloc trade is below expectation. Third, the results indicate that there is asymmetry across products. For example, food trade leads to pure trade creation in the EAC and COMESA markets, but pure trade diversion in the WTO. Fourth, there is variation in the performance of products within countries, though the EAC RTA leads to trade creation for all products across the EAC. These empirical findings are robust to alternative model specifications. The second empirical study examines the impact of RTAs on the duration of trade in the EAC. The study specifies a variant of survival models and estimates the models using a comprehensive dataset for the period 1988–2015. The model is estimated using Kaplan–Meier, Cox proportional hazards and discrete time estimators. The empirical results show, first, that RTAs enhance the duration of exports in the EAC market, and inconsistently drive the duration of exports of the EAC in the COMESA market. However, RTAs do not lengthen the duration of EAC exports in the WTO trading market in aggregate. Second, the impact of RTAs on the duration of exports varies across countries: the EAC bloc leads to the persistence of exports in Tanzania and Uganda, while the COMESA bloc increases the duration of Kenya’s exports. Third, the impact of RTAs is heterogeneous across products. Fourth, the impact of RTAs on the duration of EAC exports is short lived with 50% of exports coming to an end within 2–3 years. Fifth, export hazards are quite high at the beginning of trading relationships but stabilise over time, albeit for only a few trade spells. That is, the key drivers of trade duration are gravity-like covariates, fixed trade cost variables and duration ‘type’ covariates. The third empirical study examines the impact of RTAs on economic growth in the EAC. The study specifies an endogenous growth model and estimates the model using feasible generalised least squares and panel corrected standard error estimators. The empirical results indicate, first, that RTAs and trade openness enhance economic growth in the EAC. Second, RTAs have impacts that are more significant for economic growth in the EAC than for trade openness measures. Third, the impact of trade liberalisation varies across regional markets. For instance, the EAC regional market has a more significant impact on economic growth than do plurilateral (COMESA) and multilateral (the WTO) trade agreements. Fourth, the impacts of RTAs on economic growth vary across countries in the EAC. These empirical results are robust to alternative model specifications. The thesis makes three major contributions that have important policy implications. The first draws from the empirical findings of Study 1, which provides new evidence that RTAs do have a positive effect on trade in the EAC and support third countries’ trade. However, the impacts of RTAs on trade vary across countries and product groupings. The policy implication is that there is a need to strengthen trade liberalisation within the EAC with a special focus on strengthening RTAs. Adopting holistic policies may not be appropriate because of their varying impacts at country and sectoral levels. Policies should be particularly cognisant of each country’s economic conditions. The second contribution draws from the empirical findings of Study 2. Utilising a new and comprehensive dataset, the study provides new empirical evidence that RTAs increase the duration of exports in the EAC. That is, it takes 2–3 years for half of the exports to dissipate in the EAC. The policy implication is that there is a need to explore ways in which RTAs can be used to increase the duration of trade in the EAC. Country and product characteristics in the EAC should be taken into account in plans to expand trade opportunities within particular regional markets, if trade relationships are to be extended. The third contribution draws on the empirical findings of Study 3. The study provides new empirical evidence that although RTAs do have a positive effect on economic growth, this varies across countries in the EAC. Empirical results reveal that investment and human capital are growth enhancing, while expansion of domestic credit by the private sector and ‘import openness’ are growth impeding. The policy implication is that there is a need to factor in trade liberalisation through RTAs in economic growth strategies other than via trade openness measures alone.
APA, Harvard, Vancouver, ISO, and other styles
45

Xia, Jun. "STATISTICAL MODELS AND ANALYSIS OF GROWTH PROCESSES IN BIOLOGICAL TISSUE." 2016. http://scholarworks.gsu.edu/math_diss/39.

Full text
Abstract:
The mechanisms that control growth processes in biology tissues have attracted continuous research interest despite their complexity. With the emergence of big data experimental approaches there is an urgent need to develop statistical and computational models to fit the experimental data and that can be used to make predictions to guide future research. In this work we apply statistical methods on growth process of different biological tissues, focusing on development of neuron dendrites and tumor cells. We first examine the neuron cell growth process, which has implications in neural tissue regenerations, by using a computational model with uniform branching probability and a maximum overall length constraint. One crucial outcome is that we can relate the parameter fits from our model to real data from our experimental collaborators, in order to examine the usefulness of our model under different biological conditions. Our methods can now directly compare branching probabilities of different experimental conditions and provide confidence intervals for these population-level measures. In addition, we have obtained analytical results that show that the underlying probability distribution for this process follows a geometrical progression increase at nearby distances and an approximately geometrical series decrease for far away regions, which can be used to estimate the spatial location of the maximum of the probability distribution. This result is important, since we would expect maximum number of dendrites in this region; this estimate is related to the probability of success for finding a neural target at that distance during a blind search. We then examined tumor growth processes which have similar evolutional evolution in the sense that they have an initial rapid growth that eventually becomes limited by the resource constraint. For the tumor cells evolution, we found an exponential growth model best describes the experimental data, based on the accuracy and robustness of models. Furthermore, we incorporated this growth rate model into logistic regression models that predict the growth rate of each patient with biomarkers; this formulation can be very useful for clinical trials. Overall, this study aimed to assess the molecular and clinic pathological determinants of breast cancer (BC) growth rate in vivo.
APA, Harvard, Vancouver, ISO, and other styles
46

Šlegerová, Lenka. "Hodnocení zdravotní technologie (HTA): léčba karcinomu prsu, případová studie ČR." Master's thesis, 2019. http://www.nusl.cz/ntk/nusl-392652.

Full text
Abstract:
Health technology assessment: case study on breast carcinoma treatment in the Czech Republic Bc. Lenka Šlegerová January 4, 2019 Abstract This thesis proposes an original method for assessing total costs of med- ical treatment. It defines the semi-Markov model with four states that are associated with specific costs of the treatment, and not with patients' health statuses. This method is applied to individuals' treatment data drawn from the Czech clinical practice in the treatment of the metastatic HER2+ breast cancer. The aim is to assess the cost-effectiveness of adding medication per- tuzumab to the combination of trastuzumab+docetaxel within first-line therapy and to examine whether using individual data on Czech patients and the economic conditions leads to different results from foreign stud- ies. Furthermore, employing censored data from the clinical practice in the thesis complicates the estimation of patients' overall survival in compari- son to clinical-trials data that form random samples. Therefore, survival functions were not only estimated by the Kaplan-Meier estimator but also using the Cox proportional hazard model and the Accelerated failure time model that both control for the effects of included covariates. The addition of pertuzumab does not result in significantly longer pa- tients'...
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography