Dissertations / Theses on the topic 'Hazard ratios'

To see the other types of publications on this topic, follow the link: Hazard ratios.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Hazard ratios.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Winnett, Angela Susan. "Flexible estimators of hazard ratios for exploratory and residual analysis." Thesis, University College London (University of London), 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.312945.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ligges, Sandra [Verfasser], Christine [Akademischer Betreuer] Müller, and Jörg [Akademischer Betreuer] Rahnenführer. "Schätzung des Hazard-Ratios in zweiarmigen Überlebenszeitstudien / Sandra Ligges. Betreuer: Christine Müller. Gutachter: Jörg Rahnenführer." Dortmund : Universitätsbibliothek Dortmund, 2013. http://d-nb.info/1099709687/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Nåtman, Jonatan. "The performance of inverse probability of treatment weighting and propensity score matching for estimating marginal hazard ratios." Thesis, Uppsala universitet, Statistiska institutionen, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-385502.

Full text
Abstract:
Propensity score methods are increasingly being used to reduce the effect of measured confounders in observational research. In medicine, censored time-to-event data is common. Using Monte Carlo simulations, this thesis evaluates the performance of nearest neighbour matching (NNM) and inverse probability of treatment weighting (IPTW) in combination with Cox proportional hazards models for estimating marginal hazard ratios. Focus is on the performance for different sample sizes and censoring rates, aspects which have not been fully investigated in this context before. The results show that, in the absence of censoring, both methods can reduce bias substantially. IPTW consistently had better performance in terms of bias and MSE compared to NNM. For the smallest examined sample size with 60 subjects, the use of IPTW led to estimates with bias below 15 %. Since the data were generated using a conditional parametrisation, the estimation of univariate models violates the proportional hazards assumption. As a result, censoring the data led to an increase in bias.
APA, Harvard, Vancouver, ISO, and other styles
4

Haller, Bernhard. "The analysis of competing risks data with a focus on estimation of cause-specific and subdistribution hazard ratios from a mixture model." Diss., Ludwig-Maximilians-Universität München, 2014. http://nbn-resolving.de/urn:nbn:de:bvb:19-170319.

Full text
Abstract:
Treatment efficacy in clinical trials is often assessed by time from treatment initiation to occurrence of a certain critical or beneficial event. In most cases the event of interest cannot be observed for all patients, as patients are only followed for a limited time or contact to patients is lost during their follow-up time. Therefore, certain methods were developed in the framework of the so called time-to-event or survival analysis, in order to obtain valid and consistent estimates in the presence of these "censored observations", using all available information. In classical event time analysis only one endpoint exists, as the death of a patient. As patients can die from different causes, in some clinical trials time to one out of two or more mutually exclusive types of event may be of interest. In many oncological studies, for example, time to cancer-specific death is considered as primary endpoint with deaths from other causes acting as so called competing risks. Different methods for data analysis in the competing risks framework were developed in recent years, which either focus on modelling the cause-specific or the subdistribution hazard rate or split the joint distribution of event times and event types into quantities, that can be estimated from observable data. In this work the analysis of event time data in the presence of competing risks is described, including the presentation and discussion of different regression approaches. A major topic of this work is the estimation of cause-specific and subdistribution hazard rates from a mixture model and a new approach using penalized B-splines (P-splines) for estimation of conditional hazard rates in a mixture model is proposed. In order to evaluate the behaviour of the new approach, a simulation study was conducted, using simulation techniques for competing risks data, which are described in detail in this work. The presented regression models were applied to data from a clinical cohort study investigating a risk stratification for cardiac mortality in patients, that survived a myocardial infarction. Finally, the use of the presented methods for event time analysis in the presence of competing risks and results obtained from the simulation study and the data analysis are discussed.
Zur Beurteilung der Wirksamkeit von Behandlungen in klinischen Studien wird häufig die Zeit vom Beginn einer Behandlung bis zum Eintreten eines bestimmten kritischen oder erwünschten Ereignisses als Zielgröße verwendet. Da in vielen Fällen das entsprechende Ereignis nicht bei allen Patienten beobachtet werden kann, da z.B. Patienten nur für einen gewissen Zeitraum nachverfolgt werden können oder der Patientenkontakt in der Nachbeobachtungszeit abbricht, wurden im Rahmen der so genannten Ereigniszeit- bzw. Überlebenszeitanalyse Verfahren entwickelt, die bei Vorliegen dieser "zensierten Beobachtungen" konsistente Schätzer liefern und dabei die gesamte verfügbare Information verwenden. In der klassischen Ereigniszeitanalyse existiert nur ein möglicher Endpunkt, wie der Tod eines Patienten. Da Patienten jedoch an verschiedenen Ursachen versterben können, ist in manchen klinischen Studien die Zeit bis zu einem von zwei oder mehreren sich gegenseitig ausschließenden Ereignistypen von Interesse. So fungiert z.B. in vielen onkologischen Studien die Zeit bis zum tumor-bedingten Tod als primärer Endpunkt, wobei andere Todesursachen sogenannte konkurrierende Risiken ("Competing Risks") darstellen. In den letzten Jahren wurden mehrere Verfahren zur Datenanalyse bei Vorliegen konkurrierender Risiken entwickelt, bei denen entweder die ereignis-spezifische oder die Subdistribution-Hazardrate modelliert wird, oder bei denen die gemeinsame Verteilung von Ereigniszeiten und Ereignistypen als Produkt von Größen abgebildet wird, die aus den beobachtbaren Daten geschätzt werden können. In dieser Arbeit werden Methoden zur Analyse von Competing-Risks-Daten, einschließlich verschiedener Regressionsansätze, vorgestellt. Besonderes Augenmerk liegt auf der Schätzung der ereignis-spezifischen und Subdistribution-Hazardraten aus einem sogenannten Mixture Model. Diesbezüglich wird auch ein neuer Ansatz zur Schätzung der konditionalen Hazardraten in einem Mixture Model unter Verwendung penalisierter B-Spline-Funktionen (P-Splines) vorgestellt. Um die Eigenschaften des neuen Ansatzes zu untersuchen, wurde eine Simulationsstudie unter Einsatz verschiedener Simulationsstrategien für Competing-Risks-Daten, die in dieser Arbeit im Detail beschrieben werden, durchgeführt. Die Regressionsmodelle wurden auf Daten einer klinischen Kohortenstudie zur Evaluation einer Risikostratifizierung für Patienten, die einen Myokardinfarkt überlebt haben, angewandt. Abschließend werden die vorgestellten Methoden zur Analyse von Ereigniszeitdaten bei Vorliegen konkurrierender Risiken sowie die Ergebnisse der Simulationsstudie und der Datenanalyse diskutiert.
APA, Harvard, Vancouver, ISO, and other styles
5

Singh, Bina Aruna. "GIS based assessment of seismic risk for the Christchurch CBD and Mount Pleasant, New Zealand." Thesis, University of Canterbury. Geography, 2006. http://hdl.handle.net/10092/1302.

Full text
Abstract:
This research employs a deterministic seismic risk assessment methodology to assess the potential damage and loss at meshblock level in the Christchurch CBD and Mount Pleasant primarily due to building damage caused by earthquake ground shaking. Expected losses in terms of dollar value and casualties are calculated for two earthquake scenarios. Findings are based on: (1) data describing the earthquake ground shaking and microzonation effects; (2) an inventory of buildings by value, floor area, replacement value, occupancy and age; (3) damage ratios defining the performance of buildings as a function of earthquake intensity; (4) daytime and night-time population distribution data and (5) casualty functions defining casualty risk as a function of building damage. A GIS serves as a platform for collecting, storing and analyzing the original and the derived data. It also allows for easy display of input and output data, providing a critical functionality for communication of outcomes. The results of this study suggest that economic losses due to building damage in the Christchurch CBD and Mount Pleasant will possibly be in the order of $5.6 and $35.3 million in a magnitude 8.0 Alpine fault earthquake and a magnitude 7.0 Ashley fault earthquake respectively. Damage to non-residential buildings constitutes the vast majority of the economic loss. Casualty numbers are expected to be between 0 and 10.
APA, Harvard, Vancouver, ISO, and other styles
6

PACIFICO, CLAUDIA. "Comparison of propensity score based methods for estimating marginal hazard ratios with composite unweighted and weighted endpoints: simulation study and application to hepatocellular carcinoma." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2021. http://hdl.handle.net/10281/306601.

Full text
Abstract:
Introduzione La mia attività di ricerca si propone di utilizzare i dati dello studio HERCOLES, uno studio retrospettivo sull’epatocarcinoma, come esempio applicativo per il confronto di metodi statistici per la stima dell’effetto marginale di un certo trattamento su endpoint di sopravvivenza standard (non pesati) ed endpoint compositi pesati. Quest’ultimo approccio, non ancora esplorato, è motivato dalla necessità di tenere conto della diversa rilevanza clinica degli eventi causa-specifici. In particolare, la morte è considerata l'evento peggiore ma una rilevanza maggiore è data anche alla recidiva locale rispetto a quella non locale. Per confrontare la performance statistiche di tali metodi sono stati sviluppati due protocolli di simulazioni. Metodi Per rimuovere o ridurre l'effetto dei confondenti (caratteristiche del soggetto e da altri fattori al basale che determinano differenze sistematiche tra i gruppi di trattamento) al fine di quantificare un effetto marginale, è necessario l’utilizzo di metodi statistici appropriati, basati sul Propensity Score (PS):la probabilità che un soggetto sia assegnato ad un trattamento condizionatamente alle covariate misurate al basale. Nella mia tesi ho considerato alcuni tra i metodi disponibili in letteratura basati sul PS (Austin 2013): - PS come covariata con trasformazione spline - PS come covariata categorica stratificata rispetto ai quantili - Appaiamento per PS - Inverse probability weighting (IPW) L’effetto marginale dell’endpoint composito non pesato è misurato in termini di hazard ratio (HR) marginale stimato tramite un modello di Cox. Per quanto riguarda l’endpoint composito pesato, lo stimatore dell’effetto del trattamento è lo stimatore non-parametrico del rapporto tra hazard cumulativi proposto da Ozga e Rauch (2019). Protocollo simulazioni Il meccanismo di generazione dei dati è simile per entrambi gli studi di simulazione. In entrambi i protocolli di simulazione, Il meccanismo di generazione dei dati è simile a quello utilizzato da Austin (2013). Nello specifico, per quanto riguarda l’endpoint non pesato (DFS), ho simulato tre scenari considerando rispettivamente tre valori per l'HR marginale: HR=1 (scenario a); HR=1.5 (scenario b) and HR=2 (scenario c). In ogni scenario ho simulato 10.000 set di dati composti da 1.000 soggetti e per la stima del PS ho generato 12 confondenti. Lo studio di simulazione per l’endpoint pesato prevede gli stessi scenari (a,b,c) combinati con tre tipologie di pesi per i due endpoints singoli: (w1,w2)=(1,1); (w1,w2)=(1,0.5); (w1,w2)=(1,0.8). In ogni scenario ho simulato 1.000 set di dati composti da 1.000 soggetti e per la stima del PS ho generato 3 confondenti. Inoltre ho considerato solo i due metodi considerati in letteratura i più robusti: IPW e appaiamento per PS (Austin 2016). Risultati I risultati relativi all’endpoint composito non pesato confermano quanto già noto in letteratura: l’IPW è il metodo basato su PS più robusto, seguito dall’appaiamento per PS. L’aspetto innovativo della mia tesi riguarda l’implementazione di studi di simulazione per la valutazione della performance dei metodi basati sul PS nello stimare l’effetto marginale di un certo trattamento rispetto ad un endpoint di sopravvivenza composito pesato: l’IPW si conferma il metodo più accurato e preciso.
Introduction My research activity aims to use the data from the HERCOLES study, a retrospective study on hepatocarcinoma, as an application example for the comparison of statistical methods for estimating the marginal effect of a certain treatment on standard survival endpoints (unweighted) and weighted composite endpoints. This last approach, unexplored to date, is motivated by the need to take into account the different clinical relevance of cause-specific events. In particular, death is considered the worst event but a greater relevance is also given to local recurrence compared to non-local one. To evaluate the statistical performance of these methods, two simulation protocols were developed. Methods To remove or reduce the effect of confounders (characteristics of the subject and other baseline factors that determine systematic differences between treatment groups) in order to quantify a marginal effect, it is necessary to use appropriate statistical methods, based on the Propensity Score (PS): the probability that a subject is assigned to a treatment conditional on the covariates measured at baseline. In my thesis I considered some of the PS-based methods available in literature (Austin 2013): - PS as a covariate with spline transformation - PS as a stratified categorical covariate with respect to quantiles - Pairing for PS - Inverse probability weighting (IPW) The marginal effect of the unweighted composite endpoint is measured in terms of marginal hazard ratio (HR) estimated using a Cox model. As regards the weighted composite endpoint, the estimator of the treatment effect is the non-parametric estimator of the ratio between cumulative hazards proposed by Ozga and Rauch (2019). Simulation protocol The data generation mechanism is similar for both simulation studies. In both simulation protocols, the data generation mechanism is similar to that used by Austin (2013). Specifically, with regard to the unweighted endpoint (Disease Free Survival), I simulated three scenarios by considering respectively three values for the marginal HR: HR=1 (scenario a); HR=1.5 (scenario b) and HR=2 (scenario c). In each scenario, I simulated 10,000 datasets consisting of 1,000 subjects and for the estimate of the PS I generated 12 confounders. The simulation study for the weighted endpoint provides for the same scenarios (a, b, c) combined with three types of weights for the two single endpoints: (w1,w2)=(1,1); (w1,w2)=(1,0.5); (w1,w2)=(1,0.8). In each scenario I simulated 1,000 data sets consisting of 1,000 subjects and for the estimate of the PS I generated 3 confounders. Furthermore, I considered only the two methods considered in the literature to be the most robust: IPW and PS pairing (Austin 2016). Results The results relating to the unweighted composite endpoint confirm what is already known in the literature: IPW is the most robust method based on PS, followed by matching for PS. The innovative aspect of my thesis concerns the implementation of simulation studies for the evaluation of the performance of PS-based methods in estimating the marginal effect of a certain treatment with respect to a weighted composite survival endpoint: the IPW is confirmed as the most accurate and precise method.
APA, Harvard, Vancouver, ISO, and other styles
7

Haller, Bernhard [Verfasser], and Kurt [Akademischer Betreuer] Ulm. "The analysis of competing risks data with a focus on estimation of cause-specific and subdistribution hazard ratios from a mixture model / Bernhard Haller. Betreuer: Kurt Ulm." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2014. http://d-nb.info/1052778984/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

MASO, L. DAL. "LEGAME TRA L¿IMMUNODEFICIENZA HIV-CORRELATA E L¿INSORGENZA DI TUMORI: ASPETTI DI METODOLOGIA STATISTICA." Doctoral thesis, Università degli Studi di Milano, 2012. http://hdl.handle.net/2434/168454.

Full text
Abstract:
The aim of the present Ph.D. program was exploring methodological issues arising in studies of cancer incidence, relative risk, and survival in patients with HIV/AIDS (PHA). Aspects related to the first two objectives were explored in the first two years. This thesis described a record-linkage study conducted between the national Italian AIDS Registry and 24 Italian cancer registries to estimate survival after a cancer diagnosis in PHA. More than 2600 cancer cases diagnosed between 1986 and 2005 were included. Survival in PHA was compared with that reported in patients without AIDS using, as comparison group, patients matched for site (1:1 for Kaposi Sarcoma, 1:2 for non-Hodgkin lymphoma, 1:5 for other cancers), sex, age, period of diagnosis, and area of residence. Overall survival and death hazard ratios (HR) compared survival in PWA with cancer to that in cancer patients without AIDS have been calculated. Overall, the 3-year survival rate of PHA with cancer increased from 16% in 1986-1995 to 41% in 1996-2005 period, after the widespread use of antiretroviral therapy (cART). In this period, HR remained higher in PHA than in persons without AIDS (3.0, 95% confidence interval [CI]: 2.7–3.4), in particular for cancer with good prognosis, e.g., Hodgkin lymphomas (HR=8.6), non-melanoma skin cancer (H=5.0), and anal cancer (HR=4.0). A sensitivity analysis was performed to evaluate the impact on survival and on HR of different study designs and comparison groups.
APA, Harvard, Vancouver, ISO, and other styles
9

Petit, Claire. "Méta-analyse en réseau et cancer ORL : utilité des critères de jugement multiples Individual Patient Data Network Meta-Analysis Using Either Restricted Mean Survival Time Difference or Hazard Ratios: Is There a Difference? A Case Study on Locoregionally Advanced Nasopharyngeal Carcinomas." Thesis, université Paris-Saclay, 2020. http://www.theses.fr/2020UPASR010.

Full text
Abstract:
Les cancers des voies aéro-digestives supérieures, qu’ils soient des carcinomes épidermoïdes “classiques” ou des carcinomes indifférenciés du nasopharynx sont traités par des associations thérapeutiques en cas de maladie localement avancée. Le traitement loco-régional en est le socle, avec une chirurgie et/ou une radiothérapie. Cette radiothérapie peut être standard (66-70 Grays en 33-35 séances) ou avoir un fractionnement modifié (hyperfractionnement ou accélération). Une chimiothérapie est souvent associée à ces traitements, avec différents temps d’administration possible : en induction, en concomitant ou en adjuvant.De nombreux essais randomisés ont comparé différentes associations de traitements entre elles. Une méta-analyse en réseau permet une analyse groupée de tous ces essais randomisés, en utilisant les informations directes et indirectes disponibles pour déterminer l’efficacité relative des traitements.L’objectif de cette thèse était la réalisation de méta-analyses en réseau de type fréquentiste en utilisant les données individuelles de trois méta-analyses classiques :- combinaison de deux méta-analyses pour les carcinomes épidermoïdes (MACH-NC et MARCH, 115 essais, 28 978 patients et 16 modalités de traitement)- une méta-analyse pour les carcinomes du nasopharynx (MAC-NPC, 28 essais, 8 214 patients et 8 modalités de traitement) ;avec l’utilisation de différents critères de jugement (survie globale, survie sans progression ou sans évènement, contrôle loco-régional et métastatique, décès liés ou non au cancer) et différentes mesures pour ces critères de jugement : hazard ratio et différence de survie moyenne restreinte
Locally advanced head and neck cancers, whether “classical” squamous cell carcinomas or undifferentiated carcinomas of the nasopharynx, are treated by multimodality therapy. Loco-regional treatment is the main therapy, with surgery and/or radiotherapy. This radiotherapy can be standard (66-70 Grays in 33-35 fractions) or can have a modified fractionation (hyperfractionation or acceleration). Chemotherapy is often associated to these treatments, with different timing: induction, concomitant or adjuvant.Several randomized trials have compared different combinations of treatments. A network meta-analysis allows performing a pooled analysis of all these randomized trials, using the direct and indirect information available to determine the relative efficacy of the treatments.The objective of this doctoral thesis was to perform frequentist network meta-analyses using individual patient data from three standard meta-analyses:- combination of two meta-analyses for squamous cell carcinomas (MACH-NC and MARCH, 115 trials; 28,978 patients and 16 modalities of treatment);- one meta-analysis for nasopharyngeal carcinomas (MAC-NPC, 28 trials; 8,214 patients and 8 modalities of treatment);with different endpoints (overall survival, progression-free or event-free survival, locoregional and metastatic control, cancer-related or non-cancer-related deaths) and different measures for these endpoints: hazard ratio and restricted mean survival time difference
APA, Harvard, Vancouver, ISO, and other styles
10

Cain, Samuel Franklin. "Rating Rockfall Hazard in Tennessee." Thesis, Virginia Tech, 2004. http://hdl.handle.net/10919/9972.

Full text
Abstract:
Rockfall from rock slopes adjacent to roadways is a major hazard and poses a problem for transportation agencies across the country. The state of Tennessee has implemented the Tennessee Rockfall Management System (RMS) as a means of reducing the liabilities associated with rockfall hazard. It utilizes digital data acquisition via PDAs coupled with distribution via an expandable web-based GIS database. The Tennessee Rockfall Hazard Rating System (RHRS) is part of the Tennessee RMS and assigns a numeric hazard rating according relative hazard for all slopes identified as having a high potential for delivering rock blocks onto Tennessee Department of Transportation maintained roadways. The Tennessee RHRS uses standard rock slope failure mechanisms (planar failure, wedge failure, topple failure, differential weathering, and raveling) along with the site and roadway geometry to assess the rockfall hazard of an individual slope. This study suggests methods that will expedite fieldwork, including an informational guide on how to properly identify individual failure mechanisms in the field. Also, the study examines the current method of scoring abundance and suggests an alternative, multiplicative approach. The alternative of using a multiplicative abundance is considered and its results summarized.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
11

Chen, Yi-Ting Civ E. Massachusetts Institute of Technology. "Rainfall-induced Landslide Hazard Rating System." Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/66858.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, 2011.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 136-138).
This research develops a Landslide Hazard Rating System for the rainfall-induced landslides in the Chenyulan River basin area in central Taiwan. This system is designed to provide a simplified and quick evaluation of the possibility of landslide occurrence, which can be used for planning and risk management. A systematic procedure to investigate the characterization of rainfall distribution in a regional area is developed in the first part of the thesis. Rainfall data for approximately one decade, 2002 to 2008, from 9 rainfall stations in the study area are included, in which a total of 46 typhoons are selected and categorized into 3 typhoon paths: the Northeastern, Northwestern, and Western. The rainfall distribution affected by typhoon paths in a region is thereby determined. The second part of the thesis is the Landslide Hazard Rating System, which integrates different hazard factors: bedrock geology, aspect, and slope gradients. This analysis is based on the specific characterization of the study area, which consists of the relative topographic relief (aspect and slope gradients) and variable bedrock geology. The method of normalized difference is used for examining the relationship of the topographic features to landslide occurrence. Although this study is conducted in a specific area, this landslide hazard rating system can be applied to other locations. Finally, a concept of a rainfall-induced landslide analytical system is proposed to combine the rainfall distribution analysis and the landslide hazard rating system. This analytical system is intended to include and address the relationship of rainfall and landslide occurrence by combining characterizations of rainfall, topography, and landslide potential. Additionally, this study recommends that, in future work, theoretical models of rainfall distribution and laboratory tests of soil and rock samples be included. Together, these will constitute a basis for the prediction of landslide occurrence. The ultimate goal of future work should be the development of a system for assessing and forecasting rainfall-induced landslide risks, which can become the foundation for a comprehensive risk management system for use in planning.
by Yi-Ting Chen.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
12

Mackenzie, Todd. "Modelling a time-dependant hazard ratio with regression splines." Thesis, McGill University, 1993. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=69720.

Full text
Abstract:
The proportional hazards model proposed by Cox (1972) is by far the most popular method of regressing survival data. This model is attractive becuase: (i) It has a simple interpretation; the impact of a variable upon survival is a constant and multiplicative effect on the hazard function. (ii) It facilitates the employment of the partial likelihood inference technique so that it requires no assumptions about the baseline distribution of survival times. Many numerical tests as well as graphical approaches have been proposed for assessing the adequacy of the proportional hazards model. However only a few authors have discussed strategies for modelling data for which the hazard ratio varies over time.
In this thesis the topic of survival analysis is overviewed, and methods for assessing the validity of the proportional hazards assumption are reviewed. Finally a method of estimating the hazard ratio as a flexible function of time using the method of regression splines and the AIC model selection criterion is proposed. We report the results of a simulation meant to examine the small sample properties of this technique.
APA, Harvard, Vancouver, ISO, and other styles
13

Dang, Huong Dieu. "Rating History, Time and The Dynamic Estimation of Rating Migration Hazard." Thesis, The University of Sydney, 2010. http://hdl.handle.net/2123/6397.

Full text
Abstract:
This thesis employs survival analysis framework (Allison, 1984) and the Cox’s hazard model (Cox, 1972) to estimate the probability that a credit rating survives in its current grade at a certain forecast horizon. The Cox’s hazard model resolves some significant drawbacks of the conventional estimation approaches. It allows a rigorous testing of non-Markovian behaviours and time heterogeneity in rating dynamics. It accounts for the changes in risk factors over time, and features the time structure of probability survival estimates. The thesis estimates three stratified Cox’s hazard models, including a proportional hazard model, and two dynamic hazard models which account for the changes in macro-economic conditions, and the passage of survival time over rating durations. The estimation of these stratified Cox’s hazard models for downgrades and upgrades offers improved understanding of the impact of rating history in a static and a dynamic estimation framework. The thesis overcomes the computational challenges involved in forming dynamic probability estimates when the standard proportionality assumption of Cox’s model does not hold and when the data sample includes multiple strata. It is found that the probability of rating migrations is a function of rating history and that rating history is more important than the current rating in determining the probability of a rating change. Switching from a static estimation framework to a dynamic estimation framework does not alter the effect of rating history on the rating migration hazard. It is also found that rating history and the current rating interact with time. As the rating duration extends, the main effects of rating history and current rating variables decay. Accounting for this decay has a substantial impact on the risk of rating transitions. Downgrades are more affected by rating history and time interactions than upgrades. To evaluate the predictive performance of rating history, the Brier score (Brier, 1950) and its covariance decomposition (Yates, 1982) were employed. Tests of forecast accuracy suggest that rating history has some predictive power for future rating changes. The findings suggest that an accurate forecast framework is more likely to be constructed if non-Markovian behaviours and time heterogeneity are incorporated into credit risk models.
APA, Harvard, Vancouver, ISO, and other styles
14

Dang, Huong Dieu. "Rating History, Time and The Dynamic Estimation of Rating Migration Hazard." University of Sydney, 2010. http://hdl.handle.net/2123/6397.

Full text
Abstract:
Doctor of Philosophy(PhD)
This thesis employs survival analysis framework (Allison, 1984) and the Cox’s hazard model (Cox, 1972) to estimate the probability that a credit rating survives in its current grade at a certain forecast horizon. The Cox’s hazard model resolves some significant drawbacks of the conventional estimation approaches. It allows a rigorous testing of non-Markovian behaviours and time heterogeneity in rating dynamics. It accounts for the changes in risk factors over time, and features the time structure of probability survival estimates. The thesis estimates three stratified Cox’s hazard models, including a proportional hazard model, and two dynamic hazard models which account for the changes in macro-economic conditions, and the passage of survival time over rating durations. The estimation of these stratified Cox’s hazard models for downgrades and upgrades offers improved understanding of the impact of rating history in a static and a dynamic estimation framework. The thesis overcomes the computational challenges involved in forming dynamic probability estimates when the standard proportionality assumption of Cox’s model does not hold and when the data sample includes multiple strata. It is found that the probability of rating migrations is a function of rating history and that rating history is more important than the current rating in determining the probability of a rating change. Switching from a static estimation framework to a dynamic estimation framework does not alter the effect of rating history on the rating migration hazard. It is also found that rating history and the current rating interact with time. As the rating duration extends, the main effects of rating history and current rating variables decay. Accounting for this decay has a substantial impact on the risk of rating transitions. Downgrades are more affected by rating history and time interactions than upgrades. To evaluate the predictive performance of rating history, the Brier score (Brier, 1950) and its covariance decomposition (Yates, 1982) were employed. Tests of forecast accuracy suggest that rating history has some predictive power for future rating changes. The findings suggest that an accurate forecast framework is more likely to be constructed if non-Markovian behaviours and time heterogeneity are incorporated into credit risk models.
APA, Harvard, Vancouver, ISO, and other styles
15

Deneke, Fred. "Wildfire Hazard Severity Rating Checklist for Arizona Homes and Communities." College of Agriculture and Life Sciences, University of Arizona (Tucson, AZ), 2002. http://hdl.handle.net/10150/146948.

Full text
Abstract:
6 pp.
Many Arizona residents own homes in or near the forests, woodlands, chaparral, and grasslands to take advantage of the amenities of living in a natural environment. Fire protection for homes in rural and remote areas is limited when compared to living in an urban area. This checklist is designed to assist an individual homeowner or a group of homeowners living in a remote area to assess the relative wildfire hazard severity around a home, neighborhood, subdivision, or community.
APA, Harvard, Vancouver, ISO, and other styles
16

DeGomez, Tom. "Wildfire HAZard Severity Rating Checklist for Arizona Homes and Communities." College of Agriculture and Life Sciences, University of Arizona (Tucson, AZ), 2011. http://hdl.handle.net/10150/239576.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Zhao, Meng. "Empirical Likelihood Confidence Intervals for the Ratio and Difference of Two Hazard Functions." Digital Archive @ GSU, 2008. http://digitalarchive.gsu.edu/math_theses/57.

Full text
Abstract:
In biomedical research and lifetime data analysis, the comparison of two hazard functions usually plays an important role in practice. In this thesis, we consider the standard independent two-sample framework under right censoring. We construct efficient and useful confidence intervals for the ratio and difference of two hazard functions using smoothed empirical likelihood methods. The empirical log-likelihood ratio is derived and its asymptotic distribution is a chi-squared distribution. Furthermore, the proposed method can be applied to medical diagnosis research. Simulation studies show that the proposed EL confidence intervals have better performance in terms of coverage accuracy and average length than the traditional normal approximation method. Finally, our methods are illustrated with real clinical trial data. It is concluded that the empirical likelihood methods provide better inferential outcomes.
APA, Harvard, Vancouver, ISO, and other styles
18

Barton, William H. "COMPARISON OF TWO SAMPLES BY A NONPARAMETRIC LIKELIHOOD-RATIO TEST." UKnowledge, 2010. http://uknowledge.uky.edu/gradschool_diss/99.

Full text
Abstract:
In this dissertation we present a novel computational method, as well as its software implementation, to compare two samples by a nonparametric likelihood-ratio test. The basis of the comparison is a mean-type hypothesis. The software is written in the R-language [4]. The two samples are assumed to be independent. Their distributions, which are assumed to be unknown, may be discrete or continuous. The samples may be uncensored, right-censored, left-censored, or doubly-censored. Two software programs are offered. The first program covers the case of a single mean-type hypothesis. The second program covers the case of multiple mean-type hypotheses. For the first program, an approximate p-value for the single hypothesis is calculated, based on the premise that -2log-likelihood-ratio is asymptotically distributed as ­­χ2(1). For the second program, an approximate p-value for the p hypotheses is calculated, based on the premise that -2log-likelihood-ratio is asymptotically distributed as ­χ2(p). In addition we present a proof relating to use of a hazard-type hypothesis as the basis of comparison. We show that -2log-likelihood-ratio is asymptotically distributed as ­­χ2(1) for this hypothesis. The R programs we have developed can be downloaded free-of-charge on the internet at the Comprehensive R Archive Network (CRAN) at http://cran.r-project.org, package name emplik2. The R-language itself is also available free-of-charge at the same site.
APA, Harvard, Vancouver, ISO, and other styles
19

Dalqamouni, Ahmad Yousef. "Development of a Landslide Hazard Rating System for Selected Counties in Northeastern Ohio." Kent State University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=kent1299284289.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Hendy, Setiawan. "Landslide Hazard Assessment on the Upstream of Dam Reservoir." 京都大学 (Kyoto University), 2017. http://hdl.handle.net/2433/225565.

Full text
Abstract:
付記する学位プログラム名: グローバル生存学大学院連携プログラム
Kyoto University (京都大学)
0048
新制・課程博士
博士(工学)
甲第20340号
工博第4277号
新制||工||1662(附属図書館)
京都大学大学院工学研究科社会基盤工学専攻
(主査)教授 寶 馨, 教授 角 哲也, 准教授 佐山 敬洋
学位規則第4条第1項該当
APA, Harvard, Vancouver, ISO, and other styles
21

Pham, Van Tien. "MECHANISMS AND HAZARD ASSESSMENT OF RAINFALL-INDUCED LANDSLIDE DAMS." Kyoto University, 2018. http://hdl.handle.net/2433/231989.

Full text
Abstract:
付記する学位プログラム名: グローバル生存学大学院連携プログラム
Kyoto University (京都大学)
0048
新制・課程博士
博士(工学)
甲第21056号
工博第4420号
新制||工||1687(附属図書館)
京都大学大学院工学研究科社会基盤工学専攻
(主査)教授 寶 馨, 教授 角 哲也, 准教授 佐山 敬洋
学位規則第4条第1項該当
APA, Harvard, Vancouver, ISO, and other styles
22

Williams, Matthew Richard. "Likelihood-based testing and model selection for hazard functions with unknown change-points." Diss., Virginia Tech, 2011. http://hdl.handle.net/10919/26835.

Full text
Abstract:
The focus of this work is the development of testing procedures for the existence of change-points in parametric hazard models of various types. Hazard functions and the related survival functions are common units of analysis for survival and reliability modeling. We develop a methodology to test for the alternative of a two-piece hazard against a simpler one-piece hazard. The location of the change is unknown and the tests are irregular due to the presence of the change-point only under the alternative hypothesis. Our approach is to consider the profile log-likelihood ratio test statistic as a process with respect to the unknown change-point. We then derive its limiting process and find the supremum distribution of the limiting process to obtain critical values for the test statistic. We first reexamine existing work based on Taylor Series expansions for abrupt changes in exponential data. We generalize these results to include Weibull data with known shape parameter. We then develop new tests for two-piece continuous hazard functions using local asymptotic normality (LAN). Finally we generalize our earlier results for abrupt changes to include covariate information using the LAN techniques. While we focus on the cases of no censoring, simple right censoring, and censoring generated by staggered-entry; our derivations reveal that our framework should apply to much broader censoring scenarios.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
23

Cabas, Mijares Ashly Margot. "Improvements to the Assessment of Site-Specific Seismic Hazards." Diss., Virginia Tech, 2016. http://hdl.handle.net/10919/82352.

Full text
Abstract:
The understanding of the impact of site effects on ground motions is crucial for improving the assessment of seismic hazards. Site response analyses (SRA) can numerically accommodate the mechanics behind the wave propagation phenomena near the surface as well as the variability associated with the input motion and soil properties. As a result, SRA constitute a key component of the assessment of site-specific seismic hazards within the probabilistic seismic hazard analysis framework. This work focuses on limitations in SRA, namely, the definition of the elastic half-space (EHS) boundary condition, the selection of input ground motions so that they are compatible with the assumed EHS properties, and the proper consideration of near-surface attenuation effects. Input motions are commonly selected based on similarities between the shear wave velocity (Vs) at the recording station and the materials below the reference depth at the study site (among other aspects such as the intensity of the expected ground motion, distance to rupture, type of source, etc.). This traditional approach disregards the influence of the attenuation in the shallow crust and the degree to which it can alter the estimates of site response. A Vs-κ correction framework for input motions is proposed to render them compatible with the properties of the assumed EHS at the site. An ideal EHS must satisfy the conditions of linearity and homogeneity. It is usually defined at a horizon where no strong impedance contrast will be found below that depth (typically the top of bedrock). However, engineers face challenges when dealing with sites where this strong impedance contrast takes place far beyond the depth of typical Vs measurements. Case studies are presented to illustrate potential issues associated with the selection of the EHS boundary in SRA. Additionally, the relationship between damping values as considered in geotechnical laboratory-based models, and as implied by seismological attenuation parameters measured using ground motions recorded in the field is investigated to propose alternative damping models that can match more closely the attenuation of seismic waves in the field.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
24

VanDerHurst, Jeffrey J. "The development of an interactive computer model for managing geologic hazard databases." Thesis, This resource online, 1996. http://scholar.lib.vt.edu/theses/available/etd-08292008-063619/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Cicek, Ceren. "Implementation Of A Hazard Rating System To The Cut Slopes Along Kizilcahamam-gerede Segment Of D750 Highway." Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/12610547/index.pdf.

Full text
Abstract:
The purpose of this study is to implement a rock fall hazard rating system to the cut slopes along Kizilcahamam-Gerede segment of D750 (Ankara-istanbul) Highway. The rating system developed by the Tennessee Department of Transportation was assessed for thirty six cut slopes which were selected based on a reconnaissance survey along D750 highway, between Kurtbogazi Dam (50 km northwest of Ankara) and AktaS village (15 km to Gerede town of Bolu province). The stages of the investigation consist of project conception, field investigations and application of this system, assessment and presentation of data. The cut slopes were classified by implementing this method which requires a scoring on an exponential scale assigned to various parameters related to the site and roadway geometry and geologic characteristics. The rating process was completed at two stages: Preliminary and Detailed Rating. Based on the Tennessee RHRS, nineteen cutslopes were assessed according to these two stages while the other seventeen cut slopes were able to be classified only with the preliminary rating stage. Different modes of slope failure (planar, wedge, toppling, rock fall with differential weathering, raveling) throughout the selected segments of the highway were investigated and the slope and highway related parameters such as slope height, ditch effectiveness, average vehicle risk, road width, percent desicion site distance and rockfall history were identified for these nineteen cut slopes. After the scoring process was completed all cut slopes were classified based on their hazard ratings from the point of the problems that they may cause in transportation. According to the rules of Tennessee RHRS, a total of thirty five cut slopes were rated. Among these slopes, nineteen of them are rated as A slopes which are considered to be potentially hazardous, while a total of seven are rated as C slopes which pose no danger. In placing a slope into a B category, it is considered that they are not as prone as A slopes to create a danger and a total of nine B slopes are detected. The detailed rating is accomplished for these nineteen A slopes and as a result of the scorings, it has been seen that the final RHRS scores range from 164 to 591. The slopes with scores over 500 can be counted as more hazardous slopes since they get very high scores both from site and roadway geometry and geologic hazard part.
APA, Harvard, Vancouver, ISO, and other styles
26

Goetz, Ryan P. Rosenblad Brent L. "Study of the horizontal-to-vertical spectral ratio (HVSR) method for characterization of deep soils in the Mississippi Embayment." Diss., Columbia, Mo. : University of Missouri--Columbia, 2009. http://hdl.handle.net/10355/5334.

Full text
Abstract:
The entire thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file; a non-technical public abstract appears in the public.pdf file. Title from PDF of title page (University of Missouri--Columbia, viewed on December 22, 2009). Thesis advisor: Dr. Brent L. Rosenblad. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
27

Brilli, Nicolò. "Valutazione del rischio da caduta massi con Rockfall Hazard Rating System lungo la strada comunale della Montagna (Sansepolcro,AR)." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2017.

Find full text
Abstract:
In questa tesi è stata valutata la pericolosità e quindi il relativo rischio da crollo. In particolare riferendoci ad ammassi affioranti lungo tratti stradali sono stati considerati due metodi classificativi, che ci hanno permesso di evidenziare sezioni a diverso rischio. Le metodologie utilizzate per la classificazione sono: - RHRS Pierson et al. 1990; - RHRS Russell et al. 2008. Ogni metodologia utilizza particolari tabelle che restituiscono punteggi la cui sommatoria cumulativa costituisce il Total Hazard Risk Score caratteristico di ogni sezione. Dopo aver discusso i risultati ottenuti ed aver analizzato i fattori determinanti per la diversa caratterizzazione del tratto stradale in sezioni a diverso rischio, è stata analizzata la compatibilità e le differenze tra i due metodi classificativi adottati.
APA, Harvard, Vancouver, ISO, and other styles
28

Bísková, Hana. "Sekuritizace a její role v soudobé finanční a hospodářské krizi." Master's thesis, Vysoká škola ekonomická v Praze, 2014. http://www.nusl.cz/ntk/nusl-191777.

Full text
Abstract:
Securitization has been one of the most important causes of recent financial crisis. Diploma thesis characterizes subprime mortgage securitization with the emphasis on entities in the process involved and their motivations for securitization use. The thesis describes the most used securitized assets with the emphasis on risks arising from their characteristics. The thesis also clarifies negatives associated with securitization especially moral hazard, asymmetric information on the market and decline in lending standards and that securitization lead to failure of rating agencies evaluation. Thesis connects cheap money policy with increase demand for mortgages that have been securitized afterword and with losses of investors into securitized assets.
APA, Harvard, Vancouver, ISO, and other styles
29

Misconel, Michele. "Analisi del rischio da caduta massi con Rockfall Hazard Rating System lungo la strada statale 612 (Molina di Fiemme, TN)." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2016. http://amslaurea.unibo.it/12224/.

Full text
Abstract:
Il presente lavoro ha lo scopo di analizzare con il metodo Rockfall Hazard Rating System (Pierson et al, 1990) un tratto di strada lungo la SS612 nei comuni di Molina di Fiemme (Tn) e Anterivo (Bz) in tre zone distinte e valutarne il rischio rispetto ai fenomeni di crollo.
APA, Harvard, Vancouver, ISO, and other styles
30

Andreini, Lorenzo. "Analisi del rischio da caduta massi con Rockfall Hazard Rating System lungo la SP29 di Sovramonte (BL) e la SP19 di Lamon (BL)." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2019.

Find full text
Abstract:
Scopo della tesi è l'analisi del rischio da caduta massi con Rockfall Hazard Rating System lungo la SP29 di Sovramonte (BL) e la SP19 di Lamon (BL). Sono stati utilizzati il metodo RHRS originale (Pierson et al., 1990) e il metodo RHRS modificato (P.Santi et al., 2009) per valutare gli aspetti geologici di dodici sezioni soggette a distacco di roccia insieme alle caratteristiche stradali. Con i punteggi ottenuti dallo studio di ogni sezione si è infine riusciti a fornire una scala di priorità per gli interventi di sistemazione.
APA, Harvard, Vancouver, ISO, and other styles
31

Deniz, Aykut. "Estimation Of Earthquake Insurance Premium Rates Based On Stochastic Methods." Master's thesis, METU, 2006. http://etd.lib.metu.edu.tr/upload/12607063/index.pdf.

Full text
Abstract:
In this thesis, stochastic methods are utilized to improve a familiar comprehensive probabilistic model to obtain realistic estimates of the earthquake insurance premium rates in different seismic zones of Turkey. The model integrates the information on future earthquake threat with the information on expected earthquake damage to buildings. The quantification of the future earthquake threat is achieved by making use of the seismic hazard analysis techniques. Due to the uncertainties involved, the hazard that may occur at a site during future earthquakes has to be treated in a probabilistic manner. Accessibility of past earthquake data from a number of different data sources, encourages the consideration of every single earthquake report. Seismic zonation of active earthquake generating regions has been improved as recent contributions are made available. Finally, up-to-date data bases have been utilized to establish local attenuation relationships reflecting the expected earthquake wave propagation and its randomness more effectively. The damage that may occur to structures during future earthquakes involves various uncertainties and also has to be treated in a probabilistic manner. For this purpose, damage probability matrices (DPM), expressing what will happen to buildings, designed according to some particular set of requirements, during earthquakes of various intensities, are constructed from observational and estimated data. With the above considerations, in order to demonstrate the application of the improved probabilistic method, earthquake insurance premium rates are computed for reinforced concrete and masonry buildings constructed in different seismic zones of Turkey.
APA, Harvard, Vancouver, ISO, and other styles
32

Persson, Daniel, and Johannes Ahlström. "Går det att prediktera konkurs i svenska aktiebolag? : En kvantitativ studie om hur finansiella nyckeltal kan användas vid konkursprediktion." Thesis, Linköpings universitet, Företagsekonomi, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-119867.

Full text
Abstract:
Från 1900-talets början har banker och låneinstitut använt nyckeltal som hjälpmedel vid bedömning och kvantifiering av kreditrisk. För dagens investerare är den ekonomiska miljön mer komplicerad än för bara 40 år sedan då teknologin och datoriseringen öppnade upp världens marknader mot varandra. Bedömning av kreditrisk idag kräver effektiv analys av kvantitativa data och modeller som med god träffsäkerhet kan förutse risker. Under 1900-talets andra hälft skedde en snabb utveckling av de verktyg som används för konkursprediktion, från enkla univariata modeller till komplexa data mining-modeller med tusentals observationer. Denna studie undersöker om det är möjligt att prediktera att svenska företag kommer att gå i konkurs och vilka variabler som innehåller relevant information för detta. Metoderna som används är diskriminantanalys, logistisk regression och överlevnadsanalys på 50 aktiva och 50 företag försatta i konkurs. Resultaten visar på en träffsäkerhet mellan 67,5 % och 75 % beroende på vald statistisk metod. Oavsett vald statistisk metod är det möjligt att klassificera företag som konkursmässiga två år innan konkursens inträffande med hjälp av finansiella nyckeltal av typerna lönsamhetsmått och solvensmått. Samhällskostnader reduceras av bättre konkursprediktion med hjälp av finansiella nyckeltal vilka bidrar till ökad förmåga för företag att tillämpa ekonomistyrning med relevanta nyckeltal i form av lager, balanserad vinst, nettoresultat och rörelseresultat.
From the early 1900s, banks and lending institutions have used financial ratios as an aid in the assessment and quantification of credit risk. For today's investors the economic environment is far more complicated than 40 years ago when the technology and computerization opened up the world's markets. Credit risk assessment today requires effective analysis of quantitative data and models that can predict risks with good accuracy. During the second half of the 20th century there was a rapid development of the tools used for bankruptcy prediction. We moved from simple univariate models to complex data mining models with thousands of observations. This study investigates if it’s possible to predict bankruptcy in Swedish limited companies and which variables contain information relevant for this cause. The methods used in the study are discriminant analysis, logistic regression and survival analysis on 50 active and 50 failed companies. The results indicate accuracy between 67.5 % and 75 % depending on the choice of statistical method. Regardless of the selected statistical method used, it’s possible to classify companies as bankrupt two years before the bankruptcy occurs using financial ratios which measures profitability and solvency. Societal costs are reduced by better bankruptcy prediction using financial ratios which contribute to increasing the ability of companies to apply financial management with relevant key ratios in the form of stock , retained earnings , net income and operating income.
APA, Harvard, Vancouver, ISO, and other styles
33

Nguyen, Duytrac Vu. "Omnibus Tests for Comparison of Competing Risks with Covariate Effects via Additive Risk Model." Digital Archive @ GSU, 2007. http://digitalarchive.gsu.edu/math_theses/25.

Full text
Abstract:
It is of interest that researchers study competing risks in which subjects may fail from any one of K causes. Comparing any two competing risks with covariate effects is very important in medical studies. This thesis develops omnibus tests for comparing cause-specific hazard rates and cumulative incidence functions at specified covariate levels. In the thesis, the omnibus tests are derived under the additive risk model, that is an alternative to the proportional hazard model, with by a weighted difference of estimates of cumulative cause-specific hazard rates. Simultaneous confidence bands for the difference of two conditional cumulative incidence functions are also constructed. A simulation procedure is used to sample from the null distribution of the test process in which the graphical and numerical techniques are used to detect the significant difference in the risks. A melanoma data set is used for the purpose of illustration.
APA, Harvard, Vancouver, ISO, and other styles
34

Wilder, Jessica A. "Operationalizing the Pressure and Release Theoretical Framework Using Risk Ratio Analysis to Measure Vulnerability and Predict Risk from Natural Hazards in the Tampa, FL Metropolitan Area." Scholar Commons, 2018. http://scholarcommons.usf.edu/etd/7245.

Full text
Abstract:
Significant damage and loss is experienced every year due to natural hazards such as hurricanes, tornadoes, droughts, floods, wildfires, volcanoes, and earthquakes. NOAA’s National Center for Environmental Information (NCEI) reports that in 2016 the United States experienced more than a dozen climate disaster events with damages and loss in excess of a billion dollars (NOAA National Centers for Environmental Information, 2017). Identifying vulnerabilities and risk associated with disaster threats is now a major focus of natural hazards research. Natural hazards research has yielded numerous theoretical frameworks over the last 25 years that have explained important elements of risk and vulnerability in disasters (Birkmann, 2016b). However, there has been much less progress made in operationalizing these frameworks. While the theory is well established, one of the more pressing challenges before us is the lack of development of user-friendly and flexible risk assessment techniques for emergency managers (Mustafa et al., 2011). The trend in operationalizing natural hazards, theoretical frameworks has been the development of general, all-purpose, static models to measure vulnerability. However, important missing elements in the current hazards literature is the need for an operationalized risk model that is (1) simple, quick and easy to use, (2) flexible for changing conditions, and (3) site-specific for various geographic locations. Many of the current models for determining risk and vulnerability are very complex and time consuming to calculate and thus make them of little use for emergency and risk managers. In addition, little analysis has been conducted to see if a flexible risk identification measurement system could be developed. As vulnerability and risk become fluid due to changing conditions (environmental—hazard and location) and circumstances (social, economic, and political), our measurement tools need to be able to capture these differences in order to be effective. This dissertation examines whether the Pressure and Release (PAR) natural hazards, theoretical framework can be operationalized using financial risk ratio methods. Specifically, it analyzes risk ratios using key vulnerability indicators to identify escalating vulnerability and ultimately predict risk. A structured modeling approach was used to identify key vulnerability indicators and develop risk ratios. These are applied to a case study to demonstrate whether this new approach can identify emerging risk trends. My research suggests that instead of operationalizing natural hazards theoretical frameworks using the current static, aggregate index method, a flexible risk ratio method could provide a new, viable option.
APA, Harvard, Vancouver, ISO, and other styles
35

Worku, Eshetu Bekele. "Efficiency and Social Capital in Micro, Small and Medium Enterprises: the Case of Ethiopia." Thesis, University of the Western Cape, 2008. http://etd.uwc.ac.za/index.php?module=etd&action=viewtitle&id=gen8Srv25Nme4_2168_1263780307.

Full text
Abstract:

This study extends the existing literature on how social networks enhance the performance and sustainability of small enterprises. More specifically, the study isolates and investigates the mechanisms through which social capital helps with the growth and survival of MSMEs. The evidence presented in this study strongly suggests that an indigenous social network widely practiced in Ethiopia, the &ldquo
iqqub&rdquo
, contributes significantly to the start-up, survival and development of urban MSMEs.

APA, Harvard, Vancouver, ISO, and other styles
36

Taghvatalab, Golnaz. "The Economics of Marriage and Divorce in Iran." Diss., Virginia Tech, 2012. http://hdl.handle.net/10919/77981.

Full text
Abstract:
This dissertation consists of four chapters on the economics of marriage and divorce in Iran. The first chapter outlines major forces driving the recent transitions in Iran's marriage market. Age structure of the population, the rise of women's education, marriage and divorce laws, and fertility decline are the four main forces influencing marriage transitions, that is, the age of marriage, couple's age and education gaps, quality of marriage (stability, education status of children), and women's power within marriage. Chapter two looks at the change in age structure that influences the sex ratio. I consider the influence of the sex ratio on couples' age and education gaps using data from multiple national surveys from 1984-2007. The findings of this chapter show that a lower sex ratio, i.e. a greater supply of marriage-age women, increases the bargaining power of men at the time of marriage and thereby increases their ability to marry younger and more educated women. In chapter three, I evaluate the effects of demographic change, the sex ratio, and policy change, particularly the provision of family planning programs through health clinics on delayed marriage in rural Iran. I use data from Iran's 2000 Demographic and Health Survey to estimate a hazard model of timing of marriage. The results show that a lower sex ratio decreases the chances of a woman finding a man five years older, and easier access to family planning decreases her probability of marriage. In chapter four, I provide a legal history of Iran's marriage and divorce laws and then discuss how changes in the legal structure of marriage and divorce alter the terms of marital bargaining and force women to circumvent inequitable Iranian laws to improve their position. Then, I present a model of how Mahrieh could improve a woman's position within the household in light of the unequal divorce rights favoring men. As women cannot exit their marriage, they request a conditional and legally enforceable bond known as Mahrieh from their husbands to secure themselves against the risks of divorce or maltreatment within marriage.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
37

Clarke, Samantha. "Submarine landslides of the upper east Australian continental margin." Thesis, The University of Sydney, 2014. http://hdl.handle.net/2123/12456.

Full text
Abstract:
Stable continental margins experience submarine landslides relatively frequently and some of the largest slides on record have been shed from these relatively passive terrains. Despite this, and the obvious accompanying tsunami hazard, slides from passive margins such as Australia are poorly understood when compared to other settings, such as the flanks of volcanic islands, active subduction-zone margins and submarine fans. This work presents an investigation into the submarine landslides occurring along east Australia’s (EA) continental margin, with a focus on investigating the causes, timing, and mechanisms responsible for these features. It has focused on analysing gravity core samples and interpreting of high-resolution multibeam and subbottom profiles. The age, morphology, composition, and origin of particular submarine landslides on the EA continental margin offshore New South Wales/Queensland has been described and the mechanical characteristics of sediments from the EA continental slope has been presented. The hazard posed by these submarine landslides has also been evaluated by investigating their potential to generate tsunamis along this margin. The widespread occurrence of slides across the EA margin indicates that submarine sliding should be considered to be a common characteristic of this margin. Engineering properties imply that the sediment forming the margin is reasonably strong and inherently stable and classical limit-equilibrium modeling indicates that submarine landslides should not be a common occurrence. This indicates that a pre-conditioning trigger, or some other mechanism, is required to destabilise the slope and enable failure. The most likely suspected processes include: 1) dramatic reduction of the shear strength of the sediments to extremely low values; 2) long-term modification of the slope-geometry; and/or 3) seismic events large enough to trigger sediment liquefaction or a sudden increase of pore-fluid pressure.
APA, Harvard, Vancouver, ISO, and other styles
38

Marques, Inês Filipa Costa. "Mortality of elite athletes : an application to football players." Master's thesis, Instituto Superior de Economia e Gestão, 2018. http://hdl.handle.net/10400.5/18092.

Full text
Abstract:
Mestrado em Actuarial Science
Os benefícios para a saúde resultantes da prática regular de exercício físico, de uma forma moderada, estão cientificamente comprovados. Contudo, quando se trata de uma abordagem sobre atletas profissionais, os benefícios deixam de ser uma clara evidência, surgindo por vezes sinais de alerta para os seus possíveis efeitos adversos. Para alimentar esta controvérsia, muito têm contribuído os estudos recentes que evidenciam anomalias e doenças cardiovasculares, bem como as frequentes lesões em atletas de elite. É neste contexto que surge o principal objectivo deste trabalho: investigar se os atletas de elite vivem mais do que a população em geral. Após uma profunda revisão literária inicial relativa à mortalidade dos atletas de elite, procede-se a uma análise de sobrevivência que tem como foco dois grupos de jogadores de futebol profissionais. Recolheram-se dados relativos à data de nascimento e morte dos jogadores portugueses e espanhóis que representaram a sua selecção, bem como de outras variáveis de interesse para o estudo. Cada grupo de jogadores é comparado com a população geral do respectivo país, usando dados disponíveis na Human Mortality Database, através da estimação de standardised mortality ratios e de curvas de sobrevivência. O years-lost method é também aplicado, fornecendo uma medida de longevidade dos referidos atletas de elite. Ainda é averiguado se a posição dos jogadores e o número de jogos na sua carreira afectam diferencialmente a mortalidade dos mesmos, através dos Cox Proportional Hazard Models. Por fim, as populações dos jogadores portugueses e espanhóis são comparadas entre si.
The health benefits of moderate regular physical activity have been clearly demonstrated and are widely consensual. However, there is a growing debate over the potential adverse effects of strenuous physical activity, particularly at a professional level. Recent findings of cardiovascular anomalies in elite athletes coupled with the high frequency of injuries have brought some sports under increased scrutiny. In this context, the main goal of this work is to investigate whether elite athletes live longer than the general population. After an initial review of the literature on elite athletes' mortality, a comprehensive survival analysis is applied to two populations of professional football players. Lifespan data and specific occupational variables of Portuguese and Spanish football players, who have represented their national teams in their career, were collected from recognized publicly available sources. Each cohort is then compared to the respective standard population, using available data in the Human Mortality Database, through the estimation of standardised mortality ratios and survival curves. The years-lost method is applied to provide a time dimension measure for these elite athletes' longevity. Furthermore, the association of position on the field and the number of games with overall mortality is accessed using Cox Proportional Hazard Models. At the end, a comparison between the mortality of Portuguese and Spanish football players is carried out.
info:eu-repo/semantics/publishedVersion
APA, Harvard, Vancouver, ISO, and other styles
39

Akyuz, Emre. "Development Of Site Specific Vertical Design Spectrum For Turkey." Master's thesis, METU, 2013. http://etd.lib.metu.edu.tr/upload/12615403/index.pdf.

Full text
Abstract:
Vertical design spectra may be developed in a probabilistic seismic hazard assessment (PSHA) by computing the hazard using vertical ground motion prediction equations (GMPEs), or using a vertical-to-horizontal spectral acceleration (V/H) ratio GMPEs to scale the horizontal spectrum that was developed using the results of horizontal component PSHA. The objective of this study is to provide GMPEs that are compatible with regional ground motion characteristics to perform both alternatives. GMPEs for the V/H ratio were developed recently by Gü
lerce and Abrahamson (2011) using NGA-W1 database. A strong motion dataset consistent with the V/H ratio model parameters is developed by including strong motion data from earthquakes occurred in Turkey with at least three recordings per earthquake. The compatibility of GA2011 V/H ratio model with the magnitude, distance, and site amplification scaling of Turkish ground motion dataset is evaluated by using inter-event and intra-event residual plots and necessary coefficients of the model is adjusted to reflect the regional characteristics. Analysis of the model performance in the recent moderate-tolarge magnitude earthquakes occurred in Turkey shows that the Turkey-Adjusted GA2011 model is a suitable candidate V/H ratio model for PSHA studies conducted in Turkey. Using the same dataset, a preliminary vertical ground motion prediction equation for Turkey consistent with the preliminary vertical model based on NGA-W1 dataset is developed. Proposed preliminary model is applicable to magnitudes 5-8.5, distances 0-200 km, and spectral periods of 0-10 seconds and offers an up-to-date alternative to the regional vertical GMPEs proposed by Kalkan and Gü
lkan (2004).
APA, Harvard, Vancouver, ISO, and other styles
40

Mariniere, Judith. "Amélioration des modèles prédictifs de séismes pour le PSHA grâce aux données géodésiques : application en Equateur." Thesis, Université Grenoble Alpes, 2020. http://www.theses.fr/2020GRALU019.

Full text
Abstract:
L'évaluation probabiliste de l’aléa sismique (PSHA) s'appuie sur des modèles de prédictions sismiques long terme et des modèles de mouvements du sol. Jusqu'à présent, les données géodésiques sont restées sous-utilisées dans le cadre du PSHA, bien qu'elles fournissent des informations uniques et sans précédent sur les taux de déformation des structures tectoniques, de l'échelle locale à l'échelle régionale. L'objectif de cette thèse est d'améliorer les modèles de récurrence des séismes en incluant quantitativement les informations dérivées des mesures géodésiques, avec une application à l'Équateur, un pays exposé à la fois aux séismes de faible profondeur de la croûte terrestre et aux mégathrusts de la zone de subduction. Le deuxième chapitre présente la construction d'un modèle probabiliste d'aléa sismique pour l'Équateur, en utilisant la sismicité historique et récente, les connaissances actuelles sur la tectonique active, la géodynamique et la géodésie. J'ai contribué à cet effort collectif de deux manières : 1) la création de catalogues sismiques à partir d'ensembles de données sismiques mondiales ; 2) l'établissement de taux de glissement moyens sur un ensemble de failles crustales simplifiées, à partir des vitesses GPS. Les calculs d'aléas effectués à l'échelle du pays indiquent que les incertitudes sont plus grandes pour les sites de la côte nord et le long des failles de la Cordillère. Le troisième chapitre de cette thèse se concentre sur la détermination du potentiel sismique du système de failles de Quito. La ville de Quito est traversée par une faille inverse de ∼60-km de long, représentant un risque important en raison de la forte densité de population. Nous contraignons l'accumulation actuelle des contraintes associées au système de failles avec les données GPS et l'analyse du radar à ouverture synthétique (PS-InSAR). Les modèles de blocage variables dans l'espace en 3D montrent qu'une grande partie de la faille subit actuellement un glissement à faible profondeur, réduisant ainsi l'énergie disponible pour les futurs séismes, ce qui a un impact significatif sur les calculs d'aléa. Dans le dernier chapitre de cette thèse, nous évaluons la capacité des données géodésiques à contraindre les modèles de récurrence des séismes pour la zone de subduction dans le nord de l'Équateur. À l'aide de modèles de couplage intersismique, nous mesurons le taux annuel d'accumulation du déficit de moment sur l'interface et identifions les incertitudes liées à la conversion en termes de relâchement du moment sismique total. Sur la base d'un catalogue de séismes nouvellement développé, nous proposons d'établir des modèles de récurrence qui correspondent à la fois aux taux de sismicité basés sur le catalogue et au budget du moment géodésique. Nous établissons un arbre logique pour explorer les incertitudes sur les taux de sismicité et sur le budget du moment géodésique à libérer lors des séismes. L'exploration de l'arbre logique conduit à une distribution des magnitudes maximales Mmax possibles délimitant le modèle de récurrence des séismes ; nous n'extrayons que les modèles qui fournissent des Mmax compatibles avec la longueur du segment interface. Cette nouvelle méthode permet 1) d'identifier quelle forme de modèle de récurrence est adaptée à la subduction équatorienne ; 2) de générer une distribution de modèles de récurrence équilibrés en termes de moments représentatifs des incertitudes et de propager cette incertitude jusqu'aux spectres à risque uniforme (UHS) ; et 3) d'évaluer une gamme de valeurs pour la composante sismique du glissement sur l'interface. Compte tenu de la disponibilité récente d'une quantité massive de données géodésiques, cette nouvelle approche pourrait être utilisée dans d'autres régions du monde pour développer des modèles de récurrence cohérents à la fois avec la sismicité passée et la déformation tectonique mesurée
Probabilistic Seismic Hazard Assessment (PSHA) relies on long-term earthquake forecasts, and ground-motion models. Up to now, geodetic data has been rather under-used in PSHA, although it provides unique and unprecedented information on the deformation rates of tectonic structures from local to regional scales. The aim of this PhD thesis is to improve earthquake recurrence models by quantitatively including the information derived from geodetic measurements, with an application to Ecuador, a country exposed both to shallow crustal earthquakes and megathrust subduction events. The second chapter presents the building of a probabilistic seismic hazard model for Ecuador, using historical and contemporary seismicity, recent knowledges about active tectonics, geodynamics, and geodesy. I contributed to this collective effort in two ways: 1) the building of earthquake catalogs from global seismic datasets; 2) the establishment of average slip rates on a set of simplified crustal faults, from GPS velocities. The hazard calculations led at the country scale indicate that uncertainties are largest for sites on the northern coast and along the faults in the Cordillera. The second chapter of this PhD focuses on the determination of the seismic potential of the Quito fault system. Quito city lies on the hanging wall of this ∼60-km-long reverse active fault, representing significant risks due to the high population density. We constrain the present-day strain accumulation associated with the fault system with GPS data and Persistent Scatterer Interferometric Synthetic Aperture Radar (PS-InSAR) analysis. 3-D spatially variable locking models show that a large part of the fault is presently experiencing shallow creep, hence reducing the energy available for future earthquakes, which has a significant impact for hazard calculation. In the third part of this PhD, we evaluate the ability of geodetic data to constrain earthquake recurrence models for the subduction zone in northern Ecuador. We quantify the annual rate of moment deficit accumulation at the interface using interseismic coupling models, and identify the uncertainties related to the conversion in terms of total seismic moment release. Based on a newly-developed earthquake catalog, we propose to establish recurrence models that match both the catalog-based seismicity rates and the geodetic moment budget. We set up a logic tree for exploring the uncertainties on the seismic rates and on the geodetic moment budget to be released in earthquakes. The exploration of the logic tree leads to a distribution of possible maximal magnitudes Mmax bounding the earthquake recurrence model; we extract only those models that lead to Mmax values compatible with the extent of the interface segment according to earthquake scaling laws. This new method allows 1) to identify which magnitude-frequency form is adapted for the Ecuadorian subduction; 2) to generate a distribution of moment-balanced recurrence models representative of uncertainties and propagate this uncertainty up to the uniform hazard spectra; and 3) to evaluate a range for the aseismic component of the slip on the interface. Considering the recent availability of massive quantity of geodetic data, this new approach could be used in other regions of the world to develop recurrence models consistent both with past seismicity and measured tectonic deformation
APA, Harvard, Vancouver, ISO, and other styles
41

Stavrou, Eftyhia P. "Vision, functional and cognitive determinants of motor vehicle incidents in older drivers." Thesis, Queensland University of Technology, 2006. https://eprints.qut.edu.au/28503/1/Efty_Stavrou_Thesis.pdf.

Full text
Abstract:
Background: The proportion of older individuals in the driving population is predicted to increase in the next 50 years. This has important implications for driving safety as abilities which are important for safe driving, such as vision (which accounts for the majority of the sensory input required for driving), processing ability and cognition have been shown to decline with age. The current methods employed for screening older drivers upon re-licensure are also vision based. This study, which investigated social, behavioural and professional aspects involved with older drivers, aimed to determine: (i) if the current visual standards in place for testing upon re-licensure are effective in reducing the older driver fatality rate in Australia; (ii) if the recommended visual standards are actually implemented as part of the testing procedures by Australian optometrists; and (iii) if there are other non-standardised tests which may be better at predicting the on-road incident-risk (including near misses and minor incidents) in older drivers than those tests recommended in the standards. Methods: For the first phase of the study, state-based age- and gender-stratified numbers of older driver fatalities for 2000-2003 were obtained from the Australian Transportation Safety Bureau database. Poisson regression analyses of fatality rates were considered by renewal frequency and jurisdiction (as separate models), adjusting for possible confounding variables of age, gender and year. For the second phase, all practising optometrists in Australia were surveyed on the vision tests they conduct in consultations relating to driving and their knowledge of vision requirements for older drivers. Finally, for the third phase of the study to investigate determinants of on-road incident risk, a stratified random sample of 600 Brisbane residents aged 60 years and were selected and invited to participate using an introductory letter explaining the project requirements. In order to capture the number and type of road incidents which occurred for each participant over 12 months (including near misses and minor incidents), an important component of the prospective research study was the development and validation of a driving diary. The diary was a tool in which incidents that occurred could be logged at that time (or very close in time to which they occurred) and thus, in comparison with relying on participant memory over time, recall bias of incident occurrence was minimised. Association between all visual tests, cognition and scores obtained for non-standard functional tests with retrospective and prospective incident occurrence was investigated. Results: In the first phase,rivers aged 60-69 years had a 33% lower fatality risk (Rate Ratio [RR] = 0.75, 95% CI 0.32-1.77) in states with vision testing upon re-licensure compared with states with no vision testing upon re-licensure, however, because the CIs are wide, crossing 1.00, this result should be regarded with caution. However, overall fatality rates and fatality rates for those aged 70 years and older (RR=1.17, CI 0.64-2.13) did not differ between states with and without license renewal procedures, indicating no apparent benefit in vision testing legislation. For the second phase of the study, nearly all optometrists measured visual acuity (VA) as part of a vision assessment for re-licensing, however, 20% of optometrists did not perform any visual field (VF) testing and only 20% routinely performed automated VF on older drivers, despite the standards for licensing advocating automated VF as part of the vision standard. This demonstrates the need for more effective communication between the policy makers and those responsible for carrying out the standards. It may also indicate that the overall higher driver fatality rate in jurisdictions with vision testing requirements is resultant as the tests recommended by the standards are only partially being conducted by optometrists. Hence a standardised protocol for the screening of older drivers for re-licensure across the nation must be established. The opinions of Australian optometrists with regard to the responsibility of reporting older drivers who fail to meet the licensing standards highlighted the conflict between maintaining patient confidentiality or upholding public safety. Mandatory reporting requirements of those drivers who fail to reach the standards necessary for driving would minimise potential conflict between the patient and their practitioner, and help maintain patient trust and goodwill. The final phase of the PhD program investigated the efficacy of vision, functional and cognitive tests to discriminate between at-risk and safe older drivers. Nearly 80% of the participants experienced an incident of some form over the prospective 12 months, with the total incident rate being 4.65/10 000 km. Sixty-three percent reported having a near miss and 28% had a minor incident. The results from the prospective diary study indicate that the current vision screening tests (VA and VF) used for re-licensure do not accurately predict older drivers who are at increased odds of having an on-road incident. However, the variation in visual measurements of the cohort was narrow, also affecting the results seen with the visual functon questionnaires. Hence a larger cohort with greater variability should be considered for a future study. A slightly lower cognitive level (as measured with the Mini-Mental State Examination [MMSE]) did show an association with incident involvement as did slower reaction time (RT), however the Useful-Field-of-View (UFOV) provided the most compelling results of the study. Cut-off values of UFOV processing (>23.3ms), divided attention (>113ms), selective attention (>258ms) and overall score (moderate/ high/ very high risk) were effective in determining older drivers at increased odds of having any on-road incident and the occurrence of minor incidents. Discussion: The results have shown that for the 60-69 year age-group, there is a potential benefit in testing vision upon licence renewal. However, overall fatality rates and fatality rates for those aged 70 years and older indicated no benefit in vision testing legislation and suggests a need for inclusion of screening tests which better predict on-road incidents. Although VA is routinely performed by Australian optometrists on older drivers renewing their licence, VF is not. Therefore there is a need for a protocol to be developed and administered which would result in standardised methods conducted throughout the nation for the screening of older drivers upon re-licensure. Communication between the community, policy makers and those conducting the protocol should be maximised. By implementing a standardised screening protocol which incorporates a level of mandatory reporting by the practitioner, the ethical dilemma of breaching patient confidentiality would also be resolved. The tests which should be included in this screening protocol, however, cannot solely be ones which have been implemented in the past. In this investigation, RT, MMSE and UFOV were shown to be better determinants of on-road incidents in older drivers than VA and VF, however, as previously mentioned, there was a lack of variability in visual status within the cohort. Nevertheless, it is the recommendation from this investigation, that subject to appropriate sensitivity and specificity being demonstrated in the future using a cohort with wider variation in vision, functional performance and cognition, these tests of cognition and information processing should be added to the current protocol for the screening of older drivers which may be conducted at licensing centres across the nation.
APA, Harvard, Vancouver, ISO, and other styles
42

Stavrou, Eftyhia P. "Vision, functional and cognitive determinants of motor vehicle incidents in older drivers." Queensland University of Technology, 2006. http://eprints.qut.edu.au/28503/.

Full text
Abstract:
Background: The proportion of older individuals in the driving population is predicted to increase in the next 50 years. This has important implications for driving safety as abilities which are important for safe driving, such as vision (which accounts for the majority of the sensory input required for driving), processing ability and cognition have been shown to decline with age. The current methods employed for screening older drivers upon re-licensure are also vision based. This study, which investigated social, behavioural and professional aspects involved with older drivers, aimed to determine: (i) if the current visual standards in place for testing upon re-licensure are effective in reducing the older driver fatality rate in Australia; (ii) if the recommended visual standards are actually implemented as part of the testing procedures by Australian optometrists; and (iii) if there are other non-standardised tests which may be better at predicting the on-road incident-risk (including near misses and minor incidents) in older drivers than those tests recommended in the standards. Methods: For the first phase of the study, state-based age- and gender-stratified numbers of older driver fatalities for 2000-2003 were obtained from the Australian Transportation Safety Bureau database. Poisson regression analyses of fatality rates were considered by renewal frequency and jurisdiction (as separate models), adjusting for possible confounding variables of age, gender and year. For the second phase, all practising optometrists in Australia were surveyed on the vision tests they conduct in consultations relating to driving and their knowledge of vision requirements for older drivers. Finally, for the third phase of the study to investigate determinants of on-road incident risk, a stratified random sample of 600 Brisbane residents aged 60 years and were selected and invited to participate using an introductory letter explaining the project requirements. In order to capture the number and type of road incidents which occurred for each participant over 12 months (including near misses and minor incidents), an important component of the prospective research study was the development and validation of a driving diary. The diary was a tool in which incidents that occurred could be logged at that time (or very close in time to which they occurred) and thus, in comparison with relying on participant memory over time, recall bias of incident occurrence was minimised. Association between all visual tests, cognition and scores obtained for non-standard functional tests with retrospective and prospective incident occurrence was investigated. Results: In the first phase,rivers aged 60-69 years had a 33% lower fatality risk (Rate Ratio [RR] = 0.75, 95% CI 0.32-1.77) in states with vision testing upon re-licensure compared with states with no vision testing upon re-licensure, however, because the CIs are wide, crossing 1.00, this result should be regarded with caution. However, overall fatality rates and fatality rates for those aged 70 years and older (RR=1.17, CI 0.64-2.13) did not differ between states with and without license renewal procedures, indicating no apparent benefit in vision testing legislation. For the second phase of the study, nearly all optometrists measured visual acuity (VA) as part of a vision assessment for re-licensing, however, 20% of optometrists did not perform any visual field (VF) testing and only 20% routinely performed automated VF on older drivers, despite the standards for licensing advocating automated VF as part of the vision standard. This demonstrates the need for more effective communication between the policy makers and those responsible for carrying out the standards. It may also indicate that the overall higher driver fatality rate in jurisdictions with vision testing requirements is resultant as the tests recommended by the standards are only partially being conducted by optometrists. Hence a standardised protocol for the screening of older drivers for re-licensure across the nation must be established. The opinions of Australian optometrists with regard to the responsibility of reporting older drivers who fail to meet the licensing standards highlighted the conflict between maintaining patient confidentiality or upholding public safety. Mandatory reporting requirements of those drivers who fail to reach the standards necessary for driving would minimise potential conflict between the patient and their practitioner, and help maintain patient trust and goodwill. The final phase of the PhD program investigated the efficacy of vision, functional and cognitive tests to discriminate between at-risk and safe older drivers. Nearly 80% of the participants experienced an incident of some form over the prospective 12 months, with the total incident rate being 4.65/10 000 km. Sixty-three percent reported having a near miss and 28% had a minor incident. The results from the prospective diary study indicate that the current vision screening tests (VA and VF) used for re-licensure do not accurately predict older drivers who are at increased odds of having an on-road incident. However, the variation in visual measurements of the cohort was narrow, also affecting the results seen with the visual functon questionnaires. Hence a larger cohort with greater variability should be considered for a future study. A slightly lower cognitive level (as measured with the Mini-Mental State Examination [MMSE]) did show an association with incident involvement as did slower reaction time (RT), however the Useful-Field-of-View (UFOV) provided the most compelling results of the study. Cut-off values of UFOV processing (>23.3ms), divided attention (>113ms), selective attention (>258ms) and overall score (moderate/ high/ very high risk) were effective in determining older drivers at increased odds of having any on-road incident and the occurrence of minor incidents. Discussion: The results have shown that for the 60-69 year age-group, there is a potential benefit in testing vision upon licence renewal. However, overall fatality rates and fatality rates for those aged 70 years and older indicated no benefit in vision testing legislation and suggests a need for inclusion of screening tests which better predict on-road incidents. Although VA is routinely performed by Australian optometrists on older drivers renewing their licence, VF is not. Therefore there is a need for a protocol to be developed and administered which would result in standardised methods conducted throughout the nation for the screening of older drivers upon re-licensure. Communication between the community, policy makers and those conducting the protocol should be maximised. By implementing a standardised screening protocol which incorporates a level of mandatory reporting by the practitioner, the ethical dilemma of breaching patient confidentiality would also be resolved. The tests which should be included in this screening protocol, however, cannot solely be ones which have been implemented in the past. In this investigation, RT, MMSE and UFOV were shown to be better determinants of on-road incidents in older drivers than VA and VF, however, as previously mentioned, there was a lack of variability in visual status within the cohort. Nevertheless, it is the recommendation from this investigation, that subject to appropriate sensitivity and specificity being demonstrated in the future using a cohort with wider variation in vision, functional performance and cognition, these tests of cognition and information processing should be added to the current protocol for the screening of older drivers which may be conducted at licensing centres across the nation.
APA, Harvard, Vancouver, ISO, and other styles
43

Bohman, Peter, and Erik Karlsson. "Leasing Risks and Commercial Real Estate : A Study on the Relationship Between Risk Premium and Leasing Risks." Thesis, KTH, Fastigheter och byggande, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-254721.

Full text
Abstract:
Purpose: The purpose of this thesis paper is to evaluate what the current market practice of real estatevaluation and investment decisions is when it comes to different leasing risks and the risk premium.With regard to some of the ongoing trends within real estate, it is believed that investor preferencesaffect the market practice and the underlying theories of valuation does not fully comply to the currentmarket practice. Method: The implementation of the method is stage wise. At first already existing research andliterature was evaluated and triangulated to find relevant knowledge as basis for the theoreticalframework. Afterwards an analysis was performed to answer whether there is a research gap or not.By analyzing the literature, a research gap as well as potential problems related to leasing risks wasfound. The second phase consisted of a qualitative method where experts in the field were interviewedregarding leasing risk to evaluate whether the problem exist in practice or only in literature.Experts on the topic also helped to develop the questions consequently delivered to the interviewees.The mentioned strategy was done with guidance of our tutor Han-Suck Song at KTH and DanielHolmkvist at CBRE. Interviews: Nine interviews were conducted where experts in the business (consultants and propertyfirms) participated to deliver different perspectives on the research question. All interviews were madein Stockholm and held in Swedish and afterwards translated to English. Results: The results consist of the answers from the interview-part, where the relevant findings weresummarized and pin-pointed with regard to the respective field of business and property segment.The general themes that arose throughout the methods are presented, as well as the extremes in termsof opinions and answers. It was found that there is a clear relationship between the leasing risk and therisk premium for commercial real estate. The relationship depends on several factors such asgeographical location, the different submarkets and finally the segment. A municipal- or corporate bondcannot be fully comparable to a leasing contract but for a 20 year or longer contract where the tenant ispublicly financed, the contract can become an interesting investment alternative due to the currentinterest rate cycle. Finally the leasing contract needs to be more effortless to liquidate in order to becomparable to the bond situation. Scientific relevance: The recent transaction activity on the Swedish real estate market has been ratherdefensive for multiple segments the last twelve months with an exception of community properties.A common understanding is that such objects feature “stable tenants” and are viewed as a safeinvestment by the market. This investment practice raises the awareness of what a stable tenant is, andhow the consultants and property owners’ reason during investments and appraising decisions.This research paper illustrates that a common perception on the subject is that the risk exposurecompletely depends on the specific segments, location or contract length etc. The academic researchexplains the theory behind how to derive the discount rate for an investment decision, however thisstudy has during the literature review proven that several important concepts are left out in the theorypartand thus does not fully cover phenomena’s that investors and appraisers are exposed to duringmarket practice. The most critical part is how to relate leasing risk to the risk premium on the Swedishmarket. Since this study focuses on specifically the Swedish market it is crucial to relate to suitableliterature review for further discussions. On foreign markets, more rigid literature on the subject wasfound.
Syfte: Syftet med detta examensarbete är att undersöka vad den aktuella marknadspraxisen inomfastighetsvärdering samt investeringsbeslut är gällande olika nivåer av hyresgästrisker och riskpremie. Metod: Genomförandet av undersökningen har gjorts i två steg. I ett första steg har tidigare forskninginom ämnet analyserats för att finna relevant teori samt identifiera eventuella forskningsgap. Efteranalysen konstaterades ett uppenbart informationsgap inom litteraturen relaterat till hyresgästrisker.Den andra fasen bestod av en kvalitativ metod där experter inom området har intervjuats gällandehyresgästrisker, för att utvärdera om problemet finns i praktiken eller endast i teorin. För att konstruerafrågorna fick vi assistans av experter inom ämnet via våra handledare Han-Suck Song, KTH och DanielHolmkvist, CBRE. Intervjuer: Nio intervjuer genomfördes med experter inom ämnet där både konsulter ochfastighetsägare deltog för att presentera olika synvinklar på problemet. Samtliga intervjuer ärgenomförda i Stockholm och på svenska. Intervjuavsnitten har översatts till engelska i efterhand. Resultat: Resultatavsnittet består av de svar som har erhållits från intervjuerna, där relevantaresonemang har summerats och noggrant strukturerats för att koppla marknadsområden till korrektfastighetssegment. Återkommande teman och ämnen har presenterats i resultatavsnittet, så väl somavvikande uppfattningar. Resultatet visar att det finns ett tydligt samband mellan riskpremium ochhyresgästrisker gällande kommersiella fastigheter. Sambandet beror på ett flertal faktorer där läge ochfastighetssegment har störst inverkan på riskpremien. Gällande obligationsmarknaden går det inte attlikställa ett hyresavtal med en obligation under något förhållande. Däremot om avtalet avser enkontraktslängd på 20 år eller längre och en offentligt finansierad hyresgäst så kan kassaflödet bli ettintressant investeringsalternativ till befintliga obligationer på marknaden. Detta beror till stor del pånuvarande ränteläge. Slutligen måste ett hyresavtal bli lättare att omsätta för att kunna jämföras meden alternativ obligation. Vetenskaplig relevans: Transaktionsaktiviteten på den svenska fastighetsmarknaden har varit relativtdefensiv för flertalet segment med undantag för samhällsfastigheter de senaste tolv månaderna. Dengenerella uppfattningen är att samhällsfastigheter avser ”stabila hyresgäster” och därmed ses som enmindre riskfylld investering. Detta medför frågeställningen, vad avses för att klassificera en hyresgästsom stabil, och hur resonerar konsulter samt fastighetsägare vid investerings- och värderingsbeslut?Efter att ha genomfört undersökningen går det att konstatera att en allmän uppfattning bland experterinom området är att hyresgästrisken till största del beror på vilket segment, lokalisering ellerkontraktslängd som avses. Den akademiska litteraturen förklarar hur diskonteringsräntan härleds förinvesteringsbeslut, men denna undersökning visar att den tillgängliga litteraturen antingen utelämnarflera viktiga koncept eller inte tillräckligt belyser fenomen som investerare och värderare möter i sittpraktiska arbete. Det grundläggande avsnittet som svensk litteratur till viss del utelämnar är sambandetmellan risk premium och hyresgästrisk på specifikt den svenska marknaden. Det finns utländsk litteratursom belyser denna typ av frågeställningar, men just för den svenska marknaden är litteraturen till vissdel ej tillräcklig och därmed har ett potentiellt forskningsgap inom området identifieras.
APA, Harvard, Vancouver, ISO, and other styles
44

Křížová, Eliška. "Příčiny a souvislosti finanční krize v USA." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-113685.

Full text
Abstract:
The diploma thesis investigates causes and progression of the financial crisis beginning in 2007 in the United States and leading in the economic recession. Theoretical part of the thesis describes business cycles and their explanations in accordance with the Austrian theory of the business cycle and other theories. Analytical part of the thesis explores the period before the crisis and significant events relevant to it. The main subject of the thesis are institutions and regulatory measures that have major importance for the U.S. real estate market -- including monetary and intervenionist policy of Fed, Community Reinvestment Act, government sponsored enterprises and three major rating agencies. The goal of the work is to provide a comprehensive view of the financial crisis and analyse main factors that influenced its creation -- credit expansion, mortgage market, Fed's monetary policy, bank behavior, etc. This thesis tries to demonstrate an inaccuracy of state inteventions and their impacts on the economy and market system.
APA, Harvard, Vancouver, ISO, and other styles
45

Crespo, António Pedro Marcos Avérous Mira. "Controlo de pragas no Jardim Zoológico de Lisboa : particular relevância para o controlo de roedores e sua infeção parasitária." Master's thesis, Universidade Técnica de Lisboa. Faculdade de Medicina Veterinária, 2012. http://hdl.handle.net/10400.5/4937.

Full text
Abstract:
Dissertação de Mestrado em Segurança Alimentar
Os parques zoológicos constituem ecossistemas partilhados por espécies domésticas e silvestres e pelo Homem, o que facilita a entrada de agentes patogénicos, sendo fundamental o estabelecimento de um programa de medicina veterinária preventiva, onde o cumprimento de todas as regras de segurança e higiene e o controlo de pragas são fundamentais. Assim, o presente estudo que decorreu no Jardim Zoológico de Lisboa, durante o ano de 2011, teve como objetivo efetuar um levantamento sobre as práticas de higiene e segurança, com ênfase no controlo de pragas, mediante observações in loco, entrevistas aos tratadores e responsáveis pelas empresas de controlo de pragas e a aplicação de um inquérito aos tratadores, com vista à identificação dos graus de infestação e práticas de prevenção e controlo. Dada a importância que os roedores representam como hospedeiros de um elevado número de espécies parasitárias foi ainda objetivo deste estudo a determinação da infeção parasitária em 100 roedores capturados (50 Mus musculus; 50 Rattus norvegicus). Através das observações realizadas e da análise das respostas obtidas no inquérito aos tratadores, verificou-se que de uma maneira geral, os cuidados de higiene e segurança praticados no Jardim Zoológico de Lisboa vão ao encontro do preconizado por vários autores, no entanto, alguns aspetos, relativamente à prevenção e ao controlo de pragas poderiam ser melhorados. São apresentadas neste trabalho sugestões de melhoramento, e sugere-se ainda que as ações de formação aos tratadores sejam estendidas a esta temática, caracterizando as espécies de pragas existentes e as doenças por elas transmitidas, para incentivar uma melhor deteção e controlo das mesmas. No estudo parasitário verificou-se que 82 roedores (82,0 %) apresentavam formas de eliminação parasitária, com maior proporção de animais positivos para a espécie Rattus norvegicus (84,0%). Identificaram-se nove espécies de parasitas, Eimeria spp., Cryptosporidium parvum, Cysticercus fasciolaris (forma larvar de Taenia taeniaeformis), Hymenolepis diminuta, Nippostrongylus brasiliensis, Heterakis spumosa, Syphacia obvelata, Callodium hepaticum e Trichuris muris, independentemente da espécie de roedor. Das espécies identificadas, Cryptosporidium parvum, Calodium hepaticum e Syphacia obvelata são transmissíveis diretamente a primatas, incluindo o Homem e Cysticercus fasciolaris e Hymenolepis diminuta indiretamente aos mesmos hospedeiros. Estes estudos salientam a importância do controlo de pragas no ecossistema formado pelo Zoo de lisboa, especialmente considerando o papel que algumas espécies de pragas assumem como reservatório de agentes parasitários e de outras naturezas, patogénicos para o Homem e animais.
ABSTRACT - Pest control in the Zoological Garden of Lisbon – special relevance for rodents control and their parasitic infection - Zoological parks constitute special ecosystems shared by domestic and wild species and man, which promotes the entrance of pathogens, therefore the establishment of preventive programs, with the fulfillment of all rules of security hygiene and pest control is essential The present study was carried out at the Lisbon Zoo, during 2011, aimed to characterize the hygiene and safety practices, with special emphasis on pest control, by in situ observations, interview to keepers and the pest control personnel. A formal questionnaire was implemented to keepers in order to establish infestations levels and current preventive and control practices. Considering the importance of rodents as host of a large number of parasitic species another objetive of this study was to determine the parasitic infection in 100 captured rodents (50 Mus musculus, 50 Rattus norvegicus). The observations and the analysis of keeper’s survey revealed that in general, the hygiene and safety practiced in Lisbon Zoo follows the recommendations of several authors, however, some aspects of pest control would benefit of improvements. Suggestions are presented to correct the aspects, is also suggested that training activities for keepers are extended to this subject, including the characteristics of the existing pests and diseases they can transmit in order to improve pest detection and their control. In the parasitic study it was found that 82 rodents (82.0%) had parasitic forms of disposal, with a greater proportion of animals positive for the species Rattus norvegicus (84.0%). We identified nine species of parasites, Eimeria spp., Cryptosporidium parvum, Cysticercus fasciolaris (larval form of Taenia taeniaeformis), Hymenolepis diminuta, Nippostrongylus brasiliensis, Heterakis spumosa, Syphacia obvelata, Calodium hepaticum and Trichuris muris, regardless of the rodent species. From those species, Cryptosporidium parvum, Calodium hepaticum and Syphacia obvelata may be transmitted directly to primates, including humans and Cysticercus fasciolaris and Hymenolepis diminuta are indirectly transmitted also to humans and primates. These studies highlighted the importance of pest control in the Lisbon Zoo ecosystem, especially considering the role that some pest species assume as reservoirs of animal and zoonotic parasitic diseases as well as other pathogens.
APA, Harvard, Vancouver, ISO, and other styles
46

Candido, Guilherme Amaral. "Aplicação de um modelo de intensidade para apreçamento de credit default swaps sobre emissor corporativo no Brasil." reponame:Repositório Institucional do FGV, 2018. http://hdl.handle.net/10438/20441.

Full text
Abstract:
Submitted by Guilherme Candido (gui0488@hotmail.com) on 2018-03-07T00:37:23Z No. of bitstreams: 1 Dissertação - Guilherme Amaral Candido.pdf: 2626690 bytes, checksum: 1e894027eeaf3360d910fb69a4a1f0b2 (MD5)
Approved for entry into archive by Thais Oliveira (thais.oliveira@fgv.br) on 2018-03-07T23:11:03Z (GMT) No. of bitstreams: 1 Dissertação - Guilherme Amaral Candido.pdf: 2626690 bytes, checksum: 1e894027eeaf3360d910fb69a4a1f0b2 (MD5)
Made available in DSpace on 2018-03-08T13:03:44Z (GMT). No. of bitstreams: 1 Dissertação - Guilherme Amaral Candido.pdf: 2626690 bytes, checksum: 1e894027eeaf3360d910fb69a4a1f0b2 (MD5) Previous issue date: 2018-02-07
Extensa literatura existe acerca de apreçamento de derivativos de crédito, em especial Credit Default Swaps, porém pouco foi discutido sobre o caso peculiar brasileiro, com convenções de taxas de juros e legislação específicas. Este trabalho foca na implementação de um modelo de intensidade, em particular o modelo padrão da ISDA, adaptado à um contrato de CDS no Brasil sobre um emissor corporativo. Spreads de Credit Default Swaps negociados no mercado offshore, yields de bonds e yields de debêntures foram utilizados como insumos para obtenção das taxas implícitas de intensidade de default e backtesting do modelo. Os dados utilizados compreendem o período de 2015 a 2017, englobando momentos de estresse relacionados à crise política brasileira. Algumas aplicações são, então, apresentadas, entre elas hedging, basis trading e estruturação de Credit Linked Notes.
Extensive literature exists on the pricing of credit derivatives, particularly Credit Default Swaps, yet little has been discussed about the distinctive Brazilian case, with specific legislation and interest rate conventions. This work aims to implement an intensity model, in particular the standard ISDA model, adapted to a CDS contract in Brazil on a corporate issuer. Spreads of Credit Default Swaps traded in the offshore market, offshore bond yields and local bond yields were used as inputs for obtaining the implicit hazard rates and for back testing the model. The data used cover the period from 2015 to 2017, including relevant moments of stress related to the Brazilian political crisis. Some applications are then presented, including hedging, basis trading and Credit Linked Notes structuring.
APA, Harvard, Vancouver, ISO, and other styles
47

Skučaitė-Gribauskienė, Eglė. "Maisto kokybės ir saugos politika Lietuvoje." Master's thesis, Lithuanian Academic Libraries Network (LABT), 2006. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2006~D_20060526_094146-61968.

Full text
Abstract:
Final work of University Postgraduate Studies, 80 pages, 15 figures, 8 tables, 80 references, 5 Appendix, in Lithuanian. KEY WORDS: food quality, food safety, the policy of food quality and safety, food industry, food market, economical rates, hazard analysis. Research object – the policy of food quality and safety. Research aim – set realization of food quality and safety policy in Lithuania and value food quality and safety in food industry companies. Objectives: • summon and analyse information about food quality and safety theoretical aspects; • gather and traverse information about food quality and safety policy in Lithuania; • analyse and value economical rates of Lithuanian food industry; • value security systems for food quality and safety in food industry companies; • traverse supposed lookout of food quality and safety development in Lithuania. Research methods – logical analysis and synthesis of scientific literature, logical abstract, comparison, diagrammatical presentment, filling and quantitative of statistics, prognostications and other economical research methods. In writing final work of University Postgraduate Studies it was used scientific literature of Lithuanian and foreign authors, monographs, Lithuanian legislations, EU directives, regulations, rest law acts and varied statistics.
APA, Harvard, Vancouver, ISO, and other styles
48

Puzanova, Daria. "Americká ekonomická krize 2007-2009." Master's thesis, Vysoká škola ekonomická v Praze, 2009. http://www.nusl.cz/ntk/nusl-10912.

Full text
Abstract:
This diploma work describes the financial and economical crisis that has emerged in the USA during the year 2007. In the work the preceding recessions and the flow of the current crisis are being analyzed. Attention is also given to a detailed study of the pre-crisis period in the USA economics and the identification of the root causes of the crisis and their interrelationship. The final part of the work is dedicated to the examination of the crisis consequences and the possible ways of its progress
APA, Harvard, Vancouver, ISO, and other styles
49

Öberg, Mattias U. L. "Health risk assessment of dioxin-like compounds in complex samples /." Stockholm : Karolinska inst, 2003. http://diss.kib.ki.se/2003/91-7349-692-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Fernandes, Alfredo Manoel da Silva. "Duração da hospitalização e faturamento das despesas hospitalares em portadores de cardiopatia congênita e de cardiopatia isquêmica submetidos à intervenção cirúrgica cardiovascular assistidos no protocolo da via rápida." Universidade de São Paulo, 2003. http://www.teses.usp.br/teses/disponiveis/5/5131/tde-21072014-110315/.

Full text
Abstract:
Com o objetivo de avaliar o atendimento dos pacientes submetidos à intervenção cirúrgica cardiovascular no protocolo de atendimento na via rápida (fast track recovery) em relação ao protocolo convencional, foi comparada a movimentação dos pacientes atendidos em ambos os protocolos nas diferentes unidades hospitalares. O estudo foi realizado em hospital público universitário especializado em cardiologia de 400 leitos, de referência terciária para o Sistema Único de Saúde. Foram estudados 175 pacientes, 107 (61%) homens e 68 (39%) mulheres, de idades entre 2 meses a 81 anos, dos quais 107 operados no protocolo da via rápida e 68 no protocolo convencional. Foram avaliadas variáveis demográficas, clínicas e, para avaliar a movimentação dos pacientes nas diferentes unidades hospitalares, as taxas de alta por unidade de tempo em cada unidade. A análise estatística foi feita por meio de análise exploratória, método de Kaplan Meier e modelo de riscos proporcionais de Cox. A análise de variância foi empregada para comparar o faturamento das despesas. A taxa de alta das diferentes unidades hospitalares por unidade de tempo dos portadores de cardiopatia congênita atendidos no protocolo da via rápida em relação ao protocolo da via convencional foi: a) 11,3 vezes a taxa de alta quando assistidos no protocolo na via convencional quanto ao tempo de permanência no centro cirúrgico; b) 6,3 vezes quanto à duração da intervenção cirúrgica; c) 6,8 vezes quanto à duração da anestesia; d) 1,5 vezes quanto à duração da perfusão; e) 2,8 vezes quanto à permanência na unidade de recuperação pós-operatória I; f) 6,7 vezes quanto à duração da hospitalização; g) 2,8 vezes quanto à permanência na unidade de internação pré-operatória; h) 2,1 vezes quanto à permanência na unidade de internação após a alta da unidade de terapia intensiva de recuperação pós-operatória. Para os portadores de cardiopatia isquêmica, as taxas de alta das unidades hospitalares para os protocolos de atendimento no protocolo da via rápida e no protocolo convencional não demonstraram diferença estatisticamente significante. Os valores de faturamento das despesas de internação dos portadores de cardiopatia congênita decorrentes de exames e procedimentos realizados nas fases pré- e pós-operatória e dos exames da fase trans-operatória foram menores quando os pacientes foram assistidos no protocolo da via rápida. Portanto, os portadores de cardiopatias congênita apresentaram menor permanência hospitalar nos recursos médicos hospitalares instalados, quando assistidos no protocolo de atendimento na via rápida, bem como menores despesas nas fases pré- e pós- operatória da internação.
Objective - To evaluate patient assistance in pre, per and postoperative phases of cardiac surgical intervention under fast track recovering protocol compared to the conventional way. Patients - 175 patients were studied, 107 (61%) men and 68 (39%) women. Ages 2 months to 81 years old. Patients included: first surgical intervention, congenital and ischemic cardiopathy without complexity, normal ventricular function and with at least 2 preoperative ambulatory consultations. Patients submitted to emergency surgeries were excluded. Interventions - assistance submitted by fast track and conventional protocol. Statistical analysis (measures) - exploratory, uni-varied (Kaplan Meier) and multi-varied (Cox) of the time in each admission unit. Hospital installations were classified in ambulatory, preoperative admission unit, surgical center, postoperative recovery unit and postoperative admission unit; the expression of this use was the discharge rate by unit of time from the significant interaction observed between assistance protocol and the kind of cardiopathy for the stay in the surgical center, surgical intervention time, stay in postoperative recovery unit, anesthesia time and time between admission and surgery dates. Results - the patients of congenital cardiopathy who underwent the protocol of conventional way recovery in relation to the fast track protocol, in the reliability range of 95% allows one to state that discharge rate by unit of time of the congenital cardiopathy patients assisted by the fast track protocol was: 11.3 times the discharge rate when assisted by the conventional way protocol as to the time of staying in the surgical center; 6.3 times as to the duration of the surgical intervention; 6.8 times as to duration of the anesthesia; 1.5 times as to the duration of the perfusion; 2.8 times as to the stay in the postoperative recovery unit; 6.7 times as to the stay in the hospital (period of time between the admission and the discharge date); 2.8 times as to the stay in the preoperative admission unit ( period of time between the admission date and the surgery date); 2.1 times as to the stay in the postoperative unit (period of time between the date of leaving the postoperative recovery unit and the date of discharge from the hospital). For the ischemia cardiopathy patients the risks concerning the protocols of recovery by the traditional way and the fast track were the same. CONCLUSIONS - The data concerning this study allows one to suggest that the assistance can be more efficient if one takes into consideration some variables studied in the protocol of fast track recovery. The congenital and ischemic cardiopathy patients presented shorter interval of time (concerning hospital stay in doctor-hospital installed facilities) when assisted in the fast track recovery protocol as well as fewer expenses with medical and hospital assistance.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography