Dissertations / Theses on the topic 'Risk of failure'

To see the other types of publications on this topic, follow the link: Risk of failure.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Risk of failure.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Fafatas, Stephan A. "Auditor risk management following audit failure." Diss., Connect to online resource, 2006. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3239393.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ek, Gabrielle, and Ciriak Eszter. "The high risk of failure in micro-enterprises : Reducing failure-risk by evolving the traditional business plan." Thesis, Umeå universitet, Företagsekonomi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-150029.

Full text
Abstract:
Today’s economy of the European Union is statistically proven to be largely made up of startup enterprises. Startups, that has been and will be an essential part of the economy, be it present or future. However, it is a well-known fact that startup failure rates are quite high, both in the economy as a whole as well as the restaurant industry which is of focus in this paper. Therefore, there is a pressing need among both scholars and entrepreneurs to figure out how to reduce the micro-enterprise startup failure rates.   It is why; this paper was written with the purpose of studying the components of a traditional business plan model, to look for gaps and parts that are worth developing more. Primary concern was to find out the necessary steps a startup must take in the business plan to better avoid financial failure in the pre-established startup period – which is over 42 months.    Therefore the following research question was posed: “How can the components of the traditional business plan be adapted or complemented by contemporary research, and, entrepreneurs’ views and experiences in order to better avoid financial failure of a micro-enterprise start-up within the European restaurant industry?”   In order to answer this question a qualitative study was done; contemporary research was reviewed and compared with primarily collected data which was gathered by conducting semi-structured interviews with managers and employees of restaurants. The abductive approach allowed the authors to “enrich” the established theories used.     It was made clear that two prominent gaps were found in the traditional business plan models; networking and a red-thread strategy. The first gap, networking, includes the need to establish a “network identity” within the network that the startup operates in, and to plan how the network that the business operates in can be used, as well as clearly state what purposes and benefits it provides.  The second gap, red-thread strategy, emphasizes the need of a strong overall focus on the desired goals and visions of the organization in order for it to better operate and function, and specifically, how it is to be implemented to permeate throughout daily operations. It is to make the operational inferences of the vision clear, and how the startup will ensure that the aim will stay the same through their day-to-day operations.      To conclude, it was found that by allegedly filling up those two gaps by including them in detail in the business plan, the startup could have a bigger chance of avoiding financial failure within the startup period.
APA, Harvard, Vancouver, ISO, and other styles
3

Goncalves, Alexandra. "Alcohol Consumption and Risk of Heart Failure." Thesis, Harvard University, 2015. http://nrs.harvard.edu/urn-3:HUL.InstRepos:17613726.

Full text
Abstract:
Aim: Alcohol is a known cardiac toxin and heavy consumption can lead to heart failure (HF). However, the relationship between moderate alcohol consumption and risk for HF, in either men or women, remains unclear. Methods and results: We examined 14,629 participants of the Atherosclerosis Risk in Communities (ARIC) study (54±6 years, 55% women) without prevalent HF at baseline (1987-89) who were followed for 24±1 years. Self-reported alcohol consumption was assessed as the number of drinks/week (1 drink=14g of alcohol) at baseline, and updated cumulative average alcohol intake was calculated over 8.9±0.3 years. Using multivariable Cox proportional hazards models, we examined the relation of alcohol intake with incident HF and assessed whether associations were modified by sex. Overall, most participants were abstainers (42%) or former drinkers (19%), with 25% reporting up to 7 drinks per week, 8% reporting ≥7 to 14 drinks per week, and 3% reporting ≥14 to 21 and ≥ 21 drinks per week, respectively. Incident HF occurred in 1,271 men and 1,237 women. Men consuming up to 7 drinks/week had reduced risk of HF relative to abstainers (HR 0.80, 95% CI 0.68-0.94, p=0.006); this effect was less robust in women (HR 0.84, 95% CI 0.71-1.00, p=0.05). In the higher drinking categories the risk of HF was not significantly different from abstainers, either in men or in women. Conclusion: In the community, alcohol consumption of up to 7 drinks/week at early-middle age is associated with lower risk for future HF, with a similar but less definite association in women than in men. These findings suggest that despite the dangers of heavy drinking, modest alcohol consumption in early-middle age may be associated with a lower risk of HF.
APA, Harvard, Vancouver, ISO, and other styles
4

Ombete, Kenneth. "Preventing chemical product failure." Diss., Rolla, Mo. : Missouri University of Science and Technology, 2009. http://scholarsmine.mst.edu/thesis/pdf/Ombete_09007dcc80706a6e.pdf.

Full text
Abstract:
Thesis (M.S.)--Missouri University of Science and Technology, 2009.
Vita. The entire thesis text is included in file. Title from title screen of thesis/dissertation PDF file (viewed November 17, 2009) Includes bibliographical references (p. 27-30).
APA, Harvard, Vancouver, ISO, and other styles
5

Qiu, Qun. "Risk Assessment of Power System Catastrophic Failures and Hidden Failure Monitoring & Control System." Diss., Virginia Tech, 2003. http://hdl.handle.net/10919/11075.

Full text
Abstract:
One of the objectives of this study is to develop a methodology, together with a set of software programs that evaluate, in a power system, the risks of catastrophic failures caused by hidden failures in the hardware or software components of the protection system. The disturbance propagation mechanism is revealed by the analysis of the 1977 New York Blackout. The step-by-step process of estimating the relay hidden failure probability is presented. A Dynamic Event Tree for the risk-based analysis of system catastrophic failures is proposed. A reduced 179-bus WSCC sample system is studied and the simulation results obtained from California sub-system are analyzed. System weak links are identified in the case study. The issues relating to the load and generation uncertainties for the risk assessment of system vulnerabilities are addressed. A prototype system - the Hidden Failure Monitoring and Control System (HFMCS) - is proposed to mitigate the risk of power system catastrophic failures. Three main functional modules - Hidden Failure Monitoring, Hidden Failure Control and Misoperation Tracking Database - and their designs are presented. Hidden Failure Monitoring provides the basis that allows further control actions to be initiated. Hidden Failure Control is realized by using Adaptive Dependability/Security Protection, which can effectively stop possible relay involvement from triggering or propagating disturbance under stressed system conditions. As an integrated part of the HFMCS, a Misoperation Tracking Database is proposed to track the performance of automatic station equipment, hence providing automatic management of misoperation records for hidden failure analysis.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
6

Vantine, William L. "Managing the Risk of Failure in Complex Systems: Insight into the Space Shuttle Challenger Failure." Diss., Virginia Tech, 1998. http://hdl.handle.net/10919/40429.

Full text
Abstract:
This dissertation presents a new approach for identifying, assessing, mitigating, and managing the risks of failure in complex systems. It describes the paradigm commonly used today to explain such failures and proposes an alternative paradigm that expands the lens for viewing failures to include alternative theories derived from modern theories of physics. Further, it describes the foundation for each paradigm and illustrates how the paradigms may be applied to a particular system failure. Today, system failure commonly is analyzed using a paradigm grounded in classical or Newtonian physics. This branch of science embraces the principles of reductionism, cause and effect, and determinism. Reductionism is used to dissect the system failure into its fundamental elements. The principle of cause and effect links the actions that led to the failure to the consequences that result. Analysts use determinism to establish the linear link from one event to another to form the chain that reveals the path from cause to consequence. As a result, each failure has a single cause and a single consequence. An alternative paradigm, labeled contemporary, incorporates the Newtonian foundation of the classical paradigm, but it does not accept the principles as inviolate. Instead, this contemporary paradigm adopts the principles found in the theories of relativity, quantum mechanics, chaos, and complexity. These theories hold that any analysis of the failure is affected by the frame of reference of the observer. Causes may create non-linear effects and these effects may not be observable directly. In this paradigm, there are assumed to be multiple causes for any system failure. Each cause contributes to the failure to a degree that may not be measurable using techniques of classical physics. The failure itself generates multiple consequences that may be remote in place or time from the site of the failure, and which may affect multiple individuals and organizations. Further, these consequences, are not inevitable, but may be altered by actions taken prior to and responses taken after the occurrence of the failure. The classical and contemporary paradigms are applied using a single embedded case study, the failure of the space shuttle Challenger. Sources, including literature and popular press articles published prior to and after the failure and NASA documents are reviewed to determine the utility of each paradigm. These reviews are supplemented by interviews with individuals involved in the failure and the official investigations that followed. This dissertation demonstrates that a combination of the classical and contemporary paradigms provides a more complete, and more accurate, picture of system failure. This combination links the non-deterministic elements of system failure analysis to the more conventional, deterministic theories. This new framework recognizes that the complete prevention of failure cannot be achieved; instead it makes provisions for preparing for and responding to system failure.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
7

Forsberg, Fredrik. "Probabilistic Assessment of Failure Risk in Gas Turbine Discs." Thesis, Linköping University, Department of Management and Engineering, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-15909.

Full text
Abstract:

 

Gas turbine discs are heavily loaded due to centrifugal and thermal loads and are therefore designed for a service lifetime specified in hours and cycles. New probabilistic design criteria have been worked out at Siemens Industrial Turbomachinery AB and this report is intended to evaluate if existing turbine discs meet the new design criteria. The evaluation is composed of two tasks, estimation of failure risk and investigation of which parameters that have large effect on the results.

 

The outcome from the evaluations show that the failure risks are smaller than the maximum failure risks allowed in the design criteria. Further, creep strain rate, temperature and creep rupture strain are identified to have large effect on the results in the first case. In the second case blade load and other mechanical loads as well as yield stress show large effect on the results.

 

APA, Harvard, Vancouver, ISO, and other styles
8

Rezaei, Pooya. "Cascading Failure Risk Estimation and Mitigation in Power Systems." ScholarWorks @ UVM, 2016. http://scholarworks.uvm.edu/graddis/482.

Full text
Abstract:
Electricity is a critical component in our daily life. Because it is almost always available, we take it for granted. However, given the proper conditions, blackouts do happen every once in a while and can cause discomfort at a minimum, and a catastrophe in rare circumstances. The largest blackouts typically include cascading failures, which are sequences of interdependent outages. Although timely and effective operator intervention can often prevent a cascade from spreading, such interventions require ample situational awareness. The goals of this dissertation are twofold: to provide power system operators with insight into the risk of blackouts given the space of potential initiating outages, and to evaluate control systems that might mitigate cascading failure risk. Accordingly, this dissertation proposes a novel method to estimate cascading failure risk. It is shown that this method is at least two orders of magnitude faster in estimating risk, compared with a traditional Monte-Carlo simulation in two test systems including a large-scale real power grid model. This method allows one to find critical components in a system and suggests ideas for how to reduce blackout risk by preventive measures, such as adjusting initial dispatch of a system. In addition to preventive measures, it is also possible to use corrective control strategies to reduce blackout sizes. These methods could be used once the system is under stress (for example if some of the elements are overloaded) to stop a potential cascade before it unfolds. This dissertation focuses on a distributed receding horizon model predictive control strategy to mitigate overloads in a system, in which each node can only control other nodes in its local neighborhood. A distributed approach not only needs less communication and computation, but is also a more natural fit with modern power system operations, in which many control centers manage disjoint regional networks. In addition, a distributed controller may be more robust to random failures and attacks. A central controller benefits from perfect information, and thus provides the optimal solution. This dissertation shows that as long as the local neighborhood of the distributed method is large enough, distributed control can provide high quality solutions that are similar to what an omniscient centralized controller could achieve, but with less communication requirements (per node), relative to the centralized approach.
APA, Harvard, Vancouver, ISO, and other styles
9

Phan, Vuong Khac Thai. "Risk factors for treatment failure in isoniazid resistant tuberculosis." Thesis, Open University, 2014. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.664470.

Full text
Abstract:
There were 8.6 million cases and 1.3 million deaths from tuberculosis (TB) globally in 2012. Among the major challenges to global TB control and the ultimate goal of TB elimination is the increasing prevalence of drug resistant strains of Mycobacterium tuberculosis worldwide, coupled with an extremely limited pipeline of novel drug development. In 2012 there were an estimated 450,000 cases of muti- drug resistant TB (resistant to at least rifampicin and isoniazid) and 170,000 deaths due to multi-drug resistant (MDR) TB worldwide. The prevalence of resistance to isoniazid is extrememly high In some regions of the world, including Vietnam, where 25% of new smear positive patients and 54% of re treatment patients are infected with strains resistant to isoniazid. Treatment outcomes are known to be worse for patients with undiagnosed isoniazid resistant (INHR) -TB treated with standard regimens but the majority of patients have successful outcomes. This thesis investigated risk factors for treatment failure among patients with isoniazid resistant TB in Ho Chi Minh City, Vietnam. Chapter one provides. an -introduction to tuberculosis and isoniazid resistant TB and chapter two describes the methodology of the studies decribed in the thesis. In chapter three, I investigate the treatment outcomes among a cohort of patients with isoniazid resistant tuberculosis treated according to National TB guidelines. The data show that unfavourable treatment outcomes are unacceptably high, at 19% among patients with INHR TB.
APA, Harvard, Vancouver, ISO, and other styles
10

Alsoghayer, Raid Abdullah. "Risk assessment models for resource failure in grid computing." Thesis, University of Leeds, 2011. http://etheses.whiterose.ac.uk/1909/.

Full text
Abstract:
Service Level Agreements (SLAs) are introduced in order to overcome the limitations associated with the best-effort approach in Grid computing, and to accordingly make Grid computing more attractive for commercial uses. However, commercial Grid providers are not keen to adopt SLAs since there is a risk of SLA violation as a result of resource failure, which will result in a penalty fee; therefore, the need to model the resources risk of failure is critical to Grid resource providers. Essentially, moving from the best-effort approach for accepting SLAs to a risk-aware approach assists the Grid resource provider to provide a high-level Quality of Service (QoS). Moreover, risk is an important factor in establishing the resource price and penalty fee in the case of resource failure. In light of this, we propose a mathematical model to predict the risk of failure of a Grid resource using a discrete-time analytical model driven by reliability functions fitted to observed data. The model relies on the resource historical information so as to predict the probability of the resource failure (risk of failure) for a given time interval. The model was evaluated by comparing the predicted risk of failure with the observed risk of failure using availability data gathered from Grids resources. The risk of failure is an important property of a Grid resource, especially when scheduling jobs optimally in relation to resources so as to achieve a business objective. However, in Grid computing, user-centric scheduling algorithms ignore the risk factor and mostly address the minimisation of the cost of the resource allocation, or the overall deadline by which the job must be executed completely. Therefore, we propose a novel user-centric scheduling algorithm for scheduling Bag of Tasks (BoT) applications. The algorithm, which aims to meet user requirements, takes into account the risk of failure, the cost of resources and the job deadline. With this in mind, through simulation, we demonstrate that the algorithm provides a near-optimal solution for minimizing the cost of executing BoT jobs. Also, we show that the execution time of the proposed algorithm is very low, and is therefore suitable for solving scheduling problems in real-time. Risk assessment benefits the resource provider by providing methods to either support accepting or rejecting an SLA. Moreover, it will enable the resource provider to understand the capacity of the infrastructure and to thereby plan future investment. Scheduling algorithms will benefit the resource provider by providing methods to meet user requirements and the better utilisation of resources. The ability to adopt a risk assessment method and user-centric algorithms makes the exploitation of Grid systems more realistic.
APA, Harvard, Vancouver, ISO, and other styles
11

Lauder, Michael Alan. "Conceptualisation in Preparation for Risk Discourse: A Qualitative Step toward Risk Governance." Thesis, Cranfield University, 2011. http://dspace.lib.cranfield.ac.uk/handle/1826/6793.

Full text
Abstract:
The purpose of this research was, in order to forestall future failures of foresight, to provoke those responsible for risk governance into new ways of thinking through a greater exposure to and understanding of the body of existing academic knowledge. The research, which focused on the scholarship of application, synthesised the existing knowledge into a ―coherent whole‖ in order to assess its practical utility and to examine what is to be learnt about existing knowledge by trying to use it in practice. The findings are in two parts. The first focuses on how one ―thinks about thinking‖ about an issue. Early work identified three issues that were seen as being central to the understanding of risk governance. The first is the concept of risk itself, the second is to question whether there is a single paradigm used and the third is what is meant by the term ―risk indicator‖. A ―coherent whole‖, structured around seven-dimensions, was created from the range of definitions used within existing literature. No single paradigm was found to be used when discussing risk issues. Three paradigms were identified and labelled ―Line‖, ―Circle‖ and ―Dot‖. It was concluded that Risk Indicators were used to performance manage risk mitigation barriers rather than as a mechanism by which organisations may identify emerging risks. The second focus was the synthesis of academic work relevant to risk governance. It produced a list of statements which encapsulated the concerns of previous writers on this subject. The research then operationalised the issues as questions, which were seen to have practical utility. The elements of the ―coherent whole‖ suggest a way to provide access into the original research. The research suggests that it is unlikely that practitioners would wish to access the original research in its academic format. Further work therefore needs to be done to present the original work in a format that is more digestible to the practitioner community if it is to be used effectively. The results of this research are considered to be preliminary. No claim is being made that these questions are definitive. The research is however addressing an area which is of concern to those in practice and has not been previously examined.
APA, Harvard, Vancouver, ISO, and other styles
12

Christensen, Charles R. "Failure to prosper risk premia in peer to peer lending /." Laramie, Wyo. : University of Wyoming, 2007. http://proquest.umi.com/pqdweb?did=1445044811&sid=1&Fmt=2&clientId=18949&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

O'Donnell, Johanna. "Predicting heart failure deterioration." Thesis, University of Oxford, 2017. http://ora.ox.ac.uk/objects/uuid:f7e51226-128b-44eb-8f6a-557f1d0c9a53.

Full text
Abstract:
Chronic heart failure (HF) is a condition that affects more than 900,000 people in the UK. Mortality rates associated with the condition are high, with nearly 20% of patients dying within one year of diagnosis. Continuous monitoring and risk stratification can help identify patients at risk of deterioration and may consequently improve patients' likelihood of survival. Current repeated-measure risk stratification techniques for HF patients often rely on subjective perception of symptoms, such as breathlessness, and markers of fluid retention in the body (e.g. weight). Despite the common use of such markers, studies have shown that they offer limited effectiveness in predicting HF-related events. This thesis set out to identify and evaluate new markers for repeated-measure risk stratification of HF patients. It started with an exploration of traditional HF measurements, including weight, blood pressure, heart rate and symptom scores, and aimed to improve the performance of these measurements using a data-driven approach. A multi-variate model was developed from data acquired during a randomised controlled trial of remotely-monitored HF patients. The rare occurrence of HF-related adverse events during the trial required the developement of a careful methodology. This methodology helped identify the markers with most predictive ability, which achieved moderate performance at identifying patients at risk of HF-related adverse events, clearly outperforming commonly-used thresholds. Subsequently, this thesis explored the potential value of additional, accelerometer-derived physical activity (PA) and sleep markers. For this purpose, the ability of accelerometer-derived markers to differentiate between individuals with and without HF was evaluated. It was found that markers that summarise the frequency and duration of different PA intensities performed best at differentiating between the two groups and may therefore be most suitable for future use in repeated-measure applications. As part of the analysis of accelerometer-derived HF markers, a gap in the methodology of automated accelerometer processing was identified, namely the need for self-reported sleep-onset and wake-up information. As a result, Chapter 5 of this thesis describes the development and evaluation of a data-driven solution for this problem. In summary, this thesis explored both traditional and new, accelerometer-derived markers for the early detection of HF deterioration. It utilised sound methodology to overcome limitations faced by sparse and unbalanced datasets and filled a methodological gap in the processing of signals from wrist-worn accelerometers.
APA, Harvard, Vancouver, ISO, and other styles
14

Atsu, Francis. "Essays on failure risk of firms using multivariate frailty models." Thesis, Brunel University, 2016. http://bura.brunel.ac.uk/handle/2438/12994.

Full text
Abstract:
The post-2007 global financial crisis, characterised by huge firm losses, especially in the USA and Europe, initiated a new strand of literature, where default models are adjusted for unobserved risk factors, including measurement errors, missing firm specific and macroeconomic variables. These new models assume that default correlations are not only driven by observable firm-specific and macroeconomic factors, but also by unobserved risk factors. This thesis present three empirical essays. The first essay estimates and predicts the within-sector failure rate and dependence of firms on the London Stock Exchange. The study offers an additive lognormal frailty model that accounts for both unobserved factors and regime changes. The analysis reveals that during distressed market periods the sector-based failure rates and dependencies tend to be high. The second essay proposes a novel approach based on a bias-corrected estimator to investigate the impact of informative firm censoring and unobserved factors on hazard rates of US firms. The approach uses inverse probability of censoring weighted scheme that explicitly accounts for firm specific factors, economic cycles, industry-level dependence and market activities induced by unobservable factors. The analysis shows that during distressed market periods the effect of informative censoring averagely increases the hazards rates, and varies across industries. The third essay employs a mixed effects Cox model to estimate the failure dependence caused by firms’ exposure to country-based and group-level unobserved factors within the Eurozone. The empirical results show that a higher failure dependence among firms in groups of countries with similar economic and financial conditions than countries with different conditions. Overall, the thesis contributes to the empirical literature on firm default in the broad area of corporate finance by offering a different approach of capturing default dependence and its variations during unfavourable market conditions and adjusting for the effects of non-default firm exit on active firms.
APA, Harvard, Vancouver, ISO, and other styles
15

Kreskey, Donna D. "Headsprout Early Reading for Students At Risk for Reading Failure." Digital Archive @ GSU, 2012. http://digitalarchive.gsu.edu/cps_diss/73.

Full text
Abstract:
This study examined the efficacy of using Headsprout Early Reading (Headsprout, 2007) to supplement a balanced literacy curriculum for kindergarten and first grade students in a suburban public school system. Headsprout, which is an example of computer aided instruction (CAI), provided internet-based, supplemental reading instruction that incorporates the five critical components of reading instruction cited by the National Reading Panel (NRP, 2000). The school system implemented Headsprout as a standard protocol, Tier 2 intervention within their Response to Intervention (RTI) process. The study included kindergarten and first grade students from across the school system who were identified as at risk for reading failure based on fall Dynamic Indicators of Basic Early Literacy (DIBELS) scores. Kindergarten and first grade students identified as at risk for reading failure who participated in Headsprout were compared with matched groups of kindergarten and first grade students who did not participate in Headsprout. Overall, neither kindergarten nor first grade students who participated in Headsprout gained meaningful educational benefit from the CAI instruction provided by Headsprout beyond the benefit they received from participating in the general education, RTI Tier 1, balanced literacy curriculum that was available to all kindergarten and first grade students.
APA, Harvard, Vancouver, ISO, and other styles
16

Kwon, Seungmin. "A risk-based ship design approach to progressive structural failure." Thesis, University of Strathclyde, 2012. http://oleg.lib.strath.ac.uk:80/R/?func=dbin-jump-full&object_id=17818.

Full text
Abstract:
Although substantial effort has been devoted in the design process of ships to reduce the operational risk level by preventing and mitigating accidental events, the societal expectation on the safety at sea is growing faster than ever. The framework of the safe return to port for passenger ship safety reflects this trend in pursuance of zero tolerance to loss of human life in the event of an accident. Along these lines the emphasis of the survivability of a damaged ship is placed on the damage stability and the hull girder collapse under the explicit assumption that the initial damage extent is fixed. However, in practice it is often observed that progressive degradation of the damaged structure threatens the survival of a ship by causing significant reduction of its strength, as it was witnessed in the loss of MV Prestige. Hence, the information of progressive structural failure in timeline and its effect on the hull girder residual strength is of paramount importance in the course of evaluating survivability of a damaged ship and mitigating the ensuing consequences. This provides an obvious objective for this study, which is the elaboration on a method for progressive structural failure analysis under time varying wave loads and the development of a parametric tool for fast and reliable assessment of the structural survivability of a damaged ship with respect to the damage propagation. The developed tool provides the probability of unstable damage propagation over time, from which the window of safe intervention in emergency operations can be extracted and support the decision-making process in the course of the rescue and salvage operation. Moreover, this work also sets the foundation of a new dimension in the early ship design phase, namely the structural survivability with respect to the progressive structural failure. In this way, it contributes to the holistic safety assessment approach advocated by the design for safety philosophy and the riskbased ship design methodology. The developed tool is fully parametric so as to support decision-making both in the emergency operations, where fast and reliable information is required, and in the early design stage, where various damage cases need to be assessed in order to administer appropriate structural design solutions.
APA, Harvard, Vancouver, ISO, and other styles
17

Silva-Cruz, Aracely Lizet, Karina Velarde-Jacay, Nilton Yhuri Carreazo, and Raffo Escalante-Kanashiro. "Risk factors for extubation failure in the intensive care unit." Associacao de Medicina Intensiva Brasileira - AMIB, 2018. http://hdl.handle.net/10757/624625.

Full text
Abstract:
Objective: To determine the risk factors for extubation failure in the intensive care unit. Methods: The present case-control study was conducted in an intensive care unit. Failed extubations were used as cases, while successful extubations were used as controls. Extubation failure was defined as reintubation being required within the first 48 hours of extubation. Results: Out of a total of 956 patients who were admitted to the intensive care unit, 826 were subjected to mechanical ventilation (86%). There were 30 failed extubations and 120 successful extubations. The proportion of failed extubations was 5.32%. The risk factors found for failed extubations were a prolonged length of mechanical ventilation of greater than 7 days (OR = 3.84, 95%CI = 1.01 - 14.56, p = 0.04), time in the intensive care unit (OR = 1.04, 95%CI = 1.00 - 1.09, p = 0.03) and the use of sedatives for longer than 5 days (OR = 4.81, 95%CI = 1.28 - 18.02; p = 0.02). Conclusion: Pediatric patients on mechanical ventilation were at greater risk of failed extubation if they spent more time in the intensive care unit and if they were subjected to prolonged mechanical ventilation (longer than 7 days) or greater amounts of sedative use.
Revisión por pares
Revisión por pares
APA, Harvard, Vancouver, ISO, and other styles
18

Bosco-Lévy, Pauline. "Heart failure in France : chronic heart failure therapeutic management and risk of cardiac decompensation in real-life setting." Thesis, Bordeaux, 2019. http://www.theses.fr/2019BORD0348.

Full text
Abstract:
En France, environ un million de personnes seraient touchées par l’insuffisance cardiaque (IC) ; on recense près de 70 000 décès liés à l’IC, et plus de 150 000 hospitalisations et cela, malgré une prise en charge thérapeutique bien codifiée. Ces chiffres devraient s’accroitre dans les années futures du fait notamment du vieillissement de la population.L’objectif de ce travail était d’étudier l’utilisation des traitements pharmacologiques indiqués dans le traitement de l’IC (beta bloquant, inhibiteur de l’enzyme de conversion, anti-aldostérone, antagoniste des récepteurs à l’angiotensine II, diurétiques, digoxine, ivabradine) en situation réelle de soin, et d’identifier les facteurs cliniques ou pharmacologiques associés à la survenue d’un épisode de décompensation cardiaque.Un premier travail a permis de mesurer la fiabilité des bases de données médico-administratives françaises pour identifier des patients IC.Une deuxième étude a permis d’estimer que 17 à 37% de patients IC n’étaient exposés à aucun traitement de l’IC dans l’année suivant une première hospitalisation pour IC.Les troisième et quatrième parties de cette thèse ont mis en évidence qu’environ un quart des patients IC étaient réhospitalisés dans les 2 ans suivant une première hospitalisation. Les principaux facteurs cliniques prédictifs de cette réhospitalisation étaient l’âge, l’hypertension artérielle, la fibrillation auriculaire et le diabète. L’association retrouvée entre l’utilisation de fer bivalent et la réhospitalisation pour IC, souligne l’importance du risque lié à la présence d’une anémie ou d’une déficience en fer dans la survenue d’un épisode de décompensation cardiaque.Ces résultats permettent de reconsidérer la prise en charge thérapeutique chez les patients IC et mettent en avant la nécessité de renforcer la surveillance des patients les plus à risque de décompenser leur IC
In France, around one million persons would be affected by heart failure (HF); there are nearly 70 000 deaths related to HF and more than 150 000 hospitalizations despite a well defined treatment management. These numbers should increase in the next years due in particular to the ageing of the population.The objective of this work was to study the use of the pharmacological treatments indicated in HF (beta-blocker, angiotensin-converting enzyme inhibitors, angiotensin receptor blockers, aldosterone antagonist, diuretics, digoxin, ivabradine) in real-world setting and to identify the clinical or pharmacological predictors associated with a new episode of cardiac decompensation.A first work has enabled to estimate the accuracy of French claims databases in identifying HF patients.A second study estimated that 17 to 37% HF patients were not exposed to any HF treatment in the year following an incident HF hospitalization.The third and fourth parts of this thesis showed that almost one forth of HF patients was rehospitalized within the 2 years following a first hospitalization. The main clinical predictors of rehospitalization were age, high blood pressure, atrial fibrillation and diabetes. The association found between bivalent iron use and HF rehospitalization underlines the importance of the risk related to anemia or iron deficiency in the occurrence of a cardiac exacerbation episode.These results allow to reconsider the treatment management of HF patients and highlight the need to reinforce the surveillance of patients with a highest risk of cardiac exacerbation
APA, Harvard, Vancouver, ISO, and other styles
19

Xie, Xiangwen. "Covariate measurement error methods in failure time regression /." Thesis, Connect to this title online; UW restricted, 1997. http://hdl.handle.net/1773/9538.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

MacLeod, Stefanie. "Clinical nursing instructors' experiences teaching students deemed at risk of failure." Thesis, University of British Columbia, 2015. http://hdl.handle.net/2429/52688.

Full text
Abstract:
The experience of the clinical nursing instructor (CNI) in teaching nursing students deemed at risk of failure has not been well explored in nursing literature. It may be difficult for the CNI to support as well as evaluate a student when that student’s performance is judged to be unsatisfactory or unsuccessful. The purpose of this study was to explore CNIs’ experiences in teaching undergraduate nursing students deemed at risk of failure, to discover how CNIs identify potentially unsuccessful students and to describe what supports and resources CNIs utilize to help them manage such students. A pilot study using a qualitative phenomenological approach was used to interview CNIs who had at least one experience teaching an undergraduate nursing student deemed at risk of failure at the University of British Columbia (UBC) and the British Columbia Institute of Technology (BCIT) schools of nursing. The study found that CNIs identified students at risk of failure using “red flags” that included a range of actions, behaviours, and attitudes. These red flags included deficits in the demonstrated thinking, knowledge, and skills; deficits in the social and cultural aspects of nursing practice; disorganization and tardiness; and lack of integrity. CNIs felt that early and clear communication of their concerns with faculty and students deemed at risk of failure was beneficial for both the student and CNI. CNIs made decisions to fail students by considering patient safety and objective evidence while at the same time supporting and nurturing these students by providing opportunities for success.
Applied Science, Faculty of
Nursing, School of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
21

Ingelsson, Erik. "Insulin Resistance and Inflammation as Risk Factors for Congestive Heart Failure." Doctoral thesis, Uppsala : Acta Universitatis Upsaliensis : Univ.-bibl. [distributör], 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-5879.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Kotsopulos, Spiridon I. "On the evaluation of risk of failure in irrigation water delivery." Thesis, University of Southampton, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.236328.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Mansaray, Alhassan A. "Public-Private Partnership : countries' attractiveness and the risk of project failure." Thesis, Loughborough University, 2018. https://dspace.lboro.ac.uk/2134/33333.

Full text
Abstract:
The primary objective of this thesis is to analyse the public private partnership (PPP) framework for infrastructure development in developing countries across the six regions of the world. The thesis utilises the World Bank's private participation in infrastructure (PPI) dataset for the period 1980–2014, and examines three thematic areas. The first comprises of an exploratory analysis of the PPI dataset. The second research area focuses on the relationship between countries' attractiveness for PPPs and the characteristics of the countries, including: macroeconomic and market; fiscal constraints; regulatory and governance; and experience in PPPs, by utilising the Zero-Inflated Negative Binomial and Cragg's Double Hurdle models in an attempt to model private investors' decision to engage in PPPs as separate participation and consumption decisions. The third research area employs the methodology of survival analysis to investigate the risk of failure of PPP projects based on the allocation of residual facility ownership between the partners. The thesis's primary contributions include the utilisation of a wider and more informative range of econometric methodologies which have not been previously applied to the PPI dataset, and for the first time also, provides a framework to select an appropriate structure for PPPs that will enhance project survival. A key finding of the thesis is that private investors prioritise macroeconomic and market variables, such as price stability over regulatory and governance variables, such as corruption, in their determination as to which country to engage in PPPs. Contrary to previous research, corruption was found to be of no consequence to private investors who wish to engage in PPPs even for developing countries. Another key finding is that PPP projects which confer residual ownership on the public sector have lower risk of failure than those for which such ownership is conferred on the private sector. Evidence also suggests that the size of the project and the participation of multilateral institutions in PPPs also affect the risk of project failure.
APA, Harvard, Vancouver, ISO, and other styles
24

Saine, Kathleen C. (Kathleen Chen). "College Students at Risk of Academic Failure: Neurocognitive Strengths and Weaknesses." Thesis, University of North Texas, 1995. https://digital.library.unt.edu/ark:/67531/metadc278348/.

Full text
Abstract:
This study examined the neurocognitive skills, incidence of mild head injury, incidence of learning disabilities, and study habits among college students with grade point average of 2.00 or below (N = 25) as contrasted with college students with grade point average above 2.00 (N = 70). The intent of this research was to extend the work of Segalowitz and Brown (1991) and Segalowitz and Lawson (1993) who found significant associations between reported history of mild head injury and developmental disabilities among high school and college samples. MANOVAs conducted on measures of academic achievement, global cognitive skills, verbal and nonverbal memory, motor and tactile functioning, and study habits did not discriminate between probationary and non-probationary students. Probationary and non-probationary students also did not differ with regard to incidence of reported head injury, frequency of diagnosed learning disabilities, and study habits. Measures of neurocognitive functioning and study habits did not contribute to the prediction of grade point average over and above that predicted by Scholastic Aptitude Test composite score. Several exploratory analyses were performed examining the relationship between study habits and neurocognitive skills. Gender differences, implications for future research and development of study skills courses, and limitations of this study were discussed.
APA, Harvard, Vancouver, ISO, and other styles
25

Todd, Lauren Louise. "IPO Failure Risk by Industry, Auditor Industry Specialization, and Audit Fees." Thesis, The University of Arizona, 2011. http://hdl.handle.net/10150/144995.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Thacker, Scott. "Reducing the risk of failure in interdependent national infrastructure network systems." Thesis, University of Oxford, 2015. https://ora.ox.ac.uk/objects/uuid:02e7313c-0967-47e3-becc-2e7da376f745.

Full text
Abstract:
Infrastructure network systems support society and the economy by facilitating the distribution of essential services across broad spatial extents, at a range of scales. The complex and interdependent nature of these systems provides the conditions for which localised failures can dramatically cascade, resulting in disruptions that are widespread and very often unforeseen. This systemic vulnerability has been highlighted multiple times over the previous decades in infrastructures systems from around the world. In the future, the hazards to which infrastructure systems are exposed are set to grow with increasing extreme event risks caused by climate change. The aim of this thesis is to develop methodology and analysis for understanding and reducing the risk of failure of national interdependent infrastructure network systems. This study introduces multi-scale, system-of-systems based methodology and applied analysis that provides important new insights into interdependent infrastructure network risk and adaptation. Adopting a complex network based approach; real-world asset data is integrated from the energy, transport, water, waste and digital communications sectors to represent the physical interconnectivity that exists within and between interdependent infrastructure systems. Given the often limited scope of real-world datasets, an algorithm is presented that is used to synthesise missing network data, providing continuous network representations that preserve the most salient spatial and topological properties of real multi-level infrastructure systems. Using the resultant network representations, the criticality of individual assets is calculated by summing the direct and indirect customer disruptions that can occur in the event of failure. This is achieved by disrupting sets of functional service flow pathways that transcend sectorial and operational boundaries, providing long-range connectivity between service originating source nodes and customer allocated sink nodes. Kernel density estimation is used to integrate discrete asset criticality values into a continuous surface from which statistically significant infrastructure geographical criticality hotspots are identified. Finally, a business case is presented for investment in infrastructure adaptation, where adaptation costs are compared to the reduction in expected damages that arise from interdependency related failures over an assets lifetime. By representing physical and geographic interdependence at a range of scales, this analysis provides new evidence to inform the targeting of investments to reduce risks and enhance system resilience. It is concluded that the research presented within this thesis provides new theoretical insights and practical techniques for a range of academic, industrial and governmental infrastructure stakeholders, from the UK and beyond.
APA, Harvard, Vancouver, ISO, and other styles
27

Sambo, Mogamat Fadeel. "An investigation into Business Continuity Plan (BCP) failure during a disaster event." University of the Western Cape, 2012. http://hdl.handle.net/11394/4575.

Full text
Abstract:
Magister Commercii (Information Management) - MCom(IM)
This thesis examines what a Business Continuity Plan (BCP) should comprise off, as well as the difference between a BCP and a Disaster Recovery Plan (DRP) and the key elements of an effective BCP as well as the different types of disasters. It also investigates why companies that have BCP in place and conducts testing of their plan on a regular basis, either quarterly or bi-annually, still experience prolonged downtime during a disaster resulting in Service Level Agreements (SLA) not being met or major financial loses. It also inspects acceptable processes within a BCP to determine whether there are ways of improving these processes to prevent companies from experiencing prolonged downtime. The objective of this research is to determine and understand: Why organisations within the Western Cape experience prolonged downtimes during a disaster event. The potential deficiencies in a BCP and how they can be amended. A case study of four companies based in the Western Cape was conducted. These companies were chosen because each of them has a BCP in place and each have experienced prolonged downtime during a disaster. Qualitative interviews with the aid of an open-ended questionnaire were used to interview the BCP or Risk Manager of each company. The data was analysed to determine what the causes of their prolonged downtime were during a disaster. In the analysis and findings process each company is presented as a separate case study. The intension with this research study is to add an additional concept to the Common BCP Process that was identified within this study and that formed the basis for the Conceptual Framework, thereby reducing the downtime during a disaster for the companies that formed part of the research.
APA, Harvard, Vancouver, ISO, and other styles
28

Guy, Jonathan Arthur. "The regulator and the regulated : risk perception of wastewater treatment plant non-compliance events." Thesis, Imperial College London, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.324889.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Jongsma, Michael Howard. "Care Transition Gaps: Risk Identification and Intervention." ScholarWorks, 2015. https://scholarworks.waldenu.edu/dissertations/446.

Full text
Abstract:
Hospital readmissions related to chronic heart failure (CHF) are costly, widespread, and often avoidable. Patient education that includes diagnosis, causes, medications, diet, exercise, and exacerbation warning signs has been shown to reduce the number of CHF readmissions. The purpose of this study was to use risk stratification to identify CHF patients at high risk for 30-day readmission. Once a high-risk CHF patient was identified, nursing interventions would be triggered to reduce readmissions and close the gaps in the continuum of care following acute care admission. Transitions of care theory was used as the framework for this project. The methodology had a quality improvement focus. The patient population consisted of high-risk CHF patients (n = 25) with NYHA classification of II-IV using the risk identification tool. Patients were identified using the tool, were followed for 30 days, and received nursing interventions to reduce the possibility of readmission. Only one of the identified patients was readmitted within 30 days for a diagnosis unrelated to CHF, resulting in no readmissions within this sub group. This study suggests that risk stratification can identify and direct resources to CHF patients, decreasing their likelihood for readmission. Nurse leaders can use standardized tools such as the risk identification tool, thereby reducing readmissions along with associated costs for readmissions.
APA, Harvard, Vancouver, ISO, and other styles
30

Sambo, Mogamat Fadeel. "An investigation into business continuity plan (BCP) failure during a disaster event." Thesis, University of the Western Cape, 2014. http://hdl.handle.net/11394/4241.

Full text
Abstract:
Magister Commercii (Information Management) - MCom(IM)
This thesis examines what a Business Continuity Plan (BCP) should comprise off, as well as the difference between a BCP and a Disaster Recovery Plan (DRP) and the key elements of an effective BCP as well as the different types of disasters. It also investigates why companies that have BCP in place and conducts testing of their plan on a regular basis, either quarterly or bi-annually, still experience prolonged downtime during a disaster resulting in Service Level Agreements (SLA) not being met or major financial loses. It also inspects acceptable processes within a BCP to determine whether there are ways of improving these processes to prevent companies from experiencing prolonged downtime. The objective of this research is to determine and understand: • Why organisations within the Western Cape experience prolonged downtimes during a disaster event • The potential deficiencies in a BCP and how they can be amended.
APA, Harvard, Vancouver, ISO, and other styles
31

Bole, Brian McCaslyn. "Load allocation for optimal risk management in systems with incipient failure modes." Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/50394.

Full text
Abstract:
The development and implementation challenges associated with a proposed load allocation paradigm for fault risk assessment and system health management based on uncertain fault diagnostic and failure prognostic information are investigated. Health management actions are formulated in terms of a value associated with improving system reliability, and a cost associated with inducing deviations from a system's nominal performance. Three simulated case study systems are considered to highlight some of the fundamental challenges of formulating and solving an optimization on the space of available supervisory control actions in the described health management architecture. Repeated simulation studies on the three case-study systems are used to illustrate an empirical approach for tuning the conservatism of health management policies by way of adjusting risk assessment metrics in the proposed health management paradigm. The implementation and testing of a real-world prognostic system is presented to illustrate model development challenges not directly addressed in the analysis of the simulated case study systems. Real-time battery charge depletion prediction for a small unmanned aerial vehicle is considered in the real-world case study. An architecture for offline testing of prognostics and decision making algorithms is explained to facilitate empirical tuning of risk assessment metrics and health management policies, as was demonstrated for the three simulated case study systems.
APA, Harvard, Vancouver, ISO, and other styles
32

Morley-Davies, A. J. "Predicting death in chronic heart failure : electrocardiographic, autonomic and neuroendocrine risk assessment." Thesis, University of Glasgow, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.272860.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Lu, Fei. "The automatic nervous system, ventricular repolarisation and risk of sudden cardiac failure." Thesis, University of London, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.309312.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Hashemi, Djawid [Verfasser]. "Synkopen – Risikofaktor bei Herzinsuffizienz? : Syncopes – risk factor in heart failure? / Djawid Hashemi." Berlin : Medizinische Fakultät Charité - Universitätsmedizin Berlin, 2021. http://d-nb.info/1241538476/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Pellicori, Pierpaolo. "Newer imaging modalities to identify high-risk ambulatory patients with heart failure." Thesis, University of Hull, 2016. http://hydra.hull.ac.uk/resources/hull:14756.

Full text
Abstract:
The lack of widely accepted objective measures of cardiac dysfunction other than left ventricular ejection fraction (LVEF) has hampered, and continues to hamper, clinical research in patients with heart failure (HF). Identifying patients at higher risk of adverse outcome would allow better targeting of therapy to those with most to gain. The thesis is divided in three parts. In the first part, I report the results of studies of the association between echocardiographic measures of right atrial pressure (by measuring the inferior vena cava (IVC) diameter) and outcome in ambulatory patients with HF. I also studied the associations with prognosis of a newer echocardiographic method (global longitudinal strain, GLS) to assess left ventricular systolic function in patients with normal LVEF on conventional imaging. In the second part, I report the results of studies of the associations of left atrial function by cardiac magnetic resonance (cMRI) with outcome in ambulatory patients with HF. I also studied the relationship between QRS morphology on ECG with cardiac structure and function measured by cMRI in ambulatory patients with HF. In the third part, I report the results of developing and prospectively evaluating an ultrasound method to measure the internal jugular vein diameter (as an objective estimate of the right atrial pressure) and its changes with respiratory manoeuvres. I studied the association between the jugular vein diameter, clinical and echocardiographic variables, and its relations with outcome in ambulatory patients with HF and controls. My results showed that upstream consequences of a dysfunctional left ventricle, such as impaired left atrial function measured by cMRI, a distended IVC or internal jugular vein by ultrasound, provide powerful prognostic information, similar to that obtained by measuring N-terminal pro-B-type natriuretic peptide plasma levels, in individuals with HF regardless of whether they have a reduced or normal LVEF. As residual congestion (dilated IVC or jugular vein) and impaired left atrial function appear strongly related to an adverse outcome, tailoring treatment to minimise congestion or improving left atrial function is an attractive concept worth testing.
APA, Harvard, Vancouver, ISO, and other styles
36

Rosengarten, James A. "Risk stratification in sudden cardiac death : engineering novel solutions in heart failure." Thesis, University of Southampton, 2014. https://eprints.soton.ac.uk/407449/.

Full text
Abstract:
Sudden cardiac death (SCD) risk is reduced by implantable cardioverter defibrillator (ICD) use in appropriately selected patients. Established markers such as impairment of left ventricular function and QRS duration are non specific for arrhythmic death and therefore many patients receive ICD therapy from which they gain no benefit, either due to survival without arrhythmia or death from pump failure. Both myocardial scar and serum protein biomarkers have potential as SCD risk stratifiers, but novel solutions are needed to deliver non invasive tests that are suitable for point of care testing. The aims of this thesis were to explore novel assessment methods for the risk stratification of SCD, with particular focus on heart failure. Several approaches were chosen to explore these concepts: (i) meta-analysis to assess the utility of fragmented QRS, (ii) retrospective evaluation of ECG and CMR to assess ECG markers of repolarisation and (iii) QRS scoring, (iv) prospective evaluation of an automated QRS scoring algorithm to predict myocardial scar, (v) artificial intelligence machine learning techniques to develop and validate an algorithm capable to classifying ECG scar, and (vi) a novel high resolution proteomic technique to propose biomarkers of SCD risk, validated using ELISA (vii). The hypothesis is that novel clinical tools, encompassing technologies and techniques which could stretch across the clinical landscape from primary to specialised care services, can be identified as indicators of ICD benefit in patients at risk of SCD. My results indicate that simpler ECG markers such as T-peak-end, fQRS and QRS scoring have a significant association with myocardial scar, although the strength of association varies according to scar characteristics, and is not specific. The specificity of these markers for mode of death is also weak. Computerised algorithms can serve to speed up manual ECG scoring, whilst maintaining overall accuracy, but greatest potential is seen in using a novel marker, custom developed using artificial intelligence techniques. I also found that candidate serum biomarkers, predictive of death or ventricular arrhythmia, could be identified through high resolution proteomic techniques. Clinical and technical validation with ELISA is possible. Novel non invasive markers, such as serum proteins and computer ECG analysis may be valuable tools to improve risk prediction. The incremental benefit of these tools to determine prognosis, and select those who will most benefit from ICD therapy, can now be addressed by future prospective studies.
APA, Harvard, Vancouver, ISO, and other styles
37

Foulks, Barbara. "Academic Survival Skills for the Young Child At-Risk for School Failure." Scholarly Commons, 1987. https://scholarlycommons.pacific.edu/uop_etds/3193.

Full text
Abstract:
The primary purpose of this study was to identify social competence and academic survival skills necessary for success in kindergarten. The study was designed to indicate similarities and differences among early childhood educators in (a) academic survival skills considered necessary for success in kindergarten, and (b) behaviors considered inappropriate for kindergarten. A review of the literature revealed minimal research related to academic survival skills and social competence in kindergarten. In order to ascertain which skills early childhood educators consider crucial for the child's successful survival in kindergarten, the Social Behavior Skills Inventory (Walker & Rankin, 1980) was utilized as a survey instrument to obtain the relevant information. The survey obtained data that determined the specific social competence and academic survival skills considered important for kindergarten children in Calaveras, Amador, and Tuolumne Counties in rural California. The results of the study revealed information regarding social competence and academic survival skills needed for success in kindergarten. There were only two academic survival skills agreed upon by kindergarten teachers, preschool professionals, and family day care providers as being necessary for success in kindergarten. Social skills and positive interactions with peers we r e not as critical for academic survival as other types of skills. Kindergarten teachers considered more of the adaptive skills to be necessary for academic survival than either preschool professionals or family day care providers. Far more maladaptive behaviors were rated as highly important than appropriate behaviors by all groups. All participant groups felt more strongly about unacceptable, maladaptive behaviors than critical, appropriate behaviors. Altogether, 16 of 51 maladaptive behaviors were rated unacceptable by all three participant groups. There was uniform agreement among all survey participants that two behaviors were not tolerated in kindergarten. The majority of behaviors rated as unacceptable in kindergarten were behaviors that challenged the teacher's control and authority. The least important maladaptive behaviors were related to peer socialization. This study was a beginning in determining the particular adaptive precursor skills needed by the at-risk child. By identifying academic survival skills considered necessary for a successful adjustment to kindergarten, the study provided data on skills needed by the young child at-risk for school failure. A number of recommendations for further research were generated.
APA, Harvard, Vancouver, ISO, and other styles
38

Bucknor, Matthew D. "Modeling of Electrical Cable Failure in a Dynamic Assessment of Fire Risk." The Ohio State University, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=osu1373996537.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Qi, Lihong. "Analysis of failure time data under risk set sampling and missing covariates /." Thesis, Connect to this title online; UW restricted, 2003. http://hdl.handle.net/1773/9550.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Betihavas, Vasiliki. "Predicting risk: developing and testing of a nomogram to predict hospitalisation in chronic heart failure (CHF- Risk Study)." Thesis, Curtin University, 2013. http://hdl.handle.net/20.500.11937/552.

Full text
Abstract:
Chronic heart failure (CHF) is the leading cause of hospital admission in the elderly. Currently, no absolute risk model for rehospitalisation exists. The CHF-Risk Study was a 3 phase study that led to the development of a nomogram using a derivation cohort of a contemporaneous Australian CHF population. Factors associated with an increased risk of cardiovascular rehospitalisation were: age; living alone; a sedentary lifestyle and the presence of multiple co-morbid conditions.
APA, Harvard, Vancouver, ISO, and other styles
41

Halvorson, Daniel. "Capabilities, International Order and Risk: State Failure and Governance Intervention in Theory and History." Thesis, Griffith University, 2010. http://hdl.handle.net/10072/365882.

Full text
Abstract:
This study examines the phenomenon of “failed” states and governance intervention in comparative historical and international systemic context. The dissertation argues that state failure is a condition partly constructed by the leading actors of international society. The study advances a three-part framework for analysis to understand how leading states and their close allies interpret what constitutes a state failure and how an interventionist policy response is formulated. Interpretations of state failure and modes of governance intervention are based on the interplay of transnational disorder with the (1) distribution of capabilities in the international system, the (2) pattern of order in the international society, and the (3) sensitivity of the domestic polities of leading actors to risk. This framework for analysis is applied to three qualitative case studies of state “failure” and “governance” intervention selected on system polarity: the 1882 British occupation of Egypt; the United States combat intervention in South Vietnam, 1965; and Australia’s Regional Assistance Mission to the Solomon Islands (RAMSI) in 2003.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
Centre for Governance and Public Policy
Griffith Business School
Full Text
APA, Harvard, Vancouver, ISO, and other styles
42

Salman, Baris. "Infrastructure Management and Deterioration Risk Assessment of Wastewater Collection Systems." University of Cincinnati / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1282051343.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Bartone, Cheryl L. "Variables that increase heart failure patients' risk of early readmission: a retrospective analysis." University of Cincinnati / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1377869498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Chandarana, Upasna Piyush, and Upasna Piyush Chandarana. "Optimizing Geotechnical Risk Management Analysis." Diss., The University of Arizona, 2017. http://hdl.handle.net/10150/625550.

Full text
Abstract:
Mines have an inherent risk of geotechnical failure in both rock excavations and tailings storage facilities. Geotechnical failure occurs when there is a combination of exceptionally large forces acting on a structure and/or low material strength resulting in the structure not withstanding a designed service load. The excavation of rocks can cause unintended rock mass movements. If the movement is monitored promptly, accidents, loss of ore reserves and equipment, loss of lives, and closure of the mine can be prevented. Mining companies routinely use deformation monitoring to manage the geotechnical risk associated with the mining process. The aim of this dissertation is to review the geotechnical risk management process to optimize the geotechnical risk management analysis. In order to perform a proper analysis of slope instability, understanding the importance as well as the limitations of any monitoring system is crucial. Due to the potential threat associated with slope stability, it has become the top priority in all risk management programs to predict the time of slope failure. Datasets from monitoring systems are used to perform slope failure analysis. Innovations in slope monitoring equipment in the recent years have made it possible to scan a broad rock face in a short period with sub-millimetric accuracy. Instruments like Slope Stability Radars (SSR) provide the quantitative data that is commonly used to perform risk management analysis. However, it is challenging to find a method that can provide an accurate time of failure predictions. Many studies in the recent past have attempted to predict the time of slope failure using the Inverse Velocity (IV) method, and to analyze the probability of a failure with the fuzzy neural networks. Various method investigated in this dissertation include: Minimum Inverse Velocity (MIV), Maximum Velocity (MV), Log Velocity (LV), Log Inverse Velocity (LIV), Spline Regression (SR) and Machine Learning (ML). Based on the results of these studies, the ML method has the highest rate of success in predicting the time of slope failures. The predictions provided by the ML showed ~86% improvement in the results in comparison to the traditional IV method and ~72% improvement when compared with the MIV method. The MIV method also performed well with ~75% improvement in the results in comparison to the traditional IV method. Overall, both the new proposed methods, ML and MIV, outperformed the traditional inverse velocity technique used for predicting slope failure.
APA, Harvard, Vancouver, ISO, and other styles
45

Maurer, Michaela Burri Sonja. "Plasma homocysteine and cardiovascular risk in heart failure with and without cardiorenal syndrome /." [S.l.] : [s.n.], 2009. http://www.ub.unibe.ch/content/bibliotheken_sammlungen/sondersammlungen/dissen_bestellformular/index_ger.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Hewitt, Dolan. "Risk analysis associated with flank failure from Putauaki, Bay of Plenty, New Zealand." The University of Waikato, 2007. http://hdl.handle.net/10289/2337.

Full text
Abstract:
Volcanoes are dynamic evolving structures, with life cycles that are punctuated by episodes of flank instability. Putauaki (Mount Edgecumbe) is a stratovolcano located onshore in the Bay of Plenty, New Zealand. The aim of this study was to assess the stability of Putauaki and analyse the risk associated with volcanic collapse. To achieve this objective, a multidisciplinary approach was used, incorporating geomorphological and geological mapping, rock mass classification, laboratory testing to identify geotechnical properties of materials representative of the volcano, stability modelling, and analysis of landslide run-out zones. Putauaki comprises two predominant features including the larger and younger Main Cone (the summit lying 820 m a.s.l., slope angles up to 36 ), and smaller and older Main Dome (the summit lying 420 m a.s.l., slope angle of 24 ). Both features show little evidence of erosion or surface water. Rock mass description defined six lithotechnical units including indurated andesite, indurated dacite, scoriaceous andesite, altered andesite (all categorised as hard rocks), and block and ash flow and Matahina Ignimbrite (both categorised as soft rocks). The uniaxial compressive strength (UCS) of indurated andesite and indurated dacite was 60 4 MPa and 44.7 0.9 MPa respectively, correlating with moderately strong rock. Discontinuities of the indurated units were widely spaced, showed medium persistence and wide aperture, and were slightly weathered. Infill comprised predominantly loosely packed, very strong, coarse gravel. UCS of scoriaceous andesite and altered andesite was 25 5 MPa and 15 1 MPa respectively, allowing categorisation as very weak rock. Discontinuities of scoriaceous andesite were widely spaced, showed high persistence and wide aperture, and were moderately weathered. Discontinuities of the altered andesite were moderately spaced, showed low persistence and wide aperture, and were highly weathered. Infill of scoriaceous and altered andesite was loosely packed, moist, weak to very weak medium gravel. The block and ash flow was a poorly sorted, loosely packed, sandy, gravely and cobble rich matrix supported deposit. The Matahina Ignimbrite was a very weak, discontinuity-poor deposit. Shear box testing indicated cohesion and friction angle of 0 MPa and 42.1 (block and ash flow) and 1.4 x 10-3 MPa and 41.7 (Matahina Ignimbrite) respectively. These values are similar to published values. Correlation of each lithotechnical unit to its respective rock mass description site allowed approximate boundaries of each unit to be mapped. Each unit's mass strength was combined with measured bulk densities and incorporated into two dimensional slope profiles using the stability modelling package GalenaTM. Ten slope profiles of Putauaki were constructed. Failure surfaces for each slope profile were defined using the Bishop simplified multiple analysis method. Four slope profiles showed the potential for small scale failure (less than 0.1 km2 of material). The remaining six slope profiles showed the potential for large scale failure (greater than 0.1 km2 of material). Stability of these six slope profiles was investigated further in relation to earthquake force, watertable elevation, and a disturbance factor of the rock mass (D). Conditions of failure graphs for profile 6a showed that at low D (less than 0.4), earthquake forces and watertable elevation must be unrealistically high for the region (greater than 0.33 g; greater than 15% watertable elevation) in order produce a factor of safety less than 1. The remaining five slope profiles showed potential to be unstable under realistic earthquake forces and watertable elevations. Two of these profiles were unable to achieve stability at D greater than 0.8 (profile 4) and D greater than 0.9 (profile 5). A D value of 0.6 (intermediate between 0.4 and 0.8) is argued to most realistically represent Putauaki. The fact that Putauaki has not undergone large scale failure to date supports the conclusion that the constructed models overestimate the influence of those factors which promote slope instability. Maximum and minimum landslide run-out zones were constructed for the slope profiles exhibiting the potential for large scale failure. Definition of the position and extent of maximum and minimum run-out zones assumed H/L (fall height to run-out length) ratios of 0.09 and 0.18 respectively, as well as the 'credible flow path' concept. Identified impacts of landslides sourced from Putauaki include inundation of Kawerau Township, Tarawera River, forestry operations, road networks, and power supplies. Based on these impacts, the risk posed by landslides from each slope profile was categorised as ranging from relatively low to relatively high. Landslides sourced from the south-west flanks pose a relatively low risk due to their prerequisite of unrealistically high watertable elevations and earthquake forces. Landslides sourced from the north-west flanks pose a relatively high risk as minimum run-out will inundate north-east parts of Kawerau Township. Landslides sourced from the eastern flanks pose a moderate risk due to their run-out zones avoiding Kawerau Township.
APA, Harvard, Vancouver, ISO, and other styles
47

Bicik, Josef. "A risk-based decision support system for failure management in water distribution networks." Thesis, University of Exeter, 2010. http://hdl.handle.net/10036/110414.

Full text
Abstract:
The operational management of Water Distribution Systems (WDS), particularly under failure conditions when the behaviour of a WDS is not well understood, is a challenging problem. The research presented in this thesis describes the development of a methodology for risk-based diagnostics of failures in WDS and its application in a near real-time Decision Support System (DSS) for WDS’ operation. In this thesis, the use of evidential reasoning to estimate the likely location of a burst pipe within a WDS by combining outputs of several models is investigated. A novel Dempster-Shafer model is developed, which fuses evidence provided by a pipe burst prediction model, a customer contact model and a hydraulic model to increase confidence in correctly locating a burst pipe. A new impact model, based on a pressure driven hydraulic solver coupled with a Geographic Information System (GIS) to capture the adverse effects of failures from an operational perspective, is created. A set of Key Performance Indicators used to quantify impact, are aggregated according to the preferences of a Decision Maker (DM) using the Multi-Attribute Value Theory. The potential of distributed computing to deliver a near real-time performance of computationally expensive impact assessment is explored. A novel methodology to prioritise alarms (i.e., detected abnormal flow events) in a WDS is proposed. The relative significance of an alarm is expressed using a measure of an overall risk represented by a set of all potential incidents (e.g., pipe bursts), which might have caused it. The DM’s attitude towards risk is taken into account during the aggregation process. The implementation of the main constituents of the proposed risk-based pipe burst diagnostics methodology, which forms a key component of the aforementioned DSS prototype, are tested on a number of real life and semi-real case studies. The methodology has the potential to enable more informed decisions to be made in the near real-time failure management in WDS.
APA, Harvard, Vancouver, ISO, and other styles
48

Zelt, Jason. "Risk Factors, Mechanisms and Therapeuthic for Right Heart Failure Associated with Pulmonary Hypertension." Thesis, Université d'Ottawa / University of Ottawa, 2020. http://hdl.handle.net/10393/40737.

Full text
Abstract:
Right ventricular function (RV) is one of the most important predictors of prognosis in many cardiovascular disease states. Despite the significance of RV function to survival, there are no therapies that directly nor selectively improve RV function. As well, the basis for RV failure is poorly understood. This is particularly relevant for patients with pulmonary arterial hypertension (PAH), where RV failure in the setting of pressure overload is the leading cause of death. PAH will be introduced in the 2nd chapter of this thesis by comparing and refining contemporary mortality risk assessment strategies. I will then explore 1) RV neurohormonal function and, 2) RV energetics, two molecular pathways thought to be involved in the pathogenesis and progression of maladaptive RV failure. I employed small animal molecular imaging using positron emission tomography (PET) to non-invasively investigate these pathways. The PET imaging techniques employed in this thesis have the unique potential for translation to human studies, to further explore disease mechanisms.
APA, Harvard, Vancouver, ISO, and other styles
49

Subramanian, Rohit. "COMPUTATIONAL FRAMEWORK TO ASSESS ROLE OF MANUFACTURING IN MATERIAL-DEFECT RELATED FAILURE RISK." The Ohio State University, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=osu1398998282.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Alhadab, Mohammad Muflih Salem. "Real and accrual earnings management, regulatory environments, audit quality and IPO failure risk." Thesis, University of Leeds, 2012. http://etheses.whiterose.ac.uk/4876/.

Full text
Abstract:
This thesis builds on information asymmetry, agency conflicts and litigation-risk backgrounds to examine real and accrual earnings management activities around Initial Public Offerings (IPOs), mitigating factors (regulators and auditors), and consequences for future performance (stock return and IPO survivability). The IPO event is associated with higher levels of information asymmetry and agency conflicts between insiders and outsiders that are found to provide managers with strong incentives and more flexibility to engage in earnings management activities to maximise their wealth instead of shareholders. Due to the existence of information asymmetry around IPOs, IPO firms hire high quality auditors during the IPO to send positive signals about the offer to outside investors (Titman and Trueman, 1986). The first empirical study (chapter five) of this thesis examines whether different regulatory environments impact the use of real and accrual earnings management around IPOs via an analysis of the heavily regulated Main market of the London Stock Exchange and the more lightly regulated Alternative Investment Market (AIM), and whether these different regulatory burdens (restrictive vs. lighter) have different mechanisms/capabilities to correct stock prices that were inflated by earnings management during the IPO. The results of this study show that IPO firms in the UK manage earnings upward utilizing both real and accrual earnings management around IPOs, and that IPO firms on the lightly regulated AIM market exhibit higher levels of sales-based and accrual-based and a lower level of discretionary expenses-based earnings management than IPO firms on the heavily regulated Main market. Further, the results show that real and accrual earnings management, which take place during the IPO year, have severe negative consequences for post-IPO stock return performance, and that the heavily regulated Main market of the London Stock Exchange has better mechanisms-capabilities to correct stock prices that were inflated by earnings management during the IPO year than the lighter regulated AIM market. audit quality impacts real earnings management activities that occur during the IPO, whether enhanced audit quality impacts managers’ tendency to choose between real and accrual earnings management, and whether enhanced audit quality affects the association between real and accrual earnings management and post-IPO stock return performance. The results show that high quality auditors mitigate real earnings management activities that occur through discretionary expenses-based manipulation during the IPO year, and that IPO firms audited by high quality auditors (big N audit firms) undertake a higher level of sales-based manipulation to avoid the monitoring of discretionary expenses-based and accrual-based manipulations. Further, IPO firms audited by high quality auditors are found to experience a severe decline in post-IPO stock return performance due to the extensive use of sales-based manipulation at the IPO year. Thus, this evidence confirms that high quality auditors impact the relationship between real and accrual earnings management and post-IPO stock return performance. Finally, the third empirical study (chapter seven) explores whether real and accrual earnings management that occur during the IPO year are associated with post-IPO failure and survivability in the subsequent periods. The results show that IPO firms with high levels of real and accrual earnings management during the IPO year have a higher probability of failure in the subsequent period. Further, IPO firms that engage in high levels of real and accrual earnings management during the IPO year have lower survival rates in the post-IPO period. In summary, the main findings of this thesis suggest that real and accrual earnings management activities are utilized by IPO firms, that the level of utilizing these activities is dependent on the regulatory environment and audit quality, and that these activities are negatively associated with future stock performance and post-IPO survivorship. Regulators and audit firms should consider the fact that managers switch between real and accrual earnings management to avoid external monitoring. Further, the greater restriction on discretionary expenses-based and accrual-based manipulation seems to lead managers to engage extensively in sales-based manipulation.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography