Dissertations / Theses on the topic 'Risk modelling'

To see the other types of publications on this topic, follow the link: Risk modelling.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Risk modelling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Gilbert, Emmeleen Ulita. "Risk-return portfolio modelling." Master's thesis, University of Cape Town, 2007. http://hdl.handle.net/11427/19030.

Full text
Abstract:
Markowitz introduced the concept of modelling the risk associated with a given security as the variance of the expected return and showed how under certain conditions an investors portfolio can be managed by balancing the expected return of the portfolio and its variance. Building on Markowitz original framework, William Sharpe, extended these ideas by connecting a portfolio to a risky asset. This extension became known as the Sharpe Index Model. There are number of assumptions governing the residuals of the Sharpe index model, one being that the error terms of the stocks are uncorrelated. The Troskie-Hossain innovation to the Sharpe Index model relaxes this assumption. We evaluate the Troskie-Hossain model relative to the Sharpe Index Model and Markowitz portfolio, and find that the Troskie-Hossain model approximates the Markowitz efficient frontier and optimal portfolio very closely. Further examining the residuals, we find evidence of autocorrelation and heteroskedasticity. Using ARMA to model the autocorrelation of the residuals has very little impact on the efficient frontier when working with log returns. However when working with simple returns the ARMA shifts the efficient frontier to the left. We find that GARCH(l , 1) models capture most of the autocorrelation in the squared residuals for both simple returns and log returns and shifts the efficient frontier to the left. Modelling a non-constant conditional mean and non-constant conditional variance (ARMA and GARCH) has proven difficult. The more complex a model becomes the more difficult the estimation. We investigate the effects of dividend yields on the efficient frontier, as well as using simple returns vs log returns in portfolio construction. Including dividend yields in our return data shifts the efficient frontier upwards. However only the a's are increased, and the f3's and f3 t-statistics of the shares remain the same. This shift effect of dividends has no impact on the time series or heteroskedastic models. The simple returns efficient frontier lies above that of the log returns efficient frontier. The a 's for simple returns are very different to those of log returns, however the f3's lie in a similar region to those of log returns.
APA, Harvard, Vancouver, ISO, and other styles
2

Shao, Jia. "Modelling catastrophe risk bonds." Thesis, University of Liverpool, 2015. http://livrepository.liverpool.ac.uk/2033679/.

Full text
Abstract:
Insurance companies are seeking more adequate liquidity funds to cover the insured property losses related to nature and man-made disasters. Past experience shows that the losses caused by catastrophic events, such as earthquakes, tsunamis, floods or hurricanes, are extremely large. One of the alternative methods of covering these extreme losses is to transfer part of the risk to the financial markets, by issuing catastrophe-linked bonds. This thesis focuses on model and value Catastrophe (CAT) risk bonds. The findings of this thesis is twofold. First, we study the pricing process for CAT bonds with different model setups. Second, based on different framework, we structured three catastrophe based (earthquake, general and nuclear risk) bonds, estimated the parameters of the model by employing real world data and obtained numerical results using Monte Carlo simulation. Comparison between different models is also conducted. The first model employed the structure of n financial and m catastrophe-independent risks, and obtain the valuation framework. This generalized extension allows an easier application in the industry. As an illustration, a structured earthquake is considered with parametric trigger type -- annual maximum magnitude of the earthquake -- and the pricing formulas are derived. The second model presents a contingent claim model with the aggregate claims following compound forms where the claim inter-arrival times are dependent on the claim sizes by employing a two-dimensional semi-Markov process. The final model derives nuclear catastrophe (N-CAT) risk bond prices by extending the previous model. A two-coverage type trigger CAT bond is analysed by adding a perturbed state into the claims system, i.e. the system stops (N-CAT bond contract terminated) immediately after a major catastrophe.
APA, Harvard, Vancouver, ISO, and other styles
3

Aas, Kjersti. "Statistical Modelling of Financial Risk." Doctoral thesis, Norwegian University of Science and Technology, Department of Mathematical Sciences, 2007. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-1780.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Esparragoza, Rodriguez Juan Carlos. "Large portfolio credit risk modelling." Thesis, Imperial College London, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.486274.

Full text
Abstract:
A model for large portfolio credit risk is developed by using results on the asymptotic behaviour of stochastic networks. We analyse some of the charac- teristics of the model by studying the infinitesimal generator of the portfolio default process using some results of the theory of Piecewise Deterministic processes (PDPs). An efficient pricing technique is proposed using a newly- 1ntroduced quadrature algorithm using a decomposition of the sample space similar to the canonical Poisson space decomposition. Accurate calibration to iTraxx spreads is demonstrated.
APA, Harvard, Vancouver, ISO, and other styles
5

Scarrott, Carl John. "Reactor modelling and risk assessment." Thesis, Lancaster University, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.414910.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Anastassopoulou, Nikolitsa. "Credit risk measurement and modelling." Thesis, City University London, 2006. http://openaccess.city.ac.uk/8497/.

Full text
Abstract:
This thesis aims to make a contribution to the understanding of the key economic and company specific components of credit spreads in the investment and non-investment grade US bond market for different maturing bond indices. It calls for the full integration of different market andfirm specific variables into a unique framework, in order to predict credit spread changes. Key determinants of default risk are employed to determine credit migration risk. Particularly, this thesis provides evidence as to the relation between different macroeconomic factors and credit spread changes in all different maturities and rating categories, it supports the use of the consumer confidence index, as the most important variable explaining changes in credit spreads in investment and high yield companies, but most importantly it provides support for the strong informational content of high yield spreads as predictors of output growth, based on Option Adjusted Spreads. It favours the inclusion of implied volatilities in explaining credit spread changes, while it criticises the incorporation of historical ones. Throughout the thesis, it becomes evident that BBB-rated bonds exhibit highly volatile patterns and are very difficult to model. Financial ratios adjusted to reflect depreciation and amortisation expenses, which are usually very high for non-investment grade companies, prove to be very important in explaining changes of high yield spreads. However, firm specific risk, accounts only for a small fraction of the variation in the investment grade category. Ultimately, it is shown that by using solely market (equity and macro variables) and firm specific variables, i. e. some of the key determinants of default risk and the price of credit risky debt in most Merton-type models, we can accurately forecast credit spread changes at least one year ahead, particularly based on results provided from the investment grade sample. Moreover, credit spread forecasts, based on our set of OAS, tend to be overestimated rather than underestimated, as opposed to results provided by previous studies. This makes forecasts more conservative and therefore more appealing for risk management purposes. In particular, this thesis is focused on the main drivers of credit spread movements in the US corporate bond market. There are four issues mainly considered. The first part of the thesis examines a question that is a point of central focus in the fixed income literature, i. e. the relation between credit spread changes and the macroeconomic cycle. This chapter is inspired by the relatively little work that has been done on the empirical relationship between credit spread changes and the macroeconomy, since most of the literature on this issue focuses on macroeconomic variables and the modelling of default risk. We investigate how this relation evolves, not only with respect to short, medium and long term maturities but also for investment and non-investment rated companies, by testing the direction of causation among economic variables and credit spreads and by employing different sets of data and estimation techniques to explore the relation. We find that irrespective of the statistical method used or the time period tested that the most important variable in explaining the variation of credit spread changes is the US Consumer Confidence Index. We affirm the negative relation between the consumer confidence index, money supply and changes in credit spreads but not for the variables of GDP and industrial production. The negative relation between the term structure and credit spreads is also asserted for investment grade bonds of all maturities, consistent with the structural model's theory, while we find this relation to be positive for non-investment grade companies. Results from the OLS regressions suggest that macroeconomic variables alone, can explain at best a 17% of the variation in medium and long term maturing indices, and a 20.5% in short term indices. Findings from cross sectional regressions suggest that macroeconomic factors alone can explain 27.9% of the variation in credit spreads for investment grade bonds and a 44.4% for high yield ones. When testing the direction of causation, wefind thatfor long and medium term maturity investment grade indices we reject the null hypothesis that macroeconomic variables don't granger cause changes in credit spreads, but not for short term maturities and the high yield sector. Indeed, results provided on that respect from the high yield category, provide evidence that non-investment grade spreads may be a good proxy for predicting estimating overall financial conditions. Secondly, the relation between credit spreads and equities together with their implied and historical volatilities is examined. This chapter constitutes an effort to fill the gap in the existing literature, which has focused mainly on bond returns or yield changes, while very limited work has been done in modelling credit spread changes. 12 Empirical evidence points out to the fact that debt markets not only in the US but also in Europe and elsewhere seem to be greatly affected by the movements in the equity markets. If that is the case we should expect changes in equity prices to affect changes in credit spreads. This assumption is tested on a cross sectional and time series basis, for quarterly and monthly frequencies and by using company specific equity prices against the respective credit spreads, but also by including equity and volatility indices. We find that there is a negative relation between credit spread and equity changes, irrespective of maturity or rating category. Results provided by univariate regressions, based on changes in equity prices alone, explain haýr of the variation of B-rated corporate spreads. Results affirm the positive relation between implied volatilities and their high explanatory power on credit spread changes while findings derived from historical volatilities although statistically significant don't even marginally support the hypothesis of explaining the variation in credit spreads. In particular, results from pooled regressions suggest that when implied volatilities are substitutedfor the historical ones, adjusted R2 sfell to 6% and 28%for the investment and non-investment grade samples respectively (from 25% and 50.3% for investment and non-investment grade companies, when implied volatilities are considered). Resultsfrom OLS regressions, suggest that equity variables explain at best a 44% for short term maturing indices, and 35% and 37% for medium and long term maturing indices 2 as reflected by the adjusted R S. We also strongly reject the null hypothesis that implied volatilities don't granger cause changes in credit spreads but only with regards to short and medium term maturities. The next chapter of the thesis focuses on how changes in a company's financials, as those are presented by ratios, actually infiuence changes in credit spreads. The reason for including this chapter is due to the fact that although traditional ratio analysis has been widely investigated, it has mainly been tested within the context of default risk, while very limited literature exists on the use of traditional credit risk analysis in determining credit spread changes. Cross sectional analysis is employed in this chapter to test the hypothesis that credit spread changes are influenced by changes in accounting factors, both in investment and high yield categories. On a multivariate basis, we find that 63.5% of the variation in high yield credit spreads is explained by the changes in financial ratios, as reflected by the adjusted R2, compared to an adjusted R2 of 19.2% for investment grade companies. Consistently, 13 in the randomly selected group of companies, we find that traditional ratios can explain one third of the variation in credit spreads in the high yield sector, although less than 10% in the investment grade sample. A reason for the higher explanatory power in the high yield sector entails the use of ratios adjusted, to reflect depreciation and amortisation expenses, which hasn't been considered before. The most statistically and economically significant coefficient was obtained from the current market capitalisation, which was used as a proxy for the firm's size. The last part of the thesis, constitutes an effort to combine all the above factors (macroeconomic, equity and financials), in order to forecast credit spread changes one and two years ahead. We show that on a multiple regression context, results provided are consistent with previous chapters and indeed highly significant in explaining credit spread variation, irrespective of the time period tested. For the total sample we get an adjusted R2 of 95% or 52% as part of the weighted and unweighted statistics respectively. A robust model is identified for forecasting credit spread changes one year ahead, with the employment of the dynamic solution method. The accuracy of the model doesn't fall below 85% within the first year, while we choose as the most vigorous method for estimating coefficients the GLS method adjustedfor heteroscedasticity, since it consistently provides more conservative forecasts.
APA, Harvard, Vancouver, ISO, and other styles
7

Zheng, Teng. "Model risk in financial modelling." Thesis, University of Kent, 2017. https://kar.kent.ac.uk/66707/.

Full text
Abstract:
Motivated by current post-crisis discussions and the corresponding shift in regulatory requirements, this thesis is dedicated to the study of model risk in financial modelling. It is well-known that the majority of finance quantities that are involved in asset pricing, trading, and risk management activities are dependent on the chosen financial models. This gives rise to model risk in all financial activities. Even when the chosen model form is appropriate, model outputs are still subject to parameter estimation uncertainty. Therefore, among different sources of model risk, we mainly focused on investigating the impact of parameter estimation risk and model selection risk in different financial models. Models investigated in this thesis are key models in option pricing, credit risk management, stochastic process of security returns and hedge fund return forecasting. We provoke a solution, which naturally stems from the Bayesian framework. Regarding parameter estimation risk, instead of focusing on point estimation value, it is possible to gauge the rich information about parameter uncertainty from the posterior distribution of parameters. Subsequent impact to model final outputs can be easily accessed by inserting the posterior distribution of parameters into the model. Depending on the related financial activities, model users may find it useful to adopt the estimated value at a certain percentile (e.g. 97.5%) of the posterior distribution as an overlay to the estimated mean value. While more than one candidate model is considered, posterior or predictive probability of a candidate model derived from the likelihood of the model output in fitting the data is applied for a model averaging exercise to account for model selection risk.
APA, Harvard, Vancouver, ISO, and other styles
8

Kratz, Gutstav. "Risk Modelling in Payment Guarantees." Thesis, KTH, Matematisk statistik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-229418.

Full text
Abstract:
The Swedish Export Credit Agency (EKN) issues payment guarantees to Swedish companies who face the risk of non-payments in export transactions. Commitments are typically correlated, as defaults of companies are driven by other factors than factors specific to that company, such as the economic cycle or the regional conditions. In deciding upon how much capital to be reserved to remain solvent even in an unlikely scenario, this has to be accounted for in order to not underestimate financial risks.By studying models for credit risk and the research available in the area, the popular CreditRisk+ has been chosen as a suitable model for EKN to use in risk assessments. The model together with a few customizations are described in detail and tested on data from EKN.
Exportkreditnämnden (EKN) utfärdar betalningsgarantier till svenska exportörer som riskerar inställda betalningar. Fallissemang hos olika motparter är typiskt korrelerade. Vid bedömning av risken i portföljen av garantier måste detta tas i beaktning, för att inte underskatta risken väsentligt. Genom att studera befintliga kreditriskmodeller och tillgänglig forskning inom området har en modell föreslagits som kan användas i EKN:s riskbedömningar. Modellen beskrivs i detalj och testas på data från EKN.
APA, Harvard, Vancouver, ISO, and other styles
9

Kasprowicz, Tomasz. "Threshold Theory--modelling risk attitude /." Available to subscribers only, 2008. http://proquest.umi.com/pqdweb?did=1650506301&sid=11&Fmt=2&clientId=1509&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Tran, Cao Son. "Structures in credit risk modelling." Thesis, Queensland University of Technology, 2021. https://eprints.qut.edu.au/207989/1/Cao%20Son_Tran_Thesis.pdf.

Full text
Abstract:
The thesis is concerned with the design of conceptual structures in credit risk modelling. The focus is on designing mathematical constructs that serve as a unified framework for reasoning about credit risk modelling. Three contributions are made to this area of research: category theory, providing a powerful tool to study the relations of common structures underlying credit risk modelling; stacking model, to address issues of inconsistent and biased performance measurement; and the Kelly criterion, shifting the focus from dichotomous classification to optimal credit risk allocation.
APA, Harvard, Vancouver, ISO, and other styles
11

Sylta, Øyvind. "Hydrocarbon migration modelling and exploration risk." Doctoral thesis, Norwegian University of Science and Technology, Department of Geology and Mineral Resources Engineering, 2004. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-1492.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Larneback, Marcus. "Modelling Operational Risk using Actuarial Methods." Thesis, Umeå universitet, Institutionen för matematik och matematisk statistik, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-51340.

Full text
Abstract:
Within the financial industry Operational Risk is a relatively new concept, but within recent years it has gained more attention due to prior economically devastating events; these are events that cannot be categorized as market- or credit risks. The purpose of this thesis is to study the Loss Distribution Approach(LDA). This is one of the more rigorous models proposed by the Basel Committee on Banking Supervision, in order to calculate the required capital charge that should be set aside to cover future operational loss events within financial institutions. The focus is on the close connec- tion between the LDA and modelling techniques which are used in the field of insurance mathematics. A summary of relevant theoretical results underlying the (aggregate) loss distribution is given, and then detailed ways to model the specific components are pro- vided. More specifically, certain techniques are emphasized, for example: extreme value theory are used to model certain loss severities; and also generalized linear models are used to link event frequency to possible causes, thereby also allow for scenario-based modelling. The models are illustrated in a numerical example where parameter calibra- tion and exploratory techniques are addressed, and several useful tools are identified. Finally capital charges in terms of VaR and CVaR measures are numerically calculated and compared with asymptotic approximations.
APA, Harvard, Vancouver, ISO, and other styles
13

Kwok, Ho King Calvin Actuarial Studies Australian School of Business UNSW. "Energy price modelling and risk management." Awarded by:University of New South Wales. Actuarial Studies, 2007. http://handle.unsw.edu.au/1959.4/40602.

Full text
Abstract:
This thesis focuses on the development of a forecasting model for short- to medium-term electricity spot prices, based on modelling the dynamics of the supply and demand functions. It is found that the equilibrium assumption frequently adopted in electricity price models does not always hold; to overcome this problem, a notional demand process derived from the market clearing condition is proposed. Not only is this demand process able to capture all the price-affecting factors in one variable, but it also allows the equilibrium assumption to be satisfied and a spot price model to be built, using any appropriate form of hypothetical supply function. In addition, this thesis presents a model for approximating and modelling the bid stacks by capturing the points that govern their shape and location. Integrating these two models provides a realistic model that has a mean absolute percentage error of approximately 19% and 24% for week- and month-ahead forecasts respectively, when applied to the New South Wales (NSW) half-hourly electricity spot prices. Additionally, the density forecasting evaluation method proposed by Diebold et al. (1998) is employed in the thesis to assess the performance of the model. Besides the development of a spot price model, a two-part empirical study is made of the prices of NSW electricity futures contracts. The first part of the study develops a method based on the principle of certainty equivalence, which enables the market utility function to be recovered from a set of futures market quotes. The method is tested with two different sets of simulated data and works as expected. However, it is unable to obtain useful results from the NSW market quotes due to the poor data quality. The second part uses a regression method to investigate the relationship between futures prices and the descriptive statistics of the underlying spot prices. The result suggests that futures prices in NSW are linear combinations of the median and volatility of the final payoff.
APA, Harvard, Vancouver, ISO, and other styles
14

Jackson, Alexander. "Interest rate and credit risk modelling." Thesis, University of Oxford, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.400043.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Ma, Zishun. "Topics in financial market risk modelling." Thesis, University of Newcastle Upon Tyne, 2012. http://hdl.handle.net/10443/1675.

Full text
Abstract:
The growth of the financial risk management industry has been motivated by the increased volatility of financial markets combined with the rapid innovation of derivatives. Since the 1970s, several financial crises have occurred globally with devastating consequences for financial and non-financial institutions and for the real economy. The most recent US subprime crisis led to enormous losses for financial and non-financial institutions and to a recession in many countries including the US and UK. A common lesson from these crises is that advanced financial risk management systems are required. Financial risk management is a continuous process of identifying, modeling, forecasting and monitoring risk exposures arising from financial investments. The Value at Risk (VaR) methodology has served as one of the most important tools used in this process. This quantitative tool, which was first invented by JPMorgan in its Risk-Metrics system in 1995, has undergone a considerable revolution and development during the last 15 years. It has now become one of the most prominent tools employed by financial institutions, regulators, asset managers and nonfinancial corporations for risk measurement. My PhD research undertakes a comprehensive and practical study of market risk modeling in modern finance using the VaR methodology. Two newly developed risk models are proposed in this research, which are derived by integrating volatility modeling and the quantile regression technique. Compared to the existing risk models, these two new models place more emphasis on dynamic risk adjustment. The empirical results on both real and simulated data shows that under certain circumstances, the risk prediction generated from these models is more accurate and efficient in capturing time varying risk evolution than traditional risk measures. Academically, the aim of this research is to make some improvements and extensions of the existing market risk modeling techniques. In practice, the purpose of this research is to support risk managers developing a dynamic market risk measurement system, which will function well for different market states and asset categories. The system can be used by financial institutions and non-financial institutions for either passive risk measurement or active risk control.
APA, Harvard, Vancouver, ISO, and other styles
16

Bedendo, Mascia. "Density forecasting in financial risk modelling." Thesis, University of Warwick, 2003. http://wrap.warwick.ac.uk/2661/.

Full text
Abstract:
As a result of an increasingly stringent regulation aimed at monitoring financial risk exposures, nowadays the risk measurement systems play a crucial role in all banks. In this thesis we tackle a variety of problems, related to density forecasting, which are fundamental to market risk managers. The computation of risk measures (e.g. Value-at-Risk) for any portfolio of financial assets requires the generation of density forecasts for the driving risk factors. Appropriate testing procedures must then be identified for an accurate appraisal of these forecasts. We start our research by assessing whether option-implied densities, which constitute the most obvious forecasts of the distribution of the underlying asset at expiry, do actually represent unbiased forecasts. We first extract densities from options on currency and equity index futures, by means of both traditional and original specifications. We then appraise them, via rigorous density forecast evaluation tools, and we find evidence of the presence of biases. In the second part of the thesis, we focus on modelling the dynamics of the volatility curve, in order to measure the vega risk exposure for various delta-hedged option portfolios. We propose to use a linear Kalman filter approach, which gives more precise forecasts of the vega risk exposure than alternative, well-established models. In the third part, we derive a continuous time model for the dynamics of equity index returns from a data set of 5-minute returns. A model inferred from high-frequency typical of risk measures calculations. The last part of our work deals with evaluating density forecasts of the joint distribution of the risk factors. We find that, given certain specifications for the multivariate density forecast, a goodness-of-fit procedure based on the Empirical Characteristic Function displays good statistical properties in detecting misspecifications of different nature in the forecasts.
APA, Harvard, Vancouver, ISO, and other styles
17

Parslow, Gary Iain. "Erosion risk modelling of subsea components." Thesis, Cranfield University, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.267498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Gallagher, Raymond. "Uncertainty modelling in quantitative risk analysis." Thesis, University of Liverpool, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.367676.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Xu, Dapeng. "Essays on theoretical credit risk modelling." Thesis, University of Strathclyde, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.275173.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Apere, Pius Oyabramo. "Modelling life insurance new business risk." Thesis, City University London, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.435038.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Ledezma, Rosalía Diana Díaz. "Three studies in credit risk modelling." Thesis, City University London, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.397858.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Hunt, A. "Mortality modelling and longevity risk management." Thesis, City University London, 2015. http://openaccess.city.ac.uk/13532/.

Full text
Abstract:
The 20th century has witnessed some of the largest and most widespread gains in human longevity ever witnessed, which show no sign of slowing down during the early years of the 21st century. The risk of further, higher than anticipated improvements in life expectancy - known as longevity risk - is now a major and growing field of study. This thesis investigates a number of theoretical and practical problems within the field of longevity risk relating to the structure and identifiability issues within many of the most common models used to study mortality rates, the construction of new mortality models, the projection of these models into the future, the impact of differences in the level and evolution of mortality rates in different populations (such as pension schemes) and the market-consistent valuation and measurement of risk in longevity-linked liabilities and securities.
APA, Harvard, Vancouver, ISO, and other styles
23

Hossain, Nafees. "Accurate portfolio risk-return structure modelling." Doctoral thesis, University of Cape Town, 2006. http://hdl.handle.net/11427/18423.

Full text
Abstract:
Markowitz's modem portfolio theory has played a vital role in investment portfolio management, which is constantly pushing the development on volatility models. Particularly, the stochastic volatility model which reveals the dynamics of conditional volatility. Financial time series and volatility models has become one of the hot spots in operations research. In this thesis, one of the areas we explore is the theoretical formulation of the optimal portfolio selection problem under Ito calculus framework. Particularly, a stochastic variation calculus problem, i.e., seeking the optimal stochastic volatility diffusion family for facilitating the best portfolio selection identified under the continuous-time stochastic optimal control theoretical settings. One of the properties this study examines is the left-shifting role of the GARCH(1, 1) (General Autoregressive Conditional Heteroskedastic) model's efficient frontier. This study considers many instances where the left shifting superior behaviour of the GARCH(1, 1) is observed. One such instance is when GARCH(1, 1) is compared within the volatility modelling extensions of the GARCH environ in a single index framework. This study will demonstrate the persistence of the superiority of the G ARCH ( 1, 1) frontier within a multiple and single index context of modem portfolio theory. Many portfolio optimization models are investigated, particularly the Markowitz model and the Sharpe Multiple and Single index models. Includes bibliographical references (p. 313-323).
APA, Harvard, Vancouver, ISO, and other styles
24

Singh, Abhay Kumar. "Modelling Extreme Market Risk - A Study of Tail Related Risk Measures." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2011. https://ro.ecu.edu.au/theses/417.

Full text
Abstract:
Market risk modelling is one of the most dynamic domains in finance. Risk is the uncertainty that affects the values of assets in the system in an unknown fashion causing fluctuations in their values and in investment outcomes. Market risk is defined as the losses due to fluctuations in the prices of financial assets which are caused by changing market conditions. Market risk modelling comprises tools and techniques which quantify the risk associated with financial instruments. Risk quantification is necessary to devise strategies such as hedging or diversification against the risk, to avoid severe losses. With the recent financial market events like the Global Financial Crisis, there is a need to evaluate the traditional risk return relationships presented in Asset Pricing models and more sophisticated risk modelling tools like Value at Risk (VaR). Along with Asset Pricing and VaR modelling another important risk issue between financial assets is the asymptotic tail dependence, which plays a vital role in accurate risk measurement in portfolio selection and hedging amongst other considerations. The usual measure of dependence, the Pearson Correlation coefficient works on the assumption of normality in the data distribution and hence is unable to capture the tail dependence between financial assets which is an important characteristic for tail risk modelling. The research presented in this dissertation models the risk quantification techniques of Asset Pricing, VaR modelling and Tail dependence, with the more sophisticated statistical tools of Quantile Regression and Extreme Value Theory (EVT), which are particularly useful in modelling the tail behaviour of the distributions. The research targets four broad objectives to evaluate extreme risk and dependence measures in the Australian stock market which are realised with the robust techniques of Quantile Regression and EVT. The thesis comprises six chapters with chapter-1 introducing the thesis presenting the driving motivations for the research and the four major objectives (which are detailed in individual chapters following chapter-1) along with the contribution of the research and finally chapter-6 presenting the conclusion. The structure of rest of the thesis is also outlined in chapter-1.
APA, Harvard, Vancouver, ISO, and other styles
25

Schmelck, Anders. "Modelling risk in multi asset-class portfolios." Thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for matematiske fag, 2010. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-14977.

Full text
Abstract:
Using a simulation based model, with the Black-Scholes framework for equity andThe LIBOR Market Model for interest rates, we study market risk in multi assetclassportfolios, with static and dynamic weighting. The risk measures consideredare Value-at-Risk and Expected-Tail-Loss. The theoretical foundation is introducedand imperfections in the models and their assumptions are pointed out.The validity of the models and risk measures is tested using a backtesting procedureagainst data ranging from September 1999 to September 2009, with particularemphasis on the turbulent period of 2007 to September 2009. The results indicatethat the models perform slightly worse on the portfolio with the added complexityof a dynamic weighting regime. No evidence of the models performing lesssatisfactory under the latest financial turbulence is found.
APA, Harvard, Vancouver, ISO, and other styles
26

Katsigiannakis, Konstantinos. "Electricity price risk : modelling the supply stack." Thesis, Imperial College London, 2006. http://hdl.handle.net/10044/1/7429.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Yu, Lanhua. "Risk management : modelling dependence between asset returns." Thesis, Imperial College London, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.420966.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Jobst, Norbert Josef. "On credit risk modelling,measurement and optimisation." Thesis, Brunel University, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.270193.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Cheung, W. "Credit risk modelling with default-triggered acquisition." Thesis, University of Cambridge, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.597588.

Full text
Abstract:
In this thesis, I propose that, given the opportunities for default-triggered acquisition (DTA), it is liquidation risk rather than just default risk that should really concern creditors, particularly senior debt investors. DTA is an important market phenomenon with significant implications for credit risk, but is under-researched in the credit risk modelling literature. This thesis attempts to bridge the gap. It developed three diffusion –based models, one for risky bond pricing (Chapter 3), and two for the potential acquirers’ positions, i.e. the European type of option-to-acquire (Chapter 6), and the Quasi-American type (Chapter 7). Far from being three unrelated models, these models are developed to serve one common theme – credit risk modelling. My research will pursue the establishment of a risky bond pricing model under the assumption that DTA is a wise decision, as opposed to pre-default acquisition. This assumption is justified through an analysis of the Quasi-American-European spread.  All models are implemented numerically in this thesis. The numerical results derived are intuitive. The following summarises the key models developed in this thesis: The risky bond pricing model. Chapter 3 develops a structural model of the defaultable bond, taking into account the possibility that the borrowing firm might be acquired following a default. In this model, post-default acquisition is treated as an event triggered where the alternative valuation of the firm’s assets exceeds the failing firm’s value, and covers the acquisition costs (which are essentially the conceded debt liability). Analytical results are derived for situations where default time is assumed to be either fixed (as in Merton (1974)) or random (as in Black and Cox (1976)) and there are one or more alternative valuations. The implementation and numerical analysis of the model are presented in Chapters 4 and 5 respectively. The European option-to-acquire model. Instead of modelling from the bond investors’ point of view, Chapter 6 analyses the DTA problem from the perspective of potential acquirers, and develops a structural model of their position, i.e. the European option-to-acquire, with analytical results. A comprehensive numerical comparative static analysis is also undertaken, which provides some insights for the Greeks of the real option. The Quasi-American option-to-acquire model. The risky bond pricing model addresses the DTA issue under the assumption that acquisitions only take place at default. Chapter 7 aims to examine whether this assumption is viable or not. It considers pre-default acquisition opportunities leading from the extension of the European option to the corresponding Quasi-American option-to-acquire. Numerical results are derived for a comprehensive numerical comparative static analysis.  The comparison of the European and Quasi-American positions sheds some light on the circumstances in which DTAs become important.
APA, Harvard, Vancouver, ISO, and other styles
30

Weigel, Peter. "Term structure modelling : pricing and risk management." Thesis, University of Warwick, 2003. http://wrap.warwick.ac.uk/63584/.

Full text
Abstract:
This thesis is about interest rate modelling with applications in pricing and risk management of interest rate derivatives and portfolios. The first part of the thesis is developed within the random field framework suggested by Kennedy (1994). The framework is rich enough to be used for both pricing and risk management, but we believe its real value lies in the latter. Our main objective is to construct infinite-factor Gaussian field models that can fit the sample covariance matrices observed in the market. This task has not previously been addressed by the work on field methodology. We develop three methodologies for constructing strictly positive definite covariance functions, characterising infinite-factor Gaussian fields. We test all three constructions on the sample covariance and correlation matrices obtained from US and Japanese bond market data. The empirical and numerical tests suggest that these classes of field models present very satisfactory solutions to the posed problem. The models we develop make the random field methodology a much more practical tool. They allow calibration of field models to key market information, namely the covariation of the yields. The second part of the thesis deals with pricing kernel (potential) models ofthe term structure. These were first introduced by Constantinides (1992), but were subsequently overshadowed by the market models, which were developed by Miltersen et al. (1997), and Brace et al. (1997), and are very popular among the practitioners. Our objective is to construct a class of arbitrage-free term structure models that enjoy the same ease of calibration as the market models, but do not suffer from non-Markov evolution as is the case with the market models. We develop a class of models the within pricing kernel framework. I.e., we model the pricing kernel directly, and not a particular interest rate or a set of rates. The construction of the kernel is explicitly linked to the calibrating set of instruments. Thus, once the kernel is constructed it will price correctly the chosen set of instruments, and have a low-dimensional Markov structure. We test our model on yield, at-the-money cap, caplet implied volatility surface, and swaption data. We achieve a very good quality of fit.
APA, Harvard, Vancouver, ISO, and other styles
31

Gonzalez, Jhonny. "Modelling and controlling risk in energy systems." Thesis, University of Manchester, 2015. https://www.research.manchester.ac.uk/portal/en/theses/modelling-and-controlling-risk-in-energy-systems(b575d2c7-154f-4aca-b15e-4b99e0b3c661).html.

Full text
Abstract:
The Autonomic Power System (APS) grand challenge was a multi-disciplinary EPSRC-funded research project that examined novel techniques that would enable the transition between today's and 2050's highly uncertain and complex energy network. Being part of the APS, this thesis reports on the sub-project 'RR2: Avoiding High-Impact Low Probability events'. The goal of RR2 is to develop new algorithms for controlling risk exposure to high-impact low probability (Hi-Lo) events through the provision of appropriate risk-sensitive control strategies. Additionally, RR2 is concerned with new techniques for identifying and modelling risk in future energy networks, in particular, the risk of Hi-Lo events. In this context, this thesis investigates two distinct problems arising from energy risk management. On the one hand, we examine the problem of finding managerial strategies for exercising the operational flexibility of energy assets. We look at this problem from a risk perspective taking into account non-linear risk preferences of energy asset managers. Our main contribution is the development of a risk-sensitive approach to the class of optimal switching problems. By recasting the problem as an iterative optimal stopping problem, we are able to characterise the optimal risk-sensitive switching strategies. As byproduct, we obtain a multiplicative dynamic programming equation for the value function, upon which we propose a numerical algorithm based on least squares Monte Carlo regression. On the other hand, we develop tools to identify and model the risk factors faced by energy asset managers. For this, we consider a class of models consisting of superposition of Gaussian and non-Gaussian Ornstein-Uhlenbeck processes. Our main contribution is the development of a Bayesian methodology based on Markov chain Monte Carlo (MCMC) algorithms to make inference into this class of models. On extensive simulations, we demonstrate the robustness and efficiency of the algorithms to different data features. Furthermore, we construct a diagnostic tool based on Bayesian p-values to check goodness-of-fit of the models on a Bayesian framework. We apply this tool to MCMC results from fitting historical electricity and gas spot price data- sets corresponding to the UK and German energy markets. Our analysis demonstrates that the MCMC-estimated models are able to capture not only long- and short-lived positive price spikes, but also short-lived negative price spikes which are typical of UK gas prices and German electricity prices. Combining together the solutions to the two problems above, we strive to capture the interplay between risk, uncertainty, flexibility and performance in various applications to energy systems. In these applications, which include power stations, energy storage and district energy systems, we consistently show that our risk management methodology offers a tradeoff between maximising average performance and minimising risk, while accounting for the jump dynamics of energy prices. Moreover, the tradeoff is achieved in such way that the benefits in terms of risk reduction outweigh the loss in average performance.
APA, Harvard, Vancouver, ISO, and other styles
32

Li, Wang. "Default contagion modelling and counterparty credit risk." Thesis, University of Manchester, 2017. https://www.research.manchester.ac.uk/portal/en/theses/default-contagion-modelling-and-counterparty-credit-risk(76eee42a-d83d-4af9-956e-050615298b65).html.

Full text
Abstract:
This thesis introduces models for pricing credit default swaps (CDS) and evaluating the counterparty risk when buying a CDS in the over-the-counter (OTC) market from a counterpart subjected to default risk. Rather than assuming that the default of the referencing firm of the CDS is independent of the trading parties in the CDS, this thesis proposes models that capture the default correlation amongst the three parties involved in the trade, namely the referencing firm, the buyer and the seller. We investigate how the counterparty risk that CDS buyers face can be affected by default correlation and how their balance sheet could be influenced by the changes in counterparty risk. The correlation of corporate default events has been frequently observed in credit markets due to the close business relationships of certain firms in the economy. One of the many mathematical approaches to model that correlation is default contagion. We propose an innovative model of default contagion which provides more flexibility by allowing the affected firm to recover from a default contagion event. We give a detailed derivation of the partial differential equations (PDE) for valuing both the CDS and the credit value adjustment (CVA). Numerical techniques are exploited to solve these PDEs. We compare our model against other models from the literature when measuring the CVA of an OTC CDS when the default risk of the referencing firm and the CDS seller is correlated. Further, the model is extended to incorporate economy-wide events that will damage all firms' credit at the same time-this is another kind of default correlation. Advanced numerical techniques are proposed to solve the resulting partial-integro differential equations (PIDE). We focus on investigating the different role of default contagion and economy-wide events have in terms of shaping the default correlation and counterparty risk. We complete the study by extending the model to include bilateral counterparty risk, which considers the default of the buyer and the correlation among the three parties. Again, our extension leads to a higher-dimensional problem that we must tackle with hybrid numerical schemes. The CVA and debit value adjustment (DVA) are analysed in detail and we are able to value the profit and loss to the investor's balance sheet due to CVA and DVA profit and loss under different market circumstances including default contagion.
APA, Harvard, Vancouver, ISO, and other styles
33

Hunt, Laurence T. "Modelling human decision under risk and uncertainty." Thesis, University of Oxford, 2011. http://ora.ox.ac.uk/objects/uuid:244ce799-7397-4698-8dac-c8ca5d0b3e28.

Full text
Abstract:
Humans are unique in their ability to flexibly and rapidly adapt their behaviour and select courses of action that lead to future reward. Several ‘component processes’ must be implemented by the human brain in order to facilitate this behaviour. This thesis examines two such components; (i) the neural substrates supporting action selection during value- guided choice using magnetoencephalography (MEG), and (ii) learning the value of environmental stimuli and other people’s actions using functional magnetic resonance imaging (fMRI). In both situations, it is helpful to formally model the underlying component process, as this generates predictions of trial-to-trial variability in the signal from a brain region involved in its implementation. In the case of value-guided action selection, a biophysically realistic implementation of a drift diffusion model is used. Using this model, it is predicted that there are specific times and frequency bands at which correlates of value are seen. Firstly, there are correlates of the overall value of the two presented options, and secondly the difference in value between the options. Both correlates should be observed in the local field potential, which is closely related to the signal measured using MEG. Importantly, the content of these predictions is quite distinct from the function of the model circuit, which is to transform inputs relating to the value of each option into a categorical decision. In the case of social learning, the same reinforcement learning model is used to track both the value of two stimuli that the subject can choose between, and the advice of a confederate who is playing alongside them. As the confederate advice is actually delivered by a computer, it is possible to keep prediction error and learning rate terms for stimuli and advice orthogonal to one another, and so look for neural correlates of both social and non-social learning in the same fMRI data. Correlates of intentional inference are found in a network of brain regions previously implicated in social cognition, notably the dorsomedial prefrontal cortex, the right temporoparietal junction, and the anterior cingulate gyrus.
APA, Harvard, Vancouver, ISO, and other styles
34

Kyselá, Eva. "Modelling portfolios with heavy-tailed risk factors." Master's thesis, Vysoká škola ekonomická v Praze, 2015. http://www.nusl.cz/ntk/nusl-264017.

Full text
Abstract:
The thesis aims to investigate some of the approaches to modelling portfolio returns with heavy-tailed risk factors. It first elaborates on the univariate time series models, and compares the benchmark model (GARCH with Student t innovations or its GJR extension) predictive performance with its two competitors, the EVT-GARCH model and the Markov-Switching Multifractal (MSM) model. The motivation of EVT extension of GARCH specification is to use a more proper distribution of the innovations, based on the empirical distribution function. The MSM is one of the best performing models in the multifractal literature, a markov-switching model which is unique by its parsimonious specification and variability. The performance of these models is assessed with Mincer-Zarnowitz regressions as well as by comparison of quality of VaR and expected shortfall predictions, and the empirical analysis shows that for the risk management purposes the EVT-GARCH dominates the benchmark as well as the MSM. The second part addresses the dependence structure modelling, using the Gauss and t-copula to model the portfolio returns and compares the result with the classic variance-covariance approach, concluding that copulas offer a more realistic estimates of future extreme quantiles.
APA, Harvard, Vancouver, ISO, and other styles
35

Du, Toit Carl. "Modelling market risk with SAS Risk Dimensions : a step by step implementation." Thesis, Link to the online version, 2005. http://hdl.handle.net/10019/1015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Starobinskaya, Irina. "Structural modelling of operational risk in financial institutions : application of Bayesian networks and balanced scorecards to IT infrastructure risk modelling /." Berlin : Pro Business, 2008. http://d-nb.info/991725328/04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Crossland, Ross. "Risk in the development design." Thesis, University of Bristol, 1997. http://hdl.handle.net/1983/aa5c6f5c-8e74-44ab-a6da-545ec6d39cfd.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Vadeboncoeur, Nathan Noel. "Knowing climate change : modelling, understanding, and managing risk." Thesis, University of British Columbia, 2014. http://hdl.handle.net/2429/50777.

Full text
Abstract:
Climate change is a complex problem. Approaches to understanding climate change risk and preparing for its management include assessments of biophysical changes, the influence of public risk perceptions on support for policies aimed at adapting to these changes, and analysis of the governance structures charged with developing and implementing climate action plans. Climate change issues, however, are often approached from a disciplinary perspective and there are few studies examining how climate risk is viewed from multiple perspectives in a particular locale. This thesis takes a bottom-up approach to understanding climate change by focusing on how climate risk is understood on the Sunshine Coast, British Columbia, as a biophysical, social, and governance issue. It begins by surveying the available biophysical information of climate change and presents a sea level rise impact model for the Sunshine Coast. Next, it explores how public perceptions of climate risk (as distinct from climate change knowledge as scientific literacy) develop and how these affect support for climate change policies. It then examines the perspective of a local government, the Town of Gibsons, in planning for climate change adaptation. Here, it focuses on how decision- makers plan for climate change by examining their perspectives on biophysical risks and the social context within which climate issues are located. Throughout the thesis, I argue that the process of adapting to climate change (a risk management strategy) has strongly social roots and that understanding how climate change fits within the context of individual communities is, along with knowledge of biophysical hazards, an essential component of adaptation.
Science, Faculty of
Resources, Environment and Sustainability (IRES), Institute for
Graduate
APA, Harvard, Vancouver, ISO, and other styles
39

Dimitrova, Dimitrina S. "Dependent risk modelling in (re)insurance and ruin." Thesis, City, University of London, 2007. http://openaccess.city.ac.uk/18910/.

Full text
Abstract:
The work presented in this dissertation is motivated by the observation that the classical (re)insurance risk modelling assumptions of independent and identically distributed claim amounts, Poisson claim arrivals and premium income accumulating linearly at a certain rate, starting from possibly non-zero initial capital, are often not realistic and violated in practice. There is an abundance of examples in which dependence is observed at various levels of the underlying risk model. Developing risk models which are more general than the classical one and can successfully incorporate dependence between claim amounts, consecutively arriving at the insurance company, and/or dependence between the claim inter-arrival times, is at the heart of this dissertation. The main objective is to consider such general models and to address the problem of (non-) ruin within a finite-time horizon of an insurance company. Furthermore, the aim is to consider general risk and performance measures in the context of a risk sharing arrangement such as an excess of loss (XL) re insurance contract. There are two parties involved in an XL re insurance contract and their interests are contradictory, as has been first noted by Karl Borch in the 1960s. Therefore, we define joint, between the cedent and the reinsurer, risk and performance measures, both based on the probability of ruin, and show how the latter can be used to optimally set the parameters of an XL reinsurance treaty. Explicit expressions for the proposed risk and performance measures are derived and are used efficiently in numerical illustrations.
APA, Harvard, Vancouver, ISO, and other styles
40

Roberts, C. M., and n/a. "Modelling cybercrime and risk for New Zealand organisations." University of Otago. Department of Information Science, 2009. http://adt.otago.ac.nz./public/adt-NZDU20091009.162528.

Full text
Abstract:
The Internet is now fundamental to the global economy. Growing from an experimental and research network in the late 1960's, it is now the foundation of a wide range of economic, infrastructure support, communication and information sharing activities. In doing so it has also provided a vehicle for cybercrime. Organised cybercrime and state-sponsored malicious cyber activity are predicted to become the predominant cyber threats over the next five to ten years. Corporate governance is playing an increasingly important role in ensuring compliance with the growing body of legislation and regulation, protecting the interests of stakeholders. At the same time there is a divergence in organisational awareness, understanding, strategy and application between business objectives, risk management and good security practices. Organisations are finding increasing difficulty in managing the scope and extent of the cyber-threat environment, exacerbated by confusion over risk tools, approaches and requirements. This study provides a pragmatic and practical framework for organisational risk assessment, already proved over several years of use. This is supported by three national surveys which provide important data for sound risk identification and assessments. This survey data is organised through a Data Schema which is simple, rational and flexible enough to accommodate new technologies and types of cyber-attacks, as well as allowing for the decommissioning of technologies and the abandonment of attack methods. For many organisations this risk framework will be sufficient to meet their corporate governance and risk management requirements. For organisations wishing to refine their approach, a Bayesian model has also been developed, building on previous work, incorporating data from the surveys and, through the Data Schema, allowing the incorporation of probabilities and other evidence to enhance the risk assessment framework. Again this model is flexible, accommodating changes, growth and new technologies.
APA, Harvard, Vancouver, ISO, and other styles
41

Margonis, Efstathios. "Modelling credit risk with applications to credit derivatives." Thesis, Imperial College London, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.398142.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Sangrasi, Asif. "Component level risk assessment modelling for grid resources." Thesis, University of Leeds, 2012. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.590154.

Full text
Abstract:
Service level agreements (SLAs), as formal contractual agreements, increase the confidence level between the End User and the Grid Resource provider, as compared to the best effort approach. However, SLAs fail short of assessing the risk in acceptance of the SLA; Risk Assessment in Grid computing fills that gap. Risk Assessment is crucial for the Resource provider as failing to fulfil an SLA will result in facing a financial penalty_ Thus risk, a deterrent to the commercial viability of Grids, needs to be assessed and mitigated to overcome the pitfalls associated with SLAs. The current approaches to assess and manage risk in Grid computing are a major step towards the provisioning of Quality of Service (QoS) to the end-user. However these approaches are based on node or machine level Assessment. As a node may contain CPU(s), storage devices, connections for communication and software resource, consequently a node failure may actually be a failure of any of these components. Our approach towards Risk Assessment is aimed at a granularity level of individual components as compared to machine level in previous efforts. Moreover the past efforts of risk assessment at node level fail short of considering the nature of the Grid Failure data that is repairable or replaceable. Thus to overcome the short comings of the previous efforts, we propose Risk Assessment Model(s) at component level considering the resources repairable and replaceable. A three step methodology was utilized in this work consisting of Data analysis, Risk modelling and Experimentation. The Probabilistic model, proposed at the component level based on senes and parallel model(s) considers Grid Resources as replaceable is based on. Similarly an R-out-N model is proposed for the aggregation of risk values for a number of nodes and provides more detailed results but with some pitfalls, against the parallel model. On the other hand, a risk assessment model at the component level based on NonHomogeneous Poisson Process (NHPP) model is proposed considering Grid Resources as Repairable. Grid failure data is used for the experimentation at the component level. The proposed NHPP based Grid risk model selection is validated by using a goodness of fit test along with graphical approaches. Similarly, considering Grid resources as repairable, a Semi Markov based Risk assessment model is also proposed. The Semi Markov based risk assessment model provides slight advantages over the NHPP based model such as taking repairability extrinsically and assessing the probabilities of repair for an individual components within a node. The three proposed risk models are evaluated by conducting the experimentation and are further evaluated by conducting a comparative evaluation and performance analysis of the proposed models. Detailed risk assessment information at the component level is provided by the experimental results of the proposed risk assessment models which can help enable Grid Resource provider to manage and use the Grid resources efficiently. These results can in turn help enhance the commercial viability and QoS provisioning to End Users by utilization of risk aware scheduling by Grid Resource Provider.
APA, Harvard, Vancouver, ISO, and other styles
43

Miao, Daniel Wei-Chung. "Markov Modulated Poisson Processes in Credit Risk Modelling." Thesis, University of Oxford, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.490112.

Full text
Abstract:
In this thesis, we use the Markov Modulated Poisson Process (MMPP) to model default arrival, a central issue of credit risk modelling. We work within the framework of reduced-form models to describe default rates as Markov chains, as an alternative to diffusion-based models. On one hand, the Markov chain models are able to approximate closely the diffusion models. On the other hand, their discrete nature provides more modelling flexibility and allows for the incorporation of advanced features. With these benefits they can be applied to a range of credit derivative pricing problems.
APA, Harvard, Vancouver, ISO, and other styles
44

Ahlgren, Markus. "Internal Market Risk Modelling for Power Trading Companies." Thesis, KTH, Matematisk statistik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-173917.

Full text
Abstract:
Since the financial crisis of 2008, the risk awareness has increased in the -financial sector. Companies are regulated with regards to risk exposure. These regulations are driven by the Basel Committee that formulates broad supervisory standards, guidelines and recommends statements of best practice in banking supervision. In these regulations companies are regulated with own funds requirements for market risks. This thesis constructs an internal model for risk management that, according to the "Capital Requirements Regulation" (CRR) respectively the "Fundamental Review of the Trading Book" (FRTB), computes the regulatory capital requirements for market risks. The capital requirements according to CRR and FRTB are compared to show how the suggested move to an expected shortfall (ES) based model in FRTB will affect the capital requirements. All computations are performed with data that have been provided from a power trading company to make the results fit reality. In the results, when comparing the risk capital requirements according to CRR and FRTB for a power portfolio with only linear assets, it shows that the risk capital is higher using the value-at-risk (VaR) based model. This study shows that the changes in risk capital mainly depend on the different methods of calculating the risk capital according to CRR and FRTB respectively and minor on the change of risk measure.
I samband med finanskrisen 2008 har riskmedvetenheten ökat i den finansiella sektorn. Företag regleras mot riskexponering av föreskrifter som drivs av Baselkommittén, de utformar tillsynsstandarder och riktlinjer samt rekommenderar åtgärder av bästa praxis. I dessa föreskrifter regleras företag av kapitalbaskrav mot marknadsrisker. I det här examensarbetet beskrivs processen för att ta fram en intern riskmodell, enligt "Capital Requirements Regulation"(CRR) respektive Fundamental Review of the Trading Book"(FRTB), för att beräkna de lagstadgade kapitalkraven mot marknadsrisker. Kapitalbaskraven enligt regelverken jämförs för att förstå hur det föreslagna bytet till en expected shortfall (ES) baserad modell i FRTB kommer att påverka kapitalbaskraven. I alla beräkningar anv änds data från ett elhandelsföretag för att göra resultaten mer intressanta och verklighetsanpassade. I resultatdelen, vid jämförelse av riskkapitalkraven enligt CRR och FRTB för en energiportfölj med endast linjära tillgångar kan det ses att riskkapitalet blir högre med en value-at-risk (VaR) baserad modell. Den viktigaste upptäckten med detta är att skillnaden i riskkapitalkraven inte främst beror på de olika riskmåtten utan snarare de olika metoderna för att beräkna riskkapitalet enligt CRR och FRTB.
APA, Harvard, Vancouver, ISO, and other styles
45

Alwohaibi, Maram. "Modelling the risk of underfunding in ALM models." Thesis, Brunel University, 2017. http://bura.brunel.ac.uk/handle/2438/16337.

Full text
Abstract:
Asset and Liability Management (ALM) models have become well established decision tools for pension funds. ALMs are commonly modelled as multi-stage, in which a large terminal wealth is required, while at intermediate time periods, constraints on the funding ratio, that is, the ratio of assets to liabilities, are imposed. Underfunding occurs when the funding ratio is too low; a target value for funding ratios is pre-specified by the decision maker. The risk of underfunding has been usually modelled by employing established risk measures; this controls one single aspect of the funding ratio distributions. For example, controlling the expected shortfall below the target has limited power in controlling shortfall under worst-case scenarios. We propose ALM models in which the risk of underfunding is modelled based on the concept of Second Order Stochastic Dominance (SSD). This is a criterion of ranking random variables - in our case funding ratios - that takes the entire distributions of interest into account and works under the widely accepted assumptions of decision makers being rational and risk averse. In the proposed SSD models, investment decisions are taken such that the resulting short-term distribution of the funding ratio is non-dominated with respect to SSD, while a constraint is imposed on the expected terminal wealth. This is done by considering progressively larger tails of the funding ratio distribution and considering target levels for them; a target distribution is thus implied. Different target distributions lead to different SSD efficient solutions. Improved distributions of funding ratios may be thus achieved, compared to the existing risk models for ALM. This is the first contribution of this thesis. Interesting results are obtained in the special case when the target distribution is deterministic, specified by one single outcome. In this case, we can obtain equivalent risk minimisation models, with risk defined as expected shortfall or as worst case loss. This represents the second contribution. The third contribution is a framework for scenario generation based on the "Birth, Immigration, Death, Emigration" (BIDE) population model and the Empirical copula; the scenarios are used to evaluate the proposed models and their special cases both in-sample and out-of-sample. As an application, we consider the planning problem of a large DB pension fund in Saudi Arabia.
APA, Harvard, Vancouver, ISO, and other styles
46

Mavaddat, Nasim. "Risk modelling in BRCA1 and BRCA2 mutation carriers." Thesis, University of Cambridge, 2012. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.610839.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Villegas, Ramirez Andres. "Mortality : modelling, socio-economic differences and basis risk." Thesis, City University London, 2015. http://openaccess.city.ac.uk/13574/.

Full text
Abstract:
During the last two centuries the developed world experienced a persistent increase in life expectancy. Although past trends suggests that life expectancy will continue to increase, there is considerable uncertainty surrounding the future evolution of mortality. In addition, past mortality improvements have not been shared equally across the population, resulting in a widening of socio-economic inequalities in mortality. The uncertainty and socio-economic variability of life expectancy pose a challenge for the design of pension systems and the management of longevity risk in pension funds and annuity portfolios. This thesis is devoted to the investigation of the trends and financial implications of socio-economic differences in mortality. It comprises three parts. The first part introduces new modelling techniques for the quantification of socio-economic mortality differentials in aggregate and cause-specific mortality, which are applied in the study of the relationship between mortality and deprivation in the English population. The second part evaluates the suitability of several multipopulation stochastic mortality models for assessing basis risk in longevity hedges and provides guidelines on how to use these models in practical situations. Finally, the third part introduces new modelling tools which aim to permit a more effective and widespread use of stochastic mortality models.
APA, Harvard, Vancouver, ISO, and other styles
48

Yan, Yang. "Essays in modelling and estimating Value-at-Risk." Thesis, London School of Economics and Political Science (University of London), 2014. http://etheses.lse.ac.uk/1033/.

Full text
Abstract:
The thesis concerns semiparametric modelling and forecasting Value-at-Risk models, and the applications of these in financial data. Two general classes of semiparametric VaR models are proposed, the first method is introduced by defining some efficient estimators of the risk measures in a semiparametric GARCH model through moment constraints and a quantile estimator based on inverting an empirical likelihood weighted distribution. It is found that the new quantile estimator is uniformly more efficient than the simple empirical quantile and a quantile estimator based on normalized residuals. At the same time, the efficiency gain in error quantile estimation hinges on the efficiency of estimators of the variance parameters. We show that the same conclusion applies to the estimation of conditional Expected Shortfall. The second model proposes a new method to forecast one-period-ahead Value-at-Risk (VaR) in general ARCH(1) models with possibly heavy-tailed errors. The proposed method is based on least square estimation for the log-transformed model. This method imposes weak moment conditions on the errors. The asymptotic distribution also accounts for the parameter uncertainty in volatility estimation. We test our models against some conventional VaR forecasting methods, and the results demonstrate that our models are among the best in forecasting VaR.
APA, Harvard, Vancouver, ISO, and other styles
49

Albaz, Naif. "Modelling credit risk for SMEs in Saudi Arabia." Thesis, Cranfield University, 2017. http://dspace.lib.cranfield.ac.uk/handle/1826/13044.

Full text
Abstract:
The Saudi Government’s 2030 Vision directs local banks to increase and improve credit for the Small and Medium Enterprises (SMEs) of the economy (Jadwa, 2017). Banks are, however, still finding it difficult to provide credit for small businesses that meet Basel’s capital requirements. Most of the current credit-risk models only apply to large corporations with little constructed for SMEs applications (Altman and Sabato, 2007). This study fills this gap by focusing on the Saudi SMEs perspective. My empirical work constructs a bankruptcy prediction model based on logistic regressions that cover 14,727 firm-year observations for an 11-year period between 2001 and 2011. I use the first eight years data (2001-2008) to build the model and use it to predict the last three years (2009-2011) of the sample, i.e. conducting an out-of-sample test. This approach yields a highly accurate model with great prediction power, though the results are partially influenced by the external economic and geopolitical volatilities that took place during the period of 2009-2010 (the world financial crisis). To avoid making predictions in such a volatile period, I rebuild the model based on 2003-2010 data, and use it to predict the default events for 2011. The new model is highly consistent and accurate. My model suggests that, from an academic perspective, some key quantitative variables, such as gross profit margin, days inventory, revenues, days payable and age of the entity, have a significant power in predicting the default probability of an entity. I further price the risks of the SMEs by using a credit-risk pricing model similar to Bauer and Agarwal (2014), which enables us to determine the risk-return tradeoffs on Saudi’s SMEs.
APA, Harvard, Vancouver, ISO, and other styles
50

Thomas, Aleysha. "Ensemble statistical modelling of risk factors in health." Thesis, Queensland University of Technology, 2018. https://eprints.qut.edu.au/120675/1/Aleysha_Thomas_Thesis.pdf.

Full text
Abstract:
This thesis aims to identify and characterise the combined effects of non-genetic risk factors, particularly quantitative measurements of organochlorine pesticide (OCP) exposure, on the age at Parkinson's Disease onset using an ensemble modelling approach with linear models, decision trees, hierarchical models, meta-analyses and Bayesian networks when only disparate data sources are available, and it is expensive and time-consuming to coordinate large-scale studies. Although the results of the analysis are not conclusive due to the nature of the data sources, our results justify further study on the combined effects of quantitative non-genetic risk factors, particularly OCP exposure, in a population study.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography