Rozprawy doktorskie na temat „Risk assessment - Mathematical models”

Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Risk assessment - Mathematical models.

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 50 najlepszych rozpraw doktorskich naukowych na temat „Risk assessment - Mathematical models”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj rozprawy doktorskie z różnych dziedzin i twórz odpowiednie bibliografie.

1

Yeo, Keng Leong Actuarial Studies Australian School of Business UNSW. "Claim dependence in credibility models". Awarded by:University of New South Wales. School of Actuarial Studies, 2006. http://handle.unsw.edu.au/1959.4/25971.

Pełny tekst źródła
Streszczenie:
Existing credibility models have mostly allowed for one source of claim dependence only, that across time for an individual insured risk or a group of homogeneous insured risks. Numerous circumstances demonstrate that this may be inadequate and insufficient. In this dissertation, we developed a two-level common effects model, based loosely on the Bayesian model, which allows for two possible sources of dependence, that across time for the same individual risk and that between risks. For the case of Normal common effects, we are able to derive explicit formulas for the credibility premium. This takes the intuitive form of a weighted average between the individual risk's claims experience, the group's claims experience and the prior mean. We also consider the use of copulas, a tool widely used in other areas of work involving dependence, in constructing credibility premiums. Specifically, we utilise copulas to model the dependence across time for an individual risk or group of homogeneous risks. We develop the construction with several well-known families of copulas and are able to derive explicit formulas for their respective conditional expectations. Whilst some recent work has been done on constructing credibility models with copulas, explicit formulas for the conditional expectations have rarely been made available. Finally, we calibrate these copula credibility models using a real data set. This data set relates to the claims experience of workers' compensation insurance by occupation over a 7-year period for a particular state in the United States. Our results show that for each occupation, claims dependence across time is indeed present. Amongst the copulas considered in our empirical analysis, the Cook-Johnson copula model is found to be the best fit for the data set used. The calibrated copula models are then used for prediction of the next period's claims. We found that the Cook-Johnson copula model gives superior predictions. Furthermore, this calibration exercise allowed us to uncover the importance of examining the nature of the data and comparing it with the characteristics of the copulas we are calibrating to.
Style APA, Harvard, Vancouver, ISO itp.
2

Wang, Na. "Estimation of Extra Risk and Benchmark Dose in Dose Response Models". Fogler Library, University of Maine, 2008. http://www.library.umaine.edu/theses/pdf/WangN2008.pdf.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Zhu, Dongming 1963. "Asymmetric heavy-tailed distributions : theory and applications to finance and risk management". Thesis, McGill University, 2007. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=102854.

Pełny tekst źródła
Streszczenie:
This thesis focuses on construction, properties and estimation of asymmetric heavy-tailed distributions, as well as on their applications to financial modeling and risk measurement. First of all, we suggest a general procedure to construct a fully asymmetric distribution based on a symmetrically parametric distribution, and establish some natural relationships between the symmetric and asymmetric distributions. Then, three new classes of asymmetric distributions are proposed by using the procedure: the Asymmetric Exponential Power Distributions (AEPD), the Asymmetric Student-t Distributions (ASTD) and the Asymmetric Generalized t Distribution (AGTD). For the first two distributions, we give an interpretation of their parameters and explore basic properties of them, including moments, expected shortfall, characterization by the maximum entropy property, and the stochastic representation. Although neither distribution satisfies the regularity conditions under which the ML estimators have the usual asymptotics, due to a non-differentiable likelihood function, we nonetheless establish asymptotics for the full MLE of the parameters. A closed-form expression for the Fisher information matrix is derived, and Monte Carlo studies are provided. We also illustrate the usefulness of the GARCH-type models with the AEPD and ASTD innovations in the context of predicting downside market risk of financial assets and demonstrate their superiority over skew-normal and skew-Student's t GARCH models. Finally, two new classes of generalized extreme value distributions, which include Jenkinson's GEV (Generalized Extreme Value) distribution (Jenkinson, 1955) as special cases, are proposed by using the maximum entropy principle, and their properties are investigated in detail.
Style APA, Harvard, Vancouver, ISO itp.
4

Owen, Michelle L. "Exposure model : detailed profiling and quantification of the exposure of personnel to geotechnical hazards in underground mines". University of Western Australia. School of Civil and Resource Engineering, 2004. http://theses.library.uwa.edu.au/adt-WU2005.0031.

Pełny tekst źródła
Streszczenie:
[Truncated abstract] This thesis presents an operationally applicable and reliable model for quantification of the exposure of underground mining personnel to geotechnical hazards. The model is shown to have the flexibility to apply to very different operational environments, within the context of mechanised metalliferous mines. It provides an essential component for carrying out quantitative geotechnical risk analyses of underground mines. Increasingly prevalent within the Australian mining industry are moves towards a riskbased philosophy instead of prescriptive design procedures. A barrier to this has been the lag in availability of resources (personnel and technical) required for the intensive effort of applying probabilistic methods to geotechnical engineering at mines ... One of the missing components for quantitative risk analysis in mines has been an accurate model of personnel exposure to geotechnical hazards, from which meaningful estimates can be made of the probabilities of serious or fatal injury given a rockfall. Exposure profiling for geotechnical risk analysis at minesites has traditionally involved the simple classification of travelways and entry areas by their occupancy rate, not taking into account traffic and work characteristics which may significantly influence the risks. Therefore, it was the focus of this thesis to address that deficiency and progress the ability to perform semi-quantitative and quantitative risk analyses in mines.
Style APA, Harvard, Vancouver, ISO itp.
5

Kusnetsov, Michael. "Clearing models for systemic risk assessment in interbank networks". Thesis, London School of Economics and Political Science (University of London), 2018. http://etheses.lse.ac.uk/3804/.

Pełny tekst źródła
Streszczenie:
In this thesis I consider the problem of clearing models used for systemic risk assessment in interbank networks. I investigate two extensions of the classical Eisenberg & Noe (2001) model. The first extension permits the analysis of networks with interbank liabilities of several maturities. I describe a clearing mechanism that relies on a fixed-point formulation of the vector of each bank’s liquid assets at each maturity date for a given set of defaulted banks. This formulation is consistent with the main stylised principles of insolvency law, permits the construction of simple dynamic models and furthermore demonstrates that systemic risk can be underestimated by single maturity models. In the context of multiple maturities, specifying a set of defaulted banks is challenging. Two approaches to overcome this challenge are proposed. The algorithmic approach leads to a well-defined liquid asset vector for all financial networks with multiple maturities. The simpler functional approach leads to the definition of the liquid asset vector that need not exist but under a regularity condition does exist and coincides with the algorithmic approach. The second extension concerns the non-uniqueness of clearing solutions. When more than one solution exists, the standard approach is to select the greatest solution. I argue that there are circumstances when finding the least solution is desirable. An algorithm for constructing the least solution is proposed. Moreover, the solution is obtainable under an arbitrary lower bound constraint. In models incorporating default costs, clearing functions can be discontinuous, which renders the problem of constructing the least clearing solution non-trivial. I describe the properties of the construction algorithm by means of transfinite sequences and show that it always terminates. Unlike the construction of the greatest solution, the number of steps taken by the algorithm need not be bounded by the size of the network.
Style APA, Harvard, Vancouver, ISO itp.
6

Shen, Yunxiang. "Risk analysis and its application in mining project evaluation". Thesis, McGill University, 1987. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=64009.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Galane, Lesiba Charles. "The risk parity approach to asset allocation". Thesis, Stellenbosch : Stellenbosch University, 2014. http://hdl.handle.net/10019.1/95974.

Pełny tekst źródła
Streszczenie:
Thesis (MSc)--Stellenbosch University, 2014.
ENGLISH ABSTRACT: We consider the problem of portfolio's asset allocation characterised by risk and return. Prior to the 2007-2008 financial crisis, this important problem was tackled using mainly the Markowitz mean-variance framework. However, throughout the past decade of challenging markets, particularly for equities, this framework has exhibited multiple drawbacks. Today many investors approach this problem with a 'safety first' rule that puts risk management at the heart of decision-making. Risk-based strategies have gained a lot of popularity since the recent financial crisis. One of the 'trendiest' of the modern risk-based strategies is the Risk Parity model, which puts diversification in terms of risk, but not in terms of dollar values, at the core of portfolio risk management. Inspired by the works of Maillard et al. (2010), Bruder and Roncalli (2012), and Roncalli and Weisang (2012), we examine the reliability and relationship between the traditional mean-variance framework and risk parity. We emphasise, through multiple examples, the non-diversification of the traditional mean-variance framework. The central focus of this thesis is on examining the main Risk-Parity strategies, i.e. the Inverse Volatility, Equal Risk Contribution and the Risk Budgeting strategies. Lastly, we turn our attention to the problem of maximizing the absolute expected value of the logarithmic portfolio wealth (sometimes called the drift term) introduced by Oderda (2013). The drift term of the portfolio is given by the sum of the expected price logarithmic growth rate, the expected cash flow, and half of its variance. The solution to this problem is a linear combination of three famous risk-based strategies and the high cash flow return portfolio.
AFRIKAANSE OPSOMMING: Ons kyk na die probleem van batetoewysing in portefeuljes wat gekenmerk word deur risiko en wins. Voor die 2007-2008 finansiele krisis, was hierdie belangrike probleem deur die Markowitz gemiddelde-variansie raamwerk aangepak. Gedurende die afgelope dekade van uitdagende markte, veral vir aandele, het hierdie raamwerk verskeie nadele getoon. Vandag, benader baie beleggers hierdie probleem met 'n 'veiligheid eerste' reël wat risikobestuur in die hart van besluitneming plaas. Risiko-gebaseerde strategieë het baie gewild geword sedert die onlangse finansiële krisis. Een van die gewildste van die moderne risiko-gebaseerde strategieë is die Risiko- Gelykheid model wat diversifikasie in die hart van portefeulje risiko bestuur plaas. Geïnspireer deur die werke van Maillard et al. (2010), Bruder and Roncalli (2012), en Roncalli and Weisang (2012), ondersoek ons die betroubaarheid en verhouding tussen die tradisionele gemiddelde-variansie raamwerk en Risiko- Gelykheid. Ons beklemtoon, deur middel van verskeie voorbeelde, die niediversifikasie van die tradisionele gemiddelde-variansie raamwerk. Die sentrale fokus van hierdie tesis is op die behandeling van Risiko-Gelykheid strategieë, naamlik, die Omgekeerde Volatiliteit, Gelyke Risiko-Bydrae en Risiko Begroting strategieë. Ten slotte, fokus ons aandag op die probleem van maksimering van absolute verwagte waarde van die logaritmiese portefeulje welvaart (soms genoem die drif term) bekendgestel deur Oderda (2013). Die drif term van die portefeulje word gegee deur die som van die verwagte prys logaritmiese groeikoers, die verwagte kontantvloei, en die helfte van die variansie. Die oplossing vir hierdie probleem is 'n lineêre kombinasie van drie bekende risiko-gebaseerde strategieë en die hoë kontantvloei wins portefeulje.
Style APA, Harvard, Vancouver, ISO itp.
8

Blatt, Sharon L. "An in-depth look at the information ratio". Link to electronic thesis, 2004. http://www.wpi.edu/Pubs/ETD/Available/etd-0824104-155216/.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Baade, Ingrid Annette. "Survival analysis diagnostics". Thesis, Queensland University of Technology, 1997.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Ghosh, Gregory. "Lifesafety Analysis in the Building Firesafety Method". Digital WPI, 2004. https://digitalcommons.wpi.edu/etd-theses/1106.

Pełny tekst źródła
Streszczenie:
"The purpose of this thesis is to demonstrate and enhance the technical basis of the procedure for evaluating lifesafety within the Building Firesafety Engineering Method (BFSEM). A framework for the analysis has been documented, but not extensively tested in a building situation. Hence, procedures to obtain the necessary input data and to evaluate that data needed to be developed. In addition, the general framework had to be tested rigorously enough to identify weaknesses. "
Style APA, Harvard, Vancouver, ISO itp.
11

Cross, Richard J. (Richard John). "Inference and Updating of Probabilistic Structural Life Prediction Models". Diss., Georgia Institute of Technology, 2007. http://hdl.handle.net/1853/19828.

Pełny tekst źródła
Streszczenie:
Aerospace design requirements mandate acceptable levels of structural failure risk. Probabilistic fatigue models enable estimation of the likelihood of fatigue failure. A key step in the development of these models is the accurate inference of the probability distributions for dominant parameters. Since data sets for these inferences are of limited size, the fatigue model parameter distributions are themselves uncertain. A hierarchical Bayesian approach is adopted to account for the uncertainties in both the parameters and their distribution. Variables specifying the distribution of the fatigue model parameters are cast as hyperparameters whose uncertainty is modeled with a hyperprior distribution. Bayes' rule is used to determine the posterior hyperparameter distribution, given available data, thus specifying the probabilistic model. The Bayesian formulation provides an additional advantage by allowing the posterior distribution to be updated as new data becomes available through inspections. By updating the probabilistic model, uncertainty in the hyperparameters can be reduced, and the appropriate level of conservatism can be achieved. In this work, techniques for Bayesian inference and updating of probabilistic fatigue models for metallic components are developed. Both safe-life and damage-tolerant methods are considered. Uncertainty in damage rates, crack growth behavior, damage, and initial flaws are quantified. Efficient computational techniques are developed to perform the inference and updating analyses. The developed capabilities are demonstrated through a series of case studies.
Style APA, Harvard, Vancouver, ISO itp.
12

Dicks, Anelda. "Value at risk and expected shortfall : traditional measures and extreme value theory enhancements with a South African market application". Thesis, Stellenbosch : Stellenbosch University, 2013. http://hdl.handle.net/10019.1/85674.

Pełny tekst źródła
Streszczenie:
Thesis (MComm)--Stellenbosch University, 2013.
ENGLISH ABSTRACT: Accurate estimation of Value at Risk (VaR) and Expected Shortfall (ES) is critical in the management of extreme market risks. These risks occur with small probability, but the financial impacts could be large. Traditional models to estimate VaR and ES are investigated. Following usual practice, 99% 10 day VaR and ES measures are calculated. A comprehensive theoretical background is first provided and then the models are applied to the Africa Financials Index from 29/01/1996 to 30/04/2013. The models considered include independent, identically distributed (i.i.d.) models and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) stochastic volatility models. Extreme Value Theory (EVT) models that focus especially on extreme market returns are also investigated. For this, the Peaks Over Threshold (POT) approach to EVT is followed. For the calculation of VaR, various scaling methods from one day to ten days are considered and their performance evaluated. The GARCH models fail to converge during periods of extreme returns. During these periods, EVT forecast results may be used. As a novel approach, this study considers the augmentation of the GARCH models with EVT forecasts. The two-step procedure of pre-filtering with a GARCH model and then applying EVT, as suggested by McNeil (1999), is also investigated. This study identifies some of the practical issues in model fitting. It is shown that no single forecasting model is universally optimal and the choice will depend on the nature of the data. For this data series, the best approach was to augment the GARCH stochastic volatility models with EVT forecasts during periods where the first do not converge. Model performance is judged by the actual number of VaR and ES violations compared to the expected number. The expected number is taken as the number of return observations over the entire sample period, multiplied by 0.01 for 99% VaR and ES calculations.
AFRIKAANSE OPSOMMING: Akkurate beraming van Waarde op Risiko (Value at Risk) en Verwagte Tekort (Expected Shortfall) is krities vir die bestuur van ekstreme mark risiko’s. Hierdie risiko’s kom met klein waarskynlikheid voor, maar die finansiële impakte is potensieel groot. Tradisionele modelle om Waarde op Risiko en Verwagte Tekort te beraam, word ondersoek. In ooreenstemming met die algemene praktyk, word 99% 10 dag maatstawwe bereken. ‘n Omvattende teoretiese agtergrond word eers gegee en daarna word die modelle toegepas op die Africa Financials Index vanaf 29/01/1996 tot 30/04/2013. Die modelle wat oorweeg word sluit onafhanklike, identies verdeelde modelle en Veralgemeende Auto-regressiewe Voorwaardelike Heteroskedastiese (GARCH) stogastiese volatiliteitsmodelle in. Ekstreemwaarde Teorie modelle, wat spesifiek op ekstreme mark opbrengste fokus, word ook ondersoek. In hierdie verband word die Peaks Over Threshold (POT) benadering tot Ekstreemwaarde Teorie gevolg. Vir die berekening van Waarde op Risiko word verskillende skaleringsmetodes van een dag na tien dae oorweeg en die prestasie van elk word ge-evalueer. Die GARCH modelle konvergeer nie gedurende tydperke van ekstreme opbrengste nie. Gedurende hierdie tydperke, kan Ekstreemwaarde Teorie modelle gebruik word. As ‘n nuwe benadering oorweeg hierdie studie die aanvulling van die GARCH modelle met Ekstreemwaarde Teorie vooruitskattings. Die sogenaamde twee-stap prosedure wat voor-af filtrering met ‘n GARCH model behels, gevolg deur die toepassing van Ekstreemwaarde Teorie (soos voorgestel deur McNeil, 1999), word ook ondersoek. Hierdie studie identifiseer sommige van die praktiese probleme in model passing. Daar word gewys dat geen enkele vooruistkattingsmodel universeel optimaal is nie en die keuse van die model hang af van die aard van die data. Die beste benadering vir die data reeks wat in hierdie studie gebruik word, was om die GARCH stogastiese volatiliteitsmodelle met Ekstreemwaarde Teorie vooruitskattings aan te vul waar die voorafgenoemde nie konvergeer nie. Die prestasie van die modelle word beoordeel deur die werklike aantal Waarde op Risiko en Verwagte Tekort oortredings met die verwagte aantal te vergelyk. Die verwagte aantal word geneem as die aantal obrengste waargeneem oor die hele steekproefperiode, vermenigvuldig met 0.01 vir die 99% Waarde op Risiko en Verwagte Tekort berekeninge.
Style APA, Harvard, Vancouver, ISO itp.
13

Kroon, Rodney Stephen. "A framework for estimating risk". Thesis, Link to the online version, 2008. http://hdl.handle.net/10019.1/1104.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
14

Allen, H. Joel. "A Behavioral Model for Detection of Acute Stress in Bivalves". Thesis, University of North Texas, 1998. https://digital.library.unt.edu/ark:/67531/metadc277998/.

Pełny tekst źródła
Streszczenie:
A behavioral model for acute responses in bivalves, was developed using time series analysis for use in a real-time biomonitoring unit. Stressed bivalves closed their shell and waited for the stressful conditions to pass. Baseline data showed that group behavior of fifteen bivalves was periodic, however, individuals behaved independently. Group behavior did not change over a period of 20 minutes more than 30 percent, however, following toxic exposures the group behavior changed by more than 30 percent within 20 minutes. Behavior was mathematically modeled using autoregression to compare current and past behavior. A logical alarm applied to the behavior model determined when organisms were stressed. The ability to disseminate data collected in real time via the Internet was demonstrated.
Style APA, Harvard, Vancouver, ISO itp.
15

Sun, Yu. "Risk-based framework for freight movement analysis". Thesis, Queensland University of Technology, 2002.

Znajdź pełny tekst źródła
Streszczenie:
Decision-making models have, m the recent years, been developed to provide systematic and comprehensive tools to analyse, evaluate and manage freight movement. Freight transport models developed thus far have not precisely defined risk agents brought by travelling vehicles, which lead to indistinct risk types. Instead, most of the models developed discussed the risks mainly related to direct impacts by traffic accidents. On the other hand, transport efficiency, which is of more and more concern, has not been sufficiently emphasised in the previous models. This thesis studies the factors in relation to not only safety issues, but efficiency issues. And the risks due to freight movement have been classified into categories in accordance with their distinct natures and typically, affected population groups including human, environment and economic infrastructure. Two new theories, risk agent and risk response factor, have been introduced to the framework to precisely define and evaluate various risk types. Vehicle travelling on a specific route may encounter various situations combined by road characteristics, traffic flow and weather conditions, etc. In order to assist in analysing freight movement in a systematic manner, freight movement behaviours, which pose negative impacts to nearby population, are divided into different modes, which have been interpreted as "risk-producing activities" in the Risk-based Framework for Freight Movement Analysis (RBF-FMA) in order to identify the characteristics of the risk agents. It is important to differentiate the segments with significant changes in the travel risk producing conditions. This study divides travel route into segments and each segment is assessed separately and differentiates among three segment types: travel segment, intersection, and roundabout according to their different contributing factors. The framework developed in this study also considers the availability of emergency response facilities and support system as a major risk-reducing factor. When applied and compared with the risk rating results estimated by the Queensland Transport Department (QTD) using their risk-rating model, the RBF-FMA gave highly comparable results. In the evaluation, both the QTD and RBF-FMA models were applied to assess the risk associated with the release of hazardous mate1ials at 25 segments identified as having high risk by the QTD. The RBF-FMA was also successfully applied to compare two routes between two common points and the results were generally consistent with the concentration of human population, enviromnental population and economic activity and infrastructure along the two routes. The basic data that was needed to conduct the RBF-FMA was easily generated using site visits and from available data basis. While the RBF-FMA presents a logical framework that is based on the risk assessment and management methodology, the process of assigning scores (ranks) and weights to the various factors and sub-factors is basically subjective and reflects the education, values, judgement and understanding of the model by the user. It is recognised that this subjectivity can lead to viability in the results. The contribution of this study is significant in that it translates the basic risk assessment model used in the public health field into a framework suitable for rating the risk associated with freight movement. In addition, the effort presented a basic modelling approach that can be modified or built on by other researchers in this field. The framework formulated in this study is worthy of further research and development as it can be used as a useful system for making decisions related to moving freight along selected routes. Further work could include development of a GIS-based computer program which is able to contain huge amount of risk assessment data associated with freight movement and provide a visual operation of the risk analysis.
Style APA, Harvard, Vancouver, ISO itp.
16

Mello, Bernardo Brazão Rego. "Classificação de risco setorial com base nos métodos Weighted Influence Non-linear Gauge System e Analytic Hierarchy Process". reponame:Biblioteca Digital do Banco Nacional de Desenvolvimento Econômico e Social, 2014. http://web.bndes.gov.br/bib/jspui/handle/1408/5341.

Pełny tekst źródła
Streszczenie:
Bibliografia: p. 46-48
Dissertação (mestrado) - Faculdade de Economia e Finanças Ibmec, Rio de Janeiro, 2014.
Devido à crescente importância dos mercados financeiros nas últimas décadas, o risco de crédito tem se tornado um tema fundamental na tomada de decisões acerca de investimentos, taxas de financiamento, solvência corporativa, tendência e perspectivas etc. Os modelos de avaliação de risco de crédito, em geral, podem ser classificados em duas categorias: quantitativo e qualitativo. Modelos quantitativos buscam analisar informações de demonstrativos financeiros e seus indicadores, enquanto modelos qualitativos focam na análise de variáveis intangíveis que afetam os negócios globais. Estes modelos normalmente seguem uma estrutura "top-down" de análise setorial, competitividade e comparação de pares e gestão. O objetivo desta dissertação é apresentar um modelo de classificação de risco setorial com base em métodos de análise multicritério que possam mensurar a importância das variáveis que afetam os setores da economia brasileira, bem como a influência entre estas. O modelo é baseado, principalmente, no método Weighted Influence Non-Linear Gauge System. Acerca dos julgamentos sobre as variáveis, o modelo baseia-se na utilização do método Analytic Hierarchy Process. O resultado do modelo é apresentado através de níveis de risco, aplicado a quatorze setores da economia brasileira. A dissertação se encerra com uma discussão sobre os resultados, bem como com um esboço do direcionamento para futuras pesquisas.
Due to the increasing importance of the financial market over the past decades, credit risk has become a paramount issue in investment, loan spreads, corporate solvency, trends and prospetcs, etc. Credit risk evaluation models may be classified in two broad categories: quantitative and qualitative. Quantitative models seek to analyze information from financial statement and indexes, while qualitative models focus on the analysis of intangible variables that affect global business. These models typically follow a top-down approach by analyzing the industry risk, competitiveness and peer comparison and management. The aim of this thesis is to present an industry risk assessment model based on multicriteria analysis methods that can measure the strengh of variables that affect the industries of Brazilian economy, as well as the influence between them. The model is based primarily on the Weighted Influence Non-Linear Gauge System method. Concerning human judgements about the variables, the model is founded on the use of the Analytic Hierarchy Process method. The result from the model is presented through risk levels, applied to fourteen industries in the Brazilian economy. The thesis closes with a discussion of results, as well as with an outline to future research directions.
Style APA, Harvard, Vancouver, ISO itp.
17

Sahlin, Carl, i Carl-Johan Hugner. "Dealing with the ORSA : A Dynamic Risk-Factor Based Approach for the Small, Swedish Non-Life Insurer". Thesis, KTH, Industriell Management, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-133477.

Pełny tekst źródła
Streszczenie:
The Own Risk and Solvency Assessment, ORSA, is referred to as the heart of the regulation to be for European insurance companies - Solvency II. The aim of the ORSA process is to provide an overall and holistic view of the insurer’s risks by analyzing their current financial status and business strategy at hand. There is no predefined way to implement this process, which means that the companies are forced to develop a model themselves, as they see fit. In collaboration with a regional insurance company in Sweden we develop a structure and framework for an ORSA-model, flexible enough to be used by similar insurers yet standardized enough to overcome the issue of constrained resources within these smaller organizations. We apply a risk-factor based approach and tie together a balance sheet projection and stress testing, designed to be further developed as the individual insurer see fit. The suggested approach yields partially satisfying results and we consider the model to be particularly well-suited for assessing risk in the context of the small, non-life insurer.
Den egna risk- och solvensutvärderingen, ORSA, kallas hjärtat av det kommande regelverket för europeiska försäkringsbolag - Solvens II. Syftet med ORSA-processen är att ge en övergripande helhetsbild av försäkringsgivarens risker genom att analysera deras finansiella ställning och affärsstrategi. Det finns inget fördefinierat sätt att genomföra denna process, vilket innebär att företagen tvingas att utveckla en modell på egen hand, på ett sätt som de finner lämpligt. I samarbete med ett regionalt försäkringsbolag i Sverige utvecklar vi en struktur och en grund för en ORSA-modell. En modell som är tillräckligt flexibel för att kunna användas av liknande försäkringsgivare men samtidigt standardiserad nog att lösa problemet med begränsade resurser i dessa mindre organisationer. Vi tillämpar en riskfaktor-baserad metod, prognostiserar resultat- och balansräkning för bolaget och utför stresstester. Metoden är utformad för att utvecklas vidare av den enskilde försäkringsgivaren så som de finner lämpligt. Den föreslagna metoden ger delvis tillfredsställande resultat och vi anser att det är en grund väl lämpad att använda som utgångspunkt för att konstruera riskmätningsmetoder för små, skadeförsäkringsbolag.
Style APA, Harvard, Vancouver, ISO itp.
18

Omrane, Fatma. "Human health risk assessment of occupational exposure to trace metallic elements mixtures in metalworking industries in the Sfax metropolis (Tunisia)". Thesis, Université de Lorraine, 2018. http://www.theses.fr/2018LORR0097/document.

Pełny tekst źródła
Streszczenie:
Les éléments trace métalliques (ETM) sont des polluants qui sont sources de préoccupations majeures à cause de leurs toxicités et de leurs propriétés cumulatives. Certains d’eux peuvent être cancérogènes. La métropole de Sfax, située au sud de la Tunisie, a été touchée par des rejets et émissions d’ETM depuis des décennies. Plusieurs études ont confirmé que la pollution métallique est principalement d’origine anthropique, liée en particulier aux activités industrielles. Cela présente un risque sur la santé des habitants, particulièrement pour ceux qui sont également exposés par leur métier dans des procédés industriels. L’objectif de cette étude est d’évaluer les risques sanitaires associés à l’exposition professionnelle dans les industries qui manipulent des ETM dans leurs processus de production, en suivant l’approche de l’évaluation des risques sanitaires. Dans ce but, cinq entreprises qui utilisent des métaux comme matière première pour produire une variété de produits métalliques, ont accepté d’adhérer à notre étude. Les métaux qui étaient explorés sont Al, Cr, Ni, Cu, Zn and Pb. Des modèles mathématiques de prédiction des expositions professionnelles aux agents chimiques ont été utilisés pour estimer les concentrations des ETM dans l’air intérieur pour 15 postes différents. Des prélèvements atmosphériques ont été effectués afin de comparer les concentrations prédites à celles mesurées, en utilisant des prélèvements individuels ou sur postes fixes. Finalement, des prélèvements urinaires ont été collectés chez 61 travailleurs afin d’évaluer le lien entre l’excrétion des ETM et les niveaux atmosphériques. Globalement, les estimations des concentrations atmosphériques avaient une bonne concordance avec les valeurs mesurées sur l’ensemble des postes de travail. Des meilleures prédictions ont été trouvées pour certaines activités, en particulier pour des processus de découpage des tôles et de soudures. Les valeurs qui correspondent au 90ème percentile de la distribution de l’exposition ont été utilisées pour le calcul du « interaction-based hazard index HIint » pour évaluer les risques associés aux mélanges d’ETM. Un excès de risque total de cancer a été aussi calculé. Les résultats ont montré des expositions élevées qui peuvent provoquer des pathologies respiratoires, avec un HIint allant jusqu’à 93,6. Les niveaux les plus élevés sont attribués à la soudure à l'arc à l'électrode enrobée et au débitage et cisaillage des tôles. Ces risques augmentent à cause de l’effet synergique qui existe entre Cr, Ni et Cu. Des risques élevés de cancer du poumon et du rein ont été encore démontrés (risque total vie entière de cancer pour les ouvriers exposés : 3.7×10-4). Ce travail montre que les modèles mathématiques peuvent prédire correctement les niveaux d’exposition des ETM dans l’air intérieur pour plusieurs processus de la métallurgie. Ce résultat est intéressant pour aider les différents acteurs pour piloter de manière efficiente les systèmes de surveillance et la réduction des expositions dans ce secteur économique. Des progrès en matière d’hygiène industrielle sont nécessaires dans ce secteur industriel pour minimiser le risque sanitaire élevé auquel sont actuellement exposés les travailleurs concernés
Trace metallic elements (TMEs) are pollutants of great concern even in trace amounts because of their toxicity and cumulative property. Some of them can be carcinogenic. The Sfax metropolis, located in the southern region of Tunisia, has been affected by releases of TMEs for decades. Several studies confirmed that this pollution is predominantly originated from anthropogenic sources, mainly from industrial activities. It represents a threat to the health of residents, particularly for those also exposed during occupational activities in industrial processes. The present study aims to assess health risks associated with occupational exposure in industries handling TMEs in their production processes, following the human health risk assessment approach. To this end, five companies using raw material containing TMEs to produce a variety of metallic products accepted to participate to the study. The metals that were investigated are Al, Cr, Ni, Cu, Zn and Pb. Mathematical models for estimating occupational exposure to chemicals were used to predict indoor air TME exposure levels in 15 different job tasks. Air monitoring was conducted in order to compare the predicted workplace air concentrations versus the direct measured ones, using both workplace-fixed monitors and personal samplers. And finally, urine samples were collected from 61 workers to assess whether TMEs excretion correlate with job exposure levels. Globally, the predicted air estimates relate well with measured concentrations over the whole set of job tasks. Better predictions were found for certain activities, in particular for steel cutting and welding processes. The values that correspond to the 90th percentile of the exposure distribution were then used in the interaction-based hazard index HIint to assess health risks associated with the mixtures of TMEs. Total cancer risk was also investigated. Results showed high exposures for metals that may elicit respiratory conditions, with a HIint reaching 93.6, the highest levels being for the shielded metal arc welding and metal shearing and slitting tasks. The risk is enhanced by a synergetic effect between Cr, Ni and Cu. High risks of lung and kidney cancers were demonstrated (the predicted life-long total cancer risk for exposed workers is 3.7×10-4). This work shows that mathematical models can be accurate in predicting TME airborne exposure levels for several processes in the metallurgic industry, a result that is of interest to help the different stakeholders to monitor efficiently exposure surveillance and abatement. Progress in industrial hygiene is needed in this industrial sector to reduce the high level of health risks currently experienced by the metalworking workers
Style APA, Harvard, Vancouver, ISO itp.
19

Khajeh-Hosseini, Ali. "Supporting system deployment decisions in public clouds". Thesis, University of St Andrews, 2013. http://hdl.handle.net/10023/3412.

Pełny tekst źródła
Streszczenie:
Decisions to deploy IT systems on public Infrastructure-as-a-Service clouds can be complicated as evaluating the benefits, risks and costs of using such clouds is not straightforward. The aim of this project was to investigate the challenges that enterprises face when making system deployment decisions in public clouds, and to develop vendor-neutral tools to inform decision makers during this process. Three tools were developed to support decision makers: 1. Cloud Suitability Checklist: a simple list of questions to provide a rapid assessment of the suitability of public IaaS clouds for a specific IT system. 2. Benefits and Risks Assessment tool: a spreadsheet that includes the general benefits and risks of using public clouds; this provides a starting point for risk assessment and helps organisations start discussions about cloud adoption. 3. Elastic Cost Modelling: a tool that enables decision makers to model their system deployment options in public clouds and forecast their costs. These three tools collectively enable decision makers to investigate the benefits, risks and costs of using public clouds, and effectively support them in making system deployment decisions. Data was collected from five case studies and hundreds of users to evaluate the effectiveness of the tools. This data showed that the cost effectiveness of using public clouds is situation dependent rather than universally less expensive than traditional forms of IT provisioning. Running systems on the cloud using a traditional 'always on' approach can be less cost effective than on-premise servers, and the elastic nature of the cloud has to be considered if costs are to be reduced. Decision makers have to model the variations in resource usage and their systems' deployment options to obtain accurate cost estimates. Performing upfront cost modelling is beneficial as there can be significant cost differences between different cloud providers, and different deployment options within a single cloud. During such modelling exercises, the variations in a system's load (over time) must be taken into account to produce more accurate cost estimates, and the notion of elasticity patterns that is presented in this thesis provides one simple way to do this.
Style APA, Harvard, Vancouver, ISO itp.
20

LEITE, ELIANA R. "Indicadores de segurança para um d´pósito final de fontes radioativas seladas". reponame:Repositório Institucional do IPEN, 2012. http://repositorio.ipen.br:8080/xmlui/handle/123456789/10142.

Pełny tekst źródła
Streszczenie:
Made available in DSpace on 2014-10-09T12:35:13Z (GMT). No. of bitstreams: 0
Made available in DSpace on 2014-10-09T13:57:01Z (GMT). No. of bitstreams: 0
Dissertação (Mestrado)
IPEN/D
Instituto de Pesquisas Energeticas e Nucleares - IPEN-CNEN/SP
Style APA, Harvard, Vancouver, ISO itp.
21

Getley, Ian L. Department of Aviation Faculty of Science UNSW. "Cosmic and solar radiation monitoring of Australian commercial flight crew at high southern latitudes as measured and compared to predictive computer modelling". Awarded by:University of New South Wales, 2007. http://handle.unsw.edu.au/1959.4/40536.

Pełny tekst źródła
Streszczenie:
This study set out to examine the levels of galactic cosmic radiation exposure to Australian aircrew during routine flight operations, with particular attention to the high southern latitude flights between Australia and South Africa. Latitudes as high as 65?? South were flown to gain the data and are typical of the normal flight routes flown between Sydney and Johannesburg on a daily basis. In achieving this objective it became evident that suitable commercially available radiation monitoring equipment was not readily available and scientific radiation monitors were sourced from overseas research facilities to compliment my own FH4lB and Liulin monitors provided by UNSW. At the same time it became apparent that several predictive codes had been developed to attempt to model the radiation doses received by aircrew based on flight route, latitudes and altitudes. Further, it became apparent that these codes had not been subjected to verification at high southern latitudes and that they had not been validated for the effects of solar particle events. Initially measurements were required at the high latitudes followed by mid-latitude data to further balance the PCAIRE code to ensure reasonableness of results for both equatorial and high latitudes. Whilst undertaking this study new scientific monitors became available which provided an opportunity to observe comparative data and results. The Liulin, QDOS and a number of smaller personal dosimeters were subsequently obtained and evaluated. This appears to be the first time that such an extensive cross comparison of these monitors has been conducted over such a wide range of latitudes and altitudes. During the course of this study a fortuitous encounter with GLE 66 enabled several aspects of code validation to be examined, namely the inability of predictive codes to estimate the increased dose associated with a GLE or the effects of a Forbush decrease on the code results. Finally I review the known biological effects as discussed by numerous authors based on current epidemiological studies, with a view to high-lighting were the advent of future technology in aviation may project aircrew dose levels.
Style APA, Harvard, Vancouver, ISO itp.
22

Marshall, Scott. "An Empirical Approach to Evaluating Sufficient Similarity: Utilization of Euclidean Distance As A Similarity Measure". VCU Scholars Compass, 2010. http://scholarscompass.vcu.edu/etd/102.

Pełny tekst źródła
Streszczenie:
Individuals are exposed to chemical mixtures while carrying out everyday tasks, with unknown risk associated with exposure. Given the number of resulting mixtures it is not economically feasible to identify or characterize all possible mixtures. When complete dose-response data are not available on a (candidate) mixture of concern, EPA guidelines define a similar mixture based on chemical composition, component proportions and expert biological judgment (EPA, 1986, 2000). Current work in this literature is by Feder et al. (2009), evaluating sufficient similarity in exposure to disinfection by-products of water purification using multivariate statistical techniques and traditional hypothesis testing. The work of Stork et al. (2008) introduced the idea of sufficient similarity in dose-response (making a connection between exposure and effect). They developed methods to evaluate sufficient similarity of a fully characterized reference mixture, with dose-response data available, and a candidate mixture with only mixing proportions available. A limitation of the approach is that the two mixtures must contain the same components. It is of interest to determine whether a fully characterized reference mixture (representative of the random process) is sufficiently similar in dose-response to a candidate mixture resulting from a random process. Four similarity measures based on Euclidean distance are developed to aid in the evaluation of sufficient similarity in dose-response, allowing for mixtures to be subsets of each other. If a reference and candidate mixture are concluded to be sufficiently similar in dose-response, inference about the candidate mixture can be based on the reference mixture. An example is presented demonstrating that the benchmark dose (BMD) of the reference mixture can be used as a surrogate measure of BMD for the candidate mixture when the two mixtures are determined to be sufficiently similar in dose-response. Guidelines are developed that enable the researcher to evaluate the performance of the proposed similarity measures.
Style APA, Harvard, Vancouver, ISO itp.
23

Gobira, Diogo Barboza. "Precificação de derivativos exóticos no mercado de petróleo". reponame:Repositório Institucional do BNDES, 2014. http://web.bndes.gov.br/bib/jspui/handle/1408/7023.

Pełny tekst źródła
Streszczenie:
Bibliografia: p. 109-111
Dissertação (mestrado) - Instituto Nacional de Matemática Pura e Aplicada, Rio de Janeiro, 2014.
Estudamos a precificação de opções exóticas nos mercados de petróleo e de seus derivados. Iniciamos com uma análise exploratória dos dados, revisitando suas propriedades estatísticas e fatos estilizados relacionados às volatilidades e correlações. Subsidiados pelos resultados de tal análise, apresentamos alguns dos principais modelos forward para commodities e um vasto conjunto de estruturas determinísticas de volatilidades, bem como os respectivos métodos de calibragem, para os quais executamos testes com dados reais. Para melhorar o desempenho de tais modelos na precificação do smile de volatilidade, reformulamos o modelo de volatilidade estocástica de Heston para lidar com uma ou múltiplas curvas forward, permitindo sua utilização na precificação de contratos definidos sobre múltiplas commodities. Calibramos e testamos tais modelos a partir de dados reais dos mercados de petróleo, gasolina e gás, e comprovamos a sua superioridade frente aos modelos de volatilidade determinística. Para subsidiar a precificação de opções exóticas e contratos OTC, revisitamos dos pontos de vista teórico e prático assuntos como simulação de Monte Carlo, soluções numéricas para SDEs e exercício americano. Finalmente, por meio de uma bateria de simulações numéricas, mostramos como os modelos podem ser utilizados na precificação de opções exóticas que tipicamente ocorrem nos mercados de commodities, como as calendar spread options, crack spread options e as opções asiáticas.
We study the pricing of exotic options in the oil and its derivatives markets. We begin with a exploratory analysis of the data, revisiting statistical properties and stylized facts related to the volatilities and correlations. Based on this results, we present some of the main commodity forward models and a wide range of deterministic volatility structures, as well as its calibration methods, for which we ran tests with real market data. To improve the performance of such models in pricing the volatility smile, we reformulate the Heston stochastic volatility model to cope with one or multiple forward curves together, allowing its use for the pricing of multicommodity based contracts. We calibrate and test such models for the oil, gasoline and natural gas markets, confirming their superiority against deterministic volatility models. To support the tasks of exotic options and OTC contracts pricing, we also revisit, from the theoretical and practical points of view, tools and issues such as Monte Carlo simulation, numerical solutions to SDEs and American exercise. Finally, through a battery of numerical simulations, we show how the presented models can be used to price typical exotic options occurring in commodity markets, such as calendar spread options, crack spread options and Asian options.
Style APA, Harvard, Vancouver, ISO itp.
24

Siu, Kin-bong Bonny, i 蕭健邦. "Expected shortfall and value-at-risk under a model with market risk and credit risk". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2006. http://hub.hku.hk/bib/B37727473.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
25

Ngwenza, Dumisani. "Quantifying Model Risk in Option Pricing and Value-at-Risk Models". Master's thesis, Faculty of Commerce, 2019. http://hdl.handle.net/11427/31059.

Pełny tekst źródła
Streszczenie:
Financial practitioners use models in order to price, hedge and measure risk. These models are reliant on assumptions and are prone to ”model risk”. Increased innovation in complex financial products has lead to increased risk exposure and has spurred research into understanding model risk and its underlying factors. This dissertation quantifies model risk inherent in Value-at-Risk (VaR) on a variety of portfolios comprised of European options written on the ALSI futures index across various maturities. The European options under consideration will be modelled using the Black-Scholes, Heston and Variance-Gamma models.
Style APA, Harvard, Vancouver, ISO itp.
26

Gu, Jiawen, i 古嘉雯. "On credit risk modeling and credit derivatives pricing". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2014. http://hdl.handle.net/10722/202367.

Pełny tekst źródła
Streszczenie:
In this thesis, efforts are devoted to the stochastic modeling, measurement and evaluation of credit risks, the development of mathematical and statistical tools to estimate and predict these risks, and methods for solving the significant computational problems arising in this context. The reduced-form intensity based credit risk models are studied. A new type of reduced-form intensity-based model is introduced, which can incorporate the impacts of both observable trigger events and economic environment on corporate defaults. The key idea of the model is to augment a Cox process with trigger events. In addition, this thesis focuses on the relationship between structural firm value model and reduced-form intensity based model. A continuous time structural asset value model for the asset value of two correlated firms with a two-dimensional Brownian motion is studied. With the incomplete information introduced, the information set available to the market participants includes the default time of each firm and the periodic asset value reports. The original structural model is first transformed into a reduced-form model. Then the conditional distribution of the default time as well as the asset value of each name are derived. The existence of the intensity processes of default times is proven and explicit form of intensity processes is given in this thesis. Discrete-time Markovian models in credit crisis are considered. Markovian models are proposed to capture the default correlation in a multi-sector economy. The main idea is to describe the infection (defaults) in various sectors by using an epidemic model. Green’s model, an epidemic model, is applied to characterize the infectious effect in each sector and dependence structures among various sectors are also proposed. The models are then applied to the computation of Crisis Value-at-Risk (CVaR) and Crisis Expected Shortfall (CES). The relationship between correlated defaults of different industrial sectors and business cycles as well as the impacts of business cycles on modeling and predicting correlated defaults is investigated using the Probabilistic Boolean Network (PBN). The idea is to model the credit default process by a PBN and the network structure can be inferred by using Markov chain theory and real-world data. A reduced-form model for economic and recorded default times is proposed and the probability distributions of these two default times are derived. The numerical study on the difference between these two shows that our proposed model can both capture the features and fit the empirical data. A simple and efficient method, based on the ordered default rate, is derived to compute the ordered default time distributions in both the homogeneous case and the two-group heterogeneous case under the interacting intensity default contagion model. Analytical expressions for the ordered default time distributions with recursive formulas for the coefficients are given, which makes the calculation fast and efficient in finding rates of basket CDSs.
published_or_final_version
Mathematics
Doctoral
Doctor of Philosophy
Style APA, Harvard, Vancouver, ISO itp.
27

Ikwuegbu, Chigozie Charles. "Models for Risk assessment of Mobile applications". Thesis, Blekinge Tekniska Högskola, Institutionen för datavetenskap, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-20119.

Pełny tekst źródła
Streszczenie:
Mobile applications are software that extend the functionality of our smartphones by connecting us with friends and a wide range of other services. Android, which is an operating system based on the Linux kernel, leads the market with over 2.6 million applications recorded on their official store. Application developers, due to the ever-growing innovation in smartphones, are compelled to release new ideas on limited budget and time, resulting in the deployment of malicious applications. Although there exists a security mechanism on the Google Play Store to remove these applications, studies have shown that most of the applications on the app store compromise privacy or pose security-related risks. It is therefore essential to investigate the security risk of installing any of these applications on a device. The objectives are to identify methods and techniques for assessing mobile application security, investigate how attributes indicate the harmfulness of applications, and evaluate the performance of K Nearest Neighbors(K-NN) and Random forest machine learning models in assessing the security risk of installing mobile applications based on information available on the application distribution platform. A literature analysis was done to gather information on the different methods and techniques for assessing security in mobile applications and investigations on how different attributes on the application distribution platform indicate the harmfulness of an application. An experiment was also conducted to examine how various machine learning models perform in evaluating the security risk associated with installing applications, based on information on the application distribution platform. Literature analysis presents the various methods and techniques for mobile application security assessment and identifies how mobile application attributes indicate the harmfulness of mobile applications. The experimental results demonstrate the performance of the aforementioned machine learning models in evaluating the security risk of installing mobile applications. In conclusion, Static, dynamic, and grey-box analysis are the methods used to evaluate mobile application security, and machine learning models including K-NN and Random forest are suitable techniques for evaluating mobile application security risk. Attributes such as the permissions, number of installations, and ratings reveal the likelihood and impact of an underlying security threat. The K-NN and Random forest models when compared to evaluate the security risk of installing mobile applications based on information on the application distribution platform showed high performance with little differences.
Style APA, Harvard, Vancouver, ISO itp.
28

Liu, Binbin, i 刘彬彬. "Some topics in risk theory and optimal capital allocation problems". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2012. http://hub.hku.hk/bib/B48199291.

Pełny tekst źródła
Streszczenie:
In recent years, the Markov Regime-Switching model and the class of Archimedean copulas have been widely applied to a variety of finance-related fields. The Markov Regime-Switching model can reflect the reality that the underlying economy is changing over time. Archimedean copulas are one of the most popular classes of copulas because they have closed form expressions and have great flexibility in modeling different kinds of dependencies. In the thesis, we first consider a discrete-time risk process based on the compound binomial model with regime-switching. Some general recursive formulas of the expected penalty function have been obtained. The orderings of ruin probabilities are investigated. In particular, we show that if there exists a stochastic dominance relationship between random claims at different regimes, then we can order ruin probabilities under different initial regimes. Regarding capital allocation problems, which are important areas in finance and risk management, this thesis studies the problems of optimal allocation of policy limits and deductibles when the dependence structure among risks is modeled by an Archimedean copula. By employing the concept of arrangement increasing and stochastic dominance, useful qualitative results of the optimal allocations are obtained. Then we turn our attention to a new family of risk measures satisfying a set of proposed axioms, which includes the class of distortion risk measures with concave distortion functions. By minimizing the new risk measures, we consider the optimal allocation of policy limits and deductibles problems based on the assumption that for each risk there exists an indicator random variable which determines whether the risk occurs or not. Several sufficient conditions to order the optimal allocations are obtained using tools in stochastic dominance theory.
published_or_final_version
Statistics and Actuarial Science
Doctoral
Doctor of Philosophy
Style APA, Harvard, Vancouver, ISO itp.
29

蕭德權 i Tak-kuen Siu. "Risk measures in finance and insurance". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B31242297.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
30

Rong, Yian, i 戎軼安. "Applications of comonotonicity in risk-sharing and optimal allocation". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2014. http://hdl.handle.net/10722/207205.

Pełny tekst źródła
Streszczenie:
Over the past decades, researchers in economics, financial mathematics and actuarial science have introduced results to the concept of comonotonicity in their respective fields of interest. Comonotonicity is a very strong dependence structure and is very often mistaken as a dependence structure that is too extreme and unrealistic. However, the concept of comonotonicity is actually a useful tool for solving several research and practical problems in capital allocation, risk sharing and optimal allocation. The first topic of this thesis is focused on the application of comonotonicity in optimal capital allocation. The Enterprise Risk Management process of a financial institution usually contains a procedure to allocate the total risk capital of the company into its different business units. Dhaene et al. (2012) proposed a unifying capital allocation framework by considering some general deviation measures. This general framework is extended to a more general optimization problem of minimizing separable convex function with a linear constraint and box constraints. A new approach of solving this constrained minimization problem explicitly by the concept of comonotonicity is developed. Instead of the traditional Kuhn-Tucker theory, a method of expressing each convex function as the expected stop-loss of some suitable random variable is used to solve the optimization problem. Then, some results in convex analysis with infimum-convolution are derived using the result of this new approach. Next, Borch's theorem is revisited from the perspective of comonotonicity. The optimal solution to the Pareto optimal risk-sharing problem can be obtained by the Lagrangian method or variational arguments. Here, I propose a new method, which is based on a Breeden-Litzanbeger type integral representation formula for increasing convex functions. It enables the transform of the objective function into a sum of mixtures of stop-losses. Necessary conditions for the existence of optimal solution are then discussed. The explicit solution obtained allows us to show that the risk-sharing problem is indeed a “point-wise” problem, and hence the value function can be obtained immediately using the notion of supremum-convolution in convex analysis. In addition to the above classical risk-sharing and capital allocation problems, the problem of minimizing a separable convex objective subject to an ordering restriction is then studied. Best et al. (2000) proposed a pool adjacent violators algorithm to compute the optimal solution. Instead, we show that using the concept of comonotonicity and the technique of dynamic programming the solution can be derived in a recursive manner. By identifying the right-hand derivative of the convex functions with distribution functions of some suitable random variables, we rewrite the objective function into a sum of expected deviations. This transformation and the fact that the expected deviation is a convex function enable us to solve the minimizing problem.
published_or_final_version
Statistics and Actuarial Science
Doctoral
Doctor of Philosophy
Style APA, Harvard, Vancouver, ISO itp.
31

Basak, Rishi. "Environmental management systems and the intra-firm risk relationship". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0034/MQ64316.pdf.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
32

Alharthi, Muteb. "Bayesian model assessment for stochastic epidemic models". Thesis, University of Nottingham, 2016. http://eprints.nottingham.ac.uk/33182/.

Pełny tekst źródła
Streszczenie:
Acrucial practical advantage of infectious diseases modelling as a public health tool lies in its application to evaluate various disease-control policies. However, such evaluation is of limited use, unless a sufficiently accurate epidemic model is applied. If the model provides an adequate fit, it is possible to interpret parameter estimates, compare disease epidemics and implement control procedures. Methods to assess and compare stochastic epidemic models in a Bayesian framework are not well-established, particularly in epidemic settings with missing data. In this thesis, we develop novel methods for both model adequacy and model choice for stochastic epidemic models. We work with continuous time epidemic models and assume that only case detection times of infected individuals are available, corresponding to removal times. Throughout, we illustrate our methods using both simulated outbreak data and real disease data. Data augmented Markov Chain Monte Carlo (MCMC) algorithms are employed to make inference for unobserved infection times and model parameters. Under a Bayesian framework, we first conduct a systematic investigation of three different but natural methods of model adequacy for SIR (Susceptible-Infective-Removed) epidemic models. We proceed to develop a new two-stage method for assessing the adequacy of epidemic models. In this two stage method, two predictive distributions are examined, namely the predictive distribution of the final size of the epidemic and the predictive distribution of the removal times. The idea is based onlooking explicitly at the discrepancy between the observed and predicted removal times using the posterior predictive model checking approach in which the notion of Bayesian residuals and the and the posterior predictive p−value are utilized. This approach differs, most importantly, from classical likelihood-based approaches by taking into account uncertainty in both model stochasticity and model parameters. The two-stage method explores how SIR models with different infection mechanisms, infectious periods and population structures can be assessed and distinguished given only a set of removal times. In the last part of this thesis, we consider Bayesian model choice methods for epidemic models. We derive explicit forms for Bayes factors in two different epidemic settings, given complete epidemic data. Additionally, in the setting where the available data are partially observed, we extend the existing power posterior method for estimating Bayes factors to models incorporating missing data and successfully apply our missing-data extension of the power posterior method to various epidemic settings. We further consider the performance of the deviance information criterion (DIC) method to select between epidemic models.
Style APA, Harvard, Vancouver, ISO itp.
33

Wei, Zhenghong. "Empirical likelihood based evaluation for value at risk models". HKBU Institutional Repository, 2007. http://repository.hkbu.edu.hk/etd_ra/896.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
34

Li, Tang, i 李唐. "Markov chain models for re-manufacturing systems and credit risk management". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2008. http://hub.hku.hk/bib/B40203700.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
35

Hao, Fangcheng, i 郝方程. "Options pricing and risk measures under regime-switching models". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2011. http://hub.hku.hk/bib/B4714726X.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
36

Zhao, Bo. "Overview of Financial Risk Assessment". Kent State University Honors College / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=ksuhonors1399203159.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
37

Manasse, Paul Reuben. "Time-dependent stochastic models for fire risk assessment". Thesis, University of Liverpool, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.317171.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
38

Kallis, Constantinos. "Construction and assessment of risk models in medicine". Thesis, University of Warwick, 2005. http://wrap.warwick.ac.uk/79266/.

Pełny tekst źródła
Streszczenie:
This thesis investigates the application of classical and contemporary statistical methods in medical research attempting to bridge the gap between statistics and clinical medicine. The importance of using simple and advanced statistical methods in constructing and assessing risk models in medicine will be demonstrated by empirical studies related to vascular complications: namely abdominal aortic aneurysm and diabetic retinopathy. First, data preprocessing and preliminary statistical analysis are examined and their application is investigated using data on abdominal aortic aneurysm. We illustrate that when dealing with missing data, the co-operation between statisticians and clinicians is necessary. Also, we show advantages and disadvantages of exploratory analysis. Second, we describe and compare classification models for AAA selective screening. Tow logistic regression models are proposed. We also show that it is important to assess the performance of classifiers by cross-validation and bootstrapping. We also examine models that include other definitions of abnormality, weighted classification and multiple class models. Third, we consider the application of graphical models. We look at different types of graphical models that can be used for classification and for identifying the underlying data structure. The use of Naïve Bayes classifier (NBC) is shown and subsequently we illustrate the Occam’s window model selection in a statistical package for Mixed Interactions Modelling (MIM). The EM-algorithm and multiple imputation method are used to deal with inconsistent entries in the dataset. Finally, modelling mixture of Normal components is investigated by graphical modelling and compared with an alternative minimisation procedure. Finally, we examine risk factors of diabetic sight threating retinopathy (STR). We show the complexity of data preparation and preliminary analysis as well as the importance of using the clinicians’ opinion on selecting appropriate variables. Blood pressure measurements have been examined as predictors of STR. The fundamental role of imputation and its influence on the conclusions of the study are demonstrated. From this study, we conclude that the application of statistics in medicine is an optimisation procedure where both the statistical and the clinical validity need to be taken into account. Also, the combination of simple and advanced methods should be used as it provides additional information. Data, software and time limitations should be considered before and during statistical analysis and appropriate modifications might be implemented to avoid compromising the quality of the study. Finally, medical research should be regarded for statisticians and clinicians as part of a learning process.
Style APA, Harvard, Vancouver, ISO itp.
39

Palhares, André Vitor de Almeida. "Probabilistic Risk Assessment in Clouds: Models and Algorithms". Universidade Federal de Pernambuco, 2012. https://repositorio.ufpe.br/handle/123456789/10423.

Pełny tekst źródła
Streszczenie:
Submitted by Pedro Henrique Rodrigues (pedro.henriquer@ufpe.br) on 2015-03-04T17:17:29Z No. of bitstreams: 2 dissert-avap.pdf: 401311 bytes, checksum: 5bd3f82323bd612e8265a6ab8a55eda0 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5)
Made available in DSpace on 2015-03-04T17:17:29Z (GMT). No. of bitstreams: 2 dissert-avap.pdf: 401311 bytes, checksum: 5bd3f82323bd612e8265a6ab8a55eda0 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Previous issue date: 2012-03-08
Cloud reliance is critical to its success. Although fault-tolerance mechanisms are employed by cloud providers, there is always the possibility of failure of infrastructure components. We consequently need to think proactively of how to deal with the occurrence of failures, in an attempt to minimize their effects. In this work, we draw the risk concept from probabilistic risk analysis in order to achieve this. In probabilistic risk analysis, consequence costs are associated to failure events of the target system, and failure probabilities are associated to infrastructural components. The risk is the expected consequence of the whole system. We use the risk concept in order to present representative mathematical models for which computational optimization problems are formulated and solved, in a Cloud Computing environment. In these problems, consequence costs are associated to incoming applications that must be allocated in the Cloud and the risk is either seen as an objective function that must be minimized or as a constraint that should be limited. The proposed problems are solved either by optimal algorithm reductions or by approximation algorithms with provably performance guarantees. Finally, the models and problems are discussed from a more practical point of view, with examples of how to assess risk using these solutions. Also, the solutions are evaluated and results on their performance are established, showing that they can be used in the effective planning of the Cloud.
Style APA, Harvard, Vancouver, ISO itp.
40

Alexander, Byron Vernon Terry. "Legacy system upgrade for software risk assessment". Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2001. http://handle.dtic.mil/100.2/ADA401409.

Pełny tekst źródła
Streszczenie:
Thesis (M.S in Computer Science) Naval Postgtaduate School, December 2001.
Thesis Advisor(s): Berzins, Valdis ; Murrah, Michael. "December 2001." Includes bibliographical references (p. 91). Also available online.
Style APA, Harvard, Vancouver, ISO itp.
41

Powell, Robert. "Industry value at risk in Australia". Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2007. https://ro.ecu.edu.au/theses/297.

Pełny tekst źródła
Streszczenie:
Value at Risk (VaR) models have gained increasing momentum in recent years. Market VaR is an important issue for banks since its adoption as a primary risk metric in the Basel Accords and the requirement that it is calculated on a daily basis. Credit risk modelling has become increasingly important to banks since the advent of Basel 11 which allows banks with sophisticated modelling techniques to use internal models for the purpose of calculating capital requirements. A high level of credit risk is often the key reason behind banks failing or experiencing severe difficulty. Conditional Value at Risk (CVaR) measures extreme risk, and is gaining popularity with the recognition that high losses are often impacted by a small number of extreme events.
Style APA, Harvard, Vancouver, ISO itp.
42

Veraart, Luitgard Anna Maria. "Mathematical models for market making, option pricing and systemic risk". Thesis, University of Cambridge, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.613365.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
43

Zhu, Jinxia, i 朱金霞. "Ruin theory under Markovian regime-switching risk models". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2008. http://hub.hku.hk/bib/B40203980.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
44

Reynolds, Joel Howard. "Multi-criteria assessment of ecological process models using pareto optimization /". Thesis, Connect to this title online; UW restricted, 1997. http://hdl.handle.net/1773/6377.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
45

Patil, Rohit A. "Novel application of quantitative risk assessment modelling to a continuous fermenter". Thesis, 2006. http://hdl.handle.net/2440/69737.

Pełny tekst źródła
Streszczenie:
In food and pharmaceutical industries plant failure can be costly and sometimes catastrophic to public health. This thesis uses the notion of Friday 13th Syndrome, ie. the unexpected failure of a well-operated plant, to develop a new and rigorous mathematical model of a generalised, continuous fermenter to gain insight into the likelihood of bioprocess failure. This new model is developed for a continuous, anaerobic fermenter based on widely employed Monod process model.
Thesis (M.Eng.Sc.) -- University of Adelaide, School of Chemical Engineering, 2006
Style APA, Harvard, Vancouver, ISO itp.
46

"Optimal dynamic portfolio selection under downside risk measure". 2014. http://library.cuhk.edu.hk/record=b6116127.

Pełny tekst źródła
Streszczenie:
传统的风险控制以终端财富的各阶中心矩作为风险度量,而现在越来越多的投资模型转向以不对称的在某个特定临界值的下行风险作为风险度量。在现有的下行风险中,安全第一准则,风险价值,条件风险价值,下偏矩可能是最有活力的代表。在这篇博士论文中,在已有的静态文献之上,我们讨论了以安全第一准则,风险价值,条件风险价值,下偏矩为风险度量的一系列动态投资组合问题。我们的贡献在于两个方面,一个是建立了可以被解析求解的模型,另一个是得到了最优的投资策略。在终端财富上加上一个上界,使得我们克服了一类下行风险投资组合问题的不适定性。引入的上界不仅仅使得我们的下行风险下的投资组合问题能得到显式解,而且也让我们可以控制下行风险投资组合问题的最优投资的冒险性。用分位数法和鞅方法,我们能够得到上述的各种模型的解析解。在一定的市场条件下,我们得到了对应的拉格朗日问题的乘子的存在性和唯一性, 这也是对应的鞅方法中的核心步骤。更进一步,当市场投资组合集是确定性的时候,我们推出解析的最优财富过程和最优投资策略。
Instead of controlling "symmetric" risks measured by central moments of terminal wealth, more and more portfolio models have shifted their focus to manage "asymmetric" downside risks that the investment return is below certain threshold. Among the existing downside risk measures, the safety-first principle, the value-at-risk (VaR), the conditional value-at-risk (CVaR) and the lower-partial moments (LPM) are probably the most promising representatives.
In this dissertation, we investigate a general class of dynamic mean-downside risk portfolio selection formulations, including the mean-exceeding probability portfolio selection formulation, the dynamic mean-VaR portfolio selection formulation, the dynamic mean-LPM portfolio selection formulation and the dynamic mean-CVaR portfolio selection formulation in continuous-time, while the current literature has only witnessed their static versions. Our contributions are two-fold, in both building up tractable formulations and deriving corresponding optimal policies. By imposing a limit funding level on the terminal wealth, we conquer the ill-posedness exhibited in the class of mean-downside risk portfolio models. The limit funding level not only enables us to solve dynamic mean-downside risk portfolio optimization problems, but also offers a flexibility to tame the aggressiveness of the portfolio policies generated from the mean-downside risk optimization models. Using quantile method and martingale approach, we derive optimal solutions for all the above mentioned mean-downside risk models. More specifically, for a general market setting, we prove the existence and uniqueness of the Lagrangian multiplies, which is a key step in applying the martingale approach, and establish a theoretical foundation for developing efficient numerical solution approaches. Furthermore, for situations where the opportunity set of the market setting is deterministic, we derive analytical portfolio policies.
Detailed summary in vernacular field only.
Zhou, Ke.
Thesis (Ph.D.) Chinese University of Hong Kong, 2014.
Includes bibliographical references (leaves i-vi).
Abstracts also in Chinese.
Style APA, Harvard, Vancouver, ISO itp.
47

"Bayesian approach for risk bucketing". 2009. http://library.cuhk.edu.hk/record=b5894184.

Pełny tekst źródła
Streszczenie:
Lau, Ka Ho.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2009.
Includes bibliographical references (leaves 46-48).
Abstract also in Chinese.
Chapter 1 --- Introduction to Global Credit Risk Management Standard --- p.1
Chapter 1.1 --- Background --- p.2
Chapter 1.2 --- Basel Accords --- p.2
Chapter 1.3 --- Risk Bucketing --- p.7
Chapter 2 --- Current Practices of Risk Bucketing and PD Estimation --- p.10
Chapter 2.1 --- Credit Scoring --- p.10
Chapter 2.2 --- Risk Bucketing after Credit Scoring --- p.12
Chapter 2.3 --- Related Literature Review --- p.14
Chapter 2.4 --- Objective --- p.16
Chapter 3 --- Bayesian Model for risk bucketing --- p.17
Chapter 3.1 --- The Model --- p.17
Chapter 3.2 --- Posterior Distribution --- p.19
Chapter 3.3 --- Gibbs Sampler for the Posterior Distribution --- p.22
Chapter 3.3.1 --- General Gibbs Sampler Theory --- p.22
Chapter 3.3.2 --- The Gibbs Sampler for the Proposed Model --- p.23
Chapter 3.4 --- Monitoring Convergence of the Gibbs Sampler --- p.26
Chapter 3.5 --- "Estimation, Bucketing and Prediction" --- p.28
Chapter 3.5.1 --- Estimation --- p.28
Chapter 3.5.2 --- Bucketing --- p.28
Chapter 3.5.3 --- Prediction --- p.29
Appendix --- p.29
Chapter 4 --- Simulation Studies and Real Data Analysis --- p.32
Chapter 4.1 --- Simulation Studies --- p.32
Chapter 4.1.1 --- Details of Simulation --- p.32
Chapter 4.1.2 --- Simulation Procedures --- p.34
Chapter 4.1.3 --- Predictive Performance --- p.35
Chapter 4.1.4 --- Summary of Simulation Results --- p.36
Chapter 4.2 --- Real Data Analysis --- p.37
Chapter 5 --- Conclusion and Discussion --- p.44
Bibliography --- p.46
Style APA, Harvard, Vancouver, ISO itp.
48

"Risk management of the financial markets". Chinese University of Hong Kong, 1996. http://library.cuhk.edu.hk/record=b5895616.

Pełny tekst źródła
Streszczenie:
by Chan Pui Man.
Thesis (M.B.A.)--Chinese University of Hong Kong, 1996.
Includes bibliographical references (leaves 108-111).
ABSTRACT --- p.II
TABLE OF CONTENTS --- p.III
ACKNOWLEDGEMENT --- p.VI
Chapter I. --- INTRODUCTION --- p.1
Chapter II. --- LITERATURE REVIEW --- p.4
Impact due to Deregulation --- p.5
Impact due to Globalization --- p.5
Impact due to Securitization --- p.6
Impact due to Institutionalisation --- p.6
Impact due to Computerisation --- p.7
Chapter III. --- CONCEPT: MANAGEMENT OF RISK --- p.8
Definition of Risk --- p.9
Risk Analysis --- p.10
Risk Assessment --- p.10
Risk Measurement --- p.10
Risk Management --- p.11
Chapter IV. --- TYPE OF RISK --- p.13
Market/Capital Risk --- p.14
Reinvestment Risk --- p.15
Interest Rate Risk --- p.16
Credit Risk --- p.17
Liquidity or Funding Risk --- p.18
Currency and Foreign Exchange Risk --- p.19
Inflation Risk --- p.19
Operations Risk --- p.20
Legal Risk --- p.20
Political Risk --- p.21
Systemic Risk --- p.22
Portfolio Risk --- p.22
Control Risk --- p.23
Settlement Risk --- p.23
Country Risk --- p.24
Underwriting Risk --- p.24
Residual or Moral Risk --- p.24
Strategy Risk and Environment Risk --- p.25
Chapter V. --- MEASURING CHANGING RISK --- p.26
Historical Estimates --- p.28
Non-parametric Methods --- p.29
Parametric Methods --- p.30
Chapter VI. --- EVOLUTION OF RISK ESTIMATION --- p.35
Chapter VII. --- APPLYING PORTFOLIO THEORY INTO RISK ANALYSIS --- p.41
Modelling Bank Risk --- p.43
Identification of linkages between an individual loan and bank's overall risk profile --- p.43
Distribution of expected values --- p.44
Portfolio expected value --- p.44
Scenario Analysis and Formation of Loan Risk Measurement --- p.45
Subsystem --- p.45
Formation of an Integrated Risk Measurement --- p.45
Active Management of Portfolio Risk --- p.49
Chapter VIII. --- RISK ANALYSIS OF INTERNATIONAL INVESTMENT --- p.51
Discounted-Cash-Flow Analysis --- p.51
Net Present Value Approach --- p.51
Internal Rate of Return Approach --- p.54
Break-even Probability Analysis --- p.55
Certainty-Equivalent Method --- p.56
Chapter IX. --- CONSTRUCTING A MODEL FOR RISK ASSESSMENT --- p.58
"Set up a Model to Estimate ""Capital at Risk""" --- p.58
Obey the Minimum Standards --- p.60
Audit and Verify the Model --- p.62
Chapter X. --- METHODOLOGIES OF RISK MEASUREMENT
Measuring Market Risk : J P Morgan Risk Management Methodology - RiskMetrics´ёØ --- p.64
Statistical Analysis of Returns and Risk --- p.66
Market Moves and Locally Gaussian Processes --- p.72
Stochastic Volatility --- p.72
Risk and Optionality --- p.73
Mapping and Term Structure of Interest Rates --- p.73
Measuring Position Risk --- p.75
The Simplified Portfolio Approach --- p.77
The Comprehensive Approach --- p.81
The Building-Block Approach --- p.83
Chapter XI. --- ITEMS INVOLVED IN RISK MANAGEMENT --- p.85
Management Control --- p.35
Constructing Valuation Methodology --- p.90
Contents of Reporting --- p.92
Evaluation of Risk --- p.93
Counterparty Relationships --- p.93
Chapter XII. --- AFTERTHOUGHT --- p.95
APPENDIX --- p.98
BIBLIOGRAPHY --- p.108
Style APA, Harvard, Vancouver, ISO itp.
49

"Robust approach to risk management and statistical analysis". 2012. http://library.cuhk.edu.hk/record=b5549601.

Pełny tekst źródła
Streszczenie:
博士論文著重研究關於多項式優化的理論,並討論其在風險管理及統計分析中的應用。我們主要研究的對象乃為在控制理論和穩健優化中常見的所謂S 引理。原始的S 引理最早由Yakubovich 所引入。它給出一個二吹多項式在另一個二吹多項式的非負域上為非負的等價條件。在本論文中,我們把S 引理推廣到一元高吹多項式。由於S 引理與穩健優化密切相關,所以我們的結果可廣泛應用於風險管理及統計分析,包括估算在高階矩約束下的非線性風險量度問題,以及利用半正定規劃來計算同時置信區域帶等重要課題。同時,在相關章節的末段,我們以數值實驗結果來引證有關的新理論的有效性和應用前景。
In this thesis we study some structural results in polynomial optimization, with an emphasis paid to the applications from risk management problems and estimations in statistical analysis. The key underlying method being studied is related to the so-called S-lemma in control theory and robust optimization. The original S-lemma was developed by Yakubovich, which states an equivalent condition for a quadratic polynomial to be non-negative over the non-negative domain of other quadratic polynomial(s). In this thesis, we extend the S-Lemma to univariate polynomials of any degree. Since robust optimization has a strong connection to the S-Lemma, our results lead to many applications in risk management and statistical analysis, including estimating certain nonlinear risk measures under moment bound constraints, and an SDP formulation for simultaneous confidence bands. Numerical experiments are conducted and presented to illustrate the effectiveness of the methods.
Detailed summary in vernacular field only.
Wong, Man Hong.
Thesis (Ph.D.)--Chinese University of Hong Kong, 2012.
Includes bibliographical references (leaves 134-147).
Abstract also in Chinese.
Abstract --- p.i
摘要 --- p.ii
Acknowledgement --- p.iii
Chapter 1 --- Introduction --- p.1
Chapter 2 --- Meeting the S-Lemma --- p.5
Chapter 3 --- A strongly robust formulation --- p.13
Chapter 3.1 --- A more practical extension for robust optimization --- p.13
Chapter 3.1.1 --- Motivation from modeling aspect --- p.13
Chapter 3.1.2 --- Discussion of a more robust condition --- p.15
Chapter 4 --- Theoretical developments --- p.19
Chapter 4.1 --- Definition of several order relations --- p.19
Chapter 4.2 --- S-Lemma with a single condition g(x)≥0 --- p.20
Chapter 5 --- Confidence bands in polynomial regression --- p.47
Chapter 5.1 --- An introduction --- p.47
Chapter 5.1.1 --- A review on robust optimization, nonnegative polynomials and SDP --- p.49
Chapter 5.1.2 --- A review on the confidence bands --- p.50
Chapter 5.1.3 --- Our contribution --- p.51
Chapter 5.2 --- Some preliminaries on optimization --- p.52
Chapter 5.2.1 --- Robust optimization --- p.52
Chapter 5.2.2 --- Semidefinite programming and LMIs --- p.53
Chapter 5.2.3 --- Nonnegative polynomials with SDP --- p.55
Chapter 5.3 --- Some preliminaries on linear regression and confidence region --- p.59
Chapter 5.4 --- Optimization approach to the confidence bands construction --- p.63
Chapter 5.5 --- Numerical experiments --- p.66
Chapter 5.5.1 --- Linear regression example --- p.66
Chapter 5.5.2 --- Polynomial regression example --- p.67
Chapter 5.6 --- Conclusion --- p.70
Chapter 6 --- Moment bound of nonlinear risk measures --- p.72
Chapter 6.1 --- Introduction --- p.72
Chapter 6.1.1 --- Motivation --- p.72
Chapter 6.1.2 --- Robustness and moment bounds --- p.74
Chapter 6.1.3 --- Literature review in general --- p.76
Chapter 6.1.4 --- More literature review in actuarial science --- p.78
Chapter 6.1.5 --- Our contribution --- p.79
Chapter 6.2 --- Methodological fundamentals behind the moment bounds --- p.81
Chapter 6.2.1 --- Dual formulations, duality and tight bounds --- p.82
Chapter 6.2.2 --- SDP and LMIs for some dual problems --- p.84
Chapter 6.3 --- Worst expectation and worst risk measures on annuity payments --- p.87
Chapter 6.3.1 --- The worst mortgage payments --- p.88
Chapter 6.3.2 --- The worst probability of repayment failure --- p.89
Chapter 6.3.3 --- The worst expected downside risk of exceeding the threshold --- p.90
Chapter 6.4 --- Numerical examples for risk management --- p.94
Chapter 6.4.1 --- A mortgage example --- p.94
Chapter 6.4.2 --- An annuity example --- p.97
Chapter 6.5 --- Conclusion --- p.100
Chapter 7 --- Computing distributional robust probability functions --- p.101
Chapter 7.1 --- Distributional robust function with a single random variable --- p.105
Chapter 7.2 --- Moment bound of joint probability --- p.108
Chapter 7.2.1 --- Constraint (7.5) in LMIs --- p.112
Chapter 7.2.2 --- Constraint (7.6) in LMIs --- p.112
Chapter 7.2.3 --- Constraint (7.7) in LMIs --- p.116
Chapter 7.3 --- Several model extensions --- p.119
Chapter 7.3.1 --- Moment bound of probability of union events --- p.119
Chapter 7.3.2 --- The variety of domain of x --- p.120
Chapter 7.3.3 --- Higher moments incorporated --- p.123
Chapter 7.4 --- Applications of the moment bound --- p.124
Chapter 7.4.1 --- The Riemann integrable set approximation --- p.124
Chapter 7.4.2 --- Worst-case simultaneous VaR --- p.124
Chapter 7.5 --- Conclusion --- p.126
Chapter 8 --- Concluding Remarks and Future Directions --- p.127
Chapter A --- Nonnegative univariate polynomials --- p.129
Chapter B --- First and second moment of (7.2) --- p.131
Bibliography --- p.134
Style APA, Harvard, Vancouver, ISO itp.
50

Schwartz, Carmit M. Economics Australian School of Business UNSW. "Individuals' responses to changes in risk: a person-specific analysis". 2007. http://handle.unsw.edu.au/1959.4/40575.

Pełny tekst źródła
Streszczenie:
In this thesis we consider two comparative statics questions of changes in risk. The first question concerns situations where an individual faces some risk and has no control over the uncertain environment. In these situations we ask what kind of changes in risk will cause the individual's expected utility to increase. The second comparative statics question concerns situations where an individual faces some risk and has some control over the uncertain environment. In particular, we consider situations where the individual maximizes her expected utility with respect to some control parameter. Here we ask what kind of changes in risk will cause the individual's optimal value of the control parameter to increase. The existing literature has answered these questions for a class of individuals (for example, the class of risk averse individuals). This thesis differs from existing literature as it focuses on a given individual, and thus reveals some of the person-specific factors that affect individual?s responses to changes in risk. The aim of the thesis is to show how an order on distributions, termed single crossing likelihood ratio (SCLR) order, can intuitively answer both questions for a given individual. The main contributions of the thesis are as follows. First, the thesis presents the SCLR order and its main properties. Second, the thesis shows that the SCLR order can answer the above comparative statics questions in an intuitive way. In particular, the thesis shows that the answer to the above questions, with the use of the SCLR order, depends on a risk reference point which can be interpreted as a "certainty equivalent" point. Thus it is demonstrated that individual's responses to changes in risk are affected by her "certainty equivalent" point. Lastly, the results of the thesis can be used to provide an intuitive explanation of related existing results that were obtained for a class of individuals.
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii