Dissertations / Theses on the topic 'Risk – Mathematical models'

To see the other types of publications on this topic, follow the link: Risk – Mathematical models.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Risk – Mathematical models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Capelli, Giacomo <1991&gt. "Mathematical Models for Operational Risk Management." Master's Degree Thesis, Università Ca' Foscari Venezia, 2016. http://hdl.handle.net/10579/8599.

Full text
Abstract:
The thesis presents the state-of-the-art mathematical statistical models for Operational Risk measurement and management as well as the organisational and managerial processes supporting the creation of risk OR measures. We emphasise the most interesting probabilistic ideas employed in the field and apply the models on real data; these are used in the actuarial modelling paradigm following a Loss Distribution Approach. Extreme Value Theory, convolution transforms and copulæ theory will all be part of OR analysis to arrive to a regulatory risk measure. The thesis accompanies the theoretical modelling part with the management processes that are needed to implement these models in financial institutions, along with the primary risk mitigation techniques used against OR events. Finally, in light of the recent consultations carried out by the Bank for International Settlements regarding the partial substitution of actuarial models, we will also reflect on the differences, commonalities, and limitations of present and future OR modelling. The thesis is organised as follows. Chapter I introduces the major risk types institutions face. Chapter II describes the non-actuarial risk measurement techniques used by banks without internally developed actuarial models. The central mathematics of the thesis is presented and applied in Chapter III. Chapter IV compares the present modelling techniques with recently proposed modifications. Chapter VI treats the managerial aspects of OR and Chapter VI studies risk mitigation tools usually adopted. We conclude in Chapter VII with reflections and final comments.
APA, Harvard, Vancouver, ISO, and other styles
2

Ngwenza, Dumisani. "Quantifying Model Risk in Option Pricing and Value-at-Risk Models." Master's thesis, Faculty of Commerce, 2019. http://hdl.handle.net/11427/31059.

Full text
Abstract:
Financial practitioners use models in order to price, hedge and measure risk. These models are reliant on assumptions and are prone to ”model risk”. Increased innovation in complex financial products has lead to increased risk exposure and has spurred research into understanding model risk and its underlying factors. This dissertation quantifies model risk inherent in Value-at-Risk (VaR) on a variety of portfolios comprised of European options written on the ALSI futures index across various maturities. The European options under consideration will be modelled using the Black-Scholes, Heston and Variance-Gamma models.
APA, Harvard, Vancouver, ISO, and other styles
3

Siu, Kin-bong Bonny, and 蕭健邦. "Expected shortfall and value-at-risk under a model with market risk and credit risk." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2006. http://hub.hku.hk/bib/B37727473.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Gu, Jiawen, and 古嘉雯. "On credit risk modeling and credit derivatives pricing." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2014. http://hdl.handle.net/10722/202367.

Full text
Abstract:
In this thesis, efforts are devoted to the stochastic modeling, measurement and evaluation of credit risks, the development of mathematical and statistical tools to estimate and predict these risks, and methods for solving the significant computational problems arising in this context. The reduced-form intensity based credit risk models are studied. A new type of reduced-form intensity-based model is introduced, which can incorporate the impacts of both observable trigger events and economic environment on corporate defaults. The key idea of the model is to augment a Cox process with trigger events. In addition, this thesis focuses on the relationship between structural firm value model and reduced-form intensity based model. A continuous time structural asset value model for the asset value of two correlated firms with a two-dimensional Brownian motion is studied. With the incomplete information introduced, the information set available to the market participants includes the default time of each firm and the periodic asset value reports. The original structural model is first transformed into a reduced-form model. Then the conditional distribution of the default time as well as the asset value of each name are derived. The existence of the intensity processes of default times is proven and explicit form of intensity processes is given in this thesis. Discrete-time Markovian models in credit crisis are considered. Markovian models are proposed to capture the default correlation in a multi-sector economy. The main idea is to describe the infection (defaults) in various sectors by using an epidemic model. Green’s model, an epidemic model, is applied to characterize the infectious effect in each sector and dependence structures among various sectors are also proposed. The models are then applied to the computation of Crisis Value-at-Risk (CVaR) and Crisis Expected Shortfall (CES). The relationship between correlated defaults of different industrial sectors and business cycles as well as the impacts of business cycles on modeling and predicting correlated defaults is investigated using the Probabilistic Boolean Network (PBN). The idea is to model the credit default process by a PBN and the network structure can be inferred by using Markov chain theory and real-world data. A reduced-form model for economic and recorded default times is proposed and the probability distributions of these two default times are derived. The numerical study on the difference between these two shows that our proposed model can both capture the features and fit the empirical data. A simple and efficient method, based on the ordered default rate, is derived to compute the ordered default time distributions in both the homogeneous case and the two-group heterogeneous case under the interacting intensity default contagion model. Analytical expressions for the ordered default time distributions with recursive formulas for the coefficients are given, which makes the calculation fast and efficient in finding rates of basket CDSs.
published_or_final_version
Mathematics
Doctoral
Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
5

Gong, Qi, and 龔綺. "Gerber-Shiu function in threshold insurance risk models." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2008. http://hub.hku.hk/bib/B40987966.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Liu, Binbin, and 刘彬彬. "Some topics in risk theory and optimal capital allocation problems." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2012. http://hub.hku.hk/bib/B48199291.

Full text
Abstract:
In recent years, the Markov Regime-Switching model and the class of Archimedean copulas have been widely applied to a variety of finance-related fields. The Markov Regime-Switching model can reflect the reality that the underlying economy is changing over time. Archimedean copulas are one of the most popular classes of copulas because they have closed form expressions and have great flexibility in modeling different kinds of dependencies. In the thesis, we first consider a discrete-time risk process based on the compound binomial model with regime-switching. Some general recursive formulas of the expected penalty function have been obtained. The orderings of ruin probabilities are investigated. In particular, we show that if there exists a stochastic dominance relationship between random claims at different regimes, then we can order ruin probabilities under different initial regimes. Regarding capital allocation problems, which are important areas in finance and risk management, this thesis studies the problems of optimal allocation of policy limits and deductibles when the dependence structure among risks is modeled by an Archimedean copula. By employing the concept of arrangement increasing and stochastic dominance, useful qualitative results of the optimal allocations are obtained. Then we turn our attention to a new family of risk measures satisfying a set of proposed axioms, which includes the class of distortion risk measures with concave distortion functions. By minimizing the new risk measures, we consider the optimal allocation of policy limits and deductibles problems based on the assumption that for each risk there exists an indicator random variable which determines whether the risk occurs or not. Several sufficient conditions to order the optimal allocations are obtained using tools in stochastic dominance theory.
published_or_final_version
Statistics and Actuarial Science
Doctoral
Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
7

蕭德權 and Tak-kuen Siu. "Risk measures in finance and insurance." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B31242297.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Rong, Yian, and 戎軼安. "Applications of comonotonicity in risk-sharing and optimal allocation." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2014. http://hdl.handle.net/10722/207205.

Full text
Abstract:
Over the past decades, researchers in economics, financial mathematics and actuarial science have introduced results to the concept of comonotonicity in their respective fields of interest. Comonotonicity is a very strong dependence structure and is very often mistaken as a dependence structure that is too extreme and unrealistic. However, the concept of comonotonicity is actually a useful tool for solving several research and practical problems in capital allocation, risk sharing and optimal allocation. The first topic of this thesis is focused on the application of comonotonicity in optimal capital allocation. The Enterprise Risk Management process of a financial institution usually contains a procedure to allocate the total risk capital of the company into its different business units. Dhaene et al. (2012) proposed a unifying capital allocation framework by considering some general deviation measures. This general framework is extended to a more general optimization problem of minimizing separable convex function with a linear constraint and box constraints. A new approach of solving this constrained minimization problem explicitly by the concept of comonotonicity is developed. Instead of the traditional Kuhn-Tucker theory, a method of expressing each convex function as the expected stop-loss of some suitable random variable is used to solve the optimization problem. Then, some results in convex analysis with infimum-convolution are derived using the result of this new approach. Next, Borch's theorem is revisited from the perspective of comonotonicity. The optimal solution to the Pareto optimal risk-sharing problem can be obtained by the Lagrangian method or variational arguments. Here, I propose a new method, which is based on a Breeden-Litzanbeger type integral representation formula for increasing convex functions. It enables the transform of the objective function into a sum of mixtures of stop-losses. Necessary conditions for the existence of optimal solution are then discussed. The explicit solution obtained allows us to show that the risk-sharing problem is indeed a “point-wise” problem, and hence the value function can be obtained immediately using the notion of supremum-convolution in convex analysis. In addition to the above classical risk-sharing and capital allocation problems, the problem of minimizing a separable convex objective subject to an ordering restriction is then studied. Best et al. (2000) proposed a pool adjacent violators algorithm to compute the optimal solution. Instead, we show that using the concept of comonotonicity and the technique of dynamic programming the solution can be derived in a recursive manner. By identifying the right-hand derivative of the convex functions with distribution functions of some suitable random variables, we rewrite the objective function into a sum of expected deviations. This transformation and the fact that the expected deviation is a convex function enable us to solve the minimizing problem.
published_or_final_version
Statistics and Actuarial Science
Doctoral
Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
9

Basak, Rishi. "Environmental management systems and the intra-firm risk relationship." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0034/MQ64316.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Li, Tang, and 李唐. "Markov chain models for re-manufacturing systems and credit risk management." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2008. http://hub.hku.hk/bib/B40203700.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Veraart, Luitgard Anna Maria. "Mathematical models for market making, option pricing and systemic risk." Thesis, University of Cambridge, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.613365.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Hao, Fangcheng, and 郝方程. "Options pricing and risk measures under regime-switching models." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2011. http://hub.hku.hk/bib/B4714726X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Wei, Zhenghong. "Empirical likelihood based evaluation for value at risk models." HKBU Institutional Repository, 2007. http://repository.hkbu.edu.hk/etd_ra/896.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Powell, Robert. "Industry value at risk in Australia." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2007. https://ro.ecu.edu.au/theses/297.

Full text
Abstract:
Value at Risk (VaR) models have gained increasing momentum in recent years. Market VaR is an important issue for banks since its adoption as a primary risk metric in the Basel Accords and the requirement that it is calculated on a daily basis. Credit risk modelling has become increasingly important to banks since the advent of Basel 11 which allows banks with sophisticated modelling techniques to use internal models for the purpose of calculating capital requirements. A high level of credit risk is often the key reason behind banks failing or experiencing severe difficulty. Conditional Value at Risk (CVaR) measures extreme risk, and is gaining popularity with the recognition that high losses are often impacted by a small number of extreme events.
APA, Harvard, Vancouver, ISO, and other styles
15

Terciyanli, Erman. "Alternative Mathematical Models For Revenue Management Problems." Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/12610711/index.pdf.

Full text
Abstract:
In this study, the seat inventory control problem is considered for airline networks from the perspective of a risk-averse decision maker. In the revenue management literature, it is generally assumed that the decision makers are risk-neutral. Therefore, the expected revenue is maximized without taking the variability or any other risk factor into account. On the other hand, risk-sensitive approach provides us with more information about the behavior of the revenue. The risk measure we consider in this study is the probability that revenue is less than a predetermined threshold level. In the risk-neutral cases, while the expected revenue is maximized, the probability of revenue being less than such a predetermined level might be high. We propose three mathematical models to incorporate the risk measure under consideration. The optimal allocations obtained by these models are numerically evaluated in simulation studies for example problems. Expected revenue, coefficient of variation, load factor and probability of the poor performance are the performance measures in the simulation studies. According to the results of these simulations, it shown that the proposed models can decrease the variability of the revenue considerably. In other words, the probability of revenue being less than the threshold level is decreased. Moreover, expected revenue can be increased in some scenarios by using the proposed models. The approach considered in this thesis is especially proposed for small scale airlines because risk of obtaining revenue less than the threshold level is more for this type of airlines as compared to large scale airlines.
APA, Harvard, Vancouver, ISO, and other styles
16

Li, Xiaofei 1972. "Three essays on the pricing of fixed income securities with credit risk." Thesis, McGill University, 2004. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=84523.

Full text
Abstract:
This thesis studies the impacts of credit risk, or the risk of default, on the pricing of fixed income securities. It consists of three essays. The first essay extends the classical corporate debt pricing model in Merton (1974) to incorporate stochastic volatility (SV) in the underlying firm asset value and derive a closed-form solution for the price of corporate bond. Simulation results show that the SV specification for firm asset value greatly increases the resulting credit spread levels. Therefore, the SV model addresses one major deficiency of the Merton-type models: namely, at short maturities the Merton model is unable to generate credit spreads high enough to be compatible with those observed in the market. In the second essay, we develop a two-factor affine model for the credit spreads on corporate bonds. The first factor can be interpreted as the level of the spread, and the second factor is the volatility of the spread. Our empirical results show that the model is successful at fitting actual corporate bond credit spreads. In addition, key properties of actual credit spreads are better captured by the model. Finally, the third essay proposes a model of interest rate swap spreads. The model accommodates both the default risk inherent in swap contracts and the liquidity difference between the swap and Treasury markets. The default risk and liquidity components of swap spreads are found to behave very differently: first, the default risk component is positively related to the riskless interest rate, whereas the liquidity component is negatively correlated with the riskless interest rate; second, although default risk accounts for the largest share of the levels of swap spreads, the liquidity component is much more volatile; and finally, while the default risk component has been historically positive, the liquidity component was negative for much of the 1990s and has become positive since the financial market turmoil in 1998.
APA, Harvard, Vancouver, ISO, and other styles
17

Lin, Erlu, and 林尔路. "Analysis of dividend payments for insurance risk models with correlated aggregate claims." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2008. http://hub.hku.hk/bib/B40203992.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Liu, Luyin, and 劉綠茵. "Analysis of some risk processes in ruin theory." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2013. http://hdl.handle.net/10722/195992.

Full text
Abstract:
In the literature of ruin theory, there have been extensive studies trying to generalize the classical insurance risk model. In this thesis, we look into two particular risk processes considering multi-dimensional risk and dependent structures respectively. The first one is a bivariate risk process with a dividend barrier, which concerns a two-dimensional risk model under a barrier strategy. Copula is used to represent the dependence between two business lines when a common shock strikes. By defining the time of ruin to be the first time that either of the two lines has its surplus level below zero, we derive a discrete approximation procedure to calculate the expected discounted dividends until ruin under such a model. A thorough discussion of application in proportional reinsurance with numerical examples is provided as well as an examination of the joint optimal dividend barrier for the bivariate process. The second risk process is a semi-Markovian dual risk process. Assuming that the dependence among innovations and waiting times is driven by a Markov chain, we analyze a quantity resembling the Gerber-Shiu expected discounted penalty function that incorporates random variables defined before and after the time of ruin, such as the minimum surplus level before ruin and the time of the first gain after ruin. General properties of the function are studied, and some exact results are derived upon distributional assumptions on either the inter-arrival times or the gain amounts. Applications in a perpetual insurance and the last inter-arrival time before ruin are given along with some numerical examples.
published_or_final_version
Statistics and Actuarial Science
Master
Master of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
19

Zhu, Jinxia, and 朱金霞. "Ruin theory under Markovian regime-switching risk models." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2008. http://hub.hku.hk/bib/B40203980.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

McTaggart, Kevin Andrew. "Hydrodynamics and risk analysis of iceberg impacts with offshore structures." Thesis, University of British Columbia, 1989. http://hdl.handle.net/2429/30733.

Full text
Abstract:
The evaluation of design iceberg impact loads for offshore structures and the influence of hydrodynamic effects on impact loads are examined. Important hydrodynamic effects include iceberg added mass, wave-induced oscillatory iceberg motions, and the influence of a large structure on the surrounding flow field and subsequent velocities of approaching icebergs. The significance of these phenomena has been investigated using a two-body numerical diffraction model and through a series of experiments modelling the drift of various sized icebergs driven by waves and currents approaching a large offshore structure. Relevant findings from the hydrodynamic studies have been incorporated into two probabilistic models which can be used to determine design iceberg collision events with a structure based on either iceberg kinetic energy upon impact or global sliding force acting on the structure. Load exceedence probabilities from the kinetic energy and sliding force models are evaluated using the second-order reliability method. Output from the probabilistic models can be used to determine design collision parameters and to assess whether more sophisticated modelling of various impact processes is required. The influence of the structure on velocities of approaching icebergs is shown to be significant when the structure horizontal dimension is greater than twice the iceberg dimension. As expected, wave-induced oscillatory motions dominate the collision velocity for smaller icebergs but have a negligible effect on velocity for larger icebergs.
Applied Science, Faculty of
Civil Engineering, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
21

Chen, Yiqing, and 陳宜清. "Study on insurance risk models with subexponential tails and dependence structures." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2009. http://hub.hku.hk/bib/B42841768.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Li, Yuming. "Univariate and multivariate measures of risk aversion and risk premiums with joint normal distribution and applications in portfolio selection models." Thesis, University of British Columbia, 1987. http://hdl.handle.net/2429/26110.

Full text
Abstract:
This thesis gives the formal derivations of the so-called Rubinstein's measures of risk aversion and their multivariate generalizations. The applications of these measures in portfolio selection models are also presented. Assuming that a decision maker's preferences can be represented by a unidimensional von Neumann and Morgenstern utility function, we consider a model with an uninsurable initial random wealth and an insurable risk. Under the assumption that the two random variables have a bivariate normal distribution, the second-order co-variance operator is developed from Stein/Rubinstein first-order covariance operator and is used to derive Rubinstein's measures of risk aversion from the approximations of risk premiums. Rubinstein's measures of risk aversion are proved to be the appropriate generalizations of the Arrow-Pratt measures of risk aversion. In a portfolio selection model with two risky investments having a bivariate normal distribution, we show that Rubinstein's measures of risk aversion can yield the desirable characterizations of risk aversion and wealth effects on the optimal portfolio. These properties of Rubinstein's measures of risk aversion are analogous to those of the Arrow-Pratt measures of risk aversion in the portfolio selection model with one riskless and one risky investment. In multi-dimensional decision problems, we assume that a decision maker's preferences can be represented by a multivariate utility function. From the model with an uninsurable initial wealth vector and insurable risk vector having a joint normal distribution in the wealth space, we derived the matrix measures of risk aversion which are the multivariate extension of Rubinstein's measures of risk aversion. The derivations are based on the multivariate version of Stein/Rubinstein covariance operator developed by Gassmann and its second-order generalization to be developed in this thesis. We finally present an application of the matrix measures of risk aversion in a portfolio selection model with a multivariate utility function and two risky investments. In this model, if we assume that the random returns on the two investments and other random variables have a joint normal distribution, the optimal portfolio can be characterized by the matrix measures of risk aversion.
Business, Sauder School of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
23

Lee, Boram. "Risk perceptions and financial decisions of individual investors." Thesis, University of Stirling, 2013. http://hdl.handle.net/1893/16951.

Full text
Abstract:
Standard finance theory portrays investors as rational utility maximisers. Persisting market anomalies and observed investor practice, however, have led to widespread recognition that the fundamental axioms of rationality are often violated. In response to the limitations inherent in standard theory, the Behavioural Finance approach relaxes the rationality assumption and takes account of psychological influences on individuals’ decision-making processes. Adopting the behavioural approach, this thesis, which includes two empirical studies, examines why, and to what extent, investors depart from rational or optimal investment practices. The thesis examines the effect of Myopic Loss Aversion (MLA) suggested by Benartzi and Thaler (1995) as a response to the Equity Premium Puzzle highlighted by Mehra and Prescott (1985). While previous studies are almost exclusively based on experiments in a laboratory setting, this approach provides more compelling empirical evidence by investigating the effects of MLA on real individual investors’ portfolio allocations through the use of the Dutch National Bank Household Survey. For the first time, the concept of MLA is identified through the interaction of two separate effects, firstly, individuals’ myopia, reflected in portfolio evaluation and rebalancing frequencies, and secondly, loss aversion. The thesis finds that individuals who are less affected by MLA invest more in risky financial assets. Further, individuals who are less myopic increase their share of risky assets invested in their financial portfolios over time, although this is unrelated to their loss aversion. These findings support the prediction of MLA theory that short investment horizons and high loss aversion lead to a significantly lower share of risky investments. In summary, the high equity premium can be explained by the notion of MLA. If individuals evaluate their investment performance over the long-term, they perceive much smaller risks relative to stockholding returns; consequently, they will be prepared to accept smaller equity premiums. The findings suggest possible interventions by policy makers and investment advisors to encourage individuals to remain in the stock market, such as providing long-term investment instruments, or restricting evaluation frequency to the annual reporting of investment performance. In response to the stockholding puzzle (Haliassos and Bertaut, 1995), this thesis also investigates individuals’ stock market returns expectations and their varying levels of risk aversion. Previous studies find that individuals’ heterogeneous stock market expectations determine variations in their stockholdings. The thesis accounts for the effect of risk aversion on stock market expectations, as well as on stockholding decisions. Additionally, the causality issue as between individuals’ expectations and stockholding status is controlled. The thesis finds that more risk averse individuals hold lower stock market expectations, and that the stock market return expectations of more risk averse individuals affect their stock market participation decisions negatively. The portfolio allocation decisions of individuals who already hold stocks are only affected by their expectations, with risk aversion being no longer significant. The thesis argues that persistent risk aversion effects cause individuals to hold pessimistic views of stock market returns, thus contributing to the enduring stockholding puzzle. The thesis reinforces existing perceptions that individuals in the real world may not make fully rational decisions due to their judgments which are based on heuristics and affected by cognitive biases. Individual investors often fail to maximise their utility given their preferences and constraints. Consequently, this thesis draws attention to the possible role of institutions, policy makers, and financial advisory bodies in providing effective interventions and guidelines to improve individuals’ financial decisions.
APA, Harvard, Vancouver, ISO, and other styles
24

Kwan, Kwok-man, and 關國文. "Ruin theory under a threshold insurance risk model." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2007. http://hub.hku.hk/bib/B38320034.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Sagi, Jacob S. "Partial ordering of risky choices : anchoring, preference for flexibility and applications to asset pricing." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0019/NQ56611.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Wong, Tsun-yu Jeff, and 黃峻儒. "On some Parisian problems in ruin theory." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2014. http://hdl.handle.net/10722/206448.

Full text
Abstract:
Traditionally, in the context of ruin theory, most judgements are made on an immediate sense. An example would be the determination of ruin, in which a business is declared broke right away when it attains a negative surplus. Another example would be the decision on dividend payment, in which a business pays dividends whenever the surplus level overshoots certain threshold. Such scheme of decision making is generally being criticized as unrealistic from a practical point of view. The Parisian concept is therefore invoked to handle this issue. This idea is deemed more realistic since it allows certain delay in the execution of decisions. In this thesis, such Parisian concept is utilized on two different aspects. The first one is to incorporate this concept on defining ruin, leading to the introduction of Parisian ruin time. Under such a setting, a business is considered ruined only when the surplus level stays negative continuously for a prescribed length of time. The case for a fixed delay is considered. Both the renewal risk model and the dual renewal risk model are studied. Under a mild distributional assumption that either the inter arrival time or the claim size is exponentially distributed (while keeping the other arbitrary), the Laplace transform to the Parisian ruin time is derived. Numerical example is performed to confirm the reasonableness of the results. The methodology in obtaining the Laplace transform to the Parisian ruin time is also demonstrated to be useful in deriving the joint distribution to the number of negative surplus causing or without causing Parisian ruin. The second contribution is to incorporate this concept on the decision for dividend payment. Specifically, a business only pays lump-sum dividends when the surplus level stays above certain threshold continuously for a prescribed length of time. The case for a fixed and an Erlang(n) delay are considered. The dual compound Poisson risk model is studied. Laplace transform to the ordinary ruin time is derived. Numerical examples are performed to illustrate the results.
published_or_final_version
Statistics and Actuarial Science
Master
Master of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
27

Chau, Ki-wai, and 周麒偉. "Fourier-cosine method for insurance risk theory." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2014. http://hdl.handle.net/10722/208586.

Full text
Abstract:
In this thesis, a systematic study is carried out for effectively approximating Gerber-Shiu functions under L´evy subordinator models. It is a hardly touched topic in the recent literature and our approach is via the popular Fourier-cosine method. In theory, classical Gerber-Shiu functions can be expressed in terms of an infinite sum of convolutions, but its inherent complexity makes efficient computation almost impossible. In contrast, Fourier transforms of convolutions could be evaluated in a far simpler manner. Therefore, an efficient numerical method based on Fourier transform is pursued in this thesis for evaluating Gerber-Shiu functions. Fourier-cosine method is a numerical method based on Fourier transform and has been very popular in option pricing since its introduction. It then evolves into a number of extensions, and we here adopt its spirit to insurance risk theory. In this thesis, the proposed approximant of Gerber-Shiu functions under an L´evy subordinator model has O(n) computational complexity in comparison with that of O(n log n) via the usual numerical Fourier inversion. Also, for Gerber-Shiu functions within the proposed refined Sobolev space, an explicit error bound is given and error bound of this type is seemingly absent in the literature. Furthermore, the error bound for our estimation can be further enhanced under extra assumptions, which are not immediate from Fang and Oosterlee’s works. We also suggest a robust method on the estimation of ruin probabilities (one special class of Gerber-Shiu functions) based on the moments of both claim size and claim arrival distributions. Rearrangement inequality will also be adopted to amplify the use of our Fourier-cosine method in ruin probability, resulting in an effective global estimation. Finally, the effectiveness of our result will be further illustrated in a number of numerical studies and our enhanced error bound is apparently optimal in our demonstration; more precisely, empirical evidence exhibiting the biggest possible error convergence rate agrees with our theoretical conclusion.
published_or_final_version
Mathematics
Master
Master of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
28

Wan, Lai-mei. "Ruin analysis of correlated aggregate claims models." Thesis, Click to view the E-thesis via HKUTO, 2005. http://sunzi.lib.hku.hk/hkuto/record/B30705708.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Rouah, Fabrice. "Essays on hedge funds, operational risk, and commodity trading advisors." Thesis, McGill University, 2007. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=103290.

Full text
Abstract:
Hedge funds report performance information voluntarily. When they stop reporting they are transferred from the "live" pool of funds to the "defunct" pool. Consequently, liquidated funds constitute a subset of the defunct pool. I present models of hedge fund survival, attrition, and survivorship bias based on liquidation alone. This refines estimates of predictor variables in models of survival, leads to attrition rates of hedge funds to be roughly one half those previously thought, and produces larger estimates of survivorship bias. Survival models based on liquidated funds only, lead to an increase in survival time of 50 to 100 percent relative to survival based on all defunct funds.
In addition to refining estimates of survival time, it is useful to examine how the double fee structure of hedge funds and Commodity Trading Advisors (CTA) affects the incentives of their managers. Young CTAs are usually very small --- they hold few financial assets --- and may not meet their operating expenses with their management fee alone, so their incentive is to take on risk and post good returns. As they grow, their incentive to take on risk diminishes. CTAs in their fifth year diminish their volatility by 25 percent relative to their first year, and diminish returns by 70 percent. We find CTAs to behave more like indexers as they grow, concerned with more with capital preservation than asset management.
Operational risk is a major cause of hedge fund and CTA liquidation. In the banking industry, regulators have called upon institutions to develop models for measuring capital charge for operational losses, and to subject these models to stress testing. Losses are found to be inversely related to GDP growth, and positively related to unemployment. Since losses are thus cyclical, one way to stress test models is to calculate capital charge during good and bad economic regimes. We find loss distributions to have thicker tails during bad regimes. One implication is that banks will likely need to increase their capital charge when economic conditions deteriorate.
APA, Harvard, Vancouver, ISO, and other styles
30

Malwandla, Musa. "Loss distributions in consumer credit risk : macroeconomic models for expected and unexpected loss." Master's thesis, University of Cape Town, 2016. http://hdl.handle.net/11427/20414.

Full text
Abstract:
This thesis focuses on modelling the distributions of loss in consumer credit arrangements, both at an individual level and at a portfolio level, and how these might be influenced by loan-specific factors and economic factors. The thesis primarily aims to examine how these factors can be incorporated into a credit risk model through logistic regression models and threshold regression models. Considering the fact that the specification of a credit risk model is influenced by its purpose, the thesis considers the IFRS 7 and IFRS 9 accounting requirements for impairment disclosure as well as Basel II regulatory prescriptions for capital requirements. The thesis presents a critique of the unexpected loss calculation under Basel II by considering the different ways in which loans can correlate within a portfolio. Two distributions of portfolio losses are derived. The Vašíček distribution, which is the assumed in Basel II requirements, was originally derived for corporate loans and was never adapted for application in consumer credit. This makes it difficult to interpret and validate the correlation parameters prescribed under Basel II. The thesis re-derives the Vašíček distribution under a threshold regression model that is specific to consumer credit risk, thus providing a way to estimate the model parameters from observed experience. The thesis also discusses how, if the probability of default is modelled through logistic regression, the portfolio loss distribution can be modelled as a log-log-normal distribution.
APA, Harvard, Vancouver, ISO, and other styles
31

YE, Zuobin. "A risk-averse newsvendor model with pricing consideration." Digital Commons @ Lingnan University, 2004. https://commons.ln.edu.hk/otd/18.

Full text
Abstract:
A decision maker who is facing a random demand for a perishable product, such as newspapers, decides how many units to order for a single selling period. This single-period inventory problem is often referred to as the \classic newsvendor problem", in which the selling price is ¯xed, the order must be made before the selling period, and the decision maker is risk-neutral. If the decision maker orders too many (overage), the inventory cost will be too high. If the decision maker orders too few (underage), the potential pro¯t will be lost. The optimal order quantity is a balance between the expected costs of overage and underage. This thesis investigates an extension of the classic newsvendor problem. In this extension the demand depends on the selling price, the decision maker may obtain an additional order at a higher price during the selling period, and the decision maker is risk-averse (not risk-neutral). The problem is to ¯nd optimal order quantity and selling price so that the expected utility of the risk-averse decision maker is maximized. This thesis examines the relationship between the order quantity and the sell- ing price for di®erent risk-averse decision makers in this extended newsvendor problem de¯ned above. The result shows that the relationships are consistent for some decision makers but not for others. For example, if the decision maker exhibits a constant absolute risk aversion (CARA), the optimal order quantity will decline when the selling price increases. If the decision maker has constant relative risk aversion (CRRA), the relationship is complex. This thesis ¯nds that if it is just known that the decision maker is risk-averse, the optimal order quantity placed is less than that made by a risk-neutral decision maker. Further more, the risk-averse decision maker's optimal order quantity falls when her/his risk aversion increases. However, the relationship between order quantity and selling price is still indeterminate in this case. This extension of the classic newsvendor problem provides a more realistic dy- namic setting than before, therefore providing an excellent framework for exam- ining how the inventory problem interacting with the marketing issue (selling price) will in°uence decision makers at the ¯rm level. It also provides an inte- grated framework for investigating di®erent variations of newsvendor problems. Thus, this thesis will motivate and encourage more applications of the newsven- dor problem which is a foundation of many supply chain management problems.
APA, Harvard, Vancouver, ISO, and other styles
32

Yeo, Keng Leong Actuarial Studies Australian School of Business UNSW. "Claim dependence in credibility models." Awarded by:University of New South Wales. School of Actuarial Studies, 2006. http://handle.unsw.edu.au/1959.4/25971.

Full text
Abstract:
Existing credibility models have mostly allowed for one source of claim dependence only, that across time for an individual insured risk or a group of homogeneous insured risks. Numerous circumstances demonstrate that this may be inadequate and insufficient. In this dissertation, we developed a two-level common effects model, based loosely on the Bayesian model, which allows for two possible sources of dependence, that across time for the same individual risk and that between risks. For the case of Normal common effects, we are able to derive explicit formulas for the credibility premium. This takes the intuitive form of a weighted average between the individual risk's claims experience, the group's claims experience and the prior mean. We also consider the use of copulas, a tool widely used in other areas of work involving dependence, in constructing credibility premiums. Specifically, we utilise copulas to model the dependence across time for an individual risk or group of homogeneous risks. We develop the construction with several well-known families of copulas and are able to derive explicit formulas for their respective conditional expectations. Whilst some recent work has been done on constructing credibility models with copulas, explicit formulas for the conditional expectations have rarely been made available. Finally, we calibrate these copula credibility models using a real data set. This data set relates to the claims experience of workers' compensation insurance by occupation over a 7-year period for a particular state in the United States. Our results show that for each occupation, claims dependence across time is indeed present. Amongst the copulas considered in our empirical analysis, the Cook-Johnson copula model is found to be the best fit for the data set used. The calibrated copula models are then used for prediction of the next period's claims. We found that the Cook-Johnson copula model gives superior predictions. Furthermore, this calibration exercise allowed us to uncover the importance of examining the nature of the data and comparing it with the characteristics of the copulas we are calibrating to.
APA, Harvard, Vancouver, ISO, and other styles
33

O'Neill, II Martin Joseph. "Computational Epidemiology - Analyzing Exposure Risk: A Deterministic, Agent-Based Approach." Thesis, University of North Texas, 2009. https://digital.library.unt.edu/ark:/67531/metadc11017/.

Full text
Abstract:
Many infectious diseases are spread through interactions between susceptible and infectious individuals. Keeping track of where each exposure to the disease took place, when it took place, and which individuals were involved in the exposure can give public health officials important information that they may use to formulate their interventions. Further, knowing which individuals in the population are at the highest risk of becoming infected with the disease may prove to be a useful tool for public health officials trying to curtail the spread of the disease. Epidemiological models are needed to allow epidemiologists to study the population dynamics of transmission of infectious agents and the potential impact of infectious disease control programs. While many agent-based computational epidemiological models exist in the literature, they focus on the spread of disease rather than exposure risk. These models are designed to simulate very large populations, representing individuals as agents, and using random experiments and probabilities in an attempt to more realistically guide the course of the modeled disease outbreak. The work presented in this thesis focuses on tracking exposure risk to chickenpox in an elementary school setting. This setting is chosen due to the high level of detailed information realistically available to school administrators regarding individuals' schedules and movements. Using an agent-based approach, contacts between individuals are tracked and analyzed with respect to both individuals and locations. The results are then analyzed using a combination of tools from computer science and geographic information science.
APA, Harvard, Vancouver, ISO, and other styles
34

Deng, Hui, and 鄧惠. "Mean-variance optimal portfolio selection with a value-at-risk constraint." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2009. http://hub.hku.hk/bib/B41897213.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Liao, Mingwei, and 廖明瑋. "Futures hedging on both procurement risk and sales risk under correlated prices and demand." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2014. http://hdl.handle.net/10722/206683.

Full text
Abstract:
The profitability of a manufacturer could be largely affected by underlying uncertainties embedded in the fast-changing business environment. Random factors, such as input material price at the procurement end or output product price and demand at the sales end, might produce significant risks. Effective financial hedging therefore needs to be taken to mitigate these risk exposures. Although it is common to use commodity futures to control the risks at either end separately, little has been done on the hedging of these risk exposures in an integrated manner. Therefore, this study aims to develop a planning approach that performs financial hedging on both the procurement risk and the sales risk in a joint manner. This planning approach is based on a framework that has a risk-averse commodity processor that procures input commodity and sells output commodity in the spot market, while hedging the procurement risk and sales risk through trading futures contracts in the commodity markets. Both the input and output commodities futures are used for the hedging. A both-end-hedging model is developed to quantitatively evaluate the approach. The evaluation is based on an objective function that considers both profit maximisation and risk mitigation. Decisions on spot procurement, input futures hedging position, and output futures hedging position are optimised simultaneously. As the input commodity is the main production material for the output commodity, positive correlation between the input material price and the output product price is considered. The customer demand is considered negatively correlated with the output product price. An ethanol plant using corn as the main input material is employed as an example to implement the proposed model. The model is represented as a stochastic program, and the Gibson-Schwartz two-factor model is employed to describe the stochastic commodity prices. Historical commodity price data are used to estimate the parameters for the two-factor model with state-space form and Kalman filter. By generating various scenarios representing evolving prices and the random customer demand, the stochastic program could be solved using linear programming algorithms under its deterministic equivalent. Numerical experiments are carried out to demonstrate the benefit that could be gained from applying the both-end-hedging approach proposed in this study. Comparing with traditional no-hedging model or single-end-hedging models, the improvement obtained from the proposed model is found to be significant. The effectiveness of the model is further tested in various price trend and price correlation, demand elasticity and volatility, and risk attitude of the decision maker. It is found that the proposed approach is robust in these various circumstances, and the approach is especially effective when the price trend is uncertain and when the decision maker has a strong risk-averse attitude.
published_or_final_version
Industrial and Manufacturing Systems Engineering
Master
Master of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
36

Wang, Shuoyu, and 王硕玉. "Optimal inventory strategies in supply chains under a value-at-risk constraint." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2010. http://hub.hku.hk/bib/B4440704X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Weber, Stefan. "Measures and models of financial risk." Doctoral thesis, [S.l. : s.n.], 2004. http://deposit.ddb.de/cgi-bin/dokserv?idn=973223421.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Owen, Michelle L. "Exposure model : detailed profiling and quantification of the exposure of personnel to geotechnical hazards in underground mines." University of Western Australia. School of Civil and Resource Engineering, 2004. http://theses.library.uwa.edu.au/adt-WU2005.0031.

Full text
Abstract:
[Truncated abstract] This thesis presents an operationally applicable and reliable model for quantification of the exposure of underground mining personnel to geotechnical hazards. The model is shown to have the flexibility to apply to very different operational environments, within the context of mechanised metalliferous mines. It provides an essential component for carrying out quantitative geotechnical risk analyses of underground mines. Increasingly prevalent within the Australian mining industry are moves towards a riskbased philosophy instead of prescriptive design procedures. A barrier to this has been the lag in availability of resources (personnel and technical) required for the intensive effort of applying probabilistic methods to geotechnical engineering at mines ... One of the missing components for quantitative risk analysis in mines has been an accurate model of personnel exposure to geotechnical hazards, from which meaningful estimates can be made of the probabilities of serious or fatal injury given a rockfall. Exposure profiling for geotechnical risk analysis at minesites has traditionally involved the simple classification of travelways and entry areas by their occupancy rate, not taking into account traffic and work characteristics which may significantly influence the risks. Therefore, it was the focus of this thesis to address that deficiency and progress the ability to perform semi-quantitative and quantitative risk analyses in mines.
APA, Harvard, Vancouver, ISO, and other styles
39

Leboho, Nakedi Wilson. "Quantitative Risk Management and Pricing for Equity Based Insurance Guarantees." Thesis, Stellenbosch : Stellenbosch University, 2015. http://hdl.handle.net/10019.1/96980.

Full text
Abstract:
Thesis (MSc)--Stellenbosch University, 2015
ENGLISH ABSTRACT : Equity-based insurance guarantees also known as unit-linked annuities are annuities with embedded exotic, long-term and path-dependent options which can be categorised into variable and equity indexed annuities, whereby investors participate in the security markets through insurance companies that guarantee them a minimum of their invested premiums. The difference between the financial options and options embedded in equity-based policies is that financial ones are financed by the option buyers’ premiums, whereas options of the equity-based policies are financed by also continuous fees that follow the premium paid first by the policyholders during the life of the contracts. Other important dissimilarities are that equity-based policies do not give the owner the right to sell the contract, and carry not just security market related risk, but also insurance related risks such as the selection rate, behavioural, mortality, others and the systematic longevity. Thus equity-based annuities are much complicated insurance products to precisely value and hedge. For insurance companies to successfully fulfil their promise of eventually returning at least initially invested amount to the policyholders, they have to be able to measure and manage risks within the equity-based policies. So in this thesis, we do fair pricing of the variable and equity indexed annuities, then discuss management of financial market and insurance risks management.
AFRIKAANSE OPSOMMING : Aandeel-gebaseerde versekering waarborg ook bekend as eenheid-gekoppelde annuiteite is eksotiese, langtermyn-en pad-afhanklike opsies wat in veranderlike en gelykheid geindekseer annuiteite, waardeur beleggers neem in die sekuriteit markte deur middel van versekering maatskappye wat waarborg hulle ’n minimum van geklassifiseer kan word hulle belˆe premies. Die verskil tussen die finansi¨ele opsies en opsies is ingesluit in aandele-gebaseerde beleid is dat die finansi¨ele mense is gefinansier deur die opsie kopers se premies, terwyl opsies van die aandele-gebaseerde beleid word deur ook deurlopende fooie wat volg op die premie wat betaal word eers deur die polishouers gefinansier gedurende die lewe van die kontrakte. Ander belangrike verskille is dat aandele-gebaseerde beleid gee nie die eienaar die reg om die kontrak te verkoop, en dra nie net markverwante risiko sekuriteit, maar ook versekering risiko’s, soos die seleksie koers, gedrags, sterftes, ander en die sistematiese langslewendheid. So aandeel-gebaseerde annuiteite baie ingewikkeld versekering produkte om presies waarde en heining. Vir versekeringsmaatskappye suksesvol te vervul hul belofte van uiteindelik ten minste aanvanklik belˆe bedrag terug te keer na die polishouers, hulle moet in staat wees om te meet en te bestuur risiko’s binne die aandeel-gebaseerde beleid. So in hierdie tesis, ons doen billike pryse van die veranderlike en gelykheid geïndekseer annuiteite, bespreek dan die bestuur van finansiele markte en versekering risiko’s bestuur.
APA, Harvard, Vancouver, ISO, and other styles
40

ROGANTINI, PICCO Anna. "Essays in macroeconomics : fiscal policy, hiring frictions, uncertainty, and risk sharing." Doctoral thesis, European University Institute, 2020. https://hdl.handle.net/1814/69000.

Full text
Abstract:
Defence date: 24 November 2020
Examining Board: Prof. Evi Pappa (University Carlos III of Madrid); Prof. Leonardo Melosi (European University Institute and Federal Reserve Bank of Chicago); Prof. John Fernald (INSEAD and Federal Reserve Bank of San Francisco); Prof. Antonella Trigari (Bocconi University)
The three chapters of this thesis are inspired by some aspects of the complex world where we live in. The first chapter uncovers the role of firms' hiring decisions as a key source of state dependence in the fiscal spending multiplier. When the hiring rate is high, a larger share of workers has to be relocated from production to recruitment and training of the new hires. This diversion of resources lowers firms' productivity and reduces the effect of government spending stimulus on output. I establish this result using local projections and I illustrate this mechanism building a non-linear dynamic general equilibrium model. The second chapter, joint with Joonseok Oh, shows how uninsurable unemployment risk is crucial to qualitatively and quantitatively match macro responses to uncertainty shocks. Empirically, uncertainty shocks i) generate deflationary pressure; ii) have considerably negative consequences on economic activity; iii) produce a drop in aggregate consumption, which is mainly driven by the response of the households in the bottom 60% of the income distribution. Standard representative-agent New Keynesian models have difficulty to deliver these effects. A heterogeneous-agent framework with search and matching frictions and Calvo pricing allows us to jointly attain these results. Uncertainty shocks induce households' precautionary saving and firms' precautionary pricing behaviors, triggering a fall in aggregate demand and supply. These precautionary behaviors increase the unemployment risk of the imperfectly insured households, who strengthen precautionary saving. When the feedback loop between unemployment risk and precautionary saving is strong enough, a rise in uncertainty leads to i) a drop in action; ii) amplified negative responses of macro variables; iii) heterogeneous consumption responses of households, which are consistent with the empirical evidence. The third chapter, joint with Alessandro Ferrari, empirically evaluates whether adopting a common currency has changed the ability of euro area member states to share risk. We construct a counterfactual dataset of macroeconomic variables through the synthetic control method. We then use the output variance decomposition of Asdrubali, Sorensen and Yosha (1996) on both the actual and the synthetic data to study if there has been a change in risk sharing and through which channels. We find that the euro has reduced consumption smoothing. We further show that this reduction is mainly driven by the periphery countries of the euro area who have experienced a decrease in risk sharing through private credit.
-- 1. Fiscal multipliers : a tale from the labor market -- 2. Macro uncertainty and unemployment risk -- 3. Risk sharing and the adoption of the euro
Chapter 2 ‘Macro uncertainty and unemployment risk' of the PhD thesis draws upon an earlier version published as EUI ECO WP 2019/02 and Chapter 3 ‘Risk sharing and the adoption of the Euro' of the PhD thesis draws upon an earlier version published as ESM Working Paper Series 17/2016 and as ADEMU Working Paper Series 2017/055.
APA, Harvard, Vancouver, ISO, and other styles
41

Almohri, Hussain. "High Assurance Models for Secure Systems." Diss., Virginia Tech, 2013. http://hdl.handle.net/10919/22030.

Full text
Abstract:
Despite the recent advances in systems and network security, attacks on large enterprise networks consistently impose serious challenges to maintaining data privacy and software service integrity. We identify two main problems that contribute to increasing the security risk in a networked environment: (i) vulnerable servers, workstations, and mobile devices that suffer from vulnerabilities, which allow the execution of various cyber attacks, and, (ii) poor security and system configurations that create loopholes used by attackers to bypass implemented security defenses.

Complex attacks on large networks are only possible with the existence of vulnerable intermediate machines, routers, or mobile devices (that we refer to as network components) in the network. Vulnerabilities in highly connected servers and workstations, that compromise the heart of today\'s networks, are inevitable. Also, modern mobile devices with known vulnerabilities cause an increasing risk on large networks. Thus, weak security mechanisms in vulnerable  network components open the possibilities for effective network attacks.

On the other hand, lack of systematic methods for an effective static analysis of an overall complex network results in inconsistent and vulnerable configurations at individual network components as well as at the network level. For example, inconsistency and faults in designing firewall rules at a host may result in enabling more attack vector. Further, the dynamic nature of networks with changing network configurations, machine availability and connectivity, make the security analysis a challenging task.

This work presents a hybrid approach to security by providing two solutions for analyzing the overall security of large organizational networks, and a runtime  framework for protecting individual network components against misuse of system resources by cyber attackers. We  observe that to secure an overall computing environment, a static analysis of a network is not sufficient. Thus, we couple our analysis with a framework to secure individual network components including high performance machines as well as mobile devices that repeatedly enter and leave networks. We also realize the need for advancing the theoretical foundations for analyzing the security of large networks.


To analyze the security of large enterprise network, we present the first scientific attempt to compute an optimized distribution of defensive resources with the objective of minimizing the chances of successful attacks. To achieve this minimization, we develop a rigorous probabilistic model that quantitatively measures the chances of a successful attack on any network component. Our model provides a solid theoretical foundation that enables efficient computation of unknown success probabilities on every stage of a network attack. We design an algorithm that uses the computed attack probabilities for optimizing security configurations of a network. Our optimization algorithm  uses state of the art sequential linear programming to approximate the solution to a complex single objective nonlinear minimization problem that formalizes various attack steps and candidate defenses at the granularity of attack stages.

To protect individual network components, we develop a new approach under our novel idea of em process authentication.
We argue that to provide high assurance security, enforcing authorization is necessary but not sufficient. In fact, existing authorization systems lack a strong and reliable process authentication model for preventing the execution of malicious processes (i.e., processes that intentionally contain malicious goals that violate integrity and confidentiality of legitimate processes and data). Authentication is specially critical when malicious processes may use various system vulnerabilities to install on the system and stealthily execute without the user\'s consent.

We design and implement the Application Authentication (A2) framework that is capable of monitoring application executions and ensuring proper authentication of application processes. A2 has the advantage of strong security guarantees, efficient runtime execution, and compatibility with legacy applications. This authentication framework reduces the risk of infection by powerful malicious applications that may disrupt proper execution of legitimate applications, steal users\' private data, and spread across the entire organizational network.

Our process authentication model is extended and applied to the Android platform. As Android imposes its unique challenges (e.g., virtualized application execution model), our design and implementation of process authentication is extended to address these challenges. Per our results, process authentication in Android can protect the system against various critical vulnerabilities such as privilege escalation attacks and drive by downloads.

To demonstrate process authentication in Android, we implement DroidBarrier. As a runtime system, DroidBarrier includes an authentication component and a lightweight permission system to protect legitimate applications and secret authentication information in the file system. Our implementation of DroidBarrier is compatible with the Android runtime (with no need for modifications) and shows efficient performance with negligible penalties in I/O operations and process creations.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
42

"Inventory models with downside risk measures." Thesis, 2007. http://library.cuhk.edu.hk/record=b6074502.

Full text
Abstract:
Finally, we study a multi-period, risk-averse inventory model. The objective is to maximize the expected pay-offs. The risk-averse behavior is modeled as to penalize the decision maker if a target-profit level is not satisfied for each financial reporting cycle. We recognize that the operational period is usually faster than the financial reporting cycle. Therefore, the financial reporting cycle can be considered as an integer times of the operational periods. We study this model under both accrual-basis accounting principle and cash-basis accounting principle. We prove that the optimal inventory policy is a state-dependent base-stock policy under the accrual-basis accounting method. We then show that the structure of an optimal policy is a complicated one for the cash-basis accounting method.
In this thesis we study three supply chain models which address downside risk from a different angle. We start with a commitment-option supply contract in a Conditional Value-at-Risk (CVaR) framework. We show that a CVaR trade-off analysis with advanced reservation can be carried out efficiently. Moreover, our study indicates how the corresponding contract decisions differ from decisions for optimizing an expected value.
Key words. Downside Risk Measure; CVaR; Risk; Loss-Averse; Dynamic Programming.
Owing to the growing globalization in economy and the advances in commerce, research in supply chain management has attracted large number of researchers in the last two decades. Yet standard treatments of supply chain models are mainly confined for the optimization of expected values with little reflection on risk considerations. Even for those that consider a risk measure in the objective function, there are quite few literatures employing downside risk measure. The downside risk measure takes into account only the part of the distribution that is below a critical value. Thus it indicates a safety-first strategy for decision maker.
The thesis is organized in five chapters. In Chapter 1, we provide the background and research motivation for considering downside risk measures in supply chain models. In Chapter 2, we study the pay-to-delay supply contracts with a Conditional Value-at-Risk (CVaR) framework. In Chapter 3, we study the loss-averse newsvendor problem. In Chapter 4, we extend the loss-averse model to a multi-period setting. We conclude the thesis in Chapter 5 with discussions for future research.
Then, we employ a loss-aversion utility function to characterize newsvendor's decision-making behavior. We find that when there is no shortage cost, the loss-averse newsvendor consistently orders less than a risk-neutral newsvendor. Further, we discover that the loss-averse newsvendor orders a constant quantity when the reference target is sufficiently large. We discuss the importance of initial inventory to achieve the target profit level. When the target is a decision variable, the newsvendor always sets the target no higher or no lower.
Ma, Lijun.
"October 2007."
Adviser: Houmin Yan.
Source: Dissertation Abstracts International, Volume: 69-08, Section: B, page: 5003.
Thesis (Ph.D.)--Chinese University of Hong Kong, 2007.
Includes bibliographical references (p. 140-154).
Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
Electronic reproduction. [Ann Arbor, MI] : ProQuest Information and Learning, [200-] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
Abstracts in English and Chinese.
School code: 1307.
APA, Harvard, Vancouver, ISO, and other styles
43

"Systematic component in default risk." 2009. http://library.cuhk.edu.hk/record=b5894038.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

"The term structure of credit risk." Thesis, 2000. http://library.cuhk.edu.hk/record=b6073914.

Full text
Abstract:
Credit risk is an important source of risk for almost all of the financial securities. The frequent and serious financial crisis has made credit risk a sensitive and crucial consideration for financial institutions, corporations, and individual investors. The accurate pricing for credit risk and credit risky assets depends crucially upon the credit risk term structure---it implies the market expectation for the future credit risk. However, the credit risk analysis is still in its very early stages of development. The investigation about the credit risk term structure, especially the empirical exploration, has many blank points. Earlier research on the credit risk term structure mainly concentrates on the slopes, pertaining to the simple linear term structure which is not applicable to the middle credit quality assets. Thus the curvature of the spread curves may infer snore information about the changes of future credit qualities, the credit cycles, and the recurring business cycles. In this thesis, a bond pair approach is developed to study the shape (curvature as well as slope) of individual spread curves, and the relationship among spread curves for bonds with different ratings. We uncover downward sloping spread curves for triple C and double C bonds and upward sloping spread curves for triple A+ and triple A bonds. We also uncover hump-shaped spread curves for middle-graded bonds including double A to single B, and there exist peak points on these spread curves. We document the relationship among spread curves for bonds with different ratings In terms of time to peak and peak spread. We conclude that, in comparing higher rated bonds (say, double A) with lower rated bonds (say, single B), the credit spread is higher and time to peak is shorter for the latter than the former. In particular, these hump-shaped curves are bounded from above by downward sloping spread curves for triple C and double C bonds and bounded from below by upward sloping spread curves for triple A+ and triple A bonds. These findings provide a good explanation for the middle-rated bonds' spread curves. This evidence helps us to better understand the credit risk term structure, to accurately price credit risk and credit risky assets, and to appropriately manage credit risk.
Hu Wen-wei.
"August 2000."
Adviser: Jia He.
Source: Dissertation Abstracts International, Volume: 61-08, Section: A, page: 3284.
Thesis (Ph.D.)--Chinese University of Hong Kong, 2000.
Includes bibliographical references (p. 93-111).
Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
Electronic reproduction. Ann Arbor, MI : ProQuest dissertations and theses, [200-] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
Abstracts in English and Chinese.
School code: 1307.
APA, Harvard, Vancouver, ISO, and other styles
45

Sokolova, Ekaterina 1978. "Indifference valuation in non-reduced incomplete models with a stochastic risk factor." Thesis, 2007. http://hdl.handle.net/2152/3695.

Full text
Abstract:
This work contributes to the methodology of valuation of financial derivative contracts in an incomplete market. It focuses on a special type of incompleteness caused by the presence of a non-traded stochastic risk factor, affecting the value of the contract. The non-traded risk factor may only appear in the payoff of the contract or, in addition, may enter the dynamics of the traded asset. We consider both cases. We suggest a discrete time discrete space binomial model for the traded stock and the non-traded risk factor. We work in the utility maximization framework with dynamically changing agent's preferences. We present a discrete time multi-period analog of the forward and backward utility processes recently developed in continuous time. We use methods of stochastic control and provide the indifference valuation algorithm with both the forward and backward dynamic utilities. We compare the two approaches and provide conditions under which they assign the same value to the contract. We show that unlike the backward dynamic utility, the forward dynamic utility yields prices that do not depend on the end of the investment horizon. We pay attention to the choice of the equivalent martingale measure used for valuation (i.e., the minimal martingale measure and the minimal entropy measure for the forward and the backward utility processes correspondingly). We explicitly characterize both measures and give conditions under which they coincide. We extend our algorithm to the case of American and partial exercise contracts. We illustrate our work with numerical examples, showing that in an incomplete market, a call option on a non-traded risk factor may optimally be exercised early, and that it may be optimal to exercise only a fraction of the total number of contracts held, if partial exercise is allowed. In continuous time we extend the existing results to the case of American contracts with both the backward and the forward utilities. We emphasize the similarities between our discrete time valuation algorithm and the continuous time valuation. The two approaches use the same pricing measures, yield prices through nonlinear functionals of similar form, exhibit a similar relationship between the backward and forward prices, and a similar structure for the aggregate minimal entropy. We believe that our work makes a contribution by exposing the two above mentioned ways of dependence on the non-traded risk factor, and by providing a new dynamic indifference pricing algorithm that allows consistent valuation across different investment horizons.
APA, Harvard, Vancouver, ISO, and other styles
46

"On testing structural models of credit risk." 2005. http://library.cuhk.edu.hk/record=b5892687.

Full text
Abstract:
Li Ka-leung.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2005.
Includes bibliographical references (leaves 85-88).
Abstracts in English and Chinese.
Chapter 1 --- Introduction --- p.1
Chapter 2 --- Structural models of credit risk --- p.9
Chapter 2.1 --- The original Merton model --- p.10
Chapter 2.2 --- The extended Merton model --- p.11
Chapter 2.3 --- The Black and Cox model --- p.12
Chapter 2.4 --- The LS model --- p.14
Chapter 2.5 --- The CDG model --- p.16
Chapter 2.6 --- Comments on structural models --- p.19
Chapter 3 --- Proxies and their implications --- p.20
Chapter 3.1 --- Reviews of the EHH's empirical studies --- p.20
Chapter 3.2 --- The proxy for market values of firms --- p.23
Chapter 3.2.1 --- Zero coupon bonds under the Merton model --- p.23
Chapter 3.2.2 --- Coupon bearing bonds under the extended Merton model --- p.25
Chapter 3.2.3 --- Zero coupon bonds under the LS model --- p.26
Chapter 3.2.4 --- Coupon bearing bonds under the LS model --- p.28
Chapter 3.3 --- Implications of other proxies --- p.29
Chapter 4 --- Maximum Likelihood Estimation --- p.33
Chapter 4.1 --- The MLE approach for the Merton model --- p.33
Chapter 4.2 --- The MLE approach for the barrier dependent models --- p.35
Chapter 4.3 --- Survivorship consideration --- p.36
Chapter 4.4 --- Simulation tests --- p.37
Chapter 4.5 --- Simulation results --- p.39
Chapter 4.5.1 --- Simulation results for the Merton model --- p.39
Chapter 4.5.2 --- Simulation results for the LS model --- p.42
Chapter 5 --- Empirical test --- p.47
Chapter 5.1 --- Criteria of bond selection --- p.47
Chapter 5.2 --- Parameters of models --- p.51
Chapter 5.2.1 --- Firm specific parameters --- p.51
Chapter 5.2.2 --- Interest rate parameters --- p.54
Chapter 5.2.3 --- Stationary leverage process parameters --- p.55
Chapter 5.2.4 --- Bond specific parameters --- p.57
Chapter 5.3 --- Empirical results --- p.58
Chapter 5.3.1 --- Empirical results for the Merton model --- p.59
Chapter 5.3.2 --- Empirical results for the LS model --- p.66
Chapter 5.3.3 --- Empirical results for the CDG model --- p.71
Chapter 6 --- Conclusion --- p.77
Appendix --- p.80
Chapter A.1 --- Appendix 1 --- p.80
Chapter A.2 --- Appendix 2 --- p.82
Chapter A.3 --- Appendix 3 --- p.84
Bibliography --- p.85
APA, Harvard, Vancouver, ISO, and other styles
47

Sypkens, Roelf. "Risk properties and parameter estimation on mean reversion and Garch models." Diss., 2010. http://hdl.handle.net/10500/4049.

Full text
Abstract:
Most of the notations and terminological conventions used in this thesis are Statistical. The aim in risk management is to describe the risk factors present in time series. In order to group these risk factors, one needs to distinguish between different stochastic processes and put them into different classes. The risk factors discussed in this thesis are fat tails and mean reversion. The presence of these risk factors fist need to be found in the historical dataset. I will refer to the historical dataset as the original dataset. The Ljung- Box-Pierce test will be used in this thesis to determine if the distribution of the original dataset has mean reversion or no mean reversion.
Mathematical Sciences
M.Sc. (Applied Mathematics)
APA, Harvard, Vancouver, ISO, and other styles
48

"A downside risk analysis based on financial index tracking models." 2003. http://library.cuhk.edu.hk/record=b5891530.

Full text
Abstract:
Yu Lian.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2003.
Includes bibliographical references (leaves 81-84).
Abstracts in English and Chinese.
Chapter 1 --- Introduction --- p.1
Chapter 2 --- Literature Review --- p.4
Chapter 3 --- An Index Tracking Model with Downside Chance Risk Mea- sure --- p.12
Chapter 3.1 --- Statement of the Model --- p.13
Chapter 3.2 --- Efficient Frontier --- p.16
Chapter 3.3 --- Application of the Downside Chance Index Tracking Model --- p.29
Chapter 3.4 --- Chapter Summary --- p.34
Chapter 4 --- Index Tracking Models with High Order Moment Downside Risk Measure --- p.35
Chapter 4.1 --- Statement of the Models --- p.35
Chapter 4.2 --- Mean-Downside Deviation Financial Index Tracking Model --- p.38
Chapter 4.3 --- Chapter Summary --- p.45
Chapter 5 --- Numerical Analysis --- p.45
Chapter 5.1 --- Data Analysis --- p.45
Chapter 5.2 --- Experiment Description and Discussion --- p.48
Chapter 5.2.1 --- Efficient Frontiers --- p.48
Chapter 5.2.2 --- Monthly Expected Rate of Return --- p.50
Chapter 5.3 --- Chapter Summary --- p.52
Chapter 6 --- Summary --- p.54
Chapter A --- List of Companies --- p.57
Chapter B --- Graphical Result of Section 5.2.1 --- p.61
Chapter C --- Graphical Result of Section 5.2.2 --- p.67
Chapter D --- Proof in Chapter 3 and Chapter4 --- p.73
Bibliography --- p.81
APA, Harvard, Vancouver, ISO, and other styles
49

Doran, James Stephen. "On the market price of volatility risk." Thesis, 2004. http://hdl.handle.net/2152/1951.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

"Multi-period cooperative investment game with risk." 2008. http://library.cuhk.edu.hk/record=b5893772.

Full text
Abstract:
Zhou, Ying.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2008.
Includes bibliographical references (leaves 89-91).
Abstracts in English and Chinese.
Chapter 1 --- Introduction --- p.1
Chapter 1.1 --- Background --- p.1
Chapter 1.2 --- Aims and objectives --- p.2
Chapter 1.3 --- Outline of the thesis --- p.3
Chapter 2 --- Literature Review --- p.6
Chapter 2.1 --- Portfolio Optimization Problems --- p.6
Chapter 2.2 --- Cooperative Games and Cooperative Investment Models --- p.8
Chapter 2.2.1 --- Linear Production Games And Basic Concepts of Co- operative Game Theory --- p.9
Chapter 2.2.2 --- Investment Models Using Linear Production Games --- p.12
Chapter 3 --- Multi-period Cooperative Investment Games: Basic Model --- p.15
Chapter 3.1 --- Cooperative Investment Game under Deterministic Case --- p.16
Chapter 3.2 --- Cooperative Investment Game with Stochastic Return --- p.18
Chapter 3.2.1 --- Basic Assumptions --- p.18
Chapter 3.2.2 --- Choose the Proper Risk Measure --- p.20
Chapter 3.2.3 --- One Period Case --- p.21
Chapter 3.2.4 --- Multi-Period Case --- p.23
Chapter 4 --- The Two-Period Investment Game under L∞ Risk Measure --- p.26
Chapter 4.1 --- The Two Period Model --- p.26
Chapter 4.2 --- The Algorithm --- p.35
Chapter 4.3 --- Optimal Solution of the Dual --- p.41
Chapter 5 --- Primal Solution and Stability of the Core under Two-Period Case --- p.43
Chapter 5.1 --- Direct Results --- p.44
Chapter 5.2 --- Find the Optimal Solutions of the Primal Problem --- p.46
Chapter 5.3 --- Relationship between A and the Core --- p.53
Chapter 5.3.1 --- Tracing out the efficient frontier --- p.54
Chapter 6 --- Multi-Period Case --- p.63
Chapter 6.1 --- Common Risk Price and the Negotiation Process with Concave Risk Utility --- p.64
Chapter 6.1.1 --- Existence of Common Risk Price and Core --- p.65
Chapter 6.1.2 --- Negotiation Process --- p.68
Chapter 6.2 --- Modified Simplex Method --- p.71
Chapter 7 --- Other Risk Measures --- p.76
Chapter 7.1 --- The Downside Risk Measure --- p.76
Chapter 7.1.1 --- Discrete (Finite Scenario) Distributions --- p.78
Chapter 7.1.2 --- General Distributions --- p.81
Chapter 7.2 --- Coherent Risk Measure and CVaR --- p.83
Chapter 8 --- Conclusion and Future Work --- p.87
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography