Academic literature on the topic 'Conditional risk measure'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Conditional risk measure.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Conditional risk measure"

1

Xun, Li, Renqiao Jiang, and Jianhua Guo. "The conditional Haezendonck–Goovaerts risk measure." Statistics & Probability Letters 169 (February 2021): 108968. http://dx.doi.org/10.1016/j.spl.2020.108968.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ding, Rui, and Stan Uryasev. "CoCDaR and mCoCDaR: New Approach for Measurement of Systemic Risk Contributions." Journal of Risk and Financial Management 13, no. 11 (November 3, 2020): 270. http://dx.doi.org/10.3390/jrfm13110270.

Full text
Abstract:
Systemic risk is the risk that the distress of one or more institutions trigger a collapse of the entire financial system. We extend CoVaR (value-at-risk conditioned on an institution) and CoCVaR (conditional value-at-risk conditioned on an institution) systemic risk contribution measures and propose a new CoCDaR (conditional drawdown-at-risk conditioned on an institution) measure based on drawdowns. This new measure accounts for consecutive negative returns of a security, while CoVaR and CoCVaR combine together negative returns from different time periods. For instance, ten 2% consecutive losses resulting in 20% drawdown will be noticed by CoCDaR, while CoVaR and CoCVaR are not sensitive to relatively small one period losses. The proposed measure provides insights for systemic risks under extreme stresses related to drawdowns. CoCDaR and its multivariate version, mCoCDaR, estimate an impact on big cumulative losses of the entire financial system caused by an individual firm’s distress. It can be used for ranking individual systemic risk contributions of financial institutions (banks). CoCDaR and mCoCDaR are computed with CVaR regression of drawdowns. Moreover, mCoCDaR can be used to estimate drawdowns of a security as a function of some other factors. For instance, we show how to perform fund drawdown style classification depending on drawdowns of indices. Case study results, data, and codes are posted on the web.
APA, Harvard, Vancouver, ISO, and other styles
3

Kuzmina, Jekaterina, Gaida Pettere, and Irina Voronova. "Conditional risk measure modeling for Latvian insurance companies." Perspectives of Innovations, Economics and Business 2, no. 2 (October 9, 2009): 59–61. http://dx.doi.org/10.15208/pieb.2009.56.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Dmitrasinovic-Vidovic, Gordana, Ali Lari-Lavassani, Xun Li, and Antony Ware. "Continuous Time Portfolio Selection under Conditional Capital at Risk." Journal of Probability and Statistics 2010 (2010): 1–26. http://dx.doi.org/10.1155/2010/976371.

Full text
Abstract:
Portfolio optimization with respect to different risk measures is of interest to both practitioners and academics. For there to be a well-defined optimal portfolio, it is important that the risk measure be coherent and quasiconvex with respect to the proportion invested in risky assets. In this paper we investigate one such measure—conditional capital at risk—and find the optimal strategies under this measure, in the Black-Scholes continuous time setting, with time dependent coefficients.
APA, Harvard, Vancouver, ISO, and other styles
5

Kim, Joseph H. T., and Mary R. Hardy. "Estimating the Variance of Bootstrapped Risk Measures." ASTIN Bulletin 39, no. 1 (May 2009): 199–223. http://dx.doi.org/10.2143/ast.39.1.2038062.

Full text
Abstract:
AbstractIn Kim and Hardy (2007) the exact bootstrap was used to estimate certain risk measures including Value at Risk and the Conditional Tail Expectation. In this paper we continue this work by deriving the influence function of the exact-bootstrapped quantile risk measure. We can use the influence function to estimate the variance of the exact-bootstrap risk measure. We then extend the result to the L-estimator class, which includes the conditional tail expectation risk measure. The resulting formula provides an alternative way to estimate the variance of the bootstrapped risk measures, or the whole L-estimator class in an analytic form. A simulation study shows that this new method is comparable to the ordinary resampling-based bootstrap method, with the advantages of an analytic approach.
APA, Harvard, Vancouver, ISO, and other styles
6

Brownlees, Christian, and Robert F. Engle. "SRISK: A Conditional Capital Shortfall Measure of Systemic Risk." Review of Financial Studies 30, no. 1 (August 6, 2016): 48–79. http://dx.doi.org/10.1093/rfs/hhw060.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

MÖLLER, PHILIPP M. "DRAWDOWN MEASURES AND RETURN MOMENTS." International Journal of Theoretical and Applied Finance 21, no. 07 (November 2018): 1850042. http://dx.doi.org/10.1142/s0219024918500425.

Full text
Abstract:
This paper provides an investigation of the effects of an investment’s return moments on drawdown-based measures of risk, including Maximum Drawdown (MDD), Conditional Drawdown (CDD), and Conditional Expected Drawdown (CED). Additionally, a new end-of-period drawdown measure is introduced, which incorporates a psychological aspect of risk perception that previous drawdown measures had been unable to capture. While simulation results indicate many similarities in the first and second moments, skewness and kurtosis affect different drawdown measures in radically different ways. Thus, users should assess whether their choice of drawdown measure accurately reflects the kind of risk they want to measure.
APA, Harvard, Vancouver, ISO, and other styles
8

Hürlimann, Werner. "Multivariate Fréchet copulas and conditional value-at-risk." International Journal of Mathematics and Mathematical Sciences 2004, no. 7 (2004): 345–64. http://dx.doi.org/10.1155/s0161171204210158.

Full text
Abstract:
Based on the method of copulas, we construct a parametric family of multivariate distributions using mixtures of independent conditional distributions. The new family of multivariate copulas is a convex combination of products of independent and comonotone subcopulas. It fulfills the four most desirable properties that a multivariate statistical model should satisfy. In particular, the bivariate margins belong to a simple but flexible one-parameter family of bivariate copulas, called linear Spearman copula, which is similar but not identical to the convex family of Fréchet. It is shown that the distribution and stop-loss transform of dependent sums from this multivariate family can be evaluated using explicit integral formulas, and that these dependent sums are bounded in convex order between the corresponding independent and comonotone sums. The model is applied to the evaluation of the economic risk capital for a portfolio of risks using conditional value-at-risk measures. A multivariate conditional value-at-risk vector measure is considered. Its components coincide for the constructed multivariate copula with the conditional value-at-risk measures of the risk components of the portfolio. This yields a “fair” risk allocation in the sense that each risk component becomes allocated to its coherent conditional value-at-risk.
APA, Harvard, Vancouver, ISO, and other styles
9

Ghosh, Indranil, and Filipe J. Marques. "Tail Conditional Expectations Based on Kumaraswamy Dispersion Models." Mathematics 9, no. 13 (June 24, 2021): 1478. http://dx.doi.org/10.3390/math9131478.

Full text
Abstract:
Recently, there seems to be an increasing amount of interest in the use of the tail conditional expectation (TCE) as a useful measure of risk associated with a production process, for example, in the measurement of risk associated with stock returns corresponding to the manufacturing industry, such as the production of electric bulbs, investment in housing development, and financial institutions offering loans to small-scale industries. Companies typically face three types of risk (and associated losses from each of these sources): strategic (S); operational (O); and financial (F) (insurance companies additionally face insurance risks) and they come from multiple sources. For asymmetric and bounded losses (properly adjusted as necessary) that are continuous in nature, we conjecture that risk assessment measures via univariate/bivariate Kumaraswamy distribution will be efficient in the sense that the resulting TCE based on bivariate Kumaraswamy type copulas do not depend on the marginals. In fact, almost all classical measures of tail dependence are such, but they investigate the amount of tail dependence along the main diagonal of copulas, which has often little in common with the concentration of extremes in the copula’s domain of definition. In this article, we examined the above risk measure in the case of a univariate and bivariate Kumaraswamy (KW) portfolio risk, and computed TCE based on bivariate KW type copulas. For illustrative purposes, a well-known Stock indices data set was re-analyzed by computing TCE for the bivariate KW type copulas to determine which pairs produce minimum risk in a two-component risk scenario.
APA, Harvard, Vancouver, ISO, and other styles
10

Di Bernardino, E., J. M. Fernández-Ponce, F. Palacios-Rodríguez, and M. R. Rodríguez-Griñolo. "On multivariate extensions of the conditional Value-at-Risk measure." Insurance: Mathematics and Economics 61 (March 2015): 1–16. http://dx.doi.org/10.1016/j.insmatheco.2014.11.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Conditional risk measure"

1

DOLDI, ALESSANDRO. "EQUILIBRIUM, SYSTEMIC RISK MEASURES AND OPTIMAL TRANSPORT: A CONVEX DUALITY APPROACH." Doctoral thesis, Università degli Studi di Milano, 2021. http://hdl.handle.net/2434/812668.

Full text
Abstract:
This Thesis focuses on two main topics. Firstly, we introduce and analyze the novel concept of Systemic Optimal Risk Transfer Equilibrium (SORTE), and we progressively generalize it (i) to a multivariate setup and (ii) to a dynamic (conditional) setting. Additionally we investigate its relation to a recently introduced concept of Systemic Risk Measures (SRM). We present Conditional Systemic Risk Measures and study their properties, dual representation and possible interpretations of the associated allocations as equilibria in the sense of SORTE. On a parallel line of work, we develop a duality for the Entropy Martingale Optimal Transport problem and provide applications to problems of nonlinear pricing-hedging. The mathematical techniques we exploit are mainly borrowed from functional and convex analysis, as well as probability theory. More specifically, apart from a wide range of classical results from functional analysis, we extensively rely on Fenchel-Moreau-Rockafellar type conjugacy results, Minimax Theorems, theory of Orlicz spaces, compactness results in the spirit of Komlós Theorem. At the same time, mathematical results concerning utility maximization theory (existence of optima for primal and dual problems, just to mention an example) and optimal transport theory are widely exploited. The notion of SORTE is inspired by the Bühlmann's classical Equilibrium Risk Exchange (H. Bühlmann, "The general economic premium principle", Astin Bulletin, 1984). In both the Bühlmann and the SORTE definition, each agent is behaving rationally by maximizing his/her expected utility given a budget constraint. The two approaches differ by the budget constraints. In Bühlmann's definition the vector that assigns the budget constraint is given a priori. In the SORTE approach, on the contrary, the budget constraint is endogenously determined by solving a systemic utility maximization problem. SORTE gives priority to the systemic aspects of the problem, in order to first optimize the overall systemic performance, rather than to individual rationality. Single agents' preferences are, however, taken into account by the presence of individual optimization problems. The two aspects are simultaneously considered via an optimization problem for a value function given by summation of single agents' utilities. After providing a financial and theoretical justification for this new idea, in this research sufficient general assumptions that guarantee existence, uniqueness, and Pareto optimality of such a SORTE are presented. Once laid the theoretical foundation for the newly introduced SORTE, this Thesis proceeds in extending such a notion to the case when the value function to be optimized has two components, one being the sum of the single agents' utility functions, as in the aforementioned case of SORTE, the other consisting of a truly systemic component. This marks the progress from SORTE to Multivariate Systemic Optimal Risk Transfer Equilibrium (mSORTE). Technically, the extension of SORTE to the new setup requires developing a theory for multivariate utility functions and selecting at the same time a suitable framework for the duality theory. Conceptually, this more general setting allows us to introduce and study a Nash Equilibrium property of the optimizers. Existence, uniqueness, Pareto optimality and the Nash Equilibrium property of the newly defined mSORTE are proved in this Thesis. Additionally, it is shown how mSORTE is in fact a proper generalization, and covers both from the conceptual and the mathematical point of view the notion of SORTE. Proceeding further in the analysis, the relations between the concepts of mSORTE and SRM are investigated in this work. The notion of SRM we start from was introduced in the papers "A unified approach to systemic risk measures via acceptance sets" (Math. Finance, 2019) and "On fairness of systemic risk measures" (Finance Stoch., 2020) by F. Biagini, J.-P. Fouque, M. Frittelli, and T. Meyer-Brandis. SRM of Biagini et al. are generalized in this Thesis to a dynamic (namely conditional) setting, adding also a systemic, multivariate term in the threshold functions that Biagini et al. consider in their papers. The dynamic version of mSORTE is introduced, and it is proved that the optimal allocations of dynamic SRM, together with the corresponding fair pricing measures, yield a dynamic mSORTE. This in particular remains true if conditioning is taken with respect to the trivial sigma algebra, which is tantamount to working in the non-dynamic setting covered in Biagini et al. for SRM, and in the previous parts of our work for mSORTE. The case of exponential utility functions is thoroughly examined, and the explicit formulas we obtain for this specific choice of threshold functions allow for providing a time consistency property for allocations, dynamic SRM and dynamic mSORTE. The last part of this Thesis is devoted to a conceptually separate topic. Nonetheless, a clear mathematical link between the previous work and the one we are to describe is established by the use of common techniques. A duality between a novel Entropy Martingale Optimal Transport (EMOT) problem (D) and an associated optimization problem (P) is developed. In (D) the approach taken in Liero et al. (M. Liero, A. Mielke, and G. Savaré, "Optimal entropy-transport problems and a new Hellinger-Kantorovich distance between positive measures", Inventiones mathematicae, 2018) serves as a basis for adding the constraint, typical of Martingale Optimal Transport (MOT) theory, that the infimum of the cost functional is taken over martingale probability measures, instead of finite positive measures, as in Liero et al.. The Problem (D) differs from the corresponding problem in Liero et al. not only by the martingale constraint, but also because we admit less restrictive penalization terms D, which may not have a divergence formulation. In Problem (P) the objective functional, associated via Fenchel conjugacy to the terms D, is not any more linear, as in Optimal Transport or in MOT. This leads to a novel optimization problem which also has a clear financial interpretation as a non linear subhedging value. Our results in this Thesis establish a novel nonlinear robust pricing-hedging duality in financial mathematics, which covers a wide range of known robust results in its generality. The research for this Thesis resulted in the production of the following works: F. Biagini, A. Doldi, J.-P. Fouque, M. Frittelli, and T. Meyer-Brandis, "Systemic optimal risk transfer equilibrium", Mathematics and Financial Economics, 2021; A. Doldi and M. Frittelli, "Multivariate Systemic Optimal Risk Transfer Equilibrium", Preprint: arXiv:1912.12226, 2019; A. Doldi and M. Frittelli, "Conditional Systemic Risk Measures", Preprint: arXiv:2010.11515, 2020; A. Doldi and M. Frittelli, "Entropy Martingale Optimal Transport and Nonlinear Pricing-Hedging Duality", Preprint: arXiv:2005.12572, 2020.
APA, Harvard, Vancouver, ISO, and other styles
2

Karniychuk, Maryna. "Comparing Approximations for Risk Measures Related to Sums of Correlated Lognormal Random Variables." Master's thesis, Universitätsbibliothek Chemnitz, 2007. http://nbn-resolving.de/urn:nbn:de:swb:ch1-200700024.

Full text
Abstract:
In this thesis the performances of different approximations are compared for a standard actuarial and financial problem: the estimation of quantiles and conditional tail expectations of the final value of a series of discrete cash flows. To calculate the risk measures such as quantiles and Conditional Tail Expectations, one needs the distribution function of the final wealth. The final value of a series of discrete payments in the considered model is the sum of dependent lognormal random variables. Unfortunately, its distribution function cannot be determined analytically. Thus usually one has to use time-consuming Monte Carlo simulations. Computational time still remains a serious drawback of Monte Carlo simulations, thus several analytical techniques for approximating the distribution function of final wealth are proposed in the frame of this thesis. These are the widely used moment-matching approximations and innovative comonotonic approximations. Moment-matching methods approximate the unknown distribution function by a given one in such a way that some characteristics (in the present case the first two moments) coincide. The ideas of two well-known approximations are described briefly. Analytical formulas for valuing quantiles and Conditional Tail Expectations are derived for both approximations. Recently, a large group of scientists from Catholic University Leuven in Belgium has derived comonotonic upper and comonotonic lower bounds for sums of dependent lognormal random variables. These bounds are bounds in the terms of "convex order". In order to provide the theoretical background for comonotonic approximations several fundamental ordering concepts such as stochastic dominance, stop-loss and convex order and some important relations between them are introduced. The last two concepts are closely related. Both stochastic orders express which of two random variables is the "less dangerous/more attractive" one. The central idea of comonotonic upper bound approximation is to replace the original sum, presenting final wealth, by a new sum, for which the components have the same marginal distributions as the components in the original sum, but with "more dangerous/less attractive" dependence structure. The upper bound, or saying mathematically, convex largest sum is obtained when the components of the sum are the components of comonotonic random vector. Therefore, fundamental concepts of comonotonicity theory which are important for the derivation of convex bounds are introduced. The most wide-spread examples of comonotonicity which emerge in financial context are described. In addition to the upper bound a lower bound can be derived as well. This provides one with a measure of the reliability of the upper bound. The lower bound approach is based on the technique of conditioning. It is obtained by applying Jensen's inequality for conditional expectations to the original sum of dependent random variables. Two slightly different version of conditioning random variable are considered in the context of this thesis. They give rise to two different approaches which are referred to as comonotonic lower bound and comonotonic "maximal variance" lower bound approaches. Special attention is given to the class of distortion risk measures. It is shown that the quantile risk measure as well as Conditional Tail Expectation (under some additional conditions) belong to this class. It is proved that both risk measures being under consideration are additive for a sum of comonotonic random variables, i.e. quantile and Conditional Tail Expectation for a comonotonic upper and lower bounds can easily be obtained by summing the corresponding risk measures of the marginals involved. A special subclass of distortion risk measures which is referred to as class of concave distortion risk measures is also under consideration. It is shown that quantile risk measure is not a concave distortion risk measure while Conditional Tail Expectation (under some additional conditions) is a concave distortion risk measure. A theoretical justification for the fact that "concave" Conditional Tail Expectation preserves convex order relation between random variables is given. It is shown that this property does not necessarily hold for the quantile risk measure, as it is not a concave risk measure. Finally, the accuracy and efficiency of two moment-matching, comonotonic upper bound, comonotonic lower bound and "maximal variance" lower bound approximations are examined for a wide range of parameters by comparing with the results obtained by Monte Carlo simulation. It is justified by numerical results that, generally, in the current situation lower bound approach outperforms other methods. Moreover, the preservation of convex order relation between the convex bounds for the final wealth by Conditional Tail Expectation is demonstrated by numerical results. It is justified numerically that this property does not necessarily hold true for the quantile.
APA, Harvard, Vancouver, ISO, and other styles
3

Drapeau, Samuel. "Risk preferences and their robust representation." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät II, 2010. http://dx.doi.org/10.18452/16135.

Full text
Abstract:
Ziel dieser Dissertation ist es, den Begriff des Risikos unter den Aspekten seiner Quantifizierung durch robuste Darstellungen zu untersuchen. In einem ersten Teil wird Risiko anhand Kontext-Invarianter Merkmale betrachtet: Diversifizierung und Monotonie. Wir führen die drei Schlüsselkonzepte, Risikoordnung, Risikomaß und Risikoakzeptanzfamilen ein, und studieren deren eins-zu-eins Beziehung. Unser Hauptresultat stellt eine eindeutige duale robuste Darstellung jedes unterhalbstetigen Risikomaßes auf topologischen Vektorräumen her. Wir zeigen auch automatische Stetigkeitsergebnisse und robuste Darstellungen für Risikomaße auf diversen Arten von konvexen Mengen. Diese Herangehensweise lässt bei der Wahl der konvexen Menge viel Spielraum, und erlaubt damit eine Vielfalt von Interpretationen von Risiko: Modellrisiko im Falle von Zufallsvariablen, Verteilungsrisiko im Falle von Lotterien, Abdiskontierungsrisiko im Falle von Konsumströmen... Diverse Beispiele sind dann in diesen verschiedenen Situationen explizit berechnet (Sicherheitsäquivalent, ökonomischer Risikoindex, VaR für Lotterien, "variational preferences"...). Im zweiten Teil, betrachten wir Präferenzordnungen, die möglicherweise zusätzliche Informationen benötigen, um ausgedrückt zu werden. Hierzu führen wir einen axiomatischen Rahmen in Form von bedingten Präferenzordungen ein, die lokal mit der Information kompatibel sind. Dies erlaubt die Konstruktion einer bedingten numerischen Darstellung. Wir erhalten eine bedingte Variante der von Neumann und Morgenstern Darstellung für messbare stochastische Kerne und erweitern dieses Ergebnis zur einer bedingten Version der "variational preferences". Abschließend, klären wir das Zusammenpiel zwischen Modellrisiko und Verteilungsrisiko auf der axiomatischen Ebene.
The goal of this thesis is the conceptual study of risk and its quantification via robust representations. We concentrate in a first part on context invariant features related to this notion: diversification and monotonicity. We introduce and study the general properties of three key concepts, risk order, risk measure and risk acceptance family and their one-to-one relations. Our main result is a uniquely characterized dual robust representation of lower semicontinuous risk orders on topological vector space. We also provide automatic continuity and robust representation results on specific convex sets. This approach allows multiple interpretation of risk depending on the setting: model risk in the case of random variables, distributional risk in the case of lotteries, discounting risk in the case of consumption streams... Various explicit computations in those different settings are then treated (economic index of riskiness, certainty equivalent, VaR on lotteries, variational preferences...). In the second part, we consider preferences which might require additional information in order to be expressed. We provide a mathematical framework for this idea in terms of preorders, called conditional preference orders, which are locally compatible with the available information. This allows us to construct conditional numerical representations of conditional preferences. We obtain a conditional version of the von Neumann and Morgenstern representation for measurable stochastic kernels and extend then to a conditional version of the variational preferences. We finally clarify the interplay between model risk and distributional risk on the axiomatic level.
APA, Harvard, Vancouver, ISO, and other styles
4

Eksi, Zehra. "Comparative Study Of Risk Measures." Master's thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/12606501/index.pdf.

Full text
Abstract:
There is a little doubt that, for a decade, risk measurement has become one of the most important topics in finance. Indeed, it is natural to observe such a development, since in the last ten years, huge amounts of financial transactions ended with severe losses due to severe convulsions in financial markets. Value at risk, as the most widely used risk measure, fails to quantify the risk of a position accurately in many situations. For this reason a number of consistent risk measures have been introduced in the literature. The main aim of this study is to present and compare coherent, convex, conditional convex and some other risk measures both in theoretical and practical settings.
APA, Harvard, Vancouver, ISO, and other styles
5

Prastorfer, Andreas. "Simulation-Based Portfolio Optimization with Coherent Distortion Risk Measures." Thesis, KTH, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-266382.

Full text
Abstract:
This master's thesis studies portfolio optimization using linear programming algorithms. The contribution of this thesis is an extension of the convex framework for portfolio optimization with Conditional Value-at-Risk, introduced by Rockafeller and Uryasev. The extended framework considers risk measures in this thesis belonging to the intersecting classes of coherent risk measures and distortion risk measures, which are known as coherent distortion risk measures. The considered risk measures belonging to this class are the Conditional Value-at-Risk, the Wang Transform, the Block Maxima and the Dual Block Maxima measures. The extended portfolio optimization framework is applied to a reference portfolio consisting of stocks, options and a bond index. All assets are from the Swedish market. The returns of the assets in the reference portfolio are modelled with elliptical distribution and normal copulas with asymmetric marginal return distributions. The portfolio optimization framework is a simulation-based framework that measures the risk using the simulated scenarios from the assumed portfolio distribution model. To model the return data with asymmetric distributions, the tails of the marginal distributions are fitted with generalized Pareto distributions, and the dependence structure between the assets are captured using a normal copula. The result obtained from the optimizations is compared to different distributional return assumptions of the portfolio and the four risk measures. A Markowitz solution to the problem is computed using the mean average deviation as the risk measure. The solution is the benchmark solution which optimal solutions using the coherent distortion risk measures are compared to. The coherent distortion risk measures have the tractable property of being able to assign user-defined weights to different parts of the loss distribution and hence value increasing loss severities as greater risks. The user-defined loss weighting property and the asymmetric return distribution models are used to find optimal portfolios that account for extreme losses. An important finding of this project is that optimal solutions for asset returns simulated from asymmetric distributions are associated with greater risks, which is a consequence of more accurate modelling of distribution tails. Furthermore, weighting larger losses with increasingly larger weights show that the portfolio risk is greater, and a safer position is taken.
Denna masteruppsats behandlar portföljoptimering med linjära programmeringsalgoritmer. Bidraget av uppsatsen är en utvidgning av det konvexa ramverket för portföljoptimering med Conditional Value-at-Risk, som introducerades av Rockafeller och Uryasev. Det utvidgade ramverket behandlar riskmått som tillhör en sammansättning av den koherenta riskmåttklassen och distortions riksmåttklassen. Denna klass benämns som koherenta distortionsriskmått. De riskmått som tillhör denna klass och behandlas i uppsatsen och är Conditional Value-at-Risk, Wang Transformen, Block Maxima och Dual Block Maxima måtten. Det utvidgade portföljoptimeringsramverket appliceras på en referensportfölj bestående av aktier, optioner och ett obligationsindex från den Svenska aktiemarknaden. Tillgångarnas avkastningar, i referens portföljen, modelleras med både elliptiska fördelningar och normal-copula med asymmetriska marginalfördelningar. Portföljoptimeringsramverket är ett simuleringsbaserat ramverk som mäter risk baserat på scenarion simulerade från fördelningsmodellen som antagits för portföljen. För att modellera tillgångarnas avkastningar med asymmetriska fördelningar modelleras marginalfördelningarnas svansar med generaliserade Paretofördelningar och en normal-copula modellerar det ömsesidiga beroendet mellan tillgångarna. Resultatet av portföljoptimeringarna jämförs sinsemellan för de olika portföljernas avkastningsantaganden och de fyra riskmåtten. Problemet löses även med Markowitz optimering där "mean average deviation" används som riskmått. Denna lösning kommer vara den "benchmarklösning" som kommer jämföras mot de optimala lösningarna vilka beräknas i optimeringen med de koherenta distortionsriskmåtten. Den speciella egenskapen hos de koherenta distortionsriskmåtten som gör det möjligt att ange användarspecificerade vikter vid olika delar av förlustfördelningen och kan därför värdera mer extrema förluster som större risker. Den användardefinerade viktningsegenskapen hos riskmåtten studeras i kombination med den asymmetriska fördelningsmodellen för att utforska portföljer som tar extrema förluster i beaktande. En viktig upptäckt är att optimala lösningar till avkastningar som är modellerade med asymmetriska fördelningar är associerade med ökad risk, vilket är en konsekvens av mer exakt modellering av tillgångarnas fördelningssvansar. En annan upptäckt är, om större vikter läggs på högre förluster så ökar portföljrisken och en säkrare portföljstrategi antas.
APA, Harvard, Vancouver, ISO, and other styles
6

Koren, Øystein Sand. "Contrasting broadly adopted model-based portfolio risk measures with current market conditions." Thesis, Norwegian University of Science and Technology, Department of Mathematical Sciences, 2009. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-9824.

Full text
Abstract:

The last two years have seen the most volatile financial markets for decades with steep losses in asset values and a deteriorating world economy. The insolvency of several banks and their negative impact on the economy has led to criticism of their risk management systems for not being adequate and lacking foresight. This thesis will study the performance of two broadly adopted portfolio risk measures before and during the current financial turbulence to examine their accuracy and reliability. The study will be carried out on a case portfolio consisting of American and European fixed income and equity. The portfolio uses a dynamic asset allocation scheme to maximize the ratio between expected return and portfolio risk. The market risk of the portfolio will be calculated on a daily basis using both Value-at-Risk (VaR) and expected shortfall (ES) in a Monte Carlo framework. These risk measures are then compared with prior measurements and the actual loss over the period. The results from the study indicate that the implemented risk model do not give totally reliable estimates, with more frequent and larger real losses than predicted. Nevertheless, the study sees a significant worsening in the performance of the risk measures during the current financial crisis from June 2007 to December 2008 compared with the previous years. This thesis argues that VaR and ES are useful risk measures, but that users should be well aware of the pitfalls in the underlying models and take appropriate precautions.

APA, Harvard, Vancouver, ISO, and other styles
7

Hoffmann, Hannes [Verfasser], and Thilo [Akademischer Betreuer] Meyer-Brandis. "Multivariate conditional risk measures : with a view towards systemic risk in financial networks / Hannes Hoffmann ; Betreuer: Thilo Meyer-Brandis." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2017. http://d-nb.info/1137835222/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Loukrati, Hicham. "Tail Empirical Processes: Limit Theorems and Bootstrap Techniques, with Applications to Risk Measures." Thesis, Université d'Ottawa / University of Ottawa, 2018. http://hdl.handle.net/10393/37594.

Full text
Abstract:
Au cours des dernières années, des changements importants dans le domaine des assurances et des finances attirent de plus en plus l’attention sur la nécessité d’élaborer un cadre normalisé pour la mesure des risques. Récemment, il y a eu un intérêt croissant de la part des experts en assurance sur l’utilisation de l’espérance conditionnelle des pertes (CTE) parce qu’elle partage des propriétés considérées comme souhaitables et applicables dans diverses situations. En particulier, il répond aux exigences d’une mesure de risque “cohérente”, selon Artzner [2]. Cette thèse représente des contributions à l’inférence statistique en développant des outils, basés sur la convergence des intégrales fonctionnelles, pour l’estimation de la CTE qui présentent un intérêt considérable pour la science actuarielle. Tout d’abord, nous développons un outil permettant l’estimation de la moyenne conditionnelle E[X|X > x], ensuite nous construisons des estimateurs de la CTE, développons la théorie asymptotique nécessaire pour ces estimateurs, puis utilisons la théorie pour construire des intervalles de confiance. Pour la première fois, l’approche de bootstrap non paramétrique est explorée dans cette thèse en développant des nouveaux résultats applicables à la valeur à risque (VaR) et à la CTE. Des études de simulation illustrent la performance de la technique de bootstrap.
APA, Harvard, Vancouver, ISO, and other styles
9

Chan, Meanne. "Implicit measures of early-life family conditions : relationships to psychosocial characteristics and cardiovascular disease risk in adulthood." Thesis, University of British Columbia, 2011. http://hdl.handle.net/2429/36745.

Full text
Abstract:
An implicit measure of early-life family conditions was created to help address potential biases in responses to self-reported questionnaires of early-life family environments. We investigated whether a computerized affect attribution paradigm designed to capture implicit, affective responses (anger, fear, warmth) regarding early-life family environments was a) stable over time, b) associated with self-reports of childhood family environments, c) able to predict adult psychosocial profiles (perceived social support, heightened vigilance), and d) able to predict adult cardiovascular risk (blood pressure) either alone or in conjunction with a measure of early-life socioeconomic status. Two studies were conducted to examine reliability and validity of the affect attribution paradigm (Study 1, N = 94) and associated adult psychosocial outcomes and cardiovascular risk (Study 2, N = 122). Responses on the affect attribution paradigm showed significant correlations over a 6-month period, and were moderately associated with self-reports of childhood family environments. Greater attributed negative affect about early-life family conditions predicted lower levels of current perceived social support and heightened vigilance in adulthood. Attributed negative affect also interacted with early-life socioeconomic status to marginally predict resting systolic blood pressure, such that those individuals high in early-life SES but who had implicit negative affect attributed to early-life family conditions had SBP levels that were as high as individuals low in early-life SES. Implicit measures of early-life family conditions are a useful approach for assessing the psychosocial nature of early-life environments and linking them to adult psychosocial and physiological health profiles.
APA, Harvard, Vancouver, ISO, and other styles
10

Maggis, M. "ON QUASICONVEX CONDITIONAL MAPS. DUALITY RESULTS AND APPLICATIONS TO FINANCE." Doctoral thesis, Università degli Studi di Milano, 2010. http://hdl.handle.net/2434/150201.

Full text
Abstract:
Motivated by many financial insights, we provide dual representation theorems for quasiconvex conditional maps defined on vector space or modules and taking values in sets of random variables. These results match the standard dual representation for quasiconvex real valued maps provided by Penot and Volle. As a financial byproduct, we apply this theory to the case of dynamic certainty equivalents and conditional risk measures.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Conditional risk measure"

1

Centre for the Study of Adolescence (Nairobi, Kenya) and Population Action International, eds. A measure of commitment: Women's sexual and reproductive risk index for sub Saharan Africa. Nairobi, Kenya: Centre for the Study of Adolescence, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Fuqiang, Nie, ed. Zhongguo guo jia jing ji an quan yu jing xi tong yan jiu. Beijing Shi: Zhongguo tong ji chu ban she, 2005.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Manning, Roberta Thompson. The rise and fall of "the extraordinary measures," January-June, 1928: Toward a reexamination of the onset of the Stalin Revolution. Pittsburgh, PA: Center for Russian & East European Studies. University Center for International Studies, University of Pittsburgh, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

United States. General Accounting Office, ed. Blood plasma safety: Plasma product risks are low if good manufacturing practices are followed : report to the chairman, Subcommittee on Human Resources, Committee on Government Reform and Oversight, House of Representatives. Washington, D.C: The Office, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Daojiong, Zha, ed. Zhongguo xue zhe kan shi jie.: World politics, views from China. Beijing: Xin shi jie chu ban she, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Galinovskaya, Elena, Elena Boltanova, Gennadiy Volkov, Galina Vyphanova, I. Ignat'eva, N. Kichigin, E. Kovaleva, et al. Zones with special conditions of use of territories (problems of the establishment and implementation of the legal regime). ru: INFRA-M Academic Publishing LLC., 2020. http://dx.doi.org/10.12737/1080400.

Full text
Abstract:
The peculiarities of the modern spatial development necessitated the development of organizational, managerial and legal measures to reduce the risks of neighbourhood objects that have a negative impact on humans and the environment, as well as to strengthen the protection of especially dangerous or sensitive objects. Introduction to the Land code of the Russian Federation the concept of "zones with special conditions of use of territories" is one of the promising solutions to the above tasks and is aimed at ensuring sanitary and epidemiological welfare of the population, industrial safety, safety in operating all types of transport, defence and state security, environmental protection etc. The Handbook describes the concept and the legal nature of the zones with special conditions of use of territories as a new category, which should become a full part of fur- the mechanism of the land law regulation. Describes the evolution of national legislation on conservation and protection zones, the analysis of the regulation of similar zones in foreign legislation. Special attention is paid to General issues of the legal regime of these zones, the specifics of their establishment and accounting. Researched legal requirements for the adherence of all types of zones with special conditions of use. For practitioners and specialists in the field of state and municipal administration, scientific workers and lecturers of higher and secondary professional educational institutions, students, graduates, and also for a wide range of readers.
APA, Harvard, Vancouver, ISO, and other styles
7

Lobanov, Aleksey. Biomedical foundations of security. ru: INFRA-M Academic Publishing LLC., 2019. http://dx.doi.org/10.12737/1007643.

Full text
Abstract:
The textbook discusses the threats and risks to life and health of people in post-industrial society. The role and place of medical and biological technologies in the system of ensuring the safety of the population of the Russian Federation are shown from the standpoint of an interdisciplinary approach. Briefly, but quite informative, the structure of the human body and the principles of its functioning are described. The specificity and mechanism of toxic effects on humans of harmful substances, energy effects and combined action of the main damaging factors of sources of emergency situations of peace and war are shown. The medical and biological aspects of ensuring the safety of human life in adverse environmental conditions, including in regions with hot and cold climates (Arctic) are considered. Means and methods of first aid to victims are shown. The questions of organization and carrying out of measures of medical support of the population in zones of emergency situations and the centers of defeat are covered. Designed for students, students and cadets of educational institutions of higher education, studying under the bachelor's program. It can also be useful for teachers, researchers and a wide range of professionals engaged in practical work on the planning and organization of biomedical protection of the population.
APA, Harvard, Vancouver, ISO, and other styles
8

The Measure Of America 20102011 Mapping Risks And Resilience. New York University Press, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Burd-Sharps, Sarah, Kristen Lewis, and Jeffrey Sachs. Measure of America, 2010-2011: Mapping Risks and Resilience. New York University Press, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Cardarelli, John. Ionizing and Non-ionizing Radiation. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780190662677.003.0015.

Full text
Abstract:
This chapter describes ionizing radiation and non-ionizing radiation, ways to measure it in the environment, describes the potential health effects from chronic and acute exposures as well as a discussion addressing pregnancy concerns. Background radiation from both sources are described in relation to occupational or public exposure limits and how these limits were derived. Among the subjects described are acute radiation syndrome, exposure assessment, radon, and assessment of radiation risk. Radiation protection and control measures are described and how their applications may change based on routine vs. emergency response conditions and the scale of the incident.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Conditional risk measure"

1

Boduroğlu, İ. İlkay. "Portfolio Optimization via a Surrogate Risk Measure: Conditional Desirability Value at Risk (CDVaR)." In Lecture Notes in Computer Science, 257–70. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-05348-2_23.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Boduroğlu, İ. İlkay, and Bartu Köksal. "Mean-Reverting Portfolio Optimization via a Surrogate Risk Measure - Conditional Desirability Value at Risk." In Advances in Systems Engineering, 151–64. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-92604-5_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Coletti, Giulianella, Davide Petturiti, and Barbara Vantaggi. "Conditional Submodular Coherent Risk Measures." In Communications in Computer and Information Science, 239–50. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-91476-3_20.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ferreira, Óscar. "Modelling Risk Reduction Measures to Minimise Future Impacts of Storms at Coastal Areas." In Springer Climate, 59–66. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-86211-4_8.

Full text
Abstract:
AbstractCoastal storms often cause damages and losses in occupied areas. Under climate change conditions (i.e. sea-level rise and increased frequency of extreme sea levels) and increasing human occupation, the consequences of coastal storms will be amplified if no adaptation actions are implemented. The selection of the best possible coastal management measures to reduce risks at coastal areas, considering costs, effectiveness and acceptance, will be mandatory in the future. This work presents a generic approach to model disaster risk reduction measures at coastal areas, including climate change effects. The proposed methodology is adaptable to any coastal region and can be used to test (and improve) management options at a broad number of coastal areas. It can also be used to define a timeframe for the implementation of the defined measures since not all risk reduction measures, under a climate change scenario, need to be implemented at the same time. This would help to optimise implementation costs while reducing the risk to the occupation and people.
APA, Harvard, Vancouver, ISO, and other styles
5

Panek, Tomasz, and Jan Zwierzchowski. "Fuzzy and Multi-Dimensional Measures of the Degree of Social Exclusion Risk." In Analysis of Socio-Economic Conditions, 180–99. Abingdon, Oxon ; New York, NY : Routledge, 2021. |: Routledge, 2021. http://dx.doi.org/10.4324/9781003053712-12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Crisafulli, Ernesto, Stefania Costi, and Enrico M. Clini. "Anthropometry in Special and Selective Conditions and Circumstances: Anthropometry as Measure of Risk in COPD Patients." In Handbook of Anthropometry, 2357–71. New York, NY: Springer New York, 2012. http://dx.doi.org/10.1007/978-1-4419-1788-1_145.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Geraskin, Mikhail, and Elena Rostova. "Impact of Preventive Measures on Conditions of Risk Insurance in Cyber-Physical System of Industrial Enterprise." In Cyber-Physical Systems: Modelling and Industrial Application, 235–42. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-95120-7_20.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Condemine, Cyril, Loic Grau, Yves Masson, and Sebastien Aubry. "Live Digital Twin for Hydraulic Structures Fatigue Estimation." In Lecture Notes in Civil Engineering, 494–505. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-19-6138-0_43.

Full text
Abstract:
AbstractMaintaining hydraulic structures such as dams, penstocks, or water lock gates in operating conditions and optimizing their maintenance costs are key issues for energy production or river navigation. The ultimate objective is to know the real state of fatigue and damage of the structure and identify any related anomalies. In this paper, we introduce a digital twin, for fatigue evaluation merging measured data obtained with an embedded sensor network and a 3D numerical model that converts in real time measured data into fatigue. After 3 years of R&D collaboration between CNR and Morphosense in the maintenance of navigation lock gates or dam gates, this presentation exposes how the proposed Live Digital Twin solution contributes to fatigue evaluation and more generally to global structural monitoring in dealing with fundamental issues of hydraulic structures: risk assessment, maintenance in operating conditions and maintenance costs optimization. After a context and state of the art introduction, the second part will detail the system overview. In the third part, the monitoring system will be addressed.
APA, Harvard, Vancouver, ISO, and other styles
9

Ife-Adediran, Oluwatobi Ololade, and Oluyemi Bright Aboyewa. "Climate Change Resistant Energy Sources for Global Adaptation." In African Handbook of Climate Change Adaptation, 1955–66. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-45106-6_106.

Full text
Abstract:
AbstractA holistic response and adaptation to climatic vicissitudes and extreme conditions as well as their associated risks to human and ecological sustainability must adequately cater for energy needs and optimization. An interventional approach should, among other measures, seek to improve the resilience of existing and prospective energy systems to climate change. The structured and policy-driven nature of adaptation measures require a bottom-up proactive approach that envisages the performance and efficiency of these systems, especially in terms of their sensitivity and vulnerability to changing climate conditions. Therefore, this chapter seeks to scrutinize various sources of energy concerning their resistance capabilities to climate change in the face of increasing global energy demands and consumption. Renewable and conventional energy sources are co-examined and compared vis-à-vis the current trends and predictions on climatic factors that are bearing on their principles of production, supply, and distribution. Findings from this chapter will serve as assessment tools for decision makers and corroborate other ongoing discourse on climate actions towards socioeconomic development and a sustainable environment.
APA, Harvard, Vancouver, ISO, and other styles
10

Ren, Hongmei, Jianping Zhu, Yanyan Lv, and Weiwei Qin. "Aseismic Design of an Out-of-Code High-Rise Building in Shanghai." In Advances in Frontier Research on Engineering Structures, 21–31. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-19-8657-4_3.

Full text
Abstract:
AbstractProper structural system and performance-based seismic design are the key issues in designing high-rise building structures. This project has unique building facade shape and complex plane function layout, and the structural plane and vertical layout are irregular. The superstructure adopts assembled integral concrete frame-shear wall structure, which is judged as out-of-code high-rise building by seismic review. Firstly, the site conditions, foundation design and structural form selection are introduced. Then, YJK software is used to calculate and analyze the seismic force of the superstructure, and the seismic performance indexes of the structure can meet the requirements of the code. Finally, the regularity of each structural unit of the superstructure is judged, and the corresponding main seismic strengthening measures are put forward.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Conditional risk measure"

1

Ma, Xiaoxian, Jilin Qu, and Jianquan Sun. "A Risk Measure with Conditional Expectation and Portfolio Optimization with Fuzzy Uncertainty." In 2009 International Conference on Business Intelligence and Financial Engineering (BIFE). IEEE, 2009. http://dx.doi.org/10.1109/bife.2009.32.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Xiao-mei, Zhu, Zhang Qun-yan, and Ren Xin. "The research of software reliability measure based on conditional value at risk." In Mechanical Engineering and Information Technology (EMEIT). IEEE, 2011. http://dx.doi.org/10.1109/emeit.2011.6023741.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Tsyarmasto, Peter, and Stan Uryasev. "Advanced risk measures in estimation and classification." In International Workshop of "Stochastic Programming for Implementation and Advanced Applications". The Association of Lithuanian Serials, 2012. http://dx.doi.org/10.5200/stoprog.2012.20.

Full text
Abstract:
This paper considers several well-known Support Vector Machine (SVM) algorithms for classification. We suggested a general risk management framework for describing considered SVMs. We introduced a loss function and expressed each SVM with a risk measure, such as Expected Value, Conditional Value-at-Risk, Supremum. We tested SVM algorithms on five classification data sets. The computational experiments were performed with Portfolio Safeguard (PSG). Risk functions, used in our framework, were precoded in PSG, which allowed for optimization of SVMs with several lines of code.
APA, Harvard, Vancouver, ISO, and other styles
4

Nakagawa, Kei, Shuhei Noma, and Masaya Abe. "RM-CVaR: Regularized Multiple β-CVaR Portfolio." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/629.

Full text
Abstract:
The problem of finding the optimal portfolio for investors is called the portfolio optimization problem. Such problem mainly concerns the expectation and variability of return (i.e., mean and variance). Although the variance would be the most fundamental risk measure to be minimized, it has several drawbacks. Conditional Value-at-Risk (CVaR) is a relatively new risk measure that addresses some of the shortcomings of well-known variance-related risk measures, and because of its computational efficiencies, it has gained popularity. CVaR is defined as the expected value of the loss that occurs beyond a certain probability level (β). However, portfolio optimization problems that use CVaR as a risk measure are formulated with a single β and may output significantly different portfolios depending on how the β is selected. We confirm even small changes in β can result in huge changes in the whole portfolio structure. In order to improve this problem, we propose RM-CVaR: Regularized Multiple β-CVaR Portfolio. We perform experiments on well-known benchmarks to evaluate the proposed portfolio. Compared with various portfolios, RM-CVaR demonstrates a superior performance of having both higher risk-adjusted returns and lower maximum drawdown.
APA, Harvard, Vancouver, ISO, and other styles
5

Akduğan, Umut, and Yasemin Koldere Akın. "Volatility Modelling in Parametric Value at Risk Calculation: An Application on Pension Funds in Turkey." In International Conference on Eurasian Economies. Eurasian Economists Association, 2013. http://dx.doi.org/10.36880/c04.00713.

Full text
Abstract:
Risk management has strategic importance with deepness and globalization of financial markets. Financial markets are faced with systematic or non-systematic risk factors. Risk management has a great importance for banks, as well as other financial institutions and investors.In this context, "Value at Risk (VaR)" is recommended as one of the methods to measure market risk by international organizations and Turkish Banking Regulation and Supervision Agency (BRSA). The parametric method (Variance-Covariance Method), one of the methods used in VaR calculation, used in pension funds which have reached large numbers and established by the Turkish pension companies. Moreover, this method is applied in two different ways based on the assumption of constant variance and based on the basis of conditional heteroscedasticity, and the results were compared. The accuracy of the calculations have been tested by backtesting method. For this purpose, daily returns company's growth equity fund between 03/01/2011 - 04/30/2013 is used for the four pension companies which is the maximum size of the fund and operating in Turkey.
APA, Harvard, Vancouver, ISO, and other styles
6

Wen-de Yi and Ai-hua Huang. "Measures of Conditional Tail-Dependence Risk with Copulas." In 2008 International Symposium on Information Science and Engineering (ISISE). IEEE, 2008. http://dx.doi.org/10.1109/isise.2008.291.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

La¨uferts, Ulrike, Charlotte Halbe, and Aliki van Heek. "Value-Creating Investment Strategies to Manage Risk From Structural Market Uncertainties: Switching and Compound Options in (V)HTR Technologies." In Fourth International Topical Meeting on High Temperature Reactor Technology. ASMEDC, 2008. http://dx.doi.org/10.1115/htr2008-58157.

Full text
Abstract:
To measure the value of a technology investment under uncertainty with standard techniques like net present value (NPV) or return on investment (ROI) will often uncover the difficulty to present convincing business case. Projected cash flows are inefficient or the discount rate chosen to compensate for the risk is so high, that it is disagreeable to the investor’s requirements. Decision making and feasibility studies have to look beyond traditional analysis to reveal the strategic value of a technology investment. Here, a Real Option Analysis (ROA) offers a powerful alternative to standard discounted cash-flow (DCF) methodology by risk-adjusting the cash flow along the decision path rather than risk adjusting the discount rate. Within the GEN IV initiative attention is brought not only towards better sustainability, but also to broader industrial application and improved financing. Especially the HTR design is full of strategic optionalities: The high temperature output facilitates penetration into other non-electricity energy markets like industrial process heat applications and the hydrogen market. The flexibility to switch output in markets with multi-source uncertainties reduces downside risk and creates an additional value of over 50% with regard to the Net Present Value without flexibility. The supplement value of deploying a modular (V)HTR design adds over 100% to the project value using real option evaluation tools. Focus of this paper was to quantify the strategic value that comes along a) with the modular design; a design that offers managerial flexibility adapting a step-by-step investment strategy to the actual market demand and b) with the option to switch between two modes of operation, namely electricity and hydrogen production. We will demonstrate that the effect of uncertain electricity prices can be dampened down with a modular HTR design. By using a real option approach, we view the project as a series of compound options — each option depending on the exercise of those that preceded it. At each end of the design phase, the viability will be reviewed conditional on the operating spread at each time step. We quantify the value of being able to wait with the investment into a next block until market conditions are favourable and to be able to abandon one block if market conditions are disapproving. To derive the intrinsic value of this multi block HTR design, it will be compared with a reference investment of a full commitment light water reactor without any managerial flexibility. In another case, we raise the question to what extent product output diversification is a suitable strategy to cope with long term market uncertainty in electricity price. What is the value of a multi-potent technology that is able to produce output for energy markets others than the electricity market? To investigate this, we concentrate on The Netherlands, a country with an intense industrial demand in electricity and hydrogen.
APA, Harvard, Vancouver, ISO, and other styles
8

Górny, Adam. "Occupational Risk In Improving The Quality Of Working Conditions." In Applied Human Factors and Ergonomics Conference. AHFE International, 2020. http://dx.doi.org/10.54941/ahfe100327.

Full text
Abstract:
Adherence to the systemic approach to improving working conditions is increasingly becoming a central prerequisite for the successful operation of business organizations. By adopting systemic principles to improve the quality of working conditions, organizations gain access to effective tools for eliminating hazards and strenuousness and consequently acquire the ability to grow and improve themselves. Any measures adopted within that framework are undertaken in recognition of the roles and tasks of employees seen as the internal clients of specific processes.The article demonstrates that improvements can be achieved by assessing risks. In this context, risk assessment is viewed as a tool for gathering information on irregularities. By assessing risks, businesses can identify any hazardous, deleterious and strenuous factors which require improvement (through corrective and preventive measures) and whose scope and characteristics depend on the level of occupational risk. The use of occupational risk as a criterion for selecting improvement measures helps identify adequate technical means and organizational arrangements to be applied to bring the working environment to the required quality standard. In particular cases, such means and arrangements should be complemented by using personal protection items. An essential consideration in improving working conditions is to incorporate any selected elements of the systemic approach that are critical for shaping the working environment. Only then will the proper improvement measures be effective.
APA, Harvard, Vancouver, ISO, and other styles
9

Shang, Zhaoxia, Hong Liu, Xiaoxian Ma, and Yanmin Liu. "Notice of Retraction: Fuzzy Value-at-Risk and Fuzzy Conditional Value-at-Risk: Two risk measures under fuzzy uncertainty." In 2010 IEEE 2nd Symposium on Web Society (SWS 2010). IEEE, 2010. http://dx.doi.org/10.1109/sws.2010.5607440.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Yi, Wen-de, and Ai-hua Huang. "Study on Measures of Dependence Conditional Risk Based-on Copulas in Financial Portfolio." In 2008 International Seminar on Future Information Technology and Management Engineering. IEEE, 2008. http://dx.doi.org/10.1109/fitme.2008.131.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Conditional risk measure"

1

Ayoul-Guilmard, Q., S. Ganesh, F. Nobile, R. Rossi, and C. Soriano. D6.3 Report on stochastic optimisation for simple problems. Scipedia, 2021. http://dx.doi.org/10.23967/exaqute.2021.2.001.

Full text
Abstract:
This report addresses the general matter of optimisation under uncertainties, following a previous report on stochastic sensitivities (deliverable 6.2). It describes several theoretical methods, as well their application into implementable algorithms. The specific case of the conditional value at risk chosen as risk measure, with its challenges, is prominently discussed. In particular, the issue of smoothness – or lack thereof – is addressed through several possible approaches. The whole report is written in the context of high-performance computing, with concern for parallelisation and cost-efficiency.
APA, Harvard, Vancouver, ISO, and other styles
2

Nobile, F., Q. Ayoul-Guilmard, S. Ganesh, M. Nuñez, A. Kodakkal, C. Soriano, and R. Rossi. D6.5 Report on stochastic optimisation for wind engineering. Scipedia, 2022. http://dx.doi.org/10.23967/exaqute.2022.3.04.

Full text
Abstract:
This report presents the latest methods of optimisation under uncertainties investigated in the ExaQUte project, and their applications to problems related to civil and wind engineering. The measure of risk throughout the report is the conditional value at risk. First, the reference method is presented: the derivation of sensitivities of the risk measure; their accurate computation; and lastly, a practical optimisation algorithm with adaptive statistical estimation. Second, this method is directly applied to a nonlinear relaxation oscillator (FitzHugh–Nagumo model) with numerical experiments to demonstrate its performance. Third, the optimisation method is adapted to the shape optimisation of an airfoil and illustrated by a large-scale experiment on a computing cluster. Finally, the benchmark of the shape optimisation of a tall building under a turbulent flow is presented, followed by an adaptation of the optimisation method. All numerical experiments showcase the open-source software stack of the ExaQUte project for large-scale computing in a distributed environment.
APA, Harvard, Vancouver, ISO, and other styles
3

Shang, Dajing, Yang Yan, and Oliver Linton. Efficient estimation of conditional risk measures in a semiparametric GARCH model. Institute of Fiscal Studies, September 2012. http://dx.doi.org/10.1920/wp.cem.2012.2512.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Alt, Jonathan, Willie Brown, George Gallarno, John Richards, and Titus Rice. Risk-based prioritization of operational condition assessments : Jennings Randolph case study. Engineer Research and Development Center (U.S.), April 2022. http://dx.doi.org/10.21079/11681/43862.

Full text
Abstract:
The US Army Corps of Engineers (USACE) operates, maintains, and manages over $232 billion worth of the Nation’s water resource infrastructure. Using Operational Condition Assessments (OCA), the USACE allocates limited resources to assess asset condition in efforts to minimize risks associated with asset performance degradation, but decision makers require a greater understanding of those risks. The analysis of risk associated with Flood Risk Management assets in the context of its associated watershed system includes understanding the consequences of the asset’s failure and a determination of the likelihood that the asset will perform as expected given the current OCA ratings of critical components. This research demonstrates an application of a scalable methodology to model the probability of a dam performing as expected given the state of its subordinate gates and their components. The research team combines this likelihood with consequences generated by the application of designed simulation experiments with hydrological models to develop a measure of risk. The resulting risk scores serve as an input for an optimization program that outputs the optimal set of components to conduct OCAs on to minimize risk in the watershed. Proof-of-concept results for an initial case study on the Jennings Randolph Dam are provided.
APA, Harvard, Vancouver, ISO, and other styles
5

Hassan, Tarek A., Jesse Schreger, Markus Schwedeler, and Ahmed Tahoun. Country Risk. Institute for New Economic Thinking Working Paper Series, March 2021. http://dx.doi.org/10.36687/inetwp157.

Full text
Abstract:
We construct new measures of country risk and sentiment as perceived by global investors and executives using textual analysis of the quarterly earnings calls of publicly listed firms around the world. Our quarterly measures cover 45 countries from 2002-2020. We use our measures to provide a novel characterization of country risk and to provide a harmonized definition of crises. We demonstrate that elevated perceptions of a country's riskiness are associated with significant falls in local asset prices and capital outflows, even after global financial conditions are controlled for. Increases in country risk are associated with reductions in firm-level investment and employment. We also show direct evidence of a novel type of contagion, where foreign risk is transmitted across borders through firm-level exposures. Exposed firms suffer falling market valuations and significantly retrench their hiring and investment in response to crises abroad. Finally, we provide direct evidence that heterogeneous currency loadings on global risk help explain the cross-country pattern of interest rates and currency risk premia.
APA, Harvard, Vancouver, ISO, and other styles
6

Alt, Jonathan, Willie Brown, George Gallarno, John Richards, Jennifer Olszewski, and Titus Rice. Risk-based prioritization of operational condition assessments : methodology and case study results. Engineer Research and Development Center (U.S.), November 2022. http://dx.doi.org/10.21079/11681/46123.

Full text
Abstract:
USACE operates, maintains, and manages more than $232 billion of the Nation’s water resource infrastructure. USACE uses the Operational Condition Assessment (OCA) to allocate limited resources to assess condition of this infrastructure in efforts to minimize risks associated with performance degradation. The analysis of risk associated with flood risk management (FRM) assets includes consideration of how each asset contributes to its associated FRM watershed system, understanding the consequences of the asset’s performance degradation, and a determination of the likelihood that the asset will perform as expected given the current OCA condition ratings of critical components. This research demonstrates a proof-of-concept application of a scalable methodology to model the probability of a dam performing as expected given the state of its gates and their components. The team combines this likelihood of degradation with consequences generated by the application of designed simulation experiments with hydrological models to develop a risk measure. The resulting risk scores serve as an input for a mixed-integer optimization program that outputs the optimal set of components to conduct OCAs on to minimize risk in the watershed. This report documents the results of the application of this methodology to two case studies.
APA, Harvard, Vancouver, ISO, and other styles
7

Mazzoni, Silvia, Nicholas Gregor, Linda Al Atik, Yousef Bozorgnia, David Welch, and Gregory Deierlein. Probabilistic Seismic Hazard Analysis and Selecting and Scaling of Ground-Motion Records (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/zjdn7385.

Full text
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER) and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 3 (WG3), Task 3.1: Selecting and Scaling Ground-motion records. The objective of Task 3.1 is to provide suites of ground motions to be used by other working groups (WGs), especially Working Group 5: Analytical Modeling (WG5) for Simulation Studies. The ground motions used in the numerical simulations are intended to represent seismic hazard at the building site. The seismic hazard is dependent on the location of the site relative to seismic sources, the characteristics of the seismic sources in the region and the local soil conditions at the site. To achieve a proper representation of hazard across the State of California, ten sites were selected, and a site-specific probabilistic seismic hazard analysis (PSHA) was performed at each of these sites for both a soft soil (Vs30 = 270 m/sec) and a stiff soil (Vs30=760 m/sec). The PSHA used the UCERF3 seismic source model, which represents the latest seismic source model adopted by the USGS [2013] and NGA-West2 ground-motion models. The PSHA was carried out for structural periods ranging from 0.01 to 10 sec. At each site and soil class, the results from the PSHA—hazard curves, hazard deaggregation, and uniform-hazard spectra (UHS)—were extracted for a series of ten return periods, prescribed by WG5 and WG6, ranging from 15.5–2500 years. For each case (site, soil class, and return period), the UHS was used as the target spectrum for selection and modification of a suite of ground motions. Additionally, another set of target spectra based on “Conditional Spectra” (CS), which are more realistic than UHS, was developed [Baker and Lee 2018]. The Conditional Spectra are defined by the median (Conditional Mean Spectrum) and a period-dependent variance. A suite of at least 40 record pairs (horizontal) were selected and modified for each return period and target-spectrum type. Thus, for each ground-motion suite, 40 or more record pairs were selected using the deaggregation of the hazard, resulting in more than 200 record pairs per target-spectrum type at each site. The suites contained more than 40 records in case some were rejected by the modelers due to secondary characteristics; however, none were rejected, and the complete set was used. For the case of UHS as the target spectrum, the selected motions were modified (scaled) such that the average of the median spectrum (RotD50) [Boore 2010] of the ground-motion pairs follow the target spectrum closely within the period range of interest to the analysts. In communications with WG5 researchers, for ground-motion (time histories, or time series) selection and modification, a period range between 0.01–2.0 sec was selected for this specific application for the project. The duration metrics and pulse characteristics of the records were also used in the final selection of ground motions. The damping ratio for the PSHA and ground-motion target spectra was set to 5%, which is standard practice in engineering applications. For the cases where the CS was used as the target spectrum, the ground-motion suites were selected and scaled using a modified version of the conditional spectrum ground-motion selection tool (CS-GMS tool) developed by Baker and Lee [2018]. This tool selects and scales a suite of ground motions to meet both the median and the user-defined variability. This variability is defined by the relationship developed by Baker and Jayaram [2008]. The computation of CS requires a structural period for the conditional model. In collaboration with WG5 researchers, a conditioning period of 0.25 sec was selected as a representative of the fundamental mode of vibration of the buildings of interest in this study. Working Group 5 carried out a sensitivity analysis of using other conditioning periods, and the results and discussion of selection of conditioning period are reported in Section 4 of the WG5 PEER report entitled Technical Background Report for Structural Analysis and Performance Assessment. The WG3.1 report presents a summary of the selected sites, the seismic-source characterization model, and the ground-motion characterization model used in the PSHA, followed by selection and modification of suites of ground motions. The Record Sequence Number (RSN) and the associated scale factors are tabulated in the Appendices of this report, and the actual time-series files can be downloaded from the PEER Ground-motion database Portal (https://ngawest2.berkeley.edu/)(link is external).
APA, Harvard, Vancouver, ISO, and other styles
8

TARAKANOVA, V., A. ROMANENKO, and O. PRANTSUZ. MEASURES TO PREVENT POSSIBLE EMERGENCIES AT THE ENTERPRISE. Science and Innovation Center Publishing House, 2022. http://dx.doi.org/10.12731/2070-7568-2022-11-1-4-32-43.

Full text
Abstract:
In the article, the authors consider emergency situations at the enterprise of the Joint-Stock Company “Scientific and Production Complex “Alternative Energy” (JSC “NPK “ALTEN”), consider measures to prevent emergency situations at the enterprise, readiness to eliminate them consequences. Compliance with these measures will improve the efficiency of the company’s industrial safety management system. The relevance of the research is aimed at an effective system of organization and management of industrial safety, which allows you to manage risks and helps to ensure favorable working conditions for the health of employees at the enterprise. A mobile emergency and emergency response system was created. The system can also be used for accounting and accident investigation, based on the use of corporate communication devices and applications for mobile operating systems.
APA, Harvard, Vancouver, ISO, and other styles
9

Stall, Nathan M., Kevin A. Brown, Aaron Jones, Andrew P. Costa, Vanessa Allen, Adalsteinn D. Brown, Gerald A. Evans, et al. COVID-19 and Ontario’s Long-Term Care Homes. Ontario COVID-19 Science Advisory Table, December 2020. http://dx.doi.org/10.47326/ocsat.2020.01.05.1.0.

Full text
Abstract:
Ontario long-term care (LTC) home residents have experienced disproportionately high morbidity and mortality, both from COVID-19 and from the conditions associated with the COVID-19 pandemic. There are several measures that could be effective in preventing COVID-19 outbreaks, hospitalizations, and deaths in Ontario’s LTC homes, if implemented. First, temporary staffing could be minimized by improving staff working conditions. Second, homes could be further decrowded by a continued disallowance of three- and four-resident rooms and additional temporary housing for the most crowded homes. Third, the risk of SARS-CoV-2 infection in staff could be minimized by approaches that reduce the risk of transmission in communities with a high burden of COVID-19.
APA, Harvard, Vancouver, ISO, and other styles
10

Schiller, Brandon, Tara Hutchinson, and Kelly Cobeen. Cripple Wall Small-Component Test Program: Wet Specimens II (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/ldbn4070.

Full text
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 4 (WG4): Testing, whose central focus was to experimentally investigate the seismic performance of retrofitted and existing cripple walls. This report focuses stucco or “wet” exterior finishes. Paralleled by a large-component test program conducted at the University of California, Berkeley (UC Berkeley) [Cobeen et al. 2020], the present study involves two of multiple phases of small-component tests conducted at the University of California San Diego (UC San Diego). Details representative of era-specific construction, specifically the most vulnerable pre-1960s construction, are of predominant focus in the present effort. Parameters examined are cripple wall height, finish style, gravity load, boundary conditions, anchorage, and deterioration. This report addresses the third phase of testing, which consisted of eight specimens, as well as half of the fourth phase of testing, which consisted of six specimens where three will be discussed. Although conducted in different phases, their results are combined here to co-locate observations regarding the behavior of the second phase the wet (stucco) finished specimens. The results of first phase of wet specimen tests were presented in Schiller et al. [2020(a)]. Experiments involved imposition of combined vertical loading and quasi-static reversed cyclic lateral load onto ten cripple walls of 12 ft long and 2 or 6 ft high. One cripple wall was tested with a monotonic loading protocol. All specimens in this report were constructed with the same boundary conditions on the top and corners of the walls as well as being tested with the same vertical load. Parameters addressed in this report include: wet exterior finishes (stucco over framing, stucco over horizontal lumber sheathing, and stucco over diagonal lumber sheathing), cripple wall height, loading protocol, anchorage condition, boundary condition at the bottom of the walls, and the retrofitted condition. Details of the test specimens, testing protocol, including instrumentation; and measured as well as physical observations are summarized in this report. Companion reports present phases of the tests considering, amongst other variables, impacts of various boundary conditions, stucco (wet) and non-stucco (dry) finishes, vertical load, cripple wall height, and anchorage condition. Results from these experiments are intended to support advancement of numerical modeling tools, which ultimately will inform seismic loss models capable of quantifying the reduction of loss achieved by applying state-of-practice retrofit methods as identified in FEMA P-1100,Vulnerability-Base Seismic Assessment and Retrofit of One- and Two-Family Dwellings.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography