Academic literature on the topic 'Multivariate risk measure'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Multivariate risk measure.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Multivariate risk measure"

1

Landsman, Zinoviy, and Tomer Shushi. "Multivariate Tail Moments for Log-Elliptical Dependence Structures as Measures of Risks." Symmetry 13, no. 4 (March 28, 2021): 559. http://dx.doi.org/10.3390/sym13040559.

Full text
Abstract:
The class of log-elliptical distributions is well used and studied in risk measurement and actuarial science. The reason is that risks are often skewed and positive when they describe pure risks, i.e., risks in which there is no possibility of profit. In practice, risk managers confront a system of mutually dependent risks, not only one risk. Thus, it is important to measure risks while capturing their dependence structure. In this short paper, we compute the multivariate risk measures, multivariate tail conditional expectation, and multivariate tail covariance measure for the family of log-elliptical distributions, which captures the dependence structure of the risks while focusing on the tail of their distributions, i.e., on extreme loss events. We then study our result and examine special cases, as well as the optimal portfolio selection using such measures. Finally, we show how the given multivariate tail moments can also be computed for log-skew elliptical models based on similar approaches given for the log-elliptical case.
APA, Harvard, Vancouver, ISO, and other styles
2

ARARAT, ÇAĞIN, ANDREAS H. HAMEL, and BIRGIT RUDLOFF. "SET-VALUED SHORTFALL AND DIVERGENCE RISK MEASURES." International Journal of Theoretical and Applied Finance 20, no. 05 (July 30, 2017): 1750026. http://dx.doi.org/10.1142/s0219024917500261.

Full text
Abstract:
Risk measures for multivariate financial positions are studied in a utility-based framework. Under a certain incomplete preference relation, shortfall and divergence risk measures are defined as the optimal values of specific set minimization problems. The dual relationship between these two classes of multivariate risk measures is constructed via a recent Lagrange duality for set optimization. In particular, it is shown that a shortfall risk measure can be written as an intersection over a family of divergence risk measures indexed by a scalarization parameter. Examples include set-valued versions of the entropic risk measure and the average value at risk. As a second step, the minimization of these risk measures subject to trading opportunities is studied in a general convex market in discrete time. The optimal value of the minimization problem, called the market risk measure, is also a set-valued risk measure. A dual representation for the market risk measure that decomposes the effects of the original risk measure and the frictions of the market is proved.
APA, Harvard, Vancouver, ISO, and other styles
3

Feinstein, Zachary, and Birgit Rudloff. "Time consistency for scalar multivariate risk measures." Statistics & Risk Modeling 38, no. 3-4 (July 1, 2021): 71–90. http://dx.doi.org/10.1515/strm-2019-0023.

Full text
Abstract:
Abstract In this paper we present results on dynamic multivariate scalar risk measures, which arise in markets with transaction costs and systemic risk. Dual representations of such risk measures are presented. These are then used to obtain the main results of this paper on time consistency; namely, an equivalent recursive formulation of multivariate scalar risk measures to multiportfolio time consistency. We are motivated to study time consistency of multivariate scalar risk measures as the superhedging risk measure in markets with transaction costs (with a single eligible asset) (Jouini and Kallal (1995), Löhne and Rudloff (2014), Roux and Zastawniak (2016)) does not satisfy the usual scalar concept of time consistency. In fact, as demonstrated in (Feinstein and Rudloff (2021)), scalar risk measures with the same scalarization weight at all times would not be time consistent in general. The deduced recursive relation for the scalarizations of multiportfolio time consistent set-valued risk measures provided in this paper requires consideration of the entire family of scalarizations. In this way we develop a direct notion of a “moving scalarization” for scalar time consistency that corroborates recent research on scalarizations of dynamic multi-objective problems (Karnam, Ma and Zhang (2017), Kováčová and Rudloff (2021)).
APA, Harvard, Vancouver, ISO, and other styles
4

Haier, Andreas, and Ilya Molchanov. "Multivariate risk measures in the non-convex setting." Statistics & Risk Modeling 36, no. 1-4 (December 1, 2019): 25–35. http://dx.doi.org/10.1515/strm-2019-0002.

Full text
Abstract:
Abstract The family of admissible positions in a transaction costs model is a random closed set, which is convex in case of proportional transaction costs. However, the convexity fails, e.g., in case of fixed transaction costs or when only a finite number of transfers are possible. The paper presents an approach to measure risks of such positions based on the idea of considering all selections of the portfolio and checking if one of them is acceptable. Properties and basic examples of risk measures of non-convex portfolios are presented.
APA, Harvard, Vancouver, ISO, and other styles
5

Fougeres, Anne-Laure, and Cecile Mercadier. "Risk Measures and Multivariate Extensions of Breiman's Theorem." Journal of Applied Probability 49, no. 2 (June 2012): 364–84. http://dx.doi.org/10.1239/jap/1339878792.

Full text
Abstract:
The modeling of insurance risks has received an increasing amount of attention because of solvency capital requirements. The ruin probability has become a standard risk measure to assess regulatory capital. In this paper we focus on discrete-time models for the finite time horizon. Several results are available in the literature to calibrate the ruin probability by means of the sum of the tail probabilities of individual claim amounts. The aim of this work is to obtain asymptotics for such probabilities under multivariate regular variation and, more precisely, to derive them from extensions of Breiman's theorem. We thus present new situations where the ruin probability admits computable equivalents. We also derive asymptotics for the value at risk.
APA, Harvard, Vancouver, ISO, and other styles
6

Fougeres, Anne-Laure, and Cecile Mercadier. "Risk Measures and Multivariate Extensions of Breiman's Theorem." Journal of Applied Probability 49, no. 02 (June 2012): 364–84. http://dx.doi.org/10.1017/s0021900200009141.

Full text
Abstract:
The modeling of insurance risks has received an increasing amount of attention because of solvency capital requirements. The ruin probability has become a standard risk measure to assess regulatory capital. In this paper we focus on discrete-time models for the finite time horizon. Several results are available in the literature to calibrate the ruin probability by means of the sum of the tail probabilities of individual claim amounts. The aim of this work is to obtain asymptotics for such probabilities under multivariate regular variation and, more precisely, to derive them from extensions of Breiman's theorem. We thus present new situations where the ruin probability admits computable equivalents. We also derive asymptotics for the value at risk.
APA, Harvard, Vancouver, ISO, and other styles
7

Wei, Linxiao, and Yijun Hu. "CAPITAL ALLOCATION WITH MULTIVARIATE RISK MEASURES: AN AXIOMATIC APPROACH." Probability in the Engineering and Informational Sciences 34, no. 2 (March 6, 2019): 297–315. http://dx.doi.org/10.1017/s0269964819000032.

Full text
Abstract:
AbstractCapital allocation is of central importance in portfolio management and risk-based performance measurement. Capital allocations for univariate risk measures have been extensively studied in the finance literature. In contrast to this situation, few papers dealt with capital allocations for multivariate risk measures. In this paper, we propose an axiom system for capital allocation with multivariate risk measures. We first recall the class of the positively homogeneous and subadditive multivariate risk measures, and provide the corresponding representation results. Then it is shown that for a given positively homogeneous and subadditive multivariate risk measure, there exists a capital allocation principle. Furthermore, the uniqueness of the capital allocation principe is characterized. Finally, examples are also given to derive the explicit capital allocation principles for the multivariate risk measures based on mean and standard deviation, including the multivariate mean-standard-deviation risk measures.
APA, Harvard, Vancouver, ISO, and other styles
8

Zuo, Baishuai, and Chuancun Yin. "Multivariate tail covariance risk measure for generalized skew-elliptical distributions." Journal of Computational and Applied Mathematics 410 (August 2022): 114210. http://dx.doi.org/10.1016/j.cam.2022.114210.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Di Bernardino, E., J. M. Fernández-Ponce, F. Palacios-Rodríguez, and M. R. Rodríguez-Griñolo. "On multivariate extensions of the conditional Value-at-Risk measure." Insurance: Mathematics and Economics 61 (March 2015): 1–16. http://dx.doi.org/10.1016/j.insmatheco.2014.11.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hürlimann, Werner. "Multivariate Fréchet copulas and conditional value-at-risk." International Journal of Mathematics and Mathematical Sciences 2004, no. 7 (2004): 345–64. http://dx.doi.org/10.1155/s0161171204210158.

Full text
Abstract:
Based on the method of copulas, we construct a parametric family of multivariate distributions using mixtures of independent conditional distributions. The new family of multivariate copulas is a convex combination of products of independent and comonotone subcopulas. It fulfills the four most desirable properties that a multivariate statistical model should satisfy. In particular, the bivariate margins belong to a simple but flexible one-parameter family of bivariate copulas, called linear Spearman copula, which is similar but not identical to the convex family of Fréchet. It is shown that the distribution and stop-loss transform of dependent sums from this multivariate family can be evaluated using explicit integral formulas, and that these dependent sums are bounded in convex order between the corresponding independent and comonotone sums. The model is applied to the evaluation of the economic risk capital for a portfolio of risks using conditional value-at-risk measures. A multivariate conditional value-at-risk vector measure is considered. Its components coincide for the constructed multivariate copula with the conditional value-at-risk measures of the risk components of the portfolio. This yields a “fair” risk allocation in the sense that each risk component becomes allocated to its coherent conditional value-at-risk.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Multivariate risk measure"

1

DOLDI, ALESSANDRO. "EQUILIBRIUM, SYSTEMIC RISK MEASURES AND OPTIMAL TRANSPORT: A CONVEX DUALITY APPROACH." Doctoral thesis, Università degli Studi di Milano, 2021. http://hdl.handle.net/2434/812668.

Full text
Abstract:
This Thesis focuses on two main topics. Firstly, we introduce and analyze the novel concept of Systemic Optimal Risk Transfer Equilibrium (SORTE), and we progressively generalize it (i) to a multivariate setup and (ii) to a dynamic (conditional) setting. Additionally we investigate its relation to a recently introduced concept of Systemic Risk Measures (SRM). We present Conditional Systemic Risk Measures and study their properties, dual representation and possible interpretations of the associated allocations as equilibria in the sense of SORTE. On a parallel line of work, we develop a duality for the Entropy Martingale Optimal Transport problem and provide applications to problems of nonlinear pricing-hedging. The mathematical techniques we exploit are mainly borrowed from functional and convex analysis, as well as probability theory. More specifically, apart from a wide range of classical results from functional analysis, we extensively rely on Fenchel-Moreau-Rockafellar type conjugacy results, Minimax Theorems, theory of Orlicz spaces, compactness results in the spirit of Komlós Theorem. At the same time, mathematical results concerning utility maximization theory (existence of optima for primal and dual problems, just to mention an example) and optimal transport theory are widely exploited. The notion of SORTE is inspired by the Bühlmann's classical Equilibrium Risk Exchange (H. Bühlmann, "The general economic premium principle", Astin Bulletin, 1984). In both the Bühlmann and the SORTE definition, each agent is behaving rationally by maximizing his/her expected utility given a budget constraint. The two approaches differ by the budget constraints. In Bühlmann's definition the vector that assigns the budget constraint is given a priori. In the SORTE approach, on the contrary, the budget constraint is endogenously determined by solving a systemic utility maximization problem. SORTE gives priority to the systemic aspects of the problem, in order to first optimize the overall systemic performance, rather than to individual rationality. Single agents' preferences are, however, taken into account by the presence of individual optimization problems. The two aspects are simultaneously considered via an optimization problem for a value function given by summation of single agents' utilities. After providing a financial and theoretical justification for this new idea, in this research sufficient general assumptions that guarantee existence, uniqueness, and Pareto optimality of such a SORTE are presented. Once laid the theoretical foundation for the newly introduced SORTE, this Thesis proceeds in extending such a notion to the case when the value function to be optimized has two components, one being the sum of the single agents' utility functions, as in the aforementioned case of SORTE, the other consisting of a truly systemic component. This marks the progress from SORTE to Multivariate Systemic Optimal Risk Transfer Equilibrium (mSORTE). Technically, the extension of SORTE to the new setup requires developing a theory for multivariate utility functions and selecting at the same time a suitable framework for the duality theory. Conceptually, this more general setting allows us to introduce and study a Nash Equilibrium property of the optimizers. Existence, uniqueness, Pareto optimality and the Nash Equilibrium property of the newly defined mSORTE are proved in this Thesis. Additionally, it is shown how mSORTE is in fact a proper generalization, and covers both from the conceptual and the mathematical point of view the notion of SORTE. Proceeding further in the analysis, the relations between the concepts of mSORTE and SRM are investigated in this work. The notion of SRM we start from was introduced in the papers "A unified approach to systemic risk measures via acceptance sets" (Math. Finance, 2019) and "On fairness of systemic risk measures" (Finance Stoch., 2020) by F. Biagini, J.-P. Fouque, M. Frittelli, and T. Meyer-Brandis. SRM of Biagini et al. are generalized in this Thesis to a dynamic (namely conditional) setting, adding also a systemic, multivariate term in the threshold functions that Biagini et al. consider in their papers. The dynamic version of mSORTE is introduced, and it is proved that the optimal allocations of dynamic SRM, together with the corresponding fair pricing measures, yield a dynamic mSORTE. This in particular remains true if conditioning is taken with respect to the trivial sigma algebra, which is tantamount to working in the non-dynamic setting covered in Biagini et al. for SRM, and in the previous parts of our work for mSORTE. The case of exponential utility functions is thoroughly examined, and the explicit formulas we obtain for this specific choice of threshold functions allow for providing a time consistency property for allocations, dynamic SRM and dynamic mSORTE. The last part of this Thesis is devoted to a conceptually separate topic. Nonetheless, a clear mathematical link between the previous work and the one we are to describe is established by the use of common techniques. A duality between a novel Entropy Martingale Optimal Transport (EMOT) problem (D) and an associated optimization problem (P) is developed. In (D) the approach taken in Liero et al. (M. Liero, A. Mielke, and G. Savaré, "Optimal entropy-transport problems and a new Hellinger-Kantorovich distance between positive measures", Inventiones mathematicae, 2018) serves as a basis for adding the constraint, typical of Martingale Optimal Transport (MOT) theory, that the infimum of the cost functional is taken over martingale probability measures, instead of finite positive measures, as in Liero et al.. The Problem (D) differs from the corresponding problem in Liero et al. not only by the martingale constraint, but also because we admit less restrictive penalization terms D, which may not have a divergence formulation. In Problem (P) the objective functional, associated via Fenchel conjugacy to the terms D, is not any more linear, as in Optimal Transport or in MOT. This leads to a novel optimization problem which also has a clear financial interpretation as a non linear subhedging value. Our results in this Thesis establish a novel nonlinear robust pricing-hedging duality in financial mathematics, which covers a wide range of known robust results in its generality. The research for this Thesis resulted in the production of the following works: F. Biagini, A. Doldi, J.-P. Fouque, M. Frittelli, and T. Meyer-Brandis, "Systemic optimal risk transfer equilibrium", Mathematics and Financial Economics, 2021; A. Doldi and M. Frittelli, "Multivariate Systemic Optimal Risk Transfer Equilibrium", Preprint: arXiv:1912.12226, 2019; A. Doldi and M. Frittelli, "Conditional Systemic Risk Measures", Preprint: arXiv:2010.11515, 2020; A. Doldi and M. Frittelli, "Entropy Martingale Optimal Transport and Nonlinear Pricing-Hedging Duality", Preprint: arXiv:2005.12572, 2020.
APA, Harvard, Vancouver, ISO, and other styles
2

Hua, Lei. "Multivariate extremal dependence and risk measures." Thesis, University of British Columbia, 2012. http://hdl.handle.net/2429/42475.

Full text
Abstract:
Overlooking non-Gaussian and tail dependence phenomena has emerged as an important reason of underestimating aggregate financial or insurance risks. For modeling the dependence structures between non-Gaussian random variables, the concept of copula plays an important role and provides practitioners with promising quantitative tools. In order to study copula families that have different tail patterns and tail asymmetry than multivariate Gaussian and t copulas, we introduce the concepts of tail order and tail order functions. These provide a unified way to study three types of dependence in the tails: tail dependence, intermediate tail dependence and tail orthant independence. Some fundamental properties of tail order and tail order functions are obtained. For multivariate Archimedean copulas, we relate the tail heaviness of a positive random variable to the tail behavior of the Archimedean copula constructed by the Laplace transform of the random variable. Quantitative risk measurements pay more attention on large losses. A good statistical approach for the whole data does not guarantee a good way for risk assessments. We use tail comonotonicity as a conservative dependence structure for modeling multivariate dependent losses. By this way, we do not lose too much accuracy but gain reasonable conservative risk measures, especially when we consider high-risk scenarios. We have conducted a thorough investigation on the properties and constructions of tail comonotonicity, and found interesting properties such as asymptotic additivity properties of risk measures. Sufficient conditions have also been obtained to justify the conservativity of tail comonotonicity. For large losses, tail behavior of loss distributions is more critical than the whole distributions. Asymptotic study assuming that each marginal risk goes to infinity is more mathematically tractable. However, the asymptotic study that leads to a first order approximation is only a crude way and may not be sufficient. To this end, we study the second order conditions for risk measures of sub-extremal multiple risks. Some relationships between Value at Risk and Conditional Tail Expectation have been obtained under the condition of Second Order Regular Variation. We also find that the second order parameter determines whether a higher order approximation is necessary.
APA, Harvard, Vancouver, ISO, and other styles
3

Tavin, Bertrand. "Trois essais en finance de marché." Thesis, Paris 1, 2013. http://www.theses.fr/2013PA010029.

Full text
Abstract:
Le but de cette thèse est l'étude de certains aspects d'un marché financier comportant plusieurs actifs risqués et des options écrites sur ces actifs. Dans un premier essai, nous proposons une expression de la distribution implicite du prix d'un actif sous-jacent en fonction du smile de volatilité associé aux options écrites sur cet actif. L'expression obtenue pour la densité implicite prend la forme d'une densité log-normale plus deux termes d'ajustement. La mise en œuvre de ce résultat est ensuite illustrée à travers deux applications pratiques. Dans le deuxième essai, nous obtenons deux caractérisations de l'absence d'opportunité d'arbitrage en termes de fonctions copules. Chacune de ces caractérisations conduit à une méthode de détection des situations d'arbitrage. La première méthode proposée repose sur une propriété particulière des copules de Bernstein. La seconde méthode est valable dans le cas bivarié et tire profit de résultats sur les bornes de Fréchet-Hoeffding en présence d'information additionnelle sur la dépendance. Les résultats de l'utilisation de ces méthodes sur des données empiriques sont présentés. Enfin, dans le troisième essai, nous proposons une approche pour couvrir avec des options sur spread l'exposition au risque de dépendance d'un portefeuille d'options écrites sur deux actifs. L'approche proposée repose sur l'utilisation de deux modèles paramétriques de dépendance que nous introduisons: les copules Power Frank (PF) et Power Student's t (PST). Le fonctionnement et les résultats de l'approche proposée sont illustrés dans une étude numérique
This thesis is dedicated to the study of a market with several risky assets and options written on these assets. In a first essay, we express the implied distribution of an underlying asset price as a function of its options implied volatility smile. For the density, the obtained expression has the form of a log-normal density plus two adjustment terms. We then explain how to use these results and develop practical applications. In a first application we value a portfolio of digital options and in another application we fit a parametric distribution. In the second essay, we propose a twofold characterization of the absence of arbitrage opportunity in terms of copula functions. We then propose two detection methods. The first method relies on a particular property of Bernstein copulas. The second method, valid only in the case of a market with two risky assets, is based upon results on improved Fréchet-Hoeffding bounds in presence of additional information about the dependence. We also present results obtained with the proposed methods applied to empirical data. Finally, in the third essay, we develop an approach to hedge, with spread options, an exposure to dependence risk for a portfolio comprising two-asset options. The approach we propose is based on two parametric models of dependence that we introduce. These dependence models are copulas functions named Power Frank (PF) and Power Student's t (PST). The results obtained with the proposed approach are detailed in a numerical study
APA, Harvard, Vancouver, ISO, and other styles
4

Hoffmann, Hannes [Verfasser], and Thilo [Akademischer Betreuer] Meyer-Brandis. "Multivariate conditional risk measures : with a view towards systemic risk in financial networks / Hannes Hoffmann ; Betreuer: Thilo Meyer-Brandis." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2017. http://d-nb.info/1137835222/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Li, Yuming. "Univariate and multivariate measures of risk aversion and risk premiums with joint normal distribution and applications in portfolio selection models." Thesis, University of British Columbia, 1987. http://hdl.handle.net/2429/26110.

Full text
Abstract:
This thesis gives the formal derivations of the so-called Rubinstein's measures of risk aversion and their multivariate generalizations. The applications of these measures in portfolio selection models are also presented. Assuming that a decision maker's preferences can be represented by a unidimensional von Neumann and Morgenstern utility function, we consider a model with an uninsurable initial random wealth and an insurable risk. Under the assumption that the two random variables have a bivariate normal distribution, the second-order co-variance operator is developed from Stein/Rubinstein first-order covariance operator and is used to derive Rubinstein's measures of risk aversion from the approximations of risk premiums. Rubinstein's measures of risk aversion are proved to be the appropriate generalizations of the Arrow-Pratt measures of risk aversion. In a portfolio selection model with two risky investments having a bivariate normal distribution, we show that Rubinstein's measures of risk aversion can yield the desirable characterizations of risk aversion and wealth effects on the optimal portfolio. These properties of Rubinstein's measures of risk aversion are analogous to those of the Arrow-Pratt measures of risk aversion in the portfolio selection model with one riskless and one risky investment. In multi-dimensional decision problems, we assume that a decision maker's preferences can be represented by a multivariate utility function. From the model with an uninsurable initial wealth vector and insurable risk vector having a joint normal distribution in the wealth space, we derived the matrix measures of risk aversion which are the multivariate extension of Rubinstein's measures of risk aversion. The derivations are based on the multivariate version of Stein/Rubinstein covariance operator developed by Gassmann and its second-order generalization to be developed in this thesis. We finally present an application of the matrix measures of risk aversion in a portfolio selection model with a multivariate utility function and two risky investments. In this model, if we assume that the random returns on the two investments and other random variables have a joint normal distribution, the optimal portfolio can be characterized by the matrix measures of risk aversion.
Business, Sauder School of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
6

LOREGIAN, ANGELA. "Multivariate Lèvy models: estimation and asset allocation." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2013. http://hdl.handle.net/10281/49727.

Full text
Abstract:
Multidimensional asset models based on Lévy processes have been introduced to meet the necessity of capturing market shocks using more refined distribution assumptions compared to the standard Gaussian framework. In particular, along with accurately modeling marginal distributions of asset returns, capturing the dependence structure among them is of paramount importance, for example, to correctly price derivatives written on more than one underlying asset. Most of the literature on multivariate Lévy models focuses in fact on pricing multi-asset products, which is also the case of the model introduced in Ballotta and Bonfiglioli (2014). Believing that risk and portfolio management applications may benefit from a better description of the joint distribution of the returns as well, we choose to adopt Ballotta and Bonfiglioli (2014) model for asset allocation purposes and we empirically test its performances. We choose this model since, besides its flexibility and the ability to properly capture the dependence among assets, it is simple, relatively parsimonious and it has an immediate and intuitive interpretation, retaining a high degree of mathematical tractability. In particular we test two specifications of the general model, assuming respectively a pure jump process, more precisely the normal inverse Gaussian process, or a jump-diffusion process, precisely Merton’s jump-diffusion process, for all the components involved in the model construction. To estimate the model we propose a simple and easy-to-implement three-step procedure, which we assess via simulations, comparing the results with those obtained through a more computationally intensive one-step maximum likelihood estimation. We empirically test portfolio construction based on multivariate Lévy models assuming a standard utility maximization framework; for the exponential utility function we get a closed form expression for the expected utility, while for other utility functions (we choose to test the power one) we resort to numerical approximations. Among the benchmark strategies, we consider in our study what we call a ‘non-parametric optimization approach’, based on Gaussian kernel estimation of the portfolio return distribution, which to our knowledge has never been used. A different approach to allocation decisions aims at minimizing portfolio riskiness requiring a minimum expected return. Following Rockafellar and Uryasev (2000), we describe how to solve this optimization problem in our multivariate Lévy framework, when risk is measured by CVaR. Moreover we present formulas and methods to compute, as efficiently as possible, some downside risk measures for portfolios made of assets following the multivariate Lévy model by Ballotta and Bonfiglioli (2014). More precisely, we consider traditional risk measures (VaR and CVaR), the corresponding marginal measures, which evaluate their sensibility to portfolio weights alterations, and intra-horizon risk measures, which take into account the magnitude of losses that can incur before the end of the investment horizon. Formulas for CVaR in monetary terms and marginal measures, together with our approach to evaluate intra-horizon risk, are among the original contributions of this work.
APA, Harvard, Vancouver, ISO, and other styles
7

Said, Khalil. "Mesures de risque multivariées et applications en science actuarielle." Thesis, Lyon, 2016. http://www.theses.fr/2016LYSE1245.

Full text
Abstract:
L'entrée en application depuis le 1er Janvier 2016 de la réforme réglementaire européenne du secteur des assurances Solvabilité 2 est un événement historique qui va changer radicalement les pratiques en matière de gestion des risques. Elle repose sur une prise en compte importante du profil et de la vision du risque, via la possibilité d'utiliser des modèles internes pour calculer les capitaux de solvabilité et l'approche ORSA (Own Risk and Solvency Assessment) pour la gestion interne du risque. La modélisation mathématique est ainsi un outil indispensable pour réussir un exercice réglementaire. La théorie du risque doit être en mesure d'accompagner ce développement en proposant des réponses à des problématiques pratiques, liées notamment à la modélisation des dépendances et aux choix des mesures de risques. Dans ce contexte, cette thèse présente une contribution à l'amélioration de la gestion des risques actuariels. En quatre chapitres nous présentons des mesures multivariées de risque et leurs applications à l'allocation du capital de solvabilité. La première partie de cette thèse est consacrée à l'introduction et l'étude d'une nouvelle famille de mesures multivariées élicitables de risque qu'on appellera des expectiles multivariés. Son premier chapitre présente ces mesures et explique les différentes approches utilisées pour les construire. Les expectiles multivariés vérifient un ensemble de propriétés de cohérence que nous abordons aussi dans ce chapitre avant de proposer un outil d'approximation stochastique de ces mesures de risque. Les performances de cette méthode étant insuffisantes au voisinage des niveaux asymptotiques des seuils des expectiles, l'analyse théorique du comportement asymptotique est nécessaire, et fera le sujet du deuxième chapitre de cette partie. L'analyse asymptotique est effectuée dans un environnement à variations régulières multivariées, elle permet d'obtenir des résultats dans le cas des queues marginales équivalentes. Nous présentons aussi dans le deuxième chapitre le comportement asymptotique des expectiles multivariés sous les hypothèses précédentes en présence d'une dépendance parfaite, ou d'une indépendance asymptotique, et nous proposons à l'aide des statistiques des valeurs extrêmes des estimateurs de l'expectile asymptotique dans ces cas. La deuxième partie de la thèse est focalisée sur la problématique de l'allocation du capital de solvabilité en assurance. Elle est composée de deux chapitres sous forme d'articles publiés. Le premier présente une axiomatisation de la cohérence d'une méthode d'allocation du capital dans le cadre le plus général possible, puis étudie les propriétés de cohérence d'une approche d'allocation basée sur la minimisation d'indicateurs multivariés de risque. Le deuxième article est une analyse probabiliste du comportement de cette dernière approche d'allocation en fonction de la nature des distributions marginales des risques et de la structure de la dépendance. Le comportement asymptotique de l'allocation est aussi étudié et l'impact de la dépendance est illustré par différents modèles marginaux et différentes copules. La présence de la dépendance entre les différents risques supportés par les compagnies d'assurance fait de l'approche multivariée une réponse plus appropriée aux différentes problématiques de la gestion des risques. Cette thèse est fondée sur une vision multidimensionnelle du risque et propose des mesures de nature multivariée qui peuvent être appliquées pour différentes problématiques actuarielles de cette nature
The entry into force since January 1st, 2016 of Solvency 2, the European regulatory reform of insurance industry, is a historic event that will radically change the practices in risk management. It is based on taking into account the own risk profile and the internal view of risk through the ability to use internal models for calculating solvency capital requirement and ORSA (Own Risk and Solvency Assessment) approach for internal risk management. It makes the mathematical modeling an essential tool for a successful regulatory exercise. The risk theory must allow to support this development by providing answers to practical problems, especially those related to the dependence modeling and the choice of risk measures. In the same context, this thesis presents a contribution to improving the management of insurance risks. In four chapters we present multivariate risk measures and their application to the allocation of solvency capital. The first part of this thesis is devoted to the introduction and study of a new family of multivariate elicitable risk measures that we will call multivariate expectiles. The first chapter presents these measures and explains the different construction approaches. The multivariate expectiles verify a set of coherence properties that we also discuss in this chapter before proposing a stochastic approximation tool of these risk measures. The performance of this method is insufficient in the asymptotic levels of the expectiles thresholds. That makes the theoretical analysis of the asymptotic behavior necessary. The asymptotic behavior of multivariate expectiles is then the subject of the second chapter of this part. It is studied in a multivariate regular variations framework, and some results are given in the case of equivalent marginal tails. We also study in the second chapter of the first part the asymptotic behavior of multivariate expectiles under previous assumptions in the presence of a perfect dependence, or in the case of asymptotic independence. Finally, we propose using extreme values statistics some estimators of the asymptotic expectile in these cases. The second part of the thesis is focused on the issue of solvency capital allocation in insurance. It is divided into two chapters; each chapter consists of a published paper. The first one presents an axiomatic characterization of the coherence of a capital allocation method in a general framework. Then it studies the coherence properties of an allocation approach based on the minimization of some multivariate risk indicators. The second paper is a probabilistic analysis of the behavior of this capital allocation method based on the nature of the marginal distributions of risks and the dependence structure. The asymptotic behavior of the optimal allocation is also studied and the impact of dependence is illustrated using some selected models and copulas. Faced to the significant presence of dependence between the various risks taken by insurance companies, a multivariate approach seems more appropriate to build responses to the various issues of risk management. This thesis is based on a multidimensional vision of risk and proposes some multivariate risk measures that can be applied to several actuarial issues of a multivariate nature
APA, Harvard, Vancouver, ISO, and other styles
8

Chautru, Emilie. "Statistiques multivariées pour l'analyse du risque alimentaire." Thesis, Paris, ENST, 2013. http://www.theses.fr/2013ENST0045/document.

Full text
Abstract:
Véritable carrefour de problématiques économiques, biologiques, sociologiques, culturelles et sanitaires, l’alimentation suscite de nombreuses polémiques. Dans un contexte où les échanges mondiaux facilitent le transport de denrées alimentaires produites dans des conditions environnementales diverses, où la consommation de masse encourage les stratégies visant à réduire les coûts et maximiser le volume de production (OGM, pesticides, etc.) il devient nécessaire de quantifier les risques sanitaires que de tels procédés engendrent. Notre intérêt se place ici sur l’étude de l’exposition chronique, de l’ordre de l’année, à un ensemble de contaminants dont la nocivité à long terme est d’ores et déjà établie. Les dangers et bénéfices de l’alimentation ne se restreignant pas à l’ingestion ou non de substances toxiques, nous ajoutons à nos objectifs l’étude de certains apports nutritionnels. Nos travaux se centrent ainsi autour de trois axes principaux. Dans un premier temps, nous nous intéressons à l'analyse statistique des très fortes expositions chroniques à une ou plusieurs substances chimiques, en nous basant principalement sur des résultats issus de la théorie des valeurs extrêmes. Nous adaptons ensuite des méthodes d'apprentissage statistique de type ensembles de volume minimum pour l'identification de paniers de consommation réalisant un compromis entre risque toxicologique et bénéfice nutritionnel. Enfin, nous étudions les propriétés asymptotiques d'un certain nombre d'estimateurs permettant d'évaluer les caractéristiques de l'exposition, qui prennent en compte le plan de sondage utilisé pour collecter les données
At a crossroads of economical, sociological, cultural and sanitary issues, dietary analysis is of major importance for public health institutes. When international trade facilitates the transportation of foodstuffs produced in very different environmental conditions, when conspicuous consumption encourages profitable strategies (GMO, pesticides, etc.), it is necessary to quantify the sanitary risks engendered by such economic behaviors. We are interested in the evaluation of chronic types of exposure (at a yearly scale) to food contaminants, the long-term toxicity of which is already well documented. Because dietary risk and benefit is not limited to the abuse or the avoidance of toxic substances, nutritional intakes are also considered. Our work is thus organized along three main lines of research. We first consider the statistical analysis of very high long-term types of exposure to one or more chemical elements present in the food, adopting approaches in keeping with extreme value theory. Then, we adapt classical techniques borrowed from the statistical learning field concerning minimum volume set estimation in order to identify dietary habits that realize a compromise between toxicological risk and nutritional benefit. Finally, we study the asymptotic properties of a number of statistics that can assess the characteristics of the distribution of individual exposure, which take into account the possible survey scheme from which the data originate
APA, Harvard, Vancouver, ISO, and other styles
9

Kato, Fernando Hideki. "Análise de carteiras em tempo discreto." Universidade de São Paulo, 2004. http://www.teses.usp.br/teses/disponiveis/12/12139/tde-24022005-005812/.

Full text
Abstract:
Nesta dissertação, o modelo de seleção de carteiras de Markowitz será estendido com uma análise em tempo discreto e hipóteses mais realísticas. Um produto tensorial finito de densidades Erlang será usado para aproximar a densidade de probabilidade multivariada dos retornos discretos uniperiódicos de ativos dependentes. A Erlang é um caso particular da distribuição Gama. Uma mistura finita pode gerar densidades multimodais não-simétricas e o produto tensorial generaliza este conceito para dimensões maiores. Assumindo que a densidade multivariada foi independente e identicamente distribuída (i.i.d.) no passado, a aproximação pode ser calibrada com dados históricos usando o critério da máxima verossimilhança. Este é um problema de otimização em larga escala, mas com uma estrutura especial. Assumindo que esta densidade multivariada será i.i.d. no futuro, então a densidade dos retornos discretos de uma carteira de ativos com pesos não-negativos será uma mistura finita de densidades Erlang. O risco será calculado com a medida Downside Risk, que é convexa para determinados parâmetros, não é baseada em quantis, não causa a subestimação do risco e torna os problemas de otimização uni e multiperiódico convexos. O retorno discreto é uma variável aleatória multiplicativa ao longo do tempo. A distribuição multiperiódica dos retornos discretos de uma seqüência de T carteiras será uma mistura finita de distribuições Meijer G. Após uma mudança na medida de probabilidade para a composta média, é possível calcular o risco e o retorno, que levará à fronteira eficiente multiperiódica, na qual cada ponto representa uma ou mais seqüências ordenadas de T carteiras. As carteiras de cada seqüência devem ser calculadas do futuro para o presente, mantendo o retorno esperado no nível desejado, o qual pode ser função do tempo. Uma estratégia de alocação dinâmica de ativos é refazer os cálculos a cada período, usando as novas informações disponíveis. Se o horizonte de tempo tender a infinito, então a fronteira eficiente, na medida de probabilidade composta média, tenderá a um único ponto, dado pela carteira de Kelly, qualquer que seja a medida de risco. Para selecionar um dentre vários modelos de otimização de carteira, é necessário comparar seus desempenhos relativos. A fronteira eficiente de cada modelo deve ser traçada em seu respectivo gráfico. Como os pesos dos ativos das carteiras sobre estas curvas são conhecidos, é possível traçar todas as curvas em um mesmo gráfico. Para um dado retorno esperado, as carteiras eficientes dos modelos podem ser calculadas, e os retornos realizados e suas diferenças ao longo de um backtest podem ser comparados.
In this thesis, Markowitz’s portfolio selection model will be extended by means of a discrete time analysis and more realistic hypotheses. A finite tensor product of Erlang densities will be used to approximate the multivariate probability density function of the single-period discrete returns of dependent assets. The Erlang is a particular case of the Gamma distribution. A finite mixture can generate multimodal asymmetric densities and the tensor product generalizes this concept to higher dimensions. Assuming that the multivariate density was independent and identically distributed (i.i.d.) in the past, the approximation can be calibrated with historical data using the maximum likelihood criterion. This is a large-scale optimization problem, but with a special structure. Assuming that this multivariate density will be i.i.d. in the future, then the density of the discrete returns of a portfolio of assets with nonnegative weights will be a finite mixture of Erlang densities. The risk will be calculated with the Downside Risk measure, which is convex for certain parameters, is not based on quantiles, does not cause risk underestimation and makes the single and multiperiod optimization problems convex. The discrete return is a multiplicative random variable along the time. The multiperiod distribution of the discrete returns of a sequence of T portfolios will be a finite mixture of Meijer G distributions. After a change of the distribution to the average compound, it is possible to calculate the risk and the return, which will lead to the multiperiod efficient frontier, where each point represents one or more ordered sequences of T portfolios. The portfolios of each sequence must be calculated from the future to the present, keeping the expected return at the desired level, which can be a function of time. A dynamic asset allocation strategy is to redo the calculations at each period, using new available information. If the time horizon tends to infinite, then the efficient frontier, in the average compound probability measure, will tend to only one point, given by the Kelly’s portfolio, whatever the risk measure is. To select one among several portfolio optimization models, it is necessary to compare their relative performances. The efficient frontier of each model must be plotted in its respective graph. As the weights of the assets of the portfolios on these curves are known, it is possible to plot all curves in the same graph. For a given expected return, the efficient portfolios of the models can be calculated, and the realized returns and their differences along a backtest can be compared.
APA, Harvard, Vancouver, ISO, and other styles
10

Omidi, Firouzi Hassan. "On the design of customized risk measures in insurance, the problem of capital allocation and the theory of fluctuations for Lévy processes." Thèse, 2014. http://hdl.handle.net/1866/11669.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Multivariate risk measure"

1

van der Hoeven, Frank, and Alexander Wandl. Hotterdam: How space is making Rotterdam warmer, how this affects the health of its inhabitants, and what can be done about it. TU Delft Open, 2015. http://dx.doi.org/10.47982/bookrxiv.1.

Full text
Abstract:
Heat waves will occur in Rotterdam with greater frequency in the future. Those affected most will be the elderly – a group that is growing in size. In the light of the Paris heat wave of August 2003 and the one in Rotterdam in July 2006, mortality rates among the elderly in particular are likely to rise in the summer. METHOD The aim of the Hotterdam research project was to gain a better understanding of urban heat. The heat was measured and the surface energy balance modelled from that perspective. Social and physical features of the city we identified in detail with the help of satellite images, GIS and 3D models. We determined the links between urban heat/surface energy balance and the social/physical features of Rotterdam by multivariable regression analysis. The crucial elements of the heat problem were then clustered and illustrated on a social and a physical heat map. RESULTS The research project produced two heat maps, an atlas of underlying data and a set of adaptation measures which, when combined, will make the city of Rotterdam and its inhabitants more aware and less vulnerable to heat wave-related health effects. CONCLUSION In different ways, the pre-war districts of the city (North, South, and West) are warmer and more vulnerable to urban heat than are other areas of Rotterdam. The temperature readings that we carried out confirm these findings as far as outdoor temperatures are concerned. Indoor temperatures vary widely. Homes seem to have their particular dynamics, in which the house’s age plays a role. The above-average mortality of those aged 75 and over during the July 2006 heat wave in Rotterdam can be explained by a) the concentration of people in this age group, b) the age of the homes they live in, and c) the sum of sensible heat and ground heat flux. A diverse mix of impervious surfaces, surface water, foliage, building envelopes and shade make one area or district warmer than another. Adaptation measures are in the hands of residents, homeowners and the local council alike, and relate to changing behaviour, physical measures for homes, and urban design respectively.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Multivariate risk measure"

1

Guégan, Dominique, and Bertrand K. Hassani. "Extensions for Risk Measures: Univariate and Multivariate Approaches." In Risk Measurement, 115–42. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-02680-6_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Cardin, Marta, and Elisa Pagani. "Some classes of multivariate risk measures." In Mathematical and Statistical Methods for Actuarial Sciences and Finance, 63–73. Milano: Springer Milan, 2010. http://dx.doi.org/10.1007/978-88-470-1481-7_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Feinstein, Zachary, and Birgit Rudloff. "A Comparison of Techniques for Dynamic Multivariate Risk Measures." In Set Optimization and Applications - The State of the Art, 3–41. Berlin, Heidelberg: Springer Berlin Heidelberg, 2015. http://dx.doi.org/10.1007/978-3-662-48670-2_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Lee, Sharon X., and Geoffrey J. McLachlan. "Risk Measures Based on Multivariate Skew Normal and Skew t-Mixture Models." In Asymmetric Dependence in Finance, 152–68. Chichester, UK: John Wiley & Sons Ltd, 2018. http://dx.doi.org/10.1002/9781119288992.ch7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Selman Çolak, Mehmet, İbrahim Ethem Güney, and Yavuz Selim Hacıhasanoğlu. "The Relationship between Economic Uncertainty and Firms’ Balance Sheet Strength." In Banking and Finance. IntechOpen, 2020. http://dx.doi.org/10.5772/intechopen.91860.

Full text
Abstract:
This chapter aims to elaborate on the relationship between economic uncertainty and balance sheet strength of nonfinancial firms in Turkish economy. In order to effectively measure the balance sheet strength, we make use of a multivariate indicator, namely, the Multivariate Firm Assessment Score (MFA Score), which is a composite index to gauge the credit risk of nonfinancial firms quoted in Borsa İstanbul. MFA scores are compared with some uncertainty indicators for the period of 2005–2019. Our results suggest that when the uncertainties in global or Turkish economy are high, we observe a significant causal relationship from uncertainty indicators to firms’ balance sheet strength. More specifically, economic uncertainties negatively affect firms’ balance sheet performance in such an environment. Moreover, different types of uncertainties such as trade policy uncertainty and consumer perceptions about the economy are found to have differential impacts on exporter and non-exporter firms.
APA, Harvard, Vancouver, ISO, and other styles
6

López Pérez, Jesús-Fabian, Ana Elena De la Mora, and Rosalba Trevino Reyes. "Clustering for Innovative Business Model Design for Products and Services." In Handbook of Research on Industrial Applications for Improved Supply Chain Performance, 125–48. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-0202-0.ch006.

Full text
Abstract:
Acceleration of technology evolution, customers' requests of agility in operations and rapid product development requires the supply chains to become more active and connected to attend clients and new markets. The chapter content is structured on three business use cases. The first section is related to the aerospace industry. The purpose is to identify the influence of quality and quantity of the supplier base in the innovation activities of aerospace companies participating in a cluster. Authors applied a framework based on factor analysis and multivariate linear regression to measure the impact of the quality and quantity of a set of suppliers. The second section is related to operation of Micro Finance institutions (MFIs). Authors design and propose a full-featured optimization framework based on a mixed integer programming model. They discuss the impact of the risk balancing and merits of the proposed model.
APA, Harvard, Vancouver, ISO, and other styles
7

"Repeated Measures." In Multivariate Survival Analysis and Competing Risks, 163–84. Chapman and Hall/CRC, 2012. http://dx.doi.org/10.1201/b11893-14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Li, Yuming, and William T. Ziemba. "Univariate and multivariate measures of risk aversion and risk premiums." In Handbook of the Fundamentals of Financial Decision Making, 333–64. WORLD SCIENTIFIC, 2013. http://dx.doi.org/10.1142/9789814417358_0020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

"Multivariate Static Hedge Designs Using Measure-Distorted Valuations." In Nonlinear Valuation and Non-Gaussian Risks in Finance, 135–49. Cambridge University Press, 2022. http://dx.doi.org/10.1017/9781108993876.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hitaj, Ermal, Chris Lane, Paulomi Mehta, and Rima Turk. "Tailoring IMF-Supported Programs to Fragile and Conflict-Affected States’ Needs." In Macroeconomic Policy in Fragile States, 548–67. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780198853091.003.0018.

Full text
Abstract:
We consider the impact of Fund-supported programs in fragile and conflict-affected states (FCS) on aid flows and what factors determine program success. Using several indicators of fragility in a multivariate setting, we find that the catalytic role of IMF engagement on aid is significant in general and particularly so in fragile states. There is clear evidence that risks are more elevated in FCS due to conflict and political instability. Probit analysis for metrics of program success indicates that programs in FCS have a significantly higher probability of non-completion than non-FCS. We find a strong negative effect of high public debt on program completion. Usage of prior actions is associated with weaker program performance, suggesting that their use occurs in situations that are perceived as more risky. We recommend a greater focus on the drivers of conflict and instability in FCS and consideration of program measures that contribute to better economic outcomes.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Multivariate risk measure"

1

Malinovskii, V. K. "Risk measures and their application in the regulation of insurance and financial markets." In X-th International School-Seminar "Multivariate statistical analysis, econometrics and simulation of real processes". CEMI RAS, 2021. http://dx.doi.org/10.33276/978-5-8211-0797-8-79-80.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Llanes, Jose Damian, Alejo Viñales, and Juan Juri. "Assisted 3D Model Construction and Facies Propagation in Golfo San Jorge Basin Reservoirs for Modelling EOR." In SPE Improved Oil Recovery Conference. SPE, 2022. http://dx.doi.org/10.2118/209400-ms.

Full text
Abstract:
Abstract Three-dimensional modelling is at the critical path to map the by-passed oil in multilayer fluvial systems in the San Jorge Basin. Integrated reservoir modelling teams dedicate an important amount of time to create these three-dimensional models to decrease risk pursuing chemical injection for enhanced oil recovery. Traditional static reservoir modelling requires an important effort from the geologist to construct the interwell correlation. The objective of this work is to show the implementation of two unsupervised algorithms to automate/assist integrated reservoir modelling. We create multiple possible three-dimensional models of real multilayer static reservoirs and accelerate simulation. The first part of the work obtains the stratigraphic representation of the entire reservoir structure. We use the available lithology well logs as spontaneous potential and gamma ray to identify automatically the permeable and shale rocks with unbiased interpretation by their deflection responses in each well for the entire target reservoirs. Then we construct a graph in which each of the deflections is represented by a node. The edges that join each pair of nodes have an assigned weight depending on the difference in depth and the distance in plan of the nodes. We draw edge weights from a multivariate distribution with interwell distances and dipping angle. Then we use an adapted version of the Girvan-Newman algorithm to make a community detection by eliminating nongeological connections/features, to find the community with greater modularity. These communities represent the existing correlations between the deflections of the different wells. In the second part of this work, we obtain the facies distribution in the reservoir, using one-, two- and three-dimensional Markov chains. We implemented Jaccard distance to measure the mismatch of geological features and objects between the true synthetic case and the reconstructed model. With the modified Girvan-Newman algorithm we obtained multiple stratigraphic representations similar to the 3D model created by a geologist. Through modeling of two incomplete synthetic cross section cases using Markov chain propagation of a transition matrix the reconstruction reveals that we recover 90% of the original case even when we Input only 5% of the true data initially in the model. Then we tested a very fine real three-dimensional case created by an experienced geologist. The Markov reconstruction algorithm was able to recover up to 60 percent of the model true real three-dimensional model. The analysis of the reconstructed model features reveals that the Jaccard distance is a reliable indicator of the geological features. Using the computational algorithms implemented, it is possible to obtain a stratigraphic model and the facies model in less than four hours speeding up reservoir modeling. And core data is sufficient to recover a reasonable model to input to the simulator.
APA, Harvard, Vancouver, ISO, and other styles
3

Saczalski, Kenneth J., Mark N. West, Todd K. Saczalski, Luis Frausto, and Mark C. Pozzi. "Test Analysis of Youth and Adult Football Helmet Head Injury Risk Resulting From Repeat Impacts in High Humidity and Temperature." In ASME 2017 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/imece2017-70754.

Full text
Abstract:
Design of an optimally safe football helmet system requires an awareness and evaluation of the factors and variables that can adversely affect the impact attenuating performance of energy absorbing (EA) pad materials needed to minimize transmission of linear and rotational forces applied to the head so that risk of head injury is reduced. For instance, player head sweating can induce high temperatures and moisture within a helmet system (i.e. a Hot-Wet condition) which can result in degradation of helmet EA capacity and cause increased measures of head injury risk levels, which are often used for comparative evaluation of helmet designs. In this study, a “multivariable” experimental method was utilized to demonstrate an efficient means for assessment and comparison of currently representative adult and youth football helmet system designs when subjected to a range of variables that included, among other factors: temperature-moisture effects; impact energy; and, repeat impacts. Both quasi-static (QS) compression testing of commonly used EA materials and dynamic impact testing of full helmet systems were conducted and the results are presented in Tables and graphic form. The EA pad types that were QS tested included: Thermoplastic-Polyurethane (TPU) “waffle shaped” EA pad configurations; load rate sensitive “Gel” foam padding; and, dual and single density elastomeric foam padding. Dynamic helmet repeat impact tests were conducted by using a pendulum impact test device where various helmet designs were mounted to a Hybrid-III head and neck system and impacted against a non-yielding surface at energy levels of 108J and 130J after being subjected to ambient and Hot-Wet conditions. The QS tests showed that a short Hot-Wet soak time of only a few hours’ noticeably diminished EA levels. Also, the dynamic full helmet system testing demonstrated that the “Hot-Wet” condition tended to degrade helmet impact attenuation performance such that, depending on the size and type of EA material provided in the crush zone, head injury risk measures tended to increase. Finally, examples of the use and benefits of a “multivariable” experimental method for helmet injury risk assessment, not reported on previously, are provided.
APA, Harvard, Vancouver, ISO, and other styles
4

Yakti, Fatima alzahra Hasan, Hissa Al-Mannai, Dana Saad, Abdelhamid Kerkadi, Grace Attieh, and Hiba Bawadi. "Clustering of lifestyle risk factors among Algerian adolescents: Comparison between urban and rural area." In Qatar University Annual Research Forum & Exhibition. Qatar University Press, 2021. http://dx.doi.org/10.29117/quarfe.2021.0140.

Full text
Abstract:
Background: Lifestyle behavior risk factors (LBRs) such as sedentary behavior, physical inactivity, smoking, unhealthy eating patterns and being overweight/obese play a major role in the development or prevention of NCDs. Objective: Compare the clustering of LBRs between urban and rural Algerian adolescents. We expect differences in LBRs between urban and rural area. Design: Data of this cross-sectional study was derived from GSHS. Self-administered, anonymous questionnaire was filled out by 4532 adolescents (11–16 years), which addressed LBRs of NCDs. LBRs clustering was measured by the ratios of observed (O) and expected (E) prevalence of one or more simultaneously occurring LBRs for urban and rural area separately. Multivariate logistic regression was performed to examine the association of LBRs as dependent variable with demographic variables (location, age, gender). Results: The most common LBR was physical inactivity (84.6%: 50.9% for urban and 49.1% for rural). Adolescents in urban area had a higher prevalence of two (56.8% vs. 43.2%) and three and more (61.3 vs. 38.7%) LBRs than rural. In urban area, a significant positive association was found between: (low fruits and vegetables + physical inactivity) [2.06 (1.61-2.64)] and (high SB + smoking) [2.10 (1.54-2.76)], while (physical inactivity + high SB) [0.70 (0.54-0.91)] showed a significant negative association. In rural area, (high SB + overweight/obesity) [1.49 (1.09-2.04)] had a significant positive association. While, (low fruits and vegetables + high SB) [0.75 (0.60-0.94)], (physical inactivity + high SB) [0.65 (0.49-0.86)] and (physical inactivity + smoking) [0.70 (0.49-0.99)] had a negative association. Conclusions: Several socio-demographic factors have been identified to play a role in LBRs clustering among Algerian adolescents. Results of the study suggest the development of intervention aiming to tackle different LBRs rather than focusing on a single LBR.
APA, Harvard, Vancouver, ISO, and other styles
5

Bitetto, Alessandro, Stefano Filomeni, and Michele Modina. "Can unlisted firms benefit from market information? A data-driven approach." In CARMA 2022 - 4th International Conference on Advanced Research Methods and Analytics. valencia: Universitat Politècnica de València, 2022. http://dx.doi.org/10.4995/carma2022.2022.15045.

Full text
Abstract:
We employ a sample of 10,136 Italian micro-, small-, and mid-sized enterprises (MSMEs) that borrow from 113 cooperative banks to examine whether market pricing of public firms adds additional information to accounting measures in predicting default of private firms. Specifically, we first match the asset prices of listed firms following a data-driven clustering by means of Neural Networks Autoencoder so to evaluate the firm-wise probability of default (PD) of MSMEs. Then, we adopt three statistical techniques, namely linear models, multivariate adaptive regression spline, and random forest to assess the performance of the models and to explain the relevance of each predictor. Our results provide novel evidence that market information represents a crucial indicator in predicting corporate default of unlisted firms. Indeed, we show a significant improvement of the model performance, both on class-specific (F1-score for defaulted class) and overall metrics (AUC) when using market information in credit risk assessment, in addition to accounting information. Moreover, by taking advantage of global and local variable importance technique we prove that the increase in performance is effectively attributable to market information, highlighting its relevant effect in predicting corporate default.
APA, Harvard, Vancouver, ISO, and other styles
6

Al Ghazali, Kateba, Sana El Tayeb, Ayesha Musleh, Tamara Al-Abdi, and Zumin Shi. "Serum Magnesium and Cognitive Function among Qatari Adult." In Qatar University Annual Research Forum & Exhibition. Qatar University Press, 2020. http://dx.doi.org/10.29117/quarfe.2020.0207.

Full text
Abstract:
Background: Previous studies found that low levels of magnesium can increase the risk of several diseases including cardiovascular diseases, diabetes, and hypertension which are associated with cognitive dysfunction. Objective: Examine the association between serum magnesium and cognitive function among Qatari adults. In addition to assessing the interaction relation between low serum magnesium, hypertension, and diabetes in relation to cognitive function. Methods: Data from 1000 Qatari participants aged ≥20 years attending the Qatar Biobank Study were analyzed. Serum magnesium was measured by an automated calorimetric method and suboptimal magnesium was characterized by <0.85 mmol/L. Cognitive function was measured by a computer-based selfadministered test focusing on mean reaction time (MRT). Multivariable linear regression and logistic regression were used. Results: The prevalence of suboptimal magnesium was 57.1%. Across the quartiles of serum magnesium from high to low, the regression coefficients (95% CI) for MRT were 0, -17.79, -18.27, and -31.93 (95%CI 2.38–3.05), respectively (p for trend <0.033). The presence of hypertension and diabetes significantly increased the MRT along with a wide range of low serum magnesium. Women with diabetes or hypertension were affected the greatest by low magnesium levels. Conclusion: There was a positive association between serum magnesium and cognitive function. Low magnesium levels were associated with a longer MRT.
APA, Harvard, Vancouver, ISO, and other styles
7

Korneeva, Yana, and Natalia Simonova. "The Functional State Assessment as the Psychological Safety Marker of the Offshore Production Platform Workers." In Offshore Technology Conference. OTC, 2021. http://dx.doi.org/10.4043/31262-ms.

Full text
Abstract:
Abstract The present study is devoted to the functional states’ identification and description as the psychological safety marker of offshore oil-producing platform workers with the fly-in-fly-out work organization. This will allow identifying an employee's risk group with low psychological safety for the development of measures to improve it, preserve their health and work efficiency. The research was carried out by means of a scientific expedition in April 2019 during the entire fly-in-fly-out visit to the offshore ice- resistant platform in the Caspian Sea. It was attended by 50 employees (average age 36.17 ± 1.064, average work experience on a fly-in-fly-out basis 7.97 ± 0.839, fly-in-fly-out period - 14 days). Research methods are: 1) instrumental psychophysiological methods for assessing the state on the devices «AngioScan» (stress level) and «Psychophysiologist» (operator performance, functional state level, functional reserves level); 2) psychological testing methods are M. Luscher's color test and the "Well-being. Activity. Insistence" questioning. Psychological testing of employee's personality traits. Statistical analysis was performed using descriptive statistics and multivariate methods using the SPSS 23.00 software package. As the study result, all employees were divided into two large groups according to functional states indicators: 1) a group with high performance, since these employees have optimal speed, high quality work performance and good health; 2) a group with low performance, because with a favorable general state of health and the performing tasks speed, employees show a low performance. The relationship between the two groups oil-producing platform employees’ subjective characteristics of efficiency and safety studied. It was found that employees with high performance are adapted to the negative environment impact and are characterized by high psychological safety. The second group representatives with low performance give higher assessments of the professional situations danger and are not satisfied with the work schedule, and therefore belong to the risk group and require additional measures to ensure psychological safety. Personal markers of attribution to groups with different efficiency are independence, cyclothymic character accentuation type, planning and the general level of subjective control.
APA, Harvard, Vancouver, ISO, and other styles
8

AlMukdad, Sawsan Ibrahim, Hazem Elewa, and Daoud Al-Badriyeh. "Economic Evaluation of CYP2C19 Genotype-Guided Antiplatelet Therapy Compared to Universal use of Ticagrelor or Clopidogrel in Qatar." In Qatar University Annual Research Forum & Exhibition. Qatar University Press, 2020. http://dx.doi.org/10.29117/quarfe.2020.0170.

Full text
Abstract:
Background: Patients having CYP2C19 loss-of-function alleles and receiving clopidogrel are at higher risk of adverse cardiovascular outcomes. Ticagrelor is a more effective and expensive antiplatelet that is unaffected by the CYP2C19 polymorphism. The main aim of the current research is to evaluate the cost-effectiveness among CYP2C19 genotype-guided therapy, universal ticagrelor, and universal clopidogrel after a percutaneous coronary intervention (PCI). Methods: A two-part simulation model, including a one-year decision-analytic model and a 20-year followup Markov model, was created to follow the use of (i) universal clopidogrel, (ii) universal ticagrelor, and (iii) genotype-guided antiplatelet therapy. Outcome measures were the incremental cost-effectiveness ratio (ICER, cost/success) and incremental cost-utility ratio (ICUR, cost/qualityadjusted life years [QALY]). Therapy success was defined as survival without myocardial infarction, stroke, cardiovascular death, stent thrombosis, and no therapy discontinuation because of adverse events, i.e. major bleeding and dyspnea. The model was based on a multivariate analysis, and a sensitivity analysis confirmed the robustness of the model outcomes. Results: Against universal clopidogrel, genotype-guided therapy was cost-effective over the one-year duration (ICER, USD 6,102 /success), and dominant over the long-term. Genotype-guided therapy was dominant over universal ticagrelor over the one-year duration and cost-effective over the long term (ICUR, USD 1,383 /QALY). Universal clopidogrel was dominant over ticagrelor over the short term, and cost-effective over the long-term (ICUR, 10,616 /QALY). Conclusion: CYP2C19 genotype-guided therapy appears to be the preferred antiplatelet strategy, followed by universal clopidogrel, and then universal ticagrelor for post-PCI patients in Qatar.
APA, Harvard, Vancouver, ISO, and other styles
9

Schwarz, Aubriana, Patricia Goodhines, Amelia Wedel, Lisa LaRowe, and Aesoon Park. "Sleep-Related Cannabis Expectancies Questionnaire (SR-CEQ): Replication and Psychometric Validation among College Students using Cannabis for Sleep Aid." In 2021 Virtual Scientific Meeting of the Research Society on Marijuana. Research Society on Marijuana, 2022. http://dx.doi.org/10.26828/cannabis.2022.01.000.45.

Full text
Abstract:
Introduction: Emerging evidence suggests that cannabis is commonly used to aid sleep among college students. Although outcome expectancies have been associated with the progression of cannabis use, sleep-related expectancies have not been included in widely-used cannabis expectancy measures. Recently, the Sleep-Related Cannabis Expectancies Questionnaire (SR-CEQ; Goodhines et al., 2020) was developed and initial evidence for its 2-factor structure was obtained in a general college sample (including non-cannabis users). However, the SR-CEQ’s associations with sleep and cannabis use behaviors among cannabis sleep aid users is unknown. This study aimed to replicate the previous factor structure and test construct and concurrent validity of the SR-CEQ among college students using cannabis for sleep aid. Method: Cross-sectional data were drawn from 94 college students reporting at least bimonthly cannabis use for sleep aid. Five multivariate outliers on the SR-CEQ were excluded, resulting in an analytic sample of 89 (Mage=19.92 [SD=1.19; range=18-22]; 66% female; 72% White, 12% Multiracial, 7% Asian, 5% Black or African-American, 1% self-reported Other and 3% did not disclose; 14% Hispanic/Latinx). Students completed an online survey of sleep and substance use behaviors. A confirmatory factor analysis (CFA) replicated the 2-factor structure (Positive and Negative Sleep-Related Cannabis Expectancies), bivariate correlations tested associations with related constructs (sleep and cannabis use behaviors/beliefs), and independent-samples t-tests further explicated relevant group differences. Results: After dropping item 5 (factor loading<.40), CFA with a 2-factor structure indicated good fit to the data (χ2(41)=66.76, p=.01; CFI=0.94; SRMR=0.07; RMSEA=0.08 [90% CI=0.05, 0.12]). Positive sleep-related cannabis expectancies (α=.84) were associated with dysfunctional beliefs about sleep (r=.24, p=.02), but not insomnia symptoms, poor sleep quality, or frequencies of cannabis use (ps>.05). Students who used cannabis more frequently in general (≥36 of 60 days, per median split) reported more positive sleep-related cannabis expectancies (t[86]=1.99, p=.05, Cohen’s d=0.42). Negative sleep-related cannabis expectancies (α=.80) were not associated with any cannabis or sleep variables assessed (ps>.05). Negative sleep-related cannabis expectancies were marginally lower among students with greater frequency of general cannabis use (t[87]=-1.89, p=.06, Cohen’s d=0.40) and cannabis use for sleep aid (≥3 times/week, per median split; t[87]=-1.87, p=.06, Cohen’s d=0.40). Further, greater negative sleep-related cannabis expectancies were reported among male (versus female) students (t[87]=2.30, p=.02, Cohen’s d=0.51). Conclusion: Overall, replication of this 2-factor structure showed good fit to the data and both subscales demonstrated good internal consistency. Although replication is needed, results suggest that college students using cannabis for sleep aid may have less negative sleep-related expectancies about sleep. Positive sleep-related cannabis expectancies were associated with dysfunctional beliefs about sleep, but not sleep behaviors or cannabis use. Current novel findings extend existing knowledge of general non-sleep related cannabis expectancies among cannabis users in terms of cannabis use correlates. Findings can help identify at-risk students and modifiable risk factors that can be targeted to minimize harm with cannabis sleep aid use. Future research is needed among larger samples to (a) assess generalizability to varied populations and (b) clarify temporal sequencing of potential consequences through longitudinal designs.
APA, Harvard, Vancouver, ISO, and other styles
10

Alsaeedi, Ayesha, Mohamed Mubarak Albadi, Ibrahim Eltony, Noora Al Mahri, Reem Alhammadi, Ammar Al-Ameri, Zeeshan Ahmad, et al. "Novel Direct Multiphase Real Time Wellhead Measurement Using Wet-Gas Coriolis Technology in a Giant Gas Field- Case Study." In ADIPEC. SPE, 2022. http://dx.doi.org/10.2118/211236-ms.

Full text
Abstract:
Abstract One of the key aspects of production optimization and monitoring towards digital transformation requirements is to have reliable and accurate data / measurement. This submission demonstrates how the smart technologies for direct wellhead measurement using Coriolis meter in gas field significantly help asset in production monitoring and Opex optimization while accurately regulating the field rates. The use of the smart technology in direct well head measurement provided real time insight into well production, with continuous and reliable quality data on production enabling Reservoir and Production Engineers to take timely decisions to optimize production, improve mass balance, better reservoir management, and reduce operating cost. The technology also had built in advanced diagnostics and connectivity features paving way for digitalization of oil fields and remote online access of data. The high operating range and flexibility in switching between wells offered ease of adaptability, reduced engineering, and commissioning costs as well as reduced maintenance and inventory requirement in mass implementation. Prior to scale-up, a trial / pilot was conducted in collaboration between ADNOC TC, Bab site Production & Instrumentation team and vendor for the technology over a period of 2 months. First, Multiple wells were selected from different operating envelop aiming different reservoirs. Then, the technology was tested by installing the Wet-Gas Coriolis meter in series with a portable well testing package provider. The measurement of the well tests from both techniques were compared. This is to evaluate the performance of the technology in having a reliable data compared with the current periodic measurement. Finally, evaluation report was produced to show the challenges, data verification and validation, then plan for wide field implementation. The technology provides real time insight and continuous monitoring of well production and its profile resulting in better production management. With direct real time well head measurement, significant CAPEX and OPEX savings can be achieved in additional to optimizing production with the use of digital transformation capable technology. Additionally, Support the asset digital and back allocation initiatives. Significant optimization on periodic portable separator well testing requirement. Permanent installation of the Wet Gas Coriolis meter on all wells will empower continuous production monitoring capabilities by increasing the availability of the real time multivariable well data. Continuous data and actionable insights will be used to identify new development opportunity to maximize recovery. Advanced diagnostics capabilities enable predictive maintenance and reducing operating costs. Improved Reservoir and well modeling and production management. No intervention cost required and enhanced HSE measures. Well behaviors change over time and can't be accurately predicted specially with the depleted wells. The current method of relying on intermittent well testing data is insufficient, expensive, labor intensive and increases safety risk. Continuous real time insight into production is essential in optimizing production, improving mass balance, better reservoir management and reducing optimizing cost. The implantation in 60 wells have a value proposition of $5.98 M over a period of 5 years.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Multivariate risk measure"

1

Patston, L. L. M., A. N. Henry, M. McEwen, J. Mannion, and L. A. Ewens-Volynkina. Thinking While Standing: An exploratory study on the effect of standing on cognitive performance. Unitec ePress, September 2017. http://dx.doi.org/10.34074/ocds.32017.

Full text
Abstract:
Sedentary behaviour is extremely prevalent in Western societies and is significantly associated with an elevated risk of all-cause mortality that cannot be mitigated by physical activity. The introduction of standing desks into the workplace offers a solution to this inactivity, but there is limited investigation regarding the effects of standing on cognition, which is a major consideration in much office-based work. In this study we aimed to provide an exploratory investigation on the effect on cognitive performance of standing while working. We tested 30 office-based adults on a battery of 19 cognitive tasks (tapping five cognitive domains) in a randomised, repeatedmeasures crossover design study. Two conditions (standing versus sitting) were investigated over two 7.5-hour work days including morning, midday and afternoon sessions (Time of Day). Effects were analysed using multivariate two-way repeated-measures ANOVAs (Condition by Time of Day) for five cognitive domains. Overall, after correcting for multiple comparisons, there were no differences in performance between sitting and standing. At an uncorrected level, however, significant effects of Condition were found in three of the 19 tasks, with all demonstrating better performance while standing. Importantly, these results suggest that there is no detriment to cognitive performance through standing. They also provide an initial indication that there may be cognitive benefits of standing in the attention and working memory domains, which may be a promising avenue for future inquiry.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography