Thèses sur le sujet « Risk decomposition »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Consultez les 46 meilleures thèses pour votre recherche sur le sujet « Risk decomposition ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Parcourez les thèses sur diverses disciplines et organisez correctement votre bibliographie.
Surucu, Oktay. « Decomposition Techniques In Energy Risk Management ». Master's thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/12606552/index.pdf.
Texte intégralGérard, Henri. « Stochastic optimization problems : decomposition and coordination under risk ». Thesis, Paris Est, 2018. http://www.theses.fr/2018PESC1111/document.
Texte intégralWe consider stochastic optimization and game theory problems with risk measures. In a first part, we focus on time consistency. We begin by proving an equivalence between time consistent mappings and the existence of a nested formula. Motivated by well-known examples in risk measures, we investigate three classes of mappings: translation invariant, Fenchel-Moreau transform and supremum mappings. Then, we extend the concept of time consistency to player consistency, by replacing the sequential time by any unordered set and mappings by any relations. Finally, we show how player consistency relates to sequential and parallel forms of decomposition in optimization. In a second part, we study how risk measures impact the multiplicity of equilibria in dynamic game problems in complete and incomplete markets. We design an example where the introduction of risk measures leads to the existence of three equilibria instead of one in the risk neutral case. We analyze the ability of two different algorithms to recover the different equilibria. We discuss links between player consistency and equilibrium problems in games. In a third part, we study distribution ally robust optimization in machine learning. Using convex risk measures, we provide a unified framework and propose an adapted algorithm covering three ambiguity sets discussed in the literature
Amaxopoulos, Fotios. « Hedge funds : risk decomposition, replication and the disposition effect ». Thesis, Imperial College London, 2011. http://hdl.handle.net/10044/1/6935.
Texte intégralVoßmann, Frank. « Decision weights in choice under risk and uncertainty : measurement and decomposition / ». [S.l. : s.n.], 2004. http://www.gbv.de/dms/zbw/490610218.pdf.
Texte intégralKagaruki-Kakoti, Generosa. « An Economic Analysis of School and Labor Market Outcomes For At-Risk Youth ». Digital Archive @ GSU, 2005. http://digitalarchive.gsu.edu/econ_diss/6.
Texte intégralColletta, Renato Dalla. « Cash flow and discount rate risk decomposition and ICAPM for the US and brazilian stock markets ». reponame:Repositório Institucional do FGV, 2013. http://hdl.handle.net/10438/10566.
Texte intégralApproved for entry into archive by Vera Lúcia Mourão (vera.mourao@fgv.br) on 2013-02-27T21:01:52Z (GMT) No. of bitstreams: 1 Tese MPFE Renato D Colletta.pdf: 1766902 bytes, checksum: 4daf523c0cf56d0533692bcd81b813db (MD5)
Made available in DSpace on 2013-02-27T21:05:31Z (GMT). No. of bitstreams: 1 Tese MPFE Renato D Colletta.pdf: 1766902 bytes, checksum: 4daf523c0cf56d0533692bcd81b813db (MD5) Previous issue date: 2013-01-31
This work applies the intertemporal asset pricing model developed by Campbell (1993) and Campbell and Vuolteenaho (2004) to the Brazilian 2x3 Fama-French stock portfolios from January 2003 to April 2012 and to the US 5x5 Fama-French portfolios in dfferent time periods. The variables suggested by Campbell and Vuolteenaho (2004) to forecast US market excess returns from 1929 to 2001 were also good excess return predictors for the Brazilian market on the recent period, except the term structure yield spread. However, we found that an increase in the small stock value spread predicts a higher market excess return, which is not consistent with the intertemporal model explanation for the value premium. Moreover, using the residuals of the forecasting VAR to define the test portfolios’ cash flow and discount rate shock risk sensitivity, we found that the resulting intertemporal model explains little of the variance in the cross section of returns. For the US market, we conclude that the proposed variables’ ability to forecast market excess returns is not constant in time. Campbell and Vuolteenaho’s (2004) success in explaining the value premium for the US market in the 1963 to 2001 sub-sample is a result of the VAR specification in the full sample, since we show that none of the variables are statistically significant return predictors in this sub-sample.
Esse trabalho é uma aplicação do modelo intertemporal de apreçamento de ativos desenvolvido por Campbell (1993) e Campbell e Vuolteenaho (2004) para as carteiras de Fama-French 2x3 brasileiras no period de janeiro de 2003 a abril de 2012 e para as carteiras de Fama-French 5x5 americanas em diferentes períodos. As varíaveis sugeridas por Campbell e Vuolteenaho (2004) para prever os excessos de retorno do mercado acionário americano no period de 1929 a 2001 mostraram-se também bons preditores de excesso de retorno para o mercado brasileiro no período recente, com exceção da inclinação da estrutura a termo das taxas de juros. Entretanto, mostramos que um aumento no small stock value spread indica maior excesso de retorno no futuro, comportamento que não é coerente com a explicação para o prêmio de valor sugerida pelo modelo intertemporal. Ainda, utilizando os resíduos do VAR preditivo para definir o risco de choques de fluxo de caixa e de choques nas taxas de desconto das carteiras de teste, verificamos que o modelo intertemporal resultante não explica adequadamente os retornos observados. Para o mercado norte-americano, concluímos que a habilidade das variáveis propostas para explicar os excessos de retorno do mercado varia no tempo. O sucesso de Campbell e Vuolteenaho (2004) em explicar o prêmio de valor para o mercado norte-americano na amostra de 1963 a 2001 é resultado da especificação do VAR na amostra completa, pois mostramos que nenhuma das varíaveis é um preditor de retorno estatisticamente significante nessa sub-amostra.
Soberanis, Policarpio Antonio. « Risk optimization with p-order conic constraints ». Diss., University of Iowa, 2009. https://ir.uiowa.edu/etd/437.
Texte intégralRROJI, EDIT. « Risk attribution and semi-heavy tailed distributions ». Doctoral thesis, Università degli Studi di Milano-Bicocca, 2013. http://hdl.handle.net/10281/49833.
Texte intégralIucci, Alessandro. « Explainable Reinforcement Learning for Risk Mitigation in Human-Robot Collaboration Scenarios ». Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-296162.
Texte intégralAlgoritmer för förstärkningsinlärning (RL-algoritmer) är mycket populära inom robotikområdet för att lösa komplexa problem, att lära sig av dynamiska miljöer och att generera optimala resultat. En av de viktigaste begränsningarna för RL är dock bristen på modellens transparens. Detta inkluderar den oförmåga att förklara bakomliggande process (algoritm eller modell) som genererade ett visst returvärde. Förklarbarheten blir ännu viktigare när resultatet från en RL-algoritm påverkar mänskliga beslut, till exempel i HRC-scenarier där säkerhetskrav bör uppfyllas. Detta arbete fokuserar på användningen av två förklarbarhetstekniker, “Reward Decomposition” och “Autonomous policy Explanation”, tillämpat på en RL-algoritm som är kärnan i en riskreduceringsmodul för drift av samarbetande robotars på ett automatiserat lager. “Reward Decomposition” ger en inblick i vilka faktorer som påverkade robotens val genom att bryta ner belöningsfunktionen i mindre funktioner. Det gör det också möjligt att formulera en MSX (minimal sufficient explanation), uppsättning av relevanta skäl för varje beslut som har fattas under robotens drift. Den andra tillämpade tekniken, “Autonomous Policy Explanation”, ger en generellt prespektiv över robotens beteende genom att mänskliga användare får ställa frågor till roboten. Detta ger även insikt i de beslutsriktlinjer som är inbäddade i robotens policy. Ty syntesen av policybeskrivningarna och frågornas svar är naturligt språk underlättar detta en algoritmdiagnos även för icke-expertanvändare. Resultaten visade att det finns en förbättring av RL-algoritmen som nu väljer mer jämnt fördelade åtgärder. Dessutom produceras en fullständig policy för robotens beslut som för det mesta är anpassad till förväntningarna. Rapporten ger en analys av resultaten av tillämpningen av båda teknikerna, som visade att båda ledde till ökad transparens i robotens beslutsprocess. Förklaringsmetoderna gav inte bara förtroende för robotens val, vilket visade sig vara bland de optimala i de flesta fall, utan gjorde det också möjligt att hitta svagheter i robotens policy, vilket gjorde dem till ett verktyg som är användbart för felsökningsändamål.
Berg, Simon, et Victor Elfström. « IRRBB in a Low Interest Rate Environment ». Thesis, KTH, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-273589.
Texte intégralFinansiella institutioner är exponerade mot flera olika typer av risker. En av de risker som kan ha en stor påverkan är ränterisk i bankboken (IRRBB). 2018 släppte European Banking Authority (EBA) ett regelverk gällande IRRBB som ska se till att institutioner gör tillräckliga riskberäkningar. Detta papper föreslår en IRRBB modell som följer EBAs regelverk. Detta regelverk innehåller bland annat ett deterministiskt stresstest av den riskfria avkastningskurvan, utöver detta så gjordes två olika typer av stokastiska stresstest av avkastningskurvan. Resultatet visar att de deterministiska stresstesten ger högst riskutslag men att utfallen anses vara mindre sannolika att inträffa jämfört med utfallen som de stokastiska modellera genererade. Det påvisas även att EBAs förslag på stressmodell skulle kunna anpassas bättre mot den lågräntemiljö som vi för tillfället befinner oss i. Vidare förs en diskussion gällande ett behov av ett mer standardiserat ramverk för att tydliggöra, både för institutioner själva och samt övervakande myndigheter, vilka risker institutioner utsätts för.
Andronoudis, Dimos. « Essays on risk, stock return volatility and R&D intensity ». Thesis, University of Exeter, 2015. http://hdl.handle.net/10871/21278.
Texte intégralWang, Yi. « Default risk in equity returns an industrial and cross-industrial study / ». Cleveland, Ohio : Cleveland State University, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=csu1251906476.
Texte intégralAbstract. Title from PDF t.p. (viewed on Sept. 8, 2009). Includes bibliographical references (p. 149-154). Available online via the OhioLINK ETD Center and also available in print.
Souza, Enio Bonafé Mendonça de. « Mensuração e evidenciação contábil do risco financeiro de derivativos ». Universidade de São Paulo, 2015. http://www.teses.usp.br/teses/disponiveis/12/12136/tde-05032015-182918/.
Texte intégralThe great advantage of accounting as a science of financial control are the techniques that guarantee the integrity of the information presented, primarily through the identity debit / credit. This thesis shows a new form to recognize and record derivatives while preserving accounting principles and providing a much more clear and precise estimate of the financial risks involved in the balance sheet items. An Accounting Decomposition is made over derivative transactions by spreading up each one of them into an asset and a liability; the difference being the fair value result of the transaction. Subsequently, a new Risks Decomposition opens up assets and liabilities in their primitive risk factors, highlighting the risk exposures by each type. Finally, a global reaggregation of all decompositions performed by risk factors generates the SFR-Statement of Financial Risks, showing synthetically the exposures to all risks involved in the transactions carried in the balance sheet. It is presented how the SFR shows effectiveness of hedges applied on the balance sheet more clearly, either, for internal management and for external user purposes. Also, it turns evident the amount of exposure to each market risk factor. The greatest advantage of this procedure is to obtain the risk exposures of derivatives automatically and straightly reconciled with accounting records.
Motyka, Matt. « Risk measurement of mortgage-backed security portfolios via principal components and regression analyses ». Link to electronic thesis, 2003. http://www.wpi.edu/Pubs/ETD/Available/etd-0429103-231210.
Texte intégralKeywords: portfolio risk decomposition; principal components regression; principal components analysis; mortgage-backed securities. Includes bibliographical references (p. 88-89).
Berg, Edvin, et Karl Wilhelm Lange. « Enhancing ESG-Risk Modelling - A study of the dependence structure of sustainable investing ». Thesis, KTH, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-266378.
Texte intégralIntresset för hållbara investeringar har ökat avsevärt de senaste åren. Fondförvaltare och institutionella investerare är, från deras intressenter, manade att investera mer hållbart vilket minskar förvaltarnas investeringsuniversum. Denna uppsats har funnit att hållbara investeringar har en beroendestruktur som är skild från de regionala marknaderna i Europa och Nordamerika, men inte för Asien-Stillahavsregionen. De största värdeminskningarna i en hållbar portfölj har historiskt varit mindre än värdeminskningarna från en slumpmässig marknadsportfölj, framförallt i Europa och Nordamerika.
Celidoni, Martina. « Essays on vulnerability to poverty and inequality ». Doctoral thesis, Università degli studi di Padova, 2012. http://hdl.handle.net/11577/3422143.
Texte intégralAlla luce del recente rapporto della Commissione sulla Misura della Performance Economica e del progresso Sociale (CMEPSP), composta anche da Joseph Stiglitz, Amartya Sen e Jean Paul Fitoussi, gli indicatori statistici sono importanti per il design e la valutazione delle politiche pubbliche in termini di progresso sociale (Stiglitz, Sen, and Fitoussi, 2009). L'obiettivo principale della tesi in oggetto è l'analisi dell'informazione fornita dagli indici di vulnerabilità alla povertà e disuguaglianza. Il primo capitolo confronta in termini empirici le misure individuali di vulnerabilità alla povertà proposte in letteratura. Lo scopo è capire quale sia l'indice più preciso nel predire la povertà, affinchè questo possa essere utilizzato come fonte di informazione per le politiche pubbliche. La Receiver Operating Characteristic (ROC) curve, i coefficenti di correlazione di Pearson e Spearman sono utilizzati come criteri per la valutazione della precisione. Usando dati del British Household Panel Survey (BHPS), del German Socio-Economic Panel (SOEP) e della Survey on Household Income and Wealth (SHIW), i risultati mostrano che possono essere identificate due categorie di indici, high- e low-performers; fra i primi, l'indice proposto da Dutta, Foster, and Mishra (2011) è il più preciso nell'identificare i futuri poveri. Il secondo capitolo applica una scomposizione non parametrica dell'indice di povertà Foster-Greer-Thorbecke alla vulnerabilità alla povertà individuale. Questo approccio mostra come il rischio di povertà può essere espresso come funzione di incidenza attesa, intensità attesa e variabilità negativa attesa. La scomposizione proposta è utile in termini di politiche di risk management per le informazioni circa le caratteristiche del rischio di povertà. Il capitolo prevede due illustrazioni empiriche con dati del British Household Panel Survey e della Survey on Household Income and Wealth. Il terzo capitolo di focalizza sugli indici di disuguaglianza. Secondo Atkinson (1971), la disuguaglianza attribuibile all'età è irrilevante se l'interesse è concentrato nella distribuzione di reddito e ricchezza di lungo periodo (lifetime perspective). Riguardo ciò, il terzo capitolo propone delle misure di disuguaglianza basate sulla ricchezza netta e corrette per l'effetto dei cambiamenti demografici nella popolazione italiana fra il 1991 ed il 2008. Utilizzando i dati della Survey on Household Income and Wealth della Banca d'Italia, i risultati confermano quanto già osservato in letteratura: gli aggiustamenti demografici non risultano determinanti nella dinamica della disuguaglianza in termini di ricchezza netta.
Fausch, Jürg. « Essays on Financial Markets and the Macroeconomy ». Doctoral thesis, Stockholms universitet, Nationalekonomiska institutionen, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-140151.
Texte intégralWang, Nancy. « Spectral Portfolio Optimisation with LSTM Stock Price Prediction ». Thesis, KTH, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-273611.
Texte intégralDen nobelprisvinnande moderna portföjlteorin (MPT) är utan tvekan en av de mest framgångsrika investeringsmodellerna inom finansvärlden och investeringsstrategier. MPT antar att investerarna är mindre benägna till risktagande och approximerar riskexponering med variansen av tillgångarnasränteavkastningar. Nyckeln till en lyckad portföljförvaltning är därmed goda riskestimat och goda förutsägelser av tillgångspris. Riskestimering görs vanligtvis genom traditionella prissättningsmodellerna som tillåter risken att variera i tiden, dock inte i frekvensrummet. Denna begränsning utgör bland annat ett större fel i riskestimering. För att tackla med detta har intresset för tillämpningar av spektraanalys på finansiella tidsserier ökat de senast åren. Bland annat är ett nytt tillvägagångssätt för att behandla detta den nyintroducerade spektralportföljteorin och spektralfak- tormodellen som påvisade ökad portföljenprestanda genom spektralriskskattning [1][11]. Samtidigt har prediktering av aktierpriser länge varit en stor utmaning på grund av dess icke-linjära och icke-stationära egenskaper medan maskininlärning har kunnat använts för att lösa annars omöjliga uppgifter. Färska studier har påvisat signifikant resultat i aktieprisprediktering med hjälp av artificiella LSTM neurala nätverk [6][34]. Detta arbete undersöker kombinerade effekten av dessa två framsteg i ett portföljoptimeringsproblem genom att optimera en spektral portfölj med framtida avkastningar predikterade av ett LSTM neuralt nätverk. Arbetet börjar med matematisk härledningar och teoretisk introduktion och sedan studera portföljprestation som genereras av spektra risk, LSTM aktieprispredikteringen samt en kombination av dessa två. Resultaten visar på att LSTM-predikteringen ensam presterade bättre än kombinationen, vilket i sin tur presterade bättre än enbart spektralriskskattningen.
Leclere, Vincent. « Contributions to decomposition methods in stochastic optimization ». Thesis, Paris Est, 2014. http://www.theses.fr/2014PEST1092/document.
Texte intégralStochastic optimal control addresses sequential decision-making under uncertainty. As applications leads to large-size optimization problems, we count on decomposition methods to tackle their mathematical analysis and their numerical resolution. We distinguish two forms of decomposition. In chained decomposition, like Dynamic Programming, the original problemis solved by means of successive smaller subproblems, solved one after theother. In parallel decomposition, like Progressive Hedging, the original problemis solved by means of parallel smaller subproblems, coordinated and updated by amaster algorithm. In the first part of this manuscript, Dynamic Programming: Risk and Convexity, we focus on chained decomposition; we address the well known time decomposition that constitutes Dynamic Programming with two questions. In Chapter 2, we extend the traditional additive in time and risk neutral setting to more general ones for which we establish time-consistency. In Chapter 3, we prove a convergence result for the Stochastic Dual Dynamic Programming Algorithm in the case where (convex) cost functions are no longer polyhedral. Then, we turn to parallel decomposition, especially decomposition methods obtained by dualizing constraints (spatial or non-anticipative). In the second part of this manuscript, Duality in Stochastic Optimization, we first point out that such constraints lead to delicate duality issues (Chapter 4).We establish a duality result in the pairing $Bp{mathrm{L}^infty,mathrm{L}^1}$ in Chapter 5. Finally, in Chapter 6, we prove the convergence of the Uzawa Algorithm in~$mathrm{L}^inftyp{Omega,cF,PP;RR^n}$.The third part of this manuscript, Stochastic Spatial Decomposition Methods, is devoted to the so-called Dual Approximate Dynamic Programming Algorithm. In Chapter 7, we prove that a sequence of relaxed optimization problems epiconverges to the original one, where almost sure constraints are replaced by weaker conditional expectation ones and that corresponding $sigma$-fields converge. In Chapter 8, we give theoretical foundations and interpretations to the Dual Approximate Dynamic Programming Algorithm
Alais, Jean-Christophe. « Risque et optimisation pour le management d'énergies : application à l'hydraulique ». Thesis, Paris Est, 2013. http://www.theses.fr/2013PEST1071/document.
Texte intégralHydropower is the main renewable energy produced in France. It brings both an energy reserve and a flexibility, of great interest in a contextof penetration of intermittent sources in the production of electricity. Its management raises difficulties stemming from the number of dams, from uncertainties in water inflows and prices and from multiple uses of water. This Phd thesis has been realized in partnership with Electricité de France and addresses two hydropower management issues, modeled as stochastic dynamic optimization problems. The manuscript is divided in two parts. In the first part, we consider the management of a hydroelectric dam subject to a so-called tourist constraint. This constraint assures the respect of a given minimum dam stock level in Summer months with a prescribed probability level. We propose different original modelings and we provide corresponding numerical algorithms. We present numerical results that highlight the problem under various angles useful for dam managers. In the second part, we focus on the management of a cascade of dams. We present the approximate decomposition-coordination algorithm called Dual Approximate Dynamic Programming (DADP). We show how to decompose an original (large scale) problem into smaller subproblems by dualizing the spatial coupling constraints. On a three dams instance, we are able to compare the results of DADP with the exact solution (obtained by dynamic programming); we obtain approximate gains that are only at a few percents of the optimum, with interesting running times. The conclusions we arrived at offer encouraging perspectives for the stochastic optimization of large scale problems
Moataz, Fatima Zahra. « Vers des réseaux optiques efficaces et tolérants aux pannes : complexité et algorithmes ». Thesis, Nice, 2015. http://www.theses.fr/2015NICE4077/document.
Texte intégralWe study in this thesis optimization problems with application in optical networks. The problems we consider are related to fault-tolerance and efficient resource allocation and the results we obtain are mainly related to the computational complexity of these problems. The first part of this thesis is devoted to finding paths and disjoint paths. Finding a path is crucial in all types of networks in order to set up connections and finding disjoint paths is a common approach used to provide some degree of protection against failures in networks. We study these problems under different settings. We first focus on finding paths and node or link-disjoint paths in networks with asymmetric nodes, which are nodes with restrictions on their internal connectivity. Afterwards, we consider networks with star Shared Risk Link Groups (SRLGs) which are groups of links that might fail simultaneously due to a localized event. In these networks, we investigate the problem of finding SRLG-disjoint paths. The second part of this thesis focuses on the problem of Routing and Spectrum Assignment (RSA) in Elastic Optical Networks (EONs). EONs are proposed as the new generation of optical networks and they aim at an efficient and flexible use of the optical resources. RSA is the key problem in EONs and it deals with allocating resources to requests under multiple constraints. We first study the static version of RSA in tree networks. Afterwards, we examine a dynamic version of RSA in which a non-disruptive spectrum defragmentation technique is used. Finally, we present in the appendix another problem that has been studied during this thesis
Dibaji, Seyed Ahmad Reza. « Nonlinear Derating of High Intensity Therapeutic Ultrasound Beams using Decomposition of Gaussian Mode ». University of Cincinnati / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1458900246.
Texte intégralJoshi, Neekita. « ASSESSING THE SIGNIFICANCE OF CLIMATE VARIABILITY ON GROUNDWATER RISE AND SEA LEVEL CHANGES ». OpenSIUC, 2021. https://opensiuc.lib.siu.edu/dissertations/1908.
Texte intégralPilon, Rémi. « Dynamique du système racinaire de l'écosystème prairial et contribution au bilan de carbone du sol sous changement climatique ». Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2011. http://tel.archives-ouvertes.fr/tel-00673439.
Texte intégralPuzyrova, Polina. « The algorithm for constructing a decomposition matrix of innovative risks : the degree of their influence on innovation potential for integrated business structures in dynamic conditions of modern development ». Thesis, Perfect Publishing, Vancouver, Canada, 2021. https://er.knutd.edu.ua/handle/123456789/19280.
Texte intégralNosjean, Nicolas. « Management et intégration des risques et incertitudes pour le calcul de volumes de roches et de fluides au sein d’un réservoir, zoom sur quelques techniques clés d’exploration Integrated Post-stack Acoustic Inversion Case Study to Enhance Geological Model Description of Upper Ordovicien Statics : from imaging to interpretation pitfalls and an efficient way to overcome them Improving Upper Ordovician reservoir characterization - an Algerian case study Tracking Fracture Corridors in Tight Gas Reservoirs : An Algerian Case Study Integrated sedimentological case study of glacial Ordovician reservoirs in the Illizi Basin, Algeria A Case Study of a New Time-Depth Conversion Workflow Designed for Optimizing Recovery Proper Systemic Knowledge of Reservoir Volume Uncertainties in Depth Conversion Integration of Fault Location Uncertainty in Time to Depth Conversion Emergence of edge scenarios in uncertainty studies for reservoir trap analysis Enhancing geological model with the use of Spectral Decomposition - A case study of a prolific stratigraphic play in North Viking Graben, Norway Fracture corridor identification through 3D multifocusing to improve well deliverability, an Algerian tight reservoir case study Geological Probability Of Success Assessment for Amplitude-Driven Prospects, A Nile Delta Case Study ». Thesis, université Paris-Saclay, 2020. http://www.theses.fr/2020UPASS085.
Texte intégralIn the last 20 years, I have been conducting various research projects focused on the management of risks and uncertainties in the petroleum exploration domain. The various research projects detailed in this thesis are dealing with problematics located throughout the whole Exploration and Production chain, from seismic acquisition and processing, until the optimal exploration to development wells placement. Focus is made on geophysical risks and uncertainties, where these problematics are the most pronounced and paradoxically the less worked in the industry. We can subdivide my research projects into tree main axes, which are following the hydrocarbon exploration process, namely: seismic processing, seismic interpretation thanks to the integration with various well informations, and eventually the analysis and extraction of key uncertainties, which will be the basis for the optimal calculation of in place and recoverable volumes, in addition to the associated risk analysis on a given target structure. The various research projects that are detailed in this thesis have been applied successfully on operational North Africa and North Sea projects. After introducing risks and uncertainty notions, we will detail the exploration process and the key links with these issues. I will then present four major research projects with their theoretical aspects and applied case study on an Algerian asset
Dakkoune, Amine. « Méthodes pour l'analyse et la prévention des risques d'emballement thermique Zero-order versus intrinsic kinetics for the determination of the time to maximum rate under adiabatic conditions (TMR_ad) : application to the decomposition of hydrogen peroxide Risk analysis of French chemical industry Fault detection in the green chemical process : application to an exothermic reaction Analysis of thermal runaway events in French chemical industry Early detection and diagnosis of thermal runaway reactions using model-based approaches in batch reactors ». Thesis, Normandie, 2019. http://www.theses.fr/2019NORMIR30.
Texte intégralThe history of accidental events in chemical industries shows that their human, environmental and economic consequences are often serious. This thesis aims at proposing an approach of detection and diagnosis faults in chemical processes in order to prevent these accidental events. A preliminary study serves to identify the major causes of chemical industrial events based on experience feedback. In France, according to the ARIA database, 25% of the events are due to thermal runaway because of human errors. It is therefore appropriate to develop a method for early fault detection and diagnosis due to thermal runaway. For that purpose, we develop an approach that uses dynamical thresholds for the detection and collection of measurements for diagnosis. The localization of faults is based on a classification of the statistical characteristics of the temperature according to several defectives modes. A multiset of linear classifiers and binary decision diagrams indexed with respect to the time are used for that purpose. Finally, the synthesis of peroxyformic acid in a batch and semi batch reactor is considered to validate the proposed method by numerical simulations and then experiments. Faults detection performance has been proved satisfactory and the classifiers have proved a high isolability rate of faults
Riahi, Hassen. « Analyse de structures à dimension stochastique élevée : application aux toitures bois sous sollicitation sismique ». Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2013. http://tel.archives-ouvertes.fr/tel-00881187.
Texte intégralOzcan, Berkay. « The effects of marital transitions and spousal characteristic on economic outcomes ». Doctoral thesis, Universitat Pompeu Fabra, 2008. http://hdl.handle.net/10803/7251.
Texte intégralEsta tesis tiene el objetivo de ampliar y perfeccionar nuestra comprensión de por qué y cómo la dinámica de pareja afecta cuatro críticos resultados económicos que están directamente realacionados con la desigualdad y la estratificación. Estos resultados son, respectivamente; ser autónomo, la oferta de trabajo, el ahorro de los hogares y la distribución del ingreso. A lo largo de la tesis, con la dinámica de pareja, concibo dos conceptos: en primer lugar implica formar parte de una pareja (es decir, tener una esposa/o con ciertas características) versus ser soltero/a y transiciones entre estos dos estados. Y la segunda se refiere a los cambios en el comportamiento de los esposos debido a un cambio de contexto, como un aumento en el riesgo de disolución de la pareja. Por consiguiente, analiza las implicaciones de estos dos conceptos en cada una de estas variables económicas. La tesis se utiliza una serie de grandes conjuntos de datos longitudinales de diferentes países (p.e. PSID, GSOEP, PHCE, Living in Ireland Survey) y estratégias econométricas. Estas características incluyen el análisis de supervivencia, las estimaciones de diff-en-diff, simulaciones y descomposiciones.
Карчева, Ганна Тимофіївна, et Hanna Karcheva. « Забезпечення ефективного функціонування та розвитку банківської системи України ». Thesis, Університет банківської справи Національного банку України, 2013. http://ir.kneu.edu.ua/handle/2010/3090.
Texte intégralDoctoral thesis on economics, specialty 08.00.08 - Money, Finance and Credit -University of Banking of the National Bank of Ukraine. - Kyiv, 2013. Doctoral thesis developed and founded theoretical and methodological basis of the effective functioning and development of the banking system of Ukraine which are based on cybernetic and synergistic approaches, adherence of dynamic balance and financial stability aimed at the positive impact on economics development and providing high-quality service for people and companies. It was researched modem approaches for defining the nature of a banking system and its main characteristics, functions, structure, defined the theoretical basis of effective functioning and development of a banking system and modem concepts of development of a banking system and its effective functioning which are based on studying the effectiveness as a multicomponent system characteristics dependent on many factors. It was founded the synergistic concept of development of banking systems in conditions of their bifurcational development and defined conditions of synergistic effectiveness and importance of resonance effects of the regulator for its providing. It was analyzed the multivariate and alternative development of banking systems and founded the necessity of further development of the banking system of Ukraine by the intensive endogenously based model which will promote increase of its financial stability and effectiveness. It was developed the methodical approach to the complex evaluation of effectiveness of the banking system of Ukraine including decomposition analysis of profitability, evaluation of the effectiveness of assets and liabilities management and interest rate risk, to the state recapitalization of banks in Ukraine with the help of economic and statistical methods. It was developed the theoretical and methodological principles of effective control of liquidity risk of banking system including the internal and external (system-wide) factors, new requirements of the Basel Committee and improvement of quantitative risk assessment. It was conducted the analysis of development of the banking system of Ukraine as a multidimensional process of its improvement and modernization, pointed out the main phases of its development according to the model of linear stages of development and three regimes of funding determined by H. Minsky (provided, speculative, Ponzi finance), defined the main trends in development of the banking system in post-crisis period and increase of its efficiency. It was founded the ways of implementation of stable development of the banking system including the development and implementation of development strategy of the banking system which would predict the transition from extensive to intensive model of development, improvement of regulation of banking business considering the international standards.
Диссертация на соискание ученой степени доктора экономических наук по специальности 08.00.08 - деньги, финансы и кредит - Университет банковского дела Национального банка Украины (г. Киев). - К., 2013. Диссертация посвящена разработке теоретико-методологических подходов и практических приемов обеспечения эффективного функционирования и развития банковской системы Украины с целью повышения устойчивости и усиления влияния на социально-экономическое развитие страны, повышения доступности банковских услуг для населения и предприятий. В работе исследованы современные подходы к определению сущности банковской системы и ее эффективности, рассмотрены функции и структура, определены теоретические основы обеспечения эффективного функционирования и развития банковской системы, базирующиеся на соблюдении динамического равновесия и устойчивости, рассмотрении эффективности как интегральной многокомпонентной системной характеристики, зависящей от многих факторов и условий. Для поддержания устойчивости банковской системы важным является наличие определенного «потенциала устойчивости» в виде капитала, ликвидности, доходности, резервов на покрытие рисков. В работе предложено оценивать потенциал устойчивости банковской системы с помощью интегрального показателя, рассчитанного на основе оценки рисков и достаточности капитала для их покрытия. Выделены пять уровней устойчивости банковской системы в зависимости от значений параметров устойчивости, их волатильности и наличия потенциала устойчивости. Доказано, что неустойчивая банковская система, не поддерживающая динамическое равновесие, не может быть эффективной. Разработана синергетическая концепция развития банковских систем в условиях бифуркационного их развития и определены критерии вхождения банковской системы в фазу бифуркации. Предложен метод определения уровня хаотичности в банковской системе Украины и вхождения в фазу бифуркации на основе оценки волатильности текущих пассивов и возможности построения аналитического тренда. Разработан теоретико-методологический поход к исследованию обеспечения эффективного функционирования и развития банковской системы, обоснованы критерии и принципы обеспечения ее эффективности, условия достижения синергетических эффектов, выделены структурные компоненты эффективности и разработана система сбалансированных показателей для оценки всех выделенных видов эффективности, что позволит разработать эффективные механизмы поддержания финансовой устойчивости и обеспечения эффективности банковской системы. В диссертации предложен методический подход к проведению комплексного анализа эффективности банковской системы Украины, включая декомпозиционный анализ рентабельности деятельности, оценку эффективности управления активами и пассивами и процентным риском, рекапитализации банков в Украине с участием государства. Проанализирована многовариантность и альтернативность развития банковских систем и обоснована целесообразность дальнейшего развития банковской системы Украины в соответствии с эндогенно ориентированной моделью, которая будет способствовать повышению ее финансовой устойчивости и эффективности. Проведен анализ развития банковской системы Украины как многомерного процесса ее совершенствования и модернизации, выделены основные этапы ее развития в соответствии с моделью линейных стадий развития и выделенных X. Мински трех режимов финансирования (обеспеченного, спекулятивного, Понци-финансирования), определены основные тенденции развития банковской системы в посткризисный период и пути повышения ее эффективности. Разработаны теоретико-метологические основы повышения эффективности управления риском ликвидности банковской системы с учетом внутренних и внешних (общесистемных) факторов, новых требований Базельского комитета и совершенствования количественной оценки риска. Получили дальнейшее развитие методы определения и управления неснижаемым остатком текущих пассивов, предложен методологический подход к определению интегрального показателя оценки риска ликвидности банковской системы - динамического индикатора ликвидности, расчет которого основывается на принципах комплексности, системности, чувствительности к рискам и последствиям принятых решений. Предложен комплекс мероприятий по обеспечению эффективного функционирования и развития банковской системы Украины на основе совершенствования управления активами и пассивами и процентным риском с использованием экономико-статистических моделей, обоснована роль и определены концептуальные подходы к разработке государственной стратегии развития банковской системы Украины до 2020 г., которая должна предусматривать переход от экстенсивной к интенсивной модели развития банковской системы, совершенствование регулирования деятельности банков с учетом международных стандартов, внедрение инноваций, усиление влияния на развитие экономики и повышение доступности банковских услуг для населения и предприятий.
Lee, Tai-Chin, et 李岱瑾. « Decomposition of Risk Assessments for Financial Structured Notes ». Thesis, 2009. http://ndltd.ncl.edu.tw/handle/95630339033826002865.
Texte intégral元智大學
財務金融學系
97
In order to accelerate the development of the finance market, the authorities deregulated some rules and launched many new products recently. Equity-Linked Notes and Principal-Guaranteed Notes on NT dollars were deregulated in 2003. Besides, the interest rate is relatively low and structured notes are highly promoted by financial planner. Due to the above reasons, structured notes are very popular in financial market. The goal of this paper is to clarify the risk of structured notes which is sold in the market. We also hope this paper can help investors to determine the true risk of each product. We simulated eleven products in this paper and price each product via different pricing strategy and calculate Value at Risk (VaR) by historical estimation, variance-covariance estimation and Monte Carlo simulation. Moreover, we also conduct Marginal VaR and Incremental VaR to access it risk. From this paper, we find that if the underlying asset is stock, the coefficient of variance in risk will be relatively high of all. It means that it is difficult to conduct the risk management. As we know, if the product has more underlying assets, the effect of diversification will be more significant. However, the risk of main indices in developed countries is not lower than emerging countries. This result is counterintuitive and the reason may be related to the pricing strategy. The risk will be relatively high if the pricing strategy only contains single option. The underlying asset of interest rate seems much more stable than others but we find that the coefficient of variance in risk is relatively high of all. Most investors believe that the product of oil will have higher risk than others. However, we find that the variance of risk will be low if the pricing strategy is appropriate. Overall, analyzing the product’s pricing strategy and underlying asset correctly, it will help investors to make a good decision.
He, Ying active 2013. « Decomposition of multiple attribute preference models ». 2013. http://hdl.handle.net/2152/22980.
Texte intégraltext
« Decomposition of the market risk : listed location and operation location ». 2005. http://library.cuhk.edu.hk/record=b5892590.
Texte intégralThesis (M.Phil.)--Chinese University of Hong Kong, 2005.
Includes bibliographical references (leaf 31).
Abstracts in English and Chinese.
Chapter I --- Introduction --- p.1
Chapter II --- Data Description --- p.4
Chapter III --- Market risks for stocks --- p.6
Chapter 1. --- Listing Location --- p.7
Chapter 2. --- Operation Location --- p.9
Chapter 3. --- Measurements --- p.10
Chapter IV --- The Model --- p.13
Chapter V --- Empirical Results --- p.16
Chapter 1. --- Summary statistics --- p.16
Chapter 2. --- Diagnostics Test --- p.17
Chapter 3. --- The co-efficient --- p.18
Chapter 4. --- Comparing the result with US dollar-denominated returns --- p.21
Chapter VI --- Sub-period analysis --- p.26
Chapter VII --- Market analysis --- p.29
Chapter VIII --- Industrial analysis --- p.31
Chapter IX --- Conclusion --- p.35
Chapter X --- References --- p.37
Chapter XI --- Appendix --- p.39
Cotton, Tanisha Green. « Computational Study of Mean-Risk Stochastic Programs ». Thesis, 2013. http://hdl.handle.net/1969.1/149619.
Texte intégralJiang, Yun. « Decomposition, Ignition and Flame Spread on Furnishing Materials ». 2006. http://eprints.vu.edu.au/481/1/481contents.pdf.
Texte intégralLee, Chung-Yi, et 李仲益. « Private Placements of Equity and Systematic Risk – Application of the Beta Decomposition Model ». Thesis, 2014. http://ndltd.ncl.edu.tw/handle/ck73xf.
Texte intégral國立中興大學
財務金融系所
102
The existing literature finds that firms perform poorly after private placements, which is explained by investors overoptimism. This study uses the two-beta model: cash-flow beta and discount-rate beta, following Campbell and Vuolteenaho (2004) to investigate both issues. Cash-flow beta represents the risk of future investment opportunities, and discount-rate beta represents company’s sensitivity to market discount rate. The results show that firms with low cash-flow beta have poor long-run performance. This implies that with low sensitivity to cash flows are likely to perform poorly following private placements. Further, the negative relation between discount-rate beta and long-run performance indicate that investors are prone to be overoptimistic about high discount-rate beta firms.
Lai, Ying-chu, et 賴映竹. « Applying Spectral Decomposition in Value-at-Risk : Empirical Evidence of Taiwan Stock Market ». Thesis, 2007. http://ndltd.ncl.edu.tw/handle/96514113920472666338.
Texte intégral國立中央大學
財務金融研究所
95
Since more and more financial products have been invented, we have more ways to get more variety payoff and hedging. As a result, how to control financial risk becomes more and more important. Monte Carlo Simulation is the most widely used method to conduct value-at-risk. The most important part of computing value-at-risk of an asset portfolio is to derive the default correlation matrix to apply into Monte Carlo Simulation. Generally, we use the Cholesky decomposition to decompose the correlation matrix. However, there are some limitations to the Cholesky decomposition. The Cholesky decomposition cannot be used to decompose a non-positive correlation matrix. Under this circumstance, we may adopt the Spectral decomposition. This paper will show the efficiency of Spectral decomposition when facing the non-positive correlation matrix. Due to having fewer limitations, the Spectral decomposition could be more widely used rather than the Cholesky decomposition.
Lin, Geng-Xian, et 林耕賢. « Causal effect decomposition and risk attribution - unification of counterfactual and sufficient cause framework ». Thesis, 2019. http://ndltd.ncl.edu.tw/handle/uvf4rx.
Texte intégral國立交通大學
統計學研究所
107
The purpose of this study, attribution fraction and effect decomposition, is to unify the counterfactual framework and sufficient cause framework. This study is based on the SCC model, which is more logically finer, and the counterfactual model, which can quantify the causality. Effect decomposition decompose the effect of exposure A on outcome under the mSCC model into 6 pathways, which are DEs, Ago, MEs, Syns, Synmed and Agomed. We define the effects and the mechanism of the 6 pathways, and we estimate the effects with needed assumptions. Attribution fraction decompose the all effect on outcome Y into 9 pathways, which are ARn, PAE, ARm, PME, SYNref, SYNmed, AGONref, AGONmed, and AEagon. We calculate the proportion of every pathway, and we also define the effects and mechanism, and we estimate the effects with needed assumptions. This study also links to other methodologies of causal inference. The effect decomposition links to Vander Weele’s 4-way decomposition, and the attribution fraction links to 6-way decomposition. We use bootstrap method to validate our proposed methods. The results of simulation show the validation of these two methods. In the end, we use HCC, HBV, and HCV dataset to demonstrate this methodology.
Jiang, Yun. « Decomposition, Ignition and Flame Spread on Furnishing Materials ». Thesis, 2006. https://vuir.vu.edu.au/481/.
Texte intégralAtemnkeng, Tabi Rosy Christy. « Estimation of Longevity Risk and Mortality Modelling ». Master's thesis, 2022. http://hdl.handle.net/10362/135573.
Texte intégralPrevious mortality models failed to account for improvements in human mortality rates thus in general, human life expectancy was underestimate. Declining mortality and increasing life expectancy (longevity) profoundly alter the population age distribution. This demographic transition has received considerable attention on pension and annuity providers. Concerns have been expressed about the implications of increased life expectancy for government spending on old-age support. The goal of this paper is to lay out a framework for measuring, understanding, and analyzing longevity risk, with a focus on defined pension plans. Lee-Carter proposed a widely used mortality forecasting model in 1992. The study looks at how well the Lee-Carter model performed for female and male populations in the selected country (France) from 1816 to 2018. The Singular Value Decomposition (SVD) method is used to estimate the parameters of the LC model. The mortality table then assesses future improvements in mortality and life expectancy, taking into account mortality assumptions, to see if pension funds and annuity providers are exposed to longevity risk. Mortality assumptions are predicted death rates based on a mortality table. The two types of mortality are mortality at birth and mortality in old age. Longevity risk must be effectively managed by pension and annuity providers. To mitigate this risk, pension providers must factor in future improvements in mortality and life expectancy, as mortality rates tend to decrease over time. The findings show that failing to account for future improvements in mortality results in an expected provision shortfall. Protection mechanisms and policy recommendations to manage longevity risk can help to mitigate the financial impact of an unexpected increase in longevity.
Wang, Sheng Yuan, et 王聖元. « The Pricing, Credit Risk Decomposition and Hedging Analysis of CPDO Under The Jump Diffusion Model ». Thesis, 2011. http://ndltd.ncl.edu.tw/handle/01659379509792560049.
Texte intégral國立政治大學
金融研究所
99
The increasing trading volumes and innovative structures of credit derivatives have attracted great academic attention in the quantification and analysis of their complex risk characteristics. The pricing and hedging issues of complex credit structuers after the 2009 financial crisis are especially vital, and they present great challegens to both the academic community and industry practitioners. Constant Proportion Debt Obligations (CPDOs) are one of the new credit-innovations that claim to provide risk-adverse investors with fixed-income cash flows and minimal risk-bearing, yet the cash-outs events of such products during the crisis unfolded risk characteristics that had been unseen to investors. This research focuses on the pricing risk quantification, and dynamic hedging issues of CPDOs under a Levy jump diffusion setting. Based on decomposing the product's risk structure, we derive explicit closed-form solutions in the form of time-dependent double digital knock-out barrier options. This enables us to explore, in terms of the associated hedging greeks, the embeded risk characteristics of CPDOs and propose feasible delta-netral strategies that are feasible to hedge such products. Numerical simulations are subsequently performed to provide benchmark measures for the proposed hedging strategies.
Jootar, Jay, et Steven D. Eppinger. « A System Architecture-based Model for Planning Iterative Development Processes : General Model Formulation and Analysis of Special Cases ». 2002. http://hdl.handle.net/1721.1/4042.
Texte intégralSingapore-MIT Alliance (SMA)
Li, Yuan-Hung, et 李沅泓. « Novel Weather Generator Using Empirical Mode Decomposition and K-NN Moving Window Method and Application to Climate Change Risk Assessment of Hsinchu Water Supply System ». Thesis, 2015. http://ndltd.ncl.edu.tw/handle/46885504872403627354.
Texte intégral國立臺灣大學
生物環境系統工程學研究所
103
Climate change has drawn many attentions. However, not only human-induced climate change but also natural low frequency climate variability should be concerned. Weather generator is often used to produce weather data for climate change risk assessment. Thus, a novel weather generator needs to be developed to not only produce weather data based on climate scenarios but also reproduce the low frequency climate characteristics. The main purposes of this study is to develop a novel weather generation to preserve the low-frequency climate characteristics of the rainfall data and apply the generated data to evaluate the climate change risk of the Hsinchu water supply system. The first part of this study is to develop a novel weather generator. The proposed weather generator includes two steps. The first step is to apply Empirical Mode Decomposition (EMD) to decompose the time series of historical monthly rainfall amount into Intrinsic Mode Functions (IMF) and trends. Each IMF represents a generally simple component of the rainfall time series. During generating process, the envelope curve and every single phases of each month derived from IMFs is used to form new ones. Assemble the new IMFs and the trend to generate monthly rainfall series. The second step is to appropriately downscale the generated monthly rainfall series into daily rainfall series. After the monthly rainfall amounts has been produced, the daily rainfall amounts could be allocated from the monthly rainfall by k-NN moving window method. Thus the newly simulated daily rainfall series with long-term and low frequency climate characteristics is produced. Furthermore, weather parameters of nearby stations can also be produced by k-NN method. The statistical characteristics of the generated monthly and daily weather data, such as mean, standard deviation and autoregressive coefficient, must be comparable to the historical ones. If their similarity is conformed, further use of adopting GCM scenarios can be applied by adding statistical properties on the historical data and redo all the steps to produce future weather data. The second part of this study is to evaluate the climate change risk of a water supply system. The weather data generated by the novel weather generator are used as inputs for the GWLF model to simulate stream flows of the Hsinchu area. Then the simulated stream flows are further used in the water supply system dynamics model to analyze the risk of water deficits. The water supply system dynamics model considers current hydraulic facilities, different climate scenarios, hydrological conditions, and social and economic development. This study introduces several evaluation index and climate scenarios to assess future risks of the Hsinchu water supply system. Furthermore, the results are compared with the results using Richardson-Type generated weather data to identify the contribution of the proposed new weather generator.
Ha, Hongjun. « Essays on Computational Problems in Insurance ». 2016. http://scholarworks.gsu.edu/rmi_diss/40.
Texte intégralOndo, Guy-Roger Abessolo. « Mathematical methods for portfolio management ». Diss., 2002. http://hdl.handle.net/10500/784.
Texte intégralStatistics
M. Sc. (Operations Research)
Anjos, Helena Maria Chaves. « What drives insurance companies´ stock returns ? The impact of new rules and regulation ». Master's thesis, 2015. http://hdl.handle.net/10362/30148.
Texte intégral