Дисертації з теми "Risk assessment mathematical models; fermentation mathematical models"

Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Risk assessment mathematical models; fermentation mathematical models.

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 дисертацій для дослідження на тему "Risk assessment mathematical models; fermentation mathematical models".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Yeo, Keng Leong Actuarial Studies Australian School of Business UNSW. "Claim dependence in credibility models." Awarded by:University of New South Wales. School of Actuarial Studies, 2006. http://handle.unsw.edu.au/1959.4/25971.

Повний текст джерела
Анотація:
Existing credibility models have mostly allowed for one source of claim dependence only, that across time for an individual insured risk or a group of homogeneous insured risks. Numerous circumstances demonstrate that this may be inadequate and insufficient. In this dissertation, we developed a two-level common effects model, based loosely on the Bayesian model, which allows for two possible sources of dependence, that across time for the same individual risk and that between risks. For the case of Normal common effects, we are able to derive explicit formulas for the credibility premium. This takes the intuitive form of a weighted average between the individual risk's claims experience, the group's claims experience and the prior mean. We also consider the use of copulas, a tool widely used in other areas of work involving dependence, in constructing credibility premiums. Specifically, we utilise copulas to model the dependence across time for an individual risk or group of homogeneous risks. We develop the construction with several well-known families of copulas and are able to derive explicit formulas for their respective conditional expectations. Whilst some recent work has been done on constructing credibility models with copulas, explicit formulas for the conditional expectations have rarely been made available. Finally, we calibrate these copula credibility models using a real data set. This data set relates to the claims experience of workers' compensation insurance by occupation over a 7-year period for a particular state in the United States. Our results show that for each occupation, claims dependence across time is indeed present. Amongst the copulas considered in our empirical analysis, the Cook-Johnson copula model is found to be the best fit for the data set used. The calibrated copula models are then used for prediction of the next period's claims. We found that the Cook-Johnson copula model gives superior predictions. Furthermore, this calibration exercise allowed us to uncover the importance of examining the nature of the data and comparing it with the characteristics of the copulas we are calibrating to.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Wang, Na. "Estimation of Extra Risk and Benchmark Dose in Dose Response Models." Fogler Library, University of Maine, 2008. http://www.library.umaine.edu/theses/pdf/WangN2008.pdf.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Owen, Michelle L. "Exposure model : detailed profiling and quantification of the exposure of personnel to geotechnical hazards in underground mines." University of Western Australia. School of Civil and Resource Engineering, 2004. http://theses.library.uwa.edu.au/adt-WU2005.0031.

Повний текст джерела
Анотація:
[Truncated abstract] This thesis presents an operationally applicable and reliable model for quantification of the exposure of underground mining personnel to geotechnical hazards. The model is shown to have the flexibility to apply to very different operational environments, within the context of mechanised metalliferous mines. It provides an essential component for carrying out quantitative geotechnical risk analyses of underground mines. Increasingly prevalent within the Australian mining industry are moves towards a riskbased philosophy instead of prescriptive design procedures. A barrier to this has been the lag in availability of resources (personnel and technical) required for the intensive effort of applying probabilistic methods to geotechnical engineering at mines ... One of the missing components for quantitative risk analysis in mines has been an accurate model of personnel exposure to geotechnical hazards, from which meaningful estimates can be made of the probabilities of serious or fatal injury given a rockfall. Exposure profiling for geotechnical risk analysis at minesites has traditionally involved the simple classification of travelways and entry areas by their occupancy rate, not taking into account traffic and work characteristics which may significantly influence the risks. Therefore, it was the focus of this thesis to address that deficiency and progress the ability to perform semi-quantitative and quantitative risk analyses in mines.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Zhu, Dongming 1963. "Asymmetric heavy-tailed distributions : theory and applications to finance and risk management." Thesis, McGill University, 2007. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=102854.

Повний текст джерела
Анотація:
This thesis focuses on construction, properties and estimation of asymmetric heavy-tailed distributions, as well as on their applications to financial modeling and risk measurement. First of all, we suggest a general procedure to construct a fully asymmetric distribution based on a symmetrically parametric distribution, and establish some natural relationships between the symmetric and asymmetric distributions. Then, three new classes of asymmetric distributions are proposed by using the procedure: the Asymmetric Exponential Power Distributions (AEPD), the Asymmetric Student-t Distributions (ASTD) and the Asymmetric Generalized t Distribution (AGTD). For the first two distributions, we give an interpretation of their parameters and explore basic properties of them, including moments, expected shortfall, characterization by the maximum entropy property, and the stochastic representation. Although neither distribution satisfies the regularity conditions under which the ML estimators have the usual asymptotics, due to a non-differentiable likelihood function, we nonetheless establish asymptotics for the full MLE of the parameters. A closed-form expression for the Fisher information matrix is derived, and Monte Carlo studies are provided. We also illustrate the usefulness of the GARCH-type models with the AEPD and ASTD innovations in the context of predicting downside market risk of financial assets and demonstrate their superiority over skew-normal and skew-Student's t GARCH models. Finally, two new classes of generalized extreme value distributions, which include Jenkinson's GEV (Generalized Extreme Value) distribution (Jenkinson, 1955) as special cases, are proposed by using the maximum entropy principle, and their properties are investigated in detail.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Shen, Yunxiang. "Risk analysis and its application in mining project evaluation." Thesis, McGill University, 1987. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=64009.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Blatt, Sharon L. "An in-depth look at the information ratio." Link to electronic thesis, 2004. http://www.wpi.edu/Pubs/ETD/Available/etd-0824104-155216/.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Cross, Richard J. (Richard John). "Inference and Updating of Probabilistic Structural Life Prediction Models." Diss., Georgia Institute of Technology, 2007. http://hdl.handle.net/1853/19828.

Повний текст джерела
Анотація:
Aerospace design requirements mandate acceptable levels of structural failure risk. Probabilistic fatigue models enable estimation of the likelihood of fatigue failure. A key step in the development of these models is the accurate inference of the probability distributions for dominant parameters. Since data sets for these inferences are of limited size, the fatigue model parameter distributions are themselves uncertain. A hierarchical Bayesian approach is adopted to account for the uncertainties in both the parameters and their distribution. Variables specifying the distribution of the fatigue model parameters are cast as hyperparameters whose uncertainty is modeled with a hyperprior distribution. Bayes' rule is used to determine the posterior hyperparameter distribution, given available data, thus specifying the probabilistic model. The Bayesian formulation provides an additional advantage by allowing the posterior distribution to be updated as new data becomes available through inspections. By updating the probabilistic model, uncertainty in the hyperparameters can be reduced, and the appropriate level of conservatism can be achieved. In this work, techniques for Bayesian inference and updating of probabilistic fatigue models for metallic components are developed. Both safe-life and damage-tolerant methods are considered. Uncertainty in damage rates, crack growth behavior, damage, and initial flaws are quantified. Efficient computational techniques are developed to perform the inference and updating analyses. The developed capabilities are demonstrated through a series of case studies.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Galane, Lesiba Charles. "The risk parity approach to asset allocation." Thesis, Stellenbosch : Stellenbosch University, 2014. http://hdl.handle.net/10019.1/95974.

Повний текст джерела
Анотація:
Thesis (MSc)--Stellenbosch University, 2014.
ENGLISH ABSTRACT: We consider the problem of portfolio's asset allocation characterised by risk and return. Prior to the 2007-2008 financial crisis, this important problem was tackled using mainly the Markowitz mean-variance framework. However, throughout the past decade of challenging markets, particularly for equities, this framework has exhibited multiple drawbacks. Today many investors approach this problem with a 'safety first' rule that puts risk management at the heart of decision-making. Risk-based strategies have gained a lot of popularity since the recent financial crisis. One of the 'trendiest' of the modern risk-based strategies is the Risk Parity model, which puts diversification in terms of risk, but not in terms of dollar values, at the core of portfolio risk management. Inspired by the works of Maillard et al. (2010), Bruder and Roncalli (2012), and Roncalli and Weisang (2012), we examine the reliability and relationship between the traditional mean-variance framework and risk parity. We emphasise, through multiple examples, the non-diversification of the traditional mean-variance framework. The central focus of this thesis is on examining the main Risk-Parity strategies, i.e. the Inverse Volatility, Equal Risk Contribution and the Risk Budgeting strategies. Lastly, we turn our attention to the problem of maximizing the absolute expected value of the logarithmic portfolio wealth (sometimes called the drift term) introduced by Oderda (2013). The drift term of the portfolio is given by the sum of the expected price logarithmic growth rate, the expected cash flow, and half of its variance. The solution to this problem is a linear combination of three famous risk-based strategies and the high cash flow return portfolio.
AFRIKAANSE OPSOMMING: Ons kyk na die probleem van batetoewysing in portefeuljes wat gekenmerk word deur risiko en wins. Voor die 2007-2008 finansiele krisis, was hierdie belangrike probleem deur die Markowitz gemiddelde-variansie raamwerk aangepak. Gedurende die afgelope dekade van uitdagende markte, veral vir aandele, het hierdie raamwerk verskeie nadele getoon. Vandag, benader baie beleggers hierdie probleem met 'n 'veiligheid eerste' reël wat risikobestuur in die hart van besluitneming plaas. Risiko-gebaseerde strategieë het baie gewild geword sedert die onlangse finansiële krisis. Een van die gewildste van die moderne risiko-gebaseerde strategieë is die Risiko- Gelykheid model wat diversifikasie in die hart van portefeulje risiko bestuur plaas. Geïnspireer deur die werke van Maillard et al. (2010), Bruder and Roncalli (2012), en Roncalli and Weisang (2012), ondersoek ons die betroubaarheid en verhouding tussen die tradisionele gemiddelde-variansie raamwerk en Risiko- Gelykheid. Ons beklemtoon, deur middel van verskeie voorbeelde, die niediversifikasie van die tradisionele gemiddelde-variansie raamwerk. Die sentrale fokus van hierdie tesis is op die behandeling van Risiko-Gelykheid strategieë, naamlik, die Omgekeerde Volatiliteit, Gelyke Risiko-Bydrae en Risiko Begroting strategieë. Ten slotte, fokus ons aandag op die probleem van maksimering van absolute verwagte waarde van die logaritmiese portefeulje welvaart (soms genoem die drif term) bekendgestel deur Oderda (2013). Die drif term van die portefeulje word gegee deur die som van die verwagte prys logaritmiese groeikoers, die verwagte kontantvloei, en die helfte van die variansie. Die oplossing vir hierdie probleem is 'n lineêre kombinasie van drie bekende risiko-gebaseerde strategieë en die hoë kontantvloei wins portefeulje.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Ghosh, Gregory. "Lifesafety Analysis in the Building Firesafety Method." Digital WPI, 2004. https://digitalcommons.wpi.edu/etd-theses/1106.

Повний текст джерела
Анотація:
"The purpose of this thesis is to demonstrate and enhance the technical basis of the procedure for evaluating lifesafety within the Building Firesafety Engineering Method (BFSEM). A framework for the analysis has been documented, but not extensively tested in a building situation. Hence, procedures to obtain the necessary input data and to evaluate that data needed to be developed. In addition, the general framework had to be tested rigorously enough to identify weaknesses. "
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Baade, Ingrid Annette. "Survival analysis diagnostics." Thesis, Queensland University of Technology, 1997.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Dicks, Anelda. "Value at risk and expected shortfall : traditional measures and extreme value theory enhancements with a South African market application." Thesis, Stellenbosch : Stellenbosch University, 2013. http://hdl.handle.net/10019.1/85674.

Повний текст джерела
Анотація:
Thesis (MComm)--Stellenbosch University, 2013.
ENGLISH ABSTRACT: Accurate estimation of Value at Risk (VaR) and Expected Shortfall (ES) is critical in the management of extreme market risks. These risks occur with small probability, but the financial impacts could be large. Traditional models to estimate VaR and ES are investigated. Following usual practice, 99% 10 day VaR and ES measures are calculated. A comprehensive theoretical background is first provided and then the models are applied to the Africa Financials Index from 29/01/1996 to 30/04/2013. The models considered include independent, identically distributed (i.i.d.) models and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) stochastic volatility models. Extreme Value Theory (EVT) models that focus especially on extreme market returns are also investigated. For this, the Peaks Over Threshold (POT) approach to EVT is followed. For the calculation of VaR, various scaling methods from one day to ten days are considered and their performance evaluated. The GARCH models fail to converge during periods of extreme returns. During these periods, EVT forecast results may be used. As a novel approach, this study considers the augmentation of the GARCH models with EVT forecasts. The two-step procedure of pre-filtering with a GARCH model and then applying EVT, as suggested by McNeil (1999), is also investigated. This study identifies some of the practical issues in model fitting. It is shown that no single forecasting model is universally optimal and the choice will depend on the nature of the data. For this data series, the best approach was to augment the GARCH stochastic volatility models with EVT forecasts during periods where the first do not converge. Model performance is judged by the actual number of VaR and ES violations compared to the expected number. The expected number is taken as the number of return observations over the entire sample period, multiplied by 0.01 for 99% VaR and ES calculations.
AFRIKAANSE OPSOMMING: Akkurate beraming van Waarde op Risiko (Value at Risk) en Verwagte Tekort (Expected Shortfall) is krities vir die bestuur van ekstreme mark risiko’s. Hierdie risiko’s kom met klein waarskynlikheid voor, maar die finansiële impakte is potensieel groot. Tradisionele modelle om Waarde op Risiko en Verwagte Tekort te beraam, word ondersoek. In ooreenstemming met die algemene praktyk, word 99% 10 dag maatstawwe bereken. ‘n Omvattende teoretiese agtergrond word eers gegee en daarna word die modelle toegepas op die Africa Financials Index vanaf 29/01/1996 tot 30/04/2013. Die modelle wat oorweeg word sluit onafhanklike, identies verdeelde modelle en Veralgemeende Auto-regressiewe Voorwaardelike Heteroskedastiese (GARCH) stogastiese volatiliteitsmodelle in. Ekstreemwaarde Teorie modelle, wat spesifiek op ekstreme mark opbrengste fokus, word ook ondersoek. In hierdie verband word die Peaks Over Threshold (POT) benadering tot Ekstreemwaarde Teorie gevolg. Vir die berekening van Waarde op Risiko word verskillende skaleringsmetodes van een dag na tien dae oorweeg en die prestasie van elk word ge-evalueer. Die GARCH modelle konvergeer nie gedurende tydperke van ekstreme opbrengste nie. Gedurende hierdie tydperke, kan Ekstreemwaarde Teorie modelle gebruik word. As ‘n nuwe benadering oorweeg hierdie studie die aanvulling van die GARCH modelle met Ekstreemwaarde Teorie vooruitskattings. Die sogenaamde twee-stap prosedure wat voor-af filtrering met ‘n GARCH model behels, gevolg deur die toepassing van Ekstreemwaarde Teorie (soos voorgestel deur McNeil, 1999), word ook ondersoek. Hierdie studie identifiseer sommige van die praktiese probleme in model passing. Daar word gewys dat geen enkele vooruistkattingsmodel universeel optimaal is nie en die keuse van die model hang af van die aard van die data. Die beste benadering vir die data reeks wat in hierdie studie gebruik word, was om die GARCH stogastiese volatiliteitsmodelle met Ekstreemwaarde Teorie vooruitskattings aan te vul waar die voorafgenoemde nie konvergeer nie. Die prestasie van die modelle word beoordeel deur die werklike aantal Waarde op Risiko en Verwagte Tekort oortredings met die verwagte aantal te vergelyk. Die verwagte aantal word geneem as die aantal obrengste waargeneem oor die hele steekproefperiode, vermenigvuldig met 0.01 vir die 99% Waarde op Risiko en Verwagte Tekort berekeninge.
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Allen, H. Joel. "A Behavioral Model for Detection of Acute Stress in Bivalves." Thesis, University of North Texas, 1998. https://digital.library.unt.edu/ark:/67531/metadc277998/.

Повний текст джерела
Анотація:
A behavioral model for acute responses in bivalves, was developed using time series analysis for use in a real-time biomonitoring unit. Stressed bivalves closed their shell and waited for the stressful conditions to pass. Baseline data showed that group behavior of fifteen bivalves was periodic, however, individuals behaved independently. Group behavior did not change over a period of 20 minutes more than 30 percent, however, following toxic exposures the group behavior changed by more than 30 percent within 20 minutes. Behavior was mathematically modeled using autoregression to compare current and past behavior. A logical alarm applied to the behavior model determined when organisms were stressed. The ability to disseminate data collected in real time via the Internet was demonstrated.
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Getley, Ian L. Department of Aviation Faculty of Science UNSW. "Cosmic and solar radiation monitoring of Australian commercial flight crew at high southern latitudes as measured and compared to predictive computer modelling." Awarded by:University of New South Wales, 2007. http://handle.unsw.edu.au/1959.4/40536.

Повний текст джерела
Анотація:
This study set out to examine the levels of galactic cosmic radiation exposure to Australian aircrew during routine flight operations, with particular attention to the high southern latitude flights between Australia and South Africa. Latitudes as high as 65?? South were flown to gain the data and are typical of the normal flight routes flown between Sydney and Johannesburg on a daily basis. In achieving this objective it became evident that suitable commercially available radiation monitoring equipment was not readily available and scientific radiation monitors were sourced from overseas research facilities to compliment my own FH4lB and Liulin monitors provided by UNSW. At the same time it became apparent that several predictive codes had been developed to attempt to model the radiation doses received by aircrew based on flight route, latitudes and altitudes. Further, it became apparent that these codes had not been subjected to verification at high southern latitudes and that they had not been validated for the effects of solar particle events. Initially measurements were required at the high latitudes followed by mid-latitude data to further balance the PCAIRE code to ensure reasonableness of results for both equatorial and high latitudes. Whilst undertaking this study new scientific monitors became available which provided an opportunity to observe comparative data and results. The Liulin, QDOS and a number of smaller personal dosimeters were subsequently obtained and evaluated. This appears to be the first time that such an extensive cross comparison of these monitors has been conducted over such a wide range of latitudes and altitudes. During the course of this study a fortuitous encounter with GLE 66 enabled several aspects of code validation to be examined, namely the inability of predictive codes to estimate the increased dose associated with a GLE or the effects of a Forbush decrease on the code results. Finally I review the known biological effects as discussed by numerous authors based on current epidemiological studies, with a view to high-lighting were the advent of future technology in aviation may project aircrew dose levels.
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Kroon, Rodney Stephen. "A framework for estimating risk." Thesis, Link to the online version, 2008. http://hdl.handle.net/10019.1/1104.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
15

LEITE, ELIANA R. "Indicadores de segurança para um d´pósito final de fontes radioativas seladas." reponame:Repositório Institucional do IPEN, 2012. http://repositorio.ipen.br:8080/xmlui/handle/123456789/10142.

Повний текст джерела
Анотація:
Made available in DSpace on 2014-10-09T12:35:13Z (GMT). No. of bitstreams: 0
Made available in DSpace on 2014-10-09T13:57:01Z (GMT). No. of bitstreams: 0
Dissertação (Mestrado)
IPEN/D
Instituto de Pesquisas Energeticas e Nucleares - IPEN-CNEN/SP
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Sun, Yu. "Risk-based framework for freight movement analysis." Thesis, Queensland University of Technology, 2002.

Знайти повний текст джерела
Анотація:
Decision-making models have, m the recent years, been developed to provide systematic and comprehensive tools to analyse, evaluate and manage freight movement. Freight transport models developed thus far have not precisely defined risk agents brought by travelling vehicles, which lead to indistinct risk types. Instead, most of the models developed discussed the risks mainly related to direct impacts by traffic accidents. On the other hand, transport efficiency, which is of more and more concern, has not been sufficiently emphasised in the previous models. This thesis studies the factors in relation to not only safety issues, but efficiency issues. And the risks due to freight movement have been classified into categories in accordance with their distinct natures and typically, affected population groups including human, environment and economic infrastructure. Two new theories, risk agent and risk response factor, have been introduced to the framework to precisely define and evaluate various risk types. Vehicle travelling on a specific route may encounter various situations combined by road characteristics, traffic flow and weather conditions, etc. In order to assist in analysing freight movement in a systematic manner, freight movement behaviours, which pose negative impacts to nearby population, are divided into different modes, which have been interpreted as "risk-producing activities" in the Risk-based Framework for Freight Movement Analysis (RBF-FMA) in order to identify the characteristics of the risk agents. It is important to differentiate the segments with significant changes in the travel risk producing conditions. This study divides travel route into segments and each segment is assessed separately and differentiates among three segment types: travel segment, intersection, and roundabout according to their different contributing factors. The framework developed in this study also considers the availability of emergency response facilities and support system as a major risk-reducing factor. When applied and compared with the risk rating results estimated by the Queensland Transport Department (QTD) using their risk-rating model, the RBF-FMA gave highly comparable results. In the evaluation, both the QTD and RBF-FMA models were applied to assess the risk associated with the release of hazardous mate1ials at 25 segments identified as having high risk by the QTD. The RBF-FMA was also successfully applied to compare two routes between two common points and the results were generally consistent with the concentration of human population, enviromnental population and economic activity and infrastructure along the two routes. The basic data that was needed to conduct the RBF-FMA was easily generated using site visits and from available data basis. While the RBF-FMA presents a logical framework that is based on the risk assessment and management methodology, the process of assigning scores (ranks) and weights to the various factors and sub-factors is basically subjective and reflects the education, values, judgement and understanding of the model by the user. It is recognised that this subjectivity can lead to viability in the results. The contribution of this study is significant in that it translates the basic risk assessment model used in the public health field into a framework suitable for rating the risk associated with freight movement. In addition, the effort presented a basic modelling approach that can be modified or built on by other researchers in this field. The framework formulated in this study is worthy of further research and development as it can be used as a useful system for making decisions related to moving freight along selected routes. Further work could include development of a GIS-based computer program which is able to contain huge amount of risk assessment data associated with freight movement and provide a visual operation of the risk analysis.
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Khajeh-Hosseini, Ali. "Supporting system deployment decisions in public clouds." Thesis, University of St Andrews, 2013. http://hdl.handle.net/10023/3412.

Повний текст джерела
Анотація:
Decisions to deploy IT systems on public Infrastructure-as-a-Service clouds can be complicated as evaluating the benefits, risks and costs of using such clouds is not straightforward. The aim of this project was to investigate the challenges that enterprises face when making system deployment decisions in public clouds, and to develop vendor-neutral tools to inform decision makers during this process. Three tools were developed to support decision makers: 1. Cloud Suitability Checklist: a simple list of questions to provide a rapid assessment of the suitability of public IaaS clouds for a specific IT system. 2. Benefits and Risks Assessment tool: a spreadsheet that includes the general benefits and risks of using public clouds; this provides a starting point for risk assessment and helps organisations start discussions about cloud adoption. 3. Elastic Cost Modelling: a tool that enables decision makers to model their system deployment options in public clouds and forecast their costs. These three tools collectively enable decision makers to investigate the benefits, risks and costs of using public clouds, and effectively support them in making system deployment decisions. Data was collected from five case studies and hundreds of users to evaluate the effectiveness of the tools. This data showed that the cost effectiveness of using public clouds is situation dependent rather than universally less expensive than traditional forms of IT provisioning. Running systems on the cloud using a traditional 'always on' approach can be less cost effective than on-premise servers, and the elastic nature of the cloud has to be considered if costs are to be reduced. Decision makers have to model the variations in resource usage and their systems' deployment options to obtain accurate cost estimates. Performing upfront cost modelling is beneficial as there can be significant cost differences between different cloud providers, and different deployment options within a single cloud. During such modelling exercises, the variations in a system's load (over time) must be taken into account to produce more accurate cost estimates, and the notion of elasticity patterns that is presented in this thesis provides one simple way to do this.
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Mello, Bernardo Brazão Rego. "Classificação de risco setorial com base nos métodos Weighted Influence Non-linear Gauge System e Analytic Hierarchy Process." reponame:Biblioteca Digital do Banco Nacional de Desenvolvimento Econômico e Social, 2014. http://web.bndes.gov.br/bib/jspui/handle/1408/5341.

Повний текст джерела
Анотація:
Bibliografia: p. 46-48
Dissertação (mestrado) - Faculdade de Economia e Finanças Ibmec, Rio de Janeiro, 2014.
Devido à crescente importância dos mercados financeiros nas últimas décadas, o risco de crédito tem se tornado um tema fundamental na tomada de decisões acerca de investimentos, taxas de financiamento, solvência corporativa, tendência e perspectivas etc. Os modelos de avaliação de risco de crédito, em geral, podem ser classificados em duas categorias: quantitativo e qualitativo. Modelos quantitativos buscam analisar informações de demonstrativos financeiros e seus indicadores, enquanto modelos qualitativos focam na análise de variáveis intangíveis que afetam os negócios globais. Estes modelos normalmente seguem uma estrutura "top-down" de análise setorial, competitividade e comparação de pares e gestão. O objetivo desta dissertação é apresentar um modelo de classificação de risco setorial com base em métodos de análise multicritério que possam mensurar a importância das variáveis que afetam os setores da economia brasileira, bem como a influência entre estas. O modelo é baseado, principalmente, no método Weighted Influence Non-Linear Gauge System. Acerca dos julgamentos sobre as variáveis, o modelo baseia-se na utilização do método Analytic Hierarchy Process. O resultado do modelo é apresentado através de níveis de risco, aplicado a quatorze setores da economia brasileira. A dissertação se encerra com uma discussão sobre os resultados, bem como com um esboço do direcionamento para futuras pesquisas.
Due to the increasing importance of the financial market over the past decades, credit risk has become a paramount issue in investment, loan spreads, corporate solvency, trends and prospetcs, etc. Credit risk evaluation models may be classified in two broad categories: quantitative and qualitative. Quantitative models seek to analyze information from financial statement and indexes, while qualitative models focus on the analysis of intangible variables that affect global business. These models typically follow a top-down approach by analyzing the industry risk, competitiveness and peer comparison and management. The aim of this thesis is to present an industry risk assessment model based on multicriteria analysis methods that can measure the strengh of variables that affect the industries of Brazilian economy, as well as the influence between them. The model is based primarily on the Weighted Influence Non-Linear Gauge System method. Concerning human judgements about the variables, the model is founded on the use of the Analytic Hierarchy Process method. The result from the model is presented through risk levels, applied to fourteen industries in the Brazilian economy. The thesis closes with a discussion of results, as well as with an outline to future research directions.
Стилі APA, Harvard, Vancouver, ISO та ін.
19

Omrane, Fatma. "Human health risk assessment of occupational exposure to trace metallic elements mixtures in metalworking industries in the Sfax metropolis (Tunisia)." Thesis, Université de Lorraine, 2018. http://www.theses.fr/2018LORR0097/document.

Повний текст джерела
Анотація:
Les éléments trace métalliques (ETM) sont des polluants qui sont sources de préoccupations majeures à cause de leurs toxicités et de leurs propriétés cumulatives. Certains d’eux peuvent être cancérogènes. La métropole de Sfax, située au sud de la Tunisie, a été touchée par des rejets et émissions d’ETM depuis des décennies. Plusieurs études ont confirmé que la pollution métallique est principalement d’origine anthropique, liée en particulier aux activités industrielles. Cela présente un risque sur la santé des habitants, particulièrement pour ceux qui sont également exposés par leur métier dans des procédés industriels. L’objectif de cette étude est d’évaluer les risques sanitaires associés à l’exposition professionnelle dans les industries qui manipulent des ETM dans leurs processus de production, en suivant l’approche de l’évaluation des risques sanitaires. Dans ce but, cinq entreprises qui utilisent des métaux comme matière première pour produire une variété de produits métalliques, ont accepté d’adhérer à notre étude. Les métaux qui étaient explorés sont Al, Cr, Ni, Cu, Zn and Pb. Des modèles mathématiques de prédiction des expositions professionnelles aux agents chimiques ont été utilisés pour estimer les concentrations des ETM dans l’air intérieur pour 15 postes différents. Des prélèvements atmosphériques ont été effectués afin de comparer les concentrations prédites à celles mesurées, en utilisant des prélèvements individuels ou sur postes fixes. Finalement, des prélèvements urinaires ont été collectés chez 61 travailleurs afin d’évaluer le lien entre l’excrétion des ETM et les niveaux atmosphériques. Globalement, les estimations des concentrations atmosphériques avaient une bonne concordance avec les valeurs mesurées sur l’ensemble des postes de travail. Des meilleures prédictions ont été trouvées pour certaines activités, en particulier pour des processus de découpage des tôles et de soudures. Les valeurs qui correspondent au 90ème percentile de la distribution de l’exposition ont été utilisées pour le calcul du « interaction-based hazard index HIint » pour évaluer les risques associés aux mélanges d’ETM. Un excès de risque total de cancer a été aussi calculé. Les résultats ont montré des expositions élevées qui peuvent provoquer des pathologies respiratoires, avec un HIint allant jusqu’à 93,6. Les niveaux les plus élevés sont attribués à la soudure à l'arc à l'électrode enrobée et au débitage et cisaillage des tôles. Ces risques augmentent à cause de l’effet synergique qui existe entre Cr, Ni et Cu. Des risques élevés de cancer du poumon et du rein ont été encore démontrés (risque total vie entière de cancer pour les ouvriers exposés : 3.7×10-4). Ce travail montre que les modèles mathématiques peuvent prédire correctement les niveaux d’exposition des ETM dans l’air intérieur pour plusieurs processus de la métallurgie. Ce résultat est intéressant pour aider les différents acteurs pour piloter de manière efficiente les systèmes de surveillance et la réduction des expositions dans ce secteur économique. Des progrès en matière d’hygiène industrielle sont nécessaires dans ce secteur industriel pour minimiser le risque sanitaire élevé auquel sont actuellement exposés les travailleurs concernés
Trace metallic elements (TMEs) are pollutants of great concern even in trace amounts because of their toxicity and cumulative property. Some of them can be carcinogenic. The Sfax metropolis, located in the southern region of Tunisia, has been affected by releases of TMEs for decades. Several studies confirmed that this pollution is predominantly originated from anthropogenic sources, mainly from industrial activities. It represents a threat to the health of residents, particularly for those also exposed during occupational activities in industrial processes. The present study aims to assess health risks associated with occupational exposure in industries handling TMEs in their production processes, following the human health risk assessment approach. To this end, five companies using raw material containing TMEs to produce a variety of metallic products accepted to participate to the study. The metals that were investigated are Al, Cr, Ni, Cu, Zn and Pb. Mathematical models for estimating occupational exposure to chemicals were used to predict indoor air TME exposure levels in 15 different job tasks. Air monitoring was conducted in order to compare the predicted workplace air concentrations versus the direct measured ones, using both workplace-fixed monitors and personal samplers. And finally, urine samples were collected from 61 workers to assess whether TMEs excretion correlate with job exposure levels. Globally, the predicted air estimates relate well with measured concentrations over the whole set of job tasks. Better predictions were found for certain activities, in particular for steel cutting and welding processes. The values that correspond to the 90th percentile of the exposure distribution were then used in the interaction-based hazard index HIint to assess health risks associated with the mixtures of TMEs. Total cancer risk was also investigated. Results showed high exposures for metals that may elicit respiratory conditions, with a HIint reaching 93.6, the highest levels being for the shielded metal arc welding and metal shearing and slitting tasks. The risk is enhanced by a synergetic effect between Cr, Ni and Cu. High risks of lung and kidney cancers were demonstrated (the predicted life-long total cancer risk for exposed workers is 3.7×10-4). This work shows that mathematical models can be accurate in predicting TME airborne exposure levels for several processes in the metallurgic industry, a result that is of interest to help the different stakeholders to monitor efficiently exposure surveillance and abatement. Progress in industrial hygiene is needed in this industrial sector to reduce the high level of health risks currently experienced by the metalworking workers
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Gobira, Diogo Barboza. "Precificação de derivativos exóticos no mercado de petróleo." reponame:Repositório Institucional do BNDES, 2014. http://web.bndes.gov.br/bib/jspui/handle/1408/7023.

Повний текст джерела
Анотація:
Bibliografia: p. 109-111
Dissertação (mestrado) - Instituto Nacional de Matemática Pura e Aplicada, Rio de Janeiro, 2014.
Estudamos a precificação de opções exóticas nos mercados de petróleo e de seus derivados. Iniciamos com uma análise exploratória dos dados, revisitando suas propriedades estatísticas e fatos estilizados relacionados às volatilidades e correlações. Subsidiados pelos resultados de tal análise, apresentamos alguns dos principais modelos forward para commodities e um vasto conjunto de estruturas determinísticas de volatilidades, bem como os respectivos métodos de calibragem, para os quais executamos testes com dados reais. Para melhorar o desempenho de tais modelos na precificação do smile de volatilidade, reformulamos o modelo de volatilidade estocástica de Heston para lidar com uma ou múltiplas curvas forward, permitindo sua utilização na precificação de contratos definidos sobre múltiplas commodities. Calibramos e testamos tais modelos a partir de dados reais dos mercados de petróleo, gasolina e gás, e comprovamos a sua superioridade frente aos modelos de volatilidade determinística. Para subsidiar a precificação de opções exóticas e contratos OTC, revisitamos dos pontos de vista teórico e prático assuntos como simulação de Monte Carlo, soluções numéricas para SDEs e exercício americano. Finalmente, por meio de uma bateria de simulações numéricas, mostramos como os modelos podem ser utilizados na precificação de opções exóticas que tipicamente ocorrem nos mercados de commodities, como as calendar spread options, crack spread options e as opções asiáticas.
We study the pricing of exotic options in the oil and its derivatives markets. We begin with a exploratory analysis of the data, revisiting statistical properties and stylized facts related to the volatilities and correlations. Based on this results, we present some of the main commodity forward models and a wide range of deterministic volatility structures, as well as its calibration methods, for which we ran tests with real market data. To improve the performance of such models in pricing the volatility smile, we reformulate the Heston stochastic volatility model to cope with one or multiple forward curves together, allowing its use for the pricing of multicommodity based contracts. We calibrate and test such models for the oil, gasoline and natural gas markets, confirming their superiority against deterministic volatility models. To support the tasks of exotic options and OTC contracts pricing, we also revisit, from the theoretical and practical points of view, tools and issues such as Monte Carlo simulation, numerical solutions to SDEs and American exercise. Finally, through a battery of numerical simulations, we show how the presented models can be used to price typical exotic options occurring in commodity markets, such as calendar spread options, crack spread options and Asian options.
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Bula, Gustavo Alfredo. "Vehicle Routing for Hazardous Material Transportation." Thesis, Troyes, 2018. http://www.theses.fr/2018TROY0014.

Повний текст джерела
Анотація:
L'objectif de cette thèse est d'étudier le problème du transport de matières dangereuses (HazMat) vu comme un problème de tournées de véhicules à flotte hétérogène. Les décisions pour ce type de transport comportent des objectifs différents, parfois antagonistes. Deux sont pris en compte dans ce travail, le coût et le risque. La première tâche entreprise a été la formulation d'un modèle mathématique pour la minimisation du risque, qui dépend du type de véhicule, du matériel transporté et du changement de charge lorsque le véhicule passe d'un client à un autre. Une approximation linéaire par morceaux est utilisée pour conserver une formulation de programmation linéaire en nombres entiers mixtes.Des méthodes hybrides basées sur des explorations de voisinages sont proposées pour traiter la minimisation du risque. Cela comprend l'étude des structures de voisinages et le développement d'un algorithme de descente à voisinages variables (VND) pour la recherche locale, ainsi qu'un mécanisme de perturbation des solutions. Une post-optimisation est appliquée pour améliorer la qualité des solutions obtenues.Enfin, deux approches, un algorithme basé sur la dominance multi-objectif et une méta-heuristique de type epsilon- contrainte, sont développées pour traiter la version multi-objectif. Deux mesures de performance sont utilisées : l'hyper volume et la ∆-métrique. Les approximations de fronts montrent qu'une légère augmentation du coût total des tournées peut entraîner une forte réduction en pourcentage des risques
The main objective of this thesis is to study the hazardous materials (HazMat) transportation problem considered as a heterogeneous fleet vehicle routing problem. HazMat transportation decisions comprise different and sometimes conflicting objectives. Two are considered in this work, the total routing cost and the total routing risk. The first task undertaken was the formulation of a mathematical model for the routing risk minimization, which depends on the type of vehicle, the material being transported, and the load change when the vehicle goes from one customer to another. A piecewise linear approximation is employed to keep a mixed integer linear programing formulation.Hybrid solution methods based on neighborhood search are explored for solving the routing risk minimization. This includes the study of neighborhood structures and the development of a Variable Neighborhood Descent (VND) algorithm for local search, and a perturbation mechanism (shaking neighborhoods). A post-optimization procedure is applied to improve the solution quality.Finally, two different solution approaches, a multi-objective dominance-based algorithm and a meta-heuristic epsilon-constraint method are employed for addressing the multi-objective version of the problem. Two performance metrics are used: the hyper volume and the ∆-metric. The front approximations show that a small increment in the total routing cost can produce a high reduction in percentage of the expected consequences given the probability of a HazMat transportation incident
Стилі APA, Harvard, Vancouver, ISO та ін.
22

Ripka, Wagner Luis. "Modelos matemáticos para estimativa da gordura corporal de adolescentes utilizando dobras cutâneas, a partir da absorciometria de raios-X de dupla energia." Universidade Tecnológica Federal do Paraná, 2017. http://repositorio.utfpr.edu.br/jspui/handle/1/2865.

Повний текст джерела
Анотація:
Introdução: Estudos têm encontrado uma transição da obesidade da população adulta para crianças e adolescentes, que por sua vez, pode acarretar manifestações clínicas, como: doenças coronarianas, diabetes tipo 2, e complicações psicossociais cada vez mais precocemente. Contudo, métodos para avaliação da composição corporal para essa faixa etária, principalmente envolvendo técnicas de baixo custo como as medidas de dobras cutâneas (DC) apresentam imprecisões em estudos brasileiros. Fator o qual pode levar a uma interpretação equivocada da composição corporal dos avaliados. Objetivo: desenvolver novos modelos matemáticos utilizando medidas de DC, tendo como referência a absorciometria de raios-X de dupla energia (DXA), para estimativa de massa de gordura (G) em adolescentes. Métodos: Trata-se de um estudo exploratório descritivo onde foram avaliados 416 adolescentes do gênero masculino de 12 a 17 anos, sendo 42 destinados para compor a amostra de validação da pesquisa. Foram coletadas medidas de massa corporal total, estatura, circunferência da cintura e quadril, nove pontos anatômicos baseados em DC: bíceps, tríceps, subescapular, peitoral, axilar média, abdominal, supra-ilíaca, coxa e panturrilha, além da G e densidade mineral óssea (DMO) aferida com a tecnologia de DXA. Para o desenvolvimento das equações foi utilizado um modelo de regressão linear múltipla através do método de mínimos quadrados ordinários (OLS). Resultados: O grupo apresentou índice de massa corporal (IMC) médio de 21,25±4,12kg/m² e %G = 20,57±5,80%. A partir do %G, a prevalência de excesso de gordura foi verificada em 38,3% dos adolescentes. O impacto da gordura na DMO dos adolescentes indicou uma associação na ordem de r = -0,358; p<0,005, sendo verificada redução de até 14% da DMO para a região da coluna em adolescentes com obesidade em comparação aos eutróficos. O desenvolvimento de novos modelos matemáticos que atendessem critérios de alto coeficiente de determinação (R²), baixo erro padrão de estimativa (EPE), controle de colinearidade, normalidades dos resíduos, homoscedasticidade e praticidade, possibilitaram a apresentação de três opções com R² = 0,932 e EPE 1,79; R² = 0,912 e EPE = 1,78; R² = 0,850 e EPE = 1,87, respectivamente. Em todas as opções, as variáveis idade e estatura foram empregadas, bem como as DC de tríceps e subescapular. Conclusão: Os resultados obtidos evidenciam a possibilidade de desenvolvimento de novos modelos matemáticos para a avaliação da gordura corporal em adolescentes com resultados superiores aos modelos existentes na literatura.
Introduction: Studies have found a transition from obesity of the adult population to children and adolescents, which in turn, can lead to clinical manifestations, such as: coronary diseases, type 2 diabetes, and psychosocial complications increasingly early. However, methods for evaluating nutritional status for this age group, mainly involving low cost techniques such as skinfold thickness measurements (ST), are imprecise in Brazilian studies. Factor which can lead to a mistaken interpretation of the body composition of the evaluated ones. Objective: To develop new mathematical models, based on DC measurements, based on dual energy X-rays absorptiometry (DXA), to estimate fat mass (G) in adolescents. Methods: This was an exploratory descriptive study in which 416 male adolescents aged 12 to 17 years were evaluated, 42 of whom were separated to compose the study validation sample. Measurements of total body mass, stature, waist and hip circumference were obtained, nine anatomical points based on ST: biceps, triceps, subscapular, pectoral, mid axillary, abdominal, suprailiac, thigh and calf muscles, as well as G and bone mineral density (BMD) measured with DXA technology. For the development of the equations, a multiple linear regression model was used by the ordinary least square (OLS) method. Results: The group had a mean body mass index (BMI) of 21.25± 4.12 kg / m² and %G = 20.57 ± 5.80%. From %G, the prevalence of excess fat was verified in 38.3% of adolescents. The impact of fat on adolescent BMD indicated an association in the order of r = -0.358; P <0.005, with BMD reduction up to 14% for the spine region in adolescents with obesity compared to eutrophic. The development of new mathematical models that meet criteria of high coefficient of determination (R²), low standard error of estimation (SEE), control of colinearity, residue normalities, homoscedasticity and practicality, allowed the presentation of three options with R² = 0.932 and SEE 1.79; R² = 0.912 and SEE = 1.78; R² = 0.850 and SEE = 1.87, respectively. In all the options, the variables age and height were employed, as well as triceps and subscapular ST. Conclusion: The results obtained evidenced the possibility of developing new mathematical models for the evaluation of body fat in adolescents with results superior to the existing models in the literature.
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Broy, Perrine. "Evaluation de la sûreté de systèmes dynamiques hybrides complexes : application aux systèmes hydrauliques." Phd thesis, Université de Technologie de Troyes, 2014. http://tel.archives-ouvertes.fr/tel-01006308.

Повний текст джерела
Анотація:
Ces travaux s'intéressent à l'estimation de la fiabilité des évacuateurs de crues vannés. Le comportement fiabiliste de ces systèmes hydrauliques dépend à la fois d'événements aléatoires discrets, mais aussi de l'évolution d'une variable déterministe continue : ce sont des systèmes dynamiques hybrides. Pour ces systèmes, l'événement redouté est réalisé lorsque le niveau de la retenue atteint un seuil de sûreté. La démarche de fiabilité dynamique proposée dans cette thèse vise à prendre en compte l'information temporelle, de la modélisation à la synthèse d'indicateurs fiabilistes pour l'aide à la décision et développe deux contributions : 1) L'élaboration d'une base de connaissances dédiée à la description des évacuateurs de crues en termes de fiabilité dynamique. Chaque classe de composants est décrite par un automate stochastique hybride dont les états sont les différentes phases de son fonctionnement. 2) Le suivi de la simulation de Monte Carlo, le traitement et l'analyse des "histoires" (séquence de tous les états activés et des dates d'activation) obtenues en simulation. Cela permet de construire des indicateurs de fiabilité classique (probabilité d'occurrence de l'évènement redouté, identification des coupes équivalentes prépondérantes, ...). Des indicateurs de fiabilité dynamique basés sur la classification des histoires en fonction des dates de défaillance des composants concernés et sur l'estimation de l'importance dynamique sont aussi proposés.
Стилі APA, Harvard, Vancouver, ISO та ін.
24

Patil, Rohit A. "Novel application of quantitative risk assessment modelling to a continuous fermenter." Thesis, 2006. http://hdl.handle.net/2440/69737.

Повний текст джерела
Анотація:
In food and pharmaceutical industries plant failure can be costly and sometimes catastrophic to public health. This thesis uses the notion of Friday 13th Syndrome, ie. the unexpected failure of a well-operated plant, to develop a new and rigorous mathematical model of a generalised, continuous fermenter to gain insight into the likelihood of bioprocess failure. This new model is developed for a continuous, anaerobic fermenter based on widely employed Monod process model.
Thesis (M.Eng.Sc.) -- University of Adelaide, School of Chemical Engineering, 2006
Стилі APA, Harvard, Vancouver, ISO та ін.
25

"Optimal dynamic portfolio selection under downside risk measure." 2014. http://library.cuhk.edu.hk/record=b6116127.

Повний текст джерела
Анотація:
传统的风险控制以终端财富的各阶中心矩作为风险度量,而现在越来越多的投资模型转向以不对称的在某个特定临界值的下行风险作为风险度量。在现有的下行风险中,安全第一准则,风险价值,条件风险价值,下偏矩可能是最有活力的代表。在这篇博士论文中,在已有的静态文献之上,我们讨论了以安全第一准则,风险价值,条件风险价值,下偏矩为风险度量的一系列动态投资组合问题。我们的贡献在于两个方面,一个是建立了可以被解析求解的模型,另一个是得到了最优的投资策略。在终端财富上加上一个上界,使得我们克服了一类下行风险投资组合问题的不适定性。引入的上界不仅仅使得我们的下行风险下的投资组合问题能得到显式解,而且也让我们可以控制下行风险投资组合问题的最优投资的冒险性。用分位数法和鞅方法,我们能够得到上述的各种模型的解析解。在一定的市场条件下,我们得到了对应的拉格朗日问题的乘子的存在性和唯一性, 这也是对应的鞅方法中的核心步骤。更进一步,当市场投资组合集是确定性的时候,我们推出解析的最优财富过程和最优投资策略。
Instead of controlling "symmetric" risks measured by central moments of terminal wealth, more and more portfolio models have shifted their focus to manage "asymmetric" downside risks that the investment return is below certain threshold. Among the existing downside risk measures, the safety-first principle, the value-at-risk (VaR), the conditional value-at-risk (CVaR) and the lower-partial moments (LPM) are probably the most promising representatives.
In this dissertation, we investigate a general class of dynamic mean-downside risk portfolio selection formulations, including the mean-exceeding probability portfolio selection formulation, the dynamic mean-VaR portfolio selection formulation, the dynamic mean-LPM portfolio selection formulation and the dynamic mean-CVaR portfolio selection formulation in continuous-time, while the current literature has only witnessed their static versions. Our contributions are two-fold, in both building up tractable formulations and deriving corresponding optimal policies. By imposing a limit funding level on the terminal wealth, we conquer the ill-posedness exhibited in the class of mean-downside risk portfolio models. The limit funding level not only enables us to solve dynamic mean-downside risk portfolio optimization problems, but also offers a flexibility to tame the aggressiveness of the portfolio policies generated from the mean-downside risk optimization models. Using quantile method and martingale approach, we derive optimal solutions for all the above mentioned mean-downside risk models. More specifically, for a general market setting, we prove the existence and uniqueness of the Lagrangian multiplies, which is a key step in applying the martingale approach, and establish a theoretical foundation for developing efficient numerical solution approaches. Furthermore, for situations where the opportunity set of the market setting is deterministic, we derive analytical portfolio policies.
Detailed summary in vernacular field only.
Zhou, Ke.
Thesis (Ph.D.) Chinese University of Hong Kong, 2014.
Includes bibliographical references (leaves i-vi).
Abstracts also in Chinese.
Стилі APA, Harvard, Vancouver, ISO та ін.
26

Schwartz, Carmit M. Economics Australian School of Business UNSW. "Individuals' responses to changes in risk: a person-specific analysis." 2007. http://handle.unsw.edu.au/1959.4/40575.

Повний текст джерела
Анотація:
In this thesis we consider two comparative statics questions of changes in risk. The first question concerns situations where an individual faces some risk and has no control over the uncertain environment. In these situations we ask what kind of changes in risk will cause the individual's expected utility to increase. The second comparative statics question concerns situations where an individual faces some risk and has some control over the uncertain environment. In particular, we consider situations where the individual maximizes her expected utility with respect to some control parameter. Here we ask what kind of changes in risk will cause the individual's optimal value of the control parameter to increase. The existing literature has answered these questions for a class of individuals (for example, the class of risk averse individuals). This thesis differs from existing literature as it focuses on a given individual, and thus reveals some of the person-specific factors that affect individual?s responses to changes in risk. The aim of the thesis is to show how an order on distributions, termed single crossing likelihood ratio (SCLR) order, can intuitively answer both questions for a given individual. The main contributions of the thesis are as follows. First, the thesis presents the SCLR order and its main properties. Second, the thesis shows that the SCLR order can answer the above comparative statics questions in an intuitive way. In particular, the thesis shows that the answer to the above questions, with the use of the SCLR order, depends on a risk reference point which can be interpreted as a "certainty equivalent" point. Thus it is demonstrated that individual's responses to changes in risk are affected by her "certainty equivalent" point. Lastly, the results of the thesis can be used to provide an intuitive explanation of related existing results that were obtained for a class of individuals.
Стилі APA, Harvard, Vancouver, ISO та ін.
27

"Shrinkage method for estimating optimal expected return of self-financing portfolio." Thesis, 2011. http://library.cuhk.edu.hk/record=b6075221.

Повний текст джерела
Анотація:
A new estimator for calculating the optimal expected return of a self-financing portfolio is proposed, by considering the joint impact of the sample mean vector and the sample covariance matrix. A shrinkage covariance matrix is designed to substitute the sample covariance matrix in the optimization procedure, which leads to an estimate of the optimal expected return smaller than the plug-in estimate. The new estimator is also applicable for both p < n and p ≥ n. Simulation studies are conducted for two empirical data sets. The simulation results show that the new estimator is superior to the previous methods.
By the seminal work of Markowitz in 1952, modern portfolio theory studies how to maximize the portfolio expected return for a given risk, or minimize the risk for a given expected return. Since these two issues are equivalent, this thesis only focuses on the study of the optimal expected return of a self-financing portfolio for a given risk.
Finally, under certain assumptions, we extend our research in the framework of random matrix theory.
The mean-variance portfolio optimization procedure requires two crucial inputs: the theoretical mean vector and the theoretical covariance matrix of the portfolio in one period. Since the traditional plug-in method using the sample mean vector and the sample covariance matrix of the historical data incurs substantial estimation errors, this thesis explores how the sample mean vector and the sample covariance matrix behave in the optimization procedure based on the idea of conditional expectation and finds that the effect of the sample mean vector is an additive process while the effect of the sample covariance matrix is a multiplicative process.
Liu, Yan.
Adviser: Ngai Hang Chan.
Source: Dissertation Abstracts International, Volume: 73-06, Section: B, page: .
Thesis (Ph.D.)--Chinese University of Hong Kong, 2011.
Includes bibliographical references (leaves 76-80).
Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
Electronic reproduction. [Ann Arbor, MI] : ProQuest Information and Learning, [201-] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
Abstract also in Chinese.
Стилі APA, Harvard, Vancouver, ISO та ін.
28

"Bayesian approach for risk bucketing." 2009. http://library.cuhk.edu.hk/record=b5894184.

Повний текст джерела
Анотація:
Lau, Ka Ho.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2009.
Includes bibliographical references (leaves 46-48).
Abstract also in Chinese.
Chapter 1 --- Introduction to Global Credit Risk Management Standard --- p.1
Chapter 1.1 --- Background --- p.2
Chapter 1.2 --- Basel Accords --- p.2
Chapter 1.3 --- Risk Bucketing --- p.7
Chapter 2 --- Current Practices of Risk Bucketing and PD Estimation --- p.10
Chapter 2.1 --- Credit Scoring --- p.10
Chapter 2.2 --- Risk Bucketing after Credit Scoring --- p.12
Chapter 2.3 --- Related Literature Review --- p.14
Chapter 2.4 --- Objective --- p.16
Chapter 3 --- Bayesian Model for risk bucketing --- p.17
Chapter 3.1 --- The Model --- p.17
Chapter 3.2 --- Posterior Distribution --- p.19
Chapter 3.3 --- Gibbs Sampler for the Posterior Distribution --- p.22
Chapter 3.3.1 --- General Gibbs Sampler Theory --- p.22
Chapter 3.3.2 --- The Gibbs Sampler for the Proposed Model --- p.23
Chapter 3.4 --- Monitoring Convergence of the Gibbs Sampler --- p.26
Chapter 3.5 --- "Estimation, Bucketing and Prediction" --- p.28
Chapter 3.5.1 --- Estimation --- p.28
Chapter 3.5.2 --- Bucketing --- p.28
Chapter 3.5.3 --- Prediction --- p.29
Appendix --- p.29
Chapter 4 --- Simulation Studies and Real Data Analysis --- p.32
Chapter 4.1 --- Simulation Studies --- p.32
Chapter 4.1.1 --- Details of Simulation --- p.32
Chapter 4.1.2 --- Simulation Procedures --- p.34
Chapter 4.1.3 --- Predictive Performance --- p.35
Chapter 4.1.4 --- Summary of Simulation Results --- p.36
Chapter 4.2 --- Real Data Analysis --- p.37
Chapter 5 --- Conclusion and Discussion --- p.44
Bibliography --- p.46
Стилі APA, Harvard, Vancouver, ISO та ін.
29

"Efficient portfolio optimisation by hydridised machine learning." Thesis, 2015. http://hdl.handle.net/10210/13583.

Повний текст джерела
Анотація:
D.Ing.
The task of managing an investment portfolio is one that continues to challenge both professionals and private individuals on a daily basis. Contrary to popular belief, the desire of these actors is not in all (or even most) instances to generate the highest profits imaginable, but rather to achieve an acceptable return for a given level of risk. In other words, the investor desires to have his funds generate money for him, while not feeling that he is gambling away his (or his clients’) funds. The reasons for a given risk tolerance (or risk appetite) are as varied as the clients themselves – in some instances, clients will simply have their own arbitrary risk appetites, while other may need to maintain certain values to satisfy their mandates, while other may need to meet regulatory requirements. In order to accomplish this task, many measures and representations of performance data are employed to both communicate and understand the risk-reward trade-offs involved in the investment process. In light of the recent economic crisis, greater understanding and control of investment is being clamoured for around the globe, along with the concomitant finger-pointing and blame-assignation that inevitably follows such turmoil, and such heavy costs. The reputation of the industry, always dubious in the best of times, has also taken a significant knock after the events, and while this author would not like to point fingers, clearly the managers of funds, custodians of other people’s money, are in no small measure responsible for the loss of the funds under their care. It is with these concerns in mind that this thesis explores the potential for utilising the powerful tools found within the disciplines of artificial intelligence and machine learning in order to aid fund managers in the balancing of portfolios, tailoring specifically to their clients’ individual needs. These fields hold particular promise due to their focus on generalised pattern recognition, multivariable optimisation and continuous learning. With these tools in hand, a fund manager is able to continuously rebalance a portfolio for a client, given the client’s specific needs, and achieve optimal results while staying within the client’s risk parameters (in other words, keeping within the clients comfort zone in terms of price / value fluctuations).This thesis will first explore the drivers and constraints behind the investment process, as well as the process undertaken by the fund manager as recommended by the CFA (Certified Financial Analyst) Institute. The thesis will then elaborate on the existing theory behind modern investment theory, and the mathematics and statistics that underlie the process. Some common tools from the field of Technical Analysis will be examined, and their implicit assumptions and limitations will be shown, both for understanding and to show how they can still be utilised once their limitations are explicitly known. Thereafter the thesis will show the various tools from within the fields of machine learning and artificial intelligence that form the heart of the thesis herein. A highlight will be placed on data structuring, and the inherent dangers to be aware of when structuring data representations for computational use. The thesis will then illustrate how to create an optimiser using a genetic algorithm for the purpose of balancing a portfolio. Lastly, it will be shown how to create a learning system that continues to update its own understanding, and create a hybrid learning optimiser to enable fund managers to do their job effectively and safely.
Стилі APA, Harvard, Vancouver, ISO та ін.
30

"Risk management of the financial markets." Chinese University of Hong Kong, 1996. http://library.cuhk.edu.hk/record=b5895616.

Повний текст джерела
Анотація:
by Chan Pui Man.
Thesis (M.B.A.)--Chinese University of Hong Kong, 1996.
Includes bibliographical references (leaves 108-111).
ABSTRACT --- p.II
TABLE OF CONTENTS --- p.III
ACKNOWLEDGEMENT --- p.VI
Chapter I. --- INTRODUCTION --- p.1
Chapter II. --- LITERATURE REVIEW --- p.4
Impact due to Deregulation --- p.5
Impact due to Globalization --- p.5
Impact due to Securitization --- p.6
Impact due to Institutionalisation --- p.6
Impact due to Computerisation --- p.7
Chapter III. --- CONCEPT: MANAGEMENT OF RISK --- p.8
Definition of Risk --- p.9
Risk Analysis --- p.10
Risk Assessment --- p.10
Risk Measurement --- p.10
Risk Management --- p.11
Chapter IV. --- TYPE OF RISK --- p.13
Market/Capital Risk --- p.14
Reinvestment Risk --- p.15
Interest Rate Risk --- p.16
Credit Risk --- p.17
Liquidity or Funding Risk --- p.18
Currency and Foreign Exchange Risk --- p.19
Inflation Risk --- p.19
Operations Risk --- p.20
Legal Risk --- p.20
Political Risk --- p.21
Systemic Risk --- p.22
Portfolio Risk --- p.22
Control Risk --- p.23
Settlement Risk --- p.23
Country Risk --- p.24
Underwriting Risk --- p.24
Residual or Moral Risk --- p.24
Strategy Risk and Environment Risk --- p.25
Chapter V. --- MEASURING CHANGING RISK --- p.26
Historical Estimates --- p.28
Non-parametric Methods --- p.29
Parametric Methods --- p.30
Chapter VI. --- EVOLUTION OF RISK ESTIMATION --- p.35
Chapter VII. --- APPLYING PORTFOLIO THEORY INTO RISK ANALYSIS --- p.41
Modelling Bank Risk --- p.43
Identification of linkages between an individual loan and bank's overall risk profile --- p.43
Distribution of expected values --- p.44
Portfolio expected value --- p.44
Scenario Analysis and Formation of Loan Risk Measurement --- p.45
Subsystem --- p.45
Formation of an Integrated Risk Measurement --- p.45
Active Management of Portfolio Risk --- p.49
Chapter VIII. --- RISK ANALYSIS OF INTERNATIONAL INVESTMENT --- p.51
Discounted-Cash-Flow Analysis --- p.51
Net Present Value Approach --- p.51
Internal Rate of Return Approach --- p.54
Break-even Probability Analysis --- p.55
Certainty-Equivalent Method --- p.56
Chapter IX. --- CONSTRUCTING A MODEL FOR RISK ASSESSMENT --- p.58
"Set up a Model to Estimate ""Capital at Risk""" --- p.58
Obey the Minimum Standards --- p.60
Audit and Verify the Model --- p.62
Chapter X. --- METHODOLOGIES OF RISK MEASUREMENT
Measuring Market Risk : J P Morgan Risk Management Methodology - RiskMetrics´ёØ --- p.64
Statistical Analysis of Returns and Risk --- p.66
Market Moves and Locally Gaussian Processes --- p.72
Stochastic Volatility --- p.72
Risk and Optionality --- p.73
Mapping and Term Structure of Interest Rates --- p.73
Measuring Position Risk --- p.75
The Simplified Portfolio Approach --- p.77
The Comprehensive Approach --- p.81
The Building-Block Approach --- p.83
Chapter XI. --- ITEMS INVOLVED IN RISK MANAGEMENT --- p.85
Management Control --- p.35
Constructing Valuation Methodology --- p.90
Contents of Reporting --- p.92
Evaluation of Risk --- p.93
Counterparty Relationships --- p.93
Chapter XII. --- AFTERTHOUGHT --- p.95
APPENDIX --- p.98
BIBLIOGRAPHY --- p.108
Стилі APA, Harvard, Vancouver, ISO та ін.
31

"Robust approach to risk management and statistical analysis." 2012. http://library.cuhk.edu.hk/record=b5549601.

Повний текст джерела
Анотація:
博士論文著重研究關於多項式優化的理論,並討論其在風險管理及統計分析中的應用。我們主要研究的對象乃為在控制理論和穩健優化中常見的所謂S 引理。原始的S 引理最早由Yakubovich 所引入。它給出一個二吹多項式在另一個二吹多項式的非負域上為非負的等價條件。在本論文中,我們把S 引理推廣到一元高吹多項式。由於S 引理與穩健優化密切相關,所以我們的結果可廣泛應用於風險管理及統計分析,包括估算在高階矩約束下的非線性風險量度問題,以及利用半正定規劃來計算同時置信區域帶等重要課題。同時,在相關章節的末段,我們以數值實驗結果來引證有關的新理論的有效性和應用前景。
In this thesis we study some structural results in polynomial optimization, with an emphasis paid to the applications from risk management problems and estimations in statistical analysis. The key underlying method being studied is related to the so-called S-lemma in control theory and robust optimization. The original S-lemma was developed by Yakubovich, which states an equivalent condition for a quadratic polynomial to be non-negative over the non-negative domain of other quadratic polynomial(s). In this thesis, we extend the S-Lemma to univariate polynomials of any degree. Since robust optimization has a strong connection to the S-Lemma, our results lead to many applications in risk management and statistical analysis, including estimating certain nonlinear risk measures under moment bound constraints, and an SDP formulation for simultaneous confidence bands. Numerical experiments are conducted and presented to illustrate the effectiveness of the methods.
Detailed summary in vernacular field only.
Wong, Man Hong.
Thesis (Ph.D.)--Chinese University of Hong Kong, 2012.
Includes bibliographical references (leaves 134-147).
Abstract also in Chinese.
Abstract --- p.i
摘要 --- p.ii
Acknowledgement --- p.iii
Chapter 1 --- Introduction --- p.1
Chapter 2 --- Meeting the S-Lemma --- p.5
Chapter 3 --- A strongly robust formulation --- p.13
Chapter 3.1 --- A more practical extension for robust optimization --- p.13
Chapter 3.1.1 --- Motivation from modeling aspect --- p.13
Chapter 3.1.2 --- Discussion of a more robust condition --- p.15
Chapter 4 --- Theoretical developments --- p.19
Chapter 4.1 --- Definition of several order relations --- p.19
Chapter 4.2 --- S-Lemma with a single condition g(x)≥0 --- p.20
Chapter 5 --- Confidence bands in polynomial regression --- p.47
Chapter 5.1 --- An introduction --- p.47
Chapter 5.1.1 --- A review on robust optimization, nonnegative polynomials and SDP --- p.49
Chapter 5.1.2 --- A review on the confidence bands --- p.50
Chapter 5.1.3 --- Our contribution --- p.51
Chapter 5.2 --- Some preliminaries on optimization --- p.52
Chapter 5.2.1 --- Robust optimization --- p.52
Chapter 5.2.2 --- Semidefinite programming and LMIs --- p.53
Chapter 5.2.3 --- Nonnegative polynomials with SDP --- p.55
Chapter 5.3 --- Some preliminaries on linear regression and confidence region --- p.59
Chapter 5.4 --- Optimization approach to the confidence bands construction --- p.63
Chapter 5.5 --- Numerical experiments --- p.66
Chapter 5.5.1 --- Linear regression example --- p.66
Chapter 5.5.2 --- Polynomial regression example --- p.67
Chapter 5.6 --- Conclusion --- p.70
Chapter 6 --- Moment bound of nonlinear risk measures --- p.72
Chapter 6.1 --- Introduction --- p.72
Chapter 6.1.1 --- Motivation --- p.72
Chapter 6.1.2 --- Robustness and moment bounds --- p.74
Chapter 6.1.3 --- Literature review in general --- p.76
Chapter 6.1.4 --- More literature review in actuarial science --- p.78
Chapter 6.1.5 --- Our contribution --- p.79
Chapter 6.2 --- Methodological fundamentals behind the moment bounds --- p.81
Chapter 6.2.1 --- Dual formulations, duality and tight bounds --- p.82
Chapter 6.2.2 --- SDP and LMIs for some dual problems --- p.84
Chapter 6.3 --- Worst expectation and worst risk measures on annuity payments --- p.87
Chapter 6.3.1 --- The worst mortgage payments --- p.88
Chapter 6.3.2 --- The worst probability of repayment failure --- p.89
Chapter 6.3.3 --- The worst expected downside risk of exceeding the threshold --- p.90
Chapter 6.4 --- Numerical examples for risk management --- p.94
Chapter 6.4.1 --- A mortgage example --- p.94
Chapter 6.4.2 --- An annuity example --- p.97
Chapter 6.5 --- Conclusion --- p.100
Chapter 7 --- Computing distributional robust probability functions --- p.101
Chapter 7.1 --- Distributional robust function with a single random variable --- p.105
Chapter 7.2 --- Moment bound of joint probability --- p.108
Chapter 7.2.1 --- Constraint (7.5) in LMIs --- p.112
Chapter 7.2.2 --- Constraint (7.6) in LMIs --- p.112
Chapter 7.2.3 --- Constraint (7.7) in LMIs --- p.116
Chapter 7.3 --- Several model extensions --- p.119
Chapter 7.3.1 --- Moment bound of probability of union events --- p.119
Chapter 7.3.2 --- The variety of domain of x --- p.120
Chapter 7.3.3 --- Higher moments incorporated --- p.123
Chapter 7.4 --- Applications of the moment bound --- p.124
Chapter 7.4.1 --- The Riemann integrable set approximation --- p.124
Chapter 7.4.2 --- Worst-case simultaneous VaR --- p.124
Chapter 7.5 --- Conclusion --- p.126
Chapter 8 --- Concluding Remarks and Future Directions --- p.127
Chapter A --- Nonnegative univariate polynomials --- p.129
Chapter B --- First and second moment of (7.2) --- p.131
Bibliography --- p.134
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Crawley, P. D. (Philip David). "Risk and reliability assessment of multiple reservoir water supply headworks systems." 1995. http://web4.library.adelaide.edu.au/theses/09PH/09phc911.pdf.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Crawley, P. D. (Philip David). "Risk and reliability assessment of multiple reservoir water supply headworks systems / by Philip David Crawley." Thesis, 1995. http://hdl.handle.net/2440/18555.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Okunev, John Uwe. "An analysis of the extended mean Gini coefficient as an alternative measure of risk in investment decision making under uncertainty." Phd thesis, 1989. http://hdl.handle.net/1885/128334.

Повний текст джерела
Анотація:
The methods traditionally used for comparing uncertain prospects in investment decision making under uncertainty are the mean variance and the stochastic dominance approaches. Yitzhaki (1982) has recently presented an alternative model based upon the extended mean Gini coefficient (EMG) to compare uncertain prospects. The EMG model possesses the attractive features of each of the mean variance and stochastic dominance models without their apparent disadvantages. The EMG model has simplicity of a two parameter model and EMG efficiency implies second order stochastic dominance. Theoretically the EMG model is consistent with the behaviour of investors under conditions of uncertainty for a wider class of probability distributions, and thus appears to be more adequate than variance as a measure of risk. This study empirically compares the EMG model with that of the mean variance model with respect to, the generation of efficient frontiers, capital asset pricing, portfolio performance and farm planning under uncertainty. The major finding of the study is that the extended mean Gini coefficient is a viable alternative to variance as a measure of risk.
Стилі APA, Harvard, Vancouver, ISO та ін.
35

"Bayesian analysis of structure credit risk models with micro-structure noises and jump diffusion." 2013. http://library.cuhk.edu.hk/record=b5549266.

Повний текст джерела
Анотація:
有實證研究表明,傳統的信貸風險結構模型顯著低估了違約概率以及信貸收益率差。傳統的結構模型有三個可能的問題:1. 因為正態假設,布朗模型在模擬公司資產價值的過程中未能捕捉到極端事件2. 市場微觀結構噪聲扭曲了股票價格所包含信息3. 在到期日前任何時間,標準BS 期權理論方法不足以描述任何破產的可能性。這些問題在過去的文獻中曾分別提及。而在本文中,在不同的信用風險結構模型的基礎上,我們提出了貝葉斯方法去估算公司價值的跳躍擴散過程和微觀結構噪聲。因為企業的資產淨值不能在市場上觀察,本文建議的貝葉斯方法可對隱藏變量和泊松衝擊作出一定的估算,並就後驗分佈進行財務分析。我們應用馬爾可夫鏈蒙特卡羅方法(MCMC)和吉布斯採樣計算每個參數的後驗分佈。以上的做法,允許我們檢查結構性信用風險模型的偏差主要是來自公司價值的分佈、期權理論方法或市場微觀結構噪聲。我們進行模擬研究以確定模型的表現。最後,我們以新興市場的數據實踐我們的模型。
There is empirical evidence that structural models of credit risk significantly underestimate both the probability of default and credit yield spreads. There are three potential sources of the problems in traditional structural models. First, the Brownian model driving the firm asset value process may fail to capture extreme events because of the normality assumption. Second, the market micro-structure noise in trading may distort the information contained in equity prices within the estimation process. Third, the standard Black and Scholes option-theoretic approach may be inadequate to describe the consequences of bankruptcy at any time before maturity. These potential problems have been handled separately in the literature. In this paper, we propose a Bayesian approach to simultaneously estimate the jump-diffusion firm value process and micro-structure noise from equity prices based on different structural credit risk models. As the firm asset value is not observable but the equity price is, the proposed Bayesian approach is useful in the estimation with hidden variable and Poisson shocks, and produces posterior distributions for financial analysis. We demonstrate the application using the Markov chain Monte Carlo (MCMC) method to obtain the posterior distributions of parameters and latent variable. The proposed approach enables us to check whether the bias of the structural credit risk model is mainly caused by the firm value distribution, the option-theoretic method or the micro-structure noise of the market. A simulation study is conducted to ascertain the performance of our model. We also apply our model to the emerging market data.
Detailed summary in vernacular field only.
Chan, Sau Lung.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2013.
Includes bibliographical references (leaves 62-65).
Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
Abstracts also in Chinese.
List of Tables --- p.vii
List of Figures --- p.viii
Chapter 1 --- Introduction --- p.1
Chapter 2 --- Background and Intuition --- p.5
Chapter 2.1 --- Merton Model with Trading Noise --- p.7
Chapter 2.2 --- Black-Cox Model with Default Barrier --- p.10
Chapter 2.3 --- Double Exponential Jump Diffusion Model (KJD Model) --- p.11
Chapter 2.4 --- Equity Value via Laplace Transforms --- p.13
Chapter 2.5 --- KJD Model with Trading Noises --- p.15
Chapter 3 --- Bayesian Analysis --- p.17
Chapter 3.1 --- Gibbs Sampling and Metropolis-Hastings Method --- p.17
Chapter 3.2 --- Merton Model with Trading Noises (M1) --- p.19
Chapter 3.2.1 --- Prior Distribution for M1 --- p.19
Chapter 3.2.2 --- Posterior Distribution for M1 --- p.20
Chapter 3.3 --- Merton Model with Default Barrier (M2) --- p.22
Chapter 3.3.1 --- Prior Distribution for M2 --- p.23
Chapter 3.3.2 --- Posterior Distribution for M2 --- p.23
Chapter 3.4 --- KJD Model with Trading Noises (M3) --- p.25
Chapter 3.4.1 --- Prior Distribution for M3 --- p.26
Chapter 3.4.2 --- Posterior Distribution for M3 --- p.27
Chapter 3.5 --- KJD Model with Default Barrier (M4) --- p.33
Chapter 3.5.1 --- Prior Distribution for M4 --- p.34
Chapter 3.5.2 --- Posterior Distribution for M4 --- p.35
Chapter 4 --- Numerical Examples --- p.42
Chapter 4.1 --- Simulation Analysis --- p.42
Chapter 4.2 --- Empirical Study --- p.46
Chapter 4.2.1 --- BEA and DBS, 2003-2004 --- p.46
Chapter 4.2.2 --- HSBC, 2008-2009 --- p.49
Chapter 5 --- Conclusion --- p.60
Bibliography --- p.62
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Zhang, Zhizun. "Modeling and Analyzing Systemic Risk in Complex Sociotechnical Systems The Role of Teleology, Feedback, and Emergence." Thesis, 2018. https://doi.org/10.7916/D83R24TQ.

Повний текст джерела
Анотація:
Recent systemic failures such as the BP Deepwater Horizon Oil Spill, Global Financial Crisis, and Northeast Blackout have reminded us, once again, of the fragility of complex sociotechnical systems. Although the failures occurred in very different domains and were triggered by different events, there are, however, certain common underlying mechanisms of abnormalities driving these systemic failures. Understanding these mechanisms is essential to avoid such disasters in the future. Moreover, these disasters happened in sociotechnical systems, where both social and technical elements can interact with each other and with the environment. The nonlinear interactions among these components can lead to an “emergent” behavior – i.e., the behavior of the whole is more than the sum of its parts – that can be difficult to anticipate and control. Abnormalities can propagate through the systems to cause systemic failures. To ensure the safe operation and production of such complex systems, we need to understand and model the associated systemic risk. Traditional emphasis of chemical engineering risk modeling is on the technical components of a chemical plant, such as equipment and processes. However, a chemical plant is more than a set of equipment and processes, with the human elements playing a critical role in decision-making. Industrial statistics show that about 70% of the accidents are caused by human errors. So, new modeling techniques that go beyond the classical equipment/process-oriented approaches to include the human elements (i.e., the “socio” part of the sociotechnical systems) are needed for analyzing systemic risk of complex sociotechnical systems. This thesis presents such an approach. This thesis presents a new knowledge modeling paradigm for systemic risk analysis that goes beyond chemical plants by unifying different perspectives. First, we develop a unifying teleological, control theoretic framework to model decision-making knowledge in a complex system. The framework allows us to identify systematically the common failure mechanisms behind systemic failures in different domains. We show how cause-and-effect knowledge can be incorporated into this framework by using signed directed graphs. We also develop an ontology-driven knowledge modeling component and show how this can support decision-making by using a case study in public health emergency. This is the first such attempt to develop an ontology for public health documents. Lastly, from a control-theoretic perspective, we address the question, “how do simple individual components of a system interact to produce a system behavior that cannot be explained by the behavior of just the individual components alone?” Through this effort, we attempt to bridge the knowledge gap between control theory and complexity science.
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Shen, Weiwei. "Portfolio optimization with transaction costs and capital gain taxes." Thesis, 2014. https://doi.org/10.7916/D8PK0D76.

Повний текст джерела
Анотація:
This thesis is concerned with a new computational study of optimal investment decisions with proportional transaction costs or capital gain taxes over multiple periods. The decisions are studied for investors who have access to a risk-free asset and multiple risky assets to maximize the expected utility of terminal wealth. The risky asset returns are modeled by a discrete-time multivariate geometric Brownian motion. As in the model in Davis and Norman (1990) and Lynch and Tan (2010), the transaction cost is modeled to be proportional to the amount of transferred wealth. As in the model in Dammon et al. (2001) and Dammon et al. (2004), the taxation rule is linear, uses the weighted average tax basis price, and allows an immediate tax credit for a capital loss. For the transaction costs problem, we compute both lower and upper bounds for optimal solutions. We propose three trading strategies to obtain the lower bounds: the hyper-sphere strategy (termed HS); the hyper-cube strategy (termed HC); and the value function optimization strategy (termed VF). The first two strategies parameterize the associated no-trading region by a hyper-sphere and a hyper-cube, respectively. The third strategy relies on approximate value functions used in an approximate dynamic programming algorithm. In order to examine their quality, we compute the upper bounds by a modified gradient-based duality method (termed MG). We apply the new methods across various parameter sets and compare their results with those from the methods in Brown and Smith (2011). We are able to numerically solve problems up to the size of 20 risky assets and a 40-year-long horizon. Compared with their methods, the three novel lower bound methods can achieve higher utilities. HS and HC are about one order of magnitude faster in computation times. The upper bounds from MG are tighter in various examples. The new duality gap is ten times narrower than the one in Brown and Smith (2011) in the best case. In addition, I illustrate how the no-trading region deforms when it reaches the borrowing constraint boundary in state space. To the best of our knowledge, this is the first study of the deformation in no-trading region shape resulted from the borrowing constraint. In particular, we demonstrate how the rectangular no-trading region generated in uncorrelated risky asset cases (see, e.g., Lynch and Tan, 2010; Goodman and Ostrov, 2010) transforms into a non-convex region due to the binding of the constraint.For the capital gain taxes problem, we allow wash sales and rule out "shorting against the box" by imposing nonnegativity on portfolio positions. In order to produce accurate results, we sample the risky asset returns from its continuous distribution directly, leading to a dynamic program with continuous decision and state spaces. We provide ingredients of effective error control in an approximate dynamic programming solution method. Accordingly, the relative numerical error in approximating value functions by a polynomial basis function is about 10E-5 measured by the l1 norm and about 10E-10 by the l2 norm. Through highly accurate numerical solutions and transformed state variables, we are able to explain the optimal trades through an associated no-trading region. We numerically show in the new state space the no-trading region has a similar shape and parameter sensitivity to that of the transaction costs problem in Muthuraman and Kumar (2006) and Lynch and Tan (2010). Our computational results elucidate the impact on the no-trading region from volatilities, tax rates, risk aversion of investors, and correlations among risky assets. To the best of our knowledge, this is the first time showing no-trading region of the capital gain taxes problem has such similar traits to that of the transaction costs problem. We also compute lower and upper bounds for the problem. To obtain the lower bounds we propose five novel trading strategies: the value function optimization (VF) strategy from approximate dynamic programming; the myopic optimization and the rolling buy-and-hold heuristic strategies (MO and RBH); and the realized Merton's and hyper-cube strategies (RM and HC) from policy approximation. In order to examine their performance, we develop two upper bound methods (VUB and GUB) based on the duality technique in Brown et al. (2009) and Brown and Smith (2011). Across various sets of parameters, duality gaps between lower and upper bounds are smaller than 3% in most examples. We are able to solve the problem up to the size of 20 risky assets and a 30-year-long horizon.
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Khanghahi, Houshang Farabi. "A risk-based approach to control of water quality impacts caused by forest road systems." Phd thesis, 2005. http://hdl.handle.net/1885/151699.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
39

Alie, Kaylene Jean. "Assessment of business risk economic capital for South Africa banks : a response to Pillar 2 of Basel II." Thesis, 2016. https://hdl.handle.net/10539/23989.

Повний текст джерела
Анотація:
Thesis (M.M. (Finance & Investment)--University of the Witwatersrand, Faculty of Commerce, Law and Management, Wits Business School, 2016
The study is an assessment of the current treatment of business risk, as a significant risk type for financial institutions. It includes an industry analysis of the five major banks in South Africa, as well as international banks, and how these banks currently manage business risk in the Pillar 2 supervisory process. It assesses economic capital frameworks and the importance of business risk in the risk assessment and measurement process in the global and local industry. Various methodologies have been researched to assess which statistical methods are best suited in the measurement of this risk type as well as the quantification of the capital levels required. This study has compared the available statistical methodologies currently used in the industry and concludes which is best given the issues pertaining to the modelling of business risk quantification. A statistical model has been developed to quantify business risk for a specific bank using bank specific data, using a methodology which is relatively generic and could be applied widely across all financial institutions. The model serves to illustrate the principles surrounding the quantification of business risk economic capital.
GR2018
Стилі APA, Harvard, Vancouver, ISO та ін.
40

Aristeguieta, Alfonzo Otto D. "Multi-objective portfolio optimisation of upstream petroleum projects." 2008. http://hdl.handle.net/2440/47918.

Повний текст джерела
Анотація:
The shareholders of E&P companies evaluate the future performance of these companies in terms of multiple performance attributes. Hence, E&P decision makers have the task of allocating limited resources to available project proposals to deliver the best performance on these various attributes. Additionally, the performance of these proposals on these attributes is uncertain and the attributes of the various proposals are usually correlated. As a result of the above, the E&P portfolio optimisation decision setting is characterised by multiple attributes with uncertain future performance. Most recent contributions in the E&P portfolio optimisation arena seek to adapt modern financial portfolio theory concepts to the E&P project portfolio selection problem. These contributions generally focus on understanding the tradeoffs between risk and return for the attribute NPV while acknowledging the presence of correlation among the assets of the portfolio. The result is usually an efficient frontier where one objective is set over the expected value of the NPV and the other is set over a risk metric calculated from the same attribute where, typically, the risk metric has a closed form solution (e.g., variance, standard deviation, semi-standard deviation). However, this methodology fails to acknowledge the presence of multiple attributes in the E&P decision setting. To fill this gap, this thesis proposes a decision support model to optimise risk and return objectives extracted from the NPV attribute and from other financial and/or operational attributes simultaneously. The result of this approach is an approximate Pareto front that explicitly shows the tradeoffs among these objectives whilst honouring intra-project and inter-project correlations. Intra-project correlations are incorporated into the optimisation by integrating the single project models to the portfolio model to be optimised. Inter-project correlation is included by modelling of the oil price a global variable. Additionally, the model uses a multi-objective simulation-optimisation approach and hence it overcomes the need of using risk metrics with closed form solutions. The model is applied to a set of realistic hypothetical offshore E&P projects. The results show the presence of complex relationships among the objectives in the approximate Pareto set. The ability of the method to unveil these relationships hopes to bring more insight to the decision makers and hence promote better investment decisions in the E&P industry.
http://proxy.library.adelaide.edu.au/login?url= http://library.adelaide.edu.au/cgi-bin/Pwebrecon.cgi?BBID=1320463
Thesis (M.Eng.Sc.) -- University of Adelaide, Australian School of Petroleum, 2008
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Aristeguieta, Alfonzo Otto D. "Multi-objective portfolio optimisation of upstream petroleum projects." Thesis, 2008. http://hdl.handle.net/2440/47918.

Повний текст джерела
Анотація:
The shareholders of E&P companies evaluate the future performance of these companies in terms of multiple performance attributes. Hence, E&P decision makers have the task of allocating limited resources to available project proposals to deliver the best performance on these various attributes. Additionally, the performance of these proposals on these attributes is uncertain and the attributes of the various proposals are usually correlated. As a result of the above, the E&P portfolio optimisation decision setting is characterised by multiple attributes with uncertain future performance. Most recent contributions in the E&P portfolio optimisation arena seek to adapt modern financial portfolio theory concepts to the E&P project portfolio selection problem. These contributions generally focus on understanding the tradeoffs between risk and return for the attribute NPV while acknowledging the presence of correlation among the assets of the portfolio. The result is usually an efficient frontier where one objective is set over the expected value of the NPV and the other is set over a risk metric calculated from the same attribute where, typically, the risk metric has a closed form solution (e.g., variance, standard deviation, semi-standard deviation). However, this methodology fails to acknowledge the presence of multiple attributes in the E&P decision setting. To fill this gap, this thesis proposes a decision support model to optimise risk and return objectives extracted from the NPV attribute and from other financial and/or operational attributes simultaneously. The result of this approach is an approximate Pareto front that explicitly shows the tradeoffs among these objectives whilst honouring intra-project and inter-project correlations. Intra-project correlations are incorporated into the optimisation by integrating the single project models to the portfolio model to be optimised. Inter-project correlation is included by modelling of the oil price a global variable. Additionally, the model uses a multi-objective simulation-optimisation approach and hence it overcomes the need of using risk metrics with closed form solutions. The model is applied to a set of realistic hypothetical offshore E&P projects. The results show the presence of complex relationships among the objectives in the approximate Pareto set. The ability of the method to unveil these relationships hopes to bring more insight to the decision makers and hence promote better investment decisions in the E&P industry.
Thesis (M.Eng.Sc.) -- University of Adelaide, Australian School of Petroleum, 2008
Стилі APA, Harvard, Vancouver, ISO та ін.
42

"Benefit, cost and risk analysis of designing: a third-party e-commerce logistics center." 2001. http://library.cuhk.edu.hk/record=b5895878.

Повний текст джерела
Анотація:
Fu Gang.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2001.
Includes bibliographical references (leaves 71-72).
Abstracts in English and Chinese.
ABSTRACT OF THESIS ENTITLED --- p.I
ACKNOWLEDGEMENT --- p.III
TABLE OF CONTENT --- p.IV
LIST OF FIGURES --- p.VII
LIST OF TABLES --- p.VIII
Chapter CHAPTER 1 --- INTRODUCTION --- p.1
Chapter 1.1 --- A Third-party E-commerce Logistics Center in Need --- p.1
Chapter 1.2 --- Difficulty in Designing the Logistics Center --- p.2
Chapter 1.3 --- AHP and ANP --- p.3
Chapter 1.4 --- Scope of the Study --- p.4
Chapter 1.5 --- Organization of the Thesis --- p.5
Chapter CHAPTER 2 --- BACKGROUND AND LITERATURE REVIEW --- p.7
Chapter 2.1 --- Third-party E-commerce Logistics Center --- p.7
Chapter 2.2 --- "Government, Investors, and Users" --- p.8
Chapter 2.3 --- Center Design --- p.11
Chapter 2.3.1 --- Information and Physical Infrastructure --- p.11
Chapter 2.3.2 --- Ownership Arrangement --- p.12
Chapter 2.3.3 --- Design Alternatives --- p.13
Chapter 2.4 --- Evaluating Design Alternatives --- p.17
Chapter CHAPTER 3 --- AHP MODEL --- p.19
Chapter 3.1 --- Introduction of AHP --- p.19
Chapter 3.2 --- AHP Models for Government --- p.20
Chapter 3.2.1 --- Benefit to Government --- p.20
Chapter 3.2.2 --- Cost to Government --- p.23
Chapter 3.2.3 --- Risk to Government --- p.24
Chapter 3.3 --- AHP Models for Investors --- p.25
Chapter 3.3.1 --- Benefit to Investors --- p.25
Chapter 3.3.2 --- Cost to Investors --- p.28
Chapter 3.3.3 --- Risk to Investors --- p.29
Chapter 3.4 --- AHP Models for Users --- p.32
Chapter 3.4.1 --- Benefit to Users --- p.32
Chapter 3.4.2 --- Cost to Users --- p.34
Chapter 3.4.3 --- Risk to Users --- p.36
Chapter CHAPTER 4 --- RISK SHARING IN CENTER DESIGN ´ؤ USING AHP MODEL --- p.38
Chapter 4.1 --- "Solution Methodology of Aggregating Benefit, Cost and Risk in AHP" --- p.38
Chapter 4.2 --- Aspects in Determining an Agreeable Solution --- p.40
Chapter 4.3 --- Sensitivity Analysis in AHP --- p.42
Chapter 4.4 --- A Conflict-Resolving Solution Procedure for AHP --- p.44
Chapter 4.5 --- An Illustrative Numerical Example in AHP --- p.48
Chapter CHAPTER 5 --- ANP MODEL --- p.51
Chapter 5.1 --- Introduction of ANP --- p.51
Chapter 5.2 --- ANP Models for Government --- p.53
Chapter 5.2.1. --- Benefit to Government --- p.55
Chapter 5.2.2. --- Cost to Government --- p.54
Chapter 5.2.3. --- Risk to Government --- p.54
Chapter 5.3 --- ANP Models for Investors --- p.56
Chapter 5.3.1 --- Benefit to Investors --- p.56
Chapter 5.3.2 --- Cost to Investors --- p.56
Chapter 5.3.3 --- Risk to Investors --- p.56
Chapter 5.4 --- ANP Models for Users --- p.56
Chapter 5.4.1 --- Benefit to Users --- p.56
Chapter 5.4.2 --- Cost to Users --- p.58
Chapter 5.4.3 --- Risk to Users --- p.58
Chapter CHAPTER 6 --- RISK SHARING IN CENTER DESIGN ´ؤ USING ANP MODEL --- p.60
Chapter 6.1 --- Aggregated Benefit-Cost-Risk ANP Model --- p.60
Chapter 6.2 --- Sensitivity Analysis of ANP Model in an AHP Fashion --- p.61
Chapter 6.3 --- Sensitivity Analysis of General ANP Model --- p.62
Chapter 6.4 --- A Conflict-Resolving Solution Procedure for ANP --- p.63
Chapter 6.5 --- An Illustrative Numerical Example in ANP --- p.66
Chapter CHAPTER 7 --- p.69
CONCLUSION --- p.69
BIBLIOGRAPHY --- p.71
Стилі APA, Harvard, Vancouver, ISO та ін.
43

Brock, Terry A. "A comparison of deterministic and probabilistic radiation dose assessments at three fictitious �������Cs contaminated sites in California, Colorado, and Florida." Thesis, 1997. http://hdl.handle.net/1957/34111.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
44

Le, Roux Gabriël Jacobus. "A quantitative risk analysis model for private security managers." Thesis, 2004. http://hdl.handle.net/10500/1073.

Повний текст джерела
Анотація:
An easy-to-use quantitative risk analysis model is developed for the private security industry in South Africa, which can be used as a suitable analysing tool in the hands of the private security manager. This model incorporate different concepts such as the probability, impact, cost of risk, degree of correction and the newly established human factor concept, which cannot be seen in isolation. This latter concept plays a major part in the overall risk quantification process in establishing a most accurate risk score rating. The human factor concept, also known as the "CHHP" approach, is the first concept, which will round the model of in an effective measuring way. Human factors such as (i) control measures (ii) human attitude towards the risk (iii) handling of the risk and (iv) understanding and implementation of policies and procedures are combined to form part of the total integrated quantitative risk analysis model, also known as the "TIQCAM"-model. The "TIQCAM"-model uses Excel spreadsheet format as the principal means to illustrate the total integration of all risk concepts and also providing the user of the model with a solid foundation in analyze physical and quantifiable security risks. This model will also enable the user to use it as a value-added service to their clients.
Criminology
D. Litt et Phil.(Police Science)
Стилі APA, Harvard, Vancouver, ISO та ін.
45

Mac, Nicol Richard. "Sources and management of risk in large-scale sugarcane farming in KwaZulu-Natal, South Africa." Thesis, 2007. http://hdl.handle.net/10413/4502.

Повний текст джерела
Анотація:
The South African (SA) sugar industry supports approximately 50,940 small and large scale producers who collectively produce 22 million tons of sugarcane seasonally, on average. SA farmers face many challenges that lead to an uncertain decision making environment. Despite a general consensus among agricultural economists that risk constitutes a prevalent feature of the production and marketing environment, various authors have recently stated that risk-related research has failed to provide a convincing argument that risk matters in farmers' decisions. The various shortcomings of previous research have been identified and recommendations for the future proposed. Recommendations include that the focus of future risk research should be on holistic risk management. This study firstly identified the perceived importance of 14 separate sources of risk for a sample of 76 large-scale commercial sugarcane farmers in KwaZulu-Natal. Once a sufficient understanding of the risk perceptions of respondents had been attained, their use of 12 risk-related management strategies was determined. Principal components analysis (PCA) was used to investigate how individual management instruments are grouped together by respondents into choice brackets in order to make use of complementary and substitution effects. The study then proposed and demonstrated a technique that may be used in future research to isolate the effects of risk on individual risk-related management responses by modelling the management strategies contained within individual choice brackets with two-stage least squares regression analysis (2SLS). The most important risk sources were found to be the threats posed by land reform, minimum wage legislation and the variability of the sugar price, in that order. PCA identified seven risk dimensions, collectively explaining 78% of the variance in all 14 risk sources considered. These dimensions were: the "Crop Gross Income Index", "Macroeconomic and Political Index", "Legislation Index", "Labour and Inputs Index", "Human Capital and Credit Access Index", "Management Index" and the "Water Rights Index". Respondents were also asked questions regarding risk-related management strategies, including diversification of on-farm enterprises, investments and management time. PCA identified six management response brackets, collectively explaining 77% of the variance in the 12 responses considered. These response indexes were: the "Mechanisation and Management Bracket", "Enterprise and Time Diversification Bracket", "Insurance and Credit Reserve Bracket", "Geographic and Investment Diversification Bracket", "Land Trade Bracket" and the "Labour Bracket". Lastly, the study proposed a methodology for investigating the role of individuals' risk preferences in decision making. The recommended technique involves the simultaneous modelling of the major risk-related management strategies within each management response bracket, using 2SLS. A measure of risk preference was included in the 2SLS analysis to establish the influence of risk on decision making. By applying this methodology to the data obtained in this study, respondents were shown to be taking advantage of various complementary and substitution effects that exist between management responses. This was evident from the PCA and confirmed for the first previously identified management response bracket using 2SLS regression analysis. Risk attitude was shown to be a significant determinant of management decisions regarding the extent to which back-up management is kept in reserve. Important policy recommendations stemming from this study include that government review restrictive labour legislation and decrease the uncertainty surrounding new land redistribution legislation. Farmers need to make better use of available information by considering the effects of any single management decision on separate decisions, enabling them to take further advantage of substitution and complementary effects that may exist between management strategies previously considered in separate decision brackets. The fact that mechanisation and labour use occur in separate risk-related management response brackets in this study is an example of one such substitution effect that farmers do not seem to be utilising in terms of their management decision making. Future research using time series data is important in order to identify how risk perceptions and management portfolios change over time. Also, further research using the methodology proposed in this study may prove to be a useful means of more adequately addressing the question "Does risk matter in farmers' decisions?"
Thesis (M.Sc.)-University of KwaZulu-Natal, Pietermaritzburg, 2007.
Стилі APA, Harvard, Vancouver, ISO та ін.
46

Hallee, Brian Todd. "Feed-and-bleed transient analysis of OSU APEX facility using the modern Code Scaling, Applicability, and Uncertainty method." Thesis, 2013. http://hdl.handle.net/1957/37872.

Повний текст джерела
Анотація:
The nuclear industry has long relied upon bounding parametric analyses in predicting the safety margins of reactor designs undergoing design-basis accidents. These methods have been known to return highly-conservative results, limiting the operating conditions of the reactor. The Best-Estimate Plus Uncertainty (BEPU) method using a modernized version of the Code-Scaling, Applicability, and Uncertainty (CSAU) methodology has been applied to more accurately predict the safety margins of the Oregon State University Advanced Plant Experiment (APEX) facility experiencing a Loss-of-Feedwater Accident (LOFA). The statistical advantages of the Bayesian paradigm of probability was utilized to incorporate prior knowledge when determining the analysis required to justify the safety margins. RELAP5 Mod 3.3 was used to accurately predict the thermal-hydraulics of a primary Feed-and-Bleed response to the accident using assumptions to accompany the lumped-parameter calculation approach. A novel coupling of thermal-hydraulic and statistical software was accomplished using the Symbolic Nuclear Analysis Package (SNAP). Uncertainty in Peak Cladding Temperature (PCT) was calculated at the 95/95 probability/confidence levels under a series of four separate sensitivity studies.
Graduation date: 2013
Стилі APA, Harvard, Vancouver, ISO та ін.
47

Song, Zhibin. "Modeling and simulation of heat of mixing in li ion batteries." Thesis, 2015. http://hdl.handle.net/1805/7971.

Повний текст джерела
Анотація:
Indiana University-Purdue University Indianapolis (IUPUI)
Heat generation is a major safety concern in the design and development of Li ion batteries (LIBs) for large scale applications, such as electric vehicles. The total heat generation in LIBs includes entropic heat, enthalpy, reaction heat, and heat of mixing. The main objective of this study is to investigate the influence of heat of mixing on the LIBs and to understand whether it is necessary to consider the heat of mixing during the design and development of LIBs. In the previous research, Thomas and Newman derived methods to compute heat of mixing in LIB cells. Their results show that the heat of mixing cannot be neglected in comparison with the other heat sources at 2 C rate. In this study, the heat of mixing in different materials, porosity, particle sizes, and charging/discharging rate was investigated. A COMSOL mathematical model was built to simulate the heat generation of LIBs. The LIB model was based on Newman’s model. LiMn2O4 and LiCoO2 were applied as the cathode materials, and LiC6 was applied as the anode material. The results of heat of mixing were compared with the other heat sources to investigate the weight of heat of mixing in the total heat generation. The heat of mixing in cathode is smaller than the heat of mixing in anode, because of the diffusivity of LiCoO2 is 1 ×10-13 m2/s, which is larger than LiC6's diffusivity 2.52 × 10-14 m2/s. In the comparison, the heat of mixing is not as much as the irreversible heat and reversible heat, but it still cannot be neglected. Finally, a special situation will be discussed, which is the heat of mixing under the relaxation status. For instance, after the drivers turn off their vehicles, the entropy, ix enthalpy and reaction heat in LIBs will stop generating, but the heat will still be generated due to the release of heat of mixing. Therefore, it is meaningful to investigate to see if this process has significant influence on the safety and cycle life of LIBs.
Стилі APA, Harvard, Vancouver, ISO та ін.
48

Han, Baoguang. "Statistical analysis of clinical trial data using Monte Carlo methods." Thesis, 2014. http://hdl.handle.net/1805/4650.

Повний текст джерела
Анотація:
Indiana University-Purdue University Indianapolis (IUPUI)
In medical research, data analysis often requires complex statistical methods where no closed-form solutions are available. Under such circumstances, Monte Carlo (MC) methods have found many applications. In this dissertation, we proposed several novel statistical models where MC methods are utilized. For the first part, we focused on semicompeting risks data in which a non-terminal event was subject to dependent censoring by a terminal event. Based on an illness-death multistate survival model, we proposed flexible random effects models. Further, we extended our model to the setting of joint modeling where both semicompeting risks data and repeated marker data are simultaneously analyzed. Since the proposed methods involve high-dimensional integrations, Bayesian Monte Carlo Markov Chain (MCMC) methods were utilized for estimation. The use of Bayesian methods also facilitates the prediction of individual patient outcomes. The proposed methods were demonstrated in both simulation and case studies. For the second part, we focused on re-randomization test, which is a nonparametric method that makes inferences solely based on the randomization procedure used in clinical trials. With this type of inference, Monte Carlo method is often used for generating null distributions on the treatment difference. However, an issue was recently discovered when subjects in a clinical trial were randomized with unbalanced treatment allocation to two treatments according to the minimization algorithm, a randomization procedure frequently used in practice. The null distribution of the re-randomization test statistics was found not to be centered at zero, which comprised power of the test. In this dissertation, we investigated the property of the re-randomization test and proposed a weighted re-randomization method to overcome this issue. The proposed method was demonstrated through extensive simulation studies.
Стилі APA, Harvard, Vancouver, ISO та ін.
49

Bruder, Slawa Romana. "Prediction of Spatial-Temporal Distribution of Algal Metabolites in Eagle Creek Reservoir, Indianapolis, IN." Thesis, 2012. http://hdl.handle.net/1805/3043.

Повний текст джерела
Анотація:
Indiana University-Purdue University Indianapolis (IUPUI)
In this research, Environmental Fluid Dynamic Code (EFDC) and Adaptive- Networkbased Fuzzy Inference System Models (ANFIS) were developed and implemented to determine the spatial-temporal distribution of cyanobacterial metabolites: 2-MIB and geosmin, in Eagle Creek Reservoir, IN. The research is based on the current need for understanding algae dynamics and developing prediction methods for algal taste and odor release events. In this research the methodology for prediction of 2-MIB and geosmin production was explored. The approach incorporated a combination of numerical and heuristic modeling to show its capabilities in prediction of cyanobacteria metabolites. The reservoir’s variable data measured at monitoring stations and consisting of chemical/physical and biological parameters with the addition of calculated mixing conditions within the reservoir were used to train and validate the models. The Adaptive – Network based Fuzzy Inference System performed satisfactorily in predicting the metabolites, in spite of multiple model constraints. The predictions followed the generally observed trends of algal metabolites during the three seasons over three years (2008-2010). The randomly selected data pairs for geosmin for validation achieved coefficient of determination of 0.78, while 2-MIB validation was not accepted due to large differences between two observations and their model prediction. Although, these ANFIS results were accepted, the further application of the ANFIS model coupled with the numerical models to predict spatio-temporal distribution of metabolites showed serious limitations, due to numerical model calibration errors. The EFDC-ANFIS model over-predicted Pseudanabaena spp. biovolumes for selected stations. The predicted value was 18,386,540 mm3/m3, while observed values were 942,478 mm3/m3. The model simulating Planktothrix agardhii gave negative biovolumes, which were assumed to represent zero values observed at the station. The taste and odor metabolite, geosmin, was under-predicted as the predicted v concentration was 3.43 ng/L in comparison to observed value of 11.35 ng/l. The 2-MIB model did not validate during EFDC to ANFIS model evaluation. The proposed approach and developed methodology could be used for future applications if the limitations are appropriately addressed.
Стилі APA, Harvard, Vancouver, ISO та ін.
50

Li, Li. "Spatio-temporal analyses of the distribution of alcohol outlets in California." Thesis, 2014. http://hdl.handle.net/1805/6463.

Повний текст джерела
Анотація:
Indiana University-Purdue University Indianapolis (IUPUI)
The objective of this research is to examine the development of the California alcohol outlets over time and the relationship between neighborhood characteristics and densities of the alcohol outlets. Two types of advanced analyses were done after the usual preliminary description of data. Firstly, fixed and random effects linear regression were used for the county panel data across time (1945-2010) with a dummy variable added to capture the change in law regarding limitations on alcohol outlets density. Secondly, a Bayesian spatio-temporal Poisson regression of the census tract panel data was conducted to capture recent availability of population characteristics affecting outlet density. The spatial Conditional Autoregressive model was embedded in the Poisson regression to detect spatial dependency of unexplained variance of alcohol outlet density. The results show that the alcohol outlets density reduced under the limitation law over time. However, it was no more effective in reducing the growth of alcohol outlets after the limitation was modified to be more restrictive. Poorer, higher vacancy rate and lower percentage of Black neighborhoods tend to have higher alcohol outlet density (numbers of alcohol outlets to population ratio) for both on-sale general and off-sale general. Other characteristics like percentage of Hispanics, percentage of Asians, percentage of younger population and median income of adjacency neighbors were associated with densities of on-sale general and off sale general alcohol outlets. Some regions like the San Francisco Bay area and the Greater Los Angeles area have more alcohol outlets than the predictions of neighborhood characteristics included in the model.
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії