To see the other types of publications on this topic, follow the link: Risk and uncertainty theory.

Dissertations / Theses on the topic 'Risk and uncertainty theory'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Risk and uncertainty theory.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Martinez-Correa, Jimmy. "Decisions under Risk, Uncertainty and Ambiguity: Theory and Experiments." Digital Archive @ GSU, 2012. http://digitalarchive.gsu.edu/rmi_diss/29.

Full text
Abstract:
I combine theory, experiments and econometrics to undertake the task of disentangling the subtleties and implications of the distinction between risk, uncertainty and ambiguity. One general conclusion is that the elements of this methodological trilogy are not equally advanced. For example, new experimental tools must be developed to adequately test the predictions of theory. My dissertation is an example of this dynamic between theoretical and applied economics.
APA, Harvard, Vancouver, ISO, and other styles
2

Walker, Kenneth C. "Rhetorics of Uncertainty: Networked Deliberations in Climate Risk." Diss., The University of Arizona, 2015. http://hdl.handle.net/10150/556604.

Full text
Abstract:
This dissertation applies a mixed-methods model across three cases of climate risk in order to examine the rhetorical dynamics of uncertainties. I argue that a rhetorical approach to uncertainties can effectively scaffold civic agency in risk communication by translating conflicting interests and creating sites of public participation. By tracing the networks of scientists and their artifacts through cases of climate risk, I demonstrate how the performances of scientific ethos and their material-discursive technologies facilitate the personalization of risk as a form of scientific prudence, and thus a channel to feasible political action. I support these claims through a rhetorical model of translation, which hybridizes methods from discourse analysis and Actor-Network Theory (ANT) in order assemble a data-driven and corpus-based approach to rhetorical analysis. From this rhetorical perspective uncertainties expand on our notions of risk because they reveal associations between scientific inquiries, probability assessments, and the facilitation of political dialogues. In each case, the particular insight of the model reveals a range of rhetorical potentials in climate risk that can be confronted through uncertainties.
APA, Harvard, Vancouver, ISO, and other styles
3

PANK, ROULUND Rasmus. "Essays in empirical economics." Doctoral thesis, European University Institute, 2019. http://hdl.handle.net/1814/62944.

Full text
Abstract:
Defence date: 20 May 2019
Examining Board: Prof. Jerome Adda (Supervisor); Prof. Piero Gottardi,University of Essex; Prof. Rosemarie Nagel, Universitat Pompeu Fabra; Prof. Glenn W. Harrison, Georgia State University
This first chapter is co-authored with Nicolás Aragón and examines how participant and market confidence affect the outcomes in an experimental asset market where the fundamental value is known by all participants. Such a market should, in theory, clear at the expected value in each period. However, the literature has shown that bubbles often occur in these markets. We measure the confidence of each participant by asking them to forecast the one-period-ahead price as a discrete probability mass distribution. We find that confidence not only affects price-formation in markets, but is important in explaining the dynamics of bubbles. Moreover, as traders’ confidence grows, they become increasingly more optimistic, thus increasing the likelihood of price bubbles. The second chapter also deals with expectations and uncertainty, but from a different angle. It asks how increased uncertainty affects economic demand in a particular sector, using a discrete-choice demand framework. To investigate this issue I examine empirically to what extent varying uncertainty affects the consumer demand for flight traffic using us micro demand data. I find that the elasticity of uncertainty on demand is economically and statistically significant. The third chapter presents a more practical side to the issue examined in the first chapter. It describes how to elicit participants’ expectations in an economic experiment. The methodology is based on Harrison et al. (2017). The tool makes it easier for participants in economic experiments to forecast the movements of a key variable as discrete values using a discrete probability mass distribution that can be “drawn” on a virtual canvas using the mouse. The module I wrote is general enough that it can be included in other economic experiments.
1. Certainty and Decision-Making in Experimental Asset Markets 1.1. Literature Review 1.2. Hypotheses 1.3. Experimental Design 1.3.1. The asset market 1.3.2. Eliciting traders’ beliefs 1.3.3. Risk, Ambiguity and Hedging 1.4. Overview of experimental data 1.4.1. Summary of the trade data 1.4.2. Expectation data 1.5. Results 1.5.1. Predictions and forecast 1.5.2. Convergence of expectations 1.5.3. Market volatility and initial expectations 1.5.4. Explanatory power of certainty on price formation 1.6. Conclusion 2. The impact of macroeconomic uncertainty on demand: 2.1. Introduction 2.2. Literature review 2.3. A model of demand for flights 2.3.1. Demand 2.3.2. Firms 2.4. Data 2.4.1. The characteristics of the products 2.4.2. Market and macroeconomic characteristics 2.4.3. Instruments 2.4.4. Product shares 2.5. Results 2.6. Conclusion 3. forecast.js: a module for measuring expectation in economic experiments 3.1. Background 3.1.1. Elicitating Expectations in Experimental Finance 3.1.2. Eliciting a Distribution of Beliefs: Theoretical Considerations 3.2. Using the forecast.js module 3.2.1. Calibration 3.2.2. Accessing the forecast data 3.3. The generated data 3.3.1. Example of individual expectations 3.3.2. Timing Considerations 3.3.3. Prediction precision over time 3.4. Conclusion Bibliography A. Appendix to Chapter 1 A.1. Further robustness checks A.1.1. Additional graph for Hypothesis 2 A.1.2. Increased agreement with the Bhattacharyya coefficient A.1.3. Additional robustness checks for Hypothesis 3 A.2. Instructions for experiment A.2.1. General Instructions A.2.2. How to use the computerized market A.3. Questionnaire A.3.1. Before Session A.3.2. After Session B. Appendix to Chapter 3 99 B.1. Robustness check of precision B.2. Using forecast.js in a standalone HTML page B.3. Using forecast.js with oTree B.3.1. Setting up models.py B.3.2. The pages.py file B.3.3. Display forecast modules on the pages
APA, Harvard, Vancouver, ISO, and other styles
4

Li, Kehan. "Stress, uncertainty and multimodality of risk measures." Thesis, Paris 1, 2017. http://www.theses.fr/2017PA01E068.

Full text
Abstract:
Dans cette thèse, nous discutons du stress, de l'incertitude et de la multimodalité des mesures de risque en accordant une attention particulière à deux parties. Les résultats ont une influence directe sur le calcul du capital économique et réglementaire des banques. Tout d'abord, nous fournissons une nouvelle mesure de risque - la VaR du stress du spectre (SSVaR) - pour quantifier et intégrer l'incertitude de la valeur à risque. C'est un modèle de mise en œuvre de la VaR stressée proposée par Bâle III. La SSVaR est basée sur l'intervalle de confiance de la VaR. Nous étudions la distribution asymptotique de la statistique de l'ordre, qui est un estimateur non paramétrique de la VaR, afin de construire l'intervalle de confiance. Deux intervalles de confiance sont obtenus soit par le résultat gaussien asymptotique, soit par l'approche saddlepoint. Nous les comparons avec l'intervalle de confiance en bootstrapping par des simulations, montrant que l'intervalle de confiance construit à partir de l'approche saddlepoint est robuste pour différentes tailles d'échantillons, distributions sous-jacentes et niveaux de confiance. Les applications de test de stress utilisant SSVaR sont effectuées avec des rendements historiques de l'indice boursier lors d'une crise financière, pour identifier les violations potentielles de la VaR pendant les périodes de turbulences sur les marchés financiers. Deuxièmement, nous étudions l'impact de la multimodalité des distributions sur les calculs de la VaR et de l'ES. Les distributions de probabilité unimodales ont été largement utilisées pour le calcul paramétrique de la VaR par les investisseurs, les gestionnaires de risques et les régulateurs. Cependant, les données financières peuvent être caractérisées par des distributions ayant plus d'un mode. Avec ces données nous montrons que les distributions multimodales peuvent surpasser la distribution unimodale au sens de la qualité de l'ajustement. Deux catégories de distributions multimodales sont considérées: la famille de Cobb et la famille Distortion. Nous développons un algorithme d'échantillonnage de rejet adapté, permettant de générer efficacement des échantillons aléatoires à partir de la fonction de densité de probabilité de la famille de Cobb. Pour une étude empirique, deux ensembles de données sont considérés: un ensemble de données quotidiennes concernant le risque opérationnel et un scénario de trois mois de rendement du portefeuille de marché construit avec cinq minutes de données intraday. Avec un éventail complet de niveaux de confiance, la VaR et l'ES à la fois des distributions unimodales et des distributions multimodales sont calculés. Nous analysons les résultats pour voir l'intérêt d'utiliser la distribution multimodale au lieu de la distribution unimodale en pratique
In this thesis, we focus on discussing the stress, uncertainty and multimodality of risk measures with special attention on two parts. The results have direct influence on the computation of bank economic and regulatory capital. First, we provide a novel risk measure - the Spectrum Stress VaR (SSVaR) - to quantify and integrate the uncertainty of the Value-at-Risk. It is an implementation model of stressed VaR proposed in Basel III. The SSVaR is based on the confidence interval of the VaR. We investigate the asymptotic distribution of the order statistic, which is a nonparametric estimator of the VaR, in order to build the confidence interval. Two confidence intervals are derived from either the asymptotic Gaussian result, or the saddlepoint approach. We compare them with the bootstrapping confidence interval by simulations, showing that the confidence interval built from the saddlepoint approach is robust for different sample sizes, underlying distributions and confidence levels. Stress testing applications using SSVaR are performed with historical stock index returns during financial crisis, for identifying potential violations of the VaR during turmoil periods on financial markets. Second, we investigate the impact of multimodality of distributions on VaR and ES calculations. Unimodal probability distributions have been widely used for parametric VaR computation by investors, risk managers and regulators. However, financial data may be characterized by distributions having more than one modes. For these data, we show that multimodal distributions may outperform unimodal distribution in the sense of goodness-of-fit. Two classes of multimodal distributions are considered: Cobb's family and Distortion family. We develop an adapted rejection sampling algorithm, permitting to generate random samples efficiently from the probability density function of Cobb's family. For empirical study, two data sets are considered: a daily data set concerning operational risk and a three month scenario of market portfolio return built with five minutes intraday data. With a complete spectrum of confidence levels, the VaR and the ES from both unimodal distributions and multimodal distributions are calculated. We analyze the results to see the interest of using multimodal distribution instead of unimodal distribution in practice
APA, Harvard, Vancouver, ISO, and other styles
5

Raykov, Radoslav S. "Essays in Applied Microeconomic Theory." Thesis, Boston College, 2012. http://hdl.handle.net/2345/bc-ir:104087.

Full text
Abstract:
Thesis advisor: Utku Unver
This dissertation consists of three essays in microeconomic theory: two focusing on insurance theory and one on matching theory. The first chapter is concerned with catastrophe insurance. Motivated by the aftermath of hurricane Katrina, it studies a strategic model of catastrophe insurance in which consumers know that they may not get reimbursed if too many other people file claims at the same time. The model predicts that the demand for catastrophe insurance can ``bend backwards'' to zero, resulting in multiple equilibria and especially in market failure, which is always an equilibrium. This shows that a catastrophe market can fail entirely due to demand-driven reasons, a result new to the literature. The model suggests that pricing is key for the credibility of catastrophe insurers: instead of increasing demand, price cuts may backfire and instead cause a ``race to the bottom.'' However, small amounts of extra liquidity can restore the system to stable equilibrium, highlighting the importance of a functioning reinsurance market for large risks. These results remain robust both for expected utility consumer preferences and for expected utility's most popular alternative, rank-dependent expected utility. The second chapter develops a model of quality differentiation in insurance markets, focusing on two of their specific features: the fact that costs are uncertain, and the fact that firms are averse to risk. Cornerstone models of price competition predict that firms specialize in products of different quality (differentiate their products) as a way of softening price competition. However, real-world insurance markets feature very little differentiation. This chapter offers an explanation to this phenomenon by showing that cost uncertainty fundamentally alters the nature of price competition among risk-averse firms by creating a drive against differentiation. This force becomes particularly pronounced when consumers are picky about quality, and is capable of reversing standard results, leading to minimum differentiation instead. The chapter concludes with a study of how the costs of quality affect differentiation by considering two benchmark cases: when quality is costless and when quality costs are convex (quadratic). The third chapter focuses on the theory of two-sided matching. Its main topic are inefficiencies that arise when agent preferences permit indifferences. It is well-known that two-sided matching under weak preferences can result in matchings that are stable, but not Pareto efficient, which creates bad incentives for inefficiently matched agents to stay together. In this chapter I show that in one-to-one matching with weak preferences, the fraction of inefficiently matched agents decreases with market size if agents are sufficiently diverse; in particular, the proportion of agents who can Pareto improve in a randomly chosen stable matching approaches zero when the number of agents goes to infinity. This result shows that the relative degree of the inefficiency vanishes in sufficiently large markets, but this does not provide a "cure-all'' solution in absolute terms, because inefficient individuals remain even when their fraction is vanishing. Agent diversity is represented by the diversity of each person's preferences, which are assumed randomly drawn, i.i.d. from the set of all possible weak preferences. To demonstrate its main result, the chapter relies on the combinatorial properties of random weak preferences
Thesis (PhD) — Boston College, 2012
Submitted to: Boston College. Graduate School of Arts and Sciences
Discipline: Economics
APA, Harvard, Vancouver, ISO, and other styles
6

Kentel, Elçin. "Uncertainty Modeling Health Risk Assessment and Groundwater Resources Management." Diss., Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/11584.

Full text
Abstract:
Real-world problems especially the ones that involve natural systems are complex and they are composed of many non-deterministic components. Uncertainties associated with these non-deterministic components may originate from randomness or from imprecision due to lack of information. Until recently, uncertainty, regardless of its nature or source has been treated by probability concepts. However, uncertainties associated with real-world systems are not limited to randomness. Imprecise, vague or incomplete information may better be represented by other mathematical tools, such as fuzzy set theory, possibility theory, belief functions, etc. New approaches which allow utilization of probability theory in combination with these new mathematical tools found applications in various engineering fields. Uncertainty modeling in human health risk assessment and groundwater resources management areas are investigated in this thesis. In the first part of this thesis two new approaches which utilize both probability theory and fuzzy set theory concepts to treat parameter uncertainties in carcinogenic risk assessment are proposed. As a result of these approaches fuzzy health risks are generated. For the fuzzy risk to be useful for practical purposes its acceptability with respect to compliance guideline has to be evaluated. A new fuzzy measure, the risk tolerance measure, is proposed for this purpose. The risk tolerance measure is a weighed average of the possibility and the necessity measures which are currently used for decision making purposes. In the second part of this thesis two decision making frameworks are proposed to determine the best groundwater resources management strategy in the Savannah region, Georgia. Groundwater resources management problems, especially ones in the coastal areas are complex and require treatment of various uncertain inputs. The first decision making framework proposed in this study is composed of a coupled simulation-optimization model followed by a fuzzy multi-objective decision making approach while the second framework includes a groundwater flow model in which the parameters of the flow equation are characterized by fuzzy numbers and a decision making approach which utilizes the risk tolerance measure proposed in the first part of this thesis.
APA, Harvard, Vancouver, ISO, and other styles
7

Zargar, Yaghoobi Amin H. "Handling uncertainty in hydrologic analysis and drought risk assessment using Dempster-Shafer theory." Thesis, University of British Columbia, 2012. http://hdl.handle.net/2429/43814.

Full text
Abstract:
The aim of this thesis is to enhance some of the hydrologic analyses involved in drought risk assessment (DRA) to uncertainty-driven analyses therefore improving the accuracy and informativeness of DRA. In DRA, risk, or the expected loss from drought hazard is estimated by integrating the magnitude of hazard (i.e., drought severity) with vulnerability (i.e., susceptibility to losses from drought). Most hydrologic analyses including DRA are traditionally performed in a deterministic setting, ignoring data quality and uncertainty issues. Uncertainty can affect the accuracy of modeling results and undermine subsequent decision making. In order to handle uncertainty in DRA, this thesis uses the Dempster-Shafer theory (DST) which provides a unified platform for modeling and propagating uncertainty in the forms of variability, conflict and incompleteness. First, DST is used to model and propagate uncertainty arisen from a high degree of conflict between two datasets of a drought hazard indicator, the snow water equivalent. Four DST combination rules are used for conflict-resolution and results unanimously indicate a high possibility of drought. Second, the Standardized Precipitation Index (SPI) is used as a generic measure of hazard and is linked directly with wildfire risk in current and future climate scenarios. Using DST, modifications are introduced into SPI, enabling the integration of uncertainty analysis with SPI processes. The resulting enhanced SPI can model the effects of long-term shifts in climate normals on drought hazard while simultaneously evaluating the significance of these shifts within the range of surrounding uncertainty. Later, vulnerability to wildfire is simulated using enhanced SPI and two additional variables: evaporation and firefighting capacity. The estimated risk indicates that forests in Okanagan Basin are vulnerable to wildfires during periods of 2040-2069 and 2070-2099 unless the firefighting capacity is enhanced with a presumed rate. Through the successful implementation of DST into DRA processes, this research demonstrates the capability of DST in improving hydrologic analyses and enhancing informativeness in the water resources arena in general.
APA, Harvard, Vancouver, ISO, and other styles
8

Niculescu, Mihai. "Towards a Unified Treatment of Risk and Uncertainty in Choice Research." University of Cincinnati / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1249493228.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhao, Mingjun. "Essays on model uncertainty in macroeconomics." Columbus, Ohio : Ohio State University, 2006. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1153244452.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Garcia, Thomas. "A behavioral approach of decision making under risk and uncertainty." Thesis, Queensland University of Technology, 2019. https://eprints.qut.edu.au/132313/1/Thomas%20Jean-Christophe%20Lucien_Garcia_Thesis.pdf.

Full text
Abstract:
This thesis investigates how individuals make decisions under risk and uncertainty. It is composed of four essays that theoretically and experimentally investigate decision-making. First, I study situations where individuals must decide whether an event has occurred using uncertain evidence. I highlight that individuals tend to maximize accuracy instead of maximizing expected payoffs. I find that it is partially due to the existence of a value of being right and a recency bias. Second, I study how ambiguity on the costs or the benefits of a donation affects donation behavior. I show that individuals use ambiguity strategically as an excuse to behave less generously without feeling guilty. Finally, I study the external validity of risk preference measures based on a representative panel of the Dutch population. I find that risk-preference measures are related to behavior in experimental risk tasks, however they are not related to risk-taking in the field.
APA, Harvard, Vancouver, ISO, and other styles
11

Cook, Victoria Tracy 1960. "The effects of temporal uncertainty resolution on the overall utility and suspense of risky monetary and survival gambles /." Thesis, McGill University, 1989. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=75966.

Full text
Abstract:
We extend Kreps and Porteus' (1978, 1979a,b) temporal utility theory to include measures of suspense for gambles that vary in the timing of uncertainty resolution. Our f$ sp t$-modification (of their theory) defines overall utility and suspense in terms of two functions: a standard utility function and an iterative function whose properties determine attitude towards temporal uncertainty resolution. Suspense, which is increasing with time delay to uncertainty resolution, is defined as the "variance" of the standard utilities of the outcome streams taken about our measure of overall utility (rather than about the standard mean utility). We explore the properties of our measures and their implications for the overall utility and suspense of various key examples. Two preliminary experiments are reported which give some support for our overall utility and suspense measures, and which suggest that risk and suspense are different concepts. Iteration theory is also discussed in some detail.
APA, Harvard, Vancouver, ISO, and other styles
12

Humphrey, Steven James. "The economics and psychology of decision making under risk and uncertainty : an experimental investigation and integrating behavioural framework." Thesis, University of East Anglia, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.338218.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Garcia, Thomas. "A behavioral approach of decision making under risk and uncertainty." Thesis, Lyon, 2019. http://www.theses.fr/2019LYSE2042/document.

Full text
Abstract:
Cette thèse porte sur la façon dont les individus prennent des décisions en présence de risque et d'incertitude. Elle est composée de quatre essais qui étudient théoriquement et expérimentalement la prise de décision.Les deux premiers essais étudient des situations où un décideur doit décider si un événement a eu lieu en utilisant des informations incertaines. Le fait d'identifier correctement que cet événement s'est produit est plus rémunéré que le fait d'identifier correctement qu'il ne s'est pas produit. Ce problème de décision induit une divergence entre deux qualités d'une décision : l'optimalité et l'exactitude. Les deux essais reproduisent de telles situations dans une expérience de laboratoire basée sur des tâches perceptuelles et analysent les décisions en utilisant la théorie de la détection du signal pour étudier l'arbitrage optimalité-exactitude. Le premier essai confirme l'existence d'un tel arbitrage avec un rôle dominant de la recherche de l'exactitude. Il explique l'existence de cet arbitrage par utilité non-monétaire associée au fait d'avoir raison. Le deuxième chapitre montre que présenter les informations perceptuelles en dernier contribue à l'existence de l'arbitrage optimalité-exactitude.Le troisième essai étudie comment les préférences vie-à-vie d'autrui interagissent avec l'attitude face à l'ambiguïté. Il présente les résultats d'une expérience où les sujets doivent faire des dons à des associations caritatives. Les dons peuvent avoir des coûts ou des bénéfices ambigus. Nous constatons que l'ambiguïté a pour effet de rendre les individus plus égoïstes. En d'autres termes, nous montrons que les individus utilisent l'ambiguïté comme une excuse pour ne pas donner. Ce comportement d’auto-justification est plus marqué pour les coûts ambigus que pour les avantages ambigus.Le quatrième essai examine la validité externe des mesures de préférence pour le risque en laboratoire en utilisant des décisions dans d'autres tâches expérimentales risquées et des décisions prisent sur en dehors du laboratoire. Nous constatons que les mesures de préférence pour le risque permettent d'expliquer les premières, mais qu'elles n'expliquent pas les secondes
This thesis investigates how individuals make decisions under risk and uncertainty. It is composed of four essays that theoretically and experimentally investigate decision-making.The first two essays study situations where a decision maker has to decide whether an event has occurred using uncertain evidence. Accurately identifying that this event has occurred is more rewarded than accurately identifying that it has not occurred. This decision problem induces a divergence between two qualities of a decision: optimality and accuracy. Both essays reproduce such situations in a laboratory experiment based on perceptual tasks and analyze behavior using Signal Detection Theory to study the optimality-accuracy trade-off. The first essay confirms the existence of the trade-off with a leading role of accuracy. It explains the trade-off by the concern of individuals for being right. The second chapter finds that presenting perceptual evidence last contributes to the existence of the optimality-accuracy trade-off.The third essay studies how other-regarding preferences interact with attitude toward ambiguity. It reports the results of an experiment where subjects have to make donations to charities. Donations may have either ambiguous costs or ambiguous benefits. We find that other-regarding preferences are decreased under ambiguity. In other terms, we highlight that individual use ambiguity has an excuse not to give. This excuse-driven behavior is stronger for ambiguous costs than ambiguous benefits.The fourth essay challenges the external validity of laboratory risk preference measures using behavior in experimental risk tasks and naturally occurring behavior under risk. We find that risk preference measures are related with the former but that they fail to explain the latter
APA, Harvard, Vancouver, ISO, and other styles
14

Voßmann, Frank. "Decision weights in choice under risk and uncertainty : measurement and decomposition /." [S.l. : s.n.], 2004. http://www.gbv.de/dms/zbw/490610218.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Broll, Udo, Peter Wenzel, and Kit Pong Wong. "Multinational Firm, Exchange Rate Risk and the Impact of Regret on Trade." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2014. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-150460.

Full text
Abstract:
This paper examines the behavior of the regret-averse multinational firm under exchange rate uncertainty. The multinational firm simultaneously sells in the home market and exports to a foreign country. We characterize the multinational firm's regret-averse preferences by a modified utility function that includes disutility from having chosen ex-post suboptimal alternatives. The extent of regret depends on the difference between the actual home currency profit and the maximum home currency profit attained by making the optimal production and export decisions had the multinational firm observed the true realization of the random spot exchange rate. We show that the conventional results that the multinational firm optimally produces less, sells more domestically, and export less abroad under uncertainty than under certainty holds if the multinational firm is not too regret averse. Using a simple binary model wherein the random spot exchange rate can take on either a low value or a high value with positive probability, we show that the multinational firm may optimally produce more, sell less domestically, and export more abroad under uncertainty than under certainty, particularly when the multinational firm is sufficiently regret averse and the low spot exchange rate is very likely to prevail.
APA, Harvard, Vancouver, ISO, and other styles
16

Guimarães, Pedro Henrique Engel. "Three essays on macro-finance: robustness and portfolio theory." reponame:Repositório Institucional do FGV, 2017. http://hdl.handle.net/10438/19926.

Full text
Abstract:
Submitted by Pedro Guimarães (pedroengel@hotmail.com) on 2017-12-28T19:42:52Z No. of bitstreams: 1 Tese.pdf: 917520 bytes, checksum: cfa05ebb1d37a4a617f387942ee05a15 (MD5)
Approved for entry into archive by GILSON ROCHA MIRANDA (gilson.miranda@fgv.br) on 2018-01-15T18:46:52Z (GMT) No. of bitstreams: 1 Tese.pdf: 917520 bytes, checksum: cfa05ebb1d37a4a617f387942ee05a15 (MD5)
Made available in DSpace on 2018-01-16T19:08:33Z (GMT). No. of bitstreams: 1 Tese.pdf: 917520 bytes, checksum: cfa05ebb1d37a4a617f387942ee05a15 (MD5) Previous issue date: 2017-07-28
This doctoral thesis is composed of three chapters related to portfolio theory and model uncertainty. The first paper investigates how ambiguity averse agents explain the equity premium puzzle for a large group of countries including both Advanced Economies (AE) and Emerging Markets (EM). In the second article, we develop a general robust allocation framework that is capable of dealing with parametric and non parametric asset allocation models. In the final paper, I investigate portfolio selection criteria and analyze a set of portfolios out of sample performance in terms of Sharpe ratio (SR) and Certainty Equivalent (CEQ)
APA, Harvard, Vancouver, ISO, and other styles
17

Nybrant, Arvid, and Henrik Rundberg. "Predicting Uncertainty in Financial Markets : -An empirical study on ARCH-class models ability to estimate Value at Risk." Thesis, Uppsala universitet, Statistiska institutionen, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-352381.

Full text
Abstract:
Value at Risk has over the last couple of decades become one of the most widely used measures of market risk. Several methods to compute this measure have been suggested. In this paper, we evaluate the use of the GARCH(1,1)-, EGARCH(1,1)- and the APARCH(1,1) model for estimation of this measure under the assumption that the conditional error distribution is normally-, t-, skewed t- and NIG-distributed respectively. For each model, the 95% and 99% one-day Value at Risk is computed using rolling out-of-sample forecasts for three equity indices. These forecasts are evaluated with Kupiec´s test for unconditional coverage test and Christoffersen’s test for conditional coverage. The results imply that the models generally perform well. The APARCH(1,1) model seems to be the most robust model. However, the GARCH(1,1) and the EGARCH(1,1) models also provide accurate predictions. The results indicate that the assumption of conditional distribution matters more for 99% than 95% Value at Risk. Generally, a leptokurtic distribution appears to be a sound choice for the conditional distribution.
APA, Harvard, Vancouver, ISO, and other styles
18

Olson, Erik Davin. "Conceptual Design and Technical Risk Analysis of Quiet Commercial Aircraft Using Physics-Based Noise Analysis Methods." Diss., Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/11486.

Full text
Abstract:
An approach was developed which allows for design studies of commercial aircraft using physics-based noise analysis methods while retaining the ability to perform the rapid tradeoff and risk analysis studies needed at the conceptual design stage. A prototype integrated analysis process was created for computing the total aircraft EPNL at the Federal Aviation Regulations Part 36 certification measurement locations using physics-based methods for fan rotor-stator interaction tones and jet mixing noise. The analysis process was then used in combination with design of experiments to create response surface equations (RSEs) for the engine and aircraft performance metrics, geometric constraints and takeoff and landing noise levels. In addition, Monte Carlo analysis was used to assess the expected variability of the metrics under the influence of uncertainty, and to determine how the variability is affected by the choice of engine cycle. Finally, the RSEs were used to conduct a series of proof-of-concept conceptual-level design studies demonstrating the utility of the approach. The study found that a key advantage to using physics-based analysis during conceptual design lies in the ability to assess the benefits of new technologies as a function of the design to which they are applied. The greatest difficulty in implementing the physics-based analysis proved to be the generation of design geometry at a sufficient level of detail for high-fidelity analysis.
APA, Harvard, Vancouver, ISO, and other styles
19

Moon, Hyeun Jun. "Assessing Mold Risks in Buildings under Uncertainty." Diss., Georgia Institute of Technology, 2005. http://hdl.handle.net/1853/7279.

Full text
Abstract:
Microbial growth is a major cause of Indoor Air Quality (IAQ) problems. The implications of mold growth range from unacceptable musty smells and defacement of interior finishes, to structural damage and adverse health effects, not to mention lengthy litigation processes. Mold is likely to occur when a favorable combination of humidity, temperature, and substrate nutrient are maintained long enough. As many modern buildings use products that increase the likelihood of molds (e.g., paper and wood based products), reported cases have increased in recent years. Despite decades of intensive research efforts to prevent mold, modern buildings continue to suffer from mold infestation. The main reason is that current prescriptive regulations focus on the control of relative humidity only. However, recent research has shown that mold occurrences are influenced by a multitude of parameters with complex physical interactions. The set of relevant building parameters includes physical properties of building components, aspects of building usage, certain materials, occupant behavior, cleaning regime, HVAC system components and their operation, and other. Mold occurs mostly as the unexpected result of an unforeseen combination of the uncertain building parameters. Current deterministic mold assessment studies fail to give conclusive results. These simulations are based on idealizations of the building and its use, and therefore unable to capture the effect of the random, situational, and sometimes idiosyncratic nature of building use and operation. The presented research takes a radically different approach, based on the assessment of the uncertainties of all parameters and their propagation through a mixed set of simulations using a Monte Carlo technique. This approach generates a mold risk distribution that reveals the probability of mold occurrence in selected trouble spots in a building. The approach has been tested on three building cases located in Miami and Atlanta. In all cases the new approach was able to show the circumstances under which the mold risk could increase substantially, leading to a set of clear specifications for remediation and, in for new designs, to A/E procurement methods that will significantly reduce any mold risk.
APA, Harvard, Vancouver, ISO, and other styles
20

Gröhn, John Henrik, and Stefan Eriksson. "Jorden runt på fyra företag : En studie om hur rädsla för misslyckande påverkar internationaliseringsbeslut." Thesis, Linnéuniversitetet, Institutionen för organisation och entreprenörskap (OE), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-45723.

Full text
Abstract:
The fear of failure is something most people encounter on a daily basis and a common acceptance is; the more at stake, the harder is the process to make the right decision.This study examines how the variable “fear of failure” affects a strategic decision toexpand abroad. The study is based on a qualitative method and four CEOs of internationalized companies have been interviewed. Positivistic and deductive approaches are applied. Among the four companies risks was seen as a necessity fordeveloping the organization, but unnecessary risks were avoided. Finally, the study shows that fear affects internationalization decisions, especially in the form of lossaversion and uncertainty avoidance, where the uncertainty increased as physical and psychological distances increased.
Att rädsla för att misslyckas finns omkring oss är något som de flesta är medvetna om och oftast är det så att ju mer som står på spel, desto svårare blir processen att komma fram till rätt beslut. Studien har undersökt hur variabeln rädsla för att misslyckas påverkar ett strategiskt beslut om att etablera sig utomlands. Uppsatsen är byggd på en kvalitativ metod där fyra internationaliserade företag har studerats genom intervjuer. Vidare utgår studien från ett positivistiskt synsätt och ett deduktivt angreppssätt tillämpas. Bland de fyra företagen sågs risker som nödvändigt för att utveckla organisationen, men man tog helst inte onödiga risker. Avslutningsvis visar studien att rädsla påverkar internationaliseringsbeslut framförallt i form av förlust- och osäkerhetsaversion, där osäkerheten ökade med ökade fysiska och psykiska avstånd.
APA, Harvard, Vancouver, ISO, and other styles
21

Cooksey, Kenneth Daniel. "A portfolio approach to design in the presence of scenario-based uncertainty." Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/49036.

Full text
Abstract:
Current aircraft conceptual design practices result in the selection of a single (hopefully) Pareto optimal design to be carried forward into preliminary design. This paradigm is based on the assumption that carrying a significant number of concepts forward is too costly and thus early down-selection between competing concepts is necessary. However, this approach requires that key architectural design decisions which drive performance and market success are fixed very early in the design process, sometimes years before the aircraft actually goes to market. In the presence of uncertainty, if the design performance is examined for individual scenarios as opposed to measuring performance of the design with aggregate statistics, the author finds that the single concept approach can lead to less than desirable design outcomes. This thesis proposes an alternate conceptual design paradigm which leverages principles from economics (specifically the Nobel prize-winning modern portfolio theory) to improve design outcomes by intelligently selecting a small well diversified portfolio of concepts to carry forward through preliminary design, thus reducing the risk from external events that are outside of the engineer’s control. This alternate paradigm is expected to result in an increase in the overall profit by increasing the probability that the final design matches market needs at the time it goes to market. This thesis presents a portfolio based design approach, which leverages dynamic programming to enable a stochastic optimization of alternative portfolios of concepts. This optimization returns an optimized portfolio of concepts which are iteratively pruned to improve design outcomes in the presence of scenario-driven uncertainties. While dynamic programming is identified as a means for doing a stochastic portfolio optimization, dynamic programming is an analytical optimization process which suffers heavily from the curse of dimensionality. As a result, a new hybrid stochastic optimization process called the Evolutionary Cooperative Optimization with Simultaneous Independent Sub-optimization (ECOSIS) has been introduced. The ECOSIS algorithm leverages a co-evolutionary algorithm to optimize a multifaceted problem under uncertainty. ECOSIS allows for a stochastic portfolio optimization including the desired benefit-to-cost tradeoff for a well-diversified portfolio at the size and scope required for use in design problems. To demonstrate the applicability and value of a portfolio based design approach, an example application of the approach to the selection of a new 300 passenger aircraft is presented.
APA, Harvard, Vancouver, ISO, and other styles
22

Lee, Jae Min. "Households Saving and Reference Dependent Changes in Income and Uncertainty." The Ohio State University, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=osu1408967943.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Hollender, Julian. "Lévy-Type Processes under Uncertainty and Related Nonlocal Equations." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2016. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-211795.

Full text
Abstract:
The theoretical study of nonlinear expectations is the focus of attention for applications in a variety of different fields — often with the objective to model systems under incomplete information. Especially in mathematical finance, advances in the theory of sublinear expectations (also referred to as coherent risk measures) lay the theoretical foundation for modern approaches to evaluations under the presence of Knightian uncertainty. In this book, we introduce and study a large class of jump-type processes for sublinear expectations, which can be interpreted as Lévy-type processes under uncertainty in their characteristics. Moreover, we establish an existence and uniqueness theory for related nonlinear, nonlocal Hamilton-Jacobi-Bellman equations with non-dominated jump terms.
APA, Harvard, Vancouver, ISO, and other styles
24

Smolarski, Jan M. (Jan Mietek). "Environmental Determinants and Choice of Project Evaluation Techniques in US and UK Firms." Thesis, University of North Texas, 1996. https://digital.library.unt.edu/ark:/67531/metadc277767/.

Full text
Abstract:
The purpose of this dissertation is to develop a theory that helps explain the conditions under which firms select certain project evaluation techniques. This study uses contingency theory to analyze the impact of environmental uncertainty on the choice of project evaluation techniques. In addition to a direct measure of uncertainty, several dimensions of uncertainty are included in this study. These dimensions of uncertainty include control structure, method of financing, foreign assets, method of growth, and product domination. This study also analyzes the use of project evaluation, management science and risk management techniques in US firms over time and in UK firms over time in order to compare to prior research. A comparison of firms in the two countries are also provided. The primary method of data collection was a survey instrument. Data were also collected from annual reports and various other public sources. The variables that appear significant in the choice of project evaluation technique in US firms are environmental uncertainty, control structure, method of financing, foreign assets, and product domination. The variable that appear significant in the choice of project evaluation technique in UK firms is method of financing. US firms favor discounted cash flow techniques although this study detected a slight decrease over time. UK firms continue to use non-discounted cash flow techniques, although the use of discounted cash flow techniques is widespread. There are significant differences between US and UK firms. US firms tend to use discounted cash flow techniques to a greater extent than UK firms. This research makes a significant contribution in attempting to develop a theory explaining the use of project evaluation techniques in firms in the US and UK. In addition, several other developments relating to project evaluation, management science and risk management are discussed. The results of this study can be used by managers in refining and improving their existing project evaluation processes.
APA, Harvard, Vancouver, ISO, and other styles
25

Jonsson, Robin. "Optimal Linear Combinations of Portfolios Subject to Estimation Risk." Thesis, Mälardalens högskola, Akademin för utbildning, kultur och kommunikation, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-28524.

Full text
Abstract:
The combination of two or more portfolio rules is theoretically convex in return-risk space, which provides for a new class of portfolio rules that gives purpose to the Mean-Variance framework out-of-sample. The author investigates the performance loss from estimation risk between the unconstrained Mean-Variance portfolio and the out-of-sample Global Minimum Variance portfolio. A new two-fund rule is developed in a specific class of combined rules, between the equally weighted portfolio and a mean-variance portfolio with the covariance matrix being estimated by linear shrinkage. The study shows that this rule performs well out-of-sample when covariance estimation error and bias are balanced. The rule is performing at least as good as its peer group in this class of combined rules.
APA, Harvard, Vancouver, ISO, and other styles
26

Nybrant, Arvid. "On Robust Forecast Combinations With Applications to Automated Forecasting." Thesis, Uppsala universitet, Statistiska institutionen, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-450807.

Full text
Abstract:
Combining forecasts have been proven as one of the most successful methods to improve predictive performance. However, while there often is a focus on theoretically optimal methods, this is an ill-posed issue in practice where the problem of robustness is of more empirical relevance. This thesis focuses on the latter issue, where the risk associated with different combination methods is examined. The problem is addressed using Monte Carlo experiments and an application to automated forecasting with data from the M4 competition. Overall, our results indicate that the choice of combining methodology could constitute an important source of risk. While equal weighting of forecasts generally works well in the application, there are also cases where estimating weights improve upon this benchmark. In these cases, many robust and simple alternatives perform the best. While estimating weights can be beneficial, it is important to acknowledge the role of estimation uncertainty as it could outweigh the benefits of combining. For this reason, it could be advantageous to consider methods that effectively acknowledge this source of risk. By doing so, a forecaster can effectively utilize the benefits of combining forecasts while avoiding the risk associated with uncertainty in weights.
APA, Harvard, Vancouver, ISO, and other styles
27

Beisler, Matthias Werner. "Modelling of input data uncertainty based on random set theory for evaluation of the financial feasibility for hydropower projects." Doctoral thesis, Technische Universitaet Bergakademie Freiberg Universitaetsbibliothek "Georgius Agricola", 2011. http://nbn-resolving.de/urn:nbn:de:bsz:105-qucosa-71564.

Full text
Abstract:
The design of hydropower projects requires a comprehensive planning process in order to achieve the objective to maximise exploitation of the existing hydropower potential as well as future revenues of the plant. For this purpose and to satisfy approval requirements for a complex hydropower development, it is imperative at planning stage, that the conceptual development contemplates a wide range of influencing design factors and ensures appropriate consideration of all related aspects. Since the majority of technical and economical parameters that are required for detailed and final design cannot be precisely determined at early planning stages, crucial design parameters such as design discharge and hydraulic head have to be examined through an extensive optimisation process. One disadvantage inherent to commonly used deterministic analysis is the lack of objectivity for the selection of input parameters. Moreover, it cannot be ensured that the entire existing parameter ranges and all possible parameter combinations are covered. Probabilistic methods utilise discrete probability distributions or parameter input ranges to cover the entire range of uncertainties resulting from an information deficit during the planning phase and integrate them into the optimisation by means of an alternative calculation method. The investigated method assists with the mathematical assessment and integration of uncertainties into the rational economic appraisal of complex infrastructure projects. The assessment includes an exemplary verification to what extent the Random Set Theory can be utilised for the determination of input parameters that are relevant for the optimisation of hydropower projects and evaluates possible improvements with respect to accuracy and suitability of the calculated results
Die Auslegung von Wasserkraftanlagen stellt einen komplexen Planungsablauf dar, mit dem Ziel das vorhandene Wasserkraftpotential möglichst vollständig zu nutzen und künftige, wirtschaftliche Erträge der Kraftanlage zu maximieren. Um dies zu erreichen und gleichzeitig die Genehmigungsfähigkeit eines komplexen Wasserkraftprojektes zu gewährleisten, besteht hierbei die zwingende Notwendigkeit eine Vielzahl für die Konzepterstellung relevanter Einflussfaktoren zu erfassen und in der Projektplanungsphase hinreichend zu berücksichtigen. In frühen Planungsstadien kann ein Großteil der für die Detailplanung entscheidenden, technischen und wirtschaftlichen Parameter meist nicht exakt bestimmt werden, wodurch maßgebende Designparameter der Wasserkraftanlage, wie Durchfluss und Fallhöhe, einen umfangreichen Optimierungsprozess durchlaufen müssen. Ein Nachteil gebräuchlicher, deterministischer Berechnungsansätze besteht in der zumeist unzureichenden Objektivität bei der Bestimmung der Eingangsparameter, sowie der Tatsache, dass die Erfassung der Parameter in ihrer gesamten Streubreite und sämtlichen, maßgeblichen Parameterkombinationen nicht sichergestellt werden kann. Probabilistische Verfahren verwenden Eingangsparameter in ihrer statistischen Verteilung bzw. in Form von Bandbreiten, mit dem Ziel, Unsicherheiten, die sich aus dem in der Planungsphase unausweichlichen Informationsdefizit ergeben, durch Anwendung einer alternativen Berechnungsmethode mathematisch zu erfassen und in die Berechnung einzubeziehen. Die untersuchte Vorgehensweise trägt dazu bei, aus einem Informationsdefizit resultierende Unschärfen bei der wirtschaftlichen Beurteilung komplexer Infrastrukturprojekte objektiv bzw. mathematisch zu erfassen und in den Planungsprozess einzubeziehen. Es erfolgt eine Beurteilung und beispielhafte Überprüfung, inwiefern die Random Set Methode bei Bestimmung der für den Optimierungsprozess von Wasserkraftanlagen relevanten Eingangsgrößen Anwendung finden kann und in wieweit sich hieraus Verbesserungen hinsichtlich Genauigkeit und Aussagekraft der Berechnungsergebnisse ergeben
APA, Harvard, Vancouver, ISO, and other styles
28

Higgins, Paul Anthony. "Reducing uncertainty in new product development." Thesis, Queensland University of Technology, 2008. https://eprints.qut.edu.au/20273/1/Paul_Higgins_Thesis.pdf.

Full text
Abstract:
Research and Development engineering is at the corner stone of humanity’s evolution. It is perceived to be a systematic creative process which ultimately improves the living standard of a society through the creation of new applications and products. The commercial paradigm that governs project selection, resource allocation and market penetration prevails when the focus shifts from pure research to applied research. Furthermore, the road to success through commercialisation is difficult for most inventors, especially in a vast and isolated country such as Australia which is located a long way from wealthy and developed economies. While market leading products are considered unique, the actual process to achieve these products is essentially the same; progressing from an idea, through development to an outcome (if successful). Unfortunately, statistics indicate that only 3% of ‘ideas’ are significantly successful, 4% are moderately successful, and the remainder ‘evaporate’ in that form (Michael Quinn, Chairman, Innovation Capital Associates Pty Ltd). This study demonstrates and analyses two techniques developed by the author which reduce uncertainty in the engineering design and development phase of new product development and therefore increase the probability of a successful outcome. This study expands the existing knowledge of the engineering design and development stage in the new product development process and is couched in the identification of practical methods, which have been successfully used to develop new products by Australian Small Medium Enterprise (SME) Excel Technology Group Pty Ltd (ETG). Process theory is the term most commonly used to describe scientific study that identifies occurrences that result from a specified input state to an output state, thus detailing the process used to achieve an outcome. The thesis identifies relevant material and analyses recognised and established engineering processes utilised in developing new products. The literature identified that case studies are a particularly useful method for supporting problem-solving processes in settings where there are no clear answers or where problems are unstructured, as in New Product Development (NPD). This study describes, defines, and demonstrates the process of new product development within the context of historical product development and a ‘live’ case study associated with an Australian Government START grant awarded to Excel Technology Group in 2004 to assist in the development of an image-based vehicle detection product. This study proposes two techniques which reduce uncertainty and thereby improve the probability of a successful outcome. The first technique provides a predicted project development path or forward engineering plan which transforms the initial ‘fuzzy idea’ into a potential and achievable outcome. This process qualifies the ‘fuzzy idea’ as a potential, rationale or tangible outcome which is within the capability of the organisation. Additionally, this process proposes that a tangible or rationale idea can be deconstructed in reverse engineering process in order to create a forward engineering development plan. A detailed structured forward engineering plan reduces the uncertainty associated with new product development unknowns and therefore contributes to a successful outcome. This is described as the RETRO technique. The study recognises however that this claim requires qualification and proposes a second technique. The second technique proposes that a two dimensional spatial representation which has productivity and consumed resources as its axes, provides an effective means to qualify progress and expediently identify variation from the predicted plan. This spatial representation technique allows a quick response which in itself has a prediction attribute associated with directing the project back onto its predicted path. This process involves a coterminous comparison between the predicted development path and the evolving actual project development path. A consequence of this process is verification of progress or the application of informed, timely and quantified corrective action. This process also identifies the degree of success achieved in the engineering design and development phase of new product development where success is defined as achieving a predicted outcome. This spatial representation technique is referred to as NPD Mapping. The study demonstrates that these are useful techniques which aid SMEs in achieving successful new product outcomes because the technique are easily administered, measure and represent relevant development process related elements and functions, and enable expedient quantified responsive action when the evolving path varies from the predicted path. These techniques go beyond time line representations as represented in GANTT charts and PERT analysis, and represent the base variables of consumed resource and productivity/technical achievement in a manner that facilitates higher level interpretation of time, effort, degree of difficulty, and product complexity in order to facilitate informed decision making. This study presents, describes, analyses and demonstrates an SME focused engineering development technique, developed by the author, that produces a successful new product outcome which begins with a ‘fuzzy idea’ in the mind of the inventor and concludes with a successful new product outcome that is delivered on time and within budget. Further research on a wider range of SME organisations undertaking new product development is recommended.
APA, Harvard, Vancouver, ISO, and other styles
29

Higgins, Paul Anthony. "Reducing uncertainty in new product development." Queensland University of Technology, 2008. http://eprints.qut.edu.au/20273/.

Full text
Abstract:
Research and Development engineering is at the corner stone of humanity’s evolution. It is perceived to be a systematic creative process which ultimately improves the living standard of a society through the creation of new applications and products. The commercial paradigm that governs project selection, resource allocation and market penetration prevails when the focus shifts from pure research to applied research. Furthermore, the road to success through commercialisation is difficult for most inventors, especially in a vast and isolated country such as Australia which is located a long way from wealthy and developed economies. While market leading products are considered unique, the actual process to achieve these products is essentially the same; progressing from an idea, through development to an outcome (if successful). Unfortunately, statistics indicate that only 3% of ‘ideas’ are significantly successful, 4% are moderately successful, and the remainder ‘evaporate’ in that form (Michael Quinn, Chairman, Innovation Capital Associates Pty Ltd). This study demonstrates and analyses two techniques developed by the author which reduce uncertainty in the engineering design and development phase of new product development and therefore increase the probability of a successful outcome. This study expands the existing knowledge of the engineering design and development stage in the new product development process and is couched in the identification of practical methods, which have been successfully used to develop new products by Australian Small Medium Enterprise (SME) Excel Technology Group Pty Ltd (ETG). Process theory is the term most commonly used to describe scientific study that identifies occurrences that result from a specified input state to an output state, thus detailing the process used to achieve an outcome. The thesis identifies relevant material and analyses recognised and established engineering processes utilised in developing new products. The literature identified that case studies are a particularly useful method for supporting problem-solving processes in settings where there are no clear answers or where problems are unstructured, as in New Product Development (NPD). This study describes, defines, and demonstrates the process of new product development within the context of historical product development and a ‘live’ case study associated with an Australian Government START grant awarded to Excel Technology Group in 2004 to assist in the development of an image-based vehicle detection product. This study proposes two techniques which reduce uncertainty and thereby improve the probability of a successful outcome. The first technique provides a predicted project development path or forward engineering plan which transforms the initial ‘fuzzy idea’ into a potential and achievable outcome. This process qualifies the ‘fuzzy idea’ as a potential, rationale or tangible outcome which is within the capability of the organisation. Additionally, this process proposes that a tangible or rationale idea can be deconstructed in reverse engineering process in order to create a forward engineering development plan. A detailed structured forward engineering plan reduces the uncertainty associated with new product development unknowns and therefore contributes to a successful outcome. This is described as the RETRO technique. The study recognises however that this claim requires qualification and proposes a second technique. The second technique proposes that a two dimensional spatial representation which has productivity and consumed resources as its axes, provides an effective means to qualify progress and expediently identify variation from the predicted plan. This spatial representation technique allows a quick response which in itself has a prediction attribute associated with directing the project back onto its predicted path. This process involves a coterminous comparison between the predicted development path and the evolving actual project development path. A consequence of this process is verification of progress or the application of informed, timely and quantified corrective action. This process also identifies the degree of success achieved in the engineering design and development phase of new product development where success is defined as achieving a predicted outcome. This spatial representation technique is referred to as NPD Mapping. The study demonstrates that these are useful techniques which aid SMEs in achieving successful new product outcomes because the technique are easily administered, measure and represent relevant development process related elements and functions, and enable expedient quantified responsive action when the evolving path varies from the predicted path. These techniques go beyond time line representations as represented in GANTT charts and PERT analysis, and represent the base variables of consumed resource and productivity/technical achievement in a manner that facilitates higher level interpretation of time, effort, degree of difficulty, and product complexity in order to facilitate informed decision making. This study presents, describes, analyses and demonstrates an SME focused engineering development technique, developed by the author, that produces a successful new product outcome which begins with a ‘fuzzy idea’ in the mind of the inventor and concludes with a successful new product outcome that is delivered on time and within budget. Further research on a wider range of SME organisations undertaking new product development is recommended.
APA, Harvard, Vancouver, ISO, and other styles
30

Bosco, Estevão 1983. "Ulrick Beck = a teoria da sociedade de risco mundial." [s.n.], 2011. http://repositorio.unicamp.br/jspui/handle/REPOSIP/278753.

Full text
Abstract:
Orientador: Leila da Costa Ferreira
Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Filosofia e Ciências Humanas
Made available in DSpace on 2018-08-18T09:50:37Z (GMT). No. of bitstreams: 1 Bosco_Estevao_M.pdf: 1064536 bytes, checksum: 6932d843d823c854b8aeca8199aa9cd9 (MD5) Previous issue date: 2011
Resumo: O primeiro objetivo consiste em compreender e interpretar a teoria da sociedade de risco mundial elaborada por Ulrich Beck, de modo a deslindar os aspectos-chave que lhe permitem a caracterização de teoria. A definição desse objetivo como problema justifica-se pelo uso do ensaio como estratégia analítica/discursiva por parte do autor. Para tanto, a mediação teórica é estabelecida de forma imanente, definindo-se os conceitos reguladores da teoria, reflexividade e risco, como condutores da análise. As teses principais da teoria são assim delineadas, com seus dilemas específicos, inovações e possibilidades prático-teóricas. A partir disso, torna-se possível a crítica imanente, que por meio de proposições específicas, permite novas formulações conceituais, aqui circunscritas às seguintes questões: aspectos processuais do conceito de reflexividade; continuidade e descontinuidade na concepção de processo histórico-social; e a relação entre reflexividade, modernidade e incerteza, sob a perspectiva dos significados do devir social. Além dessas contribuições específicas, a pesquisa se justifica por abordar uma teoria sobre a qual não há críticas estabelecidas, apesar de sua difusão nos circuitos acadêmicos de uma sociologia globalizada, de suas contribuições significativas para a compreensão sociológica de problemas contemporâneos e das controvérsias que suscita no âmbito da justificação do argumento
Abstract: The first aim is to comprehend and interpret the theory of the world risk society, formulated by Ulrich Beck, in order to unravel the key aspects that allow its characterization as a theory. The definition of this main purpose as a research question is justified by the usage of the essay by the author as an analytical/discursive strategy. To achieve this goal, the theoretical mediation is established through an immanent perspective, in which the regulatory concepts of the theory, reflexivity and risk, are defined as the analytical conductors. The main theses of the theory are thus delineated according to their specific dilemmas, innovations and practical and theoretical possibilities. From this, the immanent critique becomes possible, allowing new conceptual formulations by means of specific propositions, which are related to the following issues: procedural aspects of the concept of reflexivity; continuity and discontinuity in the design of socio-historical process; and the relation between reflexivity, modernity and uncertainty, from the perspective of the meanings of social developments (devir social). Beyond these specific contributions, this research is justified by discussing a theory on which there is no critical review, despite its spread in the academic circuit of a global sociology, their significant contributions to the sociological comprehension of contemporary issues and also despite the controversies that the theory raises in the realm of the justification of the argument
Mestrado
Sociologia
Mestre em Sociologia
APA, Harvard, Vancouver, ISO, and other styles
31

Xavier, Alexandre Monticuco. "Analise do valor da informação na avaliação e desenvolvimento de campos de petroleo." [s.n.], 2004. http://repositorio.unicamp.br/jspui/handle/REPOSIP/263720.

Full text
Abstract:
Orientador: Denis Jose Schiozer
Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecanica, Instituto de Geociencias
Made available in DSpace on 2018-08-04T14:56:10Z (GMT). No. of bitstreams: 1 Xavier_AlexandreMonticuco_M.pdf: 1314652 bytes, checksum: f1d19635e80ee2c542dd65483505cb25 (MD5) Previous issue date: 2004
Resumo: A capacidade de lidar com incertezas pode ser um fator decisivo para viabilizar projetos de avaliação e desenvolvimento de campos de petróleo. Um critério econômico utilizado em processos de tomada de decisões é o valor da informação (VDI) que envolve a quantificação das incertezas, a avaliação econômica de diversos cenários de desenvolvimento e a quantificação dos benefícios que dados adicionais podem trazer ao processo. O cálculo do VDI pode ser complexo e demorado, principalmente nas fases de avaliação e desenvolvimento, em que uma modelagem detalhada do problema pode ser necessária. Nessas fases, a quantificação do VDI, assim como o de adicionar flexibilidade ao processo (valor da flexibilização - VDF), deve levar em conta os benefícios que podem ser extraídos do processo através da aplicação de estratégias de produção mais adequadas para os vários cenários possíveis. A quantificação do VDI e VDF, portanto, exige que a estratégia de produção seja determinada para cada cenário possível. Como isso geralmente não é viável, devido ao grande esforço que seria exigido, existem simplificações possíveis, como a determinação de modelos geológicos representativos (MGR) que podem fornecer a incerteza agregada dos atributos geológicos. Dessa forma, o objetivo deste trabalho é desenvolver e aplicar uma metodologia de cálculo do VDI durante as fases de avaliação e desenvolvimento de campos de petróleo com aplicação para casos simples e complexos, considerando diferentes números de parâmetros analisados. Esta etapa é realizada através da aplicação da metodologia em três exemplos, sendo dois casos teóricos, visando expor o conceito do VDI e VDF, e um caso real complexo, visando o cálculo do VDI para um caso utilizando a simplificação do processo. Os resultados indicam que a precisão do cálculo do VDI depende do número de MGR e a melhor forma de avaliação é através da aplicação das melhores estratégias em todos os cenários. Uma boa aproximação do VDI pode ser obtida pelo procedimento de inclusão gradativa de MGR até a estabilização dos resultados. Outra simplificação possível é usar também os MGR para representar a árvore no cálculo do VDI, mas com prejuízo de precisão nos resultados
Abstract: The capacity to deal with uncertainties is responsible for the economic viability of petroleum fields. The Value of Information (VOI) is an economic criterion used in decisionmaking process. It involves the quantification of uncertainties and the economic evaluation of various development scenarios. The quantification of the value of the information (VOI) and flexibility (VOF) can be highly complex and time-consuming, mainly in the appraisal and development phases when a detailed modeling of the problem may be necessary. The quantification of the value of information and of flexibility must take into account the benefits that can be extracted of the process. In these phases, these benefits result from a specific production strategy applied to several possible scenarios after the acquisition of the information. Therefore, the quantification of the VOI and VOF demand that the production strategy be determined to each possible scenario. This is not usually viable because a great effort would be needed; to circumvent this problem, there are some alternatives, such as the determination of geologic representative models (GRM) that can represent the uncertainty of the geologic attributes. The objective of this work is to develop and apply a methodology that can calculate the value of information during the appraisal and development phases in petroleum fields which can be applied to simple and complex cases, considering the number of analyzed parameters. This stage is realized through the application of the methodology to three examples; two theoretical models showing the concept of the value of information and, one real and complex case that demands a detailed analysis of the process. The results show that the quality of the results depends on the number of GRM and the best quantification technique is to apply the best production strategy to all possible scenarios. It is shown in this work that a good approximation of the VOI can be obtained by a dynamic procedure including new GRM until a stabilization of the results. The GRM can be used also to represent the decision tree but with some deterioration of the results
Mestrado
Mestre em Ciências e Engenharia de Petróleo
APA, Harvard, Vancouver, ISO, and other styles
32

Combier, Robert. "Risk-informed scenario-based technology and manufacturing evaluation of aircraft systems." Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/49046.

Full text
Abstract:
In the last half century, the aerospace industry has seen a dramatic paradigm shift from a focus on performance-at-any-cost to product economics and value. The steady increase in product requirements, complexity and global competition has driven aircraft manufacturers to seek broad portfolios of advanced technologies. The development costs and cycle times of these technologies vary widely, and the resulting design environment is one where decisions must be made under substantial uncertainty. Modeling and simulation have recently become the standard practice for addressing these issues; detailed simulations and explorations of candidate future states of these systems help reduce a complex design problem into a comprehensible, manageable form where decision factors are prioritized. While there are still fundamental criticisms about using modeling and simulation, the emerging challenge becomes ``How do you best configure uncertainty analyses and the information they produce to address real world problems?” One such analysis approach was developed in this thesis by structuring the input, models, and output to answer questions about the risk and economic impact of technology decisions in future aircraft programs. Unlike other methods, this method placed emphasis on the uncertainty in the cumulative cashflow space as the integrator of economic viability. From this perspective, it then focused on exploration of the design and technology space to tailor the business case and its associated risk in the cash flow dimension. The methodology is called CASSANDRA and is intended to be executed by a program manager of a manufacturer working of the development of future concepts. The program manager has the ability to control design elements as well as the new technology allocation on that aircraft. She is also responsible for the elicitation of the uncertainty in those dimensions within control as well as the external scenarios (that are out of program control). The methodology was applied on a future single-aisle 150 passenger aircraft design. The overall methodology is compared to existing approaches and is shown to identify more economically robust design decisions under a set of at-risk program scenarios. Additionally, a set of metrics in the uncertain cumulative cashflow space were developed to assist the methodology user in the identification, evaluation, and selection of design and technology. These metrics are compared to alternate approaches and are shown to better identify risk efficient design and technology selections. At the modeling level, an approach is given to estimate the production quantity based on an enhanced Overall Evaluation Criterion method that captures the competitive advantage of the aircraft design. This model was needed as the assumption of production quantity is highly influential to the business case risk. Finally, the research explored the capacity to generate risk mitigation strategies in to two analysis configurations: when available data and simulation capacity are abundant, and when they are sparse or incomplete. The first configuration leverages structured filtration of Monte Carlo simulation results. The allocation of design and technology risk is then identified on the Pareto Frontier. The second configuration identifies the direction of robust risk mitigation based on the available data and limited simulation ability. It leverages a linearized approximation of the cashflow metrics and identifies the direction of allocation using the Jacobian matrix and its inversion.
APA, Harvard, Vancouver, ISO, and other styles
33

Ozyurt, Gulizar. "Fuzzy Vulnerability Assessment Model Of Coastal Areas To Sea Level Rise." Phd thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/12612653/index.pdf.

Full text
Abstract:
Climate change and anticipated impacts of sea level rise such as increased coastal erosion, inundation, flooding due to storm surges and salt water intrusion to freshwater resources will affect all the countries but mostly small island countries of oceans and low-lying lands along coastlines. Turkey having 8333 km of coastline including physically, ecologically and socio-economically important low-lying deltas should also prepare for the impacts of sea level rise as well as other impacts of climate change while participating in adaptation and mitigation efforts. Thus, a coastal vulnerability assessment of Turkey to sea level rise is needed both as a part of coastal zone management policies for sustainable development and as a guideline for resource allocation for preparation of adaptation options for upcoming problems due to sea level rise. In this study, a fuzzy coastal vulnerability assessment model (FCVI) of a region to sea level rise using physical and human activity indicators of impacts of sea level rise which use commonly available data are developed. The results enable decision makers to compare and rank different regions according to their vulnerabilities to sea level rise, to prioritize impacts of sea level rise on the region according to the vulnerability of the region to each impact and to determine the most vulnerable parameters for planning of adaptation measures to sea level rise. The sensitivity and uncertainty analysis performed for the results of the model (FCVI) is the first time application of a fuzzy uncertainty analysis model to coastal vulnerability assessments. These analysis ensure that the decision makers could be able to interpret the results of such vulnerability assessments based primarily on expert perceptions accurately enough. This in turn, would increase the confidence levels of adaptation measures and as well as accelerate implementation of adaptation of coastal areas to climate change. The developed coastal vulnerability assessment model is applied successfully to determine the vulnerability of Gö
ksu, Gö
cek and Amasra regions of Turkey that have different geological, ecological and socio-economical properties. The results of the site studies show that Gö
ksu has high vulnerability, Gö
cek has moderate vulnerability and Amasra shows low vulnerability to sea level rise. These results are in accordance with the general literature on impacts of sea level rise at different geomorphological coastal areas thus the applicability of fuzzy vulnerability assessment model (FCVI) to coastal areas is validated.
APA, Harvard, Vancouver, ISO, and other styles
34

Pekkinen, L. (Leena). "Information processing view on collaborative risk management practices in project networks." Doctoral thesis, Oulun yliopisto, 2015. http://urn.fi/urn:isbn:9789526210162.

Full text
Abstract:
Abstract Large engineering projects are executed by a network of heterogeneous organisations. In order to be effective, risk management in large engineering projects needs to take the perspective of the entire project network instead of focusing on risk management practices of single actors. Contextual factors such as complexity of the project network and the challenging institutional environment pose additional challenges to risk management. The purpose of this study is to increase the understanding of the sources of risks in engineering project networks and the role of risk sources in determining risk management practices. The perspective of information processing theory is used. The role of equivocality and uncertainty as organisations’ rationales for processing information is examined to gain new insights into the selection of appropriate risk management practices. Literature introduces relational contracting as a response to the need for collaboration in project networks. In this study collaborative risk management practices in the workshop-type meeting and in the project alliance were studied. A qualitative research method was employed to study the nature of risk sources, the role of risk sources in determining risk management practices and collaborative risk management practices. The results of this study enhance the understanding of the nature of risks in engineering project networks. The current project risk management literature proposes that contextual factors related to technology, organising projects and environment increase uncertainty in projects. This study shows that it is relevant to categorise risk sources based on their contingency factors related to uncertainty (lack of information) and to equivocality (the existence of multiple interpretations). It is shown how risk sources impact the selection of project risk management practices. Collaborative risk management practices of workshop-type meeting and project alliance are depicted. Project-based companies and organisations executing investment projects can benefit from the results of this study. This study can guide managers when developing practices to enhance risk management. This study shows how informal risk management practices should be considered in addition to the traditional formal risk management practices, particularly in cases when projects confront situations of equivocality
Tiivistelmä Suuria projekteja toteutetaan heterogeenisten organisaatioiden muodostaman projektiverkoston avulla. Projektiverkoston tehokkaaseen riskienhallintaan tarvitaan koko verkoston näkökulma yhden organisaation näkökulman sijaan. Tilannetekijät kuten projektiverkoston monimuotoisuus ja projektin haasteellinen ympäristö asettavat lisää haasteita riskienhallinnalle. Tämän väitöskirjan tavoitteena on lisätä ymmärrystä siitä, mitkä ovat riskien lähteitä projekteissa ja kuinka riskien lähteet vaikuttavat riskienhallintamenetelmien valintaan. Väitöskirjassa on käytetty teoreettisena viitekehyksenä informaation prosessoinnin näkökulmaa. Erityisesti on tutkittu monimerkityksisyyden ja epävarmuuden roolia organisaatioiden perusteena käsitellä informaatiota. Kirjallisuudessa on esitetty luottamukseen perustuva sopiminen vastauksena projektiverkostojen yhteistoiminnallisuuden tarpeelle. Väitöskirjassa on tutkittu yhteistoiminnallisina riskienhallintamuotoina työpajatyyppistä työskentelyä sekä projektiallianssia. Tutkimuksessa on tapaustutkimuksen avulla selvitetty projektien riskien lähteitä, riskien lähteiden roolia riskienhallintamenetelmiä määritettäessä, sekä yhteistoiminnallisia riskienhallintakeinoja. Tutkimuksen löydökset lisäävät ymmärrystä projektien riskien lähteistä. Nykyinen projektin riskienhallintakirjallisuus esittää, että projektien tilannetekijät, jotka liittyvät teknologiaan, projektien organisointiin ja ympäristöön kasvattavat epävarmuutta. Tämä tutkimus osoittaa, että on tärkeää jaotella projektien riskit tilannetekijöittäin. Jaottelu tulee tehdä sen mukaan onko vallitseva tilannetekijä epävarmuus eli tiedon puute vai monimerkityksisyys eli tilanne, jossa on paljon keskenään ristiriitaista tietoa. Tässä tutkimuksessa osoitetaan kuinka riskien lähteet vaikuttavat projektiverkoston riskienhallintamenetelmien valintaan. Lisäksi kuvataan yhteistoiminnallisia riskienhallintamenetelmiä projekteissa. Projektitoimintaa harjoittavat yritykset sekä investointiprojekteja tekevät organisaatiot voivat hyödyntää tämän tutkimuksen tuloksia. Tutkimuksen tulokset ohjaavat riskienhallintamenetelmien muokkaamista erilaiset tilannetekijät huomioon ottaen. Tämä tutkimus osoittaa, kuinka epämuodollisia riskienhallintamenetelmiä tulisi suosia perinteisten muodollisten menetelmien ohessa erityisesti tilanteissa, joissa monimerkityksisyys on vallitseva tilannetekijä
APA, Harvard, Vancouver, ISO, and other styles
35

Jonsson, Fredrik. "Physiologically based pharmacokinetic modeling in risk assessment - Development of Bayesian population methods." Doctoral thesis, Solna : National Institute for Working Life (Arbetslivsinstitutet), 2001. http://publications.uu.se/theses/91-7045-599-6/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Toret, Jean-Baptiste. "Traitement ordinal de l'information d'expertise pour le risque en génie civil : apport des sciences de la décision à la gestion des risques." Thesis, Paris 1, 2014. http://www.theses.fr/2014PA010012/document.

Full text
Abstract:
Lorsque des systèmes, tels les barrages, sont soumis à un haut degré d’incertitude et que l’heuristique des experts prend une place très importante, les outils habituels de gestion des risques ne sont pas toujours efficaces pour rendre compte du jugement des experts. Les sciences de la décision proposent alors des outils pour aider à la compréhension, voire à l’élicitation de l’avis des experts. Dans le cas des barrages, nous disposons d’un retour d’expérience encore peu formalisé et de peu d’événements significatifs. En outre, les mécanismes phénoménologiques à l’œuvre sont mal connus. Il est alors nécessaire d’invoquer des outils qui sortent des habitudes pratiquées dans la gestion des risques. Cette étude propose une méthode qui permet à l’expert de mieux éliciter son jugement, et de révéler les risques sur les barrages par un traitement ordinal de l’information d’expertise. En outre, nous montrerons que cet outil est un estimateur du maximum de vraisemblance, et promet donc une information de première importance pour un décisionnaire. Pour parvenir à ce résultat, nous utiliserons une méthode articulée autour des bases de règles logiques, dont la construction est enrichie par des outils issus des théories du vote, des jeux coopératifs et des bases de données. De cette façon, nous montrerons qu’il est possible de gérer les risques sans utiliser les outils issus des approches probabilistes, tout en prenant en compte les heuristiques des experts
When facing high uncertainty systems, such as dams, where experts heuristics becomes too much important, usual tools are not satisfying enough to reveal experts’ opinion in order to manage the risks associated with the system. Decision science then brings tools to sharpen our understanding, or even help the elicitation, of what the expert wants best to express. Concerning dams, we have only very little feedback, and no to few significant events. In addition to the lack of knowledge when it comes to the phenomenological mechanisms, these issues lead us to use unusual tools for risk management. This study brings an innovative tool to help on the elicitation of experts’ opinion, allowing risk management on dams based on an ordering approach. Furthermore, we will show this tool is an estimation of the maximum likelihood, which is invaluable information for any decision maker. We will show this result is obtainable through a method using rule based assignments, developing the rules thanks to tools like votes, games and database theories. Doing so, we will show how it is possible to process risks without using usual probabilistic tools, while taking experts’ heuristics into account
APA, Harvard, Vancouver, ISO, and other styles
37

Arnold, Patrick. "Probabilistic modelling of unsaturated slope stability accounting for heterogeneity." Thesis, University of Manchester, 2017. https://www.research.manchester.ac.uk/portal/en/theses/probabilistic-modelling-of-unsaturated-slope-stability-accounting-for-heterogeneity(fb3d214c-8a42-4a2c-81c2-bda45e9ae7af).html.

Full text
Abstract:
The performance and safety assessment of geo-structures is strongly affected by uncertainty; that is, both due a subjective lack of knowledge as well as objectively present and irreducible unknowns. Due to uncertainty in the non-linear variation of the matric suction induced effective stress as a function of the transient soil-atmosphere boundary conditions, the unsaturated state of the subsoil is generally not accounted for in a deterministic slope stability assessment. Probability theory, accounting for uncertainties quantitatively rather than using "cautious estimates" on loads and resistances, may aid to partly bridge the gap between unsaturated soil mechanics and engineering practice. This research investigates the effect of uncertainty in soil property values on the stability of unsaturated soil slopes. Two 2D Finite Element (FE) programs have been developed and implemented into a parallelised Reliability-Based Design (RBD) framework, which allows for the assessment of the failure probability, failure consequence and parameter sensitivity, rather than a deterministic factor of safety. Utilising the Random Finite Element Method (RFEM), within a Monte Carlo framework, multivariate cross-correlated random property fields have been mapped onto the FE mesh to assess the effect of isotropic and anisotropic moderate heterogeneity on the transient slope response, and thus performance. The framework has been applied to a generic slope subjected to different rainfall scenarios. The performance was found to be sensitive to the uncertainty in the effective shear strength parameters, as well as the parameters governing the unsaturated soil behaviour. The failure probability was found to increase most during prolonged rainfall events with a low precipitation rate. Nevertheless, accounting for the unsaturated state resulted in a higher slope reliability than when not considering suction effects. In a heterogeneous deposit failure is attracted to local zones of low shear strength, which, for an unsaturated soil, are a function of both the spatial variability of soil property values, as well as of the soil-water dynamics, leading to a significant increase in the failure probability near the end of the main rainfall event.
APA, Harvard, Vancouver, ISO, and other styles
38

Coelho, Alexandre Avelar. "Um indicador do valor da informação sismica em projetos de exploração de petroleo." [s.n.], 2004. http://repositorio.unicamp.br/jspui/handle/REPOSIP/265524.

Full text
Abstract:
Orientador: Saul Barisnik Suslick
Dissertação (mestrado) - Universidade Estadual de Engenharia Mecanica e Instituto de Geociencias
Made available in DSpace on 2018-08-12T15:51:45Z (GMT). No. of bitstreams: 1 Coelho_AlexandreAvelar_M.pdf: 28812155 bytes, checksum: 29b3bc8596d1657465b5fc0dc96c3f9e (MD5) Previous issue date: 2004
Resumo: A priorização de oportunidades exploratórias é de fundamental importância na indústria do petróleo devido à elevada quantidade de projetos e ao orçamento limitado das empresas. A valoração de cada projeto depende das estimativas de volume e de ocorrência de hidrocarbonetos, sendo que o valor atribuído será tão mais preciso quanto melhor for o desempenho da tecnologia sísmica utilizada na obtenção da informação. O avanço tecnológico transformou os dados sísmicos em uma fonte de informação cada vez mais precisa para estimativas relacionadas a tais ocorrências. Portanto, é necessário que a tecnologia utilizada para realizar as estimativas, seja considerada na valoração e priorização de oportunidades. O método proposto estabelece um indicador de informação sísmica cujo valor traduz a confiabilidade das estimativas realizadas. Além disso, é proposta uma abordagem para estimar o valor da informação sísmica imperfeita para levantamentos futuros, incorporando-se a quantidade e a qualidade dos dados, o modelo geológico envolvido, a adequação e o desempenho da tecnologia utilizada e as características inerentes da bacia que afetam a qualidade da informação. A finalidade do método é subsidiar a priorização de projetos, fornecendo informações para a tomada de decisão consistente e com menor subjetividade. O estudo de caso apresentado mostra que a utilização do indicador pode alterar as prioridades na escolha das oportunidades, valorizando as estimativas mais confiáveis.
Abstract: The assessment of exploratory opportunity has a fundamental importance in the upstream oil industry due to a high number of projects and the limited budget from companies. The valuation of each project depends on the estimation of oil quantities from a given field which accuracy changes with the capacity of measure the reservoir size. In the last decades, the technological progress positioned seismic data as a significant source of information for opportunities. Therefore, it is necessary that the technology used to get information should be incorporated at assessment processo This dissertation presents a methodology by using an indicator of seismic information which its value gives a degree of confidence of the technological seismic option used. This methodology ,also develops an option to estimate the value of imperfect seismic information for new surveys through the inc1usion of the amount of data, data quality, the embedded geological model, the adequacy and performance ofthe technologyused and others characteristics inherent ofbasin such as noises low-velocity zone that can influence the quality datao The main goal of this methodology is to support the assessment and ranking of exploratory opportunities giving valuable information to the decision process in a consistent and standard formo A case study presented shows that the indicator presents good performance by adjusting the opportunities, considering the most reliable outcomes and improving the decision-making process.
Mestrado
Reservatórios e Gestão
Mestre em Ciências e Engenharia de Petróleo
APA, Harvard, Vancouver, ISO, and other styles
39

Silva, Fernando César Nimer Moreira da. "Venture capital: valor da informação, riscos e instrumentos para sua mitigação." Universidade de São Paulo, 2014. http://www.teses.usp.br/teses/disponiveis/2/2132/tde-20012015-162731/.

Full text
Abstract:
Venture capital é espécie de empreendimento que vincula dois agentes econômicos, empreendedor e investidor, visando ao desenvolvimento de uma ideia inovadora para posterior comercialização no mercado. O empreendedor é detentor de conhecimento sobre a ideia e o investidor possui os recursos para desenvolver o projeto. O negócio se diferencia dos demais pelo alto grau de incerteza e risco do empreendimento e requer o uso de tipos contratuais adequados para sua limitação. O projeto se inicia com a etapa de contratação, na qual as partes negociam a divisão de riscos e retorno do negócio, seguindo-se a etapa de monitoramento do desenvolvimento das atividades. Ao final ocorre o desinvestimento, com a saída do investidor e venda do negócio. Do ponto de vista da Economia, utilizamos a Teoria dos Jogos e apresentamos os problemas informacionais, riscos e incertezas do negócio, e os incentivos para organizar a cooperação entre as partes. Do ponto de vista de Finanças, debatemos a decisão de financiamento do negócio e as alternativas para diversificação dos riscos do investimento, isto é, a possibilidade de limitação dos riscos pela adoção de estratégias de contenção, que aumentam o interesse em contratar o negócio. Do ponto de vista do Direito, avaliamos qual a estrutura contratual ideal para organizar esse tipo de empreendimento. Analisamos as principais formas usadas para organização do negócio, em especial as sociedades limitadas e as sociedades anônimas fechadas. Avaliamos o suporte normativo aplicável, com destaque para a possibilidade de limitação dos riscos do projeto pela aplicação das normas de Direito Societário a esses empreendimentos. Os principais riscos aplicáveis são os riscos de contratação do negócio, os riscos de alocação do poder de decisão entre os sócios e os riscos de interrupção prematura do projeto. Devido à natureza e características do negócio de venture capital, concluímos que esse tipo de projeto é mais bem organizado como um contrato plurilateral e que não há tipo contratual ideal para alinhar os interesses. Dos tipos existentes, a sociedade anônima fechada é o mais adequado, mas incapaz de limitar todos os riscos do negócio. A conclusão é confirmada, parcialmente, pelas evidências empíricas apresentadas.
Venture capital is a business that links two economic agents, entrepreneur and investor, aiming to develop an innovative idea for future sale on the market. The entrepreneur holds knowledge about the idea and the investor has the resources to develop the project. It is distinguished from others by the high degree of uncertainty and risk of the project and requires the use of appropriate contract types for its restriction. The project begins with the contracting stage, in which the parties negotiate the division of risks and return business, followed by the monitoring of the development of the business activities. At the end occurs the divestment, in which the finished business is sold by the investor. From the point of view of Economics, we use Game Theory to present the informational problems, business risks and uncertainties, and the incentives to organize the cooperation between the parties. From the standpoint of Finance, we discuss the decision to finance the business, and alternatives for risk diversification, that is, the possibility of limiting the risks by adopting containment strategies that may increase the interest in contracting. From the point of view of Law, we evaluate the ideal contractual structure for organizing this kind of project. We analyze the main existing contract types, in particular, the limited liability companies and the closed corporations. We present our concerns about the normative support applicable to that type of business, emphasizing the Corporate Law problems. We evaluate the normative support applicable, emphasizing the possibility of limiting the project risks by applying the Corporate Law rules to such ventures. The main risks are the risks applicable to the contracting phase, the risk of incorrect allocation of decision rights between the partners and the risk of premature termination of the project. Due to the nature and characteristics of the venture capital business, we conclude that this type of design is best organized as a plurilateral agreement and that there is no contract type that can be considered ideal to align the interests. Considering all the existing types, the private corporation contract is the most appropriate form, but also unable to limit all the business risks. The conclusion is partially supported by the empirical evidence presented.
APA, Harvard, Vancouver, ISO, and other styles
40

Fayard, Nicolas. "Capability approach inspired tools for aiding policy design." Electronic Thesis or Diss., Université Paris sciences et lettres, 2024. http://www.theses.fr/2024UPSLD043.

Full text
Abstract:
Cette thèse explore l'application de l'Approche par les Capabilités (CA) dans les cadres d'aide à la décision, avec un accent particulier sur la conception des politiques publiques.La CA est présentée comme une alternative aux mesures traditionnelles de bien-être, offrant un cadre multidimensionnel qui prend en compte la diversité et la subjectivité.Nous proposons une amélioration de l'approche en intégrant des facteurs systémiques à travers la programmation linéaire, obtenant ainsi des ensembles de capabilités sous forme de frontières de Pareto.Une étude de faisabilité a été développée en appliquant la CA pour évaluer les capabilités de santé mentale des personnes âgées en Ile de France.Plutôt que de comparer et d'agréger des solutions, y compris multidimensionnelles, de nouvelles méthodologies doivent être développées pour comparer et agréger des ensembles de solutions Pareto-efficaces.La deuxième partie de ce manuscrit est donc consacrée à l'adaptation des méthodes classiques d'agrégation à la CA, en se concentrant spécifiquement sur l'agrégation des ensembles de capabilités en présence d'une fonction d'utilité et sur la gestion de l'incertitude, avec ou sans probabilités subjectives
This research explores the application of the Capability Approach (CA) within decision-aiding frameworks, focusing on public policy design.The CA is presented as an alternative to traditional welfare measures, offering a multidimensional framework that accounts for diversity and subjectivity.We propose an improved approach by incorporating systemic factors through mixed-integer linear programming, obtaining capability sets as Pareto frontiers.A proof of concept was developed by applying the CA to assess mental health capabilities of older adults in the urban context of Paris.Rather than comparing and aggregating, possibly multidimensional, solutions, new methodologies are required to compare and aggregate sets of Pareto-efficient solutions.The second part of this manuscript is dedicated to adapting classical aggregation methods to the CA, specifically focusing on aggregating capability sets in the presence of a utility function and dealing with uncertainty, both with and without subjective probabilities
APA, Harvard, Vancouver, ISO, and other styles
41

Loukoianova, Elena. "Risk, uncertainty, and fiscal institutions." Thesis, University of Cambridge, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.616105.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Zheng, Esther Zhi Hong. "Gestão de incertezas em projetos complexos: quadro conceitual e estudos de caso." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/3/3136/tde-07122016-084613/.

Full text
Abstract:
As metodologias tradicionais de gerenciamento de projetos são caracterizadas como sendo rígidas e adequadas apenas para ambientes de pouca incerteza. No entanto, atualmente cada vez mais projetos são desenvolvidos em ambientes de alta complexidade e maiores incertezas, o que requer formas diferentes de gerenciamento do projeto, menos rígidas e mais flexíveis. Assim, o objetivo deste trabalho é propor um quadro conceitual para o gerenciamento de incertezas em projetos complexos. A abordagem metodológica mescla revisão sistemática da literatura e estudos de caso. O quadro conceitual desenvolvido, baseado na teoria da contingência, sugere que as abordagens de gestão de projetos sob incertezas são, em parte, determinadas pelas características das incertezas existentes. As respostas às incertezas podem ser orientadas à causa ou ao efeito da incerteza, e essas são escolhidas de acordo com a habilidade de influência sobre a causa, que aumenta para incertezas internas e diminui para incertezas externas. Já a flexibilidade da abordagem de gestão de projetos é impactada pelo grau da incerteza. Foram realizados seis estudos de caso, e um caso piloto, em projetos complexos, que mostraram que existe relação entre a origem da incerteza e a habilidade de influência, e entre a habilidade de influência e a orientação das ações. Eles mostraram também como a flexibilidade é importante para o gerenciamento das incertezas do projeto, associando métodos de antecipação, de instrucionismo, como o gerenciamento de riscos e a resiliência, principalmente a capacidade da alta direção assumir a existência de incertezas. Os estudos de caso evidenciaram que quanto maior o nível de incerteza do projeto, maior é a flexibilidade necessária. A dissertação apresenta ainda as limitações do estudo e as sugestões para trabalhos futuros.
Traditional project management methodologies are considered rigid and suitable only for environments of little uncertainty. However nowadays increasing number of projects is being developed in high complexity and uncertainty environment, requiring different approaches for project management: less rigid and more flexible. Thus, the purpose of this paper is to present a framework for managing projects under uncertainties, through a systematic literature review. The developed framework, based on the contingency theory, suggests that approaches for project management under uncertainty are, in part, determined by the characteristics of the existing uncertainties. The responses for uncertainty can be driven by the cause or consequence of the uncertainties and those are chosen according to the ability to influence the cause, which is higher for internal uncertainties and lower for external uncertainties. The flexibility of the project management approach, in its turn, is impacted by the uncertainty degree. Six case studies and a pilot case study were conducted in projects with complexity. The cases showed that there is a relation between the source of the uncertainty and the ability to influence, and between the ability to influence and the orientation of the responses. They also pointed out the importance of flexibility to manage uncertainties, combining prediction methods, the instructionism, such as using risk management, and resilience, especially the ability of the management to admit the existence of uncertainties. The cases also indicated that the highest is the project uncertainty, the highest is the need for flexibility. The paper also presents the limitation of the research, and suggestions for further works.
APA, Harvard, Vancouver, ISO, and other styles
43

Mahamud, Abdirahman, Abdimajid Khayre, and Paula Bergholm. "Management of project failures in the gaming industry : The normalization approach." Thesis, Internationella Handelshögskolan, Högskolan i Jönköping, IHH, Företagsekonomi, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-44190.

Full text
Abstract:
In creative industries such as the gaming industry, the failure rate is typically higher in relation to many other industries. This is usually due to the constant need of innovation and the extreme competition in the industry of gaming. Firms in this industry take on multiple innovation projects, which inherently have a high rate of failure. Literature has previously stressed and focused on the importance of failure and how it can enhance learning that can be a crucial asset for any organization. However, failure brings along negative emotions that can slow down or block the learning process of an individual or an organization at large. In an industry where failure is common, it is important for the management to tackle this issue. Therefore, the purpose of this thesis is to explore the approach the management of small gaming firms take in order to normalize failure. In this study, the data has been collected qualitatively while using a thematic analysis to recognize consistent themes and patterns, which arise from the primary data that was collected. By conducting four semi-structured interviews with two different companies (2 interviews each), we found that both companies have a similar attitude regarding project failure. Both companies either expect failure to happen or even encourage it. One of our key findings was that both companies emphasize failing fast, which allows them to save time, money and resources as well as helps some members of the organization to react less emotionally to the termination of a project. Empirical results were then discussed and analyzed by judging whether the actions these companies took can be classified as a way of normalizing failure. We concluded that there was evidence for management employing various methods of action that would eventually lead to normalization of failure. Some of these actions included the fail fast attitude, failure supportive slogans and the thought of planning for failure beforehand.
APA, Harvard, Vancouver, ISO, and other styles
44

Clausen, Mork Jonas. "Dealing with uncertainty." Doctoral thesis, KTH, Filosofi, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-72680.

Full text
Abstract:
Uncertainty is, it seems, more or less constantly present in our lives. Even so, grasping the concept philosophically is far from trivial. In this doctoral thesis, uncertainty and its conceptual companion information are studied. Axiomatic analyses are provided and numerical measures suggested. In addition to these basic conceptual analyses, the widespread practice of so-called safety factor use in societal regulation is analyzed along with the interplay between science and policy in European regulation of chemicals and construction.
QC 20120202
APA, Harvard, Vancouver, ISO, and other styles
45

Filipsson, Monika. "Uncertainty, variability and environmental risk analysis." Doctoral thesis, Linnéuniversitetet, Institutionen för naturvetenskap, NV, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-11193.

Full text
Abstract:
The negative effects of hazardous substances and possible measures that can be taken are evaluated in the environmental risk analysis process, consisting of risk assessment, risk communication and risk management. Uncertainty due to lack of knowledge and natural variability are always present in this process. The aim of this thesis is to evaluate some tools as well as discuss the management of uncertainty and variability, as it is necessary to treat them both in a reliable and transparent way to gain regulatory acceptance in decision making. The catalytic effects of various metals on the formation of chlorinated aromatic compounds during the heating of fly ash were investigated (paper I). Copper showed a positive catalytic effect, while cobalt, chromium and vanadium showed a catalytic effect for degradation. Knowledge of the catalytic effects may facilitate the choice and design of combustion processes to decrease emissions, but it also provides valuable information to identify and characterize the hazard. Exposure factors of importance in risk assessment (physiological parameters, time use factors and food consumption) were collected and evaluated (paper II). Interindividual variability was characterized by mean, standard deviation, skewness, kurtosis and multiple percentiles, while uncertainty in these parameters was estimated with confidence intervals. How these statistical parameters can be applied was shown in two exposure assessments (papers III and IV). Probability bounds analysis was used as a probabilistic approach, which enables separate propagation of uncertainty and variability even in cases where the availability of data is limited. In paper III it was determined that the exposure cannot be expected to cause any negative health effects for recreational users of a public bathing place. Paper IV concluded that the uncertainty interval in the estimated exposure increased when accounting for possible changes in climate-sensitive model variables. Risk managers often need to rely on precaution and an increased uncertainty may therefore have implications for risk management decisions. Paper V focuses on risk management and a questionnaire was sent to employees at all Swedish County Administrative Boards working with contaminated land. It was concluded that the gender, age and work experience of the employees, as well as the funding source of the risk assessment, all have an impact on the reviewing of risk assessments. Gender was the most significant factor, and it also affected the perception of knowledge.
Negativa effekter orsakade av skadliga ämnen och möjliga åtgärder bedöms och utvärderas i en miljöriskanalys, som kan delas i riskbedömning, riskkommunikation och riskhantering. Osäkerhet som beror på kunskapsbrist samt naturlig variabilitet finns alltid närvarande i denna process. Syftet med avhandlingen är att utvärdera några tillvägagångssätt samt diskutera hur osäkerhet och variabilitet hanteras då det är nödvändigt att båda hanteras trovärdigt och transparent för att riskbedömningen ska vara användbar för beslutsfattande. Metallers katalytiska effekt på bildning av klorerade aromatiska ämnen under upphettning av flygaska undersöktes (artikel I). Koppar visade en positiv katalytisk effekt medan kobolt, krom och vanadin istället katalyserade nedbrytningen. Kunskap om katalytisk potential för bildning av skadliga ämnen är viktigt vid val och design av förbränningsprocesser för att minska utsläppen, men det är också ett exempel på hur en fara kan identifieras och karaktäriseras. Information om exponeringsfaktorer som är viktiga i riskbedömning (fysiologiska parametrar, tidsanvändning och livsmedelskonsumtion) samlades in och analyserades (artikel II). Interindividuell variabilitet karaktäriserades av medel, standardavvikelse, skevhet, kurtosis (toppighet) och multipla percentiler medan osäkerhet i dessa parametrar skattades med konfidensintervall. Hur dessa statistiska parametrar kan tillämpas i exponeringsbedömningar visas i artikel III och IV. Probability bounds analysis användes som probabilistisk metod, vilket gör det möjligt att separera osäkerhet och variabilitet i bedömningen även när tillgången på data är begränsad. Exponeringsbedömningen i artikel III visade att vid nu rådande föroreningshalter i sediment i en badsjö så medför inte bad någon hälsofara. I artikel IV visades att osäkerhetsintervallet i den skattade exponeringen ökar när hänsyn tas till förändringar i klimatkänsliga modellvariabler. Riskhanterare måste ta hänsyn till försiktighetsprincipen och en ökad osäkerhet kan därmed få konsekvenser för riskhanteringsbesluten. Artikel V fokuserar på riskhantering och en enkät skickades till alla anställda som arbetar med förorenad mark på länsstyrelserna i Sverige. Det konstaterades att anställdas kön, ålder och erfarenhet har en inverkan på granskningsprocessen av riskbedömningar. Kön var den mest signifikanta variabeln, vilken också påverkade perceptionen av kunskap. Skillnader i de anställdas svar kunde också ses beroende på om riskbedömningen finansierades av statliga bidrag eller av en ansvarig verksamhetsutövare.
APA, Harvard, Vancouver, ISO, and other styles
46

Johnson, David G. "Representations of uncertainty in risk analysis." Thesis, Loughborough University, 1998. https://dspace.lboro.ac.uk/2134/31941.

Full text
Abstract:
Uncertainty in situations involving risk is frequently modelled by assuming a plausible form of probability distribution for the uncertain quantities involved, and estimating the relevant parameters of that distribution based on the knowledge and judgement of informed experts or decision makers. The distributions assumed are usually uni-modal (and often bell-shaped) around some most likely value, with the Normal, Beta, Gamma and Triangular distributions being popular choices.
APA, Harvard, Vancouver, ISO, and other styles
47

Gallagher, Raymond. "Uncertainty modelling in quantitative risk analysis." Thesis, University of Liverpool, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.367676.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Werner, Jana. "Risk and uncertainty in project management." Thesis, Heriot-Watt University, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.525618.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Hantzsche, Arno. "Fiscal uncertainty and sovereign credit risk." Thesis, University of Nottingham, 2018. http://eprints.nottingham.ac.uk/49976/.

Full text
Abstract:
This doctoral thesis studies sovereign credit risk during periods of uncertainty about the state of a government's fiscal position. A new measure of fiscal uncertainty is introduced, based on the disagreement in official forecasts of the public budget deficit, and forecast revisions to approximate common uncertainty shocks. It is shown that in the aftermath of the global financial crisis, fiscal uncertainty increased substantially in advanced economies. The effects of fiscal uncertainty are largely unknown, in particular in the context of sovereign credit risk. To estimate the response of sovereign credit ratings to fiscal uncertainty, a new empirical framework is developed for the analysis of rating determinants. Rating transition is modelled as the joint outcome of two processes, which determine the frequency of rating changes, and their direction. This thesis finds that fiscal uncertainty is perceived a credit risk by rating agencies and increases the probability of a rating downgrade. Fiscal uncertainty also affects the attention paid to sovereign ratings. An event study analysis shows that the attention to rating announcements increases, the more noisy publicly available information about fiscal outcomes is.
APA, Harvard, Vancouver, ISO, and other styles
50

Krüger, Niclas. "Infrastructure investment planning under uncertainty /." Örebro : Örebro University, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:oru:diva-6618.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography