Dissertations / Theses on the topic 'Economics – Simulation methods'

To see the other types of publications on this topic, follow the link: Economics – Simulation methods.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 37 dissertations / theses for your research on the topic 'Economics – Simulation methods.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Smith, Gregory Steven. "Applications of quantitative methods in environmental economics : econometrics, simulation modelling and experiments." Thesis, University of East Anglia, 2016. https://ueaeprints.uea.ac.uk/57421/.

Full text
Abstract:
In Part I of this thesis we employ novel econometric techniques to explore elicitation anomalies in contingent valuation (CV). According to standard assumptions regarding preferences, changes in the way values are elicited in CV questions should be decision irrelevant. That responses are observed to systematically differ according to elicitation format has, therefore, called the CV method into question. One possible explanation lies in the proposition that respondents are uncertain about their preferences and that their uncertainty precipitates systematically different responses to different question formats. We test this hypothesis using data from a split-sample CV survey. We analyse our data using an innovative application of a semi-parametric estimator more commonly used for duration modelling in the medical sciences but find that uncertainty alone cannot explain away common elicitation anomalies. In Part II we employ simulation modelling and experimental techniques to investigate payment for ecosystem services (PES) schemes that involve multiple buyers. In Chapter 2, we explore opportunities for buyers in PES scheme to realise Paretoimproving outcomes through spatial coordination in their independent purchases of changes to land-management practices. We develop a simulation environment imitating a heterogeneous agricultural landscape and using techniques of integer-linear programming solve for outcomes under different institutional arrangements. Our simulations allow us to explore how gains from negotiated or fully-cooperative purchasing differ across different configurations of landscape and buyer objectives. In Chapter 3, we investigate negotiation as a multiple-purchaser ecosystem service procurement mechanism. We design and conduct novel three-person bargaining experiments in which two potential buyers can negotiate not only between each other but also with a seller of ecosystem services. We find that negotiated deals can be reached that are mutually advantageous to all parties. In all treatment scenarios presented, the vast majority of groups are able to reach agreements; in addition, these agreements are reached relatively quickly.
APA, Harvard, Vancouver, ISO, and other styles
2

Frohwerk, Sascha. "Asymmetrien in der Neuen Ökonomischen Geographie : Modelle, Simulationsmethoden und wirtschaftspolitische Diskussion." Phd thesis, Universität Potsdam, 2010. http://opus.kobv.de/ubp/volltexte/2011/4915/.

Full text
Abstract:
Die Neue Ökonomische Geographie (NEG) erklärt Agglomerationen aus einem mikroökonomischen Totalmodell heraus. Zur Vereinfachung werden verschiedene Symmetrieannahmen getätigt. So wird davon ausgegangen, dass die betrachteten Regionen die gleiche Größe haben, die Ausgabenanteile für verschiedene Gütergruppen identisch sind und die Transportkosten für alle Industrieprodukte die selben sind. Eine Folge dieser Annahmen ist es, dass zwar erklärt werden kann, unter welchen Bedingungen es zur Agglomerationsbildung kommt, nicht aber wo dies geschieht. In dieser Arbeit werden drei Standardmodelle der NEG um verschiedene Asymmetrien erweitert und die Veränderung der Ergebnisse im Vergleich zum jeweiligen Basismodell dargestellt. Dabei wird neben der Theorie auf die Methoden der Simulation eingegangen, die sich grundsätzlich auf andere Modelle übertragen lassen. Darauf aufbauend wird eine asymmetrische Modellvariante auf die wirtschaftliche Entwicklung Deutschlands angewandt. So lässt sich das Ausbleiben eines flächendeckenden Aufschwungs in den neuen Ländern, die starken Wanderungsbewegungen in die alten Länder und das dauerhafte Lohnsatzgefälle in einem Totalmodell erklären.
The new economic geography explains agglomerations based on a microeconomic general equilibrium model, witch is usually assumed to be symmetric in the sense, that regions are of the same size and transport costs and expenditure shares are the same. As a result, the models can explain why an agglomeration occurs, but not in witch region. This book modifies three of the most influential models of the new economic geography and assumes various asymmetries. It compares the results to the symmetric cases. Not only theoretical aspects but also methods of simulation are discussed in detail. This methods can be applied to a wide variety of models. To show the political implications of the theoretical results, one of the asymmetric models is applied to the economical development in germany after reunification. The model is able to explain the persistent difference in wages between east and west and the simultaneous incomplete agglomeration in the west.
APA, Harvard, Vancouver, ISO, and other styles
3

Low, Hamish Wallace. "Simulation methods and economic analysis." Thesis, University College London (University of London), 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.392495.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Tiwisina, Johannes [Verfasser]. "Essays on simulation methods in economic dynamics / Johannes Tiwisina." Bielefeld : Universitätsbibliothek Bielefeld, 2016. http://d-nb.info/108148781X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rozario, Jewel Augustine, and Osman Abdelkader Hamid. "A systematic approach to assess the relocation of the business centres to a logistics platform: A case study on DHL Freight AB (Sweden)." Thesis, Högskolan i Gävle, Industriell ekonomi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-27244.

Full text
Abstract:
The relocation of logistic companies’ from the inner centres to the logistic platforms significantly affects both the supply chain management and the urban sustainability development. Recently the concept of city logistics and intermodality has received a significant attention from both academics and decision makers. City logistics play a pivotal role to ensure the liveability of urban areas but, in parallel, urban freight transport also has a significant effect on the quality of life in the urban settings. Optimization of urban freight transportation have an important input in the context of sustainability and liveability of cities and urban areas reducing traffic congestion, decreasing road accidents, alleviating CO2 emissions and noise impacts. The purpose of this thesis is to evaluate the relocation of a case logistics company from the city centre to a suburban area. To do this, a wide range of literature reviews pertaining the influence of peripheral logistics platform on the city sustainability were investigated. It seems that there are not well-defined models which can make a comprehensive and quantitative assessment in the context of sustainability for the relocation of business premises. Further investigation was done by conducting a case study on DHL, field observation of traffic flow. Based on all the collected information from the relevant sources, a mixed methods research was applied including a qualitative approach and a quantitative approach. A systematic approach was therefore developed in the context of sustainable development which can be used as an assessment tool for the major factors that enlighten the decision makers to consider the relocation of the logistics companies. A systematic approach was developed by this thesis which facilitates the assessment of key factors that impact the relocation decision in the context of all the three sustainable aspects: economic, social and environmental development. These impacts represent traffic congestion, time and distance of transportation, emission, cost optimization and transport mobility.
APA, Harvard, Vancouver, ISO, and other styles
6

Dillon, Krystal Renee. "A simulation-optimization method for economic efficient design of net zero energy buildings." Thesis, Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/51909.

Full text
Abstract:
Buildings have a significant impact on energy usage and the environment. Much of the research in architectural sustainability has centered on economically advanced countries because they consume the most energy and have the most resources. However, sustainable architecture is important in developing countries, where the energy consumption of the building sector is increasing significantly. Currently, developing countries struggle with vaccine storage because vaccines are typically warehoused in old buildings that are poorly designed and wasteful of energy. This thesis created and studied a decision support tool that can be used to aid in the design of economically feasible Net Zero Energy vaccine warehouses for the developing world. The decision support tool used a simulation-optimization approach to combine an optimization technique with two simulation softwares in order to determine the cost-optimal design solution. To test its effectiveness, a new national vaccine storage facility located in Tunis, Tunisia was used. Nine building parameters were investigated to see which have the most significant effect on the annual energy usage and initial construction cost of the building. First, tests were conducted for two construction techniques, five different climates in the developing world, and three photovoltaic system prices to gain insight on the design space of the optimal solution. The results showed the difference between an economically efficient and economically inefficient Net Zero Energy building and the results were used to provide generalized climatic recommendations for all the building parameters studied. The final test showed the benefits of combining two optimization techniques, a design of experiments and a genetic algorithm, to form a two-step process to aid in the building design in the early stages and final stages of the design process. The proposed decision support tool can efficiently and effectively aid in the design of an economically feasible Net Zero Energy vaccine warehouse for the developing world.
APA, Harvard, Vancouver, ISO, and other styles
7

McFerran, Ethna. "Health economic evaluation of alternatives to current surveillance in colorectal adenoma at risk of colorectal cancer." Thesis, Queen's University Belfast, 2018. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.766286.

Full text
Abstract:
The thesis provides a comprehensive overview of key issues affecting practice, policy and patients, in current efforts for colorectal cancer (CRC) disease control. The global burden of CRC is expected to increase by 60% to more than 2.2 million new cases and 1.1 million deaths by 2030. CRC incidence and mortality rates vary up to 10-fold worldwide, which is thought to reflect variation in lifestyles, especially diet. Better primary prevention, and more effective early detection, in screening and surveillance, are needed to reduce the number of patients with CRC in future1. The risk factors for CRC development include genetic, behavioural, environmental and socio-economic factors. Changes to surveillance, which offer non-invasive testing and provide primary prevention interventions represent promising opportunities to improve outcomes and personalise care in those at risk of CRC. By systematic review of the literature, I highlight the gaps in comparative effectiveness analyses of post-polypectomy surveillance. Using micro-simulation methods I assess the role of non-invasive, faecal immunochemical testing in surveillance programmes, to optimise post-polypectomy surveillance programmes, and in an accompanying sub-study, I explore the value of adding an adjunct diet and lifestyle intervention. The acceptability of such revisions is exposed to patient preference evaluation by discrete choice experiment methods. These preferences are accompanied by evidence generated from the prospective evaluation of the health literacy, numeracy, sedentary behaviour levels, body mass index (BMI) and information provision about cancer risk factors, to highlight the potential opportunities for personalisation and optimisation of surveillance. Additional analysis examines the optimisation of a screening programme facing colonoscopy constraints, highlighting the attendant potential to reduce costs and save lives within current capacity.
APA, Harvard, Vancouver, ISO, and other styles
8

Duder, Sydney. "Cards, dice and lifestyles : gaming a guaranteed annual income." Thesis, McGill University, 1987. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=72093.

Full text
Abstract:
A simulation game was designed to examine the impact of a guaranteed annual income (GAI). The sample of 158 player-objects included factory and clerical workers, high school dropouts, single mothers, and CEGEP and university social work students. To establish the validity of the game, the working and spending behaviour of players was compared with results reported for the New Jersey negative income tax experiment, and found to be similar in a number of respects. The game also simulated two features not present in the New Jersey experiment: (a) variable labour-market conditions, and (b) comparison of a partial, time-limited GAI with a permanent, universal plan. For players on a GAI, working hours were significantly lower when fellow-players were not on a GAI than when they were. Results suggest that work effort may be related to comparisions with a reference group on visible consumer goods.
APA, Harvard, Vancouver, ISO, and other styles
9

Mrdalo, Zvonimir. "A comparison of the economic efficiency of the petroleum fiscal systems under uncertainty : a Monte Carlo simulation approach." Thesis, University of Aberdeen, 2011. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?pid=189500.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Curtis, J. Jeffrey. "Mixed structural models for decision making under uncertainty using stochastic system simulation and experimental economic methods : application to information security control choice." Thesis, University of Reading, 2016. http://centaur.reading.ac.uk/80444/.

Full text
Abstract:
This research is concerned with whether and to what extent information security managers may be biased in their evaluation of and decision making over the quantifiable risks posed by information management systems where the circumstances may be characterized by uncertainty in both the risk inputs (e.g. system threat and vulnerability factors) and outcomes (actual efficacy of the selected security controls and the resulting system performance and associated business impacts). Although ‘quantified security’ and any associated risk management remains problematic from both a theoretical and empirical perspective (Anderson 2001; Verendel 2009; Appari 2010), professional practitioners in the field of information security continue to advocate the consideration of quantitative models for risk analysis and management wherever possible because those models permit a reliable economic determination of optimal operational control decisions (Littlewood, Brocklehurst et al. 1993; Nicol, Sanders et al. 2004; Anderson and Moore 2006; Beautement, Coles et al. 2009; Anderson 2010; Beresnevichiene, Pym et al. 2010; Wolter and Reinecke 2010; Li, Parker et al. 2011) The main contribution of this thesis is to bring current quantitative economic methods and experimental choice models to the field of information security risk management to examine the potential for biased decision making by security practitioners, under conditions where information may be relatively objective or subjective and to demonstrate the potential for informing decision makers about these biases when making control decisions in a security context. No single quantitative security approach appears to have formally incorporated three key features of the security risk management problem addressed in this research: 1) the inherently stochastic nature of the information system inputs and outputs which contribute directly to decisional uncertainty (Conrad 2005; Wang, Chaudhury et al. 2008; Winkelvos, Rudolph et al. 2011); 2) the endogenous estimation of a decision maker’s risk attitude using models which otherwise typically assume risk neutrality or an inherent degree of risk aversion (Danielsson 2002; Harrison, Johnson et al. 2003); and 3) the application of structural modelling which allows for the possible combination and weighting between multiple latent models of choice (Harrison and Rutström 2009). The identification, decomposition and tractability of these decisional factors is of crucial importance to understanding the economic trade-offs inherent in security control choice under conditions of both risk and uncertainty, particularly where established psychological decisional biases such as ambiguity aversion (Ellsberg 1961) or loss aversion (Kahneman and Tversky 1984) may be assumed to be endemic to, if not magnified by, the institutional setting in which these decisions take place. Minimally, risk averse managers may simply be overspending on controls, overcompensating for anticipated losses that do not actually occur with the frequency or impact they imagine. On the other hand, risk-seeking managers, where they may exist (practitioners call them ‘cowboys’ – they are a familiar player in equally risky financial markets) may be simply gambling against ultimately losing odds, putting the entire firm at risk of potentially catastrophic security losses. Identifying and correcting for these scenarios would seem to be increasingly important for now universally networked business computing infrastructures. From a research design perspective, the field of behavioural economics has made significant and recent contributions to the empirical evaluation of psychological theories of decision making under uncertainty (Andersen, Harrison et al. 2007) and provides salient examples of lab experiments which can be used to elicit and isolate a range of latent decision-making behaviours for choice under risk and uncertainty within relatively controlled conditions versus those which might be obtainable in the field (Harrison and Rutström 2008). My research builds on recent work in the domain of information security control choice by 1) undertaking a series of lab experiments incorporating a stochastic model of a simulated information management system at risk which supports the generation of observational data derived from a range of security control choice decisions under both risk and uncertainty (Baldwin, Beres et al. 2011); and 2) modeling the resulting decisional biases using structural models of choice under risk and uncertainty (ElGamal and Grether 1995; Harrison and Rutström 2009; Keane 2010). The research contribution consists of the novel integration of a model of stochastic system risk and domain relevant structural utility modeling using a mixed model specification for estimation of the latent decision making behaviour. It is anticipated that the research results can be applied to the real world problem of ‘tuning’ quantitative information security risk management models to the decisional biases and characteristics of the decision maker (Abdellaoui and Munier 1998).
APA, Harvard, Vancouver, ISO, and other styles
11

Kurth, Andrew Hamilton. "A Stochastic Simulation of the North Dakota Ethanol Production Incentive." Thesis, North Dakota State University, 2009. https://hdl.handle.net/10365/29635.

Full text
Abstract:
The objective of this research is to determine the effect the North Dakota Ethanol Production Incentive has on ethanol plant survivability. This thesis uses a stochastic simulation to show the financial performance of an ethanol plant with and without subsidy support. Historical corn and ethanol prices are used to simulate market conditions a typical ethanol might face. Using the forecast prices, an ethanol plant balance sheet was created to show how a plant would perform in normal market conditions, as well as how the plant would perform with the Ethanol Production Incentive and also with alternative subsidy structures that were developed. The results showed the Ethanol Production Incentive was the most effective subsidy tested and it does appear to improve plant balance sheets to a certain extent during a downturn.
APA, Harvard, Vancouver, ISO, and other styles
12

Rauktytė, Aidana. "VaR METODOLOGIJOS ANALIZĖ IR METODŲ PRAKTINIS TAIKYMAS." Master's thesis, Lithuanian Academic Libraries Network (LABT), 2010. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2010~D_20101108_095132-99409.

Full text
Abstract:
Magistro darbe nagrinėjamas šiuo metu vienas moderniausių rizikos matų – rizikos vertė (angl.Value-at-risk) Analizuojami trys pagrindiniai VaR rodiklio skaičiavimo metodai: variacijos/kovariacijos, istorinio modeliavimo ir Monte Karlo simuliacijos keliamų prielaidų, sudėtingumo ir adekvatumo požiūriais. Visų trijų metodų pagalba dabartinėmis rinkos sąlygomis atliekami empiriniai tyrimai, siekiant įvertinti rizikos vertes valiutų ir akcijų rinkose, atlikta gautų rizikos verčių palyginamoji analizė bei patikrintas naudotų metodų tikslumas. Autorės suformuluota hipotezė, kad VaR rodiklio skaičiavimo metodai nėra tinkami naudoti pereinamuoju laikotarpiu kuomet ekonominė aplinka ir padėtis nėra stabili iš dalies patvirtinta, nes atliktų tyrimų rezultatai atmetė tik variacijos/kovariacijos bei istorinio modeliavimo metodų tinkamumą.
In this master‘s work analyzed one of the modern risk measurements – Value-at-Risk (VaR). The paper examined three main VaR calculation methods: variance/covariance, historical simulation and Monte Carlo generations satisfying in the terms of the assumptions, adequacy and complexity. For all three methods was carried out empirical studies to assess the risk of currency and stock markets, made comparative analysis of the obtained risk values and verified accuracy of used methods in the current market conditions. The authors formulated the hypothesis that the VaR indicator calculation methods are not suitable for use during the transitional period when the economic environment and situation is not stable partially confirmed because the results of tests performed to reject just the variance / covariance and historical simulation methods.
APA, Harvard, Vancouver, ISO, and other styles
13

Marinelli, Marco Antonio. "Modelling and communicating the effects of spatial data uncertainty on spatially based decision-making." Thesis, Curtin University, 2011. http://hdl.handle.net/20.500.11937/1842.

Full text
Abstract:
Important economic and environmental decisions are routinely based on spatial/ temporal models. This thesis studies the uncertainty in the predictions of three such models caused by uncertainty propagation. This is considered important as it quantifies the sensitivity of a model’s prediction to uncertainty in other components of the model, such as the model’s inputs. Furthermore, many software packages that implement these models do not permit users to easily visualize either the uncertainty in the data inputs, the effects of the model on the magnitude of that uncertainty, or the sensitivity of the uncertainty to individual data layers. In this thesis, emphasis has been placed on demonstrating the methods used to quantify and then, to a lesser extent, visualize the sensitivity of the models. Also, the key questions required to be resolved with regards to the source of the uncertainty and the structure of the model is investigated. For all models investigated, the propagation paths that most influence the uncertainty in the prediction were determined. How the influence of these paths can be minimised, or removed, is also discussed.Two different methods commonly used to analyse uncertainty propagation were investigated. The first is the analytical Taylor series method, which can be applied to models with continuous functions. The second is the Monte Carlo simulation method which can be used on most types of models. Also, the later can be used to investigate how the uncertainty propagation changes when the distribution of model uncertainty is non Gaussian. This is not possible with the Taylor method.The models tested were two continuous Precision Agriculture models and one ecological niche statistical model. The Precision Agriculture models studied were the nitrogen (N) availability component of the SPLAT model and the Mitscherlich precision agricultural model. The third, called BIOCLIM, is a probabilistic model that can be used to investigate and predict species distributions for both native and agricultural species.It was generally expected that, for a specific model, the results from the Taylor method and the Monte Carlo will agree. However, it was found that the structure of the model in fact influences this agreement, especially in the Mitscherlich Model which has more complex non linear functions. Several non-normal input uncertainty distributions were investigated to see if they could improve the agreement between these methods. The uncertainty and skew of the Monte Carlo results relative to the prediction of the model was also useful in highlighting how the distribution of model inputs and the models structure itself, may bias the results.The version of BIOCLIM used in this study uses three basic spatial climatic input layers (monthly maximum and minimum temperature and precipitation layers) and a dataset describing the current spatial distribution of the species of interest. The thesis investigated how uncertainty in the input data propagates through to the estimated spatial distribution for Field Peas (Pisum sativum) in the agriculturally significant region of south west Western Australia. The results clearly show the effect of uncertainty in the input layers on the predicted specie’s distribution map. In places the uncertainty significantly influences the final validity of the result and the spatial distribution of the validity also varies significantly.
APA, Harvard, Vancouver, ISO, and other styles
14

Popiolek, Nathalie. "Modèle prospectif de consommation d'énergie dans l'agriculture : le cas de la fertilisation azotée a l'horizon 2010." Grenoble 2, 1993. http://www.theses.fr/1993GRE21027.

Full text
Abstract:
La premiere partie de cette these donne une vision critique des methodes de prospective disponibles. A l'issue de ce panorama est presentee la methodologie prospective choisie pour la construction du model. Cette construction constitue la deuxieme etape du travail. Le systeme "energie et fertilisation azotee" est defini, (liste des variables cles pour son evolution a l'horizon 2010, analyse de leurs relations) puis modelise en mettant l'accent sur la demande d'engrais azotes. A l'horizon 2010, celle-ci est determinee a l7aide d'un modele de programmation lineaire (aropaj) traduisant les choix culturaux des agriculteurs. Ce modele, destine au depart au court moyen terme, est adapte a l'horizon 2010 grace a la construction de modeles peripheriques projetant les facteurs fixes (a court termes) des exploitations et leur mode de production. Ces projections sont effectuees suivant differents scenarios d'evolution. L'elaboration de ces scenarios, a partir d'enquetes d'experts (delphi), fait l'objet de la troisieme partie. Dans le cadre de ces scenarios, des simulations sont effectuees sur le modele afin de tester notamment les consequences de la future politique agricole commune ou celles de l'introduction d'innovations technologiques
The first part of this thesis gives a critical overview of available forecasting methods. After this survey, the methodology chosen for the construction of the model is presented. This construction constitutes the second stage of this work. The "energy and nitrogen fertilization" system is defined (list of key variables for its evolution towards the year 2010 and analysis of their relationships) and then modelized highlighting the nitrogen fertilizers demand. The demand is determined for the year 2010 by the mean of a linear programming model (aropaj) based on farmers' cropping choices. This model, first used for short and middle terms is adapted for the long term (2010) by using peripheral models projecting the fixed factors (in the short term) of agricultural holdings and their way of producing. These projections are based on experts inquiries (delphi) constitutes the third part of this thesis. These scenarios constitute the patterns of simulation running on the linear programme to test the impact of the future common agricultural policy or the introduction of technological innovations
APA, Harvard, Vancouver, ISO, and other styles
15

Madeira, Marcelo Gomes. "Comparação de tecnicas de analise de risco aplicadas ao desenvolvimento de campos de petroleo." [s.n.], 2005. http://repositorio.unicamp.br/jspui/handle/REPOSIP/263732.

Full text
Abstract:
Orientadores: Denis Jose Schiozer, Eliana L. Ligero
Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecanica, Instituto de Geociencias
Made available in DSpace on 2018-08-04T09:02:20Z (GMT). No. of bitstreams: 1 Madeira_MarceloGomes_M.pdf: 1487066 bytes, checksum: 39d5e48bf728f51d3c85f7d5ef207d41 (MD5) Previous issue date: 2005
Resumo: Os processos de tomada de decisões em campos de petróleo estão associados a grandes riscos provenientes de incertezas geológicas, econômicas e tecnológicas e altos investimentos. Nas fases de avaliação e desenvolvimento dos campos, torna-se necessário modelar o processo de recuperação com confiabilidade aumentando o esforço computacional. Uma forma de acelerar o processo é através de simplificações sendo algumas discutidas neste trabalho: técnica de quantificação do risco (Monte Carlo, árvore de derivação), redução no número de atributos, tratamento simplificado de atributos e simplificação da modelagem do reservatório. Ênfase especial está sendo dada à (1) comparação entre Monte Carlo e árvore de derivação e (2) desenvolvimento de modelos rápidos através de planejamento de experimentos e superfície de resposta. Trabalhos recentes estão sendo apresentados sobre estas técnicas, mas normalmente mostrando aplicações e não comparação entre alternativas. O objetivo deste trabalho é comparar estas técnicas levando em consideração a confiabilidade, a precisão dos resultados e aceleração do processo. Estas técnicas são aplicadas a um campo marítimo e os resultados mostram que (1) é possível reduzir significativamente o número de simulações do fluxo mantendo a precisão dos resultados e que (2) algumas simplificações podem afetar o processo de decisão
Abstract: Petroleum field decision-making process is associated to high risks due to geological, economic and technological uncertainties, and high investments, mainly in the appraisal and development phases of petroleum fields where it is necessary to model the recovery process with higher precision increasing the computational time. One way to speedup the process is by simplifying the process; some simplifications are discussed in this work: technique to quantify the risk (Monte Carlo and derivative tree), reduction of number of attributes, simplification of the treatment of attributes and simplification of the reservoir modeling process. Special emphasis is given to (1) comparison between Monte Carlo and derivative tree techniques and (2) development of fast models through experimental design and response surface method. Some works are being presented about these techniques but normally they show applications and no comparison among alternatives is presented. The objective of this work is to compare these techniques taking into account the reliability, precision of the results and speedup of the process. These techniques are applied to an offshore field and the results show that it is possible to reduce significantly the number of flow simulation maintaining the precision of the results. It is also possible to show that some simplifications can yield different results affecting the decision process
Mestrado
Reservatórios e Gestão
Mestre em Ciências e Engenharia de Petróleo
APA, Harvard, Vancouver, ISO, and other styles
16

Furlani, Luiz Gustavo Cassilatti. "Flutuações cambiais e política monetária no Brasil : evidências econométricas e de simulação." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2008. http://hdl.handle.net/10183/14996.

Full text
Abstract:
A literatura sobre economia monetária vem despertando interesse crescente dentro da macroeconomia. Devido aos avanços computacionais, os modelos têm se tornado cada vez mais complexos e precisos, permitindo estudar detalhadamente as relações entre as variáveis reais da economia e as variáveis nominais. Dessa forma, através de um modelo de equilíbriogeral estocástico e dinâmico (DSGE) baseado em Gali e Monacelli (2005), é proposto e estimado um modelo para a economia brasileira através de métodos bayesianos, com o intuito de avaliar se o Banco Central do Brasil (BCB) considera variações cambiais na condução da política monetária. O resultado mais importante do presente trabalho é que não há evidências de que o BCB altere diretamente a trajetória dos juros devido a variações na taxa de câmbio. Um exercício de simulação também é realizado. Conclui-se que a economia acomoda rapidamente choques induzidos separadamente na taxa de câmbio, nos termos de troca, na taxa de juros e na inflação mundial.
The literature on monetary economy has aroused growing interest in macroeconomics. Due to computational advancements, models have been increasingly more complex and accurate, allowing for the in-depth analysis of the relationships between real economic variables and nominal variables. Therefore, using a dynamic stochastic general equilibrium (DSGE) model, based on Gali and Monacelli (2005), we propose and estimate a model for the Brazilian economy by employing Bayesian methods so as to assess whether the Central Bank of Brazil takes exchange rate fluctuations into account in the conduct of monetary policy. The most striking result of the present study is that the Central Bank of Brazil does not directly change the interest rate path due to exchange rate movements. A simulation exercise is also used. Our conclusion is that the economy quickly accommodates shocks induced separately on the exchange rate, on the terms of trade, on the interest rate, and on global inflation.
APA, Harvard, Vancouver, ISO, and other styles
17

Thiebaut, Sophie. "Maladies chroniques et pertes d'autonomie chez les personnes âgees : évolutions des dépenses de santé et de la prise en charge de la dépendance sous l'effet du vieillissement de la population." Thesis, Aix-Marseille 2, 2011. http://www.theses.fr/2011AIX24026.

Full text
Abstract:
Fondée sur deux analyses empiriques et sur un travail de modélisation théorique, cette thèse traite de la problématique du vieillissement de la population, en France, en termes de dépenses de biens et services de santé, et en termes de prise en charge des personnes âgées dépendantes. Dans un premier chapitre, une méthode de microsimulation dynamique est mise au point afin d'évaluer l'évolution des dépenses de médicaments remboursables (en médecine de ville) sous l'effet du vieillissement de la population et de l'évolution de l'état de santé chronique des personnes âgées. Un deuxième chapitre s'intéresse aux tenants d'une possible réforme de l'Allocation Personnalisée d'Autonomie (APA), qui viserait à récupérer sur la succession une partie des fonds versés aux personnes dépendantes. Nous développons un modèle théorique de transfert intergénérationnel en individualisant les décisions des deux membres d'une famille, un parent dépendant et un enfant aidant informel potentiel. Enfin, dans une dernière partie, nous évaluons empiriquement les facteurs modifiant la demande d'aide à domicile des bénéficiaires de l'APA, en nous concentrant, afin d'anticiper sur de possibles réformes de l'aide publique, sur l'évaluation des effets-prix dans la demande d'aide formelle
This thesis addresses, using an elaborated theoretical model and two empirical applications, issues related to population ageing and health care expenditures as per the French context. In the first chapter, a method of dynamic microsimulation is developed to assess the evolution of outpatient reimbursable drugs expenditures as a result of the ageing population and the evolution of health status of chronically ill elderly people. The second chapter focuses on the ins and outs of a possible reform of the Personal Allowance for Autonomy (APA), which would seek to recover a portion of the funds paid to disabled elderly on the inheritance of their heirs. A theoretical model of intergenerational transfers is developed to study the individual decisions of a two-member family - a disabled parent and a child who can play the role of informal care giver. The final section presents an empirical evaluation of the factors affecting the demand of APA's recipients for home care. This work examines the price effects in the demand for formal care in order to anticipate possible reforms of public allowance
APA, Harvard, Vancouver, ISO, and other styles
18

Горкуненко, Андрій Борисович, Андрей Борисович Горкуненко, and A. B. Horkunenko. "Моделювання та методи аналізу і прогнозування циклічних економічних процесів в інформаційних системах підтримки прийняття рішень." Thesis, Тернопільський національний технічний університет ім. Івана Пулюя, 2013. http://elartu.tntu.edu.ua/handle/123456789/2033.

Full text
Abstract:
Робота виконана в Тернопільському національному технічному університеті імені Івана Пулюя Міністерства освіти і науки України. Захист відбувся у 2013 р. на засіданні спеціалізованої вченої ради К58.052.01 у Тернопільському національному технічному університеті імені Івана Пулюя за адресою: 46001, м. Тернопіль, вул. Руська, буд. 56, корп. 2, ауд. 79. З дисертацією можна ознайомитися в науково-технічній бібліотеці Тернопільського національного технічного університету імені Івана Пулюя за адресою: 46001, м. Тернопіль, вул. Руська, 56.
Дисертацію присвячено розробленню математичних моделей, методів аналізу, комп’ютерної імітації та прогнозування циклічних економічних процесів. Розроблено нову математичну модель циклічних економічних процесів у вигляді суми детермінованої поліноміальної функції та циклічного випадкового процесу. Розроблено нову математичну модель сукупності взаємопов’язаних циклічних економічних процесів у вигляді суми вектора детермінованих поліноміальних функцій та вектора циклічних ритмічно пов’язаних випадкових процесів. Обґрунтовано методи статистичного опрацювання циклічних економічних процесів на основі нових математичних моделей. Розроблено метод прогнозування циклічних економічних процесів. Досягнуто підвищення точності їх прогнозування на 25% у порівнянні з відомим методом прогнозування при заданій довірчій ймовірності прогнозування. Обґрунтовано та застосовано методи імітаційного моделювання циклічних економічних процесів на ЕОМ. Розроблено програмну систему моделювання, аналізу та прогнозування циклічних економічних процесів, що може бути складовою інформаційної системи підтримки прийняття рішень, проведено її атестацію та апробацію.
Диссертация посвящена разработке математических моделей, методов анализа, имитации и прогнозирования циклических экономических процессов с целью повышения точности, достоверности и информативности их моделирования, анализа и прогнозирования в информационных системах поддержки принятия решений. Рассмотрены существующие подходы, математические модели, методы анализа и прогнозирования циклических экономических процессов, приведены их основные недостатки. Составлено сравнительную таблицу математических моделей циклических экономических процессов. Разработана и верифицирована новая математическая модель циклических экономических процессов в виде суммы детерминированной полиномиальной функции и циклического случайного процесса, которая в результате одновременного учета трендовых составляющих, стохастичности, цикличности экономических процессов и изменчивости ритма их циклических составляющих, позволила повысить точность, достоверность методов и программных средств моделирования, анализа и прогнозирования циклических экономических процессов в системах поддержки принятия решений. Разработана и верифицирована новая математическая модель совокупности взаимосвязанных циклических экономических процессов в виде суммы вектора детерминированных полиномиальных функций и вектора циклических ритмично связанных случайных процессов, которая за счет отражения общности и изменчивости ритмической структуры их циклических составляющих, позволила обосновать информативные методы их совместного статистического анализа. Обоснованы методы статистической обработки циклических экономических процессов на основании новых математических моделей, учитывающих цикличность, изменчивость и общность ритма циклических составляющих экономических процессов. Разработан подход к статистическому анализу взаимосвязанных циклических экономических процессов на основании модели в виде суммы детерминированных функций и вектора циклических ритмично связанных случайных процессов. Обоснован метод разложения статистических оценок циклических экономических процессов в ряды Фурье (одномерные и двумерные) чем существенно уменьшено размерность пространства информативных (диагностических и прогностических) признаков в системах поддержки принятия решений. Разработан метод прогнозирования циклических экономических процессов. Достигнуто повышение точности прогнозирования циклических экономических процессов на 25% на основании нового метода прогнозирования по сравнению с известным при одинаковой доверительной вероятности прогнозирования. Обоснованы и применены методы компьютерного имитационного моделирования циклических случайных процессов и векторов для задачи имитации на ЭВМ циклических экономических процессов, которые благодаря наличию процедуры идентификации алгоритма генерирования, одновременного учета морфологических характеристик и характеристик ритма циклических составляющих имитируемых экономических процессов, повышают точность и достоверность их симуляции на ЭВМ. Разработана программная система моделирования, анализа и прогнозирования циклических экономических процессов, которая может быть составляющей информационной системы поддержки принятия решений, проведено ее аттестацию и апробацию. Установлен факт соответствия программной системы моделирования, анализа и прогнозирования циклических экономических процессов поставленным задачам и требованиям, а также на компоненты данной системы получены авторские свидетельства.
The thesis is devoted to development of mathematical models and methods of analysis, computer simulation and forecasting of cyclical economic processes. A new mathematical model of cyclical economic processes as a sum of deterministic polynomial functions and cyclic stochastic process is created. A new mathematical model is a set of interrelated cyclical economic processes as a sum vector of deterministic polynomial functions and vector cyclic rhythmically related random processes is created. Methods of statistical processing of cyclical economic processes based on new mathematical models are proved. The method of forecasting of cyclical economic processes is developed. An increase of their forecasting accuracy by 25% compared with the known method of predicting a given confidence level forecasting. Methods of simulation of cyclical economic processes on the computer are proved and applied. The software system for modeling, analysis and forecasting of cyclical economic processes is developed and tested. It may be a part of an information system decision support.
APA, Harvard, Vancouver, ISO, and other styles
19

Pérgola, Gabriel Campos. "Seguro contra risco de downside de uma carteira: uma proposta híbrida frequentista-Bayesiana com uso de derivativos." reponame:Repositório Institucional do FGV, 2013. http://hdl.handle.net/10438/10468.

Full text
Abstract:
Submitted by Gabriel Campos Pérgola (gabrielpergola@gmail.com) on 2013-02-04T12:56:43Z No. of bitstreams: 1 DissertationGabrielPergola2013.pdf: 521205 bytes, checksum: 85369078a82b0d5cc02f8248961e9214 (MD5)
Rejected by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br), reason: Prezado Gabriel, Não recebemos os arquivo em PDF. Att. Suzi 3799-7876 on 2013-02-05T18:53:00Z (GMT)
Submitted by Gabriel Campos Pérgola (gabrielpergola@gmail.com) on 2013-02-05T19:00:17Z No. of bitstreams: 2 DissertationGabrielPergola2013.pdf: 521205 bytes, checksum: 85369078a82b0d5cc02f8248961e9214 (MD5) DissertationGabrielPergola2013.pdf: 521205 bytes, checksum: 85369078a82b0d5cc02f8248961e9214 (MD5)
Approved for entry into archive by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br) on 2013-02-05T19:07:12Z (GMT) No. of bitstreams: 2 DissertationGabrielPergola2013.pdf: 521205 bytes, checksum: 85369078a82b0d5cc02f8248961e9214 (MD5) DissertationGabrielPergola2013.pdf: 521205 bytes, checksum: 85369078a82b0d5cc02f8248961e9214 (MD5)
Made available in DSpace on 2013-02-05T19:09:04Z (GMT). No. of bitstreams: 2 DissertationGabrielPergola2013.pdf: 521205 bytes, checksum: 85369078a82b0d5cc02f8248961e9214 (MD5) DissertationGabrielPergola2013.pdf: 521205 bytes, checksum: 85369078a82b0d5cc02f8248961e9214 (MD5) Previous issue date: 23-01-13
Portfolio insurance allows a manager to limit downside risk while allowing participation in upside markets. The purpose of this dissertation is to introduce a framework to portfolio insurance optimization from a hybrid frequentist-Bayesian approach. We obtain the joint distribution of regular returns from a frequentist statistical method, once the outliers have been identified and removed from the data sample. The joint distribution of extreme returns, in its turn, is modelled by a Bayesian network, whose topology reflects the events that can significantly impact the portfolio performance. Once we link the regular and extreme distributions of returns, we simulate future scenarios for the portfolio value. The insurance subportfolio is then optimized by the Differential Evolution algorithm. We show the framework in a step by step example for a long portfolio including stocks participating in the Bovespa Index (Ibovespa), using market data from 2008 to 2012.
Seguros de carteiras proporcionam aos gestores limitar o risco de downside sem renunciar a movimentos de upside. Nesta dissertação, propomos um arcabouço de otimização de seguro de carteira a partir de um modelo híbrido frequentista-Bayesiano com uso de derivativos. Obtemos a distribuição conjunta de retornos regulares através de uma abordagem estatística frequentista, uma vez removidos os outliers da amostra. A distribuição conjunta dos retornos extremos, por sua vez, é modelada através de Redes Bayesianas, cuja topologia contempla os eventos que o gestor considera crítico ao desempenho da carteira. Unindo as distribuições de retornos regulares e extremos, simulamos cenários futuros para a carteira. O seguro é, então, otimizado através do algoritmo Evolução Diferencial. Mostramos uma aplicação passo a passo para uma carteira comprada em ações do Ibovespa, utilizando dados de mercado entre 2008 e 2012.
APA, Harvard, Vancouver, ISO, and other styles
20

Bilha, Vitor Meira. "Análise do processo de retificação interna aplicado à fabricação de bicos injetores diesel." Universidade Tecnológica Federal do Paraná, 2015. http://repositorio.utfpr.edu.br/jspui/handle/1/1889.

Full text
Abstract:
A utilização mais eficiente dos recursos naturais tem levado a indústria a aumentar a eficiência dos produtos manufaturados. Veículos de transporte Diesel estão incluídos neste cenário. Nos motores Diesel, um componente importante do sistema de injeção é o bico injetor. Uma legislação recente no Brasil introduziu limites de emissões EURO5. Estes novos parâmetros tiveram impacto no projeto do bico injetor com tolerâncias sendo reduzidas, em especial na área da sede do corpo do bico injetor, alterando o parâmetro funcional de pressão de abertura do produto. Neste cenário, o impacto desta superfície cônica retificada sobre a pressão de abertura é grande, bem como, sobre o desempenho do motor Diesel. Este estudo tem como objetivo analisar o defeito de topografia superficial típico e recorrente no processo de retificação interna cônica da sede do corpo dos bicos injetores. Um experimento planejado foi realizado de acordo com método Taguchi e determinou-se a relação sinal-ruído para dois parâmetros de topografia 2D. A superfície da sede também foi analisada utilizando análise topográfica 3D. Os resultados deste estudo incluem a indicação de possíveis causas do defeito recorrente, caracterização dos principais elementos do processo de retificação, caracterização da superfície retificada e otimização dos parâmetros do processo de retificação.
Industries of manufactured products have increased their efficiency optimizing the natural resources usage and Diesel commercial vehicles are included in this scenario. For Diesel engines, one of the most important components of the injection system is the nozzle injector. In Brazil, EURO5 legislation was recently introduced, bringing new emission limits for Diesel engines. Because of this, the nozzle injector design has changed and some manufacturing tolerances were reduced, in special the body seat geometry. This also changed the nozzle opening pressure. In this new process, the body seat grinded conical surface impacts on this functional parameter and consequently the Diesel engine performance. This study has as target to analyze a recurrent defect in the internal conic grinding process of the nozzle body seat. A trial was performed in this process according to Taguchi method and signal / noise ratio for 2D topographic parameters were defined. The body seat surface was also analyzed using 3D topographic analysis. The results of this study include the possible cause of the recurrent failure, characterization of the ground surface, process main elements integrity assessment and optimization of the grinding process parameters.
APA, Harvard, Vancouver, ISO, and other styles
21

Puigoriol, Forcada Josep Maria. "Una metodología para caracterización elastoplástica cuasi-estática simplificada de materiales termoplásticos inyectados en proceso industrial para simulación estructural." Doctoral thesis, Universitat Ramon Llull, 2013. http://hdl.handle.net/10803/121469.

Full text
Abstract:
En la present investigació es planteja una metodologia de caracterització de materials per alimentar el model constitutiu elastoplàstic de Von Mises amb enduriment isotròpic, per obtenir una millor resposta en simulacions computacionals estàtiques (CAE) de peces fabricades amb termoplàstics. Aquesta estratègia de caracterització ha de resultar simple pel que fa a aplicació, per a facilitar l’ús en l’exercici habitual d’indústria. Es presenten els models constitutius usualment utilitzats en el sector industrial per donar resposta a problemes de no linealitat de material en estàtica, per aquest tipus de polímers, que requereixin tan sol de l’assaig de tracció en la màquina universal. S’acompanya d’una declaració de les variables més influents sobre la resposta computacional. La metodologia proposa l’ús d’un Factor d’Escala Màster que permeti la reducció de la corba elastoplàstica per a una família de materials. Aquest factor de reducció s’utilitza, a la vegada, per la definició del límit de fluència i del mòdul secant elàstic de càlcul. S’ha avaluat el comportament mecànic d’una selecció de quatre materials, representatius de famílies utilitzades en sistemes d’interior d’automòbils, mitjançant mostres retallades de components reals. D’aquesta forma, es comprova el descens de les propietats mecàniques d’aquests materials en un context d’injecció usual en procés industrial. La nova estratègia de caracterització s’alimenta d’aquesta informació obtinguda en l’assaig de tracció. Es realitza la validació de la metodologia a través d’un assaig híbrid test-simulació sobre peça real injectada. Finalment s’ha efectuat un exercici de verificació de l’estratègia de caracterització sobre un conjunt real Mòdul Panell Porta, mitjançant correlació experimental.
En esta investigación se plantea una metodología de caracterización de materiales para alimentar el modelo constitutivo elastoplástico clásico de von Mises con endurecimiento isotrópico, para una mejor respuesta en simulaciones computacionales estáticas (CAE) de piezas fabricadas con termoplásticos. Esta estrategia de caracterización debe resultar simple en cuanto a aplicación, para facilitar el uso en el ejercicio habitual en industria. Se presentan los modelos constitutivos usualmente utilizados en el sector industrial para dar respuesta a problemas de no linealidad de material en estática, para este tipo de polímeros, que requieran tan solo del ensayo de tracción en máquina universal. Se acompaña de una declaración de las variables más influyentes en la respuesta computacional. La metodología propone el uso de un Factor de Escala Máster que permita la reducción de la curva elastoplástica para una familia de materiales. Este factor de reducción se utiliza, a su vez, para la definición del límite de fluencia y del módulo secante elástico de cálculo. Se ha procedido a evaluar el comportamiento mecánico de una selección de cuatro materiales, representativos de familias utilizadas en sistemas de interior de automóviles, mediante muestras recortadas de componentes reales. De esta forma, se comprueba el descenso de las propiedades mecánicas de estos materiales en un contexto de inyección usual en proceso industrial. La nueva estrategia de caracterización se alimenta de esta información hallada mediante el ensayo de tracción. Se realiza la validación de la metodología a través de un ensayo híbrido test-simulación sobre pieza real inyectada. Finalmente se lleva a cabo un ejercicio de verificación de la estrategia de caracterización sobre un conjunto real Módulo Panel Puerta, mediante correlación experimental.
This research project proposes a methodology for characterization of materials to feed the traditional elasto-plastic model with Von Mises isotropic hardening criterion. This is to achieve a better response in static computer simulations (CAE) for thermoplastic parts. This characterization strategy must be simple in application to facilitate regular use in industry. The models normally used in industry are presented to address issues of static material non-alignment for this class of polymers, only requiring the universal tensile strength machine test. It is accompanied by a statement of the most influential variables in the computational response. The methodology proposes the use of a Master Scale Factor that allows the reduction of the elasto-plastic curve for a class of materials. This reduction factor is used in turn to define the yield stress and the young modules. An evaluation of the mechanical behaviour from a selection of four materials is undertaken, representative classes of materials used in automotive interior systems. This was done by using samples cut from real components. Thus, the decrease of the mechanical properties of these materials in the context of the usual industrial injection process was evaluated. The new strategy characterization is fed by this information gained by the tensile strength test. A validation of the methodology is undertaken through a simulated hybrid test on an actual injected part. Finally, a verification exercise of the characterization strategy is conducted on a real door panel module assembly, through experimental correlation.
APA, Harvard, Vancouver, ISO, and other styles
22

Beisler, Matthias Werner. "Modelling of input data uncertainty based on random set theory for evaluation of the financial feasibility for hydropower projects." Doctoral thesis, Technische Universitaet Bergakademie Freiberg Universitaetsbibliothek "Georgius Agricola", 2011. http://nbn-resolving.de/urn:nbn:de:bsz:105-qucosa-71564.

Full text
Abstract:
The design of hydropower projects requires a comprehensive planning process in order to achieve the objective to maximise exploitation of the existing hydropower potential as well as future revenues of the plant. For this purpose and to satisfy approval requirements for a complex hydropower development, it is imperative at planning stage, that the conceptual development contemplates a wide range of influencing design factors and ensures appropriate consideration of all related aspects. Since the majority of technical and economical parameters that are required for detailed and final design cannot be precisely determined at early planning stages, crucial design parameters such as design discharge and hydraulic head have to be examined through an extensive optimisation process. One disadvantage inherent to commonly used deterministic analysis is the lack of objectivity for the selection of input parameters. Moreover, it cannot be ensured that the entire existing parameter ranges and all possible parameter combinations are covered. Probabilistic methods utilise discrete probability distributions or parameter input ranges to cover the entire range of uncertainties resulting from an information deficit during the planning phase and integrate them into the optimisation by means of an alternative calculation method. The investigated method assists with the mathematical assessment and integration of uncertainties into the rational economic appraisal of complex infrastructure projects. The assessment includes an exemplary verification to what extent the Random Set Theory can be utilised for the determination of input parameters that are relevant for the optimisation of hydropower projects and evaluates possible improvements with respect to accuracy and suitability of the calculated results
Die Auslegung von Wasserkraftanlagen stellt einen komplexen Planungsablauf dar, mit dem Ziel das vorhandene Wasserkraftpotential möglichst vollständig zu nutzen und künftige, wirtschaftliche Erträge der Kraftanlage zu maximieren. Um dies zu erreichen und gleichzeitig die Genehmigungsfähigkeit eines komplexen Wasserkraftprojektes zu gewährleisten, besteht hierbei die zwingende Notwendigkeit eine Vielzahl für die Konzepterstellung relevanter Einflussfaktoren zu erfassen und in der Projektplanungsphase hinreichend zu berücksichtigen. In frühen Planungsstadien kann ein Großteil der für die Detailplanung entscheidenden, technischen und wirtschaftlichen Parameter meist nicht exakt bestimmt werden, wodurch maßgebende Designparameter der Wasserkraftanlage, wie Durchfluss und Fallhöhe, einen umfangreichen Optimierungsprozess durchlaufen müssen. Ein Nachteil gebräuchlicher, deterministischer Berechnungsansätze besteht in der zumeist unzureichenden Objektivität bei der Bestimmung der Eingangsparameter, sowie der Tatsache, dass die Erfassung der Parameter in ihrer gesamten Streubreite und sämtlichen, maßgeblichen Parameterkombinationen nicht sichergestellt werden kann. Probabilistische Verfahren verwenden Eingangsparameter in ihrer statistischen Verteilung bzw. in Form von Bandbreiten, mit dem Ziel, Unsicherheiten, die sich aus dem in der Planungsphase unausweichlichen Informationsdefizit ergeben, durch Anwendung einer alternativen Berechnungsmethode mathematisch zu erfassen und in die Berechnung einzubeziehen. Die untersuchte Vorgehensweise trägt dazu bei, aus einem Informationsdefizit resultierende Unschärfen bei der wirtschaftlichen Beurteilung komplexer Infrastrukturprojekte objektiv bzw. mathematisch zu erfassen und in den Planungsprozess einzubeziehen. Es erfolgt eine Beurteilung und beispielhafte Überprüfung, inwiefern die Random Set Methode bei Bestimmung der für den Optimierungsprozess von Wasserkraftanlagen relevanten Eingangsgrößen Anwendung finden kann und in wieweit sich hieraus Verbesserungen hinsichtlich Genauigkeit und Aussagekraft der Berechnungsergebnisse ergeben
APA, Harvard, Vancouver, ISO, and other styles
23

Forneron, Jean-Jacques Mitchell. "Essays on Simulation-Based Estimation." Thesis, 2018. https://doi.org/10.7916/D8PZ6RXC.

Full text
Abstract:
Complex nonlinear dynamic models with an intractable likelihood or moments are increasingly common in economics. A popular approach to estimating these models is to match informative sample moments with simulated moments from a fully parameterized model using SMM or Indirect Inference. This dissertation consists of three chapters exploring different aspects of such simulation-based estimation methods. The following chapters are presented in the order in which they were written during my thesis. Chapter 1, written with Serena Ng, provides an overview of existing frequentist and Bayesian simulation-based estimators. These estimators are seemingly computationally similar in the sense that they all make use of simulations from the model in order to do the estimation. To better understand the relationship between these estimators, this chapters introduces a Reverse Sampler which expresses the Bayesian posterior moments as a weighted average of frequentist estimates. As such, it highlights a deeper connection between the two class of estimators beyond the simulation aspect. This Reverse Sampler also allows us to compare the higher-order bias properties of these estimators. We find that while all estimators have an automatic bias correction property (Gourieroux et al., 1993) the Bayesian estimator introduces two additional biases. The first is due to computing a posterior mean rather than the mode. The second is due to the prior, which penalizes the estimates in a particular direction. Chapter 2, also written with Serena Ng, proves that the Reverse Sampler described above targets the desired Approximate Bayesian Computation (ABC) posterior distribution. The idea relies on a change of variable argument: the frequentist optimization step implies a non-linear transformation. As a result, the unweighted draws follow a distribution that depends on the likelihood that comes from the simulations, and a Jacobian term that arises from the non-linear transformation. Hence, solving the frequentist estimation problem multiple times, with different numerical seeds, leads to an optimization-based importance sampler where the weights depend on the prior and the volume of the Jacobian of the non-linear transformation. In models where optimization is relatively fast, this Reverse Sampler is shown to compare favourably to existing ABC-MCMC or ABC-SMC sampling methods. Chapter 3, relaxes the parametric assumptions on the distribution of the shocks in simulation-based estimation. It extends the existing SMM literature, where even though the choice of moments is flexible and potentially nonparametric, the model itself is assumed to be fully parametric. The large sample theory in this chapter allows for both time-series and short-panels which are the two most common data types found in empirical applications. Using a flexible sieve density reduces the sensitivity of estimates and counterfactuals to an ad hoc choice of distribution such as the Gaussian density. Compared to existing work on sieve estimation, the Sieve-SMM estimator involves dynamically generated data which implies non-standard bias and dependence properties. First, the dynamics imply an accumulation of the bias resulting in a larger nonparametric approximation error than in static models. To ensure that it does not accumulate too much, a set decay conditions on the data generating process are given and the resulting bias is derived. Second, by construction, the dependence properties of the simulated data vary with the parameter values so that standard empirical process results, which rely on a coupling argument, do not apply in this setting. This non-standard dependent empirical process is handled through an inequality built by adapting results from the existing literature. The results hold for bounded empirical processes under a geometric ergodicity condition. This is illustrated in the paper with Monte-Carlo simulations and two empirical applications.
APA, Harvard, Vancouver, ISO, and other styles
24

Michael, Beeler. "The Use of Simulation Methods to Understand and Control Pandemic Influenza." Thesis, 2012. http://hdl.handle.net/1807/33335.

Full text
Abstract:
This thesis investigates several uses of simulation methods to understand and control pandemic influenza in urban settings. An agent-based simulation, which models pandemic spread in a large metropolitan area, is used for two main purposes: to identify the shape of the distribution of pandemic outcomes, and to test for the presence of complex relationships between public health policy responses and underlying pandemic characteristics. The usefulness of pandemic simulation as a tool for assessing the cost-effectiveness of vaccination programs is critically evaluated through a rigorous comparison of three recent H1N1 vaccine cost-effectiveness studies. The potential for simulation methods to improve vaccine deployment is then demonstrated through a discrete-event simulation study of a mass immunization clinic.
APA, Harvard, Vancouver, ISO, and other styles
25

Mohsen, Fadi. "Internetbasierte Lehr-/Lernmethoden für die wirtschaftswissenschaftliche Hochschulausbildung." Doctoral thesis, 2002. http://hdl.handle.net/11858/00-1735-0000-000D-F265-A.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Parandvash, G. Hossein. "On the incorporation of nonnumeric information into the estimation of economic relationships in the presence of multicollinearity." Thesis, 1987. http://hdl.handle.net/1957/26851.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Oh, Young-Soo. "Optimal fiscal policy for the provision of local public services : some simulation results for the case of elementary education in Korea." Thesis, 1990. http://hdl.handle.net/10125/9625.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Robinson, Robert Howard. "An economic model of an open pit mine." Thesis, 2015. http://hdl.handle.net/10539/18256.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

"A prototype of collaborative virtual geographic environments to facilitate air pollution simulation." Thesis, 2009. http://library.cuhk.edu.hk/record=b6074966.

Full text
Abstract:
Air pollution simulation has several components, including data preparation, atmospheric circulation and air pollution dispersion modelling and computation, visualization of the model computation results, analysis and model evaluation. Of these components, only atmospheric circulation and air pollution dispersion modelling and computation are mature, while the other components are weak to a greater or lesser extent. To address these weaknesses, this thesis proposes to integrate the data, modelling and analysis into a multi-dimension, virtual geographically referenced environment. In addition, collaboration is used to solve the problem of multi-disciplinary knowledge requirements for conducting air pollution simulation. Based on this model, a concept of collaborative virtual geographic environments (CVGE) is proposed.
The air pollution that is associated with global economic growth is a global problem. Scientists, governmental officials and the public are focusing on improving understanding, accurately predicting and efficiently controlling levels of air pollution. Air pollution simulation is one method used to achieve these goals. This research will consider a computer supported simulation.
The contributions can be drawn from two aspects---CVGE and practice of air pollution simulation. Regarding CVGE, thesis 1 develops the conceptual framework of CVGE; 2 designs the architecture of a CVGE prototype in order to facilitate air pollution simulation; 3 proposes the concept of a "fuzzy boundary volume object", and designs a solution composed of a particle system wrapped in pollution boxes; and 4 examines the levels of geo-collaboration for air pollution simulation. For air pollution simulation, thesis 1 integrates air pollution sources, geo-data, an atmospheric circulation model, an air pollution dispersion model, geo-visualization and analysis into a collaborative virtual geographic environment, which is able to supply a new research methodology and platform for air pollution simulation; in 2, the new platform is scalable and able to free the restrictions of operations on visualization, which paves the way for further extension; 3 couples air pollution dispersion models with geo-information, opening up opportunities for cross studies between air pollution and other research areas, such as the economy, public health and urban planning.
The focus of this thesis is two-fold: one is on the development of a conceptual framework and prototype of CVGE from practice of air pollution simulation; the other is on applying this framework to facilitate air pollution simulation. The work of this thesis can be summarized as follows. (1) Defining the concept of CVGE, developing a conceptual framework for CVGE and discussing primary theories of CVGE. (2) Designing the architecture of a CVGE prototype to facilitate air pollution simulation. (3) Integration and computation of a complex atmospheric circulation model and an air pollution dispersion model based on high performance computation. (4) Geo-visualization of air pollution distribution and dispersion based on calculations using air pollution dispersion models. (5) Geo-collaboration for air pollution simulation. And finally (6) CVGE prototype based air pollution simulation.
The motivation for future research has two main aspects again---CVGE and practice of air pollution simulation. For the aspect of CVGE, possibilities for future research include: 1 more detailed research on the CVGE concept, primary theories and methodologies; 2 the efficient integration and management of heterogeneous geo-models with CVGE in standardization; and 3 the efficient rendering of a complex structured object in CVGE. Regarding practice, future research can be conducted into: 1 extending air pollution dispersion models; and 2 improving the efficiency of air pollutant rendering with a particle system wrapped in pollution boxes.
Xu, Bingli
Adviser: Hui Lin.
Source: Dissertation Abstracts International, Volume: 72-11, Section: A, page: .
Thesis (Ph.D.)--Chinese University of Hong Kong, 2009.
Includes bibliographical references (leaves 251-264).
Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
Electronic reproduction. [Ann Arbor, MI] : ProQuest Information and Learning, [201-] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
Abstract also in Chinese.
APA, Harvard, Vancouver, ISO, and other styles
30

Okeke, Tobenna. "Simulation and Economic Screening of Improved Oil Recovery Methods with Emphasis on Injection Profile Control Including Waterflooding, Polymer Flooding and a Thermally Activated Deep Diverting Gel." Thesis, 2012. http://hdl.handle.net/1969.1/ETD-TAMU-2012-05-10765.

Full text
Abstract:
The large volume of water produced during the extraction of oil presents a significant problem due to the high cost of disposal in an environmentally friendly manner. On average, an estimated seven barrels of water is produced per barrel of oil in the US alone and the associated treatment and disposal cost is an estimated $5-10 billion. Besides making oil-water separation more complex, produced water also causes problems such as corrosion in the wellbore, decline in production rate and ultimate recovery of hydrocarbons and premature well or field abandonment. Water production can be more problematic during waterflooding in a highly heterogeneous reservoir with vertical communication between layers leading to unevenness in the flood front, cross-flow between high and low permeability layers and early water breakthrough from high permeability layers. Some of the different technologies that can be used to counteract this involve reducing the mobility of water or using a permeability block in the higher permeability, swept zones. This research was initiated to evaluate the potential effectiveness of the latter method, known as deep diverting gels (DDG) to plug thief zones deep within the reservoir and far from the injection well. To evaluate the performance of DDG, its injection was modeled, sensitivities run for a range of reservoir characteristics and conditions and an economic analysis was also performed. The performance of the DDG was then compared to other recovery methods, specifically waterflooding and polymer flooding from a technical and economic perspective. A literature review was performed on the background of injection profile control methods, their respective designs and technical capabilities. For the methods selected, Schlumberger's Eclipse software was used to simulate their behavior in a reservoir using realistic and simplified assumptions of reservoir characteristics and fluid properties. The simulation results obtained were then used to carry out economic analyses upon which conclusions and recommendations are based. These results show that the factor with the largest impact on the economic success of this method versus a polymer flood was the amount of incremental oil produced. By comparing net present values of the different methods, it was found that the polymer flood was the most successful with the highest NPV for each configuration followed by DDG.
APA, Harvard, Vancouver, ISO, and other styles
31

Gomes, Diogo Filipe Meireles. "A comparative evaluation of VaR models using Monte Carlo simulations." Master's thesis, 2020. http://hdl.handle.net/10071/21164.

Full text
Abstract:
Quantile regression models emerges as an alternative Value-at-Risk (VaR) methodology that does not require any specific distribution assumption. This dissertation describes and tests a recent quantile regression VaR model that introduces a nontrivial transformation enabling the use of Generalized Autoregressive Conditional Heteroskedasticity (GARCH) volatility models, proposed by Zheng et al. (2018). His study has shown that this approach to the conditional quantile estimation for a GARCH(1,1) model provides promising results. We test this new model against a group of benchmarks composed by traditional VaR methods and other quantile regression VaR models. In order to evaluate the performance of this new VaR model, we generate returns by Monte Carlo simulations following a GARCH(1,1) process similar to what was carried out by Zheng et al. (2018). After, we change the returns process parameters to, in our opinion, more realistic assumptions on the daily persistent volatility. We confirm the superior performance of the new proposed model when the return generating process is simulated with Zheng et al. (2018) parameters, however, the same does not happen when a more reasonable parametrization is simulated. New results show that the benchmark group is heavily penalized by Zheng et al. (2018) parametrization.
Os modelos de regressão de quantis surgem como uma metodologia "Value-at-Risk" (VaR) alternativa que não requer nenhum pressuposto específico quanto à distribuição dos retornos. Esta dissertação descreve e testa um modelo recente, proposto por Zheng et al. (2018), para estimação do VaR através da regressão de quantis e introduz uma transformação não trivial que permite o uso de modelos "Generalized Autoregressive Conditional Heteroskedasticity" (GARCH). O estudo desenvolvido por este investigador apresenta resultados promissores relativamente ao uso desta abordagem de estimação do quantil condicional para um modelo GARCH(1,1). Testamos este novo modelo comparando-o com um grupo de "benchmarks" compostos por metodologias VaR tradicionais e outros modelos VaR de regressão de quantis. De modo a avaliar o desempenho deste novo modelo VaR, geramos retornos através de simulações de Monte Carlo que seguem um processo GARCH(1,1) idêntico ao que foi utilizado por Zheng et al. (2018). Depois, mudamos os parâmetros do processo gerador de retornos para, na nossa opinião, suposições mais realistas quanto à volatilidade diária no longo prazo. Confirmamos a superioridade do desempenho deste novo modelo quando os parâmetros do processo gerador de retornos é o mesmo do que o que foi definido por Zheng et al. (2018), no entanto, o mesmo não acontece quando utilizamos parâmetros mais realistas. Os novos resultados mostram que a parametrização de Zheng et al. (2018) penaliza bastante o desempenho dos "benchmarks".
APA, Harvard, Vancouver, ISO, and other styles
32

Magagula, Sibusiso Vusi. "Jump-diffusion based-simulated expected shortfall (SES) method of correcting value-at-risk (VaR) under-prediction tendencies in stressed economic climate." Diss., 2014. http://hdl.handle.net/10500/18801.

Full text
Abstract:
Value-at-Risk (VaR) model fails to predict financial risk accurately especially during financial crises. This is mainly due to the model’s inability to calibrate new market information and the fact that the risk measure is characterised by poor tail risk quantification. An alternative approach which comprises of the Expected Shortfall measure and the Lognormal Jump-Diffusion (LJD) model has been developed to address the aforementioned shortcomings of VaR. This model is called the Simulated-Expected-Shortfall (SES) model. The Maximum Likelihood Estimation (MLE) approach is used in determining the parameters of the LJD model since it’s more reliable and authenticable when compared to other nonconventional parameters estimation approaches mentioned in other literature studies. These parameters are then plugged into the LJD model, which is simulated multiple times in generating the new loss dataset used in the developed model. This SES model is statistically conservative when compared to peers which means it’s more reliable in predicting financial risk especially during a financial crisis.
Statistics
M.Sc. (Statistics)
APA, Harvard, Vancouver, ISO, and other styles
33

Sulistyawati, Endah. "An agent-based simulation of land-use in a swidden agricultural landscape of the Kantu' in Kalimantan, Indonesia." Phd thesis, 2001. http://hdl.handle.net/1885/146045.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Nelson, Mark E. "An analysis of calving season strategies." 1986. http://hdl.handle.net/2097/22123.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

(8800949), Patrick N. Maier. "A Bioeconomic Model of Indoor Pacific Whiteleg Shrimp (Litopenaeus Vannamei) Farms With Low-Cost Salt Mixtures." Thesis, 2020.

Find full text
Abstract:
Using a bioeconomic model and stochastic simulation to assess the economic viability of small-scale, recirculating shrimp farms in the Midwestern U.S. A series of stress tests were implemented on key input variables including survival rate, selling price, electricity usage, discount rate and the cost of added salt. The key output variable is the Net Present Value of the operation.


APA, Harvard, Vancouver, ISO, and other styles
36

Correia, Jorge Filipe Almeida. "A decision support tool for retail product assortment." Master's thesis, 2011. http://hdl.handle.net/10071/4293.

Full text
Abstract:
Classificação JEL: C63 – Técnicas computacionais, Simulação. C81 – Metodologia para recolha, estimação e organização de dados microeconómicos; Análise de dados.
A gestão de uma gama de produtos é uma das áreas mais importantes para o retalho e para os gestores de categoria. A criação de uma gama de produtos exige a análise de um vasto número de artigos e variáveis. Daqui resulta a necessidade de criar uma ferramenta informática que auxilie o gestor na criação de gamas de produtos. Assim, foi desenvolvido um modelo de simulação com base num problema de programação não linear inteira mista, cuja resolução, recorrendo ao software Solver, conduz a uma proposta para a gama de produtos. Este modelo foi aplicado numa grande rede de retalho em Portugal. As principais variáveis do modelo são as vendas, o lucro e as dimensões dos artigos e das lojas. Adicionalmente o modelo utiliza a segmentação de clientes para poder direccionar a gama para um ou vários segmentos e utiliza o parâmetro de substituição de procura para produzir uma solução mais realista. Em termos gerais a ferramenta produz uma solução admissível em menos de dois minutos e os resultados obtidos, comparando com os da solução actualmente adoptada, revelam uma melhoria significativa na performance do lucro da categoria mantendo os níveis de vendas. A discrepância entre resultados é tanto maior quanto maior o número de hipóteses de gamas, o qual aumenta à medida que o espaço disponível nas prateleiras diminui. Uma vez que o aumento do número de gamas conduz a um acréscimo de possibilidades para o gestor analisar, a probabilidade da solução produzida não ser óptima tem tendência a aumentar.
The assortment planning is one of the most important areas for retail and for category managers. The creation of a single assortment requires the analysis of an enormous amount of variables and products. Therefore there is a need to create an informatics tool to assist managers in the creation of product assortments. In this sense, it was developed a simulation model based on a mixed integer nonlinear programming problem that with the help of software Solver can produce a proposal to the product assortment. This model was applied at a big retailer network in Portugal. The principal model variables are the sales, the profit and product and store dimensions. Additionally the model uses the customer segmentation to be able to direct the assortment, and also uses the demand substitution parameter in order to produce a more realistic solution. In general terms the tool produces a feasible solution within two minutes, and the results compared with the currently adopted solution, show a significant improvement in profit performance category, while maintaining the sales levels. The discrepancy between results is greater the larger the number of assortment possibilities, which increases as the available space on the shelves decreases. Since the increase in the number of assortments leads to an increase in possibilities that the manager needs to analyze, the probability that the solution produced is not the optimum tends to augment.
APA, Harvard, Vancouver, ISO, and other styles
37

Beisler, Matthias Werner. "Modelling of input data uncertainty based on random set theory for evaluation of the financial feasibility for hydropower projects." Doctoral thesis, 2010. https://tubaf.qucosa.de/id/qucosa%3A22775.

Full text
Abstract:
The design of hydropower projects requires a comprehensive planning process in order to achieve the objective to maximise exploitation of the existing hydropower potential as well as future revenues of the plant. For this purpose and to satisfy approval requirements for a complex hydropower development, it is imperative at planning stage, that the conceptual development contemplates a wide range of influencing design factors and ensures appropriate consideration of all related aspects. Since the majority of technical and economical parameters that are required for detailed and final design cannot be precisely determined at early planning stages, crucial design parameters such as design discharge and hydraulic head have to be examined through an extensive optimisation process. One disadvantage inherent to commonly used deterministic analysis is the lack of objectivity for the selection of input parameters. Moreover, it cannot be ensured that the entire existing parameter ranges and all possible parameter combinations are covered. Probabilistic methods utilise discrete probability distributions or parameter input ranges to cover the entire range of uncertainties resulting from an information deficit during the planning phase and integrate them into the optimisation by means of an alternative calculation method. The investigated method assists with the mathematical assessment and integration of uncertainties into the rational economic appraisal of complex infrastructure projects. The assessment includes an exemplary verification to what extent the Random Set Theory can be utilised for the determination of input parameters that are relevant for the optimisation of hydropower projects and evaluates possible improvements with respect to accuracy and suitability of the calculated results.
Die Auslegung von Wasserkraftanlagen stellt einen komplexen Planungsablauf dar, mit dem Ziel das vorhandene Wasserkraftpotential möglichst vollständig zu nutzen und künftige, wirtschaftliche Erträge der Kraftanlage zu maximieren. Um dies zu erreichen und gleichzeitig die Genehmigungsfähigkeit eines komplexen Wasserkraftprojektes zu gewährleisten, besteht hierbei die zwingende Notwendigkeit eine Vielzahl für die Konzepterstellung relevanter Einflussfaktoren zu erfassen und in der Projektplanungsphase hinreichend zu berücksichtigen. In frühen Planungsstadien kann ein Großteil der für die Detailplanung entscheidenden, technischen und wirtschaftlichen Parameter meist nicht exakt bestimmt werden, wodurch maßgebende Designparameter der Wasserkraftanlage, wie Durchfluss und Fallhöhe, einen umfangreichen Optimierungsprozess durchlaufen müssen. Ein Nachteil gebräuchlicher, deterministischer Berechnungsansätze besteht in der zumeist unzureichenden Objektivität bei der Bestimmung der Eingangsparameter, sowie der Tatsache, dass die Erfassung der Parameter in ihrer gesamten Streubreite und sämtlichen, maßgeblichen Parameterkombinationen nicht sichergestellt werden kann. Probabilistische Verfahren verwenden Eingangsparameter in ihrer statistischen Verteilung bzw. in Form von Bandbreiten, mit dem Ziel, Unsicherheiten, die sich aus dem in der Planungsphase unausweichlichen Informationsdefizit ergeben, durch Anwendung einer alternativen Berechnungsmethode mathematisch zu erfassen und in die Berechnung einzubeziehen. Die untersuchte Vorgehensweise trägt dazu bei, aus einem Informationsdefizit resultierende Unschärfen bei der wirtschaftlichen Beurteilung komplexer Infrastrukturprojekte objektiv bzw. mathematisch zu erfassen und in den Planungsprozess einzubeziehen. Es erfolgt eine Beurteilung und beispielhafte Überprüfung, inwiefern die Random Set Methode bei Bestimmung der für den Optimierungsprozess von Wasserkraftanlagen relevanten Eingangsgrößen Anwendung finden kann und in wieweit sich hieraus Verbesserungen hinsichtlich Genauigkeit und Aussagekraft der Berechnungsergebnisse ergeben.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography