Dissertations / Theses on the topic 'Computational economics'

To see the other types of publications on this topic, follow the link: Computational economics.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Computational economics.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Pugh, David. "Essays in computational economics." Thesis, University of Edinburgh, 2014. http://hdl.handle.net/1842/9882.

Full text
Abstract:
The focus of my PhD research has been on the acquisition of computational modeling and simulation methods used in both theoretical and applied Economics. My first chapter provides an interactive review of finite-difference methods for solving systems of ordinary differential equations (ODEs) commonly encountered in economic applications using Python. The methods surveyed in this chapter, as well as the accompanying code and IPython lab notebooks should be of interest to any researcher interested in applying finite-difference methods for solving ODEs to economic problems. My second chapter is an empirical analysis of the evolution of the distribution of bank size in the U.S. This paper assesses the statistical support for Zipf's Law (i.e., a power law, or Pareto, distribution with a scaling exponent of α = 2) as an appropriate model for the upper tail of the distribution of U.S. banks. Using detailed balance sheet data for all FDIC regulated banks for the years 1992 through 2011, I find significant departures from Zipf's Law for most measures of bank size inmost years. Although Zipf's Law can be statistically rejected, a power law distribution with α of roughly 1.9 statistically outperforms other plausible heavy-tailed alternative distributions. In my final chapter, which is based on joint work with Dr. David Comerford, I apply computational methods to model the relationship between per capita income and city size. A well-known result from the urban economics literature is that a monopolistically competitive market structure combined with internal increasing returns to scale can be used to generate log-linear relations between income and population. I extend this theoretical framework to allow for a variable elasticity of substitution between factors of production in a manner similar to Zhelobodko et al. (2012). Using data on Metropolitan Statistical Areas (MSAs) in the U.S. I find evidence that supports what Zhelobodko et al. (2012) refer to as "increasing relative love for variety (RLV)." Increasing RLV generates procompetitive effects as market size increases which means that IRS, whilst important for small to medium sized cities, are exhausted as cities become large. This has important policy implications as it suggests that focusing intervention on creating scale for small populations is potentially much more valuable than further investments to increase market size in the largest population centers.
APA, Harvard, Vancouver, ISO, and other styles
2

Jelonek, Piotr Zbigniew. "Essays on computational economics." Thesis, University of Leicester, 2014. http://hdl.handle.net/2381/28644.

Full text
Abstract:
This text consists of two parts. In chapters 2-3 the methods are developed that enable the application of tempered stable distributions to measuring and simulating macroeconomic uncertainties. In contrast to the tools used in finance, these results are applicable to low frequency aggregated data, which typically displays tails of moderate gravity. Thus thay are particularly useful in modelling macroeconomic densities. The new methods may be readily employed in Monte Carlo simulations of possibly skewed, moderately heavy-tailed random variates with arbitrary excess kurtosis. In chapter 4 a computational model of endogenous network formation for the inter-bank overnight lending market is proposed. The structure of this market emerges from interactions of heterogeneous agents who are endowed with assets, liabilities and take into account investment risk. As all the banks are large and their trading affects the prices of risky assets, the costs of price slippage breaks the symmetry of portfolio problem, making inter-bank borrowing and lending more desirable. The model takes into account three channels of contagion - bankruptcy cascades, common component of risky asset returns and erosion of liquidity. The network formation algorithm outputs the ensemble of optimal transactions, the outcome of the corresponding link formation process is pairwise stable. This framework is next employed to investigate the stability of the endogenously generated banking systems.
APA, Harvard, Vancouver, ISO, and other styles
3

Grinis, Inna. "Essays in applied computational economics." Thesis, London School of Economics and Political Science (University of London), 2017. http://etheses.lse.ac.uk/3580/.

Full text
Abstract:
This thesis presents four distinct essays that lie at the intersection of economics and computation. The first essay constructs an abstract framework for defining skills gaps, mismatches and shortages geometrically and thinking about these phenomena in a unified, formal way. It then develops a job matching model with imperfect information, in which skills mismatches influence the job application decisions of the workers, while skills gaps and shortages shape the competition for workers on the resulting bipartite job applications network. The tools proposed in this chapter could in future work be employed as the main ingredients of an agent-based model used to investigate how skills gaps, mismatches and shortages affect equilibrium outcomes. The second chapter designs and tests machine learning algorithms to classify 33 million UK online vacancy postings into STEM and non-STEM jobs based on the keywords collected from the vacancy descriptions and job titles. The goal is to investigate whether jobs in “non-STEM” occupations (e.g. Graphic Designers, Economists) also require and value STEM knowledge and skills (e.g. “Microsoft C#”, “Systems Engineering”), thereby contributing to the debate on whether or not the “STEM pipeline leakage” – the fact that less than half of STEM graduates in the UK work in STEM occupations - should be considered as highly problematic. Chapter 3 relates to empirical growth. It proposes a programming algorithm, called “iterative Fit and Filter” (iFF), that extracts trend growth as a sequence of medium/long term average growth rates, and applies it on a sample of over 150 countries. The paper then develops an econometric framework that relates the conditional probabilities of up and down-shifts in trend growth next year to the country's current characteristics, e.g. the growth environment, level of development, demographics, institutions, etc. Finally, Chapter 4 studies credit risk spillovers in financial networks by modelling default as a multi-stage disease with each credit-rating corresponding to a new infection phase. The paper derives analytical and proposes computer simulation-based indicators of systemic importance and vulnerability, then applies them in the context of the Eurozone sovereign debt crisis.
APA, Harvard, Vancouver, ISO, and other styles
4

Balikcioglu, Metin. "Essays on Environmental and Computational Economics." NCSU, 2008. http://www.lib.ncsu.edu/theses/available/etd-12032008-210449/.

Full text
Abstract:
The study consists of three separate essays. The first essay reassesses and extends the papers by Pindyck (2000, 2002) which analyze the effects of uncertainty and irreversibility on the timing of emissions reduction policy. It is shown that proposed solutions for some of the optimal stopping problems introduced in these papers are incorrect. Correct solutions are provided for both the incorrect special cases and the general model through a numerical method since closed form solutions do not exist for these problems. In the second essay, singular control framework is employed in order to allow for gradual emission reduction instead of once-for-all type policies. The solution for the model is obtained using the numerical method introduced in the last essay. The effects of uncertainty and irreversibility on optimal emission reduction policy are investigated. The model is illustrated for greenhouse gas mitigation in the context of climate change problem and some of the model parameters are estimated using a state space model. In the third essay, a unified numerical method is introduced for solving multidimensional singular and impulse control models. The link between regime switching and singular/impulse control problems is established. This link results in a convenient representation of optimality conditions for the numerical method. After solving the optimality conditions at a discrete set of points, an approximate solution can be obtained by solving an extended vertical linear complementarity problem using a variety of techniques. The numerical approach is illustrated with four examples from economics and finance literature.
APA, Harvard, Vancouver, ISO, and other styles
5

Schuster, Stephan. "Applications in agent-based computational economics." Thesis, University of Surrey, 2012. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.556466.

Full text
Abstract:
A constituent feature of adaptive complex systems are non-linear feedback mechanisms between actors. These mechanisms are often difficult to model and analyse. One pos- sibility of modelling is given by Agent-based Computational Economics (ACE), which uses computer simulation methods to represent such systems and analyse non-linear processes. The aim of this thesis is to explore ways of modelling adaptive agents in ACE models. Its major contribution is of a methodological nature. Artificial intelligence and machine learning methods are used to represent agents and learning processes in economics domains by means of learning mechanisms. In this work, a general reinforcement learning framework is developed and realised in a simulation system. This system is used to implement three models of increasing complexity in two different economic domains. One of these domains are iterative games in which agents meet repeatedly and interact. In an experimental labour market, it is shown how statistical discrimination can be generated simply by the learning algorithm used. The results resemble actual patterns of observed human behaviour in laboratory settings. The second model treats strategic network formation. The main contribution here is to show how agent-based modelling helps to analyse non-linearity that is introduced when assumptions of perfect information and full rationality are relaxed. The other domain has a Health Economics background. The aim here is to provide insights of how the approach might be useful in real-world applications. For this, a general model of primary care is developed, and the implications of different consumer behaviour patterns (based on the learning features introduced before) analysed.
APA, Harvard, Vancouver, ISO, and other styles
6

Chong, Shi Kai. "A computational approach to urban economics." Thesis, Massachusetts Institute of Technology, 2018. https://hdl.handle.net/1721.1/122318.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, Computation for Design and Optimization Program, 2018
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 89-92).
Cities are home to more than half of the world population today and urbanization is one of this century's biggest drivers of global economic growth. The dynamics of the urban environment is thus an important question to investigate. In this thesis, techniques from statistical modeling, machine learning, data mining and econometrics are utilized to study digital traces of people's everyday lives. In particular, we investigated how people influence the economic growth of cities, as well as how the urban environment affect the decisions made by people. Focusing on the role of cities as centers of consumption, we found that a gravity model based on the availability of a large and diverse pool of amenities accurately explained human flows observed from credit card records. Investigation of the consumption patterns of individuals in Istanbul, Beijing and various metropolitan areas in the United States revealed a positive relationship between the diversity of urban amenities consumed and the city's economic growth. Taking the perspective of cities as hubs for information exchange, we modeled the interactions between individuals in the cities of Beijing and Istanbul using records of their home and work locations and demonstrated how cities which facilitate the mixing of diverse human capital are crucial to the flow of new ideas across communities and their productivity. This contributes to the body of evidence which supports the notion that efficient information exchange is the key factor that drives innovation. To investigate how urban environments shape people's decisions, we study the social influence city dwellers have on each other and showed how face-to-face interaction and information exchange across different residential communities can shape their behavior and increase the similarity of their financial habits and political views in Istanbul.
by Shi Kai Chong.
S.M.
S.M. Massachusetts Institute of Technology, Computation for Design and Optimization Program
APA, Harvard, Vancouver, ISO, and other styles
7

Hull, Isaiah. "Essays in Computational Macroeconomics and Finance." Thesis, Boston College, 2013. http://hdl.handle.net/2345/bc-ir:104376.

Full text
Abstract:
Thesis advisor: Peter N. Ireland
This dissertation examines three topics in computational macroeconomics and finance. The first two chapters are closely linked; and the third chapter covers a separate topic in finance. Throughout the dissertation, I place a strong emphasis on constructing computational tools and modeling devices; and using them in appropriate applications. The first chapter examines how a central banks choice of interest rate rule impacts the rate of mortgage default and welfare. In this chapter, a quantitative equilibrium (QE) model is constructed that incorporates incomplete markets, aggregate uncertainty, overlapping generations, and realistic mortgage structure. Through a series of counterfactual simulations, five things are demonstrated: 1) nominal interest rate rules that exhibit cyclical behavior increase the average default rate and lower average welfare; 2) welfare can be substantially improved by adopting a modified Taylor rule that stabilizes house prices; 3) a decrease in the length of the interest rate cycle will tend to increase the average default rate; 4) if the business and housing cycles are not aligned, then aggressive inflation targeting will tend to increase the mortgage default rate; and 5) placing a legal cap on loan-to-value ratios will lower the average default rate and lessen the intensity of extreme events. In addition to these findings, this paper also incorporates an important mechanism for default, which had not pre- viously been included in the QE literature: default spikes happen when income falls and home equity is degraded at the same time. The paper concludes with a policy recommendation for central banks: if they wish to crises where many households default simultaneously, they should either adopt a rule that generates interest rates with slow-moving cycles or use a modified Taylor rule that also targets house price growth. The second chapter generalizes the solution method used in the first and compares it to more common techniques used in the computational macroeconomics literature, including the parameterized expectations approach (PEA), projection methods, and value function iteration. In particular, this chapter compares the speed and accuracy of the aforementioned modifications to an alternative method that was introduced separately by Judd (1998), Sutton and Barto (1998), and Van Roy et al. (1997), but was not developed into a general solution method until Powell (2007) introduced it to the Operations Research literature. This approach involves rewriting the Bellman equation in terms of the post-decision state variables, rather than the pre-decision state variables, as is done in standard dynamic programming applications in economics. I show that this approach yields considerable performance benefits over common global solution methods when the state space is large; and has the added benefit of not forcing modelers to assume a data generating process for shocks. In addition to this, I construct two new algorithms that take advantage of this approach to solve heterogenous agent models. Finally, the third chapter imports the SIR model from mathematical epidemiol- ogy; and uses it to construct a model of financial epidemics. In particular, the paper demonstrates how the SIR model can be microfounded in an economic context to make predictions about financial epidemics, such as the spread of asset-backed securities (ABS) and exchange-traded funds (ETFs), the proliferation of zombie financial institutions, and the expansion of financial bubbles and mean-reverting fads. The paper proceeds by developing the 1-host SIR model for economic and financial contexts; and then moves on to demonstrate how to work with the multi-host version of the model. In addition to showing how the SIR framework can be used to model economic interactions, it will also: 1) show how it can be simulated; 2) use it to develop and estimate a sufficient statistic for the spread of a financial epidemic; and 3) show how policymakers can impose the financial analog of herd immunity-that is, prevent the spread of a financial epidemic without completely banning the asset or behavior associated with the epidemic. Importantly, the paper will focus on developing a neutral framework to describe financial epidemics that can be either bad or good. That is, the general framework can be applied to epidemics that constitute a mean-reverting fad or an informational bubble, but ultimately yield little value and shrink in importance; or epidemics that are long-lasting and yield a new financial in- strument that generates permanent efficiency gains or previously unrealized hedging opportunities
Thesis (PhD) — Boston College, 2013
Submitted to: Boston College. Graduate School of Arts and Sciences
Discipline: Economics
APA, Harvard, Vancouver, ISO, and other styles
8

Wong, Yiu Kwong. "Application of computational models and qualitative reasoning to economics." Thesis, Heriot-Watt University, 1996. http://hdl.handle.net/10399/688.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lupi, Paolo. "The evolution of collusion : three essays in computational economics." Thesis, University of York, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.341598.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gao, Lili. "Applications of MachLearning and Computational Linguistics in Financial Economics." Research Showcase @ CMU, 2016. http://repository.cmu.edu/dissertations/815.

Full text
Abstract:
In the world of the financial economics, we have abundant text data. Articles in the Wall Street Journal and on Bloomberg Terminals, corporate SEC filings, earnings-call transcripts, social media messages, etc. all contain ample information about financial markets and investor behaviors. Extracting meaningful signals from unstructured and high dimensional text data is not an easy task. However, with the development of machine learning and computational linguistic techniques, processing and statistically analyzing textual documents tasks can be accomplished, and many applications of statistical text analysis in social sciences have proven to be successful.
APA, Harvard, Vancouver, ISO, and other styles
11

Lee, Myong-hwal. "Computational analysis of optimal macroeconomic policy design /." Digital version accessible at:, 1998. http://wwwlib.umi.com/cr/utexas/main.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Ellison, Sara Fisher. "A nonparametric residual-based specification test : asymptotic, finite-sample, and computational properties." Thesis, Massachusetts Institute of Technology, 1993. http://hdl.handle.net/1721.1/12697.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Lodhi, Aemen Hassaan. "The economics of internet peering interconnections." Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/53092.

Full text
Abstract:
The Internet at the interdomain level is a complex network of approximately 50,000 Autonomous Systems (ASes). ASes interconnect through two types of links: (a) transit (customer-provider) and (b) peering links. Recent studies have shown that despite being optional for most ASes, a rich and dynamic peering fabric exists among ASes. Peering has also grown as one of the main instruments for catching up with asymmetric traffic due to CDNs, online video traffic, performance requirements, etc. Moreover, peering has been in the spotlight recently because of peering conflicts between major ISPs and Content Providers. Such conflicts have led to calls for intervention by communication regulators and legislation at the highest levels of government. Peering disputes have also sometimes resulted in partitioning of the Internet. Despite the broad interest and intense debate about peering, several fundamental questions remain elusive. The objective of this thesis is to study peering from a techno-economics perspective. We explore the following questions: 1- What are the main sources of complexity in Internet peering that defy the development of an automated approach to assess peering relationships? 2- What is the current state of the peering ecosystem, e.g., which categories of ASes are more inclined towards peering? What are the most popular peering strategies among ASes in the Internet? 3- What can we say about the economics of contemporary peering practices, e.g., what is the impact of using different peering traffic ratios as a strategy to choose peers? Is the general notion that peering saves network costs, always valid? 4- Can we propose novel methods for peering that result in more stable and fair peering interconnections? We have used game-theoretic modeling, large-scale computational agent-based modeling, and analysis of publicly available peering data to answer the above questions. The main contributions of this thesis include: 1- Identification of fundamental complexities underlying the evaluation of peers and formation of stable peering links in the interdomain network. 2- An empirical study of the state of the peering ecosystem from August 2010 to August 2013. 3- Development of a large-scale agent-based computational model to study the formation and evolution of Internet peering interconnections. 4- A plausible explanation for the gravitation of Internet transit providers towards Open peering and a prediction of its future consequences. 5- We propose a variant of the Open peering policy and a new policy based on cost-benefit analysis to replace the contemporary simplistic policies.
APA, Harvard, Vancouver, ISO, and other styles
14

Henkel, Marco. "Darstellung der Agent-based Computational Economics und Anwendung auf ausgewählte Märkte." Münster Verl.-Haus Monsenstein und Vannerdat, 2009. http://d-nb.info/1001177894/04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Wu, Di. "Three Essays on the Credit Card Debt Puzzle, Income Falsification, and Numerical Approximation." The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1563316071624495.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Okasha, Ahmed E. "Agent-based computational economics : studying the effect of different levels of rationality on macro-activities for economic systems." Thesis, University of Kent, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.529398.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Latsch, Wolfram Wilhelm. "Beyond complexity and evolution : on the limits of computability in economics." Thesis, University of Oxford, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.325103.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Schaff, Frederik [Verfasser]. "Pure agent-based computational economics of time, knowledge and procedural rationality with an application to environmental economics / Frederik Schaff." Hagen : Fernuniversität Hagen, 2016. http://d-nb.info/1114292087/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Krause, Thilo. "Evaluating congestion management schemes in liberalized electricity markets applying agent-based computational economics /." Zürich : ETH, 2007. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=16928&part=abstracts.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Teglio, Andrea. "From agent-based models to artificial economies." Doctoral thesis, Universitat Jaume I, 2011. http://hdl.handle.net/10803/83303.

Full text
Abstract:
The aim of this thesis is to propose and illustrate an alternative approach to economic modeling and policy design that is grounded in the innovative field of agent-based computational economics (ACE). The recent crisis pointed out the fundamental role played by macroeconomic policy design in order to preserve social welfare, and the consequent necessity of understanding the effects of coordinated policy measures on the economic system. Classic approaches to macroeconomic modeling, mainly represented by dynamic stochastic general equilibrium models, have been recently criticized for they difficulties in explaining many economic phenomena. The absence of interaction among heterogeneous agents, along with their strong rationality, are two of the main of criticisms that emerged, among others. Actually, decentralized market economies consist of large numbers of economic agents involved in local interactions and the aggregated macroeconomic trends should be considered as the result of these local interactions. The approach of agent-based computational economics consists in designing economic models able to reproduce the complicated dynamics of recurrent chains connecting agent behaviors, interaction networks, and to explain the global outcomes emerging from the bottom-up. The work presented in this thesis tries to understand the feedback between the microstructure of the economic model and the macrostructure of policy design, investigating the effects of different policy measures on agents behaviors and interactions. In particular, the attention is focused on modeling the relation between the financial and the real sides of the economy, linking the financial markets and the credit sector to the markets of goods and labor. The model complexity is increasing with the different chapters. The agent-based models presented in the first part evolve to a more complex object in the second part, becoming a sort of complete ``artificial economy''. The problems tackled in the thesis are various and go from the investigation of the equity premium puzzle, to study of the effects of classic monetary policy rules (as the Taylor rule) or to the study of the macroeconomic implications of bank's capital requirement or quantitative easing.
APA, Harvard, Vancouver, ISO, and other styles
21

Dennig, Francis. "On the welfare economics of climate change." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:aefca5e4-147e-428b-b7a1-176b7daa0f85.

Full text
Abstract:
The three constituent chapters of this thesis tackle independent, self-contained research questions, all concerning welfare economics in general and its application to climate change policy in particular. Climate change is a policy problem for which the costs and benefits are distributed unequally across space and time, as well as one involving a high degree of uncertainty. Therefore, cost-benefit analysis of climate policy ought to be based on a welfare function that is sufficiently sophisticated to incorporate the three dimensions of aggregation: time, risk and space. Chapter 1 is an axiomatic treatment of a stylised model in which all three dimensions appear. The main result is a functional representation of the social welfare function for policy assessment in such situations. Chapter 2 is a numerical mitigation policy analysis. I modify William Nordhaus' RICE-2010 model by replacing his social welfare function with one that allows for different degrees of inequality aversion along the regional and inter-temporal dimension. I find that, holding the inter-temporal coefficient of inequality aversion fixed, performing the optimisation with a greater degree of regional inequality reduces the optimal carbon tax relative to treating the world as a single aggregate consumer. In Chapter 3 I analyse climate policy from the point of view of intergenerational transfers. I propose a system of transfers that allows future generations to compensate the current one for its mitigation effort and demonstrate the effects in an OLG model. When the marginal benefit to a - possibly distant - future generation is greater than the cost of compensating the current generation for its abatement effort, a Pareto improvement is possible by a combination of mitigation policy and transfer payments. I show that under very general assumptions the business-as-usual outcome is Pareto dominated by such policies and derive the conditions for the set of climate policies that are not dominated thus.
APA, Harvard, Vancouver, ISO, and other styles
22

Cui, Zhuoya. "Understanding social function in psychiatric illnesses through computational modeling and multiplayer games." Diss., Virginia Tech, 2021. http://hdl.handle.net/10919/103528.

Full text
Abstract:
Impaired social functioning conferred by mental illnesses has been constantly implicated in previous literatures. However, studies of social abnormalities in psychiatric conditions are often challenged by the difficulties of formalizing dynamic social exchanges and quantifying their neurocognitive underpinnings. Recently, the rapid growth of computational psychiatry as a new field along with the development of multiplayer economic paradigms provide powerful tools to parameterize complex interpersonal processes and identify quantitative indicators of social impairments. By utilizing these methodologies, the current set of studies aimed to examine social decision making during multiplayer economic games in participants diagnosed with depression (study 1) and combat-related post-traumatic stress disorder (PTSD, study 2), as well as an online population with elevated symptoms of borderline personality disorder (BPD, study 3). We then quantified and disentangled the impacts of multiple latent decision-making components, mainly social valuation and social learning, on maladaptive social behavior via explanatory modeling. Different underlying alterations were revealed across diagnoses. Atypical social exchange in depression and BPD were found attributed to altered social valuation and social learning respectively, whereas both social valuation and social learning contributed to interpersonal dysfunction in PTSD. Additionally, model-derived indices of social abnormalities positively correlated with levels of symptom severity (study 1 and 2) and exhibited a longitudinal association with symptom change (study 1). Our findings provided mechanistic insights into interpersonal difficulties in psychiatric illnesses, and highlighted the importance of a computational understanding of social function which holds potential clinical implications in differential diagnosis and precise treatment.
Doctor of Philosophy
People with psychiatric conditions often suffer from impaired social relationships due to an inability to engage in everyday social interactions. As different illnesses can sometimes produce the same symptoms, social impairment can also have different causes. For example, individuals who constantly avoid social activities may find them less interesting or attempt to avoid potential negative experiences. While those who display elevated aggression may have a strong desire for social dominance or falsely believe that others are also aggressive. However, it is hard to infer what drives these alterations by just observing the behavior. To address this question, we enrolled people with three different kinds of psychopathology to play an interactive game together with another player and mathematically modeled their latent decision-making processes. By comparing their model parameters to those of the control population, we were able to infer how people with psychopathology made the decisions and which part of the decision-making processes went wrong that led to disrupted social interactions. We found altered model parameters differed among people with major depression, post-traumatic stress disorder and borderline personality disorder, suggesting different causes underlying impaired social behavior observed in the game, the extent of which also positively correlated with their psychiatric symptom severity. Understanding the reasons behind social dysfunctions associated with psychiatric illnesses can help us better differentiate people with different diagnoses and design more effective treatments to restore interpersonal relationships.
APA, Harvard, Vancouver, ISO, and other styles
23

Yan, Chang. "A computational game-theoretic study of reputation." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:e6acb250-efb8-410b-86dd-9e3e85b427b6.

Full text
Abstract:
As societies become increasingly connected thanks to advancing technologies and the Internet in particular, individuals and organizations (i.e. agents hereafter) engage in innumerable interaction and face constantly the possibilities thereof. Such unprecedented connectivity offers opportunities through which social and economic benefits are realised and disseminated. Nonetheless, risky and damaging interaction abound. To promote beneficial relationships and to deter adverse outcomes, agents adopt different means and resources. This thesis focuses on reputation as a crucial mechanism for promoting positive interaction, and examines the topic from game-theoretic perspective using computational methods. First, we investigate the design of reputation systems by incorporating economic incentives into algorithm design. Focusing on ubiquitous user-generated ratings on the Internet, we propose a truthful reputation mechanism that not only enforces honest reporting from individual raters but also takes into account their personal preferences. The mechanism is constructed using a blend of Bayesian Truth Serum and SimRank algorithms, both specifically adapted for our use case of online ratings. We show that the resulting mechanism is Bayesian incentive compatible and is computable in polynomial time. In addition, the mechanism is shown to be resistant to common manipulations on the Internet such as uniform fake ratings and targeted collusions. Lastly, we discuss detailed considerations for implementing the mechanism in practice. Second, we investigate experimentally the relative importance of reputational and social knowledge in sustaining cooperation in dynamic networks. In our experiments, U.S-based subjects play a repeated game where, in each round, an endogenous network is formed among a group of 13 players and each player chooses a cooperative or non-cooperative action that applies to all her connections. We vary the availability of reputational and social knowledge to subjects in 4 treatments. At the aggregate level, we find that reputational knowledge is of first-order importance for supporting cooperation, while social knowledge plays a complementary role only when reputational knowledge is available. Further community-level analysis reveals that reputational knowledge leads to the emergence of highly cooperative hubs, and a dense and cluster network, while social knowledge enhances cooperation by forming a large, dense and clustered community of cooperators who exclude outsiders through link removals and link refusals. At the individual level, reputational knowledge proves essential for the emergence of network structural characteristics that are associated with cooperative actions. In contrast, in treatments without reputational information, none of the network metrics is predicative of subjects' choices of action. Furthermore, we present UbiquityLab, a pioneering online platform for conducting real-time interactive experiments for game-theoretic studies. UbiquityLab supports both synchronous and asynchronous game models, and allows for complex and customisable interaction between subjects. It offers both back-end and front-end infrastructure with a modularised design to enable rapid development and streamlined operation. For in- stance, in synchronous mode, all per-stage and inter-stage logic are fully encapsulated by a thin server-side module, while a suite of client-side components eases the creation of game interface. The platform features a robust messaging protocol, such that player connection and game states are restored automatically upon networking errors and dropped out subjects are seamlessly substituted by customisable program players. Online experiments enjoy clear advantages over lab equivalents as they benefit from low operation cost, efficient execution, large and diverse subject pools, etc. UbiquityLab aims to promote online experiments as an emerging research methodology in experimental economics by bringing its benefits to other researchers.
APA, Harvard, Vancouver, ISO, and other styles
24

Jessup, Ryan K. "Neural correlates of the behavioral differences between descriptive and experiential choice an examination combining computational modeling with fMRI /." [Bloomington, Ind.] : Indiana University, 2008. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3337258.

Full text
Abstract:
Thesis (Ph.D.)--Indiana University, Dept. of Psychological & Brain Sciences, 2008.
Title from PDF t.p. (viewed on Feb. 17, 2010). Source: Dissertation Abstracts International, Volume: 69-12, Section: B, page: 7830. Advisers: Jerome R. Busemeyer; Peter M. Todd.
APA, Harvard, Vancouver, ISO, and other styles
25

Boto, Joaquim Paulo da Silva. "Desenvolvimento de Modelos Baseados em Agentes: Plataforma Aplicacional." Master's thesis, Instituto Superior de Economia e Gestão, 2010. http://hdl.handle.net/10400.5/2962.

Full text
Abstract:
Mestrado Gestão de Sistemas de Informação
Apesar de a experimentação ser o método mais utilizado em muitas áreas científicas, nem sempre é possível recorrer a esse método nas áreas das Ciências Sociais ou das Ciências Económicas. Uma alternativa é a simulação computacional, em que se recorre a um programa de computador para representar um modelo do fenómeno a estudar, aplicando as abstracções e conceitos considerados relevantes e apropriados, tem-se vindo a tornar um instrumento cada vez mais utilizado e útil na pesquisa científica. Ao longo dos últimos anos, um dos métodos de simulação computacional que tem tido maior desenvolvimento é o desenvolvimento de Modelos Baseado em Agentes: este consiste num sistema computacional que simula as acções das entidades intervenientes no fenómeno a estudar, e as interacções dessas entidades entre si e com o ambiente em que se encontram localizadas, tendo em vista a confirmação de hipóteses teóricas que contribuam para explicar o fenómeno estudado. Em linhas gerais, o tema proposto para este Trabalho de Fim de Mestrado é a criação de uma plataforma aplicacional para a criação de Modelos Baseados em Agentes, contemplando os requisitos gerais deste tipo de simulação computacional, e avaliar as vantagens e desvantagens desta plataforma. Esta dissertação começará por enquadrar teórica e historicamente a simulação computacional no âmbito das Ciências Sociais, para em seguida identificar os aspectos fundamentais e específicos dos Modelos Baseados em Agentes. Sucessivamente, será concebida a arquitectura da plataforma aplicacional, considerando os requisitos gerias que se podem associar à criação de Modelos Baseados em Agentes e será efectuada a sua implementação; a plataforma será depois utilizada para a construção de vários modelos, de modo a verificar a conveniência da sua utilização para construção de alguns dos modelos mais frequentes referidos na literatura. Por fim, serão indicadas as várias possíveis evoluções e ampliações à plataforma criada, no sentido de a tornar mais completa, tanto sobre o ponto de vista das funcionalidades contempladas como do ponto de vista da sua versatilidade, e são avaliadas as vantagens e desvantagens da sua utilização. Assim, em linhas gerais, a metodologia seguida no desenvolvimento deste trabalho será constituída pelas seguintes seis etapas principais: Adquirir e consolidar o conhecimento sobre a modelização de fenómenos sociais baseada em agentes; Especificar os principais requisitos que deverão ser contemplados pela plataforma e identificar a arquitectura mais apropriada; Construir a plataforma; Verificar a plataforma, utilizando-a para construir alguns Modelos Baseados em Agentes de referência descritos na literatura; Identificar possíveis evoluções e extensões tendo em vista obter funcionalidade adicional e uma maior versatilidade da plataforma desenvolvida; Avaliar as vantagens e desvantagens da utilização da plataforma desenvolvida.
Despite the experimental method being the most widely used method in many scientific areas, is not always possible to use this method in the areas of Social Sciences or Economics Sciences. An alternative is the computer simulation, where a computer program is used to represent a model of the phenomenon to study, applying the abstractions and concepts considered relevant and appropriate, has been used as a key instrument in research. During the last years, one of the methods of computer simulation which has had significant development is the so called Agent-Based Models method: this method is based on computational systems that simulate the actions of the entities involved in the phenomena to study, and the interactions of these entities among themselves and with the environments where they are located, in order to explain the phenomenon studied. The theme proposed for this master thesis is to create an application platform for creating agent-based models covering the general requirements of this type of computer simulation, and to evaluate advantages and drawbacks in its use. The thesis will begin by describing, historically and theoretically, computer simulation as applied to social sciences, to then identify the basic and specific aspects of Agent-Based Models. Then, the application platform architecture will be designed, considering the general requirements that may involved in the creation of Agent-Based Models and its implementation will be carried out; finally, the platform will be used to build several models in order to verify the appropriateness of their use for building some of the models most frequently mentioned in literature. At the end, the various possible developments and extensions to the platform in order to make it more complete (considering functionality and versatility) will be described and the advantages and drawbacks of its use will be evaluated as well. Thus, in general, the methodology applied in developing this master thesis will pursue the subsequent six main steps: Acquire and consolidate knowledge on social phenomena models based on agents; Identify key issues to be addressed by the platform and identify the most appropriate architecture options; Design and implement the platform; Check the platform, using it to build some agent-based models described in the literature; Identify possible developments and extensions in order to obtain additional functionality and versatility; Evaluate the platform advantages and drawbacks.
APA, Harvard, Vancouver, ISO, and other styles
26

Strid, Ingvar. "Computational methods for Bayesian inference in macroeconomic models." Doctoral thesis, Handelshögskolan i Stockholm, Ekonomisk Statistik (ES), 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:hhs:diva-1118.

Full text
Abstract:
The New Macroeconometrics may succinctly be described as the application of Bayesian analysis to the class of macroeconomic models called Dynamic Stochastic General Equilibrium (DSGE) models. A prominent local example from this research area is the development and estimation of the RAMSES model, the main macroeconomic model in use at Sveriges Riksbank.   Bayesian estimation of DSGE models is often computationally demanding. In this thesis fast algorithms for Bayesian inference are developed and tested in the context of the state space model framework implied by DSGE models. The algorithms discussed in the thesis deal with evaluation of the DSGE model likelihood function and sampling from the posterior distribution. Block Kalman filter algorithms are suggested for likelihood evaluation in large linearised DSGE models. Parallel particle filter algorithms are presented for likelihood evaluation in nonlinearly approximated DSGE models. Prefetching random walk Metropolis algorithms and adaptive hybrid sampling algorithms are suggested for posterior sampling. The generality of the algorithms, however, suggest that they should be of interest also outside the realm of macroeconometrics.
APA, Harvard, Vancouver, ISO, and other styles
27

Kiose, Daniil. "The ACEWEM computational laboratory : an integrated agent-based and statistical modelling framework for experimental designs of repeated power auctions." Thesis, London Metropolitan University, 2015. http://repository.londonmet.ac.uk/1257/.

Full text
Abstract:
This research work develops a novel framework for experimental designs of liberalised wholesale power markets, namely the Agent-based Computational Economics of Wholesale Electricity Market (ACEWEM) framework. The ACEWEM allows to further understand the effect of various market designs on market efficiency and to gain insights into market manipulation by electricity generators. The thesis describes a detailed market simulations whereby the strategies of power generators emerge as a result of a stochastic profit optimisation learning algorithm based upon the Generalized Additive Models for Location Scale and Shape statistical framework. The ACEWEM framework, which integrates the agent-based modelling paradigm with formal statistical methods to represent better real-world decision rules, is designed to be the foundation for large custom-purpose experimental studies inspired by computational learning. It makes a methodological contribution in the development of an expert computational laboratory for repeated power auctions with capacity and physical constraints. Furthermore, it contributes by developing a new computational learning algorithm. It integrates the reinforcement learning paradigm to engage past experience in decision making, with flexible statistical models adjust these decisions based on the vision of the future. In regard to policy contribution, this research work conducts a simulation study to identify whether high market prices can be ascribed to problems of market design and/or exercise of market power. Furthermore, the research work presents the detailed study of an abstract wholesale electricity market and real UK power market.
APA, Harvard, Vancouver, ISO, and other styles
28

Joseph, Joshua Allen Jr. "Computational Tools for Improved Analysis and Assessment of Groundwater Remediation Sites." Diss., Virginia Tech, 2008. http://hdl.handle.net/10919/28458.

Full text
Abstract:
Remediation of contaminated groundwater remains a high-priority national goal in the United States. Water is essential to life, and new sources of water are needed for an expanding population. Groundwater remediation remains a significant technical challenge despite decades of research into this field. New approaches are needed to address the most severely-polluted aquifers, and cost-effective solutions are required to meet remediation objectives that protect human health and the environment. Source reduction combined with Monitored Natural Attenuation (MNA) is a remediation strategy whereby the source of contamination is aggressively treated or removed and the residual groundwater plume depletes due to natural processes in the subsurface. The USEPA requires long-term performance monitoring of groundwater at MNA sites over the remediation timeframe, which often takes decades to complete. Presently, computational tools are lacking to adequately integrate source remediation with economic models. Furthermore, no framework has been developed to highlight the tradeoff between the degree of remediation versus the level of benefit within a cost structure. Using the Natural Attenuation Software (NAS) package developed at Virginia Tech, a set of formulae have been developed for calculating the TOR for petroleum-contaminated aquifers (specifically tracking benzene and MTBE) through statistical techniques. With the knowledge of source area residual saturation, groundwater velocity, and contaminant plume source length, the time to remediate a site contaminated with either benzene or MTBE can be determined across a range of regulatory maximum contaminant levels. After developing formulae for TOR, an integrated and interactive decision tool for framing the decision analysis component of the remediation problem was developed. While MNA can be a stand-alone groundwater remediation technology, significant benefits may be realized by layering a more traditional source zone remedial technique with MNA. Excavation and soil vapor extraction when applied to the front end of a remedial action plan can decrease the amount of time to remediation and while generally more expensive than an MNA-only approach, may accrue long-term economic advantages that would otherwise be foregone. The value of these research components can be realized within the engineering and science communities, as well as through government, business and industry, and communities where groundwater contamination and remediation are of issue. Together, these tools constitute the Sâ ªEâ ªEâ ªPâ ªAGE paradigm, founded upon the concept of sound science for an environmental engineering, effectual economics, and public policy agenda. The TOR formulation simplifies the inputs necessary to determine the number of years that an MNA strategy will require before project closure and thus reduces the specialized skills and training required to perform a numerical analysis that for one set of conditions could require many hours of simulation time. The economic decision tool, that utilizes a life cycle model to evaluate a set of feasible alternatives, highlights the tradeoffs between time and economics can be realized over the lifetime of the remedial project.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
29

Faleiro, Jorge. "Supporting large scale collaboration and crowd-based investigation in economics : a computational representation for description and simulation of financial models." Thesis, University of Essex, 2018. http://repository.essex.ac.uk/21782/.

Full text
Abstract:
Finance should be studied as a hard science, where scientific methods apply. When a trading strategy is proposed, the underlying model should be transparent and defined robustly to allow other researchers to understand and examine it thoroughly. Any reports on experimental results must allow other researchers to trace back to the original data and models that produced them. Like any hard sciences, results must be repeatable to allow researchers to collaborate and build upon each other’s results. Large-scale collaboration, when applying the steps of scientific investigation, is an efficient way to leverage crowd science to accelerate research in finance. Unfortunately, the current reality is far from that. Evidence shows that current methods of investigation in finance in most cases do not allow for reproducible and falsifiable procedures of scientific investigation. As a consequence, the majority of financial decisions at all levels, from personal investment choices to overreaching global economic policies, rely on some variation of try-and-error and are mostly non-scientific by definition. We lack transparency for procedures and evidence, proper explanation of market events, predictability on effects, or identification of causes. There is no clear demarcation of what is inherently scientific, and as a consequence, the line between fake and true is blurred. In this research, we advocate the use of a next-generation investigative approach leveraging forces of human diversity, micro-specialized crowds, and proper computer-assisted control methods associated with accessibility, reproducibility, communication, and collaboration. This thesis is structured in three distinctive parts. The first part defines a set of very specific cognitive and non-cognitive enablers for crowd-based scientific investigation: methods of proof, large-scale collaboration, and a domain-specific computational representation. These enablers allow the application of procedures of structured scientific investigation powered by crowds, a “collective brain in which neurons are human collaborators”. The second part defines a specialized computational representation to allow proper controls and collaboration in large-scale in the field of economics. A computational representation is a role-based representation system based on facets, contributions, and constraints of data, and used to define concepts related to a specific domain of knowledge for crowd-based investigation. The third and last part performs an end-to-end investigation of a non-trivial problem in finance by measuring the actual performance of a momentum strategy in technical analysis, applying formal methods of investigation developed over the first and second part of this research.
APA, Harvard, Vancouver, ISO, and other styles
30

Kraft, Dennis [Verfasser], Susanne [Akademischer Betreuer] Albers, Martin [Gutachter] Bichler, and Susanne [Gutachter] Albers. "Incentive Design for Present-Biased Agents : A Computational Problem in Behavioral Economics / Dennis Kraft ; Gutachter: Martin Bichler, Susanne Albers ; Betreuer: Susanne Albers." München : Universitätsbibliothek der TU München, 2018. http://d-nb.info/1173898999/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Kraft, Dennis Verfasser], Susanne [Akademischer Betreuer] [Albers, Martin [Gutachter] Bichler, and Susanne [Gutachter] Albers. "Incentive Design for Present-Biased Agents : A Computational Problem in Behavioral Economics / Dennis Kraft ; Gutachter: Martin Bichler, Susanne Albers ; Betreuer: Susanne Albers." München : Universitätsbibliothek der TU München, 2018. http://nbn-resolving.de/urn:nbn:de:bvb:91-diss-20181130-1445718-1-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

MANCA, MAURIZIO. "Consumption and saving specification: a new perspective." Doctoral thesis, Università Politecnica delle Marche, 2014. http://hdl.handle.net/11566/242765.

Full text
Abstract:
La grande maggioranza della letteratura accademica su consumo e risparmio utilizza lo strumento di massimizzazione dell'utilità vincolata per determinare la scelta ottimale di risparmio, funzione di utilità che dipende dal consumo di un singolo bene omogeneo e rappresentativo dell'intero paniere domandato dal consumatore. Dall'osservazione di un dataset di famiglie italiane, osservate tra il 1991 ed il 2010, si evince che il consumo in funzione del rapporto di buffer-stock, ossia il rapporto tra ricchezza e consumo corrente, si discosta in maniera significativa da quanto implicato dalla teoria economica tradizionale: se i dati vengono opportunamente normalizzati per la numerosità del nucleo familiare e per il livello soglia di povertà, il consumo presenta bensì asimmetrie in reazione a shock di liquidità di segno opposto e non linearità tali da sollevare qualche dubbio sulla portata informativa del costrutto teorico. Tali presunte irregolarità possono essere conciliate se si arricchisce il set informativo su cui poggiano le decisioni di consumo e risparmio e si suddivide l'intervallo di valori del rapporto buffer-stock in tre opportune categorie di consumatori. Una simile specificazione delle scelte di consumo e risparmio viene discussa e implementata in un modello di simulazione computazionale ad agenti eterogenei.
The vast majority of the academic literature on consumption and saving, expecially for the last 40 years, describes the household consumption function as the outcome of a constrained maximization of of the utility coming from the consumption of a single homogenous good, for the latter can be seen as representing the bundle of goods desired by the households in her steady state. I observed in a dataset of italian households that consumption function exhibits asymmetric shocks and non linearities that can be comprehended by splitting the households' buffer-stock ratio distribution into three intervals. In this paper I discuss a stochastic model capable of managing such heterogeneity by an agent based computer simulation.
APA, Harvard, Vancouver, ISO, and other styles
33

Bertolai, Jefferson Donizeti Pereira. "Dinâmica monetária eficiente sob encontros aleatórios: uma classe de métodos numéricos que exploram concavidade." reponame:Repositório Institucional do FGV, 2009. http://hdl.handle.net/10438/4277.

Full text
Abstract:
Submitted by Daniella Santos (daniella.santos@fgv.br) on 2010-03-23T12:01:36Z No. of bitstreams: 1 Dissertacao_Jefferson_Donizeti_Bertolai.pdf: 274926 bytes, checksum: a60c8343a27883dd5e7f529f517bc2e8 (MD5)
Approved for entry into archive by Andrea Virginio Machado(andrea.machado@fgv.br) on 2010-03-23T12:31:24Z (GMT) No. of bitstreams: 1 Dissertacao_Jefferson_Donizeti_Bertolai.pdf: 274926 bytes, checksum: a60c8343a27883dd5e7f529f517bc2e8 (MD5)
Made available in DSpace on 2010-03-24T12:48:06Z (GMT). No. of bitstreams: 1 Dissertacao_Jefferson_Donizeti_Bertolai.pdf: 274926 bytes, checksum: a60c8343a27883dd5e7f529f517bc2e8 (MD5) Previous issue date: 2009-12-08
The dificulty in characterizing non-stationary allocations or equilibria is one of the main explanations for the use of concepts and assumptions that trivialize the dynamics of the economy. This di¢ culty is especially critical in Monetary Theory, in which the dimensionality of the problem is high even for very simple models. In this context, this paper reports the computational strategy for implementing the recursive method proposed by Monteiro and Cavalcanti (2006), which allows you to calculate the optimal sequence (possibly non-stationary) of distributions of money in an extension of the model proposed by Kiyotaki and Wright (1989). Three aspects of this calculation are emphasized: (i) the computational implementation of the plannerís problem involves the choice of continuous and discrete variables that maximize a nonlinear function and satisÖes nonlinear constraints; (ii) the objective function of this problem is not concave and constraints are not convex, and (iii) the set of admissible choices is not known a priori. The goal is to document the di¢ culties involved, the proposed solutions and available methods and resources to implement the numerical characterization of e¢ cient monetary dynamics under the assumption of random matching.
A dificuldade em se caracterizar alocações ou equilíbrios não estacionários é uma das principais explicações para a utilização de conceitos e hipóteses que trivializam a dinâmica da economia. Tal dificuldade é especialmente crítica em Teoria Monetária, em que a dimensionalidade do problema é alta mesmo para modelos muito simples. Neste contexto, o presente trabalho relata a estratégia computacional de implementação do método recursivo proposto por Monteiro e Cavalcanti (2006), o qual permite calcular a sequência ótima (possivelmente não estacionária) de distribuições de moeda em uma extensão do modelo proposto por Kiyotaki e Wright (1989). Três aspectos deste cálculo são enfatizados: (i) a implementação computacional do problema do planejador envolve a escolha de variáveis contínuas e discretas que maximizem uma função não linear e satisfaçam restrições não lineares; (ii) a função objetivo deste problema não é côncava e as restrições não são convexas; e (iii) o conjunto de escolhas admissíveis não é conhecido a priori. O objetivo é documentar as dificuldades envolvidas, as soluções propostas e os métodos e recursos disponíveis para a implementação numérica da caracterização da dinâmica monetária eficiente sob a hipótese de encontros aleatórios.
APA, Harvard, Vancouver, ISO, and other styles
34

Gräbner, Claudius [Verfasser], Wolfram [Akademischer Betreuer] [Gutachter] Elsner, and Christian [Gutachter] Cordes. "A systemic framework for the computational analysis of complex economies: An evolutionary-institutional perspective on the ontology, epistemology, and methodology of complexity economics / Claudius Gräbner. Betreuer: Wolfram Elsner. Gutachter: Wolfram Elsner ; Christian Cordes." Bremen : Staats- und Universitätsbibliothek Bremen, 2016. http://d-nb.info/1102308889/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Gräbner-Radkowitsch, Claudius [Verfasser], Wolfram [Akademischer Betreuer] [Gutachter] Elsner, and Christian [Gutachter] Cordes. "A systemic framework for the computational analysis of complex economies: An evolutionary-institutional perspective on the ontology, epistemology, and methodology of complexity economics / Claudius Gräbner. Betreuer: Wolfram Elsner. Gutachter: Wolfram Elsner ; Christian Cordes." Bremen : Staats- und Universitätsbibliothek Bremen, 2016. http://nbn-resolving.de/urn:nbn:de:gbv:46-00105216-14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Thompson, David R. M. "The positronic economist : a computational system for analyzing economic mechanisms." Thesis, University of British Columbia, 2015. http://hdl.handle.net/2429/52868.

Full text
Abstract:
A mechanism is a formal protocol for collective decision making among self-interested agents. Mechanisms model many important social processes from auctions to elections. They are also widely studied in computer science: the participants in real-world mechanisms are often autonomous software systems (e.g., algorithmic bidding and trading agents), and algorithmic problems (e.g., job scheduling) give rise to mechanisms when users have competing interests. Strategic behavior (or "gaming") poses a major obstacle to understanding mechanisms. Although real-world mechanisms are often fairly simple functions (consider, e.g., plurality voting), a mechanism's outcome depends on both this function and on agents' strategic choices. Game theory provides a principled means for analyzing such choices. Unfortunately, game theoretic analysis is a difficult process requiring either human effort or very large amounts of computation. My thesis is that mechanism analysis can be done computationally, due to recent advances in compact representations of games. Compact representation is possible when a game's description has a suitable independence structure. Exploiting this structure can exponentially decrease the space required to represent games, and exponentially decrease the time required to analyze them. The contributions of my thesis revolve around showing that the relevant kinds of structure (specifically, the structure exploited by action-graph games) are present in important mechanisms of interest. Specifically, I studied two major classes of mechanisms, position auctions (as used for internet advertising) and voting rules. In both cases, I was able to provide exponential improvements in the space and time complexity of analysis, and to use those improvements to address important open questions in the literature. I also introduced a new algorithm for analyzing action-graph games, with faster empirical performance and additional benefits over the previous state-of-the-art. Finally, I created the Positronic Economist system, which consists of a python-based descriptive language for mechanism games, with automatic discovery of computationally useful structure.
Science, Faculty of
Computer Science, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
37

Teixeira, Henrique Oliveira. "Bank networks and firm credit: an agent based model approach." reponame:Repositório Institucional do FGV, 2016. http://hdl.handle.net/10438/15973.

Full text
Abstract:
Submitted by Henrique Teixeira (henrique.oliv@gmail.com) on 2016-03-16T02:51:18Z No. of bitstreams: 1 Dissertacao.pdf: 1894861 bytes, checksum: af73e440cc555c69c32dbb74b4ba3f59 (MD5)
Approved for entry into archive by Renata de Souza Nascimento (renata.souza@fgv.br) on 2016-03-16T21:49:10Z (GMT) No. of bitstreams: 1 Dissertacao.pdf: 1894861 bytes, checksum: af73e440cc555c69c32dbb74b4ba3f59 (MD5)
Made available in DSpace on 2016-03-17T11:42:41Z (GMT). No. of bitstreams: 1 Dissertacao.pdf: 1894861 bytes, checksum: af73e440cc555c69c32dbb74b4ba3f59 (MD5) Previous issue date: 2016-02-18
Starting from the idea that economic systems fall into complexity theory, where its many agents interact with each other without a central control and that these interactions are able to change the future behavior of the agents and the entire system, similar to a chaotic system we increase the model of Russo et al. (2014) to carry out three experiments focusing on the interaction between Banks and Firms in an artificial economy. The first experiment is relative to Relationship Banking where, according to the literature, the interaction over time between Banks and Firms are able to produce mutual benefits, mainly due to reduction of the information asymmetry between them. The following experiment is related to information heterogeneity in the credit market, where the larger the bank, the higher their visibility in the credit market, increasing the number of consult for new loans. Finally, the third experiment is about the effects on the credit market of the heterogeneity of prices that Firms faces in the goods market.
Partindo da ideia de que os sistemas econômicos se enquadram na teoria da complexidade, onde seus inúmeros agentes interagem entre si sem um controle central e que essas interações são capazes de alterar o comportamento futuro dos agentes e de todo o sistema, semelhante a um sistema caótico, incrementamos o modelo de Russo et al. (2014) para a realização de três experimentos com foco na interação entre bancos e empresas em uma economia artificial. O primeiro experimento diz respeito a Relationship Banking onde, segundo a literatura, a interação ao longo do tempo entre bancos e empresas é capaz de produzir benefícios mútuos, principalmente devido a redução da assimetria de informação entre eles. O experimento seguinte está relacionado a assimetria de informação no mercado de crédito, onde quanto maior o banco, maior sua visibilidade no mercado de crédito, elevando na mesma proporção as consultar para novos emprestimos. Por fim, o terceiro experimento é relativo aos efeitos no mercado de crédito da heterogeneidade de preços que as empresas se deparam no mercado de bens
APA, Harvard, Vancouver, ISO, and other styles
38

Sartzetaki, Maria. "Computational modeling for evaluating the economic impact of airports on regional economies." Thesis, Cranfield University, 2011. http://dspace.lib.cranfield.ac.uk/handle/1826/7219.

Full text
Abstract:
Airports, as fundamental nodes of the air transport network, reflect the economic status of the region they serve and act as major engines of economic development, as was stated in ACI 2004. The impact of regional tourist airports on their region is more important due to the fact that there is a high interrelation between airports and tourism. A growing literature on this subject highlights the methods used to calculate the total effect of an airport on regional economy, and the difficulties entailed in such calculations. Τhe key objectives of this research are to develop an econometric assessment model based on a computational modelling concept that will estimate the economic impact of Regional Tourist Airports on Regional economy. The modelling framework is based on the Input Output Analysis concept and is in accordance with the theoretical principles of regional and national Economics, as well as all the reviewed models which have been developed globally, in order to assess the regional economic significance of airports and transportation projects. The case study of the research is the new airport in the Island of Crete in Greece, one of the most attractive tourist destinations in southeast Mediterranean. Conventional wisdom dictates the presentation of a Computational Input Output Model, appropriate for this purpose, in order to quantify the total value of the new airport operation in terms of jobs and income, at a regional and national level. The Economic impact that the Model will estimate includes four categories of impact: direct, indirect, induced and catalytic. The model outputs will measure these impacts in terms of Jobs, total Income and Total growth of GDP. The goal is to create a Model, which will be appropriate for application in relevant tourism regional airports, giving an essential tool in order to support decisions at the level of strategic planning, providing essential results about the impact of tourist airports developing a new airport and estimating the economic development.
APA, Harvard, Vancouver, ISO, and other styles
39

Pham, Tien Duc, and n/a. "A new approach to regional modelling: an Integrated Regional Equation System (IRES)." Griffith University. School of International Business and Asian Studies, 2004. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20041022.083520.

Full text
Abstract:
This thesis develops a new structure that explicitly combines two CGE models, a national and a regional, in an integrated structure that gives the thesis model the name IRES, in short for the Integrated Regional Equation System. The typical features of the integrated structure are the adding-up conditions and the two-way linkages between the national and the regional modules facilitated by the interface shifters. The adding-up conditions ensure the two modules produce consistent results and updated databases. The inclusion of the interface shifters on the one hand plays a role in ensuring compatibility of results of the two modules, i.e. no distortion occurs because technical or taste changes are transferred across modules. On the other hand, the interface shifters assist the operation of IRES in different modes: the model can be used as a top-down model, a bottom-up model or an integrated model where national and regional shocks can be introduced at the same time. Hence, IRES has more flexibility in its application than a regional model or a national model alone, as IRES can make use of availability of data at any levels in the economy. IRES has a new labour market in which regional migration is no longer the only factor that settles the labour market as in the original setting of the MMRF model. Regional unemployment and regional participation rates are modelled to response to changes in regional employment growth using elasticities estimated econometrically in this thesis. IRES implements historical patterns of regional migration so that results of regional migration are consistent with observed patterns. Altogether, regional migration, regional unemployment and participation rates determine the equilibrium of the labour market. IRES adopts new approaches to modelling margin demands and indirect taxes. These new approaches are very effective in reducing the size of IRES but they do not compromise the use of the model. These approaches are readily applicable to any other regional CGE models.
APA, Harvard, Vancouver, ISO, and other styles
40

Pham, Tien Duc. "A new approach to regional modelling: an Integrated Regional Equation System (IRES)." Thesis, Griffith University, 2004. http://hdl.handle.net/10072/366367.

Full text
Abstract:
This thesis develops a new structure that explicitly combines two CGE models, a national and a regional, in an integrated structure that gives the thesis model the name IRES, in short for the Integrated Regional Equation System. The typical features of the integrated structure are the adding-up conditions and the two-way linkages between the national and the regional modules facilitated by the interface shifters. The adding-up conditions ensure the two modules produce consistent results and updated databases. The inclusion of the interface shifters on the one hand plays a role in ensuring compatibility of results of the two modules, i.e. no distortion occurs because technical or taste changes are transferred across modules. On the other hand, the interface shifters assist the operation of IRES in different modes: the model can be used as a top-down model, a bottom-up model or an integrated model where national and regional shocks can be introduced at the same time. Hence, IRES has more flexibility in its application than a regional model or a national model alone, as IRES can make use of availability of data at any levels in the economy. IRES has a new labour market in which regional migration is no longer the only factor that settles the labour market as in the original setting of the MMRF model. Regional unemployment and regional participation rates are modelled to response to changes in regional employment growth using elasticities estimated econometrically in this thesis. IRES implements historical patterns of regional migration so that results of regional migration are consistent with observed patterns. Altogether, regional migration, regional unemployment and participation rates determine the equilibrium of the labour market. IRES adopts new approaches to modelling margin demands and indirect taxes. These new approaches are very effective in reducing the size of IRES but they do not compromise the use of the model. These approaches are readily applicable to any other regional CGE models.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of International Business and Asian Studies
Full Text
APA, Harvard, Vancouver, ISO, and other styles
41

Angus, Simon Douglas Economics Australian School of Business UNSW. "Economic networks: communication, cooperation & complexity." Awarded by:University of New South Wales. Economics, 2007. http://handle.unsw.edu.au/1959.4/27005.

Full text
Abstract:
This thesis is concerned with the analysis of economic network formation. There are three novel sections to this thesis (Chapters 5, 6 and 8). In the first, the non-cooperative communication network formation model of Bala and Goyal (2000) (BG) is re-assessed under conditions of no inertia. It is found that the Strict Nash circle (or wheel) structure is still the equilibrium outcome for n = 3 under no inertia. However, a counter-example for n = 4 shows that with no inertia infinite cycles are possible, and hence the system does not converge. In fact, cycles are found to quickly dominate outcomes for n > 4 and further numerical simulations of conditions approximating no inertia (probability of updating > 0.8 to 1) indicate that cycles account for a dramatic slowing of convergence times. These results, together with the experimental evidence of Falk and Kosfeld (2003) (FK) motivate the second contribution of this thesis. A novel artificial agent model is constructed that allows for a vast strategy space (including the Best Response) and permits agents to learn from each other as was indicated by the FK results. After calibration, this model replicates many of the FK experimental results and finds that an externality exploiting ratio of benefits and costs (rather than the difference) combined with a simple altruism score is a good proxy for the human objective function. Furthermore, the inequity aversion results of FK are found to arise as an emergent property of the system. The third novel section of this thesis turns to the nature of network formation in a trust-based context. A modified Iterated Prisoners' Dilemma (IPD) model is developed which enables agents to play an additional and costly network forming action. Initially, canonical analytical results are obtained despite this modification under uniform (non-local) interactions. However, as agent network decisions are 'turned on' persistent cooperation is observed. Furthermore, in contrast to the vast majority of non-local, or static network models in the literature, it is found that a-periodic, complex dynamics result for the system in the long-run. Subsequent analysis of this regime indicates that the network dynamics have fingerprints of self-organized criticality (SOC). Whilst evidence for SOC is found in many physical systems, such dynamics have been seldom, if ever, reported in the strategic interaction literature.
APA, Harvard, Vancouver, ISO, and other styles
42

Björnfot, Fredrik. "GDP Growth Rate Nowcasting and Forecasting." Thesis, Umeå universitet, Institutionen för fysik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-132951.

Full text
Abstract:
The main purpose of this project was to help Swedbank get a better understandingof how gross domestic product growth rate develops in the future froma data set of macroeconomic variables. Since GDP values are released long aftera quarter has ended Swedbank would like to have a model that could predictupcoming GDP from these data sets. This was solved by a combination ofgrowth rate predictions from a dynamic factor model, a vector autoregressivemodel and two machine learning models. The predictions were combined usinga weighting method called system averaging model where the model predictionwith least historical error receives the largest weight in the nal future prediction.In previous work a simple moving average model has been implementedto achieve this eect however there are several aws in a simple moving averagemodel. Most of these defects could in theory be avoided by using an exponentialweighting scheme instead. This resulted in the use of an exponentialweighting method that is used to calculate weights for future predictions. Themain conclusions from this project were that some predictions could get betterwhen removing bad performing models which had too large of a weight. Puttingtoo high weight on a single well performing model is also not optimal since thepredictions could get very unstable because of varying model performance. Theexponential weighting scheme worked well for some predictions however whenthe parameter , that controls how the weight is distributed between recent andhistorical errors, got too small a problem arose. Too few values were used toform the nal weights for the prediction and the estimate got unsteady results.
APA, Harvard, Vancouver, ISO, and other styles
43

Metzig, Cornelia. "A Model for a complex economic system." Thesis, Grenoble, 2013. http://www.theses.fr/2013GRENS038/document.

Full text
Abstract:
Cette thèse s'inscrit dans le cadre de systèmes complexes appliqués aux systèmes économiques. Dans cette thèse, un modèle multi-agent a été proposé, qui modélise le cycle de production. Il est consitué d'entreprises, ouvirers/foyers, et une banque, et repecte la conservation de la monnaie. Son hypothèse centrale est que les entreprises se basent sur une marge espérée pour déterminer leur production. Un scénario simple de ce modèle, ou les marges espérées sont homogènes, a été analysé dans le cadre de models de croissance stochastique. Les résultats sont la distribution de tailles d'entreprises rassemblant des lois de puissance, et leur distribution du taux de croissance de forme 'tente', ainsi qu'une dépendence de taille de la variance de la croissance. Ces résultats sont proches aux faits stylisés issus d'études empiriques. Dans un scénario plus complet, le modèle contient des caractéristiques supplémentaires: des marges espérées hétérogèges, ainsi que des paiements d'intérêts, la possibilité de faire faillite. Cela ramène le modèle aux modèles macro-économiques multi-agents. Les extensions sont décrites de façon théorique par des équations de replicateur. Les résultats nouveaux sont la distribution d'age d'entreprises actives, la distribution de leur taux de profit, la distribution de dette, des statistiques sur les faillites, et des cycles de vie caractéristiques. Tout ces résultats sont qualitativement en accord avec des résultats d'études empiriques de plusieurs pays.Le modèle proposé génère des résultats prometteurs, en respectant le principe que des résultats qui apparaissent simultanément peuvent soit etre générés par un même processus, soit par plusieurs aui qui sont compatibles
The thesis is in the field of complex systems, applied to an economic system. In this thesis, an agent-based model has been proposed to model the production cycle. It comprises firms, workers, and a bank, and respects stock-flow consistency. Its central assumption is that firms plan their production based on an expected profit margin. A simple scenario of the model, where the expected profit margin is the same for all firms, has been analyzed in the context of simple stochastic growth models. Results are a firms' size distribution close to a power law, and tent-shaped growth rate distribution, and a growth rate variance scaling with firm size. These results are close to empirically found stylized facts. In a more comprehensive version, the model contains additional features: heterogeneous profits margins, as well as interest payments and the possibility of bankruptcy. This relates the model to agent-based macroeconomic models. The extensions are described theoretically theoretically with replicator dynamics. New results are the age distribution of active firms, their profit rate distribution, debt distribution, bankruptcy statistics, as well as typical life cycles of firms, which are all qualitatively in agreement with studies of firms databases of various countries.The proposed model yields promising results by respecting the principle that jointly found results may be generated by the same process, or by several ones which are compatible
APA, Harvard, Vancouver, ISO, and other styles
44

Davy, Simon Mark. "Decentralised economic resource allocation for computational grids." Thesis, University of Leeds, 2008. http://etheses.whiterose.ac.uk/1369/.

Full text
Abstract:
Grid computing is the concept of harnessing the power of many computational resources in a transparent manner. It is currently an active research area, with significant challenges due to the scale and level of heterogeneity involved. One of the key challenges in implementing grid systems is resource allocation. Currently, centralised approaches are employed that have limited scalability and reliability, which is a key factor in achieving a usable grid system. The field of economics is the study of allocating scarce resources using economic mechanisms. Such systems can be highly scalable, robust and adaptive and as such are a potential solution to the grid allocation problem. There is also a natural fit of the economic allocation metaphor to grid systems, given the diversity of autonomy of grid resources. We propose that an economic system is a suitable mechanism for grid resource allocation. We propose a simple market mechanism to explore this idea. Our system is a fully decentralised economic allocation scheme, which aims to achieve a high degree of scalability and reliability, and easily allows resources to retain their autonomy. We implement a simulation of a grid system to analyse this system, and explore its performance and scalability, with a comparison to existing systems. We use a network to facilitate communication between participating agents, and we pay particular attention to the topology of the network between participating agents, examining the effects of different topologies on the performance of the system.
APA, Harvard, Vancouver, ISO, and other styles
45

Bogan, Nathaniel Rockwood. "Economic allocation of computation time with computation markets." Thesis, Massachusetts Institute of Technology, 1994. http://hdl.handle.net/1721.1/32603.

Full text
Abstract:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1994.
Includes bibliographical references (leaves 88-91).
by Nathaniel Rockwood Bogan.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
46

Merz, Laura. "AUTOMATION-INDUCED RESHORING: An Agent-based Model of the German Manufacturing Industry." Thesis, Uppsala universitet, Institutionen för geovetenskaper, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-394212.

Full text
Abstract:
The concept of ‘Industry 4.0’ signalises the rise of innovative manufacturing technologies, including industrial robots. Wider applicability of robotic automation and higher efficiency of production processes shift the profitability analysis of strategic relocation decisions. Despite the technological feasibility, diffusion of technology lowers the profitability threshold for robots. Consequently, competitive labour cost advantages, formerly motivating manufacturing firms to offshore production become less relevant. In fact, robots additionally gain importance in the case of shifted global economic realities, such as stricter environmental regulation on global trade and the convergence of the global wage gap. However, the heterogeneous levels of automation among manufacturing firms have not been taken into account when studying the macroeconomic phenomenon of reshoring. This study adds novelty by offering an agent-based perspective which has allowed insights on how the behaviour of firms, guided by simple economic rules on the micro-level, is dynamically influenced by their complex environment in regard to relocation, decision-making hypotheses. Testing various variables sensitive to initial conditions, increased environmental regulations targeting global trade and upward shifting wage levels in formerly offshore production locations have shown to be driving and inhibiting mechanisms of this socio-technical system. Therefore, the dynamic demonstrates a shift from predominantly cited economic reasoning for relocation strategies towards sustainability aspects, pressingly changing these realities on an environmental and social dimension. The popular debate is driven by increased environmental awareness and the proclaimed fear of robots killing jobs. In view of reshoring shaping the political agenda, interest in the phenomenon has recently been fuelled by the rise of populism and protectionism claiming to “bring jobs back home”.
APA, Harvard, Vancouver, ISO, and other styles
47

Mignot, Sylvain. "Négocier ou enchérir, l’influence des mécanismes de vente : le cas du marché aux poissons de Boulogne-sur-Mer." Thesis, Paris 2, 2012. http://www.theses.fr/2012PA020101/document.

Full text
Abstract:
Le marché aux poissons de Boulogne-sur-Mer se caractérise par l’organisation singulière de son système de vente. En effet, sur celui-ci, les acheteurs et les vendeurs peuvent choisir chaque jour de recourir à un mécanisme d’enchères ou à un marché de gré à gré (voire à ces deux possibilités en même temps), pour commercer entre eux. La coexistence de ces deux systèmes de vente est stable dans le temps, chacun d’entre eux représentant approximativement la moitié des quantités échangées. Cette singularité économique conduit à s’interroger sur les conditions nécessaires à l’émergence et à la stabilité de cette coexistence. Pourquoi les agents ne s’accordent-ils tous pas pour un unique mécanisme de transaction comme dans la majorité des marchés? pourquoi observe-t-on une si grande volatilité dans les choix individuels de marché? Afin de comprendre les conditions nécessaires à cette coexistence de mécanismes de marché, la présente thèse se déclinera comme suit. La première partie sera dédiée à l’étude empirique des transactions journalières ayant lieu sur chacun des deux sous-marchés. Nous commençons par une analyse statistique et économétrique afin d’extraire les faits stylisés représentatifs des propriétés du marché et de ses acteurs, avant de procéder à une analyse des réseaux sociaux existants sur ce marché,visant à déterminer l’influence des interactions dans la prise de décision. Fort de ces résultats, nous construisons des modèles informatiques multi-agents, capables de reproduire les comportements observés au niveau individuel, et, au travers ceux-ci,le comportement du marché lui-même au niveau agrégé
Should I buy or should I bid ? The influence of market mechanism : the case of Boulogne-Sur-Mer fish market. The Boulogne-sur-Mer fish market is organized in a very specific way. Each day buyers and sellers can choose to use either an auction mechanism, a negotiated market, or evenboth, in order to sell and buy goods.A stunning fact observed is the stable coexistence of those two sub-markets throughout time, with no convergence of agents toward one of them, each one accounting for roughly half of the exchanged quantities.The present thesis aims at discovering the necessary conditions of the emergence andstability of such a coexistence.To do it, we will begin with an empirical study of daily transactions that have occurred on this market for a few years. We begin with a statistical and econometric study to extract the main stylized facts of this market, then we study the social networks influencing the outcomes. Once those facts determined, we build agent-based computational models able to reproduce the individual behaviours of agents, and through these, the emergence of the market’sbehaviour itself
APA, Harvard, Vancouver, ISO, and other styles
48

Ngaruye, Innocent. "Contributions to Small Area Estimation : Using Random Effects Growth Curve Model." Doctoral thesis, Linköpings universitet, Matematisk statistik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-137206.

Full text
Abstract:
This dissertation considers Small Area Estimation with a main focus on estimation and prediction for repeated measures data. The demand of small area statistics is for both cross-sectional and repeated measures data. For instance, small area estimates for repeated measures data may be useful for public policy makers for different purposes such as funds allocation, new educational or health programs, etc, where decision makers might be interested in the trend of estimates for a specic characteristic of interest for a given category of the target population as a basis of their planning. It has been shown that the multivariate approach for model-based methods in small area estimation may achieve substantial improvement over the usual univariate approach. In this work, we consider repeated surveys taken on the same subjects at different time points. The population from which a sample has been drawn is partitioned into several non-overlapping subpopulations and within all subpopulations there is the same number of group units. The aim is to propose a model that borrows strength across small areas and over time with a particular interest of growth profiles over time. The model accounts for repeated surveys, group individuals and random effects variations. Firstly, a multivariate linear model for repeated measures data is formulated under small area estimation settings. The estimation of model parameters is discussed within a likelihood based approach, the prediction of random effects and the prediction of small area means across timepoints, per group units and for all time points are obtained. In particular, as an application of the proposed model, an empirical study is conducted to produce district level estimates of beans in Rwanda during agricultural seasons 2014 which comprise two varieties, bush beans and climbing beans. Secondly, the thesis develops the properties of the proposed estimators and discusses the computation of their first and second moments. Through a method based on parametric bootstrap, these moments are used to estimate the mean-squared errors for the predicted small area means. Finally, a particular case of incomplete multivariate repeated measures data that follow a monotonic sample pattern for small area estimation is studied. By using a conditional likelihood based approach, the estimators of model parameters are derived. The prediction of random effects and predicted small area means are also produced.
APA, Harvard, Vancouver, ISO, and other styles
49

Schriner, Andrew W. "#Crowdwork4dev:Engineering Increases in Crowd Labor Demand to Increase the Effectiveness of Crowd Work as a Poverty-Reduction Tool." University of Cincinnati / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1445341861.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Tran, Hieu. "Fragilité financière par l'analyse des réseaux et l'approche comportementale." Thesis, Bordeaux, 2018. http://www.theses.fr/2018BORD0445/document.

Full text
Abstract:
L'objectif de cette thèse est d'étudier la fragilité financière, c.à.d. la sensibilité du système financier par rapport aux perturbations. La difficulté principale concernant la fragilité financière dans le contexte actuel est la complexité croissante du système financier. piur remédier à ce problème, cette thèse s'inspire des deux courants relativement récents de la recherche économique : l'analyse des réseaux et l'économie comportementale. Les principaux concepts mobilisés sont les mécanismes de diffusion, de cascade et la rationalité limitée. Chapitre 1 étudie les effets des structures locales ds liens, spécifiquement la longueur des cycles transitifs sur la magnitude de la contagion financière. Chapitre 2 propose un modèle dynamique des paniques bancaires, dans lequel les paniques émergent par un mécanisme de cascade des retraits. Le but est de mieux comprendre comment les paniques se forment. Chapitre 3 étudie les paniques bancaires dans un contexte à la fois dynamique et comportemental, avec la présence du mimétisme et l'hétérogénéité des déposants
This thesis studies financial fragility, i.e. the sensitivity of the financial system with respect to shocks. the main issue of financial fragility in the current context is the increased financial complexity. To address this problem, this study draws inspiration from two relatively recent streams of literature : econopmics of networks and behavioral economics. The main concepts in use are diffusion, cascade and bounded rationality. Chapter 1 studies how petterns of links, specifically, the length of transitive cycles affect the extent of financial contagion. Chapter 2 proposes a dynamic model in which bank runs arise as cascades of withdrawals. The aim is to better understand how bank runs occur. Chapter 3 studies bank runs in a dynamic and behavioral setting, with herding and heterogeneity of depositors
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography