Tesis sobre el tema "Econophysics"

Siga este enlace para ver otros tipos de publicaciones sobre el tema: Econophysics.

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores tesis para su investigación sobre el tema "Econophysics".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Schinckus, Christophe. "When physics became undisciplined : an essay on econophysics". Thesis, University of Cambridge, 2018. https://www.repository.cam.ac.uk/handle/1810/279683.

Texto completo
Resumen
In the 1990s, physicists started looking beyond their disciplinary boundaries by using their methods to study various problems usually thrown up by financial economics. This dissertation deals with this extension of physics outside its disciplinary borders. It seeks to determine what sort of discipline econophysics is in relation to physics and to economics, how its emergence was made possible, and what sort of knowledge it produces. Using a variety of evidence including bibliometric analysis Chapter 1 explores the field’s disciplinary identity as a branch of physics even though its intellectual heart is better seen as the re-emergence of a 1960s research programme initiated in economics. Chapter 2 is historical: it identifies the key role played by the Santa Fe Institute and its pioneering complexity research in the shaping of methodological horizons of econophysics. These are in turn investigated in Chapter 3, which argues that there are in fact three methodological strands: statistical econophysics, bottom-up agent-based econophysics, and top-down agent-based econophysics. Viewed from a Lakatosian perspective they all share a conceptual hard-core but articulate the protective belt in distinctly different ways. The last and final chapter is devoted to the way econophysicists produce and justify their knowledge. It shows that econophysics operates by proposing empirically adequate analogies between physical and other systems in exactly the ways emphasised by Pierre Duhem. The contrast between such use of analogy in econophysics and modeling practices implemented by financial economics explains why econophysics remains so controversial to economists.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Oltean, Elvis. "Modelling income, wealth, and expenditure data by use of Econophysics". Thesis, Loughborough University, 2016. https://dspace.lboro.ac.uk/2134/20203.

Texto completo
Resumen
In the present paper, we identify several distributions from Physics and study their applicability to phenomena such as distribution of income, wealth, and expenditure. Firstly, we apply logistic distribution to these data and we find that it fits very well the annual data for the entire income interval including for upper income segment of population. Secondly, we apply Fermi-Dirac distribution to these data. We seek to explain possible correlations and analogies between economic systems and statistical thermodynamics systems. We try to explain their behaviour and properties when we correlate physical variables with macroeconomic aggregates and indicators. Then we draw some analogies between parameters of the Fermi-Dirac distribution and macroeconomic variables. Thirdly, as complex systems are modelled using polynomial distributions, we apply polynomials to the annual sets of data and we find that it fits very well also the entire income interval. Fourthly, we develop a new methodology to approach dynamically the income, wealth, and expenditure distribution similarly with dynamical complex systems. This methodology was applied to different time intervals consisting of consecutive years up to 35 years. Finally, we develop a mathematical model based on a Hamiltonian that maximises utility function applied to Ramsey model using Fermi-Dirac and polynomial utility functions. We find some theoretical connections with time preference theory. We apply these distributions to a large pool of data from countries with different levels of development, using different methods for calculation of income, wealth, and expenditure.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Volpati, Valerio. "Statistical Mechanics approach to the sustainability of economic ecosystems". Doctoral thesis, SISSA, 2016. http://hdl.handle.net/20.500.11767/4924.

Texto completo
Resumen
This thesis contains some of the main results obtained during my research activity in these years, in the Statistical Physics sector at SISSA and in the Quantitative Life Sciences sector at ICTP. Chapter 1 serves as an introduction and is kept brief, because each of the following chapters has a separate introduction containing more details on the different problems that have been considered. In Chapter 2 several models of wealth dynamics are discussed, with focus on the stationary distributions that they have. In particular, we introduce a stochastic growth model that has a truncated power law distribution as a stationary state, and we give an interpretation for the mechanism generating this cut-off as a manifestation of the shadow banking activity. Chapter 3 is devoted to the issue of wealth inequality, and in particular to its consequences, when in a system with a power law wealth distribution, economic exchanges are considered. A stylized model of trading dynamics is introduced, in which we show how as inequality increases, the liquid capital concentrates more and more on the wealthiest agents, thereby suppressing the liquidity of the economy. Finally in Chapter 4, we discuss the issue of complexity and information sensitiveness of financial products. In particular, we introduce a stylized model of binary variables, where the financial transparency can be quantified in bits. We quantify how such information losses create sources of systemic risk, and how they should affect the pricing of financial products.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Tenenbaum, Joel. "Interdisciplinary applications of statistical physics to complex systems: seismic physics, econophysics, and sociophysics". Thesis, Boston University, 2012. https://hdl.handle.net/2144/31617.

Texto completo
Resumen
Thesis (Ph.D.)--Boston University
PLEASE NOTE: Boston University Libraries did not receive an Authorization To Manage form for this thesis or dissertation. It is therefore not openly accessible, though it may be available by request. If you are the author or principal advisor of this work and would like to request open access for it, please contact us at open-help@bu.edu. Thank you.
This thesis applies statistical physics concepts and methods to quantitatively analyze complex systems. This thesis is separated into four parts: (i) characteristics of earthquake systems (ii) memory and volatility in data time series (iii) the application of part (ii) to world financial markets, and (iv) statistical observations on the evolution of word usage. In Part I, we observe statistical patterns in the occurrence of earthquakes. We select a 14-year earthquake catalog covering the archipelago of Japan. We find that regions traditionally thought of as being too distant from one another for causal contact display remarkably high correlations, and the networks that result have a tendency to link highly connected areas with other highly connected areas. In Part II, we introduce and apply the concept of "volatility asymmetry", the primary use of which is in financial data. We explain the relation between memory and "volatility asymmetry" in terms of an asymmetry parameter λ. We define a litmus test for determining whether λ is statistically significant and propose a stochastic model based on this parameter and use the model to further explain empirical data. In Part III, we expand on volatility asymmetry. Importing the concepts of time dependence and universality from physics, we explore the aspects of emerging (or "transition") economies in Eastern Europe as they relate to asymmetry. We find that these emerging markets in some instances behave like developed markets and in other instances do not, and that the distinction is a matter both of country and a matter of time period, crisis periods showing different asymmetry characteristics than "healthy" periods. In Part IV, we take note of a series of findings in econophysics, showing statistical growth similarities between a variety of different areas that all have in common the fact of taking place in areas that are both (i) competing and (ii) dynamic. We show that this same growth distribution can be reproduced in observing the growth rates of the usage of individual words, that just as companies compete for sales in a zero sum marketing game, so do words compete for usage within a limited amount of reader man-hours.
2031-01-01
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Cheung, Wing-Keung. "Monte Carlo simulation on 2D random point pattern : Potts model and its application to econophysics /". View abstract or full-text, 2005. http://library.ust.hk/cgi/db/thesis.pl?PHYS%202005%20CHEUNG.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Koh, Jason S. H. "Comparison of the new "econophysics" approach to dealing with problems of financial to traditional econometric methods". Thesis, View thesis, 2008. http://handle.uws.edu.au:8081/1959.7/38828.

Texto completo
Resumen
We begin with the outlining the motivation of this research as there are still so many unanswered research questions on our complex financial and economic systems. The philosophical background and the advances of econometrics and econophysics are discussed to provide an overview of the stochastic and nonstochastic modelling and these disciplines are set as a central theme for the thesis. This thesis investigates the effectiveness of financial econometrics models such as Gaussian, ARCH (1), GARCH (1, 1) and its extensions as compared to econophysics models such as Power Law model, Boltzmann-Gibbs (BG) and Tsallis Entropy as statistical models of volatility in US S&P500, Dow Jones and NASDAQ stock index using daily data. The data demonstrate several distinct behavioural characteristics, particularly the increased volatility during 1998 to 2004. Power Laws appear to describe the large fluctuations and other characteristics of stock price changes. Surprisingly, these Power Laws models also show significant correlations for different types and sizes of markets and for different periods and sub-periods of markets. The results show the robustness of Power Law analysis, with the Power Law exponent (0.4 to 2.4) staying within the acceptable range of significance (83% to 97%), regardless of the percentage change in the index return. However, the procedure for testing empirical data against a hypothesised power-law distribution using a simple rank-frequency plot of the data and the data binning process can turn out to be a spurious result for the distribution. As for the stochastic processes such as ARCH (1) and GARCH (1, 1) the models are explicitly confined to the conditional behaviour of the data and the unconditional behaviour has often been described via moments. In reality, it is the unconditional tail behaviour that accounts for the tail behaviour and hence, we have to convert the unconditional tail behaviour and express the models as two-dimensional stochastic difference equation using the processes of Starica (Mikosch 2000). The results show the random walk prediction successfully describes the stock movements for small price fluctuations but fails to handle large price fluctuations. The Power Law tests prove superior to the stochastic tests when stock price fluctuations are substantially divergent from the mean. One of the main points of the thesis is that these empirical phenomena are not present in the stochastic process but emerge in the non-parametric process. The main objective of the thesis is to study the relatively new field of Econophysics and put its work in perspective relative to the established if not altogether successful practice of econometric analysis of stock market volatility. One of the most exciting characteristics of Econophysics is that, as a developing field, no models as yet perfectly represent the market and there is still a lot of fundamental research to be done. Therefore, we begin to explore the application of statistical physics method particularly Tsallis entropy to give a new insights into problems traditionally associated with financial markets. The results of Tsallis entropy surpass all expectations and it is therefore one of the most robust methods of analysis. However, it is now subject to some challenge from McCauley, Bassler et. al., as they found that the stochastic dynamic process (sliding interval techniques) used in fat tail distributions is time dependent.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Koh, Jason S. H. "Comparison of the new "econophysics" approach to dealing with problems of financial to traditional econometric methods". View thesis, 2008. http://handle.uws.edu.au:8081/1959.7/38828.

Texto completo
Resumen
Thesis (Ph.D.)--University of Western Sydney, 2008.
Thesis submitted to fulfil the requirements for the degree of Doctor of Philosophy in the School of Economics and Finance, College of Business, University of Western Sydney. Includes bibliography.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Kong, Chi-Wah. "Monte-Carlo simulation on a 2-D random point pattern : ising model and its application to econophysics /". View Abstract or Full-Text, 2002. http://library.ust.hk/cgi/db/thesis.pl?PHYS%202002%20KONG.

Texto completo
Resumen
Thesis (M. Phil.)--Hong Kong University of Science and Technology, 2002.
Includes bibliographical references (leaves 81-82). Also available in electronic version. Access restricted to campus users.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Zeithamer, Tomáš. "Exaktní metody v obchodě (modelový přístup)". Doctoral thesis, Vysoká škola ekonomická v Praze, 2003. http://www.nusl.cz/ntk/nusl-77119.

Texto completo
Resumen
The paper deals with quantum economy. It means the methods of quantum mechanics are applied in the study of economic processes. The scalar abstract economic quantities are constructed as follows: general abstract economic quantity F , average abstract economic quantity FA, marginal abstract economic quantity FM, marginal average abstract economic quantity FMA, average marginal abstract economic quantity FAM, elasticity of abstract economic quantity EF. All the abstract economic quantities mentioned above are constructed as mappings. The general theory of abstract economic quantities is utilized in a construction of the abstract total gross profit TGP. The set of static models of total gross profit TGP is constructed in the case that the first unit gross profit is slowly changed with time while the second unit gross profit is quickly changed with time in comparison with the first unit gross profit.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Mandere, Edward Ondieki. "Financial Networks and Their Applications to the Stock Market". Bowling Green State University / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1234473233.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Favaro, Guilherme Martinatti. "\"Dinâmicas autoregressivas em econofísica\"". Universidade de São Paulo, 2007. http://www.teses.usp.br/teses/disponiveis/76/76131/tde-23032007-101512/.

Texto completo
Resumen
Neste trabalho, fazemos uma breve introdução à Econofísica e às grandezas estatísticas relevantes para o estudo de um ativo financeiro. Estas grandezas são estudadas detalhadamente para o índice NYSE Composto. Determinamos o tempo de autocorrelação e o espectro de potência, cujos resultados indicam a presença de uma correlação de curto alcance. Através do expoente de Hurst, investigamos o tipo de correlação presente e detectamos a presença de multifractalidade. A volatilidade do índice NYSE mostrou-se análoga a um processo de Wiener. Por outro lado, a função densidade de probabilidade do índice NYSE foi ajustada por uma distribuição de Lévy simétrica com alpha = 1,47. Apresentamos os modelos de variância autoregressiva ARCH e GARCH. Em particular, focalizamos o modelo Markoviano GARCH(1,1). Este modelo tem três parâmetros de controle. Mostramos que, para o índice NYSE, o uso do tempo de autocorrelação na determinação deste conjunto de parâmetros de controle não é a melhor escolha. Resultados muito mais satisfatórios são obtidos se utilizarmos o sexto momento padronizado, uma vez que o ganho no ajuste da função de autocorrelação temporal é muito mais expressivo. A proposta de utilização do sexto momento é robusta e se aplica tanto ao modelo GARCH Gaussiano quanto ao modelo GARCH Exponencial. Desenvolvemos uma técnica de expansão em série para obter o sexto momento padronizado em função dos três parâmetros de controle. Obtivemos uma expressão analítica exata para a curtose do modelo GARCH Exponencial. Ambas as versões Gaussiana e Exponencial apresentam um desempenho equivalente na descrição da função densidade de probabilidade e da função de autocorrelação temporal. Porém, no que tange às leis de escala temporal, medidas através da probabilidade de retorno à origem, o modelo Exponencial tem, clara e inequivocamente, um melhor desempenho que o modelo Gaussiano, pois apresenta um expoente da lei de escala temporal em bom acordo com o expoente do índice NYSE.
In this thesis, we briefly give an introduction to Econophysics and discuss some important statistical quantities used in the study of a financial asset. This quantities are meticulously studied for the NYSE Composite Index. For its time series, we determine the time autocorrelation and the power spectrum, which show the presence of a short range correlation. By means of the Hurst exponent, we investigate the kind of autocorrelation which is present and we detected the presence of multifractality. The volatility of the NYSE Index show a behavior analogous to a Wiener process. On the other hand, the probability density function was adjusted by a symmetric Lévy distribuition with alpha = 1.47. We present the variance autoregressive ARCH and GARCH models. More specifically, we focus on the Markovian GARCH(1,1) model. This model has three control parameters. We show that, for the NYSE Index, the use of the time autocorrelation to determinate the set of control parameters is not the best choice. Instead, results much more reasonable are obtained if the standardized sixth moment is used, as can be seen by the adjust of the time autocorrelation function. The proposal of the sixth moment is robust and applies for both the Gaussian and the Exponential GARCH models. We developed a series expansion technique to get the standardized sixth moment as a function of the three control parameters. We found an exact analytic expression for the kurtosis of the Exponential GARCH model. Both the Gaussian and the Exponential versions exhibit an equivalent performance in the description of the probability density function and the time autocorrelation function. However, with respect to the time scaling laws (measured by the probability of return to the origin) the Exponential model shows, in a clear and unequivocal way, a better performance than the Gaussian model, since it gives a time horizon exponent much more close to the real NYSE exponent.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Šubrt, Jiří. "Analýza finančních dat metodami ekonofyziky". Master's thesis, Vysoká škola ekonomická v Praze, 2012. http://www.nusl.cz/ntk/nusl-113910.

Texto completo
Resumen
For financial forcasting of crisis new concepts from disciplines dissimilar to economics are looked for by financial experts. The branch of econophysics using theories of natural sciences is significant. The meaning of this work is to point out one of many methods applied to financial data with help of the theory of turbulence of fluids and deterministic chaos. We provide a parallel analysis of high frequency financial time series of a stock index and velocities of a turbulent fluid. This work concerns the use of concepts from statistical mathematics, probability theory and scaling. We find differences of both studied systems but the methodologies of natural diciplines can be also applied to financial data.
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Silva, Luís Eduardo Araripe Gomes da. "Efeitos coletivos em eleições e jogos". reponame:Repositório Institucional da UFC, 2007. http://www.repositorio.ufc.br/handle/riufc/12668.

Texto completo
Resumen
SILVA, Luís Eduardo Araripe Gomes da. Efeitos coletivos em eleições e jogos. 2007. 112 f. Tese (Doutorado em Física) - Programa de Pós-Graduação em Física, Departamento de Física, Centro de Ciências, Universidade Federal do Ceará, Fortaleza, 2007.
Submitted by Edvander Pires (edvanderpires@gmail.com) on 2015-06-08T18:55:02Z No. of bitstreams: 1 2007_tese_leagsilva.pdf: 1036296 bytes, checksum: f2c061ea0da08749fc44ba00c9d4ead3 (MD5)
Approved for entry into archive by Edvander Pires(edvanderpires@gmail.com) on 2015-06-08T18:58:05Z (GMT) No. of bitstreams: 1 2007_tese_leagsilva.pdf: 1036296 bytes, checksum: f2c061ea0da08749fc44ba00c9d4ead3 (MD5)
Made available in DSpace on 2015-06-08T18:58:05Z (GMT). No. of bitstreams: 1 2007_tese_leagsilva.pdf: 1036296 bytes, checksum: f2c061ea0da08749fc44ba00c9d4ead3 (MD5) Previous issue date: 2007
In this work we investigate some collective phenomena emerging from the interaction between individuals in a group or society. The approach is the same one used in statistical mechanics. Basically, we use the techniques and models that are well known in the subject of critical phenomena and statistical mechanics out of equilibrium. First, we studied the collective behavior of a population in an electoral process. In particular, we have analyzed the Brazilian elections from 1998 to 2006. In this study, we analyzed the performance of candidates in a proportional election. We have shown that the vote distributions of the candidates follows a power law for two orders of magnitude, and it is the same no matter the year of the election or different regions of the country. In order to verify the influence of the parties in the election we also analyzed the results using a electoral coefficient and compared our results to legislative elections in some European countries. We also studied majority elections, for that we have used a fragmentation model as an attempt to reproduce the result for mayor elections in the Brazilian cities. The model can gives a good indication about the strategic voting in these kind of election. This study of how people choose their candidates drive us to analyze the emergence of patterns in groups of people playing games with adaptable agents. Therefore, we investigate here the forecasting game, a game where people make prediction about some events. Furthermore, we applied the forecasting game to study ways of how to improve the search tools on the internet. In both cases, we observed a rich scenario where phase transitions happen depending on the fraction of agents using some strategy.
Neste trabalho investigamos alguns fenômenos coletivos que surgem a partir das interações entre os indivíduos que formam uma população. Para isso utilizamos técnicas da mecânica estatística e modelos e análises que são comuns no estudo de fenômenos críticos e de sistemas fora do equilíbrio. A primeira manifestação coletiva estudada foi o resultado das eleições brasileiras dos anos de 1998, 2002, 2004 e 2006. Uma investigação a respeito da estatística do desempenho dos candidatos a deputado estadual mostrou que a distribuição de votos segue uma lei de potência com um expoente universal que se mantém o mesmo para diferentes eleições e diferentes regiões geográficas. Em uma análise considerando a estrutura partidária, pudemos verificar que existem diferenças entre as eleições brasileiras que estudamos com as eleições legislativas de alguns países europeus. Também utilizamos um modelo de fragmentação na tentativa de reproduzir os dados das eleições municipais estudadas. Sugerimos que um modelo de quebra possa ser utilizado para dar uma informação, de maneira quantitativa, a respeito da quantidade de votos estratégicos que foram utilizados. Esse estudo de eleições, ou melhor, de como as pessoas escolhem seus candidatos, nos levou a analisar o surgimento de padrões em grupos de pessoas utilizando jogos envolvendo agentes adaptativos. Para isso estudamos uma situação onde os agentes devem fazer uma previsão a respeito de um evento futuro. Além disso, mostramos uma aplicação desse jogo de previsão que tem como objetivo melhorar o desempenho das ferramentas de busca na internet. Em ambos estes modelos observamos um rico cenário envolvendo transições de fase em função da fração de agentes utilizando uma determinada estratégia.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Kauper, Benjamin y Karl-Kuno Kunze. "Modellierung von Aktienkursen im Lichte der Komplexitätsforschung". Universität Potsdam, 2011. http://opus.kobv.de/ubp/volltexte/2011/5228/.

Texto completo
Resumen
This paper offers empirical evidence on the power of Sornette et al's [2001] model of bubbles and crashes regarding the German stock market between 1960 and 2009. We identify relevant time periods and describe them with the function given by Sornette et al's model. Our results show some evidence in predicting crashes with the understanding of logarithmic periodic structures that are hidden in the stock price trajectories. It was shown that for the DAX most of the relevant parameters determining the shape of the logarithmic periodic structures are lying in the expected interval researched by Sornette et al. Further more the paper implicitly shows that the point of time of former crashes can be predicted with the presented formula. We conclude that the concept of financial time series conceived as purely random objects should be generalised as to admit complexity.
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Kellermann, Gustavo Adolfo. "Aspectos estatísticos e dinâmicos do jogo do ultimato espacial e não espacial". reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2008. http://hdl.handle.net/10183/15531.

Texto completo
Resumen
Nesta dissertação é explorado o comportamento emergente de uma população heterogênea de jogadores negociando segundo o jogo do ultimato: dois jogadores recebem uma oferta; um deles (o proponente) propõe a sua divisão, enquanto o outro jogador (o aceitador) pode aceitar ou rejeitar a proposta. A rejeição é prejudicial a ambos jogadores, pois nenhum deles recebe sua parcela dos possíveis ganhos. Neste contexto, o ganho e seus momentos são calculados a partir de métodos analíticos simples e várias simulações computacionais corroboram os resultados obtidos. Também são analisadas as flutuações estatísticas da distribuição do ganho. Além disso, é apresentada uma abordagem simples evolucionária que considera mudanças em estratégias baseadas em ganhos anteriores. Para este caso, é demonstrado que o tempo médio de permanência (idade) de uma estratégia de uma população de "justos" convergepara um valor constante enquanto t se aproxima do ∞ e o cutoff médio decai segundouma lei de potência em tempos altos, após uma queda inicial. Também foram observadas transições entre comportamentos de alto e baixo ganho. Adicionalmente foi estudadauma versão espacial desse modelo. Para tanto são consideradosjogadores interagindo com seus primeiros vizinhos em reticulados 2D de acordo com duas dinâmicas estocáticas: (1) morte e nascimento com amostragem seletiva (MNAS), (2) Gibbs sampling sobre a vizinhança (GS). Estes resultados trazem importantes considerações sobre o projeto de simulaçõesno contexto da teoria dos jogos evolucionários, em particular na simulação dos aspectos relevantes quando modelando grandes populações.
Weexplore the emergent behavior of a heterogeneous population of players negotiating via an ultimatum game: two players are offered a gift; one of them (the proposer) suggestshow to divide the offer while the other player (the responder) can either agree or reject the deal. Rejection is detrimental to both players as it results in no eamings. In this context, the payoff and its moments are calculated from simple analytical methods and several computer simulations corroborate the obtained results. Wealso analyze statistical fluctuationson payoff distribution. In addition,we present a simple evolutionaryapproach that considers changes in strategies based on previous eamings. For this case, we show that average permanence time (age) in a strategy of a fair population converges to a constant value when t approaches ∞ and the cutoff average decays as a power law for large times after a initial deterministic slip. We have also observed transitions between highlow payoffbehaviors. Additionallywe studied a spatial version ofthis model. For this we consider players interacting with their nearest neighbors in 2D lattices according to two different stochastic dynamics: (1) Death and birth with selective sampling (MNAS), (2) Gibbs sampling on neighborhood (GS) Webelieve that these results can bring important considerationsto the design of simulations in the context ofthe evolutionary game theory, in particular in the simulation of relevant features when modeling large populations.
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Fagerström, Sixten. "Behavioural Finance : The psychological impact and overconfidence in financial markets". Thesis, University of Skövde, School of Technology and Society, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-1326.

Texto completo
Resumen

Purpose

The main purpose of this paper is to investigate overconfidence and over-optimism in the market. This leads the reader to the question, are the analysts “right” concerning their forecasts? The reader will also get to understand various and sometimes forgotten factors that affect we human beings in our decision making when it comes to investing and analysing which is also known as the behavioural finance theory.

Conclusion

According to the results from my tests it seems that analysts of the S&P500 are exaggerated by the problem of overconfidence and the over-optimistic biases. The analysis part of this study is confirming the discussed theory of anchoring and herding. Analysts tend to “follow the stream”, by evaluate the standard deviations between forecasts and the realized outcome, as well as the indexed analysts’ consensus estimations for twenty-four months of EPS.

Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Silva, Michel Alexandre da. "Livro de ofertas e dinâmica de preços: evidências a partir de dados da BOVESPA". Universidade de São Paulo, 2013. http://www.teses.usp.br/teses/disponiveis/100/100132/tde-04122013-114154/.

Texto completo
Resumen
Este trabalho possui um duplo objetivo: i) estudar os fatos estilizados do livro de ofertas dos papéis negociados na Bolsa de Valores de São Paulo (BOVESPA), assim como dos retornos engendrados pela dinâmica do livro de ofertas e ii) desenvolver um modelo de livro de ofertas baseado em agentes com o propósito de reproduzir tais fatos estilizados. Trabalhou-se com dados de junho/2006 a janeiro/2009 de uma amostra formada pelos vinte papéis mais negociados da BOVESPA. Os resultados empíricos corroboraram alguns fatos estilizados observados no estudo de papéis de outros países, mas refutaram outros. O modelo baseado em agentes conseguiu emular satisfatoriamente os fatos estilizados relacionados aos retornos, mas em se tratando da reprodução dos fatos estilizados do livro de ofertas o modelo foi menos eficaz.
This study has two aims: i) analyze the stylized facts of the order book of stocks traded in the São Paulo Stock Exchange (BOVESPA), as well as of the returns engendered by the order book dynamics and ii) develop an order book agent-based model able to reproduce such stylized facts. It was used data from June 2006 to January 2009 regarding a sample composed by the twenty most traded stocks in BOVESPA. The empirical results corroborated some stylized facts observed in stocks of other countries, but refuted others. The agent-based model successfully emulated the stylized facts concerning the returns; however, the model was less efficient in reproducing the stylized facts of the order book.
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Guilherme, Adriano Pereira. "Análise do índice Bovespa pelo método dos gráficos de recorrência". UNIVERSIDADE ESTADUAL DE PONTA GROSSA, 2008. http://tede2.uepg.br/jspui/handle/prefix/875.

Texto completo
Resumen
Made available in DSpace on 2017-07-21T19:25:57Z (GMT). No. of bitstreams: 1 ADRIANOPEREIRA.pdf: 5814518 bytes, checksum: fd0af0d5e0a77b57e48974063eac7e4f (MD5) Previous issue date: 2008-07-31
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Recurrence analysis has been extensively used in approaching problems that deal with transitions between regular and chaotic behaviors, identi¯cation of structure of dynamic systems, such as frequencies and correlations hard to detect by linear methods, for example. Among the main tools of this analysis are the Recurrence Plots (RP) and the Quantitative Recurrence Analysis (RQA), which are constantly used in the analysis of time series supposedly proceeding from non-linear and even non-stationary dynamical systems. These tools have been applied in a wide range of phenomena, since the study of cardiac arrhythmia until the greater phenomena of nature, such as sunspots. Recently, many economic and ¯financial séries are being investigated under this perspective, as exchange rates, the financial crashes" and the behavior of some stock index. In this work we employ the RP and RQA for the study of a long time series of the returns of the Bovespa Index (Ibovespa), where we carefully studied the obtention of the parameters for the phase space reconstruction of the supposed system which created the time series, we analyze the patterns formed in RP as well as the values of the quantities of RQA, comparing the results obtained with the original and randomized series. We search, from these results,to establish whether there is some sort of deterministic component in the studied system, and what its intensity. Our investigations suggest that the real financial market dynamics is a combination of deterministic chaos and stochastic behavior.
A análise da recorrência vem sendo muito usada na abordagem de problemas que tratam das transicões entre comportamentos regulares e caoticos, na identicação da estrutura de sistemas dinamicos, como frequências e correlacõess dificeis de detectar por metodos lineares, por exemplo. Dentre as principais ferramentas desta analise destacam-se os Gráficos de Recorrência (GR) e a Análise Quantitativa de Recorrência (AQR), que são constantemente empregadas na análise de séries temporais supostamente provenientes de sistemas dinâmicos não-lineares e até não-estacionários. Tais ferramentas vêm sendo aplicadas em uma grande gama de fenômenos, desde o estudo da arritmia cardíaca até os maiores fenômenos da natureza, como as manchas solares. Recentemente, muitas séries econômicas e financeiraso estão sendo investigadas sob esta ótica, como as taxas de cãmbio, os grandes crashes" financeiros e o comportamento de alguns índices de acões. Neste trabalho nós empregamos os GR e a AQR para o estudo de uma longa série temporal dos retornos do índice Bovespa (Ibovespa), onde estudamos cuidadosamente a obtencão dos parâmetros para a reconstrucão do suposto espaço de fase do sistema que gerou a série temporal, analisamos os padrões formados nos GR bem como os valores das quantidades da AQR, comparando os resultados obtidos com as séries originais e embaralhadas. Procuramos, a partir de tais resultados, estabelecer se existe algum tipo de componente determinística no sistema analisado, e qual sua intensidade. Nossas investigações sugeriram que a dinâmica real do mercado financeiro e uma combinacão de caos determinístico e comportamento estocástico.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Civitarese, Jamil Kehdi Pereira. "We're Chained: an analysis of systemic risk in finance". reponame:Repositório Institucional do FGV, 2015. http://hdl.handle.net/10438/15117.

Texto completo
Resumen
Submitted by Jamil Civitarese (jamil@rankings.watch) on 2015-09-08T17:16:54Z No. of bitstreams: 1 ebape_v2_completa.pdf: 1545221 bytes, checksum: 26ed0880a075cf3930258d1d3b4b769f (MD5)
Approved for entry into archive by ÁUREA CORRÊA DA FONSECA CORRÊA DA FONSECA (aurea.fonseca@fgv.br) on 2016-01-25T14:30:06Z (GMT) No. of bitstreams: 1 ebape_v2_completa.pdf: 1545221 bytes, checksum: 26ed0880a075cf3930258d1d3b4b769f (MD5)
Approved for entry into archive by Maria Almeida (maria.socorro@fgv.br) on 2016-01-26T19:19:59Z (GMT) No. of bitstreams: 1 ebape_v2_completa.pdf: 1545221 bytes, checksum: 26ed0880a075cf3930258d1d3b4b769f (MD5)
Made available in DSpace on 2016-01-26T19:20:11Z (GMT). No. of bitstreams: 1 ebape_v2_completa.pdf: 1545221 bytes, checksum: 26ed0880a075cf3930258d1d3b4b769f (MD5) Previous issue date: 2015-08-08
This dissertation presents two papers on how to deal with simple systemic risk measures to assess portfolio risk characteristics. The first paper deals with the Granger-causation of systemic risk indicators based in correlation matrices in stock returns. Special focus is devoted to the Eigenvalue Entropy as some previous literature indicated strong re- sults, but not considering different macroeconomic scenarios; the Index Cohesion Force and the Absorption Ratio are also considered. Considering the S&P500, there is not ev- idence of Granger-causation from Eigenvalue Entropies and the Index Cohesion Force. The Absorption Ratio Granger-caused both the S&P500 and the VIX index, being the only simple measure that passed this test. The second paper develops this measure to capture the regimes underlying the American stock market. New indicators are built using filtering and random matrix theory. The returns of the S&P500 is modelled as a mixture of normal distributions. The activation of each normal distribution is governed by a Markov chain with the transition probabilities being a function of the indicators. The model shows that using a Herfindahl-Hirschman Index of the normalized eigenval- ues exhibits best fit to the returns from 1998-2013.
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Brand, Rene. "An econophysical investigation : using the Boltzmann distribution to determine market temperature as applied to the JSE all share index". Thesis, Stellenbosch : University of Stellenbosch, 2009. http://hdl.handle.net/10019.1/879.

Texto completo
Resumen
Thesis (MBA (Business Management))--University of Stellenbosch, 2009.
ENGLISH ABSTRACT: Econophysics is a relatively new branch of physics. It entails the use of models in physics applied to economics. The distributions of financial time series are the aspect most intensely studied by physicists. This study is based on a study by Kleinert and Chen who applied the Boltzmann distribution to stock exchange data to define a market temperature that may be used by investors to indicate an impending stock market crash. Most econophysicists’ analysed the tail regions of the distributions as the tails represent risk in financial data. This study’s focus of analysis, on the other hand is the characterisation of the central portion of the probability distribution. The Boltzmann distribution, a cornerstone in statistical physics, yields an exponential distribution. The objective of this study is to investigate the suitability of using a market volatility forecasting method from econophysics, namely the Boltzmann/market temperature method. As econometric benchmark the ARCH/GARCH method is used. Stock market indices are known to be non-normally (non-Gaussian) distributed. The distribution pattern of a stock market index of reasonable high sampling frequency (typically interday or intraday) is leptokurtic with heavy tails. Mesoscopic (interday) distributions of financial time series have been found to be exponential distributions. If the empirical exponential distribution is therefore interpreted as a Boltzmann distribution, then a market temperature can be calculated from the exponential distribution. Empirical data for this study is in the form of daily closing values of the Johannesburg Stock Exchange (JSE) All Share Index (ALSI) and the Standard & Poor 500 (S & P 500) index for the period 1995 through to 2008. The Kleinert and Chen study made use of intraday data obtained from established markets. This study differs from the Kleinert and Chen study in that interday data obtained from an emerging market, namely the South African stock market is used. Neither of the aforementioned two differences had a significant influence on the results of this study. The JSE ALSI log-return data displays non-Gaussian properties and the Laplace (double exponential) distribution fit the data well. A plot of the market temperature provided a clear indication of when stock market crashes occurred. Results of the econophysical (Boltzmann/market temperature) method compared well to results of the econometric (ARCH/GARCH) method and subject to certain improvements can be utilised successfully. A leptokurtic, non-Gaussian nature was established for daily log-returns of the JSE ALSI and the S & P 500 index. The Laplace (double exponential) distribution fit the annual logreturns of the JSE ALSI and S & P 500 index well. As a result of the good Laplace fit, annual market temperatures could be calculated for the JSE ALSI and the S & P 500 index. The market temperature method was effective in identifying market crashes for both indices, but a limitation of the method is that only annual market temperatures can be determined. The availability of intraday stock index data should improve the interval for which market temperature can be determined.
AFRIKAANSE OPSOMMING: Ekonofisika is ‘n relatiewe nuwe studieveld. Dit behels die toepassing van fisiese modelle op finansiële data. Die waarskynlikheidsversdelings van finansiële tydreekse is die aspek wat meeste deur fisisie bestudeer word. Hierdie studie is gebaseer op ‘n studie deur Kleinert en Chen. Hulle het die Boltzmann-verspreiding op ‘n aandele-indeks toegepas en ‘n mark-temperatuur bepaal. Hierdie mark-temperatuur kan deur ontleders gebruik word as waarskuwingsmeganisme teen moontlike aandelebeurs ineenstortings. Die meeste fisisie het die uiterste areas van die verspreidingskurwes geanaliseer omdat hierdie uiterste area risiko in finansiële data verteenwoordig. Die analitiese fokus van hierdie studie, aan die ander kant, is die karakterisering van die die sentrale areas van die waarskeinlikheidsverdeling. Die Boltzmann verspreiding, die hoeksteen van Statistiese Fisika lewer ‘n eksponensiële waarskynlikheidsverdeling. Die doel van hierdie studie is om ‘n ondersoek te doen na die geskiktheid van die gebruik van ‘n ekonofisiese, vooruitskattingsmetode, naamlik die Boltzmann/mark-temperatuur model. As ekonometriese verwysing is die “ARCH/GARCH” metode toegepas. Aandelemark indekse is bekend vir die nie-Gaussiese verspreiding daarvan. Die verspreidingspatroon van ‘n aandelemark indeks met‘n redelike hoë steekproef frekwensie (in die orde van ‘n dag of minder) is leptokurties met breë stert-dele. Mesoskopiese (interdag) verspreidings van finansiële tydreekse is getipeer as eksponensieël. Indien die empiriese eksponensiële-verspreiding as ‘n Boltzmann-verspreiding geinterpreteer word, kan ‘n mark-temperatuur daarvoor bereken word. Empiriese data vir die gebruik in hierdie studie is in die vorm van daaglikse sluitingswaardes van die Johannesburgse Effektebeurs (JSE) se Alle Aandele Indeks (ALSI) en die Standard en Poor 500 (S & P 500) indeks vir die periode 1995 tot en met 2008. Die Kleinert en Chen studie het van intradag data vanuit ‘n ontwikkelde mark gebruik gemaak. Hierdie studie verskil egter van die Kleinert en Chen studie deurdat van interdag data vanuit ‘n opkomende mark, naamlik die Suid-Afrikaanse aandelemark, gebruik is. Nie een van die twee voorafgaande verskille het ‘n beduidende invloed op die resultate van hierdie studie gehad nie. Die JSE ALSI se logaritmiese opbrengs data vertoon nie-Gaussiese eienskappe en die Laplace (dubbeleksponensiële) verspreiding beskryf die data goed. ‘n Grafiek van die mark-temperatuur vertoon duidelik wanneer aandelemarkineenstortings plaasgevind het. Resultate van die ekonofisiese (Boltzmann/mark-temperatuur) metode vergelyk goed met resultate van die ekonometriese (“ARCH/GARCH”) metode en onderhewig aan sekere verbeteringe kan dit met sukses toegepas word. ‘n Leptokurtiese, nie-Gaussiese aard is vir daaglike opbrengswaardes vir die JSE ALSI en die S & P 500 indeks vasgestel. ‘n Laplace (dubbel-eksponensiële) verspreiding kan goed op die jaarlikse logaritmiese opbrengste van die JSE ALSI en die S & P 500 indeks toegepas word. As gevolg van die goeie aanwending van die Laplace-verspreiding kan ‘n jaarlikse mark-temperatuur vir die JSE ALSI en die S & P 500 indeks bereken word. Die mark-temperatuur metode is effektief in die identifisering van aandelemarkineenstorings vir beide indekse, hoewel daar ‘n beperking is op die aantal mark-temperature wat bereken kan word. Die beskikbaarheid van intradag aandele indekswaardes behoort die interval waarvoor mark-temperature bereken kan word te verbeter.
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

SANTOS, Alan de Andrade. "Simulação computacional e abordagem numérica para um modelo heterogêneo e adaptativo de distribuição de renda". Universidade Federal Rural de Pernambuco, 2016. http://www.tede2.ufrpe.br:8080/tede2/handle/tede2/6096.

Texto completo
Resumen
Submitted by Mario BC (mario@bc.ufrpe.br) on 2016-12-07T14:09:32Z No. of bitstreams: 1 Alan de Andrade Santos.pdf: 3882086 bytes, checksum: 26e70ab7ace142171818532e8ea7a8bf (MD5)
Made available in DSpace on 2016-12-07T14:09:32Z (GMT). No. of bitstreams: 1 Alan de Andrade Santos.pdf: 3882086 bytes, checksum: 26e70ab7ace142171818532e8ea7a8bf (MD5) Previous issue date: 2016-08-23
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES
A key feature of income distribution P(m) study is characterize the inequalities implied by microeconomic models based on the mechanisms of exchange of goods and services. One way to quantify such inequalities is based on the Gini index 0 6 G 6 1, a parameter that sets the maximum (G = 1) and minimum (G = 0) concentration of resources. Current studies indicates that income distribution P(m) has two distinct regimes separated by a scale mc. The rst one associated to a low-regime income (m 6 mc) described by a gamma distribution and a second one related to a high-income regime (m > mc), mathematically represented by a power law function with a parameter 1 6 6 3, usually called Pareto's exponent. In this work we introduce an adaptive heterogeneous model in order to describe quantitatively the relationship among the average expenditure rate of economic agents, and the Gini index associated to the income distribution. In this approach a fraction p0 of all economic agents N do not modify their expenditure rates, a fraction p1 are able to modify their consumption rate positively correlated with their income and lastly a fraction p2 negatively. With the view to obtain boundaries values for income distribution parameters we conduct a numeric calculation using an entropy maximization approach. After that we investigate the impact of taxation on inequality income distribution through a redistribution rate p. We conclude that the model where adaptive agents coexist with di erent characteristics for the expenditure rate provides results closer to real data producing Gini indexes and expenditure rates, emerging features of the dynamics. At the instantaneous adaptive scenario the maximum Gini index [Gmax] is inversely proportional to taxation rate p. Moreover we can establish at the space parameters (G,), a limited region that corresponds to that observed in real data, taken from the World Bank to 139 countries.
Um dos principais objetivos no estudo da distribui ção de renda P(m) é a caracterização das desigualdades associadas aos mecanismos de interação propostos nos modelos microeconômicos. Uma forma de quantificar tais desigualdades é baseada no índice de Gini 0 6 G 6 1, um parâmetro que indica máxima (G = 1) e a mínima (G = 0) concentração de recursos. Estudos recentes apontam que P(m) possui dois regimes distintos separados por uma escala mc. O primeiro associado a pequenos valores de renda (m 6 mc) descrito por uma distribuição e um segundo relacionado ao regime de altas rendas (m > mc), representado por uma lei de potência com um expoente de Pareto 1 6 6 3. Nesta dissertação introduzimos um modelo heterogêneo adaptativo a fim de descrever quantitativamente a relação entre a taxa de gasto m édia dos agentes econômicos e o índice de Gini associado a distribuição. Nesta abordagem uma fração p0 de todos os agentes N são incapazes de modificar sua taxa de gasto, uma fração p1 modifica de forma positivamente correlacionada com seu nível de recursos e uma ultima fração p2 negativamente correlacionada. A fi m de obter valores limitantes para os parâmetros associados a distribuição de renda realizamos um cálculo numérico utilizando uma abordagem de maximização da entropia. Em seguida investigamos o impacto da taxação sobre a desigualdade de renda através de uma taxa de redistribuição p. Concluímos que o modelo onde coexistem agentes adaptáveis com diferentes características para taxa de gasto fornecem resultados próximos aqueles observados em dados reais. Num cenário de adaptação instantânea o valor máximo do índice de Gini [Gmax] é inversamente proporcional a probabilidade de redistribuição. Por fi m estabelecemos no espaço de parâmetros, uma região limitada que corresponde aos dados reais extraí dos do Banco Mundial para 139 países.
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Nascimento, César Moura. "Análise multifractal e seções de Lévy de flutuações heterocedásticas". Universidade Federal de Alagoas, 2008. http://repositorio.ufal.br/handle/riufal/1012.

Texto completo
Resumen
An important problem in Physics concerns the study of stochastic processes and fluctuations away from the mean of dynamical variables. In a wide range of systems, some of the observed variables have a macroscopic quality, in the sense that they represent averages or sums over time or space of "microscopic" quantities. When long-range memory or correlation effects do not play a significant role, then the necessary and sufficient conditions for the Central Limit Theorem to hold can become satisfied. Quite often, the second moments of the studied dynamical variable do not diverge, hence in many important instances, the fluctuations of many systems follow Gaussian statistics. On the other hand, complex systems generate some variabilities that often deviate them from Gaussian statistics. Here, we focus on two properties related to Gaussian fluctuations: (i) monofractality and (ii) homoscedasticity. Specifically, we first address the general question about the nature of the relationship between multifractality and heteroscedasticity. We applied multifractal detrended fluctuation analysis to a nonstationary high frequency financial time series obtained from currency markets. As a second test, we applied the technique to the audio time series of Beethoven's fifth symphony. We obtained results suggesting that heteroscedasticity can cause or increase multifractality. We also investigate in greater detail the convergence to the homoskedastic and monofractal Gaussian regime, using the mathematical formalism of Lévy sections, as previously applied to time series. We report several conclusions related to these questions and discuss the generality of these results in the context of the physics of complex systems.
Conselho Nacional de Desenvolvimento Científico e Tecnológico
Um importante problema em Física está relacionado ao estudo de processos estocásticos e flutuações de variáveis dinâmicas. Em uma variedade de sistemas, algumas das variáveis observadas têm uma qualidade macroscópica, no sentido de que elas representam a média ou a soma sobre o espaço ou tempo de quantidades microscópicas. Quando efeitos de memória de longo alcance ou correlação não desempenharem um papel significativo, então as condições necessárias e suficientes para a validade do Teorema do Limite Central podem ser satisfeitas. Frequentemente o segundo momento da variável em questão não diverge. Consequentemente em muitos exemplos importantes, as flutuações de muitos sistemas seguem uma estatística Gaussiana. Em contraste, sistemas complexos geram flutuações que muitas vezes os desviam da estatística Gaussiana. Aqui, nós focamos em duas propriedades relacionadas à flutuações Gaussianas: (i) monofractalidade e (ii) homocedasticidade. Especificamente, discutimos primeiro a questão geral sobre a natureza da relação entre multifractalidade e heterocedasticidade. Aplicamos a multifractal detrended fluctuations analysis a uma série temporal financeira não estacionária e de alta freqüência referente à taxa cambial. Como um segundo teste, aplicamos a mesma técnica de análise para a série de áudio da quinta sinfonia de Beethoven. Obtivemos resultados que indicam que a heterocedasticidade pode causar ou aumentar a multifractalidade. Também investigamos em detalhes a convergência para o regime homocedástico e monofratal Gaussiano usando o método matemático de seções de Lévy, como previamente aplicado a séries temporais. Apresentamos conclusões relacionadas a estes questionamentos e discutimos a generalidade destes resultados no contexto da Física de sistemas complexos.
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

CERINA, FEDERICA. "Statistical physics of network communities in economic systems". Doctoral thesis, Università degli Studi di Cagliari, 2015. http://hdl.handle.net/11584/266790.

Texto completo
Resumen
In the last decade, the study of big networked systems has received a great deal of attention thanks to the increased availability of large datasets and the technology to analyze them. To unravel regularities and behaviours from his enormous quantity of data and supply suitable models, we need appropriate tools, one of them being community detection. Finding meaningful communities in a networks is still a diffcult task but essential to unveil functional relations between the parts. The research presented here has been carried out focusing on community detection; in particular were considered cases where the spatial component was relevant or intrinsic. It is indeed true that, nowadays, many systems, represented as complex networks, are affected, more or less naturally, by the geographical distance, location and organization. This holds true even for economic events: it has been proved that trade and exchanges between countries are necessarily suffocated by the geographical proximity or impeded by natural obstacles. Still, community detection alone is not sufficient to describe the whole picture, since it gives no information about the internal structure of a community. Therefore we developed the novel core detection method, natural counterpart of the community detection algorithm and meant to be performed alongside it, which is, at the same time, simple and powerful. We aim to apply community detection and core detection methodologies to the analysis of the global market and its functioning, in order to understand the origin of economic turmoils and critical events. In this work we analyze different economic systems from a complex network perspective and find some interesting results: we study patent data in order to measure internationalization of European countries and assess the effectiveness of EU policies; we examine the dynamics of network effects on the performances of individual countries and trade relationships in the International Trade Network; we represent World Input-Output data as an interdependent complex network and study its properties, showing evidence of the crisis . Thanks to both community and core detection, we are able to have a deeper insight on the inner workings of community formation, we can identify the leading members in a group and reveal in uence basins, unknown otherwise.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Angus, Simon Douglas Economics Australian School of Business UNSW. "Economic networks: communication, cooperation & complexity". Awarded by:University of New South Wales. Economics, 2007. http://handle.unsw.edu.au/1959.4/27005.

Texto completo
Resumen
This thesis is concerned with the analysis of economic network formation. There are three novel sections to this thesis (Chapters 5, 6 and 8). In the first, the non-cooperative communication network formation model of Bala and Goyal (2000) (BG) is re-assessed under conditions of no inertia. It is found that the Strict Nash circle (or wheel) structure is still the equilibrium outcome for n = 3 under no inertia. However, a counter-example for n = 4 shows that with no inertia infinite cycles are possible, and hence the system does not converge. In fact, cycles are found to quickly dominate outcomes for n > 4 and further numerical simulations of conditions approximating no inertia (probability of updating > 0.8 to 1) indicate that cycles account for a dramatic slowing of convergence times. These results, together with the experimental evidence of Falk and Kosfeld (2003) (FK) motivate the second contribution of this thesis. A novel artificial agent model is constructed that allows for a vast strategy space (including the Best Response) and permits agents to learn from each other as was indicated by the FK results. After calibration, this model replicates many of the FK experimental results and finds that an externality exploiting ratio of benefits and costs (rather than the difference) combined with a simple altruism score is a good proxy for the human objective function. Furthermore, the inequity aversion results of FK are found to arise as an emergent property of the system. The third novel section of this thesis turns to the nature of network formation in a trust-based context. A modified Iterated Prisoners' Dilemma (IPD) model is developed which enables agents to play an additional and costly network forming action. Initially, canonical analytical results are obtained despite this modification under uniform (non-local) interactions. However, as agent network decisions are 'turned on' persistent cooperation is observed. Furthermore, in contrast to the vast majority of non-local, or static network models in the literature, it is found that a-periodic, complex dynamics result for the system in the long-run. Subsequent analysis of this regime indicates that the network dynamics have fingerprints of self-organized criticality (SOC). Whilst evidence for SOC is found in many physical systems, such dynamics have been seldom, if ever, reported in the strategic interaction literature.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Gustavsson, Marcus y Daniel Levén. "The Predictability of Speculative Bubbles : An examination of the log-periodic power law model". Thesis, Linköpings universitet, Nationalekonomi, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-120378.

Texto completo
Resumen
In this thesis we examine the ability of the log-periodic power law model to accurately predict the end of speculative bubbles on financial markets through modeling of asset price dynamics on a selection of historical bubbles. The methods we use are based on a nonlinear least squares estimation which yields predictions of when the bubble will change regime.We find evidence which support the occurrence of LPPL-patterns leading up to the change in regime; asset prices during bubble periods seem to oscillate around a faster-than-exponential growth. In most cases the estimation yields accurate predictions, although we conclude that the predictions are quite dependent on at which point in time the prediction is conducted. We also find that the end of a speculative bubble seems to be influenced by both endogenous speculative growth and exogenous factors. For this reason we propose a new way of interpreting the predictions of the model, where the end dates should be interpreted as the start of a time period where the asset prices are especially sensitive to exogenous events. We propose that negative news during this time period results in a regime shift of the bubble. This study is the first to address both the possibilities and the limitations of the LPPL-model, and should therefore be considered as a contribution to the academia.
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Barrientos, Jesús Emeterio Navarro. "Adaptive investment strategies for different scenarios". Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät II, 2010. http://dx.doi.org/10.18452/16168.

Texto completo
Resumen
Die folgende Arbeit befasst sich mit den Untersuchungen von Problemen der Optimierung von Ressourcen in Umgebungen mit unvorhersehbarem Verhalten, wo: (i) nicht alle Informationen verfügbar sind, und (ii) die Umgebung unbekannte zeitliche Veränderungen aufweist. Diese Dissertation ist folgendermaßen gegliedert: Teil I stellt das Investitionsmodell vor. Es wird sowohl eine analytische als auch eine numerische Analyse der Dynamik dieses Modells für feste Investitionsstrategien in verschiedenen zufälligen Umgebungen vorgestellt. In diesem Investitionsmodell hängt die Dynamik des Budgets des Agenten x(t) von der Zufälligkeit der exogenen Rendite r(t) ab, wofür verschiedene Annahmen diskutiert wurden. Die Heavy-tailed Verteilung des Budgets wurde numerisch untersucht und mit theoretischen Vorhersagen verglichen. In Teil II wurde ein Investitionsszenario mit stilisierten exogenen Renditen untersucht, das durch eine periodische Funktion mit verschiedenen Arten und Stärken von Rauschen charakterisiert ist. In diesem Szenario wurden unterschiedliche Strategien, Agenten-Verhalten und Agenten Fähigkeiten zur Vorhersage der zukünftigen r(t) untersucht. Hier wurden Null-intelligenz-Agenten, die über technischen Analysen verfügen, mit Agenten, die über genetischen Algorithmen verfügen, verglichen. Umfangreiche Ergebnisse von Computersimulationen wurden präsentiert, in denen nachgewiesen wurde, dass für exogene Renditen mit Periodizität: (i) das wagemutige das vorsichtige Verhalten überbietet, und (ii) die genetischen Algorithmen in der Lage sind, die optimalen Investitionsstrategien zu finden und deshalb die anderen Strategien überbieten. Obwohl der Schwerpunkt dieser Dissertation im Zusammenhang mit dem Gebiet der Informatik präsentiert wurde, können die hier vorgestellten Ergebnisse auch in Szenarien angewendet werden, in denen der Agent anderere Arten von Ressourcen steuern muss, wie z.B. Energie, Zeitverbrauch, erwartete Lebensdauer, etc.
The main goal of this PhD thesis is to investigate some of the problems related to optimization of resources in environments with unpredictable behavior where: (i) not all information is available and (ii) the environment presents unknown temporal changes. The investigations in this PhD thesis are divided in two parts: Part I presents the investment model and some analytical as well as numerical analysis of the dynamics of this model for fixed investment strategies in different random environments. In this investment model, the dynamics of the investor''s budget x(t) depend on the stochasticity of the exogenous return on investment r(t) for which different model assumptions are discussed. The fat-tail distribution of the budget is investigated numerically and compared with theoretical predictions. Part II investigates an investment scenario with stylized exogenous returns characterized by a periodic function with different types and levels of noise. In this scenario, different strategies, agent''s behaviors and agent''s capacities to predict the future r(t) are investigated. Here, ''zero-intelligent'' agents using technical analysis (such as moving least squares) are compared with agents using genetic algorithms to predict r(t). Results are presented for extensive computer simulations, which shows that for exogenous returns with periodicity: (i) the daring behavior outperforms the cautious behavior and (ii) the genetic algorithm is able to find the optimal investment strategy by itself, thus outperforming the other strategies considered. Finally, the investment model is extended to include the formation of common investment projects between agents. Although the main focus of this PhD thesis is more related to the area of computer science, the results presented here can be also applied to scenarios where the agent has to control other kinds of resources, such as energy, time consumption, expected life time, etc.
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Filiasi, Mario. "Applications of Large Deviations Theory and Statistical Inference to Financial Time Series". Doctoral thesis, Università degli studi di Trieste, 2015. http://hdl.handle.net/10077/10940.

Texto completo
Resumen
2013/2014
La corretta valutazione del rischio finanziario è una delle maggiori attività nell'amibto della ricerca finanziaria, ed è divenuta ancora più importante dopo la recente crisi finanziaria. I recenti progressi dell'econofisica hanno dimostrato come la dinamica dei mercati finanziari può essere studiata in modo attendibile per mezzo dei modelli usati in fisica statistica. L'andamento dei prezzi azionari è costantemente monitorato e registrato ad alte frequenze (fino a 1ms) e ciò produce un'enorme quantità di dati che può essere analizzata statisticamente per validare e calibrare i modelli teorici. Il presente lavoro si inserisce in questa ottica, ed è il risultato dell'interazione tra il Dipartimento di Fisica dell'Università degli Studi di Trieste e List S.p.A., in collaborazione con il Centro Internazionale di Fisica Teorica (ICTP). In questo lavoro svolgeremo un analisi delle serie storiche finanziarie degli ultimi due anni relative al prezzo delle azioni maggiormente scambiate sul mercato italiano. Studieremo le proprietà statistiche dei ritorni finanziari e verificheremo alcuni fatti stilizzati circa i prezzi azionari. I ritorni finanziari sono distribuiti secondo una distribuzione di probabilità a code larghe e pertanto, secondo la Teoria delle Grandi Deviazioni, sono frequentemente soggetti ad eventi estremi che generano salti di prezzo improvvisi. Il fenomeno viene qui identificato come "condensazione delle grandi deviazioni". Studieremo i fenomeni di condensazione secondo le convenzioni della fisica statistica e mostreremo la comparsa di una transizione di fase per distribuzioni a code larghe. Inoltre, analizzaremo empiricamente i fenomeni di condensazione nei prezzi azionari: mostreremo che i ritorni finanziari estremi sono generati da complesse fluttuazioni dei prezzi che limitano gli effetti di salti improvvisi ma che amplificano il movimento diffusivo dei prezzi. Proseguendo oltre l'analisi statistica dei prezzi delle singole azioni, investigheremo la struttura del mercato nella sua interezza. E' opinione comune in letteratura finanziaria che i cambiamenti di prezzo sono dovuti ad eventi esogeni come la diffusione di notizie politiche ed economiche. Nonostante ciò, è ragionevole ipotizzare che i prezzi azionari possano essere influenzati anche da eventi endogeni, come le variazioni di prezzo in altri strumenti finanziari ad essi correlati. La grande quantità di dati a disposizione permette di verificare quest'ipotesi e di studiare la struttura del mercato finanziario per mezzo dell'inferenza statistica. In questo lavoro proponiamo un modello di mercato basato su prezzi azionari interagenti: studieremo un modello di tipo "integrate & fire" ispirato alla dinamica delle reti neurali, in cui ogni azione è influenzata da tutte gli altre per mezzo di un meccanismo con soglie limite di prezzo. Usando un algoritmo di massima verosimiglianza, applicheremo il modello ai dati sperimentali e tenteremo di inferire la rete informativa che è alla base del mercato finanziario.
The correct evaluation of financial risk is one of the most active domain of financial research, and has become even more relevant after the latest financial crisis. The recent developments of econophysics prove that the dynamics of financial markets can be successfully investigated by means of physical models borrowed from statistical physics. The fluctuations of stock prices are continuously recorded at very high frequencies (up to 1ms) and this generates a huge amount of data which can be statistically analysed in order to validate and to calibrate the theoretical models. The present work moves in this direction, and is the result of a close interaction between the Physics Department of the University of Trieste with List S.p.A., in collaboration with the International Centre for Theoretical Physics (ICTP). In this work we analyse the time-series over the last two years of the price of the 20 most traded stocks from the Italian market. We investigate the statistical properties of price returns and we verify some stylized facts about stock prices. Price returns are distributed according to a heavy-tailed distribution and therefore, according to the Large Deviations Theory, they are frequently subject to extreme events which produce abrupt price jumps. We refer to this phenomenon as the condensation of the large deviations. We investigate condensation phenomena within the framework of statistical physics and show the emergence of a phase transition in heavy-tailed distributions. In addition, we empirically analyse condensation phenomena in stock prices: we show that extreme returns are generated by non-trivial price fluctuations, which reduce the effects of sharp price jumps but amplify the diffusive movements of prices. Moving beyond the statistical analysis of the single-stock prices, we investigate the structure of the market as a whole. In financial literature it is often assumed that price changes are due to exogenous events, e.g. the release of economic and political news. Yet, it is reasonable to suppose that stock prices could also be driven by endogenous events, such as the price changes of related financial instruments. The large amount of available data allows us to test this hypothesis and to investigate the structure of the market by means of the statistical inference. In this work we propose a market model based on interacting prices: we study an integrate & fire model, inspired by the dynamics of neural networks, where each stock price depends on the other stock prices through some threshold-passing mechanism. Using a maximum likelihood algorithm, we apply the model to the empirical data and try to infer the information network that underlies the financial market.
XXVII Ciclo
1986
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Mohti, Wahbeeah. "Essays on frontier markets: financial integration, financial market efficiency, financial contagion". Doctoral thesis, Universidade de Évora, 2019. http://hdl.handle.net/10174/24579.

Texto completo
Resumen
This thesis investigates financial integration, market efficiency, and financial contagion in frontier markets in order to evaluate the potentiality of portfolio diversification. The first essay evaluates Asian frontier and emerging equity markets’ regional and global integration using Gregory and Hansen co-integration tests and detrended cross correlation analysis (DCCA). The results suggest that Asian emerging markets show some evidence of integration with both regional and global markets. From Asian frontier markets, Pakistan is the only one with evidence of integration with both benchmarks. The second essay appraises weak form efficiency of frontier markets to investigate the global correlation and long-range dependence, applying mutual information and Detrended Fluctuation Analysis (DFA). The results indicate that Slovenia is the only case where there is evidence compatible with weak form efficiency. The third essay investigates contagion from the US subprime financial crisis to frontier stock markets using Copula models to investigate dependence structures between US and frontier stock markets, before and during US subprime financial crisis. The results show that Croatia and Romania are the ones, most affected by the US subprime crisis. Subsequently, the forth essay investigates the contagion from both recent crises; US subprime financial crisis and European debt crisis to frontier stock market, applying DCCA correlation coefficients to investigate the linkage between crisis originating country stock markets (US and Greece) and those of frontier markets, to assess whether the correlation coefficients significantly increase with the crises. The results indicate that from US subprime crisis, European frontier markets are the ones most affected, followed by Middle Eastern markets. In case of European debt crisis (originated in Greece), the findings show that contagion effect is weaker in frontier markets; Ensaios sobre Mercados de Fronteira: Integração Financeira, Eficiência de Mercados, Contágio Financeiro Sumário: Esta tese investiga a integração financeira, eficiência de mercado e contágio financeiro nos chamados “mercados de fronteira”, a fim de avaliar o respetivo potencial de diversificação internacional de carteiras. O primeiro ensaio avalia a integração regional e global dos mercados de capitais emergentes e globais Asiáticos, sendo utilizados o teste de cointegração de Gregory e Hansen e a detrended cross correlation analysis (DCCA). Os resultados sugerem que os mercados emergentes asiáticos mostram algumas evidências de integração com os mercados regional e global. Dos mercados de fronteira asiática, o Paquistão é o único com evidências de integração com os dois benchmarks. O segundo ensaio avalia a eficiência da forma fraca dos mercados de fronteira para investigar a correlação global e a dependência longa, aplicando a informação mútua e a Detrended Fluctuation Analysis (DFA). Os resultados indicam que a Eslovénia é o único caso em que há evidências compatíveis com a hipótese d eficiência na forma fraca. O terceiro ensaio investiga o contágio da crise financeira subprime dos EUA para os mercados de fronteira, sendo usados modelos Copula para investigar as estruturas de dependência entre os mercados de ações dos EUA e os mercados de fronteira, antes e durante a crise financeira dos Estados Unidos. Os resultados mostram que a Croácia e a Roménia são os mercados mais afetados pela crise do subprime dos EUA. Posteriormente, o quarto ensaio investiga o contágio de ambas as crises recentes; crise financeira subprime dos EUA e crise da dívida europeia para os mercados de fronteira, aplicando coeficientes de correlação DCCA para investigar a ligação entre os mercados de ações de países EUA e Grécia e mercados de fronteira. Os resultados indicam que, relativamente à crise do subprime nos EUA, os mercados de fronteira europeus são os mais afetados, seguidos pelos mercados do Médio Oriente. Relativamente à crise da dívida soberana (originada na Grécia), os resultados mostram que o efeito de contágio é menor nos mercados de fronteira analisados.
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Hodek, Jakub [Verfasser], Johannes [Akademischer Betreuer] Schneider y Ulrich [Akademischer Betreuer] Küsters. "Wavelets im Rahmen der Econophysik : eine naturwissenschaftlich geprägte Analyse von Finanzmärkten / Jakub Hodek. Betreuer: Johannes Schneider ; Ulrich Küsters". Eichstätt-Ingolstadt : Katholische Universität Eichstätt-Ingolstadt, 2015. http://d-nb.info/1074192613/34.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Pereira, Marcelo Alves. "Dilema do prisioneiro contínuo com agentes racionais e classificadores de cooperação". Universidade de São Paulo, 2012. http://www.teses.usp.br/teses/disponiveis/59/59135/tde-08012013-222525/.

Texto completo
Resumen
O dilema do prisioneiro (DP) é um dos principais jogos da teoria dos jogos. No dilema do prisioneiro discreto (DPD), dois prisioneiros têm as opções de cooperar ou desertar. Um jogador cooperador não delata seu comparsa, já um desertor delata. Se um cooperar e o outro desertar, o cooperador fica preso por cinco anos e o desertor fica livre. Se ambos cooperarem, ficam presos por um ano e, se ambos desertarem, ficam presos por três anos. Quando o DP é repetido, a cooperação pode emergir entre agentes egoístas. Realizamos um estudo analítico para o DPD, que produziu uma formulação da evolução do nível médio de cooperação e da tentação crítica (valor de tentação que causa mudança abrupta do nível de cooperação). No dilema do prisioneiro contínuo (DPC), cada jogador apresenta um nível de cooperação que define o grau de cooperação. Utilizamos o DPC para estudar o efeito da personalidade dos jogadores sobre a emergência da cooperação. Para isso, propusemos novas estratégias: uma baseada na personalidade dos jogadores e outras duas baseadas na comparação entre o ganho obtido e a aspiração do jogador. Todas as estratégias apresentavam algum mecanismo de cópia do estado do vizinho com maior ganho na vizinhança, mecanismo este, herdado da estratégia darwiniana. Os resultados mostraram que o DPC aumenta o nível médio de cooperação do sistema, quando comparado ao DPD. No entanto, as diferentes estratégias não aumentaram a cooperação comparado à cooperação obtida com a estratégia darwiniana. Então propusemos o uso do coeficiente de agrupamentos, coeficiente de Gini e entropias de Shannon, Tsallis e Kullback-Leibler para classificar os sistemas, em que os agentes jogam o DPD com a estratégia darwiniana, quanto ao nível de cooperação. Como analisamos valores de médias configuracionais, tais classificadores não foram eficientes ao classificar os sistemas. Isso é consequência da existência de distribuições de extremos nos resultados que compõem as médias. As distribuições de extremos suscitaram uma discussão acerca da definição do regime de cooperação no dilema do prisioneiro. Discutimos também as consequências de utilizar apenas valores médios nos resultados ignorando seus desvios e as distribuições.
Prisoner\'s dilemma (PD) is one of the main games of game theory. In discrete prisoner\'s dilemma (DPD), two prisoners have the options to cooperate or to defect. A cooperator player does not defect his accomplice, while a defector does. If one player cooperates and the other defects, the cooperator gets jailed for five years and the defector goes free. If both cooperate, they get jailed during one year and if both defect, they get jailed during three years. When this game is repeated, cooperation may emerge among selfish individuals. We perform an analytical study for the DPD, that produced a formulation for the evolution of the mean cooperation level and for the critical temptation values (temptation values that promote abrupt modifications in the cooperation level). In continuous prisoner\'s dilemma (CPD), each player has a level of cooperation that defines his/her degree of cooperation. We used the CPD to study the effect of the players\' personality on the emergence of cooperation. For this, we propose new strategies: one based on the players\' personality and two others based on the comparison between the player\'s obtained payoff and the desire one. All strategies present some mechanism that copies the state of the neighbor with the highest payoff in the neighborhood, mechanism inherited from the Darwinian strategy. The results showed that the CPD increases the average cooperation level of the system when compared to DPD. However, different strategies do not increased the cooperation compared to cooperation obtained with the Darwinian strategy. So, we propose the use of cluster coefficient, Gini coefficient and entropy of Shannon, Tsallis and Kullback-Leibler as classifiers to classify systems, in which the individuals play DPD with Darwinian strategy, by the cooperation level. As configurational averages were analyzed, such classifiers were not efficient in classifying the systems. This is due to the existence of distributions with extreme values of the results that compose the means. Distributions with extremes values emerged a discussion about the definition of the cooperation state in the prisoner\'s dilemma. We also discussed the consequences of using only average results in the analysis ignoring their deviations and distributions.
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Allez, Romain. "Chaos multiplicatif Gaussien, matrices aléatoires et applications". Phd thesis, Université Paris Dauphine - Paris IX, 2012. http://tel.archives-ouvertes.fr/tel-00780270.

Texto completo
Resumen
Dans ce travail, nous nous sommes intéressés d'une part à la théorie du chaos multiplicatif Gaussien introduite par Kahane en 1985 et d'autre part à la théorie des matrices aléatoires dont les pionniers sont Wigner, Wishart et Dyson. La première partie de ce manuscrit contient une brève introduction à ces deux théories ainsi que les contributions personnelles de ce manuscrit expliquées rapidement. Les parties suivantes contiennent les textes des articles publiés [1], [2], [3], [4], [5] et pré-publiés [6], [7], [8] sur ces résultats dans lesquels le lecteur pourra trouver des développements plus détaillés
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Pereira, Marcelo Alves. "Dilema do prisioneiro evolucionário Darwiniano e Pavloviano no autômato celular unidimensional: uma nova representação e exploração exaustiva do espaço de parâmetros". Universidade de São Paulo, 2008. http://www.teses.usp.br/teses/disponiveis/59/59135/tde-12052008-122340/.

Texto completo
Resumen
O Dilema do Prisioneiro (DP) é o jogo mais proeminente da Teoria dos Jogos devido à emergência da cooperação entre jogadores egoístas. O comportamento de cada jogador depende da estratégia que ele adotada e do seu ganho, que é determinado em função dos parâmetros do DP (T, R, P e S) e do número z de vizinhos com que ele joga. Portanto, a estrutura espacial dos jogadores não é relevante. Em nosso trabalho, utilizamos um autômato celular unidimensional onde cada jogador pode cooperar ou desertar ao interagir, simetricamente, com seus z vizinhos mais próximos. O sistema proposto nos permitiu realizar um estudo exaustivo do espaço de parâmetros para as estratégias evolucionárias Darwiniana (EED) e a Pavloviana (EEP) e compara-las. A geometria unidimensional nos possibilita obter os mesmos resultados dos sistemas em dimensionalidade arbitrária d, além de apresentar várias vantagens em relação a elas. No sistema que propomos os efeitos de borda são menores, exige menos tempo para a execução das simulações numéricas, permite variar o valor de z e é fácil obter uma representação visual da evolução temporal do sistema. Tal visualização simplifica a compreensão das interações entre os jogadores, pois surgem padrões nos agrupamentos de cooperadores/desertores, semelhantes aos pertencentes às classes dos autômatos celulares elementares. O estudo destes padrões nos permite compreender simplesmente a emergência da cooperação ou deserção nos sistemas. A evolução temporal do sistema que adota a EED gera um diagrama de fases muito rico com a presença das fases cooperadora, desertora e caótica. Já para a EEP, obtivemos um novo resultado analítico para as transições de fase, que neste caso são: cooperadora e quasi-regular. O estudo numérico exaustivo determinou as regiões do espaço de parâmetros onde acontecem cada uma das fases, e os efeitos da auto-interação podendo assim validar os resultados teóricos. O estudo do caso particular T = 1, tradicionalmente considerado como trivial, mostrou que ele apresenta comportamentos inusitados. Nossa principal contribuição para o estudo do DP é a obtenção de um novo paradigma. A geometria unidimensional com interação de vizinhos simétricos permitiu a visualização da evolução de padrões de cooperadores e desertores, o cálculo analítico de Tc para a EEP e o estudo de T = 1 para tais sistemas.
The Prisoner Dilemma (PD) is the most prominent game of the Game Theory due to emergency of the cooperation between selfish players. The behavior of each player depends on his/her strategy and the payoff, which is determined in function of the PD parameters (T, R, P and S) and by the number z of neighbors with whom he/she plays. Therefore, the spatial structure of the players does not matter. In our work, we have used a one-dimensional cellular automaton where each player can cooperate or defect when interacting, symmetrically, with his/her z nearest neighbors. The considered system allowed us to carry out an exhaustive exploration of the parameters space for the Darwinian Evolutionary Strategy (EED) and Pavlovian (EEP) and compares them. One-dimensional geometry makes possible to us get the same results of the systems in arbitrary d dimensional networks, besides, it presents some advantages. For the system that we proposed compared to the others dimensional networks, the boundary effects are less present, it needs less time for run the numerical simulations, it allows to vary the z value and is easier to get the visual representation of the system temporal evolution. Such visualization simplifies the understanding of the interactions between the players, therefore patterns appear in the clusters of cooperator/defectors, and these patterns belong to the elementary cellular automata classes. The study of these patterns allows them to understand in an easy way the emergence of the cooperation or defection in the systems. The temporal evolution of the system that adopts the EED yields a very rich phases diagram with the presence of cooperative, defective and chaotic phases. By the other hand, for the EEP, we have got a new analytical result for the phase transitions that in this case are: quasi-regular and cooperative. The exhaustive exploration study determines the regions on the parameters space where happen each phases occurs, and the effect of the self-interaction and thus validate the theoretical results. The study of the particular case T = 1, traditionally considered as trivial one, showed that it presents unusual behaviors, that we will present. Our main contribution for the study of the DP is the attainment of a new paradigm. One-dimensional geometry with interaction of symmetrical neighbors allowed to visualizes the evolution of cooperators and defectors patterns, the analytical result for Tc for the EEP and the study of T = 1 for such systems.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

余智偉. "Switching dynamics in Econophysics". Thesis, 2012. http://ndltd.ncl.edu.tw/handle/42634693375758440339.

Texto completo
Resumen
碩士
國立高雄師範大學
物理學系
100
In 2011, T. Preis, J. Schneider, and H. E. Stanley (Proc. Natl. Acad. Sci. USA 108, 7674-7678 (2011)) proposed a method of renormalized time to analyze switching dynamics of German DAX future. In this thesis, we use their method to study the switching dynamics of Taiwan TX index Future, which is composed of 37,252,200 transaction records, ranging from 1 January 2007 to 31 December 2010. In studying the switching phenomenon, We analyze the distribution of (1) volatility、(2)volume and (3) intra-event time interval, and found three power-law scaling relationships. We will compare our results to that of Stanley group. At last we analyzed the switching dynamics exhibited in the daily historical data of S&;P 500 index, Taiwan Weight index and Janpen Nikkei index .
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

CAI, YU-LONG y 蔡瑜隆. "Minority mechanism for financial markets : an econophysics approach". Thesis, 2017. http://ndltd.ncl.edu.tw/handle/gssgtm.

Texto completo
Resumen
碩士
國立東華大學
物理學系
105
We construct an artificial market based on the minority game (MG). We then compute the profit probability distributions of the MG group in an open system or a close system. We also compare the profit probability distributions between a day trader and an MG trader in a market composed of MG traders. In an open system, if the price is fluctuating and totally determined by MG traders, the average wealth of MG traders decreases. However, in the bear or bull market, i.e. when the difference between the first and final price is large, the average wealth of these MG traders increases. Finally, we analyze the profit distribution of a day trader in the steady market composed of MG traders both in an open system and a close system. It is suggested that including a time-correlated information in the model will make the interaction between a day trader and the market more realistic.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Liao, Chi-Yo y 廖啟佑. "Anchoring Effect in Econophysics : A Case Study of TAIEX Futures". Thesis, 2016. http://ndltd.ncl.edu.tw/handle/35726162906212410585.

Texto completo
Resumen
碩士
國立高雄師範大學
物理學系
104
In this paper, we study the anchor effect that is emerged by local maximum and local minimum by using the data from TAIEX Futures. We observe the price fluctuation ,and figure that the local maximum and local minimum sometimes will come out. These extreme values become the conditions which the investor can use to predict the price fluctuation of future. We do some statistics of these extreme values. We calculate the time it need to breakthrough these extreme values and the largest price spread is yielded during the time. Then we consider the breakthrough time and the largest price spread as the distribution of time and space. We compare it with the Brownian motion and observe the patterns of them.
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

LAI, YI-ZHEN y 賴怡臻. "Econophysics: Correlation Analysis of Price, Time, and Volume in TAIEX". Thesis, 2016. http://ndltd.ncl.edu.tw/handle/21073259445093160098.

Texto completo
Resumen
碩士
國立高雄師範大學
物理學系
104
In this research, we study the relationship of the price, time, and volumes in uptrend and downtrend by using the data from TAIEX Futures. We found sometimes the price fluctuation will have local maximum or local minimum, and these extreme values will be used by the investor to predict the future trend. Therefore we measure the statistic of extreme values between price deference, time deference and the sum of the total volume between two extreme values. We also use the concept of Brownian motion to analyze the relationship between the deference of price, time, and accumulation volumes.
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

邱于慎. "Price Dynamics in Econophysics - Correlation Analysis between TAIEX Index and Its Futures". Thesis, 2014. http://ndltd.ncl.edu.tw/handle/q6z45c.

Texto completo
Resumen
碩士
國立高雄師範大學
物理學系
102
Abstract We study the cross correlation between time series of Taiwan Stock market Index (TAIEX) and Futures (TAIFEX) by using tools developed by statistical physics. Limited by the specification of the Taiwan Stock market Index settlement, we study the database from year 2011 to 2012. First, we analyze the persistence behavior of correlation between TAIEX and TAIFEX. Our results show that the trend is more persistent for positive correlation than negative correlation. In the second part, we analyze the correlation of persistence durations corresponding to trends with positive and negative correlation. Our results show no correlation between these two durations. We also propose a stochastic model which displays similar simulation result compared to the empirical founding.
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Cheah, E.-T. y John Fry. "Speculative bubbles in Bitcoin markets? An empirical investigation into the fundamental value of Bitcoin". 2015. http://hdl.handle.net/10454/18101.

Texto completo
Resumen
Yes
Amid its rapidly increasing usage and immense public interest the subject of Bitcoin has raised profound economic and societal issues. In this paper we undertake economic and econometric modelling of Bitcoin prices. As with many asset classes we show that Bitcoin exhibits speculative bubbles. Further, we find empirical evidence that the fundamental price of Bitcoin is zero.
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Avakian, Adam J. "Dynamic modeling of systemic risk in financial networks". Thesis, 2017. https://hdl.handle.net/2144/24072.

Texto completo
Resumen
Modern financial networks are complicated structures that can contain multiple types of nodes and connections between those nodes. Banks, governments and even individual people weave into an intricate network of debt, risk correlations and many other forms of interconnectedness. We explore multiple types of financial network models with a focus on understanding the dynamics and causes of cascading failures in such systems. In particular, we apply real-world data from multiple sources to these models to better understand real-world financial networks. We use the results of the Federal Reserve "Banking Organization Systemic Risk Report" (FR Y-15), which surveys the largest US banks on their level of interconnectedness, to find relationships between various measures of network connectivity and systemic risk in the US financial sector. This network model is then stress-tested under a number of scenarios to determine systemic risks inherent in the various network structures. We also use detailed historical balance sheet data from the Venezuelan banking system to build a bipartite network model and find relationships between the changing network structure over time and the response of the system to various shocks. We find that the relationship between interconnectedness and systemic risk is highly dependent on the system and model but that it is always a significant one. These models are useful tools that add value to regulators in creating new measurements of systemic risk in financial networks. These models could be used as macroprudential tools for monitoring the health of the entire banking system as a whole rather than only of individual banks.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Pištěk, Miroslav. "Statistická fyzika frustrovaných evolučních her". Master's thesis, 2010. http://www.nusl.cz/ntk/nusl-282503.

Texto completo
Resumen
1 Title: Statistical Physics of Frustrated Evolutionary Games Author: Miroslav Pištěk Department: Institute of Theoretical Physics Supervisor: RNDr. František Slanina, CSc. Supervisor's e-mail address: slanina@fzu.cz Abstract: In last two decades, the effort devoted to interdisciplinary research of bounded sources allocation is growing, examining complex phenomena as stock markets or traffic jams. The Minority Game is a multiple-agent model of inevitable frus- tration arising in such situations. It is analytically tractable using the replica method originated in statistical physics of spin glasses. We generalised the Mi- nority Game introducing heterogenous agents. This heterogeneity causes a con- siderable decrease of an average agent's frustration. For many configurations, we achieve even a positive-sum game, which is not possible in the original game variant. This result is in accordance with real stock market data. Keywords: frustrated evolutionary games, Minority Game, Replica method
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

王冠霖. "Price Dynamics in Econophysics : Correlation Analysis of TE and TF in Taiwan Futures Exchange". Thesis, 2015. http://ndltd.ncl.edu.tw/handle/cr7ag8.

Texto completo
Resumen
碩士
國立高雄師範大學
物理學系
103
In this thesis, we employ methods of auto-correlation, cross correlation, price phase analysis, and Hurst exponent to analyze the price return for the TE and TF futures in Taiwan Stock Index Futures. First, we found that the auto-correlation function of both TE and TF futures exhibit a decreasing tendency with increasing delay time. Furthermore, we found that the more TE and TF is with the same direction, the higher is coefficient of the cross correlation in the TE and TF futures. Finally, we apply the Hurst exponent to analyze the walking space of TE and TF futures. It is shown that the price walking has mean reversion in short time range and tends to be random walk for long time scale.
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

"Rigorous Proofs of Old Conjectures and New Results for Stochastic Spatial Models in Econophysics". Doctoral diss., 2019. http://hdl.handle.net/2286/R.I.53531.

Texto completo
Resumen
abstract: This dissertation examines six different models in the field of econophysics using interacting particle systems as the basis of exploration. In each model examined, the underlying structure is a graph G = (V , E ), where each x ∈ V represents an individual who is characterized by the number of coins in her possession at time t. At each time step t, an edge (x, y) ∈ E is chosen at random, resulting in an exchange of coins between individuals x and y according to the rules of the model. Random variables ξt, and ξt(x) keep track of the current configuration and number of coins individual x has at time t respectively. Of particular interest is the distribution of coins in the long run. Considered first are the uniform reshuffling model, immediate exchange model and model with saving propensity. For each of these models, the number of coins an individual can have is nonnegative and the total number of coins in the system is conserved for all time. It is shown here that the distribution of coins converges to the exponential distribution, gamma distribution and a pseudo gamma distribution respectively. The next two models introduce debt, however, the total number of coins again remains fixed. It is shown here that when there is an individual debt limit, the number of coins per individual converges to a shifted exponential distribution. Alternatively, when a collective debt limit is imposed on the whole population, a heuristic argument is given supporting the conjecture that the distribution of coins converges to an asymmetric Laplace distribution. The final model considered focuses on the effect of cooperation on a population. Unlike the previous models discussed here, the total number of coins in the system at any given time is not bounded and the process evolves in continuous time rather than in discrete time. For this model, death of an individual will occur if they run out of coins. It is shown here that the survival probability for the population is impacted by the level of cooperation along with how productive the population is as whole.
Dissertation/Thesis
Doctoral Dissertation Mathematics 2019
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Lubbers, Nicholas. "A statistical mechanical model of economics". Thesis, 2016. https://hdl.handle.net/2144/19754.

Texto completo
Resumen
Statistical mechanics pursues low-dimensional descriptions of systems with a very large number of degrees of freedom. I explore this theme in two contexts. The main body of this dissertation explores and extends the Yard Sale Model (YSM) of economic transactions using a combination of simulations and theory. The YSM is a simple interacting model for wealth distributions which has the potential to explain the empirical observation of Pareto distributions of wealth. I develop the link between wealth condensation and the breakdown of ergodicity due to nonlinear diffusion effects which are analogous to the geometric random walk. Using this, I develop a deterministic effective theory of wealth transfer in the YSM that is useful for explaining many quantitative results. I introduce various forms of growth to the model, paying attention to the effect of growth on wealth condensation, inequality, and ergodicity. Arithmetic growth is found to partially break condensation, and geometric growth is found to completely break condensation. Further generalizations of geometric growth with growth in- equality show that the system is divided into two phases by a tipping point in the inequality parameter. The tipping point marks the line between systems which are ergodic and systems which exhibit wealth condensation. I explore generalizations of the YSM transaction scheme to arbitrary betting functions to develop notions of universality in YSM-like models. I find that wealth condensation is universal to a large class of models which can be divided into two phases. The first exhibits slow, power-law condensation dynamics, and the second exhibits fast, finite-time condensation dynamics. I find that the YSM, which exhibits exponential dynamics, is the critical, self-similar model which marks the dividing line between the two phases. The final chapter develops a low-dimensional approach to materials microstructure quantification. Modern materials design harnesses complex microstructure effects to develop high-performance materials, but general microstructure quantification is an unsolved problem. Motivated by statistical physics, I envision microstructure as a low-dimensional manifold, and construct this manifold by leveraging multiple machine learning approaches including transfer learning, dimensionality reduction, and computer vision breakthroughs with convolutional neural networks.
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Huang, Xuqing. "Network theory and its applications in economic systems". Thesis, 2013. https://hdl.handle.net/2144/13147.

Texto completo
Resumen
This dissertation covers the two major parts of my Ph.D. research: i) developing theoretical framework of complex networks; and ii) applying complex networks models to quantitatively analyze economics systems. In part I, we focus on developing theories of interdependent networks, which includes two chapters: 1) We develop a mathematical framework to study the percolation of interdependent networks under targeted-attack and find that when the highly connected nodes are protected and have lower probability to fail, in contrast to single scale-free (SF) networks where the percolation threshold pc\ = 0, coupled SF networks are significantly more vulnerable with pc\ significantly larger than zero. 2) We analytically demonstrate that clustering, which quantifies the propensity for two neighbors of the same vertex to also be neighbors of each other, significantly increases the vulnerability of the system. In part II, we apply the complex networks models to study economics systems, which also includes two chapters: 1) We study the US corporate governance network, in which nodes representing directors and links between two directors representing their service on common company boards, and propose a quantitative measure of information and influence transformation in the network. Thus we are able to identify the most influential directors in the network. 2) We propose a bipartite networks model to simulate the risk propagation process among commercial banks during financial crisis. With empirical bank's balance sheet data in 2007 as input to the model, we find that our model efficiently identifies a significant portion of the actual failed banks reported by Federal Deposit Insurance Corporation during the financial crisis between 2008 and 2011. The results suggest that complex networks model could be useful for systemic risk stress testing for financial systems. The model also identifies that commercial rather than residential real estate assets are major culprits for the failure of over 350 US commercial banks during 2008 - 2011.
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Fry, John. "Booms, busts and heavy-tails: the story of Bitcoin and cryptocurrency markets?" 2018. http://hdl.handle.net/10454/17568.

Texto completo
Resumen
Yes
We develop bespoke rational bubble models for Bitcoin and cryptocurrencies that incorporate both heavy tails and the probability of a complete collapse in asset prices. Empirically, we present robustified evidence of bubbles in Bitcoin and Ethereum. Theoretically, we show that liquidity risks may generate heavy-tails in Bitcoin and cryptocurrency markets. Even in the absence of bubbles dramatic booms and busts can occur. We thus sound a timely note of caution.
The full-text of this article will be released for public view at the end of the publisher embargo on 10 Feb 2020.
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Koh, Jason S. H., University of Western Sydney, College of Business y School of Economics and Finance. "Comparison of the new "econophysics" approach to dealing with problems of financial to traditional econometric methods". 2008. http://handle.uws.edu.au:8081/1959.7/38828.

Texto completo
Resumen
We begin with the outlining the motivation of this research as there are still so many unanswered research questions on our complex financial and economic systems. The philosophical background and the advances of econometrics and econophysics are discussed to provide an overview of the stochastic and nonstochastic modelling and these disciplines are set as a central theme for the thesis. This thesis investigates the effectiveness of financial econometrics models such as Gaussian, ARCH (1), GARCH (1, 1) and its extensions as compared to econophysics models such as Power Law model, Boltzmann-Gibbs (BG) and Tsallis Entropy as statistical models of volatility in US S&P500, Dow Jones and NASDAQ stock index using daily data. The data demonstrate several distinct behavioural characteristics, particularly the increased volatility during 1998 to 2004. Power Laws appear to describe the large fluctuations and other characteristics of stock price changes. Surprisingly, these Power Laws models also show significant correlations for different types and sizes of markets and for different periods and sub-periods of markets. The results show the robustness of Power Law analysis, with the Power Law exponent (0.4 to 2.4) staying within the acceptable range of significance (83% to 97%), regardless of the percentage change in the index return. However, the procedure for testing empirical data against a hypothesised power-law distribution using a simple rank-frequency plot of the data and the data binning process can turn out to be a spurious result for the distribution. As for the stochastic processes such as ARCH (1) and GARCH (1, 1) the models are explicitly confined to the conditional behaviour of the data and the unconditional behaviour has often been described via moments. In reality, it is the unconditional tail behaviour that accounts for the tail behaviour and hence, we have to convert the unconditional tail behaviour and express the models as two-dimensional stochastic difference equation using the processes of Starica (Mikosch 2000). The results show the random walk prediction successfully describes the stock movements for small price fluctuations but fails to handle large price fluctuations. The Power Law tests prove superior to the stochastic tests when stock price fluctuations are substantially divergent from the mean. One of the main points of the thesis is that these empirical phenomena are not present in the stochastic process but emerge in the non-parametric process. The main objective of the thesis is to study the relatively new field of Econophysics and put its work in perspective relative to the established if not altogether successful practice of econometric analysis of stock market volatility. One of the most exciting characteristics of Econophysics is that, as a developing field, no models as yet perfectly represent the market and there is still a lot of fundamental research to be done. Therefore, we begin to explore the application of statistical physics method particularly Tsallis entropy to give a new insights into problems traditionally associated with financial markets. The results of Tsallis entropy surpass all expectations and it is therefore one of the most robust methods of analysis. However, it is now subject to some challenge from McCauley, Bassler et. al., as they found that the stochastic dynamic process (sliding interval techniques) used in fat tail distributions is time dependent.
Doctor of Philosophy (PhD)
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Becker, Alexander Paul. "Maximum entropy and network approaches to systemic risk and foreign exchange". Thesis, 2018. https://hdl.handle.net/2144/33267.

Texto completo
Resumen
The global financial system is an intricate network of networks, and recent financial crises have laid bare our insufficient understanding of its complexity. In response, within the five chapters of this thesis we study how interconnectedness, interdependency and mutual influence impact financial markets and systemic risk. In the first part, we investigate the community formation of global equity and currency markets. We find remarkable changes to correlation structure and lead-lag relationships in times of economic turmoil, implying significant risks to diversification based on historical data. The second part focuses on banks as creators of credit. Bank portfolios generally share some overlap, and this may introduce systemic risk. We model this using European stress test data, finding that the system is stable across a broad range of asset liquidity and risk tolerance. However, there exists a phase transition: If banks become sufficiently risk averse, even small shocks may inflict great losses. Failure to address portfolio overlap thus may leave the banking system ill-prepared. Complete knowledge of the financial network is prerequisite to such systemic risk analyses. When lacking this knowledge, maximum entropy methods allow a probabilistic reconstruction. In the third part of this thesis, we consider Japanese firm-bank data and find that reconstruction methods fail to generate a connected network. Deriving an analytical expression for connection probabilities, we show that this is a general problem of sparse graphs with inhomogeneous layers. Our results yield confidence intervals for the connectivity of a reconstruction. The maximum entropy approach also proves useful for studying dependencies in financial markets: On its basis, we develop a new measure for the information content in foreign exchange rates in part four of this thesis and use it to study the impact of macroeconomic variables on the strength of currency co-movements. While macroeconomic data and the law of supply and demand drive financial markets, foreign exchange rates are also subject to policy interventions. In part five, we classify the roles of currencies within the market with a clustering algorithm and study changes after political and monetary shocks. This methodology may further provide a quantitative underpinning to existing qualitative classifications.
2019-12-11T00:00:00Z
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Fry, John y M. Burke. "An options-pricing approach to election prediction". 2020. http://hdl.handle.net/10454/17754.

Texto completo
Resumen
Yes
The link between finance and politics (especially opinion polling) is interesting in both theoretical and empirical terms. Inter alia the election date corresponds to the effective price of an underlying at a known future date. This renders a derivative pricing approach appropriate and, ultimately, to a simplification of the approach suggested by Taleb (2018). Thus, we use an options-pricing approach to predict vote share. Rather than systematic bias in polls forecasting errors appear chiefly due to the mode of extracting election outcomes from the share of the vote. In the 2016 US election polling results put the Republicans ahead in the electoral college from July 2016 onwards. In the 2017 UK general election, though set to be the largest party, a Conservative majority was far from certain.
The full-text of this article will be released for public view at the end of the publisher embargo on 13 Dec 2021.
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Diassinas, Christopher Luke. "Application of Expectation Maximisation Algorithm on Mixed Distributions". Thesis, 2019. http://hdl.handle.net/2440/119934.

Texto completo
Resumen
Mixed distributions are a statistical tool used for modelling a range of phenomena in fields as diverse as marketing, genetics, medicine, artificial intelligence, and finance. A mixture model is capable of describing a quite complex distribution of data, often in situations where a single parametric distribution is unable to provide a satisfactory result. The Expectation Maximisation (EM) algorithm is an iterative maximum likelihood method typically used to estimate parameters in incomplete data problems, such as mixtures. This thesis presents a thorough analysis of mixture modelling and estimation via the EM algorithm for normal, Weibull, exponential, gamma, loglogistic, and uniform component distributions. Full derivations of relevant EM equations are provided, including censored EM equations for exponential and Weibull component distributions. Goodness-of-fit tests assess how well an hypothesised statistical model fits a set of observations. This thesis considers two goodness-of-fit testing frameworks, the first being formal hypothesis based testing, the second being model selection via information criteria. It has been empirically justified that critical values for Kolmogorov-Smirnov, Kuiper, Cramér-von Mises, and Anderson-Darling goodness-of-fit tests don’t exhibit the same parameter independent properties as single distributions. Critical values are in fact parameter dependent, as well as being dependent on sample size, significance level, and truncation level. A comprehensive analysis is also provided of model selection via information criteria, for the Akaike information criterion, and Bayesian information criterion. Goodness-of-fit testing in this manner was found to be more appropriate for mixture modelling. The work culminates with the application of previously discussed statistical methodology to an analysis of limit-order inter-arrival times, and mid-price waiting times on the London Stock Exchange. It is reasoned that censored mixtures which include a Weibull component most appropriately model this data.
Thesis (MPhil) -- University of Adelaide, School of Physical Sciences, 2019
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Sedlaříková, Jana. "Jsou finanční výnosy a volatilita skutečně multifraktální?" Master's thesis, 2016. http://www.nusl.cz/ntk/nusl-347552.

Texto completo
Resumen
Over the last decades, multifractality has become a downright stylized fact in financial markets. However, its presence has not been adequately statistically proved. The main aim of this thesis is to contribute to the discussion by an ex- tensive statistical analysis of the problem. We investigate returns and volatility of the collection of the four stock indices employing the three popular methods: the GHE, the MF-DFA, and the MF-DMA method. By comparing the results of the original series to those for simulated monofractal series, we conclude that stock market returns as well as volatility exhibit a multifractal nature. Additionally, in order to understand the origin of underlying multifractality, we study vari- ous surrogate series. We found that a fat-tailed distribution significantly affects multifractality. On the other, we were not able to confirm the impact of time correlations as the results strongly depend on the applied model. JEL Classification F12, G02, G10, C12, C22, C49, C58 Keywords econophysics, multifractality, financial markets, Hurst exponent Author's e-mail jana.sedlarikova@gmail.com Supervisor's e-mail kristoufek@ies-prague.org
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía