To see the other types of publications on this topic, follow the link: Econometric models.

Dissertations / Theses on the topic 'Econometric models'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Econometric models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Fahs, Faysal Habib. "Essays in the estimation of systems of limited dependent variables with application to demand systems." Online access for everyone, 2008. http://www.dissertations.wsu.edu/Dissertations/Summer2008/F_Fahs_072508.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Conradie, Tiaan. "The South African economy and internationally fuelled business cycles: an econometric analysis." Thesis, Nelson Mandela Metropolitan University, 2015. http://hdl.handle.net/10948/4354.

Full text
Abstract:
The objective of this study is to understand the dynamics of international monetary policy and the relationship that exists between larger more developed economies and smaller less developed economies within a policy context. The 2008 financial crisis has caused intense revival of Austrian economics due to the monetary nature of the recession caused as a subsequent effect of the stock/housing market collapse that occurred in 2007. One factor of the 2008 financial crisis that created intense concern was the extent to which the slowdown in economic activity was able to be transmitted across international borders. The South African economy was not isolated from the financial crisis by any means and experienced a significant slowdown in economic growth. By making use of data collected from the Federal Reserve Bank of St. Louis and the appropriate econometric techniques, a model is developed to study the dynamics between United States monetary policy and the South African economy. The Austrian School provides a sound theoretical framework that allows for the specification of testable propositions to verify the validity of an “Austrian” internationally transmitted business cycle. Using United States money supply, South African private consumption, South African gross fixed capital formation and the South African current account, a vector autoregressive model is specified to analyse the dynamics behind the United States and South African economy. The results of the empirical test all confirm the theoretical prescriptions developed in the literature review that monetary growth in the United States raise consumption, investment and improve the current account balance in the South African economy. This is a novel result for this study as it confirms that a large central economy has the ability to trigger economic expansions in a peripheral economy. This study further points out the inefficiencies associated with Keynesian style policy making and propagates for a movement towards a more prudent Austrian approach. Keynesian policy making through demand oriented policies have historically been more concerned with “curing” economic instability rather than preventing it. In light of this, the need for economic reform specifically within the manner in which monetary policy is conducted is evident. Aggressive monetary policy in the wake of economic slowdown is no longer effective at creating a sustainable and stable economic environment. A movement away from the monopolization of money and central economic decision making is necessary if the global economy wishes to reach economic permanence.
APA, Harvard, Vancouver, ISO, and other styles
3

Vilela, Lucas Pimentel. "Hypothesis testing in econometric models." reponame:Repositório Institucional do FGV, 2015. http://hdl.handle.net/10438/18249.

Full text
Abstract:
Submitted by Lucas Pimentel Vilela (lucaspimentelvilela@gmail.com) on 2017-05-04T01:19:37Z No. of bitstreams: 1 Hypothesis Testing in Econometric Models - Vilela 2017.pdf: 2079231 bytes, checksum: d0387462f36ab4ab7e5d33163bb68416 (MD5)
Approved for entry into archive by Maria Almeida (maria.socorro@fgv.br) on 2017-05-15T19:31:43Z (GMT) No. of bitstreams: 1 Hypothesis Testing in Econometric Models - Vilela 2017.pdf: 2079231 bytes, checksum: d0387462f36ab4ab7e5d33163bb68416 (MD5)
Made available in DSpace on 2017-05-15T19:32:18Z (GMT). No. of bitstreams: 1 Hypothesis Testing in Econometric Models - Vilela 2017.pdf: 2079231 bytes, checksum: d0387462f36ab4ab7e5d33163bb68416 (MD5) Previous issue date: 2015-12-11
This thesis contains three chapters. The first chapter considers tests of the parameter of an endogenous variable in an instrumental variables regression model. The focus is on one-sided conditional t-tests. Theoretical and numerical work shows that the conditional 2SLS and Fuller t-tests perform well even when instruments are weakly correlated with the endogenous variable. When the population F-statistic is as small as two, the power is reasonably close to the power envelopes for similar and non-similar tests which are invariant to rotation transformations of the instruments. This finding is surprising considering the poor performance of two-sided conditional t-tests found in Andrews, Moreira, and Stock (2007). These tests have bad power because the conditional null distributions of t-statistics are asymmetric when instruments are weak. Taking this asymmetry into account, we propose two-sided tests based on t-statistics. These novel tests are approximately unbiased and can perform as well as the conditional likelihood ratio (CLR) test. The second and third chapters are interested in maxmin and minimax regret tests for broader hypothesis testing problems. In the second chapter, we present maxmin and minimax regret tests satisfying more general restrictions than the alpha-level and the power control over all alternative hypothesis constraints. More general restrictions enable us to eliminate trivial known tests and obtain tests with desirable properties, such as unbiasedness, local unbiasedness and similarity. In sequence, we prove that both tests always exist and under suficient assumptions, they are Bayes tests with priors that are solutions of an optimization problem, the dual problem. In the last part of the second chapter, we consider testing problems that are invariant to some group of transformations. Under the invariance of the hypothesis testing, the Hunt-Stein Theorem proves that the search for maxmin and minimax regret tests can be restricted to invariant tests. We prove that the Hunt-Stein Theorem still holds under the general constraints proposed. In the last chapter we develop a numerical method to implement maxmin and minimax regret tests proposed in the second chapter. The parametric space is discretized in order to obtain testing problems with a finite number of restrictions. We prove that, as the discretization turns finer, the maxmin and the minimax regret tests satisfying the finite number of restrictions have the same alternative power of the maxmin and minimax regret tests satisfying the general constraints. Hence, we can numerically implement tests for a finite number of restrictions as an approximation for the tests satisfying the general constraints. The results in the second and third chapters extend and complement the maxmin and minimax regret literature interested in characterizing and implementing both tests.
Esta tese contém três capítulos. O primeiro capítulo considera testes de hipóteses para o coeficiente de regressão da variável endógena em um modelo de variáveis instrumentais. O foco é em testes-t condicionais para hipóteses unilaterais. Trabalhos teóricos e numéricos mostram que os testes-t condicionais centrados nos estimadores de 2SLS e Fuller performam bem mesmo quando os instrumentos são fracamente correlacionados com a variável endógena. Quando a estatística F populacional é menor que dois, o poder é razoavelmente próximo do poder envoltório para testes que são invariantes a transformações que rotacionam os instrumentos (similares ou não similares). Este resultado é surpreendente considerando a baixa performance dos testes-t condicionais para hipóteses bilaterais apresentado em Andrews, Moreira, and Stock (2007). Estes testes possuem baixo poder porque as distribuições das estatísticas-t na hipótese nula são assimétricas quando os instrumentos são fracos. Explorando tal assimetria, nós propomos testes para hipóteses bilaterais baseados em estatísticas-t. Estes testes são aproximadamente não viesados e podem performar tão bem quanto o teste de razão de máxima verossimilhança condicional. No segundo e no terceiro capítulos, nosso interesse é em testes do tipo maxmin e minimax regret para testes de hipóteses mais gerais. No segundo capítulo, nós apresentamos testes maxmin e minimax regret que satisfazem restrições mais gerais que as restrições de tamanho e de controle sobre todo o poder na hipótese alternativa. Restrições mais gerais nos possibilitam eliminar testes triviais e obter testes com propriedades desejáveis, como por exemplo não viés, não viés local e similaridade. Na sequência, nós provamos que ambos os testes existem e, sob condições suficientes, eles são testes Bayesianos com priors que são solução de um problema de otimização, o problema dual. Na última parte do segundo capítulo, nós consideramos testes de hipóteses que são invariantes à algum grupo de transformações. Sob invariância, o Teorema de Hunt-Stein implica que a busca por testes maxmin e minimax regret pode ser restrita a testes invariantes. Nós provamos que o Teorema de Hunt-Stein continua válido sob as restrições gerais propostas. No último capítulo, nós desenvolvemos um procedimento numérico para implementar os testes maxmin e minimax regret propostos no segundo capítulo. O espaço paramétrico é discretizado com o objetivo de obter testes de hipóteses com um número finito de pontos. Nós provamos que, ao considerarmos partições mais finas, os testes maxmin e minimax regret que satisfazem um número finito de pontos possuem o mesmo poder na hipótese alternativa que os testes maxmin e minimax regret que satisfazem as restrições gerais. Portanto, nós podemos implementar numericamente os testes que satisfazem um número finito de pontos como aproximação aos testes que satisfazem as restrições gerais.
APA, Harvard, Vancouver, ISO, and other styles
4

Castelli, Francesca <1982&gt. "Econometric models of financial risks." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amsdottorato.unibo.it/4274/1/Castelli_Francesca_tesi.pdf.

Full text
Abstract:
The goal of this dissertation is to use statistical tools to analyze specific financial risks that have played dominant roles in the US financial crisis of 2008-2009. The first risk relates to the level of aggregate stress in the financial markets. I estimate the impact of financial stress on economic activity and monetary policy using structural VAR analysis. The second set of risks concerns the US housing market. There are in fact two prominent risks associated with a US mortgage, as borrowers can both prepay or default on a mortgage. I test the existence of unobservable heterogeneity in the borrower's decision to default or prepay on his mortgage by estimating a multinomial logit model with borrower-specific random coefficients.
APA, Harvard, Vancouver, ISO, and other styles
5

Castelli, Francesca <1982&gt. "Econometric models of financial risks." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amsdottorato.unibo.it/4274/.

Full text
Abstract:
The goal of this dissertation is to use statistical tools to analyze specific financial risks that have played dominant roles in the US financial crisis of 2008-2009. The first risk relates to the level of aggregate stress in the financial markets. I estimate the impact of financial stress on economic activity and monetary policy using structural VAR analysis. The second set of risks concerns the US housing market. There are in fact two prominent risks associated with a US mortgage, as borrowers can both prepay or default on a mortgage. I test the existence of unobservable heterogeneity in the borrower's decision to default or prepay on his mortgage by estimating a multinomial logit model with borrower-specific random coefficients.
APA, Harvard, Vancouver, ISO, and other styles
6

Billah, Baki 1965. "Model selection for time series forecasting models." Monash University, Dept. of Econometrics and Business Statistics, 2001. http://arrow.monash.edu.au/hdl/1959.1/8840.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Spurway, Kayleigh Fay Nanette. "A study of the Consumption Capital Asset Pricing Model's appilcability across four countries." Thesis, Rhodes University, 2014. http://hdl.handle.net/10962/d1013016.

Full text
Abstract:
Historically, the Consumption Capital Asset Pricing Method (C-CAPM) has performed poorly in that estimated parameters are implausible, model restrictions are often rejected and inferences appear to be very sensitive to the choice of economic agents' preferences. In this study, we estimate and test the C-CAPM with Constant Relative Risk Aversion (CRRA) using time series data from Germany, South Africa, Britain and America during relatively short time periods with the latest available data sets. Hansen's GMM approach is applied to estimate the parameters arising from this model. In general, estimated parameters fall outside the bounds specified by Lund & Engsted (1996) and Cuthbertson & Nitzsche (2004), even though the models are not rejected by the J-test and are associated with relatively small minimum distances.
APA, Harvard, Vancouver, ISO, and other styles
8

Paraskevopoulos, Ioannis. "Econometric models applied to production theory." Thesis, Queen Mary, University of London, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.392498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

McGarry, Joanne S. "Seasonality in continuous time econometric models." Thesis, University of Essex, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.313064.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gualdani, C. "Econometric analysis of network formation models." Thesis, University College London (University of London), 2017. http://discovery.ucl.ac.uk/1566643/.

Full text
Abstract:
This dissertation addresses topics in the econometrics of network formation models. Chapter 1 provides a review of the literature. Statistical models focus on the specification of the probability distribution of the network. Examples include models in which nodes are born sequentially and meet existing vertices according to random meetings and network-based meetings. Within this group of models, special attention is reserved to the milestone work by Jackson and Rogers (2007): after having discussed and replicated the main results of the paper, an extension of the original model is examined and fitted to a dataset of Google Plus users. Even if statistical models can reproduce relatively well the main characteristics of real networks, they usually lack of microfundation, essential for counterfactual analysis. The chapter hence moves to considering the econometrics of economic models of network formation, where agents form links in order to maximise a payoff function. Within this framework, Chapter 2 studies identification of the parameters governing agents’ preferences in a static game of network formation, where links represent asymmetric relations between players. After having shown existence of an equilibrium, partial identification arguments are provided without restrictions on equilibrium selection. The usual computational difficulties are attenuated by restricting the attention to some local games of the network formation game and giving up on sharpness. Chapter 3 applies the methodology developed in Chapter 2 to empirically investigate which preferences are behind firms’ decisions to appoint competitors’ directors as executives. Using data on Italian companies, it is found that a firm i prefers its executives sitting on the board of a rival j when executives of other competitors are hosted too, possibly because it enables i to engage with them in “cheap talk” communications, besides having the opportunity to learn about j’s decision making process.
APA, Harvard, Vancouver, ISO, and other styles
11

Adusumilli, Karun. "Essays on inference in econometric models." Thesis, London School of Economics and Political Science (University of London), 2018. http://etheses.lse.ac.uk/3760/.

Full text
Abstract:
This thesis contains three essays on inference in econometric models. Chapter 1 considers the question of bootstrap inference for Propensity Score Matching. Propensity Score Matching, where the propensity scores are estimated in a first step, is widely used for estimating treatment effects. In this context, the naive bootstrap is invalid (Abadie and Imbens, 2008). This chapter proposes a novel bootstrap procedure for this context, and demonstrates its consistency. Simulations and real data examples demonstrate the superior performance of the proposed method relative to using the asymptotic distribution for inference, especially when the degree of overlap in propensity scores is poor. General versions of the procedure can also be applied to other causal effect estimators such as inverse probability weighting and propensity score subclassification, potentially leading to higher order refinements for inference in such contexts. Chapter 2 tackles the question of inference in incomplete econometric models. In many economic and statistical applications, the observed data take the form of sets rather than points. Examples include bracket data in survey analysis, tumor growth and rock grain images in morphology analysis, and noisy measurements on the support function of a convex set in medical imaging and robotic vision. Additionally, nonparametric bounds on treatment effects under imperfect compliance can be expressed by means of random sets. This chapter develops a concept of nonparametric likelihood for random sets and its mean, known as the Aumann expectation, and proposes general inference methods by adapting the theory of empirical likelihood. Chapter 3 considers inference on the cumulative distribution function (CDF) in the classical measurement error model. It proposes both asymptotic and bootstrap based uniform confidence bands for the estimator of the CDF under measurement error. The proposed techniques can also be used to obtain confidence bands for quantiles, and perform various CDF-based tests such as goodness-offit tests for parametric models of densities, two sample homogeneity tests, and tests for stochastic dominance; all for the first time under measurement error.
APA, Harvard, Vancouver, ISO, and other styles
12

Skvorchevsky, Alexander Evgenievich, and S. V. Larka. "Econometric models robust estimation practical aspects." Thesis, НТУ "ХПІ", 2016. http://repository.kpi.kharkov.ua/handle/KhPI-Press/28251.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Klongkratoke, Pittaya. "Econometric models in foreign exchange market." Thesis, University of Glasgow, 2016. http://theses.gla.ac.uk/7333/.

Full text
Abstract:
According to the significance of the econometric models in foreign exchange market, the purpose of this research is to give a closer examination on some important issues in this area. The research covers exchange rate pass-through into import prices, liquidity risk and expected returns in the currency market, and the common risk factors in currency markets. Firstly, with the significant of the exchange rate pass-through in financial economics, the first empirical chapter studies on the degree of exchange rate pass-through into import in emerging economies and developed countries in panel evidences for comparison covering the time period of 1970-2009. The pooled mean group estimation (PMGE) is used for the estimation to investigate the short run coefficients and error variance. In general, the results present that the import prices are affected positively, though incompletely, by the exchange rate. Secondly, the following study addresses the question whether there is a relationship between cross-sectional differences in foreign exchange returns and the sensitivities of the returns to fluctuations in liquidity, known as liquidity beta, by using a unique dataset of weekly order flow. Finally, the last study is in keeping with the study of Lustig, Roussanov and Verdelhan (2011), which shows that the large co-movement among exchange rates of different currencies can explain a risk-based view of exchange rate determination. The exploration on identifying a slope factor in exchange rate changes is brought up. The study initially constructs monthly portfolios of currencies, which are sorted on the basis of their forward discounts. The lowest interest rate currencies are contained in the first portfolio and the highest interest rate currencies are in the last. The results performs that portfolios with higher forward discounts incline to contain higher real interest rates in overall by considering the first portfolio and the last portfolio though the fluctuation occurs.
APA, Harvard, Vancouver, ISO, and other styles
14

Wellman, David B. "Econometric models of local area agriculture /." free to MU campus, to others for purchase, 2001. http://wwwlib.umi.com/cr/mo/fullcit?p3025660.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Lui, Hon-kwong, and 呂漢光. "An econometric model of spouse selection." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1996. http://hub.hku.hk/bib/B30110750.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Pitrun, Ivet 1959. "A smoothing spline approach to nonlinear inference for time series." Monash University, Dept. of Econometrics and Business Statistics, 2001. http://arrow.monash.edu.au/hdl/1959.1/8367.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Limkriangkrai, Manapon. "An empirical investigation of asset-pricing models in Australia." University of Western Australia. Faculty of Business, 2007. http://theses.library.uwa.edu.au/adt-WU2007.0197.

Full text
Abstract:
[Truncated abstract] This thesis examines competing asset-pricing models in Australia with the goal of establishing the model which best explains cross-sectional stock returns. The research employs Australian equity data over the period 1980-2001, with the major analyses covering the more recent period 1990-2001. The study first documents that existing asset-pricing models namely the capital asset pricing model (CAPM) and domestic Fama-French three-factor model fail to meet the widely applied Merton?s zero-intercept criterion for a well-specified pricing model. This study instead documents that the US three-factor model provides the best description of Australian stock returns. The three US Fama-French factors are statistically significant for the majority of portfolios consisting of large stocks. However, no significant coefficients are found for portfolios in the smallest size quintile. This result initially suggests that the largest firms in the Australian market are globally integrated with the US market while the smallest firms are not. Therefore, the evidence at this point implies domestic segmentation in the Australian market. This is an unsatisfying outcome, considering that the goal of this research is to establish the pricing model that best describes portfolio returns. Given pervasive evidence that liquidity is strongly related to stock returns, the second part of the major analyses derives and incorporates this potentially priced factor to the specified pricing models ... This study also introduces a methodology for individual security analysis, which implements the portfolio analysis, in this part of analyses. The technique makes use of visual impressions conveyed by the histogram plots of coefficients' p-values. A statistically significant coefficient will have its p-values concentrated at below a 5% level of significance; a histogram of p-values will not have a uniform distribution ... The final stage of this study employs daily return data as an examination of what is indeed the best pricing model as well as to provide a robustness check on monthly return results. The daily result indicates that all three US Fama-French factors, namely the US market, size and book-to-market factors as well as LIQT are statistically significant, while the Australian three-factor model only exhibits one significant market factor. This study has discovered that it is in fact the US three-factor model with LIQT and not the domestic model, which qualifies for the criterion of a well-specified asset-pricing model and that it best describes Australian stock returns.
APA, Harvard, Vancouver, ISO, and other styles
18

Tangen, Alyssa. "The Impacts of Expected Structural Changes in Demand for Agricultural Commodities in China and India on World Agriculture." Thesis, North Dakota State University, 2009. https://hdl.handle.net/10365/29866.

Full text
Abstract:
The objective of this study is to evaluate the changes in import and export demand in China and India on the United States and global agriculture in 2018. A spatial equilibrium model is developed to optimize production and trade in China, India, and other major importing and exporting regions in the world. This research focuses on four primary crops: wheat, com, rice and soybeans. In the model China and India are divided into 31 and 14 producing and consuming regions, respectively. The model also includes five exporting countries and ten importing countries/regions. The results indicate that India will be able to stay largely self-sufficient in 2018 and China will increase its soybean and com imports to meet rising domestic demand. The research also gives perspectives on production and trade in the United States and other major exporting and importing countries.
APA, Harvard, Vancouver, ISO, and other styles
19

Dyrberg, Rommer Anne. "Accounting-based credit-scoring models : econometric investigations /." Copenhagen, 2005. http://www.gbv.de/dms/zbw/505621215.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Mavroeidis, Sophocles. "Econometric issues in forward-looking monetary models." Thesis, University of Oxford, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.273303.

Full text
Abstract:
Recently, single equation approaches for estimating structural models have become popular in the monetary economics literature. In particular, single-equation Generalized Method Moments estimators have been used for estimating forward-looking models with rational expectations. Two important examples are found in Clarida, Gali, and Gertler (1998) for the estimation of forward- looking Taylor rules and in Gali and Gertler (1999) for the estimation of a forward-looking model for inflation dynamics. In this thesis, we address the issues of identification which have been overlooked due to the incompleteness of the single-equation formulations. We provide extensions to existing results on the properties of GMM estimators and inference under weak identification, pertaining to situations in which only functions of the parameters of interest are identified, and structural residuals exhibit negative autocorrelation. We also characterize the power of the Hansen test to detect mis specification, and address the issues arising from using too many irrelevant instruments as well as from general corrections for residual autocorrelation, beyond what is implied by the maintained model. In general, we show that the non-modelled variables cannot be weakly exogenous for the parameters of interest, and that they are informative about the identification and mis-specification of the model. Modelling the reduced form helps identify pathological situations in which the structural parameters are weakly identified and the GMM estimators are inconsistent and biased in the direction of OLS.We also ¯nd the OLS bias to be increasing in the number of over-identifying instruments, even when the latter are irrelevant, thus demonstrating the dangers of using too many potentially irrelevant instruments. Finally, with regards to the "New Phillips curve", we conclude that, for the US economy, this model is either un-identified or mis-specified, casting doubts on its utility as a model of in°ation dynamics.
APA, Harvard, Vancouver, ISO, and other styles
21

Zeileis, Achim, Friedrich Leisch, Christian Kleiber, and Kurt Hornik. "Monitoring structural change in dynamic econometric models." SFB Adaptive Information Systems and Modelling in Economics and Management Science, WU Vienna University of Economics and Business, 2002. http://epub.wu.ac.at/1296/1/document.pdf.

Full text
Abstract:
The classical approach to testing for structural change employs retrospective tests using a historical data set of a given length. Here we consider a wide array of fluctuation-type tests in a monitoring situation - given a history period for which a regression relationship is known to be stable, we test whether incoming data are consistent with the previously established relationship. Procedures based on estimates of the regression coefficients are extended in three directions: we introduce (a) procedures based on OLS residuals, (b) rescaled statistics and (c) alternative asymptotic boundaries. Compared to the existing tests our extensions offer better power against certain alternatives, improved size in finite samples for dynamic models and ease of computation respectively. We apply our methods to two data sets, German M1 money demand and U.S. labor productivity.
Series: Report Series SFB "Adaptive Information Systems and Modelling in Economics and Management Science"
APA, Harvard, Vancouver, ISO, and other styles
22

Weier, Annette 1960. "Demutualisation in the Australian life insurance industry." Monash University, Dept. of Economics, 2000. http://arrow.monash.edu.au/hdl/1959.1/8371.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Angelov, Nikolay. "Essays on unit-root testing and on discrete-response modelling of firm mergers /." Uppsala : Department of Economics, Uppsala University, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-6358.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Bokan, Nikola. "On taxes, labour market distortions and product imperfections." Thesis, University of St Andrews, 2010. http://hdl.handle.net/10023/3053.

Full text
Abstract:
This thesis aims to provide new and useful insights into the effects that various tax, labour and product market reforms have on the overall economic performance. Additionally, it aims also to provide insights about the optimal monetary and fiscal policy behaviour within the economy characterized with various real labour market frictions. We analyze the benefits of tax reforms and their effectiveness relative to product or other labour market reforms. A general equilibrium model with imperfect competition, wage bargaining and different forms of tax distortions is applied in order to analyze these issues. We find that structural reforms imply short run costs but long run gains; that the long run gains outweigh the short run costs; and that the financing of such reforms will be the main stumbling block. We also find that the effectiveness of various reform instruments depends on the policy maker's ultimate objective. More precisely, tax reforms are more effective for welfare gains, but market liberalization is more valuable for generating employment. In order to advance our understanding of the tax and product market reform processes, we then develop a dynamic stochastic general equilibrium model which incorporates search-matching frictions, costly ring and endogenous job destruction decisions, as well as a distortionary progressive wage and a at payroll tax. We confirm the negative effects of marginal tax distortions on the overall economic performance. We also find a positive effect of an increase in the wage tax progressivity and product market liberalization on employment, output and consumption. Following a positive technology shock, the volatility of employment, output and consumption turns out to be lower in the reformed economy, whereas the impact effect on inflation is more pronounced. Following a positive government spending shock the volatility of employment, output and consumption is again lower in the reformed economy, but the inflation response is stronger over the whole adjustment path. We also find detrimental effects on employment and output of a tax reform which keeps the marginal tax wedge unchanged by partially offsetting a decrease in the payroll tax by an increase in the wage tax rate. If this reform is anticipated one period in advance the negative effects remain all over the transition path. We investigate the optimal monetary and fiscal policy implication of the New-Keynesian setup enriched with search-matching frictions. We show that the optimal policy features deviation from strict price stability, and that the Ramsey planner uses both inflation and taxes in order to fully exploit the benefits of the productivity increase following a positive productivity shock. We also find that the optimal tax rate and government liabilities inherit the time series properties of the underlying shocks. Moreover, we identify a certain degree of overshooting in inflation and tax rates following a positive productivity shock, and a certain degree of undershooting following a positive government spending shock as a consequence of the assumed commitment of policy maker.
APA, Harvard, Vancouver, ISO, and other styles
25

Li, Ke 1969. "A general equilibrium analysis of the division of labour : violation and enforcement of property rights, impersonal networking decisions and bundling sale." Monash University, School of Asian Languages and Studies, 2001. http://arrow.monash.edu.au/hdl/1959.1/9256.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Forchini, Giovanni. "Exact distribution theory for some econometric problems." Thesis, University of Southampton, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.242631.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Fok, Dennis. "Advanced econometric marketing models = Geavanceerde econometrische marketing modellen /." Rotterdam : Erasmus Research Institute of Management, 2003. http://aleph.unisg.ch/hsgscan/hm00084593.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Ouyang, Desheng. "Nonparametric estimation of econometric models with categorical variables." Texas A&M University, 2005. http://hdl.handle.net/1969.1/4298.

Full text
Abstract:
In this dissertation I investigate several topics in the field of nonparametric econometrics. In chapter II, we consider the problem of estimating a nonparametric regression model with only categorical regressors. We investigate the theoretical properties of least squares cross-validated smoothing parameter selection, establish the rate of convergence (to zero) of the smoothing parameters for relevant regressors, and show that there is a high probability that the smoothing parameters for irrelevant regressors converge to their upper bound values thereby smoothing out the irrelevant regressors. In chapter III, we consider the problem of estimating a joint distribution defined over a set of discrete variables. We use a smoothing kernel estimator to estimate the joint distribution, allowing for the case in which some of the discrete variables are uniformly distributed, and explicitly address the vector-valued smoothing parameter case due to its practical relevance. We show that the cross-validated smoothing parameters differ in their asymptotic behavior depending on whether a variable is uniformly distributed or not. In chapter IV, we consider a k-n-n estimation of regression function with k selected by a cross validation method. We consider both the local constant and local linear cases. In both cases, the convergence rate of of the cross validated k is established. In chapter V, we consider nonparametric estimation of regression functions with mixed categorical and continuous data. The smoothing parameters in the model are selected by a cross-validation method. The uniform convergence rate of the kernel regression function estimator function with weakly dependent data is derived.
APA, Harvard, Vancouver, ISO, and other styles
29

Silvestrini, Andrea. "Essays on aggregation and cointegration of econometric models." Doctoral thesis, Universite Libre de Bruxelles, 2009. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210304.

Full text
Abstract:
This dissertation can be broadly divided into two independent parts. The first three chapters analyse issues related to temporal and contemporaneous aggregation of econometric models. The fourth chapter contains an application of Bayesian techniques to investigate whether the post transition fiscal policy of Poland is sustainable in the long run and consistent with an intertemporal budget constraint.

Chapter 1 surveys the econometric methodology of temporal aggregation for a wide range of univariate and multivariate time series models.

A unified overview of temporal aggregation techniques for this broad class of processes is presented in the first part of the chapter and the main results are summarized. In each case, assuming to know the underlying process at the disaggregate frequency, the aim is to find the appropriate model for the aggregated data. Additional topics concerning temporal aggregation of ARIMA-GARCH models (see Drost and Nijman, 1993) are discussed and several examples presented. Systematic sampling schemes are also reviewed.

Multivariate models, which show interesting features under temporal aggregation (Breitung and Swanson, 2002, Marcellino, 1999, Hafner, 2008), are examined in the second part of the chapter. In particular, the focus is on temporal aggregation of VARMA models and on the related concept of spurious instantaneous causality, which is not a time series property invariant to temporal aggregation. On the other hand, as pointed out by Marcellino (1999), other important time series features as cointegration and presence of unit roots are invariant to temporal aggregation and are not induced by it.

Some empirical applications based on macroeconomic and financial data illustrate all the techniques surveyed and the main results.

Chapter 2 is an attempt to monitor fiscal variables in the Euro area, building an early warning signal indicator for assessing the development of public finances in the short-run and exploiting the existence of monthly budgetary statistics from France, taken as "example country".

The application is conducted focusing on the cash State deficit, looking at components from the revenue and expenditure sides. For each component, monthly ARIMA models are estimated and then temporally aggregated to the annual frequency, as the policy makers are interested in yearly predictions.

The short-run forecasting exercises carried out for years 2002, 2003 and 2004 highlight the fact that the one-step-ahead predictions based on the temporally aggregated models generally outperform those delivered by standard monthly ARIMA modeling, as well as the official forecasts made available by the French government, for each of the eleven components and thus for the whole State deficit. More importantly, by the middle of the year, very accurate predictions for the current year are made available.

The proposed method could be extremely useful, providing policy makers with a valuable indicator when assessing the development of public finances in the short-run (one year horizon or even less).

Chapter 3 deals with the issue of forecasting contemporaneous time series aggregates. The performance of "aggregate" and "disaggregate" predictors in forecasting contemporaneously aggregated vector ARMA (VARMA) processes is compared. An aggregate predictor is built by forecasting directly the aggregate process, as it results from contemporaneous aggregation of the data generating vector process. A disaggregate predictor is a predictor obtained from aggregation of univariate forecasts for the individual components of the data generating vector process.

The econometric framework is broadly based on Lütkepohl (1987). The necessary and sufficient condition for the equality of mean squared errors associated with the two competing methods in the bivariate VMA(1) case is provided. It is argued that the condition of equality of predictors as stated in Lütkepohl (1987), although necessary and sufficient for the equality of the predictors, is sufficient (but not necessary) for the equality of mean squared errors.

Furthermore, it is shown that the same forecasting accuracy for the two predictors can be achieved using specific assumptions on the parameters of the VMA(1) structure.

Finally, an empirical application that involves the problem of forecasting the Italian monetary aggregate M1 on the basis of annual time series ranging from 1948 until 1998, prior to the creation of the European Economic and Monetary Union (EMU), is presented to show the relevance of the topic. In the empirical application, the framework is further generalized to deal with heteroskedastic and cross-correlated innovations.

Chapter 4 deals with a cointegration analysis applied to the empirical investigation of fiscal sustainability. The focus is on a particular country: Poland. The choice of Poland is not random. First, the motivation stems from the fact that fiscal sustainability is a central topic for most of the economies of Eastern Europe. Second, this is one of the first countries to start the transition process to a market economy (since 1989), providing a relatively favorable institutional setting within which to study fiscal sustainability (see Green, Holmes and Kowalski, 2001). The emphasis is on the feasibility of a permanent deficit in the long-run, meaning whether a government can continue to operate under its current fiscal policy indefinitely.

The empirical analysis to examine debt stabilization is made up by two steps.

First, a Bayesian methodology is applied to conduct inference about the cointegrating relationship between budget revenues and (inclusive of interest) expenditures and to select the cointegrating rank. This task is complicated by the conceptual difficulty linked to the choice of the prior distributions for the parameters relevant to the economic problem under study (Villani, 2005).

Second, Bayesian inference is applied to the estimation of the normalized cointegrating vector between budget revenues and expenditures. With a single cointegrating equation, some known results concerning the posterior density of the cointegrating vector may be used (see Bauwens, Lubrano and Richard, 1999).

The priors used in the paper leads to straightforward posterior calculations which can be easily performed.

Moreover, the posterior analysis leads to a careful assessment of the magnitude of the cointegrating vector. Finally, it is shown to what extent the likelihood of the data is important in revising the available prior information, relying on numerical integration techniques based on deterministic methods.


Doctorat en Sciences économiques et de gestion
info:eu-repo/semantics/nonPublished

APA, Harvard, Vancouver, ISO, and other styles
30

Kapetanios, George. "Essays on the econometric analysis of threshold models." Thesis, University of Cambridge, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.286704.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Hall, Stephen George Frederick. "Solving and evaluating large non-linear econometric models." Thesis, Queen Mary, University of London, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.261290.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Stamatogiannis, Michalis P. "Econometric inference in models with nonstationary time series." Thesis, University of Nottingham, 2010. http://eprints.nottingham.ac.uk/11950/.

Full text
Abstract:
We investigate the finite sample behaviour of the ordinary least squares (OLS) estimator in vector autoregressive (VAR) models. The data generating process is assumed to be a purely nonstationary first-order VAR. Using Monte Carlo simulation and numerical optimization we derive response surfaces for OLS bias and variance in terms of VAR dimensions both under correct model specification and under several types of over-parameterization: we include a constant, a constant and trend, and introduce excess autoregressive lags. Correction factors are introduced that minimise the mean squared error (MSE) of the OLS estimator. Our analysis improves and extends one of the main finite-sample multivariate analytical bias results of Abadir, Hadri and Tzavalis (1999), generalises the univariate variance and MSE results of Abadir (1995) to a multivariate setting, and complements various asymptotic studies. The distribution of unit root test statistics generally contains nuisance parameters that correspond to the correlation structure of the innovation errors. The presence of such nuisance parameters can lead to serious size distortions. To address this issue, we adopt an approach based on the characterization of the class of asymptotically similar critical regions for the unit root hypothesis and the application of two new optimality criteria for the choice of a test within this class. The correlation structure of the innovation sequence takes the form of a moving average process, the order of which is determined by an appropriate information criterion. Limit distribution theory for the resulting test statistics is developed and simulation evidence suggests that our statistics have substantially reduced size while retaining good power properties. Stock return predictability is a fundamental issue in asset pricing. The conclusions of empirical analyses on the existence of stock return predictability vary according to the time series properties of the economic variables considered as potential predictors. Given the uncertainty about the degree of persistence of these variables, it is important to operate in the most general possible modelling framework. This possibility is provided by the IVX methodology developed by Phillips and Magdalinos (2009) in the context of cointegrated systems with no deterministic components. This method is modified in order to apply to multivariate systems of predictive regressions with an intercept in the model. The resulting modified IVX approach yields chi-squared inference for general linear restrictions on the regression coefficients that is robust to the degree of persistence of the predictor variables. In addition to extending the class of generating mechanisms for predictive regression, the approach extends the range of testable hypotheses, assessing the combined effects of different explanatory variables to stock returns rather than the individual effect of each explanatory variable.
APA, Harvard, Vancouver, ISO, and other styles
33

Fernanda, P. M. "Instrument Selection in Econometric Models Consequences and Methods." Thesis, University of Birmingham, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.528389.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Li, Yang, and 李杨. "Statistical inference for some econometric time series models." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2014. http://hdl.handle.net/10722/195984.

Full text
Abstract:
With the increasingly economic activities, people have more and more interest in econometric models. There are two mainstream econometric models which are very popular in recent decades. One is quantile autoregressive (QAR) model which allows varying-coefficients in linear time series and greatly promotes the ranges of regression research. The first topic of this thesis is to focus on the modeling of QAR model. We propose two important measures, quantile correlation (QCOR) and quantile partial correlation (QPCOR). We then apply them to QAR models, and introduce two valuable quantities, the quantile autocorrelation function (QACF) and the quantile partial autocorrelation function (QPACF). This allows us to extend the Box-Jenkins three-stage procedure (model identification, model parameter estimation, and model diagnostic checking) from classical autoregressive models to quantile autoregressive models. Specifically, the QPACF of an observed time series can be employed to identify the autoregressive order, while the QACF of residuals obtained from the model can be used to assess the model adequacy. We not only demonstrate the asymptotic properties of QCOR, QPCOR, QACF and PQACF, but also show the large sample results of the QAR estimates and the quantile version of the Ljung- Box test. Moreover, we obtain the bootstrap approximations to the distributions of parameter estimators and proposed measures. Simulation studies indicate that the proposed methods perform well in finite samples, and an empirical example is presented to illustrate the usefulness of QAR model. The other important econometric model is autoregressive conditional duration (ACD) model which is developed with the purpose of depicting ultra high frequency (UHF) financial time series data. The second topic of this thesis is designed to incorporate ACD model with one of the extreme value distributions, i.e. Fréchet distribution. We apply the maximum likelihood estimation (MLE) to Fréchet ACD models and derive its generalized residuals for model adequacy checking. It is noteworthy that simulations show a relative greater sensitiveness in the linear parameters to sampling errors. This phenomenon successfully reflects the skewness of the Fréchet distribution and suggests a method to practitioners in proceeding model accuracy. Furthermore, we present the empirical sizes and powers for Box-Pierce, Ljung-Box and modified Box-Pierce statistics as comparisons of the proposed portmanteau statistic. In addition to the Fréchet ACD, we also systematically analyze theWeibull ACD, where the Weibull distribution is the other nonnegative extreme value distribution. The last topic of the thesis explains the estimation and diagnostic checking the Weibull ACD model. By investigating the MLE in this model, there exhibits a slight sensitiveness in linear parameters. However, there is an obvious phenomenon on the trade-off between the skewness of Weibull distribution and the sampling error when the simulations are conducted. Moreover, the asymptotic properties are also studied for the generalized residuals and a goodness-of-fit test is employed to obtain a portmanteau statistic. Through the simulation results in size and power, it shows that Weibull ACD is superior to Fréchet ACD in specifying the wrong model. This is meaningful in practice.
published_or_final_version
Statistics and Actuarial Science
Doctoral
Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
35

Lu, Maozu. "The encompassing principle and evaluation of econometric models." Thesis, University of Southampton, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.316084.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Sherrell, Neill. "The estimation and specification of spatial econometric models." Thesis, University of Bristol, 1990. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.281861.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Jenkins, Irene D. (Irene Diane), and Mary Helen Schaeffer. "Econometric models of eleven single family housing markets." Thesis, Massachusetts Institute of Technology, 1989. http://hdl.handle.net/1721.1/67381.

Full text
Abstract:
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Urban Studies and Planning, 1989.
Includes bibliographical references (leaves 80-84).
by Irene D. Jenkins and Mary Helen Schaeffer.
M.S.
APA, Harvard, Vancouver, ISO, and other styles
38

Fezzi, Carlo <1980&gt. "Econometric models for the analysis of electricity markets." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2007. http://amsdottorato.unibo.it/433/1/tesi_dottorato_carlofezzi.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Fezzi, Carlo <1980&gt. "Econometric models for the analysis of electricity markets." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2007. http://amsdottorato.unibo.it/433/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Ntantamis, Christos. "Identifying hidden boundaries within economic data in the time and space domains." Thesis, McGill University, 2009. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=115616.

Full text
Abstract:
This thesis presents methodological contributions to the modeling of regimes in the time or space domain of economic data by introducing a number of algorithms from engineering applications and substantially modifying them so that can be used in economic applications. The objective is twofold: to estimate the parameters of such models, and to identify the corresponding boundaries between regimes. The models used belong to the class of Finite Mixture Models and their natural extensions for the case of dependent data, Hidden Markov Models (see McLachlan and Peel 2000). Mixture models are extremely useful in the modeling of heterogeneity in a cluster analysis context; the components of the mixtures, or the states, will correspond to the different latent groups, e.g. homogeneous regions such as the housing submarkets or regimes in the case of stock market returns.
The thesis discusses issues of alternative estimation algorithms that provide larger model flexibility in capturing the underlying data dynamics, and of procedures that allow the selection of the number of the regimes in the data.
The first part introduces a model of spatial association for housing markets, which is approached in the context of spatial heterogeneity. A Hedonic Price Index model is considered, i.e. a model where the price of the dwelling is determined by its structural and neighborhood characteristics. Remaining spatial heterogeneity is modeled as a Finite Mixture Model for the residuals of the Hedonic Index. The Finite Mixture Model is estimated using the Figueiredo and Jain (2002) approach. The overall ability of the model to identify spatial heterogeneity is evaluated through a set of simulations. The model was applied to Los Angeles County housing prices data for the year 2002. The statistically identified number of submarkets, after taking into account the dwellings' structural characteristics, are found to be considerably fewer than the ones imposed either by geographical or administrative boundaries, thus making it more suitable for mass assessment applications.
The second part of the thesis introduces a Duration Hidden Markov Model to represent regime switches in the stock market; the duration of each state of the Markov Chain is explicitly modeled as a random variable that depends on a set of exogenous variables. Therefore, the model not only allows the endogenous determination of the different regimes but also estimates the effect of the explanatory variables on the regimes' durations. The model is estimated on NYSE returns using the short-term interest rate and the interest rate spread as exogenous variables. The estimation results coincide with existing findings in the literature, in terms of regimes' characteristics, and are compatible with basic economic intuition, in terms of the effect of the exogenous variables on regimes' durations.
The final part of the thesis considers a Hidden Markov Model (HMM) approach in order to perform the task of detecting structural breaks, which are defined as the data points where the underlying Markov Chain switches from one state to another: A new methodology is proposed in order to estimate all aspects of the model: number of regimes, parameters of the model corresponding to each regime, and the locations of regime switches. One of the main advantages of the proposed methodology is that it allows for different model specifications across regimes. The performance of the overall procedure, denoted IMI by the initials of the component algorithms is validated by two sets of simulations: one in which only the parameters are permitted to differ across regimes, and one that also permits differences in the functional forms. The IMI method performs very well across all specifications in both sets of simulations.
APA, Harvard, Vancouver, ISO, and other styles
41

Naqvi, Farzana. "GE-PAK : a computable general equilibrium model of energy-economy interaction in Pakistan." Phd thesis, Department of Economics, 1995. http://hdl.handle.net/2123/3964.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Moniz, Nuno Miguel Martins. "A procura de produtos do tabaco em Portugal (1986-2003) : estimação de elasticidades." Master's thesis, 2006. http://hdl.handle.net/10400.3/504.

Full text
Abstract:
Dissertação de Mestrado em Gestão (MBA)
Neste trabalho é realizado um estudo do comportamento dos consumidores de tabaco em Portugal no período de 1986-2003. É efectuado um teste de hipóteses de diversos modelos de comportamento dos consumidores de produtos do tabaco utilizando Modelos Convencionais e Modelos de Dependência (Modelos de Dependência Míopes e Modelos de Dependência Racional) que vieram introduzir as dimensões de reforço, tolerância e abstinência que caracterizam os bens de dependência. Os resultados levam a concluir pela não aplicabilidade dos modelos de dependência, assumindo a função final uma configuração do tipo Log-Log em modelo convencional, onde apenas se apresentam relevantes as variáveis preço, PIB per capita e uma variável de tendência temporal. O estudo estima valores para as elasticidades procura-preço de -0,466 e procura-rendimento de -0,233. São calculados vários multiplicadores de receitas fiscais do imposto especial sobre o consumo de tabaco, face a mudanças no preço e nas componentes do imposto.
ABSTRACT: In this work a study of the behaviour of the tobacco consumers is carried through for Portugal in the period of 1986-2003. A test of hypotheses of diverse models of behaviour of the consumers of tobacco products is done using Conventional Models and Addiction Models (myopic addiction models and rational addiction models) which introduced the dimensions of reinforcement, tolerance and abstinence that characterize the addiction goods. The results lead to conclude for the non applicability of the addiction models, with the final function assuming a double-log type configuration in a conventional model, where the relevant variables are price, GDP per capita and a time trend variable. The study estimates values for demand elasticity’s, -0.466 for price and -0.233 for income. Several multipliers of the tobacco excise tax receipts are calculated, due to changes in price and excise tax components.
RÉSUMÉ: Dans ce travail, une étude du comportement des consommateurs de tabac est exécutée pour le Portugal dans la période de 1986-2003. Un essai des hypothèses des modèles divers du comportement des consommateurs des produits de tabac est fait en utilisant les modèles conventionnels et des modèles de dépendance (les modèles myopes de dépendance et les modèles rationnels de dépendance) qui présentent les dimensions du renfort, de la tolérance et de l'abstinence qui caractérisent les biens de dépendance. Les résultats mènent à conclure pour la non applicabilité des modèles de dépendance, avec la fonction finale à présenter une configuration Log-Log en modèle conventionnel, où les variables relevantes sont le prix, le PIB per capita et une variable de tendance temporelle. L'étude estime des valeurs pour des élasticités de demande, -0,466 pour le prix et -0,233 pour le revenu. Plusieurs multiplicateurs des recettes fiscales de l’impôt du tabac sont calculés, en fonction des changements des prix et des components de l’impôt.
APA, Harvard, Vancouver, ISO, and other styles
43

"Die kombinering van vooruitskattings : 'n toepassing op die vernaamste makro-ekonomiese veranderlikes." Thesis, 2014. http://hdl.handle.net/10210/9439.

Full text
Abstract:
M.Com. (Econometrics)
The main purpose of this study is the combining of forecasts with special reference to major macroeconomic series of South Africa. The study is based on econometric principles and makes use of three macro-economic variables, forecasted with four forecasting techniques. The macroeconomic variables which have been selected are the consumer price index, consumer expenditure on durable and semi-durable products and real M3 money supply. Forecasts of these variables have been generated by applying the Box-Jenkins ARIMA technique, Holt's two parameter exponential smoothing, the regression approach and mUltiplicative decomposition. Subsequently, the results of each individual forecast are combined in order to determine if forecasting errors can be minimized. Traditionally, forecasting involves the identification and application of the best forecasting model. However, in the search for this unique model, it often happens that some important independent information contained in one of the other models, is discarded. To prevent this from happening, researchers have investigated the idea of combining forecasts. A number of researchers used the results from different techniques as inputs into the combination of forecasts. In spite of the differences in their conclusions, three basic principles have been identified in the combination of forecasts, namely: i The considered forecasts should represent the widest range of forecasting techniques possible. Inferior forecasts should be identified. Predictable errors should be modelled and incorporated into a new forecast series. Finally, a method of combining the selected forecasts needs to be chosen. The best way of selecting a m ethod is probably by experimenting to find the best fit over the historical data. Having generated individual forecasts, these are combined by considering the specifications of the three combination methods. The first combination method is the combination of forecasts via weighted averages. The use of weighted averages to combine forecasts allows consideration of the relative accuracy of the individual methods and of the covariances of forecast errors among the methods. Secondly, the combination of exponential smoothing and Box-Jenkins is considered. Past errors of each of the original forecasts are used to determine the weights to attach to the two original forecasts in forming the combined forecasts. Finally, the regression approach is used to combine individual forecasts. Granger en Ramanathan (1984) have shown that weights can be obtained by regressing actual values of the variables of interest on the individual forecasts, without including a constant and with the restriction that weights add up to one. The performance of combination relative to the individual forecasts have been tested, given that the efficiency criterion is the minimization of the mean square errors. The results of both the individual and the combined forecasting methods are acceptable. Although some of the methods prove to be more accurate than others, the conclusion can be made that reliable forecasts are generated by individual and combined forecasting methods. It is up to the researcher to decide whether he wants to use an individual or combined method since the difference, if any, in the root mean square percentage errors (RMSPE) are insignificantly small.
APA, Harvard, Vancouver, ISO, and other styles
44

"Testing and estimating structural change in misspecified linear models." 1997. http://library.cuhk.edu.hk/record=b5889147.

Full text
Abstract:
Leung Wai-Kit.
Thesis (M.Phil.)--Chinese University of Hong Kong, 1997.
Includes bibliographical references (leaves 84-89).
Chapter 1 --- Acknowledgment --- p.6
Chapter I --- Introduction and a Structural Change Model --- p.7
Chapter 2 --- Introduction --- p.7
Chapter 3 --- A Structural Change Model and the Estimated Specification --- p.10
Chapter II --- Behavior of the Model under Stationarity --- p.13
Chapter 4 --- Assumptions for Stationary Regressors and Error --- p.13
Chapter 5 --- Consistency of the Break Point Estimator when Regressors and Error are Stationary and Correlated --- p.14
Chapter 6 --- Limiting Distribution of the Break Point Estimator when Regressors and Error are Stationary and Correlated --- p.19
Chapter 7 --- Sup-Wald Test when Regressors and Error are Stationary and Correlated --- p.21
Chapter III --- Behavior of the Model under Nonstationarity --- p.23
Chapter 8 --- Assumptions for Nonstationary Regressors and I(d) Error --- p.23
Chapter 9 --- Consistency of the Break Point Estimator under Nonstationary Regres- sors and I(d) Error --- p.26
Chapter 10 --- F Test under Nonstationary Regressors and I(d) Error --- p.31
Chapter IV --- Finite Sample Properties and Conclusion --- p.33
Chapter 11 --- Finite Sample Properties of the Break Point Estimator --- p.33
Chapter 12 --- Conclusion --- p.38
Chapter V --- Appendix and Reference --- p.40
Chapter 13 --- Appendix --- p.40
Chapter 14 --- References --- p.84
APA, Harvard, Vancouver, ISO, and other styles
45

"Die ekonometriese verbetering van die stochastiese vergelykings van 'n ekonometriese model : met spesifieke vermelding van stasionariteit en ko-integrasie." Thesis, 2012. http://hdl.handle.net/10210/6426.

Full text
Abstract:
M.Comm.
The aim of this study is the econometric improvement of the stochastic equations of an econometric model with specific reference made to the explanation and incorporation of stationarity and cointegration testing. The study is based on an existing macroeconometric forecasting model. The focus of the study is not on the improvement of the specification of individual equations per se, but rather on the econometric improvement thereof, therefore changes to the specification of individual equations have only been made in cases where test results strongly recommended it. The RAU-model had previously been exposed to neither structural stability-, stationarity-, nor cointegration testing and therefore both the explanation and implementation of these tests have been included in the study. It is, however, important to note that the main purpose of both stationarity and co-integration testing is not to substitute nonstationary data with data which is proven to be stationary, but rather to identify nonstationary and non-cointegrationary data for future improvement and enhancement of the RAU model. Following the completion of the abovementioned tests, parameters have been estimated for the individual equations of the three sectors of the RAU-model (i.e. the Real-, Balance of payments-, and the Monetary sectors). Thereafter the results have been evaluated on the basis of the economic-, statistic-, and econometric evaluation criteria. In cases where econometric inconsistencies arose from the violation of the assumptions underlying the econometric tests, appropriate transformation processes have been applied in an attempt to resolve the problem. Thereafter, tests have been carried out to determine the forecasting ability of the model as well as to compare the model results with the a priori results. In general, the aim of the study, to econometrically improve the stochastic equations of the RAU model, has been achieved on the basis of overall better regression- and evaluation results that have been obtained. Following the completion of the study, a new approach to econometric modelbuilding, which makes provision for the inclusion of both stationarity- and cointegration testing, is proposed.
APA, Harvard, Vancouver, ISO, and other styles
46

MARCELLINO, Massimiliano. "Essays on econometric modelling." Doctoral thesis, 1996. http://hdl.handle.net/1814/4999.

Full text
Abstract:
Defence date: 20 June 1996
Examining board: Prof. Clive Granger, University of California at San Diego ; Prof. Søren Johansen, University of Copenhagen ; Prof. Marco Lippi, University of Rome ; Prof. Grayham Mizon, EUI, Supervisor ; Prof. Pravin Trivedi, University of Indiana, Bloomington
PDF of thesis uploaded from the Library digitised archive of EUI PhD theses completed between 2013 and 2017
-- A further comment on econometric policy evaluation -- Temporal aggregation of a VARIMAX process -- Some temporal aggregation issues in empirical analysis -- Temporal disaggregation of time series : a further proposal -- The effects of linear aggregation on common trends and cycles
APA, Harvard, Vancouver, ISO, and other styles
47

Chen, Pu [Verfasser]. "Econometric structural models : a model selection approach / Chen Pu." 2002. http://d-nb.info/984651942/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Lin, Hsin-Yi. "GENERAL SPECIFICATION TESTS FOR ECONOMETRIC MODELS." 2004. http://www.cetd.com.tw/ec/thesisdetail.aspx?etdun=U0001-3006200415463300.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Lin, Hsin-Yi, and 林馨怡. "GENERAL SPECIFICATION TESTS FOR ECONOMETRIC MODELS." Thesis, 2004. http://ndltd.ncl.edu.tw/handle/90932030790126059504.

Full text
Abstract:
博士
國立臺灣大學
經濟學研究所
92
In the economic literature, there are many non-nested models and conditional moment restrictions imposed by different economic theories and econometric models. Once these models or restrictions are specified, it is very important to have specification tests on their validity. This dissertation thus focuses on constructing general model specification tests for non-nested models and conditional moment restrictions. The tests in this dissertation have the following advantages. First, different from the existing literature, the proposed tests are applicable to models with possibly non-smooth moment functions. Second, the tests are consistent and asymptotically pivotal; both characteristics are important issues for specification tests. Therefore, the tests proposed in this dissertation have wide applicability and are easy to implement. In Chapter 1, we propose a generalized encompassing tests (GET) that extends the existing non-nested tests to models estimated by M-estimation for which the estimating equation may or may not be differentiable. The idea of the GET is to compare the estimating equation with its pseudo-true value, the pseudo-true estimating equation. The limiting distribution of the GET is derived. We present the GET statistics for the models estimated by quantile regression (QR), censored QR, smoothed maximum score, symmetrically trimmed least squares and asymmetrically least squares methods. The asymptotic Cox test (ACT) that extends the Cox tests to M-estimation is also proposed. The ACT and the GET are asymptotically equivalent. The asymptotic variance-covariance matrix of the GET is usually complicated. We also suggest a test based on a centered partial sum process to get an asymptotically pivotal test. In Chapter 2, we propose a consistent conditional moment test that is applicable regardless of the differentiability of the moment functions. One approach to constructing a consistent conditional moment test is to check infinitely many unconditional moment functions which are necessary and sufficient for the conditional moment restrictions. The test is based on this approach and checks unconditional moment conditions with an indicator weight function indexed by a nuisance parameter. By employing centered sequentially marked empirical processes, the estimation effect of the test is eliminated. The test statistic is thus asymptotically pivotal and converges in distribution to a Kiefer process. The test is applicable to many conditional moment models, such as nonlinear regression models, QR models, likelihood models and conditional parametric models. It is also known that QR is capable of providing a complete description of the conditional behavior of the dependent variable. The large sample properties of QR estimator have been well studied in the literature. Several efforts have been devoted to the inference of QR for a specific quantile or across quantiles. In addition, the consistent model specification tests for QR for a specific quantile are suggested by some researches. There is no test for the specifications of QR across quantiles. In Chapter 3, a model specification test for QR across quantiles is proposed. By checking infinitely many unconditional moment restrictions that are sufficient and necessary of the the model specifications of QR across quantiles, the proposed test is consistent. The test statistic is asymptotically pivotal and converges in distribution to a Kiefer process. This test does not require estimating the error density function of the model and hence is computationally simpler.
APA, Harvard, Vancouver, ISO, and other styles
50

FORONI, Claudia. "Econometric Models for Mixed-Frequency Data." Doctoral thesis, 2012. http://hdl.handle.net/1814/23750.

Full text
Abstract:
Defence date: 7 September 2012; Examining Board: Professor Massimiliano Marcellino, EUI, Supervisor; Professor Tommaso di Fonzo, Università di Padova; Professor Eric Ghysels, University of North Carolina; Professor Helmut Lütkepohl, Humboldt University Berlin
This thesis addresses different issues related to the use of mixed-frequency data. In the first chapter, I review, discuss and compare the main approaches proposed so far in the literature to deal with mixed-frequency data, with ragged edges due to publication delays: aggregation, bridge-equations, mixed-data sampling (MIDAS) approach, mixed-frequency VAR and factor models. The second chapter, a joint work with Massimiliano Marcellino, compares the different approaches analyzed in the first chapter, in a detailed empirical application. We focus on now- and forecasting the quarterly growth rate of Euro Area GDP and its components, using a very large set of monthly indicators, with a wide number of forecasting methods, in a pseudo real-time framework. The results highlight the importance of monthly information, especially during the crisis periods. The third chapter, a joint work with Massimiliano Marcellino and Christian Schumacher, studies the performance of a variant of the MIDAS model, which does not resort to functional distributed lag polynomials. We call this approach unrestricted MIDAS (U-MIDAS). We discuss the pros and cons of unrestricted lag polynomials in MIDAS regressions. In Monte Carlo experiments and empirical applications, we compare U-MIDAS to MIDAS and show that U-MIDAS performs better than MIDAS for small differences in sampling frequencies. The fourth chapter, a joint work with Massimiliano Marcellino, focuses on the issues related to mixed-frequency data in structural models. We show analytically, with simulation experiments and with actual data that a mismatch between the time scale of a DSGE or structural VAR model and that of the time series data used for its estimation generally creates identification problems, introduces estimation bias and distorts the results of policy analysis. On the constructive side, we prove that the use of mixed-frequency data can alleviate the temporal aggregation bias, mitigate the identification issues, and yield more reliable policy conclusions.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography