Dissertations / Theses on the topic 'Utility theory Econometric models'

To see the other types of publications on this topic, follow the link: Utility theory Econometric models.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Utility theory Econometric models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Paraskevopoulos, Ioannis. "Econometric models applied to production theory." Thesis, Queen Mary, University of London, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.392498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Adusumilli, Karun. "Essays on inference in econometric models." Thesis, London School of Economics and Political Science (University of London), 2018. http://etheses.lse.ac.uk/3760/.

Full text
Abstract:
This thesis contains three essays on inference in econometric models. Chapter 1 considers the question of bootstrap inference for Propensity Score Matching. Propensity Score Matching, where the propensity scores are estimated in a first step, is widely used for estimating treatment effects. In this context, the naive bootstrap is invalid (Abadie and Imbens, 2008). This chapter proposes a novel bootstrap procedure for this context, and demonstrates its consistency. Simulations and real data examples demonstrate the superior performance of the proposed method relative to using the asymptotic distribution for inference, especially when the degree of overlap in propensity scores is poor. General versions of the procedure can also be applied to other causal effect estimators such as inverse probability weighting and propensity score subclassification, potentially leading to higher order refinements for inference in such contexts. Chapter 2 tackles the question of inference in incomplete econometric models. In many economic and statistical applications, the observed data take the form of sets rather than points. Examples include bracket data in survey analysis, tumor growth and rock grain images in morphology analysis, and noisy measurements on the support function of a convex set in medical imaging and robotic vision. Additionally, nonparametric bounds on treatment effects under imperfect compliance can be expressed by means of random sets. This chapter develops a concept of nonparametric likelihood for random sets and its mean, known as the Aumann expectation, and proposes general inference methods by adapting the theory of empirical likelihood. Chapter 3 considers inference on the cumulative distribution function (CDF) in the classical measurement error model. It proposes both asymptotic and bootstrap based uniform confidence bands for the estimator of the CDF under measurement error. The proposed techniques can also be used to obtain confidence bands for quantiles, and perform various CDF-based tests such as goodness-offit tests for parametric models of densities, two sample homogeneity tests, and tests for stochastic dominance; all for the first time under measurement error.
APA, Harvard, Vancouver, ISO, and other styles
3

McGarry, Joanne S. "Seasonality in continuous time econometric models." Thesis, University of Essex, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.313064.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Forchini, Giovanni. "Exact distribution theory for some econometric problems." Thesis, University of Southampton, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.242631.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kapetanios, George. "Essays on the econometric analysis of threshold models." Thesis, University of Cambridge, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.286704.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hall, Stephen George Frederick. "Solving and evaluating large non-linear econometric models." Thesis, Queen Mary, University of London, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.261290.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Lu, Maozu. "The encompassing principle and evaluation of econometric models." Thesis, University of Southampton, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.316084.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Sherrell, Neill. "The estimation and specification of spatial econometric models." Thesis, University of Bristol, 1990. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.281861.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Chambers, Marcus James. "Durability and consumers' demand : Gaussian estimation and some continuous time models." Thesis, University of Essex, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.238563.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

McCrorie, James Roderick. "Some topics in the estimation of continuous time econometric models." Thesis, University of Essex, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.388615.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Arellano, Gomez Manuel. "Estimation and testing of dynamic econometric models from panel data." Thesis, London School of Economics and Political Science (University of London), 1985. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.261293.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Pitrun, Ivet 1959. "A smoothing spline approach to nonlinear inference for time series." Monash University, Dept. of Econometrics and Business Statistics, 2001. http://arrow.monash.edu.au/hdl/1959.1/8367.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Chen, Donghui 1970. "Median-unbiased estimation in linear autoregressive time series models." Monash University, Dept. of Econometrics and Business Statistics, 2001. http://arrow.monash.edu.au/hdl/1959.1/9044.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Indralingam, Maheswaran. "Sequential estimation, parameter variation and predictive power of econometric market response models." Thesis, Lancaster University, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.255352.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Jeon, Yongil. "Four essays on forecasting evaluation and econometric estimation /." Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 1999. http://wwwlib.umi.com/cr/ucsd/fullcit?p9949690.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Ong, Alen Sen Kay. "Asset location decision models in life insurance." Thesis, City University London, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.336430.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Nowman, Khalid. "Gaussian estimation of open higher order continuous time dynamic models with mixed stock and flow and with an application to a United Kingdom macroeconomic model." Thesis, University of Essex, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.305955.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Silvestrini, Andrea. "Essays on aggregation and cointegration of econometric models." Doctoral thesis, Universite Libre de Bruxelles, 2009. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210304.

Full text
Abstract:
This dissertation can be broadly divided into two independent parts. The first three chapters analyse issues related to temporal and contemporaneous aggregation of econometric models. The fourth chapter contains an application of Bayesian techniques to investigate whether the post transition fiscal policy of Poland is sustainable in the long run and consistent with an intertemporal budget constraint.

Chapter 1 surveys the econometric methodology of temporal aggregation for a wide range of univariate and multivariate time series models.

A unified overview of temporal aggregation techniques for this broad class of processes is presented in the first part of the chapter and the main results are summarized. In each case, assuming to know the underlying process at the disaggregate frequency, the aim is to find the appropriate model for the aggregated data. Additional topics concerning temporal aggregation of ARIMA-GARCH models (see Drost and Nijman, 1993) are discussed and several examples presented. Systematic sampling schemes are also reviewed.

Multivariate models, which show interesting features under temporal aggregation (Breitung and Swanson, 2002, Marcellino, 1999, Hafner, 2008), are examined in the second part of the chapter. In particular, the focus is on temporal aggregation of VARMA models and on the related concept of spurious instantaneous causality, which is not a time series property invariant to temporal aggregation. On the other hand, as pointed out by Marcellino (1999), other important time series features as cointegration and presence of unit roots are invariant to temporal aggregation and are not induced by it.

Some empirical applications based on macroeconomic and financial data illustrate all the techniques surveyed and the main results.

Chapter 2 is an attempt to monitor fiscal variables in the Euro area, building an early warning signal indicator for assessing the development of public finances in the short-run and exploiting the existence of monthly budgetary statistics from France, taken as "example country".

The application is conducted focusing on the cash State deficit, looking at components from the revenue and expenditure sides. For each component, monthly ARIMA models are estimated and then temporally aggregated to the annual frequency, as the policy makers are interested in yearly predictions.

The short-run forecasting exercises carried out for years 2002, 2003 and 2004 highlight the fact that the one-step-ahead predictions based on the temporally aggregated models generally outperform those delivered by standard monthly ARIMA modeling, as well as the official forecasts made available by the French government, for each of the eleven components and thus for the whole State deficit. More importantly, by the middle of the year, very accurate predictions for the current year are made available.

The proposed method could be extremely useful, providing policy makers with a valuable indicator when assessing the development of public finances in the short-run (one year horizon or even less).

Chapter 3 deals with the issue of forecasting contemporaneous time series aggregates. The performance of "aggregate" and "disaggregate" predictors in forecasting contemporaneously aggregated vector ARMA (VARMA) processes is compared. An aggregate predictor is built by forecasting directly the aggregate process, as it results from contemporaneous aggregation of the data generating vector process. A disaggregate predictor is a predictor obtained from aggregation of univariate forecasts for the individual components of the data generating vector process.

The econometric framework is broadly based on Lütkepohl (1987). The necessary and sufficient condition for the equality of mean squared errors associated with the two competing methods in the bivariate VMA(1) case is provided. It is argued that the condition of equality of predictors as stated in Lütkepohl (1987), although necessary and sufficient for the equality of the predictors, is sufficient (but not necessary) for the equality of mean squared errors.

Furthermore, it is shown that the same forecasting accuracy for the two predictors can be achieved using specific assumptions on the parameters of the VMA(1) structure.

Finally, an empirical application that involves the problem of forecasting the Italian monetary aggregate M1 on the basis of annual time series ranging from 1948 until 1998, prior to the creation of the European Economic and Monetary Union (EMU), is presented to show the relevance of the topic. In the empirical application, the framework is further generalized to deal with heteroskedastic and cross-correlated innovations.

Chapter 4 deals with a cointegration analysis applied to the empirical investigation of fiscal sustainability. The focus is on a particular country: Poland. The choice of Poland is not random. First, the motivation stems from the fact that fiscal sustainability is a central topic for most of the economies of Eastern Europe. Second, this is one of the first countries to start the transition process to a market economy (since 1989), providing a relatively favorable institutional setting within which to study fiscal sustainability (see Green, Holmes and Kowalski, 2001). The emphasis is on the feasibility of a permanent deficit in the long-run, meaning whether a government can continue to operate under its current fiscal policy indefinitely.

The empirical analysis to examine debt stabilization is made up by two steps.

First, a Bayesian methodology is applied to conduct inference about the cointegrating relationship between budget revenues and (inclusive of interest) expenditures and to select the cointegrating rank. This task is complicated by the conceptual difficulty linked to the choice of the prior distributions for the parameters relevant to the economic problem under study (Villani, 2005).

Second, Bayesian inference is applied to the estimation of the normalized cointegrating vector between budget revenues and expenditures. With a single cointegrating equation, some known results concerning the posterior density of the cointegrating vector may be used (see Bauwens, Lubrano and Richard, 1999).

The priors used in the paper leads to straightforward posterior calculations which can be easily performed.

Moreover, the posterior analysis leads to a careful assessment of the magnitude of the cointegrating vector. Finally, it is shown to what extent the likelihood of the data is important in revising the available prior information, relying on numerical integration techniques based on deterministic methods.


Doctorat en Sciences économiques et de gestion
info:eu-repo/semantics/nonPublished

APA, Harvard, Vancouver, ISO, and other styles
19

Lee, Daesik. "Essays on coalition formation under asymmetric information." Diss., Virginia Polytechnic Institute and State University, 1988. http://hdl.handle.net/10919/53567.

Full text
Abstract:
We consider the applicability of the Revelation Principle under the possibility of collusive behavior among players in some Bayesian framework. In doing this, since the coalition formation itself suffers information asymmetry problems, we assume that the coalition is formed if the colluding parties can successfully find some coalitional mechanism whose outcome is a set of messages in the original mechanism. Recently Cremer [1986] proposes a coalitional mechanism in the framework of the well known Vickrey-Clark-Groves mechanism. We assume that the agents successfully collude if they can find coalitional a mechanism such that (i) coalitional mechanism is incentive-compatible and (ii) the payoff of this mechanism is strictly Pareto-improving in terms of the agent’s expected utility. Our analysis is undertaken in a one principal/two agent framework. We first ünd that the Revelation Principle is still applicable in the pure adverse selection model. We then extend this result to a model with both adverse selection and moral hazard aspects. Finally, we consider a three-tier principal/supervisor/agent hierarchical organization, as in Tirole (1986). We explicitly present the coalitional mechanism as a side-contract between the supervisor and the agent. We apply the previous result of applicability of the Revelation Principle and characterize the coalition-proof mechanism. We find that the principal can design an optimal collusion free contract with some additional cost by specifying proper individual and coalitional incentive-compatibility conditions and individual rationality conditions. Moreover, we find that the results of Tirole (1986)’s paper hinge on the fact that he considers only “hard,” verifiable, information.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
20

Alexandrova, Anna. "Connecting models to the real world game theory in action /." Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 2006. http://wwwlib.umi.com/cr/ucsd/fullcit?p3205365.

Full text
Abstract:
Thesis (Ph. D.)--University of California, San Diego, 2006.
Title from first page of PDF file (viewed April 6, 2006). Available via ProQuest Digital Dissertations. Vita. Includes bibliographical references (p. 201-206).
APA, Harvard, Vancouver, ISO, and other styles
21

Lipscomb, Clifford Allen. "Resolving the aggregation problem that plagues the hedonic pricing method." Diss., Available online, Georgia Institute of Technology, 2004:, 2003. http://etd.gatech.edu/theses/available/etd-04082004-180317/unrestricted/lipscomb%5fclifford%5fa%5f200312%5fphd.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Lazim, Mohamad Alias. "Econometric forecasting models and model evaluation : a case study of air passenger traffic flow." Thesis, Lancaster University, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.296880.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Shami, Roland G. (Roland George) 1960. "Bayesian analysis of a structural model with regime switching." Monash University, Dept. of Econometrics and Business Statistics, 2001. http://arrow.monash.edu.au/hdl/1959.1/9277.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Mukherji, Nivedita. "Essays on the optimum quantity of money." Diss., Virginia Tech, 1992. http://hdl.handle.net/10919/39721.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Kemp, Gordon C. R. "Asymptotic expansion approximations and the distributions of various test statistics in dynamic econometric models." Thesis, University of Warwick, 1987. http://wrap.warwick.ac.uk/99431/.

Full text
Abstract:
In this thesis we examine the derivation of asymptotic expansion approximations to the cumulative distribution functions of asymptotically chi-square test statistics under the null hypothesis being tested and the use of such approximations in the investigation of the properties of testing procedures. We are particularly concerned with how the structure of various test statistics may simplify the derivation of asymptotic expansion approximations to their cumulative distribution functions and also how these approximations can be used in conjunction with other small sample techniques to investigate the properties of testing procedures. In Chapter 1 we briefly review the construction of test statistics based on the Wald testing principle and in Chapter 2 we review the various approaches to finite sample theory which have been adopted in econometrics including asymptotic expansion methods. In Chapter 3 we derive asymptotic expansion approximations to the joint cumulative distribution functions of asymptotically chi-square test statistics making explicit use of certain aspects of the structure of such test statistics. In Chapters 4, 5 and 6 we apply these asymptotic expansion approximations under the null hypothesis, in conjunction with other small sample techniques, to a number of specific testing problems. The test statistics considered in Chapters 4 and 6 are Wald test statistics and those considered in Chapter 5 are predictive failure test statistics. The asymptotic expansion approximations to the cumulative distribution functions of the test statistics under the null hypothesis are evaluated numerically; the Implementation of the algorithm for obtaining asymptotic expansion approximations to the cumulative distribution functions of test statistics is discussed in an Appendix on Computing. Finally, in Chapter 7 we draw overall conclusions from the earlier chapters of the thesis and discuss briefly directions for possible future research.
APA, Harvard, Vancouver, ISO, and other styles
26

Collado-Vindel, Maria Dolores. "Dynamic econometric models for cohort and panel data : methods and applications to life-cycle consumption." Thesis, London School of Economics and Political Science (University of London), 1994. http://etheses.lse.ac.uk/2829/.

Full text
Abstract:
The purpose of this research is to analyze dynamic models for cohort and panel data, with special emphasis in the applications to life-cycle consumption. In the second chapter of the thesis we analyze the estimation of dynamic models from time-series of independent cross-sections. The population is divided in groups with fixed membership (cohorts) and the cohort sample means are used as a panel subject to measurement errors. We propose measurement error corrected estimators and we analyze their asymptotic properties. We also calculate the asymptotic biases of the non-corrected estimators to check up to what extent the measurement error correction is needed. Finally, we carry out Monte Carlo simulations to get an idea of the performance of our estimators in finite samples. The purpose of the second part is to test the life-cycle permanent income hypothesis using an unbalanced panel from the Spanish family expenditure survey. The model accounts for aggregate shocks and within period non-separability in the Euler equation among consumption goods, contrary to most of the literature in this area. The results do not indicate excess sensitivity of consumption growth to income. In the last chapter, we specify a system of nonlinear intertemporal (or Frisch) demands. Our choice of specification is based on seven criteria for such systems. These criteria are in terms of consistency with the theory, flexibility and econometric tractability. Our specification allows us to estimate a system of exact Euler equations in contrast to the usual practice in the literature. We then estimate the system on Spanish panel data. This is the first time that a Frisch demand system has been estimated on panel data. We do not reject any of the restrictions derived from theory. Our results suggest strongly that the intertemporal substitution elasticity is well determined.
APA, Harvard, Vancouver, ISO, and other styles
27

方柏榮 and Pak-wing Fong. "Topics in financial time series analysis: theory and applications." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B31241669.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Banerjee, Dyuti Sanker. "Essays on bids and offer matching in the labor market." Diss., Virginia Tech, 1994. http://hdl.handle.net/10919/37259.

Full text
Abstract:
This dissertation is a collection of essays on bids and offer matching in a labor market for new entrants to white-collar jobs. The papers compare some of the different institutions for determining wages and conducting the hiring process in the market for new entrants to white collar jobs. The first essay analyzes how does a firm announce and commit to a wage prior to deriving specific information about applicants' productivity and the consequences of following this hiring process. In the model there are two firms and at least as many applicants as the number of firms. All applicants apply simultaneously to both firms in response to the job advertisement which also mentions a wage. Each firm derives the firm-specific productivity of the applicants from their applications which is private information to each firm. None of the applicants have any information about the firms' evaluation. There are four pure strategy Nash Equilibria in wage announcements. Both firms announce a high wage, both firms announce a low wage, both firms announce a high or a low wage, and one firm announces a high wage and the other firm announces a low wage. In the latter case there also exists a unique mixed strategy equilibrium reflecting a firm's uncertainty about the choice of the other firm. In equilibrium one or both firms may not hire and the equilibrium may not exhibit wage dispersion. The second essay analyzes the question; which is better, to announce and commit to a particular wage prior to deriving specific information about applicants' productivity or to offer wages privately after deriving the firm-specific productivity. The equilibrium policy, to be followed by the firms in the first place, is determined endogenously by comparing the ex ante expected profits associated with the equilibria under the different policies. Lack of prior information and the uncertainty about the possible match results in "offer wages privately" as always an equilibrium policy. However, if a low wage is the equilibrium strategy under all the policies, then "any pair of policies" is an equilibrium. This justifies one of the circumstances in which different policies might coexist. In equilibrium a firm's position is always filled and the equilibrium outcome may not exhibit wage dispersion. The third essay analyses the question, if "announcing a wage" is the strategy rule to be followed by the firms, then what should be the equilibrium timing of wage announcement, before or after receiving specific information about applicants' productivity. Two policies are compared. Under the first policy a firm announces and commits to a particular wage prior to deriving the match-specific productivity. Under the second policy a firm solicits applications, derives the firm-specific productivity, and then announces and commits to a wage. The equilibrium timing of wage, to be followed by the firms in the first place, is determined endogenously by comparing the ex ante expected profits associated with the equilibrium strategy under the different timings. It turns out that announcing and committing to a particular wage after deriving specific information is always an equilibrium timing because of the informational advantage. However, if a low wage is the equilibrium strategy under all the policies then any pair of policies is an equilibrium. In equilibrium one of the firm's position may remain unfilled. The equilibrium outcome may not exhibit wage dispersion.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
29

McCloud, Nadine. "Model misspecification theory and applications /." Diss., Online access via UMI:, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
30

Vashi, Vidyut H. "The effect of price, advertising, and income on consumer demand : an almost ideal demand system investigation /." Diss., This resource online, 1994. http://scholar.lib.vt.edu/theses/available/etd-06062008-165751/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Staudigel, Matthias [Verfasser]. "Obesity, food demand, and models of rational consumer behaviour : econometric analyses and challenges to theory / Matthias Staudigel." Gießen : Universitätsbibliothek, 2014. http://d-nb.info/1068591528/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Sriananthakumar, Sivagowry 1968. "Contributions to the theory and practice of hypothesis testing." Monash University, Dept. of Econometrics and Business Statistics, 2000. http://arrow.monash.edu.au/hdl/1959.1/8836.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Brito, Hugo Miguel de Jesus. "Econometric study of alternative operators' investment decisions." Master's thesis, Instituto Superior de Economia e Gestão, 2012. http://hdl.handle.net/10400.5/10796.

Full text
Abstract:
Mestrado em Econometria Aplicada e Previsão
A relação entre a intervenção regulatória, as decisões de investimento dos operadores alternativos e o grau de concorrência nos mercados de comunicações eletrónicas tem sido intensamente discutida. O debate centra-se na possibilidade de obter um compromisso entre concorrência baseada em serviços e concorrência baseada em infraestruturas. A teoria da escada do investimento defende a conciliação destes dois objetivos pela intervenção adequada do regulador. Usando uma base de dados bastante completa e atendendo às fragilidades apontadas a outros estudos conclui-se que a informação sobre o mercado português comprova alguns pressupostos teóricos associados à teoria da escada do investimento: (i) a criação de condições para que os operadores alternativos entrem no mercado é um passo importante para que invistam em infraestrutura, e (ii) o regulador possui instrumentos para neutralizar o custo de oportunidade criado ao investimento em infraestruturas pelos lucros da concorrência baseada em serviços. O investimento em redes de fibra ótica pelos operadores alternativos é também considerado, avaliando os determinantes deste investimento e o respetivo efeito no nível de cobertura de uma área geográfica. É dada particular atenção à obtenção de uma especificação adequada para o modelo. Conclui-se que é preferível utilizar um modelo a duas partes em detrimento de um modelo a uma parte, pois os conjuntos de determinantes da decisão de investir numa área geográfica e da decisão relativa ao nível de cobertura a atingir nessa área não são idênticos. As características demográficas, económicas e sociais intrínsecas às áreas geográficas influenciam significativamente as decisões de investimento dos operadores alternativos.
The relation between regulation, the alternative operators' investment decisions and the degree of competition in the markets, has been an important policy issue over time. The discussions on this matter are mostly related with the possibility to achieve service-based competition in the short run, without compromising infrastructure-based competition in the long run. The investment ladder theory argues that both goals are achievable by appropriate regulatory intervention. By using a rich dataset and taking into account flaws pointed out in other studies, the present study finds reasonable evidence that the Portuguese market's data supports theoretical assumptions of the investment ladder theory: (i) creating conditions for alternative operators entering the market is an step in creating conditions for investment in infrastructure; (ii) the regulator has the tools to neutralise the opportunity cost for infrastructure investment created by service-based competition profits. The investment in fibre networks by alternative operators is also taken into consideration, with an evaluation of the investment determinants and their effect on coverage level of alternative operator's fibre networks. Particular attention is given to achieve an appropriate model specification. It is concluded that it is preferable to use a two-part model over a one part-model, which provides evidence that the determinants of the decision to invest in a geographical area are not entirely similar to the determinants of the decision on the coverage level in that area. The present study found that the intrinsic demographic, economic and social characteristics of a given geographical area influence investment decisions of alternative operators.
APA, Harvard, Vancouver, ISO, and other styles
34

Hwang, Jungbin. "Fixed smoothing asymptotic theory in over-identified econometric models in the presence of time-series and clustered dependence." Thesis, University of California, San Diego, 2016. http://pqdtopen.proquest.com/#viewpdf?dispub=10128431.

Full text
Abstract:

In the widely used over-identified econometric model, the two-step Generalized Methods of Moments (GMM) estimator and inference, first suggested by Hansen (1982), require the estimation of optimal weighting matrix at the initial stages. For time series data and clustered dependent data, which is our focus here, the optimal weighting matrix is usually referred to as the long run variance (LRV) of the (scaled) sample moment conditions. To maintain generality and avoid misspecification, nowadays we do not model serial dependence and within-cluster dependence parametrically but use the heteroscedasticity and autocorrelation robust (HAR) variance estimator in standard practice. These estimators are nonparametric in nature with high variation in finite samples, but the conventional increasing smoothing asymptotics, so called small-bandwidth asymptotics, completely ignores the finite sample variation of the estimated GMM weighting matrix. As a consequence, empirical researchers are often in danger of making unreliable inferences and false assessments of the (efficient) two-step GMM methods. Motivated by this issue, my dissertation consists of three papers which explore the efficiency and approximation issues in the two-step GMM methods by developing new, more accurate, and easy-to-use approximations to the GMM weighting matrix.

The first chapter, "Simple and Trustworthy Cluster-Robust GMM Inference" explores new asymptotic theory for two-step GMM estimation and inference in the presence of clustered dependence. Clustering is a common phenomenon for many cross-sectional and panel data sets in applied economics, where individuals in the same cluster will be interdependent while those from different clusters are more likely to be independent. The core of new approximation scheme here is that we treat the number of clusters G fixed as the sample size increases. Under the new fixed-G asymptotics, the centered two-step GMM estimator and two continuously-updating estimators have the same asymptotic mixed normal distribution. Also, the t statistic, J statistic, as well as the trinity of two-step GMM statistics (QLR, LM and Wald) are all asymptotically pivotal, and each can be modified to have an asymptotic standard F distribution or t distribution. We also suggest a finite sample variance correction further to improve the accuracy of the F or t approximation. Our proposed asymptotic F and t tests are very appealing to practitioners, as test statistics are simple modifications of the usual test statistics, and the F or t critical values are readily available from standard statistical tables. We also apply our methods to an empirical study on the causal effect of access to domestic and international markets on household consumption in rural China.

The second paper "Should we go one step further? An Accurate Comparison of One-step and Two-step procedures in a Generalized Method of Moments Framework” (coauthored with Yixiao Sun) focuses on GMM procedure in time-series setting and provides an accurate comparison of one-step and two-step GMM procedures in a fixed-smoothing asymptotics framework. The theory developed in this paper shows that the two-step procedure outperforms the one-step method only when the benefit of using the optimal weighting matrix outweighs the cost of estimating it. We also provide clear guidance on how to choose a more efficient (or powerful) GMM estimator (or test) in practice.

While our fixed smoothing asymptotic theory accurately describes sampling distribution of two-step GMM test statistic, the limiting distribution of conventional GMM statistics is non-standard, and its critical values need to be simulated or approximated by standard distributions in practice. In the last chapter, "Asymptotic F and t Tests in an Efficient GMM Setting" (coauthored with Yixiao Sun), we propose a simple and easy-to-implement modification to the trinity (QLM, LM, and Wald) of two-step GMM statistics and show that the modified test statistics are all asymptotically F distributed under the fixed-smoothing asymptotics. The modification is multiplicative and only involves the J statistic for testing over-identifying restrictions. In fact, what we propose can be regarded as the multiplicative variance correction for two-step GMM statistics that takes into account the additional asymptotic variance term under the fixed-smoothing asymptotics. The results in this paper can be immediately generalized to the GMM setting in the presence of clustered dependence.

APA, Harvard, Vancouver, ISO, and other styles
35

Raychaudhuri, Subhashis. "Essays on game theory and its application to social discrimination and segregation." Diss., Virginia Tech, 1994. http://hdl.handle.net/10919/37258.

Full text
Abstract:
This dissertation consists of three chapters on game theory and its application to social segregation and discrimination. In the first chapter, we discuss two interpretations of the Nash equilibrium and connect the remaining two chapters based on such interpretations. The first chapter also provides the motivations and the summary of Chapters 2 and 3. In the second chapter, we consider an extension of an almost strictly competitive game in n-person extensive games by incorporating Seiten's subgame perfection. We call this extension a subgame perfect weakly-almost (SPWA) strictly competitive game, in particular, a SPW A strictly competitive game in strategic form is simply called a WA strictly competitive game. We give some general results on the structure of these classes of games. One result gives an easy way to verify almost strict competitiveness of a given extensive game. We show that a two-person weakly unilaterally competitive extensive game and a finitely repeated WA strictly competitive game are SPW A strictly competitive. In the third chapter, we consider segregations, discriminatory behaviors, and prejudices in a recurrent situation of a game called the festival game with merrymakers. We show that segregation and discriminatory behaviors may occur in Nash equilibria in the sense that players of one ethnic group go to one festival, and, if any member of one ethnic group tries to go to a different festival, he will be treated differently only for the reason of nominal differences in ethnicities between them. One of our results states that if a player tries to enter a larger festival from a smaller one, he would be discriminated against by some people in the larger festival, but not necessarily if one goes from a larger one to a smaller one. We use the theory of stable conventions for the considerations of the entire recurrent situation and of the epistemic assumptions for each individual player. We show that the central parts of the stable conventions are captured by the Nash equilibria. Associating our results with the theory of stable conventions and the cognitive and moral views called subjectivism and retributionism, we discuss the emergence of fallacious views of each player about the utility functions of all the players. One such view explains prejudicial attitudes as a rationalization of discriminatory behaviors.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
36

Nyika, Farai. "An empirical analysis of the Austrian business cycle theory with respect to South Africa." Thesis, Nelson Mandela Metropolitan University, 2012. http://hdl.handle.net/10948/d1020867.

Full text
Abstract:
In 2008, the global economy went into recession. Millions of jobs were lost, confidence in the financial markets fell and billions of dollars were lost by investors. Prior to the onset of the recession, the major economies of the world (USA, and Western Europe) had experienced a period of economic boom and expansion. Austrian Business Cycle Theory proposes that the roots of the current financial crisis and recessions in general, are found the actions of central banks through credit expansion and manipulation of interest rates. Central banks manipulate interest rates causing them to fall below the natural level, leading to credit expansion and malinvestments. Austrian Business Cycle Theory is based in capital theory. Capital theory incorporates the elements of time and money and allows the setting of a microeconomic foundation. The theory recognises that investment is not an aggregate (as do Keynesians and Monetarists). Opposition to empirical testing by Austrian economists has meant that few statistical analyses of Austrian Business Cycle Theory have been carried out. The apprehension toward empirical testing of Austrian Business Cycle Theory stems from some Austrian economists who argue that human behaviour cannot be captured in statistical terms. Recently, some Austrian economists have begun to do empirical research Austrian Business Cycle Theory and the thesis adds to that growing field. The thesis tests empirically for ABCT in South Africa by using Vector Error Correction Model and Granger causality techniques and the results are as follows: The Vector Error Correction Model shows that any disequilibrium adjustment in the structural equations influences correction mostly through changes in Manufacturing. The disequilibrium adjustment process for Investment is also found to have statistical significance. The results propose that Investment in South Africa is not inert. The Granger causality tests show that credit expansion causes interest rates to be artificially lowered leading to mal-investments. The main policy recommendation is that business cycles can be prevented by not manipulating interest rates and by not increasing credit availability.
APA, Harvard, Vancouver, ISO, and other styles
37

Cook, Victoria Tracy 1960. "The effects of temporal uncertainty resolution on the overall utility and suspense of risky monetary and survival gambles /." Thesis, McGill University, 1989. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=75966.

Full text
Abstract:
We extend Kreps and Porteus' (1978, 1979a,b) temporal utility theory to include measures of suspense for gambles that vary in the timing of uncertainty resolution. Our f$ sp t$-modification (of their theory) defines overall utility and suspense in terms of two functions: a standard utility function and an iterative function whose properties determine attitude towards temporal uncertainty resolution. Suspense, which is increasing with time delay to uncertainty resolution, is defined as the "variance" of the standard utilities of the outcome streams taken about our measure of overall utility (rather than about the standard mean utility). We explore the properties of our measures and their implications for the overall utility and suspense of various key examples. Two preliminary experiments are reported which give some support for our overall utility and suspense measures, and which suggest that risk and suspense are different concepts. Iteration theory is also discussed in some detail.
APA, Harvard, Vancouver, ISO, and other styles
38

Weng, Weiwei, and 翁韡韡. "Two essays on matching and centralized admissions." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2011. http://hub.hku.hk/bib/B46419974.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Vickers, John. "Patent races and market structure." Thesis, University of Oxford, 1985. http://ora.ox.ac.uk/objects/uuid:9e3df3d2-b58a-48cc-b639-78c7c48bd3cd.

Full text
Abstract:
This thesis is a theoretical study of relationships between patent races and market structure. The outcome of a patent race can be an important determinant of market structure. For example, whether or not a new firm enters a market may depend upon its winning a patent race against an incumbent firm already in that market. Moreover, market structure can be a major influence upon competition in a patent race. In the example, the asymmetry between incumbent and potential entrant has an effect upon their respective incentives in the patent race. Chapter I discusses models of R and D with uncertainty. We show that, as the degree of correlation between the uncertainties facing rival firms increases, R and D efforts increase under some, but not all, conditions, and the number of active competitors falls. Chapter II discusses the approach of representing patent races as bidding games. We examine a model in which several incumbent firms compete with a number of potential entrants in a patent race, and ask whether the incumbents have an incentive to form a joint venture to deter entry. They do so if and only if the patent does not offer a major cost improvement. In Chapter III we examine the strategic interactions between competitors during the course of a race, in an attempt to clarify (for different types of race) the idea that a race degenerates when one player becomes 'far enough ahead' of his rivals, in a sense made precise. In Chapter IV we examine the evolution of market structure in a duopoly model when there is a sequence of patent races. The nature of competition in the product market is shown to determine whether one firm becomes increasingly dominant as industry leader, or whether there is 'action - reaction' between firms.
APA, Harvard, Vancouver, ISO, and other styles
40

Jablonský, Petr. "Performance downside risk models of the post-modern portfolio theory." Doctoral thesis, Vysoká škola ekonomická v Praze, 2008. http://www.nusl.cz/ntk/nusl-161865.

Full text
Abstract:
The thesis provides a comparison of different portfolio models and tests their performance on the financial markets. Our analysis particularly focuses on comparison of the classical Markowitz modern portfolio theory and the downside risk models of the post-modern portfolio theory. In addition, we consider some alternative portfolio models ending with total eleven models that we test. If the performance of different portfolio models should be evaluated and compared correctly, we must use a measure that is unbiased to any portfolio theory. We suggest solving this issue via a new approach based on the utility theory and utility functions. We introduce the unbiased method for evaluation of the portfolio model performance using the expected utility efficient frontier. We use the asymmetric behavioural utility function to capture the behaviour of the real market investors. The Markowitz model is the leading market practice. We investigate whether there are any circumstances in which some other models might provide better performance than the Markowitz model. Our research is for three reasons unique. First, it provides a comprehensive comparison of broad classes of different portfolio models. Second, we focus on the developed markets in United States and Germany but also on the local emerging markets in Czech Republic and Poland. These local markets have never been tested in such extent before. Third, the empirical testing is based on the broad data set from 2003 to 2012 which enable us to test how different portfolio model perform in different macroeconomic conditions.
APA, Harvard, Vancouver, ISO, and other styles
41

Howell, John R. "Choice Models with Nonlinear Pricing." The Ohio State University, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=osu1370020683.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Lu, Lei 1975. "Essays on asset pricing with heterogeneous beliefs and bounded rational investor." Thesis, McGill University, 2007. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=103267.

Full text
Abstract:
The thesis includes two essays on asset pricing. In the first essay, "Asset Pricing in a Monetary Economy with Heterogeneous Beliefs", we shed new light on the role of monetary policy in asset pricing by focusing on the case where investors have heterogeneous expectations about future monetary policy. Under heterogeneity in beliefs, investors place bets against each other on the evolution of money supply, and as a result, the sharing of wealth in the economy evolves stochastically over time, making money non-neutral. Employing a continuous-time, general equilibrium model, we establish these fluctuations to be rich in implications, in that they majorly affect the equilibrium prices of all assets, as well as inflation. In particular, we find that the stock market volatility may be significantly increased by the heterogeneity in beliefs, a conclusion supported by our empirical analysis. The second essay is titled with " Asset Pricing and Welfare Analysis with Bounded Rational Investors". Motivated by the fact that investors have limited ability and insufficient knowledge to process information, I model investors' bounded-rational behavior in processing information and study its implications on asset pricing. Bounded rational investors perceive "correlated" information (which consists of news that is correlated with fundamentals, but provides no information on them) as "fundamental" information. This generates "bounded rational risk". Asset prices and volatilities of asset returns are derived. Specially, the equity premium and the stock volatility are raised under some conditions. I also analyze the welfare impact of bounded rationality.
APA, Harvard, Vancouver, ISO, and other styles
43

Iwasawa, Masamune. "Specification Tests in Econometrics and Their Application." Kyoto University, 2016. http://hdl.handle.net/2433/215270.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Devaraj, Srikant. "Specification and estimation of the price responsiveness of alcohol demand| A policy analytic perspective." Thesis, Indiana University - Purdue University Indianapolis, 2016. http://pqdtopen.proquest.com/#viewpdf?dispub=10032406.

Full text
Abstract:

Accurate estimation of alcohol price elasticity is important for policy analysis – e.g.., determining optimal taxes and projecting revenues generated from proposed tax changes. Several approaches to specifying and estimating the price elasticity of demand for alcohol can be found in the literature. There are two keys to policy-relevant specification and estimation of alcohol price elasticity. First, the underlying demand model should take account of alcohol consumption decisions at the extensive margin – i.e., individuals’ decisions to drink or not – because the price of alcohol may impact the drinking initiation decision and one’s decision to drink is likely to be structurally different from how much they drink if they decide to do so (the intensive margin). Secondly, the modeling of alcohol demand elasticity should yield both theoretical and empirical results that are causally interpretable. The elasticity estimates obtained from the existing two-part model takes into account the extensive margin, but are not causally interpretable.

The elasticity estimates obtained using aggregate-level models, however, are causally interpretable, but do not explicitly take into account the extensive margin. There currently exists no specification and estimation method for alcohol price elasticity that both accommodates the extensive margin and is causally interpretable. I explore additional sources of bias in the extant approaches to elasticity specification and estimation: 1) the use of logged (vs. nominal) alcohol prices; and 2) implementation of unnecessarily restrictive assumptions underlying the conventional two-part model. I propose a new approach to elasticity specification and estimation that covers the two key requirements for policy relevance and remedies all such biases. I find evidence of substantial divergence between the new and extant methods using both simulated and the real data. Such differences are profound when placed in the context of alcohol tax revenue generation.

APA, Harvard, Vancouver, ISO, and other styles
45

Fratus, Brian J. "Rational asset pricing : book-to-market equity as a proxy for risk in utility stocks /." Thesis, This resource online, 1994. http://scholar.lib.vt.edu/theses/available/etd-11242009-020322/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Thompson, Stephanie C. "Rational design theory: a decision-based foundation for studying design methods." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/39490.

Full text
Abstract:
While design theories provide a foundation for representing and reasoning about design methods, existing design theories do not explicitly include uncertainty considerations or recognize tradeoffs between the design artifact and the design process. These limitations prevent the existing theories from adequately describing and explaining observed or proposed design methods. In this thesis, Rational Design Theory is introduced as a normative theoretical framework for evaluating prescriptive design methods. This new theory is based on a two-level perspective of design decisions in which the interactions between the artifact and the design process decisions are considered. Rational Design Theory consists of normative decision theory applied to design process decisions, and is complemented by a decision-theory-inspired conceptual model of design. The application of decision analysis to design process decisions provides a structured framework for the qualitative and quantitative evaluation of design methods. The qualitative evaluation capabilities are demonstrated in a review of the systematic design method of Pahl and Beitz. The quantitative evaluation capabilities are demonstrated in two example problems. In these two quantitative examples, Value of Information analysis is investigated as a strategy for deciding when to perform an analysis to gather additional information in support of a choice between two design concepts. Both quantitative examples demonstrate that Value of Information achieves very good results when compared to a more comprehensive decision analysis that allows for a sequence of analyses to be performed.
APA, Harvard, Vancouver, ISO, and other styles
47

Kotelba, A. (Adrian). "Theory of rational decision-making and its applications to adaptive transmission." Doctoral thesis, Oulun yliopisto, 2013. http://urn.fi/urn:isbn:9789526202044.

Full text
Abstract:
Abstract In this thesis, adaptive transmission power control algorithms for reliable communication in channels with state are explored and further developed. In channels with state, strict adherence to Shannon-sense capacity may lead to very conservative system designs. In many practical systems, error-free communication is not required because these systems can cope with decoding errors. These considerations give rise to other information-theoretic notions where the rate of reliable communications is considered a random variable which depends not only on the statistical properties of the channel but also on the adaptive transmission strategy. Numerous studies on adaptive transmission in channels with state have already been conducted using expected value of communication rate or information outage probability as the relevant performance metrics. However, these metrics, although intuitively pleasing, have usually been introduced without rigorous justification. This thesis contributes to the state of the art in a number of ways. These include the development of new conceptual viewpoints on performance assessment of adaptive communication systems in channels with state as well as a new set of adaptive transmission power control algorithms. In particular, the models and methods of rational decision theory are introduced and systematically used in developing a unified framework for analysis and optimization of adaptive transmission in channels with state. The proposed framework properly addresses the limitation of finite coding length, takes into account the decision maker's preferences, considers uncertainties relevant in a given decision, and determines the optimal decision by maximizing some numerical index. A central finding of the theoretical studies is that many of the previously proposed performance metrics can be rigorously justified within the newly proposed framework. In addition, adaptive transmission power control in parallel Gaussian channels is considered with the aim of obtaining new classes of algorithms. The safety-first approach, risk theory, and expected utility theory are applied to derive novel transmission power control algorithms. The performance of the proposed power control algorithms is evaluated by computer simulations and compared against the performance of some other well-known algorithms
Tiivistelmä Tässä väitöskirjassa tutkitaan ja kehitetään edelleen adaptiivisia lähettimen tehonsäätöalgoritmeja luotettavaan tietoliikenteeseen kanavissa, joilla on tila. Tällaisissa kanavissa Shannonin määrittelemän kapasiteetin tiukka noudattaminen saattaa johtaa konservatiivisiin järjestelmiin. Monissa käytännön järjestelmissä virheetöntä tiedonsiirtoa ei vaadita, koska niissä voidaan helposti selviytyä dekoodausvirheistä. Nämä pohdinnat johtavat toisenlaisiin informaatioteoreettisiin näkökulmiin, joissa luotettavan tietoliikenteen nopeutta pidetään satunnaismuuttujana, joka ei riipu ainoastaan kanavan tilastollisista ominaisuuksista vaan myös adaptiivisesta lähetysstrategiasta. Adaptiivisesta siirrosta kanavissa, joilla on tila, on jo tehty lukuisia tutkimuksia käyttäen tietoliikennenopeuden odotusarvoa tai informaation katkostodennäköisyyttä asiaankuuluvina suorituskykymittareina. Näitä mittareita on käytetty tavallisesti ilman tarkkaa perustelua, vaikka ne ovat intuitiivisesti houkuttelevia. Tämä väitöskirja tuottaa uusia tuloksia alan kehityksen nykytasoon monella tavalla. Näihin kuuluvat uudet käsitteelliset näkökulmat adaptiivisten tietoliikennejärjestelmien suorituskyvyn arviointiin kanavissa, joilla on tila, sekä uusi joukko adaptiivisia tehonsäätöalgoritmeja. Erityisesti rationaalisen päätöksenteon malleja ja menetelmiä on otettu käyttöön systemaattisesti kehitettäessä yhtenäistä puitetta adaptiivisen siirron analyysiin ja optimointiin kanavissa, joilla on tila. Ehdotettu puite arvioi asianmukaisesti äärellisen koodauspituuden rajoitusta, ottaa huomioon päätöksentekijän mieltymykset, tarkastelee määrättyyn päätökseen liittyviä oleellisia epävarmuuksia ja määrittää optimaalisen päätöksen maksimoimalla jonkin numeerisen päätösmuuttujan. Keskeinen löytö on se, että monet aikaisemmin ehdotetut suorituskykymittarit voidaan perustella tarkasti uuden, tässä ehdotetun puitteen sisällä. Lisäksi tarkastellaan adaptiivista lähettimen tehonsäätöä rinnakkaisissa Gaussin jakaumaa noudattavissa kanavissa. Tavoitteena on saada aikaan uusia lähettimen tehonsäätöalgoritmien luokkia. Turvallisuus ensin -lähestymistapaa, riskiteoriaa ja odotetun hyödyn teoriaa sovelletaan uusien lähettimen tehonsäätöalgoritmien johtamiseen. Ehdotettujen tehonsäätöalgoritmien suorituskykyä on mitattu tietokonesimuloinneilla ja verrattu joidenkin muiden hyvin tunnettujen algoritmien suorituskykyyn
APA, Harvard, Vancouver, ISO, and other styles
48

Heller, Collin M. "A computational model of engineering decision making." Thesis, Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/50272.

Full text
Abstract:
The research objective of this thesis is to formulate and demonstrate a computational framework for modeling the design decisions of engineers. This framework is intended to be descriptive in nature as opposed to prescriptive or normative; the output of the model represents a plausible result of a designer's decision making process. The framework decomposes the decision into three elements: the problem statement, the designer's beliefs about the alternatives, and the designer's preferences. Multi-attribute utility theory is used to capture designer preferences for multiple objectives under uncertainty. Machine-learning techniques are used to store the designer's knowledge and to make Bayesian inferences regarding the attributes of alternatives. These models are integrated into the framework of a Markov decision process to simulate multiple sequential decisions. The overall framework enables the designer's decision problem to be transformed into an optimization problem statement; the simulated designer selects the alternative with the maximum expected utility. Although utility theory is typically viewed as a normative decision framework, the perspective in this research is that the approach can be used in a descriptive context for modeling rational and non-time critical decisions by engineering designers. This approach is intended to enable the formalisms of utility theory to be used to design human subjects experiments involving engineers in design organizations based on pairwise lotteries and other methods for preference elicitation. The results of these experiments would substantiate the selection of parameters in the model to enable it to be used to diagnose potential problems in engineering design projects. The purpose of the decision-making framework is to enable the development of a design process simulation of an organization involved in the development of a large-scale complex engineered system such as an aircraft or spacecraft. The decision model will allow researchers to determine the broader effects of individual engineering decisions on the aggregate dynamics of the design process and the resulting performance of the designed artifact itself. To illustrate the model's applicability in this context, the framework is demonstrated on three example problems: a one-dimensional decision problem, a multidimensional turbojet design problem, and a variable fidelity analysis problem. Individual utility functions are developed for designers in a requirements-driven design problem and then combined into a multi-attribute utility function. Gaussian process models are used to represent the designer's beliefs about the alternatives, and a custom covariance function is formulated to more accurately represent a designer's uncertainty in beliefs about the design attributes.
APA, Harvard, Vancouver, ISO, and other styles
49

Bart-Williams, Claudius Pythias. "On asset pricing and the equity premium puzzle." Thesis, Brunel University, 2000. http://bura.brunel.ac.uk/handle/2438/6371.

Full text
Abstract:
Presented here are consumption and production related asset pricing models which seek to explain stock market behaviour through the stock premium over risk-free bonds and to do so using parameter values consistent with theory. Our results show that there are models capable of explaining stock market behaviour. For the consumption-based model, we avoid many of the suggestions to artificially boost the predicted stock premium such as modelling consumption as leverage claims; instead we use the notion of surplus consumption. We find that with surplus consumption, there are models including the much-maligned power utility model, capable of yielding theory consistent estimates for the discount rate, risk-free rate as well as the coefficient of relative risk aversion, y. Since real business cycle theory assumes a risk aversion coefficient of 1, we conclude that our model which gives a value close to but not equal to 1, provides an indication of the impact of market imperfections. For production, we present many of the existing models which seek to explain stock market behaviour using production data which we find to be generally incapable of explaining stock market behaviour. We conclude by presenting a profit based formulation which uses deviations of actual from expected profits and dividends via stock price reaction parameters to successfully explain stock market behaviour. We also conclude that the use of a profit based formulation allows for a link to investment, output and pricing decisions and hence link consumption and production.
APA, Harvard, Vancouver, ISO, and other styles
50

Bauknecht, Klaus Dieter. "A macroeconometric policy model of the South African economy based on weak rational expectations with an application to monetary policy." Thesis, Stellenbosch : Stellenbosch University, 2000. http://hdl.handle.net/10019.1/51575.

Full text
Abstract:
Dissertation (PhD) -- University of Stellenbosch, 2000.
ENGLISH ABSTRACT: The Lucas critique states that if expectations are not explicitly dealt with, conventional econometric models are inappropriate for policy analyses, as their coefficients are not policy invariant. The inclusion of rational expectations in ·conventional model building has been the most common response to this critique. The concept of rational expectations has received several interpretations. In numerous studies, these expectations are associated with model consistent expectations in the sense that expectations and model solutions are identical. To derive a solution, these models require unique algorithms and assumptions regarding their terminal state, in particular when forward-looking expectations are present. An alternative that avoids these issues is the concept of weak rational expectations, which emphasises that expectation errors should not be systematic. Expectations are therefore formed on the basis of an underlying structure, but full knowledge of the model is not essential. The accommodation of this type of rational expectations is accomplished by means of an explicit specification of an expectations equation consistent with the macro econometric model's broad structure. The estimation of coefficients relating to expectations is achieved through an Instrumental Variable approach. In South Africa, monetary policy has been consistent and transparent in line with the recommendations of the De Kock Commission. This allows the modelling of the policy instrument of the South African Reserve Bank, i.e. the Bank rate, by means of a policy reaction function. Given this transparency in monetary policy, the accommodation of expectations of the Bank rate is essential in modelling the full impact of monetary policy and in avoiding the Lucas critique. This is accomplished through weak rational expectations, based on the reaction function of the Reserve Bank. The accommodation of expectations of a policy instrument also allows the modelling of anticipated and unanticipated policies as alternative assumptions regarding the expectations process can be made during simulations. Conventional econometric models emphasise the demand side of the economy, with equations focusing on private consumption, investment, exports and imports and possibly changes in inventories. In this study, particular emphasis in the model specification is also placed on the impact of monetary policy on government debt and debt servicing costs. Other dimensions of the model include the modelling of the money supply and balance of payments, short- and long-term interest rates, domestic prices, the exchange rate, the wage rate and employment as well as weakly rational expectations of inflation and the Bank rate. The model has been specified and estimated by usmg concepts such as cointegration and Error Correction modelling. Numerous tests, including the assessment of the Root Mean Square Percentage Error, have been employed to test the adequacy of the model. Similarly, tests are carried out to ensure weak rational expectations. Numerous simulations are carried out with the model and the results are compared to relevant alternative studies. The simulation results show that the reduction of inflation by means of only monetary policy could impose severe costs on the economy in terms of real sector volatility.
AFRIKAANSE OPSOMMING: Die Lucas-kritiek beweer dat konvensionele ekonometriese modelle nie gebruik kan word vir beleidsontleding nie, aangesien dit nie voorsiening maak vir die verandering in verwagtings wanneer beleidsaanpassings gemaak word nie. Die insluiting van rasionele verwagtinge in konvensionele ekonometriese modelle is die mees algemene reaksie op die Lukas-kritiek. Ten einde die praktiese insluiting van rasionele verwagtings III ekonometriese modelbou te vergemaklik, word in hierdie studie gebruik gemaak van sogenaamde "swak rasionele verwagtings", wat slegs vereis dat verwagtingsfoute me sistematies moet wees nie. Die beraming van die koëffisiënte van die verwagtingsveranderlikes word gedoen met behulp van die Instrumentele Veranderlikes-benadering. Monetêre beleid in Suid-Afrika was histories konsekwent en deursigtig in ooreenstemming met die aanbevelings van die De Kock Kommissie. Die beleidsinstrument van die Suid-Afrikaanse Reserwebank, naamlik die Bankkoers, kan gevolglik gemodelleer word met behulp van 'n beleidsreaksie-funksie. Ten einde die Lukas-kritiek te akkommodeer, moet verwagtings oor die Bankkoers egter ingesluit word wanneer die volle impak van monetêre beleid gemodelleer word. Dit word vermag met die insluiting van swak rasionele verwagtings, gebaseer op die reaksie-funksie van die Reserwebank. Sodoende kan die impak van verwagte en onverwagte beleidsaanpassings gesimuleer word. Konvensionele ekonometriese modelle beklemtoon die vraagkant van die ekonomie, met vergelykings vir verbruik, investering, invoere, uitvoere en moontlik die verandering in voorrade. In hierdie studie word daar ook klem geplaas op die impak van monetêre beleid op staatskuld en die koste van staatsskuld. Ander aspekte wat gemodelleer word, is die geldvoorraad en betalingsbalans, korttermyn- en langtermynrentekoerse, binnelandse pryse, die wisselkoers, loonkoerse en indiensneming, asook swak rasionele verwagtings van inflasie en die Bankkkoers. Die model is gespesifiseer en beraam met behulp van ko-integrasie en die gebruik van lang-en korttermynvergelykings. Die gebruiklike toetse is uitgevoer om die toereikendheid van die model te toets. Verskeie simulasies is uitgevoer met die model en die resultate is vergelyk met ander relevante studies. Die gevolgtrekking word gemaak dat die verlaging van inflasie deur alleenlik gebruik te maak van monetêre beleid 'n swaar las op die ekonomie kan lê in terme van volatiliteit in die reële sektor.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography