Academic literature on the topic 'Performance standards Econometric models'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Performance standards Econometric models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Performance standards Econometric models"

1

Kim, Dong-sup, and Seungwoo Shin. "THE ECONOMIC EXPLAINABILITY OF MACHINE LEARNING AND STANDARD ECONOMETRIC MODELS-AN APPLICATION TO THE U.S. MORTGAGE DEFAULT RISK." International Journal of Strategic Property Management 25, no. 5 (July 13, 2021): 396–412. http://dx.doi.org/10.3846/ijspm.2021.15129.

Full text
Abstract:
This study aims to bridge the gap between two perspectives of explainability−machine learning and engineering, and economics and standard econometrics−by applying three marginal measurements. The existing real estate literature has primarily used econometric models to analyze the factors that affect the default risk of mortgage loans. However, in this study, we estimate a default risk model using a machine learning-based approach with the help of a U.S. securitized mortgage loan database. Moreover, we compare the economic explainability of the models by calculating the marginal effect and marginal importance of individual risk factors using both econometric and machine learning approaches. Machine learning-based models are quite effective in terms of predictive power; however, the general perception is that they do not efficiently explain the causal relationships within them. This study utilizes the concepts of marginal effects and marginal importance to compare the explanatory power of individual input variables in various models. This can simultaneously help improve the explainability of machine learning techniques and enhance the performance of standard econometric methods.
APA, Harvard, Vancouver, ISO, and other styles
2

Andersen, Torben G. "SIMULATION-BASED ECONOMETRIC METHODS." Econometric Theory 16, no. 1 (February 2000): 131–38. http://dx.doi.org/10.1017/s0266466600001080.

Full text
Abstract:
The accessibility of high-performance computing power has always influenced theoretical and applied econometrics. Gouriéroux and Monfort begin their recent offering, Simulation-Based Econometric Methods, with a stylized three-stage classification of the history of statistical econometrics. In the first stage, lasting through the 1960's, models and estimation methods were designed to produce closed-form expressions for the estimators. This spurred thorough investigation of the standard linear model, linear simultaneous equations with the associated instrumental variable techniques, and maximum likelihood estimation within the exponential family. During the 1970's and 1980's the development of powerful numerical optimization routines led to the exploration of procedures without closed-form solutions for the estimators. During this period the general theory of nonlinear statistical inference was developed, and nonlinear micro models such as limited dependent variable models and nonlinear time series models, e.g., ARCH, were explored. The associated estimation principles included maximum likelihood (beyond the exponential family), pseudo-maximum likelihood, nonlinear least squares, and generalized method of moments. Finally, the third stage considers problems without a tractable analytic criterion function. Such problems almost invariably arise from the need to evaluate high-dimensional integrals. The idea is to circumvent the associated numerical problems by a simulation-based approach. The main requirement is therefore that the model may be simulated given the parameters and the exogenous variables. The approach delivers simulated counterparts to standard estimation procedures and has inspired the development of entirely new procedures based on the principle of indirect inference.
APA, Harvard, Vancouver, ISO, and other styles
3

Ferro, Gustavo, and Carlos A. Romero. "Setting performance standards for regulation of water services: Benchmarking Latin American utilities." Water Policy 13, no. 5 (April 23, 2011): 607–23. http://dx.doi.org/10.2166/wp.2011.042.

Full text
Abstract:
The aim of this study is to estimate both stochastic and mathematical programming efficiency cost frontiers for the Latin American water sector, by means of econometric and Data Envelopment Analysis techniques, using the ADERASA database. ADERASA is the Latin American association for water regulators, which has made a systematic job of data collection, among other initiatives. This study fills a gap in the understanding of relative efficiency in the Latin American water sector, using a consistent database. First, we present a survey of the empirical literature related to cost and production frontiers in the water and sanitation sector. Second, once alternative specifications were chosen, models have been estimated and environmental variables included in an exploratory way. The coefficients have the expected signs and plausible values. Some consistency between methodologies is found. This paper yields two results. The better knowledge of the underlying cost (or production) model is a first step to using benchmarking as a regulatory tool. The policy implications are relatively straightforward. With benchmarking technology it is possible to coordinate the action of different regulators, each with their own asymmetry of information. The key is setting indicative standards which constitute the basis of further discussion.
APA, Harvard, Vancouver, ISO, and other styles
4

Garrido, Rodrigo A., and Hani S. Mahmassani. "Forecasting Short-Term Freight Transportation Demand: Poisson STARMA Model." Transportation Research Record: Journal of the Transportation Research Board 1645, no. 1 (January 1998): 8–16. http://dx.doi.org/10.3141/1645-02.

Full text
Abstract:
A framework for analyzing, describing, and forecasting freight flows for operational and tactical purposes is presented. A dynamic econometric model is proposed. This model incorporates the spatial and temporal characteristics of freight demand within a stochastic framework. The model was applied in an actual context and its performance was compared with standard time series models (benchmark) for forecasting ability. The proposed model outperformed the benchmark from the econometric viewpoint. Extensive diagnostic checking and sensitivity analysis confirmed the robustness of the modeling methodology for short-term forecasting applications.
APA, Harvard, Vancouver, ISO, and other styles
5

Wu, Kejin, and Sayar Karmakar. "Model-Free Time-Aggregated Predictions for Econometric Datasets." Forecasting 3, no. 4 (December 8, 2021): 920–33. http://dx.doi.org/10.3390/forecast3040055.

Full text
Abstract:
Forecasting volatility from econometric datasets is a crucial task in finance. To acquire meaningful volatility predictions, various methods were built upon GARCH-type models, but these classical techniques suffer from instability of short and volatile data. Recently, a novel existing normalizing and variance-stabilizing (NoVaS) method for predicting squared log-returns of financial data was proposed. This model-free method has been shown to possess more accurate and stable prediction performance than GARCH-type methods. However, whether this method can sustain this high performance for long-term prediction is still in doubt. In this article, we firstly explore the robustness of the existing NoVaS method for long-term time-aggregated predictions. Then, we develop a more parsimonious variant of the existing method. With systematic justification and extensive data analysis, our new method shows better performance than current NoVaS and standard GARCH(1,1) methods on both short- and long-term time-aggregated predictions. The success of our new method is remarkable since efficient predictions with short and volatile data always carry great importance. Additionally, this article opens potential avenues where one can design a model-free prediction structure to meet specific needs.
APA, Harvard, Vancouver, ISO, and other styles
6

Cadotte, Ernest R., Robert B. Woodruff, and Roger L. Jenkins. "Expectations and Norms in Models of Consumer Satisfaction." Journal of Marketing Research 24, no. 3 (August 1987): 305–14. http://dx.doi.org/10.1177/002224378702400307.

Full text
Abstract:
Disconfirmation models of customer satisfaction employing three alternative standards of performance were compared by using causal modeling. Pre- and post-measures were obtained from subjects in three different use situations. The disconfirmation paradigm is supported. The analysis suggests that best brand norm and product norm are additional standards used for evaluating focal brand performance.
APA, Harvard, Vancouver, ISO, and other styles
7

G. C., Surya Bahadur, and Ravindra Prasad Baral. "The Effect of Corporate Governance and Ownership Structure on Financial Performance of Listed Companies in Nepal." Journal of Nepalese Business Studies 12, no. 1 (December 31, 2019): 1–18. http://dx.doi.org/10.3126/jnbs.v12i1.28148.

Full text
Abstract:
The paper attempts to analyze relationships among corporate governance, ownership structure and firm performance in Nepal. The study comprises of panel data set of 25 firms listed at Nepal Stock Exchange (NEPSE) covering a period of five years from 2012 to 2016. The econometric methodology for the study consists primarily of least squares dummy variable (LSDV) model, fixed and random effects panel data models and two-stage least squares (2SLS) model. The study finds bi-directional relationship between corporate governance and performance. Among corporate governance internal mechanisms; smaller board size, higher proportion of independent directors, reducing ownership concentration, improving standards of transparency and disclosure, and designing appropriate director compensation package are important dimensions that listed firms and regulators in Nepal should focus on. Ownership concentration is found to have positive effect on performance; however, it affects corporate governance negatively. This study raises understanding and provides empirical evidence for endogenous relationship between corporate governance and performance and offers support for principal-principal agency relationship. The results of this study lead to several practical implications for listed firms as well as policymakers of Nepal in promoting sound corporate governance practices and codes. For listed companies, the improvement in compliance with a code of corporate governance or voluntary adoption of best practices can provide a means of achieving improved performance.
APA, Harvard, Vancouver, ISO, and other styles
8

Ahmed, Elsadig Musa. "Modelling green productivity spillover effects on sustainability." World Journal of Science, Technology and Sustainable Development 17, no. 3 (April 1, 2020): 257–67. http://dx.doi.org/10.1108/wjstsd-01-2020-0009.

Full text
Abstract:
PurposeThis study aims to explain the integration of innovation and climate with the economic growth Green Productivity (GP) concept. This is drawn from the integration of two important developmental strategies: productivity improvement and environmental protection. Productivity provides the framework for continuous improvement, while environmental protection provides the foundation for sustainable development. Therefore, GP is a strategy for enhancing productivity and environmental performance for overall socio-economic development.Design/methodology/approachThree variations of frameworks and econometric model were developed to measure green total factor productivity, green labour productivity and green capital productivity, and their contributions to green productivity and sustainable development; these were based on extensive and intensive growth theories.FindingsThe sustainability of higher economic growth will likely continue to be productivity driven. This will be through the enhancement of total factor productivity (TFP) as technological progress in nations that combined the three dimensions of sustainable development (economic development, environmental protection and social sustainable development via human capital development). Such an enhancement needs to emphasise the quality of the workforce, demand intensity, economic restructuring, capital structure, technical progress and environmental standards. It should be recalled that green productivity through green TFP demonstrates the sustainable development concept of progressing technologically. It will ensure the rights of the future, as well as current, generations for them to enjoy a better life.Originality/valueThe study fills the gaps in growth theories by developing three variations of frameworks and econometric models, and internalising pollutants emissions as private and unpriced inputs in the three models. Further, the green capital productivity model is the sole contributing model developed in this research; it has not been thought about in any previous studies. This study highlighted the green productivity that is ignored by the studies that have been awarded the Nobel Prize in economic sciences in 2018.
APA, Harvard, Vancouver, ISO, and other styles
9

Tse, David K., and Peter C. Wilton. "Models of Consumer Satisfaction Formation: An Extension." Journal of Marketing Research 25, no. 2 (May 1988): 204–12. http://dx.doi.org/10.1177/002224378802500209.

Full text
Abstract:
The authors extend consumer satisfaction literature by theoretically and empirically (1) examining the effect of perceived performance using a model first proposed by Churchill and Surprenant, (2) investigating how alternative conceptualizations of comparison standards and disconfirmation capture the satisfaction formation process, and (3) exploring possible multiple comparison processes in satisfaction formation. Results of a laboratory experiment suggest that perceived performance exerts direct significant influence on satisfaction in addition to those influences from expected performance and subjective disconfirmation. Expectation and subjective disconfirmation seem to be the best conceptualizations in capturing satisfaction formation. The results suggest multiple comparison processes in satisfaction formation.
APA, Harvard, Vancouver, ISO, and other styles
10

Mazreku, Ibish, Fisnik Morina, and Elvis Curraj. "Evaluation of the Financial Performance of Pension Funds. Empirical Evidence: Kosovo, Albania and North Macedonia." European Journal of Sustainable Development 9, no. 1 (February 1, 2020): 161. http://dx.doi.org/10.14207/ejsd.2020.v9n1p161.

Full text
Abstract:
Purpose: This research paper aims to analyze the evaluation of the financial performance of pension funds, to find the relationship between contributions, return on investment and net asset value with pension fund performance. The following research questions have been asked in order to realize the purpose of the research: What are the factors affecting the performance of the pension fund? What is the relationship between pension fund performance and contributions, return on investment, and net asset value? Methodology: For the specification of the econometric model of this study, we rely on secondary data published in official World Bank reports and reports of pension funds in Kosovo, Albania and North Macedonia. To measure the empirical results, these statistical tests are used: standard multiple regression, fixed effects model, random effect model, and Hausman Taylor Regression. Findings: Based on the empirical results, we can conclude that the increase in gross domestic product, return on investment, contributions and net assets have positively influenced the performance of pension funds for the countries included in the study. The other independent variable, the exchange rate, on the basis of econometric estimations, has turned out to be non-significant. Practical implications: The empirical results of this study may recommend that relevant institutions in Kosovo, Albania and North Macedonia undertake reforms towards the creation of efficient pension systems, and these reforms are of crucial importance for pension systems, which have an economic and social character in their function as fund accumulators and benefit distributors for the categories in need. Originality: The study is conducted with secondary data and all the empirical analysis are original based on the authors' calculations through econometric models. Through the results of this study we aim to provide additional empirical evidence on the performance of pension funds in Kosovo, Albania and North Macedonia, recommending that relevant institutions improve the functioning of the pension system, as it is a very important part of a financial system of a country which has an impact on economic growth. Keywords: financial performance, pension fund, contributions, net assets, return on investment
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Performance standards Econometric models"

1

Klein, David. "Regional Performance in Knowledge Economies : A Comparison of Performance Indicators and Regional Units across Spatial Econometric Models." Thesis, Umeå universitet, Kulturgeografi, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-136299.

Full text
Abstract:
Policy makers, regional planners and the like have long tried in vain to come up with both economically profound and comprehensive regional policies. These policies are extremely important to achieve de- velopment goals in the European Union, which is why regulations for economic convergence and in- creasing competitiveness of regions are critical. Nowadays, technological progress poses new tasks for policy makers, as economic production shifted from industrial towards knowledge intensive processes. Therefore, it is widely accepted that knowledge is the new trigger of regional economic performance. Yet, established knowledge assets, such as creativity, human capital and entrepreneurship are scarcely studied jointly in research practice. This leaves the scientific community with a fragmented understand- ing of this topic, and can cause considerable confusion among policy makers. The aim of this paper is twofold. First, on the conceptual front, it investigates the role of knowledge assets for regional performance. The major question in this regard is whether the more recent creative class approach outperforms conventional human capital measures. Secondly, the paper aims to clarify both the significance of selecting regional performance indicators and the role of regional hierarchy. Work undertaken in this regard uses various indicators interchange- ably and often fails get to the bottom of what the choice of the indicator means for their approach. By the same token, there are persistent uncertainties about the choice and the relevance of regional units for spatial econometric analysis. Therefore, the analysis tries to study the consequences of choosing specific indicators and regional units. Using a general spatial model, the paper estimates a Cobb-Douglas production function of the economic performance of 290 Swedish municipalities between 2009 and 2014. With this mathematical approach, spatial autocorrelation and spatial error disturbances are eliminated, allowing for more comprehensive and spatially robust results. By doing so, multiple variables representing human capital, creativity, entre- preneurship and innovative activities are examined and compared across four models varying on re- gional scale and output indicators. This approach also controls for a set of industrial and socio-economic features of the regional environment. The study found significant differences for varying regional levels and performance indicators. Moreover, creativity, narrowly defined, seems to be most strongly linked to regional performance outperforming other variables, including human capital measures.
APA, Harvard, Vancouver, ISO, and other styles
2

Sundin, Timmy. "Environmental regulation in the Swedish pulp and paper industry : An econometric analysis of the effectiveness of performance standards." Thesis, Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-64136.

Full text
Abstract:
The purpose of this study is to analyze the effectiveness of environmental regulations for water-borne emissions in the Swedish pulp and paper industry. Furthermore, the study intends to analyze if there are differences in the effectiveness before and after the restructuring of the Swedish regulatory procedures in 1999. It also addresses the impact of compliance periods in the regulatory process. The method is econometric and based on the use of a fixed-effect panel data regression model. The data comprise 1 698 unique observation from 21 Swedish pulp and paper mills during the time period 1980-2013 regarding emissions, emission standards and production levels. The results display that the environmental regulation in the industry has been effective in the sense that emissions have decreased with the implementation of performance standards. Furthermore, the period before 1999 shows a greater reduction of emissions than the period after 1999. Finally, the results indicate that the use of compliance periods appears to have contributedto a greater reduction in emissions compared to cases where no such periods are granted.
Syftet med denna studie är att analysera effektiviteten av miljöregleringen av vattenbaserade utsläpp inom den svenska massa- och pappersindustrin. Dessutom avser denna studie att analysera om det finns några effektivitetsskillnader före och efter omstruktureringen av den svenska regleringsprocessen år 1999. Studien behandlar även effekten av anpassningsperioder i regleringsprocessen. Metoden är ekonometrisk och baseras på en "fixed-effect" panel datamodell. Datamaterialet består av 1 689 unika observationer från 21 svenska massa- och pappersbruk under åren 1980 - 2013 avseende utsläpp, gränsvärden och produktionsnivåer. Resultatet visar att miljöregleringen har varit effektiv i den meningen att utsläppen har minskat med införandet av gränsvärden. Dessutom, perioden innan 1999 visar en större utsläppsreduktion än perioden efter 1999. Till sist, resultaten indikerar att användandet av anpassningsperioder verkar ha bidragit till större utsläppsreduceringar i jämförelse till de fall där dessa perioder inte beviljades.
APA, Harvard, Vancouver, ISO, and other styles
3

Galagedera, Don U. A. "Investment performance appraisal and asset pricing models." Monash University, Dept. of Econometrics and Business Statistics, 2003. http://arrow.monash.edu.au/hdl/1959.1/5780.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Nel, Willem Soon. "The development of optimal composite multiples models for the performance of equity valuations of listed South African companies : an empirical investigation." Thesis, Stellenbosch : Stellenbosch University, 2014. http://hdl.handle.net/10019.1/95808.

Full text
Abstract:
Thesis (PhD)--Stellenbosch University, 2014.
ENGLISH ABSTRACT: The practice of combining single-factor multiples (SFMs) into composite multiples models is underpinned by the theory that various SFMs carry incremental information, which, if encapsulated in a superior value estimate, largely eliminates biases and errors in individual estimates. Consequently, the chief objective of this study was to establish whether combining single value estimates into an aggregate estimate will provide a superior value estimate vis-á-vis single value estimates. It is envisaged that this dissertation will provide a South African perspective, as an emerging market, to composite multiples modelling and the multiples-based equity valuation theory on which it is based. To this end, the study included 16 SFMs, based on value drivers representing all of the major value driver categories, namely earnings, assets, dividends, revenue and cash flows. The validation of the research hypothesis hinged on the results obtained from the initial cross-sectional empirical investigation into the factors that complicate the traditional multiples valuation approach. The main findings from the initial analysis, which subsequently directed the construction of the composite multiples models, were the following: Firstly, the evidence suggested that, when constructing multiples, multiples whose peer groups are based on a combination of valuation fundamentals perform more accurate valuations than multiples whose peer groups are based on industry classifications. Secondly, the research results confirmed that equity-based multiples produce more accurate valuations than entity-based multiples. Thirdly, the research findings suggested that multiples models that are constructed on earnings-based value drivers, especially HE, offer higher degrees of valuation accuracy compared to multiples models that are constructed on dividend-, asset-, revenue- or cash flowbased value drivers. The results from the initial cross-sectional analysis were also subjected to an industry analysis, which both confirmed and contradicted the initial cross-sectional-based evidence. The industry-based research findings suggested that both the choice of optimal Peer Group Variable (PGV) and the choice of optimal value driver are industry-specific. As with the initial cross-sectional analysis, earnings-based value drivers dominated the top positions in all 28 sectors that were investigated, while HE was again confirmed as the most accurate individual driver. However, the superior valuation performance of multiples whose peer groups are based on a combination of valuation fundamentals, as deduced from the crosssectional analysis conducted earlier, did not hold when subjected to an industry analysis, suggesting that peer group selection methods are industry-specific. From this evidence, it was possible to construct optimal industry-specific SFMs models, which could then be compared to industry-specific composite models. The evidence suggested that composite-based modelling offered, on annual average, between 20.21% and 44.59% more accurate valuations than optimal SFMs modelling over the period 2001 to 2010. The research results suggest that equity-based composite modelling may offer substantial gains in precision over SFMs modelling. These gains are, however, industry-specific and a carte blanche application thereof is ill advised. Therefore, since investment practitioners’ reports typically include various multiples, it seems prudent to consider the inclusion of composite models as a more accurate alternative.
AFRIKAANSE OPSOMMING: Die praktyk om Enkelfaktor Veelvoude (EFVe) te kombineer in saamgestelde veelvoudmodelle word ondersteun deur die teorie dat verskillende EFVe oor inkrementele inligting beskik, wat, indien dit in ’n superieure waardeskatting opgeneem word, grootliks vooroordele en foute in individuele skattings elimineer. Gevolglik was die hoofdoel van hierdie studie om vas te stel of die kombinering van verskeie enkelfaktor waardeskattings in ’n totale waardeskatting ’n superieure waardeskatting sal verskaf vis-á-vis enkelfaktor waardeskattings. Dit word voorsien dat hierdie proefskrif ’n Suid-Afrikaanse perspektief, as ’n ontluikende mark, sal bied aangaande saamgestelde veelvoudmodellering en die veelvoud-gebaseerde ekwiteitswaardasie-teorie waarop dit gebaseer is. Hiermee ten doel, sluit hierdie studie 16 EFVe in, gebaseer op waardedrywers wat al die vernaamste waardedrywerkategorieë, naamlik verdienste, bates, dividende, omset en kontantvloeie, verteenwoordig. Die bevestiging van die navorsingshipotese is afhanklik van die resultate soos bekom vanuit die aanvanklike dwarsdeursnee-empiriese ondersoek na die faktore wat die tradisionele veelvoudwaardasieproses kompliseer. Die hoofbevindinge van die aanvanklike ontleding, wat daarna rigtinggewend was vir die komposisie van die saamgestelde veelvoudmodelle, was die volgende: Eerstens, dui die bewyse daarop dat, wanneer veelvoude saamgestel word, veelvoude waarvan die portuurgroepe op ’n kombinasie van fundamentele waardasieveranderlikes gebaseer is, meer akkurate waardasies lewer as veelvoude waarvan die portuurgroepe op industrie-klassifikasies gebaseer is. Tweedens, het die navorsingsresultate bevestig dat ekwiteitsgebaseerde veelvoude meer akkurate waardasies lewer as entiteitsgebaseerde veelvoude. Derdens, toon die navorsingsbevindinge dat veelvoudmodelle wat saamgestel word uit verdienstegebaseerde waardedrywers, veral wesensverdienste (WV), hoër grade van waardasie-akkuraatheid bied in vergelyking met veelvoudmodelle wat saamgestel word uit dividend-, bate-, omset- of kontantvloei-gebaseerde waardedrywers. Die resultate van die aanvanklike dwarsdeursnee-ontleding is ook onderwerp aan ’n industrie-ontleding, wat die aanvanklike bevindinge van die dwarsdeursnee-ontleding beide bevestig en weerspreek het. Die bevindinge vanaf die industrie-ontleding dui daarop dat beide die keuse van optimale Portuurgroepveranderlike (PGV) en die optimale keuse van waardedrywer, industrie-spesifiek is. Soos met die aanvanklike dwarsdeursnee-ontleding, het verdienste-gebaseerde waardedrywers die top posisies by al 28 sektore wat ondersoek is, gedomineer, terwyl WV weer as die akkuraatste individuele waardedrywer bevestig is. Die superieure waardasie-resultate van veelvoude waarvan die portuurgroepe gebaseer was op ’n kombinasie van fundamentele waardasie-veranderlikes, soos afgelei uit die aanvanklike dwarsdeursnee-ontleding, het egter nie dieselfde resultate gelewer op ’n per sektor basis nie, wat aandui dat portuurgroep seleksiemetodes industrie-spesifiek is. Vanuit hierdie bevindinge was dit moontlik om optimale EFV-modelle saam te stel, wat dan vergelyk kon word met industrie-spesifieke saamgestelde veelvoudmodelle. Die bevindinge het voorgestel dat saamgestelde modellering gemiddeld jaarliks, tussen 20.21% en 44.59% meer akkurate waardasies gelewer het as optimale EFVmodellering oor die tydperk 2001 tot 2010. Die navorsingsresultate dui aan dat ekwiteitsgebaseerde saamgestelde modellering aansienlike toenames in waardasie-akkuraatheid mag bewerkstellig bo dié van EFVmodellering. Hierdie toenames is egter industrie-spesifiek en ’n carte blanche toepassing daarvan is nie aan te beveel nie. Gevolglik, aangesien beleggingspraktisyns se verslae tipies verskeie veelvoude insluit, blyk dit redelik om die insluiting van saamgestelde modelle as ’n meer akkurate alternatief te oorweeg.
APA, Harvard, Vancouver, ISO, and other styles
5

Wong, Siu-kei, and 黃紹基. "The performance of property companies in Hong Kong: a style analysis approach." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2003. http://hub.hku.hk/bib/B26720401.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Álvarez, Ruano Enrique José. "Effect of time and type of measurement on objective performance trends: a longitudinal analysis of new salespeople." Doctoral thesis, Universitat de Barcelona, 2015. http://hdl.handle.net/10803/386480.

Full text
Abstract:
The measurement of sales force performance is an issue of the upmost importance. Research in this area has primarily focused on cross-sectional studies establishing a link between various types of predictors and sales performance at a specific moment in time, despite the well accepted idea that performance is dynamic over time. Moreover, the most frequent way to measure performance has been through subjective measures. Yet, little is actually known empirically about trends (growth trajectories) of objective performance over time and their determinants. The empirical research study presented in this dissertation is designed to fill this gap. First, we conducted an extensive survey of the literature in order to identify empirical work referred to objective measures of performance at the individual level in the sales domain, yielding 133 published studies and 148 samples. Then, we analyzed in detail, on one side, all studies using two or more objective measures of performance and, on the other, studies conducting a longitudinal research. Building on job stages theory, we argue specifically that measurements of objective performance taken at different times are not related when salespeople are involved in changing contexts. Furthermore, we hypothesize that growth trajectories measured with different indicators of objective performance are not related. Random coefficient modeling in the form of Hierarchical Linear Modeling is then used to analyze objective performance over time. The individual performance growth trajectories of 230 salespeople that joined a Spanish direct selling firm were modeled using SPSS and R software. To the best of our knowledge, this thesis represents the first longitudinal study to explicitly analyze and compare the trends (growth trajectories) of various measures of objective performance (sales, units and compensation) of salespeople during their first months at av company. This analysis yielded three important results at the individual salesperson level. First, time matters when measuring individual objective performance. Our findings confirm that performance is dynamic over time and that there is a rank-order effect when measuring salespeople. Second, different objective measures of performance quantify different things. We found no evidence that the growth trajectories of objective measures of performance taken during the same period are related, thus, building on the idea that objective measures of performance are not interchangeable. Third, these findings help understand the specificities of new salespeople in direct selling, facing a transitional job stage. This thesis, thus, contributes to the longitudinal analysis of sales performance confirming (a) that future esearch studies have to consider the relationship over time of objective performance with any set of predictors, and (b) that objective indicators of sales performance are not interchangeable and have to be chosen carefully by scholars according to the objectives of each investigation. Additionally, it has important implications for practitioners referred to selection, promotion, retention, evaluation, training and compensation of salesforces.
La medición de la performance de los vendedores de una empresa es una asunto de gran importancia. La investigación en esta área se ha centrado especialmente en análisis estáticos, determinando la relación entre varios tipos de predictores y la performance en un momento en el tiempo, pese a la idea aceptada en el mundo académicos de que la performance es dinámica en el tiempo. Además, la forma más habitual de medir esa performance ha sido mediante métricas subjetivas. Finalmente, se sabe muy poco sobre las tendencias (trayectorias de crecimiento) de la performance objetiva o sobre sus determinantes. El presente estudio empírico pretende abordar estos asuntos. Primero, realizamos un análisis detallado de las principales revistas académicas para identificar estudios empíricos referidos a medidas objetivas de performance a nivel individual en el ámbito de las ventas, obteniendo 133 artículos y 148 muestras. Posteriormente, analizamos exhaustivamente, por una parte, los estudios utilizando dos o más medidas objetivas de performance y, por otra, aquellos estudios con análisis longitudinales. Basándonos en la teoría de las etapas laborales ("job stages"), establecimos las hipótesis de que medidas de performance objetivas tomadas en momentos diferentes no están relacionadas entre sí cuando los vendedores están implicados en situaciones de cambio; además, consideramos que las trayectorias de crecimiento medidas con distintos indicadores no están relacionadas entre sí. Se utilizó "random coefficient modeling" en la forma de Modelos Jerárquicos Lineales ("HLM") para analizar la performance objetiva a lo largo del tiempo. Se modelizaron las trayectorias de crecimiento de la performance de 230 vendedores de una compañía española de venta directa utilizando los software "SPSS" y "R". Hasta donde hemos podido identificar, este es el primer estudio longitudinal que analiza y compara las trayectorias de varias medidas de performance objetiva (ventas en euros, unidades y retribución en euros) de vendedores durante sus primeros meses en la empresa. El análisis produjo tres resultados a nivel individual de los vendedores. Primero, el tiempo es un factor relevante cuando se mide la performance individual objetiva; nuestros resultados confirman que la performance es dinámica en el tiempo. Segundo, distintas métricas de performance objetiva miden cosas diferentes. No encontramos evidencia de que las trayectorias de crecimiento de medidas objetivas tomadas en el mismo periodo estén relacionadas, soportando la idea de que dichas métricas no son intercambiables entre sí. Tercero, estas conclusiones ayudan a entender los aspectos específicos de nuevos vendedores en una empresa. Por tanto, esta Tesis contribuye al análisis longitudinal de la performance de vendedores confirmando (a) que futuros estudios académicos deben considerar la relación en el tiempo de la performance objetiva con los predictores que se estén analizando, y (b) que las métricas de performance objetiva no son intercambiables entre sí y que, por tanto, deben escogerse cuidadosamente en función de los objetivos de la investigación. Adicionalmente, tiene importantes implicaciones para el mundo de la empresa referidas a la selección, promoción, retención, evaluación, formación y retribución de sus vendedores.
APA, Harvard, Vancouver, ISO, and other styles
7

Lorande, Marcelo Schiller. "Previsão de inflação no Brasil utilizando desagregação por componentes de localidade." reponame:Repositório Institucional do FGV, 2018. http://hdl.handle.net/10438/24759.

Full text
Abstract:
Submitted by MARCELO LORANDE (marcelomilq@gmail.com) on 2018-09-12T02:06:05Z No. of bitstreams: 1 Dissertação - MSL.pdf: 1806135 bytes, checksum: 104d4a7f94cee09c2d70b23ed78d5fad (MD5)
Rejected by Thais Oliveira (thais.oliveira@fgv.br), reason: Boa noite, Marcelo! Para que possamos aprovar sua Dissertação, serão necessárias apenas duas alterações: - "GETULIO" não tem acento; - Lista de Figuras e Tabelas são posicionadas ANTES do Sumário. Por gentileza, alterar e submeter novamente. Obrigada. on 2018-09-14T21:15:22Z (GMT)
Submitted by MARCELO LORANDE (marcelomilq@gmail.com) on 2018-09-17T02:46:44Z No. of bitstreams: 1 Dissertação - MSL.pdf: 1813262 bytes, checksum: d4225bb320038db9cfeb819e8804a5b8 (MD5)
Approved for entry into archive by Joana Martorini (joana.martorini@fgv.br) on 2018-09-17T15:55:14Z (GMT) No. of bitstreams: 1 Dissertação - MSL.pdf: 1813262 bytes, checksum: d4225bb320038db9cfeb819e8804a5b8 (MD5)
Approved for entry into archive by Suzane Guimarães (suzane.guimaraes@fgv.br) on 2018-09-18T13:14:49Z (GMT) No. of bitstreams: 1 Dissertação - MSL.pdf: 1813262 bytes, checksum: d4225bb320038db9cfeb819e8804a5b8 (MD5)
Made available in DSpace on 2018-09-18T13:14:49Z (GMT). No. of bitstreams: 1 Dissertação - MSL.pdf: 1813262 bytes, checksum: d4225bb320038db9cfeb819e8804a5b8 (MD5) Previous issue date: 2018-08-17
Este trabalho propõe a desagregação por componentes de localidade do índice de inflação no Brasil como forma de melhorar o desempenho preditivo de modelos econométricos. Foram desenvolvidos modelos autorregressivos com ou sem variáveis macroeconômicas explicativas para se avaliar como a desagregação impacta em cada um deles. Além disso, foram utilizados dois testes estatísticos para se comparar o desempenho dos modelos, o Model Confidence Set e o Superior Predictive Ability. Observou-se que para curto prazo, como horizontes de até 3 meses, modelos autorregressivos de 1ª ordem possuem desempenho imbatível, ao passo que para horizontes mais distantes, modelo macroeconômicos e desagregados geram previsões estatisticamente superiores.
This work proposes the disaggregation of locality components from Brazil´s inflation index to enhance predictive performance of econometric models. Autorregressive models were implemented with or without explicative macroeconomic variables, in order to evaluate how the disaggregation affects each one of them. Besides that, it has been used two statistical tests to compare model forecast performance, the Model Confidence Set and the Superior Predictive Ability. For short term, up to 3 months, autorregressive models showed unachievable performance, whereas for longer terms, macroeconomic disaggregated models generate statistically superior forecasts.
APA, Harvard, Vancouver, ISO, and other styles
8

Duong, Lien Thi Hong. "Australian takeover waves : a re-examination of patterns, causes and consequences." UWA Business School, 2009. http://theses.library.uwa.edu.au/adt-WU2009.0201.

Full text
Abstract:
This thesis provides more precise characterisation of patterns, causes and consequences of takeover activity in Australia over three decades spanning from 1972 to 2004. The first contribution of the thesis is to characterise the time series behaviour of takeover activity. It is found that linear models do not adequately capture the structure of merger activity; a non-linear two-state Markov switching model works better. A key contribution of the thesis is, therefore, to propose an approach of combining a State-Space model with the Markov switching regime model in describing takeover activity. Experimental results based on our approach show an improvement over other existing approaches. We find four waves, one in the 1980s, two in the 1990s, and one in the 2000s, with an expected duration of each wave state of approximately two years. The second contribution is an investigation of the extent to which financial and macro-economic factors predict takeover activity after controlling for the probability of takeover waves. A main finding is that while stock market boom periods are empirically associated with takeover waves, the underlying driver is interest rate level. A low interest rate environment is associated with higher aggregate takeover activity. This relationship is consistent with Shleifer and Vishny (1992)'s liquidity argument that takeover waves are symptoms of lower cost of capital. Replicating the analysis to the biggest takeover market in the world, the US, reveals a remarkable consistency of results. In short, the Australian findings are not idiosyncratic. Finally, the implications for target and bidder firm shareholders are explored via investigation of takeover bid premiums and long-term abnormal returns separately between the wave and non-wave periods. This represents the third contribution to the literature of takeover waves. Findings reveal that target shareholders earn abnormally positive returns in takeover bids and bid premiums are slightly lower in the wave periods. Analysis of the returns to bidding firm shareholders suggests that the lower premiums earned by target shareholders in the wave periods may simply reflect lower total economic gains, at the margin, to takeovers made in the wave periods. It is found that bidding firms earn normal post-takeover returns (relative to a portfolio of firms matched in size and survival) if their bids are made in the non-wave periods. However, bidders who announce their takeover bids during the wave periods exhibit significant under-performance. For mergers that took place within waves, there is no difference in bid premiums and nor is there a difference in the long-run returns of bidders involved in the first half and second half of the waves. We find that none of theories of merger waves (managerial, mis-valuation and neoclassical) can fully account for the Australian takeover waves and their effects. Instead, our results suggest that a combination of these theories may provide better explanation. Given that normal returns are observed for acquiring firms, taken as a whole, we are more likely to uphold the neoclassical argument for merger activity. However, the evidence is not entirely consistent with neo-classical rational models, the under-performance effect during the wave states is consistent with the herding behaviour by firms.
APA, Harvard, Vancouver, ISO, and other styles
9

Chaudhry, Ashraf. "Quantitative performance evaluation of benchmarked active funds." Phd thesis, 2005. http://hdl.handle.net/1885/151357.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Henning, Ingrid. "Verband tussen emosionele intelligensie en werkprestasie in 'n gesondheidsorgomgewing." Diss., 2006. http://hdl.handle.net/10500/2144.

Full text
Abstract:
Text in Afrikaans
Health care has typical challenges and stress factors as employees deal with people in a very personal manner. This exclusively client-centred environment requires special skills from employees if they wish to perform well. In addition to excellent interpersonal and stress management skills, they need to be able to handle their own and patients' emotions well. These skills can be referred to as emotional intelligence competencies. Emotional intelligence is a relatively new concept with many possibilities if applied correctly. This study involves 114 employees, their colleagues and supervisors in two health-care organisations. The 360° Emotional Competency Profiler was used together with the internal performance evaluation scores, and statistical techniques such as t-tests and discriminant analysis were utilized. It was found that certain emotional intelligence dimensions and work performance are related to a certain extent. However, other factors also need to be considered if one wants to make predictions in this regard.
Die gesondheidsorgomgewing het unieke uitdagings en stresfaktore omdat werknemers op 'n baie persoonlike vlak met mense werk. Die uitsluitlik klientgesentreerde omgewing verg spesiale vaardighede indien werknemers goed wil presteer. Buiten uitstekende interpersoonlike- en streshanteringsvaardighede, moet hulle ook in staat wees om hulle eie en huile pasiente se gevoelens te kan hanteer. Hierdie vaardighede kan na verwys word as emosionele intelligensievaardighede. Emosionele intelligensie is 'n relatief nuwe konsep wat baie toepassingsmoontlikhede het indien dit reg aangewend word. Hierdie studie betrek 114 beskikbare werknemers en hulle kollegas en toesighouers in twee gesondheidsorgorganisasies. Die 360° Emotional Competency Profiler is saam met die interne prestasie-beoordelingstellings gebruik en daar is gebruik gemaak van statistiese tegnieke soos f-toetse en diskriminantontleding. Daar is bevind dat sekere emosionele intelligensie dimensies in 'n sekere mate 'n onderlinge verband toon met werksprestasie. Ander faktore moet egter ook in ag geneem word indien mens 'n voorspelling in hierdie verband wil maak.
Industrial and Organisational Psychology
MCOM (Industrial and Organisational Psychology)
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Performance standards Econometric models"

1

Bauer, Georg. Wertorientierte Steuerung multidivisionaler Unternehmen über Residualgewinne. Frankfurt am Main: Lang, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kolpakov, Vasiliy. Economic and mathematical and econometric modeling: Computer workshop. ru: INFRA-M Academic Publishing LLC., 2020. http://dx.doi.org/10.12737/24417.

Full text
Abstract:
The textbook presents mathematical research methods and models of economic objects and processes designed for the analysis and prediction of economic factors and develop control solutions as in the deterministic conditions, and in conditions of some uncertainty, and dynamics. Each Chapter of the book consists of a theoretical framework, discussed in detail several examples and tasks for independent work. As workbench simulation uses standard office the program Excel and Mathcad. Tutorial focused on independent performance of students individual tasks on disciplines "Economic-mathematical methods" and "Econometrics". Meets the requirements of Federal state educational standard of higher education of the last generation. The publication is intended for students and postgraduate students in economic disciplines. It can also be useful as they perform final qualifying works. The book will be useful for practitioners engaged in the analysis of the current financial and economic condition and future development of firms and businesses.
APA, Harvard, Vancouver, ISO, and other styles
3

Howell, Syd. Correlation between market timing performance and stock selection performance in the Henriksson-Merton model. Manchester: Manchester Business School, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Brown, Kenneth M. Measuring economic performance: A study. Washington: U.S. G.P.O., 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

United States. Congress. Joint Economic Committee., ed. Measuring economic performance: A study. Washington: U.S. G.P.O., 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Coate, Stephen. Reciprocity without commitment: Characterization and performance of informal risk-sharing arrangements. Coventry: University of Warwick, Development Economics Research Centre, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Benati, Luca. Evolving post-World war II UK economic performance. London: Bank of England, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Benati, Luca. Evolving post-World War II UK economic performance. London: Bank of England, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Performance of the Mombasa port: An empirical analysis. Nairobi, Kenya: Kenya Institute for Public Policy Research and Analysis, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gott, C. Deene. Exploratory models to link job performance to enlistment standards. Brooks Air Force Base, Tex: Air Force Systems Command, Air Force Human Resources Laboratory, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Performance standards Econometric models"

1

Bastianin, Andrea, Matteo Manera, Anil Markandya, and Elisa Scarpa. "Evaluating the Empirical Performance of Alternative Econometric Models for Oil Price Forecasting." In The Interrelationship Between Financial and Energy Markets, 157–81. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-55382-0_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Stenner, A. Jackson, and Mark H. Stone. "Does the Reader Comprehend the Text Because the Reader Is Able or Because the Text Is Easy?" In Explanatory Models, Unit Standards, and Personalized Learning in Educational Measurement, 133–52. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-3747-7_11.

Full text
Abstract:
AbstractDoes the reader comprehend the text because the reader is able or because the text is easy? Localizing the cause of comprehension in either the reader or the text is fraught with contradictions. A proposed solution uses a Rasch equation to models comprehension as the difference between a reader measure and text measure. Computing such a difference requires that reader and text are measured on a common scale. Thus, the puzzle is solved by positing a single continuum along which texts and readers can be conjointly ordered. A reader’s comprehension of a text is a function of the difference between reader ability and text readability. This solution forces recognition that generalizations about reader performance can be text independent (reader ability) or text dependent (comprehension). The article explores how reader ability and text readability can be measured on a single continuum, and the implications that this formulation holds for reading theory, the teaching of reading, and the testing of reading.
APA, Harvard, Vancouver, ISO, and other styles
3

Fisher, William P., and A. Jackson Stenner. "Metrology for the Social, Behavioral, and Economic Sciences." In Explanatory Models, Unit Standards, and Personalized Learning in Educational Measurement, 217–22. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-3747-7_17.

Full text
Abstract:
AbstractA metrological infrastructure for the social, behavioral, and economic sciences has foundational and transformative potentials relating to education, health care, human and natural resource management, organizational performance assessment, and the economy at large. The traceability of universally uniform metrics to reference standard metrics is a taken-for-granted essential component of the infrastructure of the natural sciences and engineering. Advanced measurement methods and models capable of supporting similar metrics, standards, and traceability for intangible forms of capital have been available for decades but have yet to be implemented in ways that take full advantage of their capacities. The economy, education, health care reform, and the environment are all now top national priorities. There is nothing more essential to succeeding in these efforts than the quality of the measures we develop and deploy. Even so, few, if any, of these efforts are taking systematic advantage of longstanding, proven measurement technologies that may be crucial to the scientific and economic successes we seek. Bringing these technologies to the attention of the academic and business communities for use, further testing, and development in new directions is an area of critical national need.
APA, Harvard, Vancouver, ISO, and other styles
4

van der Aalst, Wil M. P. "Process Mining: A 360 Degree Overview." In Lecture Notes in Business Information Processing, 3–34. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-08848-3_1.

Full text
Abstract:
AbstractProcess mining enables organizations to uncover their actual processes, provide insights, diagnose problems, and automatically trigger corrective actions. Process mining is an emerging scientific discipline positioned at the intersection between process science and data science. The combination of process modeling and analysis with the event data present in today’s information systems provides new means to tackle compliance and performance problems. This chapter provides an overview of the field of process mining introducing the different types of process mining (e.g., process discovery and conformance checking) and the basic ingredients, i.e., process models and event data. To prepare for later chapters, event logs are introduced in detail (including pointers to standards for event data such as XES and OCEL). Moreover, a brief overview of process mining applications and software is given.
APA, Harvard, Vancouver, ISO, and other styles
5

Fisher, William P. "Measurement Systems, Brilliant Processes, and Exceptional Results in Healthcare: Untapped Potentials of Person-Centered Outcome Metrology for Cultivating Trust." In Springer Series in Measurement Science and Technology, 357–96. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-07465-3_12.

Full text
Abstract:
AbstractAn historic shift in focus on the quality and person-centeredness of health care has occurred in the last two decades. Accounts of results produced from reinvigorated attention to the measurement, management, and improvement of the outcomes of health care show that much has been learned, and much remains to be done. This article proposes that causes of the failure to replicate in health care the benefits of “lean” methods lie in persistent inattention to measurement fundamentals. These fundamentals must extend beyond mathematical and technical issues to the social, economic, and political processes involved in constituting trustworthy performance measurement systems. Successful “lean” implementations will follow only when duly diligent investments in these fundamentals are undertaken. Absent those investments, average people will not be able to leverage brilliant processes to produce exceptional outcomes, and we will remain stuck with broken processes in which even brilliant people can produce only flawed results. The methodological shift in policy and practice prescribed by the authors of the chapters in this book moves away from prioritizing the objectivity of data in centrally planned and executed statistical modeling, and toward scientific models that prioritize the objectivity of substantive and invariant unit quantities. The chapters in this book describe scientific modeling’s bottom-up, emergent and evolving standards for mass customized comparability. Though the technical aspects of the scientific modeling perspective are well established in health care outcomes measurement, operationalization of the social, economic, and political aspects required for creating new degrees of trust in health care institutions remains at a nascent stage of development. Potentials for extending everyday thinking in new directions offer hope for achieving previously unattained levels of efficacy in health care improvement efforts.
APA, Harvard, Vancouver, ISO, and other styles
6

Mariano, Roberto S., and Bryan W. Brown. "Stochastic-Simulation Tests of Nonlinear Econometric Models." In Comparative Performance of U.S. Econometric Models, 250–59. Oxford University Press, 1991. http://dx.doi.org/10.1093/acprof:oso/9780195057720.003.0009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Gerard, Adams F., and Joaquin Vial. "Comparisons of Macroeconometric Models of Developing Economies." In Comparative Performance of U.S. Econometric Models, 260–85. Oxford University Press, 1991. http://dx.doi.org/10.1093/acprof:oso/9780195057720.003.0010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Klein, Lawrence R. "Past, Present, and Possible Future of Macroeconometric Models and Their Uses." In Comparative Performance of U.S. Econometric Models, 2–16. Oxford University Press, 1991. http://dx.doi.org/10.1093/acprof:oso/9780195057720.003.0001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Adams, F. Gerard, and Lawrence R. Klein. "Performance of Quarterly Econometric Models of the United States: A New Round of Model Comparisons." In Comparative Performance of U.S. Econometric Models, 18–60. Oxford University Press, 1991. http://dx.doi.org/10.1093/acprof:oso/9780195057720.003.0002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

McNees, Stephen K. "Comparing Macroeconomic Model Forecasts under Common Assumptions." In Comparative Performance of U.S. Econometric Models, 69–84. Oxford University Press, 1991. http://dx.doi.org/10.1093/acprof:oso/9780195057720.003.0003.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Performance standards Econometric models"

1

Takara, Lucas de Azevedo, Viviana Cocco Mariani, and Leandro dos Santos Coelho. "Autoencoder Neural Network Approaches for Anomaly Detection in IBOVESPA Stock Market Index." In Congresso Brasileiro de Inteligência Computacional. SBIC, 2021. http://dx.doi.org/10.21528/cbic2021-37.

Full text
Abstract:
Anomalies are patterns in data that do not conform to a well-defined notion of normal behavior. Anomaly detection has been applied to many problems such as bank fraud, fault detection, noise reduction, among many others. Some approaches to detect anomalies include classical statistical econometric methods such as AutoRegressive Moving Average (ARMA) and AutoRegressive Integrated Moving Average (ARIMA) approaches. More recently, with the progress of artificial intelligence and more specifically, machine learning, new algorithms such as one-class support vector machines, isolation forest, gradient boosting, and deep neural networks were applied to such tasks. This paper focuses on propose an anomaly detection framework for the Índice da Bolsa de Valores de São Paulo (IBOVESPA). It is a major stock market index that tracks the performance of around 50 most liquid stocks traded on the São Paulo Stock Exchange in Brazil. Exploring unsupervised autoencoder neural network algorithms, we compare the long short-term autoencoder, bidirectional long short-term autoencoder, and convolutional autoencoder models, aiming to explore the performance of these architectures for anomaly detection. Due to the ability of autoencoders to learn a compressed representation of their respective input, we train these models with standard data by minimizing the mean absolute error (MAE) loss function and evaluate them with anomalous inputs. We set a reconstruction error threshold, and in case that the reconstruction error of the test data sample is beyond it, anomalies are detected. Our results show that these models perform quite well and can be applied to real stock market data.
APA, Harvard, Vancouver, ISO, and other styles
2

Perera, Treshani. "Forecasting the Performance of Commercial Property Market: Beyond the Primary Reliance on Econometric Models." In 25th Annual European Real Estate Society Conference. European Real Estate Society, 2016. http://dx.doi.org/10.15396/eres2016_91.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bishnu, Abhijeet, and Vimal Bhatia. "On performance analysis of IEEE 802.22 (PHY) for COST-207 channel models." In 2015 IEEE Conference on Standards for Communications and Networking (CSCN). IEEE, 2015. http://dx.doi.org/10.1109/cscn.2015.7390449.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hunten, Keith A., and Allison Barnard Feeney. "Business Object Models for Industrial Data Standards." In ASME 2011 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2011. http://dx.doi.org/10.1115/detc2011-47965.

Full text
Abstract:
Business object models, as proposed for inclusion in the ISO 10303 family of industrial data standards developed in ISO TC 184/SC4 (SC4), are a layer over the ISO 10303 architecture that is intended to simplify and make the complex standards more accessible to a wider audience, ease implementation, and improve implementation performance. This paper discusses the motivation for developing business object models in SC4, proposes a process for developing business objects, provides example business objects at different levels of complexity, and describes issues facing the two SC4 projects currently developing business object models.
APA, Harvard, Vancouver, ISO, and other styles
5

Klise, Geoffrey T., Roger Hill, Andy Walker, Aron Dobos, and Janine Freeman. "PV system “Availability” as a reliability metric — Improving standards, contract language and performance models." In 2016 IEEE 43rd Photovoltaic Specialists Conference (PVSC). IEEE, 2016. http://dx.doi.org/10.1109/pvsc.2016.7749918.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Myers, Daryl R., Keith Emery, and C. Gueymard. "Revising and Validating Spectral Irradiance Reference Standards for Photovoltaic Performance Evaluation." In ASME Solar 2002: International Solar Energy Conference. ASMEDC, 2002. http://dx.doi.org/10.1115/sed2002-1074.

Full text
Abstract:
In 1982, the American Society for Testing and Materials (ASTM) adopted consensus standard direct-normal and global-tilted solar terrestrial spectra (ASTM E891/E892). These standard spectra were intended to evaluate photovoltaic (PV) device performance and other solar-related applications. The International Standards Organization (ISO) and International Electrotechnical Commission (IEC) adopted these spectra as spectral standards ISO 9845-1 and IEC 60904-3. Additional information and more accurately representative spectra are needed by today’s PV community. Modern terrestrial spectral radiation models, knowledge of atmospheric physics, and measured radiometric quantities are applied to develop new reference spectra for consideration by ASTM.
APA, Harvard, Vancouver, ISO, and other styles
7

Marquis, Brian, Robert Greif, and Erik Curtis. "Effect of Cant Deficiency on Rail Vehicle Performance." In ASME 2005 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2005. http://dx.doi.org/10.1115/detc2005-85101.

Full text
Abstract:
Simplified train models are analyzed to assess the relationship of unbalance on carbody acceleration and wheel unloading during steady state curving motion. In this paper a half-car model appropriate for both power cars and tilting coach cars is theoretically analyzed. Models of this type are useful for examining static lean requirements as well as margins of safety at higher cant deficiencies in the Track Safety Standards of the Federal Railroad Administration (FRA). The suspension systems modeled and analyzed include the following types: rigid, flexible, tilting actuation, and combined flexible and tilting actuation suspension. Simplified formulas are derived which can be used as an analysis tool by railroad designers to assess vehicle performance. Parametric results are presented for vertical wheel unloading and lateral carbody acceleration as a function of cant deficiency. Results show that incorporation of tilting systems, better suspension designs and better track quality, are necessary in order to provide an equivalent level of margin of safety for operations at higher cant deficiency. The relationship of these results to limits in the Track Safety Standards is discussed.
APA, Harvard, Vancouver, ISO, and other styles
8

Liu, Jiancheng (Jessie), Claude Daley, Han-Chang Yu, and James Bond. "Comparison of Analytical and Numerical Models of Glancing Ship-Ice Collisions." In SNAME 10th International Conference and Exhibition on Performance of Ships and Structures in Ice. SNAME, 2012. http://dx.doi.org/10.5957/icetech-2012-157.

Full text
Abstract:
The International Association of Classification Societies (IACS) Polar Class (PC) requirements are widely accepted by the industry as the design standards for vessels operating in polar regions. The PC requirements consider the bow shoulder collision with infinite ice as the base scenario. The Popov collision model (Popov, 1976; Daley, 1990), updated with a pressure-area ice pressure model, was adopted for calculating the ice loads on the hull. The Popov collision model considers that the ship-ice impact is so quick that a 3-D collision can be modeled by an equivalent 1-D collision. All motions between ship and ice are mapped onto the normal direction to the hull at the collision point. No collision energy is assumed to be dissipated by the friction force since sliding velocities are ignored. It is of interest to assess the effects of the simplifications in the Popov model using a numerical method which can simulate the collision mechanism with a high level of sophistication. A numerical study of the Popov collision model of ship-ice interaction is presented in this paper. In the study, ship sliding motions and frictions between hull and ice were included in the six-degree-of freedom ship-ice collision model using LS-DYNA software. The output and checked items include the force time histories, maximum force value, ship/ice motions, contact areas and positions, etc. A parametric study was carried out to quantify the effects from varying mesh sizes, ship-ice friction coefficients, ice elastic modulus and ice buoyancy force using the FE model simulation. The effects of simplifications of the Popov analytical model were assessed through the comparisons of the results produced by the analytical model and the numerical model.
APA, Harvard, Vancouver, ISO, and other styles
9

Koppisetty, D. V. Suresh, S. S. Akhil Hawaldar, and Hamid M. Lankarani. "Comparison of Dummy and Human Body Models in Automotive Side Impact Collisions According to the Regulatory Standards." In ASME 2019 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/imece2019-10680.

Full text
Abstract:
Abstract Side-Impact car accidents are the second leading cause of fatalities in the United States. Regulatory standards have been developed for occupant protection in side impact car accidents using dummies or Anthropomorphic Test Devices (ATDs). Although the regulations are based on the use of ATDs, there might be differences between an actual human crash performance and that of a dummy crash performance. In recent years, technology has improved in such a way that crash scenarios can be modeled in various computational software. The human dynamic responses can be examined using active human body models including a combination of rigid bodies, finite elements, and kinematic joints, thus making them versatile to use in all crash test scenarios. In this study, the nearside occupants are considered as per regulatory standards set by National Highway Traffic Safety Administration (NHTSA). Vehicle side-impact crash simulations are carried out using LS-DYNA finite element (FE) software, and the occupant response simulations are obtained using MADYMO. Because the simulation of an entire FE model of a car and occupant is quite time-consuming and computationally expensive, a prescribed structural motion (PSM) technique has been utilized in this study and applied to the side-door panel with an occupant positioned in the driver seat of the car in MADYMO. Regular side-impact deformable barrier and pole test simulations are performed with belted and unbelted occupant models considering two different target vehicles namely — a mid-size sedan and a small compact car. Responses from the dummy and the human body models are compared in order to quantify differences between the two in side impacts. The results from this study indicate that human body model behavior is generally similar to that of dummy model in terms of kinematic responses. However, the corresponding injury parameters of the human model are typically higher than that of the dummy model.
APA, Harvard, Vancouver, ISO, and other styles
10

Williams, Lawrence J., and David J. Pethrick. "User Experience in Upgrading Early Models of Aero Derived Gas Turbine Pipeline Compressor Units to Current Standards." In ASME 1990 International Gas Turbine and Aeroengine Congress and Exposition. American Society of Mechanical Engineers, 1990. http://dx.doi.org/10.1115/90-gt-291.

Full text
Abstract:
On July 3, 1964 TransCanada PipeLines Limited commissioned the first Rolls-Royce Industrial Avon, in a Cooper Bessemer pipeline compressor set. In 1986, Cooper-Rolls introduced upgraded versions of the Avon (1535E) and power turbine (RT48S). In 1987, TransCanada converted an Avon to 1535E standard at one location and power turbines to RT48S at two other locations. Following one year of acceptable operation, four units were converted to the “Production” standard with both Avon and power turbine upgraded in 1988. Testing showed the old, modified units to achieve the performance guaranteed for new engines. The condition of the lead Avon 1535E at tear-down after 6000 hours is described, together with a statement on condition after 15000 hours. The four units modified in 1988 were torn down for inspection during the Summer of 1989, and their condition is discussed. Minor operating difficulties are described together with their solutions. This paper publishes work that was presented without publication at the Toronto conference in June 1989 together with new information which has become available since that time.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Performance standards Econometric models"

1

Stein, Joshua. Final Project Report: Performance Models and Standards for Bifacial PV Module Technologies. Office of Scientific and Technical Information (OSTI), October 2018. http://dx.doi.org/10.2172/1481544.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bravo-Ureta, Boris E., Eric Njuki, Ana Claudia Palacios, and Lina Salazar. Agricultural Productivity in El Salvador: A Preliminary Analysis. Inter-American Development Bank, February 2022. http://dx.doi.org/10.18235/0004020.

Full text
Abstract:
The need to enhance food security while reducing poverty along with the growing threat imposed by climate change clearly reveal that it is imperative to accelerate agricultural productivity growth. This paper estimates micro-level production models to identify the major factors that have contributed to productivity growth in El Salvador, including irrigation, purchased inputs, mechanization, technical assistance, and farm size, among others. The econometric framework adopted in this investigation is grounded on recent panel data stochastic production frontier methodologies. The results obtained from the estimation of these models are used to calculate Total Factor Productivity (TFP) change and to decompose such change into different factors, including technological progress, technical efficiency (TE), and economies of scale. The findings imply that efforts are needed to improve productivity in both technological progress and technical efficiency where the latter is a measurement of managerial performance. This in turn indicates that resources should be devoted to promoting the adoption and diffusion of improved technologies while enhancing managerial capabilities through agricultural extension.
APA, Harvard, Vancouver, ISO, and other styles
3

Kramer, Robert. LED Street Lighting Implementation Research, Support, and Testing. Purdue University, 2020. http://dx.doi.org/10.5703/1288284317274.

Full text
Abstract:
This report describes the results of technical analysis, field tests, and laboratory tests that were performed for LED highway lighting options by the Energy Efficiency and Reliability Center (EERC) at Purdue University Northwest for the Indiana Department of Transportation (INDOT). This effort was conducted over the past 3 years to evaluate and test the technology and viability of using modern highway lighting technology to enhance energy efficiency, safety, security, and economic development of communities and roadways. During the testing period there was a continuous discussion between INDOT and EERC regarding the laboratory and field testing of INDOT approved luminaires submitted by vendors. There were multiple discussions with INDOT and vendors regarding the individual details and issues for the 29 luminaires that were tested. A comparison study was conducted by EERC of the various alternatives and comparison to currently installed luminaires. Data was collected for field tests of the luminaires by EERC and INDOT personnel for the luminaires. Field data was evaluated and compared to lighting models using vendor supplied ies data files. Multiple presentations were made at 3 separate Purdue Road Schools regarding the results and procedures of the testing program by EERC in conjunction with INDOT. A total of 22 final reports, considered confidential by INDOT, for individual vendor luminaires have been prepared as part of this effort. These reports were submitted sequentially to INDOT as testing was completed during the course of this effort. A total of 29 luminaires were tested. Some luminaire testing was terminated during testing due to design issues or vendor requests. All testing was summarized in the INDOT specification sheet attached to each report. Observations regarding the consistency of the supplied test luminaire with the requirements of Section 7.2 of the INDOT test procedure “Procedure for evaluation and approval list requirements for solid state ballasted luminaires ITM 957-17P” is provided in the Appendix to the report for each luminaire. Details regarding how these tests were performed and the respective associated evaluation of performance and reliability are provided in the report. This effort included: consideration of published and vendor information; appraisal of products consistent with national industry standards; review of physical design, thermal performance; laboratory testing of photopic performance, reliability, life cycle data and characteristics, and power characteristics; technical and probabilistic risk studies; and field testing and analysis of LED light sources including comparison to currently installed conventional light sources. Assistance in preparing INDOT standards for highway lighting was provided on multiple occasions.
APA, Harvard, Vancouver, ISO, and other styles
4

McKenna, Patrick, and Mark Evans. Emergency Relief and complex service delivery: Towards better outcomes. Queensland University of Technology, June 2021. http://dx.doi.org/10.5204/rep.eprints.211133.

Full text
Abstract:
Emergency Relief (ER) is a Department of Social Services (DSS) funded program, delivered by 197 community organisations (ER Providers) across Australia, to assist people facing a financial crisis with financial/material aid and referrals to other support programs. ER has been playing this important role in Australian communities since 1979. Without ER, more people living in Australia who experience a financial crisis might face further harm such as crippling debt or homelessness. The Emergency Relief National Coordination Group (NCG) was established in April 2020 at the start of the COVID-19 pandemic to advise the Minister for Families and Social Services on the implementation of ER. To inform its advice to the Minister, the NCG partnered with the Institute for Governance at the University of Canberra to conduct research to understand the issues and challenges faced by ER Providers and Service Users in local contexts across Australia. The research involved a desktop review of the existing literature on ER service provision, a large survey which all Commonwealth ER Providers were invited to participate in (and 122 responses were received), interviews with a purposive sample of 18 ER Providers, and the development of a program logic and theory of change for the Commonwealth ER program to assess progress. The surveys and interviews focussed on ER Provider perceptions of the strengths, weaknesses, future challenges, and areas of improvement for current ER provision. The trend of increasing case complexity, the effectiveness of ER service delivery models in achieving outcomes for Service Users, and the significance of volunteering in the sector were investigated. Separately, an evaluation of the performance of the NCG was conducted and a summary of the evaluation is provided as an appendix to this report. Several themes emerged from the review of the existing literature such as service delivery shortcomings in dealing with case complexity, the effectiveness of case management, and repeat requests for service. Interviews with ER workers and Service Users found that an uplift in workforce capability was required to deal with increasing case complexity, leading to recommendations for more training and service standards. Several service evaluations found that ER delivered with case management led to high Service User satisfaction, played an integral role in transforming the lives of people with complex needs, and lowered repeat requests for service. A large longitudinal quantitative study revealed that more time spent with participants substantially decreased the number of repeat requests for service; and, given that repeat requests for service can be an indicator of entrenched poverty, not accessing further services is likely to suggest improvement. The interviews identified the main strengths of ER to be the rapid response and flexible use of funds to stabilise crisis situations and connect people to other supports through strong local networks. Service Users trusted the system because of these strengths, and ER was often an access point to holistic support. There were three main weaknesses identified. First, funding contracts were too short and did not cover the full costs of the program—in particular, case management for complex cases. Second, many Service Users were dependent on ER which was inconsistent with the definition and intent of the program. Third, there was inconsistency in the level of service received by Service Users in different geographic locations. These weaknesses can be improved upon with a joined-up approach featuring co-design and collaborative governance, leading to the successful commissioning of social services. The survey confirmed that volunteers were significant for ER, making up 92% of all workers and 51% of all hours worked in respondent ER programs. Of the 122 respondents, volunteers amounted to 554 full-time equivalents, a contribution valued at $39.4 million. In total there were 8,316 volunteers working in the 122 respondent ER programs. The sector can support and upskill these volunteers (and employees in addition) by developing scalable training solutions such as online training modules, updating ER service standards, and engaging in collaborative learning arrangements where large and small ER Providers share resources. More engagement with peak bodies such as Volunteering Australia might also assist the sector to improve the focus on volunteer engagement. Integrated services achieve better outcomes for complex ER cases—97% of survey respondents either agreed or strongly agreed this was the case. The research identified the dimensions of service integration most relevant to ER Providers to be case management, referrals, the breadth of services offered internally, co-location with interrelated service providers, an established network of support, workforce capability, and Service User engagement. Providers can individually focus on increasing the level of service integration for their ER program to improve their ability to deal with complex cases, which are clearly on the rise. At the system level, a more joined-up approach can also improve service integration across Australia. The key dimensions of this finding are discussed next in more detail. Case management is key for achieving Service User outcomes for complex cases—89% of survey respondents either agreed or strongly agreed this was the case. Interviewees most frequently said they would provide more case management if they could change their service model. Case management allows for more time spent with the Service User, follow up with referral partners, and a higher level of expertise in service delivery to support complex cases. Of course, it is a costly model and not currently funded for all Service Users through ER. Where case management is not available as part of ER, it might be available through a related service that is part of a network of support. Where possible, ER Providers should facilitate access to case management for Service Users who would benefit. At a system level, ER models with a greater component of case management could be implemented as test cases. Referral systems are also key for achieving Service User outcomes, which is reflected in the ER Program Logic presented on page 31. The survey and interview data show that referrals within an integrated service (internal) or in a service hub (co-located) are most effective. Where this is not possible, warm referrals within a trusted network of support are more effective than cold referrals leading to higher take-up and beneficial Service User outcomes. However, cold referrals are most common, pointing to a weakness in ER referral systems. This is because ER Providers do not operate or co-locate with interrelated services in many cases, nor do they have the case management capacity to provide warm referrals in many other cases. For mental illness support, which interviewees identified as one of the most difficult issues to deal with, ER Providers offer an integrated service only 23% of the time, warm referrals 34% of the time, and cold referrals 43% of the time. A focus on referral systems at the individual ER Provider level, and system level through a joined-up approach, might lead to better outcomes for Service Users. The program logic and theory of change for ER have been documented with input from the research findings and included in Section 4.3 on page 31. These show that ER helps people facing a financial crisis to meet their immediate needs, avoid further harm, and access a path to recovery. The research demonstrates that ER is fundamental to supporting vulnerable people in Australia and should therefore continue to be funded by government.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography