Academic literature on the topic 'Heterogeneity Error'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Heterogeneity Error.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Heterogeneity Error"

1

Ballinger, T. Parker, and Nathaniel T. Wilcox. "Decisions, Error and Heterogeneity." Economic Journal 107, no. 443 (July 1, 1997): 1090–105. http://dx.doi.org/10.1111/j.1468-0297.1997.tb00009.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Darban, Ameneh, Mojtaba Ghaedi, and Jafar Qajar. "Analysis of the impacts of relative permeability and mobility ratio on heterogeneity loss error during upscaling of geological models." Oil & Gas Science and Technology – Revue d’IFP Energies nouvelles 75 (2020): 53. http://dx.doi.org/10.2516/ogst/2020049.

Full text
Abstract:
The detailed geological fine grids are upscaled to create reliably sized simulation coarse models to solve flow equations in a more efficient way. Any upscaling process results in a loss of accuracy, along with an increase of errors. Numerical dispersion, heterogeneity loss, and connectivity misrepresentation are responsible for the upscaling errors. Recognizing the source of each error, and the behavior of influential factors through upscaling process could provide an optimum level of upscaling and an evaluation of upscaling methods’ accuracy. Despite the importance of upscaling error, little attention has been paid to this subject. This paper represents a rigorous analysis of the heterogeneity loss behavior associated with the relative permeability contrast and the mobility ratio under a waterflooding process. For this purpose, heterogeneous fine grid models are constructed by the fractional Brownian motion process. The models are upscaled by three upscaling factors. The models achieved are implemented to eliminate the impact of numerical error among upscaling errors in order to focus strictly on heterogeneity loss. Water–oil displacement simulation is then performed on fine and corresponding refined upscaled models at three different ratios of relative permeabilities and mobility ratios. In the next stage, the relation between flow performance error and heterogeneity loss is investigated by the heterogeneity loss plot. The slope of this plot provides the reservoir engineer an insight to evaluate the performance of upscaling methods and the behavior of the influential factors on upscaling errors. Moreover, by using the heterogeneity loss plot for each ratio, a limit of coarsening is presented. Based on the results, the heterogeneity loss error is affected more by the mobility ratio contrast than the relative permeability difference. Also, it is demonstrated that water-wet reservoirs with light oil are more sensitive to the level of upscaling.
APA, Harvard, Vancouver, ISO, and other styles
3

Shurygin, D. N., V. M. Kalinchenko, and V. V. Shutkova. "INTERPOLATION ERROR ESTIMATION CONSIDERING GEOLOGICAL SPACE HETEROGENEITY." MINING INFORMATIONAL AND ANALYTICAL BULLETIN 5 (2018): 113–21. http://dx.doi.org/10.25018/0236-1493-2018-5-0-113-121.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Fan, Caiyun, Wenbin Lu, and Yong Zhou. "Testing error heterogeneity in censored linear regression." Computational Statistics & Data Analysis 161 (September 2021): 107207. http://dx.doi.org/10.1016/j.csda.2021.107207.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Shugan, Steven M. "Editorial: Errors in the Variables, Unobserved Heterogeneity, and Other Ways of Hiding Statistical Error." Marketing Science 25, no. 3 (May 2006): 203–16. http://dx.doi.org/10.1287/mksc.1060.0215.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Griffith, Daniel A., Robert Haining, and Giuseppe Arbia. "Heterogeneity of Attribute Sampling Error in Spatial Data Sets." Geographical Analysis 26, no. 4 (September 3, 2010): 300–320. http://dx.doi.org/10.1111/j.1538-4632.1994.tb00328.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Nichols, Ellert R., Elnaz Shadabi, and Douglas B. Craig. "Effect of alteration of translation error rate on enzyme microheterogeneity as assessed by variation in single molecule electrophoretic mobility and catalytic activity." Biochemistry and Cell Biology 87, no. 3 (June 2009): 517–29. http://dx.doi.org/10.1139/o09-010.

Full text
Abstract:
The role of translation error for Escherichia coli individual β-galactosidase molecule catalytic and electrophoretic heterogeneity was investigated using CE-LIF. An E. coli rpsL mutant with a hyperaccurate translation phenotype produced enzyme molecules that exhibited significantly less catalytic heterogeneity but no reduction of electrophoretic heterogeneity. Enzyme expressed with streptomycin-induced translation error had increased thermolability, lower activity, and no significant change to catalytic or electrophoretic heterogeneity. Modeling of the electrophoretic behaviour of β-galactosidase suggested that variation of the hydrodynamic radius may be the most significant contributor to electrophoretic heterogeneity.
APA, Harvard, Vancouver, ISO, and other styles
8

Campbell, Patrick J., Mira Patel, Jennifer R. Martin, Ana L. Hincapie, David Rhys Axon, Terri L. Warholak, and Marion Slack. "Systematic review and meta-analysis of community pharmacy error rates in the USA: 1993–2015." BMJ Open Quality 7, no. 4 (October 2018): e000193. http://dx.doi.org/10.1136/bmjoq-2017-000193.

Full text
Abstract:
ImportanceWhile much is known about hospital pharmacy error rates in the USA, comparatively little is known about community pharmacy dispensing error rates.ObjectiveThe aim of this study was to determine the rate of community pharmacy dispensing errors in the USA.MethodsEnglish language, peer-reviewed observational and interventional studies that reported community pharmacy dispensing error rates in the USA from January 1993 to December 2015 were identified in 10 bibliographic databases and topic-relevant grey literature. Studies with a denominator reflecting the total number of prescriptions in the sample were necessary for inclusion in the meta-analysis. A random effects meta-analysis was conducted to estimate an aggregate community pharmacy dispensing error rate. Heterogeneity was assessed using the I2 statistic prior to analysis.ResultsThe search yielded a total of 8490 records, of which 11 articles were included in the systematic review. Two articles did not have adequate data components to be included in the meta-analysis. Dispensing error rates ranged from 0.00003% (43/1 420 091) to 55% (55/100). The meta-analysis included 1 461 128 prescriptions. The overall community pharmacy dispensing error rate was estimated to be 0.015 (95% CI 0.014 to 0.018); however, significant heterogeneity was observed across studies (I2=99.6). Stratification by study error identification methodology was found to have a significant impact on dispensing error rate (p<0.001).Conclusion and relevanceThere are few published articles that describe community pharmacy dispensing error rates in the USA. Thus, there is limited information about the current rate of community pharmacy dispensing errors. A robust investigation is needed to assess dispensing error rates in the USA to assess the nature and magnitude of the problem and establish prevention strategies.
APA, Harvard, Vancouver, ISO, and other styles
9

Byun, Bok S., and Chi‐Yuh Young. "Effects of subsurface characteristics on surface seismic measurements: A simulation study on horizontally layered media." GEOPHYSICS 54, no. 6 (June 1989): 730–36. http://dx.doi.org/10.1190/1.1442700.

Full text
Abstract:
We use seismic modeling of shot records to simulate the effects of acoustic‐wave propagation, P/SV-wave conversion, intrabed multiple reflections, anelastic attenuation, and transverse isotropy in materials representative of a shallow Gulf of Mexico environment. The study focuses on the estimates of three kinematic properties: vertical traveltime, velocity used for depth conversion, and depth to the reflector. The primary tool used for the measurement is velocity spectral analysis along hyperbolic traveltime‐distance curves. Analyses of numerous modeling experiments in the Gulf of Mexico environment lead to the following conclusions: First, the kinematic properties are affected mainly by anelastic attenuation, vertical velocity heterogeneity, and velocity anisotropy. The effects of P/SV-wave mode conversion and intrabed multiples are of secondary significance. Second, anelastic attenuation coupled with heterogeneity contribute up to 3 percent in depth error, mainly in the form of apparent time delays. Transverse isotropy is likely to add up to an additional 2 percent, for a total of 5 percent depth error. Since the depth error magnitude agrees closely with published field observations, we conclude, therefore, that attenuation, heterogeneity, and transverse isotropy account for most of the errors in depth estimation in horizontally layered media.
APA, Harvard, Vancouver, ISO, and other styles
10

Esbensen, Kim H. "Materials Properties: Heterogeneity and Appropriate Sampling Modes." Journal of AOAC INTERNATIONAL 98, no. 2 (March 1, 2015): 269–74. http://dx.doi.org/10.5740/jaoacint.14-234.

Full text
Abstract:
Abstract The target audience for this Special Section comprises parties related to the food and feed sectors, e.g., field samplers, academic and industrial scientists, laboratory personnel, companies, organizations, regulatory bodies, and agencies who are responsible for sampling, as well as project leaders, project managers, quality managers, supervisors, and directors. All these entities face heterogeneous materials,and the characteristics of heterogeneous materials needs to be competently understood by all of them. Before delivering analytical results for decision-making, one form or other of primary sampling is always necessary, which must counteract the effects of the sampling target heterogeneity. Up to five types of sampling error may arise as a specific sampling process interacts with a heterogeneous material; two sampling errors arise because of the heterogeneity of the sampling target, and three additional sampling errors are produced by the sampling process itself—if not properly understood, reduced, and/or eliminated, which is the role of Theory of Sampling. Thispaper discusses the phenomenon and concepts involvedin understanding, describing, and managing the adverse effects of heterogeneity in sampling.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Heterogeneity Error"

1

Yan, Zheng. "The Econometrics of Piecewise Linear Budget Constraints With Skewed Error Distributons: An Application To Housing Demand In The Presence Of Capital Gains Taxation." Diss., Virginia Tech, 1999. http://hdl.handle.net/10919/28606.

Full text
Abstract:
This paper examines the extent to which thin markets in conjunction with tax induced kinks in the budget constraint cause consumer demand to be skewed. To illustrate the principles I focus on the demand for owner-occupied housing. Housing units are indivisible and heterogeneous while tastes for housing are at least partly idiosyncratic, causing housing markets to be thin. In addition, prior to 1998, capital gains tax provisions introduced a sharp kink in the budget constraint of existing owner-occupiers in search of a new home: previous homeowners under age 55 paid no capital gains tax if they bought up, but were subject to capital gains tax if they bought down. I first characterize the economic conditions under which households err on the up or down side when choosing a home in the presence of a thin market and a kinked budget constraint. I then specify an empirical model that takes such effects into account. Results based on Monte Carlo experiments indicate that failing to allow for skewness in the demand for housing leads to biased estimates of the elasticities of demand when such skewness is actually present. In addition, estimates based on American Housing Survey data suggest that such bias is substantial: controlling for skewness reduces the price elasticity of demand among previous owner-occupiers from 1.6 to 0.3. Moreover, 58% of previous homeowners err on the up while only 42% err on the down side. Thus, housing demand is skewed.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
2

So, Yoon-Sup. "Prediction of cultivar performance and heterogeneity of genotype variance, correlation and error variance in the Iowa Crop Performance Tests-Corn (Zea mays L.)." [Ames, Iowa : Iowa State University], 2009. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3355532.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Liao, Shaojuan. "Three Essays on Economic Growth and Technology Development: Considering the Spillover Effects." Diss., Virginia Tech, 2012. http://hdl.handle.net/10919/37808.

Full text
Abstract:
This dissertation consists of three essays on the empirical analysis of economic growth and technology development. In particular, I consider spillover effects in different frameworks. The first chapter outlines the three topics involved and briefly discusses the motivations, methods as well as some conclusions in each of the following chapters. The second chapter considers the spillovers in economic growth and convergence. Spillovers are prevalent in nowadays' economy. I formally model the spillover effects as the interdependence of total factor productivity (TFP), and develop a model in which spillover effects of R&D through the channel of international trade make the TFPs correlated among countries. In this sense, I apply the thoughts of international trade to the economic growth framework. Empirically, I develop a three-stage generalized method of moment(GMM) to estimate the dynamic panel spatial error autoregressive model. Simulation results show that my estimator is consistent and efficient. Through counterfactual analysis, I find that there are positive spillovers through both geographic connection and trade connection. Such a positive spillover effect, however, slows down the convergence speed. Moreover, there were little spillovers in the early 1960s. Spillover effects become stronger overtime. The third chapter is about the determinants of technology development in China. What makes my paper different from others is that I take a full consideration of the spillover effects: provincial spillovers in Science and Technology (S&T) capital as well as S&T personnel, and international spillovers through trade and FDI. The most interesting point in my paper is that I consider the indirect effects of institutions on technology development. Marketization, measured by the share of state-owned enterprises (SOEs) in the economy, affects the production of technology through different channels at different stages. I use a semiparametric varying-coefficient model to account for the effects. In this paper, I find that provincial spillovers are mainly through the externalities of S&T capital stock while international spillovers occur through trade. Marketization affects the technology development through S&T capital, S&T capital spillovers and trade. Although a certain share of SOEs is necessary for technology production, the marketization process will promote the development of technology in China in the long run. The fourth chapter looks into the provincial technology spillovers from another aspect. Instead of the S&T endowment spillovers from the nearby provinces, I consider the technology transfer from the frontier province to the targeted province as well as the absorptive capacity of the targeted province itself. Two forms of technology transfer are analyzed: the technology distance due to the structural discrepancy in the patent portfolio and the technology gap because of the difference in the patent level. Through the empirical analysis, several factors contributing to patent growth, such as S&T investment, road density, international spillovers from imports and FDI, are identified. Moreover, I find that technology transfer due to the technology distance can stimulate patent growth. However, I fail to find robust evidence of technology transfer due to the technology gap, which implies that the provincial technology convergence may not exist in China.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
4

Hammami, Imen. "Statistical properties of parasite density estimators in malaria and field applications." Phd thesis, Université René Descartes - Paris V, 2013. http://tel.archives-ouvertes.fr/tel-01064071.

Full text
Abstract:
Malaria is a devastating global health problem that affected 219 million people and caused 660,000 deaths in 2010. Inaccurate estimation of the level of infection may have adverse clinical and therapeutic implications for patients, and for epidemiological endpoint measurements. The level of infection, expressed as the parasite density (PD), is classically defined as the number of asexual parasites relative to a microliter of blood. Microscopy of Giemsa-stained thick blood smears (TBSs) is the gold standard for parasite enumeration. Parasites are counted in a predetermined number of high-power fields (HPFs) or against a fixed number of leukocytes. PD estimation methods usually involve threshold values; either the number of leukocytes counted or the number of HPFs read. Most of these methods assume that (1) the distribution of the thickness of the TBS, and hence the distribution of parasites and leukocytes within the TBS, is homogeneous; and that (2) parasites and leukocytes are evenly distributed in TBSs, and thus can be modeled through a Poisson-distribution. The violation of these assumptions commonly results in overdispersion. Firstly, we studied the statistical properties (mean error, coefficient of variation, false negative rates) of PD estimators of commonly used threshold-based counting techniques and assessed the influence of the thresholds on the cost-effectiveness of these methods. Secondly, we constituted and published the first dataset on parasite and leukocyte counts per HPF. Two sources of overdispersion in data were investigated: latent heterogeneity and spatial dependence. We accounted for unobserved heterogeneity in data by considering more flexible models that allow for overdispersion. Of particular interest were the negative binomial model (NB) and mixture models. The dependent structure in data was modeled with hidden Markov models (HMMs). We found evidence that assumptions (1) and (2) are inconsistent with parasite and leukocyte distributions. The NB-HMM is the closest model to the unknown distribution that generates the data. Finally, we devised a reduced reading procedure of the PD that aims to a better operational optimization and a practical assessing of the heterogeneity in the distribution of parasites and leukocytes in TBSs. A patent application process has been launched and a prototype development of the counter is in process.
APA, Harvard, Vancouver, ISO, and other styles
5

Villanova, Fernando Lucas dos Santos Peixoto de. "Estudo do erro fundamental de amostragem: uma comparação entre o teste de heterogeneidade e o teste da árvore no quartzo fumê na Mina Lamego (Sabará, MG)." Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/3/3134/tde-04052018-104543/.

Full text
Abstract:
O processo de estimativa de teores, em qualquer etapa de um empreendimento mineiro, requer resultados confiáveis para a tomada de decisão ou para a entrega de produtos finais. Através da caracterização dos tipos de erros a que um protocolo de amostragem está exposto e da definição de diversas práticas corretas para eliminá-los, Pierre Gy desenvolveu a Teoria da Amostragem. Dentre os onze erros identificados, o erro fundamental de amostragem - Fundamental Sampling Error (FSE) - é o único que não pode ser anulado. A forma de estimar o FSE é tema abordado em diversas publicações, sendo que os dois métodos mais utilizados são o teste de heterogeneidade e o teste da árvore. A seleção da amostra inicial, a massa, o tipo e o local de coleta são alguns dos fatores que impactam diretamente no resultado final, que pode ser mascarado por variações estatísticas, frutos de erros não controlados. Este estudo tem como objetivo comparar os resultados de estimativa de FSE pelos testes de heterogeneidade e da árvore, e a estimativa utilizando observações mineralógicas em testemunhos de sondagem na fórmula de Gy. O presente trabalho foi realizado na mina Lamego, em Sabará, Minas Gerais, utilizando a rocha mineralizada em ouro, o quartzo fumê. A metodologia empregada para o teste de heterogeneidade baseou-se nos protocolos utilizados por Pitard, e o teste da árvore seguiu o protocolo proposto por François-Bongarçon, ambos especialistas e consultores em Amostragem. As análises demonstraram que a utilização somente dos resultados de observações mineralógicas, associados à fórmula de Gy, gerou um alto desvio e, consequentemente, um protocolo de baixa aplicabilidade. O teste de heterogeneidade, único que isola o FSE, apresentou desvio mais alto que o teste da árvore, ao contrário do esperado. A influência de clusters de ouro na fração grosseira gerou uma alta variabilidade nos resultados do teste de heterogeneidade e foi retirada do cálculo da estimativa da constante de heterogeneidade constitucional. Com base nos resultados, pôde-se concluir que o teste da árvore é a melhor opção de estimativa do FSE para o minério de ouro de Lamego.
Grade estimation at any stage of a mining project requires reliable results to guarantee best decisions or the delivery of final products. Characterizing the types of error to which sampling is exposed and defining several correct practices to diminish these errors, Pierre Gy developed the Theory of Sampling. From the eleven identified errors, the Fundamental Sampling Error (FSE) is the only one that will cannot become zero. The estimation of the FSE is a subject of many papers and the most common protocols used are the heterogeneity test and the sampling tree test. The selection of the primary sample, the mass, type and local of collection are some of the factors which directly impact on the final result, that could be masked by statistical variation on the data, rooted on non controlled errors. This study compares the results of FSE estimation by the heterogeneity test, the sampling tree method, and by using mineralogical observations from drill cores on Gy\'s formula. The study analyses the gold bearing smoky quartz from Lamego Mine, Sabará, Minas Gerais. The methodology for the heterogeneity test was based on the protocol used by Pitard, and the sampling tree method followed François-Bongarçon\'s protocol, both specialists and consultants on Sampling. The application of Gy\'s formula using mineralogical observations study resulted in a high deviation of the proposed protocol and this approach was discarded. The heterogeneity test, the only test on which the FSE is isolated, presented a higher deviation than the sampling tree experiment, contradicting the expected result. Gold clusters in the coarse fraction created a high variability of the results and this fraction had to be removed from the calculation of the constitutional heterogeneity constant. Based on the results, it could be concluded that the sampling tree test is the best option to estimate FSE for the Lamego gold ore.
APA, Harvard, Vancouver, ISO, and other styles
6

Olsen, Andrew Nolan. "When Infinity is Too Long to Wait: On the Convergence of Markov Chain Monte Carlo Methods." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1433770406.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Garcia, Vera Lucia de Souza. "A segmentação não-convencional na escrita dos alunos do ensino fundamental II: dos erros aos acertos pela reescrita de texto." Universidade Estadual do Oeste do Parana, 2016. http://tede.unioeste.br:8080/tede/handle/tede/934.

Full text
Abstract:
Made available in DSpace on 2017-07-10T16:21:11Z (GMT). No. of bitstreams: 1 Dissertacao Vera.pdf: 3265973 bytes, checksum: c84ea3982ae25119b2b295324dc54ca5 (MD5) Previous issue date: 2016-02-05
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Did the identification, description and categorization of the phenomena considered language of the non-written agreements, phonological and orthographic origin, not suitable for standard written norm, showing an increased incidence of hypo- and hyper words in the textual productions of the students. Established, then the question - problem: What factor (or factors) does (do) that there is the incidence of hypo- and hyper writing students EF-II? To answer the question and understand the nature of the targets phenomenon unconventional, its phonetic, phonological and morphological character, we rely on Mattoso House (1985) and Bisol (2001); by Abaurre; Fiad and Mayrink Sabinson (1997); Oliveira (2009), Cagliari (2009), in view of the heterogeneity of language, speech / orality / writing / literacy; we base on Corrêa (2004); Capristano (2004); Chacon (2004 and 2006) Cunha (2004 and 2010). We found that unconventional segmentation present in the writing of the subjects scribes of our research are established both in terms of prosody, specific of prosodic constituents, as due to the heterogeneous constitution of written language as individuation points Scribe by circulation in the genesis of language and the image that it makes written code institutionalized. In view of these findings, based on Abaurre; Salek Fiad; Mayrink-Sabinson and Geraldi (1995); Leal and Brandao (2006); Salek-Fiad (2009); Ruiz (2013), about writing and the rewriting of texts, we propose a roadmap of writing and rewriting text activities, focusing on students' difficulties, as reworking strategy of non-conventional segmentation - hypo- and hyper.
A dificuldade apresentada pelos alunos em apreender as convenções da norma padrão da língua escrita tem sido bastante discutida em ambiente educacional. Os problemas se caracterizam em relação às convenções da língua escrita, são muitos e podem ser encontrados na produção escrita de alunos em todos os níveis de ensino. Isto assinala um não domínio da escrita convencional e transforma a produção de texto na escola uma ação incômoda. A percepção do fato nos levou a estabelecer o objetivo de compreender quais critérios linguísticos e da relação sujeito/linguagem podiam explicar a presença desses eventos na escrita de alunos do 6º ao 9º Ano do EF-II, integrantes do Projeto Jornal Escolar, em uma instituição pública de ensino da região noroeste do Paraná. Fizemos a identificação, a descrição e a categorização dos fenômenos considerados não-convenções da língua escrita, de origem fonológica e ortográfica, não adequados à norma padrão escrita, constatando maior incidência de hipo e hipersegmentação de palavras nas produções textuais dos alunos. Estabelecemos, assim, a questão-problema: Que fator (ou fatores) faz (fazem) com que haja essa incidência de hipo e hipersegmentação na escrita de alunos do EF-II? Para responder à questão e compreender a natureza do fenômeno de segmentações não-convencionais, seu caráter fonético-fonológico e morfológico, baseamo-nos em Mattoso Câmara (1985) e Bisol (2001); em Abaurre; Fiad e Mayrink Sabinson (1997); em Oliveira (2009) e em Cagliari (2009), na perspectiva da heterogeneidade da língua, da relação fala/oralidade/escrita/letramento; fundamentamos em Corrêa (2004); em Capristano (2004); em Chacon (2004; 2006), em Cunha (2004; 2010). Constatamos que a segmentação não-convencional presente na escrita dos escreventes sujeitos da nossa pesquisa se estabelecem tanto em função da prosódia, específico dos constituintes prosódicos, quanto em função da constituição heterogênea da língua escrita como pontos de individuação do escrevente pela circulação na gênese da língua e na imagem que ele faz do código escrito institucionalizado. Frente a essas constatações, baseadas em Abaurre; Salek Fiad; Mayrink-Sabinson e Geraldi (1995); em Leal e Brandrão (2006); em Salek-Fiad (2009) e em Ruiz (2013) sobre escrita e reescrita de texto, pudemos propor um roteiro de atividades de escrita e de reescrita de texto, focalizando as dificuldades dos alunos, como estratégia de reelaboração da segmentação não-convencional hipo e hipersegmentação
APA, Harvard, Vancouver, ISO, and other styles
8

Tseng, Ming-Chi, and 曾明基. "Measurement Error Affect the Heterogeneity of the Growth Mixture Model: Monte Carlo Simulation and Empirical Data Analysis." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/s7yrz6.

Full text
Abstract:
博士
國立東華大學
課程設計與潛能開發學系
101
Growth mixture model is suitable for application in the heterogeneity of the structure of the longitudinal samples, longitudinal analysis of parameter estimation due to sample heterogeneity bias problem can be solved. However, the duplicate variable measurement for each measurement point in time, the growth mixture model is only a single item plus total points to carry out the construction of growth models, as well as follow-up to the heterogeneity of this measurement method and did not taking into account the measurement error. Based on this, the present point in each measuring time by the repeated measurements of the plurality of items become latent growth mixture model, in considering the potential constructs and in the control of the premise of the measurement error. In the premise of the estimated measurement error, this study attempted to clarify at the goodness-of-fit of the model as well as the accuracy of clustering between growth mixture model and latent growth mixture model. The study by the Monte Carlo simulation study to compare the measurement error affect growth mixture model and latent growth mixture model. Measurement error will affect the goodness-of-fit of the model as well as the clustering accuracy in growth mixture model and latent growth mixture model. When the measurement error is smaller, the goodness-of-fit and the accuracy of clustering the better. In addition, in the framework of the same parameter compare growth mixture model and latent growth mixture model found that latent growth mixture model due to greater freedom in the AIC, BIC, ABIC fit expansion quickly, but appropriate in the LMR, BLRT with indicators of performance than growth mixture model. While Entropy results are not consistent, latent growth mixture model performed better when the sample size is small, growth mixture model performed better when the sample size is large. Through empirical construct of adolescent depression and hostility, the difference between growth mixture model and latent growth mixture model are similar to the simulation study. In addition, due to the latent growth mixture model can control measuring error, explained variance at different time points are better than growth mixture model.
APA, Harvard, Vancouver, ISO, and other styles
9

Abarin, Taraneh. "Second-order least squares estimation in regression models with application to measurement error problems." 2009. http://hdl.handle.net/1993/3126.

Full text
Abstract:
This thesis studies the Second-order Least Squares (SLS) estimation method in regression models with and without measurement error. Applications of the methodology in general quasi-likelihood and variance function models, censored models, and linear and generalized linear models are examined and strong consistency and asymptotic normality are established. To overcome the numerical difficulties of minimizing an objective function that involves multiple integrals, a simulation-based SLS estimator is used and its asymptotic properties are studied. Finite sample performances of the estimators in all of the studied models are investigated through simulation studies.
February 2009
APA, Harvard, Vancouver, ISO, and other styles
10

"Difasic growth curves of Hereford females with auto-regressive errors and heterogenety of variances." Tese, BIBLIOTECA CENTRAL DA UFLA, 2007. http://bibtede.ufla.br/tede//tde_busca/arquivo.php?codArquivo=550.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Heterogeneity Error"

1

Antman, Francisca. Poverty traps and nonlinear income dynamics with measurement error and individual heterogeneity. [Washington, D.C: World Bank, 2005.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Antman, Francisca, and David J. McKenzie. Poverty Traps And Nonlinear Income Dynamics With Measurement Error And Individual Heterogeneity. The World Bank, 2005. http://dx.doi.org/10.1596/1813-9450-3764.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ravallion, Martin, and Shaohua Chen. Benefit Incidence with Incentive Effects, Measurement Errors and Latent Heterogeneity. The World Bank, 2013. http://dx.doi.org/10.1596/1813-9450-6573.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

1954-, Börsch-Supan Axel, ed. Health, children, and elderly living arrangements: A multiperiod-multinomial probit model with unoberserved heterogeneity and autocorrelated errors. Cambridge, MA: National Bureau of Economic Research, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Furst, Eric M., and Todd M. Squires. Multiple particle tracking. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780199655205.003.0004.

Full text
Abstract:
The fundamentals and best practices of multiple particle tracking microrheology are discussed, including methods for producing video microscopy data, analyzing data to obtain mean-squared displacements and displacement correlations, and, critically, the accuracy and errors (static and dynamic) associated with particle tracking. Applications presented include two-point microrheology, methods for characterizing heterogeneous material rheology, and shell models of local (non-continuum) heterogeneity. Particle tracking has a long history. The earliest descriptions of Brownian motion relied on precise observations, and later quantitative measurements, using light microscopy.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Heterogeneity Error"

1

Welham, S. J., R. Thompson, and A. R. Gilmour. "A General Form for Specification of Correlated Error Models, with Allowance for Heterogeneity." In COMPSTAT, 479–84. Heidelberg: Physica-Verlag HD, 1998. http://dx.doi.org/10.1007/978-3-662-01131-7_68.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pawiro, S. A., Sugiyantari, W. E. Wibowo, K. Y. Cheung, and D. S. Soejoko. "Dosimetric Errors of TPS Calculations without Correction for Heterogeneity - A Study Using CIRS Thorax Phantom." In IFMBE Proceedings, 628–30. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-03474-9_176.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Goeree, Jacob K., Charles A. Holt, and Thomas R. Palfrey. "Heterogeneity." In Quantal Response Equilibrium. Princeton University Press, 2016. http://dx.doi.org/10.23943/princeton/9780691124230.003.0004.

Full text
Abstract:
Players have different skills, which has implications for the degree to which they make errors. Low-skill hitters in baseball often swing at bad pitches, beginning skiers frequently fall for no apparent reason, and children often lose at tic-tac-toe. At the other extreme, there are brilliant chess players, bargainers, and litigators who seem to know exactly what move to make or offer to decline. From a quantal response equilibrium (QRE) perspective, these skill levels can be modeled in terms of variation in error rates or in responsiveness of quantal response functions. This chapter explores issues related to individual heterogeneity with respect to player error rates. It also describes some extensions of QRE that relax the assumption that player expectations about the choice behavior of other players are correct. For example, in games that are played only once, players are not able to learn from others' prior decisions, and expectations must be based on introspection. The chapter develops the implications of noisy introspection embedded in a model of iterated thinking.
APA, Harvard, Vancouver, ISO, and other styles
4

Paccagnella, Omar. "Self-Evaluation, Differential Item Functioning, and Longitudinal Anchoring Vignettes." In Measurement Error in Longitudinal Data, 289–310. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780198859987.003.0012.

Full text
Abstract:
Anchoring vignettes are a powerful instrument to detect systematic differences in the use of self-reported ordinal survey responses. Not taking into account the (non-random) heterogeneity in reporting styles across different respondents may systematically bias the measurement of the variables of interest. The presence of such individual heterogeneity leads respondents to interpret, understand, or use the response categories for the same question differently. This phenomenon is defined as differential item functioning (DIF) in the psychometric literature. A growing amount of cross-sectional studies apply the anchoring vignette approach to tackle this issue but its use is still limited in the longitudinal context. This chapter introduces longitudinal anchoring vignettes for DIF correction, as well as the statistical approaches available when working with such data and how to investigate stability over time of individual response scales.
APA, Harvard, Vancouver, ISO, and other styles
5

Rubin, Yoram. "Quantifying and Accounting for Uncertainty." In Applied Stochastic Hydrogeology. Oxford University Press, 2003. http://dx.doi.org/10.1093/oso/9780195138047.003.0018.

Full text
Abstract:
This chapter deals with a wide range of issues with a common theme: coping with uncertainty. To this end, we look at the sources of uncertainty and the types of errors we need to deal with. We then explore methods for identifying these errors and for incorporating them into our predictions. This chapter extends our discussion on these topics in chapter 1, the discussion in chapter 3 on estimation under conditions of uncertainty, and image simulation using MC techniques. A comprehensive treatment of uncertainty needs to address two different types of errors. The first type is the model error, which arises from incorrect hypotheses and unmodeled processes (Gaganis and Smith, 2001), for example, from poor choice of governing equations, incorrect boundary conditions and zonation geometry, and inappropriate selection of forcing functions (Carrera and Neuman, 1986b). The second type of error is parameter error. The parameters of groundwater models are always in error because of measurement errors, heterogeneity, and scaling issues. Ignoring the effects of model and parameter errors is likely to lead to errors in model selection, in the estimation of prediction uncertainty, and in the assessment of risk. Parameter error is treated extensively in the literature: once a model is defined, it is common practice to quantify the errors associated with estimating its parameters (cf. Kitanidis and Vomvoris, 1983; Carrera and Neuman, 1986a, b; Rubin and Dagan, 1987a,b; McLaughlin and Townley, 1996; Poeter and Hill, 1997). Modeling error is well recognized, but is more difficult to quantify. Let us consider, for example, an aquifer which appears to be of uniform conductivity. Parameter error quantifies the error in estimating this conductivity. Modeling error, on the other hand, includes elusive factors such as missing a meandering channel somewhere in the aquifer. This, in essence, is the difficulty in determining modeling error; parameter error can be roughly quantified based on measurements if one assumes that the model is correct, but modeling error is expected to represent all that the measurements and/or the modeler fail to capture. To evaluate model error, the perfect model needs to be known, but this is never possible.
APA, Harvard, Vancouver, ISO, and other styles
6

"Chapter 21 Probabilistic but Incorrect Sampling * Total Error TE." In Sampling of Heterogeneous and Dynamic Material Systems - Theories of Heterogeneity, Sampling and Homogenizing, 393–405. Elsevier, 1992. http://dx.doi.org/10.1016/s0922-3487(08)70098-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

"Chapter 14 Discontinuity Component Ie1 of the Integration Error IE." In Sampling of Heterogeneous and Dynamic Material Systems - Theories of Heterogeneity, Sampling and Homogenizing, 326–32. Elsevier, 1992. http://dx.doi.org/10.1016/s0922-3487(08)70090-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

"Chapter 19 Definition and Properties of the Fundamental Error Fe." In Sampling of Heterogeneous and Dynamic Material Systems - Theories of Heterogeneity, Sampling and Homogenizing, 374–87. Elsevier, 1992. http://dx.doi.org/10.1016/s0922-3487(08)70096-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

"Chapter 15 Continuous Component Ie2 of the Integration Error IE." In Sampling of Heterogeneous and Dynamic Material Systems - Theories of Heterogeneity, Sampling and Homogenizing, 333–39. Elsevier, 1992. http://dx.doi.org/10.1016/s0922-3487(08)70091-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

"Chapter 16 Periodic Component Ie3 of the Integration Error IE." In Sampling of Heterogeneous and Dynamic Material Systems - Theories of Heterogeneity, Sampling and Homogenizing, 340–49. Elsevier, 1992. http://dx.doi.org/10.1016/s0922-3487(08)70092-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Heterogeneity Error"

1

Ankem, Mani Deep, and Swaroop Darbha. "Effect of Heterogeneity in Time Headway on Error Propagation in Vehicular Strings." In 2019 IEEE Intelligent Transportation Systems Conference - ITSC. IEEE, 2019. http://dx.doi.org/10.1109/itsc.2019.8917522.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Handajani, Sri Sulistijowati, Cornelia Ardiana Savita, Hasih Pratiwi, and Yuliana Susanti. "Best Weighted Selection in Handling Error Heterogeneity Problem on Spatial Regression Model." In International Conference on Mathematics and Islam. SCITEPRESS - Science and Technology Publications, 2018. http://dx.doi.org/10.5220/0008521002930299.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Okamoto, Ruth J., Curtis L. Johnson, Yuan Feng, John G. Georgiadis, and Philip V. Bayly. "MRE detection of heterogeneity using quantitative measures of residual error and uncertainty." In SPIE Medical Imaging, edited by Robert C. Molthen and John B. Weaver. SPIE, 2014. http://dx.doi.org/10.1117/12.2044633.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Landrigan, Matthew D., and Ryan K. Roeder. "Systematic Error in the Measure of Microdamage by Modulus Degradation During Four-Point Bending Fatigue." In ASME 2007 Summer Bioengineering Conference. American Society of Mechanical Engineers, 2007. http://dx.doi.org/10.1115/sbc2007-175238.

Full text
Abstract:
The accumulation of fatigue damage in bovine and human cortical bone is conventionally measured by modulus or stiffness degradation. The initial modulus or stiffness of each specimen is typically measured in order to normalize tissue heterogeneity to a prescribed strain [1,2]. Cyclic preloading at 100 N for 20 cycles has been used for this purpose in both uniaxial tension and four-point bending tests [1–3]. In four-point bending, the specimen modulus is often calculated using linear elastic beam theory as, (1)E=3Fl4bh2ε where F is the applied load, l is the outer support span, b is the specimen width, h is the specimen height, and ε is the maximum strain based on the beam deflection [2]. The maximum load and displacement data from preloading is used to determine the initial specimen modulus. The initial modulus and a prescribed maximum initial strain are then used to determine an appropriate load for fatigue testing under load control.
APA, Harvard, Vancouver, ISO, and other styles
5

Khoda, A. K. M. B., and Bahattin Koc. "Functionally Heterogeneous Porous Scaffold Design for Tissue Engineering." In ASME 2012 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/imece2012-86927.

Full text
Abstract:
Most of the current tissue scaffolds are mainly designed with homogeneous porosity which does not represent the spatial heterogeneity found in actual tissues. Therefore engineering a realistic tissue scaffolds with properly graded properties to facilitate the mimicry of the complex elegance of native tissues are critical for the successful tissue regeneration. In this work, novel bio-mimetic heterogeneous porous scaffolds have been modeled. First, the geometry of the scaffold is extracted along with its internal regional heterogeneity. Then the model has been discretized with planner slices suitable for layer based fabrication. An optimum filament deposition angle has been determined for each slice based on the contour geometry and the internal heterogeneity. The internal region has been discritized considering the homogeneity factor along the deposition direction. Finally, an area weight based approach has been used to generate the spatial porosity function that determines the filament deposition location for desired bio-mimetic porosity. The proposed methodology has been implemented and illustrative examples are provided. The effective porosity has been compared between the proposed design and the conventional homogeneous scaffolds. The result shows a significant error reduction towards achieving the bio-mimetic porosity in the scaffold design and provides better control over the desired porosity level. Moreover, sample designed structures have also been fabricated with a NC motion controlled micro-nozzle biomaterial deposition system.
APA, Harvard, Vancouver, ISO, and other styles
6

Acton, Katherine A., Sarah C. Baxter, Bahador Bahmani, Philip L. Clarke, and Reza Abedi. "Mesoscale Models Characterizing Material Property Fields Used As a Basis for Predicting Fracture Patterns in Quasi-Brittle Materials." In ASME 2017 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/imece2017-71500.

Full text
Abstract:
To accurately predict fracture patterns in quasi-brittle materials, it is necessary to accurately characterize heterogeneity in the properties of a material microstructure. This heterogeneity influences crack propagation at weaker points. Also, inherent randomness in localized material properties creates variability in crack propagation in a population of nominally identical material samples. In order to account for heterogeneity in the strength properties of a material at a small scale (or “microscale”), a mesoscale model is developed at an intermediate scale, smaller than the size of the overall structure. A central challenge of characterizing material behavior at a scale below the representative volume element (RVE), is that the stress/strain relationship is dependent upon boundary conditions imposed. To mitigate error associated with boundary condition effects, statistical volume elements (SVE) are characterized using a Voronoi tessellation based partitioning method. A moving window approach is used in which partitioned Voronoi SVE are analysed using finite element analysis (FEA) to determine a limiting stress criterion for each window. Results are obtained for hydrostatic, pure and simple shear uniform strain conditions. A method is developed to use superposition of results obtained to approximate SVE behavior under other loading conditions. These results are used to determine a set of strength parameters for mesoscale material property fields. These random fields are then used as a basis for input in to a fracture model to predict fracture patterns in quasi-brittle materials.
APA, Harvard, Vancouver, ISO, and other styles
7

Ijiri, Yuji, Yumi Naemura, Kenji Amano, Keisuke Maekawa, Atsushi Sawada, Kunio Ota, and Takanori Kunimaru. "Study on the Estimation Error Caused by Using One-Dimensional Model for the Evaluation of Dipole Tracer Test." In ASME 2010 13th International Conference on Environmental Remediation and Radioactive Waste Management. ASMEDC, 2010. http://dx.doi.org/10.1115/icem2010-40077.

Full text
Abstract:
In-situ tracer tests are a valuable approach to obtain parameters for a performance assessment of nuclear waste repository. A one-dimensional model is simple and is commonly used to identify radionuclide transport parameters by fitting breakthrough curves simulated using the model to those obtained from tracer tests. However, this method can increase uncertainty and introduce errors in the estimated parameters. In particular, such uncertainties and errors will be significant when evaluating parameters for tests conducted in a dipole (two-dimensional) flow field between injection and withdrawal wells. This paper describes a numerical analysis investigation into the effects of various experimental conditions on parameters estimated using a one-dimensional model for cases involving tracer tests in a two-dimensional fracture plane. Results show that longitudinal dispersivity tends to be overestimated by the one-dimensional model analysis. This overestimation is the result of several factors: smaller pumping rate, larger dipole ratio, stronger heterogeneity of the fracture hydraulic conductivity, and greater orthogonally-oriented background groundwater flow. Such information will help us to better plan and design tracer tests at underground research laboratories located in both Mizunami in central Japan and Horonobe in northern Japan. Understanding appropriate experimental conditions will help decrease the uncertainty in the results of tracer tests.
APA, Harvard, Vancouver, ISO, and other styles
8

Hoyle, Christopher, Wei Chen, Nanxin Wang, and Frank S. Koppelman. "Bayesian Hierarchical Choice Modeling Framework for Capturing Heterogeneous Preferences in Engineering System Design." In ASME 2009 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2009. http://dx.doi.org/10.1115/detc2009-87026.

Full text
Abstract:
Choice models play a critical role in enterprise-driven design by providing a link between engineering design attributes and customer preferences. In our previous work, we introduced the hierarchical choice modeling approach to address the special needs in complex engineering system design. The hierarchical choice modeling approach utilizes multiple model levels to create a link between qualitative attributes considered by consumers when selecting a product and quantitative attributes used for engineering design. In this work, the approach is expanded to the Bayesian Hierarchical Choice Modeling (BHCM) framework, estimated using an All-at-Once (AAO) solution procedure. This new framework addresses the shortcomings of the previous method while providing a highly flexible modeling framework to address the needs of complex system design. In this framework, both systematic and random consumer heterogeneity is explicitly considered, the ability to combine multiple sources of data for model estimation and updating is significantly expanded, and a method to mitigate error propagated throughout the model hierarchy is developed. In addition to developing the new choice model approach, the importance of including a complete representation of consumer heterogeneity in the model framework is provided. The new modeling framework is validated using several metrics and techniques. The benefits of the BHCM method are demonstrated in the design of an automobile occupant package.
APA, Harvard, Vancouver, ISO, and other styles
9

Long, Minhua, and W. Ross Morrow. "Should Optimal Designers Worry About Consideration?" In ASME 2014 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/detc2014-34493.

Full text
Abstract:
Consideration set formation using non-compensatory screening rules is a vital component of real purchasing decisions with decades of experimental validation. Marketers have recently developed statistical methods that can estimate quantitative choice models that include consideration set formation via non-compensatory screening rules. But is capturing consideration within models of choice important for design? This paper reports on a simulation study of a vehicle portfolio design when households screen over vehicle body style built to explore the importance of capturing consideration rules for optimal designers. We generate synthetic market share data, fit a variety of discrete choice models to this data, and then optimize design decisions using the estimated models. Model predictive power, design “error”, and profitability relative to ideal profits are compared as the amount of market data available increases. We find that even when estimated compensatory models provide relatively good predictive accuracy, they can lead to sub-optimal design decisions when the population uses consideration behavior; convergence of compensatory models to non-compensatory behavior is likely to require unrealistic amounts of data; and modeling heterogeneity in non-compensatory screening is more valuable than heterogeneity in compensatory trade-offs. This supports the claim that designers should carefully identify consideration behaviors before optimizing product portfolios. We also find that higher model predictive power does not necessarily imply better design decisions; that is, different model forms can provide “descriptive” rather than “predictive” information that is useful for design.
APA, Harvard, Vancouver, ISO, and other styles
10

Fu, Cheng, Xianpei Han, Le Sun, Bo Chen, Wei Zhang, Suhui Wu, and Hao Kong. "End-to-End Multi-Perspective Matching for Entity Resolution." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/689.

Full text
Abstract:
Entity resolution (ER) aims to identify data records referring to the same real-world entity. Due to the heterogeneity of entity attributes and the diversity of similarity measures, one main challenge of ER is how to select appropriate similarity measures for different attributes. Previous ER methods usually employ heuristic similarity selection algorithms, which are highly specialized to specific ER problems and are hard to be generalized to other situations. Furthermore, previous studies usually perform similarity learning and similarity selection independently, which often result in error propagation and are hard to be optimized globally. To resolve the above problems, this paper proposes an end-to-end multi-perspective entity matching model, which can adaptively select optimal similarity measures for heterogenous attributes by jointly learning and selecting similarity measures in an end-to-end way. Experiments on two real-world datasets show that our method significantly outperforms previous ER methods.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Heterogeneity Error"

1

Gollin, Douglas, and Christopher Udry. Heterogeneity, Measurement Error and Misallocation: Evidence from African Agriculture. Cambridge, MA: National Bureau of Economic Research, January 2019. http://dx.doi.org/10.3386/w25440.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bohorquez-Penuela, Camilo, and Mariana Urbina-Ramirez. Rising Staple Prices and Food Insecurity: The Case of the Mexican Tortilla. Banco de la República de Colombia, November 2020. http://dx.doi.org/10.32468/be.1144.

Full text
Abstract:
We study the relationship between rising prices of tortillas---the Mexican staple par excellence---and household food insecurity between 2008 and 2014, a period in which global food prices experienced dramatic increases. The use of a unique combination of household-level data and official state-level information on prices allows us exploit signi cant variation in prices across the Mexican states. Since households cannot be tracked across time, we follow Deaton (1985) by constructing a series of pseudo-panels to control for time- invariant unobserved heterogeneity and measurement error. The regression estimates suggest that increasing tortilla prices affected food insecurity rates in Mexico. More speci cally, households with children or those in the second or third income quintile are more likely to be affected.
APA, Harvard, Vancouver, ISO, and other styles
3

Ravallion, Martin, and Shaohua Chen. Benefit Incidence with Incentive Effects, Measurement Errors and Latent Heterogeneity: A Case Study for China. Cambridge, MA: National Bureau of Economic Research, April 2015. http://dx.doi.org/10.3386/w21111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Borsch-Supan, Axel, Vassilis Hajivassiliou, Laurence Kotlikoff, and John Morris. Health, Children, and Elderly Living Arrangements: A Multiperiod-Multinomial Probit Model with Unobserved Heterogeneity and Autocorrelated Errors. Cambridge, MA: National Bureau of Economic Research, April 1990. http://dx.doi.org/10.3386/w3343.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography