Journal articles on the topic 'Instrumental variable (IV) methods'

To see the other types of publications on this topic, follow the link: Instrumental variable (IV) methods.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Instrumental variable (IV) methods.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Chesher, Andrew, and Adam M. Rosen. "What Do Instrumental Variable Models Deliver with Discrete Dependent Variables?" American Economic Review 103, no. 3 (May 1, 2013): 557–62. http://dx.doi.org/10.1257/aer.103.3.557.

Full text
Abstract:
We compare nonparametric instrumental variables (IV) models with linear models and 2SLS methods when dependent variables are discrete. A 2SLS method can deliver a consistent estimator of a Local Average Treatment Effect but is not informative about other treatment effect parameters. The IV models set identify a range of interesting structural and treatment effect parameters. We give set identification results for a counterfactual probability and an Average Treatment Effect in a IV binary threshold crossing model. We illustrate using data on female employment and family size (employed by Joshua Angrist and William Evans (1998)) and compare with their LATE estimates.
APA, Harvard, Vancouver, ISO, and other styles
2

Yuan, Junkun, Anpeng Wu, Kun Kuang, Bo Li, Runze Wu, Fei Wu, and Lanfen Lin. "Auto IV: Counterfactual Prediction via Automatic Instrumental Variable Decomposition." ACM Transactions on Knowledge Discovery from Data 16, no. 4 (August 31, 2022): 1–20. http://dx.doi.org/10.1145/3494568.

Full text
Abstract:
Instrumental variables (IVs), sources of treatment randomization that are conditionally independent of the outcome, play an important role in causal inference with unobserved confounders. However, the existing IV-based counterfactual prediction methods need well-predefined IVs, while it’s an art rather than science to find valid IVs in many real-world scenes. Moreover, the predefined hand-made IVs could be weak or erroneous by violating the conditions of valid IVs. These thorny facts hinder the application of the IV-based counterfactual prediction methods. In this article, we propose a novel Automatic Instrumental Variable decomposition (AutoIV) algorithm to automatically generate representations serving the role of IVs from observed variables (IV candidates). Specifically, we let the learned IV representations satisfy the relevance condition with the treatment and exclusion condition with the outcome via mutual information maximization and minimization constraints, respectively. We also learn confounder representations by encouraging them to be relevant to both the treatment and the outcome. The IV and confounder representations compete for the information with their constraints in an adversarial game, which allows us to get valid IV representations for IV-based counterfactual prediction. Extensive experiments demonstrate that our method generates valid IV representations for accurate IV-based counterfactual prediction.
APA, Harvard, Vancouver, ISO, and other styles
3

Chalak, Karim. "INSTRUMENTAL VARIABLES METHODS WITH HETEROGENEITY AND MISMEASURED INSTRUMENTS." Econometric Theory 33, no. 1 (February 15, 2016): 69–104. http://dx.doi.org/10.1017/s0266466615000390.

Full text
Abstract:
We study the consequences of substituting an error-laden proxy W for an instrument Z on the interpretation of Wald, local instrumental variable (LIV), and instrumental variable (IV) estimands in an ordered discrete choice structural system with heterogeneity. A proxy W need only satisfy an exclusion restriction and that the treatment and outcome are mean independent from W given Z. Unlike Z, W need not satisfy monotonicity and may, under particular specifications, fail exogeneity. For example, W could code Z with error, with missing observations, or coarsely. We show that Wald, LIV, and IV estimands using W identify weighted averages of local or marginal treatment effects (LATEs or MTEs). We study a necessary and sufficient condition for nonnegative weights. Further, we study a condition under which the Wald or LIV estimand using W identifies the same LATE or MTE that would have been recovered had Z been observed. For example, this holds for binary Z and therefore the Wald estimand using W identifies the same “average causal response,” or LATE for binary treatment, that would have been recovered using Z. Also, under this condition, LIV using W can be used to identify MTE and average treatment effects for e.g., the population, treated, and untreated.
APA, Harvard, Vancouver, ISO, and other styles
4

Banerjee, Souvik, and Anirban Basu. "Estimating Endogenous Treatment Effects Using Latent Factor Models with and without Instrumental Variables." Econometrics 9, no. 1 (March 17, 2021): 14. http://dx.doi.org/10.3390/econometrics9010014.

Full text
Abstract:
We provide evidence on the least biased ways to identify causal effects in situations where there are multiple outcomes that all depend on the same endogenous regressor and a reasonable but potentially contaminated instrumental variable that is available. Simulations provide suggestive evidence on the complementarity of instrumental variable (IV) and latent factor methods and how this complementarity depends on the number of outcome variables and the degree of contamination in the IV. We apply the causal inference methods to assess the impact of mental illness on work absenteeism and disability, using the National Comorbidity Survey Replication.
APA, Harvard, Vancouver, ISO, and other styles
5

Betz, Timm, Scott J. Cook, and Florian M. Hollenbach. "Spatial interdependence and instrumental variable models." Political Science Research and Methods 8, no. 4 (January 30, 2019): 646–61. http://dx.doi.org/10.1017/psrm.2018.61.

Full text
Abstract:
AbstractInstrumental variable (IV) methods are widely used to address endogeneity concerns. Yet, a specific kind of endogeneity – spatial interdependence – is regularly ignored. We show that ignoring spatial interdependence in the outcome results in asymptotically biased estimates even when instruments are randomly assigned. The extent of this bias increases when the instrument is also spatially clustered, as is the case for many widely used instruments: rainfall, natural disasters, economic shocks, and regionally- or globally-weighted averages. Because the biases due to spatial interdependence and predictor endogeneity can offset, addressing only one can increase the bias relative to ordinary least squares. We demonstrate the extent of these biases both analytically and via Monte Carlo simulation. Finally, we discuss a general estimation strategy – S-2SLS – that accounts for both outcome interdependence and predictor endogeneity, thereby recovering consistent estimates of predictor effects.
APA, Harvard, Vancouver, ISO, and other styles
6

Marshall, John. "Coarsening Bias: How Coarse Treatment Measurement Upwardly Biases Instrumental Variable Estimates." Political Analysis 24, no. 2 (2016): 157–71. http://dx.doi.org/10.1093/pan/mpw007.

Full text
Abstract:
Political scientists increasingly use instrumental variable (IV) methods, and must often choose between operationalizing their endogenous treatment variable as discrete or continuous. For theoretical and data availability reasons, researchers frequently coarsen treatments with multiple intensities (e.g., treating a continuous treatment as binary). I show how such coarsening can substantially upwardly bias IV estimates by subtly violating the exclusion restriction assumption, and demonstrate that the extent of this bias depends upon the first stage and underlying causal response function. However, standard IV methods using a treatment where multiple intensities are affected by the instrument–even when fine-grained measurement at every intensity is not possible–recover a consistent causal estimate without requiring a stronger exclusion restriction assumption. These analytical insights are illustrated in the context of identifying the long-run effect of high school education on voting Conservative in Great Britain. I demonstrate that coarsening years of schooling into an indicator for completing high school upwardly biases the IV estimate by a factor of three.
APA, Harvard, Vancouver, ISO, and other styles
7

Bishop, Kelly C., Sehba Husain-Krautter, Jonathan D. Ketcham, Nicolai V. Kuminoff, and Corbett Schimming. "Analyzing Individual-Level Secondary Data with Instrumental Variable Methods Is Useful for Studying the Effects of Air Pollution on Dementia." Journal of Alzheimer's Disease 79, no. 1 (January 5, 2021): 15–23. http://dx.doi.org/10.3233/jad-200497.

Full text
Abstract:
We hypothesize that analyzing individual-level secondary data with instrumental variable (IV) methods can advance knowledge of the long-term effects of air pollution on dementia. We discuss issues in measurement using secondary data and how IV estimation can overcome biases due to measurement error and unmeasured variables. We link air-quality data from the Environmental Protection Agency’s monitors with Medicare claims data to illustrate the use of secondary data to document associations. Additionally, we describe results from a previous study that uses an IV for pollution and finds that PM2.5’s effects on dementia are larger than non-causal associations.
APA, Harvard, Vancouver, ISO, and other styles
8

Faradiba, Faradiba. "Determination of Climate Factors in Flood and Drought Disaster in Indonesia using Instrumental Variable (IV) Methods." JURNAL ILMU FISIKA | UNIVERSITAS ANDALAS 13, no. 1 (February 28, 2021): 54–61. http://dx.doi.org/10.25077/jif.13.1.54-61.2021.

Full text
Abstract:
Located in the Southeast Asia region, Indonesia has rainy and dry seasons. In the rainy and dry seasons that occur in Indonesia, often causes many problems in various business sectors and community activities, including floods and droughts. It is known that the disaster will have an impact on material and non-material losses. This study uses climate data and disaster data at the village level to determine the effect of rainfall on disasters. This study uses the instrumental variable method because the model has endogeneity problems. The study results concluded that increased rainfall had a positive impact on flood disasters with a coefficient of 0.003038. Simultaneously, rainfall also impacted drought with a coefficient of -0.000377. Variables in the regression model that are formed can explain 1.74 percent of the flood disaster and 0.59 percent of the drought disaster. These results indicate that most of the other variables can influence flooding and drought. Through this research, it is known that rainfall for floods and droughts is quite significant. Therefore, government and community efforts are needed to anticipate similar disasters.
APA, Harvard, Vancouver, ISO, and other styles
9

Saini, Vikram, and Lillie Dewan. "Instrument variable method based on nonlinear transformed instruments for Hammerstein system identification." Journal of Vibration and Control 24, no. 13 (February 22, 2017): 2802–14. http://dx.doi.org/10.1177/1077546317694770.

Full text
Abstract:
An extended instrumental variable (EIV) method is considered for the stochastic Hammerstein system (ARMAX and general model structure). The EIV method provides consistent parameter estimates by eliminating noise-induced bias in the least square (LS) method. To estimate the parameters, the Hammerstein model is formulated using the bilinear parameterization. The bilinear model is identified by introducing the nonlinear instrumental variables obtained from transformed delayed outputs using nonlinear mapping and polynomial basis of delayed inputs. These instruments are analyzed in full generality by computing the bounds on expected relationship between instruments and noise for the general noise disturbance structure. Then, a specific case with hyperbolic tangent (tanh) transformation is considered. Comparative performance analysis of the proposed IV method with the existing IV method, the data filtering-based LS methods, and the extended LS method shows improvement in the statistical properties of parameters estimates.
APA, Harvard, Vancouver, ISO, and other styles
10

Mogstad, Magne, and Alexander Torgovitsky. "Identification and Extrapolation of Causal Effects with Instrumental Variables." Annual Review of Economics 10, no. 1 (August 2, 2018): 577–613. http://dx.doi.org/10.1146/annurev-economics-101617-041813.

Full text
Abstract:
Instrumental variables (IV) are widely used in economics to address selection on unobservables. Standard IV methods produce estimates of causal effects that are specific to individuals whose behavior can be manipulated by the instrument at hand. In many cases, these individuals are not the same as those who would be induced to treatment by an intervention or policy of interest to the researcher. The average causal effect for the two groups can differ significantly if the effect of the treatment varies systematically with unobserved factors that are correlated with treatment choice. We review the implications of this type of unobserved heterogeneity for the interpretation of standard IV methods and for their relevance to policy evaluation. We argue that making inferences about policy-relevant parameters typically requires extrapolating from the individuals affected by the instrument to the individuals who would be induced to treatment by the policy under consideration. We discuss a variety of alternatives to standard IV methods that can be used to rigorously perform this extrapolation. We show that many of these approaches can be nested as special cases of a general framework that embraces the possibility of partial identification.
APA, Harvard, Vancouver, ISO, and other styles
11

Ertefaie, Ashkan, Dylan Small, James Flory, and Sean Hennessy. "Selection Bias When Using Instrumental Variable Methods to Compare Two Treatments But More Than Two Treatments Are Available." International Journal of Biostatistics 12, no. 1 (May 1, 2016): 219–32. http://dx.doi.org/10.1515/ijb-2015-0006.

Full text
Abstract:
Abstract Instrumental variable (IV) methods are widely used to adjust for the bias in estimating treatment effects caused by unmeasured confounders in observational studies. It is common that a comparison between two treatments is focused on and that only subjects receiving one of these two treatments are considered in the analysis even though more than two treatments are available. In this paper, we provide empirical and theoretical evidence that the IV methods may result in biased treatment effects if applied on a data set in which subjects are preselected based on their received treatments. We frame this as a selection bias problem and propose a procedure that identifies the treatment effect of interest as a function of a vector of sensitivity parameters. We also list assumptions under which analyzing the preselected data does not lead to a biased treatment effect estimate. The performance of the proposed method is examined using simulation studies. We applied our method on The Health Improvement Network (THIN) database to estimate the comparative effect of metformin and sulfonylureas on weight gain among diabetic patients.
APA, Harvard, Vancouver, ISO, and other styles
12

Alawadi, Zeinab, Uma Phatak, Chung-Yuan Hu, Christina Edwards Bailey, Lillian Kao, Y. Nancy You, and George J. Chang. "Comparative effectiveness of primary tumor resection in metastatic colon cancer: An instrumental variable analysis." Journal of Clinical Oncology 33, no. 3_suppl (January 20, 2015): 674. http://dx.doi.org/10.1200/jco.2015.33.3_suppl.674.

Full text
Abstract:
674 Background: Although the safety of chemotherapy without primary tumor resection (PTR) has been established, questions remain regarding potential survival benefit with PTR. The purpose of this study was to compare mortality with and without PTR among patients with unresectable metastatic colon cancer using nationwide hospital based cancer registry data. Methods: An observational study was conducted of patients with stage 4 colon cancer identified from the National Cancer Data Base (2003-2005). Patients who underwent metastectomy were excluded. Patient, treatment, and hospital data were analyzed. Multivariate Cox regression stratified by receipt of chemotherapy was performed to compare survival with and without PTR. To account for treatment selection bias, Propensity Score Weighting (PSW) and Instrumental Variable (IV) analyses, using hospital-level PTR rate as the instrument, were performed. In order to account for the potential bias associated with early comorbidity or disease burden associated deaths (survivor treatment bias), 1 year landmark analysis was performed. Results: A total of 14,399 patients met inclusion criteria and 6,735 patients were eligible for landmark analysis. PTR was performed in 38.2% of the total cohort and 73.8% of those at landmark. Using multivariate Cox regression analysis, PTR was associated with a significant reduction in mortality (HR 0.39; 95% CI, 0.38-0.41). This effect persisted with PSW (HR 0.4; 95% CI, 0.38-0.43). However, IV analysis showed a much smaller effect, (RR 0.88; 95% CI, 0.83-0.93). While a smaller benefit was seen on landmark analysis using multivariate Cox regression (HR 0.6; 95% CI, 0.55-0.64) and PSW (HR 0.59; 95% CI, 0.54-0.64), IV analysis showed no improvement in survival with PTR (RR 0.97; 95% CI, 0.87-1.06). Stratification by chemotherapy did not alter the results. Conclusions: Among patients with stage IV colon cancer, PTR offered no survival benefit over systemic chemotherapy alone when the IV method was applied at the 1 year landmark. Subject to selection and survivor treatment bias, standard regression analysis may overestimate the benefit of PTR. Future study should focus on identifying patients most likely to benefit from PTR.
APA, Harvard, Vancouver, ISO, and other styles
13

Andrews, Isaiah, James H. Stock, and Liyang Sun. "Weak Instruments in Instrumental Variables Regression: Theory and Practice." Annual Review of Economics 11, no. 1 (August 2, 2019): 727–53. http://dx.doi.org/10.1146/annurev-economics-080218-025643.

Full text
Abstract:
When instruments are weakly correlated with endogenous regressors, conventional methods for instrumental variables (IV) estimation and inference become unreliable. A large literature in econometrics has developed procedures for detecting weak instruments and constructing robust confidence sets, but many of the results in this literature are limited to settings with independent and homoskedastic data, while data encountered in practice frequently violate these assumptions. We review the literature on weak instruments in linear IV regression with an emphasis on results for nonhomoskedastic (heteroskedastic, serially correlated, or clustered) data. To assess the practical importance of weak instruments, we also report tabulations and simulations based on a survey of papers published in the American Economic Review from 2014 to 2018 that use IV. These results suggest that weak instruments remain an important issue for empirical practice, and that there are simple steps that researchers can take to better handle weak instruments in applications.
APA, Harvard, Vancouver, ISO, and other styles
14

Assaad, Ragui, Greta Friedemann-Sánchez, and Deborah Levison. "Impact of Domestic Violence on Children’s Education in Colombia: Methodological Challenges." Violence Against Women 23, no. 12 (August 24, 2016): 1484–512. http://dx.doi.org/10.1177/1077801216661036.

Full text
Abstract:
We explore the methodological challenges of estimating the effects of intimate partner violence (IPV) against the mother on the educational outcomes of her children. We tackle the problem of potential endogeneity and non-random selection of children into situations where they are exposed to IPV using non-parametric matching methods and parametric instrumental variable methods. Using Colombia’s 2005 Demographic and Health Survey (DHS), we find that IV and non-IV estimators produce qualitatively similar results at varying degrees of precision, for some educational outcomes. Therefore, exogeneity of IPV to various education outcomes cannot be taken for granted; appropriate methods need to be used to study its causal effects.
APA, Harvard, Vancouver, ISO, and other styles
15

Huellen and Qin. "Compulsory Schooling and Returns to Education: A Re-Examination." Econometrics 7, no. 3 (September 2, 2019): 36. http://dx.doi.org/10.3390/econometrics7030036.

Full text
Abstract:
This paper re-examines the instrumental variable (IV) approach to estimating returns to education by use of compulsory school law (CSL) in the US. We show that the IV-approach amounts to a change in model specification by changing the causal status of the variable of interest. From this perspective, the IV-OLS (ordinary least square) choice becomes a model selection issue between non-nested models and is hence testable using cross validation methods. It also enables us to unravel several logic flaws in the conceptualisation of IV-based models. Using the causal chain model specification approach, we overcome these flaws by carefully distinguishing returns to education from the treatment effect of CSL. We find relatively robust estimates for the first effect, while estimates for the second effect are hindered by measurement errors in the CSL indicators. We find reassurance of our approach from fundamental theories in statistical learning.
APA, Harvard, Vancouver, ISO, and other styles
16

Palmer, Tom M., Debbie A. Lawlor, Roger M. Harbord, Nuala A. Sheehan, Jon H. Tobias, Nicholas J. Timpson, George Davey Smith, and Jonathan AC Sterne. "Using multiple genetic variants as instrumental variables for modifiable risk factors." Statistical Methods in Medical Research 21, no. 3 (January 7, 2011): 223–42. http://dx.doi.org/10.1177/0962280210394459.

Full text
Abstract:
Mendelian randomisation analyses use genetic variants as instrumental variables (IVs) to estimate causal effects of modifiable risk factors on disease outcomes. Genetic variants typically explain a small proportion of the variability in risk factors; hence Mendelian randomisation analyses can require large sample sizes. However, an increasing number of genetic variants have been found to be robustly associated with disease-related outcomes in genome-wide association studies. Use of multiple instruments can improve the precision of IV estimates, and also permit examination of underlying IV assumptions. We discuss the use of multiple genetic variants in Mendelian randomisation analyses with continuous outcome variables where all relationships are assumed to be linear. We describe possible violations of IV assumptions, and how multiple instrument analyses can be used to identify them. We present an example using four adiposity-associated genetic variants as IVs for the causal effect of fat mass on bone density, using data on 5509 children enrolled in the ALSPAC birth cohort study. We also use simulation studies to examine the effect of different sets of IVs on precision and bias. When each instrument independently explains variability in the risk factor, use of multiple instruments increases the precision of IV estimates. However, inclusion of weak instruments could increase finite sample bias. Missing data on multiple genetic variants can diminish the available sample size, compared with single instrument analyses. In simulations with additive genotype-risk factor effects, IV estimates using a weighted allele score had similar properties to estimates using multiple instruments. Under the correct conditions, multiple instrument analyses are a promising approach for Mendelian randomisation studies. Further research is required into multiple imputation methods to address missing data issues in IV estimation.
APA, Harvard, Vancouver, ISO, and other styles
17

Earle, Craig C., Jerry S. Tsai, Richard D. Gelber, Milton C. Weinstein, Peter J. Neumann, and Jane C. Weeks. "Effectiveness of Chemotherapy for Advanced Lung Cancer in the Elderly: Instrumental Variable and Propensity Analysis." Journal of Clinical Oncology 19, no. 4 (February 15, 2001): 1064–70. http://dx.doi.org/10.1200/jco.2001.19.4.1064.

Full text
Abstract:
PURPOSE: To compare the effectiveness of chemotherapy given to elderly patients in routine practice for stage IV non–small-cell lung cancer (NSCLC) with the efficacy observed in randomized trials. PATIENTS AND METHODS: We used instrumental variable analysis (IVA) and propensity scores (PS) to simulate the conditions of a randomized trial in a retrospective cohort of patients over age 65 from the Survival, Epidemiology, and End Results (SEER) tumor registry. Geographic variation in chemotherapy use served as the instrument for the IVA analysis, and propensity scores were calculated with a logistic model based on patient disease and sociodemographic characteristics. RESULTS: Among 6,232 elderly patients, the instrumental variable estimate indicated an increase in median survival of 33 days and an improvement in 1-year survival of 9% attributable to chemotherapy. In a Cox regression model, chemotherapy administration was associated with a hazard ratio of 0.81 (95% confidence interval, 0.76 to 0.85). When survival was analyzed separately within propensity score quintiles, the hazard ratios were all similar, ranging from 0.78 to 0.85. These results are comparable with those of a large meta-analysis, which found a hazard ratio of 0.87 in the subgroup of patients over age 65. CONCLUSION: Chemotherapy for stage IV NSCLC seems to have effectiveness for elderly patients and those with comorbid conditions that is similar to the efficacy seen in randomized trials containing mostly younger, highly selected patients. All suitable patients should be given the opportunity to consider palliative chemotherapy for metastatic NSCLC.
APA, Harvard, Vancouver, ISO, and other styles
18

Anglemyer, Andrew, Amy Sturt, and Yvonne Maldonado. "The Effect of Combination Antiretroviral Therapy Use Among HIV Positive Children on the Hazard of AIDS Using Calendar Year as an Instrumental Variable." Current HIV Research 16, no. 2 (August 15, 2018): 151–57. http://dx.doi.org/10.2174/1570162x16666180409150826.

Full text
Abstract:
Background: Instrumental variable (IV) analyses are a common causal inference technique used in the absence of randomized data. Combination Antiretroviral Therapy (cART) was first introduced in 1996 and calendar periods have been used as a proxy for cART use. However, cART use misclassification can bias IV analyses. Objective: We aim to highlight the differences in the effects of antiretroviral therapy on clinical outcomes between the applications of traditional and adapted IV analysis techniques. Methods: This study includes children with perinatal human immunodeficiency virus (HIV-1) infection followed from 1988 to 2009. We describe an application of traditional and adapted IV analysis techniques. Noncompliance adjustments were applied to correct the misclassification of cART-use. Weighting the inverse probability of calendar era, the selected covariates were performed to control for variables that may be related to both the IV and outcome. Results: During 48,380 person-days, 78 HIV-positive children progressed to an initial stage-3- defining diagnosis or death. The Intention to Treat (ITT) rate ratio (RR) of stage-3-defining diagnosis or death comparing the pre-cART and cART eras was estimated at 2·67 (95% confidence interval (CI): 1·.47, 4·84). The IV estimator was used to adjust for cART use misclassification, yielding an IV RR of 5·42 (95% CI: 2·99, 9·83). Weighting analyses did not markedly alter the results. Conclusion: cART use decreased progression to stage-3-defining diagnosis or death. The use of noncompliance adjustments for cART misclassification in IV analyses may provide more robust evidence of cART's effectiveness than traditional ITT analysis.
APA, Harvard, Vancouver, ISO, and other styles
19

L. Andrews, Rick, and Peter Ebbes. "Properties of instrumental variables estimation in logit-based demand models." Journal of Modelling in Management 9, no. 3 (November 11, 2014): 261–89. http://dx.doi.org/10.1108/jm2-07-2014-0062.

Full text
Abstract:
Purpose – This paper aims to investigate the effects of using poor-quality instruments to remedy endogeneity in logit-based demand models. Endogeneity problems in demand models occur when certain factors, unobserved by the researcher, affect both demand and the values of a marketing mix variable set by managers. For example, unobserved factors such as style, prestige or reputation might result in higher prices for a product and higher demand for that product. If not addressed properly, endogeneity can bias the elasticities of the endogenous variable and subsequent optimization of the marketing mix. In practice, instrumental variables (IV) estimation techniques are often used to remedy an endogeneity problem. It is well-known that, for linear regression models, the use of IV techniques with poor-quality instruments can produce very poor parameter estimates, in some circumstances even worse than those that result from ignoring the endogeneity problem altogether. The literature has not addressed the consequences of using poor-quality instruments to remedy endogeneity problems in non-linear models, such as logit-based demand models. Design/methodology/approach – Using simulation methods, the authors investigate the effects of using poor-quality instruments to remedy endogeneity in logit-based demand models applied to finite-sample data sets. The results show that, even when the conditions for lack of parameter identification due to poor-quality instruments do not hold exactly, estimates of price elasticities can still be quite poor. That being the case, the authors investigate the relative performance of several non-linear IV estimation procedures utilizing readily available instruments in finite samples. Findings – The study highlights the attractiveness of the control function approach (Petrin and Train, 2010) and readily available instruments, which together reduce the mean squared elasticity errors substantially for experimental conditions in which the theory-backed instruments are poor in quality. The authors find important effects for sample size, in particular for the number of brands, for which it is shown that endogeneity problems are exacerbated with increases in the number of brands, especially when poor-quality instruments are used. In addition, the number of stores is found to be important for likelihood ratio testing. The results of the simulation are shown to generalize to situations under Nash pricing in oligopolistic markets, to conditions in which cross-sectional preference heterogeneity exists and to nested logit and probit-based demand specifications as well. Based on the results of the simulation, the authors suggest a procedure for managing a potential endogeneity problem in logit-based demand models. Originality/value – The literature on demand modeling has focused on deriving analytical results on the consequences of using poor-quality instruments to remedy endogeneity problems in linear models. Despite the widespread use of non-linear demand models such as logit, this study is the first to address the consequences of using poor-quality instruments in these models and to make practical recommendations on how to avoid poor outcomes.
APA, Harvard, Vancouver, ISO, and other styles
20

Brodeur, Abel, Nikolai Cook, and Anthony Heyes. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics." American Economic Review 110, no. 11 (November 1, 2020): 3634–60. http://dx.doi.org/10.1257/aer.20190687.

Full text
Abstract:
The credibility revolution in economics has promoted causal identification using randomized control trials (RCT), difference-in-differences (DID), instrumental variables (IV) and regression discontinuity design (RDD). Applying multiple approaches to over 21,000 hypothesis tests published in 25 leading economics journals, we find that the extent of p-hacking and publication bias varies greatly by method. IV (and to a lesser extent DID) are particularly problematic. We find no evidence that (i) papers published in the Top 5 journals are different to others; (ii) the journal “revise and resubmit” process mitigates the problem; (iii) things are improving through time. (JEL A14, C12, C52)
APA, Harvard, Vancouver, ISO, and other styles
21

Ezzalfani, Monia, Raphaël Porcher, Alexia Savignoni, Suzette Delaloge, Thomas Filleron, Mathieu Robain, and David Pérol. "Addressing the issue of bias in observational studies: Using instrumental variables and a quasi-randomization trial in an ESME research project." PLOS ONE 16, no. 9 (September 15, 2021): e0255017. http://dx.doi.org/10.1371/journal.pone.0255017.

Full text
Abstract:
Purpose Observational studies using routinely collected data are faced with a number of potential shortcomings that can bias their results. Many methods rely on controlling for measured and unmeasured confounders. In this work, we investigate the use of instrumental variables (IV) and quasi-trial analysis to control for unmeasured confounders in the context of a study based on the retrospective Epidemiological Strategy and Medical Economics (ESME) database, which compared overall survival (OS) with paclitaxel plus bevacizumab or paclitaxel alone as first-line treatment in patients with HER2-negative metastatic breast cancer (MBC). Patients and methods Causal interpretations and estimates can be made from observation data using IV and quasi-trial analysis. Quasi-trial analysis has the same conceptual basis as IV, however, instead of using IV in the analysis, a “superficial” or “pseudo” randomized trial is used in a Cox model. For instance, in a multicenter trial, instead of using the treatment variable, quasi-trial analysis can consider the treatment preference in each center, which can be informative, and then comparisons of results between centers or clinicians can be informative. Results In the original analysis, the OS adjusted for major factors was significantly longer with paclitaxel and bevacizumab than with paclitaxel alone. Using the center-treatment preference as an instrument yielded to concordant results. For the quasi-trial analysis, a Cox model was used, adjusted on all factors initially used. The results consolidate those obtained with a conventional multivariate Cox model. Conclusion Unmeasured confounding is a major concern in observational studies, and IV or quasi-trial analysis can be helpful to complement analysis of studies of this nature.
APA, Harvard, Vancouver, ISO, and other styles
22

Zlateva, Ivelina Yordanova, and Nikola Nikolov. "Two-step Procedure Based on the Least Squares and Instrumental Variable Methods for Simultaneous Estimation of von Bertalanffy Growth Parameters." International Journal of Agricultural and Environmental Information Systems 10, no. 2 (April 2019): 49–81. http://dx.doi.org/10.4018/ijaeis.2019040103.

Full text
Abstract:
Advanced in the present article is a Two-step procedure designed on the methods of the least squares (LS) and instrumental variable (IV) techniques for simultaneous estimation of the three unknown parameters L∞, K and t0, which represent the individual growth of fish in the von Bertalanffy growth equation. For the purposes of the present analysis, specific MATLAB-based software has been developed through simulated data sets to test the operational workability of the proposed procedure and pinpoint areas of improvement. The resulting parameter estimates have been analyzed on the basis of consecutive comparison (the initial conditions being the same) between the results delivered by the two-step procedure for simultaneous estimation of L∞, K and t0 and the results obtained via the most commonly employed methods for estimating growth parameters; first, use has been made of the Gulland-and-Holt method for estimating the asymptotic length L∞and the curvature parameter K, followed by the von Bertalanffy method for estimation of t0.
APA, Harvard, Vancouver, ISO, and other styles
23

Park, Soojin, and Gregory J. Palardy. "Sensitivity Evaluation of Methods for Estimating Complier Average Causal Mediation Effects to Assumptions." Journal of Educational and Behavioral Statistics 45, no. 4 (March 9, 2020): 475–506. http://dx.doi.org/10.3102/1076998620908599.

Full text
Abstract:
Estimating the effects of randomized experiments and, by extension, their mediating mechanisms, is often complicated by treatment noncompliance. Two estimation methods for causal mediation in the presence of noncompliance have recently been proposed, the instrumental variable method (IV-mediate) and maximum likelihood method (ML-mediate). However, little research has examined their performance when certain assumptions are violated and under varying data conditions. This article addresses that gap in the research and compares the performance of the two methods. The results show that the distributional assumption of the compliance behavior plays an important role in estimation. That is, regardless of the estimation method or whether the other assumptions hold, results are biased if the distributional assumption is not met. We also found that the IV-mediate method is more sensitive to exclusion restriction violations, while the ML-mediate method is more sensitive to monotonicity violations. Moreover, estimates depend in part on compliance rate, sample size, and the availability and impact of control covariates. These findings are used to provide guidance on estimator selection.
APA, Harvard, Vancouver, ISO, and other styles
24

Peeters, Bart, and Guido De Roeck. "Stochastic System Identification for Operational Modal Analysis: A Review." Journal of Dynamic Systems, Measurement, and Control 123, no. 4 (February 7, 2001): 659–67. http://dx.doi.org/10.1115/1.1410370.

Full text
Abstract:
This paper reviews stochastic system identification methods that have been used to estimate the modal parameters of vibrating structures in operational conditions. It is found that many classical input-output methods have an output-only counterpart. For instance, the Complex Mode Indication Function (CMIF) can be applied both to Frequency Response Functions and output power and cross spectra. The Polyreference Time Domain (PTD) method applied to impulse responses is similar to the Instrumental Variable (IV) method applied to output covariances. The Eigensystem Realization Algorithm (ERA) is equivalent to stochastic subspace identification.
APA, Harvard, Vancouver, ISO, and other styles
25

Huntington-Klein, Nick. "Instruments with Heterogeneous Effects: Bias, Monotonicity, and Localness." Journal of Causal Inference 8, no. 1 (December 19, 2020): 182–208. http://dx.doi.org/10.1515/jci-2020-0011.

Full text
Abstract:
AbstractIn Instrumental Variables (IV) estimation, the effect of an instrument on an endogenous variable may vary across the sample. In this case, IV produces a local average treatment effect (LATE), and if monotonicity does not hold, then no effect of interest is identified. In this paper, I calculate the weighted average of treatment effects that is identified under general first-stage effect heterogeneity, which is generally not the average treatment effect among those affected by the instrument. I then describe a simple set of data-driven approaches to modeling variation in the effect of the instrument. These approaches identify a Super-Local Average Treatment Effect (SLATE) that weights treatment effects by the corresponding instrument effect more heavily than LATE. Even when first-stage heterogeneity is poorly modeled, these approaches considerably reduce the impact of small-sample bias compared to standard IV and unbiased weak-instrument IV methods, and can also make results more robust to violations of monotonicity. In application to a published study with a strong instrument, the preferred approach reduces error by about 19% in small (N ≈ 1, 000) subsamples, and by about 13% in larger (N ≈ 33, 000) subsamples.
APA, Harvard, Vancouver, ISO, and other styles
26

Babii, Andrii. "HONEST CONFIDENCE SETS IN NONPARAMETRIC IV REGRESSION AND OTHER ILL-POSED MODELS." Econometric Theory 36, no. 4 (March 5, 2020): 658–706. http://dx.doi.org/10.1017/s0266466619000380.

Full text
Abstract:
AbstractThis article develops inferential methods for a very general class of ill-posed models in econometrics encompassing the nonparametric instrumental variable regression, various functional regressions, and the density deconvolution. We focus on uniform confidence sets for the parameter of interest estimated with Tikhonov regularization, as in Darolles et al. (2011, Econometrica 79, 1541–1565). Since it is impossible to have inferential methods based on the central limit theorem, we develop two alternative approaches relying on the concentration inequality and bootstrap approximations. We show that expected diameters and coverage properties of resulting sets have uniform validity over a large class of models, that is, constructed confidence sets are honest. Monte Carlo experiments illustrate that introduced confidence sets have reasonable width and coverage properties. Using U.S. data, we provide uniform confidence sets for Engel curves for various commodities.
APA, Harvard, Vancouver, ISO, and other styles
27

Chen, Nan, and Jay Pan. "The causal effect of delivery volume on severe maternal morbidity: an instrumental variable analysis in Sichuan, China." BMJ Global Health 7, no. 5 (May 2022): e008428. http://dx.doi.org/10.1136/bmjgh-2022-008428.

Full text
Abstract:
ObjectiveFindings regarding the association between delivery volume and maternal health outcomes are mixed, most of which explored their correlation. This study aims to demonstrate the causal effect of delivery volume on severe maternal morbidity (SMM) in China.MethodsWe analysed all women giving birth in the densely populated Sichuan province with 83 million residents in China, during the fourth quarters of each of 4 years (from 2016 to 2019). The routinely collected discharge data, the health institutional annual report data and road network data were used for analysis. The maternal health outcome was measured by SMM. Instrumental variable (IV) methods were applied for estimation, while the surrounding average number of delivery cases per institution was used as the instrument.ResultsThe study included 4545 institution-years of data from 1456 distinct institutions with delivery services, reflecting 810 049 associated delivery cases. The average SMM rate was approximately 33.08 per 1000 deliveries during 2016 and 2019. More than 86% of delivery services were provided by a third of the institutions with the highest delivery volume (≥143 delivery cases quarterly). In contrast, less than 2% of delivery services were offered by a third of the institutions with the lowest delivery volume (<19 delivery cases quarterly). After adjusting the confounders in the IV-logistic models, the average marginal effect of per 1000 cases in delivery volume was −0.162 (95% CI −0.169 to –0.155), while the adjusted OR of delivery volume was 0.005 (95% CI 0.004 to 0.006).ConclusionIncreased delivery volume has great potential to improve maternal health outcomes, while the centralisation of delivery services might facilitate maternal health promotion in China. Our study also provides implications for other developing countries confronted with similar challenges to China.
APA, Harvard, Vancouver, ISO, and other styles
28

Rotar, Laura. "Evaluating the Effectiveness of an Institutional Training Program in Slovenia: A Comparison of Methods." South East European Journal of Economics and Business 7, no. 1 (April 1, 2012): 43–51. http://dx.doi.org/10.2478/v10033-012-0004-8.

Full text
Abstract:
Evaluating the Effectiveness of an Institutional Training Program in Slovenia: A Comparison of MethodsThis paper aims to estimate the effect of an institutional training program on participants' chances of finding a job, using a rich dataset which comes from the official records of the Employment Service of Slovenia and taking into account the potential bias due to the existence of unobserved confounding factors. To deal with these selection biases, three methods are implemented in a comparative perspective: (1) instrumental variable (IV) regression; (2) Heckman's two-stage approach and (3) propensity score matching. This paper underlines important divergences between the results of parametric and non-parametric estimators. Some of the results, however, show the impact of the institutional training program on participants' chances of finding a job, especially in the short run. In the long run, however, the results are not so obvious.
APA, Harvard, Vancouver, ISO, and other styles
29

Hansen, Peter R., and Asger Lunde. "ESTIMATING THE PERSISTENCE AND THE AUTOCORRELATION FUNCTION OF A TIME SERIES THAT IS MEASURED WITH ERROR." Econometric Theory 30, no. 1 (August 8, 2013): 60–93. http://dx.doi.org/10.1017/s0266466613000121.

Full text
Abstract:
An economic time series can often be viewed as a noisy proxy for an underlying economic variable. Measurement errors will influence the dynamic properties of the observed process and may conceal the persistence of the underlying time series. In this paper we develop instrumental variable (IV) methods for extracting information about the latent process. Our framework can be used to estimate the autocorrelation function of the latent volatility process and a key persistence parameter. Our analysis is motivated by the recent literature on realized volatility measures that are imperfect estimates of actual volatility. In an empirical analysis using realized measures for the Dow Jones industrial average stocks, we find the underlying volatility to be near unit root in all cases. Although standard unit root tests are asymptotically justified, we find them to be misleading in our application despite the large sample. Unit root tests that are based on the IV estimator have better finite sample properties in this context.
APA, Harvard, Vancouver, ISO, and other styles
30

Janczak, Andrzej, and Józef Korbicz. "Two–Stage Instrumental Variables Identification of Polynomial Wiener Systems with Invertible Nonlinearities." International Journal of Applied Mathematics and Computer Science 29, no. 3 (September 1, 2019): 571–80. http://dx.doi.org/10.2478/amcs-2019-0042.

Full text
Abstract:
Abstract A new two-stage approach to the identification of polynomial Wiener systems is proposed. It is assumed that the linear dynamic system is described by a transfer function model, the memoryless nonlinear element is invertible and the inverse nonlinear function is a polynomial. Based on these assumptions and by introducing a new extended parametrization, the Wiener model is transformed into a linear-in-parameters form. In Stage I, parameters of the transformed Wiener model are estimated using the least squares (LS) and instrumental variables (IV) methods. Although the obtained parameter estimates are consistent, the number of parameters of the transformed Wiener model is much greater than that of the original one. Moreover, there is no unique relationship between parameters of the inverse nonlinear function and those of the transformed Wiener model. In Stage II, based on the assumption that the linear dynamic model is already known, parameters of the inverse nonlinear function are estimated uniquely using the IV method. In this way, not only is the parameter redundancy removed but also the parameter estimation accuracy is increased. A numerical example is included to demonstrate the practical effectiveness of the proposed approach.
APA, Harvard, Vancouver, ISO, and other styles
31

Collischon, Matthias. "Methods to Estimate Causal Effects. An Overview on IV, DiD and RDD and a Guide on How to Apply them in Practice." Soziale Welt 73, no. 4 (2022): 713–35. http://dx.doi.org/10.5771/0038-6073-2022-4-713.

Full text
Abstract:
The identification of causal effects has gained increasing attention in social sciences over the last years and this trend also has found its way into sociology, albeit on a relatively small scale. This article provides an overview of three methods to identify causal effects that are rarely used in sociology: instrumental variable (IV) regression, difference-in-differences (DiD), and regression discontinuity design (RDD). I provide intuitive introductions to these methods, discuss identifying assumptions, limitations of the methods, promising extension, and present an exemplary study for each estimation method that can serve as a benchmark when applying these estimation techniques. Furthermore, the supplemental material to this article contains Stata and R syntax that shows with simulated data how to apply these techniques in practice.
APA, Harvard, Vancouver, ISO, and other styles
32

Xiao, Yingchao, Jun Zhou, and Bin Zhao. "Attitude dynamics aiding for three-dimensional passive target tracking of strap-down seeker based on instrumental variable Kalman filter." Transactions of the Institute of Measurement and Control 42, no. 14 (June 10, 2020): 2645–59. http://dx.doi.org/10.1177/0142331220923768.

Full text
Abstract:
The accuracy of three-dimensional (3D) passive target tracking under strap-down system is usually affected by the measurement accuracy of attitude angular rate and attitude angle. In order to save the problem, a novel 3D passive target tracking method based on instrumental variable Kalman filter (IVKF) aided by the attitude dynamic is proposed. At first, the maneuvering target motion model is established based on the “current” statistical model and the filtering equation of MEMS inertial measurement unit (IMU) is also set up. Then, linearize the nonlinear state equations and replace the nonlinear measurement equations with pseudolinear equations. The 3D pseudolinear Kalman filter (PLKF) algorithm is derived according to the linear Kalman filter (KF). To counter the severe bias problems with PLKF, bias compensation and recursive instrumental variable (IV) methods are considered. In order to enhance observability of the system, a 3D motion tracking sliding-mode guidance law is deduced. Finally, some mathematical simulations were made to verify the effectiveness of the proposed method. The simulation results show the effect of the measurement accuracy and complexity of the algorithm are reduced, which proves the validity of the method.
APA, Harvard, Vancouver, ISO, and other styles
33

Kitamura, Yuichi, and Peter C. B. Phillips. "Efficient IV Estimation in Nonstationary Regression." Econometric Theory 11, no. 5 (October 1995): 1095–130. http://dx.doi.org/10.1017/s026646660000997x.

Full text
Abstract:
A limit theory for instrumental variables (IV) estimation that allows for possibly nonstationary processes was developed in Kitamura and Phillips (1992, Fully Modified IV, GIVE, and GMM Estimation with Possibly Non-stationary Regressors and Instruments, mimeo, Yale University). This theory covers a case that is important for practitioners, where the nonstationarity of the regressors may not be of full rank, and shows that the fully modified (FM) regression procedure of Phillips and Hansen (1990) is still applicable. FM. versions of the generalized method of moments (GMM) estimator and the generalized instrumental variables estimator (GIVE) were also developed, and these estimators (FM-GMM and FM-GIVE) were designed specifically to take advantage of potential stationarity in the regressors (or unknown linear combinations of them). These estimators were shown to deliver efficiency gains over FM-IV in the estimation of the stationary components of a model.This paper provides an overview of the FM-IV, FM-GMM, and FM-GIVE procedures and investigates the small sample properties of these estimation procedures by simulations. We compare the following five estimation methods: ordinary least squares, crude (conventional) IV, FM-IV, FM-GMM, and FM-GIVE. Our findings are as follows, (i) In terms of overall performance in both stationary and nonstationary cases, FM-IV is more concentrated and better centered than OLS and crude IV, though it has a higher root mean square error than crude IV due to occasional outliers, (ii) Among FM-IV, FM-GMM, and FM-GIVE, (a) when applied to the stationary coefficients, FM-GIVE generally outperforms FM-IV and FM-GMM by a wide margin, whereas the difference between the latter two is quite small when the AR roots of the stationary processes are rather large; and (b) when applied to the nonstationary coefficients, the three estimators are numerically very close. The performance of the FM-GIVE estimator is generally very encouraging.
APA, Harvard, Vancouver, ISO, and other styles
34

Schroeder, Mary C., Cole G. Chapman, Elizabeth A. Chrischilles, June Wilwert, Kathleen M. Schneider, Jennifer G. Robinson, and John M. Brooks. "Generating Practice-Based Evidence in the Use of Guideline-Recommended Combination Therapy for Secondary Prevention of Acute Myocardial Infarction." Pharmacy 10, no. 6 (November 3, 2022): 147. http://dx.doi.org/10.3390/pharmacy10060147.

Full text
Abstract:
Background: Clinical guidelines recommend beta-blockers, angiotensin-converting enzyme inhibitors/angiotensin-receptor blockers, and statins for the secondary prevention of acute myocardial infarction (AMI). It is not clear whether variation in real-world practice reflects poor quality-of-care or a balance of outcome tradeoffs across patients. Methods: The study cohort included Medicare fee-for-service beneficiaries hospitalized 2007–2008 for AMI. Treatment within 30-days post-discharge was grouped into one of eight possible combinations for the three drug classes. Outcomes included one-year overall survival, one-year cardiovascular-event-free survival, and 90-day adverse events. Treatment effects were estimated using an Instrumental Variables (IV) approach with instruments based on measures of local-area practice style. Pre-specified data elements were abstracted from hospital medical records for a stratified, random sample to create “unmeasured confounders” (per claims data) and assess model assumptions. Results: Each drug combination was observed in the final sample (N = 124,695), with 35.7% having all three, and 13.5% having none. Higher rates of guideline-recommended treatment were associated with both better survival and more adverse events. Unmeasured confounders were not associated with instrumental variable values. Conclusions: The results from this study suggest that providers consider both treatment benefits and harms in patients with AMIs. The investigation of estimator assumptions support the validity of the estimates.
APA, Harvard, Vancouver, ISO, and other styles
35

Lin, Zhaotong, Isaac Pan, and Wei Pan. "A practical problem with Egger regression in Mendelian randomization." PLOS Genetics 18, no. 5 (May 4, 2022): e1010166. http://dx.doi.org/10.1371/journal.pgen.1010166.

Full text
Abstract:
Mendelian randomization (MR) is an instrumental variable (IV) method using genetic variants such as single nucleotide polymorphisms (SNPs) as IVs to disentangle the causal relationship between an exposure and an outcome. Since any causal conclusion critically depends on the three valid IV assumptions, which will likely be violated in practice, MR methods robust to the IV assumptions are greatly needed. As such a method, Egger regression stands out as one of the most widely used due to its easy use and perceived robustness. Although Egger regression is claimed to be robust to directional pleiotropy under the instrument strength independent of direct effect (InSIDE) assumption, it is known to be dependent on the orientations/coding schemes of SNPs (i.e. which allele of an SNP is selected as the reference group). The current practice, as recommended as the default setting in some popular MR software packages, is to orientate the SNPs to be all positively associated with the exposure, which however, to our knowledge, has not been fully studied to assess its robustness and potential impact. We use both numerical examples (with both real data and simulated data) and analytical results to demonstrate the practical problem of Egger regression with respect to its heavy dependence on the SNP orientations. Under the assumption that InSIDE holds for some specific (and unknown) coding scheme of the SNPs, we analytically show that other coding schemes would in general lead to the violation of InSIDE. Other related MR and IV regression methods may suffer from the same problem. Cautions should be taken when applying Egger regression (and related MR and IV regression methods) in practice.
APA, Harvard, Vancouver, ISO, and other styles
36

Hansen, Ryan N., An T. Pham, Belinda Lovelace, Stela Balaban, and George J. Wan. "Comparative Analysis of Inpatient Costs for Obstetrics and Gynecology Surgery Patients Treated With IV Acetaminophen and IV Opioids Versus IV Opioid-only Analgesia for Postoperative Pain." Annals of Pharmacotherapy 51, no. 10 (June 13, 2017): 834–39. http://dx.doi.org/10.1177/1060028017715651.

Full text
Abstract:
Background: Recovery from obstetrics and gynecology (OB/GYN) surgery, including hysterectomy and cesarean section delivery, aims to restore function while minimizing hospital length of stay (LOS) and medical expenditures. Objective: Our analyses compare OB/GYN surgery patients who received combination intravenous (IV) acetaminophen and IV opioid analgesia with those who received IV opioid-only analgesia and estimate differences in LOS, hospitalization costs, and opioid consumption. Methods: We performed a retrospective analysis of the Premier Database between January 2009 and June 2015, comparing OB/GYN surgery patients who received postoperative pain management with combination IV acetaminophen and IV opioids with those who received only IV opioids starting on the day of surgery and continuing up to the second postoperative day. We performed instrumental variable 2-stage least-squares regressions controlling for patient and hospital covariates to compare the LOS, hospitalization costs, and daily opioid doses (morphine equivalent dose) of IV acetaminophen recipients with that of opioid-only analgesia patients. Results: We identified 225 142 OB/GYN surgery patients who were eligible for our study of whom 89 568 (40%) had been managed with IV acetaminophen and opioids. Participants averaged 36 years of age and were predominantly non-Hispanic Caucasians (60%). Multivariable regression models estimated statistically significant differences in hospitalization cost and opioid use with IV acetaminophen associated with $484.4 lower total hospitalization costs (95% CI = −$760.4 to −$208.4; P = 0.0006) and 8.2 mg lower daily opioid use (95% CI = −10.0 to −6.4), whereas the difference in LOS was not significant, at −0.09 days (95% CI = −0.19 to 0.01; P = 0.07). Conclusion: Compared with IV opioid-only analgesia, managing post-OB/GYN surgery pain with the addition of IV acetaminophen is associated with decreased hospitalization costs and reduced opioid use.
APA, Harvard, Vancouver, ISO, and other styles
37

Milla, Joniada. "The Canadian University Selectivity Premium." Review of Economic Analysis 10, no. 4 (October 10, 2018): 313–49. http://dx.doi.org/10.15353/rea.v10i4.1473.

Full text
Abstract:
This paper surveys the recent empirical literature on wage premium to university selectivity, and provides new evidence to this literature on a country such as Canada that has a distinct higher education system from those already analyzed. I estimate the wage premium to university selectivity using Canadian data and two popular methods to correct for non-random selection in universities of different quality: matching methods and instrumental variables (IV). I estimate a wage premium of 7% using the matching estimator, and a premium of 14.8% using the IV estimator for alumni of selective Canadian universities 4--6 years after graduation. My findings are in line with the literature on countries with a moderately differentiated higher education system that has low variation in tuition fees and is well supported by public funds.
APA, Harvard, Vancouver, ISO, and other styles
38

Hou, Jianyun, Xuexi Huo, and Runsheng Yin. "Does computer usage change farmers’ production and consumption? Evidence from China." China Agricultural Economic Review 11, no. 2 (May 7, 2019): 387–410. http://dx.doi.org/10.1108/caer-09-2016-0149.

Full text
Abstract:
Purpose The purpose of this paper is to explore the impact of using computers to obtain information on the farm household’s production and consumption based on a field survey of farm households in the northern China. Design/methodology/approach The most important methods applied are instrumental variable (IV) method and propensity score matching (PSM) method. Estimators of IV, PSM and nearest neighborhood matching approaches are considered together to check the robustness of empirical results. Findings This paper careful impact evaluation results suggest that the use of computer not only improves the size of arable land rented in but also reduces family labor input intensity and the probability of selling agricultural outputs at farm-gate markets. Moreover, it also stimulates transportation, garment, housing and insurance expenditures per capita. Research limitations/implications The database of this research comprises cross-section data, which does not support a cross-time comparison. Practical implications These results imply that it is vital to expand the coverage of computer use in rural areas. This may suggest that the importance of improving computer access is crucial for stimulating rural consumption increase. Furthermore, the need for the expansion of internet network coverage in western areas is also of importance. Originality/value First, the authors directly estimate computer usage impacts on a broader range of production and consumption indicators by including land-relative investments, variable investments, labor input and household’s expenditure and provide rigorous impact evaluations on the impact of access to computer. Second, the authors use IV and PSM methods to correct self-selection bias, going beyond the single equation approach in other studies. This enables us to identify the causal relationship between computer usage and farmer’s production and consumption decisions.
APA, Harvard, Vancouver, ISO, and other styles
39

Reitzug, Fabian, Stephen P. Luby, Hemant K. Pullabhotla, and Pascal Geldsetzer. "The Effect of Particulate Matter Exposure During Pregnancy on Pregnancy and Child Health Outcomes in South Asia: Protocol for an Instrumental Variable Analysis." JMIR Research Protocols 11, no. 8 (August 10, 2022): e35249. http://dx.doi.org/10.2196/35249.

Full text
Abstract:
Background Determining the longer-term health effects of air pollution has been difficult owing to the multitude of potential confounding variables in the relationship between air pollution and health. Air pollution in many areas of South Asia is seasonal, with large spikes in particulate matter (PM) concentration occurring in the winter months. This study exploits this seasonal variation in PM concentration through a natural experiment. Objective This project aims to determine the causal effect of PM exposure during pregnancy on pregnancy and child health outcomes. Methods We will use an instrumental variable (IV) design whereby the estimated month of conception is our instrument for exposure to PM with a diameter less than 2.5 μm (PM2.5) during pregnancy. We will assess the plausibility of our assumption that timing of conception is exogenous with regard to our outcomes of interest and will adjust for date of monsoon onset to control for confounding variables related to harvest timing. Our outcomes are 1) birth weight, 2) pregnancy termination resulting in miscarriage, abortion, or still birth, 3) neonatal death, 4) infant death, and 5) child death. We will use data from the Demographic and Health Surveys (DHS) conducted in relevant regions of Bangladesh, India, Nepal, and Pakistan, along with monthly gridded data on PM2.5 concentration (0.1°×0.1° spatial resolution), precipitation data (0.5°×0.5° resolution), temperature data (0.5°×0.5°), and agricultural land use data (0.1°×0.1° resolution). Results Data access to relevant DHSs was granted on June 6, 2021 for India, Nepal, Bangladesh, August 24, 2021 for Pakistan, and June 19 2022 for the latest DHS from India. Conclusions If the assumptions for a causal interpretation of our instrumental variable analysis are met, this analysis will provide important causal evidence on the maternal and child health effects of PM2.5 exposure during pregnancy. This evidence is important to inform personal behavior and interventions, such as the adoption of indoor air filtration during pregnancy as well as environmental and health policy. International Registered Report Identifier (IRRID) DERR1-10.2196/35249
APA, Harvard, Vancouver, ISO, and other styles
40

Konstantopoulos, Spyros, and Anne Traynor. "Class Size Effects on Reading Achievement Using PIRLS Data: Evidence from Greece." Teachers College Record: The Voice of Scholarship in Education 116, no. 2 (February 2014): 1–29. http://dx.doi.org/10.1177/016146811411600202.

Full text
Abstract:
Background/Context The effects of class size on student achievement have gained considerable attention in education research and policy, especially over the last 30 years. Perhaps the best evidence about the effects of class size thus far has been produced from analyses of Project STAR data, a large-scale experiment where students and teachers were randomly assigned to smaller or larger classes within schools. Researchers have also analyzed observational data to examine the effects of class size, but the results have been mixed. Purpose/Objective/Research Question/Focus of Study It is generally difficult to draw causal inferences about class size effects with observational data because of the omitted variables problem. This shortcoming can be overcome with instrumental variables (IV) methods that are designed to facilitate causal inferences. The present study uses IV methods to examine the effects of class size on reading achievement using data from the 2001 fourth-grade sample of the Progress in International Reading Literacy Study (PIRLS) in Greece. We took advantage of Greece's nationwide rule about maximum class size in elementary schools to construct IV estimates of class size. Population PIRLS was designed to monitor children's achievement levels in fourth grade worldwide. We used reading achievement data from 2001 in Greece. The sample was a national probability sample of fourth graders. The use of appropriate weights helped us make projections to the fourth-grade student population in Greece in 2001. Research Design The research design was secondary analysis. We examined whether class size predicts reading achievement for fourth graders in Greece net of student, teacher/classroom, and school characteristics. We used multilevel models to capture the dependency in the data (i.e., students nested within schools). We also used instrumental variables methods to facilitate causal inferences about class size effects. Conclusions We investigated the effects of class size on reading achievement for fourth graders in Greece in 2001 using rich data from PIRLS. The results produced from the multilevel and the IV analyses were overall similar. Generally, the results indicated a positive association between class size and achievement. However, the association was typically statistically insignificant, especially when teacher/classroom and school variables were taken into account.
APA, Harvard, Vancouver, ISO, and other styles
41

Kline, Patrick, and Christopher R. Walters. "On Heckits, LATE, and Numerical Equivalence." Econometrica 87, no. 2 (2019): 677–96. http://dx.doi.org/10.3982/ecta15444.

Full text
Abstract:
Structural econometric methods are often criticized for being sensitive to functional form assumptions. We study parametric estimators of the local average treatment effect (LATE) derived from a widely used class of latent threshold crossing models and show they yield LATE estimates algebraically equivalent to the instrumental variables (IV) estimator. Our leading example is Heckman's (1979) two‐step (“Heckit”) control function estimator which, with two‐sided non‐compliance, can be used to compute estimates of a variety of causal parameters. Equivalence with IV is established for a semiparametric family of control function estimators and shown to hold at interior solutions for a class of maximum likelihood estimators. Our results suggest differences between structural and IV estimates often stem from disagreements about the target parameter rather than from functional form assumptions per se. In cases where equivalence fails, reporting structural estimates of LATE alongside IV provides a simple means of assessing the credibility of structural extrapolation exercises.
APA, Harvard, Vancouver, ISO, and other styles
42

Deaton, Angus. "Instruments, Randomization, and Learning about Development." Journal of Economic Literature 48, no. 2 (June 1, 2010): 424–55. http://dx.doi.org/10.1257/jel.48.2.424.

Full text
Abstract:
There is currently much debate about the effectiveness of foreign aid and about what kind of projects can engender economic development. There is skepticism about the ability of econometric analysis to resolve these issues or of development agencies to learn from their own experience. In response, there is increasing use in development economics of randomized controlled trials (RCTs) to accumulate credible knowledge of what works, without overreliance on questionable theory or statistical methods. When RCTs are not possible, the proponents of these methods advocate quasi-randomization through instrumental variable (IV) techniques or natural experiments. I argue that many of these applications are unlikely to recover quantities that are useful for policy or understanding: two key issues are the misunderstanding of exogeneity and the handling of heterogeneity. I illustrate from the literature on aid and growth. Actual randomization faces similar problems as does quasi-randomization, notwithstanding rhetoric to the contrary. I argue that experiments have no special ability to produce more credible knowledge than other methods, and that actual experiments are frequently subject to practical problems that undermine any claims to statistical or epistemic superiority. I illustrate using prominent experiments in development and elsewhere. As with IV methods, RCT-based evaluation of projects, without guidance from an understanding of underlying mechanisms, is unlikely to lead to scientific progress in the understanding of economic development. I welcome recent trends in development experimentation away from the evaluation of projects and toward the evaluation of theoretical mechanisms. (JEL C21, F35, O19)
APA, Harvard, Vancouver, ISO, and other styles
43

Xu, Ran, Richard P. DeShon, and Christopher R. Dishop. "Challenges and Opportunities in the Estimation of Dynamic Models." Organizational Research Methods 23, no. 4 (May 6, 2019): 595–619. http://dx.doi.org/10.1177/1094428119842638.

Full text
Abstract:
Interest in modeling longitudinal processes is increasing rapidly in organizational science. Organizational scholars often employ multilevel or hierarchical linear models (HLMs) to study such processes given that longitudinal data in organizational science typically consist of observations over a relatively small number of time intervals ( T) nested within a relatively large number of units ( N; e.g., people, teams, organizations). In this paper, we first distinguish change and dynamics as common research foci when modeling longitudinal processes and then demonstrate that a unique set of inferential hazards exists when investigating change or dynamics using multilevel models. Specifically, multilevel models that include one or more time-lagged values of the dependent variable as predictors often result in substantially biased estimates of the model parameters, inflated Type I error rates, and ultimately inaccurate inference. Using Monte Carlo simulations, we investigate the bias and Type I error rates for the standard centered/uncentered hierarchical linear model (HLM) and compare them with two alternative estimation methods: the Bollen and Brand structural equation modeling (SEM) approach and the Arrelano and Bond generalized method of moments using instrumental variables (GMM-IV) approach. We find that the commonly applied hierarchical linear model performs poorly, whereas the SEM and GMM-IV approaches generally perform well, with the SEM approach yielding slightly better performance in small samples with large autoregressive effects. We recommend the Bollen and Brand SEM approach for general use when studying change or dynamics in organizational science.
APA, Harvard, Vancouver, ISO, and other styles
44

Zhang, Wei. "Economic analysis of the environmental sustainability of agriculture: recent studies using quasi-experimental methods." China Agricultural Economic Review 14, no. 2 (January 12, 2022): 259–73. http://dx.doi.org/10.1108/caer-08-2021-0164.

Full text
Abstract:
PurposeThe purpose of this review article is to demonstrate how the quasi-experimental approach has been used to study environmental and natural resource issues related to agricultural production.Design/methodology/approachThis review article first provides a short introduction to the quasi-experimental approach using the potential outcomes framework and then uses studies on the environmental sustainability of agricultural production to illustrate how quasi-experimental methods have been applied. Papers reviewed consist of studies that estimate the environmental externalities from agricultural production, evaluate agri-environmental and other related policies and programs, and demonstrate issues related to on-farm resource use and climate adaptation.FindingsDifference-in-differences (DID) and two-way fixed effects methods that utilize the spatial and temporal variation in panel data are widely used to estimate the causal impact of changes in agricultural production and policy on the environment. Utilizing the discontinuities and limits created by agricultural policies and regulations, local treatment effects on land and other input use are estimated using regression discontinuity (RD) or instrumental variable (IV) methods with cross-sectional data.Originality/valueChallenges faced by the food systems have made agricultural sustainability more critical than ever. Over the past three decades, the quasi-experimental approach has become the powerhouse of applied economic research. This review article focuses on quasi-experimental studies on the environmental sustainability of agriculture to provide methodological insights and to highlight gaps in the economics literature of agricultural sustainability.
APA, Harvard, Vancouver, ISO, and other styles
45

Kuersteiner, Guido M. "EFFICIENT IV ESTIMATION FOR AUTOREGRESSIVE MODELS WITH CONDITIONAL HETEROSKEDASTICITY." Econometric Theory 18, no. 3 (May 15, 2002): 547–83. http://dx.doi.org/10.1017/s0266466602183010.

Full text
Abstract:
This paper analyzes autoregressive time series models where the errors are assumed to be martingale difference sequences that satisfy an additional symmetry condition on their fourth-order moments. Under these conditions quasi maximum likelihood estimators of the autoregressive parameters are no longer efficient in the generalized method of moments (GMM) sense. The main result of the paper is the construction of efficient semiparametric instrumental variables estimators for the autoregressive parameters. The optimal instruments are linear functions of the innovation sequence.It is shown that a frequency domain approximation of the optimal instruments leads to an estimator that only depends on the data periodogram and an unknown linear filter. Semiparametric methods to estimate the optimal filter are proposed.The procedure is equivalent to GMM estimators where lagged observations are used as instruments. As a result of the additional symmetry assumption on the fourth moments the number of instruments is allowed to grow at the same rate as the sample. No lag truncation parameters are needed to implement the estimator, which makes it particularly appealing from an applied point of view.
APA, Harvard, Vancouver, ISO, and other styles
46

Xie, Yong, Lin Wu, and Bo Liu. "Rationalism or Intuitionism: How Does Internet Use Affect the Perceptions of Social Fairness among Middle-Aged Groups in China?" International Journal of Environmental Research and Public Health 19, no. 16 (August 10, 2022): 9855. http://dx.doi.org/10.3390/ijerph19169855.

Full text
Abstract:
Background: In the digital age, the Internet has profoundly affected our production and life, which in turn has affected our mental health. However, little research has been conducted on when and how Internet use (IU) affects social fairness perception (SFP). Methods: Using the data of Chinese General Social Survey (CGSS) 2015, this paper identifies the causal effect of IU on Chinese middle-aged people’s SFP through Ordinary Least Square (OLS) regression and the instrumental variable (IV) method, and uses the Sobel and Bootstrap Test for mediation analysis. Results: IU not only directly reduces Chinese people’s SFP by channeling their social emotions, but also indirectly decreases SFP through the inspiration of government trust. However, inconsistent with some previous studies, social comparison mainly has a partial masking effect on the causality between IU and SFP. Conclusions: The significant negative impact of IU on SFP is the result of the combination of rationalism and intuitionism.
APA, Harvard, Vancouver, ISO, and other styles
47

Nelson, Lyndsay A., Andrew J. Spieker, Lindsay S. Mayberry, Candace McNaughton, and Robert A. Greevy. "Estimating the impact of engagement with digital health interventions on patient outcomes in randomized trials." Journal of the American Medical Informatics Association 29, no. 1 (November 22, 2021): 128–36. http://dx.doi.org/10.1093/jamia/ocab254.

Full text
Abstract:
Abstract Objective Guidance is needed on studying engagement and treatment effects in digital health interventions, including levels required for benefit. We evaluated multiple analytic approaches for understanding the association between engagement and clinical outcomes. Materials and Methods We defined engagement as intervention participants’ response rate to interactive text messages, and considered moderation, standard regression, mediation, and a modified instrumental variable (IV) analysis to investigate the relationship between engagement and clinical outcomes. We applied each approach to two randomized controlled trials featuring text message content in the intervention: REACH (Rapid Encouragement/Education and Communications for Health), which targeted diabetes, and VERB (Vanderbilt Emergency Room Bundle), which targeted hypertension. Results In REACH, the treatment effect on hemoglobin A1c was estimated to be −0.73% (95% CI: [−1.29, −0.21]; P = 0.008), and in VERB, the treatment effect on systolic blood pressure was estimated to be −10.1 mmHg (95% CI: [−17.7, −2.8]; P = 0.007). Only the IV analyses suggested an effect of engagement on outcomes; the difference in treatment effects between engagers and non-engagers was −0.29% to −0.51% in the REACH study and −1.08 to −3.25 mmHg in the VERB study. Discussion Standard regression and mediation have less power than a modified IV analysis, but the IV approach requires specification of assumptions. This is the first review of the strengths and limitations of various approaches to evaluating the impact of engagement on outcomes. Conclusions Understanding the role of engagement in digital health interventions can help reveal when and how these interventions achieve desired outcomes.
APA, Harvard, Vancouver, ISO, and other styles
48

Logan, Jean, David Alexoff, and Joanna S. Fowler. "The Use of Alternative Forms of Graphical Analysis to Balance Bias and Precision in PET Images." Journal of Cerebral Blood Flow & Metabolism 31, no. 2 (September 1, 2010): 535–46. http://dx.doi.org/10.1038/jcbfm.2010.123.

Full text
Abstract:
Graphical analysis (GA) is an efficient method for estimating total tissue distribution volume ( VT) from positron emission tomography (PET) uptake data. The original GA produces a negative bias in VT in the presence of noise. Estimates of VT using other GA forms have less bias but less precision. Here, we show how the bias terms are related between the GA methods and how using an instrumental variable (IV) can also reduce bias. Results are based on simulations of a two-compartment model with VT's ranging from 10.5 to 64 mL/cm3 and from PET image data with the tracer [11C]DASB ([11C]-3-amino-4-(2-dimethylaminomethyl-phenylsulfanyl) benzonitrile). Four estimates of VT (or distribution volume ratio (DVR) using a reference tissue) can be easily computed from different formulations of GA including the IV. As noise affects the estimates from all four differently, they generally do not provide the same estimates. By taking the median value of the four estimates, we can decrease the bias and reduce the effect of large values contributing to noisy images. The variance of the four estimates can serve as a guide to the reliability of the median estimate. This may provide a general method for the generation of parametric images with little bias and good precision.
APA, Harvard, Vancouver, ISO, and other styles
49

Gajra, Ajeet, William P. Tew, Molly Hardt, Supriya Gupta Mohile, Cynthia Owusu, Heidi D. Klepin, Cary P. Gross, Stuart M. Lichtman, Vani Katheria, and Arti Hurria. "Predictors of early discontinuation of chemotherapy (EDC) in patients (pts) age 65 or older with stage IV non-small cell lung cancer (NSCLC)." Journal of Clinical Oncology 30, no. 15_suppl (May 20, 2012): e18133-e18133. http://dx.doi.org/10.1200/jco.2012.30.15_suppl.e18133.

Full text
Abstract:
e18133 Background: NSCLC accounts for the highest cancer related mortality. Older pts suffer higher chemotherapy (chemo) toxicity (tox). Inability to complete a planned chemo course is associated with poor outcome. We studied factors associated with EDC (defined as chemo duration of ≤ 6 weeks) in pts ≥ age 65 with stage IV NSCLC. Methods: We conducted a subset analysis of pts with stage IV NSCLC (n=100) enrolled in a multi-site prospective study (n=500) which identified predictors of chemo tox in pts age > 65 (Hurria et al, JCO 2011). Bivariate analysis and multivariate logistic regression were utilized to identify factors associated with EDC among pts with stage IV NSCLC including: sociodemographics, chemo factors (1st vs. 2nd + line; single vs doublet), labs (hemoglobin [Hb], liver function, creatinine clearance, albumin) and geriatric assessment (GA) variables (functional status, comorbidity, social support, psychological, cognitive and nutritional status). Results: Of the 100 pts (median age 73, range 65-89) with stage IV NSCLC, 33 had EDC. Of these 67% was for disease progression, 15% tox and 15% for both progression and tox. Majority received 1st line (58%) and doublet chemo (67%). In bivariate analysis (p values <0.01 for all), pts with EDC were more likely to receive single agent (52% vs. 24%) and 2nd + line chemo (70% vs 28%)and have lower mean Hb (11.3 vs 12.7 g/dl) and albumin levels (3.4 vs 4.0 g/dl). GA variables associated with EDC include poorer physical function [lower score on Medical Outcomes Study physical health (MOS physical), MD rated Karnofsky performance status (KPS), self rated KPS, and MOS social activity, and need assistance with instrumental activities of daily living; p<0.01 for each variable]. On multivariate analysis, factors independently associated with EDC include 2nd + line chemo (OR=5.93; 95%CI = 2.25-15.61) and lower MOS physical score of <70 (OR= 4.19; 95% CI= 1.56-11.29). Conclusions: In older pts with NSCLC, factors associated with chemo duration of < 6 weeks include 2nd+ line chemo and poor physical function best assessed by MOS physical in this cohort. These simple factors may help the practicing oncologist identify older pts with NSCLC at greatest risk for EDC.
APA, Harvard, Vancouver, ISO, and other styles
50

Huang, Tao, Ming Ding, K. M. Bergholdt Helle, Tiange Wang, Yoriko Heianza, Dianjianyi Sun, C. Frazier-Wood Alexis, et al. "Dairy Consumption and Body Mass Index Among Adults: Mendelian Randomization Analysis of 184802 Individuals from 25 Studies." Clinical Chemistry 64, no. 1 (January 1, 2018): 183–91. http://dx.doi.org/10.1373/clinchem.2017.280701.

Full text
Abstract:
Abstract BACKGROUND Associations between dairy intake and body mass index (BMI) have been inconsistently observed in epidemiological studies, and the causal relationship remains ill defined. METHODS We performed Mendelian randomization (MR) analysis using an established dairy intake-associated genetic polymorphism located upstream of the lactase gene (LCT-13910 C/T, rs4988235) as an instrumental variable (IV). Linear regression models were fitted to analyze associations between (a) dairy intake and BMI, (b) rs4988235 and dairy intake, and (c) rs4988235 and BMI in each study. The causal effect of dairy intake on BMI was quantified by IV estimators among 184802 participants from 25 studies. RESULTS Higher dairy intake was associated with higher BMI (β = 0.03 kg/m2 per serving/day; 95% CI, 0.00–0.06; P = 0.04), whereas the LCT genotype with 1 or 2 T allele was significantly associated with 0.20 (95% CI, 0.14–0.25) serving/day higher dairy intake (P = 3.15 × 10−12) and 0.12 (95% CI, 0.06–0.17) kg/m2 higher BMI (P = 2.11 × 10−5). MR analysis showed that the genetically determined higher dairy intake was significantly associated with higher BMI (β = 0.60 kg/m2 per serving/day; 95% CI, 0.27–0.92; P = 3.0 × 10−4). CONCLUSIONS The present study provides strong evidence to support a causal effect of higher dairy intake on increased BMI among adults.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography