Academic literature on the topic 'Regression analysis'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Regression analysis.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Regression analysis"
Asakura, Koko, and Toshimitsu Hamasaki. "Regression analysis." Drug Delivery System 31, no. 1 (2016): 72–81. http://dx.doi.org/10.2745/dds.31.72.
Full textAnthony, Denis. "Regression analysis." Nurse Researcher 4, no. 4 (August 1997): 54–62. http://dx.doi.org/10.7748/nr.4.4.54.s6.
Full textAnthony, Denis. "Regression analysis." Nurse Researcher 4, no. 1 (October 1996): 318–26. http://dx.doi.org/10.7748/nr1996.10.4.1.318.c6066.
Full textMunro, Barbara Hazard. "Regression Analysis." Clinical Nurse Specialist 6, no. 2 (1992): 77. http://dx.doi.org/10.1097/00002800-199200620-00006.
Full textRawles, John, and J. C. Bignall. "REGRESSION ANALYSIS." Lancet 327, no. 8481 (March 1986): 614–15. http://dx.doi.org/10.1016/s0140-6736(86)92832-1.
Full textBland, J. M., and D. G. Altman. "REGRESSION ANALYSIS." Lancet 327, no. 8486 (April 1986): 908–9. http://dx.doi.org/10.1016/s0140-6736(86)91008-1.
Full textGlaser, Elton. "Regression Analysis." Missouri Review 27, no. 1 (2004): 140–41. http://dx.doi.org/10.1353/mis.2004.0010.
Full textLewis, S. "Regression analysis." Practical Neurology 7, no. 4 (August 1, 2007): 259–64. http://dx.doi.org/10.1136/jnnp.2007.120055.
Full textBolshakova, Lyudmila Valentinovna. "Correlation and Regression Analysis of Economic Problems." Revista Gestão Inovação e Tecnologias 11, no. 3 (June 30, 2021): 2077–88. http://dx.doi.org/10.47059/revistageintec.v11i3.2074.
Full textSrivastava, Vaibhav, Saksham Kaushik, and Ujjwal Raj. "Student Placement Package Prediction By Regression Analysis." International Journal of Research Publication and Reviews 5, no. 1 (January 24, 2024): 4724–29. http://dx.doi.org/10.55248/gengpi.5.0124.0345.
Full textDissertations / Theses on the topic "Regression analysis"
Sullwald, Wichard. "Grain regression analysis." Thesis, Stellenbosch : Stellenbosch University, 2014. http://hdl.handle.net/10019.1/86526.
Full textENGLISH ABSTRACT: Grain regression analysis forms an essential part of solid rocket motor simulation. In this thesis a numerical grain regression analysis module is developed as an alternative to cumbersome and time consuming analytical methods. The surface regression is performed by the level-set method, a numerical interface advancement scheme. A novel approach to the integration of the surface area and volume of a numerical interface, as defined implicitly in a level-set framework, by means of Monte-Carlo integration is proposed. The grain regression module is directly coupled to a quasi -1D internal ballistics solver in an on-line fashion, in order to take into account the effects of spatially varying burn rate distributions. A multi-timescale approach is proposed for the direct coupling of the two solvers.
AFRIKAANSE OPSOMMING: Gryn regressie analise vorm ’n integrale deel van soliede vuurpylmotor simulasie. In hierdie tesis word ’n numeriese gryn regressie analise model, as ’n alternatief tot dikwels omslagtige en tydrowende analitiese metodes, ontwikkel. Die oppervlak regressie word deur die vlak-set metode, ’n numeriese koppelvlak beweging skema uitgevoer. ’n Nuwe benadering tot die integrasie van die buite-oppervlakte en volume van ’n implisiete numeriese koppelvlak in ’n vlakset raamwerk, deur middel van Monte Carlo-integrasie word voorgestel. Die gryn regressie model word direk en aanlyn aan ’n kwasi-1D interne ballistiek model gekoppel, ten einde die uitwerking van ruimtelik-wisselende brand-koers in ag te neem. ’n Multi-tydskaal benadering word voorgestel vir die direkte koppeling van die twee modelle.
Dai, Elin, and Lara Güleryüz. "Factors that influence condominium pricing in Stockholm: A regression analysis : A regression analysis." Thesis, KTH, Matematisk statistik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-254235.
Full textDenna studie ämnar till att undersöka vilka faktorer som är av betydelse när syftet är att förutsäga prissättningen på bostadsrätter i Stockholms innerstad. Genom att använda multipel linjär regression, transformation av responsvariabeln, samt en mängd olika metoder för att förfina modellen, togs en slutgiltig, out of sample-validerad modell med ett 95%-konfidensintervall fram. För att genomföra de statistiska metoderna användes programmet R. Denna studie är avgränsad till de distrikt i Stockholms innerstad vars postnummer varierar mellan 112-118, därav är det viktigt att modellen endast appliceras på dessa områden eftersom de är inkluderade i modellen som regressorer. Tidsperioden inom vilket slutpriserna analyserades var mellan januari 2014 och april 2019, i vilket valutans volatilitet inte har analyserats som en ekonomisk påverkande faktor. Den slutgiltiga modellen innefattar de följande variablerna: våning, boarea, månadsavgift, konstruktionsår, distrikt.
Zuo, Yanling. "Monotone regression functions." Thesis, University of British Columbia, 1990. http://hdl.handle.net/2429/29457.
Full textScience, Faculty of
Statistics, Department of
Graduate
Ryu, Duchwan. "Regression analysis with longitudinal measurements." Texas A&M University, 2005. http://hdl.handle.net/1969.1/2398.
Full textCampbell, Ian. "The geometry of regression analysis." Thesis, University of Ottawa (Canada), 1989. http://hdl.handle.net/10393/5755.
Full textWiencierz, Andrea. "Regression analysis with imprecise data." Diss., Ludwig-Maximilians-Universität München, 2013. http://nbn-resolving.de/urn:nbn:de:bvb:19-166786.
Full textMethoden der statistischen Datenanalyse setzen in der Regel voraus, dass die vorhandenen Daten präzise und korrekte Beobachtungen der untersuchten Größen sind. Häufig können aber bei praktischen Studien die interessierenden Werte nur unvollständig oder unscharf beobachtet werden. Die vorliegende Arbeit beschäftigt sich mit der Fragestellung, wie Regressionsanalysen bei unscharfen Daten sinnvoll durchgeführt werden können. Zunächst werden verschiedene Ansätze zum Umgang mit unscharf beobachteten Variablen diskutiert, bevor eine neue Likelihood-basierte Methodologie für Regression mit unscharfen Daten eingeführt wird. Als Ergebnis der Regressionsanalyse wird bei diesem Ansatz keine einzelne Regressionsfunktion angestrebt, sondern die gesamte Menge aller anhand der Daten plausiblen Regressionsfunktionen betrachtet, welche als Konfidenzbereich für den untersuchten Zusammenhang interpretiert werden kann. Im darauffolgenden Kapitel wird im Rahmen dieser Methodologie eine Regressionsmethode entwickelt, die sehr allgemein bezüglich der Form der unscharfen Beobachtungen, der möglichen Verteilungen der Zufallsgrößen sowie der Form des funktionalen Zusammenhangs zwischen den untersuchten Variablen ist. Zudem werden ein exakter Algorithmus für den Spezialfall der linearen Einfachregression mit Intervalldaten entwickelt und einige statistische Eigenschaften der Methode näher untersucht. Dabei stellt sich heraus, dass die entwickelte Regressionsmethode sowohl robust im Sinne eines hohen Bruchpunktes ist, als auch sehr verlässliche Erkenntnisse hervorbringt, was sich in einer hohen Überdeckungswahrscheinlichkeit der Ergebnismenge äußert. Darüber hinaus wird in einem weiteren Kapitel ein in der Literatur vorgeschlagener Alternativansatz ausführlich diskutiert, der auf Support Vector Regression aufbaut. Dieser wird durch Einbettung in den methodologischen Rahmen des vorher eingeführten Likelihood-basierten Ansatzes weiter verallgemeinert. Abschließend werden die behandelten Regressionsmethoden auf zwei praktische Probleme angewandt.
Jeffrey, Stephen Glenn. "Quantile regression and frontier analysis." Thesis, University of Warwick, 2012. http://wrap.warwick.ac.uk/47747/.
Full textRanganai, Edmore. "Aspects of model development using regression quantiles and elemental regressions." Thesis, Stellenbosch : Stellenbosch University, 2007. http://hdl.handle.net/10019.1/18668.
Full textENGLISH ABSTRACT: It is well known that ordinary least squares (OLS) procedures are sensitive to deviations from the classical Gaussian assumptions (outliers) as well as data aberrations in the design space. The two major data aberrations in the design space are collinearity and high leverage. Leverage points can also induce or hide collinearity in the design space. Such leverage points are referred to as collinearity influential points. As a consequence, over the years, many diagnostic tools to detect these anomalies as well as alternative procedures to counter them were developed. To counter deviations from the classical Gaussian assumptions many robust procedures have been proposed. One such class of procedures is the Koenker and Bassett (1978) Regressions Quantiles (RQs), which are natural extensions of order statistics, to the linear model. RQs can be found as solutions to linear programming problems (LPs). The basic optimal solutions to these LPs (which are RQs) correspond to elemental subset (ES) regressions, which consist of subsets of minimum size to estimate the necessary parameters of the model. On the one hand, some ESs correspond to RQs. On the other hand, in the literature it is shown that many OLS statistics (estimators) are related to ES regression statistics (estimators). Therefore there is an inherent relationship amongst the three sets of procedures. The relationship between the ES procedure and the RQ one, has been noted almost “casually” in the literature while the latter has been fairly widely explored. Using these existing relationships between the ES procedure and the OLS one as well as new ones, collinearity, leverage and outlier problems in the RQ scenario were investigated. Also, a lasso procedure was proposed as variable selection technique in the RQ scenario and some tentative results were given for it. These results are promising. Single case diagnostics were considered as well as their relationships to multiple case ones. In particular, multiple cases of the minimum size to estimate the necessary parameters of the model, were considered, corresponding to a RQ (ES). In this way regression diagnostics were developed for both ESs and RQs. The main problems that affect RQs adversely are collinearity and leverage due to the nature of the computational procedures and the fact that RQs’ influence functions are unbounded in the design space but bounded in the response variable. As a consequence of this, RQs have a high affinity for leverage points and a high exclusion rate of outliers. The influential picture exhibited in the presence of both leverage points and outliers is the net result of these two antagonistic forces. Although RQs are bounded in the response variable (and therefore fairly robust to outliers), outlier diagnostics were also considered in order to have a more holistic picture. The investigations used comprised analytic means as well as simulation. Furthermore, applications were made to artificial computer generated data sets as well as standard data sets from the literature. These revealed that the ES based statistics can be used to address problems arising in the RQ scenario to some degree of success. However, due to the interdependence between the different aspects, viz. the one between leverage and collinearity and the one between leverage and outliers, “solutions” are often dependent on the particular situation. In spite of this complexity, the research did produce some fairly general guidelines that can be fruitfully used in practice.
AFRIKAANSE OPSOMMING: Dit is bekend dat die gewone kleinste kwadraat (KK) prosedures sensitief is vir afwykings vanaf die klassieke Gaussiese aannames (uitskieters) asook vir data afwykings in die ontwerpruimte. Twee tipes afwykings van belang in laasgenoemde geval, is kollinearitiet en punte met hoë hefboom waarde. Laasgenoemde punte kan ook kollineariteit induseer of versteek in die ontwerp. Na sodanige punte word verwys as kollinêre hefboom punte. Oor die jare is baie diagnostiese hulpmiddels ontwikkel om hierdie afwykings te identifiseer en om alternatiewe prosedures daarteen te ontwikkel. Om afwykings vanaf die Gaussiese aanname teen te werk, is heelwat robuuste prosedures ontwikkel. Een sodanige klas van prosedures is die Koenker en Bassett (1978) Regressie Kwantiele (RKe), wat natuurlike uitbreidings is van rangorde statistieke na die lineêre model. RKe kan bepaal word as oplossings van lineêre programmeringsprobleme (LPs). Die basiese optimale oplossings van hierdie LPs (wat RKe is) kom ooreen met die elementale deelversameling (ED) regressies, wat bestaan uit deelversamelings van minimum grootte waarmee die parameters van die model beraam kan word. Enersyds geld dat sekere EDs ooreenkom met RKe. Andersyds, uit die literatuur is dit bekend dat baie KK statistieke (beramers) verwant is aan ED regressie statistieke (beramers). Dit impliseer dat daar dus ‘n inherente verwantskap is tussen die drie klasse van prosedures. Die verwantskap tussen die ED en die ooreenkomstige RK prosedures is redelik “terloops” van melding gemaak in die literatuur, terwyl laasgenoemde prosedures redelik breedvoerig ondersoek is. Deur gebruik te maak van bestaande verwantskappe tussen ED en KK prosedures, sowel as nuwes wat ontwikkel is, is kollineariteit, punte met hoë hefboom waardes en uitskieter probleme in die RK omgewing ondersoek. Voorts is ‘n lasso prosedure as veranderlike seleksie tegniek voorgestel in die RK situasie en is enkele tentatiewe resultate daarvoor gegee. Hierdie resultate blyk belowend te wees, veral ook vir verdere navorsing. Enkel geval diagnostiese tegnieke is beskou sowel as hul verwantskap met meervoudige geval tegnieke. In die besonder is veral meervoudige gevalle beskou wat van minimum grootte is om die parameters van die model te kan beraam, en wat ooreenkom met ‘n RK (ED). Met sodanige benadering is regressie diagnostiese tegnieke ontwikkel vir beide EDs en RKe. Die belangrikste probleme wat RKe negatief beinvloed, is kollineariteit en punte met hoë hefboom waardes agv die aard van die berekeningsprosedures en die feit dat RKe se invloedfunksies begrensd is in die ruimte van die afhanklike veranderlike, maar onbegrensd is in die ontwerpruimte. Gevolglik het RKe ‘n hoë affiniteit vir punte met hoë hefboom waardes en poog gewoonlik om uitskieters uit te sluit. Die finale uitset wat verkry word wanneer beide punte met hoë hefboom waardes en uitskieters voorkom, is dan die netto resultaat van hierdie twee teenstrydige pogings. Alhoewel RKe begrensd is in die onafhanklike veranderlike (en dus redelik robuust is tov uitskieters), is uitskieter diagnostiese tegnieke ook beskou om ‘n meer holistiese beeld te verkry. Die ondersoek het analitiese sowel as simulasie tegnieke gebruik. Voorts is ook gebruik gemaak van kunsmatige datastelle en standard datastelle uit die literatuur. Hierdie ondersoeke het getoon dat die ED gebaseerde statistieke met ‘n redelike mate van sukses gebruik kan word om probleme in die RK omgewing aan te spreek. Dit is egter belangrik om daarop te let dat as gevolg van die interafhanklikheid tussen kollineariteit en punte met hoë hefboom waardes asook dié tussen punte met hoë hefboom waardes en uitskieters, “oplossings” dikwels afhanklik is van die bepaalde situasie. Ten spyte van hierdie kompleksiteit, is op grond van die navorsing wat gedoen is, tog redelike algemene riglyne verkry wat nuttig in die praktyk gebruik kan word.
Lo, Sau Yee. "Measurement error in logistic regression model /." View abstract or full-text, 2004. http://library.ust.hk/cgi/db/thesis.pl?MATH%202004%20LO.
Full textIncludes bibliographical references (leaves 82-83). Also available in electronic version. Access restricted to campus users.
Kulich, Michal. "Additive hazards regression with incomplete covariate data /." Thesis, Connect to this title online; UW restricted, 1997. http://hdl.handle.net/1773/9562.
Full textBooks on the topic "Regression analysis"
Sen, Ashish, and Muni Srivastava. Regression Analysis. New York, NY: Springer New York, 1990. http://dx.doi.org/10.1007/978-1-4612-4470-7.
Full textSen, Ashish, and Muni Srivastava. Regression Analysis. Berlin, Heidelberg: Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/978-3-662-25092-1.
Full textLewis-Beck, Michael S. Regression analysis. Thousand Oaks, CA: Sage, 1994.
Find full textDraper, N. R. Applied regression analysis. 3rd ed. New York: Wiley, 1998.
Find full text1923-, Smith Harry, ed. Applied regression analysis. 3rd ed. New York: Wiley, 1998.
Find full textSchroeder, Larry, David Sjoquist, and Paula Stephan. Understanding Regression Analysis. 2455 Teller Road, Newbury Park California 91320 United States of America: SAGE Publications, Inc., 1986. http://dx.doi.org/10.4135/9781412986410.
Full textWestfall, Peter H., and Andrea L. Arias. Understanding Regression Analysis. Boca Raton : CRC Press, [2020]: Chapman and Hall/CRC, 2020. http://dx.doi.org/10.1201/9781003025764.
Full textvon Rosen, Dietrich. Bilinear Regression Analysis. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-78784-8.
Full textRawlings, John O., Sastry G. Pantula, and David A. Dickey, eds. Applied Regression Analysis. New York: Springer-Verlag, 1998. http://dx.doi.org/10.1007/b98890.
Full textThrane, Christer. Applied Regression Analysis. Abingdon, Oxon ; New York, NY : Routledge, 2020.: Routledge, 2019. http://dx.doi.org/10.4324/9780429443756.
Full textBook chapters on the topic "Regression analysis"
Arkes, Jeremy. "Summarizing thoughts." In Regression Analysis, 352–63. 2nd ed. London: Routledge, 2022. http://dx.doi.org/10.4324/9781003285007-14.
Full textArkes, Jeremy. "Time-series models." In Regression Analysis, 287–314. 2nd ed. London: Routledge, 2022. http://dx.doi.org/10.4324/9781003285007-10.
Full textArkes, Jeremy. "Regression analysis basics." In Regression Analysis, 12–50. 2nd ed. London: Routledge, 2022. http://dx.doi.org/10.4324/9781003285007-2.
Full textArkes, Jeremy. "Methods to address biases." In Regression Analysis, 225–65. 2nd ed. London: Routledge, 2022. http://dx.doi.org/10.4324/9781003285007-8.
Full textArkes, Jeremy. "Introduction." In Regression Analysis, 1–11. 2nd ed. London: Routledge, 2022. http://dx.doi.org/10.4324/9781003285007-1.
Full textArkes, Jeremy. "What could go wrong when estimating causal effects?" In Regression Analysis, 132–207. 2nd ed. London: Routledge, 2022. http://dx.doi.org/10.4324/9781003285007-6.
Full textArkes, Jeremy. "Strategies for other regression objectives." In Regression Analysis, 208–24. 2nd ed. London: Routledge, 2022. http://dx.doi.org/10.4324/9781003285007-7.
Full textArkes, Jeremy. "What does “holding other factors constant” mean?" In Regression Analysis, 67–89. 2nd ed. London: Routledge, 2022. http://dx.doi.org/10.4324/9781003285007-4.
Full textArkes, Jeremy. "How to conduct a research project." In Regression Analysis, 331–42. 2nd ed. London: Routledge, 2022. http://dx.doi.org/10.4324/9781003285007-12.
Full textArkes, Jeremy. "The ethics of regression analysis." In Regression Analysis, 343–51. 2nd ed. London: Routledge, 2022. http://dx.doi.org/10.4324/9781003285007-13.
Full textConference papers on the topic "Regression analysis"
Saragih, Jason. "Principal regression analysis." In 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2011. http://dx.doi.org/10.1109/cvpr.2011.5995618.
Full textChivukula, V. N. Aditya Datta, and Sri Keshava Reddy Adupala. "Music Signal Analysis: Regression Analysis." In 2nd International Conference on Machine Learning, IOT and Blockchain (MLIOB 2021). Academy and Industry Research Collaboration Center (AIRCC), 2021. http://dx.doi.org/10.5121/csit.2021.111205.
Full textKhan, Mohiuddeen, and Kanishk Srivastava. "Regression Model for Better Generalization and Regression Analysis." In ICMLSC 2020: The 4th International Conference on Machine Learning and Soft Computing. New York, NY, USA: ACM, 2020. http://dx.doi.org/10.1145/3380688.3380691.
Full textPodgurski, Andy. "Session details: Regression testing." In ISSTA '08: International Symposium on Software Testing and Analysis. New York, NY, USA: ACM, 2008. http://dx.doi.org/10.1145/3260628.
Full textvan Erp, N., and P. van Gelder. "Bayesian logistic regression analysis." In BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING: 32nd International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering. AIP, 2013. http://dx.doi.org/10.1063/1.4819994.
Full textKrasotkina, O., and V. Mottl. "Adaptive nonstationary regression analysis." In 2008 19th International Conference on Pattern Recognition (ICPR). IEEE, 2008. http://dx.doi.org/10.1109/icpr.2008.4761666.
Full textDuller, Christine. "Model selection for logistic regression models." In NUMERICAL ANALYSIS AND APPLIED MATHEMATICS ICNAAM 2012: International Conference of Numerical Analysis and Applied Mathematics. AIP, 2012. http://dx.doi.org/10.1063/1.4756152.
Full textKarim, Rezaul, Md Khorshed Alam, and Md Rezaul Hossain. "Stock Market Analysis Using Linear Regression and Decision Tree Regression." In 2021 1st International Conference on Emerging Smart Technologies and Applications (eSmarTA). IEEE, 2021. http://dx.doi.org/10.1109/esmarta52612.2021.9515762.
Full textKavitha S, Varuna S, and Ramya R. "A comparative analysis on linear regression and support vector regression." In 2016 Online International Conference on Green Engineering and Technologies (IC-GET). IEEE, 2016. http://dx.doi.org/10.1109/get.2016.7916627.
Full textAraveeporn, Autcha, and Choojai Kuharatanachai. "Comparing Penalized Regression Analysis of Logistic Regression Model with Multicollinearity." In the 2019 2nd International Conference. New York, New York, USA: ACM Press, 2019. http://dx.doi.org/10.1145/3343485.3343487.
Full textReports on the topic "Regression analysis"
Burke, Kevin. Regression in Analysis. Fort Belvoir, VA: Defense Technical Information Center, November 2008. http://dx.doi.org/10.21236/ada494895.
Full textCrowson, Michael. Regression and Moderation Analysis with PROCESS in SPSS. Instats Inc., 2022. http://dx.doi.org/10.61700/7h5yxd5cms1i1469.
Full textMarchese, Malvina. Regression Analysis: Everything You Need to Know. Instats Inc., 2023. http://dx.doi.org/10.61700/758g2bp367fhc469.
Full textMarchese, Malvina. Regression Analysis: Everything You Need to Know. Instats Inc., 2023. http://dx.doi.org/10.61700/s16tb8g7d585g469.
Full textLandis, Ronald. Regression Analysis for Social and Health Science. Instats Inc., 2023. http://dx.doi.org/10.61700/me4ohse6jsdoy469.
Full textSteed, Chad A., J. Edward SwanII, Patrick J. Fitzpatrick, and T. J. Jankun-Kelly. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis. Office of Scientific and Technical Information (OSTI), February 2012. http://dx.doi.org/10.2172/1035521.
Full textCerulli, Giovanni. Non-Parametric Regression for Prediction and Scenario Analysis. Instats Inc., 2024. http://dx.doi.org/10.61700/h03w8dvg3h26b767.
Full textHutny, W. P., and J. T. Price. Analysis and regression model of blast furnace coal injection. Natural Resources Canada/ESS/Scientific and Technical Publishing Services, 1987. http://dx.doi.org/10.4095/304361.
Full textJacob, Brian, and Lars Lefgren. Remedial Education and Student Achievement: A Regression-Discontinuity Analysis. Cambridge, MA: National Bureau of Economic Research, May 2002. http://dx.doi.org/10.3386/w8918.
Full textKerr, William, Josh Lerner, and Antoinette Schoar. The Consequences of Entrepreneurial Finance: A Regression Discontinuity Analysis. Cambridge, MA: National Bureau of Economic Research, March 2010. http://dx.doi.org/10.3386/w15831.
Full text