Academic literature on the topic 'Regression'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Regression.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Regression"
Sabzekar, Mostafa, and Seyed Mohammad Hossein Hasheminejad. "Robust regression using support vector regressions." Chaos, Solitons & Fractals 144 (March 2021): 110738. http://dx.doi.org/10.1016/j.chaos.2021.110738.
Full textPhillips, Peter C. B. "Partitioned Regression with Rank-Deficient Regressions." Econometric Theory 8, no. 2 (June 1992): 307–9. http://dx.doi.org/10.1017/s0266466600012901.
Full textPlaehn, Dave C., and David S. Lundahl. "Regression with multiple regressor arrays." Journal of Chemometrics 21, no. 12 (2007): 621–34. http://dx.doi.org/10.1002/cem.1092.
Full textSteece, Bert M. "Regressor space outliers in ridge regression." Communications in Statistics - Theory and Methods 15, no. 12 (January 1986): 3599–605. http://dx.doi.org/10.1080/03610928608829333.
Full textSamaniego, Angel. "CAPM-alpha estimation with robust regression vs. linear regression." Análisis Económico 38, no. 97 (January 20, 2023): 27–37. http://dx.doi.org/10.24275/uam/azc/dcsh/ae/2022v38n97/samaniego.
Full textDuarte, Rigoberto Fimia, del Valle LD, Armiñana GR, Hidalgo MET, and Fimia DR. "Perspectives and Potentials of the Objective Regressive Regression Methodology in Terms of One Health." International Journal of Zoology and Animal Biology 7, no. 4 (2024): 1–6. http://dx.doi.org/10.23880/izab-16000603.
Full textMoore, Roger H. "Regression Graphics: Ideas for Studying Regressions Through Graphics." Technometrics 41, no. 4 (November 1999): 368–69. http://dx.doi.org/10.1080/00401706.1999.10485937.
Full textKubáček, Lubomír. "Multistage regression model." Applications of Mathematics 31, no. 2 (1986): 89–96. http://dx.doi.org/10.21136/am.1986.104189.
Full textBouchenaki, F., K. Badache, N. Habchi, M. S. Benachour, and S. Bakhti. "Caudal Regression Syndrome." Clinical Research and Clinical Trials 4, no. 3 (September 29, 2021): 01–06. http://dx.doi.org/10.31579/2693-4779/066.
Full textSandler, J., and A. Sandler. "Regression und Anti-Regression." Zeitschrift für psychoanalytische Theorie und Praxis 22, no. 2 (2007): 147–58. http://dx.doi.org/10.15534/zptp/2007/2/2.
Full textDissertations / Theses on the topic "Regression"
Jacobs, Mary Christine. "Regression Trees Versus Stepwise Regression." UNF Digital Commons, 1992. http://digitalcommons.unf.edu/etd/145.
Full textRanganai, Edmore. "Aspects of model development using regression quantiles and elemental regressions." Thesis, Stellenbosch : Stellenbosch University, 2007. http://hdl.handle.net/10019.1/18668.
Full textENGLISH ABSTRACT: It is well known that ordinary least squares (OLS) procedures are sensitive to deviations from the classical Gaussian assumptions (outliers) as well as data aberrations in the design space. The two major data aberrations in the design space are collinearity and high leverage. Leverage points can also induce or hide collinearity in the design space. Such leverage points are referred to as collinearity influential points. As a consequence, over the years, many diagnostic tools to detect these anomalies as well as alternative procedures to counter them were developed. To counter deviations from the classical Gaussian assumptions many robust procedures have been proposed. One such class of procedures is the Koenker and Bassett (1978) Regressions Quantiles (RQs), which are natural extensions of order statistics, to the linear model. RQs can be found as solutions to linear programming problems (LPs). The basic optimal solutions to these LPs (which are RQs) correspond to elemental subset (ES) regressions, which consist of subsets of minimum size to estimate the necessary parameters of the model. On the one hand, some ESs correspond to RQs. On the other hand, in the literature it is shown that many OLS statistics (estimators) are related to ES regression statistics (estimators). Therefore there is an inherent relationship amongst the three sets of procedures. The relationship between the ES procedure and the RQ one, has been noted almost “casually” in the literature while the latter has been fairly widely explored. Using these existing relationships between the ES procedure and the OLS one as well as new ones, collinearity, leverage and outlier problems in the RQ scenario were investigated. Also, a lasso procedure was proposed as variable selection technique in the RQ scenario and some tentative results were given for it. These results are promising. Single case diagnostics were considered as well as their relationships to multiple case ones. In particular, multiple cases of the minimum size to estimate the necessary parameters of the model, were considered, corresponding to a RQ (ES). In this way regression diagnostics were developed for both ESs and RQs. The main problems that affect RQs adversely are collinearity and leverage due to the nature of the computational procedures and the fact that RQs’ influence functions are unbounded in the design space but bounded in the response variable. As a consequence of this, RQs have a high affinity for leverage points and a high exclusion rate of outliers. The influential picture exhibited in the presence of both leverage points and outliers is the net result of these two antagonistic forces. Although RQs are bounded in the response variable (and therefore fairly robust to outliers), outlier diagnostics were also considered in order to have a more holistic picture. The investigations used comprised analytic means as well as simulation. Furthermore, applications were made to artificial computer generated data sets as well as standard data sets from the literature. These revealed that the ES based statistics can be used to address problems arising in the RQ scenario to some degree of success. However, due to the interdependence between the different aspects, viz. the one between leverage and collinearity and the one between leverage and outliers, “solutions” are often dependent on the particular situation. In spite of this complexity, the research did produce some fairly general guidelines that can be fruitfully used in practice.
AFRIKAANSE OPSOMMING: Dit is bekend dat die gewone kleinste kwadraat (KK) prosedures sensitief is vir afwykings vanaf die klassieke Gaussiese aannames (uitskieters) asook vir data afwykings in die ontwerpruimte. Twee tipes afwykings van belang in laasgenoemde geval, is kollinearitiet en punte met hoë hefboom waarde. Laasgenoemde punte kan ook kollineariteit induseer of versteek in die ontwerp. Na sodanige punte word verwys as kollinêre hefboom punte. Oor die jare is baie diagnostiese hulpmiddels ontwikkel om hierdie afwykings te identifiseer en om alternatiewe prosedures daarteen te ontwikkel. Om afwykings vanaf die Gaussiese aanname teen te werk, is heelwat robuuste prosedures ontwikkel. Een sodanige klas van prosedures is die Koenker en Bassett (1978) Regressie Kwantiele (RKe), wat natuurlike uitbreidings is van rangorde statistieke na die lineêre model. RKe kan bepaal word as oplossings van lineêre programmeringsprobleme (LPs). Die basiese optimale oplossings van hierdie LPs (wat RKe is) kom ooreen met die elementale deelversameling (ED) regressies, wat bestaan uit deelversamelings van minimum grootte waarmee die parameters van die model beraam kan word. Enersyds geld dat sekere EDs ooreenkom met RKe. Andersyds, uit die literatuur is dit bekend dat baie KK statistieke (beramers) verwant is aan ED regressie statistieke (beramers). Dit impliseer dat daar dus ‘n inherente verwantskap is tussen die drie klasse van prosedures. Die verwantskap tussen die ED en die ooreenkomstige RK prosedures is redelik “terloops” van melding gemaak in die literatuur, terwyl laasgenoemde prosedures redelik breedvoerig ondersoek is. Deur gebruik te maak van bestaande verwantskappe tussen ED en KK prosedures, sowel as nuwes wat ontwikkel is, is kollineariteit, punte met hoë hefboom waardes en uitskieter probleme in die RK omgewing ondersoek. Voorts is ‘n lasso prosedure as veranderlike seleksie tegniek voorgestel in die RK situasie en is enkele tentatiewe resultate daarvoor gegee. Hierdie resultate blyk belowend te wees, veral ook vir verdere navorsing. Enkel geval diagnostiese tegnieke is beskou sowel as hul verwantskap met meervoudige geval tegnieke. In die besonder is veral meervoudige gevalle beskou wat van minimum grootte is om die parameters van die model te kan beraam, en wat ooreenkom met ‘n RK (ED). Met sodanige benadering is regressie diagnostiese tegnieke ontwikkel vir beide EDs en RKe. Die belangrikste probleme wat RKe negatief beinvloed, is kollineariteit en punte met hoë hefboom waardes agv die aard van die berekeningsprosedures en die feit dat RKe se invloedfunksies begrensd is in die ruimte van die afhanklike veranderlike, maar onbegrensd is in die ontwerpruimte. Gevolglik het RKe ‘n hoë affiniteit vir punte met hoë hefboom waardes en poog gewoonlik om uitskieters uit te sluit. Die finale uitset wat verkry word wanneer beide punte met hoë hefboom waardes en uitskieters voorkom, is dan die netto resultaat van hierdie twee teenstrydige pogings. Alhoewel RKe begrensd is in die onafhanklike veranderlike (en dus redelik robuust is tov uitskieters), is uitskieter diagnostiese tegnieke ook beskou om ‘n meer holistiese beeld te verkry. Die ondersoek het analitiese sowel as simulasie tegnieke gebruik. Voorts is ook gebruik gemaak van kunsmatige datastelle en standard datastelle uit die literatuur. Hierdie ondersoeke het getoon dat die ED gebaseerde statistieke met ‘n redelike mate van sukses gebruik kan word om probleme in die RK omgewing aan te spreek. Dit is egter belangrik om daarop te let dat as gevolg van die interafhanklikheid tussen kollineariteit en punte met hoë hefboom waardes asook dié tussen punte met hoë hefboom waardes en uitskieters, “oplossings” dikwels afhanklik is van die bepaalde situasie. Ten spyte van hierdie kompleksiteit, is op grond van die navorsing wat gedoen is, tog redelike algemene riglyne verkry wat nuttig in die praktyk gebruik kan word.
McCubbin, Courtney C. "Regressive Play| An Investigation of Regression in the Analytic Container." Thesis, Pacifica Graduate Institute, 2019. http://pqdtopen.proquest.com/#viewpdf?dispub=13426903.
Full textThis thesis is a heuristic, hermeneutic investigation into regression using the author's experience as a case study. Regressive play and the desire for deeper regression within the analytic container are explored, guided by the question: What is the experience of following one's impulse to regress to more and more primordial states, and what kind of psychological container is needed to facilitate that deepening both inter- and intrapersonally? The author details a history of regression beginning with Sigmund Freud and continuing to psychoanalyst Michael Balint's basic fault, object relations therapist Donald Winnicott's regression to dependence, and Jungian analyst Brian Feldman's psychic skin. The therapeutic role of play is explored. The analyst's response to regression and how it facilitates or hinders the client's ability to regress are presented. This thesis challenges the notion that regression should be discouraged within a psychoanalytic frame, instead suggesting ways the analyst may hold the regression elementally.
Ishikawa, Noemi Ichihara. "Uso de transformações em modelos de regressão logística." Universidade de São Paulo, 2007. http://www.teses.usp.br/teses/disponiveis/45/45133/tde-05062007-202656/.
Full textBinary data models have a lot of utilities in many practical situations. In Regrssion Analisys, transformations can be applied to linearize or simplify the model and correct deviations of the suppositions. In this dissertation, we show the use of the transformations in logistic models to binary data models and models involving additional parameters to obtain more appropriate fits. We also present the cost of the estimation when parameters are added to models, hypothesis tests of the parameters in the Box-Cox logistic regression model and finally, diagnostics methods to evaluate the influence of the observations in the estimation of the transformation covariate parameters with their applications to a real data set.
Schwartz, Amanda Jo. "Adaptive Regression Testing Strategies for Cost-Effective Regression Testing." Diss., North Dakota State University, 2013. https://hdl.handle.net/10365/26926.
Full textNational Science Foundation
Williams, Ulyana P. "On Some Ridge Regression Estimators for Logistic Regression Models." FIU Digital Commons, 2018. https://digitalcommons.fiu.edu/etd/3667.
Full textSánchez, Lozano Enrique. "Continuous regression : a functional regression approach to facial landmark tracking." Thesis, University of Nottingham, 2017. http://eprints.nottingham.ac.uk/43300/.
Full textKazemi, Seyed Mehran. "Relational logistic regression." Thesis, University of British Columbia, 2014. http://hdl.handle.net/2429/50091.
Full textScience, Faculty of
Computer Science, Department of
Graduate
Zuo, Yanling. "Monotone regression functions." Thesis, University of British Columbia, 1990. http://hdl.handle.net/2429/29457.
Full textScience, Faculty of
Statistics, Department of
Graduate
Sullwald, Wichard. "Grain regression analysis." Thesis, Stellenbosch : Stellenbosch University, 2014. http://hdl.handle.net/10019.1/86526.
Full textENGLISH ABSTRACT: Grain regression analysis forms an essential part of solid rocket motor simulation. In this thesis a numerical grain regression analysis module is developed as an alternative to cumbersome and time consuming analytical methods. The surface regression is performed by the level-set method, a numerical interface advancement scheme. A novel approach to the integration of the surface area and volume of a numerical interface, as defined implicitly in a level-set framework, by means of Monte-Carlo integration is proposed. The grain regression module is directly coupled to a quasi -1D internal ballistics solver in an on-line fashion, in order to take into account the effects of spatially varying burn rate distributions. A multi-timescale approach is proposed for the direct coupling of the two solvers.
AFRIKAANSE OPSOMMING: Gryn regressie analise vorm ’n integrale deel van soliede vuurpylmotor simulasie. In hierdie tesis word ’n numeriese gryn regressie analise model, as ’n alternatief tot dikwels omslagtige en tydrowende analitiese metodes, ontwikkel. Die oppervlak regressie word deur die vlak-set metode, ’n numeriese koppelvlak beweging skema uitgevoer. ’n Nuwe benadering tot die integrasie van die buite-oppervlakte en volume van ’n implisiete numeriese koppelvlak in ’n vlakset raamwerk, deur middel van Monte Carlo-integrasie word voorgestel. Die gryn regressie model word direk en aanlyn aan ’n kwasi-1D interne ballistiek model gekoppel, ten einde die uitwerking van ruimtelik-wisselende brand-koers in ag te neem. ’n Multi-tydskaal benadering word voorgestel vir die direkte koppeling van die twee modelle.
Books on the topic "Regression"
Fahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian D. Marx. Regression. Berlin, Heidelberg: Springer Berlin Heidelberg, 2021. http://dx.doi.org/10.1007/978-3-662-63882-8.
Full textFahrmeir, Ludwig, Thomas Kneib, and Stefan Lang. Regression. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-01837-4.
Full textFahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. Regression. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9.
Full textBingham, N. H., and John M. Fry. Regression. London: Springer London, 2010. http://dx.doi.org/10.1007/978-1-84882-969-5.
Full textWinter, Susan, and Hari Rajagopalan. Regression. 2455 Teller Road, Thousand Oaks California 91320 United States: SAGE Publications, Inc., 2023. http://dx.doi.org/10.4135/9781071908075.
Full textHao, Lingxin, and Daniel Naiman. Quantile Regression. 2455 Teller Road, Thousand Oaks California 91320 United States of America: SAGE Publications, Inc., 2007. http://dx.doi.org/10.4135/9781412985550.
Full textFox, John. Regression Diagnostics. 2455 Teller Road, Thousand Oaks California 91320 United States of America: SAGE Publications, Inc., 1991. http://dx.doi.org/10.4135/9781412985604.
Full textBreen, Richard. Regression Models. 2455 Teller Road, Thousand Oaks California 91320 United States of America: SAGE Publications, Inc., 1996. http://dx.doi.org/10.4135/9781412985611.
Full textKalisch, Markus, and Lukas Meier. Logistische Regression. Wiesbaden: Springer Fachmedien Wiesbaden, 2021. http://dx.doi.org/10.1007/978-3-658-34225-8.
Full textGroß, Jürgen. Linear Regression. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-642-55864-1.
Full textBook chapters on the topic "Regression"
Fahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Introduction." In Regression, 1–19. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_1.
Full textFahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Quantile Regression." In Regression, 597–620. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_10.
Full textFahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Regression Models." In Regression, 21–72. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_2.
Full textFahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "The Classical Linear Model." In Regression, 73–175. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_3.
Full textFahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Extensions of the Classical Linear Model." In Regression, 177–267. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_4.
Full textFahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Generalized Linear Models." In Regression, 269–324. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_5.
Full textFahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Categorical Regression Models." In Regression, 325–47. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_6.
Full textFahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Mixed Models." In Regression, 349–412. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_7.
Full textFahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Nonparametric Regression." In Regression, 413–533. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_8.
Full textFahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Structured Additive Regression." In Regression, 535–95. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_9.
Full textConference papers on the topic "Regression"
Das, Suddhasvatta. "Agile Regression Testing." In 2024 IEEE Conference on Software Testing, Verification and Validation (ICST), 457–59. IEEE, 2024. http://dx.doi.org/10.1109/icst60714.2024.00054.
Full textXia, Haifeng, Pu Wang, Toshiaki Koike-Akino, Ye Wang, Philip Orlik, and Zhengming Ding. "Adversarial Bi-Regressor Network for Domain Adaptive Regression." In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/501.
Full textMandre, Ananya, Deeksha R. Hebbar, J. Shreya Rao, Ananya Keshav, Shoaib Kamal, and Trupthi Rao. "Early Forest-Fire Detection by Linear Regression, Ridge Regression And Lasso Regression." In 2023 International Conference on Computational Intelligence for Information, Security and Communication Applications (CIISCA). IEEE, 2023. http://dx.doi.org/10.1109/ciisca59740.2023.00060.
Full textGarcía García, Catalina, Roman Salmerón Gómez, Claudia García García, and José García Pérez. "Raise regression mitigating collinearity in moderated regression." In The 5th Virtual International Conference on Advanced Research in Scientific Areas. Publishing Society, 2016. http://dx.doi.org/10.18638/arsa.2016.5.1.839.
Full textHolcomb, Tyler R., and Manfred Morari. "Significance Regression: Robust Regression for Collinear Data." In 1993 American Control Conference. IEEE, 1993. http://dx.doi.org/10.23919/acc.1993.4793203.
Full textYu, Zhuonan. "Use of logistic regression, nonlinear regression, and linear regression in lung cancer research." In International Conference on Biological Engineering and Medical Science (ICBIOMed2022), edited by Gary Royle and Steven M. Lipkin. SPIE, 2023. http://dx.doi.org/10.1117/12.2669953.
Full textGodlin, Benny, and Ofer Strichman. "Regression verification." In the 46th Annual Design Automation Conference. New York, New York, USA: ACM Press, 2009. http://dx.doi.org/10.1145/1629911.1630034.
Full textBen-Porat, Omer, and Moshe Tennenholtz. "Regression Equilibrium." In EC '19: ACM Conference on Economics and Computation. New York, NY, USA: ACM, 2019. http://dx.doi.org/10.1145/3328526.3329560.
Full textGrzegorzewski, Przemyslaw. "Granular regression." In 2013 Joint IFSA World Congress and NAFIPS Annual Meeting (IFSA/NAFIPS). IEEE, 2013. http://dx.doi.org/10.1109/ifsa-nafips.2013.6608532.
Full textCai, Deng, Xiaofei He, and Jiawei Han. "Spectral regression." In the 15th international conference. New York, New York, USA: ACM Press, 2007. http://dx.doi.org/10.1145/1291233.1291329.
Full textReports on the topic "Regression"
Stouli, Sami, and Richard Spady. Dual regression. Institute for Fiscal Studies, January 2016. http://dx.doi.org/10.1920/wp.cem.2016.0516.
Full textStouli, Sami, and Richard Spady. Dual regression. The IFS, January 2019. http://dx.doi.org/10.1920/wp.cem.2019.01.
Full textStouli, Sami, and Richard Spady. Dual regression. The IFS, January 2019. http://dx.doi.org/10.1920/wp.cem.2019.0119.
Full textWallstrom, Timothy Clarke, and David Mitchell Higdon. Hierarchical Linear Regression. Office of Scientific and Technical Information (OSTI), January 2019. http://dx.doi.org/10.2172/1489929.
Full textBurke, Kevin. Regression in Analysis. Fort Belvoir, VA: Defense Technical Information Center, November 2008. http://dx.doi.org/10.21236/ada494895.
Full textBoggs, Paul T., and Janet R. Donaldson. Orthogonal distance regression. Gaithersburg, MD: National Institute of Standards and Technology, 1989. http://dx.doi.org/10.6028/nist.ir.89-4197.
Full textStrichman, Ofer. Software Regression Verification. Fort Belvoir, VA: Defense Technical Information Center, December 2013. http://dx.doi.org/10.21236/ada594501.
Full textCarlier, Guillaume, Alfred Galichon, and Victor Chernozhukov. Vector quantile regression. Institute for Fiscal Studies, December 2014. http://dx.doi.org/10.1920/wp.cem.2014.4814.
Full textLee, Sokbae (Simon), and Le-Yu Chen. Sparse Quantile Regression. The IFS, June 2020. http://dx.doi.org/10.1920/wp.cem.2020.3020.
Full textFauntleroy, Julia C., and Edward J. Wegman. Parallelizing Locally-Weighted Regression. Fort Belvoir, VA: Defense Technical Information Center, October 1994. http://dx.doi.org/10.21236/ada288294.
Full text