Academic literature on the topic 'Regression'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Regression.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Regression"

1

Sabzekar, Mostafa, and Seyed Mohammad Hossein Hasheminejad. "Robust regression using support vector regressions." Chaos, Solitons & Fractals 144 (March 2021): 110738. http://dx.doi.org/10.1016/j.chaos.2021.110738.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Phillips, Peter C. B. "Partitioned Regression with Rank-Deficient Regressions." Econometric Theory 8, no. 2 (June 1992): 307–9. http://dx.doi.org/10.1017/s0266466600012901.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Plaehn, Dave C., and David S. Lundahl. "Regression with multiple regressor arrays." Journal of Chemometrics 21, no. 12 (2007): 621–34. http://dx.doi.org/10.1002/cem.1092.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Steece, Bert M. "Regressor space outliers in ridge regression." Communications in Statistics - Theory and Methods 15, no. 12 (January 1986): 3599–605. http://dx.doi.org/10.1080/03610928608829333.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Samaniego, Angel. "CAPM-alpha estimation with robust regression vs. linear regression." Análisis Económico 38, no. 97 (January 20, 2023): 27–37. http://dx.doi.org/10.24275/uam/azc/dcsh/ae/2022v38n97/samaniego.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Duarte, Rigoberto Fimia, del Valle LD, Armiñana GR, Hidalgo MET, and Fimia DR. "Perspectives and Potentials of the Objective Regressive Regression Methodology in Terms of One Health." International Journal of Zoology and Animal Biology 7, no. 4 (2024): 1–6. http://dx.doi.org/10.23880/izab-16000603.

Full text
Abstract:
The possibility of having a methodology that allows the modelling and prediction, in the short, medium and long term, of biological, social and natural disaster processes and/ or phenomena is something great.
APA, Harvard, Vancouver, ISO, and other styles
7

Moore, Roger H. "Regression Graphics: Ideas for Studying Regressions Through Graphics." Technometrics 41, no. 4 (November 1999): 368–69. http://dx.doi.org/10.1080/00401706.1999.10485937.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kubáček, Lubomír. "Multistage regression model." Applications of Mathematics 31, no. 2 (1986): 89–96. http://dx.doi.org/10.21136/am.1986.104189.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bouchenaki, F., K. Badache, N. Habchi, M. S. Benachour, and S. Bakhti. "Caudal Regression Syndrome." Clinical Research and Clinical Trials 4, no. 3 (September 29, 2021): 01–06. http://dx.doi.org/10.31579/2693-4779/066.

Full text
Abstract:
Caudal Regression Syndrome (CRS) is a rare malformation syndrome associating to varying degrees agenesis of the coccygeal or lumbosacral vertebrae.This vertebral anomaly can therefore be reduced clinically to a simple coccygeal agenesis without any deficit or even lumbosacral agenesis accompanying a clinical picture with sphincter disorders associated or not with transit disorders and various deficits involving the lower limbs.This syndrome is accompanied by other orthopedic malformations such as shortening of the lower limbs, and / or gastrointestinal abnormalities, see also genitourinary as well as cardiovascular.Its incidence is 1 to 5 cases per 100,000 births. Its precise cause has not yet been identified, but its relationship to maternal diabetes is well established. We report in our study 5 patients with CRS from different clinics whose sphincter disorders were found in the foreground in all our patients and whose results vary according to the inaugural clinical picture. MRI made it possible to refine and confirm the diagnosis highlighting the congenital anomaly and the associated lesions.We have obtained 75% good results and 25% clinical stabilization; nor do we deplore any case of aggravation or death. The interest is to suspect the diagnosis of CRS; document it at the start of the prenatal period and determine its severity and associated abnormalities in order to present options for patient management; because once the diagnosis is made, surgical treatment becomes imperative due to the formidable neurological sequelae compromising the functional prognosis.
APA, Harvard, Vancouver, ISO, and other styles
10

Sandler, J., and A. Sandler. "Regression und Anti-Regression." Zeitschrift für psychoanalytische Theorie und Praxis 22, no. 2 (2007): 147–58. http://dx.doi.org/10.15534/zptp/2007/2/2.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Regression"

1

Jacobs, Mary Christine. "Regression Trees Versus Stepwise Regression." UNF Digital Commons, 1992. http://digitalcommons.unf.edu/etd/145.

Full text
Abstract:
Many methods have been developed to determine the "appropriate" subset of independent variables in a multiple variable problem. Some of the methods are application specific while others have a wide range of uses. This study compares two such methods, Regression Trees and Stepwise Regression. A simulation using a known distribution is used for the comparison. In 699 out of 742 cases the Regression Tree method gave better predictors than the Stepwise Regression procedure.
APA, Harvard, Vancouver, ISO, and other styles
2

Ranganai, Edmore. "Aspects of model development using regression quantiles and elemental regressions." Thesis, Stellenbosch : Stellenbosch University, 2007. http://hdl.handle.net/10019.1/18668.

Full text
Abstract:
Dissertation (PhD)--University of Stellenbosch, 2007.
ENGLISH ABSTRACT: It is well known that ordinary least squares (OLS) procedures are sensitive to deviations from the classical Gaussian assumptions (outliers) as well as data aberrations in the design space. The two major data aberrations in the design space are collinearity and high leverage. Leverage points can also induce or hide collinearity in the design space. Such leverage points are referred to as collinearity influential points. As a consequence, over the years, many diagnostic tools to detect these anomalies as well as alternative procedures to counter them were developed. To counter deviations from the classical Gaussian assumptions many robust procedures have been proposed. One such class of procedures is the Koenker and Bassett (1978) Regressions Quantiles (RQs), which are natural extensions of order statistics, to the linear model. RQs can be found as solutions to linear programming problems (LPs). The basic optimal solutions to these LPs (which are RQs) correspond to elemental subset (ES) regressions, which consist of subsets of minimum size to estimate the necessary parameters of the model. On the one hand, some ESs correspond to RQs. On the other hand, in the literature it is shown that many OLS statistics (estimators) are related to ES regression statistics (estimators). Therefore there is an inherent relationship amongst the three sets of procedures. The relationship between the ES procedure and the RQ one, has been noted almost “casually” in the literature while the latter has been fairly widely explored. Using these existing relationships between the ES procedure and the OLS one as well as new ones, collinearity, leverage and outlier problems in the RQ scenario were investigated. Also, a lasso procedure was proposed as variable selection technique in the RQ scenario and some tentative results were given for it. These results are promising. Single case diagnostics were considered as well as their relationships to multiple case ones. In particular, multiple cases of the minimum size to estimate the necessary parameters of the model, were considered, corresponding to a RQ (ES). In this way regression diagnostics were developed for both ESs and RQs. The main problems that affect RQs adversely are collinearity and leverage due to the nature of the computational procedures and the fact that RQs’ influence functions are unbounded in the design space but bounded in the response variable. As a consequence of this, RQs have a high affinity for leverage points and a high exclusion rate of outliers. The influential picture exhibited in the presence of both leverage points and outliers is the net result of these two antagonistic forces. Although RQs are bounded in the response variable (and therefore fairly robust to outliers), outlier diagnostics were also considered in order to have a more holistic picture. The investigations used comprised analytic means as well as simulation. Furthermore, applications were made to artificial computer generated data sets as well as standard data sets from the literature. These revealed that the ES based statistics can be used to address problems arising in the RQ scenario to some degree of success. However, due to the interdependence between the different aspects, viz. the one between leverage and collinearity and the one between leverage and outliers, “solutions” are often dependent on the particular situation. In spite of this complexity, the research did produce some fairly general guidelines that can be fruitfully used in practice.
AFRIKAANSE OPSOMMING: Dit is bekend dat die gewone kleinste kwadraat (KK) prosedures sensitief is vir afwykings vanaf die klassieke Gaussiese aannames (uitskieters) asook vir data afwykings in die ontwerpruimte. Twee tipes afwykings van belang in laasgenoemde geval, is kollinearitiet en punte met hoë hefboom waarde. Laasgenoemde punte kan ook kollineariteit induseer of versteek in die ontwerp. Na sodanige punte word verwys as kollinêre hefboom punte. Oor die jare is baie diagnostiese hulpmiddels ontwikkel om hierdie afwykings te identifiseer en om alternatiewe prosedures daarteen te ontwikkel. Om afwykings vanaf die Gaussiese aanname teen te werk, is heelwat robuuste prosedures ontwikkel. Een sodanige klas van prosedures is die Koenker en Bassett (1978) Regressie Kwantiele (RKe), wat natuurlike uitbreidings is van rangorde statistieke na die lineêre model. RKe kan bepaal word as oplossings van lineêre programmeringsprobleme (LPs). Die basiese optimale oplossings van hierdie LPs (wat RKe is) kom ooreen met die elementale deelversameling (ED) regressies, wat bestaan uit deelversamelings van minimum grootte waarmee die parameters van die model beraam kan word. Enersyds geld dat sekere EDs ooreenkom met RKe. Andersyds, uit die literatuur is dit bekend dat baie KK statistieke (beramers) verwant is aan ED regressie statistieke (beramers). Dit impliseer dat daar dus ‘n inherente verwantskap is tussen die drie klasse van prosedures. Die verwantskap tussen die ED en die ooreenkomstige RK prosedures is redelik “terloops” van melding gemaak in die literatuur, terwyl laasgenoemde prosedures redelik breedvoerig ondersoek is. Deur gebruik te maak van bestaande verwantskappe tussen ED en KK prosedures, sowel as nuwes wat ontwikkel is, is kollineariteit, punte met hoë hefboom waardes en uitskieter probleme in die RK omgewing ondersoek. Voorts is ‘n lasso prosedure as veranderlike seleksie tegniek voorgestel in die RK situasie en is enkele tentatiewe resultate daarvoor gegee. Hierdie resultate blyk belowend te wees, veral ook vir verdere navorsing. Enkel geval diagnostiese tegnieke is beskou sowel as hul verwantskap met meervoudige geval tegnieke. In die besonder is veral meervoudige gevalle beskou wat van minimum grootte is om die parameters van die model te kan beraam, en wat ooreenkom met ‘n RK (ED). Met sodanige benadering is regressie diagnostiese tegnieke ontwikkel vir beide EDs en RKe. Die belangrikste probleme wat RKe negatief beinvloed, is kollineariteit en punte met hoë hefboom waardes agv die aard van die berekeningsprosedures en die feit dat RKe se invloedfunksies begrensd is in die ruimte van die afhanklike veranderlike, maar onbegrensd is in die ontwerpruimte. Gevolglik het RKe ‘n hoë affiniteit vir punte met hoë hefboom waardes en poog gewoonlik om uitskieters uit te sluit. Die finale uitset wat verkry word wanneer beide punte met hoë hefboom waardes en uitskieters voorkom, is dan die netto resultaat van hierdie twee teenstrydige pogings. Alhoewel RKe begrensd is in die onafhanklike veranderlike (en dus redelik robuust is tov uitskieters), is uitskieter diagnostiese tegnieke ook beskou om ‘n meer holistiese beeld te verkry. Die ondersoek het analitiese sowel as simulasie tegnieke gebruik. Voorts is ook gebruik gemaak van kunsmatige datastelle en standard datastelle uit die literatuur. Hierdie ondersoeke het getoon dat die ED gebaseerde statistieke met ‘n redelike mate van sukses gebruik kan word om probleme in die RK omgewing aan te spreek. Dit is egter belangrik om daarop te let dat as gevolg van die interafhanklikheid tussen kollineariteit en punte met hoë hefboom waardes asook dié tussen punte met hoë hefboom waardes en uitskieters, “oplossings” dikwels afhanklik is van die bepaalde situasie. Ten spyte van hierdie kompleksiteit, is op grond van die navorsing wat gedoen is, tog redelike algemene riglyne verkry wat nuttig in die praktyk gebruik kan word.
APA, Harvard, Vancouver, ISO, and other styles
3

McCubbin, Courtney C. "Regressive Play| An Investigation of Regression in the Analytic Container." Thesis, Pacifica Graduate Institute, 2019. http://pqdtopen.proquest.com/#viewpdf?dispub=13426903.

Full text
Abstract:

This thesis is a heuristic, hermeneutic investigation into regression using the author's experience as a case study. Regressive play and the desire for deeper regression within the analytic container are explored, guided by the question: What is the experience of following one's impulse to regress to more and more primordial states, and what kind of psychological container is needed to facilitate that deepening both inter- and intrapersonally? The author details a history of regression beginning with Sigmund Freud and continuing to psychoanalyst Michael Balint's basic fault, object relations therapist Donald Winnicott's regression to dependence, and Jungian analyst Brian Feldman's psychic skin. The therapeutic role of play is explored. The analyst's response to regression and how it facilitates or hinders the client's ability to regress are presented. This thesis challenges the notion that regression should be discouraged within a psychoanalytic frame, instead suggesting ways the analyst may hold the regression elementally.

APA, Harvard, Vancouver, ISO, and other styles
4

Ishikawa, Noemi Ichihara. "Uso de transformações em modelos de regressão logística." Universidade de São Paulo, 2007. http://www.teses.usp.br/teses/disponiveis/45/45133/tde-05062007-202656/.

Full text
Abstract:
Modelos para dados binários são bastante utilizados em várias situações práticas. Transformações em Análise de Regressão podem ser aplicadas para linearizar ou simplificar o modelo e também para corrigir desvios de suposições. Neste trabalho, descrevemos o uso de transformações nos modelos de regressão logística para dados binários e apresentamos modelos envolvendo parâmetros adicionais de modo a obter um ajuste mais adequado. Posteriormente, analisamos o custo da estimação quando são adicionados parâmetros aos modelos e apresentamos os testes de hipóteses relativos aos parâmetros do modelo de regressão logística de Box-Cox. Finalizando, apresentamos alguns métodos de diagnóstico para avaliar a influência das observações nas estimativas dos parâmetros de transformação da covariável, com aplicação a um conjunto de dados reais.
Binary data models have a lot of utilities in many practical situations. In Regrssion Analisys, transformations can be applied to linearize or simplify the model and correct deviations of the suppositions. In this dissertation, we show the use of the transformations in logistic models to binary data models and models involving additional parameters to obtain more appropriate fits. We also present the cost of the estimation when parameters are added to models, hypothesis tests of the parameters in the Box-Cox logistic regression model and finally, diagnostics methods to evaluate the influence of the observations in the estimation of the transformation covariate parameters with their applications to a real data set.
APA, Harvard, Vancouver, ISO, and other styles
5

Schwartz, Amanda Jo. "Adaptive Regression Testing Strategies for Cost-Effective Regression Testing." Diss., North Dakota State University, 2013. https://hdl.handle.net/10365/26926.

Full text
Abstract:
Regression testing is an important but expensive part of the software development life-cycle. Many different techniques have been proposed for reducing the cost of regression testing. To date, much research has been performed comparing regression testing techniques, but very little research has been performed to aid practitioners and researchers in choosing the most cost-effective technique for a particular regression testing session. One recent study investigated this problem and proposed Adaptive Regression Testing (ART) strategies to aid practitioners in choosing the most cost-effective technique for a specific version of a software system. The results of this study showed that the techniques chosen by the ART strategy were more cost-effective than techniques that did not consider system lifetime and testing processes. This work has several limitations, however. First, it only considers one ART strategy. There are many other strategies which could be developed and studied that could be more cost-effective. Second, the ART strategy used the Analytical Hierarchy Process (AHP). The AHP method is subjective to the weights made by the decision maker. Also, the AHP method is very time consuming because it requires many pairwise comparisons. Pairwise comparisons also limit the scalability of the approach and are often found to be inconsistent. This work proposes three new ART strategies to address these limitations. One strategy utilizing the fuzzy AHP method is proposed to address imprecision in the judgment made by the decision maker. A second strategy utilizing a fuzzy expert system is proposed to reduce the time required by the decision maker, eliminate inconsistencies due to pairwise comparisons, and increase scalability. A third strategy utilizing the Weighted Sum Model is proposed to study the performance of a simple, low cost strategy. Then, a series of empirical studies are performed to evaluate the new strategies. The results of the studies show that the strategies proposed in this work are more cost-effective than the strategy presented in the previous study.
National Science Foundation
APA, Harvard, Vancouver, ISO, and other styles
6

Williams, Ulyana P. "On Some Ridge Regression Estimators for Logistic Regression Models." FIU Digital Commons, 2018. https://digitalcommons.fiu.edu/etd/3667.

Full text
Abstract:
The purpose of this research is to investigate the performance of some ridge regression estimators for the logistic regression model in the presence of moderate to high correlation among the explanatory variables. As a performance criterion, we use the mean square error (MSE), the mean absolute percentage error (MAPE), the magnitude of bias, and the percentage of times the ridge regression estimator produces a higher MSE than the maximum likelihood estimator. A Monto Carlo simulation study has been executed to compare the performance of the ridge regression estimators under different experimental conditions. The degree of correlation, sample size, number of independent variables, and log odds ratio has been varied in the design of experiment. Simulation results show that under certain conditions, the ridge regression estimators outperform the maximum likelihood estimator. Moreover, an empirical data analysis supports the main findings of this study. This thesis proposed and recommended some good ridge regression estimators of the logistic regression model for the practitioners in the field of health, physical and social sciences.
APA, Harvard, Vancouver, ISO, and other styles
7

Sánchez, Lozano Enrique. "Continuous regression : a functional regression approach to facial landmark tracking." Thesis, University of Nottingham, 2017. http://eprints.nottingham.ac.uk/43300/.

Full text
Abstract:
Facial Landmark Tracking (Face Tracking) is a key step for many Face Analysis systems, such as Face Recognition, Facial Expression Recognition, or Age and Gender Recognition, among others. The goal of Facial Landmark Tracking is to locate a sparse set of points defining a facial shape in a video sequence. These typically include the mouth, the eyes, the contour, or the nose tip. The state of the art method for Face Tracking builds on Cascaded Regression, in which a set of linear regressors are used in a cascaded fashion, each receiving as input the output of the previous one, subsequently reducing the error with respect to the target locations. Despite its impressive results, Cascaded Regression suffers from several drawbacks, which are basically caused by the theoretical and practical implications of using Linear Regression. Under the context of Face Alignment, Linear Regression is used to predict shape displacements from image features through a linear mapping. This linear mapping is learnt through the typical least-squares problem, in which a set of random perturbations is given. This means that, each time a new regressor is to be trained, Cascaded Regression needs to generate perturbations and apply the sampling again. Moreover, existing solutions are not capable of incorporating incremental learning in real time. It is well-known that person-specific models perform better than generic ones, and thus the possibility of personalising generic models whilst tracking is ongoing is a desired property, yet to be addressed. This thesis proposes Continuous Regression, a Functional Regression solution to the least-squares problem, resulting in the first real-time incremental face tracker. Briefly speaking, Continuous Regression approximates the samples by an estimation based on a first-order Taylor expansion yielding a closed-form solution for the infinite set of shape displacements. This way, it is possible to model the space of shape displacements as a continuum, without the need of using complex bases. Further, this thesis introduces a novel measure that allows Continuous Regression to be extended to spaces of correlated variables. This novel solution is incorporated into the Cascaded Regression framework, and its computational benefits for training under different configurations are shown. Then, it presents an approach for incremental learning within Cascaded Regression, and shows its complexity allows for real-time implementation. To the best of my knowledge, this is the first incremental face tracker that is shown to operate in real-time. The tracker is tested in an extensive benchmark, attaining state of the art results, thanks to the incremental learning capabilities.
APA, Harvard, Vancouver, ISO, and other styles
8

Kazemi, Seyed Mehran. "Relational logistic regression." Thesis, University of British Columbia, 2014. http://hdl.handle.net/2429/50091.

Full text
Abstract:
Aggregation is a technique for representing conditional probability distributions as an analytic function of parents. Logistic regression is a commonly used representation for aggregators in Bayesian belief networks when a child has multiple parents. In this thesis, we consider extending logistic regression to directed relational models, where there are objects and relations among them, and we want to model varying populations and interactions among parents. We first examine the representational problems caused by population variation. We show how these problems arise even in simple cases with a single parametrized parent, and propose a linear relational logistic regression which we show can represent arbitrary linear (in population size) decision thresholds, whereas the traditional logistic regression cannot. Then we examine representing interactions among the parents of a child node, and representing non-linear dependency on population size. We propose a multi-parent relational logistic regression which can represent interactions among parents and arbitrary polynomial decision thresholds. We compare our relational logistic regression to Markov logic networks and represent their analogies and differences. Finally, we show how other well-known aggregators can be represented using relational logistic regression.
Science, Faculty of
Computer Science, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
9

Zuo, Yanling. "Monotone regression functions." Thesis, University of British Columbia, 1990. http://hdl.handle.net/2429/29457.

Full text
Abstract:
In some applications, we require a monotone estimate of a regression function. In others, we want to test whether the regression function is monotone. For solving the first problem, Ramsay's, Kelly and Rice's, as well as point-wise monotone regression functions in a spline space are discussed and their properties developed. Three monotone estimates are defined: least-square regression splines, smoothing splines and binomial regression splines. The three estimates depend upon a "smoothing parameter": the number and location of knots in regression splines and the usual [formula omitted] in smoothing splines. Two standard techniques for choosing the smoothing parameter, GCV and AIC, are modified for monotone estimation, for the normal errors case. For answering the second question, a test statistic is proposed and its null distribution conjectured. Simulations are carried out to check the conjecture. These techniques are applied to two data sets.
Science, Faculty of
Statistics, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
10

Sullwald, Wichard. "Grain regression analysis." Thesis, Stellenbosch : Stellenbosch University, 2014. http://hdl.handle.net/10019.1/86526.

Full text
Abstract:
Thesis (MSc)--Stellenbosch University, 2014.
ENGLISH ABSTRACT: Grain regression analysis forms an essential part of solid rocket motor simulation. In this thesis a numerical grain regression analysis module is developed as an alternative to cumbersome and time consuming analytical methods. The surface regression is performed by the level-set method, a numerical interface advancement scheme. A novel approach to the integration of the surface area and volume of a numerical interface, as defined implicitly in a level-set framework, by means of Monte-Carlo integration is proposed. The grain regression module is directly coupled to a quasi -1D internal ballistics solver in an on-line fashion, in order to take into account the effects of spatially varying burn rate distributions. A multi-timescale approach is proposed for the direct coupling of the two solvers.
AFRIKAANSE OPSOMMING: Gryn regressie analise vorm ’n integrale deel van soliede vuurpylmotor simulasie. In hierdie tesis word ’n numeriese gryn regressie analise model, as ’n alternatief tot dikwels omslagtige en tydrowende analitiese metodes, ontwikkel. Die oppervlak regressie word deur die vlak-set metode, ’n numeriese koppelvlak beweging skema uitgevoer. ’n Nuwe benadering tot die integrasie van die buite-oppervlakte en volume van ’n implisiete numeriese koppelvlak in ’n vlakset raamwerk, deur middel van Monte Carlo-integrasie word voorgestel. Die gryn regressie model word direk en aanlyn aan ’n kwasi-1D interne ballistiek model gekoppel, ten einde die uitwerking van ruimtelik-wisselende brand-koers in ag te neem. ’n Multi-tydskaal benadering word voorgestel vir die direkte koppeling van die twee modelle.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Regression"

1

Fahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian D. Marx. Regression. Berlin, Heidelberg: Springer Berlin Heidelberg, 2021. http://dx.doi.org/10.1007/978-3-662-63882-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Fahrmeir, Ludwig, Thomas Kneib, and Stefan Lang. Regression. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-01837-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Fahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. Regression. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bingham, N. H., and John M. Fry. Regression. London: Springer London, 2010. http://dx.doi.org/10.1007/978-1-84882-969-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Winter, Susan, and Hari Rajagopalan. Regression. 2455 Teller Road, Thousand Oaks California 91320 United States: SAGE Publications, Inc., 2023. http://dx.doi.org/10.4135/9781071908075.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hao, Lingxin, and Daniel Naiman. Quantile Regression. 2455 Teller Road, Thousand Oaks California 91320 United States of America: SAGE Publications, Inc., 2007. http://dx.doi.org/10.4135/9781412985550.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Fox, John. Regression Diagnostics. 2455 Teller Road, Thousand Oaks California 91320 United States of America: SAGE Publications, Inc., 1991. http://dx.doi.org/10.4135/9781412985604.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Breen, Richard. Regression Models. 2455 Teller Road, Thousand Oaks California 91320 United States of America: SAGE Publications, Inc., 1996. http://dx.doi.org/10.4135/9781412985611.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kalisch, Markus, and Lukas Meier. Logistische Regression. Wiesbaden: Springer Fachmedien Wiesbaden, 2021. http://dx.doi.org/10.1007/978-3-658-34225-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Groß, Jürgen. Linear Regression. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-642-55864-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Regression"

1

Fahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Introduction." In Regression, 1–19. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Fahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Quantile Regression." In Regression, 597–620. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Fahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Regression Models." In Regression, 21–72. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Fahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "The Classical Linear Model." In Regression, 73–175. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Fahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Extensions of the Classical Linear Model." In Regression, 177–267. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Fahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Generalized Linear Models." In Regression, 269–324. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Fahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Categorical Regression Models." In Regression, 325–47. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Fahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Mixed Models." In Regression, 349–412. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Fahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Nonparametric Regression." In Regression, 413–533. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Fahrmeir, Ludwig, Thomas Kneib, Stefan Lang, and Brian Marx. "Structured Additive Regression." In Regression, 535–95. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-34333-9_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Regression"

1

Das, Suddhasvatta. "Agile Regression Testing." In 2024 IEEE Conference on Software Testing, Verification and Validation (ICST), 457–59. IEEE, 2024. http://dx.doi.org/10.1109/icst60714.2024.00054.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Xia, Haifeng, Pu Wang, Toshiaki Koike-Akino, Ye Wang, Philip Orlik, and Zhengming Ding. "Adversarial Bi-Regressor Network for Domain Adaptive Regression." In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/501.

Full text
Abstract:
Domain adaptation (DA) aims to transfer the knowledge of a well-labeled source domain to facilitate unlabeled target learning. When turning to specific tasks such as indoor (Wi-Fi) localization, it is essential to learn a cross-domain regressor to mitigate the domain shift. This paper proposes a novel method Adversarial Bi-Regressor Network (ABRNet) to seek more effective cross- domain regression model. Specifically, a discrepant bi-regressor architecture is developed to maximize the difference of bi-regressor to discover uncertain target instances far from the source distribution, and then an adversarial training mechanism is adopted between feature extractor and dual regressors to produce domain-invariant representations. To further bridge the large domain gap, a domain- specific augmentation module is designed to synthesize two source-similar and target-similar inter- mediate domains to gradually eliminate the original domain mismatch. The empirical studies on two cross-domain regressive benchmarks illustrate the power of our method on solving the domain adaptive regression (DAR) problem.
APA, Harvard, Vancouver, ISO, and other styles
3

Mandre, Ananya, Deeksha R. Hebbar, J. Shreya Rao, Ananya Keshav, Shoaib Kamal, and Trupthi Rao. "Early Forest-Fire Detection by Linear Regression, Ridge Regression And Lasso Regression." In 2023 International Conference on Computational Intelligence for Information, Security and Communication Applications (CIISCA). IEEE, 2023. http://dx.doi.org/10.1109/ciisca59740.2023.00060.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

García García, Catalina, Roman Salmerón Gómez, Claudia García García, and José García Pérez. "Raise regression mitigating collinearity in moderated regression." In The 5th Virtual International Conference on Advanced Research in Scientific Areas. Publishing Society, 2016. http://dx.doi.org/10.18638/arsa.2016.5.1.839.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Holcomb, Tyler R., and Manfred Morari. "Significance Regression: Robust Regression for Collinear Data." In 1993 American Control Conference. IEEE, 1993. http://dx.doi.org/10.23919/acc.1993.4793203.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Yu, Zhuonan. "Use of logistic regression, nonlinear regression, and linear regression in lung cancer research." In International Conference on Biological Engineering and Medical Science (ICBIOMed2022), edited by Gary Royle and Steven M. Lipkin. SPIE, 2023. http://dx.doi.org/10.1117/12.2669953.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Godlin, Benny, and Ofer Strichman. "Regression verification." In the 46th Annual Design Automation Conference. New York, New York, USA: ACM Press, 2009. http://dx.doi.org/10.1145/1629911.1630034.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Ben-Porat, Omer, and Moshe Tennenholtz. "Regression Equilibrium." In EC '19: ACM Conference on Economics and Computation. New York, NY, USA: ACM, 2019. http://dx.doi.org/10.1145/3328526.3329560.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Grzegorzewski, Przemyslaw. "Granular regression." In 2013 Joint IFSA World Congress and NAFIPS Annual Meeting (IFSA/NAFIPS). IEEE, 2013. http://dx.doi.org/10.1109/ifsa-nafips.2013.6608532.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Cai, Deng, Xiaofei He, and Jiawei Han. "Spectral regression." In the 15th international conference. New York, New York, USA: ACM Press, 2007. http://dx.doi.org/10.1145/1291233.1291329.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Regression"

1

Stouli, Sami, and Richard Spady. Dual regression. Institute for Fiscal Studies, January 2016. http://dx.doi.org/10.1920/wp.cem.2016.0516.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Stouli, Sami, and Richard Spady. Dual regression. The IFS, January 2019. http://dx.doi.org/10.1920/wp.cem.2019.01.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Stouli, Sami, and Richard Spady. Dual regression. The IFS, January 2019. http://dx.doi.org/10.1920/wp.cem.2019.0119.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Wallstrom, Timothy Clarke, and David Mitchell Higdon. Hierarchical Linear Regression. Office of Scientific and Technical Information (OSTI), January 2019. http://dx.doi.org/10.2172/1489929.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Burke, Kevin. Regression in Analysis. Fort Belvoir, VA: Defense Technical Information Center, November 2008. http://dx.doi.org/10.21236/ada494895.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Boggs, Paul T., and Janet R. Donaldson. Orthogonal distance regression. Gaithersburg, MD: National Institute of Standards and Technology, 1989. http://dx.doi.org/10.6028/nist.ir.89-4197.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Strichman, Ofer. Software Regression Verification. Fort Belvoir, VA: Defense Technical Information Center, December 2013. http://dx.doi.org/10.21236/ada594501.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Carlier, Guillaume, Alfred Galichon, and Victor Chernozhukov. Vector quantile regression. Institute for Fiscal Studies, December 2014. http://dx.doi.org/10.1920/wp.cem.2014.4814.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lee, Sokbae (Simon), and Le-Yu Chen. Sparse Quantile Regression. The IFS, June 2020. http://dx.doi.org/10.1920/wp.cem.2020.3020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Fauntleroy, Julia C., and Edward J. Wegman. Parallelizing Locally-Weighted Regression. Fort Belvoir, VA: Defense Technical Information Center, October 1994. http://dx.doi.org/10.21236/ada288294.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography