Academic literature on the topic 'Matrix linear regression'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Matrix linear regression.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Matrix linear regression"
Lutay, V. N., and N. S. Khusainov. "The selective regularization of a linear regression model." Journal of Physics: Conference Series 2099, no. 1 (November 1, 2021): 012024. http://dx.doi.org/10.1088/1742-6596/2099/1/012024.
Full textNakonechnyi, Alexander G., Grigoriy I. Kudin, Petr N. Zinko, and Taras P. Zinko. "Perturbation Method in Problems of Linear Matrix Regression." Journal of Automation and Information Sciences 52, no. 1 (2020): 1–12. http://dx.doi.org/10.1615/jautomatinfscien.v52.i1.10.
Full textZhang, Jiawei, Peng Wang, and Ning Zhang. "Distribution Network Admittance Matrix Estimation With Linear Regression." IEEE Transactions on Power Systems 36, no. 5 (September 2021): 4896–99. http://dx.doi.org/10.1109/tpwrs.2021.3090250.
Full textIvashnev, L. I. "Methods of linear multiple regression in a matrix form." Izvestiya MGTU MAMI 9, no. 4-4 (August 20, 2015): 35–41. http://dx.doi.org/10.17816/2074-0530-67011.
Full textMahaboob, B., J. P. Praveen, B. V. A. Rao, Y. Harnath, C. Narayana, and G. B. Prakash. "A STUDY ON MULTIPLE LINEAR REGRESSION USING MATRIX CALCULUS." Advances in Mathematics: Scientific Journal 9, no. 7 (August 2, 2020): 4863–72. http://dx.doi.org/10.37418/amsj.9.7.52.
Full textAubin, Elisete da Conceição Q., and Gauss M. Cordeiro. "BIAS in linear regression models with unknown covariance matrix." Communications in Statistics - Simulation and Computation 26, no. 3 (January 1997): 813–28. http://dx.doi.org/10.1080/03610919708813413.
Full textBargiela, Andrzej, and Joanna K. Hartley. "Orthogonal linear regression algorithm based on augmented matrix formulation." Computers & Operations Research 20, no. 8 (October 1993): 829–36. http://dx.doi.org/10.1016/0305-0548(93)90104-q.
Full textLivadiotis, George. "Linear Regression with Optimal Rotation." Stats 2, no. 4 (September 28, 2019): 416–25. http://dx.doi.org/10.3390/stats2040028.
Full textKlen, Kateryna, Vadym Martynyuk, and Mykhailo Yaremenko. "Prediction of the wind speed change function by linear regression method." Computational Problems of Electrical Engineering 9, no. 2 (November 10, 2019): 28–33. http://dx.doi.org/10.23939/jcpee2019.02.028.
Full textSrivastava, A. K. "Estimation of linear regression model with rank deficient observations matrix under linear restrictions." Microelectronics Reliability 36, no. 1 (January 1996): 109–10. http://dx.doi.org/10.1016/0026-2714(95)00018-w.
Full textDissertations / Theses on the topic "Matrix linear regression"
Kuljus, Kristi. "Rank Estimation in Elliptical Models : Estimation of Structured Rank Covariance Matrices and Asymptotics for Heteroscedastic Linear Regression." Doctoral thesis, Uppsala universitet, Matematisk statistik, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-9305.
Full textShrewsbury, John Stephen. "Calibration of trip distribution by generalised linear models." Thesis, University of Canterbury. Department of Civil and Natuaral Resources Engineering, 2012. http://hdl.handle.net/10092/7685.
Full textWang, Shuo. "An Improved Meta-analysis for Analyzing Cylindrical-type Time Series Data with Applications to Forecasting Problem in Environmental Study." Digital WPI, 2015. https://digitalcommons.wpi.edu/etd-theses/386.
Full textKim, Jingu. "Nonnegative matrix and tensor factorizations, least squares problems, and applications." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/42909.
Full textNasseri, Sahand. "Application of an Improved Transition Probability Matrix Based Crack Rating Prediction Methodology in Florida’s Highway Network." Scholar Commons, 2008. https://scholarcommons.usf.edu/etd/424.
Full textКір’ян, М. П. "Веб-система загальноосвітноьої школи з використанням алгоритму оцінювання та збору статистики." Master's thesis, Сумський державний університет, 2019. http://essuir.sumdu.edu.ua/handle/123456789/76750.
Full textBettache, Nayel. "Matrix-valued Time Series in High Dimension." Electronic Thesis or Diss., Institut polytechnique de Paris, 2024. http://www.theses.fr/2024IPPAG002.
Full textThe objective of this thesis is to model matrix-valued time series in a high-dimensional framework. To this end, the entire study is presented in a non-asymptotic framework. We first provide a test procedure capable of distinguishing whether the covariance matrix of centered random vectors with centered stationary distribution is equal to the identity or has a sparse Toeplitz structure. Secondly, we propose an extension of low-rank matrix linear regression to a regression model with two matrix-parameters which create correlations between the rows and he columns of the output random matrix. Finally, we introduce and estimate a dynamic topic model where the expected value of the observations is factorizes into a static matrix and a time-dependent matrix following a simplex-valued auto-regressive process of order one
Žiupsnys, Giedrius. "Klientų duomenų valdymas bankininkystėje." Master's thesis, Lithuanian Academic Libraries Network (LABT), 2011. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2010~D_20110709_152442-86545.
Full textThis work is about analysing regularities in bank clients historical credit data. So first of all bank information repositories are analyzed to comprehend banks data. Then using data mining algorithms and software for bank data sets, which describes credit repayment history, clients insolvency risk is being tried to estimate. So first step in analyzis is information preprocessing for data mining. Later various classification algorithms is used to make models wich classify our data sets and help to identify insolvent clients as accurate as possible. Besides clasiffication, regression algorithms are analyzed and prediction models are created. These models help to estimate how long client are late to pay deposit. So when researches have been done data marts and data flow schema are presented. Also classification and regressions algorithms and models, which shows best estimation results for our data sets, are introduced.
NÓBREGA, Caio Santos Bezerra. "Uma estratégia para predição da taxa de aprendizagem do gradiente descendente para aceleração da fatoração de matrizes." Universidade Federal de Campina Grande, 2014. http://dspace.sti.ufcg.edu.br:8080/jspui/handle/riufcg/362.
Full textMade available in DSpace on 2018-04-11T14:50:08Z (GMT). No. of bitstreams: 1 CAIO SANTOS BEZERRA NÓBREGA - DISSERTAÇÃO PPGCC 2014..pdf: 983246 bytes, checksum: 5eca7651706ce317dc514ec2f1aa10c3 (MD5) Previous issue date: 2014-07-30
Capes
Sugerir os produtos mais apropriados aos diversos tipos de consumidores não é uma tarefa trivial, apesar de ser um fator chave para aumentar satisfação e lealdade destes. Devido a esse fato, sistemas de recomendação têm se tornado uma ferramenta importante para diversas aplicações, tais como, comércio eletrônico, sites personalizados e redes sociais. Recentemente, a fatoração de matrizes se tornou a técnica mais bem sucedida de implementação de sistemas de recomendação. Os parâmetros do modelo de fatoração de matrizes são tipicamente aprendidos por meio de métodos numéricos, tal como o gradiente descendente. O desempenho do gradiente descendente está diretamente relacionada à configuração da taxa de aprendizagem, a qual é tipicamente configurada para valores pequenos, com o objetivo de não perder um mínimo local. Consequentemente, o algoritmo pode levar várias iterações para convergir. Idealmente,é desejada uma taxa de aprendizagem que conduza a um mínimo local nas primeiras iterações, mas isto é muito difícil de ser realizado dada a alta complexidade do espaço de valores a serem pesquisados. Começando com um estudo exploratório em várias bases de dados de sistemas de recomendação, observamos que, para a maioria das bases, há um padrão linear entre a taxa de aprendizagem e o número de iterações necessárias para atingir a convergência. A partir disso, propomos utilizar modelos de regressão lineares simples para predizer, para uma base de dados desconhecida, um bom valor para a taxa de aprendizagem inicial. A ideia é estimar uma taxa de aprendizagem que conduza o gradiente descendenteaummínimolocalnasprimeirasiterações. Avaliamosnossatécnicaem8bases desistemasderecomendaçãoreaisecomparamoscomoalgoritmopadrão,oqualutilizaum valorfixoparaataxadeaprendizagem,ecomtécnicasqueadaptamataxadeaprendizagem extraídas da literatura. Nós mostramos que conseguimos reduzir o número de iterações até em 40% quando comparados à abordagem padrão.
Suggesting the most suitable products to different types of consumers is not a trivial task, despite being a key factor for increasing their satisfaction and loyalty. Due to this fact, recommender systems have be come an important tool for many applications, such as e-commerce, personalized websites and social networks. Recently, Matrix Factorization has become the most successful technique to implement recommendation systems. The parameters of this model are typically learned by means of numerical methods, like the gradient descent. The performance of the gradient descent is directly related to the configuration of the learning rate, which is typically set to small values, in order to do not miss a local minimum. As a consequence, the algorithm may take several iterations to converge. Ideally, one wants to find a learning rate that will lead to a local minimum in the early iterations, but this is very difficult to achieve given the high complexity of search space. Starting with an exploratory study on several recommendation systems datasets, we observed that there is an over all linear relationship between the learnin grate and the number of iterations needed until convergence. From this, we propose to use simple linear regression models to predict, for a unknown dataset, a good value for an initial learning rate. The idea is to estimate a learning rate that drives the gradient descent as close as possible to a local minimum in the first iteration. We evaluate our technique on 8 real-world recommender datasets and compared it with the standard Matrix Factorization learning algorithm, which uses a fixed value for the learning rate over all iterations, and techniques fromt he literature that adapt the learning rate. We show that we can reduce the number of iterations until at 40% compared to the standard approach.
Cavalcanti, Alexsandro Bezerra. "Aperfeiçoamento de métodos estatísticos em modelos de regressão da família exponencial." Universidade de São Paulo, 2009. http://www.teses.usp.br/teses/disponiveis/45/45133/tde-05082009-170043/.
Full textIn this work, we develop three topics related to the exponential family nonlinear regression. First, we obtain the asymptotic covariance matrix of order $n^$, where $n$ is the sample size, for the maximum likelihood estimators corrected by the bias of order $n^$ in generalized linear models, considering the precision parameter known. Second, we calculate an asymptotic formula of order $n^{-1/2}$ for the skewness of the distribution of the maximum likelihood estimators of the mean parameters and of the precision and dispersion parameters in exponential family nonlinear models considering that the dispersion parameter is the same although unknown for all observations. Finally, we obtain Bartlett-type correction factors for the score test in exponential family nonlinear models assuming that the precision parameter is modelled by covariates. Monte Carlo simulation studies are developed to evaluate the results obtained in the three topics.
Books on the topic "Matrix linear regression"
Puntanen, Simo, George P. H. Styan, and Jarkko Isotalo. Formulas Useful for Linear Regression Analysis and Related Matrix Theory. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-32931-9.
Full textGrafarend, Erik. Linear and Nonlinear Models: Fixed effects, random effects, and total least squares. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012.
Find full textFormulas Useful For Linear Regression Analysis And Related Matrix Theory Its Only Formulas But We Like Them. Springer, 2012.
Find full textFormulas Useful for Linear Regression Analysis and Related Matrix Theory: It's Only Formulas but We Like Them. Springer London, Limited, 2013.
Find full textOptimization of Objective Functions: Analytics. Numerical Methods. Design of Experiments. Moscow, Russia: Fizmatlit Publisher, 2009.
Find full textSobczyk, Eugeniusz Jacek. Uciążliwość eksploatacji złóż węgla kamiennego wynikająca z warunków geologicznych i górniczych. Instytut Gospodarki Surowcami Mineralnymi i Energią PAN, 2022. http://dx.doi.org/10.33223/onermin/0222.
Full textMarques, Marcia Alessandra Arantes, ed. Estudos Avançados em Ciências Agrárias. Bookerfield Editora, 2022. http://dx.doi.org/10.53268/bkf22040700.
Full textBook chapters on the topic "Matrix linear regression"
Groß, Jürgen. "Matrix Algebra." In Linear Regression, 331–58. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-642-55864-1_7.
Full textGroß, Jürgen. "The Covariance Matrix of the Error Vector." In Linear Regression, 259–91. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-642-55864-1_5.
Full textvon Frese, Ralph R. B. "Matrix Linear Regression." In Basic Environmental Data Analysis for Scientists and Engineers, 127–40. Boca Raton, FL : CRC Press, Taylor & Francis Group, 2019.: CRC Press, 2019. http://dx.doi.org/10.1201/9780429291210-7.
Full textBrown, Jonathon D. "Simple Linear Regression." In Linear Models in Matrix Form, 39–67. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-11734-8_2.
Full textBrown, Jonathon D. "Polynomial Regression." In Linear Models in Matrix Form, 341–75. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-11734-8_10.
Full textBrown, Jonathon D. "Multiple Regression." In Linear Models in Matrix Form, 105–45. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-11734-8_4.
Full textLange, Kenneth. "Linear Regression and Matrix Inversion." In Numerical Analysis for Statisticians, 93–111. New York, NY: Springer New York, 2010. http://dx.doi.org/10.1007/978-1-4419-5945-4_7.
Full textDinov, Ivo D. "Linear Algebra, Matrix Computing, and Regression Modeling." In The Springer Series in Applied Machine Learning, 149–213. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-17483-4_3.
Full textHines, Benjamin, Yuriy Kuleshov, and Guoqi Qian. "Spatial Modelling of Linear Regression Coefficients for Gauge Measurements Against Satellite Estimates." In 2019-20 MATRIX Annals, 217–34. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-62497-2_11.
Full textPuntanen, Simo, Jarkko Isotalo, and George P. H. Styan. "Formulas Useful for Linear Regression Analysis and Related Matrix Theory." In Formulas Useful for Linear Regression Analysis and Related Matrix Theory, 1–116. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-32931-9_1.
Full textConference papers on the topic "Matrix linear regression"
Chen, Xiaojun, Guowen Yuan, Feiping Nie, and Joshua Zhexue Huang. "Semi-supervised Feature Selection via Rescaled Linear Regression." In Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/211.
Full textChou, Wu. "Maximum a posterior linear regression with elliptically symmetric matrix variate priors." In 6th European Conference on Speech Communication and Technology (Eurospeech 1999). ISCA: ISCA, 1999. http://dx.doi.org/10.21437/eurospeech.1999-4.
Full textDufrenois, F., and J. C. Noyer. "Discriminative Hat Matrix: A new tool for outlier identification and linear regression." In 2011 International Joint Conference on Neural Networks (IJCNN 2011 - San Jose). IEEE, 2011. http://dx.doi.org/10.1109/ijcnn.2011.6033300.
Full textYouwen Zhu, Zhikuan Wang, Cheng Qian, and Jian Wang. "On efficiently harnessing cloud to securely solve linear regression and other matrix operations." In 2016 IEEE/ACM 24th International Symposium on Quality of Service (IWQoS). IEEE, 2016. http://dx.doi.org/10.1109/iwqos.2016.7590402.
Full textKrishna, Y. Hari, G. V. Arunamayi, K. Ramesh Babu, S. Nanda Kishore, M. Rajaiah, and B. Mahaboob. "Several matrix algebra applications in linear regression analysis, information theory, ODE and geometry." In FOURTH INTERNATIONAL CONFERENCE ON ADVANCES IN PHYSICAL SCIENCES AND MATERIALS: ICAPSM 2023. AIP Publishing, 2024. http://dx.doi.org/10.1063/5.0216119.
Full textJing, Wang, Zhou Huizhi, Liu Dichen, Guo Ke, and Han Xiangyu. "Research on real-time admittance matrix identification based on WAMS and multiple linear regression." In 2014 IEEE PES Asia-Pacific Power and Energy Engineering Conference (APPEEC). IEEE, 2014. http://dx.doi.org/10.1109/appeec.2014.7066186.
Full textLopez, Oscar, Daniel Dunlavy, and Richard Lehoucq. "Zero-Truncated Poisson Regression for Multiway Count Data." In Proposed for presentation at the Conference on Random Matrix Theory and Numerical Linear Algebra held June 20-24, 2022 in Seattle, WA. US DOE, 2022. http://dx.doi.org/10.2172/2003556.
Full textThiyagarajan, A., and K. Anbazhagan. "Confusion matrix analysis of personal loan fraud detection using novel random forest algorithm and linear regression algorithm." In INTERNATIONAL CONFERENCE ON SCIENCE, ENGINEERING, AND TECHNOLOGY 2022: Conference Proceedings. AIP Publishing, 2023. http://dx.doi.org/10.1063/5.0173705.
Full textWu, Hsiao-Chun, Shih Yu Chang, and Tho Le-Ngoc. "Efficient Rank-Adaptive Least-Square Estimation and Multiple-Parameter Linear Regression Using Novel Dyadically Recursive Hermitian Matrix Inversion." In 2008 International Wireless Communications and Mobile Computing Conference (IWCMC). IEEE, 2008. http://dx.doi.org/10.1109/iwcmc.2008.185.
Full textThiradathanapattaradecha, Thanapon, Roungsan Chaisricharoen, and Thongchai Yooyativong. "The strategic planning of e-commerce business to deployment with TOWS matrix by using K-mean and linear regression." In 2017 International Conference on Digital Arts, Media and Technology (ICDAMT). IEEE, 2017. http://dx.doi.org/10.1109/icdamt.2017.7905001.
Full textReports on the topic "Matrix linear regression"
Castellano, Mike J., Abraham G. Shaviv, Raphael Linker, and Matt Liebman. Improving nitrogen availability indicators by emphasizing correlations between gross nitrogen mineralization and the quality and quantity of labile soil organic matter fractions. United States Department of Agriculture, January 2012. http://dx.doi.org/10.32747/2012.7597926.bard.
Full textGalili, Naftali, Roger P. Rohrbach, Itzhak Shmulevich, Yoram Fuchs, and Giora Zauberman. Non-Destructive Quality Sensing of High-Value Agricultural Commodities Through Response Analysis. United States Department of Agriculture, October 1994. http://dx.doi.org/10.32747/1994.7570549.bard.
Full text