Academic literature on the topic 'LASSO algorithm'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'LASSO algorithm.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "LASSO algorithm"
Kaneda, Yasuaki, and Yasuharu Irizuki. "Recursive Algorithm for LASSO." IEEJ Transactions on Electronics, Information and Systems 136, no. 7 (2016): 915–22. http://dx.doi.org/10.1541/ieejeiss.136.915.
Full textJain, Rahi, and Wei Xu. "HDSI: High dimensional selection with interactions algorithm on feature selection and testing." PLOS ONE 16, no. 2 (February 16, 2021): e0246159. http://dx.doi.org/10.1371/journal.pone.0246159.
Full textYau, Chun Yip, and Tsz Shing Hui. "LARS-type algorithm for group lasso." Statistics and Computing 27, no. 4 (May 23, 2016): 1041–48. http://dx.doi.org/10.1007/s11222-016-9669-7.
Full textAlghamdi, Maryam A., Mohammad Ali Alghamdi, Naseer Shahzad, and Hong-Kun Xu. "Properties and Iterative Methods for theQ-Lasso." Abstract and Applied Analysis 2013 (2013): 1–8. http://dx.doi.org/10.1155/2013/250943.
Full textWang, Jin-Jia, and Yang Lu. "Coordinate Descent Based Hierarchical Interactive Lasso Penalized Logistic Regression and Its Application to Classification Problems." Mathematical Problems in Engineering 2014 (2014): 1–11. http://dx.doi.org/10.1155/2014/430201.
Full textLiu, Yashu, Jie Wang, and Jieping Ye. "An Efficient Algorithm For Weak Hierarchical Lasso." ACM Transactions on Knowledge Discovery from Data 10, no. 3 (February 24, 2016): 1–24. http://dx.doi.org/10.1145/2791295.
Full textKim, Jinseog, Yuwon Kim, and Yongdai Kim. "A Gradient-Based Optimization Algorithm for LASSO." Journal of Computational and Graphical Statistics 17, no. 4 (December 2008): 994–1009. http://dx.doi.org/10.1198/106186008x386210.
Full textWang, Hao. "Coordinate descent algorithm for covariance graphical lasso." Statistics and Computing 24, no. 4 (February 23, 2013): 521–29. http://dx.doi.org/10.1007/s11222-013-9385-5.
Full textLi, Yahui, Yang Li, and Yuanyuan Sun. "Online Static Security Assessment of Power Systems Based on Lasso Algorithm." Applied Sciences 8, no. 9 (August 23, 2018): 1442. http://dx.doi.org/10.3390/app8091442.
Full textKeerthi, S. S., and S. Shevade. "A Fast Tracking Algorithm for Generalized LARS/LASSO." IEEE Transactions on Neural Networks 18, no. 6 (November 2007): 1826–30. http://dx.doi.org/10.1109/tnn.2007.900229.
Full textDissertations / Theses on the topic "LASSO algorithm"
Zhang, Han. "Detecting Rare Haplotype-Environmental Interaction and Nonlinear Effects of Rare Haplotypes using Bayesian LASSO on Quantitative Traits." The Ohio State University, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=osu149969433115895.
Full textAsif, Muhammad Salman. "Primal dual pursuit a homotopy based algorithm for the Dantzig selector /." Thesis, Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/24693.
Full textCommittee Chair: Romberg, Justin; Committee Member: McClellan, James; Committee Member: Mersereau, Russell
Soret, Perrine. "Régression pénalisée de type Lasso pour l’analyse de données biologiques de grande dimension : application à la charge virale du VIH censurée par une limite de quantification et aux données compositionnelles du microbiote." Thesis, Bordeaux, 2019. http://www.theses.fr/2019BORD0254.
Full textIn clinical studies and thanks to technological progress, the amount of information collected in the same patient continues to grow leading to situations where the number of explanatory variables is greater than the number of individuals. The Lasso method proved to be appropriate to circumvent over-adjustment problems in high-dimensional settings.This thesis is devoted to the application and development of Lasso-penalized regression for clinical data presenting particular structures.First, in patients with the human immunodeficiency virus, mutations in the virus's genetic structure may be related to the development of drug resistance. The prediction of the viral load from (potentially large) mutations allows guiding treatment choice.Below a threshold, the viral load is undetectable, data are left-censored. We propose two new Lasso approaches based on the Buckley-James algorithm, which imputes censored values by a conditional expectation. By reversing the response, we obtain a right-censored problem, for which non-parametric estimates of the conditional expectation have been proposed in survival analysis. Finally, we propose a parametric estimation based on a Gaussian hypothesis.Secondly, we are interested in the role of the microbiota in the deterioration of respiratory health. The microbiota data are presented as relative abundances (proportion of each species per individual, called compositional data) and they have a phylogenetic structure.We have established a state of the art methods of statistical analysis of microbiota data. Due to the novelty, few recommendations exist on the applicability and effectiveness of the proposed methods. A simulation study allowed us to compare the selection capacity of penalization methods proposed specifically for this type of data.Then we apply this research to the analysis of the association between bacteria / fungi and the decline of pulmonary function in patients with cystic fibrosis from the MucoFong project
Loth, Manuel. "Algorithmes d'Ensemble Actif pour le LASSO." Phd thesis, Université des Sciences et Technologie de Lille - Lille I, 2011. http://tel.archives-ouvertes.fr/tel-00845441.
Full textOunaissi, Daoud. "Méthodes quasi-Monte Carlo et Monte Carlo : application aux calculs des estimateurs Lasso et Lasso bayésien." Thesis, Lille 1, 2016. http://www.theses.fr/2016LIL10043/document.
Full textThe thesis contains 6 chapters. The first chapter contains an introduction to linear regression, the Lasso and the Bayesian Lasso problems. Chapter 2 recalls the convex optimization algorithms and presents the Fista algorithm for calculating the Lasso estimator. The properties of the convergence of this algorithm is also given in this chapter using the entropy estimator and Pitman-Yor estimator. Chapter 3 is devoted to comparison of Monte Carlo and quasi-Monte Carlo methods in numerical calculations of Bayesian Lasso. It comes out of this comparison that the Hammersely points give the best results. Chapter 4 gives a geometric interpretation of the partition function of the Bayesian lasso expressed as a function of the incomplete Gamma function. This allowed us to give a convergence criterion for the Metropolis Hastings algorithm. Chapter 5 presents the Bayesian estimator as the law limit a multivariate stochastic differential equation. This allowed us to calculate the Bayesian Lasso using numerical schemes semi-implicit and explicit Euler and methods of Monte Carlo, Monte Carlo multilevel (MLMC) and Metropolis Hastings algorithm. Comparing the calculation costs shows the couple (semi-implicit Euler scheme, MLMC) wins against the other couples (scheme method). Finally in chapter 6 we found the Lasso convergence rate of the Bayesian Lasso when the signal / noise ratio is constant and when the noise tends to 0. This allowed us to provide a new criteria for the convergence of the Metropolis algorithm Hastings
Denoyelle, Quentin. "Theoretical and Numerical Analysis of Super-Resolution Without Grid." Thesis, Paris Sciences et Lettres (ComUE), 2018. http://www.theses.fr/2018PSLED030/document.
Full textThis thesis studies the noisy sparse spikes super-resolution problem for positive measures using the BLASSO, an infinite dimensional convex optimization problem generalizing the LASSO to measures. First, we show that the support stability of the BLASSO for N clustered spikes is governed by an object called the (2N-1)-vanishing derivatives pre-certificate. When it is non-degenerate, solving the BLASSO leads to exact support recovery of the initial measure, in a low noise regime whose size is controlled by the minimal separation distance of the spikes. In a second part, we propose the Sliding Frank-Wolfe algorithm, based on the Frank-Wolfe algorithm with an added step moving continuously the amplitudes and positions of the spikes, that solves the BLASSO. We show that, under mild assumptions, it converges in a finite number of iterations. We apply this algorithm to the 3D fluorescent microscopy problem by comparing three models based on the PALM/STORM technics
Huynh, Bao Tuyen. "Estimation and feature selection in high-dimensional mixtures-of-experts models." Thesis, Normandie, 2019. http://www.theses.fr/2019NORMC237.
Full textThis thesis deals with the problem of modeling and estimation of high-dimensional MoE models, towards effective density estimation, prediction and clustering of such heterogeneous and high-dimensional data. We propose new strategies based on regularized maximum-likelihood estimation (MLE) of MoE models to overcome the limitations of standard methods, including MLE estimation with Expectation-Maximization (EM) algorithms, and to simultaneously perform feature selection so that sparse models are encouraged in such a high-dimensional setting. We first introduce a mixture-of-experts’ parameter estimation and variable selection methodology, based on l1 (lasso) regularizations and the EM framework, for regression and clustering suited to high-dimensional contexts. Then, we extend the method to regularized mixture of experts models for discrete data, including classification. We develop efficient algorithms to maximize the proposed l1 -penalized observed-data log-likelihood function. Our proposed strategies enjoy the efficient monotone maximization of the optimized criterion, and unlike previous approaches, they do not rely on approximations on the penalty functions, avoid matrix inversion, and exploit the efficiency of the coordinate ascent algorithm, particularly within the proximal Newton-based approach
SINGH, KEVIN. "Comparing Variable Selection Algorithms On Logistic Regression – A Simulation." Thesis, Uppsala universitet, Statistiska institutionen, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-446090.
Full textFang, Zaili. "Some Advanced Model Selection Topics for Nonparametric/Semiparametric Models with High-Dimensional Data." Diss., Virginia Tech, 2012. http://hdl.handle.net/10919/40090.
Full textPh. D.
Sanchez, Merchante Luis Francisco. "Learning algorithms for sparse classification." Phd thesis, Université de Technologie de Compiègne, 2013. http://tel.archives-ouvertes.fr/tel-00868847.
Full textBook chapters on the topic "LASSO algorithm"
Loth, Manuel, and Philippe Preux. "The Iso-regularization Descent Algorithm for the LASSO." In Neural Information Processing. Theory and Algorithms, 454–61. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-17537-4_56.
Full textWalrand, Jean. "Speech Recognition: B." In Probability in Electrical Engineering and Computer Science, 217–42. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-49995-2_12.
Full textWu, Kai, and Jing Liu. "Learning of Sparse Fuzzy Cognitive Maps Using Evolutionary Algorithm with Lasso Initialization." In Lecture Notes in Computer Science, 385–96. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68759-9_32.
Full textChen, Qian, and Lianbing Huang. "Research on Prediction Model of Gas Emission Based on Lasso Penalty Regression Algorithm." In Lecture Notes in Electrical Engineering, 165–72. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-0187-6_19.
Full textYanagihara, Hirokazu, and Ryoya Oda. "Coordinate Descent Algorithm for Normal-Likelihood-Based Group Lasso in Multivariate Linear Regression." In Intelligent Decision Technologies, 429–39. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-2765-1_36.
Full textAlKindy, Bassam, Christophe Guyeux, Jean-François Couchot, Michel Salomon, Christian Parisod, and Jacques M. Bahi. "Hybrid Genetic Algorithm and Lasso Test Approach for Inferring Well Supported Phylogenetic Trees Based on Subsets of Chloroplastic Core Genes." In Algorithms for Computational Biology, 83–96. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-21233-3_7.
Full textMd Shahri, Nur Huda Nabihan, and Susana Conde. "Modelling Multi-dimensional Contingency Tables: LASSO and Stepwise Algorithms." In Proceedings of the Third International Conference on Computing, Mathematics and Statistics (iCMS2017), 563–70. Singapore: Springer Singapore, 2019. http://dx.doi.org/10.1007/978-981-13-7279-7_70.
Full textPawlak, Mirosław, and Jiaqing Lv. "Analysis of Large Scale Power Systems via LASSO Learning Algorithms." In Artificial Intelligence and Soft Computing, 652–62. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-20912-4_59.
Full textBoulesteix, Anne-Laure, Adrian Richter, and Christoph Bernau. "Complexity Selection with Cross-validation for Lasso and Sparse Partial Least Squares Using High-Dimensional Data." In Algorithms from and for Nature and Life, 261–68. Cham: Springer International Publishing, 2013. http://dx.doi.org/10.1007/978-3-319-00035-0_26.
Full textGnad, Daniel, Jan Eisenhut, Alberto Lluch Lafuente, and Jörg Hoffmann. "Model Checking $$\omega $$-Regular Properties with Decoupled Search." In Computer Aided Verification, 411–34. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-81688-9_19.
Full textConference papers on the topic "LASSO algorithm"
Lin, Daxuan, Zan Yang, Jiuwei Chen, Jiaxin Dong, Wei Nai, and Dan Li. "Lasso Regression with Quantum Whale Optimization Algorithm." In 2020 IEEE 11th International Conference on Software Engineering and Service Science (ICSESS). IEEE, 2020. http://dx.doi.org/10.1109/icsess49938.2020.9237739.
Full textLiu, Yashu, Jie Wang, and Jieping Ye. "An efficient algorithm for weak hierarchical lasso." In KDD '14: The 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY, USA: ACM, 2014. http://dx.doi.org/10.1145/2623330.2623665.
Full textZhang, Jianming, Mingkun Du, and Keyang Cheng. "Pedestrian detection based on efficient fused lasso algorithm." In 2012 5th International Congress on Image and Signal Processing (CISP). IEEE, 2012. http://dx.doi.org/10.1109/cisp.2012.6469937.
Full textChen, Kai, and Yang Jin. "An ensemble learning algorithm based on Lasso selection." In 2010 IEEE International Conference on Intelligent Computing and Intelligent Systems (ICIS 2010). IEEE, 2010. http://dx.doi.org/10.1109/icicisys.2010.5658515.
Full textMo, Weike, Jiaqing Lv, Mirostaw Pawlak, U. D. Annakkage, Haoyong Chen, and Yiping Chen. "Power System Online Sensitivity Identification Based on Lasso Algorithm." In 2020 IEEE Power & Energy Society General Meeting (PESGM). IEEE, 2020. http://dx.doi.org/10.1109/pesgm41954.2020.9281724.
Full textCosta, M. A., and A. P. Braga. "Optimization of Neural Networks with Multi-Objective LASSO Algorithm." In The 2006 IEEE International Joint Conference on Neural Network Proceedings. IEEE, 2006. http://dx.doi.org/10.1109/ijcnn.2006.247329.
Full textAlissou, Simplice A., and Ye Zhang. "Hyperspectral data compression using lasso algorithm for spectral decorrelation." In SPIE Sensing Technology + Applications, edited by Bormin Huang, Chein-I. Chang, and José Fco López. SPIE, 2014. http://dx.doi.org/10.1117/12.2053265.
Full textFujiwara, Yasuhiro, Naoki Marumo, Mathieu Blondel, Koh Takeuchi, Hideaki Kim, Tomoharu Iwata, and Naonori Ueda. "SVD-Based Screening for the Graphical Lasso." In Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/233.
Full textLiu, Jun, Lei Yuan, and Jieping Ye. "An efficient algorithm for a class of fused lasso problems." In the 16th ACM SIGKDD international conference. New York, New York, USA: ACM Press, 2010. http://dx.doi.org/10.1145/1835804.1835847.
Full textFarokhmanesh, Fatemeh, and Mohammad Taghi Sadeghi. "Deep Feature Selection using an Enhanced Sparse Group Lasso Algorithm." In 2019 27th Iranian Conference on Electrical Engineering (ICEE). IEEE, 2019. http://dx.doi.org/10.1109/iraniancee.2019.8786386.
Full text