Academic literature on the topic 'LASSO algoritmus'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'LASSO algoritmus.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "LASSO algoritmus"
Gaines, Brian R., Juhyun Kim, and Hua Zhou. "Algorithms for Fitting the Constrained Lasso." Journal of Computational and Graphical Statistics 27, no. 4 (August 7, 2018): 861–71. http://dx.doi.org/10.1080/10618600.2018.1473777.
Full textBonnefoy, Antoine, Valentin Emiya, Liva Ralaivola, and Remi Gribonval. "Dynamic Screening: Accelerating First-Order Algorithms for the Lasso and Group-Lasso." IEEE Transactions on Signal Processing 63, no. 19 (October 2015): 5121–32. http://dx.doi.org/10.1109/tsp.2015.2447503.
Full textZhou, Helper, and Victor Gumbo. "Supervised Machine Learning for Predicting SMME Sales: An Evaluation of Three Algorithms." African Journal of Information and Communication, no. 27 (May 31, 2021): 1–21. http://dx.doi.org/10.23962/10539/31371.
Full textWu, Tong Tong, and Kenneth Lange. "Coordinate descent algorithms for lasso penalized regression." Annals of Applied Statistics 2, no. 1 (March 2008): 224–44. http://dx.doi.org/10.1214/07-aoas147.
Full textTsiligkaridis, Theodoros, Alfred O. Hero III, and Shuheng Zhou. "On Convergence of Kronecker Graphical Lasso Algorithms." IEEE Transactions on Signal Processing 61, no. 7 (April 2013): 1743–55. http://dx.doi.org/10.1109/tsp.2013.2240157.
Full textMuchisha, Nadya Dwi, Novian Tamara, Andriansyah Andriansyah, and Agus M. Soleh. "Nowcasting Indonesia’s GDP Growth Using Machine Learning Algorithms." Indonesian Journal of Statistics and Its Applications 5, no. 2 (June 30, 2021): 355–68. http://dx.doi.org/10.29244/ijsa.v5i2p355-368.
Full textJain, Rahi, and Wei Xu. "HDSI: High dimensional selection with interactions algorithm on feature selection and testing." PLOS ONE 16, no. 2 (February 16, 2021): e0246159. http://dx.doi.org/10.1371/journal.pone.0246159.
Full textQin, Zhiwei, Katya Scheinberg, and Donald Goldfarb. "Efficient block-coordinate descent algorithms for the Group Lasso." Mathematical Programming Computation 5, no. 2 (March 31, 2013): 143–69. http://dx.doi.org/10.1007/s12532-013-0051-x.
Full textJohnson, Karl M., and Thomas P. Monath. "Imported Lassa Fever — Reexamining the Algorithms." New England Journal of Medicine 323, no. 16 (October 18, 1990): 1139–41. http://dx.doi.org/10.1056/nejm199010183231611.
Full textZhao, Yingdong, and Richard Simon. "Development and Validation of Predictive Indices for a Continuous Outcome Using Gene Expression Profiles." Cancer Informatics 9 (January 2010): CIN.S3805. http://dx.doi.org/10.4137/cin.s3805.
Full textDissertations / Theses on the topic "LASSO algoritmus"
Loth, Manuel. "Algorithmes d'Ensemble Actif pour le LASSO." Phd thesis, Université des Sciences et Technologie de Lille - Lille I, 2011. http://tel.archives-ouvertes.fr/tel-00845441.
Full textSINGH, KEVIN. "Comparing Variable Selection Algorithms On Logistic Regression – A Simulation." Thesis, Uppsala universitet, Statistiska institutionen, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-446090.
Full textSanchez, Merchante Luis Francisco. "Learning algorithms for sparse classification." Phd thesis, Université de Technologie de Compiègne, 2013. http://tel.archives-ouvertes.fr/tel-00868847.
Full textHuynh, Bao Tuyen. "Estimation and feature selection in high-dimensional mixtures-of-experts models." Thesis, Normandie, 2019. http://www.theses.fr/2019NORMC237.
Full textThis thesis deals with the problem of modeling and estimation of high-dimensional MoE models, towards effective density estimation, prediction and clustering of such heterogeneous and high-dimensional data. We propose new strategies based on regularized maximum-likelihood estimation (MLE) of MoE models to overcome the limitations of standard methods, including MLE estimation with Expectation-Maximization (EM) algorithms, and to simultaneously perform feature selection so that sparse models are encouraged in such a high-dimensional setting. We first introduce a mixture-of-experts’ parameter estimation and variable selection methodology, based on l1 (lasso) regularizations and the EM framework, for regression and clustering suited to high-dimensional contexts. Then, we extend the method to regularized mixture of experts models for discrete data, including classification. We develop efficient algorithms to maximize the proposed l1 -penalized observed-data log-likelihood function. Our proposed strategies enjoy the efficient monotone maximization of the optimized criterion, and unlike previous approaches, they do not rely on approximations on the penalty functions, avoid matrix inversion, and exploit the efficiency of the coordinate ascent algorithm, particularly within the proximal Newton-based approach
Wang, Bo. "Variable Ranking by Solution-path Algorithms." Thesis, 2012. http://hdl.handle.net/10012/6496.
Full textNoro, Catarina Vieira. "Determinants of households´ consumption in Portugal - a machine learning approach." Master's thesis, 2021. http://hdl.handle.net/10362/121884.
Full textHe, Zangdong. "Variable selection and structural discovery in joint models of longitudinal and survival data." Thesis, 2014. http://hdl.handle.net/1805/6365.
Full textJoint models of longitudinal and survival outcomes have been used with increasing frequency in clinical investigations. Correct specification of fixed and random effects, as well as their functional forms is essential for practical data analysis. However, no existing methods have been developed to meet this need in a joint model setting. In this dissertation, I describe a penalized likelihood-based method with adaptive least absolute shrinkage and selection operator (ALASSO) penalty functions for model selection. By reparameterizing variance components through a Cholesky decomposition, I introduce a penalty function of group shrinkage; the penalized likelihood is approximated by Gaussian quadrature and optimized by an EM algorithm. The functional forms of the independent effects are determined through a procedure for structural discovery. Specifically, I first construct the model by penalized cubic B-spline and then decompose the B-spline to linear and nonlinear elements by spectral decomposition. The decomposition represents the model in a mixed-effects model format, and I then use the mixed-effects variable selection method to perform structural discovery. Simulation studies show excellent performance. A clinical application is described to illustrate the use of the proposed methods, and the analytical results demonstrate the usefulness of the methods.
Book chapters on the topic "LASSO algoritmus"
Loth, Manuel, and Philippe Preux. "The Iso-regularization Descent Algorithm for the LASSO." In Neural Information Processing. Theory and Algorithms, 454–61. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-17537-4_56.
Full textMd Shahri, Nur Huda Nabihan, and Susana Conde. "Modelling Multi-dimensional Contingency Tables: LASSO and Stepwise Algorithms." In Proceedings of the Third International Conference on Computing, Mathematics and Statistics (iCMS2017), 563–70. Singapore: Springer Singapore, 2019. http://dx.doi.org/10.1007/978-981-13-7279-7_70.
Full textWalrand, Jean. "Speech Recognition: B." In Probability in Electrical Engineering and Computer Science, 217–42. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-49995-2_12.
Full textPawlak, Mirosław, and Jiaqing Lv. "Analysis of Large Scale Power Systems via LASSO Learning Algorithms." In Artificial Intelligence and Soft Computing, 652–62. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-20912-4_59.
Full textAlKindy, Bassam, Christophe Guyeux, Jean-François Couchot, Michel Salomon, Christian Parisod, and Jacques M. Bahi. "Hybrid Genetic Algorithm and Lasso Test Approach for Inferring Well Supported Phylogenetic Trees Based on Subsets of Chloroplastic Core Genes." In Algorithms for Computational Biology, 83–96. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-21233-3_7.
Full textBoulesteix, Anne-Laure, Adrian Richter, and Christoph Bernau. "Complexity Selection with Cross-validation for Lasso and Sparse Partial Least Squares Using High-Dimensional Data." In Algorithms from and for Nature and Life, 261–68. Cham: Springer International Publishing, 2013. http://dx.doi.org/10.1007/978-3-319-00035-0_26.
Full textYamada, Isao, and Masao Yamagishi. "Hierarchical Convex Optimization by the Hybrid Steepest Descent Method with Proximal Splitting Operators—Enhancements of SVM and Lasso." In Splitting Algorithms, Modern Operator Theory, and Applications, 413–89. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-25939-6_16.
Full textHao, Yuhan, Gary M. Weiss, and Stuart M. Brown. "Identification of Candidate Genes Responsible for Age-Related Macular Degeneration Using Microarray Data." In Biotechnology, 969–1001. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-8903-7.ch038.
Full textConference papers on the topic "LASSO algoritmus"
Jin, Yuzhe, and Bhaskar D. Rao. "MultiPass lasso algorithms for sparse signal recovery." In 2011 IEEE International Symposium on Information Theory - ISIT. IEEE, 2011. http://dx.doi.org/10.1109/isit.2011.6033773.
Full textQian, Wang. "A Comparison of Three Numeric Algorithms for Lasso Solution." In 2020 International Conference on Computing and Data Science (CDS). IEEE, 2020. http://dx.doi.org/10.1109/cds49703.2020.00019.
Full textKong, Deguang, and Chris Ding. "Efficient Algorithms for Selecting Features with Arbitrary Group Constraints via Group Lasso." In 2013 IEEE International Conference on Data Mining (ICDM). IEEE, 2013. http://dx.doi.org/10.1109/icdm.2013.168.
Full textMarins, Matheus, Rafael Chaves, Vinicius Pinho, Rebeca Cunha, and Marcello Campos. "Tackling Fingerprinting Indoor Localization Using the LASSO and the Conjugate Gradient Algorithms." In XXXIV Simpósio Brasileiro de Telecomunicações. Sociedade Brasileira de Telecomunicações, 2016. http://dx.doi.org/10.14209/sbrt.2016.47.
Full textGu, Bin, Xingwang Ju, Xiang Li, and Guansheng Zheng. "Faster Training Algorithms for Structured Sparsity-Inducing Norm." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/299.
Full textMaya, Haroldo C., and Guilherme A. Barreto. "A GA-Based Approach for Building Regularized Sparse Polynomial Models for Wind Turbine Power Curves." In XV Encontro Nacional de Inteligência Artificial e Computacional. Sociedade Brasileira de Computação - SBC, 2018. http://dx.doi.org/10.5753/eniac.2018.4455.
Full textKato, Masaya, Miho Ohsaki, and Kei Ohnishi. "Genetic Algorithms Using Neural Network Regression and Group Lasso for Dynamic Selection of Crossover Operators." In 2020 Joint 11th International Conference on Soft Computing and Intelligent Systems and 21st International Symposium on Advanced Intelligent Systems (SCIS-ISIS). IEEE, 2020. http://dx.doi.org/10.1109/scisisis50064.2020.9322697.
Full textIdogun, Akpevwe Kelvin, Ruth Oyanu Ujah, and Lesley Anne James. "Surrogate-Based Analysis of Chemical Enhanced Oil Recovery – A Comparative Analysis of Machine Learning Model Performance." In SPE Nigeria Annual International Conference and Exhibition. SPE, 2021. http://dx.doi.org/10.2118/208452-ms.
Full textAhmadov, Jamal. "Utilizing Data-Driven Models to Predict Brittleness in Tuscaloosa Marine Shale: A Machine Learning Approach." In SPE Annual Technical Conference and Exhibition. SPE, 2021. http://dx.doi.org/10.2118/208628-stu.
Full textOrta Aleman, Dante, and Roland Horne. "Well Interference Detection from Long-Term Pressure Data Using Machine Learning and Multiresolution Analysis." In SPE Annual Technical Conference and Exhibition. SPE, 2021. http://dx.doi.org/10.2118/206354-ms.
Full text