Inhaltsverzeichnis
Auswahl der wissenschaftlichen Literatur zum Thema „LASSO algorithm“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "LASSO algorithm" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Zeitschriftenartikel zum Thema "LASSO algorithm"
Kaneda, Yasuaki, und Yasuharu Irizuki. „Recursive Algorithm for LASSO“. IEEJ Transactions on Electronics, Information and Systems 136, Nr. 7 (2016): 915–22. http://dx.doi.org/10.1541/ieejeiss.136.915.
Der volle Inhalt der QuelleJain, Rahi, und Wei Xu. „HDSI: High dimensional selection with interactions algorithm on feature selection and testing“. PLOS ONE 16, Nr. 2 (16.02.2021): e0246159. http://dx.doi.org/10.1371/journal.pone.0246159.
Der volle Inhalt der QuelleYau, Chun Yip, und Tsz Shing Hui. „LARS-type algorithm for group lasso“. Statistics and Computing 27, Nr. 4 (23.05.2016): 1041–48. http://dx.doi.org/10.1007/s11222-016-9669-7.
Der volle Inhalt der QuelleAlghamdi, Maryam A., Mohammad Ali Alghamdi, Naseer Shahzad und Hong-Kun Xu. „Properties and Iterative Methods for theQ-Lasso“. Abstract and Applied Analysis 2013 (2013): 1–8. http://dx.doi.org/10.1155/2013/250943.
Der volle Inhalt der QuelleWang, Jin-Jia, und Yang Lu. „Coordinate Descent Based Hierarchical Interactive Lasso Penalized Logistic Regression and Its Application to Classification Problems“. Mathematical Problems in Engineering 2014 (2014): 1–11. http://dx.doi.org/10.1155/2014/430201.
Der volle Inhalt der QuelleLiu, Yashu, Jie Wang und Jieping Ye. „An Efficient Algorithm For Weak Hierarchical Lasso“. ACM Transactions on Knowledge Discovery from Data 10, Nr. 3 (24.02.2016): 1–24. http://dx.doi.org/10.1145/2791295.
Der volle Inhalt der QuelleKim, Jinseog, Yuwon Kim und Yongdai Kim. „A Gradient-Based Optimization Algorithm for LASSO“. Journal of Computational and Graphical Statistics 17, Nr. 4 (Dezember 2008): 994–1009. http://dx.doi.org/10.1198/106186008x386210.
Der volle Inhalt der QuelleWang, Hao. „Coordinate descent algorithm for covariance graphical lasso“. Statistics and Computing 24, Nr. 4 (23.02.2013): 521–29. http://dx.doi.org/10.1007/s11222-013-9385-5.
Der volle Inhalt der QuelleLi, Yahui, Yang Li und Yuanyuan Sun. „Online Static Security Assessment of Power Systems Based on Lasso Algorithm“. Applied Sciences 8, Nr. 9 (23.08.2018): 1442. http://dx.doi.org/10.3390/app8091442.
Der volle Inhalt der QuelleKeerthi, S. S., und S. Shevade. „A Fast Tracking Algorithm for Generalized LARS/LASSO“. IEEE Transactions on Neural Networks 18, Nr. 6 (November 2007): 1826–30. http://dx.doi.org/10.1109/tnn.2007.900229.
Der volle Inhalt der QuelleDissertationen zum Thema "LASSO algorithm"
Zhang, Han. „Detecting Rare Haplotype-Environmental Interaction and Nonlinear Effects of Rare Haplotypes using Bayesian LASSO on Quantitative Traits“. The Ohio State University, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=osu149969433115895.
Der volle Inhalt der QuelleAsif, Muhammad Salman. „Primal dual pursuit a homotopy based algorithm for the Dantzig selector /“. Thesis, Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/24693.
Der volle Inhalt der QuelleCommittee Chair: Romberg, Justin; Committee Member: McClellan, James; Committee Member: Mersereau, Russell
Soret, Perrine. „Régression pénalisée de type Lasso pour l’analyse de données biologiques de grande dimension : application à la charge virale du VIH censurée par une limite de quantification et aux données compositionnelles du microbiote“. Thesis, Bordeaux, 2019. http://www.theses.fr/2019BORD0254.
Der volle Inhalt der QuelleIn clinical studies and thanks to technological progress, the amount of information collected in the same patient continues to grow leading to situations where the number of explanatory variables is greater than the number of individuals. The Lasso method proved to be appropriate to circumvent over-adjustment problems in high-dimensional settings.This thesis is devoted to the application and development of Lasso-penalized regression for clinical data presenting particular structures.First, in patients with the human immunodeficiency virus, mutations in the virus's genetic structure may be related to the development of drug resistance. The prediction of the viral load from (potentially large) mutations allows guiding treatment choice.Below a threshold, the viral load is undetectable, data are left-censored. We propose two new Lasso approaches based on the Buckley-James algorithm, which imputes censored values by a conditional expectation. By reversing the response, we obtain a right-censored problem, for which non-parametric estimates of the conditional expectation have been proposed in survival analysis. Finally, we propose a parametric estimation based on a Gaussian hypothesis.Secondly, we are interested in the role of the microbiota in the deterioration of respiratory health. The microbiota data are presented as relative abundances (proportion of each species per individual, called compositional data) and they have a phylogenetic structure.We have established a state of the art methods of statistical analysis of microbiota data. Due to the novelty, few recommendations exist on the applicability and effectiveness of the proposed methods. A simulation study allowed us to compare the selection capacity of penalization methods proposed specifically for this type of data.Then we apply this research to the analysis of the association between bacteria / fungi and the decline of pulmonary function in patients with cystic fibrosis from the MucoFong project
Loth, Manuel. „Algorithmes d'Ensemble Actif pour le LASSO“. Phd thesis, Université des Sciences et Technologie de Lille - Lille I, 2011. http://tel.archives-ouvertes.fr/tel-00845441.
Der volle Inhalt der QuelleOunaissi, Daoud. „Méthodes quasi-Monte Carlo et Monte Carlo : application aux calculs des estimateurs Lasso et Lasso bayésien“. Thesis, Lille 1, 2016. http://www.theses.fr/2016LIL10043/document.
Der volle Inhalt der QuelleThe thesis contains 6 chapters. The first chapter contains an introduction to linear regression, the Lasso and the Bayesian Lasso problems. Chapter 2 recalls the convex optimization algorithms and presents the Fista algorithm for calculating the Lasso estimator. The properties of the convergence of this algorithm is also given in this chapter using the entropy estimator and Pitman-Yor estimator. Chapter 3 is devoted to comparison of Monte Carlo and quasi-Monte Carlo methods in numerical calculations of Bayesian Lasso. It comes out of this comparison that the Hammersely points give the best results. Chapter 4 gives a geometric interpretation of the partition function of the Bayesian lasso expressed as a function of the incomplete Gamma function. This allowed us to give a convergence criterion for the Metropolis Hastings algorithm. Chapter 5 presents the Bayesian estimator as the law limit a multivariate stochastic differential equation. This allowed us to calculate the Bayesian Lasso using numerical schemes semi-implicit and explicit Euler and methods of Monte Carlo, Monte Carlo multilevel (MLMC) and Metropolis Hastings algorithm. Comparing the calculation costs shows the couple (semi-implicit Euler scheme, MLMC) wins against the other couples (scheme method). Finally in chapter 6 we found the Lasso convergence rate of the Bayesian Lasso when the signal / noise ratio is constant and when the noise tends to 0. This allowed us to provide a new criteria for the convergence of the Metropolis algorithm Hastings
Denoyelle, Quentin. „Theoretical and Numerical Analysis of Super-Resolution Without Grid“. Thesis, Paris Sciences et Lettres (ComUE), 2018. http://www.theses.fr/2018PSLED030/document.
Der volle Inhalt der QuelleThis thesis studies the noisy sparse spikes super-resolution problem for positive measures using the BLASSO, an infinite dimensional convex optimization problem generalizing the LASSO to measures. First, we show that the support stability of the BLASSO for N clustered spikes is governed by an object called the (2N-1)-vanishing derivatives pre-certificate. When it is non-degenerate, solving the BLASSO leads to exact support recovery of the initial measure, in a low noise regime whose size is controlled by the minimal separation distance of the spikes. In a second part, we propose the Sliding Frank-Wolfe algorithm, based on the Frank-Wolfe algorithm with an added step moving continuously the amplitudes and positions of the spikes, that solves the BLASSO. We show that, under mild assumptions, it converges in a finite number of iterations. We apply this algorithm to the 3D fluorescent microscopy problem by comparing three models based on the PALM/STORM technics
Huynh, Bao Tuyen. „Estimation and feature selection in high-dimensional mixtures-of-experts models“. Thesis, Normandie, 2019. http://www.theses.fr/2019NORMC237.
Der volle Inhalt der QuelleThis thesis deals with the problem of modeling and estimation of high-dimensional MoE models, towards effective density estimation, prediction and clustering of such heterogeneous and high-dimensional data. We propose new strategies based on regularized maximum-likelihood estimation (MLE) of MoE models to overcome the limitations of standard methods, including MLE estimation with Expectation-Maximization (EM) algorithms, and to simultaneously perform feature selection so that sparse models are encouraged in such a high-dimensional setting. We first introduce a mixture-of-experts’ parameter estimation and variable selection methodology, based on l1 (lasso) regularizations and the EM framework, for regression and clustering suited to high-dimensional contexts. Then, we extend the method to regularized mixture of experts models for discrete data, including classification. We develop efficient algorithms to maximize the proposed l1 -penalized observed-data log-likelihood function. Our proposed strategies enjoy the efficient monotone maximization of the optimized criterion, and unlike previous approaches, they do not rely on approximations on the penalty functions, avoid matrix inversion, and exploit the efficiency of the coordinate ascent algorithm, particularly within the proximal Newton-based approach
SINGH, KEVIN. „Comparing Variable Selection Algorithms On Logistic Regression – A Simulation“. Thesis, Uppsala universitet, Statistiska institutionen, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-446090.
Der volle Inhalt der QuelleFang, Zaili. „Some Advanced Model Selection Topics for Nonparametric/Semiparametric Models with High-Dimensional Data“. Diss., Virginia Tech, 2012. http://hdl.handle.net/10919/40090.
Der volle Inhalt der QuellePh. D.
Sanchez, Merchante Luis Francisco. „Learning algorithms for sparse classification“. Phd thesis, Université de Technologie de Compiègne, 2013. http://tel.archives-ouvertes.fr/tel-00868847.
Der volle Inhalt der QuelleBuchteile zum Thema "LASSO algorithm"
Loth, Manuel, und Philippe Preux. „The Iso-regularization Descent Algorithm for the LASSO“. In Neural Information Processing. Theory and Algorithms, 454–61. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-17537-4_56.
Der volle Inhalt der QuelleWalrand, Jean. „Speech Recognition: B“. In Probability in Electrical Engineering and Computer Science, 217–42. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-49995-2_12.
Der volle Inhalt der QuelleWu, Kai, und Jing Liu. „Learning of Sparse Fuzzy Cognitive Maps Using Evolutionary Algorithm with Lasso Initialization“. In Lecture Notes in Computer Science, 385–96. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68759-9_32.
Der volle Inhalt der QuelleChen, Qian, und Lianbing Huang. „Research on Prediction Model of Gas Emission Based on Lasso Penalty Regression Algorithm“. In Lecture Notes in Electrical Engineering, 165–72. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-0187-6_19.
Der volle Inhalt der QuelleYanagihara, Hirokazu, und Ryoya Oda. „Coordinate Descent Algorithm for Normal-Likelihood-Based Group Lasso in Multivariate Linear Regression“. In Intelligent Decision Technologies, 429–39. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-2765-1_36.
Der volle Inhalt der QuelleAlKindy, Bassam, Christophe Guyeux, Jean-François Couchot, Michel Salomon, Christian Parisod und Jacques M. Bahi. „Hybrid Genetic Algorithm and Lasso Test Approach for Inferring Well Supported Phylogenetic Trees Based on Subsets of Chloroplastic Core Genes“. In Algorithms for Computational Biology, 83–96. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-21233-3_7.
Der volle Inhalt der QuelleMd Shahri, Nur Huda Nabihan, und Susana Conde. „Modelling Multi-dimensional Contingency Tables: LASSO and Stepwise Algorithms“. In Proceedings of the Third International Conference on Computing, Mathematics and Statistics (iCMS2017), 563–70. Singapore: Springer Singapore, 2019. http://dx.doi.org/10.1007/978-981-13-7279-7_70.
Der volle Inhalt der QuellePawlak, Mirosław, und Jiaqing Lv. „Analysis of Large Scale Power Systems via LASSO Learning Algorithms“. In Artificial Intelligence and Soft Computing, 652–62. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-20912-4_59.
Der volle Inhalt der QuelleBoulesteix, Anne-Laure, Adrian Richter und Christoph Bernau. „Complexity Selection with Cross-validation for Lasso and Sparse Partial Least Squares Using High-Dimensional Data“. In Algorithms from and for Nature and Life, 261–68. Cham: Springer International Publishing, 2013. http://dx.doi.org/10.1007/978-3-319-00035-0_26.
Der volle Inhalt der QuelleGnad, Daniel, Jan Eisenhut, Alberto Lluch Lafuente und Jörg Hoffmann. „Model Checking $$\omega $$-Regular Properties with Decoupled Search“. In Computer Aided Verification, 411–34. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-81688-9_19.
Der volle Inhalt der QuelleKonferenzberichte zum Thema "LASSO algorithm"
Lin, Daxuan, Zan Yang, Jiuwei Chen, Jiaxin Dong, Wei Nai und Dan Li. „Lasso Regression with Quantum Whale Optimization Algorithm“. In 2020 IEEE 11th International Conference on Software Engineering and Service Science (ICSESS). IEEE, 2020. http://dx.doi.org/10.1109/icsess49938.2020.9237739.
Der volle Inhalt der QuelleLiu, Yashu, Jie Wang und Jieping Ye. „An efficient algorithm for weak hierarchical lasso“. In KDD '14: The 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY, USA: ACM, 2014. http://dx.doi.org/10.1145/2623330.2623665.
Der volle Inhalt der QuelleZhang, Jianming, Mingkun Du und Keyang Cheng. „Pedestrian detection based on efficient fused lasso algorithm“. In 2012 5th International Congress on Image and Signal Processing (CISP). IEEE, 2012. http://dx.doi.org/10.1109/cisp.2012.6469937.
Der volle Inhalt der QuelleChen, Kai, und Yang Jin. „An ensemble learning algorithm based on Lasso selection“. In 2010 IEEE International Conference on Intelligent Computing and Intelligent Systems (ICIS 2010). IEEE, 2010. http://dx.doi.org/10.1109/icicisys.2010.5658515.
Der volle Inhalt der QuelleMo, Weike, Jiaqing Lv, Mirostaw Pawlak, U. D. Annakkage, Haoyong Chen und Yiping Chen. „Power System Online Sensitivity Identification Based on Lasso Algorithm“. In 2020 IEEE Power & Energy Society General Meeting (PESGM). IEEE, 2020. http://dx.doi.org/10.1109/pesgm41954.2020.9281724.
Der volle Inhalt der QuelleCosta, M. A., und A. P. Braga. „Optimization of Neural Networks with Multi-Objective LASSO Algorithm“. In The 2006 IEEE International Joint Conference on Neural Network Proceedings. IEEE, 2006. http://dx.doi.org/10.1109/ijcnn.2006.247329.
Der volle Inhalt der QuelleAlissou, Simplice A., und Ye Zhang. „Hyperspectral data compression using lasso algorithm for spectral decorrelation“. In SPIE Sensing Technology + Applications, herausgegeben von Bormin Huang, Chein-I. Chang und José Fco López. SPIE, 2014. http://dx.doi.org/10.1117/12.2053265.
Der volle Inhalt der QuelleFujiwara, Yasuhiro, Naoki Marumo, Mathieu Blondel, Koh Takeuchi, Hideaki Kim, Tomoharu Iwata und Naonori Ueda. „SVD-Based Screening for the Graphical Lasso“. In Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/233.
Der volle Inhalt der QuelleLiu, Jun, Lei Yuan und Jieping Ye. „An efficient algorithm for a class of fused lasso problems“. In the 16th ACM SIGKDD international conference. New York, New York, USA: ACM Press, 2010. http://dx.doi.org/10.1145/1835804.1835847.
Der volle Inhalt der QuelleFarokhmanesh, Fatemeh, und Mohammad Taghi Sadeghi. „Deep Feature Selection using an Enhanced Sparse Group Lasso Algorithm“. In 2019 27th Iranian Conference on Electrical Engineering (ICEE). IEEE, 2019. http://dx.doi.org/10.1109/iraniancee.2019.8786386.
Der volle Inhalt der Quelle