Academic literature on the topic 'Sparse Bayesian learning (SBL)'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Sparse Bayesian learning (SBL).'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Sparse Bayesian learning (SBL)"
Yuan, Cheng, and Mingjun Su. "Seismic spectral sparse reflectivity inversion based on SBL-EM: experimental analysis and application." Journal of Geophysics and Engineering 16, no. 6 (October 18, 2019): 1124–38. http://dx.doi.org/10.1093/jge/gxz082.
Full textShin, Myoungin, Wooyoung Hong, Keunhwa Lee, and Youngmin Choo. "Frequency Analysis of Acoustic Data Using Multiple-Measurement Sparse Bayesian Learning." Sensors 21, no. 17 (August 30, 2021): 5827. http://dx.doi.org/10.3390/s21175827.
Full textNYEO, SU-LONG, and RAFAT R. ANSARI. "EARLY CATARACT DETECTION BY DYNAMIC LIGHT SCATTERING WITH SPARSE BAYESIAN LEARNING." Journal of Innovative Optical Health Sciences 02, no. 03 (July 2009): 303–13. http://dx.doi.org/10.1142/s1793545809000632.
Full textLi, Taiyong, Zhenda Hu, Yanchi Jia, Jiang Wu, and Yingrui Zhou. "Forecasting Crude Oil Prices Using Ensemble Empirical Mode Decomposition and Sparse Bayesian Learning." Energies 11, no. 7 (July 19, 2018): 1882. http://dx.doi.org/10.3390/en11071882.
Full textLiu, Qi, Xianpeng Wang, Mengxing Huang, Xiang Lan, and Lu Sun. "DOA and Range Estimation for FDA-MIMO Radar with Sparse Bayesian Learning." Remote Sensing 13, no. 13 (June 29, 2021): 2553. http://dx.doi.org/10.3390/rs13132553.
Full textPan, Kaikai, Zheng Qian, and Niya Chen. "Probabilistic Short-Term Wind Power Forecasting Using Sparse Bayesian Learning and NWP." Mathematical Problems in Engineering 2015 (2015): 1–11. http://dx.doi.org/10.1155/2015/785215.
Full textGerstoft, Peter, Christoph Mecklenbrauker, Santosh Nannuru, and Geert Leus. "DOA Estimation in Heteroscedastic Noise with sparse Bayesian Learning." Applied Computational Electromagnetics Society 35, no. 11 (February 5, 2021): 1439–40. http://dx.doi.org/10.47037/2020.aces.j.351188.
Full textWang, Meiyue, and Shizhong Xu. "A coordinate descent approach for sparse Bayesian learning in high dimensional QTL mapping and genome-wide association studies." Bioinformatics 35, no. 21 (April 9, 2019): 4327–35. http://dx.doi.org/10.1093/bioinformatics/btz244.
Full textShekaramiz, Mohammad, Todd Moon, and Jacob Gunther. "Bayesian Compressive Sensing of Sparse Signals with Unknown Clustering Patterns." Entropy 21, no. 3 (March 5, 2019): 247. http://dx.doi.org/10.3390/e21030247.
Full textWang, Guo, and Wang. "Exploring the Laplace Prior in Radio Tomographic Imaging with Sparse Bayesian Learning towards the Robustness to Multipath Fading." Sensors 19, no. 23 (November 22, 2019): 5126. http://dx.doi.org/10.3390/s19235126.
Full textDissertations / Theses on the topic "Sparse Bayesian learning (SBL)"
Chen, Cong. "High-Dimensional Generative Models for 3D Perception." Diss., Virginia Tech, 2021. http://hdl.handle.net/10919/103948.
Full textDoctor of Philosophy
The development of automation systems and robotics brought the modern world unrivaled affluence and convenience. However, the current automated tasks are mainly simple repetitive motions. Tasks that require more artificial capability with advanced visual cognition are still an unsolved problem for automation. Many of the high-level cognition-based tasks require the accurate visual perception of the environment and dynamic objects from the data received from the optical sensor. The capability to represent, identify and interpret complex visual data for understanding the geometric structure of the world is 3D perception. To better tackle the existing 3D perception challenges, this dissertation proposed a set of generative learning-based frameworks on sparse tensor data for various high-dimensional robotics perception applications: underwater point cloud filtering, image restoration, deformation detection, and localization. Underwater point cloud data is relevant for many applications such as environmental monitoring or geological exploration. The data collected with sonar sensors are however subjected to different types of noise, including holes, noise measurements, and outliers. In the first chapter, we propose a generative model for point cloud data recovery using Variational Bayesian (VB) based sparse tensor factorization methods to tackle these three defects simultaneously. In the second part of the dissertation, we propose an image restoration technique to tackle missing data, which is essential for many perception applications. An efficient generative chaotic RNN framework has been introduced for recovering the sparse tensor from a single corrupted image for various types of missing data. In the last chapter, a multi-level CNN for high-dimension tensor feature extraction for underwater vehicle localization has been proposed.
Higson, Edward John. "Bayesian methods and machine learning in astrophysics." Thesis, University of Cambridge, 2019. https://www.repository.cam.ac.uk/handle/1810/289728.
Full textJin, Junyang. "Novel methods for biological network inference : an application to circadian Ca2+ signaling network." Thesis, University of Cambridge, 2018. https://www.repository.cam.ac.uk/handle/1810/285323.
Full textSubramanian, Harshavardhan. "Combining scientific computing and machine learning techniques to model longitudinal outcomes in clinical trials." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-176427.
Full textFrancisco, André Biasin Segalla. "Esparsidade estruturada em reconstrução de fontes de EEG." Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/43/43134/tde-13052018-112615/.
Full textFunctional Neuroimaging is an area of neuroscience which aims at developing several techniques to map the activity of the nervous system and has been under constant development in the last decades due to its high importance in clinical applications and research. Common applied techniques such as functional magnetic resonance imaging (fMRI) and positron emission tomography (PET) have great spatial resolution (~ mm), but a limited temporal resolution (~ s), which poses a great challenge on our understanding of the dynamics of higher cognitive functions, whose oscillations can occur in much finer temporal scales (~ ms). Such limitation occurs because these techniques rely on measurements of slow biological responses which are correlated in a complicated manner to the actual electric activity. The two major candidates that overcome this shortcoming are Electro- and Magnetoencephalography (EEG/MEG), which are non-invasive techniques that measure the electric and magnetic fields on the scalp, respectively, generated by the electrical brain sources. Both have millisecond temporal resolution, but typically low spatial resolution (~ cm) due to the highly ill-posed nature of the electromagnetic inverse problem. There has been a huge effort in the last decades to improve their spatial resolution by means of incorporating relevant information to the problem from either other imaging modalities and/or biologically inspired constraints allied with the development of sophisticated mathematical methods and algorithms. In this work we focus on EEG, although all techniques here presented can be equally applied to MEG because of their identical mathematical form. In particular, we explore sparsity as a useful mathematical constraint in a Bayesian framework called Sparse Bayesian Learning (SBL), which enables the achievement of meaningful unique solutions in the source reconstruction problem. Moreover, we investigate how to incorporate different structures as degrees of freedom into this framework, which is an application of structured sparsity and show that it is a promising way to improve the source reconstruction accuracy of electromagnetic imaging methods.
Cherief-Abdellatif, Badr-Eddine. "Contributions to the theoretical study of variational inference and robustness." Electronic Thesis or Diss., Institut polytechnique de Paris, 2020. http://www.theses.fr/2020IPPAG001.
Full textThis PhD thesis deals with variational inference and robustness. More precisely, it focuses on the statistical properties of variational approximations and the design of efficient algorithms for computing them in an online fashion, and investigates Maximum Mean Discrepancy based estimators as learning rules that are robust to model misspecification.In recent years, variational inference has been extensively studied from the computational viewpoint, but only little attention has been put in the literature towards theoretical properties of variational approximations until very recently. In this thesis, we investigate the consistency of variational approximations in various statistical models and the conditions that ensure the consistency of variational approximations. In particular, we tackle the special case of mixture models and deep neural networks. We also justify in theory the use of the ELBO maximization strategy, a model selection criterion that is widely used in the Variational Bayes community and is known to work well in practice.Moreover, Bayesian inference provides an attractive online-learning framework to analyze sequential data, and offers generalization guarantees which hold even under model mismatch and with adversaries. Unfortunately, exact Bayesian inference is rarely feasible in practice and approximation methods are usually employed, but do such methods preserve the generalization properties of Bayesian inference? In this thesis, we show that this is indeed the case for some variational inference algorithms. We propose new online, tempered variational algorithms and derive their generalization bounds. Our theoretical result relies on the convexity of the variational objective, but we argue that our result should hold more generally and present empirical evidence in support of this. Our work presents theoretical justifications in favor of online algorithms that rely on approximate Bayesian methods. Another point that is addressed in this thesis is the design of a universal estimation procedure. This question is of major interest, in particular because it leads to robust estimators, a very hot topic in statistics and machine learning. We tackle the problem of universal estimation using a minimum distance estimator based on the Maximum Mean Discrepancy. We show that the estimator is robust to both dependence and to the presence of outliers in the dataset. We also highlight the connections that may exist with minimum distance estimators using L2-distance. Finally, we provide a theoretical study of the stochastic gradient descent algorithm used to compute the estimator, and we support our findings with numerical simulations. We also propose a Bayesian version of our estimator, that we study from both a theoretical and a computational points of view
Le, Folgoc Loïc. "Apprentissage statistique pour la personnalisation de modèles cardiaques à partir de données d’imagerie." Thesis, Nice, 2015. http://www.theses.fr/2015NICE4098/document.
Full textThis thesis focuses on the calibration of an electromechanical model of the heart from patient-specific, image-based data; and on the related task of extracting the cardiac motion from 4D images. Long-term perspectives for personalized computer simulation of the cardiac function include aid to the diagnosis, aid to the planning of therapy and prevention of risks. To this end, we explore tools and possibilities offered by statistical learning. To personalize cardiac mechanics, we introduce an efficient framework coupling machine learning and an original statistical representation of shape & motion based on 3D+t currents. The method relies on a reduced mapping between the space of mechanical parameters and the space of cardiac motion. The second focus of the thesis is on cardiac motion tracking, a key processing step in the calibration pipeline, with an emphasis on quantification of uncertainty. We develop a generic sparse Bayesian model of image registration with three main contributions: an extended image similarity term, the automated tuning of registration parameters and uncertainty quantification. We propose an approximate inference scheme that is tractable on 4D clinical data. Finally, we wish to evaluate the quality of uncertainty estimates returned by the approximate inference scheme. We compare the predictions of the approximate scheme with those of an inference scheme developed on the grounds of reversible jump MCMC. We provide more insight into the theoretical properties of the sparse structured Bayesian model and into the empirical behaviour of both inference schemes
Dang, Hong-Phuong. "Approches bayésiennes non paramétriques et apprentissage de dictionnaire pour les problèmes inverses en traitement d'image." Thesis, Ecole centrale de Lille, 2016. http://www.theses.fr/2016ECLI0019/document.
Full textDictionary learning for sparse representation has been widely advocated for solving inverse problems. Optimization methods and parametric approaches towards dictionary learning have been particularly explored. These methods meet some limitations, particularly related to the choice of parameters. In general, the dictionary size is fixed in advance, and sparsity or noise level may also be needed. In this thesis, we show how to perform jointly dictionary and parameter learning, with an emphasis on image processing. We propose and study the Indian Buffet Process for Dictionary Learning (IBP-DL) method, using a bayesian nonparametric approach.A primer on bayesian nonparametrics is first presented. Dirichlet and Beta processes and their respective derivatives, the Chinese restaurant and Indian Buffet processes are described. The proposed model for dictionary learning relies on an Indian Buffet prior, which permits to learn an adaptive size dictionary. The Monte-Carlo method for inference is detailed. Noise and sparsity levels are also inferred, so that in practice no parameter tuning is required. Numerical experiments illustrate the performances of the approach in different settings: image denoising, inpainting and compressed sensing. Results are compared with state-of-the art methods is made. Matlab and C sources are available for sake of reproducibility
Gerchinovitz, Sébastien. "Prédiction de suites individuelles et cadre statistique classique : étude de quelques liens autour de la régression parcimonieuse et des techniques d'agrégation." Phd thesis, Université Paris Sud - Paris XI, 2011. http://tel.archives-ouvertes.fr/tel-00653550.
Full textShi, Minghui. "Bayesian Sparse Learning for High Dimensional Data." Diss., 2011. http://hdl.handle.net/10161/3869.
Full textIn this thesis, we develop some Bayesian sparse learning methods for high dimensional data analysis. There are two important topics that are related to the idea of sparse learning -- variable selection and factor analysis. We start with Bayesian variable selection problem in regression models. One challenge in Bayesian variable selection is to search the huge model space adequately, while identifying high posterior probability regions. In the past decades, the main focus has been on the use of Markov chain Monte Carlo (MCMC) algorithms for these purposes. In the first part of this thesis, instead of using MCMC, we propose a new computational approach based on sequential Monte Carlo (SMC), which we refer to as particle stochastic search (PSS). We illustrate PSS through applications to linear regression and probit models.
Besides the Bayesian stochastic search algorithms, there is a rich literature on shrinkage and variable selection methods for high dimensional regression and classification with vector-valued parameters, such as lasso (Tibshirani, 1996) and the relevance vector machine (Tipping, 2001). Comparing with the Bayesian stochastic search algorithms, these methods does not account for model uncertainty but are more computationally efficient. In the second part of this thesis, we generalize this type of ideas to matrix valued parameters and focus on developing efficient variable selection method for multivariate regression. We propose a Bayesian shrinkage model (BSM) and an efficient algorithm for learning the associated parameters .
In the third part of this thesis, we focus on the topic of factor analysis which has been widely used in unsupervised learnings. One central problem in factor analysis is related to the determination of the number of latent factors. We propose some Bayesian model selection criteria for selecting the number of latent factors based on a graphical factor model. As it is illustrated in Chapter 4, our proposed method achieves good performance in correctly selecting the number of factors in several different settings. As for application, we implement the graphical factor model for several different purposes, such as covariance matrix estimation, latent factor regression and classification.
Dissertation
Book chapters on the topic "Sparse Bayesian learning (SBL)"
Chatzis, Sotirios P. "Sparse Bayesian Recurrent Neural Networks." In Machine Learning and Knowledge Discovery in Databases, 359–72. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-23525-7_22.
Full textHuang, Yong, and James L. Beck. "Sparse Bayesian Learning and its Application in Bayesian System Identification." In Bayesian Inverse Problems, 79–111. Boca Raton: CRC Press, 2021. http://dx.doi.org/10.1201/b22018-7.
Full textZhang, Guanghao, Dongshun Cui, Shangbo Mao, and Guang-Bin Huang. "Sparse Bayesian Learning for Extreme Learning Machine Auto-encoder." In Proceedings in Adaptation, Learning and Optimization, 319–27. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-23307-5_34.
Full textLei, Yun, Xiaoqing Ding, and Shengjin Wang. "Adaptive Sparse Vector Tracking Via Online Bayesian Learning." In Advances in Machine Vision, Image Processing, and Pattern Analysis, 35–45. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11821045_4.
Full textMichel, Vincent, Evelyn Eger, Christine Keribin, and Bertrand Thirion. "Multi-Class Sparse Bayesian Regression for Neuroimaging Data Analysis." In Machine Learning in Medical Imaging, 50–57. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15948-0_7.
Full textWang, Lu, Lifan Zhao, Guoan Bi, and Xin Liu. "Alternative Extended Block Sparse Bayesian Learning for Cluster Structured Sparse Signal Recovery." In Wireless and Satellite Systems, 3–12. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-19153-5_1.
Full textBo, Liefeng, Ling Wang, and Licheng Jiao. "Sparse Bayesian Learning Based on an Efficient Subset Selection." In Advances in Neural Networks – ISNN 2004, 264–69. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-28647-9_45.
Full textSun, Shuanzhu, Zhong Han, Xiaolong Qi, Chunlei Zhou, Tiancheng Zhang, Bei Song, and Yang Gao. "An Incremental Approach for Sparse Bayesian Network Structure Learning." In Big Data, 350–65. Singapore: Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-13-2922-7_24.
Full textHan, Min, and Dayun Mu. "Multi-reservoir Echo State Network with Sparse Bayesian Learning." In Advances in Neural Networks - ISNN 2010, 450–56. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-13278-0_58.
Full textSabuncu, Mert R. "A Sparse Bayesian Learning Algorithm for Longitudinal Image Data." In Lecture Notes in Computer Science, 411–18. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-24574-4_49.
Full textConference papers on the topic "Sparse Bayesian learning (SBL)"
Srivastava, Suraj, Ch Suraj Kumar Patro, Aditya K. Jagannatham, and Govind Sharma. "Sparse Bayesian Learning (SBL)-Based Frequency-Selective Channel Estimation for Millimeter Wave Hybrid MIMO Systems." In 2019 National Conference on Communications (NCC). IEEE, 2019. http://dx.doi.org/10.1109/ncc.2019.8732197.
Full textSrivastava, Suraj, and Aditya K. Jagannatham. "Sparse Bayesian Learning-Based Kalman Filtering (SBL-KF) for Group-Sparse Channel Estimation in Doubly Selective mmWave Hybrid MIMO Systems." In 2019 IEEE 20th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC). IEEE, 2019. http://dx.doi.org/10.1109/spawc.2019.8815509.
Full textChang, Xiao, and Qinghua Zheng. "Sparse Bayesian learning for ranking." In 2009 IEEE International Conference on Granular Computing (GRC). IEEE, 2009. http://dx.doi.org/10.1109/grc.2009.5255164.
Full textGiri, Ritwik, and Bhaskar D. Rao. "Bootstrapped sparse Bayesian learning for sparse signal recovery." In 2014 48th Asilomar Conference on Signals, Systems and Computers. IEEE, 2014. http://dx.doi.org/10.1109/acssc.2014.7094748.
Full textNannuru, Santosh, Kay L. Gemba, and Peter Gerstoft. "Sparse Bayesian learning with multiple dictionaries." In 2017 IEEE Global Conference on Signal and Information Processing (GlobalSIP). IEEE, 2017. http://dx.doi.org/10.1109/globalsip.2017.8309149.
Full textLiu, Jing, Yacong Ding, and Bhaskar Rao. "Sparse Bayesian Learning for Robust PCA." In ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2019. http://dx.doi.org/10.1109/icassp.2019.8682785.
Full textPal, Piya, and P. P. Vaidyanathan. "Parameter identifiability in Sparse Bayesian Learning." In ICASSP 2014 - 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2014. http://dx.doi.org/10.1109/icassp.2014.6853919.
Full textK.V., Aiswarya, Gundabattula Naga Rama Mangaiah Naidu, Kalaiarasi K., Kavya D., and Kirthiga S. "Spectrum Sensing using Sparse Bayesian Learning." In 2019 International Conference on Communication and Signal Processing (ICCSP). IEEE, 2019. http://dx.doi.org/10.1109/iccsp.2019.8698108.
Full textPark, Yongsung, Florian Meyer, and Peter Gerstoft. "Sequential Sparse Bayesian Learning For Doa." In 2020 54th Asilomar Conference on Signals, Systems, and Computers. IEEE, 2020. http://dx.doi.org/10.1109/ieeeconf51394.2020.9443444.
Full textKoyama, Shoichi, Atsushi Matsubayashi, Naoki Murata, and Hiroshi Saruwatari. "Sparse sound field decomposition using group sparse Bayesian learning." In 2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA). IEEE, 2015. http://dx.doi.org/10.1109/apsipa.2015.7415391.
Full text