Littérature scientifique sur le sujet « Divergences de Bregman »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Divergences de Bregman ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Articles de revues sur le sujet "Divergences de Bregman"
Amari, S., et A. Cichocki. « Information geometry of divergence functions ». Bulletin of the Polish Academy of Sciences : Technical Sciences 58, no 1 (1 mars 2010) : 183–95. http://dx.doi.org/10.2478/v10175-010-0019-1.
Texte intégralABDULLAH, AMIRALI, JOHN MOELLER et SURESH VENKATASUBRAMANIAN. « APPROXIMATE BREGMAN NEAR NEIGHBORS IN SUBLINEAR TIME : BEYOND THE TRIANGLE INEQUALITY ». International Journal of Computational Geometry & ; Applications 23, no 04n05 (août 2013) : 253–301. http://dx.doi.org/10.1142/s0218195913600066.
Texte intégralNielsen, Frank. « On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds ». Entropy 22, no 7 (28 juin 2020) : 713. http://dx.doi.org/10.3390/e22070713.
Texte intégralLi, Wuchen. « Transport information Bregman divergences ». Information Geometry 4, no 2 (15 novembre 2021) : 435–70. http://dx.doi.org/10.1007/s41884-021-00063-5.
Texte intégralNielsen, Frank. « On a Generalization of the Jensen–Shannon Divergence and the Jensen–Shannon Centroid ». Entropy 22, no 2 (16 février 2020) : 221. http://dx.doi.org/10.3390/e22020221.
Texte intégralChen, P., Y. Chen et M. Rao. « Metrics defined by Bregman Divergences ». Communications in Mathematical Sciences 6, no 4 (2008) : 915–26. http://dx.doi.org/10.4310/cms.2008.v6.n4.a6.
Texte intégralBock, Andreas A., et Martin S. Andersen. « Preconditioner Design via Bregman Divergences ». SIAM Journal on Matrix Analysis and Applications 45, no 2 (7 juin 2024) : 1148–82. http://dx.doi.org/10.1137/23m1566637.
Texte intégralZhang, Jianwen, et Changshui Zhang. « Multitask Bregman Clustering ». Proceedings of the AAAI Conference on Artificial Intelligence 24, no 1 (3 juillet 2010) : 655–60. http://dx.doi.org/10.1609/aaai.v24i1.7674.
Texte intégralLiang, Xiao. « A Note on Divergences ». Neural Computation 28, no 10 (octobre 2016) : 2045–62. http://dx.doi.org/10.1162/neco_a_00878.
Texte intégralVoronov, Yuri, et Matvej Sviridov. « NEW INTERREGIONAL DIFFERENCES MEASURE MENTTOOLS ». Interexpo GEO-Siberia 3, no 1 (2019) : 71–77. http://dx.doi.org/10.33764/2618-981x-2019-3-1-71-77.
Texte intégralThèses sur le sujet "Divergences de Bregman"
Wu, Xiaochuan. « Clustering and visualisation with Bregman divergences ». Thesis, University of the West of Scotland, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.730022.
Texte intégralSun, Jiang. « Extending the metric multidimensional scaling with bregman divergences ». Thesis, University of the West of Scotland, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.556070.
Texte intégralGodeme, Jean-Jacques. « Ρhase retrieval with nοn-Euclidean Bregman based geοmetry ». Electronic Thesis or Diss., Normandie, 2024. http://www.theses.fr/2024NORMC214.
Texte intégralIn this work, we investigate the phase retrieval problem of real-valued signals in finite dimension, a challenge encountered across various scientific and engineering disciplines. It explores two complementary approaches: retrieval with and without regularization. In both settings, our work is focused on relaxing the Lipschitz-smoothness assumption generally required by first-order splitting algorithms, and which is not valid for phase retrieval cast as a minimization problem. The key idea here is to replace the Euclidean geometry by a non-Euclidean Bregman divergence associated to an appropriate kernel. We use a Bregman gradient/mirror descent algorithm with this divergence to solve thephase retrieval problem without regularization, and we show exact (up to a global sign) recovery both in a deterministic setting and with high probability for a sufficient number of random measurements (Gaussian and Coded Diffraction Patterns). Furthermore, we establish the robustness of this approachagainst small additive noise. Shifting to regularized phase retrieval, we first develop and analyze an Inertial Bregman Proximal Gradient algorithm for minimizing the sum of two functions in finite dimension, one of which is convex and possibly nonsmooth and the second is relatively smooth in the Bregman geometry. We provide both global and local convergence guarantees for this algorithm. Finally, we study noiseless and stable recovery of low complexity regularized phase retrieval. For this, weformulate the problem as the minimization of an objective functional involving a nonconvex smooth data fidelity term and a convex regularizer promoting solutions conforming to some notion of low-complexity related to their nonsmoothness points. We establish conditions for exact and stable recovery and provide sample complexity bounds for random measurements to ensure that these conditions hold. These sample bounds depend on the low complexity of the signals to be recovered. Our new results allow to go far beyond the case of sparse phase retrieval
Danilevicz, Ian Meneghel. « Detecting Influential observations in spatial models using Bregman divergence ». Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/104/104131/tde-13112018-160231/.
Texte intégralComo avaliar se um modelo espacial está bem ajustado? Como escolher o melhor modelo entre muitos da classe autorregressivo condicional (CAR) e autorregressivo simultâneo (SAR), homoscedásticos e heteroscedásticos? Para responder essas perguntas dentro do paradigma bayesiano, propomos novas formas de aplicar a divergência de Bregman, assim como critérios de informação bastante recentes na literatura, são eles o widely applicable information criterion (WAIC) e validação cruzada leave-one-out (LOO). O funcional de Bregman é uma generalização da famosa divergência de Kullback-Leiber (KL). Há diversos casos particulares dela que podem ser usados para identificar pontos influentes. Todas as distribuições a posteriori apresentadas nesta dissertação foram estimadas usando Monte Carlo Hamiltoniano (HMC), uma versão otimizada do algoritmo Metropolis-Hastings. Todas as ideias apresentadas neste texto foram submetidas a simulações e aplicadas em dados reais.
Adamcik, Martin. « Collective reasoning under uncertainty and inconsistency ». Thesis, University of Manchester, 2014. https://www.research.manchester.ac.uk/portal/en/theses/collective-reasoning-under-uncertainty-and-inconsistency(7fab8021-8beb-45e7-8b45-7cb4fadd70be).html.
Texte intégralSears, Timothy Dean, et tim sears@biogreenoil com. « Generalized Maximum Entropy, Convexity and Machine Learning ». The Australian National University. Research School of Information Sciences and Engineering, 2008. http://thesis.anu.edu.au./public/adt-ANU20090525.210315.
Texte intégralSilveti, Falls Antonio. « First-order noneuclidean splitting methods for large-scale optimization : deterministic and stochastic algorithms ». Thesis, Normandie, 2021. http://www.theses.fr/2021NORMC204.
Texte intégralIn this work we develop and examine two novel first-order splitting algorithms for solving large-scale composite optimization problems in infinite-dimensional spaces. Such problems are ubiquitous in many areas of science and engineering, particularly in data science and imaging sciences. Our work is focused on relaxing the Lipschitz-smoothness assumptions generally required by first-order splitting algorithms by replacing the Euclidean energy with a Bregman divergence. These developments allow one to solve problems having more exotic geometry than that of the usual Euclidean setting. One algorithm is hybridization of the conditional gradient algorithm, making use of a linear minimization oracle at each iteration, with an augmented Lagrangian algorithm, allowing for affine constraints. The other algorithm is a primal-dual splitting algorithm incorporating Bregman divergences for computing the associated proximal operators. For both of these algorithms, our analysis shows convergence of the Lagrangian values, subsequential weak convergence of the iterates to solutions, and rates of convergence. In addition to these novel deterministic algorithms, we introduce and study also the stochastic extensions of these algorithms through a perturbation perspective. Our results in this part include almost sure convergence results for all the same quantities as in the deterministic setting, with rates as well. Finally, we tackle new problems that are only accessible through the relaxed assumptions our algorithms allow. We demonstrate numerical efficiency and verify our theoretical results on problems like low rank, sparse matrix completion, inverse problems on the simplex, and entropically regularized Wasserstein inverse problems
Hasnat, Md Abul. « Unsupervised 3D image clustering and extension to joint color and depth segmentation ». Thesis, Saint-Etienne, 2014. http://www.theses.fr/2014STET4013/document.
Texte intégralAccess to the 3D images at a reasonable frame rate is widespread now, thanks to the recent advances in low cost depth sensors as well as the efficient methods to compute 3D from 2D images. As a consequence, it is highly demanding to enhance the capability of existing computer vision applications by incorporating 3D information. Indeed, it has been demonstrated in numerous researches that the accuracy of different tasks increases by including 3D information as an additional feature. However, for the task of indoor scene analysis and segmentation, it remains several important issues, such as: (a) how the 3D information itself can be exploited? and (b) what is the best way to fuse color and 3D in an unsupervised manner? In this thesis, we address these issues and propose novel unsupervised methods for 3D image clustering and joint color and depth image segmentation. To this aim, we consider image normals as the prominent feature from 3D image and cluster them with methods based on finite statistical mixture models. We consider Bregman Soft Clustering method to ensure computationally efficient clustering. Moreover, we exploit several probability distributions from directional statistics, such as the von Mises-Fisher distribution and the Watson distribution. By combining these, we propose novel Model Based Clustering methods. We empirically validate these methods using synthetic data and then demonstrate their application for 3D/depth image analysis. Afterward, we extend these methods to segment synchronized 3D and color image, also called RGB-D image. To this aim, first we propose a statistical image generation model for RGB-D image. Then, we propose novel RGB-D segmentation method using a joint color-spatial-axial clustering and a statistical planar region merging method. Results show that, the proposed method is comparable with the state of the art methods and requires less computation time. Moreover, it opens interesting perspectives to fuse color and geometry in an unsupervised manner. We believe that the methods proposed in this thesis are equally applicable and extendable for clustering different types of data, such as speech, gene expressions, etc. Moreover, they can be used for complex tasks, such as joint image-speech data analysis
Acharyya, Sreangsu. « Learning to rank in supervised and unsupervised settings using convexity and monotonicity ». 2013. http://hdl.handle.net/2152/21154.
Texte intégraltext
Sprung, Benjamin. « Convergence rates for variational regularization of statistical inverse problems ». Doctoral thesis, 2019. http://hdl.handle.net/21.11130/00-1735-0000-0005-1398-A.
Texte intégralChapitres de livres sur le sujet "Divergences de Bregman"
Nielsen, Frank, et Gaëtan Hadjeres. « Quasiconvex Jensen Divergences and Quasiconvex Bregman Divergences ». Dans Springer Proceedings in Mathematics & ; Statistics, 196–218. Cham : Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-77957-3_11.
Texte intégralNielsen, Frank, et Richard Nock. « Bregman Divergences from Comparative Convexity ». Dans Lecture Notes in Computer Science, 639–47. Cham : Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68445-1_74.
Texte intégralLai, Pei Ling, et Colin Fyfe. « Bregman Divergences and Multi-dimensional Scaling ». Dans Advances in Neuro-Information Processing, 935–42. Berlin, Heidelberg : Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-03040-6_114.
Texte intégralWang, Xi, et Colin Fyfe. « Independent Component Analysis Using Bregman Divergences ». Dans Trends in Applied Intelligent Systems, 627–36. Berlin, Heidelberg : Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-13025-0_64.
Texte intégralJang, Eunsong, Colin Fyfe et Hanseok Ko. « Bregman Divergences and the Self Organising Map ». Dans Lecture Notes in Computer Science, 452–58. Berlin, Heidelberg : Springer Berlin Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-88906-9_57.
Texte intégralSantos-Rodríguez, Raúl, Alicia Guerrero-Curieses, Rocío Alaiz-Rodríguez et Jesús Cid-Sueiro. « Cost-Sensitive Learning Based on Bregman Divergences ». Dans Machine Learning and Knowledge Discovery in Databases, 12. Berlin, Heidelberg : Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04180-8_12.
Texte intégralSun, Jigang, Malcolm Crowe et Colin Fyfe. « Extending Metric Multidimensional Scaling with Bregman Divergences ». Dans Trends in Applied Intelligent Systems, 615–26. Berlin, Heidelberg : Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-13025-0_63.
Texte intégralWang, Shaojun, et Dale Schuurmans. « Learning Continuous Latent Variable Models with Bregman Divergences ». Dans Lecture Notes in Computer Science, 190–204. Berlin, Heidelberg : Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-39624-6_16.
Texte intégralWu, Yan, Liang Du et Honghong Cheng. « Multi-view K-Means Clustering with Bregman Divergences ». Dans Communications in Computer and Information Science, 26–38. Singapore : Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-13-2122-1_3.
Texte intégralStummer, Wolfgang, et Anna-Lena Kißlinger. « Some New Flexibilizations of Bregman Divergences and Their Asymptotics ». Dans Lecture Notes in Computer Science, 514–22. Cham : Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68445-1_60.
Texte intégralActes de conférences sur le sujet "Divergences de Bregman"
Banerjee, Arindam, Srujana Merugu, Inderjit Dhillon et Joydeep Ghosh. « Clustering with Bregman Divergences ». Dans Proceedings of the 2004 SIAM International Conference on Data Mining. Philadelphia, PA : Society for Industrial and Applied Mathematics, 2004. http://dx.doi.org/10.1137/1.9781611972740.22.
Texte intégralAcharyya, Sreangsu, Arindam Banerjee et Daniel Boley. « Bregman Divergences and Triangle Inequality ». Dans Proceedings of the 2013 SIAM International Conference on Data Mining. Philadelphia, PA : Society for Industrial and Applied Mathematics, 2013. http://dx.doi.org/10.1137/1.9781611972832.53.
Texte intégralInan, Huseyin A., Mehmet A. Donmez et Suleyman S. Kozat. « Adaptive mixture methods using Bregman divergences ». Dans ICASSP 2012 - 2012 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2012. http://dx.doi.org/10.1109/icassp.2012.6288740.
Texte intégralCayton, Lawrence. « Fast nearest neighbor retrieval for bregman divergences ». Dans the 25th international conference. New York, New York, USA : ACM Press, 2008. http://dx.doi.org/10.1145/1390156.1390171.
Texte intégralHarandi, Mehrtash, Mathieu Salzmann et Fatih Porikli. « Bregman Divergences for Infinite Dimensional Covariance Matrices ». Dans 2014 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2014. http://dx.doi.org/10.1109/cvpr.2014.132.
Texte intégralAckermann, Marcel R., et Johannes Blömer. « Coresets and Approximate Clustering for Bregman Divergences ». Dans Proceedings of the Twentieth Annual ACM-SIAM Symposium on Discrete Algorithms. Philadelphia, PA : Society for Industrial and Applied Mathematics, 2009. http://dx.doi.org/10.1137/1.9781611973068.118.
Texte intégralFerreira, Daniela P. L., Eraldo Ribeiro et Celia A. Z. Barcelos. « Variational non rigid registration with bregman divergences ». Dans SAC 2017 : Symposium on Applied Computing. New York, NY, USA : ACM, 2017. http://dx.doi.org/10.1145/3019612.3019646.
Texte intégralShaojun Wang et D. Schummans. « Learning latent variable models with bregman divergences ». Dans IEEE International Symposium on Information Theory, 2003. Proceedings. IEEE, 2003. http://dx.doi.org/10.1109/isit.2003.1228234.
Texte intégralEscolano, Francisco, Meizhu Liu et Edwin R. Hancock. « Tensor-based total bregman divergences between graphs ». Dans 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops). IEEE, 2011. http://dx.doi.org/10.1109/iccvw.2011.6130420.
Texte intégralMagron, Paul, Pierre-Hugo Vial, Thomas Oberlin et Cedric Fevotte. « Phase Recovery with Bregman Divergences for Audio Source Separation ». Dans ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2021. http://dx.doi.org/10.1109/icassp39728.2021.9413717.
Texte intégralRapports d'organisations sur le sujet "Divergences de Bregman"
Li, Xinyao. Block active ADMM to Minimize NMF with Bregman Divergences. Ames (Iowa) : Iowa State University, janvier 2021. http://dx.doi.org/10.31274/cc-20240624-307.
Texte intégralCherian, Anoop, Suvrit Sra, Arindam Banerjee et Nikos Papanikolopoulos. Jensen-Bregman LogDet Divergence for Efficient Similarity Computations on Positive Definite Tensors. Fort Belvoir, VA : Defense Technical Information Center, mai 2012. http://dx.doi.org/10.21236/ada561322.
Texte intégral