Добірка наукової літератури з теми "Divergences de Bregman"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Divergences de Bregman".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Статті в журналах з теми "Divergences de Bregman"
Amari, S., and A. Cichocki. "Information geometry of divergence functions." Bulletin of the Polish Academy of Sciences: Technical Sciences 58, no. 1 (March 1, 2010): 183–95. http://dx.doi.org/10.2478/v10175-010-0019-1.
Повний текст джерелаABDULLAH, AMIRALI, JOHN MOELLER, and SURESH VENKATASUBRAMANIAN. "APPROXIMATE BREGMAN NEAR NEIGHBORS IN SUBLINEAR TIME: BEYOND THE TRIANGLE INEQUALITY." International Journal of Computational Geometry & Applications 23, no. 04n05 (August 2013): 253–301. http://dx.doi.org/10.1142/s0218195913600066.
Повний текст джерелаNielsen, Frank. "On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds." Entropy 22, no. 7 (June 28, 2020): 713. http://dx.doi.org/10.3390/e22070713.
Повний текст джерелаLi, Wuchen. "Transport information Bregman divergences." Information Geometry 4, no. 2 (November 15, 2021): 435–70. http://dx.doi.org/10.1007/s41884-021-00063-5.
Повний текст джерелаNielsen, Frank. "On a Generalization of the Jensen–Shannon Divergence and the Jensen–Shannon Centroid." Entropy 22, no. 2 (February 16, 2020): 221. http://dx.doi.org/10.3390/e22020221.
Повний текст джерелаChen, P., Y. Chen, and M. Rao. "Metrics defined by Bregman Divergences." Communications in Mathematical Sciences 6, no. 4 (2008): 915–26. http://dx.doi.org/10.4310/cms.2008.v6.n4.a6.
Повний текст джерелаBock, Andreas A., and Martin S. Andersen. "Preconditioner Design via Bregman Divergences." SIAM Journal on Matrix Analysis and Applications 45, no. 2 (June 7, 2024): 1148–82. http://dx.doi.org/10.1137/23m1566637.
Повний текст джерелаZhang, Jianwen, and Changshui Zhang. "Multitask Bregman Clustering." Proceedings of the AAAI Conference on Artificial Intelligence 24, no. 1 (July 3, 2010): 655–60. http://dx.doi.org/10.1609/aaai.v24i1.7674.
Повний текст джерелаLiang, Xiao. "A Note on Divergences." Neural Computation 28, no. 10 (October 2016): 2045–62. http://dx.doi.org/10.1162/neco_a_00878.
Повний текст джерелаVoronov, Yuri, and Matvej Sviridov. "NEW INTERREGIONAL DIFFERENCES MEASURE MENTTOOLS." Interexpo GEO-Siberia 3, no. 1 (2019): 71–77. http://dx.doi.org/10.33764/2618-981x-2019-3-1-71-77.
Повний текст джерелаДисертації з теми "Divergences de Bregman"
Wu, Xiaochuan. "Clustering and visualisation with Bregman divergences." Thesis, University of the West of Scotland, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.730022.
Повний текст джерелаSun, Jiang. "Extending the metric multidimensional scaling with bregman divergences." Thesis, University of the West of Scotland, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.556070.
Повний текст джерелаGodeme, Jean-Jacques. "Ρhase retrieval with nοn-Euclidean Bregman based geοmetry". Electronic Thesis or Diss., Normandie, 2024. http://www.theses.fr/2024NORMC214.
Повний текст джерелаIn this work, we investigate the phase retrieval problem of real-valued signals in finite dimension, a challenge encountered across various scientific and engineering disciplines. It explores two complementary approaches: retrieval with and without regularization. In both settings, our work is focused on relaxing the Lipschitz-smoothness assumption generally required by first-order splitting algorithms, and which is not valid for phase retrieval cast as a minimization problem. The key idea here is to replace the Euclidean geometry by a non-Euclidean Bregman divergence associated to an appropriate kernel. We use a Bregman gradient/mirror descent algorithm with this divergence to solve thephase retrieval problem without regularization, and we show exact (up to a global sign) recovery both in a deterministic setting and with high probability for a sufficient number of random measurements (Gaussian and Coded Diffraction Patterns). Furthermore, we establish the robustness of this approachagainst small additive noise. Shifting to regularized phase retrieval, we first develop and analyze an Inertial Bregman Proximal Gradient algorithm for minimizing the sum of two functions in finite dimension, one of which is convex and possibly nonsmooth and the second is relatively smooth in the Bregman geometry. We provide both global and local convergence guarantees for this algorithm. Finally, we study noiseless and stable recovery of low complexity regularized phase retrieval. For this, weformulate the problem as the minimization of an objective functional involving a nonconvex smooth data fidelity term and a convex regularizer promoting solutions conforming to some notion of low-complexity related to their nonsmoothness points. We establish conditions for exact and stable recovery and provide sample complexity bounds for random measurements to ensure that these conditions hold. These sample bounds depend on the low complexity of the signals to be recovered. Our new results allow to go far beyond the case of sparse phase retrieval
Danilevicz, Ian Meneghel. "Detecting Influential observations in spatial models using Bregman divergence." Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/104/104131/tde-13112018-160231/.
Повний текст джерелаComo avaliar se um modelo espacial está bem ajustado? Como escolher o melhor modelo entre muitos da classe autorregressivo condicional (CAR) e autorregressivo simultâneo (SAR), homoscedásticos e heteroscedásticos? Para responder essas perguntas dentro do paradigma bayesiano, propomos novas formas de aplicar a divergência de Bregman, assim como critérios de informação bastante recentes na literatura, são eles o widely applicable information criterion (WAIC) e validação cruzada leave-one-out (LOO). O funcional de Bregman é uma generalização da famosa divergência de Kullback-Leiber (KL). Há diversos casos particulares dela que podem ser usados para identificar pontos influentes. Todas as distribuições a posteriori apresentadas nesta dissertação foram estimadas usando Monte Carlo Hamiltoniano (HMC), uma versão otimizada do algoritmo Metropolis-Hastings. Todas as ideias apresentadas neste texto foram submetidas a simulações e aplicadas em dados reais.
Adamcik, Martin. "Collective reasoning under uncertainty and inconsistency." Thesis, University of Manchester, 2014. https://www.research.manchester.ac.uk/portal/en/theses/collective-reasoning-under-uncertainty-and-inconsistency(7fab8021-8beb-45e7-8b45-7cb4fadd70be).html.
Повний текст джерелаSears, Timothy Dean, and tim sears@biogreenoil com. "Generalized Maximum Entropy, Convexity and Machine Learning." The Australian National University. Research School of Information Sciences and Engineering, 2008. http://thesis.anu.edu.au./public/adt-ANU20090525.210315.
Повний текст джерелаSilveti, Falls Antonio. "First-order noneuclidean splitting methods for large-scale optimization : deterministic and stochastic algorithms." Thesis, Normandie, 2021. http://www.theses.fr/2021NORMC204.
Повний текст джерелаIn this work we develop and examine two novel first-order splitting algorithms for solving large-scale composite optimization problems in infinite-dimensional spaces. Such problems are ubiquitous in many areas of science and engineering, particularly in data science and imaging sciences. Our work is focused on relaxing the Lipschitz-smoothness assumptions generally required by first-order splitting algorithms by replacing the Euclidean energy with a Bregman divergence. These developments allow one to solve problems having more exotic geometry than that of the usual Euclidean setting. One algorithm is hybridization of the conditional gradient algorithm, making use of a linear minimization oracle at each iteration, with an augmented Lagrangian algorithm, allowing for affine constraints. The other algorithm is a primal-dual splitting algorithm incorporating Bregman divergences for computing the associated proximal operators. For both of these algorithms, our analysis shows convergence of the Lagrangian values, subsequential weak convergence of the iterates to solutions, and rates of convergence. In addition to these novel deterministic algorithms, we introduce and study also the stochastic extensions of these algorithms through a perturbation perspective. Our results in this part include almost sure convergence results for all the same quantities as in the deterministic setting, with rates as well. Finally, we tackle new problems that are only accessible through the relaxed assumptions our algorithms allow. We demonstrate numerical efficiency and verify our theoretical results on problems like low rank, sparse matrix completion, inverse problems on the simplex, and entropically regularized Wasserstein inverse problems
Hasnat, Md Abul. "Unsupervised 3D image clustering and extension to joint color and depth segmentation." Thesis, Saint-Etienne, 2014. http://www.theses.fr/2014STET4013/document.
Повний текст джерелаAccess to the 3D images at a reasonable frame rate is widespread now, thanks to the recent advances in low cost depth sensors as well as the efficient methods to compute 3D from 2D images. As a consequence, it is highly demanding to enhance the capability of existing computer vision applications by incorporating 3D information. Indeed, it has been demonstrated in numerous researches that the accuracy of different tasks increases by including 3D information as an additional feature. However, for the task of indoor scene analysis and segmentation, it remains several important issues, such as: (a) how the 3D information itself can be exploited? and (b) what is the best way to fuse color and 3D in an unsupervised manner? In this thesis, we address these issues and propose novel unsupervised methods for 3D image clustering and joint color and depth image segmentation. To this aim, we consider image normals as the prominent feature from 3D image and cluster them with methods based on finite statistical mixture models. We consider Bregman Soft Clustering method to ensure computationally efficient clustering. Moreover, we exploit several probability distributions from directional statistics, such as the von Mises-Fisher distribution and the Watson distribution. By combining these, we propose novel Model Based Clustering methods. We empirically validate these methods using synthetic data and then demonstrate their application for 3D/depth image analysis. Afterward, we extend these methods to segment synchronized 3D and color image, also called RGB-D image. To this aim, first we propose a statistical image generation model for RGB-D image. Then, we propose novel RGB-D segmentation method using a joint color-spatial-axial clustering and a statistical planar region merging method. Results show that, the proposed method is comparable with the state of the art methods and requires less computation time. Moreover, it opens interesting perspectives to fuse color and geometry in an unsupervised manner. We believe that the methods proposed in this thesis are equally applicable and extendable for clustering different types of data, such as speech, gene expressions, etc. Moreover, they can be used for complex tasks, such as joint image-speech data analysis
Acharyya, Sreangsu. "Learning to rank in supervised and unsupervised settings using convexity and monotonicity." 2013. http://hdl.handle.net/2152/21154.
Повний текст джерелаtext
Sprung, Benjamin. "Convergence rates for variational regularization of statistical inverse problems." Doctoral thesis, 2019. http://hdl.handle.net/21.11130/00-1735-0000-0005-1398-A.
Повний текст джерелаЧастини книг з теми "Divergences de Bregman"
Nielsen, Frank, and Gaëtan Hadjeres. "Quasiconvex Jensen Divergences and Quasiconvex Bregman Divergences." In Springer Proceedings in Mathematics & Statistics, 196–218. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-77957-3_11.
Повний текст джерелаNielsen, Frank, and Richard Nock. "Bregman Divergences from Comparative Convexity." In Lecture Notes in Computer Science, 639–47. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68445-1_74.
Повний текст джерелаLai, Pei Ling, and Colin Fyfe. "Bregman Divergences and Multi-dimensional Scaling." In Advances in Neuro-Information Processing, 935–42. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-03040-6_114.
Повний текст джерелаWang, Xi, and Colin Fyfe. "Independent Component Analysis Using Bregman Divergences." In Trends in Applied Intelligent Systems, 627–36. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-13025-0_64.
Повний текст джерелаJang, Eunsong, Colin Fyfe, and Hanseok Ko. "Bregman Divergences and the Self Organising Map." In Lecture Notes in Computer Science, 452–58. Berlin, Heidelberg: Springer Berlin Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-88906-9_57.
Повний текст джерелаSantos-Rodríguez, Raúl, Alicia Guerrero-Curieses, Rocío Alaiz-Rodríguez, and Jesús Cid-Sueiro. "Cost-Sensitive Learning Based on Bregman Divergences." In Machine Learning and Knowledge Discovery in Databases, 12. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04180-8_12.
Повний текст джерелаSun, Jigang, Malcolm Crowe, and Colin Fyfe. "Extending Metric Multidimensional Scaling with Bregman Divergences." In Trends in Applied Intelligent Systems, 615–26. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-13025-0_63.
Повний текст джерелаWang, Shaojun, and Dale Schuurmans. "Learning Continuous Latent Variable Models with Bregman Divergences." In Lecture Notes in Computer Science, 190–204. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-39624-6_16.
Повний текст джерелаWu, Yan, Liang Du, and Honghong Cheng. "Multi-view K-Means Clustering with Bregman Divergences." In Communications in Computer and Information Science, 26–38. Singapore: Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-13-2122-1_3.
Повний текст джерелаStummer, Wolfgang, and Anna-Lena Kißlinger. "Some New Flexibilizations of Bregman Divergences and Their Asymptotics." In Lecture Notes in Computer Science, 514–22. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68445-1_60.
Повний текст джерелаТези доповідей конференцій з теми "Divergences de Bregman"
Banerjee, Arindam, Srujana Merugu, Inderjit Dhillon, and Joydeep Ghosh. "Clustering with Bregman Divergences." In Proceedings of the 2004 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2004. http://dx.doi.org/10.1137/1.9781611972740.22.
Повний текст джерелаAcharyya, Sreangsu, Arindam Banerjee, and Daniel Boley. "Bregman Divergences and Triangle Inequality." In Proceedings of the 2013 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2013. http://dx.doi.org/10.1137/1.9781611972832.53.
Повний текст джерелаInan, Huseyin A., Mehmet A. Donmez, and Suleyman S. Kozat. "Adaptive mixture methods using Bregman divergences." In ICASSP 2012 - 2012 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2012. http://dx.doi.org/10.1109/icassp.2012.6288740.
Повний текст джерелаCayton, Lawrence. "Fast nearest neighbor retrieval for bregman divergences." In the 25th international conference. New York, New York, USA: ACM Press, 2008. http://dx.doi.org/10.1145/1390156.1390171.
Повний текст джерелаHarandi, Mehrtash, Mathieu Salzmann, and Fatih Porikli. "Bregman Divergences for Infinite Dimensional Covariance Matrices." In 2014 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2014. http://dx.doi.org/10.1109/cvpr.2014.132.
Повний текст джерелаAckermann, Marcel R., and Johannes Blömer. "Coresets and Approximate Clustering for Bregman Divergences." In Proceedings of the Twentieth Annual ACM-SIAM Symposium on Discrete Algorithms. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2009. http://dx.doi.org/10.1137/1.9781611973068.118.
Повний текст джерелаFerreira, Daniela P. L., Eraldo Ribeiro, and Celia A. Z. Barcelos. "Variational non rigid registration with bregman divergences." In SAC 2017: Symposium on Applied Computing. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3019612.3019646.
Повний текст джерелаShaojun Wang and D. Schummans. "Learning latent variable models with bregman divergences." In IEEE International Symposium on Information Theory, 2003. Proceedings. IEEE, 2003. http://dx.doi.org/10.1109/isit.2003.1228234.
Повний текст джерелаEscolano, Francisco, Meizhu Liu, and Edwin R. Hancock. "Tensor-based total bregman divergences between graphs." In 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops). IEEE, 2011. http://dx.doi.org/10.1109/iccvw.2011.6130420.
Повний текст джерелаMagron, Paul, Pierre-Hugo Vial, Thomas Oberlin, and Cedric Fevotte. "Phase Recovery with Bregman Divergences for Audio Source Separation." In ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2021. http://dx.doi.org/10.1109/icassp39728.2021.9413717.
Повний текст джерелаЗвіти організацій з теми "Divergences de Bregman"
Li, Xinyao. Block active ADMM to Minimize NMF with Bregman Divergences. Ames (Iowa): Iowa State University, January 2021. http://dx.doi.org/10.31274/cc-20240624-307.
Повний текст джерелаCherian, Anoop, Suvrit Sra, Arindam Banerjee, and Nikos Papanikolopoulos. Jensen-Bregman LogDet Divergence for Efficient Similarity Computations on Positive Definite Tensors. Fort Belvoir, VA: Defense Technical Information Center, May 2012. http://dx.doi.org/10.21236/ada561322.
Повний текст джерела