Inhaltsverzeichnis
Auswahl der wissenschaftlichen Literatur zum Thema „Divergences de Bregman“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Divergences de Bregman" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Zeitschriftenartikel zum Thema "Divergences de Bregman"
Amari, S., und A. Cichocki. „Information geometry of divergence functions“. Bulletin of the Polish Academy of Sciences: Technical Sciences 58, Nr. 1 (01.03.2010): 183–95. http://dx.doi.org/10.2478/v10175-010-0019-1.
Der volle Inhalt der QuelleABDULLAH, AMIRALI, JOHN MOELLER und SURESH VENKATASUBRAMANIAN. „APPROXIMATE BREGMAN NEAR NEIGHBORS IN SUBLINEAR TIME: BEYOND THE TRIANGLE INEQUALITY“. International Journal of Computational Geometry & Applications 23, Nr. 04n05 (August 2013): 253–301. http://dx.doi.org/10.1142/s0218195913600066.
Der volle Inhalt der QuelleNielsen, Frank. „On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds“. Entropy 22, Nr. 7 (28.06.2020): 713. http://dx.doi.org/10.3390/e22070713.
Der volle Inhalt der QuelleLi, Wuchen. „Transport information Bregman divergences“. Information Geometry 4, Nr. 2 (15.11.2021): 435–70. http://dx.doi.org/10.1007/s41884-021-00063-5.
Der volle Inhalt der QuelleNielsen, Frank. „On a Generalization of the Jensen–Shannon Divergence and the Jensen–Shannon Centroid“. Entropy 22, Nr. 2 (16.02.2020): 221. http://dx.doi.org/10.3390/e22020221.
Der volle Inhalt der QuelleChen, P., Y. Chen und M. Rao. „Metrics defined by Bregman Divergences“. Communications in Mathematical Sciences 6, Nr. 4 (2008): 915–26. http://dx.doi.org/10.4310/cms.2008.v6.n4.a6.
Der volle Inhalt der QuelleBock, Andreas A., und Martin S. Andersen. „Preconditioner Design via Bregman Divergences“. SIAM Journal on Matrix Analysis and Applications 45, Nr. 2 (07.06.2024): 1148–82. http://dx.doi.org/10.1137/23m1566637.
Der volle Inhalt der QuelleZhang, Jianwen, und Changshui Zhang. „Multitask Bregman Clustering“. Proceedings of the AAAI Conference on Artificial Intelligence 24, Nr. 1 (03.07.2010): 655–60. http://dx.doi.org/10.1609/aaai.v24i1.7674.
Der volle Inhalt der QuelleLiang, Xiao. „A Note on Divergences“. Neural Computation 28, Nr. 10 (Oktober 2016): 2045–62. http://dx.doi.org/10.1162/neco_a_00878.
Der volle Inhalt der QuelleVoronov, Yuri, und Matvej Sviridov. „NEW INTERREGIONAL DIFFERENCES MEASURE MENTTOOLS“. Interexpo GEO-Siberia 3, Nr. 1 (2019): 71–77. http://dx.doi.org/10.33764/2618-981x-2019-3-1-71-77.
Der volle Inhalt der QuelleDissertationen zum Thema "Divergences de Bregman"
Wu, Xiaochuan. „Clustering and visualisation with Bregman divergences“. Thesis, University of the West of Scotland, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.730022.
Der volle Inhalt der QuelleSun, Jiang. „Extending the metric multidimensional scaling with bregman divergences“. Thesis, University of the West of Scotland, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.556070.
Der volle Inhalt der QuelleGodeme, Jean-Jacques. „Ρhase retrieval with nοn-Euclidean Bregman based geοmetry“. Electronic Thesis or Diss., Normandie, 2024. http://www.theses.fr/2024NORMC214.
Der volle Inhalt der QuelleIn this work, we investigate the phase retrieval problem of real-valued signals in finite dimension, a challenge encountered across various scientific and engineering disciplines. It explores two complementary approaches: retrieval with and without regularization. In both settings, our work is focused on relaxing the Lipschitz-smoothness assumption generally required by first-order splitting algorithms, and which is not valid for phase retrieval cast as a minimization problem. The key idea here is to replace the Euclidean geometry by a non-Euclidean Bregman divergence associated to an appropriate kernel. We use a Bregman gradient/mirror descent algorithm with this divergence to solve thephase retrieval problem without regularization, and we show exact (up to a global sign) recovery both in a deterministic setting and with high probability for a sufficient number of random measurements (Gaussian and Coded Diffraction Patterns). Furthermore, we establish the robustness of this approachagainst small additive noise. Shifting to regularized phase retrieval, we first develop and analyze an Inertial Bregman Proximal Gradient algorithm for minimizing the sum of two functions in finite dimension, one of which is convex and possibly nonsmooth and the second is relatively smooth in the Bregman geometry. We provide both global and local convergence guarantees for this algorithm. Finally, we study noiseless and stable recovery of low complexity regularized phase retrieval. For this, weformulate the problem as the minimization of an objective functional involving a nonconvex smooth data fidelity term and a convex regularizer promoting solutions conforming to some notion of low-complexity related to their nonsmoothness points. We establish conditions for exact and stable recovery and provide sample complexity bounds for random measurements to ensure that these conditions hold. These sample bounds depend on the low complexity of the signals to be recovered. Our new results allow to go far beyond the case of sparse phase retrieval
Danilevicz, Ian Meneghel. „Detecting Influential observations in spatial models using Bregman divergence“. Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/104/104131/tde-13112018-160231/.
Der volle Inhalt der QuelleComo avaliar se um modelo espacial está bem ajustado? Como escolher o melhor modelo entre muitos da classe autorregressivo condicional (CAR) e autorregressivo simultâneo (SAR), homoscedásticos e heteroscedásticos? Para responder essas perguntas dentro do paradigma bayesiano, propomos novas formas de aplicar a divergência de Bregman, assim como critérios de informação bastante recentes na literatura, são eles o widely applicable information criterion (WAIC) e validação cruzada leave-one-out (LOO). O funcional de Bregman é uma generalização da famosa divergência de Kullback-Leiber (KL). Há diversos casos particulares dela que podem ser usados para identificar pontos influentes. Todas as distribuições a posteriori apresentadas nesta dissertação foram estimadas usando Monte Carlo Hamiltoniano (HMC), uma versão otimizada do algoritmo Metropolis-Hastings. Todas as ideias apresentadas neste texto foram submetidas a simulações e aplicadas em dados reais.
Adamcik, Martin. „Collective reasoning under uncertainty and inconsistency“. Thesis, University of Manchester, 2014. https://www.research.manchester.ac.uk/portal/en/theses/collective-reasoning-under-uncertainty-and-inconsistency(7fab8021-8beb-45e7-8b45-7cb4fadd70be).html.
Der volle Inhalt der QuelleSears, Timothy Dean, und tim sears@biogreenoil com. „Generalized Maximum Entropy, Convexity and Machine Learning“. The Australian National University. Research School of Information Sciences and Engineering, 2008. http://thesis.anu.edu.au./public/adt-ANU20090525.210315.
Der volle Inhalt der QuelleSilveti, Falls Antonio. „First-order noneuclidean splitting methods for large-scale optimization : deterministic and stochastic algorithms“. Thesis, Normandie, 2021. http://www.theses.fr/2021NORMC204.
Der volle Inhalt der QuelleIn this work we develop and examine two novel first-order splitting algorithms for solving large-scale composite optimization problems in infinite-dimensional spaces. Such problems are ubiquitous in many areas of science and engineering, particularly in data science and imaging sciences. Our work is focused on relaxing the Lipschitz-smoothness assumptions generally required by first-order splitting algorithms by replacing the Euclidean energy with a Bregman divergence. These developments allow one to solve problems having more exotic geometry than that of the usual Euclidean setting. One algorithm is hybridization of the conditional gradient algorithm, making use of a linear minimization oracle at each iteration, with an augmented Lagrangian algorithm, allowing for affine constraints. The other algorithm is a primal-dual splitting algorithm incorporating Bregman divergences for computing the associated proximal operators. For both of these algorithms, our analysis shows convergence of the Lagrangian values, subsequential weak convergence of the iterates to solutions, and rates of convergence. In addition to these novel deterministic algorithms, we introduce and study also the stochastic extensions of these algorithms through a perturbation perspective. Our results in this part include almost sure convergence results for all the same quantities as in the deterministic setting, with rates as well. Finally, we tackle new problems that are only accessible through the relaxed assumptions our algorithms allow. We demonstrate numerical efficiency and verify our theoretical results on problems like low rank, sparse matrix completion, inverse problems on the simplex, and entropically regularized Wasserstein inverse problems
Hasnat, Md Abul. „Unsupervised 3D image clustering and extension to joint color and depth segmentation“. Thesis, Saint-Etienne, 2014. http://www.theses.fr/2014STET4013/document.
Der volle Inhalt der QuelleAccess to the 3D images at a reasonable frame rate is widespread now, thanks to the recent advances in low cost depth sensors as well as the efficient methods to compute 3D from 2D images. As a consequence, it is highly demanding to enhance the capability of existing computer vision applications by incorporating 3D information. Indeed, it has been demonstrated in numerous researches that the accuracy of different tasks increases by including 3D information as an additional feature. However, for the task of indoor scene analysis and segmentation, it remains several important issues, such as: (a) how the 3D information itself can be exploited? and (b) what is the best way to fuse color and 3D in an unsupervised manner? In this thesis, we address these issues and propose novel unsupervised methods for 3D image clustering and joint color and depth image segmentation. To this aim, we consider image normals as the prominent feature from 3D image and cluster them with methods based on finite statistical mixture models. We consider Bregman Soft Clustering method to ensure computationally efficient clustering. Moreover, we exploit several probability distributions from directional statistics, such as the von Mises-Fisher distribution and the Watson distribution. By combining these, we propose novel Model Based Clustering methods. We empirically validate these methods using synthetic data and then demonstrate their application for 3D/depth image analysis. Afterward, we extend these methods to segment synchronized 3D and color image, also called RGB-D image. To this aim, first we propose a statistical image generation model for RGB-D image. Then, we propose novel RGB-D segmentation method using a joint color-spatial-axial clustering and a statistical planar region merging method. Results show that, the proposed method is comparable with the state of the art methods and requires less computation time. Moreover, it opens interesting perspectives to fuse color and geometry in an unsupervised manner. We believe that the methods proposed in this thesis are equally applicable and extendable for clustering different types of data, such as speech, gene expressions, etc. Moreover, they can be used for complex tasks, such as joint image-speech data analysis
Acharyya, Sreangsu. „Learning to rank in supervised and unsupervised settings using convexity and monotonicity“. 2013. http://hdl.handle.net/2152/21154.
Der volle Inhalt der Quelletext
Sprung, Benjamin. „Convergence rates for variational regularization of statistical inverse problems“. Doctoral thesis, 2019. http://hdl.handle.net/21.11130/00-1735-0000-0005-1398-A.
Der volle Inhalt der QuelleBuchteile zum Thema "Divergences de Bregman"
Nielsen, Frank, und Gaëtan Hadjeres. „Quasiconvex Jensen Divergences and Quasiconvex Bregman Divergences“. In Springer Proceedings in Mathematics & Statistics, 196–218. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-77957-3_11.
Der volle Inhalt der QuelleNielsen, Frank, und Richard Nock. „Bregman Divergences from Comparative Convexity“. In Lecture Notes in Computer Science, 639–47. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68445-1_74.
Der volle Inhalt der QuelleLai, Pei Ling, und Colin Fyfe. „Bregman Divergences and Multi-dimensional Scaling“. In Advances in Neuro-Information Processing, 935–42. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-03040-6_114.
Der volle Inhalt der QuelleWang, Xi, und Colin Fyfe. „Independent Component Analysis Using Bregman Divergences“. In Trends in Applied Intelligent Systems, 627–36. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-13025-0_64.
Der volle Inhalt der QuelleJang, Eunsong, Colin Fyfe und Hanseok Ko. „Bregman Divergences and the Self Organising Map“. In Lecture Notes in Computer Science, 452–58. Berlin, Heidelberg: Springer Berlin Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-88906-9_57.
Der volle Inhalt der QuelleSantos-Rodríguez, Raúl, Alicia Guerrero-Curieses, Rocío Alaiz-Rodríguez und Jesús Cid-Sueiro. „Cost-Sensitive Learning Based on Bregman Divergences“. In Machine Learning and Knowledge Discovery in Databases, 12. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04180-8_12.
Der volle Inhalt der QuelleSun, Jigang, Malcolm Crowe und Colin Fyfe. „Extending Metric Multidimensional Scaling with Bregman Divergences“. In Trends in Applied Intelligent Systems, 615–26. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-13025-0_63.
Der volle Inhalt der QuelleWang, Shaojun, und Dale Schuurmans. „Learning Continuous Latent Variable Models with Bregman Divergences“. In Lecture Notes in Computer Science, 190–204. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-39624-6_16.
Der volle Inhalt der QuelleWu, Yan, Liang Du und Honghong Cheng. „Multi-view K-Means Clustering with Bregman Divergences“. In Communications in Computer and Information Science, 26–38. Singapore: Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-13-2122-1_3.
Der volle Inhalt der QuelleStummer, Wolfgang, und Anna-Lena Kißlinger. „Some New Flexibilizations of Bregman Divergences and Their Asymptotics“. In Lecture Notes in Computer Science, 514–22. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68445-1_60.
Der volle Inhalt der QuelleKonferenzberichte zum Thema "Divergences de Bregman"
Banerjee, Arindam, Srujana Merugu, Inderjit Dhillon und Joydeep Ghosh. „Clustering with Bregman Divergences“. In Proceedings of the 2004 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2004. http://dx.doi.org/10.1137/1.9781611972740.22.
Der volle Inhalt der QuelleAcharyya, Sreangsu, Arindam Banerjee und Daniel Boley. „Bregman Divergences and Triangle Inequality“. In Proceedings of the 2013 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2013. http://dx.doi.org/10.1137/1.9781611972832.53.
Der volle Inhalt der QuelleInan, Huseyin A., Mehmet A. Donmez und Suleyman S. Kozat. „Adaptive mixture methods using Bregman divergences“. In ICASSP 2012 - 2012 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2012. http://dx.doi.org/10.1109/icassp.2012.6288740.
Der volle Inhalt der QuelleCayton, Lawrence. „Fast nearest neighbor retrieval for bregman divergences“. In the 25th international conference. New York, New York, USA: ACM Press, 2008. http://dx.doi.org/10.1145/1390156.1390171.
Der volle Inhalt der QuelleHarandi, Mehrtash, Mathieu Salzmann und Fatih Porikli. „Bregman Divergences for Infinite Dimensional Covariance Matrices“. In 2014 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2014. http://dx.doi.org/10.1109/cvpr.2014.132.
Der volle Inhalt der QuelleAckermann, Marcel R., und Johannes Blömer. „Coresets and Approximate Clustering for Bregman Divergences“. In Proceedings of the Twentieth Annual ACM-SIAM Symposium on Discrete Algorithms. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2009. http://dx.doi.org/10.1137/1.9781611973068.118.
Der volle Inhalt der QuelleFerreira, Daniela P. L., Eraldo Ribeiro und Celia A. Z. Barcelos. „Variational non rigid registration with bregman divergences“. In SAC 2017: Symposium on Applied Computing. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3019612.3019646.
Der volle Inhalt der QuelleShaojun Wang und D. Schummans. „Learning latent variable models with bregman divergences“. In IEEE International Symposium on Information Theory, 2003. Proceedings. IEEE, 2003. http://dx.doi.org/10.1109/isit.2003.1228234.
Der volle Inhalt der QuelleEscolano, Francisco, Meizhu Liu und Edwin R. Hancock. „Tensor-based total bregman divergences between graphs“. In 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops). IEEE, 2011. http://dx.doi.org/10.1109/iccvw.2011.6130420.
Der volle Inhalt der QuelleMagron, Paul, Pierre-Hugo Vial, Thomas Oberlin und Cedric Fevotte. „Phase Recovery with Bregman Divergences for Audio Source Separation“. In ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2021. http://dx.doi.org/10.1109/icassp39728.2021.9413717.
Der volle Inhalt der QuelleBerichte der Organisationen zum Thema "Divergences de Bregman"
Li, Xinyao. Block active ADMM to Minimize NMF with Bregman Divergences. Ames (Iowa): Iowa State University, Januar 2021. http://dx.doi.org/10.31274/cc-20240624-307.
Der volle Inhalt der QuelleCherian, Anoop, Suvrit Sra, Arindam Banerjee und Nikos Papanikolopoulos. Jensen-Bregman LogDet Divergence for Efficient Similarity Computations on Positive Definite Tensors. Fort Belvoir, VA: Defense Technical Information Center, Mai 2012. http://dx.doi.org/10.21236/ada561322.
Der volle Inhalt der Quelle