Literatura científica selecionada sobre o tema "Divergences de Bregman"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Divergences de Bregman".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Artigos de revistas sobre o assunto "Divergences de Bregman"
Amari, S., e A. Cichocki. "Information geometry of divergence functions". Bulletin of the Polish Academy of Sciences: Technical Sciences 58, n.º 1 (1 de março de 2010): 183–95. http://dx.doi.org/10.2478/v10175-010-0019-1.
Texto completo da fonteABDULLAH, AMIRALI, JOHN MOELLER e SURESH VENKATASUBRAMANIAN. "APPROXIMATE BREGMAN NEAR NEIGHBORS IN SUBLINEAR TIME: BEYOND THE TRIANGLE INEQUALITY". International Journal of Computational Geometry & Applications 23, n.º 04n05 (agosto de 2013): 253–301. http://dx.doi.org/10.1142/s0218195913600066.
Texto completo da fonteNielsen, Frank. "On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds". Entropy 22, n.º 7 (28 de junho de 2020): 713. http://dx.doi.org/10.3390/e22070713.
Texto completo da fonteLi, Wuchen. "Transport information Bregman divergences". Information Geometry 4, n.º 2 (15 de novembro de 2021): 435–70. http://dx.doi.org/10.1007/s41884-021-00063-5.
Texto completo da fonteNielsen, Frank. "On a Generalization of the Jensen–Shannon Divergence and the Jensen–Shannon Centroid". Entropy 22, n.º 2 (16 de fevereiro de 2020): 221. http://dx.doi.org/10.3390/e22020221.
Texto completo da fonteChen, P., Y. Chen e M. Rao. "Metrics defined by Bregman Divergences". Communications in Mathematical Sciences 6, n.º 4 (2008): 915–26. http://dx.doi.org/10.4310/cms.2008.v6.n4.a6.
Texto completo da fonteBock, Andreas A., e Martin S. Andersen. "Preconditioner Design via Bregman Divergences". SIAM Journal on Matrix Analysis and Applications 45, n.º 2 (7 de junho de 2024): 1148–82. http://dx.doi.org/10.1137/23m1566637.
Texto completo da fonteZhang, Jianwen, e Changshui Zhang. "Multitask Bregman Clustering". Proceedings of the AAAI Conference on Artificial Intelligence 24, n.º 1 (3 de julho de 2010): 655–60. http://dx.doi.org/10.1609/aaai.v24i1.7674.
Texto completo da fonteLiang, Xiao. "A Note on Divergences". Neural Computation 28, n.º 10 (outubro de 2016): 2045–62. http://dx.doi.org/10.1162/neco_a_00878.
Texto completo da fonteVoronov, Yuri, e Matvej Sviridov. "NEW INTERREGIONAL DIFFERENCES MEASURE MENTTOOLS". Interexpo GEO-Siberia 3, n.º 1 (2019): 71–77. http://dx.doi.org/10.33764/2618-981x-2019-3-1-71-77.
Texto completo da fonteTeses / dissertações sobre o assunto "Divergences de Bregman"
Wu, Xiaochuan. "Clustering and visualisation with Bregman divergences". Thesis, University of the West of Scotland, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.730022.
Texto completo da fonteSun, Jiang. "Extending the metric multidimensional scaling with bregman divergences". Thesis, University of the West of Scotland, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.556070.
Texto completo da fonteGodeme, Jean-Jacques. "Ρhase retrieval with nοn-Euclidean Bregman based geοmetry". Electronic Thesis or Diss., Normandie, 2024. http://www.theses.fr/2024NORMC214.
Texto completo da fonteIn this work, we investigate the phase retrieval problem of real-valued signals in finite dimension, a challenge encountered across various scientific and engineering disciplines. It explores two complementary approaches: retrieval with and without regularization. In both settings, our work is focused on relaxing the Lipschitz-smoothness assumption generally required by first-order splitting algorithms, and which is not valid for phase retrieval cast as a minimization problem. The key idea here is to replace the Euclidean geometry by a non-Euclidean Bregman divergence associated to an appropriate kernel. We use a Bregman gradient/mirror descent algorithm with this divergence to solve thephase retrieval problem without regularization, and we show exact (up to a global sign) recovery both in a deterministic setting and with high probability for a sufficient number of random measurements (Gaussian and Coded Diffraction Patterns). Furthermore, we establish the robustness of this approachagainst small additive noise. Shifting to regularized phase retrieval, we first develop and analyze an Inertial Bregman Proximal Gradient algorithm for minimizing the sum of two functions in finite dimension, one of which is convex and possibly nonsmooth and the second is relatively smooth in the Bregman geometry. We provide both global and local convergence guarantees for this algorithm. Finally, we study noiseless and stable recovery of low complexity regularized phase retrieval. For this, weformulate the problem as the minimization of an objective functional involving a nonconvex smooth data fidelity term and a convex regularizer promoting solutions conforming to some notion of low-complexity related to their nonsmoothness points. We establish conditions for exact and stable recovery and provide sample complexity bounds for random measurements to ensure that these conditions hold. These sample bounds depend on the low complexity of the signals to be recovered. Our new results allow to go far beyond the case of sparse phase retrieval
Danilevicz, Ian Meneghel. "Detecting Influential observations in spatial models using Bregman divergence". Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/104/104131/tde-13112018-160231/.
Texto completo da fonteComo avaliar se um modelo espacial está bem ajustado? Como escolher o melhor modelo entre muitos da classe autorregressivo condicional (CAR) e autorregressivo simultâneo (SAR), homoscedásticos e heteroscedásticos? Para responder essas perguntas dentro do paradigma bayesiano, propomos novas formas de aplicar a divergência de Bregman, assim como critérios de informação bastante recentes na literatura, são eles o widely applicable information criterion (WAIC) e validação cruzada leave-one-out (LOO). O funcional de Bregman é uma generalização da famosa divergência de Kullback-Leiber (KL). Há diversos casos particulares dela que podem ser usados para identificar pontos influentes. Todas as distribuições a posteriori apresentadas nesta dissertação foram estimadas usando Monte Carlo Hamiltoniano (HMC), uma versão otimizada do algoritmo Metropolis-Hastings. Todas as ideias apresentadas neste texto foram submetidas a simulações e aplicadas em dados reais.
Adamcik, Martin. "Collective reasoning under uncertainty and inconsistency". Thesis, University of Manchester, 2014. https://www.research.manchester.ac.uk/portal/en/theses/collective-reasoning-under-uncertainty-and-inconsistency(7fab8021-8beb-45e7-8b45-7cb4fadd70be).html.
Texto completo da fonteSears, Timothy Dean, e tim sears@biogreenoil com. "Generalized Maximum Entropy, Convexity and Machine Learning". The Australian National University. Research School of Information Sciences and Engineering, 2008. http://thesis.anu.edu.au./public/adt-ANU20090525.210315.
Texto completo da fonteSilveti, Falls Antonio. "First-order noneuclidean splitting methods for large-scale optimization : deterministic and stochastic algorithms". Thesis, Normandie, 2021. http://www.theses.fr/2021NORMC204.
Texto completo da fonteIn this work we develop and examine two novel first-order splitting algorithms for solving large-scale composite optimization problems in infinite-dimensional spaces. Such problems are ubiquitous in many areas of science and engineering, particularly in data science and imaging sciences. Our work is focused on relaxing the Lipschitz-smoothness assumptions generally required by first-order splitting algorithms by replacing the Euclidean energy with a Bregman divergence. These developments allow one to solve problems having more exotic geometry than that of the usual Euclidean setting. One algorithm is hybridization of the conditional gradient algorithm, making use of a linear minimization oracle at each iteration, with an augmented Lagrangian algorithm, allowing for affine constraints. The other algorithm is a primal-dual splitting algorithm incorporating Bregman divergences for computing the associated proximal operators. For both of these algorithms, our analysis shows convergence of the Lagrangian values, subsequential weak convergence of the iterates to solutions, and rates of convergence. In addition to these novel deterministic algorithms, we introduce and study also the stochastic extensions of these algorithms through a perturbation perspective. Our results in this part include almost sure convergence results for all the same quantities as in the deterministic setting, with rates as well. Finally, we tackle new problems that are only accessible through the relaxed assumptions our algorithms allow. We demonstrate numerical efficiency and verify our theoretical results on problems like low rank, sparse matrix completion, inverse problems on the simplex, and entropically regularized Wasserstein inverse problems
Hasnat, Md Abul. "Unsupervised 3D image clustering and extension to joint color and depth segmentation". Thesis, Saint-Etienne, 2014. http://www.theses.fr/2014STET4013/document.
Texto completo da fonteAccess to the 3D images at a reasonable frame rate is widespread now, thanks to the recent advances in low cost depth sensors as well as the efficient methods to compute 3D from 2D images. As a consequence, it is highly demanding to enhance the capability of existing computer vision applications by incorporating 3D information. Indeed, it has been demonstrated in numerous researches that the accuracy of different tasks increases by including 3D information as an additional feature. However, for the task of indoor scene analysis and segmentation, it remains several important issues, such as: (a) how the 3D information itself can be exploited? and (b) what is the best way to fuse color and 3D in an unsupervised manner? In this thesis, we address these issues and propose novel unsupervised methods for 3D image clustering and joint color and depth image segmentation. To this aim, we consider image normals as the prominent feature from 3D image and cluster them with methods based on finite statistical mixture models. We consider Bregman Soft Clustering method to ensure computationally efficient clustering. Moreover, we exploit several probability distributions from directional statistics, such as the von Mises-Fisher distribution and the Watson distribution. By combining these, we propose novel Model Based Clustering methods. We empirically validate these methods using synthetic data and then demonstrate their application for 3D/depth image analysis. Afterward, we extend these methods to segment synchronized 3D and color image, also called RGB-D image. To this aim, first we propose a statistical image generation model for RGB-D image. Then, we propose novel RGB-D segmentation method using a joint color-spatial-axial clustering and a statistical planar region merging method. Results show that, the proposed method is comparable with the state of the art methods and requires less computation time. Moreover, it opens interesting perspectives to fuse color and geometry in an unsupervised manner. We believe that the methods proposed in this thesis are equally applicable and extendable for clustering different types of data, such as speech, gene expressions, etc. Moreover, they can be used for complex tasks, such as joint image-speech data analysis
Acharyya, Sreangsu. "Learning to rank in supervised and unsupervised settings using convexity and monotonicity". 2013. http://hdl.handle.net/2152/21154.
Texto completo da fontetext
Sprung, Benjamin. "Convergence rates for variational regularization of statistical inverse problems". Doctoral thesis, 2019. http://hdl.handle.net/21.11130/00-1735-0000-0005-1398-A.
Texto completo da fonteCapítulos de livros sobre o assunto "Divergences de Bregman"
Nielsen, Frank, e Gaëtan Hadjeres. "Quasiconvex Jensen Divergences and Quasiconvex Bregman Divergences". In Springer Proceedings in Mathematics & Statistics, 196–218. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-77957-3_11.
Texto completo da fonteNielsen, Frank, e Richard Nock. "Bregman Divergences from Comparative Convexity". In Lecture Notes in Computer Science, 639–47. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68445-1_74.
Texto completo da fonteLai, Pei Ling, e Colin Fyfe. "Bregman Divergences and Multi-dimensional Scaling". In Advances in Neuro-Information Processing, 935–42. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-03040-6_114.
Texto completo da fonteWang, Xi, e Colin Fyfe. "Independent Component Analysis Using Bregman Divergences". In Trends in Applied Intelligent Systems, 627–36. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-13025-0_64.
Texto completo da fonteJang, Eunsong, Colin Fyfe e Hanseok Ko. "Bregman Divergences and the Self Organising Map". In Lecture Notes in Computer Science, 452–58. Berlin, Heidelberg: Springer Berlin Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-88906-9_57.
Texto completo da fonteSantos-Rodríguez, Raúl, Alicia Guerrero-Curieses, Rocío Alaiz-Rodríguez e Jesús Cid-Sueiro. "Cost-Sensitive Learning Based on Bregman Divergences". In Machine Learning and Knowledge Discovery in Databases, 12. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04180-8_12.
Texto completo da fonteSun, Jigang, Malcolm Crowe e Colin Fyfe. "Extending Metric Multidimensional Scaling with Bregman Divergences". In Trends in Applied Intelligent Systems, 615–26. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-13025-0_63.
Texto completo da fonteWang, Shaojun, e Dale Schuurmans. "Learning Continuous Latent Variable Models with Bregman Divergences". In Lecture Notes in Computer Science, 190–204. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-39624-6_16.
Texto completo da fonteWu, Yan, Liang Du e Honghong Cheng. "Multi-view K-Means Clustering with Bregman Divergences". In Communications in Computer and Information Science, 26–38. Singapore: Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-13-2122-1_3.
Texto completo da fonteStummer, Wolfgang, e Anna-Lena Kißlinger. "Some New Flexibilizations of Bregman Divergences and Their Asymptotics". In Lecture Notes in Computer Science, 514–22. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68445-1_60.
Texto completo da fonteTrabalhos de conferências sobre o assunto "Divergences de Bregman"
Banerjee, Arindam, Srujana Merugu, Inderjit Dhillon e Joydeep Ghosh. "Clustering with Bregman Divergences". In Proceedings of the 2004 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2004. http://dx.doi.org/10.1137/1.9781611972740.22.
Texto completo da fonteAcharyya, Sreangsu, Arindam Banerjee e Daniel Boley. "Bregman Divergences and Triangle Inequality". In Proceedings of the 2013 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2013. http://dx.doi.org/10.1137/1.9781611972832.53.
Texto completo da fonteInan, Huseyin A., Mehmet A. Donmez e Suleyman S. Kozat. "Adaptive mixture methods using Bregman divergences". In ICASSP 2012 - 2012 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2012. http://dx.doi.org/10.1109/icassp.2012.6288740.
Texto completo da fonteCayton, Lawrence. "Fast nearest neighbor retrieval for bregman divergences". In the 25th international conference. New York, New York, USA: ACM Press, 2008. http://dx.doi.org/10.1145/1390156.1390171.
Texto completo da fonteHarandi, Mehrtash, Mathieu Salzmann e Fatih Porikli. "Bregman Divergences for Infinite Dimensional Covariance Matrices". In 2014 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2014. http://dx.doi.org/10.1109/cvpr.2014.132.
Texto completo da fonteAckermann, Marcel R., e Johannes Blömer. "Coresets and Approximate Clustering for Bregman Divergences". In Proceedings of the Twentieth Annual ACM-SIAM Symposium on Discrete Algorithms. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2009. http://dx.doi.org/10.1137/1.9781611973068.118.
Texto completo da fonteFerreira, Daniela P. L., Eraldo Ribeiro e Celia A. Z. Barcelos. "Variational non rigid registration with bregman divergences". In SAC 2017: Symposium on Applied Computing. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3019612.3019646.
Texto completo da fonteShaojun Wang e D. Schummans. "Learning latent variable models with bregman divergences". In IEEE International Symposium on Information Theory, 2003. Proceedings. IEEE, 2003. http://dx.doi.org/10.1109/isit.2003.1228234.
Texto completo da fonteEscolano, Francisco, Meizhu Liu e Edwin R. Hancock. "Tensor-based total bregman divergences between graphs". In 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops). IEEE, 2011. http://dx.doi.org/10.1109/iccvw.2011.6130420.
Texto completo da fonteMagron, Paul, Pierre-Hugo Vial, Thomas Oberlin e Cedric Fevotte. "Phase Recovery with Bregman Divergences for Audio Source Separation". In ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2021. http://dx.doi.org/10.1109/icassp39728.2021.9413717.
Texto completo da fonteRelatórios de organizações sobre o assunto "Divergences de Bregman"
Li, Xinyao. Block active ADMM to Minimize NMF with Bregman Divergences. Ames (Iowa): Iowa State University, janeiro de 2021. http://dx.doi.org/10.31274/cc-20240624-307.
Texto completo da fonteCherian, Anoop, Suvrit Sra, Arindam Banerjee e Nikos Papanikolopoulos. Jensen-Bregman LogDet Divergence for Efficient Similarity Computations on Positive Definite Tensors. Fort Belvoir, VA: Defense Technical Information Center, maio de 2012. http://dx.doi.org/10.21236/ada561322.
Texto completo da fonte