Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Divergences de Bregman.

Статті в журналах з теми "Divergences de Bregman"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Divergences de Bregman".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Amari, S., and A. Cichocki. "Information geometry of divergence functions." Bulletin of the Polish Academy of Sciences: Technical Sciences 58, no. 1 (March 1, 2010): 183–95. http://dx.doi.org/10.2478/v10175-010-0019-1.

Повний текст джерела
Анотація:
Information geometry of divergence functionsMeasures of divergence between two points play a key role in many engineering problems. One such measure is a distance function, but there are many important measures which do not satisfy the properties of the distance. The Bregman divergence, Kullback-Leibler divergence andf-divergence are such measures. In the present article, we study the differential-geometrical structure of a manifold induced by a divergence function. It consists of a Riemannian metric, and a pair of dually coupled affine connections, which are studied in information geometry. The class of Bregman divergences are characterized by a dually flat structure, which is originated from the Legendre duality. A dually flat space admits a generalized Pythagorean theorem. The class off-divergences, defined on a manifold of probability distributions, is characterized by information monotonicity, and the Kullback-Leibler divergence belongs to the intersection of both classes. Thef-divergence always gives the α-geometry, which consists of the Fisher information metric and a dual pair of ±α-connections. The α-divergence is a special class off-divergences. This is unique, sitting at the intersection of thef-divergence and Bregman divergence classes in a manifold of positive measures. The geometry derived from the Tsallisq-entropy and related divergences are also addressed.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

ABDULLAH, AMIRALI, JOHN MOELLER, and SURESH VENKATASUBRAMANIAN. "APPROXIMATE BREGMAN NEAR NEIGHBORS IN SUBLINEAR TIME: BEYOND THE TRIANGLE INEQUALITY." International Journal of Computational Geometry & Applications 23, no. 04n05 (August 2013): 253–301. http://dx.doi.org/10.1142/s0218195913600066.

Повний текст джерела
Анотація:
Bregman divergences are important distance measures that are used extensively in data-driven applications such as computer vision, text mining, and speech processing, and are a key focus of interest in machine learning. Answering nearest neighbor (NN) queries under these measures is very important in these applications and has been the subject of extensive study, but is problematic because these distance measures lack metric properties like symmetry and the triangle inequality. In this paper, we present the first provably approximate nearest-neighbor (ANN) algorithms for a broad sub-class of Bregman divergences under some assumptions. Specifically, we examine Bregman divergences which can be decomposed along each dimension and our bounds also depend on restricting the size of our allowed domain. We obtain bounds for both the regular asymmetric Bregman divergences as well as their symmetrized versions. To do so, we develop two geometric properties vital to our analysis: a reverse triangle inequality (RTI) and a relaxed triangle inequality called μ-defectiveness where μ is a domain-dependent value. Bregman divergences satisfy the RTI but not μ-defectiveness. However, we show that the square root of a Bregman divergence does satisfy μ-defectiveness. This allows us to then utilize both properties in an efficient search data structure that follows the general two-stage paradigm of a ring-tree decomposition followed by a quad tree search used in previous near-neighbor algorithms for Euclidean space and spaces of bounded doubling dimension. Our first algorithm resolves a query for a d-dimensional (1 + ε)-ANN in [Formula: see text] time and O(n log d-1 n) space and holds for generic μ-defective distance measures satisfying a RTI. Our second algorithm is more specific in analysis to the Bregman divergences and uses a further structural parameter, the maximum ratio of second derivatives over each dimension of our allowed domain (c0). This allows us to locate a (1 + ε)-ANN in O( log n) time and O(n) space, where there is a further (c0)d factor in the big-Oh for the query time.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Nielsen, Frank. "On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds." Entropy 22, no. 7 (June 28, 2020): 713. http://dx.doi.org/10.3390/e22070713.

Повний текст джерела
Анотація:
We study the Voronoi diagrams of a finite set of Cauchy distributions and their dual complexes from the viewpoint of information geometry by considering the Fisher-Rao distance, the Kullback-Leibler divergence, the chi square divergence, and a flat divergence derived from Tsallis entropy related to the conformal flattening of the Fisher-Rao geometry. We prove that the Voronoi diagrams of the Fisher-Rao distance, the chi square divergence, and the Kullback-Leibler divergences all coincide with a hyperbolic Voronoi diagram on the corresponding Cauchy location-scale parameters, and that the dual Cauchy hyperbolic Delaunay complexes are Fisher orthogonal to the Cauchy hyperbolic Voronoi diagrams. The dual Voronoi diagrams with respect to the dual flat divergences amount to dual Bregman Voronoi diagrams, and their dual complexes are regular triangulations. The primal Bregman Voronoi diagram is the Euclidean Voronoi diagram and the dual Bregman Voronoi diagram coincides with the Cauchy hyperbolic Voronoi diagram. In addition, we prove that the square root of the Kullback-Leibler divergence between Cauchy distributions yields a metric distance which is Hilbertian for the Cauchy scale families.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Li, Wuchen. "Transport information Bregman divergences." Information Geometry 4, no. 2 (November 15, 2021): 435–70. http://dx.doi.org/10.1007/s41884-021-00063-5.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Nielsen, Frank. "On a Generalization of the Jensen–Shannon Divergence and the Jensen–Shannon Centroid." Entropy 22, no. 2 (February 16, 2020): 221. http://dx.doi.org/10.3390/e22020221.

Повний текст джерела
Анотація:
The Jensen–Shannon divergence is a renown bounded symmetrization of the Kullback–Leibler divergence which does not require probability densities to have matching supports. In this paper, we introduce a vector-skew generalization of the scalar α -Jensen–Bregman divergences and derive thereof the vector-skew α -Jensen–Shannon divergences. We prove that the vector-skew α -Jensen–Shannon divergences are f-divergences and study the properties of these novel divergences. Finally, we report an iterative algorithm to numerically compute the Jensen–Shannon-type centroids for a set of probability densities belonging to a mixture family: This includes the case of the Jensen–Shannon centroid of a set of categorical distributions or normalized histograms.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Chen, P., Y. Chen, and M. Rao. "Metrics defined by Bregman Divergences." Communications in Mathematical Sciences 6, no. 4 (2008): 915–26. http://dx.doi.org/10.4310/cms.2008.v6.n4.a6.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Bock, Andreas A., and Martin S. Andersen. "Preconditioner Design via Bregman Divergences." SIAM Journal on Matrix Analysis and Applications 45, no. 2 (June 7, 2024): 1148–82. http://dx.doi.org/10.1137/23m1566637.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Zhang, Jianwen, and Changshui Zhang. "Multitask Bregman Clustering." Proceedings of the AAAI Conference on Artificial Intelligence 24, no. 1 (July 3, 2010): 655–60. http://dx.doi.org/10.1609/aaai.v24i1.7674.

Повний текст джерела
Анотація:
Traditional clustering methods deal with a single clustering task on a single data set. However, in some newly emerging applications, multiple similar clustering tasks are involved simultaneously. In this case, we not only desire a partition for each task, but also want to discover the relationship among clusters of different tasks. It's also expected that the learnt relationship among tasks can improve performance of each single task. In this paper, we propose a general framework for this problem and further suggest a specific approach. In our approach, we alternatively update clusters and learn relationship between clusters of different tasks, and the two phases boost each other. Our approach is based on the general Bregman divergence, hence it's suitable for a large family of assumptions on data distributions and divergences. Empirical results on several benchmark data sets validate the approach.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Liang, Xiao. "A Note on Divergences." Neural Computation 28, no. 10 (October 2016): 2045–62. http://dx.doi.org/10.1162/neco_a_00878.

Повний текст джерела
Анотація:
In many areas of neural computation, like learning, optimization, estimation, and inference, suitable divergences play a key role. In this note, we study the conjecture presented by Amari ( 2009 ) and find a counterexample to show that the conjecture does not hold generally. Moreover, we investigate two classes of [Formula: see text]-divergence (Zhang, 2004 ), weighted f-divergence and weighted [Formula: see text]-divergence, and prove that if a divergence is a weighted f-divergence, as well as a Bregman divergence, then it is a weighted [Formula: see text]-divergence. This result reduces in form to the main theorem established by Amari ( 2009 ) when [Formula: see text] [Formula: see text].
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Voronov, Yuri, and Matvej Sviridov. "NEW INTERREGIONAL DIFFERENCES MEASURE MENTTOOLS." Interexpo GEO-Siberia 3, no. 1 (2019): 71–77. http://dx.doi.org/10.33764/2618-981x-2019-3-1-71-77.

Повний текст джерела
Анотація:
The article proposes new methods of measurement of differences between of RF regions economic development level. Currently, indicators are used for comparison such as mean, median and mode. And the choice of three options is carried out not on formal criteria, but on the basis of general reasoning. The authors propose to build a system of comparisons on the basis of Bregman divergences. They are already beginning to be used when comparing regions by biologists. These measures have not yet been applied in economic analysis. The divergence of Kulback-Leibler (KL-divergence) is investigated, primarily, the variant of the Bregman divergence. Pairwise comparisons of the level of immigration to the regions as a criterion of quality of measurement of differences are proposed. The main calculations are made on the example of the level of wages in the regions of the South of Western Siberia. The conclusion is made about the prospects of using the existing variants of Bregman divergence, as well as about the possibility of developing new divergence options that best match the comparative attractiveness of the regions for immigrants.
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Nielsen, Frank. "Generalizing the Alpha-Divergences and the Oriented Kullback–Leibler Divergences with Quasi-Arithmetic Means." Algorithms 15, no. 11 (November 17, 2022): 435. http://dx.doi.org/10.3390/a15110435.

Повний текст джерела
Анотація:
The family of α-divergences including the oriented forward and reverse Kullback–Leibler divergences is often used in signal processing, pattern recognition, and machine learning, among others. Choosing a suitable α-divergence can either be done beforehand according to some prior knowledge of the application domains or directly learned from data sets. In this work, we generalize the α-divergences using a pair of strictly comparable weighted means. Our generalization allows us to obtain in the limit case α→1 the 1-divergence, which provides a generalization of the forward Kullback–Leibler divergence, and in the limit case α→0, the 0-divergence, which corresponds to a generalization of the reverse Kullback–Leibler divergence. We then analyze the condition for a pair of weighted quasi-arithmetic means to be strictly comparable and describe the family of quasi-arithmetic α-divergences including its subfamily of power homogeneous α-divergences. In particular, we study the generalized quasi-arithmetic 1-divergences and 0-divergences and show that these counterpart generalizations of the oriented Kullback–Leibler divergences can be rewritten as equivalent conformal Bregman divergences using strictly monotone embeddings. Finally, we discuss the applications of these novel divergences to k-means clustering by studying the robustness property of the centroids.
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Nielsen, Frank. "Statistical Divergences between Densities of Truncated Exponential Families with Nested Supports: Duo Bregman and Duo Jensen Divergences." Entropy 24, no. 3 (March 17, 2022): 421. http://dx.doi.org/10.3390/e24030421.

Повний текст джерела
Анотація:
By calculating the Kullback–Leibler divergence between two probability measures belonging to different exponential families dominated by the same measure, we obtain a formula that generalizes the ordinary Fenchel–Young divergence. Inspired by this formula, we define the duo Fenchel–Young divergence and report a majorization condition on its pair of strictly convex generators, which guarantees that this divergence is always non-negative. The duo Fenchel–Young divergence is also equivalent to a duo Bregman divergence. We show how to use these duo divergences by calculating the Kullback–Leibler divergence between densities of truncated exponential families with nested supports, and report a formula for the Kullback–Leibler divergence between truncated normal distributions. Finally, we prove that the skewed Bhattacharyya distances between truncated exponential families amount to equivalent skewed duo Jensen divergences.
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Nock, R., and F. Nielsen. "Bregman Divergences and Surrogates for Learning." IEEE Transactions on Pattern Analysis and Machine Intelligence 31, no. 11 (November 2009): 2048–59. http://dx.doi.org/10.1109/tpami.2008.225.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Fischer, Aurélie. "Quantization and clustering with Bregman divergences." Journal of Multivariate Analysis 101, no. 9 (October 2010): 2207–21. http://dx.doi.org/10.1016/j.jmva.2010.05.008.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Santos-Rodriguez, R., and J. Cid-Sueiro. "Cost-Sensitive Sequences of Bregman Divergences." IEEE Transactions on Neural Networks and Learning Systems 23, no. 12 (December 2012): 1896–904. http://dx.doi.org/10.1109/tnnls.2012.2219067.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Sun, Jigang, Colin Fyfe, and Malcolm Crowe. "Extending Sammon mapping with Bregman divergences." Information Sciences 187 (March 2012): 72–92. http://dx.doi.org/10.1016/j.ins.2011.10.013.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Dhillon, Inderjit S., and Joel A. Tropp. "Matrix Nearness Problems with Bregman Divergences." SIAM Journal on Matrix Analysis and Applications 29, no. 4 (January 2008): 1120–46. http://dx.doi.org/10.1137/060649021.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Kokolakis, G., Ph Nanopoulos, and D. Fouskakis. "Bregman divergences in the -partitioning problem." Computational Statistics & Data Analysis 51, no. 2 (November 2006): 668–78. http://dx.doi.org/10.1016/j.csda.2006.02.017.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
19

Nielsen, Frank. "Divergences Induced by the Cumulant and Partition Functions of Exponential Families and Their Deformations Induced by Comparative Convexity." Entropy 26, no. 3 (February 23, 2024): 193. http://dx.doi.org/10.3390/e26030193.

Повний текст джерела
Анотація:
Exponential families are statistical models which are the workhorses in statistics, information theory, and machine learning, among others. An exponential family can either be normalized subtractively by its cumulant or free energy function, or equivalently normalized divisively by its partition function. Both the cumulant and partition functions are strictly convex and smooth functions inducing corresponding pairs of Bregman and Jensen divergences. It is well known that skewed Bhattacharyya distances between the probability densities of an exponential family amount to skewed Jensen divergences induced by the cumulant function between their corresponding natural parameters, and that in limit cases the sided Kullback–Leibler divergences amount to reverse-sided Bregman divergences. In this work, we first show that the α-divergences between non-normalized densities of an exponential family amount to scaled α-skewed Jensen divergences induced by the partition function. We then show how comparative convexity with respect to a pair of quasi-arithmetical means allows both convex functions and their arguments to be deformed, thereby defining dually flat spaces with corresponding divergences when ordinary convexity is preserved.
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Nielsen, Frank, and Richard Nock. "Generalizing Skew Jensen Divergences and Bregman Divergences With Comparative Convexity." IEEE Signal Processing Letters 24, no. 8 (August 2017): 1123–27. http://dx.doi.org/10.1109/lsp.2017.2712195.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Wada, Tatsuaki. "Generalized log-likelihood functions and Bregman divergences." Journal of Mathematical Physics 50, no. 11 (November 2009): 113301. http://dx.doi.org/10.1063/1.3257917.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
22

Sun, Jigang, Malcolm Crowe, and Colin Fyfe. "Extending metric multidimensional scaling with Bregman divergences." Pattern Recognition 44, no. 5 (May 2011): 1137–54. http://dx.doi.org/10.1016/j.patcog.2010.11.013.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Venkatesan, R. C., and A. Plastino. "Scaled Bregman divergences in a Tsallis scenario." Physica A: Statistical Mechanics and its Applications 390, no. 15 (August 2011): 2749–58. http://dx.doi.org/10.1016/j.physa.2011.03.012.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
24

Chen, P., Y. Chen, and M. Rao. "Metrics defined by Bregman divergences: Part 2." Communications in Mathematical Sciences 6, no. 4 (2008): 927–48. http://dx.doi.org/10.4310/cms.2008.v6.n4.a7.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Donmez, Mehmet A., Huseyin A. Inan, and Suleyman S. Kozat. "Adaptive mixture methods based on Bregman divergences." Digital Signal Processing 23, no. 1 (January 2013): 86–97. http://dx.doi.org/10.1016/j.dsp.2012.09.006.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
26

Santos-Rodríguez, Raúl, Alicia Guerrero-Curieses, Rocío Alaiz-Rodríguez, and Jesús Cid-Sueiro. "Cost-sensitive learning based on Bregman divergences." Machine Learning 76, no. 2-3 (July 23, 2009): 271–85. http://dx.doi.org/10.1007/s10994-009-5132-8.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
27

LÓPEZ-RUBIO, EZEQUIEL, ESTEBAN JOSÉ PALOMO, and ENRIQUE DOMÍNGUEZ. "BREGMAN DIVERGENCES FOR GROWING HIERARCHICAL SELF-ORGANIZING NETWORKS." International Journal of Neural Systems 24, no. 04 (April 3, 2014): 1450016. http://dx.doi.org/10.1142/s0129065714500166.

Повний текст джерела
Анотація:
Growing hierarchical self-organizing models are characterized by the flexibility of their structure, which can easily accomodate for complex input datasets. However, most proposals use the Euclidean distance as the only error measure. Here we propose a way to introduce Bregman divergences in these models, which is based on stochastic approximation principles, so that more general distortion measures can be employed. A procedure is derived to compare the performance of networks using different divergences. Moreover, a probabilistic interpretation of the model is provided, which enables its use as a Bayesian classifier. Experimental results are presented for classification and data visualization applications, which show the advantages of these divergences with respect to the classical Euclidean distance.
Стилі APA, Harvard, Vancouver, ISO та ін.
28

Stummer, Wolfgang, and Igor Vajda. "On Bregman Distances and Divergences of Probability Measures." IEEE Transactions on Information Theory 58, no. 3 (March 2012): 1277–88. http://dx.doi.org/10.1109/tit.2011.2178139.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
29

BAĞCI, G. B., ALTUĞ ARDA, and RAMAZAN SEVER. "ON THE PROBLEM OF CONSTRAINTS IN NONEXTENSIVE FORMALISM: A QUANTUM MECHANICAL TREATMENT." International Journal of Modern Physics B 20, no. 14 (June 10, 2006): 2085–92. http://dx.doi.org/10.1142/s0217979206034510.

Повний текст джерела
Анотація:
Relative entropy (divergence) of Bregman type recently proposed by T. D. Frank and Jan Naudts is considered and its quantum counterpart is used to calculate purity of the Werner state in nonextensive formalism. It has been observed that two different expressions arise due to two different forms of quantum divergences. It is then argued that the difference is due to the fact that the relative entropy of Bregman type is related to the first choice thermostatistics whereas one of Csiszàr type is related to the third-choice thermostatistics. The superiority of the third-choice thermostatistics to the first-choice thermostatistics has been deduced by noticing that the expression obtained by using the Bregman type leads to negative values for q ∈ (0, 1) and fidelity F smaller than 1 whereas the one obtained by using Csiszàr type is free from such anomalies. Moreover, it has been noted that these two measures show different qualitative behavior with respect to F. Contrary to the classical case, the violation of the positive definiteness of the relative entropy immediately results in a choice between the two constraints without any need of more abstract Shore–Johnson axioms. The possibility of writing a relative entropy of Bregman type compatible with the third choice has been investigated further. The answer turns out to be negative as far as the usual transformation from ordinary probabilities to the escort probabilities are considered.
Стилі APA, Harvard, Vancouver, ISO та ін.
30

Edelsbrunner, Herbert, Katharina Ölsböck, and Hubert Wagner. "Understanding Higher-Order Interactions in Information Space." Entropy 26, no. 8 (July 27, 2024): 637. http://dx.doi.org/10.3390/e26080637.

Повний текст джерела
Анотація:
Methods used in topological data analysis naturally capture higher-order interactions in point cloud data embedded in a metric space. This methodology was recently extended to data living in an information space, by which we mean a space measured with an information theoretical distance. One such setting is a finite collection of discrete probability distributions embedded in the probability simplex measured with the relative entropy (Kullback–Leibler divergence). More generally, one can work with a Bregman divergence parameterized by a different notion of entropy. While theoretical algorithms exist for this setup, there is a paucity of implementations for exploring and comparing geometric-topological properties of various information spaces. The interest of this work is therefore twofold. First, we propose the first robust algorithms and software for geometric and topological data analysis in information space. Perhaps surprisingly, despite working with Bregman divergences, our design reuses robust libraries for the Euclidean case. Second, using the new software, we take the first steps towards understanding the geometric-topological structure of these spaces. In particular, we compare them with the more familiar spaces equipped with the Euclidean and Fisher metrics.
Стилі APA, Harvard, Vancouver, ISO та ін.
31

Penev, Spiridon, and Tania Prvan. "Robust estimation in structural equation models using Bregman divergences." ANZIAM Journal 54 (September 24, 2013): 574. http://dx.doi.org/10.21914/anziamj.v54i0.6306.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Virosztek, Dániel. "Maps on Quantum States Preserving Bregman and Jensen Divergences." Letters in Mathematical Physics 106, no. 9 (June 28, 2016): 1217–34. http://dx.doi.org/10.1007/s11005-016-0868-0.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Li, Xinyao, and Akhilesh Tyagi. "Block-Active ADMM to Minimize NMF with Bregman Divergences." Sensors 23, no. 16 (August 17, 2023): 7229. http://dx.doi.org/10.3390/s23167229.

Повний текст джерела
Анотація:
Over the last ten years, there has been a significant interest in employing nonnegative matrix factorization (NMF) to reduce dimensionality to enable a more efficient clustering analysis in machine learning. This technique has been applied in various image processing applications within the fields of computer vision and sensor-based systems. Many algorithms exist to solve the NMF problem. Among these algorithms, the alternating direction method of multipliers (ADMM) and its variants are one of the most popular methods used in practice. In this paper, we propose a block-active ADMM method to minimize the NMF problem with general Bregman divergences. The subproblems in the ADMM are solved iteratively by a block-coordinate-descent-type (BCD-type) method. In particular, each block is chosen directly based on the stationary condition. As a result, we are able to use much fewer auxiliary variables and the proposed algorithm converges faster than the previously proposed algorithms. From the theoretical point of view, the proposed algorithm is proved to converge to a stationary point sublinearly. We also conduct a series of numerical experiments to demonstrate the superiority of the proposed algorithm.
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Quadeer, Maria, Marco Tomamichel, and Christopher Ferrie. "Minimax quantum state estimation under Bregman divergence." Quantum 3 (March 4, 2019): 126. http://dx.doi.org/10.22331/q-2019-03-04-126.

Повний текст джерела
Анотація:
We investigate minimax estimators for quantum state tomography under general Bregman divergences. First, generalizing the work of Koyama et al. [Entropy 19, 618 (2017)] for relative entropy, we find that given any estimator for a quantum state, there always exists a sequence of Bayes estimators that asymptotically perform at least as well as the given estimator, on any state. Second, we show that there always exists a sequence of priors for which the corresponding sequence of Bayes estimators is asymptotically minimax (i.e. it minimizes the worst-case risk). Third, by re-formulating Holevo's theorem for the covariant state estimation problem in terms of estimators, we find that there exists a covariant measurement that is, in fact, minimax (i.e. it minimizes the worst-case risk). Moreover, we find that a measurement that is covariant only under a unitary 2-design is also minimax. Lastly, in an attempt to understand the problem of finding minimax measurements for general state estimation, we study the qubit case in detail and find that every spherical 2-design is a minimax measurement.
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Molnár, Lajos, József Pitrik, and Dániel Virosztek. "Maps on positive definite matrices preserving Bregman and Jensen divergences." Linear Algebra and its Applications 495 (April 2016): 174–89. http://dx.doi.org/10.1016/j.laa.2016.01.010.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Kanamori, Takafumi, and Takashi Takenouchi. "Graph-based composite local Bregman divergences on discrete sample spaces." Neural Networks 95 (November 2017): 44–56. http://dx.doi.org/10.1016/j.neunet.2017.06.005.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Harandi, Mehrtash T., Richard Hartley, Brian Lovell, and Conrad Sanderson. "Sparse Coding on Symmetric Positive Definite Manifolds Using Bregman Divergences." IEEE Transactions on Neural Networks and Learning Systems 27, no. 6 (June 2016): 1294–306. http://dx.doi.org/10.1109/tnnls.2014.2387383.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Vial, Pierre-Hugo, Paul Magron, Thomas Oberlin, and Cedric Fevotte. "Phase Retrieval With Bregman Divergences and Application to Audio Signal Recovery." IEEE Journal of Selected Topics in Signal Processing 15, no. 1 (January 2021): 51–64. http://dx.doi.org/10.1109/jstsp.2021.3051870.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
39

Reem, Daniel, Simeon Reich, and Alvaro De Pierro. "Re-examination of Bregman functions and new properties of their divergences." Optimization 68, no. 1 (November 19, 2018): 279–348. http://dx.doi.org/10.1080/02331934.2018.1543295.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
40

Teng, Yueyang, Nimu Yuan, Yaonan Zhang, Shouliang Qi, and Yan Kang. "Family of Iterative Reconstruction Algorithms for Medical Imaging with Bregman-Divergences." Journal of Medical Imaging and Health Informatics 5, no. 8 (December 1, 2015): 1708–14. http://dx.doi.org/10.1166/jmihi.2015.1633.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Liang, Zhizheng, Lei Zhang, Jin Liu, and Yong Zhou. "Adaptively weighted learning for twin support vector machines via Bregman divergences." Neural Computing and Applications 32, no. 8 (November 2, 2018): 3323–36. http://dx.doi.org/10.1007/s00521-018-3843-0.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
42

Paul, Grégory, Janick Cardinale, and Ivo F. Sbalzarini. "Coupling Image Restoration and Segmentation: A Generalized Linear Model/Bregman Perspective." International Journal of Computer Vision 104, no. 1 (March 8, 2013): 69–93. http://dx.doi.org/10.1007/s11263-013-0615-2.

Повний текст джерела
Анотація:
Abstract We introduce a new class of data-fitting energies that couple image segmentation with image restoration. These functionals model the image intensity using the statistical framework of generalized linear models. By duality, we establish an information-theoretic interpretation using Bregman divergences. We demonstrate how this formulation couples in a principled way image restoration tasks such as denoising, deblurring (deconvolution), and inpainting with segmentation. We present an alternating minimization algorithm to solve the resulting composite photometric/geometric inverse problem. We use Fisher scoring to solve the photometric problem and to provide asymptotic uncertainty estimates. We derive the shape gradient of our data-fitting energy and investigate convex relaxation for the geometric problem. We introduce a new alternating split-Bregman strategy to solve the resulting convex problem and present experiments and comparisons on both synthetic and real-world images.
Стилі APA, Harvard, Vancouver, ISO та ін.
43

Mavridis, Christos N., and John S. Baras. "Convergence of Stochastic Vector Quantization and Learning Vector Quantization with Bregman Divergences." IFAC-PapersOnLine 53, no. 2 (2020): 2214–19. http://dx.doi.org/10.1016/j.ifacol.2020.12.006.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
44

Pronzato, Luc, Henry P. Wynn, and Anatoly Zhigljavsky. "Bregman divergences based on optimal design criteria and simplicial measures of dispersion." Statistical Papers 60, no. 2 (January 16, 2019): 545–64. http://dx.doi.org/10.1007/s00362-018-01082-8.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
45

Sasaki, Hiroaki, Yung-Kyun Noh, Gang Niu, and Masashi Sugiyama. "Direct Density Derivative Estimation." Neural Computation 28, no. 6 (June 2016): 1101–40. http://dx.doi.org/10.1162/neco_a_00835.

Повний текст джерела
Анотація:
Estimating the derivatives of probability density functions is an essential step in statistical data analysis. A naive approach to estimate the derivatives is to first perform density estimation and then compute its derivatives. However, this approach can be unreliable because a good density estimator does not necessarily mean a good density derivative estimator. To cope with this problem, in this letter, we propose a novel method that directly estimates density derivatives without going through density estimation. The proposed method provides computationally efficient estimation for the derivatives of any order on multidimensional data with a hyperparameter tuning method and achieves the optimal parametric convergence rate. We further discuss an extension of the proposed method by applying regularized multitask learning and a general framework for density derivative estimation based on Bregman divergences. Applications of the proposed method to nonparametric Kullback-Leibler divergence approximation and bandwidth matrix selection in kernel density estimation are also explored.
Стилі APA, Harvard, Vancouver, ISO та ін.
46

Adamčík, Martin. "The Information Geometry of Bregman Divergences and Some Applications in Multi-Expert Reasoning." Entropy 16, no. 12 (December 1, 2014): 6338–81. http://dx.doi.org/10.3390/e16126338.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
47

Hughes, Gareth, and Cairistiona Topp. "Probabilistic Forecasts: Scoring Rules and Their Decomposition and Diagrammatic Representation via Bregman Divergences." Entropy 17, no. 12 (July 31, 2015): 5450–71. http://dx.doi.org/10.3390/e17085450.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
48

Andrieux, Stéphane. "Bregman divergences for physically informed discrepancy measures for learning and computation in thermomechanics." Comptes Rendus. Mécanique 351, G1 (February 1, 2023): 59–81. http://dx.doi.org/10.5802/crmeca.164.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
49

Wada, Tatsuaki. "Erratum: “Generalized log-likelihood functions and Bregman divergences” [J. Math. Phys. 80, 113301 (2009)]." Journal of Mathematical Physics 51, no. 4 (April 2010): 049901. http://dx.doi.org/10.1063/1.3376658.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
50

Ferreira, Daniela Portes Leal, Eraldo Ribeiro, and Celia A. Zorzo Barcelos. "A variational approach to non-rigid image registration with Bregman divergences and multiple features." Pattern Recognition 77 (May 2018): 237–47. http://dx.doi.org/10.1016/j.patcog.2017.12.015.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії