Segui questo link per vedere altri tipi di pubblicazioni sul tema: Divergences de Bregman.

Articoli di riviste sul tema "Divergences de Bregman"

Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili

Scegli il tipo di fonte:

Vedi i top-50 articoli di riviste per l'attività di ricerca sul tema "Divergences de Bregman".

Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.

Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.

Vedi gli articoli di riviste di molte aree scientifiche e compila una bibliografia corretta.

1

Amari, S., e A. Cichocki. "Information geometry of divergence functions". Bulletin of the Polish Academy of Sciences: Technical Sciences 58, n. 1 (1 marzo 2010): 183–95. http://dx.doi.org/10.2478/v10175-010-0019-1.

Testo completo
Abstract (sommario):
Information geometry of divergence functionsMeasures of divergence between two points play a key role in many engineering problems. One such measure is a distance function, but there are many important measures which do not satisfy the properties of the distance. The Bregman divergence, Kullback-Leibler divergence andf-divergence are such measures. In the present article, we study the differential-geometrical structure of a manifold induced by a divergence function. It consists of a Riemannian metric, and a pair of dually coupled affine connections, which are studied in information geometry. The class of Bregman divergences are characterized by a dually flat structure, which is originated from the Legendre duality. A dually flat space admits a generalized Pythagorean theorem. The class off-divergences, defined on a manifold of probability distributions, is characterized by information monotonicity, and the Kullback-Leibler divergence belongs to the intersection of both classes. Thef-divergence always gives the α-geometry, which consists of the Fisher information metric and a dual pair of ±α-connections. The α-divergence is a special class off-divergences. This is unique, sitting at the intersection of thef-divergence and Bregman divergence classes in a manifold of positive measures. The geometry derived from the Tsallisq-entropy and related divergences are also addressed.
Gli stili APA, Harvard, Vancouver, ISO e altri
2

ABDULLAH, AMIRALI, JOHN MOELLER e SURESH VENKATASUBRAMANIAN. "APPROXIMATE BREGMAN NEAR NEIGHBORS IN SUBLINEAR TIME: BEYOND THE TRIANGLE INEQUALITY". International Journal of Computational Geometry & Applications 23, n. 04n05 (agosto 2013): 253–301. http://dx.doi.org/10.1142/s0218195913600066.

Testo completo
Abstract (sommario):
Bregman divergences are important distance measures that are used extensively in data-driven applications such as computer vision, text mining, and speech processing, and are a key focus of interest in machine learning. Answering nearest neighbor (NN) queries under these measures is very important in these applications and has been the subject of extensive study, but is problematic because these distance measures lack metric properties like symmetry and the triangle inequality. In this paper, we present the first provably approximate nearest-neighbor (ANN) algorithms for a broad sub-class of Bregman divergences under some assumptions. Specifically, we examine Bregman divergences which can be decomposed along each dimension and our bounds also depend on restricting the size of our allowed domain. We obtain bounds for both the regular asymmetric Bregman divergences as well as their symmetrized versions. To do so, we develop two geometric properties vital to our analysis: a reverse triangle inequality (RTI) and a relaxed triangle inequality called μ-defectiveness where μ is a domain-dependent value. Bregman divergences satisfy the RTI but not μ-defectiveness. However, we show that the square root of a Bregman divergence does satisfy μ-defectiveness. This allows us to then utilize both properties in an efficient search data structure that follows the general two-stage paradigm of a ring-tree decomposition followed by a quad tree search used in previous near-neighbor algorithms for Euclidean space and spaces of bounded doubling dimension. Our first algorithm resolves a query for a d-dimensional (1 + ε)-ANN in [Formula: see text] time and O(n log d-1 n) space and holds for generic μ-defective distance measures satisfying a RTI. Our second algorithm is more specific in analysis to the Bregman divergences and uses a further structural parameter, the maximum ratio of second derivatives over each dimension of our allowed domain (c0). This allows us to locate a (1 + ε)-ANN in O( log n) time and O(n) space, where there is a further (c0)d factor in the big-Oh for the query time.
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Nielsen, Frank. "On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds". Entropy 22, n. 7 (28 giugno 2020): 713. http://dx.doi.org/10.3390/e22070713.

Testo completo
Abstract (sommario):
We study the Voronoi diagrams of a finite set of Cauchy distributions and their dual complexes from the viewpoint of information geometry by considering the Fisher-Rao distance, the Kullback-Leibler divergence, the chi square divergence, and a flat divergence derived from Tsallis entropy related to the conformal flattening of the Fisher-Rao geometry. We prove that the Voronoi diagrams of the Fisher-Rao distance, the chi square divergence, and the Kullback-Leibler divergences all coincide with a hyperbolic Voronoi diagram on the corresponding Cauchy location-scale parameters, and that the dual Cauchy hyperbolic Delaunay complexes are Fisher orthogonal to the Cauchy hyperbolic Voronoi diagrams. The dual Voronoi diagrams with respect to the dual flat divergences amount to dual Bregman Voronoi diagrams, and their dual complexes are regular triangulations. The primal Bregman Voronoi diagram is the Euclidean Voronoi diagram and the dual Bregman Voronoi diagram coincides with the Cauchy hyperbolic Voronoi diagram. In addition, we prove that the square root of the Kullback-Leibler divergence between Cauchy distributions yields a metric distance which is Hilbertian for the Cauchy scale families.
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Li, Wuchen. "Transport information Bregman divergences". Information Geometry 4, n. 2 (15 novembre 2021): 435–70. http://dx.doi.org/10.1007/s41884-021-00063-5.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Nielsen, Frank. "On a Generalization of the Jensen–Shannon Divergence and the Jensen–Shannon Centroid". Entropy 22, n. 2 (16 febbraio 2020): 221. http://dx.doi.org/10.3390/e22020221.

Testo completo
Abstract (sommario):
The Jensen–Shannon divergence is a renown bounded symmetrization of the Kullback–Leibler divergence which does not require probability densities to have matching supports. In this paper, we introduce a vector-skew generalization of the scalar α -Jensen–Bregman divergences and derive thereof the vector-skew α -Jensen–Shannon divergences. We prove that the vector-skew α -Jensen–Shannon divergences are f-divergences and study the properties of these novel divergences. Finally, we report an iterative algorithm to numerically compute the Jensen–Shannon-type centroids for a set of probability densities belonging to a mixture family: This includes the case of the Jensen–Shannon centroid of a set of categorical distributions or normalized histograms.
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Chen, P., Y. Chen e M. Rao. "Metrics defined by Bregman Divergences". Communications in Mathematical Sciences 6, n. 4 (2008): 915–26. http://dx.doi.org/10.4310/cms.2008.v6.n4.a6.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Bock, Andreas A., e Martin S. Andersen. "Preconditioner Design via Bregman Divergences". SIAM Journal on Matrix Analysis and Applications 45, n. 2 (7 giugno 2024): 1148–82. http://dx.doi.org/10.1137/23m1566637.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Zhang, Jianwen, e Changshui Zhang. "Multitask Bregman Clustering". Proceedings of the AAAI Conference on Artificial Intelligence 24, n. 1 (3 luglio 2010): 655–60. http://dx.doi.org/10.1609/aaai.v24i1.7674.

Testo completo
Abstract (sommario):
Traditional clustering methods deal with a single clustering task on a single data set. However, in some newly emerging applications, multiple similar clustering tasks are involved simultaneously. In this case, we not only desire a partition for each task, but also want to discover the relationship among clusters of different tasks. It's also expected that the learnt relationship among tasks can improve performance of each single task. In this paper, we propose a general framework for this problem and further suggest a specific approach. In our approach, we alternatively update clusters and learn relationship between clusters of different tasks, and the two phases boost each other. Our approach is based on the general Bregman divergence, hence it's suitable for a large family of assumptions on data distributions and divergences. Empirical results on several benchmark data sets validate the approach.
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Liang, Xiao. "A Note on Divergences". Neural Computation 28, n. 10 (ottobre 2016): 2045–62. http://dx.doi.org/10.1162/neco_a_00878.

Testo completo
Abstract (sommario):
In many areas of neural computation, like learning, optimization, estimation, and inference, suitable divergences play a key role. In this note, we study the conjecture presented by Amari ( 2009 ) and find a counterexample to show that the conjecture does not hold generally. Moreover, we investigate two classes of [Formula: see text]-divergence (Zhang, 2004 ), weighted f-divergence and weighted [Formula: see text]-divergence, and prove that if a divergence is a weighted f-divergence, as well as a Bregman divergence, then it is a weighted [Formula: see text]-divergence. This result reduces in form to the main theorem established by Amari ( 2009 ) when [Formula: see text] [Formula: see text].
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Voronov, Yuri, e Matvej Sviridov. "NEW INTERREGIONAL DIFFERENCES MEASURE MENTTOOLS". Interexpo GEO-Siberia 3, n. 1 (2019): 71–77. http://dx.doi.org/10.33764/2618-981x-2019-3-1-71-77.

Testo completo
Abstract (sommario):
The article proposes new methods of measurement of differences between of RF regions economic development level. Currently, indicators are used for comparison such as mean, median and mode. And the choice of three options is carried out not on formal criteria, but on the basis of general reasoning. The authors propose to build a system of comparisons on the basis of Bregman divergences. They are already beginning to be used when comparing regions by biologists. These measures have not yet been applied in economic analysis. The divergence of Kulback-Leibler (KL-divergence) is investigated, primarily, the variant of the Bregman divergence. Pairwise comparisons of the level of immigration to the regions as a criterion of quality of measurement of differences are proposed. The main calculations are made on the example of the level of wages in the regions of the South of Western Siberia. The conclusion is made about the prospects of using the existing variants of Bregman divergence, as well as about the possibility of developing new divergence options that best match the comparative attractiveness of the regions for immigrants.
Gli stili APA, Harvard, Vancouver, ISO e altri
11

Nielsen, Frank. "Generalizing the Alpha-Divergences and the Oriented Kullback–Leibler Divergences with Quasi-Arithmetic Means". Algorithms 15, n. 11 (17 novembre 2022): 435. http://dx.doi.org/10.3390/a15110435.

Testo completo
Abstract (sommario):
The family of α-divergences including the oriented forward and reverse Kullback–Leibler divergences is often used in signal processing, pattern recognition, and machine learning, among others. Choosing a suitable α-divergence can either be done beforehand according to some prior knowledge of the application domains or directly learned from data sets. In this work, we generalize the α-divergences using a pair of strictly comparable weighted means. Our generalization allows us to obtain in the limit case α→1 the 1-divergence, which provides a generalization of the forward Kullback–Leibler divergence, and in the limit case α→0, the 0-divergence, which corresponds to a generalization of the reverse Kullback–Leibler divergence. We then analyze the condition for a pair of weighted quasi-arithmetic means to be strictly comparable and describe the family of quasi-arithmetic α-divergences including its subfamily of power homogeneous α-divergences. In particular, we study the generalized quasi-arithmetic 1-divergences and 0-divergences and show that these counterpart generalizations of the oriented Kullback–Leibler divergences can be rewritten as equivalent conformal Bregman divergences using strictly monotone embeddings. Finally, we discuss the applications of these novel divergences to k-means clustering by studying the robustness property of the centroids.
Gli stili APA, Harvard, Vancouver, ISO e altri
12

Nielsen, Frank. "Statistical Divergences between Densities of Truncated Exponential Families with Nested Supports: Duo Bregman and Duo Jensen Divergences". Entropy 24, n. 3 (17 marzo 2022): 421. http://dx.doi.org/10.3390/e24030421.

Testo completo
Abstract (sommario):
By calculating the Kullback–Leibler divergence between two probability measures belonging to different exponential families dominated by the same measure, we obtain a formula that generalizes the ordinary Fenchel–Young divergence. Inspired by this formula, we define the duo Fenchel–Young divergence and report a majorization condition on its pair of strictly convex generators, which guarantees that this divergence is always non-negative. The duo Fenchel–Young divergence is also equivalent to a duo Bregman divergence. We show how to use these duo divergences by calculating the Kullback–Leibler divergence between densities of truncated exponential families with nested supports, and report a formula for the Kullback–Leibler divergence between truncated normal distributions. Finally, we prove that the skewed Bhattacharyya distances between truncated exponential families amount to equivalent skewed duo Jensen divergences.
Gli stili APA, Harvard, Vancouver, ISO e altri
13

Nock, R., e F. Nielsen. "Bregman Divergences and Surrogates for Learning". IEEE Transactions on Pattern Analysis and Machine Intelligence 31, n. 11 (novembre 2009): 2048–59. http://dx.doi.org/10.1109/tpami.2008.225.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
14

Fischer, Aurélie. "Quantization and clustering with Bregman divergences". Journal of Multivariate Analysis 101, n. 9 (ottobre 2010): 2207–21. http://dx.doi.org/10.1016/j.jmva.2010.05.008.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
15

Santos-Rodriguez, R., e J. Cid-Sueiro. "Cost-Sensitive Sequences of Bregman Divergences". IEEE Transactions on Neural Networks and Learning Systems 23, n. 12 (dicembre 2012): 1896–904. http://dx.doi.org/10.1109/tnnls.2012.2219067.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
16

Sun, Jigang, Colin Fyfe e Malcolm Crowe. "Extending Sammon mapping with Bregman divergences". Information Sciences 187 (marzo 2012): 72–92. http://dx.doi.org/10.1016/j.ins.2011.10.013.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
17

Dhillon, Inderjit S., e Joel A. Tropp. "Matrix Nearness Problems with Bregman Divergences". SIAM Journal on Matrix Analysis and Applications 29, n. 4 (gennaio 2008): 1120–46. http://dx.doi.org/10.1137/060649021.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
18

Kokolakis, G., Ph Nanopoulos e D. Fouskakis. "Bregman divergences in the -partitioning problem". Computational Statistics & Data Analysis 51, n. 2 (novembre 2006): 668–78. http://dx.doi.org/10.1016/j.csda.2006.02.017.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
19

Nielsen, Frank. "Divergences Induced by the Cumulant and Partition Functions of Exponential Families and Their Deformations Induced by Comparative Convexity". Entropy 26, n. 3 (23 febbraio 2024): 193. http://dx.doi.org/10.3390/e26030193.

Testo completo
Abstract (sommario):
Exponential families are statistical models which are the workhorses in statistics, information theory, and machine learning, among others. An exponential family can either be normalized subtractively by its cumulant or free energy function, or equivalently normalized divisively by its partition function. Both the cumulant and partition functions are strictly convex and smooth functions inducing corresponding pairs of Bregman and Jensen divergences. It is well known that skewed Bhattacharyya distances between the probability densities of an exponential family amount to skewed Jensen divergences induced by the cumulant function between their corresponding natural parameters, and that in limit cases the sided Kullback–Leibler divergences amount to reverse-sided Bregman divergences. In this work, we first show that the α-divergences between non-normalized densities of an exponential family amount to scaled α-skewed Jensen divergences induced by the partition function. We then show how comparative convexity with respect to a pair of quasi-arithmetical means allows both convex functions and their arguments to be deformed, thereby defining dually flat spaces with corresponding divergences when ordinary convexity is preserved.
Gli stili APA, Harvard, Vancouver, ISO e altri
20

Nielsen, Frank, e Richard Nock. "Generalizing Skew Jensen Divergences and Bregman Divergences With Comparative Convexity". IEEE Signal Processing Letters 24, n. 8 (agosto 2017): 1123–27. http://dx.doi.org/10.1109/lsp.2017.2712195.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
21

Wada, Tatsuaki. "Generalized log-likelihood functions and Bregman divergences". Journal of Mathematical Physics 50, n. 11 (novembre 2009): 113301. http://dx.doi.org/10.1063/1.3257917.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
22

Sun, Jigang, Malcolm Crowe e Colin Fyfe. "Extending metric multidimensional scaling with Bregman divergences". Pattern Recognition 44, n. 5 (maggio 2011): 1137–54. http://dx.doi.org/10.1016/j.patcog.2010.11.013.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
23

Venkatesan, R. C., e A. Plastino. "Scaled Bregman divergences in a Tsallis scenario". Physica A: Statistical Mechanics and its Applications 390, n. 15 (agosto 2011): 2749–58. http://dx.doi.org/10.1016/j.physa.2011.03.012.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
24

Chen, P., Y. Chen e M. Rao. "Metrics defined by Bregman divergences: Part 2". Communications in Mathematical Sciences 6, n. 4 (2008): 927–48. http://dx.doi.org/10.4310/cms.2008.v6.n4.a7.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
25

Donmez, Mehmet A., Huseyin A. Inan e Suleyman S. Kozat. "Adaptive mixture methods based on Bregman divergences". Digital Signal Processing 23, n. 1 (gennaio 2013): 86–97. http://dx.doi.org/10.1016/j.dsp.2012.09.006.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
26

Santos-Rodríguez, Raúl, Alicia Guerrero-Curieses, Rocío Alaiz-Rodríguez e Jesús Cid-Sueiro. "Cost-sensitive learning based on Bregman divergences". Machine Learning 76, n. 2-3 (23 luglio 2009): 271–85. http://dx.doi.org/10.1007/s10994-009-5132-8.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
27

LÓPEZ-RUBIO, EZEQUIEL, ESTEBAN JOSÉ PALOMO e ENRIQUE DOMÍNGUEZ. "BREGMAN DIVERGENCES FOR GROWING HIERARCHICAL SELF-ORGANIZING NETWORKS". International Journal of Neural Systems 24, n. 04 (3 aprile 2014): 1450016. http://dx.doi.org/10.1142/s0129065714500166.

Testo completo
Abstract (sommario):
Growing hierarchical self-organizing models are characterized by the flexibility of their structure, which can easily accomodate for complex input datasets. However, most proposals use the Euclidean distance as the only error measure. Here we propose a way to introduce Bregman divergences in these models, which is based on stochastic approximation principles, so that more general distortion measures can be employed. A procedure is derived to compare the performance of networks using different divergences. Moreover, a probabilistic interpretation of the model is provided, which enables its use as a Bayesian classifier. Experimental results are presented for classification and data visualization applications, which show the advantages of these divergences with respect to the classical Euclidean distance.
Gli stili APA, Harvard, Vancouver, ISO e altri
28

Stummer, Wolfgang, e Igor Vajda. "On Bregman Distances and Divergences of Probability Measures". IEEE Transactions on Information Theory 58, n. 3 (marzo 2012): 1277–88. http://dx.doi.org/10.1109/tit.2011.2178139.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
29

BAĞCI, G. B., ALTUĞ ARDA e RAMAZAN SEVER. "ON THE PROBLEM OF CONSTRAINTS IN NONEXTENSIVE FORMALISM: A QUANTUM MECHANICAL TREATMENT". International Journal of Modern Physics B 20, n. 14 (10 giugno 2006): 2085–92. http://dx.doi.org/10.1142/s0217979206034510.

Testo completo
Abstract (sommario):
Relative entropy (divergence) of Bregman type recently proposed by T. D. Frank and Jan Naudts is considered and its quantum counterpart is used to calculate purity of the Werner state in nonextensive formalism. It has been observed that two different expressions arise due to two different forms of quantum divergences. It is then argued that the difference is due to the fact that the relative entropy of Bregman type is related to the first choice thermostatistics whereas one of Csiszàr type is related to the third-choice thermostatistics. The superiority of the third-choice thermostatistics to the first-choice thermostatistics has been deduced by noticing that the expression obtained by using the Bregman type leads to negative values for q ∈ (0, 1) and fidelity F smaller than 1 whereas the one obtained by using Csiszàr type is free from such anomalies. Moreover, it has been noted that these two measures show different qualitative behavior with respect to F. Contrary to the classical case, the violation of the positive definiteness of the relative entropy immediately results in a choice between the two constraints without any need of more abstract Shore–Johnson axioms. The possibility of writing a relative entropy of Bregman type compatible with the third choice has been investigated further. The answer turns out to be negative as far as the usual transformation from ordinary probabilities to the escort probabilities are considered.
Gli stili APA, Harvard, Vancouver, ISO e altri
30

Edelsbrunner, Herbert, Katharina Ölsböck e Hubert Wagner. "Understanding Higher-Order Interactions in Information Space". Entropy 26, n. 8 (27 luglio 2024): 637. http://dx.doi.org/10.3390/e26080637.

Testo completo
Abstract (sommario):
Methods used in topological data analysis naturally capture higher-order interactions in point cloud data embedded in a metric space. This methodology was recently extended to data living in an information space, by which we mean a space measured with an information theoretical distance. One such setting is a finite collection of discrete probability distributions embedded in the probability simplex measured with the relative entropy (Kullback–Leibler divergence). More generally, one can work with a Bregman divergence parameterized by a different notion of entropy. While theoretical algorithms exist for this setup, there is a paucity of implementations for exploring and comparing geometric-topological properties of various information spaces. The interest of this work is therefore twofold. First, we propose the first robust algorithms and software for geometric and topological data analysis in information space. Perhaps surprisingly, despite working with Bregman divergences, our design reuses robust libraries for the Euclidean case. Second, using the new software, we take the first steps towards understanding the geometric-topological structure of these spaces. In particular, we compare them with the more familiar spaces equipped with the Euclidean and Fisher metrics.
Gli stili APA, Harvard, Vancouver, ISO e altri
31

Penev, Spiridon, e Tania Prvan. "Robust estimation in structural equation models using Bregman divergences". ANZIAM Journal 54 (24 settembre 2013): 574. http://dx.doi.org/10.21914/anziamj.v54i0.6306.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
32

Virosztek, Dániel. "Maps on Quantum States Preserving Bregman and Jensen Divergences". Letters in Mathematical Physics 106, n. 9 (28 giugno 2016): 1217–34. http://dx.doi.org/10.1007/s11005-016-0868-0.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
33

Li, Xinyao, e Akhilesh Tyagi. "Block-Active ADMM to Minimize NMF with Bregman Divergences". Sensors 23, n. 16 (17 agosto 2023): 7229. http://dx.doi.org/10.3390/s23167229.

Testo completo
Abstract (sommario):
Over the last ten years, there has been a significant interest in employing nonnegative matrix factorization (NMF) to reduce dimensionality to enable a more efficient clustering analysis in machine learning. This technique has been applied in various image processing applications within the fields of computer vision and sensor-based systems. Many algorithms exist to solve the NMF problem. Among these algorithms, the alternating direction method of multipliers (ADMM) and its variants are one of the most popular methods used in practice. In this paper, we propose a block-active ADMM method to minimize the NMF problem with general Bregman divergences. The subproblems in the ADMM are solved iteratively by a block-coordinate-descent-type (BCD-type) method. In particular, each block is chosen directly based on the stationary condition. As a result, we are able to use much fewer auxiliary variables and the proposed algorithm converges faster than the previously proposed algorithms. From the theoretical point of view, the proposed algorithm is proved to converge to a stationary point sublinearly. We also conduct a series of numerical experiments to demonstrate the superiority of the proposed algorithm.
Gli stili APA, Harvard, Vancouver, ISO e altri
34

Quadeer, Maria, Marco Tomamichel e Christopher Ferrie. "Minimax quantum state estimation under Bregman divergence". Quantum 3 (4 marzo 2019): 126. http://dx.doi.org/10.22331/q-2019-03-04-126.

Testo completo
Abstract (sommario):
We investigate minimax estimators for quantum state tomography under general Bregman divergences. First, generalizing the work of Koyama et al. [Entropy 19, 618 (2017)] for relative entropy, we find that given any estimator for a quantum state, there always exists a sequence of Bayes estimators that asymptotically perform at least as well as the given estimator, on any state. Second, we show that there always exists a sequence of priors for which the corresponding sequence of Bayes estimators is asymptotically minimax (i.e. it minimizes the worst-case risk). Third, by re-formulating Holevo's theorem for the covariant state estimation problem in terms of estimators, we find that there exists a covariant measurement that is, in fact, minimax (i.e. it minimizes the worst-case risk). Moreover, we find that a measurement that is covariant only under a unitary 2-design is also minimax. Lastly, in an attempt to understand the problem of finding minimax measurements for general state estimation, we study the qubit case in detail and find that every spherical 2-design is a minimax measurement.
Gli stili APA, Harvard, Vancouver, ISO e altri
35

Molnár, Lajos, József Pitrik e Dániel Virosztek. "Maps on positive definite matrices preserving Bregman and Jensen divergences". Linear Algebra and its Applications 495 (aprile 2016): 174–89. http://dx.doi.org/10.1016/j.laa.2016.01.010.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
36

Kanamori, Takafumi, e Takashi Takenouchi. "Graph-based composite local Bregman divergences on discrete sample spaces". Neural Networks 95 (novembre 2017): 44–56. http://dx.doi.org/10.1016/j.neunet.2017.06.005.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
37

Harandi, Mehrtash T., Richard Hartley, Brian Lovell e Conrad Sanderson. "Sparse Coding on Symmetric Positive Definite Manifolds Using Bregman Divergences". IEEE Transactions on Neural Networks and Learning Systems 27, n. 6 (giugno 2016): 1294–306. http://dx.doi.org/10.1109/tnnls.2014.2387383.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
38

Vial, Pierre-Hugo, Paul Magron, Thomas Oberlin e Cedric Fevotte. "Phase Retrieval With Bregman Divergences and Application to Audio Signal Recovery". IEEE Journal of Selected Topics in Signal Processing 15, n. 1 (gennaio 2021): 51–64. http://dx.doi.org/10.1109/jstsp.2021.3051870.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
39

Reem, Daniel, Simeon Reich e Alvaro De Pierro. "Re-examination of Bregman functions and new properties of their divergences". Optimization 68, n. 1 (19 novembre 2018): 279–348. http://dx.doi.org/10.1080/02331934.2018.1543295.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
40

Teng, Yueyang, Nimu Yuan, Yaonan Zhang, Shouliang Qi e Yan Kang. "Family of Iterative Reconstruction Algorithms for Medical Imaging with Bregman-Divergences". Journal of Medical Imaging and Health Informatics 5, n. 8 (1 dicembre 2015): 1708–14. http://dx.doi.org/10.1166/jmihi.2015.1633.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
41

Liang, Zhizheng, Lei Zhang, Jin Liu e Yong Zhou. "Adaptively weighted learning for twin support vector machines via Bregman divergences". Neural Computing and Applications 32, n. 8 (2 novembre 2018): 3323–36. http://dx.doi.org/10.1007/s00521-018-3843-0.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
42

Paul, Grégory, Janick Cardinale e Ivo F. Sbalzarini. "Coupling Image Restoration and Segmentation: A Generalized Linear Model/Bregman Perspective". International Journal of Computer Vision 104, n. 1 (8 marzo 2013): 69–93. http://dx.doi.org/10.1007/s11263-013-0615-2.

Testo completo
Abstract (sommario):
Abstract We introduce a new class of data-fitting energies that couple image segmentation with image restoration. These functionals model the image intensity using the statistical framework of generalized linear models. By duality, we establish an information-theoretic interpretation using Bregman divergences. We demonstrate how this formulation couples in a principled way image restoration tasks such as denoising, deblurring (deconvolution), and inpainting with segmentation. We present an alternating minimization algorithm to solve the resulting composite photometric/geometric inverse problem. We use Fisher scoring to solve the photometric problem and to provide asymptotic uncertainty estimates. We derive the shape gradient of our data-fitting energy and investigate convex relaxation for the geometric problem. We introduce a new alternating split-Bregman strategy to solve the resulting convex problem and present experiments and comparisons on both synthetic and real-world images.
Gli stili APA, Harvard, Vancouver, ISO e altri
43

Mavridis, Christos N., e John S. Baras. "Convergence of Stochastic Vector Quantization and Learning Vector Quantization with Bregman Divergences". IFAC-PapersOnLine 53, n. 2 (2020): 2214–19. http://dx.doi.org/10.1016/j.ifacol.2020.12.006.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
44

Pronzato, Luc, Henry P. Wynn e Anatoly Zhigljavsky. "Bregman divergences based on optimal design criteria and simplicial measures of dispersion". Statistical Papers 60, n. 2 (16 gennaio 2019): 545–64. http://dx.doi.org/10.1007/s00362-018-01082-8.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
45

Sasaki, Hiroaki, Yung-Kyun Noh, Gang Niu e Masashi Sugiyama. "Direct Density Derivative Estimation". Neural Computation 28, n. 6 (giugno 2016): 1101–40. http://dx.doi.org/10.1162/neco_a_00835.

Testo completo
Abstract (sommario):
Estimating the derivatives of probability density functions is an essential step in statistical data analysis. A naive approach to estimate the derivatives is to first perform density estimation and then compute its derivatives. However, this approach can be unreliable because a good density estimator does not necessarily mean a good density derivative estimator. To cope with this problem, in this letter, we propose a novel method that directly estimates density derivatives without going through density estimation. The proposed method provides computationally efficient estimation for the derivatives of any order on multidimensional data with a hyperparameter tuning method and achieves the optimal parametric convergence rate. We further discuss an extension of the proposed method by applying regularized multitask learning and a general framework for density derivative estimation based on Bregman divergences. Applications of the proposed method to nonparametric Kullback-Leibler divergence approximation and bandwidth matrix selection in kernel density estimation are also explored.
Gli stili APA, Harvard, Vancouver, ISO e altri
46

Adamčík, Martin. "The Information Geometry of Bregman Divergences and Some Applications in Multi-Expert Reasoning". Entropy 16, n. 12 (1 dicembre 2014): 6338–81. http://dx.doi.org/10.3390/e16126338.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
47

Hughes, Gareth, e Cairistiona Topp. "Probabilistic Forecasts: Scoring Rules and Their Decomposition and Diagrammatic Representation via Bregman Divergences". Entropy 17, n. 12 (31 luglio 2015): 5450–71. http://dx.doi.org/10.3390/e17085450.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
48

Andrieux, Stéphane. "Bregman divergences for physically informed discrepancy measures for learning and computation in thermomechanics". Comptes Rendus. Mécanique 351, G1 (1 febbraio 2023): 59–81. http://dx.doi.org/10.5802/crmeca.164.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
49

Wada, Tatsuaki. "Erratum: “Generalized log-likelihood functions and Bregman divergences” [J. Math. Phys. 80, 113301 (2009)]". Journal of Mathematical Physics 51, n. 4 (aprile 2010): 049901. http://dx.doi.org/10.1063/1.3376658.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
50

Ferreira, Daniela Portes Leal, Eraldo Ribeiro e Celia A. Zorzo Barcelos. "A variational approach to non-rigid image registration with Bregman divergences and multiple features". Pattern Recognition 77 (maggio 2018): 237–47. http://dx.doi.org/10.1016/j.patcog.2017.12.015.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Offriamo sconti su tutti i piani premium per gli autori le cui opere sono incluse in raccolte letterarie tematiche. Contattaci per ottenere un codice promozionale unico!

Vai alla bibliografia