Journal articles on the topic 'Kullback-Leibler divergence'

To see the other types of publications on this topic, follow the link: Kullback-Leibler divergence.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Kullback-Leibler divergence.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Nielsen, Frank. "Statistical Divergences between Densities of Truncated Exponential Families with Nested Supports: Duo Bregman and Duo Jensen Divergences." Entropy 24, no. 3 (March 17, 2022): 421. http://dx.doi.org/10.3390/e24030421.

Full text
Abstract:
By calculating the Kullback–Leibler divergence between two probability measures belonging to different exponential families dominated by the same measure, we obtain a formula that generalizes the ordinary Fenchel–Young divergence. Inspired by this formula, we define the duo Fenchel–Young divergence and report a majorization condition on its pair of strictly convex generators, which guarantees that this divergence is always non-negative. The duo Fenchel–Young divergence is also equivalent to a duo Bregman divergence. We show how to use these duo divergences by calculating the Kullback–Leibler divergence between densities of truncated exponential families with nested supports, and report a formula for the Kullback–Leibler divergence between truncated normal distributions. Finally, we prove that the skewed Bhattacharyya distances between truncated exponential families amount to equivalent skewed duo Jensen divergences.
APA, Harvard, Vancouver, ISO, and other styles
2

Nielsen, Frank. "Generalizing the Alpha-Divergences and the Oriented Kullback–Leibler Divergences with Quasi-Arithmetic Means." Algorithms 15, no. 11 (November 17, 2022): 435. http://dx.doi.org/10.3390/a15110435.

Full text
Abstract:
The family of α-divergences including the oriented forward and reverse Kullback–Leibler divergences is often used in signal processing, pattern recognition, and machine learning, among others. Choosing a suitable α-divergence can either be done beforehand according to some prior knowledge of the application domains or directly learned from data sets. In this work, we generalize the α-divergences using a pair of strictly comparable weighted means. Our generalization allows us to obtain in the limit case α→1 the 1-divergence, which provides a generalization of the forward Kullback–Leibler divergence, and in the limit case α→0, the 0-divergence, which corresponds to a generalization of the reverse Kullback–Leibler divergence. We then analyze the condition for a pair of weighted quasi-arithmetic means to be strictly comparable and describe the family of quasi-arithmetic α-divergences including its subfamily of power homogeneous α-divergences. In particular, we study the generalized quasi-arithmetic 1-divergences and 0-divergences and show that these counterpart generalizations of the oriented Kullback–Leibler divergences can be rewritten as equivalent conformal Bregman divergences using strictly monotone embeddings. Finally, we discuss the applications of these novel divergences to k-means clustering by studying the robustness property of the centroids.
APA, Harvard, Vancouver, ISO, and other styles
3

van Erven, Tim, and Peter Harremoes. "Rényi Divergence and Kullback-Leibler Divergence." IEEE Transactions on Information Theory 60, no. 7 (July 2014): 3797–820. http://dx.doi.org/10.1109/tit.2014.2320500.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Nielsen, Frank. "On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds." Entropy 22, no. 7 (June 28, 2020): 713. http://dx.doi.org/10.3390/e22070713.

Full text
Abstract:
We study the Voronoi diagrams of a finite set of Cauchy distributions and their dual complexes from the viewpoint of information geometry by considering the Fisher-Rao distance, the Kullback-Leibler divergence, the chi square divergence, and a flat divergence derived from Tsallis entropy related to the conformal flattening of the Fisher-Rao geometry. We prove that the Voronoi diagrams of the Fisher-Rao distance, the chi square divergence, and the Kullback-Leibler divergences all coincide with a hyperbolic Voronoi diagram on the corresponding Cauchy location-scale parameters, and that the dual Cauchy hyperbolic Delaunay complexes are Fisher orthogonal to the Cauchy hyperbolic Voronoi diagrams. The dual Voronoi diagrams with respect to the dual flat divergences amount to dual Bregman Voronoi diagrams, and their dual complexes are regular triangulations. The primal Bregman Voronoi diagram is the Euclidean Voronoi diagram and the dual Bregman Voronoi diagram coincides with the Cauchy hyperbolic Voronoi diagram. In addition, we prove that the square root of the Kullback-Leibler divergence between Cauchy distributions yields a metric distance which is Hilbertian for the Cauchy scale families.
APA, Harvard, Vancouver, ISO, and other styles
5

Nielsen, Frank. "On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means." Entropy 21, no. 5 (May 11, 2019): 485. http://dx.doi.org/10.3390/e21050485.

Full text
Abstract:
The Jensen–Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback–Leibler divergence which measures the total Kullback–Leibler divergence to the average mixture distribution. However, the Jensen–Shannon divergence between Gaussian distributions is not available in closed form. To bypass this problem, we present a generalization of the Jensen–Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using parameter mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report two closed-form formula for (i) the geometric Jensen–Shannon divergence between probability densities of the same exponential family; and (ii) the geometric JS-symmetrization of the reverse Kullback–Leibler divergence between probability densities of the same exponential family. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen–Shannon divergence between scale Cauchy distributions. Applications to clustering with respect to these novel Jensen–Shannon divergences are touched upon.
APA, Harvard, Vancouver, ISO, and other styles
6

Ba, Amadou Diadie, and Gane Samb Lo. "Divergence Measures Estimation and its Asymptotic Normality Theory in the Discrete Case." European Journal of Pure and Applied Mathematics 12, no. 3 (July 25, 2019): 790–820. http://dx.doi.org/10.29020/nybg.ejpam.v12i3.3437.

Full text
Abstract:
In this paper we provide the asymptotic theory of the general of φ-divergences measures, which include the most common divergence measures : R´enyi and Tsallis families and the Kullback-Leibler measure. We are interested in divergence measures in the discrete case. One sided and two-sided statistical tests are derived as well as symmetrized estimators. Almost sure rates of convergence and asymptotic normality theorem are obtained in the general case, and next particularized for the R´enyi and Tsallis families and for the Kullback-Leibler measure as well. Our theoretical results are validated by simulations.
APA, Harvard, Vancouver, ISO, and other styles
7

Yanagimoto, Hidekazu, and Sigeru Omatu. "Information Filtering Using Kullback-Leibler Divergence." IEEJ Transactions on Electronics, Information and Systems 125, no. 7 (2005): 1147–52. http://dx.doi.org/10.1541/ieejeiss.125.1147.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Sunoj, S. M., P. G. Sankaran, and N. Unnikrishnan Nair. "Quantile-based cumulative Kullback–Leibler divergence." Statistics 52, no. 1 (May 22, 2017): 1–17. http://dx.doi.org/10.1080/02331888.2017.1327534.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ponti, Moacir, Josef Kittler, Mateus Riva, Teófilo de Campos, and Cemre Zor. "A decision cognizant Kullback–Leibler divergence." Pattern Recognition 61 (January 2017): 470–78. http://dx.doi.org/10.1016/j.patcog.2016.08.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Sankaran, P. G., S. M. Sunoj, and N. Unnikrishnan Nair. "Kullback–Leibler divergence: A quantile approach." Statistics & Probability Letters 111 (April 2016): 72–79. http://dx.doi.org/10.1016/j.spl.2016.01.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Dragalin, Vladimir, Valerii Fedorov, Scott Patterson, and Byron Jones. "Kullback-Leibler divergence for evaluating bioequivalence." Statistics in Medicine 22, no. 6 (2003): 913–30. http://dx.doi.org/10.1002/sim.1451.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Amari, S., and A. Cichocki. "Information geometry of divergence functions." Bulletin of the Polish Academy of Sciences: Technical Sciences 58, no. 1 (March 1, 2010): 183–95. http://dx.doi.org/10.2478/v10175-010-0019-1.

Full text
Abstract:
Information geometry of divergence functionsMeasures of divergence between two points play a key role in many engineering problems. One such measure is a distance function, but there are many important measures which do not satisfy the properties of the distance. The Bregman divergence, Kullback-Leibler divergence andf-divergence are such measures. In the present article, we study the differential-geometrical structure of a manifold induced by a divergence function. It consists of a Riemannian metric, and a pair of dually coupled affine connections, which are studied in information geometry. The class of Bregman divergences are characterized by a dually flat structure, which is originated from the Legendre duality. A dually flat space admits a generalized Pythagorean theorem. The class off-divergences, defined on a manifold of probability distributions, is characterized by information monotonicity, and the Kullback-Leibler divergence belongs to the intersection of both classes. Thef-divergence always gives the α-geometry, which consists of the Fisher information metric and a dual pair of ±α-connections. The α-divergence is a special class off-divergences. This is unique, sitting at the intersection of thef-divergence and Bregman divergence classes in a manifold of positive measures. The geometry derived from the Tsallisq-entropy and related divergences are also addressed.
APA, Harvard, Vancouver, ISO, and other styles
13

Kocuk, Burak. "Conic reformulations for Kullback-Leibler divergence constrained distributionally robust optimization and applications." An International Journal of Optimization and Control: Theories & Applications (IJOCTA) 11, no. 2 (April 19, 2021): 139–51. http://dx.doi.org/10.11121/ijocta.01.2021.001001.

Full text
Abstract:
In this paper, we consider a Kullback-Leibler divergence constrained distributionally robust optimization model. This model considers an ambiguity set that consists of all distributions whose Kullback-Leibler divergence to an empirical distribution is bounded. Utilizing the fact that this divergence measure has an exponential cone representation, we obtain the robust counterpart of the Kullback-Leibler divergence constrained distributionally robust optimization problem as a dual exponential cone constrained program under mild assumptions on the underlying optimization problem. The resulting conic reformulation of the original optimization problem can be directly solved by a commercial conic programming solver. We specialize our generic formulation to two classical optimization problems, namely, the Newsvendor Problem and the Uncapacitated Facility Location Problem. Our computational study in an out-of-sample analysis shows that the solutions obtained via the distributionally robust optimization approach yield significantly better performance in terms of the dispersion of the cost realizations while the central tendency deteriorates only slightly compared to the solutions obtained by stochastic programming.
APA, Harvard, Vancouver, ISO, and other styles
14

Jain, K., and Ram Saraswat. "A New Information Inequality and Its Application in Establishing Relation Among Various f-Divergence Measures." Journal of Applied Mathematics, Statistics and Informatics 8, no. 1 (May 1, 2012): 17–32. http://dx.doi.org/10.2478/v10294-012-0002-6.

Full text
Abstract:
A New Information Inequality and Its Application in Establishing Relation Among Various f-Divergence MeasuresAn Information inequality by using convexity arguments and Jensen inequality is established in terms of Csiszar f-divergence measures. This inequality is applied in comparing particular divergences which play a fundamental role in Information theory, such as Kullback-Leibler distance, Hellinger discrimination, Chi-square distance, J-divergences and others.
APA, Harvard, Vancouver, ISO, and other styles
15

Sfetcu, Răzvan-Cornel, Sorina-Cezarina Sfetcu, and Vasile Preda. "Some Properties of Weighted Tsallis and Kaniadakis Divergences." Entropy 24, no. 11 (November 5, 2022): 1616. http://dx.doi.org/10.3390/e24111616.

Full text
Abstract:
We are concerned with the weighted Tsallis and Kaniadakis divergences between two measures. More precisely, we find inequalities between these divergences and Tsallis and Kaniadakis logarithms, prove that they are limited by similar bounds with those that limit Kullback–Leibler divergence and show that are pseudo-additive.
APA, Harvard, Vancouver, ISO, and other styles
16

Bulinski, Alexander, and Denis Dimitrov. "Statistical Estimation of the Kullback–Leibler Divergence." Mathematics 9, no. 5 (March 4, 2021): 544. http://dx.doi.org/10.3390/math9050544.

Full text
Abstract:
Asymptotic unbiasedness and L2-consistency are established, under mild conditions, for the estimates of the Kullback–Leibler divergence between two probability measures in Rd, absolutely continuous with respect to (w.r.t.) the Lebesgue measure. These estimates are based on certain k-nearest neighbor statistics for pair of independent identically distributed (i.i.d.) due vector samples. The novelty of results is also in treating mixture models. In particular, they cover mixtures of nondegenerate Gaussian measures. The mentioned asymptotic properties of related estimators for the Shannon entropy and cross-entropy are strengthened. Some applications are indicated.
APA, Harvard, Vancouver, ISO, and other styles
17

Boche, Holger, and Sawomir Staczak. "The Kullback–Leibler Divergence and Nonnegative Matrices." IEEE Transactions on Information Theory 52, no. 12 (December 2006): 5539–45. http://dx.doi.org/10.1109/tit.2006.885488.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

COOPER, Vanessa N., Hisham M. HADDAD, and Hossain SHAHRIAR. "Android Malware Detection Using Kullback-Leibler Divergence." ADCAIJ: ADVANCES IN DISTRIBUTED COMPUTING AND ARTIFICIAL INTELLIGENCE JOURNAL 3, no. 2 (December 17, 2014): 17. http://dx.doi.org/10.14201/adcaij2014321725.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

COOPER, Vanessa N., Hisham M. HADDAD, and Hossain SHAHRIAR. "Android Malware Detection Using Kullback-Leibler Divergence." ADCAIJ: Advances in Distributed Computing and Artificial Intelligence Journal 3, no. 9 (December 17, 2014): 17. http://dx.doi.org/10.14201/adcaij2014391725.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Lin, Chungwei, Tim K. Marks, Milutin Pajovic, Shinji Watanabe, and Chih-kuan Tung. "Model parameter learning using Kullback–Leibler divergence." Physica A: Statistical Mechanics and its Applications 491 (February 2018): 549–59. http://dx.doi.org/10.1016/j.physa.2017.09.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Delgado-Gutiérrez, G., F. Rodríguez-Santos, O. Jiménez-Ramírez, and R. Vázquez-Medina. "Acoustic environment identification by Kullback–Leibler divergence." Forensic Science International 281 (December 2017): 134–40. http://dx.doi.org/10.1016/j.forsciint.2017.10.031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Kárný, Miroslav, and Josef Andrýsek. "Use of Kullback-Leibler divergence for forgetting." International Journal of Adaptive Control and Signal Processing 23, no. 10 (October 2009): 961–75. http://dx.doi.org/10.1002/acs.1080.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Simić, Slavko. "On the symmetrized S-divergence." ITM Web of Conferences 29 (2019): 01004. http://dx.doi.org/10.1051/itmconf/20192901004.

Full text
Abstract:
In this paper we worked with the relative divergence of type s, s ∈ ℝ, which include Kullback-Leibler divergence and the Hellinger and χ2 distances as particular cases. We give here a study of the sym- metrized divergences in additive and multiplicative forms. Some ba-sic properties as symmetry, monotonicity and log-convexity are estab-lished. An important result from the Convexity Theory is also proved.
APA, Harvard, Vancouver, ISO, and other styles
24

Simić, Slavko, Sara Salem Alzaid, and Hassen Aydi. "On the symmetrized s-divergence." Open Mathematics 18, no. 1 (May 26, 2020): 378–85. http://dx.doi.org/10.1515/math-2020-0027.

Full text
Abstract:
Abstract In this study, we work with the relative divergence of type s,s\in {\mathbb{R}} , which includes the Kullback-Leibler divergence and the Hellinger and χ 2 distances as particular cases. We study the symmetrized divergences in additive and multiplicative forms. Some basic properties such as symmetry, monotonicity and log-convexity are established. An important result from the convexity theory is also proved.
APA, Harvard, Vancouver, ISO, and other styles
25

Giski, Zahra Eslami. "Rényi Entropy and Rényi Divergence in Sequential Effect Algebra." Open Systems & Information Dynamics 27, no. 02 (June 2020): 2050008. http://dx.doi.org/10.1142/s1230161220500080.

Full text
Abstract:
The aim of this study is to extend the results concerning the Shannon entropy and Kullback–Leibler divergence in sequential effect algebra to the case of Rényi entropy and Rényi divergence. For this purpose, the Rényi entropy of finite partitions in sequential effect algebra and its conditional version are proposed and the basic properties of these entropy measures are derived. In addition, the notion of Rényi divergence of a partition in sequential effect algebra is introduced and the basic properties of this quantity are studied. In particular, it is proved that the Kullback–Leibler divergence and Shannon’s entropy of partitions in a given sequential effect algebra can be obtained as limits of their Rényi divergence and Rényi entropy respectively. Finally, to illustrate the results, some numerical examples are presented.
APA, Harvard, Vancouver, ISO, and other styles
26

Lu, Wanbo, and Wenhui Shi. "Model Averaging Estimation Method by Kullback–Leibler Divergence for Multiplicative Error Model." Complexity 2022 (April 27, 2022): 1–13. http://dx.doi.org/10.1155/2022/7706992.

Full text
Abstract:
In this paper, we propose the model averaging estimation method for multiplicative error model and construct the corresponding weight choosing criterion based on the Kullback–Leibler divergence with a hyperparameter to avoid the problem of overfitting. The resulting model average estimator is proved to be asymptotically optimal. It is shown that the Kullback–Leibler model averaging (KLMA) estimator asymptotically minimizes the in-sample Kullback–Leibler divergence and improves the forecast accuracy of out-of-sample even under different loss functions. In simulations, we show that the KLMA estimator compares favorably with smooth-AIC estimator (SAIC), smooth-BIC estimator (SBIC), and Mallows model averaging estimator (MMA), especially when some nonlinear noise is added to the data generation process. The empirical applications in the daily range of S&P500 and price duration of IBM show that the out-of-sample forecasting capacity of the KLMA estimator is better than that of other methods.
APA, Harvard, Vancouver, ISO, and other styles
27

Wang, Chen-Pin, and Malay Ghosh. "A Kullback-Leibler Divergence for Bayesian Model Diagnostics." Open Journal of Statistics 01, no. 03 (2011): 172–84. http://dx.doi.org/10.4236/ojs.2011.13021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Moral, Serafín, Andrés Cano, and Manuel Gómez-Olmedo. "Computation of Kullback–Leibler Divergence in Bayesian Networks." Entropy 23, no. 9 (August 28, 2021): 1122. http://dx.doi.org/10.3390/e23091122.

Full text
Abstract:
Kullback–Leibler divergence KL(p,q) is the standard measure of error when we have a true probability distribution p which is approximate with probability distribution q. Its efficient computation is essential in many tasks, as in approximate computation or as a measure of error when learning a probability. In high dimensional probabilities, as the ones associated with Bayesian networks, a direct computation can be unfeasible. This paper considers the case of efficiently computing the Kullback–Leibler divergence of two probability distributions, each one of them coming from a different Bayesian network, which might have different structures. The paper is based on an auxiliary deletion algorithm to compute the necessary marginal distributions, but using a cache of operations with potentials in order to reuse past computations whenever they are necessary. The algorithms are tested with Bayesian networks from the bnlearn repository. Computer code in Python is provided taking as basis pgmpy, a library for working with probabilistic graphical models.
APA, Harvard, Vancouver, ISO, and other styles
29

Rached, Z., F. Alajaji, and L. L. Campbell. "The Kullback–Leibler Divergence Rate Between Markov Sources." IEEE Transactions on Information Theory 50, no. 5 (May 2004): 917–21. http://dx.doi.org/10.1109/tit.2004.826687.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Park, Chanseok, and Ayanendranath Basu. "The generalized kullback-leibler divergence and robust inference." Journal of Statistical Computation and Simulation 73, no. 5 (January 2003): 311–32. http://dx.doi.org/10.1080/0094965021000033477.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Keyes, Tim, and Martin Levy. "On calibration of kullback-leibler divergence via prediction." Communications in Statistics - Theory and Methods 28, no. 1 (1999): 67–85. http://dx.doi.org/10.1080/03610929908832283.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Yongchang Wang, Kai Liu, Qi Hao, Xianwang Wang, D. L. Lau, and L. G. Hassebrook. "Robust Active Stereo Vision Using Kullback-Leibler Divergence." IEEE Transactions on Pattern Analysis and Machine Intelligence 34, no. 3 (March 2012): 548–63. http://dx.doi.org/10.1109/tpami.2011.162.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Qian, Dongyun, and Huifeng Jin. "Distance Metric with Kullback–Leibler Divergence for Classification." International Journal of Signal Processing, Image Processing and Pattern Recognition 10, no. 7 (July 31, 2017): 157–66. http://dx.doi.org/10.14257/ijsip.2017.10.7.14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Choi, Changryoul, and Dongweon Yoon. "Blind Interleaver Parameter Estimation Using Kullback-Leibler Divergence." Journal of Korean Institute of Information Technology 15, no. 12 (December 21, 2017): 109–15. http://dx.doi.org/10.14801/jkiit.2017.15.12.109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Chen, Guanjie, Ao Yuan, Tao Cai, Chuan‐Ming Li, Amy R. Bentley, Jie Zhou, Daniel N. Shriner, Adebowale A. Adeyemo, and Charles N. Rotimi. "Measuring gene–gene interaction using Kullback–Leibler divergence." Annals of Human Genetics 83, no. 6 (June 17, 2019): 405–17. http://dx.doi.org/10.1111/ahg.12324.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Yohai, Victor J. "Optimal robust estimates using the Kullback–Leibler divergence." Statistics & Probability Letters 78, no. 13 (September 2008): 1811–16. http://dx.doi.org/10.1016/j.spl.2008.01.042.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Zeng, Jiusun, Uwe Kruger, Jaap Geluk, Xun Wang, and Lei Xie. "Detecting abnormal situations using the Kullback–Leibler divergence." Automatica 50, no. 11 (November 2014): 2777–86. http://dx.doi.org/10.1016/j.automatica.2014.09.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Zhou, XiaoJian, Dennis K. J. Lin, Xuelong Hu, and Ting Jiang. "Robust parameter design based on Kullback-Leibler divergence." Computers & Industrial Engineering 135 (September 2019): 913–21. http://dx.doi.org/10.1016/j.cie.2019.06.053.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Belov, Dmitry I. "Detection of Test Collusion via Kullback-Leibler Divergence." Journal of Educational Measurement 50, no. 2 (June 2013): 141–63. http://dx.doi.org/10.1111/jedm.12008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Misztal, Krzysztof, and Jacek Tabor. "Ellipticity and Circularity Measuring via Kullback–Leibler Divergence." Journal of Mathematical Imaging and Vision 55, no. 1 (December 7, 2015): 136–50. http://dx.doi.org/10.1007/s10851-015-0618-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Shen, Pengcheng, Chunguang Li, and Yiliang Luo. "Distributed Vector Quantization Based on Kullback-Leibler Divergence." Entropy 17, no. 12 (November 30, 2015): 7875–87. http://dx.doi.org/10.3390/e17127851.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Tian, Shaohua, Xuefeng Chen, Zhibo Yang, Zhengjia He, and Xingwu Zhang. "Damage detection using the improved Kullback-Leibler divergence." Structural Engineering and Mechanics 48, no. 3 (November 10, 2013): 291–308. http://dx.doi.org/10.12989/sem.2013.48.3.291.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Pinelis, Iosif. "An involution inequality for the Kullback-Leibler divergence." Mathematical Inequalities & Applications, no. 1 (2017): 233–35. http://dx.doi.org/10.7153/mia-20-17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Bouhlel, Nizar, and Ali Dziri. "Kullback–Leibler Divergence Between Multivariate Generalized Gaussian Distributions." IEEE Signal Processing Letters 26, no. 7 (July 2019): 1021–25. http://dx.doi.org/10.1109/lsp.2019.2915000.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Belov, Dmitry I., and Ronald D. Armstrong. "Distributions of the Kullback-Leibler divergence with applications." British Journal of Mathematical and Statistical Psychology 64, no. 2 (April 15, 2011): 291–309. http://dx.doi.org/10.1348/000711010x522227.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Smith, Aaron, Prasad A. Naik, and Chih-Ling Tsai. "Markov-switching model selection using Kullback–Leibler divergence." Journal of Econometrics 134, no. 2 (October 2006): 553–77. http://dx.doi.org/10.1016/j.jeconom.2005.07.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Lee, Young Kyung, and Byeong U. Park. "Estimation of Kullback–Leibler Divergence by Local Likelihood." Annals of the Institute of Statistical Mathematics 58, no. 2 (June 3, 2006): 327–40. http://dx.doi.org/10.1007/s10463-005-0014-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Chen, Badong, Jinchun Hu, Yu Zhu, and Zengqi Sun. "Parameter identifiability with Kullback-Leibler information divergence criterion." International Journal of Adaptive Control and Signal Processing 23, no. 10 (October 2009): 940–60. http://dx.doi.org/10.1002/acs.1078.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Sbert, Mateu, Min Chen, Jordi Poch, and Anton Bardera. "Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence." Entropy 20, no. 12 (December 12, 2018): 959. http://dx.doi.org/10.3390/e20120959.

Full text
Abstract:
Cross entropy and Kullback–Leibler (K-L) divergence are fundamental quantities of information theory, and they are widely used in many fields. Since cross entropy is the negated logarithm of likelihood, minimizing cross entropy is equivalent to maximizing likelihood, and thus, cross entropy is applied for optimization in machine learning. K-L divergence also stands independently as a commonly used metric for measuring the difference between two distributions. In this paper, we introduce new inequalities regarding cross entropy and K-L divergence by using the fact that cross entropy is the negated logarithm of the weighted geometric mean. We first apply the well-known rearrangement inequality, followed by a recent theorem on weighted Kolmogorov means, and, finally, we introduce a new theorem that directly applies to inequalities between K-L divergences. To illustrate our results, we show numerical examples of distributions.
APA, Harvard, Vancouver, ISO, and other styles
50

Szega, Marcin. "Application of the entropy information for the optimization of an additional measurements location in thermal systems." Archives of Thermodynamics 32, no. 3 (December 1, 2011): 215–29. http://dx.doi.org/10.2478/v10173-011-0024-2.

Full text
Abstract:
Application of the entropy information for the optimization of an additional measurements location in thermal systemsFor the optimal location of an additional surplus measurements in the design of redundant measurements system, from data reconciliation point of view, of thermal processes, an information entropy has been applied. The relative entropy - Kullback-Leibler divergence, has been used. As a criterion of the optimal location of an additional surplus measurements in a system of measurements data, the minimum of the entropy information of reconciled measurements data has been assumed. Hence, the objective function in the described optimization task is maximum of the relative entropy - Kullback-Leibler divergence concerning sets of raw and reconciled measurements data. Simulation calculation with application of data reconciliation algorithm and Monte Carlo method concerning the influence of installation of the additional surplus measurements on decrease of entropy information of measurements after data validation have been carried out. The example calculations concerned the cross high-pressure heat regeneration system with cascade flow of condensate installed in 153 MW power unit equipped with cooler of steam are presented. Calculations for all variants of configurations of an additional surplus measurements in the analyzed thermal system have been done. Usefulness of the proposed Kullback-Leibler divergence as a objective function has been demonstrated.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography