Academic literature on the topic 'Kullback-Leibler divergence'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Kullback-Leibler divergence.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Kullback-Leibler divergence"
Nielsen, Frank. "Statistical Divergences between Densities of Truncated Exponential Families with Nested Supports: Duo Bregman and Duo Jensen Divergences." Entropy 24, no. 3 (March 17, 2022): 421. http://dx.doi.org/10.3390/e24030421.
Full textNielsen, Frank. "Generalizing the Alpha-Divergences and the Oriented Kullback–Leibler Divergences with Quasi-Arithmetic Means." Algorithms 15, no. 11 (November 17, 2022): 435. http://dx.doi.org/10.3390/a15110435.
Full textvan Erven, Tim, and Peter Harremoes. "Rényi Divergence and Kullback-Leibler Divergence." IEEE Transactions on Information Theory 60, no. 7 (July 2014): 3797–820. http://dx.doi.org/10.1109/tit.2014.2320500.
Full textNielsen, Frank. "On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds." Entropy 22, no. 7 (June 28, 2020): 713. http://dx.doi.org/10.3390/e22070713.
Full textNielsen, Frank. "On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means." Entropy 21, no. 5 (May 11, 2019): 485. http://dx.doi.org/10.3390/e21050485.
Full textBa, Amadou Diadie, and Gane Samb Lo. "Divergence Measures Estimation and its Asymptotic Normality Theory in the Discrete Case." European Journal of Pure and Applied Mathematics 12, no. 3 (July 25, 2019): 790–820. http://dx.doi.org/10.29020/nybg.ejpam.v12i3.3437.
Full textYanagimoto, Hidekazu, and Sigeru Omatu. "Information Filtering Using Kullback-Leibler Divergence." IEEJ Transactions on Electronics, Information and Systems 125, no. 7 (2005): 1147–52. http://dx.doi.org/10.1541/ieejeiss.125.1147.
Full textSunoj, S. M., P. G. Sankaran, and N. Unnikrishnan Nair. "Quantile-based cumulative Kullback–Leibler divergence." Statistics 52, no. 1 (May 22, 2017): 1–17. http://dx.doi.org/10.1080/02331888.2017.1327534.
Full textPonti, Moacir, Josef Kittler, Mateus Riva, Teófilo de Campos, and Cemre Zor. "A decision cognizant Kullback–Leibler divergence." Pattern Recognition 61 (January 2017): 470–78. http://dx.doi.org/10.1016/j.patcog.2016.08.018.
Full textSankaran, P. G., S. M. Sunoj, and N. Unnikrishnan Nair. "Kullback–Leibler divergence: A quantile approach." Statistics & Probability Letters 111 (April 2016): 72–79. http://dx.doi.org/10.1016/j.spl.2016.01.007.
Full textDissertations / Theses on the topic "Kullback-Leibler divergence"
MESEJO-LEON, DANIEL ALEJANDRO. "APPROXIMATE NEAREST NEIGHBOR SEARCH FOR THE KULLBACK-LEIBLER DIVERGENCE." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2018. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=33305@1.
Full textCOORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR
PROGRAMA DE EXCELENCIA ACADEMICA
Em uma série de aplicações, os pontos de dados podem ser representados como distribuições de probabilidade. Por exemplo, os documentos podem ser representados como modelos de tópicos, as imagens podem ser representadas como histogramas e também a música pode ser representada como uma distribuição de probabilidade. Neste trabalho, abordamos o problema do Vizinho Próximo Aproximado onde os pontos são distribuições de probabilidade e a função de distância é a divergência de Kullback-Leibler (KL). Mostramos como acelerar as estruturas de dados existentes, como a Bregman Ball Tree, em teoria, colocando a divergência KL como um produto interno. No lado prático, investigamos o uso de duas técnicas de indexação muito populares: Índice Invertido e Locality Sensitive Hashing. Os experimentos realizados em 6 conjuntos de dados do mundo real mostraram que o Índice Invertido é melhor do que LSH e Bregman Ball Tree, em termos de consultas por segundo e precisão.
In a number of applications, data points can be represented as probability distributions. For instance, documents can be represented as topic models, images can be represented as histograms and also music can be represented as a probability distribution. In this work, we address the problem of the Approximate Nearest Neighbor where the points are probability distributions and the distance function is the Kullback-Leibler (KL) divergence. We show how to accelerate existing data structures such as the Bregman Ball Tree, by posing the KL divergence as an inner product embedding. On the practical side we investigated the use of two, very popular, indexing techniques: Inverted Index and Locality Sensitive Hashing. Experiments performed on 6 real world data-sets showed the Inverted Index performs better than LSH and Bregman Ball Tree, in terms of queries per second and precision.
Nounagnon, Jeannette Donan. "Using Kullback-Leibler Divergence to Analyze the Performance of Collaborative Positioning." Diss., Virginia Tech, 2016. http://hdl.handle.net/10919/86593.
Full textPh. D.
Junior, Willian Darwin. "Agrupamento de textos utilizando divergência Kullback-Leibler." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/18/18153/tde-30032016-160011/.
Full textThis work proposes a methodology for grouping texts for the purposes of textual searching in general but also specifically for aiding in distributing law processes in order to reduce time applied in solving judicial conflicts. The proposed methodology uses the Kullback-Leibler divergence applied to frequency distributions of word stems occurring in the texts. Several groups of stems are considered, built up on their occurrence frequency among the texts and the resulting distributions are taken regarding each one of those groups. For each group, divergences are computed based on the distribution taken from a reference text originated from the assembling of all sample texts, yelding one value for each text in relation to each group of stems. Finally, those values are taken as attributes of each text in a clusterization process driven by a K-Means algorithm implementation providing a grouping for the texts. The methodology is tested for simple toy examples and applied to cases of electrical failure registering, texts with similar issues and law texts and compared to an expert\'s classification. As byproducts from the conducted research, a graphical development environment for Pattern Recognition and Bayesian Networks based models and a study on the possibilities of using parallel processing in Bayesian Networks learning have also been obtained.
Harmouche, Jinane. "Statistical Incipient Fault Detection and Diagnosis with Kullback-Leibler Divergence : from Theory to Applications." Thesis, Supélec, 2014. http://www.theses.fr/2014SUPL0022/document.
Full textThis phD dissertation deals with the detection and diagnosis of incipient faults in engineering and industrial systems by non-parametric statistical approaches. An incipient fault is supposed to provoke an abnormal change in the measurements of the system variables. However, this change is imperceptible and also unpredictable due to the large signal-to-fault ratio and the low fault-to-noise ratio characterizing the incipient fault. The detection and identification of a global change require a ’global’ approach that takes into account the total faults signature. In this context, the Kullback-Leibler divergence is considered to be a ’global’ fault indicator, which is recommended sensitive to abnormal small variations hidden in noise. A ’global’ spectral analysis approach is also proposed for the diagnosis of faults with a frequency signature. The ’global’ statistical approach is proved on two application studies. The first one concerns the detection and characterization of minor cracks in conductive structures. The second application concerns the diagnosis of bearing faults in electrical rotating machines. In addition, the fault estimation problem is addressed in this work. A theoretical study is conducted to obtain an analytical model of the KL divergence, from which an estimate of the amplitude of the incipient fault is derived
Chhogyal, Kinzang. "Belief Change: A Probabilistic Inquiry." Thesis, Griffith University, 2016. http://hdl.handle.net/10072/366331.
Full textThesis (PhD Doctorate)
Doctor of Philosophy (PhD)
Institute for Integrated and Intelligent Systems
Science, Environment, Engineering and Technology
Full Text
Zhou, Ruikun. "A Kullback-Leiber Divergence Filter for Anomaly Detection in Non-Destructive Pipeline Inspection." Thesis, Université d'Ottawa / University of Ottawa, 2020. http://hdl.handle.net/10393/40987.
Full textJung, Daniel. "Diagnosability performance analysis of models and fault detectors." Doctoral thesis, Linköpings universitet, Fordonssystem, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-117058.
Full textWhite, Staci A. "Quantifying Model Error in Bayesian Parameter Estimation." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1433771825.
Full textAdamcik, Martin. "Collective reasoning under uncertainty and inconsistency." Thesis, University of Manchester, 2014. https://www.research.manchester.ac.uk/portal/en/theses/collective-reasoning-under-uncertainty-and-inconsistency(7fab8021-8beb-45e7-8b45-7cb4fadd70be).html.
Full textMacêra, Márcia Aparecida Centanin. "Uso dos métodos clássico e bayesiano para os modelos não-lineares heterocedásticos simétricos." Universidade de São Paulo, 2011. http://www.teses.usp.br/teses/disponiveis/55/55134/tde-14092011-164458/.
Full textThe normal regression models have been used for many years for data analysis. Even in cases where normality could not be assumed, was trying to be some kind of transformation in order to achieve the normality sought. However, in practice, these assumptions about normality and linearity are not always satisfied. As alternatives to classical technique new classes of regression models were developed. In this context, we focus on the class of models in which the distribution assumed for the response variable belongs to the symmetric distributions class. The aim of this work is the modeling of this class in the bayesian context, in particular the modeling of the nonlinear models heteroscedastic symmetric class. Note that this work is connected with two research lines, the statistical inference addressing aspects of asymptotic theory and the bayesian inference considering aspects of modeling and criteria for models selection based on simulation methods Monte Carlo Markov Chain (MCMC). A first step is to present the nonlinear models heteroscedastic symmetric class as well as the classic inference of parameters of these models. Subsequently, we propose a bayesian approach to these models, whose objective is to show their feasibility and compare the estimated parameters bayesian inference by MCMC methods with the classical inference of the estimates obtained by GAMLSS tool. In addition, we use the bayesian method of influence analysis on a case based on the Kullback-Leibler divergence for detecting influential observations in the data. The computational implementation was developed in the software R and programs details can be found at the studys authors
Books on the topic "Kullback-Leibler divergence"
Соловйов, Володимир Миколайович, Андрій Олександрович Бєлінський,, A. V. Matviychuk, and O. A. Serdyuk. Permutation Based Complexity Measures and Crashes. Братислава-Харьков, ВШЭМ – ХНЭУ им. С. Кузнеца, 2021. http://dx.doi.org/10.31812/123456789/4397.
Full textBook chapters on the topic "Kullback-Leibler divergence"
Polani, Daniel. "Kullback-Leibler Divergence." In Encyclopedia of Systems Biology, 1087–88. New York, NY: Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4419-9863-7_1551.
Full textJoyce, James M. "Kullback-Leibler Divergence." In International Encyclopedia of Statistical Science, 720–22. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-04898-2_327.
Full textRoldán, Édgar. "Dissipation and Kullback–Leibler Divergence." In Irreversibility and Dissipation in Microscopic Systems, 37–59. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-07079-7_2.
Full textRoldán, Édgar. "Estimating the Kullback–Leibler Divergence." In Irreversibility and Dissipation in Microscopic Systems, 61–85. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-07079-7_3.
Full textYang, Zhirong, He Zhang, Zhijian Yuan, and Erkki Oja. "Kullback-Leibler Divergence for Nonnegative Matrix Factorization." In Lecture Notes in Computer Science, 250–57. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-21735-7_31.
Full textAgrawal, Rohit, Yi-Hsiu Chen, Thibaut Horel, and Salil Vadhan. "Unifying Computational Entropies via Kullback–Leibler Divergence." In Advances in Cryptology – CRYPTO 2019, 831–58. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-26951-7_28.
Full textChen, Hongtian, Bin Jiang, Ningyun Lu, and Wen Chen. "PCA and Kullback-Leibler Divergence-Based FDD Methods." In Data-driven Detection and Diagnosis of Faults in Traction Systems of High-speed Trains, 119–35. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-46263-5_7.
Full textHuynh, Hiep Xuan, Cang Anh Phan, Tu Cam Thi Tran, Hai Thanh Nguyen, and Dinh Quoc Truong. "Threshold Text Classification with Kullback–Leibler Divergence Approach." In Machine Learning and Mechanics Based Soft Computing Applications, 1–11. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-19-6450-3_2.
Full textCorduas, Marcella. "Assessing Similarity of Rating Distributions by Kullback-Leibler Divergence." In Classification and Multivariate Analysis for Complex Data Structures, 221–28. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-13312-1_22.
Full textChirco, Goffredo. "Rényi Relative Entropy from Homogeneous Kullback-Leibler Divergence Lagrangian." In Lecture Notes in Computer Science, 744–51. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-80209-7_80.
Full textConference papers on the topic "Kullback-Leibler divergence"
Raiber, Fiana, and Oren Kurland. "Kullback-Leibler Divergence Revisited." In ICTIR '17: ACM SIGIR International Conference on the Theory of Information Retrieval. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3121050.3121062.
Full textLi, Xiangfei, Huan Zhao, and Han Ding. "Kullback-Leibler Divergence-Based Visual Servoing." In 2021 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM). IEEE, 2021. http://dx.doi.org/10.1109/aim46487.2021.9517706.
Full textNomura, Ryo. "Source Resolvability with Kullback-Leibler Divergence." In 2018 IEEE International Symposium on Information Theory (ISIT). IEEE, 2018. http://dx.doi.org/10.1109/isit.2018.8437647.
Full textAhuja, Kartik. "Estimating Kullback-Leibler Divergence Using Kernel Machines." In 2019 53rd Asilomar Conference on Signals, Systems, and Computers. IEEE, 2019. http://dx.doi.org/10.1109/ieeeconf44664.2019.9049082.
Full textIm, Chaewon, Seongjin Ahn, and Dongweon Yoon. "Modulation Classification Based on Kullback-Leibler Divergence." In 2020 IEEE 15th International Conference on Advanced Trends in Radioelectronics, Telecommunications and Computer Engineering (TCSET). IEEE, 2020. http://dx.doi.org/10.1109/tcset49122.2020.235457.
Full textSum, John, Chi-sing Leung, and Lipin Hsu. "Fault tolerant learning using Kullback-Leibler divergence." In TENCON 2007 - 2007 IEEE Region 10 Conference. IEEE, 2007. http://dx.doi.org/10.1109/tencon.2007.4429073.
Full textZeng, Jia, Xiao-Qin Cao, and Hong Yan. "Human Promoter Recognition using Kullback-Leibler Divergence." In 2007 International Conference on Machine Learning and Cybernetics. IEEE, 2007. http://dx.doi.org/10.1109/icmlc.2007.4370721.
Full textPheng, Hang See, Siti Mariyam Shamsuddin, Wong Yee Leng, and Razana Alwee. "Kullback Leibler divergence for image quantitative evaluation." In ADVANCES IN INDUSTRIAL AND APPLIED MATHEMATICS: Proceedings of 23rd Malaysian National Symposium of Mathematical Sciences (SKSM23). Author(s), 2016. http://dx.doi.org/10.1063/1.4954516.
Full textPerez-Cruz, Fernando. "Kullback-Leibler divergence estimation of continuous distributions." In 2008 IEEE International Symposium on Information Theory - ISIT. IEEE, 2008. http://dx.doi.org/10.1109/isit.2008.4595271.
Full textMansouri, Majdi, Hazem Nounou, and Mohamed Nounou. "Kullback-Leibler divergence -based improved particle filter." In 2014 11th International Multi-Conference on Systems, Signals & Devices (SSD). IEEE, 2014. http://dx.doi.org/10.1109/ssd.2014.6808793.
Full textReports on the topic "Kullback-Leibler divergence"
Wilson, D., Matthew Kamrath, Caitlin Haedrich, Daniel Breton, and Carl Hart. Urban noise distributions and the influence of geometric spreading on skewness. Engineer Research and Development Center (U.S.), November 2021. http://dx.doi.org/10.21079/11681/42483.
Full text