Journal articles on the topic 'INFORMATION THEORETIC MEASURES'

To see the other types of publications on this topic, follow the link: INFORMATION THEORETIC MEASURES.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'INFORMATION THEORETIC MEASURES.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Rosso, Osvaldo A., and Fernando Montani. "Information Theoretic Measures and Their Applications." Entropy 22, no. 12 (December 7, 2020): 1382. http://dx.doi.org/10.3390/e22121382.

Full text
Abstract:
The concept of entropy, an ever-growing physical magnitude that measured the degree of decay of order in a physical system, was introduced by Rudolf Clausius in 1865 through an elegant formulation of the second law of thermodynamics [...]
APA, Harvard, Vancouver, ISO, and other styles
2

RAZ, TZVI. "Information theoretic measures of inspection performance." International Journal of Production Research 29, no. 5 (May 1991): 913–26. http://dx.doi.org/10.1080/00207549108930110.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Saud, Ibne, and M. Z. Khan. "A note on information theoretic measures." Statistics 20, no. 1 (January 1989): 161–64. http://dx.doi.org/10.1080/02331888908802156.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

FINN, SETH. "Information-Theoretic Measures of Reader Enjoyment." Written Communication 2, no. 4 (October 1985): 358–76. http://dx.doi.org/10.1177/0741088385002004002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Dehesa, J. S., S. López-Rosa, and R. J. Yáñez. "Information-theoretic measures of hyperspherical harmonics." Journal of Mathematical Physics 48, no. 4 (April 2007): 043503. http://dx.doi.org/10.1063/1.2712913.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

A. Warrick, Philip, and Emily F. Hamilton. "Information theoretic measures of perinatal cardiotocography synchronization." Mathematical Biosciences and Engineering 17, no. 3 (2020): 2179–93. http://dx.doi.org/10.3934/mbe.2020116.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Lee, Y. T. "Information-theoretic distortion measures for speech recognition." IEEE Transactions on Signal Processing 39, no. 2 (1991): 330–35. http://dx.doi.org/10.1109/78.80815.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Agrawal, Pankaj, Sk Sazim, Indranil Chakrabarty, and Arun K. Pati. "Local, nonlocal quantumness and information theoretic measures." International Journal of Quantum Information 14, no. 06 (September 2016): 1640034. http://dx.doi.org/10.1142/s0219749916400347.

Full text
Abstract:
It has been suggested that there may exist quantum correlations that go beyond entanglement. The existence of such correlations can be revealed by information theoretic quantities such as quantum discord, but not by the conventional measures of entanglement. We argue that a state displays quantumness, that can be of local and nonlocal origin. Information theoretic measures not only characterize the nonlocal quantumness, but also the local quantumness, such as the “local superposition”. This can be a reason, why such measures are nonzero, when there is no entanglement. We consider a generalized version of the Werner state to demonstrate the interplay of local quantumness, nonlocal quantumness and classical mixedness of a state.
APA, Harvard, Vancouver, ISO, and other styles
9

Nath, D. "Information theoretic spreading measures of orthogonal functions." Journal of Mathematical Chemistry 51, no. 5 (March 7, 2013): 1446–61. http://dx.doi.org/10.1007/s10910-013-0157-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Wei-Ning, Qi Li, and Liang Wang. "Robust Object Tracking via Information Theoretic Measures." International Journal of Automation and Computing 17, no. 5 (May 30, 2020): 652–66. http://dx.doi.org/10.1007/s11633-020-1235-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

van der Hoef, Hanneke, and Matthijs J. Warrens. "Understanding information theoretic measures for comparing clusterings." Behaviormetrika 46, no. 2 (December 4, 2018): 353–70. http://dx.doi.org/10.1007/s41237-018-0075-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

HU, Bao-Gang, Ran HE, and Xiao-Tong YUAN. "Information-theoretic Measures for Objective Evaluation of Classifications." Acta Automatica Sinica 38, no. 7 (2012): 1169. http://dx.doi.org/10.3724/sp.j.1004.2012.01169.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Lee, Dong-Hoon, Young-Sik Kim, and Jong-Seon No. "Bit Security Estimation Using Various Information-Theoretic Measures." IEEE Access 9 (2021): 140103–15. http://dx.doi.org/10.1109/access.2021.3119707.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Lee, Dong-Hoon, Young-Sik Kim, and Jong-Seon No. "Bit Security Estimation Using Various Information-Theoretic Measures." IEEE Access 9 (2021): 140103–15. http://dx.doi.org/10.1109/access.2021.3119707.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Marculescu, D., R. Marculescu, and M. Pedram. "Information theoretic measures for power analysis [logic design]." IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 15, no. 6 (June 1996): 599–610. http://dx.doi.org/10.1109/43.503930.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Aizawa, Akiko. "An information-theoretic perspective of tf–idf measures." Information Processing & Management 39, no. 1 (January 2003): 45–65. http://dx.doi.org/10.1016/s0306-4573(02)00021-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Batra, Luckshay, and H. C. Taneja. "Evaluating volatile stock markets using information theoretic measures." Physica A: Statistical Mechanics and its Applications 537 (January 2020): 122711. http://dx.doi.org/10.1016/j.physa.2019.122711.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

HU, Bao-Gang, Ran HE, and Xiao-Tong YUAN. "Information-theoretic Measures for Objective Evaluation of Classifications." Acta Automatica Sinica 38, no. 7 (July 2012): 1169–82. http://dx.doi.org/10.1016/s1874-1029(11)60289-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Dehesa, J. S., A. Guerrero, and P. Sánchez-Moreno. "Information-Theoretic-Based Spreading Measures of Orthogonal Polynomials." Complex Analysis and Operator Theory 6, no. 3 (February 15, 2011): 585–601. http://dx.doi.org/10.1007/s11785-011-0136-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Zhu, Ping, and Qiaoyan Wen. "Information-theoretic measures associated with rough set approximations." Information Sciences 212 (December 2012): 33–43. http://dx.doi.org/10.1016/j.ins.2012.05.014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Takahashi, Daniel Y., Luiz A. Baccalá, and Koichi Sameshima. "Information theoretic interpretation of frequency domain connectivity measures." Biological Cybernetics 103, no. 6 (December 2010): 463–69. http://dx.doi.org/10.1007/s00422-010-0410-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Bonaventura, Xavier, Jianwei Guo, Weiliang Meng, Miquel Feixas, Xiaopeng Zhang, and Mateu Sbert. "3D shape retrieval using viewpoint information-theoretic measures." Computer Animation and Virtual Worlds 26, no. 2 (December 18, 2013): 147–56. http://dx.doi.org/10.1002/cav.1566.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Batra, Luckshay, and Harish Chander Taneja. "Comparison between Information Theoretic Measures to Assess Financial Markets." FinTech 1, no. 2 (May 19, 2022): 137–54. http://dx.doi.org/10.3390/fintech1020011.

Full text
Abstract:
Information theoretic measures were applied to the study of the randomness associations of different financial time series. We studied the level of similarities between information theoretic measures and the various tools of regression analysis, i.e., between Shannon entropy and the total sum of squares of the dependent variable, relative mutual information and coefficients of correlation, conditional entropy and residual sum of squares, etc. We observed that mutual information and its dynamical extensions provide an alternative approach with some advantages to study the association between several international stock indices. Furthermore, mutual information and conditional entropy are relatively efficient compared to the measures of statistical dependence.
APA, Harvard, Vancouver, ISO, and other styles
24

Munjal, D., K. D. Sen, and V. Prasad. "Shape effect on information theoretic measures of quantum heterostructures." Journal of Physics Communications 2, no. 2 (February 1, 2018): 025002. http://dx.doi.org/10.1088/2399-6528/aaa3ba.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Voronov, S. V., and A. G. Tashlinskii. "Efficiency analysis of information theoretic measures in image registration." Pattern Recognition and Image Analysis 26, no. 3 (July 2016): 502–5. http://dx.doi.org/10.1134/s1054661816030226.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Dehesa, J. S., A. Martínez-Finkelshtein, and V. N. Sorokin. "Information-theoretic measures for Morse and Pöschl–Teller potentials." Molecular Physics 104, no. 4 (February 20, 2006): 613–22. http://dx.doi.org/10.1080/00268970500493243.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Yi-Teh Lee. "Corrections and supplementary note to "Information-theoretic distortion measures"." IEEE Transactions on Speech and Audio Processing 2, no. 2 (April 1994): 350. http://dx.doi.org/10.1109/89.279285.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

SHIN, In-Seob, Seung Kee HAN*, Keon Myung LEE, Seung bok LEE, and Woo Hyun JUNG. "Bi-partitioning Algorithm for Information-theoretic Measures of Aesthetics." New Physics: Sae Mulli 63, no. 6 (June 28, 2013): 655–60. http://dx.doi.org/10.3938/npsm.63.655.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Kanwal, Maxinder, Joshua Grochow, and Nihat Ay. "Comparing Information-Theoretic Measures of Complexity in Boltzmann Machines." Entropy 19, no. 7 (July 3, 2017): 310. http://dx.doi.org/10.3390/e19070310.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Foster, Peter, Simon Dixon, and Anssi Klapuri. "Identifying Cover Songs Using Information-Theoretic Measures of Similarity." IEEE/ACM Transactions on Audio, Speech, and Language Processing 23, no. 6 (June 2015): 993–1005. http://dx.doi.org/10.1109/taslp.2015.2416655.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Boschetti, Fabio, Karine Prunera, Mathew A. Vanderklift, Damian P. Thomson, Russell C. Babcock, Christopher Doropoulos, Anna Cresswell, and Hector Lozano-Montes. "Information-theoretic measures of ecosystem change, sustainability, and resilience." ICES Journal of Marine Science 77, no. 4 (June 19, 2019): 1532–44. http://dx.doi.org/10.1093/icesjms/fsz105.

Full text
Abstract:
Abstract We introduce five measures describing the system-wide behaviour of complex ecological systems. Within an information-theoretic framework, these measures account for changes in both species diversity and total biomass to describe (i) overall system change, (ii) sustainability to external pressure, (iii) shift from a baseline state and two types of resilience: (iv) ability to recover from local pressures and (v) overall potential to return to a baseline state. We apply these measures to study the behaviour of three computer models: a large 59-functional groups complex ecological model (Ecopath with Ecosim) of north Western Australia undergoing internal dynamics, a smaller 6-group coral reef model subjected to various combinations of single and multiple stressors and a prey–predator model displaying limit cycles. We demonstrate the state-dependency of properties like resilience and sustainability by showing how these measures change in time as a function of internal dynamics and external forcing. Furthermore, we show how our proposed measures can simplify system analysis and monitoring by providing indicators of changes in system behaviour, sustainability, and resilience.
APA, Harvard, Vancouver, ISO, and other styles
32

Manzano, D., R. J. Yáñez, and J. S. Dehesa. "Relativistic Klein–Gordon charge effects by information-theoretic measures." New Journal of Physics 12, no. 2 (February 11, 2010): 023014. http://dx.doi.org/10.1088/1367-2630/12/2/023014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Zachary, John, and S. S. Iyengar. "Information theoretic similarity measures for content based image retrieval." Journal of the American Society for Information Science and Technology 52, no. 10 (2001): 856–67. http://dx.doi.org/10.1002/asi.1139.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Chowdhury, Anirban Roy, Ashis Saha, and Sunandan Gangopadhyay. "Mixed state information theoretic measures in boosted black brane." Annals of Physics 452 (May 2023): 169270. http://dx.doi.org/10.1016/j.aop.2023.169270.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Hertweck, Corinna, and Tim Räz. "Gradual (In)Compatibility of Fairness Criteria." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 11 (June 28, 2022): 11926–34. http://dx.doi.org/10.1609/aaai.v36i11.21450.

Full text
Abstract:
Impossibility results show that important fairness measures (independence, separation, sufficiency) cannot be satisfied at the same time under reasonable assumptions. This paper explores whether we can satisfy and/or improve these fairness measures simultaneously to a certain degree. We introduce information-theoretic formulations of the fairness measures and define degrees of fairness based on these formulations. The information-theoretic formulations suggest unexplored theoretical relations between the three fairness measures. In the experimental part, we use the information-theoretic expressions as regularizers to obtain fairness-regularized predictors for three standard datasets. Our experiments show that a) fairness regularization directly increases fairness measures, in line with existing work, and b) some fairness regularizations indirectly increase other fairness measures, as suggested by our theoretical findings. This establishes that it is possible to increase the degree to which some fairness measures are satisfied at the same time -- some fairness measures are gradually compatible.
APA, Harvard, Vancouver, ISO, and other styles
36

Lai, Ryan Ka Yau, and Youngah Do. "Large-sample confidence intervals of information-theoretic measures in linguistics." Journal of Research Design and Statistics in Linguistics and Communication Science 6, no. 1 (November 7, 2020): 19–54. http://dx.doi.org/10.1558/jrds.40134.

Full text
Abstract:
This article explores a method of creating confidence bounds for information-theoretic measures in linguistics, such as entropy, Kullback-Leibler Divergence (KLD), and mutual information. We show that a useful measure of uncertainty can be derived from simple statistical principles, namely the asymptotic distribution of the maximum likelihood estimator (MLE) and the delta method. Three case studies from phonology and corpus linguistics are used to demonstrate how to apply it and examine its robustness against common violations of its assumptions in linguistics, such as insufficient sample size and non-independence of data points.
APA, Harvard, Vancouver, ISO, and other styles
37

Zhang, Yimeng, Xiuyi Jia, and Zhenmin Tang. "Information-theoretic measures of uncertainty for interval-set decision tables." Information Sciences 577 (October 2021): 81–104. http://dx.doi.org/10.1016/j.ins.2021.06.092.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Ping Luo, Hui Xiong, Guoxing Zhan, Junjie Wu, and Zhongzhi Shi. "Information-Theoretic Distance Measures for Clustering Validation: Generalization and Normalization." IEEE Transactions on Knowledge and Data Engineering 21, no. 9 (September 2009): 1249–62. http://dx.doi.org/10.1109/tkde.2008.200.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Diaz, Mario, Hao Wang, Flavio P. Calmon, and Lalitha Sankar. "On the Robustness of Information-Theoretic Privacy Measures and Mechanisms." IEEE Transactions on Information Theory 66, no. 4 (April 2020): 1949–78. http://dx.doi.org/10.1109/tit.2019.2939472.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

de Fleurian, Remi, Tim Blackwell, Oded Ben-Tal, and Daniel Müllensiefen. "Information-Theoretic Measures Predict the Human Judgment of Rhythm Complexity." Cognitive Science 41, no. 3 (February 17, 2016): 800–813. http://dx.doi.org/10.1111/cogs.12347.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Aviyente, S., L. A. W. Brakel, R. K. Kushwaha, M. Snodgrass, H. Shevrin, and W. J. Williams. "Characterization of Event Related Potentials Using Information Theoretic Distance Measures." IEEE Transactions on Biomedical Engineering 51, no. 5 (May 2004): 737–43. http://dx.doi.org/10.1109/tbme.2004.824133.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Robinson, J. W. C., D. E. Asraf, A. R. Bulsara, and M. E. Inchiosa. "Information-Theoretic Distance Measures and a Generalization of Stochastic Resonance." Physical Review Letters 81, no. 14 (October 5, 1998): 2850–53. http://dx.doi.org/10.1103/physrevlett.81.2850.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

LaFrance, J. T., T. K. M. Beatty, R. D. Pope, and G. K. Agnew. "Information theoretic measures of the income distribution in food demand." Journal of Econometrics 107, no. 1-2 (March 2002): 235–57. http://dx.doi.org/10.1016/s0304-4076(01)00122-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Cha, Shin, In Sang Chung, and Yong Rae Kwon. "Complexity measures for concurrent programs based on information-theoretic metrics." Information Processing Letters 46, no. 1 (April 1993): 43–50. http://dx.doi.org/10.1016/0020-0190(93)90195-f.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Emmert-Streib, Frank, and Matthias Dehmer. "Information theoretic measures of UHG graphs with low computational complexity." Applied Mathematics and Computation 190, no. 2 (July 2007): 1783–94. http://dx.doi.org/10.1016/j.amc.2007.02.095.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Gholipour, Ali, Nasser Kehtarnavaz, Siamak Yousefi, Kaundinya Gopinath, and Richard Briggs. "Symmetric deformable image registration via optimization of information theoretic measures." Image and Vision Computing 28, no. 6 (June 2010): 965–75. http://dx.doi.org/10.1016/j.imavis.2009.11.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Yahya, W. A., K. J. Oyewumi, and K. D. Sen. "Position and momentum information-theoretic measures of the pseudoharmonic potential." International Journal of Quantum Chemistry 115, no. 21 (July 22, 2015): 1543–52. http://dx.doi.org/10.1002/qua.24971.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Isonguyo, Cecilia N., Kayode J. Oyewumi, and Opeyemi S. Oyun. "Quantum information-theoretic measures for the static screened Coulomb potential." International Journal of Quantum Chemistry 118, no. 15 (March 9, 2018): e25620. http://dx.doi.org/10.1002/qua.25620.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Schamberg, Gabriel, William Chapman, Shang-Ping Xie, and Todd P. Coleman. "Direct and Indirect Effects—An Information Theoretic Perspective." Entropy 22, no. 8 (July 31, 2020): 854. http://dx.doi.org/10.3390/e22080854.

Full text
Abstract:
Information theoretic (IT) approaches to quantifying causal influences have experienced some popularity in the literature, in both theoretical and applied (e.g., neuroscience and climate science) domains. While these causal measures are desirable in that they are model agnostic and can capture non-linear interactions, they are fundamentally different from common statistical notions of causal influence in that they (1) compare distributions over the effect rather than values of the effect and (2) are defined with respect to random variables representing a cause rather than specific values of a cause. We here present IT measures of direct, indirect, and total causal effects. The proposed measures are unlike existing IT techniques in that they enable measuring causal effects that are defined with respect to specific values of a cause while still offering the flexibility and general applicability of IT techniques. We provide an identifiability result and demonstrate application of the proposed measures in estimating the causal effect of the El Niño–Southern Oscillation on temperature anomalies in the North American Pacific Northwest.
APA, Harvard, Vancouver, ISO, and other styles
50

Tian, Chao, James S. Plank, Brent Hurst, and Ruida Zhou. "Computational Techniques for Investigating Information Theoretic Limits of Information Systems." Information 12, no. 2 (February 16, 2021): 82. http://dx.doi.org/10.3390/info12020082.

Full text
Abstract:
Computer-aided methods, based on the entropic linear program framework, have been shown to be effective in assisting the study of information theoretic fundamental limits of information systems. One key element that significantly impacts their computation efficiency and applicability is the reduction of variables, based on problem-specific symmetry and dependence relations. In this work, we propose using the disjoint-set data structure to algorithmically identify the reduction mapping, instead of relying on exhaustive enumeration in the equivalence classification. Based on this reduced linear program, we consider four techniques to investigate the fundamental limits of information systems: (1) computing an outer bound for a given linear combination of information measures and providing the values of information measures at the optimal solution; (2) efficiently computing a polytope tradeoff outer bound between two information quantities; (3) producing a proof (as a weighted sum of known information inequalities) for a computed outer bound; and (4) providing the range for information quantities between which the optimal value does not change, i.e., sensitivity analysis. A toolbox, with an efficient JSON format input frontend, and either Gurobi or Cplex as the linear program solving engine, is implemented and open-sourced.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography