Journal articles on the topic 'SHANNON ENTROPHY'

To see the other types of publications on this topic, follow the link: SHANNON ENTROPHY.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'SHANNON ENTROPHY.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Khumaidi, Ali. "Simulasi Entropi Shannon, Entropi Renyi, dan informasi pada kasus Spin Wheel." AKSIOMA : Jurnal Matematika dan Pendidikan Matematika 12, no. 1 (April 20, 2021): 120–28. http://dx.doi.org/10.26877/aks.v12i1.6893.

Full text
Abstract:
Entropi adalah besaran yang mengukur ketidakpastian variabel acak, dan ini adalah besaran yang merupakan kunci dalam konsep teori informasi. Entropi adalah ukuran ketidakpastian. Konsep entropi dimulai dengan terminologi yang disebut konten informasi. Shannon Entropy sering dinyatakan sebagai asal mula ukuran informasi yang digunakan dalam beberapa aplikasi. Pada penelitian ini menggunakan balanced dan unbalanced spin whell dengan nilai q = 1,000001 diperoleh entropi Shannon, Renyi dan informasi pada balanced spin wheels masing-masing 2.079442, 2.079442, 2.079439 dan pada unbalanced spin wheels masing-masing juga 1.936798, 1.936798, 1.936796 jadi nilainya dari entropi dan informasi akan cenderung shanon entropi ketika q → 1.
APA, Harvard, Vancouver, ISO, and other styles
2

Weilenmann, Mirjam, and Roger Colbeck. "Non-Shannon inequalities in the entropy vector approach to causal structures." Quantum 2 (March 14, 2018): 57. http://dx.doi.org/10.22331/q-2018-03-14-57.

Full text
Abstract:
A causal structure is a relationship between observed variables that in general restricts the possible correlations between them. This relationship can be mediated by unobserved systems, modelled by random variables in the classical case or joint quantum systems in the quantum case. One way to differentiate between the correlations realisable by two different causal structures is to use entropy vectors, i.e., vectors whose components correspond to the entropies of each subset of the observed variables. To date, the starting point for deriving entropic constraints within causal structures are the so-called Shannon inequalities (positivity of entropy, conditional entropy and conditional mutual information). In the present work we investigate what happens when non-Shannon entropic inequalities are included as well. We show that in general these lead to tighter outer approximations of the set of realisable entropy vectors and hence enable a sharper distinction of different causal structures. Since non-Shannon inequalities can only be applied amongst classical variables, it might be expected that their use enables an entropic distinction between classical and quantum causal structures. However, this remains an open question. We also introduce techniques for deriving inner approximations to the allowed sets of entropy vectors for a given causal structure. These are useful for proving tightness of outer approximations or for finding interesting regions of entropy space. We illustrate these techniques in several scenarios, including the triangle causal structure.
APA, Harvard, Vancouver, ISO, and other styles
3

MATSUMOTO, Takanori. "The pattern of Shannon entropy distribution in the Sea of Okhotsk." Journal of the Visualization Society of Japan 18, Supplement1 (1998): 93–96. http://dx.doi.org/10.3154/jvs.18.supplement1_93.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

ABDEL-ATY, MAHMOUD, ISSA A. AL-KHAYAT, and SHOUKRY S. HASSAN. "SHANNON INFORMATION AND ENTROPY SQUEEZING OF A SINGLE-MODE CAVITY QED OF A RAMAN INTERACTION." International Journal of Quantum Information 04, no. 05 (October 2006): 807–14. http://dx.doi.org/10.1142/s021974990600216x.

Full text
Abstract:
Entropy squeezing is examined in the framework of Shannon information entropy for a degenerate Raman process involving two degenerate Rydberg energy levels of an atom interacting with a single-mode cavity field. Quantum squeezing in entropy is exhibited via the entropic uncertainty relation.
APA, Harvard, Vancouver, ISO, and other styles
5

Baccetti, Valentina, and Matt Visser. "Infinite Shannon entropy." Journal of Statistical Mechanics: Theory and Experiment 2013, no. 04 (April 2, 2013): P04010. http://dx.doi.org/10.1088/1742-5468/2013/04/p04010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Truffet, L. "Shannon Entropy Reinterpreted." Reports on Mathematical Physics 81, no. 3 (June 2018): 303–19. http://dx.doi.org/10.1016/s0034-4877(18)30050-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Sarswat, Shruti, Aiswarya R, and Jobin Jose. "Shannon entropy of resonant scattered state in the e–C60 elastic collision." Journal of Physics B: Atomic, Molecular and Optical Physics 55, no. 5 (March 2, 2022): 055003. http://dx.doi.org/10.1088/1361-6455/ac5719.

Full text
Abstract:
Abstract Resonance is a remarkable feature in elastic scattering and the resonant states of e–C60 scattering are benchmarked using Shannon entropy in the present work. The resonant wavefunction, total cross-section, partial cross-sections, and scattering phase shifts are calculated for the e–C60 scattering to review the localization properties owing to resonance. Three different model interaction potentials are used in the paper to simulate the environment of the C60 shell; annular square well, Gaussian annular square well, and Lorentzian potential. This paper aims to establish a relationship between the Shannon entropy and resonant properties linked with e + C60 scattering. This work introduces the Shannon entropy as an indicator of resonance in elastic scattering and it unveils the susceptibility of entropic properties to the nature of the model potential.
APA, Harvard, Vancouver, ISO, and other styles
8

Bonidia, Robson P., Anderson P. Avila Santos, Breno L. S. de Almeida, Peter F. Stadler, Ulisses Nunes da Rocha, Danilo S. Sanches, and André C. P. L. F. de Carvalho. "Information Theory for Biological Sequence Classification: A Novel Feature Extraction Technique Based on Tsallis Entropy." Entropy 24, no. 10 (October 1, 2022): 1398. http://dx.doi.org/10.3390/e24101398.

Full text
Abstract:
In recent years, there has been an exponential growth in sequencing projects due to accelerated technological advances, leading to a significant increase in the amount of data and resulting in new challenges for biological sequence analysis. Consequently, the use of techniques capable of analyzing large amounts of data has been explored, such as machine learning (ML) algorithms. ML algorithms are being used to analyze and classify biological sequences, despite the intrinsic difficulty in extracting and finding representative biological sequence methods suitable for them. Thereby, extracting numerical features to represent sequences makes it statistically feasible to use universal concepts from Information Theory, such as Tsallis and Shannon entropy. In this study, we propose a novel Tsallis entropy-based feature extractor to provide useful information to classify biological sequences. To assess its relevance, we prepared five case studies: (1) an analysis of the entropic index q; (2) performance testing of the best entropic indices on new datasets; (3) a comparison made with Shannon entropy and (4) generalized entropies; (5) an investigation of the Tsallis entropy in the context of dimensionality reduction. As a result, our proposal proved to be effective, being superior to Shannon entropy and robust in terms of generalization, and also potentially representative for collecting information in fewer dimensions compared with methods such as Singular Value Decomposition and Uniform Manifold Approximation and Projection.
APA, Harvard, Vancouver, ISO, and other styles
9

González, Gabriel, and Daniel Salgado-Blanco. "Shannon information entropy for a one-dimensional ionic crystal." Modern Physics Letters A 35, no. 07 (November 22, 2019): 2050032. http://dx.doi.org/10.1142/s0217732320500327.

Full text
Abstract:
In this work, we present the information theoretic lengths given by Shannon and Fisher for the lowest-lying stationary state of a one-dimensional ionic crystal modeled by [Formula: see text] equally spaced attractive and repulsive Dirac delta potentials. The entropic uncertainty relations related to position and momentum spaces are studied as a function of the number of ions and the distance between them. Our results show that the stability of the ionic crystal depends on the number of ions and distance between them. In particular, we show that the position Shannon entropy is always positive and increases as the lattice constant between ions grows, in contrast with the momentum Shannon entropy which decreases and becomes negative beyond a particular lattice constant value.
APA, Harvard, Vancouver, ISO, and other styles
10

HALDAR, SUDIP KUMAR, and BARNALI CHAKRABARTI. "DYNAMICAL FEATURES OF SHANNON INFORMATION ENTROPY OF BOSONIC CLOUD IN A TIGHT TRAP." International Journal of Modern Physics B 27, no. 13 (May 15, 2013): 1350048. http://dx.doi.org/10.1142/s0217979213500483.

Full text
Abstract:
We calculate Shannon information entropy of trapped interacting bosons in both the position and momentum spaces, Sr and Sk, respectively. The total entropy maintains the functional form S = a + b ln N for repulsive bosons. At the noninteracting limit the lower bound of entropic uncertainty relation is also satisfied whereas the diverging behavior of Sr and Sk at the critical point of collapse for attractive condensate accurately calculates the stability factor. Next we study the dynamics of Shannon information entropy with varying interparticle potential. We numerically solve the time-dependent Gross–Pitaevskii equation and study the influence of increasing nonlinearity in the dynamics of entropy uncertainty relation (EUR). We observe that for small nonlinearity the dynamics is regular. With increase in nonlinearity although Shannon entropy shows large variation in amplitude of the oscillation, the EUR is maintained throughout time for all cases and it confirms its generality. We also study the dynamics in a very tight trap when the condensate becomes highly correlated and strongly inhomogeneous. Time evolution of total entropy exhibits aperiodic and fluctuating nature in very tight trap. We also calculate Landsberg's order parameter for various interaction strengths which supports earlier observation that entropy and order are decoupled.
APA, Harvard, Vancouver, ISO, and other styles
11

Guiaşu, Radu Cornel, and Silviu Guiaşu. "Conditional and Weighted Measures of Ecological Diversity." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 11, no. 03 (June 2003): 283–300. http://dx.doi.org/10.1142/s0218488503002089.

Full text
Abstract:
Shannon's entropy and Simpson's index are the most used measures of species diversity. As the Simpson index proves to be just an approximation of the Shannon entropy, conditional Simpson indices of diversity and a global measure of interdependence among species are introduced, similar to those used in the corresponding entropic formalism from information theory. Also, since both the Shannon entropy and the Simpson index depend only on the number and relative abundance of the respective species in a given ecosystem, the paper generalizes these indices of diversity to the case when a numerical weight is attached to each species. Such a weight could reflect supplementary information about the absolute abundance, the economic significance, or the conservation value of the species.
APA, Harvard, Vancouver, ISO, and other styles
12

Kreiner, Welf Alfred. "First Digits’ Shannon Entropy." Entropy 24, no. 10 (October 3, 2022): 1413. http://dx.doi.org/10.3390/e24101413.

Full text
Abstract:
Related to the letters of an alphabet, entropy means the average number of binary digits required for the transmission of one character. Checking tables of statistical data, one finds that, in the first position of the numbers, the digits 1 to 9 occur with different frequencies. Correspondingly, from these probabilities, a value for the Shannon entropy H can be determined as well. Although in many cases, the Newcomb–Benford Law applies, distributions have been found where the 1 in the first position occurs up to more than 40 times as frequently as the 9. In this case, the probability of the occurrence of a particular first digit can be derived from a power function with a negative exponent p > 1. While the entropy of the first digits following an NB distribution amounts to H = 2.88, for other data distributions (diameters of craters on Venus or the weight of fragments of crushed minerals), entropy values of 2.76 and 2.04 bits per digit have been found.
APA, Harvard, Vancouver, ISO, and other styles
13

Bao, Han, and Shinsaku Sakaue. "Sparse Regularized Optimal Transport with Deformed q-Entropy." Entropy 24, no. 11 (November 10, 2022): 1634. http://dx.doi.org/10.3390/e24111634.

Full text
Abstract:
Optimal transport is a mathematical tool that has been a widely used to measure the distance between two probability distributions. To mitigate the cubic computational complexity of the vanilla formulation of the optimal transport problem, regularized optimal transport has received attention in recent years, which is a convex program to minimize the linear transport cost with an added convex regularizer. Sinkhorn optimal transport is the most prominent one regularized with negative Shannon entropy, leading to densely supported solutions, which are often undesirable in light of the interpretability of transport plans. In this paper, we report that a deformed entropy designed by q-algebra, a popular generalization of the standard algebra studied in Tsallis statistical mechanics, makes optimal transport solutions supported sparsely. This entropy with a deformation parameter q interpolates the negative Shannon entropy (q=1) and the squared 2-norm (q=0), and the solution becomes more sparse as q tends to zero. Our theoretical analysis reveals that a larger q leads to a faster convergence when optimized with the Broyden–Fletcher–Goldfarb–Shanno (BFGS) algorithm. In summary, the deformation induces a trade-off between the sparsity and convergence speed.
APA, Harvard, Vancouver, ISO, and other styles
14

Vyas, Jitendra Kumar, Muthiah Perumal, and Tommaso Moramarco. "Discharge Estimation Using Tsallis and Shannon Entropy Theory in Natural Channels." Water 12, no. 6 (June 23, 2020): 1786. http://dx.doi.org/10.3390/w12061786.

Full text
Abstract:
Streamflow measurements during high floods is a challenge for which the World Meteorological Organization fosters the development of innovative technologies for achieving an accurate estimation of the discharge. The use of non-contact sensors for monitoring surface flow velocities is of interest to turn these observed values into a cross-sectional mean flow velocity, and subsequently, into discharge if bathymetry is given. In this context, several techniques are available for the estimation of mean flow velocity, starting from observed surface velocities. Among them, the entropy-based methodology for river discharge assessment is often applied by leveraging the theoretical entropic principles of Shannon and Tsallis, both of which link the maximum flow velocity measured at a vertical of the flow area, named the y-axis, and the cross-sectional mean flow velocity at a river site. This study investigates the performance of the two different entropic approaches in estimating the mean flow velocity, starting from the maximum surface flow velocity sampled at the y-axis. A velocity dataset consisting of 70 events of measurements collected at two gauged stations with different geometric and hydraulic characteristics on the Po and Tiber Rivers in Italy was used for the analysis. The comparative evaluation of the velocity distribution observed at the y-axis of all 70 events of measurement was closely reproduced using both the Shannon and Tsallis entropy approaches. Accurate values in terms of the cross-sectional mean flow velocity and discharge were obtained with average errors not exceeding 10%, demonstrating that the Shannon and Tsallis entropy concepts were equally efficient for discharge estimation in any flow conditions.
APA, Harvard, Vancouver, ISO, and other styles
15

Życzkowski, Karol. "Rényi Extrapolation of Shannon Entropy." Open Systems & Information Dynamics 10, no. 03 (September 2003): 297–310. http://dx.doi.org/10.1023/a:1025128024427.

Full text
Abstract:
Relations between Shannon entropy and Rényi entropies of integer order are discussed. For any N-point discrete probability distribution for which the Rényi entropies of order two and three are known, we provide a lower and an upper bound for the Shannon entropy. The average of both bounds provide an explicit extrapolation for this quantity. These results imply relations between the von Neumann entropy of a mixed quantum state, its linear entropy and traces.
APA, Harvard, Vancouver, ISO, and other styles
16

ABDEL-ATY, MAHMOUD, ABDEL-SHAFY F. OBADA, and M. SEBAWE ABDALLA. "SHANNON INFORMATION AND ENTROPY SQUEEZING OF TWO FIELDS PARAMETRIC FREQUENCY CONVERTER INTERACTING WITH A SINGLE-ATOM." International Journal of Quantum Information 01, no. 03 (September 2003): 359–73. http://dx.doi.org/10.1142/s0219749903000231.

Full text
Abstract:
In terms of a new Hamiltonian we introduce a modified Jaynes–Cummings model to represent the interaction between a two-level atom and two electromagnetic fields injected simultaneously within a perfect cavity. The field–field interaction is assumed to be in the frequency converter form. Exact expressions of the dynamical operators is given by solving the equations of the motion in the Heisenberg picture. The entropy squeezing is examined in the framework of Shannon information entropy. Using the entropic uncertainty relation, one can be able to show that there is a new kind of quantum squeezing in the entropy.
APA, Harvard, Vancouver, ISO, and other styles
17

Chakrabarti, C. G., and Indranil Chakrabarty. "Shannon entropy: axiomatic characterization and application." International Journal of Mathematics and Mathematical Sciences 2005, no. 17 (2005): 2847–54. http://dx.doi.org/10.1155/ijmms.2005.2847.

Full text
Abstract:
We have presented a new axiomatic derivation of Shannon entropy for a discrete probability distribution on the basis of the postulates of additivity and concavity of the entropy function. We have then modified Shannon entropy to take account of observational uncertainty.The modified entropy reduces, in the limiting case, to the form of Shannon differential entropy. As an application, we have derived the expression for classical entropy of statistical mechanics from the quantized form of the entropy.
APA, Harvard, Vancouver, ISO, and other styles
18

Tempesta, Piergiulio. "A theorem on the existence of trace-form generalized entropies." Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 471, no. 2183 (November 2015): 20150165. http://dx.doi.org/10.1098/rspa.2015.0165.

Full text
Abstract:
An analytic technique is proposed, which allows to generate many new examples of entropic functionals generalizing the standard Boltzmann–Gibbs entropy. Our approach is based on the existence of a group-theoretical structure, which is intimately related with the notion of entropy, as clarified in recent work of the author. The new entropies proposed satisfy the first three Shannon–Khinchin axioms and are composable (at least in a weak sense). By combining them, multi-parametric examples of entropies can be realized. As a by-product of the theory, entropic functionals related to the Riemann zeta function and other number-theoretical functions are introduced.
APA, Harvard, Vancouver, ISO, and other styles
19

Ascoli, R., and R. Urigu. "Physical information entropy and probability Shannon entropy." International Journal of Theoretical Physics 36, no. 8 (August 1997): 1691–716. http://dx.doi.org/10.1007/bf02435839.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Abbe, Emmanuel, and Sophie Spirkl. "Entropic Matroids and Their Representation." Entropy 21, no. 10 (September 27, 2019): 948. http://dx.doi.org/10.3390/e21100948.

Full text
Abstract:
This paper investigates entropic matroids, that is, matroids whose rank function is given as the Shannon entropy of random variables. In particular, we consider p-entropic matroids, for which the random variables each have support of cardinality p. We draw connections between such entropic matroids and secret-sharing matroids and show that entropic matroids are linear matroids when p = 2 , 3 but not when p = 9 . Our results leave open the possibility for p-entropic matroids to be linear whenever p is prime, with particular cases proved here. Applications of entropic matroids to coding theory and cryptography are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
21

Hanser, Sean F., Laurance R. Doyle, Brenda McCowan, and Jon M. Jenkins. "Information Theory Applied to Animal Communication Systems and Its Possible Application to SETI." Symposium - International Astronomical Union 213 (2004): 514–18. http://dx.doi.org/10.1017/s0074180900193817.

Full text
Abstract:
Information theory, as first introduced by Claude Shannon (Shannon & Weaver 1949) quantitatively evaluates the organizational complexity of communication systems. At the same time George Zipf was examining linguistic structure in a way that was mathematically similar to the components of the Shannon first-order entropy (Zipf 1949). Both Shannon's and Zipf's mathematical procedures have been applied to animal communication and recently have been providing insightful results. The Zipf plot is a useful tool for a first estimate of the characterization of a communication system's complexity (which can later be examined for complex structure at deeper levels using Shannon entropic analysis). In this paper we shall discuss some of the applications and pitfalls of using the Zipf distribution as a preliminary evaluator of the communication complexity of a signaling system.
APA, Harvard, Vancouver, ISO, and other styles
22

Brandenburger, Adam, Pierfrancesco La Mura, and Stuart Zoble. "Rényi Entropy, Signed Probabilities, and the Qubit." Entropy 24, no. 10 (October 3, 2022): 1412. http://dx.doi.org/10.3390/e24101412.

Full text
Abstract:
The states of the qubit, the basic unit of quantum information, are 2×2 positive semi-definite Hermitian matrices with trace 1. We contribute to the program to axiomatize quantum mechanics by characterizing these states in terms of an entropic uncertainty principle formulated on an eight-point phase space. We do this by employing Rényi entropy (a generalization of Shannon entropy) suitably defined for the signed phase-space probability distributions that arise in representing quantum states.
APA, Harvard, Vancouver, ISO, and other styles
23

Yuan, Lin, and H. K. Kesavan. "Bayesian estimation of shannon entropy." Communications in Statistics - Theory and Methods 26, no. 1 (January 1997): 139–48. http://dx.doi.org/10.1080/03610929708831906.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Aliaga, J., G. Crespo, and A. N. Proto. "Squeezed states and Shannon entropy." Physical Review A 49, no. 6 (June 1, 1994): 5146–48. http://dx.doi.org/10.1103/physreva.49.5146.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Carrasco Millán, Pedro, M. Ángeles García-Ferrero, Felipe J. Llanes-Estrada, Ana Porras Riojano, and Esteban M. Sánchez García. "Shannon entropy and particle decays." Nuclear Physics B 930 (May 2018): 583–96. http://dx.doi.org/10.1016/j.nuclphysb.2018.04.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Spivak, David I. "Polynomial Functors and Shannon Entropy." Electronic Proceedings in Theoretical Computer Science 380 (July 28, 2023): 315–27. http://dx.doi.org/10.4204/eptcs.380.19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

De la Cruz-García, Jazmín S., Juan Bory-Reyes, and Aldo Ramirez-Arellano. "A Two-Parameter Fractional Tsallis Decision Tree." Entropy 24, no. 5 (April 19, 2022): 572. http://dx.doi.org/10.3390/e24050572.

Full text
Abstract:
Decision trees are decision support data mining tools that create, as the name suggests, a tree-like model. The classical C4.5 decision tree, based on the Shannon entropy, is a simple algorithm to calculate the gain ratio and then split the attributes based on this entropy measure. Tsallis and Renyi entropies (instead of Shannon) can be employed to generate a decision tree with better results. In practice, the entropic index parameter of these entropies is tuned to outperform the classical decision trees. However, this process is carried out by testing a range of values for a given database, which is time-consuming and unfeasible for massive data. This paper introduces a decision tree based on a two-parameter fractional Tsallis entropy. We propose a constructionist approach to the representation of databases as complex networks that enable us an efficient computation of the parameters of this entropy using the box-covering algorithm and renormalization of the complex network. The experimental results support the conclusion that the two-parameter fractional Tsallis entropy is a more sensitive measure than parametric Renyi, Tsallis, and Gini index precedents for a decision tree classifier.
APA, Harvard, Vancouver, ISO, and other styles
28

Noorizadeh, Siamak, and Ehsan Shakerzadeh. "Shannon entropy as a new measure of aromaticity, Shannon aromaticity." Physical Chemistry Chemical Physics 12, no. 18 (2010): 4742. http://dx.doi.org/10.1039/b916509f.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Mahdy, Mervat, Dina S. Eltelbany, and Hoda Mohammed. "𝑯𝑵- Entropy: A New Measureof Information And Its Properties." Journal of University of Shanghai for Science and Technology 24, no. 1 (January 4, 2022): 105–18. http://dx.doi.org/10.51201/jusst/21/11979.

Full text
Abstract:
Entropy measures the amount of uncertainty and dispersion of an unknown or random quantity, this concept introduced at first by Shannon (1948), it is important for studies in many areas. Like, information theory: entropy measures the amount of information in each message received, physics: entropy is the basic concept that measures the disorder of the thermodynamical system, and others. Then, in this paper, we introduce an alternative measure of entropy, called 𝐻𝑁- entropy, unlike Shannon entropy, this proposed measure of order α and β is more flexible than Shannon. Then, the cumulative residual 𝐻𝑁- entropy, cumulative 𝐻𝑁- entropy, and weighted version have been introduced. Finally, comparison between Shannon entropy and 𝐻𝑁- entropy and numerical results have been introduced.
APA, Harvard, Vancouver, ISO, and other styles
30

JOHANSSON, FREDRIK, and HIROYUKI TOH. "RELATIVE VON NEUMANN ENTROPY FOR EVALUATING AMINO ACID CONSERVATION." Journal of Bioinformatics and Computational Biology 08, no. 05 (October 2010): 809–23. http://dx.doi.org/10.1142/s021972001000494x.

Full text
Abstract:
The Shannon entropy is a common way of measuring conservation of sites in multiple sequence alignments, and has also been extended with the relative Shannon entropy to account for background frequencies. The von Neumann entropy is another extension of the Shannon entropy, adapted from quantum mechanics in order to account for amino acid similarities. However, there is yet no relative von Neumann entropy defined for sequence analysis. We introduce a new definition of the von Neumann entropy for use in sequence analysis, which we found to perform better than the previous definition. We also introduce the relative von Neumann entropy and a way of parametrizing this in order to obtain the Shannon entropy, the relative Shannon entropy and the von Neumann entropy at special parameter values. We performed an exhaustive search of this parameter space and found better predictions of catalytic sites compared to any of the previously used entropies.
APA, Harvard, Vancouver, ISO, and other styles
31

Sobolev, Sergey L., and Igor V. Kudinov. "Extended Nonequilibrium Variables for 1D Hyperbolic Heat Conduction." Journal of Non-Equilibrium Thermodynamics 45, no. 3 (July 26, 2020): 209–21. http://dx.doi.org/10.1515/jnet-2019-0076.

Full text
Abstract:
AbstractWe use the Shannon (information) entropy to define an “entropic” temperature for 1D nonequilibrium system with heat flux. In contrast to the kinetic temperature, which is related to the average kinetic energy, the nonequilibrium entropic temperature is related to the changes in entropy and serves as a criterion for thermalization. However, the direction and value of the heat flux is controlled by the gradient of the kinetic temperature, whereas space-time evolution and the space-time evolution of the heat flux are governed by the hyperbolic heat conduction equation. The extended nonequilibrium variables, namely, entropy, entropic temperature, thermal conductivity, and heat capacity demonstrate a third-law-like behavior at high deviation from equilibrium when the heat flux tends to its maximum value, even at nonzero value of the kinetic temperature. The ratio of the heat flux to its maximum possible value plays a role of an order parameter – it varies from zero in the equilibrium (disordered) state to unity in the nonequilibrium (ordered) state.
APA, Harvard, Vancouver, ISO, and other styles
32

Dehesa, Jesús S. "Entropy-Like Properties and Lq-Norms of Hypergeometric Orthogonal Polynomials: Degree Asymptotics." Symmetry 13, no. 8 (August 3, 2021): 1416. http://dx.doi.org/10.3390/sym13081416.

Full text
Abstract:
In this work, the spread of hypergeometric orthogonal polynomials (HOPs) along their orthogonality interval is examined by means of the main entropy-like measures of their associated Rakhmanov’s probability density—so, far beyond the standard deviation and its generalizations, the ordinary moments. The Fisher information, the Rényi and Shannon entropies, and their corresponding spreading lengths are analytically expressed in terms of the degree and the parameter(s) of the orthogonality weight function. These entropic quantities are closely related to the gradient functional (Fisher) and the Lq-norms (Rényi, Shannon) of the polynomials. In addition, the degree asymptotics for these entropy-like functionals of the three canonical families of HPOs (i.e., Hermite, Laguerre, and Jacobi polynomials) are given and briefly discussed. Finally, a number of open related issues are identified whose solutions are both physico-mathematically and computationally relevant.
APA, Harvard, Vancouver, ISO, and other styles
33

Fullwood, James, and Arthur J. Parzygnat. "The Information Loss of a Stochastic Map." Entropy 23, no. 8 (August 8, 2021): 1021. http://dx.doi.org/10.3390/e23081021.

Full text
Abstract:
We provide a stochastic extension of the Baez–Fritz–Leinster characterization of the Shannon information loss associated with a measure-preserving function. This recovers the conditional entropy and a closely related information-theoretic measure that we call conditional information loss. Although not functorial, these information measures are semi-functorial, a concept we introduce that is definable in any Markov category. We also introduce the notion of an entropic Bayes’ rule for information measures, and we provide a characterization of conditional entropy in terms of this rule.
APA, Harvard, Vancouver, ISO, and other styles
34

Harremoës, Peter. "Entropy Inequalities for Lattices." Entropy 20, no. 10 (October 12, 2018): 784. http://dx.doi.org/10.3390/e20100784.

Full text
Abstract:
We study entropy inequalities for variables that are related by functional dependencies. Although the powerset on four variables is the smallest Boolean lattice with non-Shannon inequalities, there exist lattices with many more variables where the Shannon inequalities are sufficient. We search for conditions that exclude the existence of non-Shannon inequalities. The existence of non-Shannon inequalities is related to the question of whether a lattice is isomorphic to a lattice of subgroups of a group. In order to formulate and prove the results, one has to bridge lattice theory, group theory, the theory of functional dependences and the theory of conditional independence. It is demonstrated that the Shannon inequalities are sufficient for planar modular lattices. The proof applies a gluing technique that uses that if the Shannon inequalities are sufficient for the pieces, then they are also sufficient for the whole lattice. It is conjectured that the Shannon inequalities are sufficient if and only if the lattice does not contain a special lattice as a sub-semilattice.
APA, Harvard, Vancouver, ISO, and other styles
35

de Araujo, O. R., L. S. Marinho, H. A. S. Costa, Marcos Sampaio, and I. G. da Paz. "Shannon entropy and interference in the double-slit experiment with matter waves." Modern Physics Letters A 34, no. 02 (January 20, 2019): 1950017. http://dx.doi.org/10.1142/s0217732319500172.

Full text
Abstract:
The Shannon entropy has been used to study the localization/delocalization of a distribution for different systems. Greater increased localization was associated with the smaller values and increased delocalization with the larger values of the Shannon entropy. In this paper, we show that for the double-slit experiment in the position and momentum spaces, smaller values of the Shannon entropy are associated with larger number of interference fringes and larger values of the Shannon entropy with smaller number of interference fringes.
APA, Harvard, Vancouver, ISO, and other styles
36

Kalogeropoulos, Nikos. "Groups, nonadditive entropy and phase transitions." International Journal of Modern Physics B 28, no. 24 (August 5, 2014): 1450162. http://dx.doi.org/10.1142/s0217979214501628.

Full text
Abstract:
We investigate the possibility of discrete groups furnishing a kinematic framework for systems whose thermodynamic behavior may be given by nonadditive entropies. Relying on the well-known result of the growth rate of balls of nilpotent groups, we see that maintaining extensivity of the entropy of a nilpotent group requires using a non-Boltzmann/Gibbs/Shannon (BGS) entropic form. We use the Tsallis entropy as an indicative alternative. Using basic results from hyperbolic and random groups, we investigate the genericity and possible range of applicability of the BGS entropy in this context. We propose a sufficient condition for phase transitions, in the context of (multi-) parameter families of nonadditive entropies.
APA, Harvard, Vancouver, ISO, and other styles
37

Newton, Paul K., and Stephen A. DeSalvo. "The Shannon entropy of Sudoku matrices." Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 466, no. 2119 (February 10, 2010): 1957–75. http://dx.doi.org/10.1098/rspa.2009.0522.

Full text
Abstract:
We study properties of an ensemble of Sudoku matrices (a special type of doubly stochastic matrix when normalized) using their statistically averaged singular values. The determinants are very nearly Cauchy distributed about the origin. The largest singular value is , while the others decrease approximately linearly. The normalized singular values (obtained by dividing each singular value by the sum of all nine singular values) are then used to calculate the average Shannon entropy of the ensemble, a measure of the distribution of ‘energy’ among the singular modes and interpreted as a measure of the disorder of a typical matrix. We show the Shannon entropy of the ensemble to be 1.7331±0.0002, which is slightly lower than an ensemble of 9×9 Latin squares, but higher than a certain collection of 9×9 random matrices used for comparison. Using the notion of relative entropy or Kullback–Leibler divergence , which gives a measure of how one distribution differs from another, we show that the relative entropy between the ensemble of Sudoku matrices and Latin squares is of the order of 10 −5 . By contrast, the relative entropy between Sudoku matrices and the collection of random matrices has the much higher value, being of the order of 10 −3 , with the Shannon entropy of the Sudoku matrices having better distribution among the modes. We finish by ‘reconstituting’ the ‘average’ Sudoku matrix from its averaged singular components.
APA, Harvard, Vancouver, ISO, and other styles
38

CHAKRABARTI, C. G., and I. CHAKRABARTY. "BOLTZMANN–SHANNON ENTROPY: GENERALIZATION AND APPLICATION." Modern Physics Letters B 20, no. 23 (October 10, 2006): 1471–79. http://dx.doi.org/10.1142/s0217984906011529.

Full text
Abstract:
The paper deals with the generalization of both Boltzmann entropy and distribution in the light of most-probable interpretation of statistical equilibrium. The statistical analysis of the generalized entropy and distribution leads to some new interesting results of significant physical importance.
APA, Harvard, Vancouver, ISO, and other styles
39

Xing, Yifan, and Jun Wu. "Controlling the Shannon Entropy of Quantum Systems." Scientific World Journal 2013 (2013): 1–13. http://dx.doi.org/10.1155/2013/381219.

Full text
Abstract:
This paper proposes a new quantum control method which controls the Shannon entropy of quantum systems. For both discrete and continuous entropies, controller design methods are proposed based on probability density function control, which can drive the quantum state to any target state. To drive the entropy to any target at any prespecified time, another discretization method is proposed for the discrete entropy case, and the conditions under which the entropy can be increased or decreased are discussed. Simulations are done on both two- and three-dimensional quantum systems, where division and prediction are used to achieve more accurate tracking.
APA, Harvard, Vancouver, ISO, and other styles
40

Sherwin, William B., and Narcis Prat i Fornells. "The Introduction of Entropy and Information Methods to Ecology by Ramon Margalef." Entropy 21, no. 8 (August 15, 2019): 794. http://dx.doi.org/10.3390/e21080794.

Full text
Abstract:
In ecology and evolution, entropic methods are now used widely and increasingly frequently. Their use can be traced back to Ramon Margalef’s first attempt 70 years ago to use log-series to quantify ecological diversity, including searching for ecologically meaningful groupings within a large assemblage, which we now call the gamma level. The same year, Shannon and Weaver published a generally accessible form of Shannon’s work on information theory, including the measure that we now call Shannon–Wiener entropy. Margalef seized on that measure and soon proposed that ecologists should use the Shannon–Weiner index to evaluate diversity, including assessing local (alpha) diversity and differentiation between localities (beta). He also discussed relating this measure to environmental variables and ecosystem processes such as succession. Over the subsequent decades, he enthusiastically expanded upon his initial suggestions. Finally, 2019 also would have been Margalef’s 100th birthday.
APA, Harvard, Vancouver, ISO, and other styles
41

Cholewa, Marcin, and Bartłomiej Płaczek. "Application of Positional Entropy to Fast Shannon Entropy Estimation for Samples of Digital Signals." Entropy 22, no. 10 (October 19, 2020): 1173. http://dx.doi.org/10.3390/e22101173.

Full text
Abstract:
This paper introduces a new method of estimating Shannon entropy. The proposed method can be successfully used for large data samples and enables fast computations to rank the data samples according to their Shannon entropy. Original definitions of positional entropy and integer entropy are discussed in details to explain the theoretical concepts that underpin the proposed approach. Relations between positional entropy, integer entropy and Shannon entropy were demonstrated through computational experiments. The usefulness of the introduced method was experimentally verified for various data samples of different type and size. The experimental results clearly show that the proposed approach can be successfully used for fast entropy estimation. The analysis was also focused on quality of the entropy estimation. Several possible implementations of the proposed method were discussed. The presented algorithms were compared with the existing solutions. It was demonstrated that the algorithms presented in this paper estimate the Shannon entropy faster and more accurately than the state-of-the-art algorithms.
APA, Harvard, Vancouver, ISO, and other styles
42

Fortune, Timothy, and Hailin Sang. "Shannon Entropy Estimation for Linear Processes." Journal of Risk and Financial Management 13, no. 9 (September 9, 2020): 205. http://dx.doi.org/10.3390/jrfm13090205.

Full text
Abstract:
In this paper, we estimate the Shannon entropy S(f)=−E[log(f(x))] of a one-sided linear process with probability density function f(x). We employ the integral estimator Sn(f), which utilizes the standard kernel density estimator fn(x) of f(x). We show that Sn(f) converges to S(f) almost surely and in Ł2 under reasonable conditions.
APA, Harvard, Vancouver, ISO, and other styles
43

Zhuang, Houlong. "Sudoku-inspired high-Shannon-entropy alloys." Acta Materialia 225 (February 2022): 117556. http://dx.doi.org/10.1016/j.actamat.2021.117556.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Nath, Prem, M. M. Kaur, and Ranjeet Singh. "Some derivations of the Shannon entropy." Journal of Discrete Mathematical Sciences and Cryptography 1, no. 1 (January 1998): 85–100. http://dx.doi.org/10.1080/09720529.1998.10697868.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Xi, Bo-Yan, Dan-Dan Gao, Tao Zhang, Bai-Ni Guo, and Feng Qi. "Shannon Type Inequalities for Kapur’s Entropy." Mathematics 7, no. 1 (December 26, 2018): 22. http://dx.doi.org/10.3390/math7010022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Alizadeh Noughabi, Hadi. "On the Estimation of Shannon Entropy." Journal of Statistical Research of Iran 12, no. 1 (September 1, 2015): 57–70. http://dx.doi.org/10.18869/acadpub.jsri.12.1.57.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Stanisz, P., M. Oettingen, and G. Kępisty. "Shannon entropy of nuclear fuel transmutation." IOP Conference Series: Earth and Environmental Science 214 (January 23, 2019): 012025. http://dx.doi.org/10.1088/1755-1315/214/1/012025.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Bulinski, Alexander, and Alexey Kozhevin. "Statistical estimation of conditional Shannon entropy." ESAIM: Probability and Statistics 23 (2019): 350–86. http://dx.doi.org/10.1051/ps/2018026.

Full text
Abstract:
The new estimates of the conditional Shannon entropy are introduced in the framework of the model describing a discrete response variable depending on a vector of d factors having a density w.r.t. the Lebesgue measure in ℝd. Namely, the mixed-pair model (X, Y ) is considered where X and Y take values in ℝd and an arbitrary finite set, respectively. Such models include, for instance, the famous logistic regression. In contrast to the well-known Kozachenko–Leonenko estimates of unconditional entropy the proposed estimates are constructed by means of the certain spacial order statistics (or k-nearest neighbor statistics where k = kn depends on amount of observations n) and a random number of i.i.d. observations contained in the balls of specified random radii. The asymptotic unbiasedness and L2-consistency of the new estimates are established under simple conditions. The obtained results can be applied to the feature selection problem which is important, e.g., for medical and biological investigations.
APA, Harvard, Vancouver, ISO, and other styles
49

Piera, F. J., and P. Parada. "On convergence properties of Shannon entropy." Problems of Information Transmission 45, no. 2 (June 2009): 75–94. http://dx.doi.org/10.1134/s003294600902001x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Nath, Prem, and Man Mohan Kaur. "Quasicyclic Symmetry and the Shannon Entropy." Journal of Information and Optimization Sciences 13, no. 1 (January 1992): 165–72. http://dx.doi.org/10.1080/02522667.1992.10699101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography