Journal articles on the topic 'Entropy analysis'

To see the other types of publications on this topic, follow the link: Entropy analysis.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Entropy analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

BEN ABDALLAH, NAOUFEL, HEDIA CHAKER, and CHRISTIAN SCHMEISER. "THE HIGH FIELD ASYMPTOTICS FOR A FERMIONIC BOLTZMANN EQUATION: ENTROPY SOLUTIONS AND KINETIC SHOCK PROFILES." Journal of Hyperbolic Differential Equations 04, no. 04 (December 2007): 679–704. http://dx.doi.org/10.1142/s0219891607001318.

Full text
Abstract:
The high field approximation of a fermionic Boltzmann equation of semiconductors is performed after the formation of shocks. By employing a new entropy, whose dissipation measures the departure from the high field equilibrium, convergence towards the entropic solution of the limiting conservation law is proven. The entropy is also used to construct kinetic shock profiles for entropic shocks and to prove non-existence of non-entropic shock profiles.
APA, Harvard, Vancouver, ISO, and other styles
2

Flood, Matthew W., and Bernd Grimm. "EntropyHub: An open-source toolkit for entropic time series analysis." PLOS ONE 16, no. 11 (November 4, 2021): e0259448. http://dx.doi.org/10.1371/journal.pone.0259448.

Full text
Abstract:
An increasing number of studies across many research fields from biomedical engineering to finance are employing measures of entropy to quantify the regularity, variability or randomness of time series and image data. Entropy, as it relates to information theory and dynamical systems theory, can be estimated in many ways, with newly developed methods being continuously introduced in the scientific literature. Despite the growing interest in entropic time series and image analysis, there is a shortage of validated, open-source software tools that enable researchers to apply these methods. To date, packages for performing entropy analysis are often run using graphical user interfaces, lack the necessary supporting documentation, or do not include functions for more advanced entropy methods, such as cross-entropy, multiscale cross-entropy or bidimensional entropy. In light of this, this paper introduces EntropyHub, an open-source toolkit for performing entropic time series analysis in MATLAB, Python and Julia. EntropyHub (version 0.1) provides an extensive range of more than forty functions for estimating cross-, multiscale, multiscale cross-, and bidimensional entropy, each including a number of keyword arguments that allows the user to specify multiple parameters in the entropy calculation. Instructions for installation, descriptions of function syntax, and examples of use are fully detailed in the supporting documentation, available on the EntropyHub website– www.EntropyHub.xyz. Compatible with Windows, Mac and Linux operating systems, EntropyHub is hosted on GitHub, as well as the native package repository for MATLAB, Python and Julia, respectively. The goal of EntropyHub is to integrate the many established entropy methods into one complete resource, providing tools that make advanced entropic time series analysis straightforward and reproducible.
APA, Harvard, Vancouver, ISO, and other styles
3

Yan, Kesong. "Conditional entropy and fiber entropy for amenable group actions." Journal of Differential Equations 259, no. 7 (October 2015): 3004–31. http://dx.doi.org/10.1016/j.jde.2015.04.013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Makhanlall, Deodat, and Peixue Jiang. "Analysis of Entropy Generation in Diffusion HTAC Processes." International Journal of Applied Physics and Mathematics 4, no. 4 (2014): 275–79. http://dx.doi.org/10.7763/ijapm.2014.v4.298.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hanson, Robert M. "Regarding Entropy Analysis." Journal of Chemical Education 82, no. 6 (June 2005): 839. http://dx.doi.org/10.1021/ed082p839.1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bailey, Kenneth D. "System entropy analysis." Kybernetes 26, no. 6/7 (August 1997): 674–88. http://dx.doi.org/10.1108/03684929710169852.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Størmer, Erling. "Entropy of Endomorphisms and Relative Entropy in Finite von Neumann Algebras." Journal of Functional Analysis 171, no. 1 (February 2000): 34–52. http://dx.doi.org/10.1006/jfan.1999.3535.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Downarowicz, Tomasz. "Entropy structure." Journal d'Analyse Mathématique 96, no. 1 (December 2005): 57–116. http://dx.doi.org/10.1007/bf02787825.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Litvak, A. E., V. D. Milman, A. Pajor, and N. Tomczak-Jaegermann. "Entropy extension." Functional Analysis and Its Applications 40, no. 4 (October 2006): 298–303. http://dx.doi.org/10.1007/s10688-006-0046-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Castro, Manuel J., Ulrik S. Fjordholm, Siddhartha Mishra, and Carlos Parés. "Entropy Conservative and Entropy Stable Schemes for Nonconservative Hyperbolic Systems." SIAM Journal on Numerical Analysis 51, no. 3 (January 2013): 1371–91. http://dx.doi.org/10.1137/110845379.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Fang, S. C., E. L. Peterson, and J. R. Rajasekera. "Minimum cross-entropy analysis with entropy-type constraints." Journal of Computational and Applied Mathematics 39, no. 2 (March 1992): 165–78. http://dx.doi.org/10.1016/0377-0427(92)90127-j.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Sparavigna, Amelia Carolina. "Entropy in Image Analysis." Entropy 21, no. 5 (May 17, 2019): 502. http://dx.doi.org/10.3390/e21050502.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Tanaka, Yoshifumi. "Entropy and data analysis." Nihon Shuchu Chiryo Igakukai zasshi 15, no. 4 (2008): 469–71. http://dx.doi.org/10.3918/jsicm.15.469.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Jenssen, R. "Kernel Entropy Component Analysis." IEEE Transactions on Pattern Analysis and Machine Intelligence 32, no. 5 (May 2010): 847–60. http://dx.doi.org/10.1109/tpami.2009.100.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Czyż, Teresa, and Jan Hauke. "Entropy In Regional Analysis." Quaestiones Geographicae 34, no. 4 (December 1, 2015): 69–78. http://dx.doi.org/10.1515/quageo-2015-0037.

Full text
Abstract:
Abstract Entropy has been proposed as a significant tool for an analysis of spatial differences. Using Semple and Gauthier’s (1972) transformation of the Shannon entropy statistic into an entropy measure of inequality and their algorithm, an estimation is made of changes in regional inequality in Poland over the years 2005–2012. The inequality is decomposed into total, inter- and intra-regional types, and an analysis is made of relations holding between them.
APA, Harvard, Vancouver, ISO, and other styles
16

Remacle, F., Rameshkumar Arumugam, and R. D. Levine. "Maximal entropy multivariate analysis." Molecular Physics 110, no. 15-16 (August 10, 2012): 1659–68. http://dx.doi.org/10.1080/00268976.2012.665192.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Zhao, Haitao, and W. K. Wong. "Regularized discriminant entropy analysis." Pattern Recognition 47, no. 2 (February 2014): 806–19. http://dx.doi.org/10.1016/j.patcog.2013.08.020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Govindan, R. B., J. D. Wilson, H. Eswaran, C. L. Lowery, and H. Preißl. "Revisiting sample entropy analysis." Physica A: Statistical Mechanics and its Applications 376 (March 2007): 158–64. http://dx.doi.org/10.1016/j.physa.2006.10.077.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

von der Linden, W. "Maximum-entropy data analysis." Applied Physics A: Materials Science & Processing 60, no. 2 (January 1, 1995): 155–65. http://dx.doi.org/10.1007/s003390050086.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

BRYAN, R. K. "MAXIMUM ENTROPY DATA ANALYSIS." Le Journal de Physique Colloques 47, no. C5 (August 1986): C5–43—C5–53. http://dx.doi.org/10.1051/jphyscol:1986506.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Ahmed, Mosabber Uddin, and Danilo P. Mandic. "Multivariate Multiscale Entropy Analysis." IEEE Signal Processing Letters 19, no. 2 (February 2012): 91–94. http://dx.doi.org/10.1109/lsp.2011.2180713.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

von der Linden, W. "Maximum-entropy data analysis." Applied Physics A 60, no. 2 (February 1995): 155–65. http://dx.doi.org/10.1007/bf01538241.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Lake, Douglas. "Continuous sample entropy analysis." Journal of Critical Care 25, no. 3 (September 2010): e7-e8. http://dx.doi.org/10.1016/j.jcrc.2010.05.020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Dumitrașcu, Gheorghe, Michel Feidt, and Ştefan Grigorean. "Finite Physical Dimensions Thermodynamics Analysis and Design of Closed Irreversible Cycles." Energies 14, no. 12 (June 9, 2021): 3416. http://dx.doi.org/10.3390/en14123416.

Full text
Abstract:
This paper develops simplifying entropic models of irreversible closed cycles. The entropic models involve the irreversible connections between external and internal main operational parameters with finite physical dimensions. The external parameters are the mean temperatures of external heat reservoirs, the heat transfers thermal conductance, and the heat transfer mean log temperatures differences. The internal involved parameters are the reference entropy of the cycle and the internal irreversibility number. The cycle’s design might use four possible operational constraints in order to find out the reference entropy. The internal irreversibility number allows the evaluation of the reversible heat output function of the reversible heat input. Thus the cycle entropy balance equation to design the trigeneration cycles only through external operational parameters might be involved. In designing trigeneration systems, they must know the requirements of all consumers of the useful energies delivered by the trigeneration system. The conclusions emphasize the complexity in designing and/or optimizing the irreversible trigeneration systems.
APA, Harvard, Vancouver, ISO, and other styles
25

Ye, Cheng, Richard C. Wilson, and Edwin R. Hancock. "Network analysis using entropy component analysis." Journal of Complex Networks 6, no. 3 (September 28, 2017): 404–29. http://dx.doi.org/10.1093/comnet/cnx045.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Lindenstrauss, Elon. "Lowering topological entropy." Journal d'Analyse Mathématique 67, no. 1 (December 1995): 231–67. http://dx.doi.org/10.1007/bf02787792.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Zhu, Li, and Dongkui Ma. "Topological R-entropy and topological entropy of free semigroup actions." Journal of Mathematical Analysis and Applications 470, no. 2 (February 2019): 1056–69. http://dx.doi.org/10.1016/j.jmaa.2018.10.048.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Oprocha, Piotr, and Paweł Wilczyński. "Topological entropy for local processes." Journal of Differential Equations 249, no. 8 (October 2010): 1929–67. http://dx.doi.org/10.1016/j.jde.2010.06.022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Downarowicz, Tomasz, and Jacek Serafin. "Universal Systems for Entropy Intervals." Journal of Dynamics and Differential Equations 29, no. 4 (February 1, 2016): 1411–22. http://dx.doi.org/10.1007/s10884-015-9516-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Bisch, Dietmar. "Entropy of groups and subfactors." Journal of Functional Analysis 103, no. 1 (January 1992): 190–208. http://dx.doi.org/10.1016/0022-1236(92)90141-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Corcino, Cristina B., and Roberto B. Corcino. "Three-Parameter Logarithm and Entropy." Journal of Function Spaces 2020 (November 20, 2020): 1–10. http://dx.doi.org/10.1155/2020/9791789.

Full text
Abstract:
A three-parameter logarithmic function is derived using the notion of q -analogue and ansatz technique. The derived three-parameter logarithm is shown to be a generalization of the two-parameter logarithmic function of Schwämmle and Tsallis as the latter is the limiting function of the former as the added parameter goes to 1. The inverse of the three-parameter logarithm and other important properties are also proved. A three-parameter entropic function is then defined and is shown to be analytic and hence Lesche-stable, concave, and convex in some ranges of the parameters.
APA, Harvard, Vancouver, ISO, and other styles
32

Hiai, F., and D. Petz. "Entropy Densities for Algebraic States." Journal of Functional Analysis 125, no. 1 (October 1994): 287–308. http://dx.doi.org/10.1006/jfan.1994.1125.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Jaque, Nelda, and Bernado San Martín. "Topological entropy for discontinuous semiflows." Journal of Differential Equations 266, no. 6 (March 2019): 3580–600. http://dx.doi.org/10.1016/j.jde.2018.09.013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Remacle, F., T. G. Graeber, and R. D. Levine. "Whose Entropy: A Maximal Entropy Analysis of Phosphorylation Signaling." Journal of Statistical Physics 144, no. 2 (May 13, 2011): 429–42. http://dx.doi.org/10.1007/s10955-011-0215-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Bakhtin, V. I., and A. V. Lebedev. "Entropy statistic theorem and variational principle for t-entropy are equivalent." Journal of Mathematical Analysis and Applications 474, no. 1 (June 2019): 59–71. http://dx.doi.org/10.1016/j.jmaa.2019.01.032.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Zhao, Lina, Chengyu Liu, Shoushui Wei, Qin Shen, Fan Zhou, and Jianqing Li. "A New Entropy-Based Atrial Fibrillation Detection Method for Scanning Wearable ECG Recordings." Entropy 20, no. 12 (November 26, 2018): 904. http://dx.doi.org/10.3390/e20120904.

Full text
Abstract:
Entropy-based atrial fibrillation (AF) detectors have been applied for short-term electrocardiogram (ECG) analysis. However, existing methods suffer from several limitations. To enhance the performance of entropy-based AF detectors, we have developed a new entropy measure, named EntropyAF, which includes the following improvements: (1) use of a ranged function rather than the Chebyshev function to define vector distance, (2) use of a fuzzy function to determine vector similarity, (3) replacement of the probability estimation with density estimation for entropy calculation, (4) use of a flexible distance threshold parameter, and (5) use of adjusted entropy results for the heart rate effect. EntropyAF was trained using the MIT-BIH Atrial Fibrillation (AF) database, and tested on the clinical wearable long-term AF recordings. Three previous entropy-based AF detectors were used for comparison: sample entropy (SampEn), fuzzy measure entropy (FuzzyMEn) and coefficient of sample entropy (COSEn). For classifying AF and non-AF rhythms in the MIT-BIH AF database, EntropyAF achieved the highest area under receiver operating characteristic curve (AUC) values of 98.15% when using a 30-beat time window, which was higher than COSEn with AUC of 91.86%. SampEn and FuzzyMEn resulted in much lower AUCs of 74.68% and 79.24% respectively. For classifying AF and non-AF rhythms in the clinical wearable AF database, EntropyAF also generated the largest values of Youden index (77.94%), sensitivity (92.77%), specificity (85.17%), accuracy (87.10%), positive predictivity (68.09%) and negative predictivity (97.18%). COSEn had the second-best accuracy of 78.63%, followed by an accuracy of 65.08% in FuzzyMEn and an accuracy of 59.91% in SampEn. The new proposed EntropyAF also generated highest classification accuracy when using a 12-beat time window. In addition, the results from time cost analysis verified the efficiency of the new EntropyAF. This study showed the better discrimination ability for identifying AF when using EntropyAF method, indicating that it would be useful for the practical clinical wearable AF scanning.
APA, Harvard, Vancouver, ISO, and other styles
37

Dumitrascu, Gheorghe, Michel Feidt, and Stefan Grigorean. "Closed Irreversible Cycles Analysis Based on Finite Physical Dimensions Thermodynamics." Proceedings 58, no. 1 (September 11, 2020): 37. http://dx.doi.org/10.3390/wef-06905.

Full text
Abstract:
The paper develops generalizing entropic approaches of irreversible closed cycles. The mathematical models of the irreversible engines (basic, with internal regeneration of the heat, cogeneration units) and of the refrigeration cycles were applied to four possible operating irreversible trigeneration cycles. The models involve the reference entropy, the number of internal irreversibility, the thermal conductance inventory, the proper temperatures of external heat reservoirs unifying the first law of thermodynamics and the linear heat transfer law, the mean log temperature differences, and four possible operational constraints, i.e., constant heat input, constant power, constant energy efficiency and constant reference entropy. The reference entropy is always the entropy variation rate of the working fluid during the reversible heat input process. The amount of internal irreversibility allows the evaluation of the heat output via the ratio of overall internal irreversible entropy generation and the reference entropy. The operational constraints allow the replacement of the reference entropy function of the finite physical dimension parameters, i.e., mean log temperature differences, thermal conductance inventory, and the proper external heat reservoir temperatures. The paper presents initially the number of internal irreversibility and the energy efficiency equations for engine and refrigeration cycles. At the limit, i.e., endoreversibility, we can re-obtain the endoreversible energy efficiency equation. The second part develops the influences between the imposed operational constraint and the finite physical dimensions parameters for the basic irreversible cycle. The third part is applying the mathematical models to four possible standalone trigeneration cycles. It was assumed that there are the required consumers of the all useful heat delivered by the trigeneration system. The design of trigeneration system must know the ratio of refrigeration rate to power, e.g., engine shaft power or useful power delivered directly to power consumers. The final discussions and conclusions emphasize the novelties and the complexity of interconnected irreversible trigeneration systems design/optimization.
APA, Harvard, Vancouver, ISO, and other styles
38

Kan, Jeff W. T., Zafer Bilda, and John S. Gero. "Comparing entropy measures of idea links in design protocols: Linkography entropy measurement and analysis of differently conditioned design sessions." Artificial Intelligence for Engineering Design, Analysis and Manufacturing 21, no. 4 (September 19, 2007): 367–77. http://dx.doi.org/10.1017/s0890060407000339.

Full text
Abstract:
AbstractThis paper explores using Shannon's entropy of information to measure linkographs of 12 design sessions that involved six architects in two different experimental conditions. The aim is to find a quantitative tool to interpret the linkographs. This study examines if the differences in the design processes and the design outcomes can be reflected in the entropic interpretations. The results show that the overall entropy of one design condition is slightly higher than the other. Further, there are indications that the change of entropy might reflect design outcomes.
APA, Harvard, Vancouver, ISO, and other styles
39

Cai, Zhenning, Jingwei Hu, Yang Kuang, and Bo Lin. "An Entropic Method for Discrete Systems with Gibbs Entropy." SIAM Journal on Numerical Analysis 60, no. 4 (August 2022): 2345–71. http://dx.doi.org/10.1137/21m1429023.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Dai, F., A. Prymak, A. Shadrin, V. Temlyakov, and S. Tikhonov. "Entropy numbers and Marcinkiewicz-type discretization." Journal of Functional Analysis 281, no. 6 (September 2021): 109090. http://dx.doi.org/10.1016/j.jfa.2021.109090.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Ma, Zilu, and Yongjia Zhang. "Perelman's entropy on ancient Ricci flows." Journal of Functional Analysis 281, no. 9 (November 2021): 109195. http://dx.doi.org/10.1016/j.jfa.2021.109195.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Pop, Ciprian, and Roger R. Smith. "Crossed products and entropy of automorphisms." Journal of Functional Analysis 206, no. 1 (January 2004): 210–32. http://dx.doi.org/10.1016/s0022-1236(03)00082-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Liu, Kairan, Yixiao Qiao, and Leiye Xu. "Topological entropy of nonautonomous dynamical systems." Journal of Differential Equations 268, no. 9 (April 2020): 5353–65. http://dx.doi.org/10.1016/j.jde.2019.11.029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Richman, Joshua S., and J. Randall Moorman. "Physiological time-series analysis using approximate entropy and sample entropy." American Journal of Physiology-Heart and Circulatory Physiology 278, no. 6 (June 1, 2000): H2039—H2049. http://dx.doi.org/10.1152/ajpheart.2000.278.6.h2039.

Full text
Abstract:
Entropy, as it relates to dynamical systems, is the rate of information production. Methods for estimation of the entropy of a system represented by a time series are not, however, well suited to analysis of the short and noisy data sets encountered in cardiovascular and other biological studies. Pincus introduced approximate entropy (ApEn), a set of measures of system complexity closely related to entropy, which is easily applied to clinical cardiovascular and other time series. ApEn statistics, however, lead to inconsistent results. We have developed a new and related complexity measure, sample entropy (SampEn), and have compared ApEn and SampEn by using them to analyze sets of random numbers with known probabilistic character. We have also evaluated cross-ApEn and cross-SampEn, which use cardiovascular data sets to measure the similarity of two distinct time series. SampEn agreed with theory much more closely than ApEn over a broad range of conditions. The improved accuracy of SampEn statistics should make them useful in the study of experimental clinical cardiovascular and other biological time series.
APA, Harvard, Vancouver, ISO, and other styles
45

Upadhyaya, Prajna, and Tohru Yagi. "Using Entropies for the Analysis of Brain Rhythms." International Journal of Signal Processing Systems 8, no. 3 (September 2020): 54–58. http://dx.doi.org/10.18178/ijsps.8.3.54-58.

Full text
Abstract:
Epilepsy is a chronic neurological disorder characterized by seizures. It involves abnormal discharging of neurons that effects smaller section of the brain, referred to as partial epilepsy or larger section of the brain resulting in generalized epilepsy. Sometimes these abnormal activities spread from smaller section to the larger section of the brain resulting in secondary generalized epilepsy. Hence, it is important to detect and control epileptic seizure in an early stage. In this work, we design a system that classifies interictal (period between the seizure) and ictal (after onset of seizure) signals by extracting subtle information from the EEG rhythms: gamma, beta, alpha, theta and delta. The following system also aims to determine the sensitivity of these EEG rhythms towards epileptic seizure. In this research, we have used entropy methods namely: Shannon entropy, approximate entropy and sample entropy to extract the subtle information from the EEG rhythms. Classifiers namely: k-nearest neighbor, support vector machine and linear discriminant analysis is utilized to distinguish interictal and ictal signals with a classification accuracy of 94%, 95.5% and 97.5%.
APA, Harvard, Vancouver, ISO, and other styles
46

Lopes, António, and J. Tenreiro Machado. "Entropy Analysis of Soccer Dynamics." Entropy 21, no. 2 (February 16, 2019): 187. http://dx.doi.org/10.3390/e21020187.

Full text
Abstract:
This paper adopts the information and fractional calculus tools for studying the dynamics of a national soccer league. A soccer league season is treated as a complex system (CS) with a state observable at discrete time instants, that is, at the time of rounds. The CS state, consisting of the goals scored by the teams, is processed by means of different tools, namely entropy, mutual information and Jensen–Shannon divergence. The CS behavior is visualized in 3-D maps generated by multidimensional scaling. The points on the maps represent rounds and their relative positioning allows for a direct interpretation of the results.
APA, Harvard, Vancouver, ISO, and other styles
47

Sparavigna, Amelia Carolina. "Entropy in Image Analysis II." Entropy 22, no. 8 (August 15, 2020): 898. http://dx.doi.org/10.3390/e22080898.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Li Peng, Liu Cheng-Yu, Li Li-Ping, Ji Li-Zhen, Yu Shou-Yuan, and Liu Chang-Chun. "Multiscale multivariate fuzzy entropy analysis." Acta Physica Sinica 62, no. 12 (2013): 120512. http://dx.doi.org/10.7498/aps.62.120512.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Sparavigna, Amelia Carolina. "Entropy in Image Analysis III." Entropy 23, no. 12 (December 8, 2021): 1648. http://dx.doi.org/10.3390/e23121648.

Full text
Abstract:
Image analysis basically refers to any extraction of information from images, which can be as simple as QR codes required in logistics and digital certifications or related to large and complex datasets, such as the collections of images used for biometric identification or the sets of satellite surveys employed in the monitoring of Earth’s climate changes [...]
APA, Harvard, Vancouver, ISO, and other styles
50

Musicus, B., and R. Johnson. "Multichannel relative-entropy spectrum analysis." IEEE Transactions on Acoustics, Speech, and Signal Processing 34, no. 3 (June 1986): 554–64. http://dx.doi.org/10.1109/tassp.1986.1164855.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography