Academic literature on the topic 'Entropy source'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Entropy source.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Entropy source"

1

Flood, Matthew W., and Bernd Grimm. "EntropyHub: An open-source toolkit for entropic time series analysis." PLOS ONE 16, no. 11 (2021): e0259448. http://dx.doi.org/10.1371/journal.pone.0259448.

Full text
Abstract:
An increasing number of studies across many research fields from biomedical engineering to finance are employing measures of entropy to quantify the regularity, variability or randomness of time series and image data. Entropy, as it relates to information theory and dynamical systems theory, can be estimated in many ways, with newly developed methods being continuously introduced in the scientific literature. Despite the growing interest in entropic time series and image analysis, there is a shortage of validated, open-source software tools that enable researchers to apply these methods. To date, packages for performing entropy analysis are often run using graphical user interfaces, lack the necessary supporting documentation, or do not include functions for more advanced entropy methods, such as cross-entropy, multiscale cross-entropy or bidimensional entropy. In light of this, this paper introduces EntropyHub, an open-source toolkit for performing entropic time series analysis in MATLAB, Python and Julia. EntropyHub (version 0.1) provides an extensive range of more than forty functions for estimating cross-, multiscale, multiscale cross-, and bidimensional entropy, each including a number of keyword arguments that allows the user to specify multiple parameters in the entropy calculation. Instructions for installation, descriptions of function syntax, and examples of use are fully detailed in the supporting documentation, available on the EntropyHub website– www.EntropyHub.xyz. Compatible with Windows, Mac and Linux operating systems, EntropyHub is hosted on GitHub, as well as the native package repository for MATLAB, Python and Julia, respectively. The goal of EntropyHub is to integrate the many established entropy methods into one complete resource, providing tools that make advanced entropic time series analysis straightforward and reproducible.
APA, Harvard, Vancouver, ISO, and other styles
2

Guyader, A., E. Fabre, C. Guillemot, and M. Robert. "Joint source-channel turbo decoding of entropy-coded sources." IEEE Journal on Selected Areas in Communications 19, no. 9 (2001): 1680–96. http://dx.doi.org/10.1109/49.947033.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Chapeau-Blondeau, F., A. Delahaies, and D. Rousseau. "Source coding with Tsallis entropy." Electronics Letters 47, no. 3 (2011): 187. http://dx.doi.org/10.1049/el.2010.2792.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Borysenko, Oleksiy. "ON COMBINATORIAL ENTROPY OF SOURCES OF BINARY INFORMATION." Grail of Science, no. 20 (October 6, 2022): 90–96. http://dx.doi.org/10.36074/grail-of-science.30.09.2022.017.

Full text
Abstract:
The paper considers the question of converting the entropy of a probabilistic information source generating binary sequences into combinatorial forms of its two constituent sources. One of them is basic, and the second has a conditional entropy with respect to the first. It is shown that together such sources make it possible to optimally encode information using numbering at the efficiency level of known methods of optimal coding. In terms of their functions, they are universal and can solve all the problems that are solved by the methods of coding probabilistic sources. However, unlike them, they do not require statistical tests before optimal coding, or they are carried out to a much lesser extent. They also simplify the calculation of the entropy of a probabilistic source of information, which is useful in solving many problems where it is necessary to know the value of the entropy of the source, for example, in error-correcting coding.
APA, Harvard, Vancouver, ISO, and other styles
5

Bercher, J. F. "Comment: Source coding with Tsallis entropy." Electronics Letters 47, no. 10 (2011): 597. http://dx.doi.org/10.1049/el.2011.0611.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Răstoceanu, Florin, Răzvan Rughiniș, Ștefan-Dan Ciocîrlan, and Mihai Enache. "Sensor-Based Entropy Source Analysis and Validation for Use in IoT Environments." Electronics 10, no. 10 (2021): 1173. http://dx.doi.org/10.3390/electronics10101173.

Full text
Abstract:
The IoT market has grown significantly in recent years, and it is estimated that it will continue to do so. For this reason, the need to identify new solutions to ensure security is vital for the future development in this field. Inadequate sources of entropy are one of the factors that negatively influence security. In this study, inspired by NIST’s latest entropy estimation recommendations, we proposed a methodology for analyzing and validating a sensor-based entropy source, highlighted by an innovative experiment design. Moreover, the proposed solution is analyzed in terms of resistance to multiple types of attacks. Following an analysis of the influence of sensor characteristics and settings on the entropy rate, we obtain a maximum entropy value of 0.63 per bit, and a throughput of 3.12 Kb/s, even when no motion is applied on the sensors. Our results show that a stable and resistant entropy source can be built based on the data obtained from the sensors. Our assessment of the proposed entropy source also achieves a higher complexity than previous studies, in terms of the variety of approached situations and the types of the performed experiments.
APA, Harvard, Vancouver, ISO, and other styles
7

Zhou, Li, and Yang Liu. "Optimization of Horizontal Plate Fin Heat Sink in Natural Convection for Electronics Cooling by Simulated Annealing Algorithm." Advanced Materials Research 1022 (August 2014): 91–95. http://dx.doi.org/10.4028/www.scientific.net/amr.1022.91.

Full text
Abstract:
In this study, the simulated annealing (SA) algorithm was adopted to optimize the geometry of horizontal plate fin heat sink by the extreme entransy dissipation principle. The alculation of the entransy dissipation rate was presented in detail. Using the entransy dissipation rate as the objective condition, the geometry optimization of the fin heat sink was conducted. To verify the results, the heat source temperature and the entropy generation rate were also calculated in the procedure. It is found that the entrasy dissipation rate, entropy generation and heat source temperature have the similar trend. The extreme entransy dissipation principle and minimization of entropy generation play similar roles in the geometry optimization of plate fin heat sink.
APA, Harvard, Vancouver, ISO, and other styles
8

Contreras Rodríguez, Lianet, Evaristo José Madarro-Capó , Carlos Miguel Legón-Pérez , Omar Rojas, and Guillermo Sosa-Gómez. "Selecting an Effective Entropy Estimator for Short Sequences of Bits and Bytes with Maximum Entropy." Entropy 23, no. 5 (2021): 561. http://dx.doi.org/10.3390/e23050561.

Full text
Abstract:
Entropy makes it possible to measure the uncertainty about an information source from the distribution of its output symbols. It is known that the maximum Shannon’s entropy of a discrete source of information is reached when its symbols follow a Uniform distribution. In cryptography, these sources have great applications since they allow for the highest security standards to be reached. In this work, the most effective estimator is selected to estimate entropy in short samples of bytes and bits with maximum entropy. For this, 18 estimators were compared. Results concerning the comparisons published in the literature between these estimators are discussed. The most suitable estimator is determined experimentally, based on its bias, the mean square error short samples of bytes and bits.
APA, Harvard, Vancouver, ISO, and other styles
9

Lopez-Sauceda, Juan, Philipp von Bülow, Carlos Ortega-Laurel, Francisco Perez-Martinez, Kalina Miranda-Perkins, and José Gerardo Carrillo González. "Entropy as a Geometrical Source of Information in Biological Organizations." Entropy 24, no. 10 (2022): 1390. http://dx.doi.org/10.3390/e24101390.

Full text
Abstract:
Considering both biological and non-biological polygonal shape organizations, in this paper we introduce a quantitative method which is able to determine informational entropy as spatial differences between heterogeneity of internal areas from simulation and experimental samples. According to these data (i.e., heterogeneity), we are able to establish levels of informational entropy using statistical insights of spatial orders using discrete and continuous values. Given a particular state of entropy, we establish levels of information as a novel approach which can unveil general principles of biological organization. Thirty-five geometric aggregates are tested (biological, non-biological, and polygonal simulations) in order to obtain the theoretical and experimental results of their spatial heterogeneity. Geometrical aggregates (meshes) include a spectrum of organizations ranging from cell meshes to ecological patterns. Experimental results for discrete entropy using a bin width of 0.5 show that a particular range of informational entropy (0.08 to 0.27 bits) is intrinsically associated with low rates of heterogeneity, which indicates a high degree of uncertainty in finding non-homogeneous configurations. In contrast, differential entropy (continuous) results reflect negative entropy within a particular range (−0.4 to −0.9) for all bin widths. We conclude that the differential entropy of geometrical organizations is an important source of neglected information in biological systems.
APA, Harvard, Vancouver, ISO, and other styles
10

Silva, João B., Suzan S. Vasconcelos, and Valeria C. Barbosa. "Apparent-magnetization mapping using entropic regularization." GEOPHYSICS 75, no. 2 (2010): L39—L50. http://dx.doi.org/10.1190/1.3358160.

Full text
Abstract:
A new apparent-magnetization mapping method on the horizontal plane combines minimization of first-order entropy with maximization of zeroth-order entropy of the estimated magnetization. The interpretation model is a grid of vertical, juxtaposed prisms in both horizontal directions. To estimate the magnetization of the prisms, assume that the top and bottom of the magnetic sources are horizontal. Minimization of the first-order entropy favors solutions with sharp borders, and the maximization of zeroth-order entropy prevents the tendency of the estimated source to become a single prism with large magnetization. Thus, a judicious combination of both constraints can lead to solutions characterized by regions with virtually constant magnetizations separated by sharp discontinuities. This is applied to synthetic data from simulated intrusive bodies in sediments that have horizontal tops. By comparing the results with those obtained with the common Tikhonov regularization (smoothness constraint) method, it is shown that both methods produce good and equivalent locations of the central positions of the sources. However, entropic regularization delineates the boundaries of the bodies with greater detail. Both the proposed and the smoothness constraints are applied to real anomaly data over a magnetic skarn in Butte Valley, Nevada, U.S.A. Entropic regularization produced an estimated magnetization distribution with sharper boundaries, smaller volume, and higher apparent magnetization as compared with results produced by incorporating the smoothness constraint.
APA, Harvard, Vancouver, ISO, and other styles
More sources
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography