To see the other types of publications on this topic, follow the link: Entropy maximization.

Journal articles on the topic 'Entropy maximization'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Entropy maximization.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Athreya, K. B. "Entropy maximization." Proceedings - Mathematical Sciences 119, no. 4 (September 2009): 531–39. http://dx.doi.org/10.1007/s12044-009-0049-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Censor, Yair, and Joseph Segman. "On Block-Iterative Entropy Maximization." Journal of Information and Optimization Sciences 8, no. 3 (September 1987): 275–91. http://dx.doi.org/10.1080/02522667.1987.10698894.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Martı́nez, S., F. Nicolás, F. Pennini, and A. Plastino. "Tsallis’ entropy maximization procedure revisited." Physica A: Statistical Mechanics and its Applications 286, no. 3-4 (November 2000): 489–502. http://dx.doi.org/10.1016/s0378-4371(00)00359-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Janečka, Adam, and Michal Pavelka. "Gradient Dynamics and Entropy Production Maximization." Journal of Non-Equilibrium Thermodynamics 43, no. 1 (January 26, 2018): 1–19. http://dx.doi.org/10.1515/jnet-2017-0005.

Full text
Abstract:
AbstractWe compare two methods for modeling dissipative processes, namely gradient dynamics and entropy production maximization. Both methods require similar physical inputs–-how energy (or entropy) is stored and how it is dissipated. Gradient dynamics describes irreversible evolution by means of dissipation potential and entropy, it automatically satisfies Onsager reciprocal relations as well as their nonlinear generalization (Maxwell–Onsager relations), and it has statistical interpretation. Entropy production maximization is based on knowledge of free energy (or another thermodynamic potential) and entropy production. It also leads to the linear Onsager reciprocal relations and it has proven successful in thermodynamics of complex materials. Both methods are thermodynamically sound as they ensure approach to equilibrium, and we compare them and discuss their advantages and shortcomings. In particular, conditions under which the two approaches coincide and are capable of providing the same constitutive relations are identified. Besides, a commonly used but not often mentioned step in the entropy production maximization is pinpointed and the condition of incompressibility is incorporated into gradient dynamics.
APA, Harvard, Vancouver, ISO, and other styles
5

Ratnayake, L. L. "Intercity auto trip estimation for Sri Lanka using entropy maximization." Canadian Journal of Civil Engineering 16, no. 2 (April 1, 1989): 200–201. http://dx.doi.org/10.1139/l89-036.

Full text
Abstract:
Recently, techniques such as calibration of demand models and entropy maximization have been used to estimate origin–destination (O–D) matrices. To the author's knowledge, there has been no reported work on an O–D estimation for a nationwide network. This paper describes a methodology to estimate intercity auto traffic for Sri Lanka using entropy maximization. Some of these results are then compared with the values obtained from a known demand model and the actual data. Key words: O–D matrices, entropy maximization, Sri Lanka, link volume, intercity.
APA, Harvard, Vancouver, ISO, and other styles
6

Ré, Christopher, and D. Suciu. "Understanding cardinality estimation using entropy maximization." ACM Transactions on Database Systems 37, no. 1 (February 2012): 1–31. http://dx.doi.org/10.1145/2109196.2109202.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Miller, Gad, and David Horn. "Probability Density Estimation Using Entropy Maximization." Neural Computation 10, no. 7 (October 1, 1998): 1925–38. http://dx.doi.org/10.1162/089976698300017205.

Full text
Abstract:
We propose a method for estimating probability density functions and conditional density functions by training on data produced by such distributions. The algorithm employs new stochastic variables that amount to coding of the input, using a principle of entropy maximization. It is shown to be closely related to the maximum likelihood approach. The encoding step of the algorithm provides an estimate of the probability distribution. The decoding step serves as a generative mode, producing an ensemble of data with the desired distribution. The algorithm is readily implemented by neural networks, using stochastic gradient ascent to achieve entropy maximization.
APA, Harvard, Vancouver, ISO, and other styles
8

Haegeman, Bart, and Michel Loreau. "Limitations of entropy maximization in ecology." Oikos 117, no. 11 (October 28, 2008): 1700–1710. http://dx.doi.org/10.1111/j.1600-0706.2008.16539.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Rouge, Richard, and Nicole El Karoui. "Pricing Via Utility Maximization and Entropy." Mathematical Finance 10, no. 2 (April 2000): 259–76. http://dx.doi.org/10.1111/1467-9965.00093.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zou, Jieping, and Greg Holloway. "Entropy maximization tendency in topographic turbulence." Journal of Fluid Mechanics 263 (March 25, 1994): 361–74. http://dx.doi.org/10.1017/s0022112094004155.

Full text
Abstract:
Numerical simulations of geostrophic turbulence above topography are used to compare (a) nonlinear generation of system entropy, S, (b) selective damping of enstrophy and (c) development of vorticity–topography correlation. In the damped cases, S initially increases, approaching a quasi-equilibrium (maximum S subject to the instantaneous, though decaying, energy and enstrophy). When strongly scale-selective damping is applied, onset of the vorticity–topography correlation follows the timescales for enstrophy decay. During the period of decay, it is shown that nonlinear interaction continues to generate S, offsetting in part the loss of S to explicit damping.
APA, Harvard, Vancouver, ISO, and other styles
11

Davis, Sergio, and Diego González. "Hamiltonian formalism and path entropy maximization." Journal of Physics A: Mathematical and Theoretical 48, no. 42 (September 22, 2015): 425003. http://dx.doi.org/10.1088/1751-8113/48/42/425003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Solé-Casals, Jordi, Karmele López-de-Ipiña Pena, and Cesar F. Caiafa. "Inverting Monotonic Nonlinearities by Entropy Maximization." PLOS ONE 11, no. 10 (October 25, 2016): e0165288. http://dx.doi.org/10.1371/journal.pone.0165288.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Fisch, Oscar. "On the Utility of Entropy Maximization." Geographical Analysis 9, no. 1 (September 3, 2010): 79–84. http://dx.doi.org/10.1111/j.1538-4632.1977.tb00562.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Hewings, Geoffrey J. D., and Esteban Fernandez-Vazquez. "Entropy maximization and input–output analysis." Interdisciplinary Science Reviews 44, no. 3-4 (October 2, 2019): 272–85. http://dx.doi.org/10.1080/03080188.2019.1670429.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Eliazar, Iddo. "From entropy-maximization to equality-maximization: Gauss, Laplace, Pareto, and Subbotin." Physica A: Statistical Mechanics and its Applications 415 (December 2014): 479–92. http://dx.doi.org/10.1016/j.physa.2014.08.011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Tanaka, Hisa-Aki, Masaki Nakagawa, and Yasutada Oohama. "A Direct Link between Rényi–Tsallis Entropy and Hölder’s Inequality—Yet Another Proof of Rényi–Tsallis Entropy Maximization." Entropy 21, no. 6 (May 30, 2019): 549. http://dx.doi.org/10.3390/e21060549.

Full text
Abstract:
The well-known Hölder’s inequality has been recently utilized as an essential tool for solving several optimization problems. However, such an essential role of Hölder’s inequality does not seem to have been reported in the context of generalized entropy, including Rényi–Tsallis entropy. Here, we identify a direct link between Rényi–Tsallis entropy and Hölder’s inequality. Specifically, we demonstrate yet another elegant proof of the Rényi–Tsallis entropy maximization problem. Especially for the Tsallis entropy maximization problem, only with the equality condition of Hölder’s inequality is the q-Gaussian distribution uniquely specified and also proved to be optimal.
APA, Harvard, Vancouver, ISO, and other styles
17

Collins, Douglas M. "Entropy Maximizations on Electron Density." Zeitschrift für Naturforschung A 48, no. 1-2 (February 1, 1993): 68–74. http://dx.doi.org/10.1515/zna-1993-1-218.

Full text
Abstract:
Abstract Incomplete and imperfect data characterize the problem of constructing electron density representations from experimental information. One fundamental concern is identification of the proper protocol for including new information at any stage of a density reconstruction. An axiomatic approach developed in other fields specifies entropy maximization as the desired protocol. In particular, if new data are used to modify a prior charge density distribution without adding extraneous prejudice, the new distribution must both agree with all the data, new and old, and be a function of maximum relative entropy. The functional form of relative entropy is s = - r In (r/t), where r and t respectively refer to new and prior distributions normalized to a common scale.Entropy maximization has been used to deal with certain aspects of the phase problem of X-ray diffraction. Varying degrees of success have marked the work which may be roughly assigned to categories as direct methods, data reduction and analysis, and image enhancement. Much of the work has been expressed in probabilistic language, although image enhancement has been somewhat more physical or geometric in description. Whatever the language, entropy maximization is a specific and deterministic functional manipulation. A recent advance has been the description of an algorithm which, quite deterministically, adjusts a prior positive charge density distribution to agree exactly with a specified subset of structure-factor moduli by a constrained entropy maximization.Entropy on an N-representable one-particle density matrix is well defined. The entropy is the expected form, and it is a simple function of the one-matrix eigenvalues which all must be non-negative. Relationships between the entropy functional and certain properties of a one-matrix are discussed, as well as a conjecture concerning the physical interpretation of entropy. Throughout this work reference is made to informational entropy, not the entropy of thermodynamics.
APA, Harvard, Vancouver, ISO, and other styles
18

GULKO, LES. "THE ENTROPIC MARKET HYPOTHESIS." International Journal of Theoretical and Applied Finance 02, no. 03 (July 1999): 293–329. http://dx.doi.org/10.1142/s0219024999000170.

Full text
Abstract:
Information theory teaches that entropy is the fundamental limit for data compression, and electrical engineers routinely use entropy as a criterion for efficient storage and transmission of information. Since modern financial theory teaches that competitive market prices store and transmit information with some efficiency, should financial economists be concerned with entropy? This paper presents a market model in which entropy emerges endogenously as a condition for the operational efficiency of price discovery while entropy maximization emerges as a condition for the informational efficiency of market prices. The maximum-entropy formalism makes the efficient market hypothesis operational and testable. This formalism is used to establish that entropic markets admit no arbitrage and support both the Ross arbitrage pricing theory and the Black–Scholes stock option pricing model.
APA, Harvard, Vancouver, ISO, and other styles
19

Giorgio, Serena Di, and Paulo Mateus. "Efficiently Compressible Density Operators via Entropy Maximization." Proceedings 12, no. 1 (August 2, 2019): 39. http://dx.doi.org/10.3390/proceedings2019012039.

Full text
Abstract:
We address the problem of efficiently and effectively compress density operators (DOs), by providing an efficient procedure for learning the most likely DO, given a chosen set of partial information. We explore, in the context of quantum information theory, the generalisation of the maximum entropy estimator for DOs, when the direct dependencies between the subsystems are provided. As a preliminary analysis, we restrict the problem to tripartite systems when two marginals are known. When the marginals are compatible with the existence of a quantum Markov chain (QMC) we show that there exists a recovery procedure for the maximum entropy estimator, and moreover, that for these states many well-known classical results follow. Furthermore, we notice that, contrary to the classical case, two marginals, compatible with some tripartite state, might not be compatible with a QMC. Finally, we provide a new characterisation of quantum conditional independence in light of maximum entropy updating. At this level, all the Hilbert spaces are considered finite dimensional.
APA, Harvard, Vancouver, ISO, and other styles
20

Fernández-Pineda, C., and S. Velasco. "Entropy maximization in the free expansion process." European Journal of Physics 26, no. 4 (May 20, 2005): N13—N16. http://dx.doi.org/10.1088/0143-0807/26/4/n01.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Jose, Victor Richmond R., Robert F. Nau, and Robert L. Winkler. "Scoring Rules, Generalized Entropy, and Utility Maximization." Operations Research 56, no. 5 (October 2008): 1146–57. http://dx.doi.org/10.1287/opre.1070.0498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Nakamura, T. K. "Relativistic equilibrium distribution by relative entropy maximization." EPL (Europhysics Letters) 88, no. 4 (November 1, 2009): 40009. http://dx.doi.org/10.1209/0295-5075/88/40009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Siew, Lam Weng, Saiful Hafizah Jaaman, and Lam Weng Hoe. "ENHANCED INDEX TRACKING MODEL WITH ENTROPY MAXIMIZATION." Advances and Applications in Statistics 53, no. 3 (September 18, 2018): 243–58. http://dx.doi.org/10.17654/as053030243.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Takagi, Fujio, and Tatsuo Tsukamoto. "Semi-inclusive rapidity distributions from entropy maximization." Physical Review D 38, no. 7 (October 1, 1988): 2288–90. http://dx.doi.org/10.1103/physrevd.38.2288.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Balakrishnan, N., and N. Sathyamurthy. "Maximization of entropy during a chemical reaction." Chemical Physics Letters 164, no. 2-3 (December 1989): 267–69. http://dx.doi.org/10.1016/0009-2614(89)85027-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Illner, Reinhard, and Helmut Neunzert. "Relative entropy maximization and directed diffusion equations." Mathematical Methods in the Applied Sciences 16, no. 8 (August 1993): 545–54. http://dx.doi.org/10.1002/mma.1670160803.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Silva, João B., Suzan S. Vasconcelos, and Valeria C. Barbosa. "Apparent-magnetization mapping using entropic regularization." GEOPHYSICS 75, no. 2 (March 2010): L39—L50. http://dx.doi.org/10.1190/1.3358160.

Full text
Abstract:
A new apparent-magnetization mapping method on the horizontal plane combines minimization of first-order entropy with maximization of zeroth-order entropy of the estimated magnetization. The interpretation model is a grid of vertical, juxtaposed prisms in both horizontal directions. To estimate the magnetization of the prisms, assume that the top and bottom of the magnetic sources are horizontal. Minimization of the first-order entropy favors solutions with sharp borders, and the maximization of zeroth-order entropy prevents the tendency of the estimated source to become a single prism with large magnetization. Thus, a judicious combination of both constraints can lead to solutions characterized by regions with virtually constant magnetizations separated by sharp discontinuities. This is applied to synthetic data from simulated intrusive bodies in sediments that have horizontal tops. By comparing the results with those obtained with the common Tikhonov regularization (smoothness constraint) method, it is shown that both methods produce good and equivalent locations of the central positions of the sources. However, entropic regularization delineates the boundaries of the bodies with greater detail. Both the proposed and the smoothness constraints are applied to real anomaly data over a magnetic skarn in Butte Valley, Nevada, U.S.A. Entropic regularization produced an estimated magnetization distribution with sharper boundaries, smaller volume, and higher apparent magnetization as compared with results produced by incorporating the smoothness constraint.
APA, Harvard, Vancouver, ISO, and other styles
28

Yasuda, Makoto. "Deterministic Annealing Approach to Fuzzy C-Means Clustering Based on Entropy Maximization." Advances in Fuzzy Systems 2011 (2011): 1–9. http://dx.doi.org/10.1155/2011/960635.

Full text
Abstract:
This paper is dealing with the fuzzy clustering method which combines the deterministic annealing (DA) approach with an entropy, especially the Shannon entropy and the Tsallis entropy. By maximizing the Shannon entropy, the fuzzy entropy, or the Tsallis entropy within the framework of the fuzzy c-means (FCM) method, membership functions similar to the statistical mechanical distribution functions are obtained. We examine characteristics of these entropy-based membership functions from the statistical mechanical point of view. After that, both the Shannon- and Tsallis-entropy-based FCMs are formulated as DA clustering using the very fast annealing (VFA) method as a cooling schedule. Experimental results indicate that the Tsallis-entropy-based FCM is stable with very fast deterministic annealing and suitable for this annealing process.
APA, Harvard, Vancouver, ISO, and other styles
29

Hulle, Marc M. Van. "Joint Entropy Maximization in Kernel-Based Topographic Maps." Neural Computation 14, no. 8 (August 1, 2002): 1887–906. http://dx.doi.org/10.1162/089976602760128054.

Full text
Abstract:
A new learning algorithm for kernel-based topographic map formation is introduced. The kernel parameters are adjusted individually so as to maximize the joint entropy of the kernel outputs. This is done by maximizing the differential entropies of the individual kernel outputs, given that the map's output redundancy, due to the kernel overlap, needs to be minimized. The latter is achieved by minimizing the mutual information between the kernel outputs. As a kernel, the (radial) incomplete gamma distribution is taken since, for a gaussian input density, the differential entropy of the kernel output will be maximal. Since the theoretically optimal joint entropy performance can be derived for the case of nonoverlapping gaussian mixture densities, a new clustering algorithm is suggested that uses this optimum as its “null” distribution. Finally, it is shown that the learning algorithm is similar to one that performs stochastic gradient descent on the Kullback-Leibler divergence for a heteroskedastic gaussian mixture density model.
APA, Harvard, Vancouver, ISO, and other styles
30

Cai, Changxiao, and Sergio Verdú. "Conditional Rényi Divergence Saddlepoint and the Maximization of α-Mutual Information." Entropy 21, no. 10 (October 4, 2019): 969. http://dx.doi.org/10.3390/e21100969.

Full text
Abstract:
Rényi-type generalizations of entropy, relative entropy and mutual information have found numerous applications throughout information theory and beyond. While there is consensus that the ways A. Rényi generalized entropy and relative entropy in 1961 are the “right” ones, several candidates have been put forth as possible mutual informations of order α . In this paper we lend further evidence to the notion that a Bayesian measure of statistical distinctness introduced by R. Sibson in 1969 (closely related to Gallager’s E 0 function) is the most natural generalization, lending itself to explicit computation and maximization, as well as closed-form formulas. This paper considers general (not necessarily discrete) alphabets and extends the major analytical results on the saddle-point and saddle-level of the conditional relative entropy to the conditional Rényi divergence. Several examples illustrate the main application of these results, namely, the maximization of α -mutual information with and without constraints.
APA, Harvard, Vancouver, ISO, and other styles
31

Dottori, Javier A., Gustavo A. Boroni, and Alejandro Clausse. "Downstream-Conditioned Maximum Entropy Method for Exit Boundary Conditions in the Lattice Boltzmann Method." Mathematical Problems in Engineering 2015 (2015): 1–12. http://dx.doi.org/10.1155/2015/159418.

Full text
Abstract:
A method for modeling outflow boundary conditions in the lattice Boltzmann method (LBM) based on the maximization of the local entropy is presented. The maximization procedure is constrained by macroscopic values and downstream components. The method is applied to fully developed boundary conditions of the Navier-Stokes equations in rectangular channels. Comparisons are made with other alternative methods. In addition, the new downstream-conditioned entropy is studied and it was found that there is a correlation with the velocity gradient during the flow development.
APA, Harvard, Vancouver, ISO, and other styles
32

Yasuda, Makoto. "Approximate Determination ofq-Parameter for FCM with Tsallis Entropy Maximization." Journal of Advanced Computational Intelligence and Intelligent Informatics 21, no. 7 (November 20, 2017): 1152–60. http://dx.doi.org/10.20965/jaciii.2017.p1152.

Full text
Abstract:
This paper considers a fuzzyc-means (FCM) clustering algorithm in combination with deterministic annealing and the Tsallis entropy maximization. The Tsallis entropy is aq-parameter extension of the Shannon entropy. By maximizing the Tsallis entropy within the framework of FCM, statistical mechanical membership functions can be derived. One of the major considerations when using this method is how to determine appropriate values forqand the highest annealing temperature,Thigh, for a given data set. Accordingly, in this paper, a method for determining these values simultaneously without introducing any additional parameters is presented, where the membership function is approximated using a series expansion method. The results of experiments indicate that the proposed method is effective, and bothqandThighcan be determined automatically and algebraically from a given data set.
APA, Harvard, Vancouver, ISO, and other styles
33

Yu, Yang, and Xiang Zhou. "Study on Corrosion Acoustic Emission Separation Based on Blind Source Separation." Advanced Materials Research 503-504 (April 2012): 1597–600. http://dx.doi.org/10.4028/www.scientific.net/amr.503-504.1597.

Full text
Abstract:
When corrosion signals of tank bottom is detected by online method, it is essential to identify corrosion signals of different corrosion pots. It is a new method based on blind source separation. Blind source separation has produced many arithetics, among which entropy maximization is more mature. The aim of this paper is to separate corrosion signals by using entropy maximization arithmetic. Furthermore, the separation of linear mixed acoustic emission signals is achieved through simulation. The results indicate that blind source separation is an effective method for the separation of corrosion signals.
APA, Harvard, Vancouver, ISO, and other styles
34

Censor, Yair, Alvaro R. De-Pierro, Tommy Elfving, Gabor T. Herman, and Alfredo N. Iusem. "On iterative methods for linearly constrained entropy maximization." Banach Center Publications 24, no. 1 (1990): 145–63. http://dx.doi.org/10.4064/-24-1-145-163.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Barbuzza, R., P. Lotito, and A. Clausse. "Tomography reconstruction by entropy maximization with smoothing filtering." Inverse Problems in Science and Engineering 18, no. 5 (July 2010): 711–22. http://dx.doi.org/10.1080/17415977.2010.492506.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Dubey, Ritesh Kumar, V. J. Menon, M. K. Pandey, and D. N. Tripathi. "Entropy Maximization, Cutoff Distribution, and Finite Stellar Masses." Advances in Astronomy 2008 (2008): 1–14. http://dx.doi.org/10.1155/2008/870804.

Full text
Abstract:
Conventional equilibrium statistical mechanics of open gravitational systems is known to be problematical. We first recall that spherical stars/galaxies acquire unbounded radii, become infinitely massive, and evaporate away continuously if one uses the standard Maxwellian distributionfB(which maximizes the usual Boltzmann-Shannon entropy and hence has a tail extending to infinity). Next, we show that these troubles disappear automatically if we employ the exact most probable distributionf(which maximizes the combinatorial entropy and hence possesses a sharp cutoff tail). Finally, if astronomical observation is carried out on a large galaxy, then the Poisson equation together with thermal de Broglie wavelength provides useful information about the cutoff radiusrK, cutoff energyεK, and the huge quantum numberKup to which the cluster exists. Thereby, a refinement over the empirical lowered isothermal King models, is achieved. Numerically, we find that the most probable distribution (MPD) prediction fits well the number density profile near the outer edge of globular clusters.
APA, Harvard, Vancouver, ISO, and other styles
37

Girardin, Valerie. "Entropy Maximization for Markov and Semi-Markov Processes." Methodology and Computing in Applied Probability 6, no. 1 (March 2004): 109–27. http://dx.doi.org/10.1023/b:mcap.0000012418.88825.18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Haegeman, Bart, and Rampal S. Etienne. "Entropy Maximization and the Spatial Distribution of Species." American Naturalist 175, no. 4 (April 2010): E74—E90. http://dx.doi.org/10.1086/650718.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Yin, Shibai, Yiming Qian, and Minglun Gong. "Unsupervised hierarchical image segmentation through fuzzy entropy maximization." Pattern Recognition 68 (August 2017): 245–59. http://dx.doi.org/10.1016/j.patcog.2017.03.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Kekeç, Taygun, David Mimno, and David M. J. Tax. "Boosted negative sampling by quadratically constrained entropy maximization." Pattern Recognition Letters 125 (July 2019): 310–17. http://dx.doi.org/10.1016/j.patrec.2019.04.027.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Takizawa, Hiroyuki, and Hiroaki Kobayashi. "Partial distortion entropy maximization for online data clustering." Neural Networks 20, no. 7 (September 2007): 819–31. http://dx.doi.org/10.1016/j.neunet.2007.04.029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Son, S., and Sung Joon Moon. "Entropy maximization and instability in uniformly magnetized plasmas." Physica A: Statistical Mechanics and its Applications 392, no. 12 (June 2013): 2713–17. http://dx.doi.org/10.1016/j.physa.2013.02.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Suyari, Hiroki. "Maximization of Tsallis entropy in the combinatorial formulation." Journal of Physics: Conference Series 201 (February 15, 2010): 012018. http://dx.doi.org/10.1088/1742-6596/201/1/012018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Boshnakov, Georgi N., and Sophie Lambert-Lacroix. "A periodic Levinson–Durbin algorithm for entropy maximization." Computational Statistics & Data Analysis 56, no. 1 (January 2012): 15–24. http://dx.doi.org/10.1016/j.csda.2011.07.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Español I Garrigós, Josep. "Initial ensemble densities through the maximization of entropy." Physics Letters A 146, no. 1-2 (May 1990): 21–24. http://dx.doi.org/10.1016/0375-9601(90)90023-h.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Li, Hui-Jia, Bingying Xu, Liang Zheng, and Jia Yan. "Integrating attributes of nodes solves the community structure partition effectively." Modern Physics Letters B 28, no. 05 (February 18, 2014): 1450037. http://dx.doi.org/10.1142/s0217984914500377.

Full text
Abstract:
Revealing ideal community structure efficiently is very important for scientists from many fields. However, it is difficult to infer an ideal community division structure by only analyzing the topology information due to the increment and complication of the social network. Recent research on community detection uncovers that its performance could be improved by incorporating the node attribute information. Along this direction, this paper improves the Blondel–Guillaume–Lambiotte (BGL) method, which is a fast algorithm based on modularity maximization, by integrating the community attribute entropy. To fulfill this goal, our algorithm minimizes the community attribute entropy by removing the boundary nodes which are generated in the modularity maximization at each iteration. By this way, the communities detected by our algorithm make a balance between modularity maximization and community attribute entropy minimization. In addition, another merit of our algorithm is that it is free of parameters. Comprehensive experiments have been conducted on both artificial and real networks to compare the proposed community detection algorithm with several state-of-the-art ones. As the experimental results indicate, our algorithm demonstrates superior performance.
APA, Harvard, Vancouver, ISO, and other styles
47

Yang, Qisong, and Matthijs T. J. Spaan. "CEM: Constrained Entropy Maximization for Task-Agnostic Safe Exploration." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 9 (June 26, 2023): 10798–806. http://dx.doi.org/10.1609/aaai.v37i9.26281.

Full text
Abstract:
In the absence of assigned tasks, a learning agent typically seeks to explore its environment efficiently. However, the pursuit of exploration will bring more safety risks. An under-explored aspect of reinforcement learning is how to achieve safe efficient exploration when the task is unknown. In this paper, we propose a practical Constrained Entropy Maximization (CEM) algorithm to solve task-agnostic safe exploration problems, which naturally require a finite horizon and undiscounted constraints on safety costs. The CEM algorithm aims to learn a policy that maximizes state entropy under the premise of safety. To avoid approximating the state density in complex domains, CEM leverages a k-nearest neighbor entropy estimator to evaluate the efficiency of exploration. In terms of safety, CEM minimizes the safety costs, and adaptively trades off safety and exploration based on the current constraint satisfaction. The empirical analysis shows that CEM enables the acquisition of a safe exploration policy in complex environments, resulting in improved performance in both safety and sample efficiency for target tasks.
APA, Harvard, Vancouver, ISO, and other styles
48

Samanta, Bablu, and Kumar Majumder. "Entropy based transportation model: A geometric programming approach." Yugoslav Journal of Operations Research 17, no. 1 (2007): 43–54. http://dx.doi.org/10.2298/yjor0701043s.

Full text
Abstract:
The entropy model has attached a good deal of attention in transportation analysis, urban and regional planning as well as in other areas. This paper shows the equivalence of entropy maximization models to geometric programs. To provide a better understanding of this entropy based transportation model they are analyzed by geometric programming. Dual mathematical programs and algorithms are also obtained and are supported by an illustrative example. .
APA, Harvard, Vancouver, ISO, and other styles
49

ADDABBO, T., M. ALIOTO, A. FORT, S. ROCCHI, and V. VIGNOLI. "A VARIABILITY-TOLERANT FEEDBACK TECHNIQUE FOR THROUGHPUT MAXIMIZATION OF TRBGs WITH PREDEFINED ENTROPY." Journal of Circuits, Systems and Computers 19, no. 04 (June 2010): 879–95. http://dx.doi.org/10.1142/s0218126610006505.

Full text
Abstract:
In this paper a probabilistic feedback technique to maximize the throughput of a generic True Random Bit Generator (TRBG) circuit, under a given constraint on the entropy, is discussed. In the proposed solution, the throughput of the device is dynamically and adaptively varied by an on-line entropy detector, such to obtain, with an arbitrary confidence level, an entropy greater than a given worst-case value. The approach, which has a general validity, introduces a method for making maximum use of the TRBG random bit generation capabilities, maximizing the generation throughput while preserving its entropy. It is different from the classical "open loop" TRBG design approach, in which the circuit parameter variability determines an uncertainty about the actual entropy of the device, with the proposed techniques the TRBG generation speed is varied under a given constraint on the entropy. The method can be applied to all those integrated TRBG circuits proposed in the literature and based on the uniform sampling of, e.g., random physical processes or chaotic dynamical systems.
APA, Harvard, Vancouver, ISO, and other styles
50

Silva, João B. C., Francisco S. Oliveira, Valéria C. F. Barbosa, and Haroldo F. Campos Velho. "Apparent-density mapping using entropic regularization." GEOPHYSICS 72, no. 4 (July 2007): I51—I60. http://dx.doi.org/10.1190/1.2732557.

Full text
Abstract:
We present a new apparent-density mapping method on the horizontal plane that combines the minimization of the first-order entropy with the maximization of the zeroth-order entropy of the estimated density contrasts. The interpretation model consists of a grid of vertical, juxtaposed prisms in both horizontal directions. We assume that the top and the bottom of the gravity sources are flat and horizontal and estimate the prisms’ density contrasts. The minimization of the first-order entropy favors solutions presenting sharp borders, and the maximization of the zeroth-order entropy prevents the tendency of the source estimate to become a single prism. Thus, a judicious combination of both constraints may lead to solutions characterized by regions with virtually constant estimated density contrasts separated by sharp discontinuities. We apply our method to synthetic data from simulated intrusive bodies in sediments that present flat and horizontal tops. By comparing our results with those obtained with the smoothness constraint, we show that both methods produce good and equivalent locations of the sources’ central positions. However, the entropic regularization delineates the boundaries of the bodies with greater resolution, even in the case of 100-m-wide bodies separated by a distance as small as [Formula: see text]. Both the proposed and the global smoothness constraints are applied to real anomalies from the eastern Alps and from the Matsitama intrusive complex, northeastern Botswana. In the first case, the entropic regularization delineates two sources, with a horizontal and nearly flat top being consistent with the known geologic information. In the second case, both constraints produce virtually the same estimate, indicating, in agreement with results of synthetic tests, that the tops of the sources are neither flat nor horizontal.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography