To see the other types of publications on this topic, follow the link: Entropy maximization.

Journal articles on the topic 'Entropy maximization'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Entropy maximization.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Athreya, K. B. "Entropy maximization." Proceedings - Mathematical Sciences 119, no. 4 (2009): 531–39. http://dx.doi.org/10.1007/s12044-009-0049-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Censor, Yair, and Joseph Segman. "On Block-Iterative Entropy Maximization." Journal of Information and Optimization Sciences 8, no. 3 (1987): 275–91. http://dx.doi.org/10.1080/02522667.1987.10698894.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Martı́nez, S., F. Nicolás, F. Pennini, and A. Plastino. "Tsallis’ entropy maximization procedure revisited." Physica A: Statistical Mechanics and its Applications 286, no. 3-4 (2000): 489–502. http://dx.doi.org/10.1016/s0378-4371(00)00359-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Janečka, Adam, and Michal Pavelka. "Gradient Dynamics and Entropy Production Maximization." Journal of Non-Equilibrium Thermodynamics 43, no. 1 (2018): 1–19. http://dx.doi.org/10.1515/jnet-2017-0005.

Full text
Abstract:
AbstractWe compare two methods for modeling dissipative processes, namely gradient dynamics and entropy production maximization. Both methods require similar physical inputs–-how energy (or entropy) is stored and how it is dissipated. Gradient dynamics describes irreversible evolution by means of dissipation potential and entropy, it automatically satisfies Onsager reciprocal relations as well as their nonlinear generalization (Maxwell–Onsager relations), and it has statistical interpretation. Entropy production maximization is based on knowledge of free energy (or another thermodynamic potent
APA, Harvard, Vancouver, ISO, and other styles
5

Ratnayake, L. L. "Intercity auto trip estimation for Sri Lanka using entropy maximization." Canadian Journal of Civil Engineering 16, no. 2 (1989): 200–201. http://dx.doi.org/10.1139/l89-036.

Full text
Abstract:
Recently, techniques such as calibration of demand models and entropy maximization have been used to estimate origin–destination (O–D) matrices. To the author's knowledge, there has been no reported work on an O–D estimation for a nationwide network. This paper describes a methodology to estimate intercity auto traffic for Sri Lanka using entropy maximization. Some of these results are then compared with the values obtained from a known demand model and the actual data. Key words: O–D matrices, entropy maximization, Sri Lanka, link volume, intercity.
APA, Harvard, Vancouver, ISO, and other styles
6

Tanaka, Hisa-Aki, Masaki Nakagawa, and Yasutada Oohama. "A Direct Link between Rényi–Tsallis Entropy and Hölder’s Inequality—Yet Another Proof of Rényi–Tsallis Entropy Maximization." Entropy 21, no. 6 (2019): 549. http://dx.doi.org/10.3390/e21060549.

Full text
Abstract:
The well-known Hölder’s inequality has been recently utilized as an essential tool for solving several optimization problems. However, such an essential role of Hölder’s inequality does not seem to have been reported in the context of generalized entropy, including Rényi–Tsallis entropy. Here, we identify a direct link between Rényi–Tsallis entropy and Hölder’s inequality. Specifically, we demonstrate yet another elegant proof of the Rényi–Tsallis entropy maximization problem. Especially for the Tsallis entropy maximization problem, only with the equality condition of Hölder’s inequality is th
APA, Harvard, Vancouver, ISO, and other styles
7

Ré, Christopher, and D. Suciu. "Understanding cardinality estimation using entropy maximization." ACM Transactions on Database Systems 37, no. 1 (2012): 1–31. http://dx.doi.org/10.1145/2109196.2109202.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Miller, Gad, and David Horn. "Probability Density Estimation Using Entropy Maximization." Neural Computation 10, no. 7 (1998): 1925–38. http://dx.doi.org/10.1162/089976698300017205.

Full text
Abstract:
We propose a method for estimating probability density functions and conditional density functions by training on data produced by such distributions. The algorithm employs new stochastic variables that amount to coding of the input, using a principle of entropy maximization. It is shown to be closely related to the maximum likelihood approach. The encoding step of the algorithm provides an estimate of the probability distribution. The decoding step serves as a generative mode, producing an ensemble of data with the desired distribution. The algorithm is readily implemented by neural networks,
APA, Harvard, Vancouver, ISO, and other styles
9

Haegeman, Bart, and Michel Loreau. "Limitations of entropy maximization in ecology." Oikos 117, no. 11 (2008): 1700–1710. http://dx.doi.org/10.1111/j.1600-0706.2008.16539.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Rouge, Richard, and Nicole El Karoui. "Pricing Via Utility Maximization and Entropy." Mathematical Finance 10, no. 2 (2000): 259–76. http://dx.doi.org/10.1111/1467-9965.00093.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Zou, Jieping, and Greg Holloway. "Entropy maximization tendency in topographic turbulence." Journal of Fluid Mechanics 263 (March 25, 1994): 361–74. http://dx.doi.org/10.1017/s0022112094004155.

Full text
Abstract:
Numerical simulations of geostrophic turbulence above topography are used to compare (a) nonlinear generation of system entropy, S, (b) selective damping of enstrophy and (c) development of vorticity–topography correlation. In the damped cases, S initially increases, approaching a quasi-equilibrium (maximum S subject to the instantaneous, though decaying, energy and enstrophy). When strongly scale-selective damping is applied, onset of the vorticity–topography correlation follows the timescales for enstrophy decay. During the period of decay, it is shown that nonlinear interaction continues to
APA, Harvard, Vancouver, ISO, and other styles
12

Davis, Sergio, and Diego González. "Hamiltonian formalism and path entropy maximization." Journal of Physics A: Mathematical and Theoretical 48, no. 42 (2015): 425003. http://dx.doi.org/10.1088/1751-8113/48/42/425003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Solé-Casals, Jordi, Karmele López-de-Ipiña Pena, and Cesar F. Caiafa. "Inverting Monotonic Nonlinearities by Entropy Maximization." PLOS ONE 11, no. 10 (2016): e0165288. http://dx.doi.org/10.1371/journal.pone.0165288.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Fisch, Oscar. "On the Utility of Entropy Maximization." Geographical Analysis 9, no. 1 (2010): 79–84. http://dx.doi.org/10.1111/j.1538-4632.1977.tb00562.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Hewings, Geoffrey J. D., and Esteban Fernandez-Vazquez. "Entropy maximization and input–output analysis." Interdisciplinary Science Reviews 44, no. 3-4 (2019): 272–85. http://dx.doi.org/10.1080/03080188.2019.1670429.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Kiefer, Alex B. "Intrinsic Motivation as Constrained Entropy Maximization." Entropy 27, no. 4 (2025): 372. https://doi.org/10.3390/e27040372.

Full text
Abstract:
“Intrinsic motivation” refers to the capacity for intelligent systems to be motivated endogenously, i.e., by features of agential architecture itself rather than by learned associations between action and reward. This paper views active inference, empowerment, and other formal accounts of intrinsic motivation as variations on the theme of constrained maximum entropy inference, providing a general perspective on intrinsic motivation complementary to existing frameworks. The connection between free energy and empowerment noted in previous literature is further explored, and it is argued that the
APA, Harvard, Vancouver, ISO, and other styles
17

Eliazar, Iddo. "From entropy-maximization to equality-maximization: Gauss, Laplace, Pareto, and Subbotin." Physica A: Statistical Mechanics and its Applications 415 (December 2014): 479–92. http://dx.doi.org/10.1016/j.physa.2014.08.011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Collins, Douglas M. "Entropy Maximizations on Electron Density." Zeitschrift für Naturforschung A 48, no. 1-2 (1993): 68–74. http://dx.doi.org/10.1515/zna-1993-1-218.

Full text
Abstract:
Abstract Incomplete and imperfect data characterize the problem of constructing electron density representations from experimental information. One fundamental concern is identification of the proper protocol for including new information at any stage of a density reconstruction. An axiomatic approach developed in other fields specifies entropy maximization as the desired protocol. In particular, if new data are used to modify a prior charge density distribution without adding extraneous prejudice, the new distribution must both agree with all the data, new and old, and be a function of maximu
APA, Harvard, Vancouver, ISO, and other styles
19

GULKO, LES. "THE ENTROPIC MARKET HYPOTHESIS." International Journal of Theoretical and Applied Finance 02, no. 03 (1999): 293–329. http://dx.doi.org/10.1142/s0219024999000170.

Full text
Abstract:
Information theory teaches that entropy is the fundamental limit for data compression, and electrical engineers routinely use entropy as a criterion for efficient storage and transmission of information. Since modern financial theory teaches that competitive market prices store and transmit information with some efficiency, should financial economists be concerned with entropy? This paper presents a market model in which entropy emerges endogenously as a condition for the operational efficiency of price discovery while entropy maximization emerges as a condition for the informational efficienc
APA, Harvard, Vancouver, ISO, and other styles
20

Silva, João B., Suzan S. Vasconcelos, and Valeria C. Barbosa. "Apparent-magnetization mapping using entropic regularization." GEOPHYSICS 75, no. 2 (2010): L39—L50. http://dx.doi.org/10.1190/1.3358160.

Full text
Abstract:
A new apparent-magnetization mapping method on the horizontal plane combines minimization of first-order entropy with maximization of zeroth-order entropy of the estimated magnetization. The interpretation model is a grid of vertical, juxtaposed prisms in both horizontal directions. To estimate the magnetization of the prisms, assume that the top and bottom of the magnetic sources are horizontal. Minimization of the first-order entropy favors solutions with sharp borders, and the maximization of zeroth-order entropy prevents the tendency of the estimated source to become a single prism with la
APA, Harvard, Vancouver, ISO, and other styles
21

Giorgio, Serena Di, and Paulo Mateus. "Efficiently Compressible Density Operators via Entropy Maximization." Proceedings 12, no. 1 (2019): 39. http://dx.doi.org/10.3390/proceedings2019012039.

Full text
Abstract:
We address the problem of efficiently and effectively compress density operators (DOs), by providing an efficient procedure for learning the most likely DO, given a chosen set of partial information. We explore, in the context of quantum information theory, the generalisation of the maximum entropy estimator for DOs, when the direct dependencies between the subsystems are provided. As a preliminary analysis, we restrict the problem to tripartite systems when two marginals are known. When the marginals are compatible with the existence of a quantum Markov chain (QMC) we show that there exists a
APA, Harvard, Vancouver, ISO, and other styles
22

Fernández-Pineda, C., and S. Velasco. "Entropy maximization in the free expansion process." European Journal of Physics 26, no. 4 (2005): N13—N16. http://dx.doi.org/10.1088/0143-0807/26/4/n01.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Wang, Qianni, Liyang Feng, Jiayang Li, Jun Xie, and Yu (Marco) Nie. "Entropy maximization in multi-class traffic assignment." Transportation Research Part B: Methodological 192 (February 2025): 103136. https://doi.org/10.1016/j.trb.2024.103136.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Jose, Victor Richmond R., Robert F. Nau, and Robert L. Winkler. "Scoring Rules, Generalized Entropy, and Utility Maximization." Operations Research 56, no. 5 (2008): 1146–57. http://dx.doi.org/10.1287/opre.1070.0498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Nakamura, T. K. "Relativistic equilibrium distribution by relative entropy maximization." EPL (Europhysics Letters) 88, no. 4 (2009): 40009. http://dx.doi.org/10.1209/0295-5075/88/40009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Siew, Lam Weng, Saiful Hafizah Jaaman, and Lam Weng Hoe. "ENHANCED INDEX TRACKING MODEL WITH ENTROPY MAXIMIZATION." Advances and Applications in Statistics 53, no. 3 (2018): 243–58. http://dx.doi.org/10.17654/as053030243.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Takagi, Fujio, and Tatsuo Tsukamoto. "Semi-inclusive rapidity distributions from entropy maximization." Physical Review D 38, no. 7 (1988): 2288–90. http://dx.doi.org/10.1103/physrevd.38.2288.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Balakrishnan, N., and N. Sathyamurthy. "Maximization of entropy during a chemical reaction." Chemical Physics Letters 164, no. 2-3 (1989): 267–69. http://dx.doi.org/10.1016/0009-2614(89)85027-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Illner, Reinhard, and Helmut Neunzert. "Relative entropy maximization and directed diffusion equations." Mathematical Methods in the Applied Sciences 16, no. 8 (1993): 545–54. http://dx.doi.org/10.1002/mma.1670160803.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Smith, Jonathan. "Entropy Maximization, Time Emergence, and Phase Transition." Entropy 27, no. 6 (2025): 586. https://doi.org/10.3390/e27060586.

Full text
Abstract:
We survey developments in the use of entropy maximization for applying the Gibbs Canonical Ensemble to finite situations. Biological insights are invoked along with physical considerations. In the game-theoretic approach to entropy maximization, the interpretation of the two player roles as predator and prey provides a well-justified and symmetric analysis. The main focus is placed on the Lagrange multiplier approach. Using natural physical units with Planck’s constant set to unity, it is recognized that energy has the dimensions of inverse time. Thus, the conjugate Lagrange multiplier, tradit
APA, Harvard, Vancouver, ISO, and other styles
31

Yasuda, Makoto. "Deterministic Annealing Approach to Fuzzy C-Means Clustering Based on Entropy Maximization." Advances in Fuzzy Systems 2011 (2011): 1–9. http://dx.doi.org/10.1155/2011/960635.

Full text
Abstract:
This paper is dealing with the fuzzy clustering method which combines the deterministic annealing (DA) approach with an entropy, especially the Shannon entropy and the Tsallis entropy. By maximizing the Shannon entropy, the fuzzy entropy, or the Tsallis entropy within the framework of the fuzzy c-means (FCM) method, membership functions similar to the statistical mechanical distribution functions are obtained. We examine characteristics of these entropy-based membership functions from the statistical mechanical point of view. After that, both the Shannon- and Tsallis-entropy-based FCMs are for
APA, Harvard, Vancouver, ISO, and other styles
32

Teles, Tarcísio N., Calvin A. F. Farias, Renato Pakter, and Yan Levin. "A Monte Carlo Method for Calculating Lynden-Bell Equilibrium in Self-Gravitating Systems." Entropy 25, no. 10 (2023): 1379. http://dx.doi.org/10.3390/e25101379.

Full text
Abstract:
We present a Monte Carlo approach that allows us to easily implement Lynden-Bell (LB) entropy maximization for an arbitrary initial particle distribution. The direct maximization of LB entropy for an arbitrary initial distribution requires an infinite number of Lagrange multipliers to account for the Casimir invariants. This has restricted studies of Lynden-Bell’s violent relaxation theory to only a very small class of initial conditions of a very simple waterbag form, for which the entropy maximization can be performed numerically. In the present approach, an arbitrary initial distribution is
APA, Harvard, Vancouver, ISO, and other styles
33

Cai, Changxiao, та Sergio Verdú. "Conditional Rényi Divergence Saddlepoint and the Maximization of α-Mutual Information". Entropy 21, № 10 (2019): 969. http://dx.doi.org/10.3390/e21100969.

Full text
Abstract:
Rényi-type generalizations of entropy, relative entropy and mutual information have found numerous applications throughout information theory and beyond. While there is consensus that the ways A. Rényi generalized entropy and relative entropy in 1961 are the “right” ones, several candidates have been put forth as possible mutual informations of order α . In this paper we lend further evidence to the notion that a Bayesian measure of statistical distinctness introduced by R. Sibson in 1969 (closely related to Gallager’s E 0 function) is the most natural generalization, lending itself to explici
APA, Harvard, Vancouver, ISO, and other styles
34

Dottori, Javier A., Gustavo A. Boroni, and Alejandro Clausse. "Downstream-Conditioned Maximum Entropy Method for Exit Boundary Conditions in the Lattice Boltzmann Method." Mathematical Problems in Engineering 2015 (2015): 1–12. http://dx.doi.org/10.1155/2015/159418.

Full text
Abstract:
A method for modeling outflow boundary conditions in the lattice Boltzmann method (LBM) based on the maximization of the local entropy is presented. The maximization procedure is constrained by macroscopic values and downstream components. The method is applied to fully developed boundary conditions of the Navier-Stokes equations in rectangular channels. Comparisons are made with other alternative methods. In addition, the new downstream-conditioned entropy is studied and it was found that there is a correlation with the velocity gradient during the flow development.
APA, Harvard, Vancouver, ISO, and other styles
35

Yu, Yang, and Xiang Zhou. "Study on Corrosion Acoustic Emission Separation Based on Blind Source Separation." Advanced Materials Research 503-504 (April 2012): 1597–600. http://dx.doi.org/10.4028/www.scientific.net/amr.503-504.1597.

Full text
Abstract:
When corrosion signals of tank bottom is detected by online method, it is essential to identify corrosion signals of different corrosion pots. It is a new method based on blind source separation. Blind source separation has produced many arithetics, among which entropy maximization is more mature. The aim of this paper is to separate corrosion signals by using entropy maximization arithmetic. Furthermore, the separation of linear mixed acoustic emission signals is achieved through simulation. The results indicate that blind source separation is an effective method for the separation of corrosi
APA, Harvard, Vancouver, ISO, and other styles
36

Li, Hui-Jia, Bingying Xu, Liang Zheng, and Jia Yan. "Integrating attributes of nodes solves the community structure partition effectively." Modern Physics Letters B 28, no. 05 (2014): 1450037. http://dx.doi.org/10.1142/s0217984914500377.

Full text
Abstract:
Revealing ideal community structure efficiently is very important for scientists from many fields. However, it is difficult to infer an ideal community division structure by only analyzing the topology information due to the increment and complication of the social network. Recent research on community detection uncovers that its performance could be improved by incorporating the node attribute information. Along this direction, this paper improves the Blondel–Guillaume–Lambiotte (BGL) method, which is a fast algorithm based on modularity maximization, by integrating the community attribute en
APA, Harvard, Vancouver, ISO, and other styles
37

Yasuda, Makoto. "Approximate Determination ofq-Parameter for FCM with Tsallis Entropy Maximization." Journal of Advanced Computational Intelligence and Intelligent Informatics 21, no. 7 (2017): 1152–60. http://dx.doi.org/10.20965/jaciii.2017.p1152.

Full text
Abstract:
This paper considers a fuzzyc-means (FCM) clustering algorithm in combination with deterministic annealing and the Tsallis entropy maximization. The Tsallis entropy is aq-parameter extension of the Shannon entropy. By maximizing the Tsallis entropy within the framework of FCM, statistical mechanical membership functions can be derived. One of the major considerations when using this method is how to determine appropriate values forqand the highest annealing temperature,Thigh, for a given data set. Accordingly, in this paper, a method for determining these values simultaneously without introduc
APA, Harvard, Vancouver, ISO, and other styles
38

Hulle, Marc M. Van. "Joint Entropy Maximization in Kernel-Based Topographic Maps." Neural Computation 14, no. 8 (2002): 1887–906. http://dx.doi.org/10.1162/089976602760128054.

Full text
Abstract:
A new learning algorithm for kernel-based topographic map formation is introduced. The kernel parameters are adjusted individually so as to maximize the joint entropy of the kernel outputs. This is done by maximizing the differential entropies of the individual kernel outputs, given that the map's output redundancy, due to the kernel overlap, needs to be minimized. The latter is achieved by minimizing the mutual information between the kernel outputs. As a kernel, the (radial) incomplete gamma distribution is taken since, for a gaussian input density, the differential entropy of the kernel out
APA, Harvard, Vancouver, ISO, and other styles
39

Censor, Yair, Alvaro R. De-Pierro, Tommy Elfving, Gabor T. Herman, and Alfredo N. Iusem. "On iterative methods for linearly constrained entropy maximization." Banach Center Publications 24, no. 1 (1990): 145–63. http://dx.doi.org/10.4064/-24-1-145-163.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Barbuzza, R., P. Lotito, and A. Clausse. "Tomography reconstruction by entropy maximization with smoothing filtering." Inverse Problems in Science and Engineering 18, no. 5 (2010): 711–22. http://dx.doi.org/10.1080/17415977.2010.492506.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Dubey, Ritesh Kumar, V. J. Menon, M. K. Pandey, and D. N. Tripathi. "Entropy Maximization, Cutoff Distribution, and Finite Stellar Masses." Advances in Astronomy 2008 (2008): 1–14. http://dx.doi.org/10.1155/2008/870804.

Full text
Abstract:
Conventional equilibrium statistical mechanics of open gravitational systems is known to be problematical. We first recall that spherical stars/galaxies acquire unbounded radii, become infinitely massive, and evaporate away continuously if one uses the standard Maxwellian distributionfB(which maximizes the usual Boltzmann-Shannon entropy and hence has a tail extending to infinity). Next, we show that these troubles disappear automatically if we employ the exact most probable distributionf(which maximizes the combinatorial entropy and hence possesses a sharp cutoff tail). Finally, if astronomic
APA, Harvard, Vancouver, ISO, and other styles
42

Girardin, Valerie. "Entropy Maximization for Markov and Semi-Markov Processes." Methodology and Computing in Applied Probability 6, no. 1 (2004): 109–27. http://dx.doi.org/10.1023/b:mcap.0000012418.88825.18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Haegeman, Bart, and Rampal S. Etienne. "Entropy Maximization and the Spatial Distribution of Species." American Naturalist 175, no. 4 (2010): E74—E90. http://dx.doi.org/10.1086/650718.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Yin, Shibai, Yiming Qian, and Minglun Gong. "Unsupervised hierarchical image segmentation through fuzzy entropy maximization." Pattern Recognition 68 (August 2017): 245–59. http://dx.doi.org/10.1016/j.patcog.2017.03.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Kekeç, Taygun, David Mimno, and David M. J. Tax. "Boosted negative sampling by quadratically constrained entropy maximization." Pattern Recognition Letters 125 (July 2019): 310–17. http://dx.doi.org/10.1016/j.patrec.2019.04.027.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Takizawa, Hiroyuki, and Hiroaki Kobayashi. "Partial distortion entropy maximization for online data clustering." Neural Networks 20, no. 7 (2007): 819–31. http://dx.doi.org/10.1016/j.neunet.2007.04.029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Son, S., and Sung Joon Moon. "Entropy maximization and instability in uniformly magnetized plasmas." Physica A: Statistical Mechanics and its Applications 392, no. 12 (2013): 2713–17. http://dx.doi.org/10.1016/j.physa.2013.02.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Suyari, Hiroki. "Maximization of Tsallis entropy in the combinatorial formulation." Journal of Physics: Conference Series 201 (February 15, 2010): 012018. http://dx.doi.org/10.1088/1742-6596/201/1/012018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Boshnakov, Georgi N., and Sophie Lambert-Lacroix. "A periodic Levinson–Durbin algorithm for entropy maximization." Computational Statistics & Data Analysis 56, no. 1 (2012): 15–24. http://dx.doi.org/10.1016/j.csda.2011.07.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Español I Garrigós, Josep. "Initial ensemble densities through the maximization of entropy." Physics Letters A 146, no. 1-2 (1990): 21–24. http://dx.doi.org/10.1016/0375-9601(90)90023-h.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!