Academic literature on the topic 'Entropy analysis'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Entropy analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Entropy analysis"

1

BEN ABDALLAH, NAOUFEL, HEDIA CHAKER, and CHRISTIAN SCHMEISER. "THE HIGH FIELD ASYMPTOTICS FOR A FERMIONIC BOLTZMANN EQUATION: ENTROPY SOLUTIONS AND KINETIC SHOCK PROFILES." Journal of Hyperbolic Differential Equations 04, no. 04 (December 2007): 679–704. http://dx.doi.org/10.1142/s0219891607001318.

Full text
Abstract:
The high field approximation of a fermionic Boltzmann equation of semiconductors is performed after the formation of shocks. By employing a new entropy, whose dissipation measures the departure from the high field equilibrium, convergence towards the entropic solution of the limiting conservation law is proven. The entropy is also used to construct kinetic shock profiles for entropic shocks and to prove non-existence of non-entropic shock profiles.
APA, Harvard, Vancouver, ISO, and other styles
2

Flood, Matthew W., and Bernd Grimm. "EntropyHub: An open-source toolkit for entropic time series analysis." PLOS ONE 16, no. 11 (November 4, 2021): e0259448. http://dx.doi.org/10.1371/journal.pone.0259448.

Full text
Abstract:
An increasing number of studies across many research fields from biomedical engineering to finance are employing measures of entropy to quantify the regularity, variability or randomness of time series and image data. Entropy, as it relates to information theory and dynamical systems theory, can be estimated in many ways, with newly developed methods being continuously introduced in the scientific literature. Despite the growing interest in entropic time series and image analysis, there is a shortage of validated, open-source software tools that enable researchers to apply these methods. To date, packages for performing entropy analysis are often run using graphical user interfaces, lack the necessary supporting documentation, or do not include functions for more advanced entropy methods, such as cross-entropy, multiscale cross-entropy or bidimensional entropy. In light of this, this paper introduces EntropyHub, an open-source toolkit for performing entropic time series analysis in MATLAB, Python and Julia. EntropyHub (version 0.1) provides an extensive range of more than forty functions for estimating cross-, multiscale, multiscale cross-, and bidimensional entropy, each including a number of keyword arguments that allows the user to specify multiple parameters in the entropy calculation. Instructions for installation, descriptions of function syntax, and examples of use are fully detailed in the supporting documentation, available on the EntropyHub website– www.EntropyHub.xyz. Compatible with Windows, Mac and Linux operating systems, EntropyHub is hosted on GitHub, as well as the native package repository for MATLAB, Python and Julia, respectively. The goal of EntropyHub is to integrate the many established entropy methods into one complete resource, providing tools that make advanced entropic time series analysis straightforward and reproducible.
APA, Harvard, Vancouver, ISO, and other styles
3

Yan, Kesong. "Conditional entropy and fiber entropy for amenable group actions." Journal of Differential Equations 259, no. 7 (October 2015): 3004–31. http://dx.doi.org/10.1016/j.jde.2015.04.013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Makhanlall, Deodat, and Peixue Jiang. "Analysis of Entropy Generation in Diffusion HTAC Processes." International Journal of Applied Physics and Mathematics 4, no. 4 (2014): 275–79. http://dx.doi.org/10.7763/ijapm.2014.v4.298.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hanson, Robert M. "Regarding Entropy Analysis." Journal of Chemical Education 82, no. 6 (June 2005): 839. http://dx.doi.org/10.1021/ed082p839.1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bailey, Kenneth D. "System entropy analysis." Kybernetes 26, no. 6/7 (August 1997): 674–88. http://dx.doi.org/10.1108/03684929710169852.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Størmer, Erling. "Entropy of Endomorphisms and Relative Entropy in Finite von Neumann Algebras." Journal of Functional Analysis 171, no. 1 (February 2000): 34–52. http://dx.doi.org/10.1006/jfan.1999.3535.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Downarowicz, Tomasz. "Entropy structure." Journal d'Analyse Mathématique 96, no. 1 (December 2005): 57–116. http://dx.doi.org/10.1007/bf02787825.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Litvak, A. E., V. D. Milman, A. Pajor, and N. Tomczak-Jaegermann. "Entropy extension." Functional Analysis and Its Applications 40, no. 4 (October 2006): 298–303. http://dx.doi.org/10.1007/s10688-006-0046-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Castro, Manuel J., Ulrik S. Fjordholm, Siddhartha Mishra, and Carlos Parés. "Entropy Conservative and Entropy Stable Schemes for Nonconservative Hyperbolic Systems." SIAM Journal on Numerical Analysis 51, no. 3 (January 2013): 1371–91. http://dx.doi.org/10.1137/110845379.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Entropy analysis"

1

Patterson, Brett Alexander. "Maximum entropy data analysis." Thesis, University of Cambridge, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.240969.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Purahoo, K. "Maximum entropy data analysis." Thesis, Cranfield University, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.260038.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Karamanos, Konstantinos. "Entropy analysis of nonequilibrium systems." Doctoral thesis, Universite Libre de Bruxelles, 2002. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/211390.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Schwill, Stephan. "Entropy analysis of financial time series." Thesis, University of Manchester, 2016. https://www.research.manchester.ac.uk/portal/en/theses/entropy-analysis-of-financial-time-series(7e0c84fe-5d0b-41bc-96c6-5e41ffa5b8fe).html.

Full text
Abstract:
This thesis applies entropy as a model independent measure to address research questions concerning the dynamics of various financial time series. The thesis consists of three main studies as presented in chapters 3, 4 and 5. Chapters 3 and 4 apply an entropy measure to conduct a bivariate analysis of drawdowns and drawups in foreign exchange rates. Chapter 5 investigates the dynamics of investment strategies of hedge funds using entropy of realised volatility in a conditioning model. In all three studies, methods from information theory are applied in novel ways to financial time series. As Information Theory and its central concept of entropy are not widely used in the economic sciences, a methodology chapter was therefore included in chapter 2 that gives an overview on the theoretical background and statistical features of the entropy measures used in the three main studies. In the first two studies the focus is on mutual information and transfer entropy. Both measures are used to identify dependencies between two exchange rates. The chosen measures generalise, in a well defined manner, correlation and Granger causality. A different entropy measure, the approximate entropy, is used in the third study to analyse the serial structure of S&P realised volatility. The study of drawdowns and drawups has so far been concentrated on their uni- variate characteristics. Encoding the drawdown information of a time series into a time series of discrete values, Chapter 3 uses entropy measures to analyse the correlation and cross correlations of drawdowns and drawups. The method to encode the drawdown information is explained and applied to daily and hourly EUR/USD and GBP/USD exchange rates from 2001 to 2012. For the daily series, we find evidence of dependence among the largest draws (i.e. 5% and 95% quantiles), but it is not as strong as the correlation between the daily returns of the same pair of FX rates. There is also dependence between lead/lagged values of these draws. Similar and stronger findings were found among the hourly data. We further use transfer entropy to examine the spill over and lead-lag information flow between drawup/drawdown of the two exchange rates. Such information flow is indeed detectable in both daily and hourly data. The amount of information transferred is considerably higher for the hourly than the daily data. Both daily and hourly series show clear evidence of information flowing from EUR/USD to GBP/USD and, slightly stronger, in the reverse direction. Robustness tests, using effective transfer entropy, show that the information measured is not due to noise. Chapter 4 uses state space models of volatility to investigate volatility spill overs between exchange rates. Our use of entropy related measures in the investigation of dependencies of two state space series is novel. A set of five daily exchange rates from emerging and developed economies against the dollar over the period 1999 to 2012 is used. We find that among the currency pairs, the co-movement of EUR/USD and CHF/USD volatility states show the strongest observed relationship. With the use of transfer entropy, we find evidence for information flows between the volatility state series of AUD, CAD and BRL.Chapter 5 uses the entropy of S&P realised volatility in detecting changes of volatility regime in order to re-examine the theme of market volatility timing of hedge funds. A one-factor model is used, conditioned on information about the entropy of market volatility, to measure the dynamic of hedge funds equity exposure. On a cross section of around 2500 hedge funds with a focus on the US equity markets we find that, over the period from 2000 to 2014, hedge funds adjust their exposure dynamically in response to changes in volatility regime. This adds to the literature on the volatility timing behaviour of hedge fund manager, but using entropy as a model independent measure of volatility regime. Finally, chapter 6 summarises and concludes with some suggestions for future research.
APA, Harvard, Vancouver, ISO, and other styles
5

Krempa, Peter. "Analysis of Entropy Levels in the Entropy Pool of Random Number Generator." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2013. http://www.nusl.cz/ntk/nusl-236179.

Full text
Abstract:
V informatice je pojem entropie obvykle znám jako nahodný proud dat.  Tato práce krátce shrnuje metody generovaní nahodných dat a popisuje generátor náhodnych čísel, jež je obsažen v jádře operačního systému Linux.  Dále se práce zabývá určením bitové rychlosti generování nahodných dat tímto generátorem ve virtualizovaném prosředí, které poskytují různé hypervizory.  Práce popíše problémy nízkého výkonu generátory nahodných dat ve virtualním prostředí a navrhne postup pro jejich řešení.  Poté je nastíňena implementace navržených postupů, které je podrobena testům a její vysledky jsou porovnány s původním systémem. Systém pro distribuci entropie může dále vylepšit množství entropie v sytémovém jádře o několik řádu, pokud je připojen k vykonému generátoru nahodných dat.
APA, Harvard, Vancouver, ISO, and other styles
6

Robinson, David Richard Terence. "Developments in maximum entropy data analysis." Thesis, University of Cambridge, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.307063.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

McLean, Andrew Lister. "Applications of maximum entropy data analysis." Thesis, University of Southampton, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.319161.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Garvey, Jennie Hill. "Independent component analysis by entropy maximization (infomax)." Thesis, Monterey, Calif. : Naval Postgraduate School, 2007. http://bosun.nps.edu/uhtbin/hyperion-image.exe/07Jun%5FGarvey.pdf.

Full text
Abstract:
Thesis (M.S. in Electrical Engineering)--Naval Postgraduate School, June 2007.
Thesis Advisor(s): Frank E. Kragh. "June 2007." Includes bibliographical references (p. 103). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
9

Gärtner, Joel. "Analysis of Entropy Usage in Random Number Generators." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-214567.

Full text
Abstract:
Cryptographically secure random number generators usually require an outside seed to be initialized. Other solutions instead use a continuous entropy stream to ensure that the internal state of the generator always remains unpredictable. This thesis analyses four such generators with entropy inputs. Furthermore, different ways to estimate entropy is presented and a new method useful for the generator analysis is developed. The developed entropy estimator performs well in tests and is used to analyse entropy gathered from the different generators. Furthermore, all the analysed generators exhibit some seemingly unintentional behaviour, but most should still be safe for use.
Kryptografiskt säkra slumptalsgeneratorer behöver ofta initialiseras med ett oförutsägbart frö. En annan lösning är att istället konstant ge slumptalsgeneratorer entropi. Detta gör det möjligt att garantera att det interna tillståndet i generatorn hålls oförutsägbart. I den här rapporten analyseras fyra sådana generatorer som matas med entropi. Dessutom presenteras olika sätt att skatta entropi och en ny skattningsmetod utvecklas för att användas till analysen av generatorerna. Den framtagna metoden för entropiskattning lyckas bra i tester och används för att analysera entropin i de olika generatorerna. Alla analyserade generatorer uppvisar beteenden som inte verkar optimala för generatorns funktionalitet. De flesta av de analyserade generatorerna verkar dock oftast säkra att använda.
APA, Harvard, Vancouver, ISO, and other styles
10

Mujumdar, Anusha Pradeep. "Cross entropy-based analysis of spacecraft control systems." Thesis, University of Exeter, 2016. http://hdl.handle.net/10871/28006.

Full text
Abstract:
Space missions increasingly require sophisticated guidance, navigation and control algorithms, the development of which is reliant on verification and validation (V&V) techniques to ensure mission safety and success. A crucial element of V&V is the assessment of control system robust performance in the presence of uncertainty. In addition to estimating average performance under uncertainty, it is critical to determine the worst case performance. Industrial V&V approaches typically employ mu-analysis in the early control design stages, and Monte Carlo simulations on high-fidelity full engineering simulators at advanced stages of the design cycle. While highly capable, such techniques present a critical gap between pessimistic worst case estimates found using analytical methods, and the optimistic outlook often presented by Monte Carlo runs. Conservative worst case estimates are problematic because they can demand a controller redesign procedure, which is not justified if the poor performance is unlikely to occur. Gaining insight into the probability associated with the worst case performance is valuable in bridging this gap. It should be noted that due to the complexity of industrial-scale systems, V&V techniques are required to be capable of efficiently analysing non-linear models in the presence of significant uncertainty. As well, they must be computationally tractable. It is desirable that such techniques demand little engineering effort before each analysis, to be applied widely in industrial systems. Motivated by these factors, this thesis proposes and develops an efficient algorithm, based on the cross entropy simulation method. The proposed algorithm efficiently estimates the probabilities associated with various performance levels, from nominal performance up to degraded performance values, resulting in a curve of probabilities associated with various performance values. Such a curve is termed the probability profile of performance (PPoP), and is introduced as a tool that offers insight into a control system's performance, principally the probability associated with the worst case performance. The cross entropy-based robust performance analysis is implemented here on various industrial systems in European Space Agency-funded research projects. The implementation on autonomous rendezvous and docking models for the Mars Sample Return mission constitutes the core of the thesis. The proposed technique is implemented on high-fidelity models of the Vega launcher, as well as on a generic long coasting launcher upper stage. In summary, this thesis (a) develops an algorithm based on the cross entropy simulation method to estimate the probability associated with the worst case, (b) proposes the cross entropy-based PPoP tool to gain insight into system performance, (c) presents results of the robust performance analysis of three space industry systems using the proposed technique in conjunction with existing methods, and (d) proposes an integrated template for conducting robust performance analysis of linearised aerospace systems.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Entropy analysis"

1

Papadimitriou, Fivos. Spatial Entropy and Landscape Analysis. Wiesbaden: Springer Fachmedien Wiesbaden, 2022. http://dx.doi.org/10.1007/978-3-658-35596-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Eshima, Nobuoki. Statistical Data Analysis and Entropy. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-2552-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Entropy analysis: An introduction to chemical thermodynamics. New York: VCH Publishers, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

1975-, Sims Robert, and Ueltschi Daniel 1969-, eds. Entropy and the quantum: Arizona School of Analysis with Applications, March 16-20, 2009, University of Arizona. Providence, R.I: American Mathematical Society, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yee, H. C. Entropy splitting and numerical dissipation. Moffett Field, Calif: National Aeronautics and Space Administration, Ames Research Center, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Yee, H. C. Entropy splitting and numerical dissipation. Moffett Field, Calif: National Aeronautics and Space Administration, Ames Research Center, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Botbol, Joseph Moses. Multivariate clustering based on entropy. [Washington]: U.S. G.P.O., 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Botbol, Joseph Moses. Multivariate clustering based on entropy. Washington, DC: Dept. of the Interior, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Sussman, M. V. Availability (exergy) analysis: A self instruction manual. 3rd ed. Lexington, Mass: Mulliken House, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Data analysis: A Bayesian tutorial. Oxford: Clarendon Press, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Entropy analysis"

1

Yentes, Jennifer M. "Entropy." In Nonlinear Analysis for Human Movement Variability, 173–260. Boca Raton : Taylor & Francis, Taylor & Francis, a CRC title, part of the: CRC Press, 2018. http://dx.doi.org/10.1201/9781315370651-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Maurer, Andreas. "Entropy and Concentration." In Harmonic and Applied Analysis, 55–100. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-86664-8_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yu, Francis T. S. "Diffraction and Signal Analysis." In Entropy and Information Optics, 17–34. Second edition. | Boca Raton : Taylor & Francis, CRC Press,2017.: CRC Press, 2017. http://dx.doi.org/10.1201/b22443-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Chakraborty, Soubhik, Guerino Mazzola, Swarima Tewari, and Moujhuri Patra. "Raga Analysis Using Entropy." In Computational Musicology in Hindustani Music, 65–68. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-11472-9_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Eshima, Nobuoki. "Entropy-Based Path Analysis." In Behaviormetrics: Quantitative Approaches to Human Behavior, 167–97. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-2552-0_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Pardo-Igúzquiza, Eulogio, and Francisco J. Rodríguez-Tovar. "Maximum Entropy Spectral Analysis." In Encyclopedia of Mathematical Geosciences, 1–8. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-26050-7_197-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Von Der Linden, W., V. Dose, and A. Ramaswami. "Bayesian Group Analysis." In Maximum Entropy and Bayesian Methods, 87–99. Dordrecht: Springer Netherlands, 1998. http://dx.doi.org/10.1007/978-94-011-5028-6_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Yu, Francis T. S. "Wideband Signal Analysis with Optics." In Entropy and Information Optics, 155–60. Second edition. | Boca Raton : Taylor & Francis, CRC Press,2017.: CRC Press, 2017. http://dx.doi.org/10.1201/b22443-15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lukac, Rastislav, Bogdan Smolka, Konstantinos N. Plataniotis, and Anastasios N. Venetsanopoulos. "Entropy Vector Median Filter." In Pattern Recognition and Image Analysis, 1117–25. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-44871-6_129.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Barbakh, Wesam Ashour, Ying Wu, and Colin Fyfe. "Cross Entropy Methods." In Non-Standard Parameter Adaptation for Exploratory Data Analysis, 151–74. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04005-4_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Entropy analysis"

1

Jones, Bush. "Entropy data analysis." In the 18th conference. New York, New York, USA: ACM Press, 1986. http://dx.doi.org/10.1145/318242.318471.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Hong, and Sha-sha He. "Analysis and Comparison of Permutation Entropy, Approximate Entropy and Sample Entropy." In 2018 International Symposium on Computer, Consumer and Control (IS3C). IEEE, 2018. http://dx.doi.org/10.1109/is3c.2018.00060.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kalogeropoulos, Nikos. "Tsallis entropy and hyperbolicity." In 11TH INTERNATIONAL CONFERENCE OF NUMERICAL ANALYSIS AND APPLIED MATHEMATICS 2013: ICNAAM 2013. AIP, 2013. http://dx.doi.org/10.1063/1.4825870.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Prasad, T. Devi, and Tiku T. Tanyimboh. "Entropy Based Design of ''Anytown'' Water Distribution Network." In Water Distribution Systems Analysis 2008. Reston, VA: American Society of Civil Engineers, 2009. http://dx.doi.org/10.1061/41024(340)39.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Jalali, Shirin, and H. Vincent Poor. "Minimum entropy pursuit: Noise analysis." In 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2017. http://dx.doi.org/10.1109/icassp.2017.7953328.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hu, Peng, and An-ping Yang. "Indefinite Kernel Entropy Component Analysis." In 2010 International Conference on Multimedia Technology (ICMT). IEEE, 2010. http://dx.doi.org/10.1109/icmult.2010.5631137.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Jeong, Guhyeon, Euijin Choo, Joosuk Lee, Munkhbayar Bat-Erdene, and Heejo Lee. "Generic unpacking using entropy analysis." In 2010 5th International Conference on Malicious and Unwanted Software (MALWARE). IEEE, 2010. http://dx.doi.org/10.1109/malware.2010.5665789.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Tian, Xiaowei, and Liqiu Wang. "Entropy Analysis of Heat Conduction." In ICHMT International Symposium on Advances in Computational Heat Transfer. Connecticut: Begellhouse, 2017. http://dx.doi.org/10.1615/ichmt.2017.1400.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Tian, Xiaowei, and Liqiu Wang. "Entropy Analysis of Heat Conduction." In ICHMT International Symposium on Advances in Computational Heat Transfer. Connecticut: Begellhouse, 2017. http://dx.doi.org/10.1615/ichmt.2017.cht-7.1400.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

De Giorgi, Luigi, Volfango Bertola, Emilio Cafaro, and Carlo Cima. "Numerical Solutions Control by Entropy Analysis." In 2010 14th International Heat Transfer Conference. ASMEDC, 2010. http://dx.doi.org/10.1115/ihtc14-22903.

Full text
Abstract:
The rate of entropy generation is used to estimate the average error of approximate numerical solutions of conductive and convective heat transfer problems with respect to the corresponding exact solutions. This is possible because the entropy analysis of simple problems, which have exact analytical solutions, shows that the rate of entropy generation behaves similarly to the average error of approximate solutions. Two test cases (transient two-dimensional heat conduction with Dirichlet boundary conditions and free convection between two plates at different temperatures with internal heat source) are discussed. Results suggest to use entropy analysis as a tool for the assessment of solution methods and to estimate the error of numerical solutions of thermal-fluid-dynamics problems.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Entropy analysis"

1

Drost, M. K., and M. D. White. Local entropy generation analysis. Office of Scientific and Technical Information (OSTI), February 1991. http://dx.doi.org/10.2172/6078657.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Miller, Erik G., and John W. Fisher III. Independent Components Analysis by Direct Entropy Minimization. Fort Belvoir, VA: Defense Technical Information Center, January 2003. http://dx.doi.org/10.21236/ada603560.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Danylchuk, H., V. Derbentsev, Володимир Миколайович Соловйов, and A. Sharapov. Entropy analysis of dynamic properties of regional stock markets. Society for Cultural and Scientific Progress in Central and Eastern Europe, 2016. http://dx.doi.org/10.31812/0564/1154.

Full text
Abstract:
This paper examines entropy analysis of regional stock markets. We propose and empirically demonstrate the effectiveness of using such entropy as sample entropy, wavelet and Tsallis entropy as a measure of uncertainty and instability of such complex systems as regional stock markets. Our results show that these entropy measures can be effectively used as crisis prediction indicators.
APA, Harvard, Vancouver, ISO, and other styles
4

Soloviev, Vladimir, Andrii Bielinskyi, and Viktoria Solovieva. Entropy Analysis of Crisis Phenomena for DJIA Index. [б. в.], June 2019. http://dx.doi.org/10.31812/123456789/3179.

Full text
Abstract:
The Dow Jones Industrial Average (DJIA) index for the 125-year-old (since 1896) history has experienced many crises of different nature and, reflecting the dynamics of the world stock market, is an ideal model object for the study of quantitative indicators and precursors of crisis phenomena. In this paper, the classification and periodization of crisis events for the DJIA index have been carried out; crashes and critical events have been highlighted. Based on the modern paradigm of the theory of complexity, a spectrum of entropy indicators and precursors of crisis phenomena have been proposed. The entropy of a complex system is not only a measure of uncertainty (like Shannon's entropy) but also a measure of complexity (like the permutation and Tsallis entropy). The complexity of the system in a crisis changes significantly. This fact can be used as an indicator, and in the case of a proactive change as a precursor of a crisis. Complex systems also have the property of scale invariance, which can be taken into account by calculating the Multiscale entropy. The calculations were carried out within the framework of the sliding window algorithm with the subsequent comparison of the entropy measures of complexity with the dynamics of the DJIA index itself. It is shown that Shannon's entropy is an indicator, and the permutation and Tsallis entropy are the precursors of crisis phenomena to the same extent for both crashes and critical events.
APA, Harvard, Vancouver, ISO, and other styles
5

Lozar, Robert, Scott Tweddale, Charles Ehlschlaeger, Carey Baxter, and Jeffrey Burkhalter. Testing maximum entropy analysis to define population distributions. Engineer Research and Development Center (U.S.), September 2018. http://dx.doi.org/10.21079/11681/29352.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Dustafson, Donald E. Adaptive Time Series Analysis Using Predictive Inference and Entropy. Fort Belvoir, VA: Defense Technical Information Center, December 1987. http://dx.doi.org/10.21236/ada191858.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Karwat, H., and Y. Q. Ruan. Entropy analysis on non-equilibrium two-phase flow models. Office of Scientific and Technical Information (OSTI), September 1995. http://dx.doi.org/10.2172/106993.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Soloviev, Vladimir, Oleksandr Serdiuk, Serhiy Semerikov, and Arnold Kiv. Recurrence plot-based analysis of financial-economic crashes. [б. в.], October 2020. http://dx.doi.org/10.31812/123456789/4121.

Full text
Abstract:
The article considers the possibility of analyzing the dynamics of changes in the characteristics of time series obtained on the basis of recurrence plots. The possibility of using the studied indicators to determine the presence of critical phenomena in economic systems is considered. Based on the analysis of economic time series of different nature, the suitability of the studied characteristics for the identification of critical phenomena is assessed. The description of recurrence diagrams and characteristics of time series that can be obtained on their basis is given. An analysis of seven characteristics of time series, including the coefficient of self-similarity, the coefficient of predictability, entropy, laminarity, is carried out. For the entropy characteristic, several options for its calculation are considered, each of which allows the one to get its own information about the state of the economic system. The possibility of using the studied characteristics as precursors of critical phenomena in economic systems is analyzed. We have demonstrated that the entropy analysis of financial time series in phase space reveals the characteristic recurrent properties of complex systems. The recurrence entropy methodology has several advantages compared to the traditional recurrence entropy defined in the literature, namely, the correct evaluation of the chaoticity level of the signal, the weak dependence on parameters. The characteristics were studied on the basis of daily values of the Dow Jones index for the period from 1990 to 2019 and daily values of oil prices for the period from 1987 to 2019. The behavior of recurrence entropy during critical phenomena in the stock markets of the USA, Germany and France was studied separately. As a result of the study, it was determined that delay time measure, determinism and laminarity can be used as indicators of critical phenomena. It turned out that recurrence entropy, unlike other entropy indicators of complexity, is an indicator and an early precursor of crisis phenomena. The ways of further research are outlined.
APA, Harvard, Vancouver, ISO, and other styles
9

Zilberman, Mark. Good and Evil from the Point of View of Physics. Intellectual Archive, December 2022. http://dx.doi.org/10.32370/iaj.2763.

Full text
Abstract:
The article analyzes the concepts of "good" and "evil" from the point of view of physics. Although the physical concept of “entropy” as a measure of disorder was the first candidate who could serve as the physical basis of these ethical concepts, in fact it is not suitable for this purpose. However, the “entropic potential of the event” Z (T, A) that describes the impact of the event A that occurred in the moment T0 in the system R to the entropy of this system at the future moment T (T > T0) is well suited for our analysis. The article describes methods for calculating the “entropic potential of the event” for certain real-life events and discuss several other related ideas, such as “time factor”, “averaging” and “universality”.
APA, Harvard, Vancouver, ISO, and other styles
10

Donald Johnson, Todd Schaack. MODELING AND ANALYSIS OF GLOBAL AND REGIONAL HYDROLOGIC PROCESSES AND APPROPRIATE CONSERVATION OF MOIST ENTROPY. Office of Scientific and Technical Information (OSTI), June 2007. http://dx.doi.org/10.2172/908633.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography