Academic literature on the topic 'Entropy'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Entropy.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Entropy"

1

Siagian, Ruben Cornelius, Lulut Alfaris, Arip Nurahman, and Eko Pramesti Sumarto. "TERMODINAMIKA LUBANG HITAM: HUKUM PERTAMA DAN KEDUA SERTA PERSAMAAN ENTROPI." Jurnal Kumparan Fisika 6, no. 1 (May 11, 2023): 1–10. http://dx.doi.org/10.33369/jkf.6.1.1-10.

Full text
Abstract:
ABSTRAK Artikel ini membahas konsep termodinamika yang berlaku pada Lubang Hitam, yaitu hukum termodinamika pertama dan kedua. Hukum pertama termodinamika menghubungkan perubahan massa dengan perubahan entropi dan kerja, memungkinkan Lubang Hitam diperlakukan sebagai sistem termodinamika dengan suhu dan entropi. Hukum kedua termodinamika menyatakan bahwa entropi suatu sistem terisolasi dalam kesetimbangan termodinamika selalu meningkat atau tetap konstan, termasuk untuk Lubang Hitam. Metode penulisan yang digunakan dalam artikel ini melibatkan derivasi matematis untuk entropi Lubang Hitam, dengan menggabungkan hukum kedua termodinamika dan konsep termodinamika Lubang Hitam, di mana entropi dapat dinyatakan sebagai fungsi luas cakrawala peristiwa. Artikel ini menyoroti pentingnya konsep entropi dan termodinamika Lubang Hitam dalam memahami alam semesta, serta penerapannya di berbagai bidang sains. Kata kunci—Lubang Hitam, Termodinamika, Entropi, Hukum pertama termodinamika, Hukum kedua termodinamika ABSTRACT This article delves into the concepts of thermodynamics that apply to Lubang Hitams, namely the first and second laws of thermodynamics. The first law of thermodynamics connects changes in mass with changes in entropy and work, allowing Lubang Hitams to be treated as thermodynamic systems with temperature and entropy. The second law of thermodynamics states that the entropy of an isolated system in thermodynamic equilibrium always increases or remains constant, including for Lubang Hitams. The writing approach employed in this article involves mathematical derivations for Lubang Hitam entropy, combining the second law of thermodynamics with the concept of Lubang Hitam thermodynamics, where entropy can be expressed as a function of the event horizon's surface area. This article highlights the significance of entropy and Lubang Hitam thermodynamics in understanding the universe, as well as their applications in various scientific fields. Keywords—Lubang Hitam, Thermodynamics, Entropy, First law of thermodynamics, Second law of thermodynamics
APA, Harvard, Vancouver, ISO, and other styles
2

Kang, Jin-Wen, Ke-Ming Shen, and Ben-Wei Zhang. "A Note on the Connection between Non-Additive Entropy and h-Derivative." Entropy 25, no. 6 (June 9, 2023): 918. http://dx.doi.org/10.3390/e25060918.

Full text
Abstract:
In order to study as a whole a wide part of entropy measures, we introduce a two-parameter non-extensive entropic form with respect to the h-derivative, which generalizes the conventional Newton–Leibniz calculus. This new entropy, Sh,h′, is proved to describe the non-extensive systems and recover several types of well-known non-extensive entropic expressions, such as the Tsallis entropy, the Abe entropy, the Shafee entropy, the Kaniadakis entropy and even the classical Boltzmann–Gibbs one. As a generalized entropy, its corresponding properties are also analyzed.
APA, Harvard, Vancouver, ISO, and other styles
3

Li, Shu-Nan, and Bing-Yang Cao. "On Entropic Framework Based on Standard and Fractional Phonon Boltzmann Transport Equations." Entropy 21, no. 2 (February 21, 2019): 204. http://dx.doi.org/10.3390/e21020204.

Full text
Abstract:
Generalized expressions of the entropy and related concepts in non-Fourier heat conduction have attracted increasing attention in recent years. Based on standard and fractional phonon Boltzmann transport equations (BTEs), we study entropic functionals including entropy density, entropy flux and entropy production rate. Using the relaxation time approximation and power series expansion, macroscopic approximations are derived for these entropic concepts. For the standard BTE, our results can recover the entropic frameworks of classical irreversible thermodynamics (CIT) and extended irreversible thermodynamics (EIT) as if there exists a well-defined effective thermal conductivity. For the fractional BTEs corresponding to the generalized Cattaneo equation (GCE) class, the entropy flux and entropy production rate will deviate from the forms in CIT and EIT. In these cases, the entropy flux and entropy production rate will contain fractional-order operators, which reflect memory effects.
APA, Harvard, Vancouver, ISO, and other styles
4

KOSSAKOWSKI, A., M. OHYA, and N. WATANABE. "QUANTUM DYNAMICAL ENTROPY FOR COMPLETELY POSITIVE MAP." Infinite Dimensional Analysis, Quantum Probability and Related Topics 02, no. 02 (June 1999): 267–82. http://dx.doi.org/10.1142/s021902579900014x.

Full text
Abstract:
A dynamical entropy for not only shift but also completely positive (CP) map is defined by generalizing the AOW entropy1 defined through quantum Markov chain and AF entropy defined by a finite operational partition. Our dynamical entropy is numerically computed for several models.
APA, Harvard, Vancouver, ISO, and other styles
5

Huang, Yiqi. "An overview of the development and applications of information entropy." Theoretical and Natural Science 50, no. 1 (August 27, 2024): 52–57. http://dx.doi.org/10.54254/2753-8818/50/20240663.

Full text
Abstract:
With information entropy gradually taking the lead in modern information theory development, it begins to hold greater influence over multiple research areas as well as technology innovation. This paper aims to clarify peoples confusion with the development of entropy theory and provide a brief overview of the origin of entropy theory, including the original Shannons proposal, variants such as relative entropy and conditional entropy, and entropy concepts proposed by other scientists, such as Rnyi Tsallis entropy. The paper also includes the current application of entropy, studies hotspots, and predicts future entropy development trends. This research paper is able to add more coherence and consistency to information entropys development, helping more people to better understand the concept of entropy and its derivation. At the same time, with hotspots of entropy fields of study, this paper hopes to attract more people to devote themselves to studying entropy-related fields, and boost technological development.
APA, Harvard, Vancouver, ISO, and other styles
6

Huang, Yiqi. "An overview of the development and applications of information entropy." Theoretical and Natural Science 42, no. 1 (August 27, 2024): 52–57. http://dx.doi.org/10.54254/2753-8818/42/20240663.

Full text
Abstract:
With information entropy gradually taking the lead in modern information theory development, it begins to hold greater influence over multiple research areas as well as technology innovation. This paper aims to clarify peoples confusion with the development of entropy theory and provide a brief overview of the origin of entropy theory, including the original Shannons proposal, variants such as relative entropy and conditional entropy, and entropy concepts proposed by other scientists, such as Rnyi Tsallis entropy. The paper also includes the current application of entropy, studies hotspots, and predicts future entropy development trends. This research paper is able to add more coherence and consistency to information entropys development, helping more people to better understand the concept of entropy and its derivation. At the same time, with hotspots of entropy fields of study, this paper hopes to attract more people to devote themselves to studying entropy-related fields, and boost technological development.
APA, Harvard, Vancouver, ISO, and other styles
7

Jawad, Abdul, and Ayesha Iqbal. "Modified cosmology through Renyi and logarithmic entropies." International Journal of Geometric Methods in Modern Physics 15, no. 08 (June 22, 2018): 1850130. http://dx.doi.org/10.1142/s021988781850130x.

Full text
Abstract:
We consider two different entropic corrections to Bekenstein entropy, namely Renyi entropy and logarithmic-corrected entropy, and develop the entropic force, heat flow across the horizon and pressure. We also derive the expressions for Newton’s law of gravitation and verify with Bekenstein entropy by taking [Formula: see text] in the case of Renyi entropy and [Formula: see text] for logarithmic entropy. The modified Friedmann equations are also being developed by using Newton’s first law of thermodynamics in both cases. In the presence of these equations, we also analyze the validity of generalized second law of thermodynamics for both entropy corrections on the apparent horizon. It is found that this law remains valid throughout the region under certain assumptions for nonflat FRW universe.
APA, Harvard, Vancouver, ISO, and other styles
8

XIAO, CHANGMING, and LIXIN HUANG. "ENTROPIC FORCE IN A CLOSED IDEAL GAS." Modern Physics Letters B 20, no. 09 (April 10, 2006): 495–500. http://dx.doi.org/10.1142/s0217984906010731.

Full text
Abstract:
For a closed thermodynamic system of ideal gas, the entropic force is studied in this paper. The results show that the entropic force arises when the entropy is deviated from its equilibrium maximum value by an external force. This entropic force resists the entropy deviation enlarging, and will drive the entropy back to its maximum value if the external forces disappear.
APA, Harvard, Vancouver, ISO, and other styles
9

Silva, Carlos, and Kalyan Annamalai. "Entropy Generation and Human Aging: Lifespan Entropy and Effect of Physical Activity Level." Entropy 10, no. 2 (June 20, 2008): 100–123. http://dx.doi.org/10.3390/entropy-e10020100.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zhao, Lina, Chengyu Liu, Shoushui Wei, Qin Shen, Fan Zhou, and Jianqing Li. "A New Entropy-Based Atrial Fibrillation Detection Method for Scanning Wearable ECG Recordings." Entropy 20, no. 12 (November 26, 2018): 904. http://dx.doi.org/10.3390/e20120904.

Full text
Abstract:
Entropy-based atrial fibrillation (AF) detectors have been applied for short-term electrocardiogram (ECG) analysis. However, existing methods suffer from several limitations. To enhance the performance of entropy-based AF detectors, we have developed a new entropy measure, named EntropyAF, which includes the following improvements: (1) use of a ranged function rather than the Chebyshev function to define vector distance, (2) use of a fuzzy function to determine vector similarity, (3) replacement of the probability estimation with density estimation for entropy calculation, (4) use of a flexible distance threshold parameter, and (5) use of adjusted entropy results for the heart rate effect. EntropyAF was trained using the MIT-BIH Atrial Fibrillation (AF) database, and tested on the clinical wearable long-term AF recordings. Three previous entropy-based AF detectors were used for comparison: sample entropy (SampEn), fuzzy measure entropy (FuzzyMEn) and coefficient of sample entropy (COSEn). For classifying AF and non-AF rhythms in the MIT-BIH AF database, EntropyAF achieved the highest area under receiver operating characteristic curve (AUC) values of 98.15% when using a 30-beat time window, which was higher than COSEn with AUC of 91.86%. SampEn and FuzzyMEn resulted in much lower AUCs of 74.68% and 79.24% respectively. For classifying AF and non-AF rhythms in the clinical wearable AF database, EntropyAF also generated the largest values of Youden index (77.94%), sensitivity (92.77%), specificity (85.17%), accuracy (87.10%), positive predictivity (68.09%) and negative predictivity (97.18%). COSEn had the second-best accuracy of 78.63%, followed by an accuracy of 65.08% in FuzzyMEn and an accuracy of 59.91% in SampEn. The new proposed EntropyAF also generated highest classification accuracy when using a 12-beat time window. In addition, the results from time cost analysis verified the efficiency of the new EntropyAF. This study showed the better discrimination ability for identifying AF when using EntropyAF method, indicating that it would be useful for the practical clinical wearable AF scanning.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Entropy"

1

Bernier, Jobe Paul. "Entropy and Architecture entropic phenomena actuating dynamic space /." Thesis, Montana State University, 2008. http://etd.lib.montana.edu/etd/2008/bernier/BernierJ0508.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Sognnæs, Ida Andrea Braathen. "Maximum Entropy and Maximum Entropy Production in Macroecology." Thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for fysikk, 2011. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-12651.

Full text
Abstract:
The Maximum Entropy Theory of Ecology (METE), developed by John Harte, presents an entirely new method of making inferences in ecology. The method is based on the established mathematical procedure of Maximum Information Entropy (MaxEnt), developed by Edwin T. Jaynes, and is used to derive a range of important relationships in macroecology. The Maximum Entropy Production (MEP) principle is a more recent theory. This principle was used by Paltridge to successfully predict the climate on Earth in 1975. It has been suggested that this principle can be used for predicting the evolution of ecosystems over time in the framework of METE. This idea is at the very frontier of Harte's theory. This thesis investigates the hypothesis that the information entropy defined in METE is described by the MEP principle.I show that the application of the MEP principle to the information entropy in METE leads to a range of conceptual and mathematical difficulties. I show that the initial hypothesis alone cannot predict the time rate of change, but that it does predict that the number of individual organisms and the total metabolic rate of an ecosystem will continue to grow indefinitely, whereas the number of species will approach one.I also conduct a thorough review of the MEP literature and discuss the possibility of an application of the MEP principle to METE based on analogies. I also study a proof of the MEP principle published by Dewar in 2003 and 2005 in order to investigate the possibility of an application based on first principles. I conclude that the MEP principle has a low probability of success if applied directly to the information entropy in METE.One of the most central relationships derived in METE is the expected number of species in a plot of area $A$. I conduct a numerical simulation in order to study the variance of the actual number of species in a collection of plots. I then suggest two methods to be used for comparison between predictions and observations in METE.I also conduct a numerical study of selectied stability properties of Paltridge's climate model and conclude that none of these can explain the observed MEP state in nature.
APA, Harvard, Vancouver, ISO, and other styles
3

Asaad-Sultan, Asaad M. Abu. "Entropic vector optimization and simulated entropy : theory and applications." Thesis, University of Liverpool, 1990. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.293838.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cullen, Carley Nicole. "Empathy + entropy." Thesis, University of Iowa, 2019. https://ir.uiowa.edu/etd/6721.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Šelinga, Martin. "Software pro hodnocení zdrojů entropie." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2019. http://www.nusl.cz/ntk/nusl-401953.

Full text
Abstract:
This thesis is focused on exploring the sources of entropy. It includes a description of random number generators and tests used to evaluate entropy quality. Random number generator for Windows and Linux OS was created together with software for entropy evaluation. Subsequently, measurement of entropy was performed on physical workstations and Cloud environments.
APA, Harvard, Vancouver, ISO, and other styles
6

Mendes, Ronã Rinston Amaury [UNESP]. "Uma contribuição para a otimização de portfólios de séries heteroscedásticas usando projeto de experimento de misturas: uma abordagem do desirability aplicada a modelos." Universidade Estadual Paulista (UNESP), 2012. http://hdl.handle.net/11449/103053.

Full text
Abstract:
Made available in DSpace on 2014-06-11T19:32:22Z (GMT). No. of bitstreams: 0 Previous issue date: 2012-11-20Bitstream added on 2014-06-13T18:43:41Z : No. of bitstreams: 1 mendes_rra_dr_guara.pdf: 1241082 bytes, checksum: 7ae1222297e09f373622dfd724d0cbd5 (MD5)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Esta tese apresenta uma proposta inovadora com base no DOE (Design of Experiments) para tratar a otimização de portfólios multiobjetivos utilizando uma abordagem híbrida que combina arranjos de experimentos do tipo Misturas (Mixture Design of Experiments – MDE) e funções Desirability para se encontrar um portfólio ótimo modelado pelo algoritmo ARMA–GARCH. Neste tipo de estratégia experimental, as proporções investidas em cada ativo do portfólio são tratadas como fatores de um arranjo de misturas adequado para o tratamento de portfólios em geral. Ao invés de utilizar a tradicional programação matemática de portfólios de média variância (MVP), o conceito da função desirability é aqui utilizado para resolver problemas de otimização não linear multiobjetiva para a predição de valores condicionais de retorno (média), risco (variância) e entropia com suas respectivas superfícies de resposta estimadas pelo MDE. Para evitar a falta de diversificação dos portfólios, o princípio da Máxima Entropia de Shannon é incorporado ao modelo de otimização. O método fatorial de ajuste da função desirability proposto nesta tese aperfeiçoa o desempenho do algoritmo desirability conduzindo a uma eficiente alocação dos ativos no portfólio. Esta abordagem também permite a inclusão da aversão ao risco na rotina de otimização e engloba as interações (efeitos não lineares) dos efeitos entre diversos ativos enquanto reduz o esforço computacional requerido para resolver o problema de otimização não linear restrito. Para avaliar a viabilidade proposta, o método foi testado com dados reais de séries semanais do mercado mundial de preços spot de petróleo bruto. Os resultados numéricos demonstram a adequação da proposta
This thesis presents a new Design of Experiments (DOE)–based approach to treat multi– objective portfolio optimization combining Mixture Design of Experiments (MDE) and Desirability functions to find an optimal portfolio modeled by ARMA–GARCH algorithm. In this kind of experimental strategy, the design factors are treated as proportions in a mixture system considered quite adequate for treating portfolios in general. Instead of using traditional MVP mathematical programming, the concept of desirability function is here used to solve multiobjective nonlinear objective optimization problem for the predicted conditional values of return (mean), risk (variance) and entropy with their respective response surfaces estimated by MDE. To avoid the portfolio’s lack of diversity, the principle of Shannon’s maximum entropy is embodied in the optimization model. The computer–aided desirability tuning method proposed in this paper improves the desirability algorithm performance leading to an efficient assets allocation. This approach also allows the inclusion of risk aversion in the optimization routine and encompasses the interaction (nonlinear) effects among the several assets while reduces the computational effort required to solve the constrained nonlinear optimization problem. To assess the proposal feasibility, the method is tested with a real data set formed by weekly world crude oil spot prices. The numerical results verify the proposal’s adequacy
APA, Harvard, Vancouver, ISO, and other styles
7

Mendes, Ronã Rinston Amaury. "Uma contribuição para a otimização de portfólios de séries heteroscedásticas usando projeto de experimento de misturas: uma abordagem do desirability aplicada a modelos /." Guaratinguetá : [s.n.], 2012. http://hdl.handle.net/11449/103053.

Full text
Abstract:
Orientador: Anderson Paulo de Paiva
Coorientador: Pedro Paulo Balestrassi
Banca: Marcela Aparecida Guerreira Machado de Freitas
Banca: Antonio Fernando Branco Costa
Banca: Rafael Coradi Leme
Banca: João Roberto Ferreira
Resumo: Esta tese apresenta uma proposta inovadora com base no DOE (Design of Experiments) para tratar a otimização de portfólios multiobjetivos utilizando uma abordagem híbrida que combina arranjos de experimentos do tipo Misturas (Mixture Design of Experiments - MDE) e funções Desirability para se encontrar um portfólio ótimo modelado pelo algoritmo ARMA-GARCH. Neste tipo de estratégia experimental, as proporções investidas em cada ativo do portfólio são tratadas como fatores de um arranjo de misturas adequado para o tratamento de portfólios em geral. Ao invés de utilizar a tradicional programação matemática de portfólios de média variância (MVP), o conceito da função desirability é aqui utilizado para resolver problemas de otimização não linear multiobjetiva para a predição de valores condicionais de retorno (média), risco (variância) e entropia com suas respectivas superfícies de resposta estimadas pelo MDE. Para evitar a falta de diversificação dos portfólios, o princípio da Máxima Entropia de Shannon é incorporado ao modelo de otimização. O método fatorial de ajuste da função desirability proposto nesta tese aperfeiçoa o desempenho do algoritmo desirability conduzindo a uma eficiente alocação dos ativos no portfólio. Esta abordagem também permite a inclusão da aversão ao risco na rotina de otimização e engloba as interações (efeitos não lineares) dos efeitos entre diversos ativos enquanto reduz o esforço computacional requerido para resolver o problema de otimização não linear restrito. Para avaliar a viabilidade proposta, o método foi testado com dados reais de séries semanais do mercado mundial de preços spot de petróleo bruto. Os resultados numéricos demonstram a adequação da proposta
Abstract: This thesis presents a new Design of Experiments (DOE)-based approach to treat multi- objective portfolio optimization combining Mixture Design of Experiments (MDE) and Desirability functions to find an optimal portfolio modeled by ARMA-GARCH algorithm. In this kind of experimental strategy, the design factors are treated as proportions in a mixture system considered quite adequate for treating portfolios in general. Instead of using traditional MVP mathematical programming, the concept of desirability function is here used to solve multiobjective nonlinear objective optimization problem for the predicted conditional values of return (mean), risk (variance) and entropy with their respective response surfaces estimated by MDE. To avoid the portfolio's lack of diversity, the principle of Shannon's maximum entropy is embodied in the optimization model. The computer-aided desirability tuning method proposed in this paper improves the desirability algorithm performance leading to an efficient assets allocation. This approach also allows the inclusion of risk aversion in the optimization routine and encompasses the interaction (nonlinear) effects among the several assets while reduces the computational effort required to solve the constrained nonlinear optimization problem. To assess the proposal feasibility, the method is tested with a real data set formed by weekly world crude oil spot prices. The numerical results verify the proposal's adequacy
Doutor
APA, Harvard, Vancouver, ISO, and other styles
8

Pougaza, Doriano-Boris. "Utilisation de la notion de copule en tomographie." Phd thesis, Université Paris Sud - Paris XI, 2011. http://tel.archives-ouvertes.fr/tel-00684637.

Full text
Abstract:
Cette thèse porte sur le lien entre la tomographie et la notion de copule. La tomographie à rayons X consiste à (re)construire la structure cachée d'un objet (une densité de matière, la distribution d'une quantité physique, ou une densité de loi conjointe) à partir de certaines données obtenues ou mesurées de l'objet (les projections, les radiographies, les densités marginales). Le lien entre les mesures et l'objet se modélise mathématiquement par la Transformée à Rayons X ou la Transformée de Radon. Par exemple, dans les problèmes d'imagerie en géométrie parallèle, lorsqu'on a seulement deux projections à deux angles de 0 et pi/2 (horizontale et verticale), le problème peut être identifié comme un autre problème très important en mathématique qui est la détermination d'une densité conjointe à partir de ses marginales. En se limitant à deux projections, les deux problèmes sont des problèmes mal posés au sens de Hadamard. Il faut alors ajouter de l'information a priori, ou bien des contraintes supplémentaires. L'apport principal de cette thèse est l'utilisation des critères de plusieurs entropies (Rényi, Tsallis, Burg, Shannon) permettant d'aboutir à une solution régularisée. Ce travail couvre alors différents domaines. Les aspects mathématiques de la tomographie via l'élément fondamental qui est la transformée de Radon. En probabilité sur la recherche d'une loi conjointe connaissant ses lois marginales d'où la notion de ''copule'' via le théorème de Sklar. Avec seulement deux projections, ce problème est extrêmement difficile. Mais en assimilant les deux projections (normalisées) aux densités marginales et l'image à reconstruire à une densité de probabilité, le lien se fait et les deux problèmes sont équivalents et peuvent se transposer dans le cadre statistique. Pour caractériser toutes les images possibles à reconstruire on a choisi alors l'outil de la théorie de probabilité, c'est-à-dire les copules. Et pour faire notre choix parmi les copules ou les images nous avons imposé le critère d'information a priori qui se base sur différentes entropies. L'entropie est une quantité scientifique importante car elle est utilisée dans divers domaines (en Thermodynamique, en théorie de l'information, etc). Ainsi, en utilisant par exemple l'entropie de Rényi nous avons découvert de nouvelles classes de copules. Cette thèse apporte de nouvelles contributions à l'imagerie, par l'interaction entre les domaines qui sont la tomographie et la théorie des probabilités et statistiques.
APA, Harvard, Vancouver, ISO, and other styles
9

Nilsson, Mattias. "Entropy and Speech." Doctoral thesis, Stockholm : Sound and Image Processing Laboratory, School of Electrical Engineering, Royal Institute of Technology, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3990.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Charter, Mark Keith. "Maximum entropy pharmacokinetics." Thesis, University of Cambridge, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.316691.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Entropy"

1

Ivanovici, Andreea Livia. Entropie, volatilitate: Entropy, volatility. Bucureşti: Editura Fundaţiei Arhitext Design, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

S, Shiner J., ed. Entropy and entropy generation: Fundamentals and applications. Dordrecht: Kluwer, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Shiner, J. S., ed. Entropy and Entropy Generation. Dordrecht: Springer Netherlands, 2002. http://dx.doi.org/10.1007/0-306-46932-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

1953-, Greven Andreas, Keller Gerhard 1954-, and Warnecke Gerald 1956-, eds. Entropy. Princeton, N.J: Princeton University Press, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

ill, Dunbar Max, ed. Micronauts: Entropy. San Diego, CA: Idea & Design Works, LLC, 2016.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bryant, John. Entropy man. Harpenden, Herts, UK: VOCAT International Ltd, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Rifkin, Jeremy. Entropy: A new world view. London: Paladin, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Rifkin, Jeremy. Entropy: Into the greenhouse world. New York: Bantam Books, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Scherer, Leopoldo García Colín. De la máquina de vapor al cero absoluto: Calor y entropía. 3rd ed. México: SEP, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Karmeshu, ed. Entropy Measures, Maximum Entropy Principle and Emerging Applications. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-36212-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Entropy"

1

Herwig, Heinz. "Entropie S * (entropy S *)." In Wärmeübertragung A-Z, 43–47. Berlin, Heidelberg: Springer Berlin Heidelberg, 2000. http://dx.doi.org/10.1007/978-3-642-56940-1_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Landsberg, P. T., A. De Vos, P. Baruch, and J. E. Parrott. "Multiple Source Photovoltaics." In Entropy and Entropy Generation, 175–95. Dordrecht: Springer Netherlands, 1996. http://dx.doi.org/10.1007/0-306-46932-4_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gibson, Jerry D. "Differential Entropy, Entropy Rate, and Maximum Entropy." In Synthesis Lectures on Engineering, Science, and Technology, 13–21. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-65388-9_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Jones, Gareth A., and J. Mary Jones. "Entropy." In Springer Undergraduate Mathematics Series, 35–53. London: Springer London, 2000. http://dx.doi.org/10.1007/978-1-4471-0361-5_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Moses, Carl O. "Entropy." In Encyclopedia of Earth Sciences Series, 1–6. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-39193-9_40-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Moses, Carl O. "Entropy." In Encyclopedia of Earth Sciences Series, 447–53. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-39312-4_40.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Sprackling, Michael. "Entropy." In Heat and Thermodynamics, 61–81. London: Macmillan Education UK, 1993. http://dx.doi.org/10.1007/978-1-349-12690-3_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Sprackling, Michael. "Entropy." In Thermal physics, 97–116. London: Macmillan Education UK, 1991. http://dx.doi.org/10.1007/978-1-349-21377-1_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Coudène, Yves. "Entropy." In Universitext, 101–12. London: Springer London, 2016. http://dx.doi.org/10.1007/978-1-4471-7287-1_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Iordache, Octavian. "Entropy." In Understanding Complex Systems, 125–42. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-17946-4_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Entropy"

1

Ding, Ni, Mohammad Amin Zarrabian, and Parastoo Sadeghi. "A Cross Entropy Interpretation of Renyi Entropy for $\alpha$ -leakage." In 2024 IEEE International Symposium on Information Theory (ISIT), 2760–65. IEEE, 2024. http://dx.doi.org/10.1109/isit57864.2024.10619672.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kocaoglu, Murat, Alexandros G. Dimakis, Sriram Vishwanath, and Babak Hassibi. "Entropic Causality and Greedy Minimum Entropy Coupling." In 2017 IEEE International Symposium on Information Theory (ISIT). IEEE, 2017. http://dx.doi.org/10.1109/isit.2017.8006772.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Li, Jiange, Arnaud Marsiglietti, and James Melbourne. "Entropic Central Limit Theorem for Rényi Entropy." In 2019 IEEE International Symposium on Information Theory (ISIT). IEEE, 2019. http://dx.doi.org/10.1109/isit.2019.8849533.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Man'ko, Margarita A., Guillaume Adenier, Andrei Yu Khrennikov, Pekka Lahti, Vladimir I. Man'ko, and Theo M. Nieuwenhuizen. "Tomographic Entropy and New Entropic Uncertainty Relations." In Quantum Theory. AIP, 2007. http://dx.doi.org/10.1063/1.2827295.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hermenier, Fabien, Xavier Lorca, Jean-Marc Menaud, Gilles Muller, and Julia Lawall. "Entropy." In the 2009 ACM SIGPLAN/SIGOPS international conference. New York, New York, USA: ACM Press, 2009. http://dx.doi.org/10.1145/1508293.1508300.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Stenholm, Stig. "When is an Entropy an Entropy?" In QUANTUM THEORY: Reconsideration of Foundations - 3. AIP, 2006. http://dx.doi.org/10.1063/1.2158728.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Arias, Cesar, Felipe Diaz, and Per Sundell. "Gibbons–Hawking entropy as entanglement entropy." In PROCEEDINGS OF THE 23RD INTERNATIONAL SCIENTIFIC CONFERENCE OF YOUNG SCIENTISTS AND SPECIALISTS (AYSS-2019). AIP Publishing, 2019. http://dx.doi.org/10.1063/1.5130124.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Zhang, Hong, and Sha-sha He. "Analysis and Comparison of Permutation Entropy, Approximate Entropy and Sample Entropy." In 2018 International Symposium on Computer, Consumer and Control (IS3C). IEEE, 2018. http://dx.doi.org/10.1109/is3c.2018.00060.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Gutierrez, Rafael M., Chandrashekhar U. Murade, Jianfeng Guo, and George Shubeita. "Entropy and entropic forces to model biological fluids." In Entropy 2021: The Scientific Tool of the 21st Century. Basel, Switzerland: MDPI, 2021. http://dx.doi.org/10.3390/entropy2021-09781.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Xiang, Gang, and Vladik Kreinovich. "Extending maximum entropy techniques to entropy constraints." In NAFIPS 2010 - 2010 Annual Meeting of the North American Fuzzy Information Processing Society. IEEE, 2010. http://dx.doi.org/10.1109/nafips.2010.5548264.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Entropy"

1

Cordwell, William, and Mark Torgerson. PUF Entropy. Office of Scientific and Technical Information (OSTI), March 2023. http://dx.doi.org/10.2172/2431723.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Xu, X., S. Kini, P. Psenak, C. Filsfils, S. Litkowski, and M. Bocci. Signaling Entropy Label Capability and Entropy Readable Label Depth Using OSPF. RFC Editor, August 2021. http://dx.doi.org/10.17487/rfc9089.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jaegar, Stefan. Entropy, Perception, and Relativity. Fort Belvoir, VA: Defense Technical Information Center, April 2006. http://dx.doi.org/10.21236/ada453569.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Drost, M. K., and M. D. White. Local entropy generation analysis. Office of Scientific and Technical Information (OSTI), February 1991. http://dx.doi.org/10.2172/6078657.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Vu, Vincent Q., Bin Yu, and Robert E. Kass. Coverage Adjusted Entropy Estimation. Fort Belvoir, VA: Defense Technical Information Center, June 2007. http://dx.doi.org/10.21236/ada472999.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Xu, X., S. Kini, P. Psenak, C. Filsfils, S. Litkowski, and M. Bocci. Signaling Entropy Label Capability and Entropy Readable Label Depth Using IS-IS. RFC Editor, August 2021. http://dx.doi.org/10.17487/rfc9088.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Balachandran, A. P., L. Chandar, and A. Momen. Edge states and entanglement entropy. Office of Scientific and Technical Information (OSTI), February 1996. http://dx.doi.org/10.2172/212697.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Melendez, Eduardo. Steganography Detection Using Entropy Measures. Fort Belvoir, VA: Defense Technical Information Center, August 2012. http://dx.doi.org/10.21236/ada586643.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Melendez, Eduardo. Steganography Detection Using Entropy Measures. Fort Belvoir, VA: Defense Technical Information Center, November 2012. http://dx.doi.org/10.21236/ada622733.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Linville, Lisa M., Joshua James Michalenko, and Dylan Zachary Anderson. Multimodal Data Fusion via Entropy Minimization. Office of Scientific and Technical Information (OSTI), March 2020. http://dx.doi.org/10.2172/1614682.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography