Entropy (journal)
- Entropy (journal)
-
Entropy (abrégé en Entropy) est une revue scientifique trimestrielle à comité de lecture qui publie des articles en libre accès dans tous les domaines relié à l'entropie[1].
L'actuel directeur de publication est Peter Harremoës[2].
Références
Catégories :
- Revue de chimie en libre accès
- Revue pluridisciplinaire
- Titre de presse créé en 1999
Wikimedia Foundation.
2010.
Contenu soumis à la licence CC-BY-SA. Source : Article Entropy (journal) de Wikipédia en français (auteurs)
Regardez d'autres dictionnaires:
Entropy (journal) — Infobox Journal title = Entropy discipline = Physics, Chemistry, Biology, Engineering, Computer sciences, Economics, Philosophy language = English abbreviation = Sensors publisher = MDPI country = Switzerland frequency = Quarterly history = 1999… … Wikipedia
Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… … Wikipedia
Entropy (energy dispersal) — The thermodynamic concept of entropy can be described qualitatively as a measure of energy dispersal (energy distribution) at a specific temperature. Changes in entropy can be quantitatively related to the distribution or the spreading out of the … Wikipedia
Entropy in thermodynamics and information theory — There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S , of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s; and the … Wikipedia
Entropy (order and disorder) — Boltzmann s molecules (1896) shown at a rest position in a solid In thermodynamics, entropy is commonly associated with the amount of order, disorder, and/or chaos in a thermodynamic system. This stems from Rudolf Clausius 1862 assertion that any … Wikipedia
Entropy power inequality — In mathematics, the entropy power inequality is a result in probability theory that relates to so called entropy power of random variables. It shows that the entropy power of suitably well behaved random variables is a superadditive function. The … Wikipedia
Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… … Wikipedia
Entropy of mixing — The entropy of mixing is the change in the configuration entropy, an extensive thermodynamic quantity, when two different chemical substances or components are mixed. This entropy change must be positive since there is more uncertainty about the… … Wikipedia
Entropy estimation — Estimating the differential entropy of a system or process, given some observations, is useful in various science/engineering applications, such as Independent Component Analysis [Dinh Tuan Pham (2004) Fast algorithms for mutual information based … Wikipedia
Von Neumann entropy — In quantum statistical mechanics, von Neumann entropy refers to the extension of classical entropy concepts to the field of quantum mechanics.John von Neumann rigorously established the correct mathematical framework for quantum mechanics with… … Wikipedia