Find the word definition

Crossword clues for entropy

Longman Dictionary of Contemporary English
▪ So how do we calculate the entropy change of the surroundings?
▪ Thus during a chemical reaction there is always an entropy change.
▪ The Gibbs free energy change therefore takes into account the enthalpy change of a reaction system and its entropy change.
▪ But how does entropy help us to predict whether a change will take place or not?
▪ During this spontaneous process, the entropy of the system therefore increases.
▪ Energy is thus dispersed and so we might expect an increase in entropy and not a decrease.
▪ Eventually the limit is reached where no further packing would be possible and where the configurational entropy therefore vanishes.
▪ So how do we calculate the entropy change of the surroundings?
▪ The entropy of gases is much higher than the entropy of solids.
▪ They saw disorder steadily growing, like a baby, and called this entropy.
▪ They therefore have a higher entropy after mixing.
The Collaborative International Dictionary

Heat \Heat\ (h[=e]t), n. [OE. hete, h[ae]te, AS. h[=ae]tu, h[=ae]to, fr. h[=a]t hot; akin to OHG. heizi heat, Dan. hede, Sw. hetta. See Hot.]

  1. A force in nature which is recognized in various effects, but especially in the phenomena of fusion and evaporation, and which, as manifested in fire, the sun's rays, mechanical action, chemical combination, etc., becomes directly known to us through the sense of feeling. In its nature heat is a mode of motion, being in general a form of molecular disturbance or vibration. It was formerly supposed to be a subtile, imponderable fluid, to which was given the name caloric.

    Note: As affecting the human body, heat produces different sensations, which are called by different names, as heat or sensible heat, warmth, cold, etc., according to its degree or amount relatively to the normal temperature of the body.

  2. The sensation caused by the force or influence of heat when excessive, or above that which is normal to the human body; the bodily feeling experienced on exposure to fire, the sun's rays, etc.; the reverse of cold.

  3. High temperature, as distinguished from low temperature, or cold; as, the heat of summer and the cold of winter; heat of the skin or body in fever, etc.

    Else how had the world . . . Avoided pinching cold and scorching heat!

  4. Indication of high temperature; appearance, condition, or color of a body, as indicating its temperature; redness; high color; flush; degree of temperature to which something is heated, as indicated by appearance, condition, or otherwise.

    It has raised . . . heats in their faces.

    The heats smiths take of their iron are a blood-red heat, a white-flame heat, and a sparkling or welding heat.

  5. A single complete operation of heating, as at a forge or in a furnace; as, to make a horseshoe in a certain number of heats.

  6. A violent action unintermitted; a single effort; a single course in a race that consists of two or more courses; as, he won two heats out of three.

    Many causes . . . for refreshment betwixt the heats.

    [He] struck off at one heat the matchless tale of ``Tam o' Shanter.''
    --J. C. Shairp.

  7. Utmost violence; rage; vehemence; as, the heat of battle or party. ``The heat of their division.''

  8. Agitation of mind; inflammation or excitement; exasperation. ``The heat and hurry of his rage.''

  9. Animation, as in discourse; ardor; fervency; as, in the heat of argument.

    With all the strength and heat of eloquence.

  10. (Zo["o]l.) Sexual excitement in animals; readiness for sexual activity; estrus or rut.

  11. Fermentation.

  12. Strong psychological pressure, as in a police investigation; as, when they turned up the heat, he took it on the lam. [slang]

    Animal heat, Blood heat, Capacity for heat, etc. See under Animal, Blood, etc.

    Atomic heat (Chem.), the product obtained by multiplying the atomic weight of any element by its specific heat. The atomic heat of all solid elements is nearly a constant, the mean value being 6.4.

    Dynamical theory of heat, that theory of heat which assumes it to be, not a peculiar kind of matter, but a peculiar motion of the ultimate particles of matter.

    Heat engine, any apparatus by which a heated substance, as a heated fluid, is made to perform work by giving motion to mechanism, as a hot-air engine, or a steam engine.

    Heat producers. (Physiol.) See under Food.

    Heat rays, a term formerly applied to the rays near the red end of the spectrum, whether within or beyond the visible spectrum.

    Heat weight (Mech.), the product of any quantity of heat by the mechanical equivalent of heat divided by the absolute temperature; -- called also thermodynamic function, and entropy.

    Mechanical equivalent of heat. See under Equivalent.

    Specific heat of a substance (at any temperature), the number of units of heat required to raise the temperature of a unit mass of the substance at that temperature one degree.

    Unit of heat, the quantity of heat required to raise, by one degree, the temperature of a unit mass of water, initially at a certain standard temperature. The temperature usually employed is that of 0[deg] Centigrade, or 32[deg] Fahrenheit.

Douglas Harper's Etymology Dictionary

1868, from German Entropie "measure of the disorder of a system," coined 1865 (on analogy of Energie) by German physicist Rudolph Clausius (1822-1888), in his work on the laws of thermodynamics, from Greek entropia "a turning toward," from en "in" (see en- (2)) + trope "a turning, a transformation" (see trope). The notion is supposed to be "transformation contents." Related: Entropic.\n\nIt was not until 1865 that Clausius invented the word entropy as a suitable name for what he had been calling "the transformational content of the body." The new word made it possible to state the second law in the brief but portentous form: "The entropy of the universe tends toward a maximum," but Clausius did not view entropy as the basic concept for understanding that law. He preferred to express the physical meaning of the second law in terms of the concept of disgregation, another word that he coined, a concept that never became part of the accepted structure of thermodynamics.

[Martin J. Klein, "The Scientific Style of Josiah Willard Gibbs," in "A Century of Nathematics in America," 1989]


n. 1 (context thermodynamics countable English) 2 # strictly '''thermodynamic entropy'''. A measure of the amount of energy in a physical system that cannot be used to do work. 3 # A measure of the disorder present in a system. 4 # The capacity factor for thermal energy that is hidden with respect to temperature []. 5 # The dispersal of energy; how much energy is spread out in a process, or how widely spread out it becomes, at a specific temperature. [] 6 (context statistics information theory countable English) A measure of the amount of information and noise present in a signal. Originally a tongue-in-cheek coinage, has fallen into disuse to avoid confusion with thermodynamic entropy. 7 (context uncountable English) The tendency of a system that is left to itself to descend into chaos.

  1. n. (communication theory) a numerical measure of the uncertainty of an outcome; "the signal contained thousands of bits of information" [syn: information, selective information]

  2. (thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work; "entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity" [syn: randomness, S]


In thermodynamics, entropy (usual symbol ) is a measure of the number of microscopic configurations that correspond to a thermodynamic system in a state specified by certain macroscopic variables. Specifically, assuming that each of the microscopic configurations is equally probable, the entropy of the system is the natural logarithm of that number of configurations, multiplied by the Boltzmann constant (which provides consistency with the original thermodynamic concept of entropy discussed below, and gives entropy the dimension of energy divided by temperature). Formally,

S = klnΩ.

For example, gas in a container with known volume, pressure, and temperature could have an enormous number of possible configurations of the individual gas molecules, and which configuration the gas is actually in may be regarded as random. Hence, entropy can be understood as a measure of molecular disorder within a macroscopic system. The second law of thermodynamics states that an isolated system's entropy never decreases. Such systems spontaneously evolve towards thermodynamic equilibrium, the state with maximum entropy. Non-isolated systems may lose entropy, provided their environment's entropy increases by at least that decrement. Since entropy is a state function, the change in entropy of a system is determined by its initial and final states. This applies whether the process is reversible or irreversible. However, irreversible processes increase the combined entropy of the system and its environment.

The change in entropy (ΔS) of a system was originally defined for a thermodynamically reversible process as

$$\Delta S = \int \frac{\delta Q_\text{rev}}T$$

where is the absolute temperature of the system, dividing an incremental reversible transfer of heat into that system . (If heat is transferred out the sign would be reversed giving a decrease in entropy of the system.) The above definition is sometimes called the macroscopic definition of entropy because it can be used without regard to any microscopic description of the contents of a system. The concept of entropy has been found to be generally useful and has several other formulations. Entropy was discovered when it was noticed to be a quantity that behaves as a function of state, as a consequence of the second law of thermodynamics.

Entropy is an extensive property. It has the dimension of energy divided by temperature, which has a unit of joules per kelvin (J K) in the International System of Units (or kg m s K in terms of base units). But the entropy of a pure substance is usually given as an intensive property — either entropy per unit mass (SI unit: J K kg) or entropy per unit amount of substance (SI unit: J K mol).

The absolute entropy ( rather than ) was defined later, using either statistical mechanics or the third law of thermodynamics, an otherwise arbitrary additive constant is fixed such that the entropy at absolute zero is zero. In statistical mechanics this reflects that the ground state of a system is generally non-degenerate and only one microscopic configuration corresponds to it.

In the modern microscopic interpretation of entropy in statistical mechanics, entropy is the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification. Understanding the role of thermodynamic entropy in various processes requires an understanding of how and why that information changes as the system evolves from its initial to its final condition. It is often said that entropy is an expression of the disorder, or randomness of a system, or of our lack of information about it. The second law is now often seen as an expression of the fundamental postulate of statistical mechanics through the modern definition of entropy.

Entropy (disambiguation)

Entropy, in thermodynamics, is a state function originally introduced to explain why part of a thermodynamic system's total energy is unavailable to do useful work.

Entropy may also refer to:

Entropy (Buffy the Vampire Slayer)

"Entropy" is the 18th episode of season 6 of the television series Buffy the Vampire Slayer.

Entropy (information theory)

In information theory, systems are modeled by a transmitter, channel, and receiver. The transmitter produces messages that are sent through the channel. The channel modifies the message in some way. The receiver attempts to infer which message was sent. In this context, entropy (more specifically, Shannon entropy) is the expected value (average) of the information contained in each message. 'Messages' can be modeled by any flow of information.

In a more technical sense, there are reasons (explained below) to define information as the negative of the logarithm of the probability distribution. The probability distribution of the events, coupled with the information amount of every event, forms a random variable whose expected value is the average amount of information, or entropy, generated by this distribution. Units of entropy are the shannon, nat, or hartley, depending on the base of the logarithm used to define it, though the shannon is commonly referred to as a bit.

The logarithm of the probability distribution is useful as a measure of entropy because it is additive for independent sources. For instance, the entropy of a coin toss is 1 shannon, whereas of tosses it is shannons. Generally, you need bits to represent a variable that can take one of values if is a power of 2. If these values are equally probable, the entropy (in shannons) is equal to the number of bits. Equality between number of bits and shannons holds only while all outcomes are equally probable. If one of the events is more probable than others, observation of that event is less informative. Conversely, rarer events provide more information when observed. Since observation of less probable events occurs more rarely, the net effect is that the entropy (thought of as average information) received from non-uniformly distributed data is less than . Entropy is zero when one outcome is certain. Shannon entropy quantifies all these considerations exactly when a probability distribution of the source is known. The meaning of the events observed (the meaning of messages) does not matter in the definition of entropy. Entropy only takes into account the probability of observing a specific event, so the information it encapsulates is information about the underlying probability distribution, not the meaning of the events themselves.

Generally, entropy refers to disorder or uncertainty. Shannon entropy was introduced by Claude E. Shannon in his 1948 paper " A Mathematical Theory of Communication". Shannon entropy provides an absolute limit on the best possible average length of lossless encoding or compression of an information source. Rényi entropy generalizes Shannon entropy.

Entropy (1977 board game)

Entropy is an abstract strategy board game for two players designed by Eric Solomon in 1977. The game is "based on the eternal conflict in the universe between order and chaos [...] One player is Order, the other Chaos. Order is trying to make patterns vertically and horizontally. Chaos is trying to prevent this."

The game originally employed a 5×5 gameboard, but in 2000 a 7x7 board was introduced to allow deeper strategies. Entropy was awarded a rare 6 out of 6 by Games & Puzzles Magazine in 1981. David Pritchard called the game "a modern classic". It is sold commercially under the names Hyle (a 5×5 board) and Hyle7 (a 7×7 board).

Entropy (anonymous data store)

Entropy was a decentralized, peer-to-peer communication network designed to be resistant to censorship, much like Freenet. Entropy was an anonymous data store written in the C programming language. It pooled the contributed bandwidth and storage space of member computers to allow users to anonymously publish or retrieve information of all kinds. The name Entropy was a backronym for "Emerging Network To Reduce Orwellian Potency Yield", referring to George Orwell's novel Nineteen Eighty-Four and its totalitarian thought police enslaving people by controlling their information.

Entropy was designed to be compatible with the similar Freenet system. As such, any Freenet client could be configured to run on the Entropy network. However, Entropy and Freenet data stores are not compatible with each other and therefore do not share data.

Entropy featured a news interface, for reading and posting on the latest frost message boards from within the client.

Entropy (album)

Entropy is a split vinyl album by Anathallo and Javelins. Each band has one song featured on the album, released in 2005 on Potential Getaway Driver. There were two pressings made, 300 in translucent green and later another 500 in clear vinyl.

Entropy (statistical thermodynamics)

In classical statistical mechanics, the entropy function earlier introduced by Rudolf Clausius is interpreted as statistical entropy using probability theory. The statistical entropy perspective was introduced in 1870 with the work of physicist Ludwig Boltzmann.

Entropy (film)

Entropy is a 1999 film directed by Phil Joanou, starring Stephen Dorff and featuring the Irish rock band U2.

Entropy (comics)

Entropy is a Cosmic Entity in the Marvel Comics Universe who possesses Nigh-Omnipotence. A representation of Eternity formed at the beginning of time, whose purpose is to undo, so the cycle of creation and destruction will forever continue. Primarily associated with Genis-Vell of Earth, he was first mentioned in Captain Marvel vol. 5 #2 (Oct. 2000) and seen in Captain Marvel vol. 5 #4 in the same year.

Entropy (computing)

In computing, entropy is the randomness collected by an operating system or application for use in cryptography or other uses that require random data. This randomness is often collected from hardware sources, either pre-existing ones such as mouse movements or specially provided randomness generators. A lack of entropy can have a negative impact on performance and security.

Entropy (classical thermodynamics)

Entropy is a property of thermodynamical systems. The term entropy was introduced by Rudolf Clausius who named it from the Greek word τρoπή, "transformation". He considered transfers of energy as heat and work between bodies of matter, taking temperature into account. Bodies of radiation are also covered by the same kind of reasoning.

More recently, it has been recognized that the quantity 'entropy' can be derived by considering the actually possible thermodynamic processes simply from the point of view of their irreversibility, not relying on temperature for the reasoning.

Ludwig Boltzmann explained the entropy as a measure of the number of possible microscopic configurations of the individual atoms and molecules of the system (microstates) which comply with the macroscopic state (macrostate) of the system. Boltzmann then went on to show that was equal to the thermodynamic entropy. The factor has since been known as Boltzmann's constant.

Entropy (arrow of time)

Entropy is the only quantity in the physical sciences (apart from certain rare interactions in particle physics; see below) that requires a particular direction for time, sometimes called an arrow of time. As one goes "forward" in time, the second law of thermodynamics says, the entropy of an isolated system can increase, but not decrease. Hence, from one perspective, entropy measurement is a way of distinguishing the past from the future. However, in thermodynamic systems that are not closed, entropy can decrease with time: many systems, including living systems, reduce local entropy at the expense of an environmental increase, resulting in a net increase in entropy. Examples of such systems and phenomena include the formation of typical crystals, the workings of a refrigerator and living organisms.

Entropy, like temperature, is an abstract concept, yet, like temperature, everyone has an intuitive sense of the effects of entropy. Watching a movie, it is usually easy to determine whether it is being run forward or in reverse. When run in reverse, broken glasses spontaneously reassemble, smoke goes down a chimney, wood "unburns", cooling the environment and ice "unmelts" warming the environment. No physical laws are broken in the reverse movie except the second law of thermodynamics, which reflects the time-asymmetry of entropy. An intuitive understanding of the irreversibility of certain physical phenomena (and subsequent creation of entropy) allows one to make this determination.

By contrast, all physical processes occurring at the microscopic level, such as mechanics, do not pick out an arrow of time. Going forward in time, an atom might move to the left, whereas going backward in time the same atom might move to the right; the behavior of the atom is not qualitatively different in either case. It would, however, be an astronomically improbable event if a macroscopic amount of gas that originally filled a container evenly spontaneously shrunk to occupy only half the container.

Certain subatomic interactions involving the weak nuclear force violate the conservation of parity, but only very rarely. According to the CPT theorem, this means they should also be time irreversible, and so establish an arrow of time. This, however, is neither linked to the thermodynamic arrow of time, nor has anything to do with our daily experience of time irreversibility.

Entropy (energy dispersal)

In physics education, the concept of entropy is traditionally introduced as a quantitative measure of disorder. While acknowledging this approach is technically sound, some educators argue entropy and related thermodynamic concepts are easier to understand if entropy is described as a measure of energy dispersal instead. In this alternative approach, entropy is a measure of energy dispersal or distribution at a specific temperature. Changes in entropy can be quantitatively related to the distribution or the spreading out of the energy of a thermodynamic system, divided by its temperature.

The energy dispersal approach to teaching entropy was developed to facilitate teaching entropy to students beginning university chemistry and biology. This new approach also avoids ambiguous terms such as disorder and chaos, which have multiple everyday meanings.

Entropy (journal)

Entropy is a peer-reviewed open access scientific journal covering research on all aspects of entropy and information studies. It was established in 1999 and is published by MDPI. The journal regularly publishes special issues compiled by guest editors. The editor-in-chief is Kevin H. Knuth ( University at Albany, SUNY).

Entropy (1994 board game)

Entropy is a board game by Augustine Carreno published in 1994. It is played on a square board divided into 5×5 cells, with seven black and seven white pieces set up as in the Korean board game Five Field Kono.

The object is to be first to go from the initial position, in which all the player's pieces can move, to a position in which none can. A piece is able to move only when it is in contact, orthogonally or diagonally, with at least one other piece of the same type.

Entropy (video game)

Entropy is a space MMORPG video game developed by the Norwegian game studio Artplant. The company is known for creating the MMORPG Battlestar Galactica Online.

Entropy (order and disorder)

In thermodynamics, entropy is commonly associated with the amount of order, disorder, or chaos in a thermodynamic system. This stems from Rudolf Clausius' 1862 assertion that any thermodynamic process always "admits to being reduced to the alteration in some way or another of the arrangement of the constituent parts of the working body" and that internal work associated with these alterations is quantified energetically by a measure of "entropy" change, according to the following differential expression:

$$\int \frac{\delta Q}{T} \ge 0$$
where Q = ... and T = ...

In the years to follow, Ludwig Boltzmann translated these "alterations" into that of a probabilistic view of order and disorder in gas phase molecular systems.

In recent years, in chemistry textbooks there has been a shift away from using the terms "order" and "disorder" to that of the concept of energy dispersion to describe entropy, among other theories. In the 2002 encyclopedia Encarta, for example, entropy is defined as a thermodynamic property which serves as a measure of how close a system is to equilibrium, as well as a measure of the disorder in the system. In the context of entropy, "perfect internal disorder" is synonymous with "equilibrium", but since that definition is so far different from the usual definition implied in normal speech, the use of the term in science has caused a great deal of confusion and misunderstanding.

Locally, the entropy can be lowered by external action. This applies to machines, such as a refrigerator, where the entropy in the cold chamber is being reduced, and to living organisms. This local decrease in entropy is, however, only possible at the expense of an entropy increase in the surroundings.

Entropy (astrophysics)

In astrophysics, what is referred to as " entropy" is actually the adiabatic constant derived as follows.

Using the first law of thermodynamics for a quasi-static, infinitesimal process for a hydrostatic system

dQ = dU − dW. 

For an ideal gas in this special case, the internal energy, U, is only a function of the temperature T; therefore the partial derivative of heat capacity with respect to T is identically the same as the full derivative, yielding through some manipulation

dQ = CdT + PdV.

Further manipulation using the differential version of the ideal gas law, the previous equation, and assuming constant pressure, one finds

dQ = CdT − VdP.

For an adiabatic process dQ = 0  and recalling $\gamma = \frac{C_{P}}{C_{V}}\,$, one finds

:{| |$\frac{V\,dP = C_{P} dT}{P\,dV = -C_{V} dT}$ |- |$\frac{dP}{P} = -\frac{dV}{V}\gamma.$ |}

One can solve this simple differential equation to find

PV = constant = K

This equation is known as an expression for the adiabatic constant, K, also called the adiabat. From the ideal gas equation one also knows

$$P=\frac{\rho k_{B}T}{\mu m_{H}},$$
where k  is Boltzmann's constant. Substituting this into the above equation along with V = [grams]/ρ  and γ = 5/3  for an ideal monatomic gas one finds

$$K = \frac{k_{B}T}{\mu m_{H} \rho^{2/3}},$$
where μ  is the mean molecular weight of the gas or plasma; and m  is the mass of the Hydrogen atom, which is extremely close to the mass of the proton, m , the quantity more often used in astrophysical theory of galaxy clusters. This is what astrophysicists refer to as "entropy" and has units of [keV cm]. This quantity relates to the thermodynamic entropy as

S = klnΩ + S
where Ω , the density of states in statistical theory, takes on the value of K as defined above.


Usage examples of "entropy".

Even if you clean your cluttered desk, decreasing its entropy, the total entropy, including that of your body and the air in the room, actually increases.

Lord of Entropy was not affected, nor was Earthma, but the latter was so stunned by the turn of events that she forgot her son for long enough that Antaeus was visibly diminished by its attempt to continue the assault.

The Lord of Entropy regularly sent electronic messages suggesting revisions and requesting additions to his Palace of Bones.

Other times, discordant sounds rose, seemingly from the dust and rubble itself, squeals of entropy and when they fell away the silence seemed even deeper.

Suffice it to say that the rearrangement of the interior of the sack would have provided more than ample evidence for Shelyid to have from its study, had he the wits, derived brilliant treatises on heretofore unknown aspects of Brownian motion and entropy.

The second Bioroid was an articulated fortress-bulbous-forearmed, bulbous-legged-one moment, and a superheated gas cloud headed for entropy the next.

In the endless heartbeat before that, he was a silent demon of entropy who waited to unwind the clockspring of Fowler's universe, the better to celebrate its last anechoic tick.

There is a drive toward complexification that is directly opposed to the physical law of entropy.

For human history can be viewed as an attempt to countervene the inevitable chaos of entropy.

Metal crystallizes, dissimilar metals react to each other, corrosion eats on everything… Entropy in a closed system increases over time - that's the second law of thermodynamics.

Since the entropy in your room has certainly decreased, Bekenstein reasoned that the only way to satisfy the second law of thermodynamics would be for the black hole to have entropy, and for this entropy to sufficiently increase as matter is pumped into it to offset the observed entropic decrease outside the black hole's exterior.

The process of decreation is unstoppable and irreversible, much like entropy.

When that sort of economic powerhouse expanded into the vicinity of star systems which could scarcely keep their heads above water, the train of events leading to eventual incorporation extended itself with the inevitability of entropy.

Changes of entropy can be calculated only for a reversible process, and may then be defined as the ratio of the amount of heat taken up to the absolute temperature at which the heat is absorbed.

I found that as the atomic numbers became higher the rate of entropy slowed.