Meaning of ENTROPY
Pronunciation: | | 'entrupee
|
WordNet Dictionary |
|
| Definition: | |
- [n] (communication theory) a numerical measure of the uncertainty of an outcome; "the signal contained thousands of bits of information"
- [n] (thermodynamics) a measure of the amount of energy in a system that is no longer available for doing work; entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity
|
|
| Websites: | | |
|
| Synonyms: | | information, selective information |
|
| Antonyms: | | ectropy | |
| See Also: | | conformational entropy, information measure, physical phenomenon | |
Products Dictionary |
|
| Definition: | | Entropy Jake is a film director trying to juggle the demands of his career with his personal life, as he suddenly falls in love with a runway model. Featuring U2, with scenes filmed during their "Popmart" tour. more details ... |
|
Webster's 1913 Dictionary |
|
| Definition: | | \En"tro*py\, n. [Gr. ? a turning in; ? in + ? a turn,
fr. ? to turn.] (Thermodynamics)
A certain property of a body, expressed as a measurable
quantity, such that when there is no communication of heat
the quantity remains constant, but when heat enters or leaves
the body the quantity increases or diminishes. If a small
amount, h, of heat enters the body when its temperature is t
in the thermodynamic scale the entropy of the body is
increased by h ? t. The entropy is regarded as measured from
some standard temperature and pressure. Sometimes called the
thermodynamic function.
The entropy of the universe tends towards a maximum.
--Clausius.
|
|
Computing Dictionary |
|
| Definition: | | A measure of the disorder of a system. Systems tend to go from a state of order (low entropy) to a state of maximum disorder (high entropy). The entropy of a system is related to the amount of information it contains. A highly ordered system can be described using fewer bits of information than a disordered one. For example, a string containing one million "0"s can be described using run-length encoding as [("0", 1000000)] whereas a string of random symbols (e.g. bits, or characters) will be much harder, if not impossible, to compress in this way. Shannon's formula gives the entropy H(M) of a message M in bits: H(M) = -log2 p(M) Where p(M) is the probability of message M. |
|
Biology Dictionary |
|
| Definition: | | The amount of disorder in a system. |
|
Thesaurus Terms |
|
| Related Terms: | | abeyance, aloofness, amorphia, amorphism, amorphousness, anarchy, apathy, bit, blurriness, catalepsy, catatonia, channel, chaos, communication explosion, communication theory, confusion, data retrieval, data storage, deadliness, deathliness, decoding, derangement, diffusion, disarrangement, disarray, disarticulation, discomfiture, discomposure, disconcertedness, discontinuity, discreteness, disharmony, dishevelment, disintegration, disjunction, dislocation, disorder, disorderliness, disorganization, dispersal, dispersion, disproportion, disruption, dissolution, disturbance, dormancy, EDP, electronic data processing, encoding, formlessness, fuzziness, haphazardness, haziness, incoherence, inconsistency, indecisiveness, indefiniteness, indeterminateness, indifference, indiscriminateness, indolence, inertia, inertness, information explosion, information theory, inharmonious harmony, irregularity, languor, latency, lotus-eating, messiness, mistiness, most admired disorder, noise, nonadhesion, noncohesion, nonsymmetry, nonuniformity, obscurity, orderlessness, passiveness, passivity, perturbation, promiscuity, promiscuousness, randomness, redundancy, scattering, separateness, shapelessness, signal, stagnancy, stagnation, stasis, suspense, torpor, turbulence, unadherence, unadhesiveness, unclearness, unsymmetry, untenacity, ununiformity, upset, vagueness, vegetation, vis inertiae |
|
|
|
|