Definition of entropy:
A) Thermodynamics: a measure of the unavailable energy in a closed thermodynamic system that is also usually considered to be a measure of the system’s disorder, that is a property of the system’s state, and that varies directly with any reversible change in heat in the system and inversely with the temperature of the system.
B) Broadly: the degree of disorder or uncertainty in a system.
C) The degradation of the matter and energy in the universe to an ultimate state of inert uniformity.
D) A process of degradation or running down or a trend to disorder.
E) Chaos, disorganization, randomness.
F) Statistical mechanics: a factor or quantity that is a function of the physical state of a mechanical system and is equal to the logarithm of the probability for the occurrence of the particular molecular arrangement in that state.
G) Communication theory: a measure of the efficiency of a system (such as a code or a language) in transmitting information, being equal to the logarithm of the number of different messages that can be sent by selection from the same set of symbols and thus indicating the degree of initial uncertainty that can be resolved by any one message.