Entropy
Moderators: honeev, Leonid, amiradm, BioTeam
I don't get ur HEN's matter!!!
In thermodynamics, entropy is an extensive state function that accounts for the effects of irreversibility in thermodynamic systems, particularly in heat engines during an engine cycle. While the concept of energy is central to the first law of thermodynamics, which deals with the conservation of energy, the concept of entropy is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy. In simple terms, entropy change is related to either a change to a more ordered or disordered state at a microscopic level, which is an early visualisation of the motional energy of molecules, and to the idea dissipation of energy via intermolecular molecular frictions and collisions. In recent years, entropy, from a nonmathematical perspective, has been interpreted in terms of the "dispersal" of energy.
Quantitatively, entropy, symbolized by S, is defined by the differential quantity dS = δQ / T, where δQ is the amount of heat absorbed in a reversible process in which the system goes from one state to another, and T is the absolute temperature.[2] Entropy is one of the factors that determines the free energy of the system.
When a system's energy is defined as the sum of its "useful" energy, (e.g. that used to push a piston), and its "useless energy", i.e. that energy which cannot be used for external work, then entropy may be (most concretely) visualized as the "scrap" or "useless" energy whose energetic prevalance over the total energy of a system is directly proportional to the absolute temperature of the considered system, as is the case with the Gibbs free energy or Helmholtz free energy relations.
In terms of statistical mechanics, the entropy describes the number of the possible microscopic configurations of the system. The statistical definition of entropy is generally thought to be the more fundamental definition, from which all other important properties of entropy follow. Although the concept of entropy was originally a thermodynamic construct, it has been adapted in other fields of study, including information theory, psychodynamics, and thermoeconomics.
Thanx!
In thermodynamics, entropy is an extensive state function that accounts for the effects of irreversibility in thermodynamic systems, particularly in heat engines during an engine cycle. While the concept of energy is central to the first law of thermodynamics, which deals with the conservation of energy, the concept of entropy is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy. In simple terms, entropy change is related to either a change to a more ordered or disordered state at a microscopic level, which is an early visualisation of the motional energy of molecules, and to the idea dissipation of energy via intermolecular molecular frictions and collisions. In recent years, entropy, from a nonmathematical perspective, has been interpreted in terms of the "dispersal" of energy.
Quantitatively, entropy, symbolized by S, is defined by the differential quantity dS = δQ / T, where δQ is the amount of heat absorbed in a reversible process in which the system goes from one state to another, and T is the absolute temperature.[2] Entropy is one of the factors that determines the free energy of the system.
When a system's energy is defined as the sum of its "useful" energy, (e.g. that used to push a piston), and its "useless energy", i.e. that energy which cannot be used for external work, then entropy may be (most concretely) visualized as the "scrap" or "useless" energy whose energetic prevalance over the total energy of a system is directly proportional to the absolute temperature of the considered system, as is the case with the Gibbs free energy or Helmholtz free energy relations.
In terms of statistical mechanics, the entropy describes the number of the possible microscopic configurations of the system. The statistical definition of entropy is generally thought to be the more fundamental definition, from which all other important properties of entropy follow. Although the concept of entropy was originally a thermodynamic construct, it has been adapted in other fields of study, including information theory, psychodynamics, and thermoeconomics.
Thanx!

 Coral
 Posts: 429
 Joined: Tue Aug 30, 2005 10:09 am
 Location: Yogyakarta, indonesia, southeast asia
 Contact:
because it is isothermal with the egg's.. nothing energy will flow into or from the reservoir.(at on time..)
but, if some thing make it is not more be isothermal then a heat flow will arise.. maybe to the egg's or to the reservoir. (but the reservoir is infinite.. so i think the only one that can change is the egg's temp.>> if this happen then the heat will flow into the egg...and keep it again into the isothermal condition)
but, if some thing make it is not more be isothermal then a heat flow will arise.. maybe to the egg's or to the reservoir. (but the reservoir is infinite.. so i think the only one that can change is the egg's temp.>> if this happen then the heat will flow into the egg...and keep it again into the isothermal condition)
Who is online
Users browsing this forum: Google [Bot] and 3 guests