The concept of entropy was put forward by the German physicist Clausius in 1865. Originally used to describe "energy degradation", it is one of the state parameters of matter and is widely used in thermodynamics.
But at that time, entropy was only a physical quantity that could be measured by the change of heat, and its essence was still not well explained. It was not until the development of a series of scientific theories such as statistical physics and information theory that the essence of entropy was gradually explained, that is, the essence of entropy was the "internal chaos degree" of a system.
It has important applications in cybernetics, probability theory, number theory, astrophysics, life science and other fields, and has more specific definitions in different disciplines. According to mathematical thinking, these specific extended definitions are essentially unified with each other, and entropy is a very important parameter in these fields.
Enthalpy (Han)
In thermodynamics, an important state parameter representing the energy of a material system is often represented by the symbol H. The physical meaning of enthalpy is the thermodynamic energy in the system plus PV.
Enthalpy has the dimension of energy. A substance of a certain mass changes from one state to another according to the reversible process of constant pressure, and the increment of enthalpy is equal to the heat absorbed in this process.
Enthalpy is defined as H=U+pV, where u is the internal energy of matter, p is the pressure and v is the volume.
Extended data:
Enthalpy is a physical quantity related to internal energy. Under certain conditions, whether the reaction is endothermic or exothermic depends on the enthalpy difference between products and reactants, that is, enthalpy change (△H). The energy released or absorbed in the process of chemical reaction can be expressed by heat (or replaced by corresponding heat), which is called reaction heat, also called "enthalpy change".
Enthalpy is a state quantity and enthalpy change is a process quantity, just as instantaneous speed is a state quantity and average speed is a process quantity. 1877, Boltzmann put forward the statistical physics explanation of entropy.
In a series of papers, he proved that the macroscopic physical properties of the system can be regarded as the equal probability statistical average of all possible microscopic states.
For example, consider an ideal gas in a container. The microscopic state can be expressed by the position and momentum of each gas atom. All possible microscopic states must meet the following conditions: (i) All particles are within the volume range of the container; (ii) The sum of kinetic energy of all atoms is equal to the total energy of gas.
Baidu encyclopedia-entropy
Baidu encyclopedia-enthalpy