definition of entropy

Artículo revisado y aprobado por nuestro equipo editorial, siguiendo los criterios de redacción y edición de YuBrain.

Entropy (S) is one of the central concepts of thermodynamics. It is a state function that provides a measure of the disorder of a system and is also a measure of the amount of energy dissipated as heat during a spontaneous process. Entropy calculations are important in different fields of knowledge, from physics, chemistry and biology, to social sciences such as economics, finance and sociology.

Having such a wide variety of applications, it is not surprising that there are different concepts or definitions of entropy. Later, the two main concepts of entropy are presented: the thermodynamic concept and the statistical concept.

Entropy of processes versus entropy of a system

Entropy is a property of thermodynamic systems that is represented in the bibliography with the letter S. It is a state function, which means that it is one of the variables that allow defining the state in which a system is located. Furthermore, it also means that it is a property that only depends on the particular state a system is in, and not on how the system got to that state.

This means that, when talking about the entropy of a system in a certain state, we do it in the same way as we would talk about the temperature or the volume of the system. However, it is also common to calculate the entropy change that occurs when a system passes from one state to another. For example, we can calculate the entropy change from the vaporization of a sample of water, or from the chemical reaction between oxygen and iron to give ferric oxide. In any of these cases we speak of process entropies, when in reality we should speak of entropy changes associated with said processes.

In other words, when we talk about the entropy of a sample of methane gas at 25 °C and 3.0 atmospheres of pressure (in which case we are describing a particular state of said gas), we refer to the entropy of the system, also called absolute entropy or S.

Instead, when we talk about the entropy of burning a sample of gaseous methane at 25 °C and 3.0 atmospheres of pressure in the presence of oxygen to give carbon dioxide and water, we are talking about the entropy of a process that involves a change in the state of the system and therefore a change in the entropy of the system. In other words, in these cases we refer to a change of entropy or ΔS .

When defining entropy, it is essential to be clear about whether we are talking about S or ΔS, since they are not the same. That being said, there are two basic concepts of entropy: the original thermodynamic concept and the statistical concept. Both concepts are equally important. The first because it made entropy known as an indispensable variable to understand the spontaneity of all natural macroscopic processes in the universe (in the microscopic field of quantum mechanics things get a bit swampy) and the second because it provides us with an interpretation intuitive understanding of what the entropy of a system really means.

Thermodynamic definition of entropy (ΔS)

The original concept of entropy is associated with processes of change in a system; in them, a part of the internal energy is dissipated in the form of heat. This is something that happens in every natural or spontaneous process and forms the basis of the second law of thermodynamics, which is arguably one of the most important (and limiting) laws in science.

Consider, for example, the case of releasing a ball to let it bounce on the ground. When we hold a ball at a certain height, it has a certain amount of potential energy. When you release the ball, it falls, transforming potential energy into kinetic energy until it hits the ground. At that moment, the kinetic energy accumulates again in the form of potential energy, this time elastic, which is later released when the ball bounces.

Under ideal conditions all the initial potential energy would be conserved after the bounce, which would mean that the ball should bounce back to the initial height. However, even if we remove the air completely (to eliminate friction), experience tells us that the ball never bounces back to the initial height, but goes to a lower and lower height after each bounce until it rests on the ground.

It is evident that the repeated bouncing of the ball on the ground ends up completely dissipating all the potential energy that the object had at the beginning of our little experiment. This occurs because each time the ball bounces, it transfers part of its energy to the ground in the form of heat, which in turn is randomly dissipated along the ground itself.

In thermodynamics, entropy, or rather the entropy change, is defined as the heat released or absorbed by a system during a reversible transformation divided by the absolute temperature. That is to say:

definition of entropy

This definition represents an infinitesimal variation of entropy of a process of any type carried out reversibly, that is, infinitely slowly. To obtain the entropy of a real and measurable change we must integrate this expression:

definition of entropy

Since entropy is a function of state, the previous expression implies that the entropy variation of a system between any initial state and any final state can be found by looking for a reversible path between both states and integrating the previous expression. For the simplest case of an isothermal transformation, the integrated entropy becomes:

definition of entropy

Statistical definition of entropy (S)

Austrian theoretical physicist Ludwig Boltzmann is famous for his countless contributions to science, but mainly for his statistical interpretation of entropy. Boltzmann deduced a relationship between entropy and the way in which molecules are distributed in different energy levels at a given temperature. This distribution, called the Boltzmann distribution, predicts that the population of molecules in a given energy state at a given temperature decreases exponentially with the energy level of the state. Furthermore, at higher temperatures, a greater number of energy states will be accessible.

These and other additional observations are summarized in the equation that today bears his name, that is, the Boltzmann equation:

definition of entropy

In this equation, S represents the entropy of the system in a particular state and W represents the number of microstates of the same and k B is a constant of proportionality called Boltzmann’s constant. These microstates consist of the different ways in which the atoms and molecules that make up the system can be arranged, keeping the total energy of the system constant.

The number of microstates is traditionally associated with the level of disorder in a system. To understand why, let’s consider a drawer where we keep a large number of socks. The color of the socks can be associated with the energy level in which they are found. Thus, the Boltzmann distribution predicts that, at sufficiently low temperatures, practically all socks will be of a single color (the one corresponding to the lowest energy state). In this case, no matter how we order the socks, the result will always be the same (since they are all the same). so there will only be one microstate (W = 1).

However, as we increase the temperature, some of these socks will change to a second color. Even if only one pair of socks changes color (moves to the second energy state), the fact that any one of the socks can be the one to change color means that many different microstates can exist. As the temperature rises and more states begin to populate, more and more sock colors appear in the drawer, vastly increasing the number of possible microstates, which in turn makes the drawer look like a messy mess.

Since the above equation predicts that entropy increases as the number of microstates increases, that is, as the system becomes disordered, then the Boltzmann equation defines entropy as a measure of the disorder of a system .

units of entropy

Depending on any of the two definitions presented, it can be determined that entropy has units of energy over temperature. That is to say,

definition of entropy

Depending on the system of units in which you work, these units can be:

Unit system entropy units
International system J/K
Fundamental units of the metric system m 2 .kg/(s 2 .K)
imperial system BTU/°R
calories lime/K
other units kJ/K, kcal/K

References

Atkins, P., & dePaula, J. (2010). Atkins. Physical Chemistry (8th ed .). Panamerican Medical Editorial.

Boghiu, CE (2018, February 5). Information and entropy, a probabilistic approach . National Association of Physics Students. https://nusgrem.es/informacion-entropia-probabilidad/

Chang, R. (2002). Physicochemistry (1st ed .). MCGRAW HILL EDUCATION.

Chang, R., Manzo, Á. R., Lopez, PS, & Herranz, ZR (2020). Chemistry (10th ed .). McGraw-Hill Education.

Connor, N. (2020, January 14). What is the unit of entropy? Definition . Thermal Engineering. https://www.thermal-engineering.org/en/what-is-the-unit-of-entropy-definition/

AGB High School. (nd). ENTROPY – LUDWIG BOLTZMANN . Liceoagb.es. https://www.liceoagb.es/quimigen/termo12.html

SEE. (nd). Derived Units – Thermodynamics . Industrial Verifications of Andalusia, SA https://www.veiasa.es/metrologia/utilidades/unidades_derivadas/termodinamica

Israel Parada (Licentiate,Professor ULA)
Israel Parada (Licentiate,Professor ULA)
(Licenciado en Química) - AUTOR. Profesor universitario de Química. Divulgador científico.

Artículos relacionados