Some remarks and refinements in the definition of entropy

As we have given them, the definitions of the phase space volume and entropy are not correct and can give rise to annoying problems and paradoxes.

The dimension of the phase space volume[edit | edit source]

To begin with we can note that the definition of entropy given in the fundamental postulate of statistical mechanics, makes no sense since is dimensional. In fact is defined as an integral over and:

Therefore, if we want to take the logarithm of we must first adimensionalize it; to do so we must introduce a constant with dimensions , so that we can actually redefine the phase space volume as:
which is now really adimensional and now the fundamental postulate makes sense.


A legitimate question could now be: what is the value of ? Which constant is it? Unfortunately, within classical statistical mechanics it is impossible to establish it; only in quantum statistical mechanics the constant acquires a precise meaning, and in particular it turns out that is Planck's constant:

Let us finally note that dividing by and integrating over the phase space can also be interpreted as dividing the whole phase space in cells of volume , thus "discretizing" the possible states of the system considering as equivalent all the points inside a cell[1], and summing over these cells[2]: in this sense the entropy gives a measure of the quantity of such possible states.

Extensivity of the entropy and the Gibbs paradox[edit | edit source]

Even with the introduction of we still have some problems. In particular, from the expression of for an ideal gas we have:

Since and are both extensive, is intensive and thus remains constant in the thermodynamic limit; this means that for large we can neglect the first term[3] so that in the end:
and therefore the entropy of the system will be:
This expression is rather problematic. In fact, first of all we can note that written in this way it is not extensive since it contains terms like and . Furthermore, it gives rise to a catastrophic paradox known as the Gibbs paradox. In order to understand it let us consider a system divided by a wall into two subsystems each of particles of mass , volume and energy (with ). The entropy of the system in this initial state is:
If we now remove the wall the two gases will mix and the entropy of this final state of the system will be:
Therefore, the entropy of the system will have changed during this process by the amount:
called entropy of mixing, which is always positive. If this is a correct result, since the mixing of the two gases is an irreversible process. However, since doesn't depend on and this result holds also in the case . But this is a paradox: the mixing of two identical gases is a reversible process (we can recover the initial state reinserting the wall) so the entropy of the whole system shouldn't increase if we remove the wall. Furthermore, the fact that also when is catastrophic because it means that the entropy of a system depends on the history of the system itself, rather than on its state. But this ultimately means that entropy doesn't exist at all: consider a system of energy made of particles contained in a volume ; then we can think that this system has been obtained from the separation of pre-existing subsystems, with arbitrarily large. This means that the entropy of the system has increased an arbitrarily large amount of times from its initial value, and therefore the entropy of the system in its final configuration (energy , volume , particles) is greater than any arbitrary number: in other words, assuming as the entropy of an ideal gas we would conclude that the entropy of any ideal gas is infinite! There's clearly something wrong with the definition of entropy we have given. How can we solve this problem? We know that in classical mechanics identical particles are distinguishable; however in order to solve the Gibbs paradox we must treat them as undistinguishable, just like we would do in quantum mechanics: this way if we exchange two particles the representative point of the system in phase space won't change. Now, since particles can be exchanged in different ways there will be different configurations of the system relative to the same representative point. This means that the we must redefine the phase space volume of the system reducing it by a factor (which is sometimes called Boltzmann factor):
In this way we can solve all the problems we have described. In fact, using this definition of the phase space volume in the case of the ideal gas we have:

This expression for the entropy is clearly extensive since it is proportional to and the logarithm depends only on and . Furthermore, the computation of the entropy of mixing of two gases now gives:

If the two gases are different, their densities will be different and so as it must be. However, if the two gases are identical this time we have and the entropy of mixing vanishes.


This solution to the Gibbs paradox, however, is rather an ad hoc one. Unfortunately there's no way to understand where does the Boltzmann factor really come from within the framework of classical statistical mechanics. This can be made clearer within quantum statistical mechanics: in that case the comes from the fact that identical particles are intrinsically indistinguishable and does not "disappear" in the classical limit.

Conclusions[edit | edit source]

To conclude, the correct definition of the phase space volume that eliminates all the problems that we have mentioned, and the one we will always use in the future, is:

For the sake of simplicity we now redefine as (in other words we "incorporate" the factor inside ). This way, for the ideal gas we have:

  1. In other words, we consider all the points inside a cell to represent only a single state. This is just an "approximation" in classical statistical mechanics, but in quantum statistical mechanics it turns out that there really is only one state in every of those cell in phase space.
  2. To be a bit more precise: the error that we make approximating the integral over the whole phase space with a sum over cells of linear dimension is perfectly negligible. This is due to the fact that is ridiculously small with respect to the common scales of a macroscopic system.
  3. Note that this means that we can simply drop the term (which is what we are going to do in the future) in the expression of , since it divides the phase space volume by a negligible amount. In fact, from the fundamental postulate of statistical mechanics we have that multiplying by an intensive factor is equivalent to adding a constant to the entropy; however the entropy of a typical system is so large that adding such a constant doesn't change sensibly its value. In other words the phase space volume is so large that multiplying it by a constant does not change significantly its value.
 PreviousNext