As we have given them, the definitions of the phase space volume
and entropy
are not correct and can give rise to annoying problems and paradoxes.
The dimension of the phase space volume[edit | edit source]
To begin with we can note that the definition of entropy given in the fundamental postulate of statistical mechanics, makes no sense since
is dimensional. In fact
is defined as an integral over
and:
![{\displaystyle d\Gamma =\prod _{i=1}^{N}d{\vec {q}}_{i}d{\vec {p}}_{i}\quad \Rightarrow \quad \left[d\Gamma \right]=\left({\text{m}}\cdot {\frac {\text{kg m}}{\text{s}}}\right)^{3N}=\left[\Omega \right]}](//restbase.wikitolearn.org/en.wikitolearn.org/v1/media/math/render/svg/abda9b66b1ff5820e59b1cf0e3ad398f512d3353)
Therefore, if we want to take the logarithm of

we must first adimensionalize it; to do so we must introduce a constant

with dimensions

, so that we can actually redefine the phase space volume as:

which is now really adimensional and now the fundamental postulate makes sense.
A legitimate question could now be: what is the value of
? Which constant is it?
Unfortunately, within classical statistical mechanics it is impossible to establish it; only in quantum statistical mechanics the constant
acquires a precise meaning, and in particular it turns out that
is Planck's constant:

Let us finally note that dividing

by

and integrating over the phase space can also be interpreted as dividing the whole phase space in cells of volume

, thus "discretizing" the possible states of the system considering as equivalent all the points inside a cell
[1], and summing over these cells
[2]: in this sense the entropy gives a measure of the quantity of such possible states.
Extensivity of the entropy and the Gibbs paradox[edit | edit source]
Even with the introduction of
we still have some problems. In particular, from the expression of
for an ideal gas we have:

Since

and

are both extensive,

is intensive and thus remains constant in the thermodynamic limit; this means that for large

we can neglect the first term
[3] so that in the end:
![{\displaystyle \ln \Omega (E,V,N)=N\ln \left[V\left({\frac {4\pi m}{3h^{2}}}{\frac {E}{N}}\right)^{3/2}\right]+{\frac {3}{2}}N}](//restbase.wikitolearn.org/en.wikitolearn.org/v1/media/math/render/svg/1b74a76d00e305a1f48bb8b7f2c019fbf7b29c95)
and therefore the entropy of the system will be:
![{\displaystyle S(E,V,N)=k_{B}N\ln \left[V\left({\frac {4\pi m}{3h^{2}}}{\frac {E}{N}}\right)^{3/2}\right]+{\frac {3}{2}}k_{B}N}](//restbase.wikitolearn.org/en.wikitolearn.org/v1/media/math/render/svg/2703956b852952edeacf8005e08bb86b6ee85875)
This expression is rather problematic. In fact, first of all we can note that written in this way it is
not extensive since it contains terms like

and

. Furthermore, it gives rise to a catastrophic paradox known as the
Gibbs paradox.
In order to understand it let us consider a system divided by a wall into two subsystems each of

particles of mass

, volume

and energy

(with

). The entropy of the system in this initial state is:
![{\displaystyle {\begin{aligned}S_{i}={\frac {3}{2}}k_{B}(N_{1}+N_{2})+\\+k_{B}\left\lbrace N_{1}\ln \left[V_{1}\left({\frac {4\pi m_{1}}{h^{2}}}{\frac {E_{1}}{N_{1}}}\right)^{3/2}\right]+N_{2}\ln \left[V_{2}\left({\frac {4\pi m_{2}}{h^{2}}}{\frac {E_{2}}{N_{2}}}\right)^{3/2}\right]\right\rbrace \end{aligned}}}](//restbase.wikitolearn.org/en.wikitolearn.org/v1/media/math/render/svg/5444dac8af46f01e190d7be0342880caae6e6feb)
If we now remove the wall the two gases will mix and the entropy of this final state of the system will be:
![{\displaystyle {\begin{aligned}S_{f}={\frac {3}{2}}k_{B}(N_{1}+N_{2})+\\+k_{B}\ln \left[(V_{1}+V_{2})^{N_{1}+N_{2}}\left({\frac {4\pi }{3h^{2}}}\right)^{{\frac {3}{2}}(N_{1}+N_{2})}\left({\frac {m_{1}E_{1}}{N_{1}}}\right)^{{\frac {3}{2}}N_{1}}\left({\frac {m_{2}E_{2}}{N_{2}}}\right)^{{\frac {3}{2}}N_{2}}\right]\end{aligned}}}](//restbase.wikitolearn.org/en.wikitolearn.org/v1/media/math/render/svg/828f4175d0e1475954f25a19acb16899ce2bf112)
Therefore, the entropy of the system will have changed during this process by the amount:
![{\displaystyle {\begin{aligned}\Delta S=S_{f}-S_{i}=k_{B}\ln \left[{\frac {(V_{1}+V_{2})^{N_{1}+N_{2}}}{V_{1}^{N_{1}}V_{2}^{N_{2}}}}\right]=\\=k_{B}\left[N_{1}\ln \left({\frac {V_{1}+V_{2}}{V_{1}}}\right)+N_{2}\ln \left({\frac {V_{1}+V_{2}}{V_{2}}}\right)\right]\end{aligned}}}](//restbase.wikitolearn.org/en.wikitolearn.org/v1/media/math/render/svg/08283ca16c2bcc5be8f74824468d048fcfdbaa40)
called
entropy of mixing, which is always positive. If

this is a correct result, since the mixing of the two gases is an irreversible process.
However, since

doesn't depend on

and

this result holds also in the case

. But this is a paradox: the mixing of two identical gases is a reversible process (we can recover the initial state reinserting the wall) so the entropy of the whole system shouldn't increase if we remove the wall. Furthermore, the fact that

also when

is catastrophic because it means that the entropy of a system depends on the
history of the system itself, rather than on its
state. But this ultimately means that entropy
doesn't exist at all: consider a system of energy

made of

particles contained in a volume

; then we can think that this system has been obtained from the separation of

pre-existing subsystems,
with
arbitrarily large. This means that the entropy of the system has increased an arbitrarily large amount of times from its initial value, and therefore the entropy of the system in its final configuration (energy

, volume

,

particles) is greater than any arbitrary number: in other words, assuming
![{\textstyle S(E,V,N)=k_{B}N\ln \left[V\left({\frac {4\pi m}{3h^{2}}}{\frac {E}{N}}\right)^{3/2}\right]+{\frac {3}{2}}k_{B}N}](//restbase.wikitolearn.org/en.wikitolearn.org/v1/media/math/render/svg/a7cf96d2111f1e067e7e1e4dbdd360678c903cb8)
as the entropy of an ideal gas we would conclude that the entropy of
any ideal gas is infinite!
There's clearly something wrong with the definition of entropy we have given. How can we solve this problem?
We know that in classical mechanics identical particles are distinguishable; however in order to solve the Gibbs paradox we must treat them as undistinguishable, just like we would do in quantum mechanics: this way if we exchange two particles the representative point of the system in phase space won't change. Now, since

particles can be exchanged in

different ways there will be

different configurations of the system relative to the same representative point. This means that the we must redefine the phase space volume of the system reducing it by a factor

(which is sometimes called
Boltzmann factor):

In this way we can solve all the problems we have described. In fact, using this definition of the phase space volume in the case of the ideal gas we have:

![{\displaystyle S(E,V,N)={\frac {5}{2}}Nk_{B}+Nk_{B}\ln \left[{\frac {V}{N}}\left({\frac {4\pi m}{3h^{2}}}{\frac {E}{N}}\right)^{3/2}\right]}](//restbase.wikitolearn.org/en.wikitolearn.org/v1/media/math/render/svg/382b5d2c3f0479dde34abaceffe8a734d6d346fc)
This expression for the entropy is clearly extensive since it is proportional to
and the logarithm depends only on
and
. Furthermore, the computation of the entropy of mixing of two gases now gives:
![{\displaystyle \Delta S=k_{B}\left[(N_{1}+N_{2})\ln \left({\frac {V_{1}+V_{2}}{N_{1}+N_{2}}}\right)-N_{1}\ln {\frac {V_{1}}{N_{1}}}-N_{2}\ln {\frac {V_{2}}{N_{2}}}\right]}](//restbase.wikitolearn.org/en.wikitolearn.org/v1/media/math/render/svg/1782233decd70eb9efd2553adeb679e117b359ed)
If the two gases are different, their densities

will be different and so

as it must be. However, if the two gases are identical this time we have

and the entropy of mixing vanishes.
This solution to the Gibbs paradox, however, is rather an ad hoc one. Unfortunately there's no way to understand where does the Boltzmann factor really come from within the framework of classical statistical mechanics. This can be made clearer within quantum statistical mechanics: in that case the
comes from the fact that identical particles are intrinsically indistinguishable and does not "disappear" in the classical limit.
To conclude, the correct definition of the phase space volume that eliminates all the problems that we have mentioned, and the one we will always use in the future, is:

For the sake of simplicity we now redefine

as

(in other words we "incorporate" the

factor inside

). This way, for the ideal gas we have:

- ↑ In other words, we consider all the points inside a cell to represent only a single state. This is just an "approximation" in classical statistical mechanics, but in quantum statistical mechanics it turns out that there really is only one state in every of those cell in phase space.
- ↑ To be a bit more precise: the error that we make approximating the integral over the whole phase space with a sum over cells of linear dimension
is perfectly negligible. This is due to the fact that
is ridiculously small with respect to the common scales of a macroscopic system.
- ↑ Note that this means that we can simply drop the term
(which is what we are going to do in the future) in the expression of
, since it divides the phase space volume by a negligible amount. In fact, from the fundamental postulate of statistical mechanics we have that multiplying
by an intensive factor is equivalent to adding a constant to the entropy; however the entropy of a typical system is so large that adding such a constant doesn't change sensibly its value. In other words the phase space volume
is so large that multiplying it by a constant does not change significantly its value.