Entropy and the arrow of time

Entropy as defined in Entropy as ignorance: information entropy depends only on the microscopic laws of motion, which as we have seen in Time and entropy are time-reversal invariant: this means that in equation Entropy as ignorance: information entropy is time independent and so strictly speaking the entropy of a system should never increase. To see that explicitly let us show that in general:

where is a generic function such that . In our case , which formally is not well defined for ; however, since we can extend the function continuously and define to be zero. Therefore:
where in the last step we have used the fact that since is a probability density it must satisfy the continuity equation (see also the discussion of Liouville's theorem), and ; we now can easily see that :
where by definition of gradient . Now, we have that , which can be shown exactly as we have done for . Thus, using Gauss theorem:
where is the surface that encloses the phase space volume[1]; since is a probability density it will vanish on [2] and so will , since we are supposing . Therefore the last integral is null:
From this we get:
and note that as we have obtained it, this relation is always valid. This means that in principle if we consider a system undergoing an irreversible transformation, like the adiabatic expansion of a gas, its entropy remains constant; however we know that in such cases entropy always increases: where does this discrepancy come from?

What we want to show now is that this discrepancy comes from the fact that in reality entropy is not a property of a given system, but of our knowledge about it. Let us first see this in a rather intuitive way: suppose we are computationally integrating the equations of motion of a system made of particles closed in a fixed volume and to choose very unusual initial conditions, for example we set the initial positions of the particles in only one half of the system (we are thus simulating the adiabatic expansion of a gas). We let the system evolve for some time, then we stop and invert all the velocities of the particles and then restart the integration; what we are doing is essentially equivalent to letting the system evolve for some time and then "rewind" it. We would therefore expect that as the system evolves the particles will come back to their initial conditions since we are just "rewinding" the process; however this doesn't occur and the gas evolves as a normal ideal gas. This happens because computers have finite precision: the position and momentum of every particle is stored with a fixed number of significant figures, and as time passes we are loosing information about the system because the computer will discard many significant figures that (mathematically) should be present. In order to actually see the gas go back to its original configuration we would thus need a computer with infinite precision, which would not loose information.

Let us now see this concept in a more formal way. When we consider a system from a microscopic point of view[3] the probability density has complete information on the system since it will be of the form:

Consider now a situation where we have less information on the system: suppose for example we have a phase space probability density that carries no information about the momenta of the particles[4]. Then as evolves it will satisfy a diffusion-like equation in configuration space[5]:

If the entropy of the system is:
then we have:
The second integral is null:
so integrating by parts the remaining term and using Gauss theorem:
where is the surface that encloses the system in configuration space. Assuming that on , then since we have:
Therefore, we have found that now the entropy of the system really increases, and this follows only from our lack of knowledge about the momenta of the particles.

We can thus see how entropy really emerges when we have not a perfect knowledge on the system, or in other words when we start ignoring or excluding some degrees of freedom.
  1. Note: in momentum space this is always a surface at infinity, while in configuration space it can also be a finite surface depending on the properties of the system (obviously, if the particles can occupy a limited space then in configuration space will be finite).
  2. Intuitively, this can be justified from the fact that must be normalized:
    and this can happen only if tends to zero at infinity.
  3. Which is what we have done this in the proof of the fact that , since we have used Hamilton's equations.
  4. For example, this can me obtained from the previous probability density by integrating over the momenta and then renormalizing.
  5. In fact, as we have seen in The diffusion equation the diffusion and continuity equations are equivalent if : therefore since satisfies a continuity equation (being a probability density) then it will also satisfy a diffusion equation with diffusion constant .