# Statistical mechanics and phase transitions

Now, in all the cases considered in the previous chapters we have never taken into account the possibility for our system to undergo a phase transition. We can therefore wonder if all the machinery that we have built can actually be used also when a thermodynamic system changes phase. In other more "philosphical" words the issue can be reformulated as follows: is statistical mechanics a tool that allows us to derive the whole thermodynamics of a system, phase transitions included, or it works only when we limit ourselves to single phases? It would be rather unsatisfying if this was the case, because after all the Hamiltonian (and thus the partition function) of a system does not change during a phase transition, since the interaction potential between the particles does not change with the temperature.

As we have seen in Thermodynamics of phase transitions, phase transitions are characterized by singularities (jump discontinuities or divergences) in the derivatives of thermodynamic potentials, so if we want the partition function of a system to include phase transitions it must exhibit such singularities. However, by definition the partition function is simply a sum of exponentials, so it should be itself an analytic function and thus incapable of reproducing phase transition-like phenomena. This is true as soon as we consider finite-sized systems: in fact, in this case the partition function is a finite sum of exponential, so it is inevitably an analytic function. However, infinite sums of analytic functions can be non analytic, so we can suppose that the partition function can exhibit singularities (and thus phase transitions) only in the thermodynamic limit. This is indeed what happens; in fact, it can be shown[1] that if one allows ${\displaystyle e^{\beta \mu }}$ (with ${\displaystyle \mu }$ the chemical potential) to assume complex values then it has been shown that in the thermodynamic limit at least one of the poles of the partition function moves on the real axis.

## Heuristic energy-entropy argument

We could now ask how can we understand if a system can undergo phase transitions at all; a simple but useful tool to do so is the so called energy-entropy argument: at high temperatures the entropy ${\displaystyle S}$ will be the dominant term in the free energy ${\displaystyle F=U-TS}$ of the system, and the free energy is thus minimized by maximizing ${\displaystyle S}$; at low temperatures on the other hand the internal energy ${\displaystyle U}$ can be the most important contribution to ${\displaystyle F}$ and so the free energy is minimized by minimizing ${\displaystyle U}$. Therefore if maximizing ${\displaystyle S}$ for high temperatures and minimizing ${\displaystyle U}$ for low ones brings to two different macroscopic equilibrium configurations of the system, we can conclude that there must be at least one phase transition between ${\displaystyle T=0}$ and ${\displaystyle T=\infty }$ (of course this requires that we know the exact expressions of ${\displaystyle U}$ and ${\displaystyle S}$).

1. See C. N. Yang and T. D. Lee, Statistical Theory of Equations of State and Phase Transitions. I. Theory of Condensation, Physical Review, vol. 87, 3.