# Statistics and thermodynamics

## The fundamental postulate of statistical mechanics

We still have not established a link between the ensemble formalism that we are developing and the thermodynamics of a system. This is what we are now going to do.

From thermodynamics we know (see Entropy and Thermodynamic potentials) that all the properties of a system can be obtained from its entropy through appropriate differentiations; we would therefore like to define the entropy of a system within the microcanonical ensemble.

However, we have no clues on what we can do in order to define it; of course we must link ${\displaystyle S}$ with some property of the microcanonical ensemble of our system, but we have nothing that can suggest us what we might use. After all the only really new concept that we have introduced with the ensemble formalism is the phase space volume ${\displaystyle \Omega }$, so we can think that this is what must be related to the entropy.

We can understand that since it looks like we are at a dead end we have to "artificially" introduce something, i.e. we must do some assumptions in order to proceed. The assumption we make is the fundamental postulate of statistical mechanics:

${\displaystyle S(E,V,N)=k_{B}\ln \Omega (E,V,N)}$
where ${\displaystyle k_{B}}$ is Boltzmann's constant, needed to give the entropy ${\displaystyle S}$ the right dimensions[1]. Unfortunately there is no intuitive way to justify this postulate (in general we could have supposed that ${\displaystyle S}$ was proportional to some power of ${\displaystyle \Omega }$, or some other function different from the logarithm), it something we have to take as it is. However this doesn't mean that we must "religiously" belive in it; it is true that being a postulate we can't really "prove" ${\textstyle S(E,V,N)=k_{B}\ln \Omega (E,V,N)}$, but we certainly can try to see if it is "reasonable". In other words what we would like to do now is to verify if ${\displaystyle S}$ as defined in the fundamental postulate of statistical mechanics is really the entropy of a system; this means that we want to see, even if only qualitatively, if the consequences of the fundamental postulate of statistical mechanics do agree with what we know about the thermodynamics of a system or not. In particular we have seen in Entropy that in order to obtain all the thermodynamic information we need about a macroscopic system we just need to take appropriate derivatives of the entropy:
${\displaystyle {\frac {1}{T}}={\frac {\partial S}{\partial E}}_{|V,N}\quad \qquad {\frac {P}{T}}={\frac {\partial S}{\partial V}}_{|E,N}\quad \qquad {\frac {\mu }{T}}=-{\frac {\partial S}{\partial N}}_{|E,V}}$
These relations are equivalent to those that we have actually seen which involve the derivatives of the energy ${\displaystyle E}$:
${\displaystyle T={\frac {\partial E}{\partial S}}_{|V,N}\quad \qquad P=-{\frac {\partial E}{\partial V}}_{|S,N}\quad \qquad \mu ={\frac {\partial E}{\partial N}}_{|S,V}}$
In fact, similarly to what we have seen in Response functions we have that:
${\displaystyle {\frac {\partial S}{\partial V}}_{|E,N}{\frac {\partial V}{\partial E}}_{|S,N}{\frac {\partial E}{\partial S}}_{|V,N}=-1\quad \qquad {\frac {\partial S}{\partial N}}_{|E,V}{\frac {\partial N}{\partial E}}_{|S,V}{\frac {\partial E}{\partial S}}_{|NV}=-1}$
and thus:

{\displaystyle {\begin{aligned}P=T{\frac {\partial S}{\partial V}}=-T{\frac {1}{{\frac {\partial V}{\partial E}}{\frac {\partial E}{\partial S}}}}=-T{\frac {1}{{\frac {\partial V}{\partial E}}T}}=-{\frac {\partial E}{\partial V}}\\{}\\\mu =-T{\frac {\partial S}{\partial N}}=T{\frac {1}{{\frac {\partial N}{\partial E}}{\frac {\partial E}{\partial S}}}}=T{\frac {1}{{\frac {\partial N}{\partial E}}T}}={\frac {\partial E}{\partial N}}\end{aligned}}}

Therefore what we are going to do in the following is to see if the relations we have previously written follow from the fundamental postulate of statistical mechanics.

Before proceeding let us make an observation that will be useful for the computations. Since ${\displaystyle \ln \Omega \propto S}$ and ${\displaystyle S}$ is extensive, the quantity ${\displaystyle \ln \Omega }$ is itself extensive and thus we can always write it as[2]:

${\displaystyle \ln \Omega (E,V,N)=Nf\left({\frac {E}{N}},{\frac {V}{N}}\right)}$
where ${\displaystyle f}$ is a generic function.

## Temperature and statistical mechanics

Let us now see how the temperature of a system comes into play within the microcanonical ensemble. Consider two systems, which we call 1 and 2, of volume ${\displaystyle V_{i}}$, energy ${\displaystyle E_{i}}$, at temperature ${\displaystyle T_{i}}$ and each composed of ${\displaystyle N_{i}}$ particles, with ${\displaystyle i=1,2}$, separated by a fixed wall that allows only the exchange of energy between the two systems. The situation is as shown in the following figure:

Two systems in thermal contact

We call ${\displaystyle E=E_{1}+E_{2}}$ the total energy and ${\displaystyle N=N_{1}+N_{2}}$ the total number of particles, and we know from thermodynamics that if initially ${\displaystyle T_{1}\neq T_{2}}$ then after some time the two systems reach an equilibrium and have the same temperature ${\displaystyle T}$. Now, it is intuitively clear[3] that the phase space volume of the whole system is:

${\displaystyle \Omega (E)=\int \Omega _{1}(E_{1})\Omega _{2}(E-E_{1})dE_{1}}$
In fact the interpretation of this expression is the following: for a fixed value of ${\displaystyle E_{1}}$ the total number of microstates of the whole system is ${\displaystyle \Omega _{1}(E_{1})\Omega _{2}(E-E_{1})}$ (remember that ${\displaystyle E_{2}=E-E_{1}}$), since for any of the ${\displaystyle \Omega _{1}(E_{1})}$ possible microstates of the system 1 the system 2 can be in any of its ${\displaystyle \Omega _{2}(E-E_{1})}$ possible states; therefore the total number of the possible states of the whole system can be obtained integrating over all the possible values of ${\displaystyle E_{1}}$. Now, since ${\displaystyle \ln \Omega }$ is extensive and can be written as in ${\textstyle S(E,V,N)=k_{B}\ln \Omega (E,V,N)}$, we have (neglecting the dependence of ${\displaystyle f}$ on the volume, since it is fixed):
${\displaystyle \Omega (E)=\int dE_{1}e^{N_{1}f_{1}\left({\frac {E_{1}}{N_{1}}}\right)+N_{2}f_{2}\left({\frac {E-E_{1}}{N-N_{1}}}\right)}}$
and defining ${\displaystyle \varepsilon _{i}=E_{i}/N_{i}}$, ${\displaystyle \varepsilon =E/N}$ and ${\displaystyle n_{i}=N_{i}/N}$:
${\displaystyle \Omega (E)=N_{1}\int d\varepsilon _{1}e^{N\left[n_{1}f_{1}(\varepsilon _{1})+n_{2}f_{2}(\varepsilon -\varepsilon _{1})\right]}}$
Now, this integral can be approximated using the saddle point approximation (see the appendix The saddle point approximation). The result is[4]:
${\displaystyle \Omega (E)\approx N_{1}{\frac {e^{N\left[n_{1}f_{1}(\varepsilon _{1}^{*})+n_{2}f_{2}(\varepsilon -\varepsilon _{1}^{*})\right]}}{\sqrt {2\pi N\left|\left[n_{1}f''_{1}(\varepsilon _{1}^{*})+n_{2}f''_{2}(\varepsilon -\varepsilon _{1}^{*})\right]\right|}}}}$
where ${\displaystyle \varepsilon _{1}^{*}}$ is the value of ${\displaystyle \varepsilon _{1}}$ that maximizes the integrand of ${\displaystyle \Omega (E)}$, i.e. the exponent ${\displaystyle n_{1}f_{1}(\varepsilon _{1})+n_{2}f_{2}(\varepsilon -\varepsilon _{1})}$. We therefore have:
${\displaystyle n_{1}f_{1}'(\varepsilon _{1}^{*})-n_{2}f'_{2}(\varepsilon -\varepsilon _{1}^{*})=0}$
which means (remember the fundamental postulate of statistical mechanics):
${\displaystyle {\frac {\partial }{\partial E_{1}}}\ln \Omega _{1}(E_{1})_{|E_{1}^{*}}={\frac {\partial }{\partial E_{2}}}\ln \Omega (E_{2})_{|E_{2}^{*}}}$
We have therefore found that from the formulation of the problem in the microcanonical ensemble, at thermal equilibrium the only microscopic configurations that significantly contribute to ${\displaystyle \Omega (E)}$ are those with ${\displaystyle E_{1}=E_{1}^{*}}$ and ${\displaystyle E_{2}=E_{2}^{*}}$, which are such that the last equation holds. From the fundamental postulate of statistical mechanics, this equation becomes:
${\displaystyle {\frac {\partial S}{\partial E_{1}}}_{|E_{1}^{*}}={\frac {\partial S}{\partial E_{2}}}_{|E_{2}^{*}}\quad }$
However from thermodynamics we know that ${\displaystyle \partial S/\partial E=1/T}$, so in the end we have:
${\displaystyle T_{1}=T_{2}}$
Thus from the microcanonical definition of entropy we have obtained the well known fact that when two bodies are put in thermal contact they reach an equilibrium state were both have the same temperature[5]. Now, in order to explicitly see that the main contribution to ${\textstyle \Omega (E)=\int \Omega _{1}(E_{1})\Omega _{2}(E-E_{1})dE_{1}}$ comes from the configuration where ${\displaystyle E_{1}=E_{1}^{*}}$, let us expand in a Taylor series the integrand around ${\displaystyle E_{1}=E_{1}^{*}}$ (and we write explicitly ${\displaystyle f''_{1}(\varepsilon _{1}^{*})=-|f''_{1}(\varepsilon _{1}^{*})|}$ and ${\displaystyle f''_{2}(\varepsilon -\varepsilon _{1}^{*})=-|f''_{2}(\varepsilon -\varepsilon _{1}^{*})|}$ because the second derivatives of ${\displaystyle f_{1}}$ and ${\displaystyle f_{2}}$ are negative in ${\displaystyle \varepsilon _{1}^{*}}$ and ${\displaystyle \varepsilon -\varepsilon _{1}^{*}}$, since they are maxima):
{\displaystyle {\begin{aligned}\Omega _{1}(E_{1})\Omega _{2}(E-E_{1})=e^{N\left[n_{1}f_{1}(\varepsilon _{1})+n_{2}f_{2}(\varepsilon -\varepsilon _{1})\right]}\approx \\\approx e^{N\left[n_{1}f_{1}(\varepsilon _{1}^{*})+n_{2}f_{2}(\varepsilon -\varepsilon _{1}^{*})\right]}e^{-N\left[n_{1}|f''_{1}(\varepsilon _{1}^{*})|+n_{2}|f''_{2}(\varepsilon -\varepsilon _{1}^{*})|\right](\varepsilon -\varepsilon _{1}^{*})^{2}}=\\={\text{const.}}\cdot e^{-N\left[n_{1}|f''_{1}(\varepsilon _{1}^{*})|+n_{2}|f''_{2}(\varepsilon -\varepsilon _{1}^{*})|\right](\varepsilon -\varepsilon _{1}^{*})^{2}}=\\={\text{const.}}\cdot e^{-\left[n_{1}|f''_{1}(\varepsilon _{1}^{*})|+n_{2}|f''_{2}(\varepsilon -\varepsilon _{1}^{*})|\right]{\frac {1}{N}}\cdot (E-E_{1}^{*})^{2}}\end{aligned}}}
where we have used the definitions of ${\displaystyle \varepsilon _{1}}$ and ${\displaystyle \varepsilon }$. This is a Gaussian with variance:
${\displaystyle \sigma _{E_{1}}^{2}:=\left\langle (E-E_{1}^{*})^{2}\right\rangle \propto N}$
Thus, the relative fluctuation of ${\displaystyle E_{1}}$ with respect to ${\displaystyle E_{1}^{*}}$ is:
${\displaystyle {\frac {\sigma _{E_{1}}}{E_{1}}}\propto {\frac {1}{\sqrt {N}}}}$
Again these fluctuations are absolutely negligible in the thermodynamic limit, as we expected, and so indeed ${\displaystyle E_{1}^{*}}$ is the only value of ${\displaystyle E}$ that significantly contributes to ${\displaystyle \Omega (E)}$ (in other words, for large ${\displaystyle N}$ it becomes a very sharply peaked function).

## Pressure and statistical mechanics

We now want to see, just like we have done with the temperature, what role does the pressure of a system play in the microcanonical ensemble. Let us therefore consider a system on which we can act using a piston of cross section ${\displaystyle A}$:

System with variable volume

If the pressure of the gas is ${\displaystyle P}$, in order to maintain the system in equilibrium we must exert a force ${\displaystyle F=PA}$ on the piston. If the gas inside the box is ideal the Hamiltonian of the system can be written as:

${\displaystyle {\mathcal {H}}(\mathbb {Q} ,\mathbb {P} ,X)=E_{\text{kin}}+\sum _{i=1}^{N}u(q_{ix}-X)}$
where ${\displaystyle E_{\text{kin}}}$ is the kinetic energy of the particles and ${\displaystyle u}$ is the potential that models the presence of the piston, so it will be a function with very small values for ${\displaystyle q_{ix} and that diverges for ${\displaystyle q_{ix}\to X^{-}}$. Therefore, the ${\displaystyle i}$-th particle will be subjected to the force (which is of course directed along the ${\displaystyle x}$ direction):
${\displaystyle F_{i}=-{\frac {\partial }{\partial q_{ix}}}u(q_{ix}-X)}$
and the (microscopic and istantaneous) force acting on the piston will be ${\displaystyle \sum _{i}F_{i}}$; therefore the force done by the piston on the system is such that:
${\displaystyle \left\langle -\sum _{i}F_{i}\right\rangle =PA}$
We also have:
${\displaystyle F_{i}=-{\frac {\partial }{\partial q_{ix}}}u(q_{ix}-X)={\frac {\partial }{\partial X}}u(q_{ix}-X)}$
from which we get:
${\displaystyle \sum _{i}F_{i}=\sum _{i}{\frac {\partial u}{\partial X}}={\frac {\partial {\mathcal {H}}}{\partial X}}}$
Therefore, ${\textstyle \left\langle -\sum _{i}F_{i}\right\rangle =PA}$ becomes:
${\displaystyle PA=\left\langle -{\frac {\partial {\mathcal {H}}}{\partial X}}\right\rangle }$
and since ${\displaystyle V=XA}$ we have:
${\displaystyle P=-\left\langle {\frac {\partial {\mathcal {H}}}{\partial V}}\right\rangle }$

Now, by definition:

${\displaystyle \left\langle {\frac {\partial {\mathcal {H}}}{\partial V}}\right\rangle ={\frac {1}{\Omega (E,V,N)}}\int d\Gamma {\frac {\partial {\mathcal {H}}}{\partial V}}\delta ({\mathcal {H}}-E)\quad \qquad \Omega (E,V,N)=\int d\Gamma \delta ({\mathcal {H}}-E)}$
and therefore:
${\displaystyle {\frac {\partial \Omega }{\partial V}}=\int d\Gamma \delta '({\mathcal {H}}-E){\frac {\partial {\mathcal {H}}}{\partial V}}}$
where with ${\displaystyle \delta '({\mathcal {H}}-E)}$ we mean the derivative of ${\displaystyle \delta ({\mathcal {H}}-E)}$ with respect to ${\displaystyle {\mathcal {H}}-E}$. Thus, ${\displaystyle \delta '({\mathcal {H}}-E)=-\partial \delta ({\mathcal {H}}-E)/\partial E}$ and since ${\displaystyle E}$ is a parameter (it's fixed):
${\displaystyle {\frac {\partial \Omega }{\partial V}}=-{\frac {\partial }{\partial E}}\underbrace {\int d\Gamma {\frac {\partial {\mathcal {H}}}{\partial V}}\delta ({\mathcal {H}}-E)} _{\Omega \left\langle \partial {\mathcal {H}}/\partial V\right\rangle =-\Omega P}={\frac {\partial }{\partial E}}(\Omega P)=\Omega {\frac {\partial P}{\partial E}}+{\frac {\partial \Omega }{\partial E}}P}$
Dividing both sides of the equation by ${\displaystyle \Omega }$ we get:
${\displaystyle {\frac {\partial }{\partial V}}\ln \Omega ={\frac {\partial P}{\partial E}}+P{\frac {\partial }{\partial E}}\ln \Omega }$
and considering the of entropy given by the fundamental postulate of statistical mechanics:
${\displaystyle {\frac {\partial S}{\partial V}}=k_{B}{\frac {\partial P}{\partial E}}+P{\frac {\partial S}{\partial E}}=k_{B}{\frac {\partial P}{\partial E}}+{\frac {P}{T}}}$
We know that ${\displaystyle S}$, ${\displaystyle V}$ and ${\displaystyle E}$ are extensive quantities while ${\displaystyle P}$ and ${\displaystyle T}$ are intensive; therefore ${\displaystyle \partial S/\partial V}$ and ${\displaystyle P/T}$ are intensive while ${\displaystyle \partial P/\partial E\propto N^{-1}}$. This means that in the thermodynamic limit ${\displaystyle \partial P/\partial E}$ vanishes, and thus:
${\displaystyle {\frac {P}{T}}={\frac {\partial S}{\partial V}}}$
which is exactly what we were looking for.

## Chemical potential and statistical mechanics

We conclude with the chemical potential. Let us therefore consider a system divided into two subsystems 1 and 2 connected by a hole that allows the exchange of particles, like the one represented here:

Systems with variable number of particles

Its Hamiltonian will be:

${\displaystyle {\mathcal {H}}={\mathcal {H}}_{1}+{\mathcal {H}}_{2}}$
where:
${\displaystyle {\mathcal {H}}_{i}=\sum _{j=1}^{N}{\frac {{{\vec {p}}_{j}{}^{(i)}}^{2}}{2m}}+{\frac {1}{2}}\sum _{j\neq k}{\mathcal {V}}\left({\vec {q}}_{j}{}^{(i)}-{\vec {q}}_{k}{}^{(i)}\right)\quad \quad i=1,2}$
and ${\displaystyle {\mathcal {V}}}$ is the interaction potential. If we suppose that the hole that connects the two subsystems is long enough they will not interact with each other; calling ${\displaystyle N}$ the total number of particles we will have that:
{\displaystyle {\begin{aligned}\Omega (E,V,N)=\int d\Gamma _{1}d\Gamma _{2}\delta ({\mathcal {H}}_{1}+{\mathcal {H}}_{2}-E)=\\\sum _{N_{1}=0}^{N}\int _{0}^{E}dE_{1}\Omega _{1}(E_{1},N_{1})\Omega _{2}(E-E_{1},N-N_{1})\end{aligned}}}
where we are summing over ${\displaystyle N_{1}}$ because the possible configurations of the system now include the number of particles that are in the two subsystems. If we now divide ${\displaystyle [0,E]}$ in very small intervals of width ${\displaystyle \Delta }$, we can write:
${\displaystyle \int _{0}^{E}dE_{1}f(E_{1})\approx \Delta \sum _{i=1}^{E/\Delta }f(i\Delta )}$
with ${\displaystyle f}$ a generic function, so that we have:
${\displaystyle \Omega (E,N)=\Delta \sum _{N_{1}=0}^{N}\sum _{i=1}^{E/\Delta }\Omega _{1}(i\Delta ,N_{1})\Omega _{2}(E-i\Delta ,N-N_{1})}$
If we now call ${\displaystyle E_{1}^{*}=i^{*}\Delta }$ and ${\displaystyle N_{1}^{*}}$ the values of ${\displaystyle E_{1}}$ and ${\displaystyle N_{1}}$ for which the summand has the largest value, we will surely have:
{\displaystyle {\begin{aligned}\Delta \cdot \Omega _{1}(E_{1}^{*},N_{1}^{*})\Omega _{2}(E-E_{1}^{*},N-N_{1}^{*})<\Omega (E,N)<\qquad \\\qquad \qquad \qquad <(N+1){\frac {E}{\Delta }}\cdot \Delta \cdot \Omega _{1}(E_{1}^{*},N_{1}^{*})\Omega _{2}(E-E_{1}^{*},N-N_{1}^{*})\end{aligned}}}
(namely the largest term is smaller than the whole sum which is in turn smaller than the sum where every term has been substituted with the largest one). Therefore, taking the logarithm and multiplying by ${\displaystyle k_{B}}$:
${\displaystyle k_{B}\ln \Delta +S_{1}^{*}+S_{2}^{*}
and since the entropy is extensive in the thermodynamic limit we get:
${\displaystyle S(E,N)=S_{1}(E_{1}^{*},N_{1}^{*})+S_{2}(E-E_{1}^{*},N-N_{1}^{*})+O(\ln N)}$
and we can also neglect the last term. Using the microcanonical definition of entropy (given by the fundamental postulate of statistical mechanics), since ${\displaystyle E_{1}^{*}}$ and ${\displaystyle N_{1}^{*}}$ are the values of ${\displaystyle E_{1}}$ and ${\displaystyle N_{1}}$ that maximize ${\displaystyle S}$ we will have:

{\displaystyle {\begin{aligned}{\frac {\partial }{\partial E_{1}}}\left[S_{1}(E_{1},N_{1})+S_{2}(E-E_{1},N-N_{1})\right]_{|E_{1}^{*},N_{1}^{*}}=0\\{\frac {\partial }{\partial N_{1}}}\left[S_{1}(E_{1},N_{1})+S_{2}(E-E_{1},N-N_{1})\right]_{|E_{1}^{*},N_{1}^{*}}=0\end{aligned}}}

We have already previously encountered the first equation, which has led us to ${\displaystyle T_{1}=T_{2}}$. Focusing now on the second one, this leads to:

${\displaystyle {\frac {\partial }{\partial N_{1}}}S_{1}(E_{1},N_{1})_{|E_{1}^{*},N_{1}^{*}}={\frac {\partial }{\partial N_{2}}}S_{2}(E_{2},N_{2})_{\left|{\begin{smallmatrix}E_{2}^{*}=E-E_{1}^{*}\\N_{2}^{*}=N-N_{1}^{*}\end{smallmatrix}}\right.}}$
Therefore if two systems can exchange particles they will have not only the same temperature but also the same value of ${\displaystyle \partial S/\partial N}$. From thermodynamics we know that this must be related to the chemical potential ${\displaystyle \mu }$ of the system; in particular since ${\displaystyle [\mu ]={\text{J}}}$ and ${\displaystyle [\partial S/\partial N]={\text{J}}/{\text{K}}}$, we will have:
${\displaystyle {\frac {\partial S}{\partial N}}=-{\frac {\mu }{T}}}$
where however the minus sign cannot be predicted[6].

## Conclusions

Therefore with these qualitative reasonings we have shown that the fundamental postulate of statistical mechanics, leads to what we expect about the thermodynamics of a system. We can therefore conclude that it is indeed a reasonable assumption.

1. In general, we should have used a generic constant ${\displaystyle k}$, but at the end of the computations that allow to rederive the thermodynamics of the system we would find out that ${\displaystyle k}$ is precisely ${\displaystyle k_{B}}$, so we use ${\displaystyle k_{B}}$ from the beginning just for the sake of simplicity.
2. In fact, since ${\displaystyle \ln \Omega }$ is some extensive function ${\displaystyle f}$ then factorizing an ${\displaystyle N}$ we have:
{\displaystyle {\begin{aligned}\ln \Omega (E,V,N)=f(E,V,N)=f\left(N{\frac {E}{N}},N{\frac {V}{N}},N{\frac {N}{N}}\right)=\\Nf\left({\frac {E}{N}},{\frac {V}{N}},1\right):=Nf\left({\frac {E}{N}},{\frac {V}{N}}\right)\end{aligned}}}
3. If this is not the case we can anyway easily obtain this result in a more "formal" way. In fact, we have:
${\displaystyle \Omega (E)=\int d\Gamma \delta ({\mathcal {H}}-E)=\int d\Gamma _{1}d\Gamma _{2}\delta ({\mathcal {H}}_{1}+{\mathcal {H}}_{2}-E)=\int d\Gamma _{1}\Omega _{2}(E-{\mathcal {H}}_{1})}$
where we have integrated over ${\displaystyle d\Gamma _{2}}$ in the last step. Now, using a "trick":
{\displaystyle {\begin{aligned}\Omega (E)=\int dE_{1}\int d\Gamma _{1}\Omega _{2}(E-H_{1})\delta ({\mathcal {H}}_{1}-E_{1})=\\=\int dE_{1}\int d\Gamma _{1}\Omega _{2}(E-E_{1})\delta ({\mathcal {H}}_{1}-E)=\int dE_{1}\Omega _{2}(E-E_{1})\int d\Gamma _{1}\delta ({\mathcal {H}}_{1}-E_{1})=\\=\int dE_{1}\Omega _{2}(E-E_{1})\Omega _{1}(E_{1})\end{aligned}}}
4. The derivatives are intended to be taken with respect to the argument of the function:
${\displaystyle f'_{1}(\varepsilon _{1})={\frac {\partial }{\partial \varepsilon _{1}}}f_{1}(\varepsilon _{1})\quad \qquad f'_{2}(\varepsilon -\varepsilon _{1})={\frac {\partial }{\partial (\varepsilon -\varepsilon _{1})}}f_{2}(\varepsilon -\varepsilon _{1})}$
5. Alternatively we could have used a different approach to come to the same result. In fact from our analysis we have that two systems at thermal equilibrium in the microcanonical ensemble are such that ${\displaystyle \partial S/\partial E}$ has the same value for both, and from thermodynamics we know that two systems at thermal equilibrium share the same temperature; thus ${\displaystyle \partial S/\partial E}$ must be related to the temperature of a system, and since it has the dimensions of the inverse of a temperature we can define the temperature in the microcanonical ensemble as:
${\displaystyle {\frac {1}{T}}={\frac {\partial S}{\partial E}}=k_{B}{\frac {\partial }{\partial E}}\ln \Omega (E,V,N)}$
Also in this case we get to the result ${\displaystyle T_{1}=T_{2}}$.
6. Remember that this, in the end, is only a qualitative way to show that the fundamental postulate of statistical mechanics is reasonable and leads to results compatible to what we know about the thermodynamics of a system.