# Free entropy

A thermodynamic free entropy is an entropic thermodynamic potential analogous to the free energy. Also known as a Massieu, Planck, or Massieu–Planck potentials (or functions), or (rarely) free information. In statistical mechanics, free entropies frequently appear as the logarithm of a partition function. The Onsager reciprocal relations in particular, are developed in terms of entropic potentials. In mathematics, free entropy means something quite different: it is a generalization of entropy defined in the subject of free probability.

A free entropy is generated by a Legendre transformation of the entropy. The different potentials correspond to different constraints to which the system may be subjected.

## Examples

The most common examples are:

 Name Function Alt. function Natural variables Entropy ${\displaystyle S={\frac {1}{T}}U+{\frac {P}{T}}V-\sum _{i=1}^{s}{\frac {\mu _{i}}{T}}N_{i}\,}$ ${\displaystyle ~~~~~U,V,\{N_{i}\}\,}$ Massieu potential \ Helmholtz free entropy ${\displaystyle \Phi =S-{\frac {1}{T}}U}$ ${\displaystyle =-{\frac {A}{T}}}$ ${\displaystyle ~~~~~{\frac {1}{T}},V,\{N_{i}\}\,}$ Planck potential \ Gibbs free entropy ${\displaystyle \Xi =\Phi -{\frac {P}{T}}V}$ ${\displaystyle =-{\frac {G}{T}}}$ ${\displaystyle ~~~~~{\frac {1}{T}},{\frac {P}{T}},\{N_{i}\}\,}$

where

Note that the use of the terms "Massieu" and "Planck" for explicit Massieu-Planck potentials are somewhat obscure and ambiguous. In particular "Planck potential" has alternative meanings. The most standard notation for an entropic potential is ${\displaystyle \psi }$, used by both Planck and Schrödinger. (Note that Gibbs used ${\displaystyle \psi }$ to denote the free energy.) Free entropies where invented by French engineer François Massieu in 1869, and actually predate Gibbs's free energy (1875).

## Dependence of the potentials on the natural variables

### Entropy

${\displaystyle S=S(U,V,\{N_{i}\})}$

By the definition of a total differential,

${\displaystyle dS={\frac {\partial S}{\partial U}}dU+{\frac {\partial S}{\partial V}}dV+\sum _{i=1}^{s}{\frac {\partial S}{\partial N_{i}}}dN_{i}}$.

From the equations of state,

${\displaystyle dS={\frac {1}{T}}dU+{\frac {P}{T}}dV+\sum _{i=1}^{s}(-{\frac {\mu _{i}}{T}})dN_{i}}$.

The differentials in the above equation are all of extensive variables, so they may be integrated to yield

${\displaystyle S={\frac {U}{T}}+{\frac {PV}{T}}+\sum _{i=1}^{s}(-{\frac {\mu _{i}N}{T}})}$.

### Massieu potential / Helmholtz free entropy

${\displaystyle \Phi =S-{\frac {U}{T}}}$
${\displaystyle \Phi ={\frac {U}{T}}+{\frac {PV}{T}}+\sum _{i=1}^{s}(-{\frac {\mu _{i}N}{T}})-{\frac {U}{T}}}$
${\displaystyle \Phi ={\frac {PV}{T}}+\sum _{i=1}^{s}(-{\frac {\mu _{i}N}{T}})}$

Starting over at the definition of ${\displaystyle \Phi }$ and taking the total differential, we have via a Legendre transform (and the chain rule)

${\displaystyle d\Phi =dS-{\frac {1}{T}}dU-Ud{\frac {1}{T}}}$,
${\displaystyle d\Phi ={\frac {1}{T}}dU+{\frac {P}{T}}dV+\sum _{i=1}^{s}(-{\frac {\mu _{i}}{T}})dN_{i}-{\frac {1}{T}}dU-Ud{\frac {1}{T}}}$,
${\displaystyle d\Phi =-Ud{\frac {1}{T}}+{\frac {P}{T}}dV+\sum _{i=1}^{s}(-{\frac {\mu _{i}}{T}})dN_{i}}$.

The above differentials are not all of extensive variables, so the equation may not be directly integrated. From ${\displaystyle d\Phi }$ we see that

${\displaystyle \Phi =\Phi ({\frac {1}{T}},V,\{N_{i}\})}$.

If reciprocal variables are not desired,[3]:222

${\displaystyle d\Phi =dS-{\frac {TdU-UdT}{T^{2}}}}$,
${\displaystyle d\Phi =dS-{\frac {1}{T}}dU+{\frac {U}{T^{2}}}dT}$,
${\displaystyle d\Phi ={\frac {1}{T}}dU+{\frac {P}{T}}dV+\sum _{i=1}^{s}(-{\frac {\mu _{i}}{T}})dN_{i}-{\frac {1}{T}}dU+{\frac {U}{T^{2}}}dT}$,
${\displaystyle d\Phi ={\frac {U}{T^{2}}}dT+{\frac {P}{T}}dV+\sum _{i=1}^{s}(-{\frac {\mu _{i}}{T}})dN_{i}}$,
${\displaystyle \Phi =\Phi (T,V,\{N_{i}\})}$.

### Planck potential / Gibbs free entropy

${\displaystyle \Xi =\Phi -{\frac {PV}{T}}}$
${\displaystyle \Xi ={\frac {PV}{T}}+\sum _{i=1}^{s}(-{\frac {\mu _{i}N}{T}})-{\frac {PV}{T}}}$
${\displaystyle \Xi =\sum _{i=1}^{s}(-{\frac {\mu _{i}N}{T}})}$

Starting over at the definition of ${\displaystyle \Xi }$ and taking the total differential, we have via a Legendre transform (and the chain rule)

${\displaystyle d\Xi =d\Phi -{\frac {P}{T}}dV-Vd{\frac {P}{T}}}$
${\displaystyle d\Xi =-Ud{\frac {2}{T}}+{\frac {P}{T}}dV+\sum _{i=1}^{s}(-{\frac {\mu _{i}}{T}})dN_{i}-{\frac {P}{T}}dV-Vd{\frac {P}{T}}}$
${\displaystyle d\Xi =-Ud{\frac {1}{T}}-Vd{\frac {P}{T}}+\sum _{i=1}^{s}(-{\frac {\mu _{i}}{T}})dN_{i}}$.

The above differentials are not all of extensive variables, so the equation may not be directly integrated. From ${\displaystyle d\Xi }$ we see that

${\displaystyle \Xi =\Xi ({\frac {1}{T}},{\frac {P}{T}},\{N_{i}\})}$.

If reciprocal variables are not desired,[3]:222

${\displaystyle d\Xi =d\Phi -{\frac {T(PdV+VdP)-PVdT}{T^{2}}}}$,
${\displaystyle d\Xi =d\Phi -{\frac {P}{T}}dV-{\frac {V}{T}}dP+{\frac {PV}{T^{2}}}dT}$,
${\displaystyle d\Xi ={\frac {U}{T^{2}}}dT+{\frac {P}{T}}dV+\sum _{i=1}^{s}(-{\frac {\mu _{i}}{T}})dN_{i}-{\frac {P}{T}}dV-{\frac {V}{T}}dP+{\frac {PV}{T^{2}}}dT}$,
${\displaystyle d\Xi ={\frac {U+PV}{T^{2}}}dT-{\frac {V}{T}}dP+\sum _{i=1}^{s}(-{\frac {\mu _{i}}{T}})dN_{i}}$,
${\displaystyle \Xi =\Xi (T,P,\{N_{i}\})}$.

## References

1. Antoni Planes; Eduard Vives (2000-10-24). "Entropic variables and Massieu-Planck functions". Entropic Formulation of Statistical Mechanics. Universitat de Barcelona. Retrieved 2007-09-18.
2. T. Wada; A.M. Scarfone (December 2004). "Connections between Tsallis' formalisms employing the standard linear average energy and ones employing the normalized q-average energy". Physics Letters A. 335 (5–6): 351–362. arXiv:cond-mat/0410527. Bibcode:2005PhLA..335..351W. doi:10.1016/j.physleta.2004.12.054.
3. The Collected Papers of Peter J. W. Debye. New York, New York: Interscience Publishers, Inc. 1954.

## Bibliography

• Massieu, M.F. (1869). "Compt. Rend". 69 (858): 1057. Cite journal requires |journal= (help)
• Callen, Herbert B. (1985). Thermodynamics and an Introduction to Thermostatistics (2nd ed.). New York: John Wiley & Sons. ISBN 0-471-86256-8.