Laws of thermodynamics
The three laws of thermodynamics define physical quantities (temperature, energy, and entropy) that characterize thermodynamic systems at thermodynamic equilibrium. The laws describe how these quantities behave under various circumstances, and preclude the possibility of certain phenomena (such as perpetual motion).
Thermodynamics  

The classical Carnot heat engine  


 
The three laws of thermodynamics are:[1][2][3][4][5]
 First law of thermodynamics: When energy passes, as work, as heat, or with matter, into or out of a system, the system's internal energy changes in accord with the law of conservation of energy. Equivalently, perpetual motion machines of the first kind (machines that produce work with no energy input) are impossible.
 Second law of thermodynamics: In a natural thermodynamic process, the sum of the entropies of the interacting thermodynamic systems increases. Equivalently, perpetual motion machines of the second kind (machines that spontaneously convert thermal energy into mechanical work) are impossible.
 Third law of thermodynamics: The entropy of a system approaches a constant value as the temperature approaches absolute zero.[2] With the exception of noncrystalline solids (glasses) the entropy of a system at absolute zero is typically close to zero.
In addition, there is conventionally added a "zeroth law", which defines thermal equilibrium:
 Zeroth law of thermodynamics: If two systems are each in thermal equilibrium with a third system, they are in thermal equilibrium with each other. This law helps define the concept of temperature.
There have been suggestions of additional laws, but none of them achieve the generality of the four accepted laws, and they are not mentioned in standard textbooks.[1][2][3][4][6][7]
The laws of thermodynamics are important fundamental laws in physics and they are applicable in other natural sciences.
Zeroth law
The zeroth law of thermodynamics may be stated in the following form:
If two systems are both in thermal equilibrium with a third system then they are in thermal equilibrium with each other.[8]
The law is intended to allow the existence of an empirical parameter, the temperature, as a property of a system such that systems in thermal equilibrium with each other have the same temperature. The law as stated here is compatible with the use of a particular physical body, for example a mass of gas, to match temperatures of other bodies, but does not justify regarding temperature as a quantity that can be measured on a scale of real numbers.
Though this version of the law is one of the most commonly stated versions, it is only one of a diversity of statements that are labeled as "the zeroth law" by competent writers. Some statements go further so as to supply the important physical fact that temperature is onedimensional and that one can conceptually arrange bodies in real number sequence from colder to hotter.[9][10][11] Perhaps there exists no unique "best possible statement" of the "zeroth law", because there is in the literature a range of formulations of the principles of thermodynamics, each of which call for their respectively appropriate versions of the law.
Although these concepts of temperature and of thermal equilibrium are fundamental to thermodynamics and were clearly stated in the nineteenth century, the desire to explicitly number the above law was not widely felt until Fowler and Guggenheim did so in the 1930s, long after the first, second, and third law were already widely understood and recognized. Hence it was numbered the zeroth law. The importance of the law as a foundation to the earlier laws is that it allows the definition of temperature in a noncircular way without reference to entropy, its conjugate variable. Such a temperature definition is said to be 'empirical'.[12][13][14][15][16][17]
First law
The first law of thermodynamics is a version of the law of conservation of energy, adapted for thermodynamic systems.
The law of conservation of energy states that the total energy of an isolated system is constant; energy can be transformed from one form to another, but can be neither created nor destroyed.
For a thermodynamic process without transfer of matter, the first law is often formulated
 ,
where ΔU_{system} denotes the change in the internal energy of a closed system, Q denotes the quantity of energy supplied to the system as heat, and W denotes the amount of thermodynamic work (expressed here with a negative sign) done by the system on its surroundings.
In the case of a thermodynamic cycle of a closed system, which returns to its original state, the heat Q_{in} supplied to the system in one stage of the cycle, minus the heat Q_{out} removed from it in another stage of the cycle, plus the thermodynamic work added to the system, W_{in}, equals the thermodynamic work that leaves the system W_{out}.
hence, for a full cycle,
For the particular case of a thermally isolated system (adiabatically isolated), the change of the internal energy of an adiabatically isolated system can only be the result of the work added to the system, because the adiabatic assumption is: Q = 0.
For processes that include transfer of matter, a further statement is needed: 'With due account of the respective fiducial reference states of the systems, when two systems, which may be of different chemical compositions, initially separated only by an impermeable wall, and otherwise isolated, are combined into a new system by the thermodynamic operation of removal of the wall, then
 ,
where U_{system} denotes the internal energy of the combined system, and U_{1} and U_{2} denote the internal energies of the respective separated systems.'
The First Law encompasses several principles:
 This states that energy can be neither created nor destroyed. However, energy can change forms, and energy can flow from one place to another. A particular consequence of the law of conservation of energy is that the total energy of an isolated system does not change.
 The concept of internal energy and its relationship to temperature.
 If a system has a definite temperature, then its total energy has three distinguishable components. If the system is in motion as a whole, it has kinetic energy. If the system as a whole is in an externally imposed force field (e.g. gravity), it has potential energy relative to some reference point in space. Finally, it has internal energy, which is a fundamental quantity of thermodynamics. The establishment of the concept of internal energy distinguishes the first law of thermodynamics from the more general law of conservation of energy.
 The internal energy of a substance can be explained as the sum of the diverse kinetic energies of the erratic microscopic motions of its constituent atoms, and of the potential energy of interactions between them. Those microscopic energy terms are collectively called the substance's internal energy, U, and are accounted for by macroscopic thermodynamic property. The total of the kinetic energies of microscopic motions of the constituent atoms increases as the system's temperature increases; this assumes no other interactions at the microscopic level of the system such as chemical reactions, potential energy of constituent atoms with respect to each other.
 Work is a process of transferring energy to or from a system in ways that can be described by macroscopic mechanical forces exerted by factors in the surroundings, outside the system. Examples are an externally driven shaft agitating a stirrer within the system, or an externally imposed electric field that polarizes the material of the system, or a piston that compresses the system. Unless otherwise stated, it is customary to treat work as occurring without its dissipation to the surroundings. Practically speaking, in all natural process, some of the work is dissipated by internal friction or viscosity. The work done by the system can come from its overall kinetic energy, from its overall potential energy, or from its internal energy.
 For example, when a machine (not a part of the system) lifts a system upwards, some energy is transferred from the machine to the system. The system's energy increases as work is done on the system and in this particular case, the energy increase of the system is manifested as an increase in the system's gravitational potential energy. Work added to the system increases the Potential Energy of the system:
 Or in general, the energy added to the system in the form of work can be partitioned to kinetic, potential or internal energy forms:
 When matter is transferred into a system, that masses' associated internal energy and potential energy are transferred with it.
 where u denotes the internal energy per unit mass of the transferred matter, as measured while in the surroundings; and ΔM denotes the amount of transferred mass.
 The flow of heat is a form of energy transfer.
 Heating is a natural process of moving energy to or from a system other than by work or the transfer of matter. Direct passage of heat is only from a hotter to a colder system.
 If the system has rigid walls that are impermeable to matter, and consequently energy cannot be transferred as work into or out from the system, and no external longrange force field affects it that could change its internal energy, then the internal energy can only be changed by the transfer of energy as heat:
where Q denotes the amount of energy transferred into the system as heat.
Combining these principles leads to one traditional statement of the first law of thermodynamics: it is not possible to construct a machine which will perpetually output work without an equal amount of energy input to that machine. Or more briefly, a perpetual motion machine of the first kind is impossible.
Second law
The second law of thermodynamics indicates the irreversibility of natural processes, and, in many cases, the tendency of natural processes to lead towards spatial homogeneity of matter and energy, and especially of temperature. It can be formulated in a variety of interesting and important ways.
It implies the existence of a quantity called the entropy of a thermodynamic system. In terms of this quantity it implies that
When two initially isolated systems in separate but nearby regions of space, each in thermodynamic equilibrium with itself but not necessarily with each other, are then allowed to interact, they will eventually reach a mutual thermodynamic equilibrium. The sum of the entropies of the initially isolated systems is less than or equal to the total entropy of the final combination. Equality occurs just when the two original systems have all their respective intensive variables (temperature, pressure) equal; then the final system also has the same values.
The second law is applicable to a wide variety of processes, reversible and irreversible. All natural processes are irreversible. Reversible processes are a useful and convenient theoretical fiction, but do not occur in nature.
A prime example of irreversibility is in the transfer of heat by conduction or radiation. It was known long before the discovery of the notion of entropy that when two bodies initially of different temperatures come into thermal connection, then heat always flows from the hotter body to the colder one.
The second law tells also about kinds of irreversibility other than heat transfer, for example those of friction and viscosity, and those of chemical reactions. The notion of entropy is needed to provide that wider scope of the law.
According to the second law of thermodynamics, in a theoretical and fictive reversible heat transfer, an element of heat transferred, δQ, is the product of the temperature (T), both of the system and of the sources or destination of the heat, with the increment (dS) of the system's conjugate variable, its entropy (S)
Entropy may also be viewed as a physical measure of the lack of physical information about the microscopic details of the motion and configuration of a system, when only the macroscopic states are known. This lack of information is often described as disorder on a microscopic or molecular scale. The law asserts that for two given macroscopically specified states of a system, there is a quantity called the difference of information entropy between them. This information entropy difference defines how much additional microscopic physical information is needed to specify one of the macroscopically specified states, given the macroscopic specification of the other – often a conveniently chosen reference state which may be presupposed to exist rather than explicitly stated. A final condition of a natural process always contains microscopically specifiable effects which are not fully and exactly predictable from the macroscopic specification of the initial condition of the process. This is why entropy increases in natural processes – the increase tells how much extra microscopic information is needed to distinguish the final macroscopically specified state from the initial macroscopically specified state.[18]
Third law
The third law of thermodynamics is sometimes stated as follows:
 The entropy of a perfect crystal of any pure substance approaches zero as the temperature approaches absolute zero.
At zero temperature the system must be in a state with the minimum thermal energy. This statement holds true if the perfect crystal has only one state with minimum energy. Entropy is related to the number of possible microstates according to:
Where S is the entropy of the system, k_{B} Boltzmann's constant, and Ω the number of microstates (e.g. possible configurations of atoms). At absolute zero there is only 1 microstate possible (Ω=1 as all the atoms are identical for a pure substance and as a result all orders are identical as there is only one combination) and ln(1) = 0.
A more general form of the third law that applies to a system such as a glass that may have more than one minimum microscopically distinct energy state, or may have a microscopically distinct state that is "frozen in" though not a strictly minimum energy state and not strictly speaking a state of thermodynamic equilibrium, at absolute zero temperature:
 The entropy of a system approaches a constant value as the temperature approaches zero.
The constant value (not necessarily zero) is called the residual entropy of the system.
History
Circa 1797, Count Rumford (born Benjamin Thompson) showed that endless mechanical action can generate indefinitely large amounts of heat from a fixed amount of working substance thus challenging the caloric theory of heat, which held that there would be a finite amount of caloric heat/energy in a fixed amount of working substance. The first established thermodynamic principle, which eventually became the second law of thermodynamics, was formulated by Sadi Carnot in 1824. By 1860, as formalized in the works of those such as Rudolf Clausius and William Thomson, two established principles of thermodynamics had evolved, the first principle and the second principle, later restated as thermodynamic laws. By 1873, for example, thermodynamicist Josiah Willard Gibbs, in his memoir Graphical Methods in the Thermodynamics of Fluids, clearly stated the first two absolute laws of thermodynamics. Some textbooks throughout the 20th century have numbered the laws differently. In some fields removed from chemistry, the second law was considered to deal with the efficiency of heat engines only, whereas what was called the third law dealt with entropy increases. Directly defining zero points for entropy calculations was not considered to be a law. Gradually, this separation was combined into the second law and the modern third law was widely adopted.
See also
 Chemical thermodynamics
 Conservation law
 Entropy production
 Ginsberg's theorem
 Heat death of the universe
 Htheorem
 Laws of science
 Onsager reciprocal relations (sometimes described as a fourth law of thermodynamics)
 Statistical mechanics
 Table of thermodynamic equations
References
 Guggenheim, E.A. (1985). Thermodynamics. An Advanced Treatment for Chemists and Physicists, seventh edition, North Holland, Amsterdam, ISBN 0444869514.
 Kittel, C. Kroemer, H. (1980). Thermal Physics, second edition, W.H. Freeman, San Francisco, ISBN 0716710889.
 Adkins, C.J. (1968). Equilibrium Thermodynamics, McGrawHill, London, ISBN 0070840571.
 Lebon, G., Jou, D., CasasVázquez, J. (2008). Understanding Nonequilibrium Thermodynamics. Foundations, Applications, Frontiers, Springer, Berlin, ISBN 9783540742524.
 Chris Vuille; Serway, Raymond A.; Faughn, Jerry S. (2009). College physics. Belmont, CA: Brooks/Cole, Cengage Learning. p. 355. ISBN 9780495386933.
 De Groot, S.R., Mazur, P. (1962). Nonequilibrium Thermodynamics, North Holland, Amsterdam.
 Glansdorff, P., Prigogine, I. (1971). Thermodynamic Theory of Structure, Stability and Fluctuations, WileyInterscience, London, ISBN 0471302805.
 Guggenheim (1985), p. 8.
 Sommerfeld, A. (1951/1955). Thermodynamics and Statistical Mechanics, vol. 5 of Lectures on Theoretical Physics, edited by F. Bopp, J. Meixner, translated by J. Kestin, Academic Press, New York, p. 1.
 Serrin, J. (1978). The concepts of thermodynamics, in Contemporary Developments in Continuum Mechanics and Partial Differential Equations. Proceedings of the International Symposium on Continuum Mechanics and Partial Differential Equations, Rio de Janeiro, August 1977, edited by G.M. de La Penha, L.A.J. Medeiros, NorthHolland, Amsterdam, ISBN 0444851666, pp. 411–51.
 Serrin, J. (1986). Chapter 1, 'An Outline of Thermodynamical Structure', pp. 3–32, in New Perspectives in Thermodynamics, edited by J. Serrin, Springer, Berlin, ISBN 3540159312.
 Adkins, C.J. (1968/1983). Equilibrium Thermodynamics, (first edition 1968), third edition 1983, Cambridge University Press, ISBN 0521254450, pp. 18–20.
 Bailyn, M. (1994). A Survey of Thermodynamics, American Institute of Physics Press, New York, ISBN 0883187973, p. 26.
 Buchdahl, H.A. (1966), The Concepts of Classical Thermodynamics, Cambridge University Press, London, pp. 30, 34ff, 46f, 83.

 Münster, A. (1970), Classical Thermodynamics, translated by E.S. Halberstadt, Wiley–Interscience, London, ISBN 0471624306, p. 22.
 Pippard, A.B. (1957/1966). Elements of Classical Thermodynamics for Advanced Students of Physics, original publication 1957, reprint 1966, Cambridge University Press, Cambridge, p. 10.
 Wilson, H.A. (1966). Thermodynamics and Statistical Mechanics, Cambridge University Press, London, pp. 4, 8, 68, 86, 97, 311.
 BenNaim, A. (2008). A Farewell to Entropy: Statistical Thermodynamics Based on Information, World Scientific, New Jersey, ISBN 9789812707062.
Further reading
 Atkins, Peter (2007). Four Laws That Drive the Universe. OUP Oxford. ISBN 9780199232369
 Goldstein, Martin & Inge F. (1993). The Refrigerator and the Universe. Harvard Univ. Press. ISBN 9780674753259
External links
 In Our Time: Perpetual Motion, BBC discussion about the Laws, with Ruth Gregory, Frank Close and Steven Bramwell, hosted by Melvyn Bragg, first broadcast 24 September 2015.