Postulate is the best way to take and share notes for classes, research, and other learning.
Thomas Moore outlined the three main questions of thermodynamics in chapter T1:
How exactly do objects store energy internally?
Why is temperature linked to energy?
Why is heat transfer one-directional and irreversible?
Chapter T2 gives an answer to question 1 in the form of the Einstein model, in which the internal energy of a solid is represented as kinetic and potential energy of its atoms oscillating in a lattice. It explores the implications of this model on what energy distributions are more or less probable when two solids of different energy levels are brought into contact, starting to answer questions 2 and 3.
Chapter T3 formalizes the explorations of Chapter T2 into definitions for entropy and temperature, arriving at the second law of thermodynamics.
How exactly do objects store energy internally?
In 1907, Einstein came up with a simple model that accurately described the nature and behavior of monatomic solids above 100K. Constrained application space, sure, but the model allows us to explore thermodynamic principles that generalize beyond Einstein solids.
Einstein's model first considers a solid as equivalent to a lattice of atoms connected to each other by springs, which represent the various electromagnetic and other interactions between them:
But this is still a mathematically complex model; even simpler is to think of each atom as being in a rigid box, held to each side by a spring. Now the springs on each atom operates independently of any other atom; the motion of each atom is then constrained to 1D oscillations in three dimensions.
Now it's easy to calculate the energy of each atom (the sum across all atoms of which constitutes the solid's internal energy). Classically, it's just the kinetic + spring potential energy in each axis:
$$E = \Sigma_{i=1}{3} \frac{1}{2}mv_i^2 + \frac{1}{2}kx_i^2$$
Quantum physics tells us that the possible total energies of oscillating particles are in fact quantized, meaning that they must be discrete integer multiples of some value rather than following a continuous distribution. I'll link the relevant physics for this if I ever get around to writing them up, but the quick intuitive explanation is that particles behave like waves in certain ways, and waves can only form standing waves at specific wavelengths given their boundaries. Using quantum energy equations, we can express the energy of each atom as follows:
$$E = \Sigma_{i=1}{3} \epsilon (n_i + \frac{1}{2})$$
where $\epsilon$ is a constant related to the spring coefficient and mass, $\hbar \sqrt{\frac{k}{m}}$.
The energy of the entire solid is the same value summed over all $N$ atoms in the solid:
$$E = \Sigma_{i=1}{3N} \epsilon (n_i + \frac{1}{2}) = \Sigma_{i=1}{3N} \epsilon n_i + \frac{3}{2}N\epsilon$$
But $\frac{3}{2}N\epsilon$ is energy that can never be transferred, so we don't count it in the solid's thermal energy. Thus we find that the internal thermal energy of an Einstein solid with $N$ atoms is simply:
$$U = \Sigma_{i=1}{3N} \epsilon n_i$$
This model and equation aren't too useful directly. Finding the exact energy state $n$ of every atom in a solid is hardly an efficient way to calculate its internal energy. But we can use this model to probe the nature of heat transfer: what happens when we put two Einstein solids with different amounts of energy together? The properties and implications that emerge from this model will help answer our larger questions.
Why is temperature linked to energy? Why is heat transfer one-directional and irreversible?
There is a fixed number of possible distributions of a total internal energy $U$ among $N$ atoms in a solid, and specifically the $3N$ oscillators among them.
For example, if you have one atom and $1 \epsilon$ of total internal energy, there are three possible distributions of this energy: that $1 \epsilon$ is contained in either the $x$, $y$, or $z$ oscillators of the atom. We might represent these possible arrangements -- called microstates -- as 100, 010, and 001. If we have $2 \epsilon$ of total internal energy, the possible microstates are 002, 020, 200, 110, 101, and 011. If we have $1 \epsilon$ of energy but two atoms, there are also six possible microstates: 100000, 010000, 001000, 000100, 000010, and 000001.
Larger numbers of atoms and amounts of energy will make possible more microstates still. Let's call the number of possible microstates multiplicity, written $\Omega$. It's a simple matter of math to find multiplicity as a function of internal energy $U$ and number of atoms $N$, with $q = U/\epsilon$:
$$\Omega(U, N) = \frac{(q+3N-1)!}{q!(3N-1)!}$$
The fundamental assumption of statistical mechanics, which underlies thermodynamics, is that every microstate is equally likely to occur in the long run. In other words, the probability of a system being in any given microstate is $\frac{1}{\Omega_{\text{system}}}$.
Now consider not just one system, but two Einstein solids brought into contact with each other. Let one have $N_A$ atoms and energy $U_{A, \text{init}}$ and the other have $N_B$ atoms and energy $U_{B, \text{init}}$. When they are brought together, the total energy is $U_\text{tot} = U_{A, \text{init}} + U_{B, \text{init}}$. This amount of energy can now redistribute itself among $U_A$ and $U_B$ in any split as long as $U_A + U_B = U_\text{tot}$. For example, if $U_\text{tot}=10$, then $(U_A, U_B)$ can be $(0, 10), (1, 9), (2, 8), ..., (8, 2), (9, 1), (10, 0)$.
One property of multiplicity is that the multiplicity of two systems together (but not interacting) is simply the product of the multiplicity of each (because each is just a count of possible combinations):
$$\Omega_{AB} = \Omega_A \Omega_B$$
Thus, some splits of $U_A$ and $U_B$ have more possible microstates than others, because the product $\Omega_{AB} = \Omega_A \Omega_B = \Omega(U_A, N_A)\Omega(U_B, N_B)$ is different for different $U_A$ and $U_B$.
Following the fundamental assumption, this means that some splits of $U_A$ and $U_B$ -- called macropartitions -- are more likely than other splits. If you put into contact two Einstein solids with different amounts of energy but the same number of atoms, for example, the most likely state will be an equal distribution of energy between the two solids, because there are the most possible microstates in such a distribution compared to uneven distributions. If one solid has fewer atoms than the other, then there would be more possible microstates if the solid with more atoms had more energy, as there are more oscillators there for quantized bits of energy to go.
Thus the equilibrium point of two objects in contact with each other depends on the number of atoms in each, which is related to their masses. Temperature is equal when two objects are in equilibrium, so now the pieces begin to come together for us to see why the energy-temperature relation observed in the previous chapter is true.
$$dU = mc dT$$
This connection begins to answer the second question about the relationship between temperature and energy.
The Einstein model and fundamental assumption of Statistical Mechanics also shed light on the irreversibility of heat transfer: it's theoretically possible in this model for two objects to spontaneously jump from having equilibrium temperatures to having wildly different ones, but it's incredibly improbable, and if a deviation from equilibrium does occur, it will quickly return to being near equilibrium because the multiplicity of such an energy distribution is so much higher.
The multiplicity of some macrostate of a system (macrostate meaning a state defined "macro" properties like total internal energy $U$, number of atoms $N$, and potentially other variables like volume $V$; a macrostate can have many possible microstates) may not be dependent only on $U$ and $N$, as in an Einstein solid. But the general idea that heat transfer will tend towards the macrostate of highest multiplicity is generally true, and from this we can define entropy and temperature to formalize our answers to questions 2 and 3.
Formalizing answers by defining entropy and temperature
The definition of entropy is straightforward, simply making multiplicity values, which can get extravagantly large, more easy to work with:
$$S(\text{macrostate}) = k_b \ln(\Omega(\text{macrostate}))$$
where $k_b$ is Boltzmann's constant $1.38\text{e-}23$.
The formal version of our argument for the irreversibility of heat transfer based on the multiplicies of various macropartitions is the second law of thermodynamics: "an isolated system's entropy never decreases."
Let's move on to a formal definition of temperature. The temperature of two objects are the same when equilibrium between two objects is reached; this occurs when the multiplicity of the combined system, and therefore the entropy of the combined system, is maximized by the energy distribution between $U_A$ and $U_B$. In other words:
$$T_A=T_B : \text{when} : \frac{dS_{AB}}{dU_A} = 0$$
But $\frac{dS_{AB}}{dU_A} = \frac{dS_A}{dU_A} + \frac{dS_B}{dU_A}$ and $\frac{dS_B}{dU_A} = \frac{dS_B}{dU_B} \frac{dU_B}{dU_A} = -\frac{dS_B}{dU_B}$ because $\frac{dU_B}{dU_A} = \frac{d}{dU_A} U - U_A = -1$. So the above equation is equivalent to:
$$\frac{dS_{A}}{dU_A} = \frac{dS_{B}}{dU_B}$$
Thus the temperature of an object must be directly related to the value $\frac{dU}{dS}$ of that object. In other words, $T = f(\frac{dU}{dS})$. Historically, the specific function chosen has been the following:
$$\frac{1}{T} = \frac{dS}{dU}$$
This definition matches up with empirical measurements, for example the relationship $\frac{dU}{dT} = 3Nk_b$ for monatomic solids, though I won't walk through the math here.
Moore also provides a nice intuitive analogy comparing energy, entropy, and temperature to money, happiness, and generosity, in that order. An object with a higher temperature has a smaller $\frac{dS}{dU}$, losing relatively less entropy per energy drop; analogously, a generous person is relatively happy to give away some money compared to a needy, low-temperature, high $\frac{dS}{dU}$ one.
Just as you'd expect small amounts of money to matter less the richer someone is, and the more the poorer someone is, the temperature of most systems to increases with the amount of energy they have; but there are systems where it actually decreases, or is even negative.
(Figure T3.5)
For the purposes of Moore's textbook, my class, and my blog posts, the scope is limited to :normal" systems where temperature increases with energy.
Notes for Pomona class with Prof. Whitaker, fall 2021