std thermo jfsub

41
University of Fribourg Department of Chemistry Statistical Thermodynamics Condensed script Juraj Fedor WS 2011

Upload: sorinvanatoru3645

Post on 19-Jul-2016

221 views

Category:

Documents


1 download

DESCRIPTION

Std Thermo JfSub

TRANSCRIPT

Page 1: Std Thermo JfSub

University of Fribourg

Department of Chemistry

Statistical Thermodynamics

Condensed script

Juraj Fedor

WS 2011

Page 2: Std Thermo JfSub
Page 3: Std Thermo JfSub

This script is not a full, independently readable, textbook. It is very condensed and summarizes only the

main points discussed during the lectures.

The scope of the lecture is to learn, how to determine quantities that we can measure macroscopically (e.g.,

heat capacity, reaction rate constants) from the microscopic properties of molecules. To do this, one uses the

fact, that usually huge amounts of molecules are playing role in reactions, and the statistical methods can

be used. A typical macroscopic amount contains 1023 particles! Think how large this number is - the use of

statistical methods is certainly justified.

We want to start from microscopic properties of molecules. At the beginning of the 20th century it turned

out, that atoms and molecules behave very differently from what would be expected by ’common sense’. The

effort to explain several experimental observations (discrete spectra, black body radiation, heat capacity of

crystals...) gave rise to quantum mechanics. At the first encounter, the quantum mechanics is unintuitive and

requires different way of thinking. For the scope of this lecture, you will just need to accept the fact, that the

molecules cannot posses any energy - the energy levels are discrete.

In many places I will use expression: ’Imagine a single molecule..’ This is certainly possible to imagine,

but chemists often think that real-world experiments are happening in much larger amounts (1023 molecules).

However, towards the end of 20th century, the experiments on single atoms or molecules really became possible.

Either with trapping isolated ions in ion traps or using tip of a Surface Tunneling Microscope (STM) to

manipulate individual molecules on a surface. So ’individual molecule’ is no longer a theoretical construct - we

can really see it and study it alone!

3

Page 4: Std Thermo JfSub

Contents

1 Quantum mechanical description of a single molecule 6

1.1 Quantum states and energy levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

1.2 Translational motion (in a box) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

1.3 Vibrational motion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

1.4 Rotational motion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

1.5 Electronic states . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

1.6 Conclusion: Specifying the state of single diatomic molecule . . . . . . . . . . . . . . . . . . . . . 7

2 Quantum mechanical description of several molecules 8

2.1 Microstates and thermodynamic state (macrostate) . . . . . . . . . . . . . . . . . . . . . . . . . . 8

3 Probability and statistics 10

3.1 Feeling of probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

3.2 Probability distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

3.3 Mean values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

4 Microcanonical ensemble 12

4.1 Mathematical window I . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

4.2 Microcanonical system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

4.3 Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

4.4 Entropy and internal energy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

4.5 Ensemble . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

4.6 Equilibrium . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

5 Two systems in thermal contact. Temperature. 15

5.1 Mathematical window II . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

5.2 What did Celsius do? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

5.3 Two systems in thermal contact . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

5.4 Entropy increase during heat exchange . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

5.5 Fluctuations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

5.6 Consistency with classical thermodynamics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

6 Canonical ensemble and Boltzmann distribution 20

6.1 System with a given temperature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

6.2 Probability of microstate in canonical ensemble . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

6.3 Probability to find a molecule in certain state / level . . . . . . . . . . . . . . . . . . . . . . . . . 21

4

Page 5: Std Thermo JfSub

6.4 Population of energy levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

6.5 Example: vibrations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

7 Partition function in detail 24

7.1 Power of partition function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

7.2 Interpretation of partition function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

7.3 Partition function of a single diatomic molecule . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

7.4 Vibrational partition function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

7.5 Rotational partition function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

7.6 Translational partition function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

7.7 Electronic partition function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

7.8 Partition function for a system on N molecules . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

8 Internal energy and heat capacity 30

8.1 Mean energy per molecule, internal energy, molar heat capacity . . . . . . . . . . . . . . . . . . . 30

8.2 Electronic motion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

8.3 Translational motion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

8.4 Rotational motion (diatomic molecule) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

8.5 Vibrational motion (diatomic molecule) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

8.6 Total molar heat capacity of a diatomic ideal gas . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

8.7 Equipartition theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

8.8 Polyatomic molecules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

9 Pressure and entropy 35

9.1 Pressure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

9.2 Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

9.3 Translational entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

9.4 Rotational entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

9.5 Vibrational entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

10 Chemical reactions 38

10.1 Equilibrium constant . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

10.2 Equilibrium constant and Boltzmann distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

10.3 Example: gas phase dissociation of diatomic molecule . . . . . . . . . . . . . . . . . . . . . . . . 40

A Fundamental constants 41

5

Page 6: Std Thermo JfSub

Chapter 1

Quantum mechanical description of a

single molecule

One of the keywords in statistical thermodynamics is energy that a molecule posses. Imagine a single molecule.

It can ’carry’ energy via its translational motion ( 12mv

2), vibrations of individual atoms and the rotation of

molecule. Plus, the electrons within individual atoms also have certain energy. The energy of the molecule is

sum of these four contributions. One would expect, that this sum can be any number. Surprisingly, it turns

out that this is not true - quantum mechanics says, that there are only certain energies possible.

1.1 Quantum states and energy levels

If we wanted to describe our single molecule in the language of classical mechanics, we would say, where its

atoms are and how are they moving. That means, we would specify their positions and velocities. The language

of quantum mechanics is very different - you can say it is in a certain state which has a certain energy. So, there

are only discrete energies that the molecule can have - we say that the molecules occupies certain energy level.

What to imagine under the mysterious word state you will learn in different courses (hint: wave function).

In some cases, several states can have same energy. In that case, we speak that the energy level is degenerate.

We define degeneracy of energy level g as a number of different states with this energy.

Let us speak about diatomic molecules first. There are four ways in which such a molecule can store

energy: it moves (translates), vibrates, rotates or the energy is carried by the electrons (in form of their kinetic

or potential energy).

1.2 Translational motion (in a box)

If a particle of mass m is bound to move only in one dimension of length L, the energy levels that it can have

are

En =h2

8mL2n2 (1.1)

n is called translational quantum number and has values of n = 1, 2, 3, ... h is Planck’s constant, h = 6.626 ×10−34 Js.

If the particle can move in three dimensions x, y, z there are three quantum numbers describing its motion:

6

Page 7: Std Thermo JfSub

nx, ny, nz and its energy levels are:

Enxnynz =h2

8mL2(n2x + n2

y + n2z). (1.2)

The translational energy levels are non-degenerate gnxnynz = 1

1.3 Vibrational motion

The potential energy curve of a diatomic molecule can be reasonably well approximated by that of a harmonic

oscillator. The energy levels are

Ev = hcν(v +1

2) (1.3)

where h is a Planck’s constant, c is a speed of light, ν is vibrational wavenumber and v is vibrational quantum

number with possible values v = 0, 1, 3... The states are non-degenerate, gv = 1.

1.4 Rotational motion

Rotational levels are also quantized with the possible energies:

EJ = hcBJ(J + 1) (1.4)

with B being rotational wavenumber and J rotational quantum number. Each rotational level has degeneracy

of gJ = 2J + 1.

1.5 Electronic states

Energy levels of electrons in atoms and molecules are quantized as well. Each molecule has different electronic

energy levels and one has to solve the Schrodinger equation to get these energy levels - in almost all cases

numerically.

The analytical formula can be given only for the simplest - hydrogen atom. Its energy levels are:

Enel = −13.6eV

n2el

, (1.5)

with nel being the main electronic quantum number nel = 1, 2, 3... The degeneracy of the energy levels is

gnel = n2el.

1.6 Conclusion: Specifying the state of single diatomic molecule

In order to fully specify the state of a diatomic molecule one needs to know: three translational quantum

numbers, vibrational quantum number, rotational quantum number, electronic state. One can designate such

state as nx, ny, nz, v, J, nelThe total energy content is then

E = Enxnynz + Ev + EJ + Enel (1.6)

Important: Origin of the energy scale is not absolute (do you know why?). Thus we can shift the E = 0

point. Using vibrational motion as an example: according to Eq. (1.3) vibrational level v = 0 has energy

E0 = 12hcν. We can set the energy scale so that E0 = 0, but then the energies of other levels are Ev = hcνv.

Important: So far, we have neglected spin. Will come into play later.

7

Page 8: Std Thermo JfSub

Chapter 2

Quantum mechanical description of

several molecules

Consider N molecules. Each molecule is described by a set of quantum numbers nx, ny, nz, v, J, nel. We

specify the quantum state of the whole system by specifying the quantum numbers for each of the molecule.

We have to say, that molecule 1 has quantum numbers nx, ny, nz, v, J, nel1, molecule 2 has quantum numbers

nx, ny, nz, v, J, nel2 and so on up to molecule N .1

The total energy of the system is then sum of the energies of individual molecules:

E = E1 + E2 + ..+ EN (2.1)

2.1 Microstates and thermodynamic state (macrostate)

If we fully specify the quantum state of the system - e.g. we specify all quantum numbers of all molecules - we

say, that we specified a microstate of the system. This is a complete information we can have about the N

molecules. And it is a lot of information -usually we have 1023 molecules and each of them has several quantum

numbers. It helps to imagine the microstate as a ticket with all quantum numbers of all molecules written on

it.

In many cases, we only know some macroscopic property of the system - for example the total energy. If

we specify only the total energy, we say that we specify the macrostate of the system - something that is

macroscopically measurable. The word state in this case has nothing to do with quantum state - it is used from

historical reasons and comes from ’thermodynamic state’.

Special case: Ideal monoatomic gas. State of 1 atom is characterized by three translational quantum

numbers, microstate of N distinguishable atoms by 3N quantum numbers. The total energy (often called

internal energy of the gas) is

E=h2

8mL2

N∑i=1

(n2x,i + n2

y,i + n2z,i) =

h2

8mV 2/3

N∑i=1

(n2x,i + n2

y,i + n2z,i). (2.2)

1There is one important simplification here: by doing this we assume that the particles are distinguishable. In reality, in

quantum mechanical systems we cannot say which particle is which, i.e., the particles are indistinguishable. In that case, one has

to specify for each possible set of quantum numbers, how many particles are having this set. However, principles of statistical

thermodynamics are the same for distinguishable and indistinguishable particles - one counts the states and does statistics with

them. So, in examples, I will always use distinguishable particles - their statistics is easier.

8

Page 9: Std Thermo JfSub

There are two ways how to change the internal energy: either by changing the volume V (work) or by

changing the occupation of quantum numbers (heat).

Special case: Collection of vibrating molecules. N diatomic molecules in their electronic ground

state that do not move (translationally) and do not rotate. One such molecule is characterized by vibrational

quantum number, microstate of N molecules is specified by N -tuple [v1, v2...vN ]. Figure 2.1 demonstrates, that

one thermodynamic state with a given energy E can be realized by many microstates. We denote the number

of microstates that realize the given thermodynamic state as Ω.

Figure 2.1: Collection of 4 vibrating molecules and their microstates that have total energy of 1hcν,2hcν and

3hcν .

9

Page 10: Std Thermo JfSub

Chapter 3

Probability and statistics

3.1 Feeling of probability

Everyone has a feeling of what is probability. Can you try to explain it?

Let’s do some ’experiments’: let us throw dice. For a single six-sided die, there are six possible outcomes,

let’s call them Ai: A1 = 1, A2 = 2 to A6 = 6. Each outcome has a given probability pi. If the die is fair then

p1 = p2 = ... = p6 = 16 .

Why? If we throw the die n times, the average number of times the value Ai would fall will be ni = pin.

We can take this as a definition of probability, just n has to be very large.

Important: it does not matter, if I throw a die n times or if I have n dice and throw them at once -

statistically, the results will be the same.

Note: If we sum the probabilities of all possible outcomes we get

6∑i=1

pi = 1

3.2 Probability distribution

If we have a situation with N possible outcomes A1, A2...AN , the set of probabilities piNi=1 is called proba-

bility distribution. The probability distribution has to be always normalized:

N∑i=1

pi = 1 (3.1)

Another example - die with only two possible outcomes Two sides of a die have value of A1 = 10,

other four sides have a value of A2 = 1000. Corresponding probabilities are p1 = 26 and p2 = 4

6 . The probability

distribution is shown in figure 3.1.

Another example - Boltzmann probability distribution For any quantum system we can ask, what

is the probability, that it will be in a given quantum state.

Real world example: electronic states of Cr3+ ion in Al2O3 crystal - the ion has four accessible electronic

states. The energy level diagram is shown in figure 3.1 - anytime we ’look at’ Cr3+ we can find it in one of these

four electronic states. We will see later, that that the probability to find it in the state with energy Ei follows

the Boltzmann distribution (assuming non-degenerate levels!):

pi ∝ e−EikT (3.2)

(Note the proportionality, not equality. Why there cannot be an equality sign?) Figure 3.1 demonstrates this

probability distribution graphically.

10

Page 11: Std Thermo JfSub

Note: in order to follow this probability distribution, ions have to be in thermal equilibrium with the lattice.

Cr3+:Al2O3 was not chosen randomly: using this crystal, the first laser has been constructed 51 years ago.

During the laser operation, crystal is very far from thermal equilibrium and the probabilities of energetic levels

are very different (so called population inversion).

Normal six-sided die Six-sided die with two 10’s and four 1000’s

Energy levels of Cr3+ in Al2O3 (scheme)Probability distribution in thermal

equilibrium

Figure 3.1: Top row: Probability distributions for normal six-sided die (left) and a die with two values described

in the text (right). Bottom row: Schematic energy level diagram (left) and probability distribution for Cr3+

ion in Al2O3 lattice at some temperature T . Ignore the blue vertical arrows in the energy level diagram, they

indicate the radiative transitions during ruby laser operation and do not play role during thermal equilibrium.

3.3 Mean values

If each value Ai has assigned some property Xi, then the mean value of X is

X =

N∑i=1

Xipi (3.3)

Example: What is a mean value of A after many dice throws?

A =

6∑i=1

Aipi =1

6(1 + 2 + 3 + 4 + 5 + 6) =

21

6(3.4)

Example 2: Let X = A2. What is a mean value of X after many dice throws?

A =

6∑i=1

Xipi =1

6(1 + 4 + 9 + 16 + 25 + 36) =

91

6(3.5)

11

Page 12: Std Thermo JfSub

Chapter 4

Microcanonical ensemble

4.1 Mathematical window I

• In statistical thermodynamics one normally encounters three types of numbers

– Normal. Numbers like 3, 16, 18953, 106.

– Large. Numbers like 1023. Usually connected with number of molecules in a macroscopic system.

– Huge. Numbers like 101023

. Usually connected with number of microstates in a macroscopic system.

• An important property of logarithm:

ln(10a) = a ln(10) (4.1)

That means logarithm makes normal numbers from large ones and large numbers from huge ones. Can

be handy time to time.

• For large and huge numbers, Stirling’s approximation of N !:

lnN ! = N lnN −N (4.2)

can be used to estimate the factorials of large numbers

N ! = eN lnN−N (4.3)

For example, let’s calculate 1023!:

1023! = e1023 ln(1023)−1023

= e1023(ln(1023)−1) = e1023(23 ln 10−1) = e1023×51.95. (4.4)

We see immediately, that in our classification of numbers 1023! is huge.

4.2 Microcanonical system

Consider a gas of N molecules enclosed in a vessel with volume V . Let the vessel have thermally isolated walls

- that means no heat exchange with the outside is possible (∆Q = 0). As seen from equation 2.2, if the volume

of the vessel is constant (∆V = 0), the total internal energy will be constant. We call the system with constant

internal energy microcanonical system.

There are many possible microstates, in which the system can be - all of the Ω microstates that have a

resulting energy of E. Let us number the microstates with the index i = 1...Ω. Now, as the time flows, these

12

Page 13: Std Thermo JfSub

microstates are changing very fast. Classical analogy helps: if we for a moment imagine the gas molecules as

flying balls, we give a complete information about the state of the gas when we specify the position and velocity

of each single molecule1. Now, any collision of two molecules will change their velocities. This happens very

often2. Similarly, in the quantum picture, the quantum numbers (i.e., the microstate) will change extremely

fast. If we imagine the microstate as a ticket with all the quantum numbers written on it, in each instant the gas

is characterized by one such ticket, but these tickets are changing very fast. Of course, only those microstates

will be present, which have the total energy of E.3 That means, every time we look at the system, it will be in

one of the Ω allowed microstates.

The natural question is: what is the probability pi that the system will be in a given microstate i? The

answer is one of the basic postulates of statistical thermodynamics:

In a microcanonical system, all microstates have equal a priori probabilities.

In other words: pi = const. The rule, that probability distribution has to be normalized tells as a value of

this constant:

1 =

Ω∑i=1

pi = p1 + p2 + ...+ pΩ = Ω× const (4.5)

That means pi = 1Ω .

4.3 Entropy

It is time to look at the mysterious quantity Ω in more detail - the number of possible microstates, that have

a total energy of E. Of course the exact way how to calculate it depends on the system and the way how we

specify the quantum numbers, i.e., the microstate (do we have to consider only translations, or also rotations,

vibrations etc?). The very simple example of four vibrating molecules from figure 2.1 hints, that Ω is a very

steep function of internal energy - with a higher internal energy there are more possible combinations how this

energy can be divided to individual molecules. Also, one can expect, that it will be a very steep function of

the number of molecules N - the larger the number of molecules the more possible combinations (i.e., more

microstates realizing the given E, i.e. higher Ω). With the tools of combinatorics, one can actually show, that

Ω is proportional to the factorial of N :

Ω ∝ N ! (4.6)

The constant of proportionality will be some normal number. So for a macroscopic number of molecules

N = 1023, the number of possible microstates will be in the order of Ω ≈ 1023!. We have shown in mathematical

window that this is a huge number, very impractical to deal with.

From this reason, one always operates with a logarithm ln Ω, which is a large number. In order to make a

normal number from it (and from historical reasons), this logarithm is usually multiplied by the Boltzmann

constant k = 1.38× 10−23 J/K. We define a new quantity, called entropy of the system as

S = k ln Ω. (4.7)

1There are of course no discrete microstates in classical mechanics. The analogy is hidden in the fact, that we are providing a

complete information about the system. Either by specifying positions and velocities (CM) or by specifying quantum numbers of

each molecule = microstate (QM).2For a gas at atmospheric pressure and room temperature, one molecule flies without a collisions only for about one nanosecond,

than it will most probably hit some other molecule.3Classical analogy again: if two molecules collide, their velocities will change (i.e., the analogy of microstate will change), but

the total energy remains conserved. The energy conservation law holds.

13

Page 14: Std Thermo JfSub

There are few exotic ways how one can interpret the entropy:

• The larger entropy means that there are more possible microstates in which the system can be. That means

with the larger entropy we have less detailed information about the system - we know only the macrostate

(the total internal energy). Often, entropy is interpreted as a measure of our detailed information about

the system (larger entropy = less detailed information).

• Also, a more organized system has always lower entropy than a less organized system. Imagine the same

amount of atoms in crystal and in liquid phase. There are certainly more possible configurations (and

microstates) that will result in the same energy in the liquid phase. Thus, entropy can be viewed as a

measure of the system’s disorder.

These interpretations are useful in information theory where entropy is widely used. For us the most useful

’interpretation’ is Eq. (4.7) - entropy is k times logarithm of number of microstates that will result in a given

internal energy.

4.4 Entropy and internal energy

Entropy depends on the internal energy of the system E. As hinted in figure 2.1 Ω is a very steep function of

internal energy. Important: Ω(E) is always an increasing function of E - if there is more available energy, there

will be more ways how to distribute it among the molecules → higher Ω. Thus the entropy S(E) is also an

increasing function of E. Consequence:∂S

∂E> 0. (4.8)

4.5 Ensemble

We saw, that at each moment of time, the system will be in one of the Ω possible microstates which are changing

very quickly. If we imagine each microstate as a card with all quantum numbers on it, the heap of cards that

have the resulting total energy E has a special name: ensemble. We do not need to imagine them changing

each femtosecond or so - it’s enough to imagine these cards all at once and and do statistics with them. This

is a second basic postulate of statistical thermodynamics:

The observed property of a system over a period of time is the same as the average over all microstates (taken

at an instant of time).

We will use this postulate a lot when discussing a canonical ensemble (chapter 6).

4.6 Equilibrium

Very important - both basic postulates from this chapter are valid only for systems that are in equilibrium. We

will talk a lot about equilibrium and non-equilibrium later.

14

Page 15: Std Thermo JfSub

Chapter 5

Two systems in thermal contact.

Temperature.

5.1 Mathematical window II

• Derivative and slope of the function. Let f = f(x).

If df(x)dx > 0 then f(x) is increasing.

If df(x)dx < 0 then f(x) is decreasing.

If df(x)dx = 0 then f(x) has a maximum or minimum in this point.

• Partial derivative. Let f = f(x, y) be a function of two variables x and y. Then ∂f(x,y)∂x means that we

are taking derivative only along the x coordinate, with y being constant. To stress this, I will sometimes

use notation ∂f∂x |y=const or (∂f∂x )y. 1

• Product of increasing and decreasing function. Let f(x) be a steeply increasing function of x and

g(x) a steeply decreasing function of x. Then their product f(x)g(x) will be a peak function. Of course,

unless f(x) and g(x) are in some special relationship which cancels out the effect. Can you give examples?

5.2 What did Celsius do?

So far, the base of our discussion was the internal energy of the system. We defined an entropy from the number

of ways this energy can be realized. However, we have not mentioned temperature yet. Of course, everybody

has a feeling of what is temperature. Is there a way how to define it?

We know thanks to Anders Celsius - temperature is what a thermometer shows. In more detailed - we stick

the thermometer into a sample, wait until they exchange heat and and their temperatures equalize. Then the

thermometer shows the temperature of the sample.

That is exactly the way how to arrive at the term temperature: let us bring two systems together and let

us find which physical quantity do they equalize by thermal contact. In other words, which condition has to be

fulfilled so that there is a thermal equilibrium. Thermal equilibrium means that thee is no net energy flow

between the two systems.

1Such notation is rarely used anywhere else than in thermodynamics, but there it is used very often.

15

Page 16: Std Thermo JfSub

5.3 Two systems in thermal contact

Let us consider two isolated systems A and A’. Let they have respective number of particles N and N ′ and

energies E0, E′0. Corresponding numbers of microstates are Ω(E0) and Ω′(E′0) and the entropies are S0 and

S′0. Let us now bring them to thermal contact - let they be separated by one wall that particles cannot pass

through (N and N ′ do not change). But let the wall be heat conducting so the energy can flow between the

two systems through this wall.

The systems exchange certain amount of energy. We define the energy exchanged this way - the energy

transfered from one system to another via their thermal contact - as heat and denote it generally as ∆Q.

After this exchange, the system A will have internal energy E = E0 + ∆Q, system A’ the internal energy

E′ = E0 + ∆Q′. The total energy of the two systems as whole Etot = E + E′ has to be conserved. Since

Etot = E + E′ = E0 + E′0 it immediately follows that

∆Q = −∆Q′. (5.1)

This is logical: the heat that one system gains has to be the same as the heat that the second system looses.

The question is: what will be the amount of transfered energy? Or, equivalently, what will be the final

internal energy E of the system A? We will call this energy partitioning. Do not forget: if we know that the

internal energy of system A is E than the energy of system A’ is E′ = Etot −E. To find optimal E, we need to

count the microstates which can realize each partitioning of energy between these two systems.

How does the microstate of the total system (N + N ′ molecules considered together) looks like? We need

to specify all quantum numbers for N molecules of A and all quantum numbers for N ′ molecules of A’. Each

card from the heap of Ω(E) cards characterizing A can be combined with each card from the heap of Ω′(E′)

cards characterizing A’. Thus, the number of microstates which can realize the situation in which A has internal

energy E and A’ has internal energy E′ is

Ωtot(E,E′) = Ω(E)Ω′(E′). (5.2)

Or, in other words

Ωtot(E,Etot − E) = Ω(E)Ω′(Etot − E). (5.3)

It is useful to imagine this as a function of E. Ω(E) is a steeply increasing function of E. Similarly, Ω′(Etot−E)

Figure 5.1: Schematic illustration of two systems in thermal contact.

16

Page 17: Std Thermo JfSub

is a steeply decreasing function of E. The product Ω(E)Ω′(Etot − E) will be a peak function - it will be very

small everywhere except for a narrow peak, where it will have a maximum. See figure 5.2.

What is this graph telling us? The number of microstates, that can realize the given energy partitioning

between A and A’ as a function of the internal energy of A. Now, all microstates have equal probability. So,

the most probable partitioning of the energy will be that in which A will have an energy at which this number

of microstates is maximal. If A will have this energy the two systems will be in thermal equilibrium.

That means, we have to look at witch E the function Ω(E)Ω′(Etot − E) has maximum. From practical

reasons, we will not be looking for maximum in Ωtot(E,Etot − E) but rather for a maximum in its logarithm

(which will have a maximum at the same energy E). Looking for maximum is simple: take a derivative and

put it equal to zero:

0 =∂

∂Eln [Ωtot(E,Etot − E)] =

∂Eln [Ω(E)Ω′(Etot − E)] =

∂E[ln Ω(E) + ln Ω′(Etot − E)] (5.4)

After couple of mathematical operations one can see that

∂ ln Ω(E)

∂E− ∂ ln Ω′(E′)

∂E′= 0. (5.5)

After multiplying with Boltzmann constant we immediately see that the condition for the most probable

energy partitioning is∂S(E)

∂E=∂S′(E′)

∂E′. (5.6)

Figure 5.2: Schematic illustration of the product of a steeply increasing and steeply decreasing function. The

lowest panel shows the number of possible microstates realizing situation when A has internal energy of E.

Note: it should be a simple peak function, any deviations from the simple peak shape are due to graphical

incompetence of the author.

17

Page 18: Std Thermo JfSub

If this condition is satisfied the net energy flow from one system to another will be zero - the systems will be in

thermal equilibrium.

But this is exactly the quantity we were looking for - some property of A that is equal to this property of

A’ in thermal equilibrium2. This allows us to define temperature as

1

T=∂S(E)

∂E(5.7)

5.4 Entropy increase during heat exchange

Just after we brought the two systems into contact (when the energy partitioning was E0 in A and E′0 in

A’, before there was any time for the heat exchange), the total number of possible microstates realizing such

macrostate was

Ωtot(E0, E′0) = Ω(E0)Ω′(E′0). (5.8)

Taking logarithm and multiplying by Boltzmann constant we see that the entropy is additive.

Stot,0 = S0 + S′0 (5.9)

After the heat exchange sets on, what is the entropy Stot of the composed system? We found, that when the

temperatures of two systems are equalized, the Ωtot(E,E′) has maximum - so it will be higher (in the worst

case equal) than Ωtot(E0, E′0). Correspondingly

Stot,0 ≤ Stot (5.10)

and the equality sign is valid only if there was no heat transfer, i.e., when the initial partitioning of energy

(E0, E′0) was optimal. That means, when the systems had initially the same temperature. Eq. 5.10 leads us to

important conclusion:

Heat exchange between two systems of different temperature always leads to an increase of entropy of such

combined system.

2Remember our feeling of what is temperature: some characteristics of A which will equalize with this characteristics of A’ when

they are brought into thermal contact.

18

Page 19: Std Thermo JfSub

5.5 Fluctuations

The lowest panel in figure 5.2 shows the probability, that the system A will have energy E (or, that the energy

partitioning between the two systems will be (E,E′ = Etot−E)). The energy, at which the peak has a maximum

is the mean energy of the system A at thermal equilibrium. However, the peak has a certain width, so also

the neighbouring energies have certain probabilities to occur. Even in equilibrium, as the time flows the exact

internal energy of the system A will fluctuate around its mean value. The energy-width of the probability

peak in figure 5.2 reflects the magnitude of these fluctuations.

5.6 Consistency with classical thermodynamics

In classical thermodynamics, it is assumed that everyone knows what is temperature and the entropy is defined as

a state function in a following way: if certain small amount of heat ∆Q is added to the system with temperature

T , the entropy changes as:

∆S =∆Q

T. (5.11)

This is fully consistent with our statistical relation between temperature and entropy. If the heat ∆Q is added

at constant volume, the internal energy will increase by the same amount, ∆E = ∆Q. If the changes are small,

we can approximate the partial derivatives in our definition of temperature by them as

1

T=∂S

∂E≈ ∆S

∆E=

∆S

∆Q(5.12)

which is the same es equation (5.11).

19

Page 20: Std Thermo JfSub

Chapter 6

Canonical ensemble and Boltzmann

distribution

6.1 System with a given temperature

So far, we dealt with systems with well defined internal energy. However, in chemistry one usually encounters

situations, when not the internal energy, but rather the temperature of the sample are given (controlled). In

that case, the internal energy of the system under consideration can fluctuate, as shown at the end of the

previous chapter.

We have seen that if we bring two systems together, the temperature of the colder one will rise and the

temperature of the warmer one will drop. Trick, how to simulate a constant temperature: one system (A’) has

to be much much bigger than the other system, so that its change of temperature is negligible.

Let us do the trick: Let us bring our system (A) in thermal contact with system A’, that is much bigger:

both in number of particles (N ′ >> N) and internal energy (E′ >> E). We will call such A’ system a heat

reservoir. The two systems can exchange energy, but otherwise they are isolated from environment. That

means, the total energy Etot = E + E′ is constant and the two systems together represent a microcanonical

ensemble (with all microstates of equal probability).

Let us look at the microstates of system A: one microstate is a card with all quantum numbers of N molecules

written on it. Each such microstate has energy E which is sum of the energies of N molecules. The energies of

individual microstates do not have to be equal - there are many different ways how to partition Etot between

our system A and the heat reservoir A’.

We can still number the microstates of A with an index i, than the energy of the i-th microstate will be Ei

(and the corresponding energy of heat reservoir will be E′ = Etot − Ei). At any moment of time, the system

A will be in one of these microstates. Again, we don’t need to imagine these microstates changing very fast

with time but we rather imagine a mental collection of the microstates next to each other. If the system A is in

contact with heat reservoir (that means it has a constant temperature), such collection of microstates is called

canonical ensemble.

6.2 Probability of microstate in canonical ensemble

However, since the microstates of A do not have all the same energy, probability of their occurrence in not the

same for all of them! To find a probability pi that the system A will be in microstate i we use the fact that

20

Page 21: Std Thermo JfSub

the system A together with heat reservoir represent a microcanonical ensemble. If the system A in in the i-th

microstate, the energy of the heat reservoir is Etot − Ei, and the possible number of reservoir’s microstates is

Ω′(Etot − Ei). Since all the combined microstates (system A + heat reservoir A’) have equal probability, the

relative importance of the i-th microstate of A is directly proportional to number of related microstates of A’.

In other words, probability pi is proportional to Ω′(Etot − Ei).

pi ∝ Ω′(Etot − Ei) = e1kS′(Etot−Ei). (6.1)

Now we use the fact, that the Ei << Etot and that the entropy is a reasonably slowly growing function of the

energy1 and we can do a Taylor expansion of the S′:

S′(Etot − Ei) = S′(Etot)−∂S′

∂E′Ei = S′(Etot)−

1

T ′Ei. (6.2)

Since in thermal equilibrium T = T ′, we are getting for the probability of i-th microstate

pi ∝ e1k [S′(Etot)−

EiT ] = e

1kS′(Etot)e−

EikT . (6.3)

Since the first term is some constant, we see, that the probability, to find i-th microstate which has an internal

energy Ei is

pi = Ce−EikT . (6.4)

The constant C can be obtained from the normalization condition

1 =∑j

pj =∑j

Ce−EjkT = C

∑j

e−EjkT . (6.5)

We reached our final result: if we have a system which has a given temperature T , probability to find it in a

microstate i with an energy Ei is

pi =e−

EikT∑

j

e−EjkT

. (6.6)

The sum in the denominator goes through all possible microstates of the system. This probability distribution

is called Boltzmann probability distribution. We will see later, that the sum in a denominator is an

important characterization of the system and has a special name: partition function. Its standard denotation

in physical textbooks is Z (from German ’Zustandssumme’), in chemistry textbooks Q. We will stick to the

later one, thus the partition function is

Q =∑j

e−EjkT . (6.7)

Do not mix it with the heat!

6.3 Probability to find a molecule in certain state / level

We didn’t make any assumption about the size of the system A. It can contain even one single molecule. In this

case, the microstates of the system A are simply the quantum states of this one molecule. Then the probability

to find the molecule in the i-th state is

pi =e−

EikT∑

j

e−EjkT

(6.8)

1unlike Ω which is very steeply growing function of energy, that’s why we are going to expand S and not Ω.

21

Page 22: Std Thermo JfSub

where Ei is energy of this i-th state and the sum in denominator goes through all the quantum states of the

molecule.

Usually, we are not asking what is the probability to find a molecule in a given quantum state but rather in

a given energy level Ei (remember, each energy level can have some degeneracy gi.) This probability is

p(Ei) =gie− EikT∑

j

gje−EjkT

. (6.9)

Here the sum in denominator goes through all energy levels in the molecule.

Again, the denominator is called partition function, but now it is a partition function of one molecule. To

distinguish it from the partition function of the whole system, we use a small letter q

q =∑j

gje−EjkT . (6.10)

6.4 Population of energy levels

If we have a system of N molecules at temperature T , some number of them will occupy certain energy level

Ei. Let us denote this number n(Ei) = ni. This number is given as

n(Ei) = p(Ei)N. (6.11)

Thus the fraction of molecules that occupy the energy level Ei is

n(Ei)

N=niN

=gie− EikT∑

j

gje−EjkT

. (6.12)

This is Boltzmann’s formula (and is perhaps the single most important result of this lecture)

To avoid evaluation the partition function (sum through all energy levels), one is often interested only in

relative population of the energy level with respect to the ground state

nin0

=gig0e−

Ei−E0kT . (6.13)

6.5 Example: vibrations

Let us consider gas of N diatomic molecules and let us for a moment neglect all other motion but their vibrations.

The energy levels of a single molecule are

Ev = hcνv (6.14)

and the levels are non-degenerated, gv = 1. At the temperature T , certain number of molecules will have

vibrational energy Ev. This number nv can be calculated from the Boltzmann’s formula

nvN

=e−

EvkT∑

j

e−EjkT

=e−

EvkT

qvib. (6.15)

The vibrational partition function of one molecule can be easily calculated

qvib =

∞∑j=0

e−EjkT =

∞∑j=0

e−hcνjkT =

∞∑j=0

(e−

hcνkT

)j=

1

1− e−hcνkT. (6.16)

22

Page 23: Std Thermo JfSub

ThusnvN

=(

1− e−hcνkT)e−

hcνvkT . (6.17)

23

Page 24: Std Thermo JfSub

Chapter 7

Partition function in detail

7.1 Power of partition function

The partition function is a tremendously important characteristics of the system. Let us demonstrate it on few

examples for a system of N identical molecules1 at temperature T .

It is handy to use a shortcut notation β = 1kT . Then the probability to find a molecule in an energy level

with energy Ei is

pi =gie−βEi∑

j

gje−βEj

=gie−βEi

q. (7.1)

Where the sum goes through all energy levels of one molecule and q is a partition function of this single molecule.

If we consider any physical quantity A, that can be assigned to each energy level and in i-th energy level

has a value of Ai, the mean value of this quantity is (Eq. 3.3)

A =∑i

Aipi. (7.2)

Let us take as this quantity the energy of the level itself Ei. We can calculate the mean energy of one molecule

at some temperature T

E =∑i

Eipi =∑i

Eigie−βEi

q=

1

q

∑i

giEie−βEi = −1

q

∑i

gi∂e−βEi

∂β= −1

q

∂q

∂β= −∂ ln q

∂β(7.3)

This is very useful - just knowing the partition function we can calculate the average energy content of one

molecule!

Even more: if average energy of one molecule is E the average internal energy of the whole system of N

molecules will be NE

U = −N ∂ ln q

∂β. (7.4)

(In previous chapters we designated the internal energy of the system of molecules as E, this was now stolen

by one molecule, so I used symbol U which is more common in classical thermodynamics). Thus, knowing

partition function of one molecule we can calculate internal energy of the whole macroscopic system at a given

temperature!

1In this whole chapter we assume that molecules are only weakly interacting with each other - e.g., that presence of one molecule

will not change the energy levels of the other molecule.

24

Page 25: Std Thermo JfSub

Example of more complicated thermodynamic quantity that can be obtained from partition function is a

heat capacity at constant volume

cV =∆Q

∆T=

(∂U

∂T

)V=const

. (7.5)

It turns out that all the thermodynamic properties of the system are hidden in the partition function.

7.2 Interpretation of partition function

Why is that so? What is so special about q that it encodes information about all thermodynamic properties of

the system?

Let us look on it’s physical meaning. The simplest illustration is on vibrational levels in harmonic ap-

proximation - evenly spaced energy levels with Ev = hcνv and degeneracy gv = 1. For each energy level, its

population at temperature T is proportional to e−EvkT . The partition function is merely sum of such terms. The

higher the temperature, the larger will be this sum and q. Thus, the value of partition function reflects,

how many energy levels are populated at temperature T .2

Some quantitative conclusions can be made using this interpretation. First, at high temperatures more

energy levels are populated that at low temperatures, thus the partition function increases with a temperature

of the molecule (figure 7.1). Similarly, if we consider two molecules at the same temperature, the one which has

energy levels more closely spaced (higher density of states), will have higher partition function (figure 7.2).

In general, this is an important information that characterizes the system - how many levels are populated,

and how this depends on the temperature. No wonder, that from this information the thermodynamic behavior

can be deduced.

7.3 Partition function of a single diatomic molecule

A diatomic molecule can store energy in vibrations, rotations, translational motion, or in its electron cloud. Its

energy is then

E = Ev + EJ + Etrans + Eel. (7.6)

Each type of motion contributes to the partition function. Luckily, one can decompose it to individual compo-

nents. How? Let us consider first only vibrations and rotations simultaneously

qvib,rot =

∞∑v=0

( ∞∑J=0

gvgJe−β(Ev+EJ )

)=

∞∑v=0

gve−βEv

∞∑J=0

gJe−βEJ = qvib × qrot. (7.7)

The partition function is multiplication of the two partition functions when the types of motion are considered

independently. This is kind of logical, because each energy level of one mode of motion (vibration) can be

combined with each energy level of the second mode of motion (rotation).

We can generalize this to all types of motion and the partition function of one molecule can written as

q = qvib × qrot × qtrans × qel. (7.8)

Let us now look at the individual components.

2This is a crude formulation, of course each energy level will have some probability to be populated, but for the high levels this

probability is just very small so can be neglected.

25

Page 26: Std Thermo JfSub

Figure 7.1: A system with equally spaced energy levels at different temperatures - at higher temperature more

levels are populated, thus the partition function is larger.

Figure 7.2: Three systems with different level densities at the same temperature.

7.4 Vibrational partition function

We have already evaluated the vibrational partition function of a diatomic molecule (Eq. 6.16):

qvib =

∞∑v=0

e−βhcνv =1

1− e−βhcν(7.9)

For sufficiently high temperatures (i.e, small β) one can use approximation e−x ≈ 1 − x and get vibrational

partition function in a high temperature approximation:

qvib ≈1

βhcν=

kT

hcν=

T

Θvib. (7.10)

Here we have defined characteristic vibrational temperature Θvib as

Θvib =hcν

k. (7.11)

It is a mere expression of vibrational spacing of a molecule in the units of Kelvin. The high-temperature

approximation is valid only for temperatures T >> Θvib

26

Page 27: Std Thermo JfSub

7.5 Rotational partition function

The prescription to calculate the rotational partition function is

qrot =

∞∑J=0

(2J + 1)e−βhcBJ(J+1). (7.12)

There is no simple formula for this sum and one has to calculate it numerically. However, if sufficiently high

rotational levels are populated (at high temperature), one can treat J as a continuous variable and replace the

summation by an integral

qrot ≈∞∫

J=0

(2J + 1)e−βhcBJ(J+1)dJ =1

βhcB=

kT

hcB=

T

Θrot, (7.13)

where Θrot = hcBk is a characteristic rotational temperature. This high-temperature approximation is

again valid only for temperatures T >> Θrot. However, since the spacing of rotational levels is usually much

smaller than spacing of vibrational levels (B < ν), the characteristic rotational temperature is much smaller

than the characteristic vibrational temperature. Thus, the high-temperature approximation for qrot becomes

valid at smaller temperatures than the high-temperature approximation for qvib.

Complication. However, there is one complication - what it written above is valid only for heteronuclear

diatomic molecules (e.g., CO, NO, HCl ...). For homonuclear molecules (H2, O2, N2, ...) some rotational levels

do not occur! In other words, they are not populated and will not contribute to the summation. The reason

for this lies in one of the basic quantum mechanical principles - the wavefunction of the homonuclear diatomic

molecule must posses a certain symmetry with respect to the interchange of the two identical nuclei in the

molecule (Pauli principle). We will not do the detailed analysis here. We just state, that for homonuclear

diatomic molecules the partition function is one half of that for heteronuclear molecules

qrot =1

2

T

Θrot(7.14)

To have one general formula for qrot we define the symmetry number σ so that

qrot =1

σ

T

Θrot. (7.15)

The symmetry number has a value of σ = 1 for heteronuclear molecules and σ = 2 for homonuclear molecules.

7.6 Translational partition function

If the gas is in a cubic vessel with a side length of L (it’s volume is V = L3), molecule’s translational motion

is described by three quantum numbers nx, ny, nz, the translational energy for one dimension is Ex = h2

8mL2n2x.

Again, each energy level of one dimension combines with each energy level of other dimensions and if the three

lengths are the same the translational partition function is

qtrans = qxqyqz = q3x (7.16)

The translational energy levels are very closely spaced, so one can calculate the translational partition function

in one dimension by replacing summation over the discrete levels by integration over continuous levels

qx =

∞∑n=0

e−βh2

8mL2 n2

≈∞∫

n=0

e−βh2

8mL2 n2

dn (7.17)

27

Page 28: Std Thermo JfSub

The integral is a famous Poisson integral that is tricky to calculate but has a simple result3

qx =

√2πmL2

βh2(7.18)

The translational partition function is then

qtrans = q3x = L3

(2πmkT

h2

)3/2

=V

Λ3(7.19)

The characteristic length Λ =√

h2

2πmkT was introduced for the sake of simplicity.

The translational partition function depends not only on the temperature T but also on the volume of the

gas V . (Think why - with larger L the energy levels are closer together, so more of them will be populated).

This is closely related to the first law of thermodynamics - that one can change the internal energy either by

adding some heat or changing the volume, i.e., doing some work.

7.7 Electronic partition function

Electronic states of each molecule are different, so one has to make the summation

qel =

el.states∑i

gie−βEi (7.20)

for each case numerically. However, the energies of excited electronic states are very high when compared to

vibrational and rotational states. Thus, the Boltzmann factor e−βEi of excited electronic states is low when

compared to the one of the ground state4 and by far most of the molecules are in the ground state at reasonable

temperatures. Thus, one can approximate the electronic partition function

qel ≈ g0, (7.21)

where g0 is the degeneracy of the ground state. In most cases (closed shell molecules) g0 = 1.5

7.8 Partition function for a system on N molecules

The function q that we were calculating so far is partition function of a single molecule. It is very important,

since the internal energy and heat capacity of the whole (ideal) gas can be obtained from it (Eq. 7.4, 7.5).

However, sometime we need to know the partition function of the whole system of N molecules Q, defined as

Q =∑i

e−βUi (7.22)

where the sum goes through all the microstates of the system and Ui is the internal energy of the whole system

(sum of energies of all molecules) in the i-th microstate.

If we consider two distinguishable molecules, each level of one molecule can be combined with each level of

the second molecule and the partition function is Q = q1q2 (similar reasoning as when separating vibrational

3

∞∫x=0

e−ax2dx =

√π

4a.

4See exercises!5Do you know cases where this is not so?

28

Page 29: Std Thermo JfSub

and rotational contribution to q of a single molecule). Quite generally, for a system of N distinguishable

molecules the partition function is a product of partition functions of individual molecules

QN = q1 × q2 × ...× qN = qN . (7.23)

However, if we cannot distinguish the particles, some combinations of their states are identical. Thus, we have

counted them several times and the expression QN = qN overestimates the total partition function. Without

derivation we state, that in such case, the correction factor is 1N ! and the resulting partition function for a

system of N indistinguishable molecules is

QN =1

N !q1 × q2 × ...× qN =

1

N !qN (7.24)

Our most common system: gas of N identical molecules is of exactly this type - due to their quantum mechanical

nature the molecules are inherently indistinguishable.

29

Page 30: Std Thermo JfSub

Chapter 8

Internal energy and heat capacity

8.1 Mean energy per molecule, internal energy, molar heat capacity

We will finally apply the acquired knowledge to practical problems. When knowing the partition function one

can calculate the mean energy content of a molecule

E = −∂ ln q

∂β(8.1)

and the mean internal energy of the gas of N such molecules

U = −N ∂ ln q

∂β. (8.2)

The heat capacity at constant volume CV determines, how does the internal energy of the system change when

we change it’s temperature and do not change the volume

CV =∂U

∂T. (8.3)

It depends on the amount of gas, one thus defines the molar heat capacity cv which is a heat capacity of one

mol of the gas. To get it we just need to divide CV by the number of mols n

cv =CVn. (8.4)

Each type of molecular motion will contribute to the internal energy and heat capacity. Since the partition

function is multiplicative q = qvib×qrot×qtrans×qel, the logarithm in eq. (8.1) causes that the internal energies

and heat capacities are additive (as instinctively expected). Let us look at one term after another.

8.2 Electronic motion

The electronic partition function differs from one molecule to another depending on the energies of electronic

states. For not extremely high temperatures only the ground state is populated, the mean internal energy per

molecule is the energy of this electronic ground state, which is usually put to zero. Under this assumption Uel

does not depend on temperature and and the contribution of electrons to molar heat capacity is zero.

cV,el = 0. (8.5)

Do not forget, that for temperatures high enough to electronically excite the molecules this is not longer the

case (we are talking about thousands to ten thousands of Kelvins).

30

Page 31: Std Thermo JfSub

Molecule Θrot (Kelvin) Θvib (Kelvin)

O2 2.07 2256

N2 2.88 3374

HCl 15.02 4227

CO 2.77 3103

NO 2.39 2719

H2 85.3 6332

Cl2 0.351 805

I2 0.0537 308

Table 8.1: Characteristic rotational and vibrational temperatures for selected diatomic molecules

8.3 Translational motion

The translational partition function is qtrans = VΛ3 with the characteristic length Λ =

√h2β2πm . The mean

translational energy per molecule is

Etrans = −∂ ln qtrans∂β

= − ∂

∂β

[lnV − 3

2ln

h2

2πm− 3

2lnβ

]=

3

2

1

β=

3

2kT (8.6)

This result you probably already know - it is usually taught as an empirical fact in kinetic theory of gases. Here

we have derived it from the very basic principles.

The corresponding internal energy of the gas is 32NkT . Chemists do not like to specify the number of

molecules but rather number of moles n which is n = NNA

with NA being Avogadro’s number. Then the mean

translational energy is

Utrans =3

2NkT =

3

2nNAkT =

3

2nRT. (8.7)

R is the gas constant R = NAk = 8.31 JK−1mol−1. The translational molar heat capacity is

cV,trans =3

2R. (8.8)

For a monoatomic ideal gas at not too high temperatures, this is the only contribution to molar heat capacity.

Figure 8.1: Rotational molar heat capacity of ideal diatomic gas.

31

Page 32: Std Thermo JfSub

8.4 Rotational motion (diatomic molecule)

For low temperature, the rotational partition function has to be evaluated numerically and the corresponding

mean energies and heat capacities as well. If T >> Θrot this partition function becomes qrot ≈ TΘrot

= 1βhcν .

The corresponding mean rotational energy per one molecule

Erot = −∂ ln qrot∂β

= − ∂

∂β

[ln

1

hcν− lnβ

]=

1

β= kT. (8.9)

The corresponding rotational contribution to internal energy

Urot = NkT = nRT (8.10)

and the molar heat capacity

cV,rot = R (8.11)

Table 8.1 is showing characteristic vibrational and rotational temperatures for selected diatomic molecules.

Rotational temperatures are very low - that means the high temperature approximation for rotational heat ca-

pacity starts to be valid at low temperatures and is an excellent approximation at normal laboratory conditions.

At very low temperatures one has to calculate cV,rot numerically from the exact expression for the rotational

partition function. It has to be zero at 0 Kelvin (rotations are not active at very low temperatures) and has

to approach R for high temperatures. Figure 8.1 shows the rotational molar heat capacity as a function of

temperature.

8.5 Vibrational motion (diatomic molecule)

The partition function:

qvib =1

1− e−βhcν(8.12)

and the derivative of its logarithm:

Evib = − ∂

∂βln

1

1− e−βhcν=

∂βln[1− e−βhcν

]=hcνe−βhcν

1− e−βhcν=

hcν

eβhcν − 1. (8.13)

The vibrational internal energy of the gas is

Uvib = Nhcν

eβhcν − 1. (8.14)

The vibrational molar heat capacity

cV,vib =1

n

∂U

∂T= R

[βhcν

eβhcν − 1

]2

eβhcν . (8.15)

This is rather complicated but straightforward expression. This function is plotted in figure 8.2.

For temperatures higher than characteristic vibrational temperature T >> Θvib the factor βhcν is small

and one can use approximation ex = 1 + x. In this high temperature approximation

Evib ≈ kT (8.16)

Uvib ≈ NkT = RT (8.17)

cV,vib =≈ R. (8.18)

As seen in table 8.1, the high temperature approximation is valid only for very high temperatures. At laboratory

conditions, however, most of the molecules of common gases are in the ground vibrational level, vibrations

contribute to heat capacity very little and are often neglected.

32

Page 33: Std Thermo JfSub

Figure 8.2: Vibrational molar heat capacity of ideal diatomic gas.

8.6 Total molar heat capacity of a diatomic ideal gas

The total molar heat capacity of a diatomic gas at temperatures where the electronic excitation can be neglected

and T >> Θrot is

cV = cV,trans + cV,rot + cV,vib = R

[3

2+ 1 +

(βhcν

eβhcν − 1

)2

eβhcν

]. (8.19)

8.7 Equipartition theorem

At sufficiently high temperatures, the mean energies per diatomic molecule have very similar form and depend

on the number of terms in the expression for the total energy

• Translation: Etrans = 32kT . The classical total energy is E = 1

2mv2x + 1

2mv2y + 1

2mv2z .

• Rotation: Erot = kT . The classical total energy is (rotation around two axes) E = 12Iω

2x + 1

2Iω2y.

• Vibration: Evib = kT . The classical total energy is (kinetic + potential) E = 12mv

2 + 12kx

2.

This hints towards an important conclusion called equipartition theorem: At sufficiently high temperature

the energy is shared equally among all its various forms - each quadratic term in the expression for total energy

contributes to the mean energy per molecule by 12kT .

8.8 Polyatomic molecules

From chemical point of view it was boring so far: we considered only monoatomic and diatomic gases. However,

our results can be easily extended to polyatomic molecules. Translational contributions are exactly the same,

electronic contributions are usually neglected. What is missing are vibrations and rotations.

Vibrational motion of a polyatomic molecule

Let the molecule has Nat atoms. The vibrational motion can be very complex, however there are few special

vibrations that create a basis for all possible other motions - every movement can be written down as a linear

33

Page 34: Std Thermo JfSub

combination of these vibrations with suitable amplitudes and phases. These special vibrations are called normal

modes. The number of normal modes is 3Nat − 6 if the molecule is nonlinear and 3Nat − 5 if the molecule

is linear. Each normal mode behaves as an independent harmonic oscillator, thus for statistical purposes the

polyatomic molecule is just a collection of 3Nat − 6 harmonic oscillators1.

Each normal mode has a certain vibrational spacing and associated vibrational wavenumber νi, i = 1...(3Nat−6). Thus, in order to specify the quantum state of a molecule, one has to specify 3Nat − 6 vibrational quantum

numbers v1, v2...v3Nat−6. The vibrational energy of such state is

Evib =

3Nat−6∑i=1

vihcνi. (8.20)

Since the normal vibrations are independent, each level of one mode can be combined with each level of every

other mode and the vibrational partition is product of partition functions of each mode

qvib = qν1 × qν2 × ...× qν3Nat−6. (8.21)

The mean energy is merely a sum of mean energies of individual normal vibrations

Evib =

3Nat−6∑i=1

hcνieβhcνi − 1

. (8.22)

And the heat capacity is

cV,vib = R

3Nat−6∑i=1

[βhcνi

eβhcνi − 1

]2

eβhcνi . (8.23)

At high temperatures these expressions become (in accordance with the equipartition theorem)

Evib ≈ (3Nat − 6)kT, (8.24)

cV,vib ≈ (3Nat − 6)R. (8.25)

Rotational motion of a polyatomic molecule

Linear polyatomic molecules behave in exactly the same way as diatomic molecules: they can rotate around

two mutually perpendicular axes and the two moments of inertia are the same. The rotational motion can be

described by one rotational quantum number B, the energy levers are Erot = hcBJ(J + 1), partition function

qrot = 1σ

TΘrot

and the molar heat capacity cV,rot = R.

A nonlinear polyatomic molecules can rotate around three axes and the three moments of inertia can

be different. One has to thus define three rotational quantum numbers A,B and C. Similarly one defines three

rotational temperatures Θrot,A = hcAk etc. We state, without the proof, that the rotational partition function

in high-temperature approximation is

qrot =

√π

σ

√T 3

Θrot,AΘrot,BΘrot,C. (8.26)

The mean rotational energy per one molecule is (do the derivative yourself!)

Erot =3

2kT. (8.27)

And the molar heat capacity

cV,rot =3

2R. (8.28)

1I will be writing 3Nat − 6 all the time, keep in mind, that if the molecule is linear it is 3Nat − 5.

34

Page 35: Std Thermo JfSub

Chapter 9

Pressure and entropy

Here we will have a look at the two bulk characteristics of an ideal gas: pressure and entropy. They are also

encrypted in the partition function. However, since these are quantities characterizing the whole system of N

molecules, they are encrypted in the total partition function Q. For a gas of N identical molecules, the partition

function is (Eq. 7.24)

Q =1

N !qN , (9.1)

where q is the partition function of one molecule. After decomposing q to individual components one obtains:

Q =1

N !qNtrans × qNvib × qNrot × qNel = Qtrans ×Qvib ×Qrot ×Qel (9.2)

Where we have defined individual components of the total partition function as

Qtrans =1

N !qNtrans (9.3)

Qvib = qNvib (9.4)

Qrot = qNrot (9.5)

Qel = qNel . (9.6)

The factor 1N ! is contained in the translational partition function Qtrans. Its origin lies in the fact that molecules

are indistinguishable and the total wavefunction does not change if we swap the places of two molecules. This

hints towards the assignment of the 1N ! factor to translational motion.

9.1 Pressure

Due to the energy conservation, the internal energy of the gas can be changed either by adding some heat or

doing some work on the gas

dU = dQ− PdV. (9.7)

If we consider isolated gas (adiabatic process), the pressure1 can be defined as

P = −(∂U

∂V

)N

. (9.8)

1Pressure is usually denoted as p, in this course p was stolen by probability, we use capital P instead.

35

Page 36: Std Thermo JfSub

The i-th microstate of the gas has an internal energy Ui. The gas pressure in this microstate is Pi = −∂Ui∂V and

the mean value of the pressure at temperature T is calculated as

P =

Microstates∑i

Pie−βUi

Q= − 1

Q

Microstates∑i

∂U

∂Ve−βUi =

1

Q

Microstates∑i

1

β

∂e−βUi

∂V= (9.9)

=1

β

1

Q

∂V

Microstates∑i

e−βUi =1

β

1

Q

∂Q

∂V=

1

β

∂ lnQ

∂V. (9.10)

This is the prescription, how to obtain the gas pressure from partition function. The logarithm of equation

(9.2) is

lnQ = ln1

N !+N ln q. (9.11)

The only component of partition function that depends on the gas volume is the translational partition function

qtrans = VΛ3 . Thus

P = kTN∂ ln qtrans

∂V= kTN

∂ lnV

∂V=

1

VNkT. (9.12)

As a result we obtain the well known gas equation

P V = NkT = nRT. (9.13)

9.2 Entropy

We state without proof, that the way how to calculate entropy from the partition function is

S = k lnQ+U

T, (9.14)

where U is the mean internal energy of the gas.

9.3 Translational entropy

Logarithm of the translational component of the partition function

lnQtrans = ln(1

N !qNtrans) = − lnN ! +N ln qtrans = −N lnN +N +N ln qtrans. (9.15)

The translational partition function of one molecule is qtrans = VΛ3 and the mean translational energy of the

gas is U = 32NkT . The translational entropy is thus

Strans = −kN lnN + kN + kN lnV

Λ3+

3

2kN = kN

(ln

V

NΛ3+

5

2

). (9.16)

This is Sackur-Tetrode equation for entropy of ideal monoatomic gas.

A useful application: consider an isothermal expansion of the gas from initial volume Vi to final volume Vf .

The entropy change is

∆S = Sf − Si = Nk lnVfVi

(9.17)

36

Page 37: Std Thermo JfSub

9.4 Rotational entropy

The expression for the rotational entropy:

Srot = k lnQrot +UrotT

= Nk ln qrot +UrotT

. (9.18)

Diatomic and linear molecules:

For not too low temperatures (T >> Θrot) qrot = TσΘrot

and Urot = NkT . This leads to a simple expression

Srot = Nk lnT

σΘrot+Nk = Nk(1 + ln

T

σΘrot) (9.19)

Nonlinear molecules: For not too low temperatures qrot =√πσ

√T 3

Θrot,AΘrot,BΘrot,Cand Urot = 3

2NkT .

The rotational entropy

Srot = Nk

(3

2+ ln

√π

σ

√T 3

Θrot,AΘrot,BΘrot,C

)(9.20)

9.5 Vibrational entropy

Similarly, in case of vibrations one can write:

Svib = k lnQvib +UvibT

= Nk ln qvib +UvibT

. (9.21)

Diatomic molecules

Inserting qvib = [1− e−βhcν ]−1 and Eq. (8.14) for the internal vibrational energy we obtain

Svib =N

T

hcν

eβhcν − 1−Nk ln(1− e−βhcν) = Nk

[βhcν

eβhcν − 1− ln(1− e−βhcν)

]. (9.22)

This is rather complicated but straightforward expression. In the limit T → 0 (β →∞) the vibrational entropy

goes to zero. Physical interpretation: all molecules are in the ground vibrational state and that is the only

possible configuration.

Polyatomic molecules The expression for vibrational entropy is the sum of the contributions from each

normal mode. The entropy of each normal mode is calculated using equation (9.22).

37

Page 38: Std Thermo JfSub

Chapter 10

Chemical reactions

10.1 Equilibrium constant

So far we have considered only systems of identical molecules. However, statistical thermodynamics provides

useful information about a chemical change. Consider the simplest example: isomerisation of molecules A and

B

A B. (10.1)

At equilibrium, there will be certain number of molecules A NA and certain number of molecules B NB . Their

ratio is expressed by the equilibrium constant KN

KN =NBNA

. (10.2)

For a general reaction

aA + bB cC + dD (10.3)

the equilibrium constant is defined as

KN =NC

cNDd

NAaNB

b. (10.4)

We will now calculate the equilibrium constants using one of our most important results so far - the Boltzmann

probability distribution.

Note: Chemists are very inventive as to the number of variously defined equilibrium constants. The way

we defined them - from the ratio of number of molecules - is used rather rarely. The more common is to use

ratio of concentrations, or ratio of partial pressures. Or even the ratio of partial pressures with respect to the

standard pressure. The subject of chemical kinetics deals with these definitions and it is trivial to obtain all

these equilibrium constants from KN , thus we will not discuss them here.

10.2 Equilibrium constant and Boltzmann distribution

Consider reaction (10.1). A and B are just different arrangements of the same atoms. Thus one can write the

total partition function for this system of atoms

q =

states∑i

e−βEi , (10.5)

38

Page 39: Std Thermo JfSub

Figure 10.1: Hypothetic energy levels of two isomers A and B.

where the sum goes through all quantum states disregarding from which isomer of these atoms the states

originate.

All the quantum states can be divided into two groups: either they correspond to arrangement A or ar-

rangement B. There will be some amount of quantum states that cannot be assigned to either of these two

arrangements - e.g. with atoms somewhere in between these two isomers. Our assumption is that the number

of such transition states is small and their statistical weight is negligible compared to that of the stable arrange-

ments. Thus, we will neglect these transition states and can divide all the states into two groups: A and B. If

we assume, that the lowest state of B is by ∆E higher than the lowest state of A (the reaction is endothermic),

the hypothetical energy scheme is drawn in figure 10.1. One can define the partition function of isomer A

qA =

Astates∑i

e−βEi . (10.6)

Here the sum goes only through states originating in the A arrangement of atoms. Similarly , the partition

function qB is defined as

qB =

Bstates∑i

e−βεi . (10.7)

Note, that the partition function of B is calculated with respect to its lowest level, the energy scale starting

at this level is designated as ε. The relation between the two energy scales is E = ε + ∆E. The number of

molecules A is given by the Boltzmann distribution as a sum of all populations of A states.

NA =

Astates∑i

ni =

Astates∑i

Ne−βEi

q=N

q

Astates∑i

e−βEi =N

qqA. (10.8)

Similarly, the number of isomers B

NB =

Bstates∑i

ni =

Bstates∑i

Ne−βEi

q=N

q

Bstates∑i

e−βEi =N

q

Bstates∑i

e−β(εi+∆E) =N

qqBe

−β∆E . (10.9)

We obtain a straightforward expression for the equilibrium constant

KN =NBNA

=qBqAe−β∆E (10.10)

Notice one thing: if the partition functions (i.e., the densities of states) of A and B are comparable, the

exponential factor is the driving force - there will be more isomers with lower energy. However, if qB >> qA,

39

Page 40: Std Thermo JfSub

the ratio of partition functions can overwhelm the effect of the exponential and the reaction can run even in the

direction of increasing energy! It is the increase in entropy that is determining the direction of the reaction.

For the general case of reaction (10.4), the expression for the equilibrium constant is

KN =qC

cqDd

qAaqBbe−β∆E . (10.11)

10.3 Example: gas phase dissociation of diatomic molecule

The gas phase reaction

X2 X + X (10.12)

is a special case of reaction (10.4). The equilibrium constant is

KN =N2X

NX2

=q2X

qX2

e−β∆E . (10.13)

The partition function of an X atom is

qX = qX,trans × qX,el (10.14)

and the partition function of the diatomic molecule X2 is

qX2= qX2,trans × qX2,el × qX2,vib × qX2,rot. (10.15)

All the components of the partition functions can be easily calculated from the spectroscopic data.

40

Page 41: Std Thermo JfSub

Appendix A

Fundamental constants

Following table summarizes fundamental constants used in this course. The values are rounded to precision

sufficient for the numerical calculation at the exam.

Constant Value

Planck constant h = 6.626× 10−34 Js

Speed of light c = 3× 108 ms−1

Boltzmann constant k = 1.38× 10−23 JK−1

Avogadro’s number NA = 6.022× 1023

Gas constant R = 8.31 JK−1mol−1

Atomic mass unit amu = 1.66× 10−27

Useful number hck = 0.0144 Km

41