diversity, entropy and thermodynamics

40
Diversity, Entropy and Thermodynamics John Baez http://math.ucr.edu/home/baez/biodiversity/ July 5, 2012 The Mathematics of Biodiversity CRM

Upload: others

Post on 23-Mar-2022

10 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Diversity, Entropy and Thermodynamics

Diversity, Entropy and Thermodynamics

John Baezhttp://math.ucr.edu/home/baez/biodiversity/

July 5, 2012The Mathematics of Biodiversity

CRM

Page 2: Diversity, Entropy and Thermodynamics

The Shannon entropy

S(p) = −n∑

i=1

pi ln(pi)

appears in thermodynamics and information theory, but it canalso be used to measure biodiversity. Is this a coincidence?

No:

In thermodynamics, the entropy of system is the expectedamount of information we gain by learning its precise state.In biodiversity studies, the entropy of an ecosystem is theexpected amount of information we gain about anorganism by learning its species.

Can we connect biodiversity more deeply to thermodynamics?

Page 3: Diversity, Entropy and Thermodynamics

The Shannon entropy

S(p) = −n∑

i=1

pi ln(pi)

appears in thermodynamics and information theory, but it canalso be used to measure biodiversity. Is this a coincidence?

No:

In thermodynamics, the entropy of system is the expectedamount of information we gain by learning its precise state.In biodiversity studies, the entropy of an ecosystem is theexpected amount of information we gain about anorganism by learning its species.

Can we connect biodiversity more deeply to thermodynamics?

Page 4: Diversity, Entropy and Thermodynamics

The Shannon entropy

S(p) = −n∑

i=1

pi ln(pi)

appears in thermodynamics and information theory, but it canalso be used to measure biodiversity. Is this a coincidence?

No:

In thermodynamics, the entropy of system is the expectedamount of information we gain by learning its precise state.In biodiversity studies, the entropy of an ecosystem is theexpected amount of information we gain about anorganism by learning its species.

Can we connect biodiversity more deeply to thermodynamics?

Page 5: Diversity, Entropy and Thermodynamics

Starting from any probability distribution, we can quickly reachall the main ideas of thermodynamics!

These include:

entropy, Stemperature, Tenergy, Ethe partition function, Zthe free energy, F = E − TS

So, all these and much more are available to biodiversitystudies.

Page 6: Diversity, Entropy and Thermodynamics

Starting from any probability distribution, we can quickly reachall the main ideas of thermodynamics!

These include:

entropy, Stemperature, Tenergy, Ethe partition function, Zthe free energy, F = E − TS

So, all these and much more are available to biodiversitystudies.

Page 7: Diversity, Entropy and Thermodynamics

Starting from any probability distribution, we can quickly reachall the main ideas of thermodynamics!

These include:

entropy, Stemperature, Tenergy, Ethe partition function, Zthe free energy, F = E − TS

So, all these and much more are available to biodiversitystudies.

Page 8: Diversity, Entropy and Thermodynamics

Suppose we have a finite list of probabilities pi summing to 1.

Fixing a number T0 > 0, write

pi = e−Ei/T0

for some energies Ei .

This lets us define probabilities depending on the temperatureT :

pi(T ) =1

Z (T )e−Ei/T

where Z (T ) is called the partition function:

Z (T ) =∑

i

e−Ei/T

Page 9: Diversity, Entropy and Thermodynamics

Suppose we have a finite list of probabilities pi summing to 1.Fixing a number T0 > 0, write

pi = e−Ei/T0

for some energies Ei .

This lets us define probabilities depending on the temperatureT :

pi(T ) =1

Z (T )e−Ei/T

where Z (T ) is called the partition function:

Z (T ) =∑

i

e−Ei/T

Page 10: Diversity, Entropy and Thermodynamics

Suppose we have a finite list of probabilities pi summing to 1.Fixing a number T0 > 0, write

pi = e−Ei/T0

for some energies Ei .

This lets us define probabilities depending on the temperatureT :

pi(T ) =1

Z (T )e−Ei/T

where Z (T ) is called the partition function:

Z (T ) =∑

i

e−Ei/T

Page 11: Diversity, Entropy and Thermodynamics

As we raise the temperature, the probabilities pi(T ) becomemore evenly distributed:

T = 1 T = 3

When something gets hotter, all possible situations becomecloser to being equally probable.

Page 12: Diversity, Entropy and Thermodynamics

As we raise the temperature, the probabilities pi(T ) becomemore evenly distributed:

T = 1 T = 3

When something gets hotter, all possible situations becomecloser to being equally probable.

Page 13: Diversity, Entropy and Thermodynamics

As we lower the temperature, the biggest probabilities increase,while the rest go to zero:

T = 1 T = 1/3

When something gets colder, the chance that it’s in alow-energy state goes up.

Page 14: Diversity, Entropy and Thermodynamics

As we lower the temperature, the biggest probabilities increase,while the rest go to zero:

T = 1 T = 1/3

When something gets colder, the chance that it’s in alow-energy state goes up.

Page 15: Diversity, Entropy and Thermodynamics

What is special about the probability distribution pi(T )?

It minimizes the free energy F , which we can define for anyprobability distribution ri :

free energy = expected energy − temperature times entropy

=∑

i

riEi + T∑

i

ri ln(ri)

So:

When it gets hotter, pi(T ) tries harder to maximize entropy.When it gets colder, pi(T ) tries harder to minimize energy.

Page 16: Diversity, Entropy and Thermodynamics

What is special about the probability distribution pi(T )?

It minimizes the free energy F , which we can define for anyprobability distribution ri :

free energy = expected energy − temperature times entropy

=∑

i

riEi + T∑

i

ri ln(ri)

So:

When it gets hotter, pi(T ) tries harder to maximize entropy.When it gets colder, pi(T ) tries harder to minimize energy.

Page 17: Diversity, Entropy and Thermodynamics

What is special about the probability distribution pi(T )?

It minimizes the free energy F , which we can define for anyprobability distribution ri :

free energy = expected energy − temperature times entropy

=∑

i

riEi + T∑

i

ri ln(ri)

So:

When it gets hotter, pi(T ) tries harder to maximize entropy.When it gets colder, pi(T ) tries harder to minimize energy.

Page 18: Diversity, Entropy and Thermodynamics

Using the probabilities pi(T ), entropy becomestemperature-dependent:

S(T ) = −∑

i

pi(T ) ln(pi(T ))

So does the expected value of the energy:

E(T ) =∑

i

pi(T )Ei

Thus, so does the free energy:

F (T ) = E(T )− TS(T )

Page 19: Diversity, Entropy and Thermodynamics

Using the probabilities pi(T ), entropy becomestemperature-dependent:

S(T ) = −∑

i

pi(T ) ln(pi(T ))

So does the expected value of the energy:

E(T ) =∑

i

pi(T )Ei

Thus, so does the free energy:

F (T ) = E(T )− TS(T )

Page 20: Diversity, Entropy and Thermodynamics

Using the probabilities pi(T ), entropy becomestemperature-dependent:

S(T ) = −∑

i

pi(T ) ln(pi(T ))

So does the expected value of the energy:

E(T ) =∑

i

pi(T )Ei

Thus, so does the free energy:

F (T ) = E(T )− TS(T )

Page 21: Diversity, Entropy and Thermodynamics

Given any probability distribution pi , we get a 1-parameterfamily of entropies S(T ). Are these the Rényi or Tsallisentropies?

No. The Rényi entropy is:

Sq(p) =1

1− qln

∑i

pqi =

11− q

ln∑

i

e−Ei q/T0

If we let q be the ‘cooling factor’:

T = T0/q

this gives

Sq(p) =1

1− qln

∑i

e−Ei/T =1

1− qln Z (T )

Page 22: Diversity, Entropy and Thermodynamics

Given any probability distribution pi , we get a 1-parameterfamily of entropies S(T ). Are these the Rényi or Tsallisentropies?

No. The Rényi entropy is:

Sq(p) =1

1− qln

∑i

pqi =

11− q

ln∑

i

e−Ei q/T0

If we let q be the ‘cooling factor’:

T = T0/q

this gives

Sq(p) =1

1− qln

∑i

e−Ei/T =1

1− qln Z (T )

Page 23: Diversity, Entropy and Thermodynamics

Given any probability distribution pi , we get a 1-parameterfamily of entropies S(T ). Are these the Rényi or Tsallisentropies?

No. The Rényi entropy is:

Sq(p) =1

1− qln

∑i

pqi =

11− q

ln∑

i

e−Ei q/T0

If we let q be the ‘cooling factor’:

T = T0/q

this gives

Sq(p) =1

1− qln

∑i

e−Ei/T =1

1− qln Z (T )

Page 24: Diversity, Entropy and Thermodynamics

What is the meaning of this equation:

Sq(p) =1

1− qln Z (T ) ?

We can exponentiate both sides to get the Hill numbers:

Dq(p) = Z (T )1

1−q

which Lou Jost argues are a better measure of biodiversity.

Challenge: If Hill numbers are fundamental to biodiversity,while the partition function is fundamental to thermodynamics,why this funny relationship?

Page 25: Diversity, Entropy and Thermodynamics

What is the meaning of this equation:

Sq(p) =1

1− qln Z (T ) ?

We can exponentiate both sides to get the Hill numbers:

Dq(p) = Z (T )1

1−q

which Lou Jost argues are a better measure of biodiversity.

Challenge: If Hill numbers are fundamental to biodiversity,while the partition function is fundamental to thermodynamics,why this funny relationship?

Page 26: Diversity, Entropy and Thermodynamics

What is the meaning of this equation:

Sq(p) =1

1− qln Z (T ) ?

We can exponentiate both sides to get the Hill numbers:

Dq(p) = Z (T )1

1−q

which Lou Jost argues are a better measure of biodiversity.

Challenge: If Hill numbers are fundamental to biodiversity,while the partition function is fundamental to thermodynamics,why this funny relationship?

Page 27: Diversity, Entropy and Thermodynamics

Alternatively, we can write

Sq(p) =1

1− qln Z (T ) =

11− T0/T

ln Z (T ) =T ln Z (T )

T − T0

Then, use a wonderful identity relating free energy to thepartition function:

F (T ) = −T ln Z (T )

to get

ST0/T (p) = − F (T )

T − T0

But Z (T0) = 1, so F (0) = 0. Thus

ST0/T (p) = −F (T )− F (T0)

T − T0

Page 28: Diversity, Entropy and Thermodynamics

Alternatively, we can write

Sq(p) =1

1− qln Z (T ) =

11− T0/T

ln Z (T ) =T ln Z (T )

T − T0

Then, use a wonderful identity relating free energy to thepartition function:

F (T ) = −T ln Z (T )

to get

ST0/T (p) = − F (T )

T − T0

But Z (T0) = 1, so F (0) = 0. Thus

ST0/T (p) = −F (T )− F (T0)

T − T0

Page 29: Diversity, Entropy and Thermodynamics

Alternatively, we can write

Sq(p) =1

1− qln Z (T ) =

11− T0/T

ln Z (T ) =T ln Z (T )

T − T0

Then, use a wonderful identity relating free energy to thepartition function:

F (T ) = −T ln Z (T )

to get

ST0/T (p) = − F (T )

T − T0

But Z (T0) = 1, so F (0) = 0. Thus

ST0/T (p) = −F (T )− F (T0)

T − T0

Page 30: Diversity, Entropy and Thermodynamics

Alternatively, we can write

Sq(p) =1

1− qln Z (T ) =

11− T0/T

ln Z (T ) =T ln Z (T )

T − T0

Then, use a wonderful identity relating free energy to thepartition function:

F (T ) = −T ln Z (T )

to get

ST0/T (p) = − F (T )

T − T0

But Z (T0) = 1, so F (0) = 0.

Thus

ST0/T (p) = −F (T )− F (T0)

T − T0

Page 31: Diversity, Entropy and Thermodynamics

Alternatively, we can write

Sq(p) =1

1− qln Z (T ) =

11− T0/T

ln Z (T ) =T ln Z (T )

T − T0

Then, use a wonderful identity relating free energy to thepartition function:

F (T ) = −T ln Z (T )

to get

ST0/T (p) = − F (T )

T − T0

But Z (T0) = 1, so F (0) = 0. Thus

ST0/T (p) = −F (T )− F (T0)

T − T0

Page 32: Diversity, Entropy and Thermodynamics

ST0/T (p) = −F (T )− F (T0)

T − T0

Moral: Rényi entropy is minus the change in free energydivided by the change in temperature.

Taking T → T0 we recover a famous formula for the Shannonentropy:

S(p) = − dF (T )

dT

∣∣∣∣T=T0

Challenge: What do these facts mean for biodiversity?

Page 33: Diversity, Entropy and Thermodynamics

ST0/T (p) = −F (T )− F (T0)

T − T0

Moral: Rényi entropy is minus the change in free energydivided by the change in temperature.

Taking T → T0 we recover a famous formula for the Shannonentropy:

S(p) = − dF (T )

dT

∣∣∣∣T=T0

Challenge: What do these facts mean for biodiversity?

Page 34: Diversity, Entropy and Thermodynamics

ST0/T (p) = −F (T )− F (T0)

T − T0

Moral: Rényi entropy is minus the change in free energydivided by the change in temperature.

Taking T → T0 we recover a famous formula for the Shannonentropy:

S(p) = − dF (T )

dT

∣∣∣∣T=T0

Challenge: What do these facts mean for biodiversity?

Page 35: Diversity, Entropy and Thermodynamics

The Second Law of Thermodynamics says that Shannonentropy must increase under certain conditions. Biodiversitydoes not always increase.

However, Marc Harper has shown that something similar to theSecond Law holds when a population approaches an‘evolutionary optimum’!

Page 36: Diversity, Entropy and Thermodynamics

The Second Law of Thermodynamics says that Shannonentropy must increase under certain conditions. Biodiversitydoes not always increase.

However, Marc Harper has shown that something similar to theSecond Law holds when a population approaches an‘evolutionary optimum’!

Page 37: Diversity, Entropy and Thermodynamics

Suppose the vector of populations P = (P1, . . . , Pn) evolve withtime according to a generalized Lotka–Volterra equation:

dPi

dt= fi(P1, . . . , Pn)Pi

Let p be the corresponding probability distribution:

pi =Pi∑j Pj

Let q be a fixed probability distribution, and let

I(q, p) =∑

i

ln(

qi

pi

)qi

be the relative Shannon information.

Page 38: Diversity, Entropy and Thermodynamics

Suppose the vector of populations P = (P1, . . . , Pn) evolve withtime according to a generalized Lotka–Volterra equation:

dPi

dt= fi(P1, . . . , Pn)Pi

Let p be the corresponding probability distribution:

pi =Pi∑j Pj

Let q be a fixed probability distribution, and let

I(q, p) =∑

i

ln(

qi

pi

)qi

be the relative Shannon information.

Page 39: Diversity, Entropy and Thermodynamics

Suppose the vector of populations P = (P1, . . . , Pn) evolve withtime according to a generalized Lotka–Volterra equation:

dPi

dt= fi(P1, . . . , Pn)Pi

Let p be the corresponding probability distribution:

pi =Pi∑j Pj

Let q be a fixed probability distribution, and let

I(q, p) =∑

i

ln(

qi

pi

)qi

be the relative Shannon information.

Page 40: Diversity, Entropy and Thermodynamics

Thenddt

I(q, p) ≤ 0

if q is an evolutionary optimum:

p · f (P) ≤ q · f (P)

for all P: i.e., the mean fitness of a small sample of ‘invaders’distributed according to the distribution q exceeds or equals themean fitness of any population P.

So: the information ’left to learn’ never increases as thepopulation’s distribution evolves toward an evolutionaryoptimum. Not biodiversity, but relative biodiversity, mattershere!