This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Loading...

From the course by Миннесотский университет

Statistical Molecular Thermodynamics

143 ratings

This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

From the lesson

Module 7

This module is relatively light, so if you've fallen a bit behind, you will possibly have the opportunity to catch up again. We examine the concept of the standard entropy made possible by the Third Law of Thermodynamics. The measurement of Third Law entropies from constant pressure heat capacities is explained and is compared for gases to values computed directly from molecular partition functions. The additivity of standard entropies is exploited to compute entropic changes for general chemical changes. Homework problems will provide you the opportunity to demonstrate mastery in the application of the above concepts.

- Dr. Christopher J. CramerDistinguished McKnight and University Teaching Professor of Chemistry and Chemical Physics

Chemistry

Alright. Let's take advantage of the tools we've

got so far this week to do some calculations of entropies, standard

entropies. So again a quantity an equation we've

seen before, third law entropy values. Entropies at a given temperature are

equal to the integral from 0 to T of the heat capacity dT, at constant pressure

heat capacity, that is dT over T. And, in this case, because I've made my

target temperature T. That's actually the limit on the definite

integral, so I'll use a T prime as an integration variable just to be

notationally careful. But, I've simplified this a bit compared

to some of the earlier videos. I've left out the S at temperature 0.

Because we have accepted the third law, namely that S of 0 should be equal to 0,

as long as something goes through a perfect crystal.

Now, for this expression to hold, there is one last stipulation, and that is that

there to be no phase transition of the substance between 0 and the target

temperature T. If there is a phase transition, for

example, if you go from a solid to a liquid phase, you melt.

Then, we have to account for the entropy change associated with that.

And because phase transitions are reversible processes, that means that the

delta S associated with that phase transition is going to be the amount of

heat that goes into the process. Divided by the temperature at which the

phase transition takes place. And so, if it takes place at fixed

pressure, then, the fixed pressure heat transfer is equal to the change in

enthalpy. And so delta S for a transition is equal

to delta H for a transition over the temperature at which it occurs.

So, if I now think of breaking this integral up into integrals over phases,

with different integrals for the different phases.

And then also some additional entropy associated with phase transitions, here

would be a very general expression. Its as I start at absolute 0 when my

substance is a solid. I go up to the temperature of fusion at

which point it begins melting integrating the heat capacity.

Then I would add the entropy associated with melting.

Now I continue my integral from the fusion temperature to the boiling point

the vaporization temperature. Now keeping track of the liquid heat

capacity. Where here I had the solid heat capacity.

Now I need the entropy associated with vaporization.

And, finally, I would continue to integrate from the boiling point up to

whatever temperature I'm interested in, If there is one above that.

Now the gas heat capacity at constant pressure.

Now, it's also helpful to take a more careful look at the low temperature

behavior of the constant pressure heat capacity.

And, if we have a nonmetallic solid, so an insulator, that is.

The Debye T cubed law is generally observed to hold.

And that says the following. It says that the constant pressure heat

capacity, the molar constant pressure heat capacity, is equal to a series of

constants, 12 pi to the fourth power over 5 times the universal gas constant.

Times the quantity T over capital thetas sub D, that's called the Debye

temperature, and that quantity all cubed. So it has units of kelvin, so this

becomes a a unitless quantity. And R has the same units as heat

capacity, so that all holds. So, as long as the temp is below some

limit, this this expression holds. Now Debye actually derived this

particular relationship through a analysis of quantized phonon energy

levels. And solids, so that would be sort of a

solid states physics thing, it's beyond what we want to do in this course.

But suffice it to say there is sort of a first principles explanation for why this

equation is valid. And, of course, experiment bears it out

as well. And, so if we ask, then, what is the very

lowest temperature contribution to a third law entropy.

We would want to compute the molar entropy.

I'll integrate from 0 to a given temperature T, the molar heat capacity dT

over T. But now I can replace the molar heat

capacity with the Debye T cubed expression.

So I'll do that, I'll pull all the constants out front, and I end up

integrating from 0 to T, T prime squared, because I had a T divide, T cubed divided

by T. So I'm left with T squared dT.

So when I do that integral I'll get T cubed over 3.

And actually if I look at this resulting expression, it's just this the constant

pressure molar heat capacity divided by 3.

And so that's very useful. That's says I don't necessarily need to

buy a big expensive piece of equipment to do, gokes magnetic, cooling, magnetic

refrigeration all the way down to some ridiculously low temperature.

Instead, I only have to get to a reasonably low temperature maybe, maybe I

go to five Kelvin, I ought be able to use liquid helium for that.

I measure the constant pressure heat capacity at that temperature.

How much heat does it take to raise my substance by 1 degree Kelvin?

And then, I just divide by 3, and that ought to be equal to the molar entropy at

that temperature. It's at that stage I can begin doing

additional measurements, in order to keep adding the extra entropy, above and

beyond that starting point. So, I don't really have to start from

zero, that's the utility. So, that just restates what I, what I was

commenting on, it's very convenient, because it obviates our need to measure

constant pressure heat capacity. All the way down to absolute zero.

And so let me just illustrate sort of a, a practical demonstration of the, the,

measurement or computation. Depending on how you want to thing of the

arithmetic step of the total entropy for, for nitrogen.

So, this is the molar entropy of nitrogen over the temperature range.

Absolute zero here at the origin. All the way up to 400 Kelvin.

And if we think of this as occurring in various processes, we would start from 0

to 10 Kelvin. And then 10 Kelvin to 35.61 Kelvin.

And I don't know why the experimentalists went to exactly, oh, I do know why they

went to exactly that temperature. So [INAUDIBLE] let's look at the numbers

first. So, at the very lowest range here, 0 to

10 Kelvin we pick up 2.05 Joules per Kelvin per mole worth of entropy.

And I'll stop saying the unit here for now on, but that's the unit we're

tabulating. Over the next 25.61 degrees, we pick up

about 25.79 more. At that stage, the solid nitrogen

undergoes a phase transition to another solid form.

So, that's not that unusual, solids can often have a different phases.

They're still solids, but they're just oriented differently in crystals, for

example. And so, there is an entropy change

associated with that solid, solid phase transition.

And it adds 6.43 to the entropy. In some sense, you could think of that as

the driving force for the transition. As you are increasing the temperature,

entropy is playing more of a role and the more disordered phase becomes the favored

phase. We'll see more about that when we think

about other thermodynamic quantities next week.

But in any case at that stage I'm at 35.61 Kelvin.

That phase is stable for another 28 degrees or so, and as I continue to raise

the temperature, another 23.4 entropy these units, increase is observed.

And remember, what your really doing of course is, you're measuring the heat

capacity degree by degree. And then you're just plugging that in to

the integral over each of those degree steps.

What was the temperature for that step? And what was the heat capacity for that

step in adding them up. So, that would be sort of a numeric way

to solve that integral, is a way to think about it.

Okay, at 63.15 Kelvin, we have a solid to liquid phase transition, so this is the

temperature at which nitrogen melts. As it melts, the entropy increases by

11.2. Notice that that's larger than the solid

solid, which seems sensible. There's, it seems like there's more

disorder in a liquid than there is in a solid, so we get more of a positive

increase in entropy. At that stage, the liquid, nitrogen is

only liquid over a relatively small range.

And, that is 63 to 77 kelvin and that increases the entropy another 11.46.

And so I've, I've just been indicating on this curve this is the total entropy,

molar entropy as a function of temperature.

And so far we've transitioned these two solid phases, we're now transitioning

this liquid phase. And the vertical lines then that are

shown here, they're a little to small to label conveniently.

But those are the phase transitions. So, that's the entropy without changing

the temperature that goes in, in order to change the phase.

Well, then we get to a really big phase transition, so this really tall vertical

line as the liquid goes to a gas. And certainly a gas has vastly more

disorder, as the individual molecules of the gas spread out to occupy a much

larger volume than is true for the liquid.

And so we pick up 72 joules per kelvin per mole associated with that transition.

After that point, we have a long range here, I've, I'm showing a curve that goes

to 400 but I'm actually stopping at room temperature, 298.15 in the table here.

So we pick up another 39 joules per kelvin per mole.

And here's the increase if you like until we get to about here, here's room

temperature. And finally in tabulations, one often

sees a correction for non-ideality. And that is simply to take note of when

you do the measurements you tend to try to do it at very low gas pressures.

So that things are behaving ideally, and then you correct for non-ideality as you

go to actual, say, one bar of pressure. So, the real gas compared to the ideal

gas. In nitrogen's case, at this temperature,

that's not a very big correction, 0.02, pretty small compared to all these

others. But we add all that together and you get

a value of 191.61 joules per Kelvin per mole.

And so that is a standard entropy, and you can look this up in tables of

collected physical quantities. And I've just mentioned to you that, by

convention, if you want to know the real gas that there is a correction for

non-ideality at one bar of pressure. Okay, so, having gone through that

example let me give you a chance to address a question related to it and then

we'll come back. All right, well what we've done, to date,

has really been classical thermodynamics. And, all the derivations of

differentials, and thinking about how to use heat capacity to measure entropy,

those are available from classical thermodynamics.

But I do want to pull us back to the real focus of this course, and that is the

molecular underpinnings. So, next, we'll look at the relationship

between entropy and the partition function as it relates to the third law

in more detail [SOUND].

Coursera делает лучшее в мире образование доступным каждому, предлагая онлайн-курсы от ведущих университетов и организаций.