This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Loading...

Из курса от партнера University of Minnesota

Statistical Molecular Thermodynamics

171 оценки

This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Из урока

Module 6

This module introduces a new state function, entropy, that is in many respects more conceptually challenging than energy. The relationship of entropy to extent of disorder is established, and its governance by the Second Law of Thermodynamics is described. The role of entropy in dictating spontaneity in isolated systems is explored. The statistical underpinnings of entropy are established, including equations relating it to disorder, degeneracy, and probability. We derive the relationship between entropy and the partition function and establish the nature of the constant β in Boltzmann's famous equation for entropy. Finally, we consider the role of entropy in dictating the maximum efficiency that can be achieved by a heat engine based on consideration of the Carnot cycle. Homework problems will provide you the opportunity to demonstrate mastery in the application of the above concepts.

- Dr. Christopher J. CramerDistinguished McKnight and University Teaching Professor of Chemistry and Chemical Physics

Chemistry

Well, let's move on and consider entropy in a little more detail, and in

particular take a look at its properties as a state function.

So, remember what a state function says. The property of being a state function

says, that irrespective of what path you take between one state point and another,

or state point implies the specification of say temperature, pressure, volume.

Temperature, pressure, number of particles.

you can't specify all of temperature pressure and volume, that, the substance

dictates that. But temperature, pressure, number of

particles. Irrespective of what path you take, as

long as you end up at the same destination, you should see the same

change in the state function. Indeed if you end up taking a circular

path, that what this equation says, so a circle intergral, you follow a path back

to it's original point. Since you're back at the original point,

you must still have the same value of the state function.

That is you will have changed by 0. So delta S is 0 for a cyclic process.

But, what I'd like to do then, is go back to our ideal gas roadmap, with that we

used to look at changes in internal energy and observe that U was a state

function. So here's our, our old friend where we

considered different ways to go from original pressure, volume and

temperature, to a second pressure and volume, but still the same temperature.

So there is the isothermal expansion, the adiabatic expansion, followed by constant

volume warming and the constant pressure expansion, followed by constant volume

cooling. So I want to look at, I'll start with

path A versus path B plus path C. What's the change in entropy along these

two different paths? If it's a state function, it's got to be

the same change in entropy. So, let's just do the math and check.

So, remember that the reversible heat for the isothermal path.

Since it is isothermal, the change in internal energy is zero all along the

path. That means that del q is equal to minus

del w, and the reversible work is the pressure of the ideal gas being the

external pressure. So nRT1, the temperature we're operating

at, divided by V dV. And so, if I now equate that with the

reversible heat, it's just a change in sorry, it's, it's exactly this.

It's nRT1. I integrate to get the heat.

Sorry, I'm moving a little slowly here. So, this is the delta.

In order to get the actual heat change, I want to integrate dV over V from V1 to

V2. And I get nRT1 log V2 over V1.

And we did this in week five. You can go back and review those videos

if you want to see those steps again. So, given that, this is just

recapitulating that equation. I've got the the change in the heat here.

If I instead want to compute the change in entropy, now I integrate not del q,

but del q over t from state point one to state point two.

So 1 over T1, because T is a constant here.

nRT1, those Ts will cancel. So I'll get the integral from V1 to V2 of

nRV dV, nR log V2 over V1. Alright, so that is the change in

entrophy along path A. I will ask you to notice that since

volume 2 is greater than volume 1. So we've moved from left to right on the

volume axis, this is a positive, this is a number greater than 1, that makes the

logarithm a positive value, n and R are positive, number of moles in a constant

that's positive. And so the change in entropy is positive.

Entropy has increased as I've gone from a lesser volume to a greater volume.

And that's consistent with our idea, idea of entropy is a measure of disorder.

If I have the same amount of gas in a larger volume, there's sort of more ways

to imagine where the gas molecules might be.

Now, let's consider paths B and C. So, path B is particularly simple.

Path B is the adiabatic expansion. And adiabatic means that there is no heat

transfer. Del q is equal to zero.

So, in that case the entropy change if I integrate zero divided by t, doesn't

really matter what t is, I'm integrating zero and I get zero.

So the change in entropy for an adiabatic expansion is zero.

And then, I'll ask you to remember that we worked out, and you can look at video

5.4, if you'd like to see the individual steps, that the heat transfer for step C,

the constant volume heating, was equal to there's no work done, it's equal to the

internal energy change. And so that is the integral from T2, the

starting temperature, to T1, the ending temperature , C dT dT.

And we showed that that was equal to, and this is now in Video 5.5 if you want to

see the individual steps. I'm just going to recall the answer.

It is equal to nR log V2 over V1. This is indeed then, when I sum this

result with this result, 0 plus nR log V2 over V1, is indeed this term.

And that's just what we found, four path a.

So, entropy is obeying its necessary behavior as a state function, that

independent of path, when we arrive at the final point, we have the same net

entropy change. So I'm going to pause here for a second,

and I'm going to let you consider what the entropy change is for path E, to see

if you've appreciated the development so far.

Alright, we've begun to get some experience working with entropy, and

hopefully, it's becoming a little bit more comfortable and familiar.

this is just the definition again of dS. I'll point out one feature, I suppose,

that's also worth bearing in mind as a conceptual understanding point.

And that is that, if entropy is related to the disorder of a system, and you

increase the entropy by adding heat. So, del q is positive.

Notice that the change in entropy is dependent on what is the current

temperature. So at very low temperatures, this implies

a certain quantity of heat will increase the disorder considerably more than

adding that same quantity of heat at very high temperatures.

Right? So the same heat delivered at low T

increases entropy more. So that's, just an appreciation if you

will of the definition of dS. Well, having looked at the ideal gases,

ideal gas expansion paths that is, we've got some feel for how entropy relates to

heat and work, and different paths. Next, I want to consider, the role of

spontaneity more generally in thermodynamics, and indeed express it in

terms of the second law.

Coursera делает лучшее в мире образование доступным каждому, предлагая онлайн-курсы от ведущих университетов и организаций.