This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Loading...

Из курса от партнера University of Minnesota

Statistical Molecular Thermodynamics

161 оценки

This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Из урока

Module 6

This module introduces a new state function, entropy, that is in many respects more conceptually challenging than energy. The relationship of entropy to extent of disorder is established, and its governance by the Second Law of Thermodynamics is described. The role of entropy in dictating spontaneity in isolated systems is explored. The statistical underpinnings of entropy are established, including equations relating it to disorder, degeneracy, and probability. We derive the relationship between entropy and the partition function and establish the nature of the constant β in Boltzmann's famous equation for entropy. Finally, we consider the role of entropy in dictating the maximum efficiency that can be achieved by a heat engine based on consideration of the Carnot cycle. Homework problems will provide you the opportunity to demonstrate mastery in the application of the above concepts.

- Dr. Christopher J. CramerDistinguished McKnight and University Teaching Professor of Chemistry and Chemical Physics

Chemistry

Now that we've reached the end of week six, let me try and review what I think

are the most important concepts associated with the second law of

thermodynamics and entropy. First the definition of entropy, it's a

state function defined as dS infinitesimal change in entropy is equal

to delta q reversible, so the reversible heat change but divided by the

temperature. For an isolated system, that constant U,

that is constant internal energy, and constant volume, spontaneous processes

occur until the entropy is maximized. After which point, the system is at

equilibrium and only reversible processes will continue to occur.

The second law, stated mathematically is either ds is greater than or equal to

delta q over t, that's in sort of the differential form, the infinitesimal

form, if you like. Or for a macroscopic change, delta s is

greater than or equal to the integral of del q over t.

The inequality holds if the process is at any stage irreversible, the equality

holds if it's reversible along the entire path.

Clausius summarized the first and second laws in a single statement.

The energy of the universe is constant. The entropy is tending to a maximum.

Turning to the statistical mechanical aspects of entropy.

The Boltzmann definition, s equals k log w, is maximized when the total number of

systems in a microcanonical ensemble are distributed equally among all the

degenerate energy states. And an alternative definition of

statistical entropy is s equals k log omega, where omega is the degeneracy of

the system. Or the ensemble depending on which you

want to enter before. The molar entropy change for the

isothermal expansion of an ideal gas, from volume one to volume two, is equal

to the universal gas constant R times the log rhythm of V2 over V1.

So that's true irrespective of whether the change is made reversibly or

irreversibly. The difference between the reversible and

the irreversible processes is in the sum of the entropy changes of the gas and of

the surroundings. So if any point along the way the process

is irreversible, then that sum will be greater than zero.

If it's reversible all the way then the sum will be zero.

The entropy change in the gas will be equal and opposite to the entropy change

in the surroundings for the reversible process.

We also looked at entropy of mixing considering a mixer of gases.

And while I used an example that was more convenient to draw of two gases brooming

in nitrogen. Actually you can express it more

generally that the entropy of mixing is equal to minus r times the sum over as

many gases as there are, n sub i, how many moles there are of gas i times the

natural log rhythm of the mole fraction of i.

That is how many moles of gas IR there, out of the total number of moles gas

present. And I see in that equation I seem to have

an I subscript on entropy, and that's just a typo here.

This is the overall entropy of mixing. It's always greater than zero because the

mole fraction is always less than one, so the log rhythm is negative.

All the other quantities is positive and there's a negative sign out front, so

it's always positive. Mixing is always spontaneous.

Yet another form of entropy we looked at was the probability form, and that says

that the entropy is equal to minus Boltzmann's constant times the sum over

different states, probability of a state, log of the probability of being in that

state. That function is maximized when all the

probabilities are equal. And moreover, by invoking probability, it

provides a direct connection to the partition function.

When we make that connection, we discover that entropy is equal to Boltzmann's

constant times the temperature. Time the partial derivative of the log or

the partition function with respect to temperature, holding number and volume

constant. Plus, simply Boltzmann's constant times

the log of the partition function. So that's a new feature of entropy

compared to most of the other state functions we've looked at before.

It doesn't just involve a derivative of the log of the partition function.

There's actually the log of the partition function itself appearing in the

expression. If we express the differential of the

molar entropy in another way, taking advantage of connections to other

thermodynamic state functions and quantities.

It's equal to the constant volume molar heat capacity times dt over t plus the

universal gas constant times dv over v bar.

We also looked at a Carnot engine as an example of a, a process using an ideal

gas to turn heat energy into work. And we discovered that the maximum

efficiency of such an engine is 1 minus th over tc.

Where Th is the temperature of a hot bath from which heat is extracted, in order to

do work. And tc is the energy, excuse me, the

temperature of a cold bath into which heat is dumped.

And finally, based on the analysis of such an engine, we looked at Lord

Kelvin's restatement of the Second Law of Thermodynamics, which says that no net

work can be obtained from an isothermal process.

Well those are some of the key aspects of entropy as it relates to the second law

of thermodynamics. In the next week of the course we're

going to continue studying entropy. But we're going to look at it from the

context of a, another log, the third law of thermodynamics.

And we'll begin by looking at entropy and other thermodynamic functions.

[SOUND] [BLANK_AUDIO]

Coursera делает лучшее в мире образование доступным каждому, предлагая онлайн-курсы от ведущих университетов и организаций.