Statistical Mechanics Lecture 4

preview_player
Показать описание
(April 23, 2013) Leonard Susskind completes the derivation of the Boltzman distribution of states of a system. This distribution describes a system in equilibrium and with maximum entropy.

Originally presented in the Stanford Continuing Studies Program.

Stanford University:

Continuing Studies Program:

Stanford University Channel on YouTube:
Рекомендации по теме
Комментарии
Автор

I struggled so badly with these concepts at Uni. (Not helped by a lecturer with an almost impenetrable German accent)

He makes it so clear and simple. For the first time in my life I can claim to finally understand Thermodynamics. It's never too late I guess.

qwadratix
Автор

I see a lot of comments saying these Theoretical Minimum lectures in general were continuing studies and so aren't really graduate level but those comments must not be from people who actually watch and learn from these magnificent lectures. We are so blessed to get an original presentation of these ideas based on Susskind's own understanding and not a rehash of how they are usually presented. These are to graduate physics what the Feynman Lectures were to undergrad.

qbtc
Автор

Just a quick rehash that in lecture 3 he essentially proved that the set of occupation numbers that describe the highest number of system states (relative to all other sets of occupation numbers) matches up exactly/mathematically with an unconstrained system that has maximized it's entropy (the most likely system is a system with max entropy). So moving forward, when he maximizes entropy he is actually maximizing the number of states described by his occupation numbers by varying the occupation numbers. This maximizes the probability that he is describing the system. When he constrains the system with probability, this means that he is then finding the most likely probability distribution to find the system in.

zacharythatcher
Автор

Occupation number of an ensemble; Lagrange multipliers 7:00; Maximizing entropy under constraints 11:00; Partition function Z 22:00; Entropy comes before temprature 40:00; Temprature is 1/Beta 50:00; Ideal Gas 56:00; Energy of Ideal Gas 1:23:00; Adding Gravity to Partition function 1:32:00

joabrosenberg
Автор

Such an amazing lecturer. His lectures are pure gold. Thank you Prof. Susskind... This lecture is also very important for people doing computer science, optimisation etc as a lot of the foundational ideas behind markov random fields are explained here!

friendlystonepeople
Автор

What a great privilege to be able to watch Professor Susskind's lectures. One of the greats

filipecardozo
Автор

Landed here because I like this stuff and I want to make connections with Communication/ Information Theory. So far so really good. He is a great science communicator, which happens to a tricky job to deliver, in my opinion. I also like the say the audience pose questions.

jaimelima
Автор

coming from machine learning, this is really helpful

anynamecanbeuse
Автор

The ideal gas example was so, so elegant. Absolutely enjoyed it.

alimanski
Автор

Fairly long derivation in this one, so I recommend that you set aside a block of time, grab a cup of coffee or tea (or whatever) and try to watch this one uninterrupted. If you watch it in bits over several days, it will be harder to follow.

EdSmiley
Автор

The division by N! is needed because we deal with N indistinguishable items.

pippintook
Автор

I think if the system is not at thermal equilibrium different parts of the system have different temperatures. If you want to give one temperature for the whole system it would have to be something like "the temperature it will have once it reaches equilibrium as a closed system"

kgrgzafnkg
Автор

I can finally do the canonical ensemble!! :D Thank you!! :D

queendaisy
Автор

It is relatively intuitive through understanding Lagrange multipliers that inverse beta is temperature. What is surprising is that this Lagrange multiplier can actually dictate the value of the constraint (temperature determines average energy). Aka the rate of change of energy with regard to the change in entropy is the temperature and it dictates the energy. These are the signs of a highly constrained equation.

zacharythatcher
Автор

To answer that persistent guy's question, that the professor does not understand, yes you solve the lagrangian for a fixed set of points, BUT he generates a set of general equations for WHENEVER you maximize the entropy or are only considering the set of most likely states.

zacharythatcher
Автор

At this point here I'm now confused about what temperature is. Is temperature only well defined at equilibrium? What if your system is out of equilibrium? Does that simply mean you have multiple temperatures in your system? But, if that is the case, then how can you have a non-equilibrium system that has a well defined temperature?

markfleharty
Автор

It seems like the Lagrange Multiplier alpha doesn't really go away. In fact d alpha = d log z (with a minus sign here, which is arbitrary). So it's pretty much the central focus of statistical mechanics; all the interesting quantities are related to d log z.

deanrubine
Автор

There's actually a small problem with the last derivation (Ideal Gas). He derives E as 3NT/2. However, this E, as derived earlier, was energy per particle and not total energy. E (energy per particle) cant be proportional to N. I think the error is because the way he solved the partition function is not correct (Specifically: exponent summation should be Ei, per-particle energy of state i)

aswinkrishna
Автор

Wrt Lagrange multipliers, aren't you supposed to subtract your constraints from the objective function instead of adding them? I understand all it will do is just change the sign of the Lagrange multipliers.

Eta_Carinae__
Автор

I am in this lecture 9 years ago 7 years ago 5 years ago 3 years ago and again today.

dongyoonkim