The Second Law of Thermodynamics: Heat Flow, Entropy, and Microstates

preview_player
Показать описание
What the heck is entropy?! You've heard a dozen different explanations. Disorder, microstates, Carnot engines... so many different wordings and interpretations! Really, they are all related, so let's go through a few different formulations of the second law and ways of interpreting it. It's hard to understand, but who doesn't like a challenge?

Check out "Is This Wi-Fi Organic?", my book on disarming pseudoscience!
Рекомендации по теме
Комментарии
Автор

Next time my mom asks me to clean my room I’m just going to tell her that I cannot, because I respect the universe and that the universe likes spontaneous entropy!

JamezBnd
Автор

Simply stated, entropy is the relationship between the temperature of a body and its heat content (more precisely, its kinetic heat energy). Entropy, S, is the heat content, Q, divided by the body's temperature, T.
S = Q/T
Stated another way, the heat, Q, stored in an object at temperature, T, is its entropy, S, multiplied by its temperature, T.
Q = T x S

That is it. The definition of entropy, as originally conceived in classical thermodynamics, had nothing to do with order or disorder. It had everything to do with how much heat energy was stored or trapped in a body at a given temperature. Think of it this way. If you removed all the heat energy possible from an object by cooling it down as far as possible (down to absolute zero), and then kept track of the heat you had to put back into it to bring it back to a given state, that amount of heat supplied divided by the final temperature in kelvin would be the entropy of that object in that state. The entropy of system is the average heat capacity of the system averaged over its absolute temperature.

KimberlyRPeacock
Автор

Dave your material is so well well organized for class room work and tests

ckarp
Автор

I'm trying to understand entropy perfectly and man it's not easy

sccm
Автор

I just started watching your tutorial and it gives me
I wish your educational channel will be well

victorakachukwu
Автор

I hope your channelwill be the best educational channel and will soon have a million subscribers!

malayapaul
Автор

You are the best. Thank you for the work you do in creating these lectures!

dantej
Автор

This is like the show " professor proton " from the big bang theory show that Sheldon is fond of. And am really really fond of ur work, its simple, efficent and quite enough to keep going with the topics. Thank you professor Dave🤩

yssacnton
Автор

First time I understood this concept 😄.. . .and it was my first attending lecture on your channel and experience is just voww...
Really you are an amazing
Love & Respect from INDIA 🇮🇳🇮🇳🇮🇳

shreya
Автор

these thermodynamics videos came just in time :D thanks alot ❤

cassiopeiax
Автор

I was taught entropy is randomness... and i thought that's it. But this man has made me think harder

mranonymous_
Автор

Entropy in a general state has nothing to do with order or disorder, it is simply energy is more likely to move from a concentrated or dense state to a less dense state, unless prevented by some barrier. The microstates are equally probably the macrostates are not. The Carnot engine is a ratio of work to heat flow which always is less than 1

KimberlyRPeacock
Автор

Thanks, I'm doing a physics presentation on entropy, this helped so much!

vedantsridhar
Автор

I understand it better thanks to your explanation!

lznekmc
Автор

Your video makes me want to learn more about entropy! Wish I could take that course about it LOL...

mahnoorshah
Автор

You are You made thermodynamics bearable...😊😊 Thanks..

soniaalam
Автор

this is the best video i've found on this subject seriously, thank

Vitoria-bkli
Автор

On example1 ;
There is a case of tiny probablity where the shuffled deck of cards ends up at its initial indexed state. Since there is still a possibility for shuffled deck to turn up as serial as the beginning, I've considered "Probable to increase but no way to tell." as the first answer. Is there any apparent problem this way of reasoning or is it more accurate?

ejyrdzp
Автор

I disagree with the deck of cards shuffling for an increase in entropy. There is no change entropy.

Each card is a distinct entity unique to its peers. In other words, no 2 cards are exactly alike. So, shuffling the deck would not change anything other than the order of the cards.

I do concede that if we are strictly talking about the colors or symbols of the cards, then the shuffling would increase entropy. This is due to having multiple copies of indistinct entities in the system.

johnathant
Автор

Your explanation is so easy to understand 😌 Thabk you ❤️

hyraG_