Entropy is NOT Disorder | #VeritasiumContest

preview_player
Показать описание
This video has three goals:

1. Provide a better definition of entropy (number of ways a state can exist).
2. Explain why it's useful (dE = T dS).
3. Give an intuitive reason for why it tends towards a maximum in an isolated system.

I'm using the Boltzmann definition of entropy in the same manner as Daniel Schroeder in his textbook.

Also, I just want to point out most people think disorder means disorganization, like atoms scattered about randomly. In that sense, forming a crystal or oil and water separating leads to an increase in order at the expense of air molecules moving slightly faster, which seems to be a net increase in order. If your response does not take this common understanding of disorder into account, your response is going to be much weaker and likely incorrect.

Other videos for the Veritasium Contest about how entropy is not disorder
___________________________________________________________________________

If given the choice between losing the Veritasium contest and having everyone understand that entropy is not disorder, I would lose the contest. For this reason, I am linking to other videos in the Veritasium contest in the hopes of boosting as many entropy videos to the top 100 as possible. Also, I like their explanations.

Links
___________________________________________________________________________

#VeritasiumContest #veritasiumcontest

Transcript
___________________________________________________________________________

Entropy is not disorder. If it were, the formation of highly ordered materials such as crystals or the separation of oil and water would be impossible. Instead, entropy, which is denoted as S, connects two concepts:

First, entropy measures the number of ways a state could exist.

Say we have 100 coins. There is only one way for all coins to be heads, so this is a low entropy state. On the other hand, there are billions of billions of billions of ways for 100 coins to be half heads and half tails, so this is a high entropy state.

Second, entropy measures how energy relates to temperature.

Since one number describes both, you can figure out how temperature affects a system by counting the number of ways each state could exist within a system.

So why does entropy tend to a maximum in isolated systems per the Second Law of Thermodynamics? States with higher entropy have more ways to exist, and therefore a higher probability. In other words, the Second Law says "isolated systems tend to their most likely states".
Рекомендации по теме
Комментарии
Автор

This is a true banger. Masterpiece. Finally a video about entropy using its true thermodynamic definition(s). Congrats !

bastienmassion
Автор

You explain a very common misconception very well. Your article is also excellent! I'm enjoying it

argav
Автор

This is one of the best explanations of entropy, clear and to the point with a good simple example. On top of the fact you correct the misstatement that's passed on by most people who pass on second hand information without actually trying to fully understand it themselves and grasp the idea to communicate it in their own words.

mrmotl
Автор

Something that took me months in uni to wrap my head around explained perfectly in 1 minute! Super awesome. Keep up the great work!

lennemo
Автор

Help stamp out the misuse of disorder as an analogy for entropy today.

reidflemingworldstoughestm
Автор

I like this video a lot: great animations, nice narration and smooth soundtrack.

I have some thoughts I wanted to share:
- the sentence you start with: "Entropy is not disorder. If it were, the formation of highly ordered materials such as crystals or the separation of oil and water would be impossible" I don't like it because there isn't a clear definition of what <disorder> is, and also, in the rest of your explanation you don't return to those examples to show how do they relate to the actual concept of entropy.
- The part about the relation dE = T dS, I couldn't understand what you meant with the 5th paragraph of your script. I feel like that part doesn't contribute to the rest of the video.

mchlrmr
Автор

Clear and concise explanation!
It really bothers me a lot that in the vast majority of the time entropy is associated with disorder. Understanding that it can serve as an analogy, both in thermodynamics and in information theory, but it is something essentially incorrect that can lead to not correctly understanding its true essence.

EzequielSkorepa
Автор

When a system is on its way to equilibrium, the energy amount that is added to it, say by a reservoir or by doing work on the system, makes its temperature increase. Higher temperature means higher moving ability. So more degrees of freedom are available for the system and hence more microstates (\Omega) are accessible. And these microstates are the different configurations/arrangements of the coins in your representation.
When equilibrium is reached, and by the definition of the Boltzmann entropy, you can see that \Omega is now at its largest value, and so the entropy at that point is maximized.

ankidokolo
Автор

Cool animations and quite good explanation of entropy.. Good work..

AKfire
Автор

I’ve always hated the ‘dirty room’ explanation of entropy and disorder! But I’m not a physicist, just a chef. I think about entropy a lot in the kitchen. I like to imagine the system seeking equilibrium in a bowl of ice water, right after I put the hot hard boiled eggs into it. The cooks aren’t too interested in physics though so I keep examples to myself 🤷‍♂️. Great quick video, you got a sub!

SkywalkerO
Автор

Leaving tab open for future Youtube rabbithole. Nice.

FlyNAA
Автор

So many conflations of Entropy and Information Theory in Cognitive science, neuroscience, theory of mind, psychophysics etc.

S.G.Wallner
Автор

Thanks Mr Mellor. How do we explain separation of oil and water based on Entropy, as we go along, how do we explain gravity, the inward pulling entropy reducing phenomenon...

aadilansari
Автор

Entropy cannot be said to be the number of ways of arranging objects, unless such arrangements alter the way that energy is distributed. This ‘energy connection’ is essential. An untidy room has the same entropy as a tidy room, but the arrangements of carbon monoxide molecules in the solid state do alter the energy distribution because they alter the opportunities the molecules have to interact with other.

sentfrom
Автор

That opening mentioning crystal formation and oil-water separation doesn't really make any sense to me. The states based concept of entropy (which I am very much in favor of) has the same issues as 'disorder' in those cases. Those situations cannot be described in terms of entropy alone.

Other than that, good explanation. Higher entropy configurations simply have a higher probably of existing, all else being equal.

travcollier
Автор

2^100 ~ 10^31 which is pretty close to a billion billion billion I guess, not sure if you did on purpose :)

laviekolchinsky
Автор

Could you make a video to explain if these two definitions are equivalent?

hiclh
Автор

Great Video!

Which software did you use for the animation of the particles?

Thanks!

stoicsimulation
Автор

When you say ordered things like crystals couldn't happen, doesn't that ignore that as the crystal forms heat is given off and disperses. So the 'disorder' of the surroundings increase (as molecules in the crystal become more ordered). Isn't that why the free energy of a reaction, deltaG = delta H - TdeltaS, so the delta H measures the entropy change of the surroundings from heat? So in that sense formation of crystals (or the existence of life) do lead to more "disorder"?

GCotg
Автор

While it is true that entropy can lead to highly ordered things forming, if, statistically, the most likely states that a system can have are disordered/dispersed ones, with the end result being the heat death of the universe, then wouldn't it be correct to say that entropy tends towards disorder and dispersal?

EQuake
welcome to shbcf.ru