Is ENTROPY Really a 'Measure of Disorder'? Physics of Entropy EXPLAINED and MADE EASY

preview_player
Показать описание
This is how I personally wrapped my head around the idea of entropy! I found the statistical mechanics explanation much easier to grasp than the thermodynamics (original) one.

Hey everyone, I'm back with another video, and this one has been highly requested! I really enjoy making videos about thermodynamics because it helps me to wrap my head around the topic more easily. I'll be hosting another poll to see which area of physics you want me to talk about in a future video.

In this video, we're talking about entropy. More specifically, we're talking about the definition of entropy that deals with systems on the microscopic level - looking at the particles a system is made of, rather than just the pressure / volume / temperature of the entire system. Believe it or not, these are two different approaches in physics, each with their own merits. While the original (classical thermodynamics) definition of entropy looked at the system as a whole, we didn't get a deeper understanding of entropy until we looked on the small scale and introduced this statistical mechanics definition.

We start by considering an abstract system consisting of some number of particles in a box. These particles can occupy specific energy levels (meaning each of these particles can carry / have a specific amount of energy). Given these restrictions, as well as measurements we can make on the system of (a) how many particles there are in the system, and (b) what the total energy of the system is, we can work out all the possible ways the particles in the system can be arranged in their energy levels. This is really important, because all the possible ways the particles can be arranged, are known as all the possible microstates (sometimes written as micro states) of the system. The entropy of the system is directly dependent on the number of microstates.

This also brings us to the common description of entropy, as being a "measure of disorder". If a system has lots of possible microstates it can be arranged in, this means that the particles in that system can be arranged in many ways, and the entropy of the system is larger. These systems are known as "disordered" because of the large number of ways in which they could be arranged. However, systems with fewer possible microstates are more "ordered", and they have smaller values of entropy. Hence, entropy is a "measure of disorder". The more possible microstates, the larger the entropy.

Thank you all for watching this video, and please do check out the little document I've written (linked above) and attempt the questions in it. Keep an eye out for the video I'm going to make walking through all the solutions too.

If you want to check out some music I'm making, head over to my second channel on Parth G's Shenanigans. Follow me on Instagram @parthvlogs. I'll see you really soon!
Рекомендации по теме
Комментарии
Автор


As always, thanks so much for your support! :)

ParthGChannel
Автор

Hello, I'm a physicist (PhD student) myself, and I'm positively surprised to having discovered this small channel. I like the quality and dedication you put into your videos in order to explain physics in an understandable manner. Everything was very neatly explained and illustrated, I'm delighted to see you explaining the energy scale subject AND clarifying that it is not about actual height levels of a box, because I very often see non-scientists confuse those. Thank you for that.
Now for some criticism:
Of course, everything makes perfect sense for me as a physicist, but I would like to see more explanation about why you set up those seemingly arbitrary rules. For example, a normal person might not know why the energy of a system is such an important parameter and why you're constructing your box in such a way that it remains constant. Also explaining what an "isolated" or "thermal equillibrium" system means in this context might help alot. Finally, I kind of missed the conclusion for the macroscopic world about this definition of entropy, for example what a system with many possible microstates actually means. Also, it is a missed opportunity in my opinion to not touch the subject about microstate transitions because this is fundamentally the reason for disorder, because the system tends to spread out into a variety of microstates AND that it is in general impossible to measure all those at once - hence the disorder i.e. lack of information.
But all in all, very well made video, 9/10. You've earned my subscription, and I'm eager to see more good content!

antonk.
Автор

Parth: hottest day reaching 35 degrees
me who has gone through the 50 degrees heat in Delhi: u gotta pump those numbers up those are rookie numbers.
😂😂
LOVED THE VIDEO AS ALWAYS. THANKS FOR CONSTANTLY INCREASING MY LOVE FOR PHYSICS.

harshbhogal
Автор

Man I wish my physics textbooks were like your document. Didn't really have a lot of problem solving them after your explanation, don't know if our physics textbooks are at fault or our teachers at school.

cosmovate
Автор

I'm dealing with this term since 5 years of my study and nobody explain me like this. As always love to learn from you sir. Thankyou

ShubhamSharma-rtsi
Автор

As a 3rd year physics student, these documents are brilliant. We learnt some T&Statistical physics last semester but you have done a brilliant job explaining Boltzmann's entropy equation. If you keep the same format of the LaTeX document for each subject you eventually work on, this could build up a wonderful resource of learning. I would support a patreon if you started one. With every video so far I have watched from your channel, you explain the topic very clear.

Zehkari
Автор

Just found this - it is so helpful. The 'measure of disorder' definition seemed so vague until you explained what it means. Accessible video that give real insights.

suemiller
Автор

I recently discovered you and I cannot believe you only have 78k subs considering the quality of of your videos and the work you put into them.

sozo
Автор

Amazing video! Heading straight to the worksheet. Thank you~^^


Edit:
Okay so here I am to share my thoughts. I thoroughly enjoyed the 8 pages. The questions were quick to solve after watching the video. I was stuck on the equation question but then your solution was handy so was able to get around it easily.
Once again thank you so much~

And yes "indistinguishability" is indeed a long word. Lol!

박로이-zx
Автор

As soon as anyone says _"entropy is a difficult subject to get your head around"_ I get very suspicious about the motive for doing this because *its not* if you just tell the *truth!* Much as I enjoy your clear presentation Parth, I need say what is missing here, which is actually the reason why entropy is difficult to understand now, because it didn't used to be.

1) The *microstates* of a system are not only the arrangement of energy levels, they also include all possible *positions* of the particles!
2) Omega is *not* only the total number of microstates in the system it is usually less depending on the choice of *macrostate!*
3) The assumption Omega = total number of microstates is required to derive Clausius equation for entropy change, as is the inclusion of the Boltzmann constant and the use of the natural logarithm.
4) Richard Feynman would teach Boltzmann's equation setting k = 1 and use log (base 10) so he had s = log (Omega) and that is still a valid entropy. But this means entropy actually has *no units* its just a number ! Which is the truth.

What all this means is you have been taught entropy from a *naturalistic* point of view which omits certain facts which end up making entropy confused. You really need to question things more.

The original Boltzmann equation was easy to interpret as *measure of disorder* because order is *intuitive* because the assumption *all microstates are equally likely* actually dictates the process is *random* and random processes *always create more disorder* which is the normal perception. *BUT* doing that makes, as James Jeans put it *entropy is purely subjective* because the *macrostate is a choice of the observer* and that makes entropy *subjective.* In fact just like the term probability, entropy does not exist outside of a *mind* it is *not of nature!* and that is a big problem for naturalists who can't accept anything which is not of nature and minds certainly are not. Sorry but this must be said.

mikebellamy
Автор

Superb video
Please make a video on Copenhagen interpretation

lakshthaker
Автор

Knows it will be a amazing video without watching it completely.

aryansharma-fggl
Автор

The whole purpose of your channel is beautiful, gave me a lot of different ways to approach topics that I find hard to grasp.
Thank you! ❤️

lysiri
Автор

Amazing video sir. Although entropy is prescribed in Our 11th standard chemistry syllabus but your explanation gave me a overview before starting the chapter. Thank you and kindly consider making a video on special theory of relativity.

anjalibhattacharyya
Автор

Entropy was a concept which no one could explain me for a long time in such simple terms, Your Every video has Helped me increase my Interest and Knowledge in Physics in a Fun Waiting for more Videos on Quantum Mechanics as well!!

AstronomywithManas
Автор

Man you are a genius. You explain so difficult thing in so easy way. It's become so clear for me. Thank you!
I’d love to listen more about classical definition of entropy and how it corresponds with Boltzmann entropy.

AleKaledin
Автор

Wow... You're turning out to be my favourite youtube channel...
I really enjoyed the video, and also the document you made. I hope you will keep the upcoming videos and documents for free (at least the solutions, as you mentioned in the conclusions), cause otherwise i won't be able to access those. However, i understand you want your work to be molt valuable. Anyway, do whatever you think it's best and keep doing these fantastic videos ;)

franadillongalvez
Автор

when he said " we have a box." idk why my brain said "we put a cat in it"

harshbhogal
Автор

Sounds like Emmy Noether had something to say about entrophy in abstract algebra. Your examples reminds me about symmetries in abstract algebra. But then again symmetry is orderly. 6 probabilities of turning a triangle while it still remains symmetric. Vs 6 ways of arranging your Particles in a 5E box.

Papa_and_son
Автор

Before watching this video, I watched what I assumed would be an easier watch with a TedTalk and by far your video was much much better. Thank you and good job!

reneecampbell