What Is Entropy, Really?

preview_player
Показать описание

Entropy is usually defined as "disorder," but this is not quite the correct way to think of it. A better and more precise way is to think of it as the number of ways that the microscopic components of the system can be arranged without affecting its macroscopic properties. High entropy systems can be arranged in more ways than low entropy systems. Often, this is indistinguishable from disorder, so "disorder" is the more simplified way that it is defined.
Рекомендации по теме
Комментарии
Автор

Or as the 2nd law of teenagedynamics states: in any closed bedroom, entropy always increases with time.

kernicterus
Автор

I was not expecting that end sound effect 😭

Uhhhnvm
Автор

Thank you. I’ve searched many videos, and this is the first on entropy I found that I can wrap my head around.

box-botkids
Автор

"order" and "disorder" are completely arbitrary terms which don't mean anything outside of human perception, like gas particles in a room that are in a state of equilibrium still look pretty ordered to me atleast. I always find it more helpful to think about how spread the energy in a system is, especially when talking about entropy in terms of the the age of the universe. Like, matter and energy will tend to spread out evenly overtime, that's heat death.

dfsnsdfn
Автор

A neuron can have even 200, 000 afferent (or input) synapses, and at most one axon, as output channel to myriads other neurons via its efferent (or output) synapses, which are afferent to the latter. That's the case of Purkinje cells in the cerebellum.
Each afferent synapsis can input a nervous signal. It may have nothing to do with the nervous signals other afferent synapses channel in. Therefore, there may be up to about 2²⁰⁰ ⁰⁰⁰ (or 10⁶⁶⁴ ⁰⁰⁰) different neuron-stimulating patterns; given that the number of particles in the universe is of the order of 10⁸⁴, and that the output axon signals have only two states (either an action potential, or spike, or its absence), we can say a neuron does an excellent job of reducing information entropy to a minimum.

wafikiri_
Автор

I always found these analogies/definitions of entropy to be too subjective. What feels like "order" to you is a mental preference, not clearly an objective property of the universe. So the simplest explanation I think of is that more spaced apart = more possible states = higher entropy. Less spaced apart = less possible states = lower entropy.

extremeheat
Автор

This is the only video that helped me understand, thank you

ghsbadgerfgb
Автор

Life could be a low entropy but conserved pocket arising in generally disordered milieu to reinforce more entropy for the overall system. The living organism has to work hard to stay ordered against all forces. In doing so, it extracts and processes energy from its surrounding in a way that the surrounding increases in entropy faster than before.

johnterry
Автор

I think this is what makes the abiotic synthesis of all ribonucleic monomers all the more fascinating

sauce
Автор

Entropy can also be treated as the level of disinformation in a system. The more information about the state of every particle in a closed system you have, the less entropy it is. The devil of Maxwell is a good way to see entropy with this definition. Is widely used in wave treatment, like when sending signals with an antenna

PenguinPotato
Автор

finalmente, alguien que explica que es la entropía y que le entiendo, Gracias.

hopaideia
Автор

If it's spontaneous, then it's also thermodynamically favorable

TheRevelation_Official
Автор

Thanks this is the only vid I was able to understand entropy better lol

lotobloom
Автор

Spontaneous processes tends to move towards higher entropy. But stars are spontaneous, yet the accumulation of all that hydrogen into one spot is an example of decreasing entropy. So mostly, the universe has tended towards decreasing entropy so far. The diffuse hydrogen clouds are turning into little compact dots of hot stars - all the hot particles moving into one corner of the room?

effectingcause
Автор

This is how you can stir your tea and not end up back at the beginning of making it.

TurinTuramber
Автор

Inexact: il ne pas posible modifier un système et prétendre produire la misma configuración que possédait avant. L'entropie ne reflète pas la perte d'information d'un système. L'entropie est une manifestation de la irreversibilité des phénomènes physiques: inclure l'information dans l'entropie c'est une figure de style. L'information contenue dans les plus simples expresión des phénomènes physiques est incommensurable. Dans ce sens, l'information sur un état est une métaphore qui rends imaginable un état extrêmement complexe. Si l'information était quelque chose fundamental de l'entropie, nous aurions des taux d'entropie pour certains phénomènes. Mais nous n'avons pas des taux ou de constantes inhérents a l'entropie.

lopezarellanojose
Автор

Everything tends toward entropy. Like metal rusting. In the end nothing will be left but empty space.

woodrowtaylor
Автор

Which entropy is greater hot gas or cold gas? Since entropy always increase.

jakubkusmierczak
Автор

I just see entropy as distribution. Higher entropy = more distributed over space.

nanashipersonne
Автор

Why is the universe not random?Or is it?

nsc