Why Information Theory is Important - Computerphile

preview_player
Показать описание
Zip files & error correction depend on information theory, Tim Muller takes us through how Claude Shannon's early Computer Science work is still essential today!


This video was filmed and edited by Sean Riley.


Рекомендации по теме
Комментарии
Автор

"a bit"
"a bit more"
after years living with the pack of geniuses, he had slowly become one

mba
Автор

EE chiming in: you stopped as soon as you got to the good part! Shannon channel capacity, equalization, error correction, and modulation are my jam. I'd love to see more communications theory on Computerphile!

Ziferten
Автор

This is the best unscripted math joke I can remember!

How surprised are you?
>A bit
One bit?

louisnemzer
Автор

The way I like to do that conclusion would be to say : ok let's describe a population where everyone plays once.
In the case of the coin flip, if a million people play, you need to, on average, give the name of 500k people who got tails (or heads). Otherwise your description is incomplete.
In the case of the lottery, you can just say "no one won", or just give the name of the winner. So, you can clearly see how much more information is needed in the first case.

LostTheGame
Автор

Nice. This explanation ties so elegantly to the hierarchy of text-compression. While I've, many times, been told its mathematically provable that there is no more efficient method... This relatively simple explanation leaves me feeling like I understand HOW it is mathematically provable.

roninpawn
Автор

I'm not gonna lie, I didn't think this video was going to be interesting, but man, it's making me think about other applications. Thank you!

gaptastic
Автор

I believe Popper made this connection between probability and information a bit earlier on his Logik Der Forschung (1934 Shannon's first paper was written in 1949). That's why he says that we ough to search for "bold" theories, that is, theories with low probability and thus more content. Except, at first, he used a simpler formula: Content(H) = 1-P(H), where H is a scientific hypothesis.

Philosopher's role on the history of logic and computer science is a bit underrated and obscured imo (see for example, Russell's type theory).

Btw, excellent explanation. Please, bring this guy more often.

elimgarak
Автор

The reason we use the logarithm is because it turns multiplication into addition.
The chances of 2 independent events X and Y happening is P(X)*P(Y)
if entropy(X) = -log(P(X))
entropy(X and Y) = -log(P(X)*P(Y)) = -log(P(X))-log(P(Y)) = entropy(X) + entropy(Y)

Double-Negative
Автор

This and Nyquist-Shannon sampling theorem are two of the buildings block of communication as we know today. So we can say even this video is brought to us by those two :D

travelthetropics
Автор

Been seeing lots of documentary videos about Shannon lately. Thanks for sharing.

CristobalRuiz
Автор

Coffee machine right next to computer speaks louder than any theory in this video.

Jader
Автор

Either I missed a note, there's a note upcoming, or there is no note stating that these are log_2 logarithms, not natural or common logarithms.@
@5:08. "upcoming" is the winner, giving me log_2(1/3) ~= 1.585 bits of information.

drskelebone
Автор

"million-to-one chances happen nine times out of ten" - Terry Pratchett

scitortubeyou
Автор

You really need to cover arithmetic coding, as this makes the relationship between Shannon entropy and compression limits much more obvious. I'm guessing this will be in a followup video?

gdclemo
Автор

Greenbar! Haven’t seen that kind of paper used in years.

DeanHorak
Автор

I'm reading Ashby at the moment and we recently covered Entropy. He was very heavy handed with making sure we understood that the measure of Entropy is only applicable when the states are Markovian, or that the state the system is currently in is only influenced by the state immediately preceding it. Does this still hold?

Mark-dcsu
Автор

I like how there's almost every source of caffeine on the same computer desk.

Juurus
Автор

How much information/entropy is needed to encode the position of an electron in quantum theory (either before or after measurement)? What about the rest of its properties? More generally, how much information is necessary to describe any given object? And what impact does that information have on the rest of the universe?

David-idjw
Автор

I listened to it all. I hit the like button.

I did not understand it.

I loved it

adzmarsh
Автор

Boolean algebra is awesome!!!: Person(Flip(2), Coin(Heads, Tails)) = Event(Choice1, Choice2) == (H+T)^2 == (H+T)(H+T) == H^2 + 2HT + T^2 (notice coefficient orderings) where the constant coefficient is the frequency of the outcome and the exponent or order is the amount of times the identity is present in the outcome. This preserves lots of the algebraic axioms which are largely present in expanding operations. If you try to separate out the object and states from agents using denomination of any one of the elements, you can start to be able to combine relationships and quantities with standard algebra words with positional notation(I like abstraction be used as the second quadrant, like exponents are in the first, to resolve differences of range in reduction operations from derivatives and such) polynomial equations to develop rich descriptions of the real world and thus we may characterize geometrically the natural paths of systems and their components. These become extraordinarily useful when you consider quantum states and number generators which basically describe the probability of events in a world space which allows one to rationally derive the required relationships elsewhere, events or agents involved by stating with a probability based on seemingly disjoint phenomena, i.e. coincident and if we employ a sophisticated field ordering, we can look at velocities of gravity to discern what the future will bring. Boolean algebra is awesome! Right up there with the placeholder-value string system using classification of identities.

CarlJohnson-jjic