Information & Entropy

preview_player
Показать описание
Relation b/w Entropy (H(x)) & Information (I(x)) and problems based on their concepts.

link to my channel-

link to data structure and algorithm playlist -

link to information theory and coding techniques playlist -

link to compiler design playlist -
Рекомендации по теме
Комментарии
Автор

How did you calculate the information of the given message in first problem

anush
Автор

Prefect.
Sir will you explain Abel's theorem on power series:"COMPLEX ANALYSIS"
please sir it's request to you

golmolenahisidhibaatein
Автор

Hai sir .... heartfully thanks to u...please save by solving my problem
If p(x=k) =p(1-p)^k-1, k=1, 2, 3 then find entropy of x
If x>k k is +ve integer find entropy of x
2)show that for a discrete channel I(X, Y)>=0.
TWO SOURCES EMIT MESSAges x1, x2, x3, x4 with probabilities and y1, y2, y3, y4 with probabilities q1. ..q4 then prove that H(X)<=-(£k=1^n P base k log Q base k)

AVRgamingYT
Автор

Sir, all lectures are in same sequence as taught to us in Kiit?

utkarshvikramsingh
Автор

Why is the derivative equal to 0, sir?

krishnadasnair
Автор

Eitna accha class mei bhi padha Diya karo

venu