Calculate Entropy || Information Theory || Communication Systems || Problem

preview_player
Показать описание
CALCULATE ENTROPY

Here is the solution for the problem related to entropy from the chapter, "Information Theory" of "Analog Communications" subject.

A discrete source transmits message x1 , x2 , x3 with the probabilities 0.3 , 0.4 and 0.3. The source is connected to the channel as shown in fig. Calculate all entropies.
Рекомендации по теме
Комментарии
Автор

Thanks a lot, you saved my exam grade :)

andreisir
Автор

Your explanation is good sir it is very useful for me sir thank you sir

ganeshjami
Автор

excellent way of teaching sir...inspires me a lot

rajeshkumarsv
Автор

Thank you very much for the video. Can you suggest me a textbook to study the probability matrices?

NassosKranidiotis