Joint Source-Channel Coding and the Separation Theorem - Information Theory Lecture 14

preview_player
Показать описание
Lecturer: Tsachy Weissman
Professor of Electrical Engineering, Stanford University

This lecture of Stanford's Information Theory course EE376A covers joint source-channel coding and the separation theorem.

This course is about how to measure, represent, and communicate information effectively. Why bits have become the universal currency for information exchange. How information theory bears on the design and operation of modern-day systems such as smartphones and the Internet. What are entropy and mutual information, and why are they so fundamental to data representation, communication, and inference. Practical compression and error correction. Relations and applications to probability, statistics, machine learning, biological and artificial neural networks, genomics, quantum information, and blockchains.

Lectures will focus on intuition, applications and ways in which communication and representation of information manifest in various areas. The material will be explored in more depth and rigor via videos of additional lectures (by the course instructors) made available to those interested. Homework and projects will be tailored to students’ backgrounds, interests, and goals. There will also be a fun outreach component.

We encourage everyone - from the techies to the literature majors - to enroll. Guaranteed learning, fun, contribution to social good, and new friendships with people from departments and schools other than your own. We’ll assume you’ve been exposed to basic probability at the level encountered in a first undergraduate course, or have the motivation to dedicate the first few weeks of the quarter to acquainting yourself (under our guidance) with this material.
Рекомендации по теме
Комментарии
Автор

Audio is too low. But otherwise, great lecture

Marawan