Normalization Constant for the Normal/Gaussian | Full Derivation with visualizations

preview_player
Показать описание

The bell-shape curve of the Normal/Gaussian distributions is created by the exponential of a negative parabola. But just using this expression as a probability density function would be invalid because the integral under the curve would not be 1. Hence, it requires a division by a normalization constant. In this video, we are going to derive this constant. For this, we will also figure out why there is a pi inside it.

-------

-------

Timestamps:
00:00 Introduction
00:16 Why we need the normalization?
01:23 Defining and simplifying the integral
02:30 No antiderivative? - Trick
03:38 2D Representation
06:46 Rotational Symmetry
07:04 Changing to Polar Coordinates
08:54 Finding the antiderivative
10:23 Finishing the integration
11:33 Outro
Рекомендации по теме
Комментарии
Автор

Thanks a lot for this explanation, a lot of the stuff you find online assumes a lot of "basics" or things that should be obvious (like where the "I" variable comes from which is obvious in hindsight but not stated explicitly in e.g. the PRML book from bishop, or the inclusion of the y variable considering the context given describes a univariate gaussian which makes on assume the y is over the y-axis (probability) which makes things super confusing)

Luck_x_Luck
Автор

Good explanations and I love the intro it's similar to the one in Lost lol

octaveraffault
Автор

It would be nice if you can talk about computation of normalization coefficient of Generalized Gaussian distribution (using again change of variable), say (3.56) in Bishop's book.

hengzhou
Автор

shouldn't we have multiplied by -sigma^2/x at the end instead of -sigma^2 ?

danielgigliottidg
Автор

that's interesting how the normalizing constant is derived to ensure that the density sums up to one. Curious, then why did Gauss choose exp(-0.5 * ((x - u) / s)^2) as a potential expression to start looking for a density function?

orjihvy
Автор

video almost self sufficient. Too bad you didn't talk about the "transformation factor (at 08:46)

EW-mbih