Variance of differences of random variables | Probability and Statistics | Khan Academy

preview_player
Показать описание

Variance of Differences of Random Variables

Missed the previous lesson?

Probability and statistics on Khan Academy: We dare you to go through a day in which you never consider or use probability. Did you check the weather forecast? Busted! Did you decide to go through the drive through lane vs walk in? Busted again! We are constantly creating hypotheses, making predictions, testing, and analyzing. Our lives are full of probabilities! Statistics is related to probability because much of the data we use when determining probable outcomes comes from our understanding of statistics. In these tutorials, we will cover a range of topics, some which include: independent events, dependent probability, combinatorics, hypothesis testing, descriptive statistics, random variables, probability distributions, regression, and inferential statistics. So buckle up and hop on for a wild ride. We bet you're going to be challenged AND love it!

About Khan Academy: Khan Academy offers practice exercises, instructional videos, and a personalized learning dashboard that empower learners to study at their own pace in and outside of the classroom. We tackle math, science, computer programming, history, art history, economics, and more. Our math missions guide learners from kindergarten to calculus using state-of-the-art, adaptive technology that identifies strengths and learning gaps. We've also partnered with institutions like NASA, The Museum of Modern Art, The California Academy of Sciences, and MIT to offer specialized content.

For free. For everyone. Forever. #YouCanLearnAnything

Subscribe to KhanAcademy’s Probability and Statistics channel:
Рекомендации по теме
Комментарии
Автор

Thank you for making videos like these and making it available for us for free, its really helpful. Thank You Loads

AloysiusPeter-dp
Автор

Please introduce the idea of the constants with the random variables and when to sqaure the constant and when not to. This is very important to many CIE A2 stats candidates.

oneinabillion
Автор

4:26 Of that fu.. Ey, sir got intrusive thoughts right there haha

jakeaustria
Автор

9:41 thats all i wanted to here. thanks sal and team

InderjitSingh
Автор

The prob with a lot of maths lesson is that they make use of tons of symbols completely detached from reality. You should use simpler language.

funnyvideosfans
Автор

E(x) = Sum of all ( x * p(x) ).. But, what is E((x-mean)Sq) mean ?

subramanyamujamadar
Автор

thank you sir..this site has helped me a lot educational phase of my life

shahzebafroze
Автор

How can I get personalize help on a particular problem I’m doing

phiyahfit
Автор

Would be nice if you used a number line as part of the explanation. Im still confused a bit, I get the whole we want absolute distance, but if we got variance of 10, and we want to subtract variance of 4 from it, I dont understand why we wouldnt get variance 6. Like for instance i have an experiment which yields a result of variance of 10. The experiment consits of a variety of different parts. I find that removing part Y from the experiment reduces the variance of the experiment X by 4. So now my variance is 6 for the experiment. How would that work? Because it seems like we can only increase variance and never reduce it?

Duxa_
Автор

If the variables are dependent:

Var(X + Y) = E[(X + Y - (µx + µy))^2] = E[((X - µx) + (Y - µy))^2] = E[(X - µx)^2 + 2(X - µx)(Y - µy) + (Y - µy)^2] =
= E[(X - µx)^2] + 2E[2(X - µx)(Y - µy)] + E[(Y - µy)^2] = Var(X) + 2Cov(X, Y) + Var(Y)

Var(X - Y) = E[(X - Y - (µx - µy))^2] = E[((X - µx) - (Y - µy))^2] = E[(X - µx)^2 - 2(X - µx)(Y - µy) + (Y - µy)^2] =
= E[(X - µx)^2] - 2E[2(X - µx)(Y - µy)] + E[(Y - µy)^2] = Var(X) - 2Cov(X, Y) + Var(Y)

Note that if the variables are independent the Covariance will be 0, in which case Var(X + Y) = Var(X - Y) = Var(X) + Var(Y)

jeppebeppebeppeson
Автор

The way I think about this is that var(-y) is nothing but the difference between the squares of the -y, and the mean. Since we are squaring, -y^2 = y^2 therefore var (-y) is equal to var (y). Is my line of reasoning correct?

venkataramanareddyta
Автор

I don't get it. If Z = X+ Y, then Y = Z - X. So if Var(Z) = Var(X) + Var(Y), then Var(Z-X) = Var(Z) - Var(X). Why does this logic not work? What am I missing here?

ЙцукенПетрович
Автор

plz help me about the whole chapter of probability

rimshaaqeel
Автор

Is this course, Probability and Statistics, available in PDF, Sal?

Nina-kvvn
Автор

The -1 (squared) is the same as 1, since -1*-1 = 1. He just put it there to show that you can have a minus in front of every variable(Here X and Y), and it is still the same thing.

So: E((X-E(Y))) =
 E((-X+E(Y) =
 E(((-1^(2)(X-E(Y)) =
 E((1*(X-E(Y))) =
 E((X-E(Y))) 

Roleren
Автор

8:21 where does the minus 1 squared come from?

choppera
Автор

This just looks like Chinese to me. I don’t understand this concept at all! Makes me wanna cry 😭. Hurts my brain wayyyy to much

phiyahfit
Автор

omg sal, i would pay to be your student

pepperplume
Автор

There's a mistake with the (-1)^2.
if you simplify this: (-1)^2 * (Y + E(-Y)^2)
you get: (-1)(-1)(Y + E(-Y)^2) = (-1)(-Y - E(-Y)^2) = (Y + E(-Y)^2).

johnitravolta