Design Matrix Examples in R, Clearly Explained!!!

preview_player
Показать описание

For a complete index of all the StatQuest videos, check out:

If you'd like to support StatQuest, please consider...

...or...

...buying one of my books, a study guide, a t-shirt or hoodie, or a song from the StatQuest store...

...or just donating to StatQuest!

Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:

#statquest #regression
Рекомендации по теме
Комментарии
Автор

I really enjoy these lessons. It is incredible how much free information is available online. Thank you for taking the time to make these.

nateandguitar
Автор

I believe this man is a god to me....I really am blessed to get your lectures online...far* infinite better than the junk paid online course. Keep doing this...I am on the right direction with you.

tarun
Автор

Thank you so much for making this video! Now the R output makes so much more sense to me!

cindywang
Автор

I gotta say, this series of linear regression videos are so unique and understandable. I love them. One question about this specific video. So many p-values are produced, but which one actually reflects the significance of the difference of the two lines, which actually help us reject the H0? Thank you, Josh.

余长
Автор

Hey Josh! This video really changed my life! Could you make some videos on fixed and random effect models in R (lme4 package)? You have a natural gift to make such clear explanations!! Thanks again!

Stat - Quest!
The - Best!

YouGio
Автор

can you make a video about SVM? I will be very very thankful~

pkl
Автор

04:40
Based on what I learn, the t-value(5.71) was obtained with (estimate of coefficient(1.48)-0)/ standard error of coefficient(0.26) and it has its own p-value(0.0023).
In the video, the p-value is obtained by comparing fancy model(unrestricted model) and simple model(restricted model) with F-statistics.
Do they both produce the same p-value?

콘충이
Автор

Thanks Josh. Great lessons as usual. This helped me a lot.

Is there any chance you can make a statquest about "interaction term" and things like "regress out"?

Again, thanks so much for these videos!

taotaotan
Автор

Hi Dr. Starmer, I really enjoy your videos and the book you wrote! It's the most often read book on my shelf. I have one question in the mice example in this video. The small p value tells us there is a significant difference between the models with and without weight/mutant or not/.etc. But can you conclude which model is better? I found this is confusing for me. I look forward to your reply!

Hannal-hnch
Автор

3:46 If the slopes for the 2 types of mouse are obviously different, how can we specify in the formula that we want 2 separate slopes?

Russet_Mantle
Автор

Great video! The best stat resource in the net! Thank you a lot, Josh! one short question. It looks like when we remove "TypeMutant" from the model then we already deal with "Lab A mean" but not with "Lab A CONTROL mean" (e.g. see 7:35 - 7:40), since we have no difference between groups in the lab. Is it true?

dmitryfalkov
Автор

Hey Josh and than you for your incredbile videos! I was wondering if you are going to make a video explaining mixed models with fixed vs. random effects? Fx how this differs from just including grouping factors as fixed effects. This could perhaps be coupled with an introduction to the lmer4 package in R? Keep up the good work, it is much appreciated!

kasperfischer-rasmussen
Автор

Explained so well! I am really impressed how you can break down that material in something so easy to follow. Please keep that up. I have one naive question. I guess for the models it would not matter if we add instead of 0 and 1, lets say 1 and 2 into the design matrix?

simonbaum
Автор

Thanks for the explanation. In the first example, what is the difference of using design matrix and converting control/mutant to 0/1 by one-hot encoding?

yuchenjia
Автор

Hello Josh, first I want to thank you for putting these videos on YouTube. You are a great teacher. I have a question tho. In your explanation of lm() |> summary(), you skipped what values in the (Intercept) row mean. I saw other data scientists just ignore values in the (Intercept) row but I don't know what they mean and why we are ignoring those values...

jamesly__vidi
Автор

Thanks for the great lessons. One question: In the first model, does it mean that both lines must have th e same slope, in order words, parallel to each other? If so why?

yuchenjia
Автор

its really helpful... Thanks ..I have a question
What about the interaction term which could be in the design of Deseq2 like ~condition + sample_type* treatment? what does * means biologically!
I really appreciate your help

nourelislam
Автор

I've been trying to figure out how to structure pre-planned contrasts (e.g. contrast matrices) in R rather than relying on post-hoc tests like Tukey, but I have been largely stumped in terms of how to actually get them to work in R. Do you have any pointers or advice?

zacharybrecheisen
Автор

maybe this time i will manage to communicate my misunderstanding with r Code.Thnx a lot.Why is valid345 false (the 3:45 is time in video)

pvalfancy=0.00367
notthelastpval=0.00256
modelnoiweight<-lm(Size~Type)
pvalnoweight=0.1964

valid345=

savvaskefalas
Автор

I am very confused by the contrast coefficients in ANOVA, would be great if StatQuest can do a video on this!

jamah
join shbcf.ru