Multiple Regression in Excel - P-Value; R-Square; Beta Weight; ANOVA table (Part 3 of 3)

preview_player
Показать описание
Check out our new Excel Data Analysis text:

This video illustrates how to perform a multiple regression statistical analysis in Microsoft Excel using the Data Analysis Toolpak.

Multiple Regression
Regression
R-Squared
ANOVA table
Regression Weight
Beta Weight
Predicted Value

Subscribe today!

Video Transcript: and if you recall, if we use an alpha .05, which is what we typically use and we'll also use in this example. If this p-value is less than .05, then that indicates the test is significant. So this value is significant because .0004 is definitely less than .05. So this indicates that the R-squared of .50 is significantly greater than zero. So in other words, the variables SAT score, social support, and gender, once again taken as a group, predict a significant amount of variance in college GPA. And we could write that up as follows. We could say the overall regression model was significant, and then we have F 3, 26 and that comes from right here, 3 and 26, = 8.51, which is the F value here reported in the table, p is less than .001, and I said that because this value is smaller than .001. And I also put the R-squared here. R-squared = .50, and that of course came from right here. So you'll often see results written up like this, in a research article or what have you. So this is one way to express the results of the ANOVA table. So if you're reading a research article on multiple regression and you see this information here, most likely, this first part here is corresponding to the results of the ANOVA table. OK so these first two tables, as I had said earlier, they assess how well our three predictors, taken as a set, did at predicting first-year college GPA. Moving to our last table, this is where we look at the individual predictors. Whether SAT score, on its own, social support, on its own, and gender, once again on its own, are these three variables significant predictors of college GPA. Now it may be that one of them is significant, two of them are, or all three of them are significant, but that's what this table assesses. So as we did before, we'll use alpha .05, once again. So we're going to assess each of these values against .05. And notice that SAT score, this p-value definitely is less than .05, so SAT is significant. Social support, this p-value, while fairly close, is also less than .05, so social support is significant as well. But notice gender, .66, that's definitely not less than .05, so gender is not significant. And that's really not that surprising because males and females don't typically differ significantly in their college GPA, in their first year, or in all four years for that matter. But I wanted to include this variable gender in this model as well, so you can see an example of a non-significant result. So once again this table is looking at the predictors individually, so this indicates here that SAT score is a significant predictor of college GPA, social support is also a significant predictor of college GPA, but gender is not a significant predictor. Now in this table here what we're assessing is whether these predictors account for a significant amount of unique variance in college GPA. So in other words what that means is that SAT scores significantly predicts college GPA, so it accounts for a separate, significant part of college GPA than social support, which is also significant, but it accounts for a unique part of college GPA that SAT does not account for. So if a test is significant here, that means that the variable accounts for a significant amount of variance in college GPA uniquely to itself. And that's an important point to note here, and that's frequently confused with multiple regression. So, a scenario, if these two predictors were completely and perfectly correlated at 1.0, in other words they're really getting at the exact same thing in college GPA, then neither of these would be significant if that was the case, because neither of them would be accounting for any unique information in college GPA whatsoever. They would be totally redundant and they would both not be significant. So if a predictor is significant here, as these both are, then that tells us that they account for a significant amount of unique variance in college GPA. So to wrap it all up here, to summarize, our regression overall was significant as we see that in the ANOVA table, and the amount of variance that was accounted for, when the three predictors were taken as a group, was 50% of the variance, or half of the variance, which was pretty good. When we looked at the predictors individually, SAT score was a significant predictor of college GPA, as was social support, but gender was not significant. This concludes the video on multiple regression in Microsoft Excel. Thanks for watching.
Рекомендации по теме
Комментарии
Автор

This is the best video ever to understand the interpretation of p-values. I have struggled all these years to understand this. God bless you for this video. I just understood this in 5 minutes. WoW! this is awesome. You deserve the salary of my Professor, trust me.

emmanuelhoustonkwashie
Автор

best collection of videos explaining multiple regression i've seen. thank you very much!

bigjayfitness
Автор

Am very grateful for your video.You really helped me.May God Almighty blessed you abundantly.

johnmutuku
Автор

Wow thanks I watched through 3 videos that talked about weighting betas just for there to be no explanation on weighting the betas

AUSSIESERB
Автор

Hello, Help me how to come up Anova table

kennedymumba
Автор

Thanks for this useful document. I ran a multiple regression analysis with 15 input variables and 1 output variable. Only 5 came out significant. Should maintain all variables or just the significant ones when I present the results in my paper?

Djorouh
Автор

good day sir, I just wanted to ask if an independent variable is not significant or does not have an explanatory power to the model but when removing it lowers the adjusted r-square what does this imply? so far the reason that i know the reason is because the t-statistic is greater than one. With this information, what can we infer?

adylmanulat
Автор

Good video. I ran my multiregression, X1 is not significant but when I run a univarble regression (x1 vs Y) it is significant. I've check for collinearity between x1 & x2 and there isn't any. Note alpha = 0.05, Significant F < 0.05. Any thoughts why this is happening?

michaeljbuckley
Автор

Great, really great explanation, compliments. I would like to know which were the testing hypotheses H0 and Ha to define that p-values less than 0, 05 were good and so i can consider significant SAT and Sicial support variables. Thanks in advance

sergiodamilano
Автор

Dear Prof, I would like to ask if what is the scale that you used to determine/prove that 0.50 or 50% can have a good performance
performance ? thankyou. hoping for your response.

rhiemaamoraranza
Автор

Hello. I have a question regarding the coefficients. When we will try to make a prediction for future, we will multiply these coefficients with our given values. Here, SAT score has a coefficient equal to zero. Doesn't that mean it won't effect the college GPA? Thank you.

CaliforniumTV
Автор

Hello,
Thank you for this video, i just have a question to ask you :) i have likert data and i'm wondering if i can use multiple regression to analyze it ? Thanks in advance

oumaimadaloussi
Автор

nice done, thanks,
would you please, explain 'Pearson Correlation' and '2 tailed' for me please

DrEvanAnatomist
Автор

Thanks a lot for sharing such nice video series.
Sir, I am Petroleum Engineer, working on application of statistics for prediction of rock properties.
Can you explain the advantages and limitations of 'Multiple Linear Regression Analysis"?
Is MLRA is better than Neural Network approach?
Kindly have some comments.Thank you....

atifismail
Автор

This is a great video ! Really well explained and easy to follow. However when I run my multiple regression all my p values show as #NUM!, but when I do all variables one by one the p values are normal - does anyone know how to fix this? Thanks!

audreypalosse
Автор

Better than a post grad University lecture

jruig
Автор

sir please help me
my problem is -
Another persons explain his model -
RSEI = 0.098*NDVI + 1.019*WET - 0.025*NDSI - 0.001*LST + 0.007 (R2
1. The WET must increase by 0.098 if the RSEI increase by 0.1. Nevertheless, the increased WET and the decreased LST occurred at the same time
2. Therefore, RSEI increase will be more than 0.1. If the LST decrease by 0.098 and the WET increased by 0.098 the RSEI will increase by 0.10001.
How its explain?
My model is -
RSEI = 0.9396*NDVI - 0.0074*WET - 0.22496*NDSI + 0.04665*LST + 0.0664 (R2 =1)

So, 1. If the RSEI increase by 0.1 then NDVI must increase
2. RSEI will increase by 0.10001, Then LST decrease by and NDVI increased by
How this model explain above mention two point?
Please answer me. and help me.

gardeningtourfamilycelebra
Автор

Hello, I am having problem with my regression model. I am not getting significance F equal or less than 0.0005. I am getting p value of 0.6 something. Can you please help me out regarding my problem ?

shuvaliniraja
Автор

Thanks a lot for sharing such nice video series.
Sir, I am Petroleum Engineer, working on application of statistics for prediction of rock properties.
Can you explain the advantages and limitations of 'Multiple Linear Regression Analysis"?
Is MLRA better than Neural Network approach?
Kindly have some comments.Thank you....

atifismail