Link: Supervised learning
Multiple regression is a multi-dimensional version of linear regression, meaning we are adding additional factors to the equation.
How does it work
Itβs quite similar to linear regression. See the same section in Linear regresssion for more details.
R-squared
The method of calculating R-squared is the same as simple linear regression. The difference is that, now we need to adjust R-squared to scale by the number of parameters.
F-value and convert it to P-value
Calculating F-value and P-value is the same, but now the will be different, while remains 1.
Why is it useful: compare linear vs multiple regression
We can compare between simple linear regression vs multiple regression, which will tell us if the new factor is worth the effort to add to the model.
This is done by calculating F-value, but replace SS(mean) with SS(simple), and replace SS(fit) with SS(multiple).
Conclusion: if the difference in R-squared between two regressions is big, and p-value is small, then adding the new parameter is worth the trouble.