When a regression model accounts for more of the variance, the data points are closer to the regression line. In practice, you’ll never see a regression model with an R2of 100%.
R-squared is the percentage of the dependent variable variation that a linear model explains. R-squared is always between 0 and 100%: 0% represents a model that does not explain any of the variation in the responsevariable around its mean. The mean of the dependent variable predicts the dependent variable as well as the regression model.
But, yes, the software plugs in the values of the independent variables for each observation into the regression equation, which contains the coefficients, to calculate the fitted value for each observation.
Statisticians say that a regression model fits the data well if the differences between the observations and the predicted values are small and unbiased. Unbiased in this context means that the fitted values are not systematically too high or too low anywhere in the observation space.
The 10% value indicates that the relationship between your independent variable and dependent variable is weak, but it doesn’t tell you the direction.
R-squared has Limitations. You cannot use R-squared to determine whether the coefficient estimatesand predictions are biased, which is why you must assess the residual plots. R-squared does not indicate if a regression model provides an adequate fit to your data. A good model can have a low R2value.