1 and sum of squares resid(Res SS) are sum of squares of residuals, also called sum of squares of residuals.
This statistical parameter calculates the sum of squares of the errors of the corresponding points of the fitting data and the original data.
Reg SS (regression sum of squares) is the sum of squares of the difference between the predicted data and the original data.
Total sum of squares is the sum of squares of the difference between the original data and the mean, and the formula is as follows.
Total SS=Reg SS+Res SS
2.F-F- statistics are statistics under F distribution, and the formula for calculating F in regression analysis is
F = (Reg SS/k)/[Res SS/(n-k-1)], where regss and resss are the sum of squares of regression and the sum of squares of residuals respectively.
3.R2 is the determining coefficient, also known as the determining coefficient and goodness of fit, indicating to what extent the model can be interpreted as the change of the dependent variable caused by the independent variable. Is the ratio of the sum of squares caused by x to the total sum of squares of y.
R2 = registered SS/ total SS= (total SS-Res SS)/ total SS.
= 1 resolution SS/ total SS
Extended data:
Statistically, the difference between data points and their corresponding positions on the regression line is called residuals, and the sum of squares of each residuals is called sum of squares of residuals, which indicates the influence of random errors. The smaller the sum of residual squares of a set of data, the better its fitting degree.
The coefficient of determination represents the percentage change of the dependent variable Y, which can be explained by the controlled independent variable X. The normal range of the coefficient of determination is [0 1], and the closer it is to 1, the better the explanatory power of the equation is.
References:
Baidu encyclopedia-determinant coefficient
Baidu encyclopedia-sum of squares of residuals