给定一个线性回归模型 yi = β0 + β1xi1 +…+ βpxi1 + εi  

对应数据集(xi1, xi2,…, xip, yi), i=1,…,n,包含n个观察数据. β是系数,ε 是误差项

Sum of squares表示y的期望, Sum of squares就是离差(deviation),注意不是方差(variance);  Sum of squares 表示对yi预测的值.

The total sum of squares(TSS) = the explained sum of squares(ESS) + the residual sum of squares(RSS),对应于:

Sum of squares

Sum of squares


普通最小二乘法(Ordinary Least Squares)中的应用

  Sum of squares

对β的估计:Sum of squares

The residual vector Sum of squares = Sum of squares,则

  Sum of squares

用 Sum of squares 表示向量,其每个元素都相等,为 y 的期望,则

  Sum of squares

Sum of squares,则

  Sum of squares

Sum of squares

当且仅当 Sum of squares(也即the sum of the residuals Sum of squares)时,TSS = ESS + RSS.

由于Sum of squaresSum of squares,

Sum of squares

X的第一列全是1,则Sum of squares第一个元素就是Sum of squares,并且等于0. 

因此上面的条件成立,可使TSS = ESS + RSS


Mean squared error(MSE)

Sum of squares

Sum of squares

相关文章:

  • 2022-12-23
  • 2021-09-18
  • 2021-08-25
  • 2021-11-01
  • 2022-12-23
  • 2022-12-23
猜你喜欢
  • 2022-12-23
  • 2022-12-23
  • 2022-12-23
  • 2021-08-06
  • 2018-09-09
  • 2021-07-30
  • 2022-12-23
相关资源
相似解决方案