dummy

1 资源

2 线性回归矩阵形式

2.1 回顾:协方差与协方差矩阵

方差: \(Var(X) = E[(X - E[X])(X - E[X])]\)

协方差: \(Cov(X, Y) = E[(X - E[X])(Y - E[Y])]\)


总体:\(y = X\beta + \epsilon\)

Assumptions:

  1. \(E(\epsilon) = 0\)
  2. \(Var(\epsilon) = \sigma^2\mathbf{I}\)

推到:

  1. \(\epsilon = y - X\hat{\beta}\)

  2. \(\epsilon^T \epsilon = (y - X\hat{\beta})^T (y - X\hat{\beta})\), 求最小

  3. \(\hat{\beta} = (X^T X)^{-1}X^Ty\) 其中\(X^T X\)是K x K的矩


2.2 高斯马尔科夫定理

在线性回归模型中,如果误差满足零均值、同方差且互不相关(并没要求正态分布或iid),则回归系数的最佳(更小的方差)线性无偏估计(BLUE, Best Linear unbiased estimator)就是普通最小二乘法估计。


2.3 残差的协方差矩阵如何理解

疑点: 对于n个观测点, 哪来的残差矩\(M_{n \times n}\)

dummy

How do I obtain the variance for a variable with one observation? The same goes for the off-diagonal elements: How come there exists a covariance for 2 single observations?


3 OLS: Ordinal Least Squares: 普通

同方差(误差项的方差是常数)

\(\hat{\beta} = (X^T X)^{-1} X^T y\)

4 WLS: Weighted Least Squares: 加权

异方差(误差项的方差是变化的)

  1. \(Var(\epsilon) = \sigma^2\mathbf{V}\); \(V_{n \times n}\) is diagonal but with unequal diagonal elements

5 GLS: General Least Squares: 广义

不仅仅是异方差

\(\hat{\beta} = (X^T V^{-1} X)^{-1} X^T V^{-1} y\)

6 实践