Post

[컴선설] Lec 04-2 Least Square solution

[컴선설] Lec 04-2 Least Square solution

Three methods of approaching least-square

Partial derivative

  • Let error function as total squared sum of residual of each term
\[E(a_0, a_1, a_2) = [y_1-(a_0+a_1x_1+a_2x_1^2)]^2+ [y_2-(a_0+a_1x_2+a_2x_2^2)]^2+[y_3-(a_0+a_1x_3+a_2x_3^2)]^2+[y_4-(a_0+a_1x_4+a_2x_4^2)]^2\]
  • Each partial derivative term must be zero
\[{\partial E(a_0, a_1, a_2) \over \partial a_i} = 0\]
  • Collecting all results and construct a matrix

Matrix

  • Construct a matrix $A$ with given condition

\[A^TA{\bf x} = A^T{\bf b}\]

Likelihood

  • Let the likelihood of Follows Normal distribution
\[L(a_0, a_1) = N(y_1, \sigma^2)N(y_2, \sigma^2) = \alpha_1 \exp(-{1\over 2}({y-y_1\over \sigma})^2)\alpha_2 \exp(-{1\over 2}({y-y_2\over \sigma})^2)\]
  • Maximum likelihood → maximizes probability : minimizes exponential term

Weighted maximum likelihood approach

  • Each term $f(x_i) \sim N(y_i, \sigma_i^2)$ (with different stdev
  • Consider a matrix $W$
\[W = \text{diag}(1/\sigma_1^2, 1/\sigma_2^2, \cdots, 1/\sigma_n^2)\] \[A^TWA{\bf x} = A^T W {\bf b}\]
  • which is…

with all sigma must be changed to sigma^2

$\sigma_i \rightarrow \sigma_i^2$

This post is licensed under CC BY 4.0 by the author.