ML lec 04 - multi-variable linear regression 여러개의 입력(feature)의 Linear Regression
Predicting exam score :
regression using three inputs ( x1, x2, x3)
Hypothesis
$$H(x) = Wx + b$$
$$H(x_1, x_2, x_3) = w_1x_1 + w_2x_2 + w_3x_3 + b$$
Cost function
$$H(x_1, x_2, x_3) = w_1x_1 + w_2x_2 + w_3x_3 + b$$
$$cost(W,b) = {{1}\over{m}}\sum_{i=1}^m {(H(x_1^{(i)},x_2^{(i)},x_3^{(i)})-y^{(i)})^2}$$
Multi-variable
$$H(x_1, x_2, x_3) = w_1x_1 + w_2x_2 + w_3x_3 + b$$
$$H(x_1, x_2, x_3, ..., x_n) = w_1x_1 + w_2x_2 + w_3x_3 + ... + w_nx_n + b$$
Hypothesis using matrix
$$ w_1x_1 + w_2x_2 + w_3x_3$$
$$\begin{pmatrix} x_1 & x_2 & x_3 \end{pmatrix} \centerdot \begin{pmatrix} w_1 \\ w_2 \\ w_3 \end{pmatrix} = \begin{pmatrix} x_1w_1 + x_2w_2 + x_3w_3 \end{pmatrix}$$
$$\begin{pmatrix} x_{11} & x_{12} & x_{13} \\ x_{21} & x_{22} & x_{23} \\ x_{31} & x_{32} & x_{33} \\ x_{41} & x_{42} & x_{43} \\ x_{51} & x_{52} & x_{53} \end{pmatrix} \centerdot \begin{pmatrix} w_1 \\ w_2 \\ w_3 \end{pmatrix} = \begin{pmatrix} x_{11}w_1 & x_{12}w_2 & x_{13}w_3 \\ x_{21}w_1 & x_{22}w_2 & x_{23}w_3 \\ x_{31}w_1 & x_{32}w_2 & x_{33}w_3 \\ x_{41}w_1 & x_{42}w_2 & x_{43}w_3 \\ x_{51}w_1 & x_{52}w_2 & x_{53}w_3 \end{pmatrix}$$
$$[5,3] [3,1] [5,1]$$
[5,3] [?,?] [5,1] [n,3] [3,1] [n,1] <- n= -1 or None
*$$H(X) = XW$$ *
matrix이용해서 간단하게 연산한다.
[?,?] = [3,2]
WX vs XW
Lecture (theory):
- H(x) = Wx + b
Implementation(TensorFlow)
- H(x) = XW
이론적으로는 위의 모양처럼 쓰이지만 matrix를 쓰기위해 TensorFlow에서는 저런 형태로 사용