Gradient Descent For Multiple Variables

Cost function: J(\theta) = \dfrac {1}{2m} \displaystyle \sum_{i=1}^m \left (h_\theta (x^{(i)}) - y^{(i)} \right)^2
J(\theta) = \dfrac {1}{2m} \displaystyle \sum_{i=1}^m \left (\theta^Tx^{(i)} - y^{(i)} \right)^2
J(\theta) = \dfrac {1}{2m} \displaystyle \sum_{i=1}^m \left ( \left( \sum_{j=0}^n \theta_j x_j^{(i)} \right) - y^{(i)} \right)^2

Gradient descent:

    <span class="ql-right-eqno">   </span><span class="ql-left-eqno">   </span><img src="https://teach.sg/wp-content/ql-cache/quicklatex.com-180d38ebaaff2b28806bdd8feee8e5ee_l3.png" height="50" width="632" class="ql-img-displayed-equation quicklatex-auto-format" alt="\begin{align*} & \text{repeat until convergence:} \; \lbrace \newline  \; & \theta_j := \theta_j - \alpha \frac{1}{m} \sum\limits_{i=1}^{m} (h_\theta(x^{(i)}) - y^{(i)}) \cdot x_j^{(i)} \;  & \text{for j := 0..n} \newline \rbrace \end{align*}" title="Rendered by QuickLaTeX.com"/>

which breaks down into

    <span class="ql-right-eqno">   </span><span class="ql-left-eqno">   </span><img src="https://teach.sg/wp-content/ql-cache/quicklatex.com-ade19e57866347ec87a2a0df45d0c145_l3.png" height="50" width="1189" class="ql-img-displayed-equation quicklatex-auto-format" alt="\begin{align*} & \text{repeat until convergence:} \; \lbrace \newline  \; & \theta_0 := \theta_0 - \alpha \frac{1}{m} \sum\limits_{i=1}^{m} (h_\theta(x^{(i)}) - y^{(i)}) \cdot x_0^{(i)}\newline \; & \theta_1 := \theta_1 - \alpha \frac{1}{m} \sum\limits_{i=1}^{m} (h_\theta(x^{(i)}) - y^{(i)}) \cdot x_1^{(i)} \newline \; & \theta_2 := \theta_2 - \alpha \frac{1}{m} \sum\limits_{i=1}^{m} (h_\theta(x^{(i)}) - y^{(i)}) \cdot x_2^{(i)} \newline & \cdots \newline \rbrace \end{align*}" title="Rendered by QuickLaTeX.com"/>

loading
×