Gradient Descend

Let's say we have a function of which we want to find a minimum:

$$f(x_1,x_2,...,x_i, ...,x_n)$$

in which $(x_1,x_2,...,x_i, ...,x_n)$ is the input vector $X$

If we are at a certain point $X$, we can get to the closest (local) minimum by iteratively applying the formula:

$$X_m = X - \lambda \nabla{f}$$

Thus:

$$\begin{aligned} x_{m1} &= x_1 - \lambda \frac{\partial f}{\partial x_1}\\ x_{m2} &= x_2 - \lambda \frac{\partial f}{\partial x_2}\\ ... &= ...\\ x_{mi} &= x_i - \lambda \frac{\partial f}{\partial x_i}\\ ... &= ...\\ x_{mn} &= x_n - \lambda \frac{\partial f}{\partial x_n}\\ \end{aligned}$$

Error function

You can click and drag on the grid below
to see the function in 3d-space

Partial Derivative Y

Partial Derivative X