The Steepest Descent Algorithm or the Gradient Descent Algorithm is a method for solving general unconstrained optimization problems, i.e., minimizing a general nonlinear function.
Here we employ two functions:
f(x,y) = (a - x)^2 + b(y - (x^2))^2
where I take a = 1, b = 100
f(x,y) = ((x^2) + y - 11)^2 + (x + (y)^2 - 7)^2
I have considered two starting points:
a. (0,0)^T b. (pi + 1, pi - 1)^T
I try and reach their points of minima emplying the gradient descent algorithm. Now, the algorithm requires step-sizes, and I utilised three of them:
In the end, I try and compare the behaviour of the two functions with respect to the starting points and step sizes.
- Meza, J.C., 2010. Steepest descent. Wiley Interdisciplinary Reviews: Computational Statistics,2(6), pp.719-722.
- Grad, S.M. Lecture Notes. Optimization[MAP554D].
- scipy.optimize.minimize—SciPy v1.11.4 Manual. https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html.
- differential evolution https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.differential_evolution.html
- Campbell, S. (2023) Python Timeit() with Examples. https://www.guru99.com/timeitpython-examples.html.
- GeeksforGeeks (2022) Quiver Plot in Matplotlib. https://www.geeksforgeeks.org/quiverplot-in-matplotlib/.
- Shi, Z.J. and Shen, J., 2005. Step-size estimation for unconstrained optimization methods.
- Computational & Applied Mathematics, 24, pp.399-416.
- Kia, S.S. Lecture 4. Optimization Methods