ISSN : 2663-2187

A Revision of Relaxed Steepest Descent, Gradient Descent, Modified Ellipsoid, David-Fletcher-Powell Variable Metric, Newton’s, and Fletcher-Reeves Conjugate Methods From the Dynamics on an Invariant Manifold: A Journey of Mathematical Optimization

Main Article Content

H. Gautam,S.K. Sahani, D.N. Mandal, M.P. Pauel, and K. Sahani
» doi: 10.33472/AFJBS.6.5.2024. 2175-2187

Abstract

The goal of optimization theory is to minimize an objective function while taking a set of constraints into account. The design, management, operation, and analysis of systems in the actual world depend heavily on this field. For many decades, there has been a vigorous research focus on the creation of effective minimization strategies and numerical algorithms. Finding the steepest decent method's computational comparison rate is the primary goal of this study. Gradient Descent is one of the most popular methods to pick the model that best fits the training data i.e. the model that minimizes the loss function for example, minimizing the residual sum of squares in linear regression. Stochastic Gradient Descent is a stochastic, as in probabilistic, Spin on gradient descent. It improves on the limitations of gradient descent method and performance much better in large scale datasets. In this work, we study and compare all i.e. modified ellipsoid method, David-Fletcher-Powell variable metric method, Newton's method, and Fletcher-Reeves conjugate gradient techniquesin optimization theory.

Article Details