Minimizing convex functions by continuous descent methods.
Separable nonlinear least squares (SNLLS) problems are critical in various research and application fields, such as image restoration, machine learning, and system identification. Solving such problems presents a challenge due to their nonlinearity. The traditional gradient iterative algorithm often zigzags towards the optimal solution and is sensitive to the initial guesses of unknown parameters. In this paper, we improve the convergence rate of the traditional gradient method by implementing a...