Communications on Applied Mathematics and Computation ›› 2024, Vol. 6 ›› Issue (2): 1299-1318.doi: 10.1007/s42967-023-00327-0

• ORIGINAL PAPERS • Previous Articles     Next Articles

Anderson Acceleration of Gradient Methods with Energy for Optimization Problems

Hailiang Liu1, Jia-Hao He2, Xuping Tian1   

  1. 1. Department of Mathematics, Iowa State University, Ames, IA, USA;
    2. Department of Agricultural and Biosystems Engineering, Iowa State University, Ames, IA, USA
  • Received:2022-11-15 Revised:2023-09-14 Accepted:2023-09-22 Online:2023-12-28 Published:2023-12-28
  • Contact: Hailiang Liu,E-mail:hliu@iastate.edu E-mail:hliu@iastate.edu
  • Supported by:
    This research was partially supported by the National Science Foundation under (Grant DMS No. 1812666).

Abstract: Anderson acceleration (AA) is an extrapolation technique designed to speed up fixed-point iterations. For optimization problems, we propose a novel algorithm by combining the AA with the energy adaptive gradient method (AEGD) [arXiv:2010.05109]. The feasibility of our algorithm is ensured in light of the convergence theory for AEGD, though it is not a fixed-point iteration. We provide rigorous convergence rates of AA for gradient descent (GD) by an acceleration factor of the gain at each implementation of AA-GD. Our experimental results show that the proposed AA-AEGD algorithm requires little tuning of hyperparameters and exhibits superior fast convergence.

Key words: Anderson acceleration(AA), Gradient descent(GD), Energy stability