Communications on Applied Mathematics and Computation ›› 2024, Vol. 6 ›› Issue (2): 790-810.doi: 10.1007/s42967-022-00239-5

• ORIGINAL PAPERS • 上一篇    下一篇

Global Solutions to Nonconvex Problems by Evolution of Hamilton-Jacobi PDEs

Howard Heaton1, Samy Wu Fung2, Stanley Osher3   

  1. 1. Typal Research, Typal LLC, Los Angeles, USA;
    2. Department of Applied Mathematics and Statistics, Colorado School of Mines, Golden, USA;
    3. Department Mathematics, UCLA, Los Angeles, USA
  • 收稿日期:2022-06-01 修回日期:2022-10-08 接受日期:2022-11-21 出版日期:2023-03-20 发布日期:2023-03-20
  • 通讯作者: Samy Wu Fung,E-mail:swufung@mines.edu E-mail:swufung@mines.edu
  • 基金资助:
    This work greatly benefited from feedback by anonymous reviewers. HH, SWF, and SO were partially funded by AFOSR MURI FA9550-18-502, ONR N00014-18-1-2527, N00014-18-20-1-2093, N00014-20-1-2787. HH was also supported by the NSF Graduate Research Fellowship under Grant No. DGE-1650604.

Global Solutions to Nonconvex Problems by Evolution of Hamilton-Jacobi PDEs

Howard Heaton1, Samy Wu Fung2, Stanley Osher3   

  1. 1. Typal Research, Typal LLC, Los Angeles, USA;
    2. Department of Applied Mathematics and Statistics, Colorado School of Mines, Golden, USA;
    3. Department Mathematics, UCLA, Los Angeles, USA
  • Received:2022-06-01 Revised:2022-10-08 Accepted:2022-11-21 Online:2023-03-20 Published:2023-03-20
  • Contact: Samy Wu Fung,E-mail:swufung@mines.edu E-mail:swufung@mines.edu
  • Supported by:
    This work greatly benefited from feedback by anonymous reviewers. HH, SWF, and SO were partially funded by AFOSR MURI FA9550-18-502, ONR N00014-18-1-2527, N00014-18-20-1-2093, N00014-20-1-2787. HH was also supported by the NSF Graduate Research Fellowship under Grant No. DGE-1650604.

摘要: Computing tasks may often be posed as optimization problems. The objective functions for real-world scenarios are often nonconvex and/or nondifferentiable. State-of-the-art methods for solving these problems typically only guarantee convergence to local minima. This work presents Hamilton-Jacobi-based Moreau adaptive descent (HJ-MAD), a zero-order algorithm with guaranteed convergence to global minima, assuming continuity of the objective function. The core idea is to compute gradients of the Moreau envelope of the objective (which is “piece-wise convex”) with adaptive smoothing parameters. Gradients of the Moreau envelope (i.e., proximal operators) are approximated via the Hopf-Lax formula for the viscous Hamilton-Jacobi equation. Our numerical examples illustrate global convergence.

关键词: Global optimization, Moreau envelope, Hamilton-Jacobi, Hopf-Lax, Cole-Hopf, Proximals, Zero-order optimization

Abstract: Computing tasks may often be posed as optimization problems. The objective functions for real-world scenarios are often nonconvex and/or nondifferentiable. State-of-the-art methods for solving these problems typically only guarantee convergence to local minima. This work presents Hamilton-Jacobi-based Moreau adaptive descent (HJ-MAD), a zero-order algorithm with guaranteed convergence to global minima, assuming continuity of the objective function. The core idea is to compute gradients of the Moreau envelope of the objective (which is “piece-wise convex”) with adaptive smoothing parameters. Gradients of the Moreau envelope (i.e., proximal operators) are approximated via the Hopf-Lax formula for the viscous Hamilton-Jacobi equation. Our numerical examples illustrate global convergence.

Key words: Global optimization, Moreau envelope, Hamilton-Jacobi, Hopf-Lax, Cole-Hopf, Proximals, Zero-order optimization