Beyond Second Order Methods for Nonconvex Optimization
数学专题报告
报告题目(Title):Beyond Second Order Methods for Nonconvex Optimization
报告人(Speaker):Kate Wenqi Zhu (牛津大学)
地点(Place):后主楼1124
时间(Time):2025年8月29日 (周五) 10:00-11:00
邀请人(Inviter):袁迪凡
报告摘要
Traditionally, first-order gradient-based techniques, such as stochastic gradient descent (SGD), and second-order methods, such as the Newton method, have dominated the field of optimization. In recent years, high-order tensor methods with regularization for nonconvex optimization have garnered significant research interest. These methods offer superior local convergence rates, improved worst-case evaluation complexity, enhanced insights into data geometry through higher-order information, and better parallelization compared to SGD.
The most critical challenge in implementing the $p$th-order method ($p \geq 3$) lies in efficiently minimizing the $p$th-order subproblem, which typically consists of a $p$th-degree multivariate Taylor polynomial combined with a $(p+1)$th-order regularization term. In this talk, we address the research gaps by characterizing the local and global optimality of the subproblem and investigating its potential NP-hardness. In this talk, we will introduce and discuss a series of provably convergent and efficient algorithms for minimizing the regularized subproblem both locally and globally, including the Quadratic Quartic Regularization Method (QQR), the Cubic Quartic Regularization Method (CQR), and the Sums-of-Squares Convex Taylor Method (SoS-C). More interestingly, our research adopts an AI-integrated approach, using the mathematical reasoning capabilities of large language models (LLMs) to verify the nonnegativity of multivariate polynomials. This problem is closely related to Hilbert’s Seventeenth Problem and the challenge of globally minimizing subproblems.
主讲人简介
Kate Wenqi Zhu is a fourth year Ph.D. student in Applied Mathematics at the University of Oxford, under the supervision of Professor Coralia Cartis, and is fully funded by the CIMDA–Oxford Studentship. Her research focuses on leveraging higher-order information for efficient nonconvex optimization, with interests spanning computational complexity analysis, tensor approximation, sum-of-squares techniques, implementable high-order subproblem solvers, and adaptive regularization methods. Kate was awarded second prize in 22nd IMA Leslie Fox Prize in 2025. She completed both her undergraduate and first master's degrees in Mathematics at Oxford, followed by an M.Sc. in Mathematical Modelling and Scientific Computing, supervised by Professor Yuji Nakatsukasa.