曾尚志

职称:助理教授

邮箱:zengsz@sustech.edu.cn

研究方向:最优化理论与方法、双层规划

返回上一级

个人简介

曾尚志,深圳国家应用数学中心/南方科技大学数学系助理教授。2015年本科毕业于武汉大学,2018年硕士毕业于香港浸会大学,2021年博士毕业于香港大学。2021年至2024年期间在加拿大维多利亚大学从事博士后研究工作,2024年加入深圳国家应用数学中心。研究方向包括最优化理论与方法、双层规划、机器学习优化算法。研究成果发表在Math Program、SIAM J Numer Anal、J Mach Learn Res、IEEE Trans Pattern Anal Mach Intell,以及ICML、NeurIPS、ICLR等期刊会议上。


研究领域

最优化理论与方法

双层规划

机器学习优化算法


代表论文

1. Xinmin Yang, Wei Yao, Haian Yin, Shangzhi Zeng and Jin Zhang, Gradient-based algorithms for multi-objective bi-level optimization. Science China Mathematics, 67, 1419-1438 , 2024.

2. Risheng Liu, Zhu Liu, Wei Yao, Shangzhi Zeng and Jin Zhang, Moreau Envelope for Nonconvex Bi-Level Optimization: A Single-loop and Hessian-free Solution Strategy, ICML 2024.

3. Wei Yao, Chengming Yu, Shangzhi Zeng and Jin Zhang, Constrained bi-level optimization: proximal Lagrangian value function approach and hessian-free algorithm, ICLR 2024 spotlight.

4. Risheng Liu, Xuan Liu, Shangzhi Zeng, Jin Zhang and Yixuan Zhang, Hierarchical Optimization-Derived Learning, IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(12), 14693-14708, 2023.

5. Risheng Liu, Xuan Liu, Shangzhi Zeng, Jin Zhang and Yixuan Zhang, Value-Function-Based Sequential Minimization for Bi-Level Optimization, IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(12), 15930-15948, 2023.

6. Jane J. Ye, Xiaoming Yuan, Shangzhi Zeng and Jin Zhang, Difference of convex algorithms for bilevel programs with applications in hyperparameter selection, Mathematical Programming, 198(2), 1583–1616, 2023.

7. Boris S. Mordukhovich, Xiaoming Yuan, Shangzhi Zeng and Jin Zhang, A globally convergent proximal Newton-type method in nonsmooth convex optimization, Mathematical Programming, 198(1), 899-936, 2023.

8. Risheng Liu, Yaohua Liu, Wei Yao, Shangzhi Zeng and Jin Zhang, Averaged method of multipliers for bi-level optimization without lower-level strong convexity, ICML 2023.

9. Risheng Liu, Pan Mu, Xiaoming Yuan, Shangzhi Zeng and Jin Zhang, A general descent aggregation framework for gradient-based bi-level optimization, IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(1), 38-57, 2023.

10. Lucy Gao, Jane J. Ye, Haian Yin, Shangzhi Zeng and Jin Zhang, Value Function Based Difference-of-Convex Algorithm for Bilevel Hyperparameter Selection Problems, ICML 2022.

11. Risheng Liu, Xuan Liu, Shangzhi Zeng, Jin Zhang and Yixuan Zhang, Optimization-Derived Learning with Essential Convergence Analysis of Training and Hyper-training, ICML 2022.

12. Jane J. Ye, Xiaoming Yuan, Shangzhi Zeng and Jin Zhang, Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems, Set-Valued and Variational Analysis, 29(4), 803-837, 2021.

13. Risheng Liu, Yaohua Liu, Shangzhi Zeng, and Jin Zhang, Towards gradient-based bilevel optimiza- tion with non-convex followers and beyond, NeurIPS 2021 spotlight.

14. Risheng Liu, Xuan Liu, Xiaoming Yuan, Shangzhi Zeng and Jin Zhang, A Value-Function-based Interior-point Method for Non-convex Bi-level Optimization, ICML 2021.

15. Xiaoming Yuan, Shangzhi Zeng and Jin Zhang, Discerning the linear convergence of ADMM for structured convex optimization through the lens of variational analysis, Journal of Machine Learning Research, 21(83), 1–75, 2020.

16. Risheng Liu, Pan Mu, Xiaoming Yuan, Shangzhi Zeng and Jin Zhang, A generic first-order algorithmic framework for bi-level programming beyond lower-level singleton, ICML 2020.

17. Yongchao Liu, Xiaoming Yuan, Shangzhi Zeng and Jin Zhang, Partial error bound conditions for the linear convergence rate of the alternating direction method of multipliers, SIAM Journal on Numerical Analysis, 56(4), 2095-2123, 2018.