数学学科Seminar第2909讲 机器学习中随机牛顿迭代法的切比雪夫加速算法

创建时间:  2025/10/10  邵奋芬   浏览次数:   返回

报告题目 (Title):Chebyshev polynomial acceleration of stochastic Newton method for machine learning

(机器学习中随机牛顿迭代法的切比雪夫加速算法)

报告人 (Speaker):潘建瑜 教授(华东师范大学)

报告时间 (Time):2025年10月14日(周二) 15:00

报告地点 (Place):校本部GJ303

邀请人(Inviter):刘巧华

主办部门:理学院数学系

报告摘要:

In this talk, we consider the acceleration of stochastic Newton method for the large scale optimization problems arising from machine learning. In order to reduce the cost of computing Hessian and Hessian inverse, we propose to utilize the Chebyshev polynomial to approximate the Hessian inverse. We show that, by utilizing the short-term recurrence formula, Chebyshev polynomial approximation can effectively reduce the computational cost. The convergence analysis are given and experiments on multiple benchmarks are carried out to illustrate the performance of our proposed acceleration method.

上一条:数学学科Seminar第2910讲 Householder正交化的一些新进展

下一条:数学学科Seminar第2907讲 分布有限元复形


数学学科Seminar第2909讲 机器学习中随机牛顿迭代法的切比雪夫加速算法

创建时间:  2025/10/10  邵奋芬   浏览次数:   返回

报告题目 (Title):Chebyshev polynomial acceleration of stochastic Newton method for machine learning

(机器学习中随机牛顿迭代法的切比雪夫加速算法)

报告人 (Speaker):潘建瑜 教授(华东师范大学)

报告时间 (Time):2025年10月14日(周二) 15:00

报告地点 (Place):校本部GJ303

邀请人(Inviter):刘巧华

主办部门:理学院数学系

报告摘要:

In this talk, we consider the acceleration of stochastic Newton method for the large scale optimization problems arising from machine learning. In order to reduce the cost of computing Hessian and Hessian inverse, we propose to utilize the Chebyshev polynomial to approximate the Hessian inverse. We show that, by utilizing the short-term recurrence formula, Chebyshev polynomial approximation can effectively reduce the computational cost. The convergence analysis are given and experiments on multiple benchmarks are carried out to illustrate the performance of our proposed acceleration method.

上一条:数学学科Seminar第2910讲 Householder正交化的一些新进展

下一条:数学学科Seminar第2907讲 分布有限元复形