数学系Seminar第2065期 Smoothing fast iterative hard thresholding algorithm for L0 regularized nonsmooth convex regression problem

创建时间:  2020/12/16  龚惠英   浏览次数:   返回

    数学系 Seminar 第 2064期

报告主题:Smoothing fast iterative hard thresholding algorithm for L0 regularized nonsmooth convex regression problem

报 告 人:边伟 教授 (哈尔滨工业大学)

报告时间:2020年12月21日(周一) 9:30

会议地点:G507

邀 请 人:徐姿

主办部门:理学院数学系

报告摘要: We first investigate a class of constrained sparse regression problem with cardinality penalty, where the feasible set is box constraint, and the loss function is convex, not differentiable. We put forward a smoothing fast iterative hard thresholding (SFIHT) algorithm for solving such optimization problems, which combines smoothing approximations, extrapolation techniques and iterative hard thresholding methods. The extrapolation coefficients satisfy in the proposed algorithm. We establish that any accumulated point of the iterative sequence is a local minimizer of the original cardinality penalty problem. We then consider that the case where the loss function is differentiable. We propose the fast iterative hard thresholding (FIHT) algorithm to solve such problems. We prove that the iterates converges to a local minimizer with lower bound property of the problem. In particular, we show that the convergence rate of the corresponding objective function value sequence is . Finally, we perform some numerical examples to illustrate the theoretical results.



欢迎教师、学生参加!

上一条:数学系Seminar第2066期 Nonlinear optimization techniques in wireless communications

下一条:数学系Seminar第2064期 Linearized Proximal Algorithms for Convex Composite Optimization with Applications


数学系Seminar第2065期 Smoothing fast iterative hard thresholding algorithm for L0 regularized nonsmooth convex regression problem

创建时间:  2020/12/16  龚惠英   浏览次数:   返回

    数学系 Seminar 第 2064期

报告主题:Smoothing fast iterative hard thresholding algorithm for L0 regularized nonsmooth convex regression problem

报 告 人:边伟 教授 (哈尔滨工业大学)

报告时间:2020年12月21日(周一) 9:30

会议地点:G507

邀 请 人:徐姿

主办部门:理学院数学系

报告摘要: We first investigate a class of constrained sparse regression problem with cardinality penalty, where the feasible set is box constraint, and the loss function is convex, not differentiable. We put forward a smoothing fast iterative hard thresholding (SFIHT) algorithm for solving such optimization problems, which combines smoothing approximations, extrapolation techniques and iterative hard thresholding methods. The extrapolation coefficients satisfy in the proposed algorithm. We establish that any accumulated point of the iterative sequence is a local minimizer of the original cardinality penalty problem. We then consider that the case where the loss function is differentiable. We propose the fast iterative hard thresholding (FIHT) algorithm to solve such problems. We prove that the iterates converges to a local minimizer with lower bound property of the problem. In particular, we show that the convergence rate of the corresponding objective function value sequence is . Finally, we perform some numerical examples to illustrate the theoretical results.



欢迎教师、学生参加!

上一条:数学系Seminar第2066期 Nonlinear optimization techniques in wireless communications

下一条:数学系Seminar第2064期 Linearized Proximal Algorithms for Convex Composite Optimization with Applications