统计学主题系列报告

Decentralized Learning of Quantile Regression: a Smoothing Approach with Two Bandwidths

报告人:朱仲义

报告地点:腾讯会议ID:628181580

报告时间:2023年12月22日星期五13:30-14:30


报告摘要:

Distributed estimation has attracted a significant amount of attention recently due to its advantages in computational efficiency and data privacy preservation. In this article, we focus on quantile regression over a decentralized network. Without a coordinating central node, a decentralized network improves system stability and increases efficiency by communicating with fewer nodes per round. However, existing related works on decentralized quantile regression either have slow (sub-linear) convergence speed or rely on some restrictive modelling assumptions (e.g. homogeneity of errors). We propose a novel method for decentralized quantile regression which is built upon the smoothed quantile loss. However, we argue that the smoothed loss proposed in the existing literature using a single smoothing bandwidth parameter fails to achieve fast convergence and statistical efficiency simultaneously in the decentralized setting, which we refer to as the speed-efficiency dilemma. We propose a novel quadratic approximation of the quantile loss using a big bandwidth for the Hessian and a small bandwidth for the gradient. Our method enjoys a linear convergence rate and has optimal statistical efficiency. Numerical experiments and real data analysis are conducted to demonstrate the effectiveness of our method.


主讲人简介:

朱仲义,复旦大学统计与数据科学系教授,博士研究生导师;曾任中国概率统计学会第八、九届副理事长,国际著名杂志”Statistica Sinica”副主编; “应用概率统计”, ”中国科学:数学”杂志编委;现为国际数理统计学会当选会员,担任”数理统计与管理”杂志编委和国际顶级统计杂志JASA的副主编。专业研究方向为:纵向数据(面板数据)模型;分位数回归模型,机器学习等。主持完成国家自然科学基金六项、国家社会科学基金一项,作为子项目负责人完成国家自然科学基金重点项目二项,重大项目子项目一项,目前主持国家自然科学基金面上,天元,重点项目各一项。近几年发表论文100多篇(其中包括在国际四大统计和机器学习顶级刊物等SCI论文八十多篇)。获得教育部自然科学二等奖一次。