统计学主题系列报告

Variance Reduced Median-of-Means Estimator for Byzantine-Robust Distributed Inference

报告人:刘卫东

报告地点:腾讯会议(会议ID:139223918 会议密码:200804)

报告时间:2020年08月4日星期二15:00-16:00


报告摘要:

This paper develops an efficient distributed inference algorithm, which is robust against a moderate fraction of Byzantine nodes, namely arbitrary and possibly adversarial machines in a distributed learning system. In robust statistics, the median-of-means (MOM) has been a popular approach to hedge against Byzantine failures due to its ease of implementation and computational efficiency. However, the MOM estimator has the shortcoming in terms of statistical efficiency. The first main contribution of the paper is to propose a variance reduced median-of-means (VRMOM) estimator, which improves the statistical efficiency over the vanilla MOM estimator and is computationally as efficient as the MOM. Based on the proposed VRMOM estimator, we develop a general distributed inference algorithm that is robust against Byzantine failures.  Theoretically, our distributed algorithm achieves a fast convergence rate with only a constant number of rounds of communications. We also provide the asymptotic normality result for the purpose of statistical inference. To the best of our knowledge, this is the first normality result in the setting of Byzantine-robust distributed learning.  The simulation results are also presented to illustrate the effectiveness of our method.

 

主讲人简介:

刘卫东,上海交通大学数学科学学院副院长,特聘教授,国家杰出青年科学基金获得者。2008年于浙江大学获博士学位,2008-2011年在香港科技大学、美国宾夕法尼亚大学沃顿商学院从事博士后研究工作。2010年获全国百篇优秀博士学位论文奖及由世界华人数学家大会颁发的新世界数学奖; 2013年获得国家优秀青年科学基金; 2016年获得国家青年拔尖人才;2018年获国家杰出青年科学基金。研究兴趣包括现代统计学、机器学习等,在统计学四大顶级期刊(AOS, JASA, JRSSB, Biometrika)和机器学习顶级期刊JMLR发表40余篇论文。