报告人:朱允章
报告地点:腾讯会议ID:712-743-233
报告时间:2022年11月18日星期五09:00-10:00
报告摘要:
Overparametrized interpolating models have drawn increasing attention from machine learning. Some recent studies suggest that regularized interpolating models can generalize well. This phenomenon seemingly contradicts the conventional wisdom that interpolation tends to overfit the data and performs poorly on test data, and it further appears to defy the bias-variance trade-off. As one of the shortcomings of the existing theory, the classical notion of model degrees of freedom fails to explain the intrinsic difference among the interpolating models since it focuses on estimation of in-sample prediction error. This motivates an alternative measure of model complexity which can differentiate those interpolating models and take different test points into account. In particular, we propose a measure with a proper adjustment based on the squared covariance between the prediction and observations. Our analysis with least squares method reveals some interesting properties of the measure, which can reconcile the “double descent” phenomenon with the classical theory. This opens door to an extended definition of model degrees of freedom in modern predictive settings.
主讲人简介:
朱允章是俄亥俄州立大学统计系Associate Professor,本科毕业于清华大学,博士毕业于明尼苏达大学,导师是著名统计学家沈晓彤教授,毕业后加入俄亥俄州立大学统计系。主要研究方向包括统计机器学习、图模型、高维优化和高维统计推断等,研究成果发表在JASA、JRSSB、JMLR、JCGS等上。