报告人:Jianfeng Yao
报告地点:腾讯会议ID:613 613 166
报告时间:2022年08月13日星期六15:30-16:30
报告摘要:
Random Matrix Theory (RMT) help understand Deep Learningby analyzing the spectra of large weight matrices of a trained deep neural network (DNN). We conduct extensive experiments on such weight matrices under different settings for layers, networks and data sets. Following the previous work of Martin and Mahoney (2021), spectra of weight matrices at the terminal stage of training are classified into three main types: Light Tail (LT), Bulk Transition period (BT) and Heavy Tail(HT). A main contribution from the paper is that we identify the difficulty of the classification problem as a driving factor for the appearance of HT in weight matrices spectra.Moreover, the classification difficulty can be affected either by the signal-to-noise ratio of the dataset, or by the complexity of the classification problem (complex features, large number of classes) as well. Leveraging on this finding, we further propose a spectral criterion to detect the appearance of HT and use it to early stop the training process without testing data. Such early stopped DNNs have the merit of avoiding overfitting and unnecessary extra training while preserving a much comparable generalization ability. The findings from the paper are validated in several NNs (LeNet, MiniAlexNet and Vgg), using Gaussian synthetic data and real data sets (MNIST and CIFAR10).
主讲人简介:
Professor Yao is a Presidential Chair Professor at the School of Data Science, The Chinese University of Hong Kong (Shenzhen). Earlier he worked in the University of Paris I Panthéon-Sorbonne, the University of Rennes 1 and The University of Hong Kong. He held visiting positions at various institutions in France He currently also serves as a Special Guest Professor at the School of Mathematics, Shandong University.
Professor Yao is an expert in random matrix theory and high-dimensional statistics.