摘要
<正>A comparative analysis of optimization and generalization properties of two-layer neural network and random feature models under gradient descent dynamics Weinan E, Chao Ma & Lei Wu Abstract A fairly comprehensive analysis is presented for the gradient descent dynamics for training two-layer neural network models in the situation when the parameters in both layers are updated. General initialization