When Do Neural Networks Outperform Kernel
Methods?
Behrooz Ghorbani, Song Mei, Theodor Misiakiewicz, Andrea Montanari§
Abstract
For a certain scaling of the initialization of stochastic gradient descent (SGD), wide
neural networks (NN) have been shown to be well approximated by reproducing
kernel Hilbert space (RKHS) methods. Recent empirical work showed that, for
some classification tasks, RKHS methods can replace NNs wit ...


雷达卡


京公网安备 11010802022788号







