2 Matching Annotations
  1. Feb 2019
    1. Seven Myths in Machine Learning Research

      Reddit: http://t.cn/EVgQRrv Lots of negative comments, but it’s a pretty good article...//@iPHYSresearch:技术博客文章也可以在 arXiv 上过审了。。。。[费解] 这飘逸的格式。。。[doge]

    2. Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent

      谷歌大脑这篇文章脑洞很新奇!网络的训练拟合,可以看做是泰勒近似展开!in the infinite width limit, they are governed by a linear model obtained from the first-order Taylor expansion of the network around its initial parameters. Reddit 的讨论很棒!