Author of the publication

Feature Learning in Infinite-Width Neural Networks

, and . (2020)cite arxiv:2011.14522Comment: 4th paper in the Tensor Programs series. Appearing in ICML 2021.

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Lie Access Neural Turing Machine.. CoRR, (2016)Tensor Programs IVb: Adaptive Optimization in the Infinite-Width Limit., and . CoRR, (2023)Efficient Computation of Deep Nonlinear Infinite-Width Neural Networks that Learn Features., , and . ICLR, OpenReview.net, (2022)High-dimensional Asymptotics of Feature Learning: How One Gradient Step Improves the Representation., , , , , and . NeurIPS, (2022)Bayesian Deep Convolutional Networks with Many Channels are Gaussian Processes., , , , , , , , and . ICLR (Poster), OpenReview.net, (2019)Lie-Access Neural Turing Machines., and . CoRR, (2016)A Mean Field Theory of Batch Normalization, , , , and . (2019)cite arxiv:1902.08129Comment: To appear in ICLR 2019.Improved Image Wasserstein Attacks and Defenses., , , and . CoRR, (2020)Bayesian Deep Convolutional Networks with Many Channels are Gaussian Processes, , , , , , , , and . (2018)cite arxiv:1810.05148Comment: Published as a conference paper at ICLR 2019.Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes. (2019)cite arxiv:1910.12478Comment: Appearing in NeurIPS 2019; 10 pages of main text; 12 figures, 11 programs; 73 pages total.