From post

Regularization Matters: Generalization and Optimization of Neural Nets v.s. their Induced Kernel

, , , и . (2018)cite arxiv:1810.05369Comment: version 2: title changed from originally Ön the Margin Theory of Feedforward Neural Networks". Substantial changes from old version of paper, including a new lower bound on NTK sample complexity version 3: reorganized NTK lower bound proof.

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Shape Matters: Understanding the Implicit Bias of the Noise Covariance., , , и . CoRR, (2020)Max-Margin Works while Large Margin Fails: Generalization without Uniform Convergence., , , и . CoRR, (2022)Theoretical insights on generalization in supervised and self-supervised deep learning.. Stanford University, USA, (2022)Regularization Matters: Generalization and Optimization of Neural Nets v.s. their Induced Kernel., , , и . NeurIPS, стр. 9709-9721. (2019)Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss., , , , и . NeurIPS, стр. 1565-1576. (2019)Provable Guarantees for Self-Supervised Deep Learning with Spectral Contrastive Loss., , , и . NeurIPS, стр. 5000-5011. (2021)Theoretical Analysis of Self-Training with Deep Networks on Unlabeled Data., , , и . ICLR, OpenReview.net, (2021)The Implicit and Explicit Regularization Effects of Dropout., , и . ICML, том 119 из Proceedings of Machine Learning Research, стр. 10181-10192. PMLR, (2020)Markov Chain Truncation for Doubly-Intractable Inference., и . AISTATS, том 54 из Proceedings of Machine Learning Research, стр. 776-784. PMLR, (2017)Regularization Matters: Generalization and Optimization of Neural Nets v.s. their Induced Kernel, , , и . (2018)cite arxiv:1810.05369Comment: version 2: title changed from originally Ön the Margin Theory of Feedforward Neural Networks". Substantial changes from old version of paper, including a new lower bound on NTK sample complexity version 3: reorganized NTK lower bound proof.