From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Exponentially vanishing sub-optimal local minima in multilayer neural networks., и . ICLR (Workshop), OpenReview.net, (2018)Physics-Aware Downsampling with Deep Learning for Scalable Flood Modeling., , , , и . NeurIPS, стр. 1378-1389. (2021)Accurate Post Training Quantization With Small Calibration Sets., , , , и . ICML, том 139 из Proceedings of Machine Learning Research, стр. 4466-4475. PMLR, (2021)Regularization Guarantees Generalization in Bayesian Reinforcement Learning through Algorithmic Stability., , и . AAAI, стр. 8423-8431. AAAI Press, (2022)The Implicit Bias of Gradient Descent on Separable Data., , , , и . J. Mach. Learn. Res., (2018)How do Minimum-Norm Shallow Denoisers Look in Function Space?, , , , и . CoRR, (2023)A Function Space View of Bounded Norm Infinite Width ReLU Nets: The Multivariate Case., , , и . ICLR, OpenReview.net, (2020)A Mean Field Theory of Quantized Deep Networks: The Quantization-Depth Trade-Off., , и . NeurIPS, стр. 7036-7046. (2019)Train longer, generalize better: closing the generalization gap in large batch training of neural networks., , и . NIPS, стр. 1731-1741. (2017)Bayesian Gradient Descent: Online Variational Bayes Learning with Increased Robustness to Catastrophic Forgetting and Weight Pruning, , , и . (2018)cite arxiv:1803.10123.