From post

Generalization Bounds of SGLD for Non-convex Learning: Two Theoretical Viewpoints.

, , , и . COLT, том 75 из Proceedings of Machine Learning Research, стр. 605-638. PMLR, (2018)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Gradient Descent Provably Optimizes Over-parameterized Neural Networks., , , и . CoRR, (2018)Consistency of Interpolation with Laplace Kernels is a High-Dimensional Phenomenon., и . COLT, том 99 из Proceedings of Machine Learning Research, стр. 2595-2623. PMLR, (2019)Gradient Descent Provably Optimizes Over-parameterized Neural Networks., , , и . ICLR (Poster), OpenReview.net, (2019)Consistency of Interpolation with Laplace Kernels is a High-Dimensional Phenomenon., и . CoRR, (2018)How Many Samples are Needed to Estimate a Convolutional Neural Network?, , , , , и . NeurIPS, стр. 371-381. (2018)How Many Samples are Needed to Learn a Convolutional Neural Network?, , , , , и . CoRR, (2018)Generalization Bounds of SGLD for Non-convex Learning: Two Theoretical Viewpoints., , , и . CoRR, (2017)Gradient Descent Finds Global Minima of Deep Neural Networks., , , , и . CoRR, (2018)Generalization Bounds of SGLD for Non-convex Learning: Two Theoretical Viewpoints., , , и . COLT, том 75 из Proceedings of Machine Learning Research, стр. 605-638. PMLR, (2018)SceneGen: Generative Contextual Scene Augmentation using Scene Graph Priors., , , , , и . CoRR, (2020)