From post

Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses

, , , , и . (2015)cite arxiv:1509.05753Comment: 11 pages, 4 figures (main text: 5 pages, 3 figures; Supplemental Material: 6 pages, 1 figure).
DOI: 10.1103/PhysRevLett.115.128101

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Neural networks trained with SGD learn distributions of increasing complexity., , и . ICML, том 202 из Proceedings of Machine Learning Research, стр. 28843-28863. PMLR, (2023)Network reconstruction from infection cascades., и . CoRR, (2016)Unreasonable Effectiveness of Learning Neural Networks: From Accessible States and Robust Ensembles to Basic Algorithmic Schemes, , , , , , и . (2016)cite arxiv:1605.06444Comment: 31 pages (14 main text, 18 appendix), 12 figures (6 main text, 6 appendix).Local entropy as a measure for sampling solutions in Constraint Satisfaction Problems, , , , и . (2015)cite arxiv:1511.05634Comment: 46 pages (main text: 22), 7 figures. This is an author-created, un-copyedited version of an article published in Journal of Statistical Mechanics: Theory and Experiment. IOP Publishing Ltd is not responsible for any errors or omissions in this version of the manuscript or any version derived from it. The Version of Record is available online at http://dx.doi.org/10.1088/1742-5468/2016/02/023301.Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses, , , , и . (2015)cite arxiv:1509.05753Comment: 11 pages, 4 figures (main text: 5 pages, 3 figures; Supplemental Material: 6 pages, 1 figure).Feature learning in finite-width Bayesian deep linear networks with multiple outputs and convolutional layers., , , , и . CoRR, (2024)