From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Top-KAST: Top-K Always Sparse Training., , , , и . CoRR, (2021)Fast Parametric Learning with Activation Memorization., , , и . ICML, том 80 из Proceedings of Machine Learning Research, стр. 4225-4234. PMLR, (2018)Meta-Learning Neural Bloom Filters., , и . ICML, том 97 из Proceedings of Machine Learning Research, стр. 5271-5280. PMLR, (2019)Compressive Transformers for Long-Range Sequence Modelling., , , , и . ICLR, OpenReview.net, (2020)Do Transformers Need Deep Long-Range Memory?, и . ACL, стр. 7524-7529. Association for Computational Linguistics, (2020)Stabilizing Transformers for Reinforcement Learning., , , , , , , , , и 3 other автор(ы). ICML, том 119 из Proceedings of Machine Learning Research, стр. 7487-7498. PMLR, (2020)Training Compute-Optimal Large Language Models, , , , , , , , , и 12 other автор(ы). (2022)Improving language models by retrieving from trillions of tokens, , , , , , , , , и 18 other автор(ы). (2021)cite arxiv:2112.04426Comment: Fix incorrect reported numbers in Table 14.Improving Language Models by Retrieving from Trillions of Tokens., , , , , , , , , и 18 other автор(ы). ICML, том 162 из Proceedings of Machine Learning Research, стр. 2206-2240. PMLR, (2022)Training Language GANs from Scratch., , , и . NeurIPS, стр. 4302-4313. (2019)