From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Training Your Sparse Neural Network Better with Any Mask., , , , и . ICML, том 162 из Proceedings of Machine Learning Research, стр. 9833-9844. PMLR, (2022)Sparse Cocktail: Every Sparse Pattern Every Sparse Ratio All At Once., , , , , , , , и . ICML, OpenReview.net, (2024)Decoding Compressed Trust: Scrutinizing the Trustworthiness of Efficient LLMs Under Compression., , , , , , , , , и 5 other автор(ы). ICML, OpenReview.net, (2024)Outline, Then Details: Syntactically Guided Coarse-To-Fine Code Generation., , , , , , и . ICML, том 202 из Proceedings of Machine Learning Research, стр. 42403-42419. PMLR, (2023)Sparse MoE as the New Dropout: Scaling Dense and Self-Slimmable Transformers., , , , и . ICLR, OpenReview.net, (2023)LLaGA: Large Language and Graph Assistant., , , , и . ICML, OpenReview.net, (2024)Compressing LLMs: The Truth is Rarely Pure and Never Simple., , , , , и . ICLR, OpenReview.net, (2024)Instant Soup: Cheap Pruning Ensembles in A Single Pass Can Draw Lottery Tickets from Large Models., , , , и . ICML, том 202 из Proceedings of Machine Learning Research, стр. 14691-14701. PMLR, (2023)Graph Ladling: Shockingly Simple Parallel GNN Training without Intermediate Communication., , , , и . ICML, том 202 из Proceedings of Machine Learning Research, стр. 14679-14690. PMLR, (2023)Sparsity May Cry: Let Us Fail (Current) Sparse Neural Networks Together!, , , , , , и . ICLR, OpenReview.net, (2023)