Author of the publication

Nyströmformer: A Nyström-Based Algorithm for Approximating Self-Attention

, , , , , , and . (2021)cite arxiv:2102.03902Comment: AAAI 2021; Code and supplement available at https://github.com/mlpen/Nystromformer.

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

LookupFFN: Making Transformers Compute-lite for CPU inference., , , , and . CoRR, (2024)LookupFFN: Making Transformers Compute-lite for CPU inference., , , , and . ICML, volume 202 of Proceedings of Machine Learning Research, page 40707-40718. PMLR, (2023)IM-Unpack: Training and Inference with Arbitrarily Low Precision Integers., , and . CoRR, (2024)Vcc: Scaling Transformers to 128K Tokens or More by Prioritizing Important Tokens., , , , , , and . CoRR, (2023)FrameQuant: Flexible Low-Bit Quantization for Transformers., , , and . CoRR, (2024)Nyströmformer: A Nyström-based Algorithm for Approximating Self-Attention., , , , , , and . AAAI, page 14138-14148. AAAI Press, (2021)Multi Resolution Analysis (MRA) for Approximate Self-Attention., , , , and . ICML, volume 162 of Proceedings of Machine Learning Research, page 25955-25972. PMLR, (2022)Nyströmformer: A Nyström-Based Algorithm for Approximating Self-Attention, , , , , , and . (2021)cite arxiv:2102.03902Comment: AAAI 2021; Code and supplement available at https://github.com/mlpen/Nystromformer.You Only Sample (Almost) Once: Linear Cost Self-Attention Via Bernoulli Sampling., , , , , and . ICML, volume 139 of Proceedings of Machine Learning Research, page 12321-12332. PMLR, (2021)Controlled Differential Equations on Long Sequences via Non-standard Wavelets., , , and . ICML, volume 202 of Proceedings of Machine Learning Research, page 26820-26836. PMLR, (2023)