Author of the publication

Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention

, , , and . (2020)cite arxiv:2006.16236Comment: ICML 2020, project at https://linear-transformers.com/.

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Lattice-Free MMI Adaptation Of Self-Supervised Pretrained Acoustic Models., , and . CoRR, (2020)On-demand compute reduction with stochastic wav2vec 2.0., , , and . INTERSPEECH, page 3048-3052. ISCA, (2022)Out-of-Distribution Detection Using an Ensemble of Self Supervised Leave-Out Classifiers., , , , , and . ECCV (8), volume 11212 of Lecture Notes in Computer Science, page 560-574. Springer, (2018)Commercial Block Detection in Broadcast News Videos., , , and . ICVGIP, page 63:1-63:7. ACM, (2014)Comparing CTC and LFMMI for Out-of-Domain Adaptation of wav2vec 2.0 Acoustic Model., , and . Interspeech, page 2861-2865. ISCA, (2021)Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention., , , and . ICML, volume 119 of Proceedings of Machine Learning Research, page 5156-5165. PMLR, (2020)Power efficient compressive sensing for continuous monitoring of ECG and PPG in a wearable system., and . WF-IoT, page 336-341. IEEE Computer Society, (2016)Analyzing Uncertainties in Speech Recognition Using Dropout., , , and . ICASSP, page 6730-6734. IEEE, (2019)Generative Pre-training for Speech with Flow Matching., , , , , and . CoRR, (2023)Lattice-Free Mmi Adaptation of Self-Supervised Pretrained Acoustic Models., , and . ICASSP, page 6219-6223. IEEE, (2021)