DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome - GitHub - jerryji1993/DNABERT: DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
M. Tambuscio, G. Ruffo, A. Flammini, and F. Menczer. Proceedings of the 24th International Conference on World Wide Web, page 977--982. New York, NY, USA, ACM, (2015)
D. Nguyen, N. Smith, and C. Rosé. Proceedings of the 5th ACL-HLT Workshop on Language Technology for Cultural Heritage, Social Sciences, and Humanities, page 115--123. Stroudsburg, PA, USA, Association for Computational Linguistics, (2011)