DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome - GitHub - jerryji1993/DNABERT: DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
A. Hendrickson, J. Wang, and M. Atzmueller. Proc. 24th International Symposium on Methodologies for Intelligent Systems (ISMIS), Heidelberg, Germany, Springer Verlag, ((accepted) 2018)
E. Sternberg, and M. Atzmueller. Proc. 24th International Symposium on Methodologies for Intelligent Systems (ISMIS), Heidelberg, Germany, Springer Verlag, ((accepted) 2018)
M. Atzmueller. Proc. Annual Machine Learning Conference of the Benelux (Benelearn 2017), Eindhoven, The Netherlands, Eindhoven University of Technology, (2017)
M. Tambuscio, G. Ruffo, A. Flammini, and F. Menczer. Proceedings of the 24th International Conference on World Wide Web, page 977--982. New York, NY, USA, ACM, (2015)
D. Nguyen, N. Smith, and C. Rosé. Proceedings of the 5th ACL-HLT Workshop on Language Technology for Cultural Heritage, Social Sciences, and Humanities, page 115--123. Stroudsburg, PA, USA, Association for Computational Linguistics, (2011)