DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome - GitHub - jerryji1993/DNABERT: DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
{. Schouten, {. Bueno, W. Duivesteijn, and M. Pechenizkiy. Data Mining and Knowledge Discovery, 36 (1):
379--413(January 2022)Funding Information: This research is supported by EDIC project funded by NWO. We thank the EDIC consortium and the ZGT hospital for allowing us to analyse the data from the DIALECT-2 study. We especially thank Niala Den Braber (PhD candidate at Universiteit Twente and researcher internal medicine at ZGT hospital) and prof. dr. Goos Laverman (internist-nephrologist at ZGT hospital) for giving us clinical valuation of our findings. In addition, we thank our colleagues dr. Robert Peharz for giving us useful insights on Markov chains and DBNs and dr. Maryam Tavakol for guiding us towards the MovieLens dataset..
P. Xia, S. Wu, and B. Van Durme. Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), page 7516--7533. Association for Computational Linguistics, (November 2020)
F. Lemmerich, M. Becker, P. Singer, D. Helic, A. Hotho, and M. Strohmaier. Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, page 965–974. New York, NY, USA, Association for Computing Machinery, (2016)