Unsupervised Natural Question Answering with a Small Model
M. Andrews, und S. Witteveen. (2019)cite arxiv:1911.08340Comment: Accepted paper for FEVER workshop at EMNLP-IJCNLP 2019. (4 pages + references).
DOI: 10.18653/v1/D19-6606
Zusammenfassung
The recent (2019-02) demonstration of the power of huge language models such
as GPT-2 to memorise the answers to factoid questions raises questions about
the extent to which knowledge is being embedded directly within these large
models. This short paper describes an architecture through which much smaller
models can also answer such questions - by making use of 'raw' external
knowledge. The contribution of this work is that the methods presented here
rely on unsupervised learning techniques, complementing the unsupervised
training of the Language Model. The goal of this line of research is to be able
to add knowledge explicitly, without extensive training.
Beschreibung
[1911.08340] Unsupervised Natural Question Answering with a Small Model
%0 Generic
%1 andrews2019unsupervised
%A Andrews, Martin
%A Witteveen, Sam
%D 2019
%K gpt2 masterthesis qna
%R 10.18653/v1/D19-6606
%T Unsupervised Natural Question Answering with a Small Model
%U http://arxiv.org/abs/1911.08340
%X The recent (2019-02) demonstration of the power of huge language models such
as GPT-2 to memorise the answers to factoid questions raises questions about
the extent to which knowledge is being embedded directly within these large
models. This short paper describes an architecture through which much smaller
models can also answer such questions - by making use of 'raw' external
knowledge. The contribution of this work is that the methods presented here
rely on unsupervised learning techniques, complementing the unsupervised
training of the Language Model. The goal of this line of research is to be able
to add knowledge explicitly, without extensive training.
@misc{andrews2019unsupervised,
abstract = {The recent (2019-02) demonstration of the power of huge language models such
as GPT-2 to memorise the answers to factoid questions raises questions about
the extent to which knowledge is being embedded directly within these large
models. This short paper describes an architecture through which much smaller
models can also answer such questions - by making use of 'raw' external
knowledge. The contribution of this work is that the methods presented here
rely on unsupervised learning techniques, complementing the unsupervised
training of the Language Model. The goal of this line of research is to be able
to add knowledge explicitly, without extensive training.},
added-at = {2020-09-03T16:36:55.000+0200},
author = {Andrews, Martin and Witteveen, Sam},
biburl = {https://www.bibsonomy.org/bibtex/20bced8802fb3ccebfef0737a868db8b8/festplatte},
description = {[1911.08340] Unsupervised Natural Question Answering with a Small Model},
doi = {10.18653/v1/D19-6606},
interhash = {5e39d792a1271a84d5142e13aa5bae63},
intrahash = {0bced8802fb3ccebfef0737a868db8b8},
keywords = {gpt2 masterthesis qna},
note = {cite arxiv:1911.08340Comment: Accepted paper for FEVER workshop at EMNLP-IJCNLP 2019. (4 pages + references)},
timestamp = {2020-09-03T16:36:55.000+0200},
title = {Unsupervised Natural Question Answering with a Small Model},
url = {http://arxiv.org/abs/1911.08340},
year = 2019
}