Article,

Relation classification via sequence features and bi-directional LSTMs

, , , , and .
Wuhan University Journal of Natural Sciences, 22 (6): 489--497 (Dec 1, 2017)
DOI: 10.1007/s11859-017-1278-6

Abstract

Structure features need complicated pre-processing, and are probably domain-dependent. To reduce time cost of pre-processing, we propose a novel neural network architecture which is a bi-directional long-short-term-memory recurrent-neural- network (Bi-LSTM-RNN) model based on low-cost sequence features such as words and part-of-speech (POS) tags, to classify the relation of two entities. First, this model performs bi-directional recurrent computation along the tokens of sentences. Then, the sequence is divided into five parts and standard pooling functions are applied over the token representations of each part. Finally, the token representations are concatenated and fed into a softmax layer for relation classification. We evaluate our model on two standard benchmark datasets in different domains, namely SemEval-2010 Task 8 and BioNLP-ST 2016 Task BB3. In SemEval- 2010 Task 8, the performance of our model matches those of the state-of-the-art models, achieving 83.0\% in F1. In BioNLP-ST 2016 Task BB3, our model obtains F1 51.3\% which is comparable with that of the best system. Moreover, we find that the context between two target entities plays an important role in relation classification and it can be a replacement of the shortest dependency path.

Tags

Users

  • @dallmann

Comments and Reviews