481

End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures

Abstract

We present a novel end-to-end neural model to extract entities and relations between them. Our recurrent neural network based model captures both word sequence and dependency tree substructure information by stacking bidirectional tree-structured LSTM-RNNs on bidirectional sequential LSTM-RNNs. This allows our model to jointly represent both entities and relations with shared parameters in a single model. We further encourage detection of entities during training and use of entity information in relation extraction via entity pretraining and scheduled sampling. Our model improves over the state-of-the-art feature-based model on end-to-end relation extraction, achieving 3.5% and 4.8% relative error reductions in F1-score on ACE2004 and ACE2005, respectively. We also show a 2.5% relative error reduction in F1-score over the state-of-the-art convolutional neural network based model on nominal relation classification (SemEval-2010 Task 8).

View on arXiv
Comments on this paper