762

Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation Extraction

The Web Conference (WWW), 2021
Ningyu Zhang
Shumin Deng
Chuanqi Tan
Fei Huang
Huajun Chen
Abstract

Recently, prompt-tuning has achieved promising results on some few-class classification tasks. The core idea of prompt-tuning is to insert text pieces, i.e., template, to the input and transform a classification task into a masked language modeling problem. However, as for relation extraction, determining the appropriate prompt template requires domain expertise, and single label word handcrafted or auto-searched is cumbersome and time-consuming to verify their effectiveness in non-few-shot scenarios, which also fails to leverage the abundant semantic knowledge in the entities and relation labels. To this end, we focus on incorporating knowledge into prompt-tuning for relation extraction and propose a knowledge-aware prompt-tuning with synergistic optimization (KNIGHT) approach. Specifically, we inject entity and relation knowledge into prompt construction with learnable virtual template words and answer words and jointly optimize their representation with knowledge constraints. Extensive experimental results on 5 datasets with standard and low-resource settings demonstrate the effectiveness of our approach.

View on arXiv
Comments on this paper