Using dependency parsing for few-shot learning in distributional
semantics
Annual Meeting of the Association for Computational Linguistics (ACL), 2022
Abstract
In this work, we explore the novel idea of employing dependency parsing information in the context of few-shot learning, the task of learning the meaning of a rare word based on a limited amount of context sentences. Firstly, we use dependency-based word embedding models as background spaces for few-shot learning. Secondly, we introduce two few-shot learning methods which enhance the additive baseline model by using dependencies.
View on arXivComments on this paper
