ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2108.11635
11
10

MCML: A Novel Memory-based Contrastive Meta-Learning Method for Few Shot Slot Tagging

26 August 2021
Hongru Wang
Zezhong Wang
Gabriel Pui Cheong Fung
Kam-Fai Wong
    OffRL
    CLL
ArXivPDFHTML
Abstract

Meta-learning is widely used for few-shot slot tagging in task of few-shot learning. The performance of existing methods is, however, seriously affected by \textit{sample forgetting issue}, where the model forgets the historically learned meta-training tasks while solely relying on support sets when adapting to new tasks. To overcome this predicament, we propose the \textbf{M}emory-based \textbf{C}ontrastive \textbf{M}eta-\textbf{L}earning (aka, MCML) method, including \textit{learn-from-the-memory} and \textit{adaption-from-the-memory} modules, which bridge the distribution gap between training episodes and between training and testing respectively. Specifically, the former uses an explicit memory bank to keep track of the label representations of previously trained episodes, with a contrastive constraint between the label representations in the current episode with the historical ones stored in the memory. In addition, the \emph{adaption-from-memory} mechanism is introduced to learn more accurate and robust representations based on the shift between the same labels embedded in the testing episodes and memory. Experimental results show that the MCML outperforms several state-of-the-art methods on both SNIPS and NER datasets and demonstrates strong scalability with consistent improvement when the number of shots gets greater.

View on arXiv
Comments on this paper