ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2101.12409
16
2

Few-Shot Domain Adaptation for Grammatical Error Correction via Meta-Learning

29 January 2021
Shengsheng Zhang
Yaping Huang
Yun-Nung Chen
Liner Yang
Chencheng Wang
Erhong Yang
    VLM
ArXivPDFHTML
Abstract

Most existing Grammatical Error Correction (GEC) methods based on sequence-to-sequence mainly focus on how to generate more pseudo data to obtain better performance. Few work addresses few-shot GEC domain adaptation. In this paper, we treat different GEC domains as different GEC tasks and propose to extend meta-learning to few-shot GEC domain adaptation without using any pseudo data. We exploit a set of data-rich source domains to learn the initialization of model parameters that facilitates fast adaptation on new resource-poor target domains. We adapt GEC model to the first language (L1) of the second language learner. To evaluate the proposed method, we use nine L1s as source domains and five L1s as target domains. Experiment results on the L1 GEC domain adaptation dataset demonstrate that the proposed approach outperforms the multi-task transfer learning baseline by 0.50 F0.5F_{0.5}F0.5​ score on average and enables us to effectively adapt to a new L1 domain with only 200 parallel sentences.

View on arXiv
Comments on this paper