ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.05186
  4. Cited By
Total Recall: a Customized Continual Learning Method for Neural Semantic
  Parsers

Total Recall: a Customized Continual Learning Method for Neural Semantic Parsers

11 September 2021
Zhuang Li
Lizhen Qu
Gholamreza Haffari
    CLL
ArXivPDFHTML

Papers citing "Total Recall: a Customized Continual Learning Method for Neural Semantic Parsers"

4 / 4 papers shown
Title
Variational Autoencoder with Disentanglement Priors for Low-Resource
  Task-Specific Natural Language Generation
Variational Autoencoder with Disentanglement Priors for Low-Resource Task-Specific Natural Language Generation
Zhuang Li
Lizhen Qu
Qiongkai Xu
Tongtong Wu
Tianyang Zhan
Gholamreza Haffari
CoGe
UD
DRL
27
4
0
27 Feb 2022
Broad-Coverage Semantic Parsing as Transduction
Broad-Coverage Semantic Parsing as Transduction
Sheng Zhang
Xutai Ma
Kevin Duh
Benjamin Van Durme
41
73
0
05 Sep 2019
Improving a Neural Semantic Parser by Counterfactual Learning from Human
  Bandit Feedback
Improving a Neural Semantic Parser by Counterfactual Learning from Human Bandit Feedback
Carolin (Haas) Lawrence
Stefan Riezler
OffRL
171
56
0
03 May 2018
Effective Approaches to Attention-based Neural Machine Translation
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
214
7,923
0
17 Aug 2015
1