ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.12596
  4. Cited By
Parameter Efficient Diverse Paraphrase Generation Using Sequence-Level
  Knowledge Distillation

Parameter Efficient Diverse Paraphrase Generation Using Sequence-Level Knowledge Distillation

19 April 2024
Lasal Jayawardena
Prasan Yapa
    BDL
ArXivPDFHTML

Papers citing "Parameter Efficient Diverse Paraphrase Generation Using Sequence-Level Knowledge Distillation"

3 / 3 papers shown
Title
LaMini-LM: A Diverse Herd of Distilled Models from Large-Scale
  Instructions
LaMini-LM: A Diverse Herd of Distilled Models from Large-Scale Instructions
Minghao Wu
Abdul Waheed
Chiyu Zhang
Muhammad Abdul-Mageed
Alham Fikri Aji
ALM
124
115
0
27 Apr 2023
Novelty Controlled Paraphrase Generation with Retrieval Augmented
  Conditional Prompt Tuning
Novelty Controlled Paraphrase Generation with Retrieval Augmented Conditional Prompt Tuning
Jishnu Ray Chowdhury
Yong Zhuang
Shuyi Wang
125
39
0
01 Feb 2022
Stanza: A Python Natural Language Processing Toolkit for Many Human
  Languages
Stanza: A Python Natural Language Processing Toolkit for Many Human Languages
Peng Qi
Yuhao Zhang
Yuhui Zhang
Jason Bolton
Christopher D. Manning
AI4TS
193
1,638
0
16 Mar 2020
1