ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.12817
  4. Cited By
LightPAFF: A Two-Stage Distillation Framework for Pre-training and
  Fine-tuning

LightPAFF: A Two-Stage Distillation Framework for Pre-training and Fine-tuning

27 April 2020
Kaitao Song
Hao Sun
Xu Tan
Tao Qin
Jianfeng Lu
Hongzhi Liu
Tie-Yan Liu
ArXivPDFHTML

Papers citing "LightPAFF: A Two-Stage Distillation Framework for Pre-training and Fine-tuning"

6 / 6 papers shown
Title
ConCodeEval: Evaluating Large Language Models for Code Constraints in Domain-Specific Languages
ConCodeEval: Evaluating Large Language Models for Code Constraints in Domain-Specific Languages
Mehant Kammakomati
Sameer Pimparkhede
Srikanth G. Tamilselvam
Prince Kumar
Pushpak Bhattacharyya
ALM
40
0
0
03 Jul 2024
Direct Preference Knowledge Distillation for Large Language Models
Direct Preference Knowledge Distillation for Large Language Models
Yixing Li
Yuxian Gu
Li Dong
Dequan Wang
Yu Cheng
Furu Wei
37
6
0
28 Jun 2024
Compression of Generative Pre-trained Language Models via Quantization
Compression of Generative Pre-trained Language Models via Quantization
Chaofan Tao
Lu Hou
Wei Zhang
Lifeng Shang
Xin Jiang
Qun Liu
Ping Luo
Ngai Wong
MQ
27
103
0
21 Mar 2022
Compacting Deep Neural Networks for Internet of Things: Methods and
  Applications
Compacting Deep Neural Networks for Internet of Things: Methods and Applications
Ke Zhang
Hanbo Ying
Hongning Dai
Lin Li
Yuangyuang Peng
Keyi Guo
Hongfang Yu
16
38
0
20 Mar 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
192
473
0
12 Jun 2018
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
278
2,888
0
15 Sep 2016
1