ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1506.04488
  4. Cited By
Distilling Word Embeddings: An Encoding Approach

Distilling Word Embeddings: An Encoding Approach

15 June 2015
Lili Mou
Ran Jia
Yan Xu
Ge Li
Lu Zhang
Zhi Jin
    FedML
ArXivPDFHTML

Papers citing "Distilling Word Embeddings: An Encoding Approach"

8 / 8 papers shown
Title
Exploring the Learning Difficulty of Data Theory and Measure
Exploring the Learning Difficulty of Data Theory and Measure
Weiyao Zhu
Ou Wu
Fengguang Su
Yingjun Deng
40
5
0
16 May 2022
Knowledge Distillation as Semiparametric Inference
Knowledge Distillation as Semiparametric Inference
Tri Dao
G. Kamath
Vasilis Syrgkanis
Lester W. Mackey
40
31
0
20 Apr 2021
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
23
2,851
0
09 Jun 2020
Model Compression with Two-stage Multi-teacher Knowledge Distillation
  for Web Question Answering System
Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System
Ze Yang
Linjun Shou
Ming Gong
Wutao Lin
Daxin Jiang
28
92
0
18 Oct 2019
BAM! Born-Again Multi-Task Networks for Natural Language Understanding
BAM! Born-Again Multi-Task Networks for Natural Language Understanding
Kevin Clark
Minh-Thang Luong
Urvashi Khandelwal
Christopher D. Manning
Quoc V. Le
24
228
0
10 Jul 2019
Attention-Guided Answer Distillation for Machine Reading Comprehension
Attention-Guided Answer Distillation for Machine Reading Comprehension
Minghao Hu
Yuxing Peng
Furu Wei
Zhen Huang
Dongsheng Li
Nan Yang
M. Zhou
FaML
26
75
0
23 Aug 2018
Fine-Grained Entity Type Classification by Jointly Learning
  Representations and Label Embeddings
Fine-Grained Entity Type Classification by Jointly Learning Representations and Label Embeddings
A. Abhishek
Ashish Anand
Amit Awekar
30
74
0
22 Feb 2017
Sequence-Level Knowledge Distillation
Sequence-Level Knowledge Distillation
Yoon Kim
Alexander M. Rush
47
1,099
0
25 Jun 2016
1