ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.05010
  4. Cited By
Structural Knowledge Distillation: Tractably Distilling Information for
  Structured Predictor

Structural Knowledge Distillation: Tractably Distilling Information for Structured Predictor

10 October 2020
Xinyu Wang
Yong-jia Jiang
Zhaohui Yan
Zixia Jia
Nguyen Bach
Tao Wang
Zhongqiang Huang
Fei Huang
Kewei Tu
ArXivPDFHTML

Papers citing "Structural Knowledge Distillation: Tractably Distilling Information for Structured Predictor"

3 / 3 papers shown
Title
Language Modelling via Learning to Rank
Language Modelling via Learning to Rank
A. Frydenlund
Gagandeep Singh
Frank Rudzicz
37
7
0
13 Oct 2021
Automated Concatenation of Embeddings for Structured Prediction
Automated Concatenation of Embeddings for Structured Prediction
Xinyu Wang
Yong-jia Jiang
Nguyen Bach
Tao Wang
Zhongqiang Huang
Fei Huang
Kewei Tu
35
170
0
10 Oct 2020
Design Challenges and Misconceptions in Neural Sequence Labeling
Design Challenges and Misconceptions in Neural Sequence Labeling
Jie Yang
Shuailong Liang
Yue Zhang
118
161
0
12 Jun 2018
1