ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.13482
  4. Cited By
Syntactic Structure Distillation Pretraining For Bidirectional Encoders

Syntactic Structure Distillation Pretraining For Bidirectional Encoders

27 May 2020
A. Kuncoro
Lingpeng Kong
Daniel Fried
Dani Yogatama
Laura Rimell
Chris Dyer
Phil Blunsom
ArXivPDFHTML

Papers citing "Syntactic Structure Distillation Pretraining For Bidirectional Encoders"

3 / 3 papers shown
Title
We need to talk about random seeds
We need to talk about random seeds
Steven Bethard
23
8
0
24 Oct 2022
An Empirical Revisiting of Linguistic Knowledge Fusion in Language
  Understanding Tasks
An Empirical Revisiting of Linguistic Knowledge Fusion in Language Understanding Tasks
Changlong Yu
Tianyi Xiao
Lingpeng Kong
Yangqiu Song
Wilfred Ng
22
3
0
24 Oct 2022
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,943
0
20 Apr 2018
1