ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.15022
  4. Cited By
ERICA: Improving Entity and Relation Understanding for Pre-trained
  Language Models via Contrastive Learning

ERICA: Improving Entity and Relation Understanding for Pre-trained Language Models via Contrastive Learning

30 December 2020
Yujia Qin
Yankai Lin
Ryuichi Takanobu
Zhiyuan Liu
Peng Li
Heng Ji
Minlie Huang
Maosong Sun
Jie Zhou
ArXivPDFHTML

Papers citing "ERICA: Improving Entity and Relation Understanding for Pre-trained Language Models via Contrastive Learning"

4 / 4 papers shown
Title
A Mutual Information Maximization Perspective of Language Representation
  Learning
A Mutual Information Maximization Perspective of Language Representation Learning
Lingpeng Kong
Cyprien de Masson dÁutume
Wang Ling
Lei Yu
Zihang Dai
Dani Yogatama
SSL
196
158
0
18 Oct 2019
Span-based Joint Entity and Relation Extraction with Transformer
  Pre-training
Span-based Joint Entity and Relation Extraction with Transformer Pre-training
Markus Eberts
A. Ulges
LRM
ViT
149
344
0
17 Sep 2019
Knowledge Enhanced Contextual Word Representations
Knowledge Enhanced Contextual Word Representations
Matthew E. Peters
Mark Neumann
IV RobertL.Logan
Roy Schwartz
Vidur Joshi
Sameer Singh
Noah A. Smith
210
651
0
09 Sep 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
292
6,003
0
20 Apr 2018
1