Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2110.15705
Cited By
Distilling Relation Embeddings from Pre-trained Language Models
21 September 2021
Asahi Ushio
Jose Camacho-Collados
Steven Schockaert
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Distilling Relation Embeddings from Pre-trained Language Models"
7 / 7 papers shown
Title
SememeLM: A Sememe Knowledge Enhanced Method for Long-tail Relation Representation
Shuyi Li
Shaojuan Wu
Xiaowang Zhang
Zhiyong Feng
37
0
0
13 Jun 2024
Visual Grounding Helps Learn Word Meanings in Low-Data Regimes
Chengxu Zhuang
Evelina Fedorenko
Jacob Andreas
20
10
0
20 Oct 2023
A RelEntLess Benchmark for Modelling Graded Relations between Named Entities
Asahi Ushio
Jose Camacho-Collados
Steven Schockaert
19
1
0
24 May 2023
Making Pre-trained Language Models Better Few-shot Learners
Tianyu Gao
Adam Fisch
Danqi Chen
241
1,916
0
31 Dec 2020
Linguistically-Informed Transformations (LIT): A Method for Automatically Generating Contrast Sets
Chuanrong Li
Lin Shengshuo
Leo Z. Liu
Xinyi Wu
Xuhui Zhou
Shane Steinert-Threlkeld
VLM
128
38
0
16 Oct 2020
Exploiting Cloze Questions for Few Shot Text Classification and Natural Language Inference
Timo Schick
Hinrich Schütze
258
1,586
0
21 Jan 2020
Language Models as Knowledge Bases?
Fabio Petroni
Tim Rocktaschel
Patrick Lewis
A. Bakhtin
Yuxiang Wu
Alexander H. Miller
Sebastian Riedel
KELM
AI4MH
406
2,584
0
03 Sep 2019
1