Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.09237
Cited By
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models
19 September 2021
Qianchu Liu
Fangyu Liu
Nigel Collier
Anna Korhonen
Ivan Vulić
Re-assign community
ArXiv
PDF
HTML
Papers citing
"MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models"
1 / 1 papers shown
Title
Let's Play Mono-Poly: BERT Can Reveal Words' Polysemy Level and Partitionability into Senses
Aina Garí Soler
Marianna Apidianaki
MILM
168
52
0
29 Apr 2021
1