Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1911.05758
Cited By
What do you mean, BERT? Assessing BERT as a Distributional Semantics Model
13 November 2019
Timothee Mickus
Denis Paperno
Mathieu Constant
Kees van Deemter
Re-assign community
ArXiv
PDF
HTML
Papers citing
"What do you mean, BERT? Assessing BERT as a Distributional Semantics Model"
10 / 10 papers shown
Title
Agentività e telicità in GilBERTo: implicazioni cognitive
A. Lombardi
Alessandro Lenci
18
1
0
06 Jul 2023
Do Transformers know symbolic rules, and would we know if they did?
Tommi Gröndahl
Yu-Wen Guo
Nirmal Asokan
25
0
0
19 Feb 2022
What Do They Capture? -- A Structural Analysis of Pre-Trained Language Models for Source Code
Yao Wan
Wei-Ye Zhao
Hongyu Zhang
Yulei Sui
Guandong Xu
Hairong Jin
27
105
0
14 Feb 2022
Using Distributional Principles for the Semantic Study of Contextual Language Models
Olivier Ferret
17
1
0
23 Nov 2021
LMMS Reloaded: Transformer-based Sense Embeddings for Disambiguation and Beyond
Daniel Loureiro
A. Jorge
Jose Camacho-Collados
33
26
0
26 May 2021
A comparative evaluation and analysis of three generations of Distributional Semantic Models
Alessandro Lenci
Magnus Sahlgren
Patrick Jeuniaux
Amaru Cuba Gyllensten
Martina Miliani
24
50
0
20 May 2021
The Devil is in the Details: Evaluating Limitations of Transformer-based Methods for Granular Tasks
Brihi Joshi
Neil Shah
Francesco Barbieri
Leonardo Neves
33
5
0
02 Nov 2020
Dynamic Contextualized Word Embeddings
Valentin Hofmann
J. Pierrehumbert
Hinrich Schütze
36
51
0
23 Oct 2020
BERTology Meets Biology: Interpreting Attention in Protein Language Models
Jesse Vig
Ali Madani
L. Varshney
Caiming Xiong
R. Socher
Nazneen Rajani
26
288
0
26 Jun 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,956
0
20 Apr 2018
1