ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.12449
  4. Cited By
LMMS Reloaded: Transformer-based Sense Embeddings for Disambiguation and
  Beyond

LMMS Reloaded: Transformer-based Sense Embeddings for Disambiguation and Beyond

26 May 2021
Daniel Loureiro
A. Jorge
Jose Camacho-Collados
ArXivPDFHTML

Papers citing "LMMS Reloaded: Transformer-based Sense Embeddings for Disambiguation and Beyond"

17 / 17 papers shown
Title
Linguistic Interpretability of Transformer-based Language Models: a systematic review
Linguistic Interpretability of Transformer-based Language Models: a systematic review
Miguel López-Otal
Jorge Gracia
Jordi Bernad
Carlos Bobed
Lucía Pitarch-Ballesteros
Emma Anglés-Herrero
VLM
36
0
0
09 Apr 2025
Multi-Sense Embeddings for Language Models and Knowledge Distillation
Multi-Sense Embeddings for Language Models and Knowledge Distillation
Qitong Wang
Mohammed J. Zaki
Georgios Kollias
Vasileios Kalantzis
KELM
26
0
0
08 Apr 2025
What Are Large Language Models Mapping to in the Brain? A Case Against
  Over-Reliance on Brain Scores
What Are Large Language Models Mapping to in the Brain? A Case Against Over-Reliance on Brain Scores
Ebrahim Feghhi
Nima Hadidi
Bryan Song
I. Blank
Jonathan C. Kao
43
3
0
03 Jun 2024
Injecting Wiktionary to improve token-level contextual representations
  using contrastive learning
Injecting Wiktionary to improve token-level contextual representations using contrastive learning
Anna Mosolova
Marie Candito
Carlos Ramisch
21
0
0
12 Feb 2024
Construction Grammar and Language Models
Construction Grammar and Language Models
Harish Tayyar Madabushi
Laurence Romain
P. Milin
Dagmar Divjak
19
5
0
25 Aug 2023
Combating the Curse of Multilinguality in Cross-Lingual WSD by Aligning
  Sparse Contextualized Word Representations
Combating the Curse of Multilinguality in Cross-Lingual WSD by Aligning Sparse Contextualized Word Representations
Gábor Berend
30
7
0
25 Jul 2023
Together We Make Sense -- Learning Meta-Sense Embeddings from Pretrained
  Static Sense Embeddings
Together We Make Sense -- Learning Meta-Sense Embeddings from Pretrained Static Sense Embeddings
Haochen Luo
Yi Zhou
Danushka Bollegala
SSL
26
1
0
30 May 2023
Context-Aware Semantic Similarity Measurement for Unsupervised Word
  Sense Disambiguation
Context-Aware Semantic Similarity Measurement for Unsupervised Word Sense Disambiguation
J. Martinez-Gil
20
3
0
05 May 2023
Using Two Losses and Two Datasets Simultaneously to Improve TempoWiC
  Accuracy
Using Two Losses and Two Datasets Simultaneously to Improve TempoWiC Accuracy
Mohammad Javad Pirhadi
Motahhare Mirzaei
Sauleh Eetemadi
11
0
0
15 Dec 2022
Temporal Word Meaning Disambiguation using TimeLMs
Temporal Word Meaning Disambiguation using TimeLMs
Mihir Godbole
Parth Dandavate
Aditya Kane
21
2
0
15 Oct 2022
Probing Commonsense Knowledge in Pre-trained Language Models with
  Sense-level Precision and Expanded Vocabulary
Probing Commonsense Knowledge in Pre-trained Language Models with Sense-level Precision and Expanded Vocabulary
Daniel Loureiro
A. Jorge
ReLM
KELM
AI4MH
LRM
11
1
0
12 Oct 2022
TempoWiC: An Evaluation Benchmark for Detecting Meaning Shift in Social
  Media
TempoWiC: An Evaluation Benchmark for Detecting Meaning Shift in Social Media
Daniel Loureiro
Aminette D'Souza
Areej Muhajab
Isabella A. White
Gabriel Wong
Luis Espinosa Anke
Leonardo Neves
Francesco Barbieri
Jose Camacho-Collados
27
25
0
15 Sep 2022
One Sense per Translation
One Sense per Translation
B. Hauer
Grzegorz Kondrak
19
0
0
10 Jun 2021
Let's Play Mono-Poly: BERT Can Reveal Words' Polysemy Level and
  Partitionability into Senses
Let's Play Mono-Poly: BERT Can Reveal Words' Polysemy Level and Partitionability into Senses
Aina Garí Soler
Marianna Apidianaki
MILM
201
68
0
29 Apr 2021
Knowledge Enhanced Contextual Word Representations
Knowledge Enhanced Contextual Word Representations
Matthew E. Peters
Mark Neumann
IV RobertL.Logan
Roy Schwartz
Vidur Joshi
Sameer Singh
Noah A. Smith
224
656
0
09 Sep 2019
The Bottom-up Evolution of Representations in the Transformer: A Study
  with Machine Translation and Language Modeling Objectives
The Bottom-up Evolution of Representations in the Transformer: A Study with Machine Translation and Language Modeling Objectives
Elena Voita
Rico Sennrich
Ivan Titov
190
181
0
03 Sep 2019
Probabilistic FastText for Multi-Sense Word Embeddings
Probabilistic FastText for Multi-Sense Word Embeddings
Ben Athiwaratkun
A. Wilson
Anima Anandkumar
10
136
0
07 Jun 2018
1