ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.07746
  4. Cited By
Unsupervised Attention-based Sentence-Level Meta-Embeddings from
  Contextualised Language Models

Unsupervised Attention-based Sentence-Level Meta-Embeddings from Contextualised Language Models

16 April 2022
Keigo Takahashi
Danushka Bollegala
ArXivPDFHTML

Papers citing "Unsupervised Attention-based Sentence-Level Meta-Embeddings from Contextualised Language Models"

5 / 5 papers shown
Title
Together We Make Sense -- Learning Meta-Sense Embeddings from Pretrained
  Static Sense Embeddings
Together We Make Sense -- Learning Meta-Sense Embeddings from Pretrained Static Sense Embeddings
Haochen Luo
Yi Zhou
Danushka Bollegala
SSL
26
1
0
30 May 2023
TagCLIP: Improving Discrimination Ability of Open-Vocabulary Semantic
  Segmentation
TagCLIP: Improving Discrimination Ability of Open-Vocabulary Semantic Segmentation
Jingyao Li
Pengguang Chen
Shengju Qian
Jiaya Jia
VLM
27
13
0
15 Apr 2023
Learning Meta Word Embeddings by Unsupervised Weighted Concatenation of
  Source Embeddings
Learning Meta Word Embeddings by Unsupervised Weighted Concatenation of Source Embeddings
Danushka Bollegala
SSL
6
4
0
26 Apr 2022
A Survey on Word Meta-Embedding Learning
A Survey on Word Meta-Embedding Learning
Danushka Bollegala
James OÑeill
12
12
0
25 Apr 2022
Efficient Estimation of Word Representations in Vector Space
Efficient Estimation of Word Representations in Vector Space
Tomáš Mikolov
Kai Chen
G. Corrado
J. Dean
3DV
228
31,244
0
16 Jan 2013
1