ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1702.02170
  4. Cited By
How to evaluate word embeddings? On importance of data efficiency and
  simple supervised tasks

How to evaluate word embeddings? On importance of data efficiency and simple supervised tasks

7 February 2017
Stanislaw Jastrzebski
Damian Lesniak
Wojciech M. Czarnecki
ArXivPDFHTML

Papers citing "How to evaluate word embeddings? On importance of data efficiency and simple supervised tasks"

5 / 5 papers shown
Title
Social-Group-Agnostic Word Embedding Debiasing via the Stereotype
  Content Model
Social-Group-Agnostic Word Embedding Debiasing via the Stereotype Content Model
Ali Omrani
Brendan Kennedy
M. Atari
Morteza Dehghani
16
0
0
11 Oct 2022
Learning Numeral Embeddings
Learning Numeral Embeddings
Chengyue Jiang
Zhonglin Nian
Kaihao Guo
Shanbo Chu
Yinggong Zhao
Libin Shen
Kewei Tu
22
20
0
28 Dec 2019
Learning General Purpose Distributed Sentence Representations via Large
  Scale Multi-task Learning
Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning
Sandeep Subramanian
Adam Trischler
Yoshua Bengio
C. Pal
SSL
11
327
0
30 Mar 2018
Improving Negative Sampling for Word Representation using Self-embedded
  Features
Improving Negative Sampling for Word Representation using Self-embedded Features
Long Chen
Fajie Yuan
J. Jose
Weinan Zhang
SSL
13
43
0
26 Oct 2017
Convolutional Neural Networks for Sentence Classification
Convolutional Neural Networks for Sentence Classification
Yoon Kim
AILaw
VLM
244
13,283
0
25 Aug 2014
1