ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1506.04147
  4. Cited By
On the accuracy of self-normalized log-linear models
v1v2 (latest)

On the accuracy of self-normalized log-linear models

Neural Information Processing Systems (NeurIPS), 2015
12 June 2015
Jacob Andreas
Maxim Rabinovich
Dan Klein
Sai Li
ArXiv (abs)PDFHTML

Papers citing "On the accuracy of self-normalized log-linear models"

5 / 5 papers shown
Distributionally Robust Models with Parametric Likelihood Ratios
Distributionally Robust Models with Parametric Likelihood RatiosInternational Conference on Learning Representations (ICLR), 2022
Paul Michel
Tatsunori Hashimoto
Graham Neubig
OOD
276
21
0
13 Apr 2022
Analyzing and Improving the Optimization Landscape of Noise-Contrastive
  Estimation
Analyzing and Improving the Optimization Landscape of Noise-Contrastive EstimationInternational Conference on Learning Representations (ICLR), 2021
Bingbin Liu
Elan Rosenfeld
Pradeep Ravikumar
Andrej Risteski
316
19
0
21 Oct 2021
Von Mises-Fisher Loss for Training Sequence to Sequence Models with
  Continuous Outputs
Von Mises-Fisher Loss for Training Sequence to Sequence Models with Continuous Outputs
Sachin Kumar
Yulia Tsvetkov
273
77
0
10 Dec 2018
Dropout with Expectation-linear Regularization
Dropout with Expectation-linear Regularization
Xuezhe Ma
Yingkai Gao
Zhiting Hu
Yaoliang Yu
Yuntian Deng
Eduard H. Hovy
UQCV
254
53
0
26 Sep 2016
BlackOut: Speeding up Recurrent Neural Network Language Models With Very
  Large Vocabularies
BlackOut: Speeding up Recurrent Neural Network Language Models With Very Large Vocabularies
Shihao Ji
S.V.N. Vishwanathan
N. Satish
Michael J. Anderson
Pradeep Dubey
485
77
0
21 Nov 2015
1
Page 1 of 1