ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.10507
  4. Cited By
On Sampling-Based Training Criteria for Neural Language Modeling
v1v2 (latest)

On Sampling-Based Training Criteria for Neural Language Modeling

Interspeech (Interspeech), 2021
21 April 2021
Yingbo Gao
David Thulke
Alexander Gerstenberger
Viet Anh Khoa Tran
Ralf Schluter
Hermann Ney
ArXiv (abs)PDFHTML

Papers citing "On Sampling-Based Training Criteria for Neural Language Modeling"

1 / 1 papers shown
Self-Normalized Importance Sampling for Neural Language Modeling
Self-Normalized Importance Sampling for Neural Language ModelingInterspeech (Interspeech), 2021
Zijian Yang
Yingbo Gao
Alexander Gerstenberger
Jintao Jiang
Ralf Schluter
Hermann Ney
181
1
0
11 Nov 2021
1
Page 1 of 1