ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.04640
  4. Cited By
Scaling Hidden Markov Language Models

Scaling Hidden Markov Language Models

9 November 2020
Justin T. Chiu
Alexander M. Rush
    BDL
ArXivPDFHTML

Papers citing "Scaling Hidden Markov Language Models"

10 / 10 papers shown
Title
Investigating the Impact of Model Complexity in Large Language Models
Investigating the Impact of Model Complexity in Large Language Models
Jing Luo
Huiyuan Wang
Weiran Huang
34
0
0
01 Oct 2024
Information Transfer Rate in BCIs: Towards Tightly Integrated Symbiosis
Information Transfer Rate in BCIs: Towards Tightly Integrated Symbiosis
Suayb S. Arslan
Pawan Sinha
16
2
0
01 Jan 2023
Learning Dependencies of Discrete Speech Representations with Neural
  Hidden Markov Models
Learning Dependencies of Discrete Speech Representations with Neural Hidden Markov Models
Sung-Lin Yeh
Hao Tang
SSL
BDL
27
1
0
29 Oct 2022
Scaling Up Probabilistic Circuits by Latent Variable Distillation
Scaling Up Probabilistic Circuits by Latent Variable Distillation
Anji Liu
Honghua Zhang
Guy Van den Broeck
TPM
17
24
0
10 Oct 2022
Dynamic Programming in Rank Space: Scaling Structured Inference with
  Low-Rank HMMs and PCFGs
Dynamic Programming in Rank Space: Scaling Structured Inference with Low-Rank HMMs and PCFGs
Songlin Yang
Wei Liu
Kewei Tu
15
8
0
01 May 2022
Low-Rank Constraints for Fast Inference in Structured Models
Low-Rank Constraints for Fast Inference in Structured Models
Justin T. Chiu
Yuntian Deng
Alexander M. Rush
BDL
32
13
0
08 Jan 2022
Scaling Structured Inference with Randomization
Scaling Structured Inference with Randomization
Yao Fu
John P. Cunningham
Mirella Lapata
BDL
32
2
0
07 Dec 2021
Sequence-to-Sequence Learning with Latent Neural Grammars
Sequence-to-Sequence Learning with Latent Neural Grammars
Yoon Kim
33
40
0
02 Sep 2021
Why Do Pretrained Language Models Help in Downstream Tasks? An Analysis
  of Head and Prompt Tuning
Why Do Pretrained Language Models Help in Downstream Tasks? An Analysis of Head and Prompt Tuning
Colin Wei
Sang Michael Xie
Tengyu Ma
24
96
0
17 Jun 2021
PCFGs Can Do Better: Inducing Probabilistic Context-Free Grammars with
  Many Symbols
PCFGs Can Do Better: Inducing Probabilistic Context-Free Grammars with Many Symbols
Songlin Yang
Yanpeng Zhao
Kewei Tu
23
22
0
28 Apr 2021
1