ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.13512
  4. Cited By
From Self-Attention to Markov Models: Unveiling the Dynamics of
  Generative Transformers

From Self-Attention to Markov Models: Unveiling the Dynamics of Generative Transformers

21 February 2024
M. E. Ildiz
Yixiao Huang
Yingcong Li
A. S. Rawat
Samet Oymak
ArXivPDFHTML

Papers citing "From Self-Attention to Markov Models: Unveiling the Dynamics of Generative Transformers"

5 / 5 papers shown
Title
Dual Filter: A Mathematical Framework for Inference using Transformer-like Architectures
Dual Filter: A Mathematical Framework for Inference using Transformer-like Architectures
Heng-Sheng Chang
P. Mehta
34
0
0
01 May 2025
When is Task Vector Provably Effective for Model Editing? A Generalization Analysis of Nonlinear Transformers
When is Task Vector Provably Effective for Model Editing? A Generalization Analysis of Nonlinear Transformers
Hongkang Li
Yihua Zhang
Shuai Zhang
M. Wang
Sijia Liu
Pin-Yu Chen
MoMe
60
2
0
15 Apr 2025
State space models, emergence, and ergodicity: How many parameters are
  needed for stable predictions?
State space models, emergence, and ergodicity: How many parameters are needed for stable predictions?
Ingvar M. Ziemann
Nikolai Matni
George J. Pappas
11
1
0
20 Sep 2024
Implicit Bias and Fast Convergence Rates for Self-attention
Implicit Bias and Fast Convergence Rates for Self-attention
Bhavya Vasudeva
Puneesh Deora
Christos Thrampoulidis
24
13
0
08 Feb 2024
A Theoretical Analysis of the Repetition Problem in Text Generation
A Theoretical Analysis of the Repetition Problem in Text Generation
Z. Fu
Wai Lam
Anthony Man-Cho So
Bei Shi
69
89
0
29 Dec 2020
1