ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.00958
  4. Cited By
Universal Approximation Theory: The Basic Theory for Transformer-based
  Large Language Models

Universal Approximation Theory: The Basic Theory for Transformer-based Large Language Models

1 July 2024
Wei Wang
Qing Li
ArXivPDFHTML

Papers citing "Universal Approximation Theory: The Basic Theory for Transformer-based Large Language Models"

2 / 2 papers shown
Title
Multitask Prompted Training Enables Zero-Shot Task Generalization
Multitask Prompted Training Enables Zero-Shot Task Generalization
Victor Sanh
Albert Webson
Colin Raffel
Stephen H. Bach
Lintang Sutawika
...
T. Bers
Stella Biderman
Leo Gao
Thomas Wolf
Alexander M. Rush
LRM
205
1,654
0
15 Oct 2021
Universal Approximation Under Constraints is Possible with Transformers
Universal Approximation Under Constraints is Possible with Transformers
Anastasis Kratsios
Behnoosh Zamanlooy
Tianlin Liu
Ivan Dokmanić
48
26
0
07 Oct 2021
1