Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2407.00935
Cited By
Look Ahead or Look Around? A Theoretical Comparison Between Autoregressive and Masked Pretraining
1 July 2024
Qi Zhang
Tianqi Du
Haotian Huang
Yifei Wang
Yisen Wang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Look Ahead or Look Around? A Theoretical Comparison Between Autoregressive and Masked Pretraining"
5 / 5 papers shown
Title
Elucidating the design space of language models for image generation
Xuantong Liu
Shaozhe Hao
Xianbiao Qi
Tianyang Hu
Jun Wang
Rong Xiao
Yuan Yao
VLM
30
3
0
21 Oct 2024
COrAL: Order-Agnostic Language Modeling for Efficient Iterative Refinement
Yuxi Xie
Anirudh Goyal
Xiaobao Wu
Xunjian Yin
Xiao Xu
Min-Yen Kan
Liangming Pan
William Yang Wang
LRM
42
1
0
12 Oct 2024
Masked Autoencoders Are Scalable Vision Learners
Kaiming He
Xinlei Chen
Saining Xie
Yanghao Li
Piotr Dollár
Ross B. Girshick
ViT
TPM
258
7,337
0
11 Nov 2021
The Pile: An 800GB Dataset of Diverse Text for Language Modeling
Leo Gao
Stella Biderman
Sid Black
Laurence Golding
Travis Hoppe
...
Horace He
Anish Thite
Noa Nabeshima
Shawn Presser
Connor Leahy
AIMat
245
1,977
0
31 Dec 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,927
0
20 Apr 2018
1