Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2406.11274
Cited By
Skip-Layer Attention: Bridging Abstract and Detailed Dependencies in Transformers
17 June 2024
Qian Chen
Wen Wang
Qinglin Zhang
Siqi Zheng
Shiliang Zhang
Chong Deng
Hai Yu
Jiaqing Liu
Yukun Ma
Chong Zhang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Skip-Layer Attention: Bridging Abstract and Detailed Dependencies in Transformers"
3 / 3 papers shown
Title
A Systematic Study of Cross-Layer KV Sharing for Efficient LLM Inference
You Wu
Haoyi Wu
Kewei Tu
32
3
0
18 Oct 2024
Training language models to follow instructions with human feedback
Long Ouyang
Jeff Wu
Xu Jiang
Diogo Almeida
Carroll L. Wainwright
...
Amanda Askell
Peter Welinder
Paul Christiano
Jan Leike
Ryan J. Lowe
OSLM
ALM
301
11,730
0
04 Mar 2022
Densely Connected Convolutional Networks
Gao Huang
Zhuang Liu
L. V. D. van der Maaten
Kilian Q. Weinberger
PINN
3DV
244
35,884
0
25 Aug 2016
1