Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2205.08303
Cited By
MulT: An End-to-End Multitask Learning Transformer
17 May 2022
Deblina Bhattacharjee
Tong Zhang
Sabine Süsstrunk
Mathieu Salzmann
ViT
Re-assign community
ArXiv
PDF
HTML
Papers citing
"MulT: An End-to-End Multitask Learning Transformer"
9 / 9 papers shown
Title
SGW-based Multi-Task Learning in Vision Tasks
Ruiyuan Zhang
Yuyao Chen
Yuchi Huo
Jiaxiang Liu
Dianbing Xi
Jie Liu
Chao Wu
20
0
0
03 Oct 2024
AutoTask: Task Aware Multi-Faceted Single Model for Multi-Task Ads Relevance
Shouchang Guo
Sonam Damani
Keng-hao Chang
16
0
0
09 Jul 2024
4M: Massively Multimodal Masked Modeling
David Mizrahi
Roman Bachmann
Ouguzhan Fatih Kar
Teresa Yeo
Mingfei Gao
Afshin Dehghan
Amir Zamir
MLLM
23
62
0
11 Dec 2023
PolyMaX: General Dense Prediction with Mask Transformer
Xuan S. Yang
Liangzhe Yuan
Kimberly Wilber
Astuti Sharma
Xiuye Gu
...
Stephanie Debats
Huisheng Wang
Hartwig Adam
Mikhail Sirotenko
Liang-Chieh Chen
18
14
0
09 Nov 2023
Multi-Similarity Contrastive Learning
Emily Mu
John Guttag
Maggie Makar
SSL
18
2
0
06 Jul 2023
InvPT++: Inverted Pyramid Multi-Task Transformer for Visual Scene Understanding
Hanrong Ye
Dan Xu
ViT
8
10
0
08 Jun 2023
MTLSegFormer: Multi-task Learning with Transformers for Semantic Segmentation in Precision Agriculture
D. Gonçalves
J. M. Junior
Pedro Zamboni
H. Pistori
Jonathan Li
Keiller Nogueira
W. Gonçalves
19
5
0
04 May 2023
Are Transformers More Robust Than CNNs?
Yutong Bai
Jieru Mei
Alan Yuille
Cihang Xie
ViT
AAML
167
256
0
10 Nov 2021
A Decomposable Attention Model for Natural Language Inference
Ankur P. Parikh
Oscar Täckström
Dipanjan Das
Jakob Uszkoreit
190
1,358
0
06 Jun 2016
1