Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2208.05379
Cited By
Multi-task Active Learning for Pre-trained Transformer-based Models
10 August 2022
Guy Rotman
Roi Reichart
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Multi-task Active Learning for Pre-trained Transformer-based Models"
7 / 7 papers shown
Title
Balancing Accuracy, Calibration, and Efficiency in Active Learning with Vision Transformers Under Label Noise
Moseli Motsóehli
Hope Mogale
Kyungim Baek
38
0
0
07 May 2025
Assistive Image Annotation Systems with Deep Learning and Natural Language Capabilities: A Review
Moseli Motsóehli
VLM
3DV
30
0
0
28 Jun 2024
Semi-Automated Construction of Food Composition Knowledge Base
Jason Youn
Fangzhou Li
I. Tagkopoulos
37
0
0
24 Jan 2023
A Survey of Active Learning for Natural Language Processing
Zhisong Zhang
Emma Strubell
Eduard H. Hovy
LM&MA
25
65
0
18 Oct 2022
Calibration of Pre-trained Transformers
Shrey Desai
Greg Durrett
UQLM
243
289
0
17 Mar 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,950
0
20 Apr 2018
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
282
9,136
0
06 Jun 2015
1