ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2208.05379
  4. Cited By
Multi-task Active Learning for Pre-trained Transformer-based Models

Multi-task Active Learning for Pre-trained Transformer-based Models

10 August 2022
Guy Rotman
Roi Reichart
ArXivPDFHTML

Papers citing "Multi-task Active Learning for Pre-trained Transformer-based Models"

9 / 9 papers shown
Title
Balancing Accuracy, Calibration, and Efficiency in Active Learning with Vision Transformers Under Label Noise
Balancing Accuracy, Calibration, and Efficiency in Active Learning with Vision Transformers Under Label Noise
Moseli Motsóehli
Hope Mogale
Kyungim Baek
38
0
0
07 May 2025
Assistive Image Annotation Systems with Deep Learning and Natural
  Language Capabilities: A Review
Assistive Image Annotation Systems with Deep Learning and Natural Language Capabilities: A Review
Moseli Motsóehli
VLM
3DV
30
0
0
28 Jun 2024
EASE: An Easily-Customized Annotation System Powered by Efficiency
  Enhancement Mechanisms
EASE: An Easily-Customized Annotation System Powered by Efficiency Enhancement Mechanisms
Naihao Deng
Yikai Liu
Mingye Chen
Winston Wu
Siyang Liu
Yulong Chen
Yue Zhang
Rada Mihalcea
28
0
0
23 May 2023
Active Prompting with Chain-of-Thought for Large Language Models
Active Prompting with Chain-of-Thought for Large Language Models
Shizhe Diao
Pengcheng Wang
Yong Lin
Tong Zhang
ReLM
KELM
LLMAG
LRM
29
119
0
23 Feb 2023
Semi-Automated Construction of Food Composition Knowledge Base
Semi-Automated Construction of Food Composition Knowledge Base
Jason Youn
Fangzhou Li
I. Tagkopoulos
37
0
0
24 Jan 2023
A Survey of Active Learning for Natural Language Processing
A Survey of Active Learning for Natural Language Processing
Zhisong Zhang
Emma Strubell
Eduard H. Hovy
LM&MA
30
65
0
18 Oct 2022
Calibration of Pre-trained Transformers
Calibration of Pre-trained Transformers
Shrey Desai
Greg Durrett
UQLM
243
289
0
17 Mar 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,956
0
20 Apr 2018
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
285
9,136
0
06 Jun 2015
1