ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.08491
  4. Cited By
Active Learning Helps Pretrained Models Learn the Intended Task

Active Learning Helps Pretrained Models Learn the Intended Task

18 April 2022
Alex Tamkin
Dat Nguyen
Salil Deshpande
Jesse Mu
Noah D. Goodman
ArXivPDFHTML

Papers citing "Active Learning Helps Pretrained Models Learn the Intended Task"

13 / 13 papers shown
Title
Active Learning Principles for In-Context Learning with Large Language
  Models
Active Learning Principles for In-Context Learning with Large Language Models
Katerina Margatina
Timo Schick
Nikolaos Aletras
Jane Dwivedi-Yu
20
39
0
23 May 2023
Task Ambiguity in Humans and Language Models
Task Ambiguity in Humans and Language Models
Alex Tamkin
Kunal Handa
Ava Shrestha
Noah D. Goodman
UQLM
24
22
0
20 Dec 2022
cRedAnno+: Annotation Exploitation in Self-Explanatory Lung Nodule
  Diagnosis
cRedAnno+: Annotation Exploitation in Self-Explanatory Lung Nodule Diagnosis
Jiahao Lu
Chong Yin
Kenny Erleben
M. B. Nielsen
S. Darkner
11
1
0
28 Oct 2022
Multi-Domain Active Learning: Literature Review and Comparative Study
Multi-Domain Active Learning: Literature Review and Comparative Study
Ruidan He
Shengcai Liu
Shan He
Ke Tang
OOD
19
14
0
25 Jun 2021
Emerging Properties in Self-Supervised Vision Transformers
Emerging Properties in Self-Supervised Vision Transformers
Mathilde Caron
Hugo Touvron
Ishan Misra
Hervé Jégou
Julien Mairal
Piotr Bojanowski
Armand Joulin
292
5,761
0
29 Apr 2021
Understanding the Capabilities, Limitations, and Societal Impact of
  Large Language Models
Understanding the Capabilities, Limitations, and Societal Impact of Large Language Models
Alex Tamkin
Miles Brundage
Jack Clark
Deep Ganguli
AILaw
ELM
192
253
0
04 Feb 2021
Cold-start Active Learning through Self-supervised Language Modeling
Cold-start Active Learning through Self-supervised Language Modeling
Michelle Yuan
Hsuan-Tien Lin
Jordan L. Boyd-Graber
104
180
0
19 Oct 2020
An Investigation of Why Overparameterization Exacerbates Spurious
  Correlations
An Investigation of Why Overparameterization Exacerbates Spurious Correlations
Shiori Sagawa
Aditi Raghunathan
Pang Wei Koh
Percy Liang
144
369
0
09 May 2020
Calibration of Pre-trained Transformers
Calibration of Pre-trained Transformers
Shrey Desai
Greg Durrett
UQLM
243
289
0
17 Mar 2020
Scaling Laws for Neural Language Models
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
226
4,424
0
23 Jan 2020
Consistency-based Semi-supervised Active Learning: Towards Minimizing
  Labeling Cost
Consistency-based Semi-supervised Active Learning: Towards Minimizing Labeling Cost
M. Gao
Zizhao Zhang
Guo-Ding Yu
Sercan Ö. Arik
L. Davis
Tomas Pfister
151
195
0
16 Oct 2019
Probabilistic Model-Agnostic Meta-Learning
Probabilistic Model-Agnostic Meta-Learning
Chelsea Finn
Kelvin Xu
Sergey Levine
BDL
165
666
0
07 Jun 2018
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,943
0
20 Apr 2018
1