Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2305.13282
Cited By
Is Fine-tuning Needed? Pre-trained Language Models Are Near Perfect for Out-of-Domain Detection
22 May 2023
Rheeya Uppaal
Junjie Hu
Yixuan Li
OODD
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Is Fine-tuning Needed? Pre-trained Language Models Are Near Perfect for Out-of-Domain Detection"
10 / 10 papers shown
Title
Shh, don't say that! Domain Certification in LLMs
Cornelius Emde
Alasdair Paren
Preetham Arvind
Maxime Kayser
Tom Rainforth
Thomas Lukasiewicz
Bernard Ghanem
Philip H. S. Torr
Adel Bibi
33
1
0
26 Feb 2025
Multi-Task Model Merging via Adaptive Weight Disentanglement
Feng Xiong
Runxi Cheng
Wang Chen
Zhanqiu Zhang
Yiwen Guo
Chun Yuan
Ruifeng Xu
MoMe
80
4
0
10 Jan 2025
Process Reward Model with Q-Value Rankings
W. Li
Yixuan Li
LRM
35
13
0
15 Oct 2024
Your Finetuned Large Language Model is Already a Powerful Out-of-distribution Detector
Andi Zhang
Tim Z. Xiao
Weiyang Liu
Robert Bamler
Damon J. Wischik
OODD
27
4
0
07 Apr 2024
How Useful is Continued Pre-Training for Generative Unsupervised Domain Adaptation?
Rheeya Uppaal
Yixuan Li
Junjie Hu
16
4
0
31 Jan 2024
Mitigating Neural Network Overconfidence with Logit Normalization
Hongxin Wei
Renchunzi Xie
Hao-Ran Cheng
Lei Feng
Bo An
Yixuan Li
OODD
158
258
0
19 May 2022
Types of Out-of-Distribution Texts and How to Detect Them
Udit Arora
William Huang
He He
OODD
198
97
0
14 Sep 2021
Revisiting Mahalanobis Distance for Transformer-Based Out-of-Domain Detection
Alexander Podolskiy
Dmitry Lipin
A. Bout
Ekaterina Artemova
Irina Piontkovskaya
OODD
78
70
0
11 Jan 2021
Calibration of Pre-trained Transformers
Shrey Desai
Greg Durrett
UQLM
229
288
0
17 Mar 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,003
0
20 Apr 2018
1