Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2305.13282
Cited By
Is Fine-tuning Needed? Pre-trained Language Models Are Near Perfect for Out-of-Domain Detection
22 May 2023
Rheeya Uppaal
Junjie Hu
Yixuan Li
OODD
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Is Fine-tuning Needed? Pre-trained Language Models Are Near Perfect for Out-of-Domain Detection"
5 / 5 papers shown
Title
Mitigating Neural Network Overconfidence with Logit Normalization
Hongxin Wei
Renchunzi Xie
Hao-Ran Cheng
Lei Feng
Bo An
Yixuan Li
OODD
148
183
0
19 May 2022
Types of Out-of-Distribution Texts and How to Detect Them
Udit Arora
William Huang
He He
OODD
189
82
0
14 Sep 2021
Revisiting Mahalanobis Distance for Transformer-Based Out-of-Domain Detection
Alexander Podolskiy
Dmitry Lipin
A. Bout
Ekaterina Artemova
Irina Piontkovskaya
OODD
76
68
0
11 Jan 2021
Calibration of Pre-trained Transformers
Shrey Desai
Greg Durrett
UQLM
218
243
0
17 Mar 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
283
6,003
0
20 Apr 2018
1