ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.13282
  4. Cited By
Is Fine-tuning Needed? Pre-trained Language Models Are Near Perfect for
  Out-of-Domain Detection

Is Fine-tuning Needed? Pre-trained Language Models Are Near Perfect for Out-of-Domain Detection

22 May 2023
Rheeya Uppaal
Junjie Hu
Yixuan Li
    OODD
ArXivPDFHTML

Papers citing "Is Fine-tuning Needed? Pre-trained Language Models Are Near Perfect for Out-of-Domain Detection"

5 / 5 papers shown
Title
Mitigating Neural Network Overconfidence with Logit Normalization
Mitigating Neural Network Overconfidence with Logit Normalization
Hongxin Wei
Renchunzi Xie
Hao-Ran Cheng
Lei Feng
Bo An
Yixuan Li
OODD
148
183
0
19 May 2022
Types of Out-of-Distribution Texts and How to Detect Them
Types of Out-of-Distribution Texts and How to Detect Them
Udit Arora
William Huang
He He
OODD
189
82
0
14 Sep 2021
Revisiting Mahalanobis Distance for Transformer-Based Out-of-Domain
  Detection
Revisiting Mahalanobis Distance for Transformer-Based Out-of-Domain Detection
Alexander Podolskiy
Dmitry Lipin
A. Bout
Ekaterina Artemova
Irina Piontkovskaya
OODD
76
68
0
11 Jan 2021
Calibration of Pre-trained Transformers
Calibration of Pre-trained Transformers
Shrey Desai
Greg Durrett
UQLM
218
243
0
17 Mar 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
283
6,003
0
20 Apr 2018
1