ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2209.07879
  4. Cited By
Less is Better: Recovering Intended-Feature Subspace to Robustify NLU
  Models

Less is Better: Recovering Intended-Feature Subspace to Robustify NLU Models

International Conference on Computational Linguistics (COLING), 2022
16 September 2022
Ting Wu
Tao Gui
ArXiv (abs)PDFHTML

Papers citing "Less is Better: Recovering Intended-Feature Subspace to Robustify NLU Models"

3 / 3 papers shown
Not Eliminate but Aggregate: Post-Hoc Control over Mixture-of-Experts to
  Address Shortcut Shifts in Natural Language Understanding
Not Eliminate but Aggregate: Post-Hoc Control over Mixture-of-Experts to Address Shortcut Shifts in Natural Language Understanding
Ukyo Honda
Tatsushi Oka
Peinan Zhang
Masato Mita
273
1
0
17 Jun 2024
Modeling the Q-Diversity in a Min-max Play Game for Robust Optimization
Modeling the Q-Diversity in a Min-max Play Game for Robust OptimizationAnnual Meeting of the Association for Computational Linguistics (ACL), 2023
Ting Wu
Rui Zheng
Tao Gui
Tao Gui
Xuanjing Huang
136
4
0
20 May 2023
Kernel-Whitening: Overcome Dataset Bias with Isotropic Sentence
  Embedding
Kernel-Whitening: Overcome Dataset Bias with Isotropic Sentence EmbeddingConference on Empirical Methods in Natural Language Processing (EMNLP), 2022
Songyang Gao
Jiajun Sun
Tao Gui
Xuanjing Huang
141
11
0
14 Oct 2022
1