Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2111.01026
Cited By
Introspective Distillation for Robust Question Answering
1 November 2021
Yulei Niu
Hanwang Zhang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Introspective Distillation for Robust Question Answering"
9 / 9 papers shown
Title
Overcoming Language Priors for Visual Question Answering Based on Knowledge Distillation
Daowan Peng
Wei Wei
83
0
0
10 Jan 2025
Relative Counterfactual Contrastive Learning for Mitigating Pretrained Stance Bias in Stance Detection
Jiarui Zhang
Shaojuan Wu
Xiaowang Zhang
Zhiyong Feng
29
0
0
16 May 2024
Look, Listen, and Answer: Overcoming Biases for Audio-Visual Question Answering
Jie Ma
Min Hu
Pinghui Wang
Wangchun Sun
Lingyun Song
Hongbin Pei
Jun Liu
Youtian Du
32
4
0
18 Apr 2024
Think Twice: Measuring the Efficiency of Eliminating Prediction Shortcuts of Question Answering Models
Lukávs Mikula
Michal vStefánik
Marek Petrovivc
Petr Sojka
28
3
0
11 May 2023
Respecting Transfer Gap in Knowledge Distillation
Yulei Niu
Long Chen
Chan Zhou
Hanwang Zhang
19
23
0
23 Oct 2022
Prompt-aligned Gradient for Prompt Tuning
Beier Zhu
Yulei Niu
Yucheng Han
Yuehua Wu
Hanwang Zhang
VLM
175
269
0
30 May 2022
Classification-Then-Grounding: Reformulating Video Scene Graphs as Temporal Bipartite Graphs
Kaifeng Gao
Long Chen
Yulei Niu
Jian Shao
Jun Xiao
13
29
0
08 Dec 2021
Balanced Knowledge Distillation for Long-tailed Learning
Shaoyu Zhang
Chen Chen
Xiyuan Hu
Silong Peng
35
57
0
21 Apr 2021
Counterfactual Samples Synthesizing for Robust Visual Question Answering
Long Chen
Xin Yan
Jun Xiao
Hanwang Zhang
Shiliang Pu
Yueting Zhuang
OOD
AAML
142
290
0
14 Mar 2020
1