ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.11011
  4. Cited By
VisDA-2021 Competition Universal Domain Adaptation to Improve
  Performance on Out-of-Distribution Data

VisDA-2021 Competition Universal Domain Adaptation to Improve Performance on Out-of-Distribution Data

23 July 2021
D. Bashkirova
Dan Hendrycks
Donghyun Kim
Samarth Mishra
Kate Saenko
Kuniaki Saito
Piotr Teterwak
Ben Usman
    OOD
ArXivPDFHTML

Papers citing "VisDA-2021 Competition Universal Domain Adaptation to Improve Performance on Out-of-Distribution Data"

5 / 5 papers shown
Title
Pre-Training Transformers for Domain Adaptation
Pre-Training Transformers for Domain Adaptation
Burhan Ul Tayyab
Nicholas Chua
ViT
13
2
0
18 Dec 2021
PixMix: Dreamlike Pictures Comprehensively Improve Safety Measures
PixMix: Dreamlike Pictures Comprehensively Improve Safety Measures
Dan Hendrycks
Andy Zou
Mantas Mazeika
Leonard Tang
Bo-wen Li
D. Song
Jacob Steinhardt
UQCV
23
136
0
09 Dec 2021
Are Transformers More Robust Than CNNs?
Are Transformers More Robust Than CNNs?
Yutong Bai
Jieru Mei
Alan Yuille
Cihang Xie
ViT
AAML
192
258
0
10 Nov 2021
Zero-Shot Text-to-Image Generation
Zero-Shot Text-to-Image Generation
Aditya A. Ramesh
Mikhail Pavlov
Gabriel Goh
Scott Gray
Chelsea Voss
Alec Radford
Mark Chen
Ilya Sutskever
VLM
255
4,777
0
24 Feb 2021
Confidence Regularized Self-Training
Confidence Regularized Self-Training
Yang Zou
Zhiding Yu
Xiaofeng Liu
B. Kumar
Jinsong Wang
230
789
0
26 Aug 2019
1