ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.00164
12
7

Predicting with High Correlation Features

1 October 2019
Devansh Arpit
Caiming Xiong
R. Socher
    OODD
    OOD
ArXivPDFHTML
Abstract

It has been shown that instead of learning actual object features, deep networks tend to exploit non-robust (spurious) discriminative features that are shared between training and test sets. Therefore, while they achieve state of the art performance on such test sets, they achieve poor generalization on out of distribution (OOD) samples where the IID (independent, identical distribution) assumption breaks and the distribution of non-robust features shifts. In this paper, we consider distribution shift as a shift in the distribution of input features during test time that exhibit low correlation with targets in the training set. Under this definition, we evaluate existing robust feature learning methods and regularization methods and compare them against a baseline designed to specifically capture high correlation features in training set. As a controlled test-bed, we design a colored MNIST (C-MNIST) dataset and find that existing methods trained on this set fail to generalize well on an OOD version this dataset, showing that they overfit the low correlation color features. This is avoided by the baseline method trained on the same C-MNIST data, which is designed to learn high correlation features, and is able to generalize on the test sets of vanilla MNIST, MNIST-M and SVHN datasets. Our code is available at \url{https://github.com/salesforce/corr_based_prediction}.

View on arXiv
Comments on this paper