23
0

DeepOSets: Non-Autoregressive In-Context Learning of Supervised Learning Operators

Abstract

We introduce DeepSets Operator Networks (DeepOSets), an efficient, non-autoregressive neural network architecture for in-context learning of permutation-invariant operators. DeepOSets combines the operator learning capabilities of Deep Operator Networks (DeepONets) with the set learning capabilities of DeepSets. Here, we present the application of DeepOSets to the problem of learning supervised learning algorithms, which are continuous permutation-invariant operators. We show that DeepOSets are universal approximators for this class of operators. In an empirical comparison with a popular autoregressive (transformer-based) model for in-context learning of linear regression, DeepOSets reduced the number of model weights by several orders of magnitude and required a fraction of training and inference time, in addition to significantly outperforming the transformer model in noisy settings. We also demonstrate the multiple operator learning capabilities of DeepOSets with a polynomial regression experiment where the order of the polynomial is learned in-context from the prompt.

View on arXiv
@article{chiu2025_2410.09298,
  title={ DeepOSets: Non-Autoregressive In-Context Learning of Supervised Learning Operators },
  author={ Shao-Ting Chiu and Junyuan Hong and Ulisses Braga-Neto },
  journal={arXiv preprint arXiv:2410.09298},
  year={ 2025 }
}
Comments on this paper