119
2
v1v2 (latest)

Task Generalization With AutoRegressive Compositional Structure: Can Learning From DD Tasks Generalize to DTD^{T} Tasks?

Main:9 Pages
6 Figures
Bibliography:2 Pages
3 Tables
Appendix:8 Pages
Abstract

Large language models (LLMs) exhibit remarkable task generalization, solving tasks they were never explicitly trained on with only a few demonstrations. This raises a fundamental question: When can learning from a small set of tasks generalize to a large task family? In this paper, we investigate task generalization through the lens of autoregressive compositional structure, where each task is a composition of TT operations, and each operation is among a finite family of DD subtasks. This yields a total class of size DTD^T. We first show that generalization to all DTD^T tasks is theoretically achievable by training on only O~(D)\widetilde{O}(D) tasks. Empirically, we demonstrate that Transformers achieve such exponential task generalization on sparse parity functions via In-context Learning (ICL) and chain-of-thought (CoT) reasoning. We further show generalization in arithmetic and translation, beyond parity functions.

View on arXiv
Comments on this paper