We develop and analyze algorithms for distributionally robust optimization (DRO) of convex losses. In particular, we consider group-structured and bounded -divergence uncertainty sets. Our approach relies on an accelerated method that queries a ball optimization oracle, i.e., a subroutine that minimizes the objective within a small ball around the query point. Our main contribution is efficient implementations of this oracle for DRO objectives. For DRO with non-smooth loss functions, the resulting algorithms find an -accurate solution with first-order oracle queries to individual loss functions. Compared to existing algorithms for this problem, we improve complexity by a factor of up to .
View on arXiv