Inherent limitations of dimensions for characterizing learnability of distribution classes
Lechner, Tosca; Ben-David, Shai
Session: 1B-2 (Distribution Learning), Sunday, Jun 30, 10:45-12:00
Abstract:
We consider the long-standing question of finding a parameter of a class of probability distributions
that characterizes its PAC learnability. While for many learning tasks (such as binary classification
and online learning) there is a notion of dimension whose finiteness is equivalent to learnability
within any level of accuracy, we show, rather surprisingly, that such parameter exists for distribution
learning.
Concretely, Our results apply for several general notions of characterizing learnability and for
several learning tasks. We show that there is no notion of dimension that characterizes the sample
complexity of learning distribution classes. We then consider the weaker requirement of only char-
acterizing learnability (rather than the quantitative sample complexity function). We propose some
natural requirements for such a characterization and go on to show that there exists no characteri-
zation of learnability that satisfies these requirements for classes of distributions. Furthermore, we
show that our results hold for various other learning problems. In particular, we show that there is
no notion of dimension characterizing PAC-learnability for any of the tasks: classification learning
w.r.t. a restricted set of marginal distributions and learnability of classes of real-valued functions
with continuous losses.