Outlier-Robust Learning of Ising Models Under Dobrushin's Condition
Ilias Diakonikolas , Daniel M. Kane , Alistair Stewart , Yuxin Sun
Session: Minimally Supervised Learning (B)
Session Chair: Aryeh Kontorovich
Poster: Poster Session 1
Abstract:
We study the problem of learning Ising models satisfying Dobrushin's
condition in the outlier-robust setting where a constant fraction of
the samples are adversarially corrupted. Our main result is to provide
the first computationally efficient robust learning algorithm for this
problem with near-optimal error guarantees. Our algorithm can be seen
as a special case of an algorithm for robustly learning a distribution
from a general exponential family. To prove its correctness for
Ising models, we establish new anti-concentration results
for degree-2 polynomials of Ising models
that may be of independent interest.