Reducibility and Statistical-Computational Gaps from Secret Leakage
Matthew S Brennan, Guy Bresler
Subject areas: Computational complexity, High-dimensional statistics
Presented in: Session 4A, Session 4C
[Zoom link for poster in Session 4A], [Zoom link for poster in Session 4C]
Abstract:
Inference problems with conjectured statistical-computational gaps are ubiquitous throughout modern statistics, computer science, statistical physics and discrete probability. While there has been success evidencing these gaps from the failure of restricted classes of algorithms, progress towards a more traditional reduction-based approach to computational complexity in statistical inference has been limited. These average-case problems are each tied to a different natural distribution, high-dimensional structure and conjecturally hard parameter regime, leaving reductions among them technically challenging. Despite a flurry of recent success in developing such techniques, existing reductions have largely been limited to inference problems with similar structure -- primarily mapping among problems representable as a sparse submatrix signal plus a noise matrix, which is similar to the common starting hardness assumption of planted clique ($\textsc{pc}$).\n\nThe insight in this work is that a slight generalization of the planted clique conjecture -- secret leakage planted clique ($\textsc{pc}_\rho$), wherein a small amount of information about the hidden clique is revealed -- gives rise to a variety of new average-case reduction techniques, yielding a web of reductions relating statistical problems with very different structure. Based on generalizations of the planted clique conjecture to specific forms of $\textsc{pc}_\rho$, we deduce tight statistical-computational tradeoffs for a diverse range of problems including robust sparse mean estimation, mixtures of sparse linear regressions, robust sparse linear regression, tensor PCA, variants of dense $k$-block stochastic block models, negatively correlated sparse PCA, semirandom planted dense subgraph, detection in hidden partition models and a universality principle for learning sparse mixtures. This gives the first reduction-based evidence for a number of conjectured statistical-computational gaps.\n\nWe introduce a number of new average-case reduction techniques that also reveal novel connections to combinatorial designs based on the incidence geometry of $\mathbb{F}_r^t$ and to random matrix theory. In particular, we show a convergence result between Wishart and inverse Wishart matrices that may be of independent interest. The specific hardness conjectures for $\textsc{pc}_\rho$ implying our statistical-computational gaps all are in correspondence with natural graph problems such as $k$-partite, bipartite and hypergraph variants of $\textsc{pc}$. Hardness in a $k$-partite hypergraph variant of $\textsc{pc}$ is the strongest of these conjectures and sufficient to establish all of our computational lower bounds. We also give evidence for our $\textsc{pc}_\rho$ hardness conjectures from the failure of low-degree polynomials and statistical query algorithms. Our work raises a number of open problems and suggests that previous technical obstacles to average-case reductions may have arisen because planted clique is not the right starting point. An expanded set of hardness assumptions, such as $\textsc{pc}_\rho$, may be a key first step towards a more complete theory of reductions among statistical problems.