Fast Rates for Online Prediction with Abstention
Gergely Neu, Nikita Zhivotovskiy
Subject areas: Online learning, Classification, Excess risk bounds and generalization error bounds, Loss functions, PAC learning
Presented in: Session 2E, Session 3A
[Zoom link for poster in Session 2E], [Zoom link for poster in Session 3A]
Abstract:
In the setting of sequential prediction of individual 0,1-sequences with expert advice, we show that by allowing the learner to abstain from the prediction by paying a cost marginally smaller than 0.5 (say, 0.49), it is possible to achieve expected regret bounds that are independent of the time horizon T. We exactly characterize the dependence on the abstention cost c and the number of experts N by providing matching upper and lower bounds of order log N/(1-2c), which is to be contrasted with the best possible rate of (T log N)^0.5 that is available without the option to abstain. We also discuss various extensions of our model, including a setting where the sequence of abstention costs can change arbitrarily over time, where we show regret bounds interpolating between the slow and the fast rates mentioned above, under some natural assumptions on the sequence of abstention costs.