[Submitted on 21 Oct 2022]

Download PDF

Overview: Training error $\ell$ under any class of linear predictor in Gaussian space, class Rademacher complexity, and any continuous loss controls test error under any Moro envelope of loss $\ We prove a new generalized bound stating el$. We use finite sample bounds to directly recover the ‘optimistic rate’ of Zhou et al. (2021) for linear regression with squared loss, known to be tight with minimum $\ell_2$ norm interpolation, but label produced by potentially misspecified multi-index model , which also handles more general settings. Using the same arguments, we can analyze the noisy interpolation of the max-margin classifier with the squared hinge loss and establish consistent results with spike covariance settings. More generally, assuming that the loss is only Lipschitz, our bounds effectively improve his Talagrand’s well-known contraction lemma by a factor of two, giving all smooth, non-negative We prove uniform convergence of the interpolator for the function (Koehler et al. 2021). loss. Finally, we show that the application of generalized bounds using localized Gaussian widths is generally sensitive to empirical risk minimization, applied outside the proportional scaling regime, and handles model misspecifications. and establish a non-asymptotic Morrow envelope theory for generalization that complements the existing asymptotic Morrow. Envelope theory of M-estimation.

Submission history

From: Danica J. Sutherland [view email]

[v1]

Fri, Oct 21, 2022 16:16:55 UTC (3,002 KB)



Source link

Share.

Leave A Reply