Passer à la navigation principale Passer à la recherche Passer au contenu principal

A nonasymptotic law of iterated logarithm for general M-estimators

  • ENSAE

Résultats de recherche: Contribution à un journalArticle de conférenceRevue par des pairs

Résumé

M-estimators are ubiquitous in machine learning and statistical learning theory. They are used both for defining prediction strategies and for evaluating their precision. In this paper, we propose the first non-asymptotic “any-time” deviation bounds for general Mestimators, where “any-time” means that the bound holds with a prescribed probability for every sample size. These bounds are nonasymptotic versions of the law of iterated logarithm. They are established under general assumptions such as Lipschitz continuity of the loss function and (local) curvature of the population risk. These conditions are satisfied for most examples used in machine learning, including those ensuring robustness to outliers and to heavy-tailed distributions. As an example of application, we consider the problem of best arm identification in a stochastic multi-armed bandit setting. We show that the established bound can be converted into a new algorithm, with provably optimal theoretical guarantees. Numerical experiments illustrating the validity of the algorithm are reported.

langue originaleAnglais
Pages (de - à)1331-1341
Nombre de pages11
journalProceedings of Machine Learning Research
Volume108
étatPublié - 1 janv. 2020
Modification externeOui
Evénement23rd International Conference on Artificial Intelligence and Statistics, AISTATS 2020 - Virtual, Online
Durée: 26 août 202028 août 2020

Empreinte digitale

Examiner les sujets de recherche de « A nonasymptotic law of iterated logarithm for general M-estimators ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation