Inverse Statistical Learning: minimax theory and adaptation

Nom de l'orateur
Sébastien Loustau
Etablissement de l'orateur
Université d'Angers
Date et heure de l'exposé
Lieu de l'exposé
Salle Eole

We consider the problem of statistical learning when we observe a contaminated sample. We state minimax fast rates of convergence in classification with errors in variables for deconvolution empirical risk minimizers. These rates depend on the ill-posedness, the margin and the complexity of the problem. The cornerstone of the proof is a bias variance decomposition of the excess risk. After that, we investigate the problem of adaptation to the unknown smoothness. We introduce a new selection rule called ERC (Empirical Risk Comparison), that allows us to obtain adaptive fast rates of convergence in noisy clustering. The method is based on the Lepski's procedure, where empirical risks associated with different bandwidths are compared. This adaptive rule can be used in many statistical problems of M-estimation, where the empirical risk depends on a nuisance parameter.