Uniform Risk Bounds for Learning with Dependent Data Sequences - Université de Lorraine Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2023

Uniform Risk Bounds for Learning with Dependent Data Sequences

Fabien Lauer

Résumé

This paper extends standard results from learning theory with independent data to sequences of dependent data. Contrary to most of the literature, we do not rely on mixing arguments or sequential measures of complexity and derive uniform risk bounds with classical proof patterns and capacity measures. In particular, we show that the standard classification risk bounds based on the VC-dimension hold in the exact same form for dependent data, and further provide Rademacher complexity-based bounds, that remain unchanged compared to the standard results for the identically and independently distributed case. Finally, we show how to apply these results in the context of scenario-based optimization in order to compute the sample complexity of random programs with dependent constraints.
Fichier principal
Vignette du fichier
Lauer23dependent.pdf (411.73 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04037480 , version 1 (20-03-2023)

Identifiants

Citer

Fabien Lauer. Uniform Risk Bounds for Learning with Dependent Data Sequences. 2023. ⟨hal-04037480⟩
30 Consultations
33 Téléchargements

Altmetric

Partager

Gmail Mastodon Facebook X LinkedIn More