Item request has been placed! ×
Item request cannot be made. ×
loading  Processing Request

Learning with BOT - Bregman and Optimal Transport divergences

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Additional Information
    • Contributors:
      Cornell University New York; Laboratoire de Mathématiques et de leurs Applications Pau (LMAP); Université de Pau et des Pays de l'Adour (UPPA)-Centre National de la Recherche Scientifique (CNRS)
    • Publication Information:
      CCSD
    • Publication Date:
      2021
    • Collection:
      HAL e2s UPPA (Université de Pau et des Pays de l'Adour)
    • Abstract:
      The introduction of the Kullback-Leibler divergence in PAC-Bayesian theory can be traced back to the work of [1]. It allows to design learning procedure with generalization errors based on an optimal trade-off between accuracy on the training set, and complexity. This complexity is penalized thanks to the Kullback-Leibler divergence from a prior distribution, modeling a domain knowledge over the set of candidates or weak learners. In the context of high dimensional statistics, it gives rise to sparsity oracle inequalities or more recently sparsity regret bounds, where the complexity is measured thanks to 0 or 1 −norms. In this paper, we propose to extend the PAC-Bayesian theory to get more generic regret bounds for sequential weighted averages, where (1) the measure of complexity is based on any ad-hoc criterion and (2) the prior distribution could be very simple. These results arise by introducing a new measure of divergences from the prior in terms of Bregman divergence or Optimal Transport.
    • Online Access:
      https://hal.science/hal-03262687
      https://hal.science/hal-03262687v2/document
      https://hal.science/hal-03262687v2/file/Learning_with_BOT.pdf
    • Rights:
      info:eu-repo/semantics/OpenAccess
    • Accession Number:
      edsbas.49EB6114