Learning with differentiable perturbed optimizers

  • Quentin Berthet
  • , Mathieu Blondel
  • , Olivier Teboul
  • , Marco Cuturi
  • , Jean Philippe Vert
  • , Francis Bach

Research output: Contribution to journalConference articlepeer-review

Abstract

Machine learning pipelines often rely on optimization procedures to make discrete decisions (e.g., sorting, picking closest neighbors, or shortest paths). Although these discrete decisions are easily computed, they break the back-propagation of computational graphs. In order to expand the scope of learning problems that can be solved in an end-to-end fashion, we propose a systematic method to transform optimizers into operations that are differentiable and never locally constant. Our approach relies on stochastically perturbed optimizers, and can be used readily together with existing solvers. Their derivatives can be evaluated efficiently, and smoothness tuned via the chosen noise amplitude. We also show how this framework can be connected to a family of losses developed in structured prediction, and give theoretical guarantees for their use in learning tasks. We demonstrate experimentally the performance of our approach on various tasks.

Original languageEnglish
JournalAdvances in Neural Information Processing Systems
Volume2020-December
Publication statusPublished - 1 Jan 2020
Externally publishedYes
Event34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online
Duration: 6 Dec 202012 Dec 2020

Fingerprint

Dive into the research topics of 'Learning with differentiable perturbed optimizers'. Together they form a unique fingerprint.

Cite this