Implicit Diffusion: Efficient optimization through stochastic sampling

  • Pierre Marion
  • , Anna Korba
  • , Peter Bartlett
  • , Mathieu Blondel
  • , Valentin De Bortoli
  • , Arnaud Doucet
  • , Felipe Llinares-Lopez
  • , Courtney Paquette
  • , Quentin Berthet

Research output: Contribution to journalConference articlepeer-review

Abstract

Sampling and automatic differentiation are both ubiquitous in modern machine learning. At its intersection, differentiating through a sampling operation, with respect to the parameters of the sampling process, is a problem that is both challenging and broadly applicable. We introduce a general framework and a new algorithm for first-order optimization of parameterized stochastic diffusions, performing jointly, in a single loop, optimization and sampling steps. This approach is inspired by recent advances in bilevel optimization and automatic implicit differentiation, leveraging the point of view of sampling as optimization over the space of probability distributions. We provide theoretical and experimental results showcasing the performance of our method.

Original languageEnglish
Pages (from-to)1999-2007
Number of pages9
JournalProceedings of Machine Learning Research
Volume258
Publication statusPublished - 1 Jan 2025
Externally publishedYes
Event28th International Conference on Artificial Intelligence and Statistics, AISTATS 2025 - Mai Khao, Thailand
Duration: 3 May 20255 May 2025

Fingerprint

Dive into the research topics of 'Implicit Diffusion: Efficient optimization through stochastic sampling'. Together they form a unique fingerprint.

Cite this