Abstract
Sampling and automatic differentiation are both ubiquitous in modern machine learning. At its intersection, differentiating through a sampling operation, with respect to the parameters of the sampling process, is a problem that is both challenging and broadly applicable. We introduce a general framework and a new algorithm for first-order optimization of parameterized stochastic diffusions, performing jointly, in a single loop, optimization and sampling steps. This approach is inspired by recent advances in bilevel optimization and automatic implicit differentiation, leveraging the point of view of sampling as optimization over the space of probability distributions. We provide theoretical and experimental results showcasing the performance of our method.
| Original language | English |
|---|---|
| Pages (from-to) | 1999-2007 |
| Number of pages | 9 |
| Journal | Proceedings of Machine Learning Research |
| Volume | 258 |
| Publication status | Published - 1 Jan 2025 |
| Externally published | Yes |
| Event | 28th International Conference on Artificial Intelligence and Statistics, AISTATS 2025 - Mai Khao, Thailand Duration: 3 May 2025 → 5 May 2025 |