A Mixture-Based Framework for Guiding Diffusion Models

Research output: Contribution to journalConference articlepeer-review

Abstract

Denoising diffusion models have driven significant progress in the field of Bayesian inverse problems. Recent approaches use pre-trained diffusion models as priors to solve a wide range of such problems, only leveraging inference-time compute and thereby eliminating the need to retrain task-specific models on the same dataset. To approximate the posterior of a Bayesian inverse problem, a diffusion model samples from a sequence of intermediate posterior distributions, each with an intractable likelihood function. This work proposes a novel mixture approximation of these intermediate distributions. Since direct gradient-based sampling of these mixtures is infeasible due to intractable terms, we propose a practical method based on Gibbs sampling. We validate our approach through extensive experiments on image inverse problems, utilizing both pixel-and latent-space diffusion priors, as well as on source separation with an audio diffusion model.

Original languageEnglish
Pages (from-to)26830-26876
Number of pages47
JournalProceedings of Machine Learning Research
Volume267
Publication statusPublished - 1 Jan 2025
Externally publishedYes
Event42nd International Conference on Machine Learning, ICML 2025 - Vancouver, Canada
Duration: 13 Jul 202519 Jul 2025

Fingerprint

Dive into the research topics of 'A Mixture-Based Framework for Guiding Diffusion Models'. Together they form a unique fingerprint.

Cite this