Skip to main navigation Skip to search Skip to main content

LONG-TIME ASYMPTOTICS OF NOISY SVGD OUTSIDE THE POPULATION LIMIT

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Stein Variational Gradient Descent (SVGD) is a widely used sampling algorithm that has been successfully applied in several areas of Machine Learning. SVGD operates by iteratively moving a set of n interacting particles (which represent the samples) to approximate the target distribution. Despite recent studies on the complexity of SVGD and its variants, their long-time asymptotic behavior (i.e., after numerous iterations k) is still not understood in the finite number of particles regime. We study the long-time asymptotic behavior of a noisy variant of SVGD. First, we establish that the limit set of noisy SVGD for large k is well-defined. We then characterize this limit set, showing that it approaches the target distribution as n increases. In particular, noisy SVGD avoids the variance collapse observed for SVGD. Our approach involves demonstrating that the trajectories of noisy SVGD closely resemble those described by a McKean-Vlasov process.

Original languageEnglish
Title of host publication13th International Conference on Learning Representations, ICLR 2025
PublisherInternational Conference on Learning Representations, ICLR
Pages94780-94811
Number of pages32
ISBN (Electronic)9798331320850
Publication statusPublished - 1 Jan 2025
Event13th International Conference on Learning Representations, ICLR 2025 - Singapore, Singapore
Duration: 24 Apr 202528 Apr 2025

Publication series

Name13th International Conference on Learning Representations, ICLR 2025

Conference

Conference13th International Conference on Learning Representations, ICLR 2025
Country/TerritorySingapore
CitySingapore
Period24/04/2528/04/25

Fingerprint

Dive into the research topics of 'LONG-TIME ASYMPTOTICS OF NOISY SVGD OUTSIDE THE POPULATION LIMIT'. Together they form a unique fingerprint.

Cite this