A theoretical perspective on mode collapse in variational inference

Research output: Contribution to journalArticlepeer-review

Abstract

While deep learning has expanded the possibilities for highly expressive variational families, the practical benefits of these tools for variational inference (VI) are often limited by the minimization of the traditional Kullback-Leibler objective, which can yield suboptimal solutions. A major challenge in this context is mode collapse: the phenomenon where a model concentrates on a few modes of the target distribution during training, despite being statistically capable of expressing them all. In this work, we carry a theoretical investigation of mode collapse for the gradient flow on Gaussian mixture models. We identify the key low-dimensional statistics characterizing the flow, and derive a closed set of low-dimensional equations governing their evolution. Leveraging this compact description, we show that mode collapse is present even in statistically favorable scenarios, and identify two key mechanisms driving it: mean alignment and vanishing weight. Our theoretical findings are consistent with the implementation of VI using normalizing flows, a class of popular generative models, thereby offering practical insights.

Original languageEnglish
Article number025056
JournalMachine Learning: Science and Technology
Volume6
Issue number2
DOIs
Publication statusPublished - 30 Jun 2025

Keywords

  • Gaussian mixture models
  • mean field approximation
  • mode collapse
  • statistical mechanics of learning
  • variational inference

Fingerprint

Dive into the research topics of 'A theoretical perspective on mode collapse in variational inference'. Together they form a unique fingerprint.

Cite this