Skip to main navigation Skip to search Skip to main content

Entropy and mutual information in models of deep neural networks

  • Marylou Gabrié
  • , Andre Manoel
  • , Clément Luneau
  • , Jean Barbier
  • , Nicolas Macris
  • , Florent Krzakala
  • , Lenka Zdeborová

Research output: Contribution to journalConference articlepeer-review

Abstract

We examine a class of stochastic deep learning models with a tractable method to compute information-theoretic quantities. Our contributions are three-fold: (i) We show how entropies and mutual informations can be derived from heuristic statistical physics methods, under the assumption that weight matrices are independent and orthogonally-invariant. (ii) We extend particular cases in which this result is known to be rigorously exact by providing a proof for two-layers networks with Gaussian random weights, using the recently introduced adaptive interpolation method. (iii) We propose an experiment framework with generative models of synthetic datasets, on which we train deep neural networks with a weight constraint designed so that the assumption in (i) is verified during learning. We study the behavior of entropies and mutual informations throughout learning and conclude that, in the proposed setting, the relationship between compression and generalization remains elusive.

Original languageEnglish
Pages (from-to)1821-1831
Number of pages11
JournalAdvances in Neural Information Processing Systems
Volume2018-December
Publication statusPublished - 1 Jan 2018
Externally publishedYes
Event32nd Conference on Neural Information Processing Systems, NeurIPS 2018 - Montreal, Canada
Duration: 2 Dec 20188 Dec 2018

Fingerprint

Dive into the research topics of 'Entropy and mutual information in models of deep neural networks'. Together they form a unique fingerprint.

Cite this