Maximum Entropy Methods for Texture Synthesis: Theory and Practice

Valentin De Bortoli, Agnès Desolneux, Alain Durmus, Bruno Galerne, Arthur Leclaire

Research output: Contribution to journalArticlepeer-review

Abstract

Recent years have seen the rise of convolutional neural network techniques in exemplar-based image synthesis. These methods often rely on the minimization of some variational formulation on the image space for which the minimizers are assumed to be the solutions of the synthesis problem. In this paper we investigate, both theoretically and experimentally, another framework to deal with this problem using an alternate sampling/minimization scheme. First, we use results from information geometry to assess that our method yields a probability measure which has maximum entropy under some constraints in expectation. Then, we turn to the analysis of our method, and we show, using recent results from the Markov chain literature, that its error can be explicitly bounded with constants which depend polynomially on the dimension even in the nonconvex setting. This includes the case where the constraints are defined via a differentiable neural network. Finally, we present an extensive experimental study of the model, including a comparison with state-of-the-art methods.

Original languageEnglish
Pages (from-to)52-82
Number of pages31
JournalSIAM Journal on Mathematics of Data Science
Volume3
Issue number1
DOIs
Publication statusPublished - 1 Jan 2020
Externally publishedYes

Keywords

  • Langevin algorithm
  • Markov chains
  • convolutional neural network
  • information geometry
  • maximum entropy
  • texture synthesis

Fingerprint

Dive into the research topics of 'Maximum Entropy Methods for Texture Synthesis: Theory and Practice'. Together they form a unique fingerprint.

Cite this