Role of Synaptic Stochasticity in Training Low-Precision Neural Networks

  • Carlo Baldassi
  • , Federica Gerace
  • , Hilbert J. Kappen
  • , Carlo Lucibello
  • , Luca Saglietti
  • , Enzo Tartaglione
  • , Riccardo Zecchina

Research output: Contribution to journalArticlepeer-review

Abstract

Stochasticity and limited precision of synaptic weights in neural network models are key aspects of both biological and hardware modeling of learning processes. Here we show that a neural network model with stochastic binary weights naturally gives prominence to exponentially rare dense regions of solutions with a number of desirable properties such as robustness and good generalization performance, while typical solutions are isolated and hard to find. Binary solutions of the standard perceptron problem are obtained from a simple gradient descent procedure on a set of real values parametrizing a probability distribution over the binary synapses. Both analytical and numerical results are presented. An algorithmic extension that allows to train discrete deep neural networks is also investigated.

Original languageEnglish
Article number268103
JournalPhysical Review Letters
Volume120
Issue number26
DOIs
Publication statusPublished - 29 Jun 2018
Externally publishedYes

Fingerprint

Dive into the research topics of 'Role of Synaptic Stochasticity in Training Low-Precision Neural Networks'. Together they form a unique fingerprint.

Cite this