Skip to main navigation Skip to search Skip to main content

THE RISE OF THE LOTTERY HEROES: WHY ZERO-SHOT PRUNING IS HARD

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Recent advances in deep learning optimization showed that just a subset of parameters are really necessary to successfully train a model. Potentially, such a discovery has broad impact from the theory to application; however, it is known that finding these trainable sub-network is a typically costly process. This inhibits practical applications: can the learned sub-graph structures in deep learning models be found at training time? In this work we explore such a possibility, observing and motivating why common approaches typically fail in the extreme scenarios of interest, and proposing an approach which potentially enables training with reduced computational effort. The experiments on either challenging architectures and datasets suggest the algorithmic accessibility over such a computational gain, and in particular a trade-off between accuracy achieved and training complexity deployed emerges.

Original languageEnglish
Title of host publication2022 IEEE International Conference on Image Processing, ICIP 2022 - Proceedings
PublisherIEEE Computer Society
Pages2361-2365
Number of pages5
ISBN (Electronic)9781665496209
DOIs
Publication statusPublished - 1 Jan 2022
Event29th IEEE International Conference on Image Processing, ICIP 2022 - Bordeaux, France
Duration: 16 Oct 202219 Oct 2022

Publication series

NameProceedings - International Conference on Image Processing, ICIP
ISSN (Print)1522-4880

Conference

Conference29th IEEE International Conference on Image Processing, ICIP 2022
Country/TerritoryFrance
CityBordeaux
Period16/10/2219/10/22

Keywords

  • The lottery ticket hypothesis
  • computational complexity
  • deep learning
  • pruning

Fingerprint

Dive into the research topics of 'THE RISE OF THE LOTTERY HEROES: WHY ZERO-SHOT PRUNING IS HARD'. Together they form a unique fingerprint.

Cite this