Skip to main navigation Skip to search Skip to main content

ASkewSGD: An Annealed interval-constrained Optimisation method to train Quantized Neural Networks

Research output: Contribution to journalConference articlepeer-review

Abstract

In this paper, we develop a new algorithm, Annealed Skewed SGD - ASkewSGD - for training deep neural networks (DNNs) with quantized weights. First, we formulate the training of quantized neural networks (QNNs) as a smoothed sequence of interval-constrained optimization problems. Then, we propose a new first-order stochastic method, ASkewSGD, to solve each constrained optimization subproblem. Unlike algorithms with active sets and feasible directions, ASkewSGD avoids projections or optimization under the entire feasible set and allows iterates that are infeasible. The numerical complexity of ASkewSGD is comparable to existing approaches for training QNNs, such as the straight-through gradient estimator used in BinaryConnect, or other state of the art methods (ProxQuant, LUQ). We establish convergence guarantees for ASkewSGD (under general assumptions for the objective function). Experimental results show that the ASkewSGD algorithm performs better than or on par with state of the art methods in classical benchmarks.

Original languageEnglish
Pages (from-to)3644-3663
Number of pages20
JournalProceedings of Machine Learning Research
Volume206
Publication statusPublished - 1 Jan 2023
Event26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023 - Valencia, Spain
Duration: 25 Apr 202327 Apr 2023

Fingerprint

Dive into the research topics of 'ASkewSGD: An Annealed interval-constrained Optimisation method to train Quantized Neural Networks'. Together they form a unique fingerprint.

Cite this