Passer à la navigation principale Passer à la recherche Passer au contenu principal

ASkewSGD: An Annealed interval-constrained Optimisation method to train Quantized Neural Networks

Résultats de recherche: Contribution à un journalArticle de conférenceRevue par des pairs

Résumé

In this paper, we develop a new algorithm, Annealed Skewed SGD - ASkewSGD - for training deep neural networks (DNNs) with quantized weights. First, we formulate the training of quantized neural networks (QNNs) as a smoothed sequence of interval-constrained optimization problems. Then, we propose a new first-order stochastic method, ASkewSGD, to solve each constrained optimization subproblem. Unlike algorithms with active sets and feasible directions, ASkewSGD avoids projections or optimization under the entire feasible set and allows iterates that are infeasible. The numerical complexity of ASkewSGD is comparable to existing approaches for training QNNs, such as the straight-through gradient estimator used in BinaryConnect, or other state of the art methods (ProxQuant, LUQ). We establish convergence guarantees for ASkewSGD (under general assumptions for the objective function). Experimental results show that the ASkewSGD algorithm performs better than or on par with state of the art methods in classical benchmarks.

langue originaleAnglais
Pages (de - à)3644-3663
Nombre de pages20
journalProceedings of Machine Learning Research
Volume206
étatPublié - 1 janv. 2023
Evénement26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023 - Valencia, Espagne
Durée: 25 avr. 202327 avr. 2023

Empreinte digitale

Examiner les sujets de recherche de « ASkewSGD: An Annealed interval-constrained Optimisation method to train Quantized Neural Networks ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation