Passer à la navigation principale Passer à la recherche Passer au contenu principal

Neural Velocity for hyperparameter tuning

  • Gianluca Dalmasso
  • , Andrea Bragagnolo
  • , Enzo Tartaglione
  • , Attilio Fiandrotti
  • , Marco Grangetto

Résultats de recherche: Contribution à un journalArticle de conférenceRevue par des pairs

Résumé

Hyperparameter tuning, such as learning rate decay and defining a stopping criterion, often relies on monitoring the validation loss. This paper presents NeVe, a dynamic training approach that adjusts the learning rate and defines the stop criterion based on the novel notion of "neural velocity". The neural velocity measures the rate of change of each neuron's transfer function and is an indicator of model convergence: sampling neural velocity can be performed even by forwarding noise in the network, reducing the need for a held-out dataset. Our findings show the potential of neural velocity as a key metric for optimizing neural network training efficiently.

langue originaleAnglais
journalProceedings of the International Joint Conference on Neural Networks
Les DOIs
étatPublié - 1 janv. 2025
Evénement2025 International Joint Conference on Neural Networks, IJCNN 2025 - Rome, Italie
Durée: 30 juin 20255 juil. 2025

Empreinte digitale

Examiner les sujets de recherche de « Neural Velocity for hyperparameter tuning ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation