Passer à la navigation principale Passer à la recherche Passer au contenu principal

Learning with Fitzpatrick Losses

Résultats de recherche: Contribution à un journalArticle de conférenceRevue par des pairs

Résumé

Fenchel-Young losses are a family of loss functions, encompassing the squared, logistic and sparsemax losses, among others. They are convex w.r.t. the model output and the target, separately. Each Fenchel-Young loss is implicitly associated with a link function, that maps model outputs to predictions. For instance, the logistic loss is associated with the soft argmax link function. Can we build new loss functions associated with the same link function as Fenchel-Young losses? In this paper, we introduce Fitzpatrick losses, a new family of separately convex loss functions based on the Fitzpatrick function. A well-known theoretical tool in maximal monotone operator theory, the Fitzpatrick function naturally leads to a refined Fenchel-Young inequality, making Fitzpatrick losses tighter than Fenchel-Young losses, while maintaining the same link function for prediction. As an example, we introduce the Fitzpatrick logistic loss and the Fitzpatrick sparsemax loss, counterparts of the logistic and the sparsemax losses. This yields two new tighter losses associated with the soft argmax and the sparse argmax, two of the most ubiquitous output layers used in machine learning. We study in details the properties of Fitzpatrick losses and, in particular, we show that they can be seen as Fenchel-Young losses using a modified, target-dependent generating function. We demonstrate the effectiveness of Fitzpatrick losses for label proportion estimation.

langue originaleAnglais
journalAdvances in Neural Information Processing Systems
Volume37
étatPublié - 1 janv. 2024
Modification externeOui
Evénement38th Conference on Neural Information Processing Systems, NeurIPS 2024 - Vancouver, Canada
Durée: 9 déc. 202415 déc. 2024

Empreinte digitale

Examiner les sujets de recherche de « Learning with Fitzpatrick Losses ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation