Passer à la navigation principale Passer à la recherche Passer au contenu principal

Linear time sinkhorn divergences using positive features

Résultats de recherche: Contribution à un journalArticle de conférenceRevue par des pairs

Résumé

Although Sinkhorn divergences are now routinely used in data sciences to compare probability distributions, the computational effort required to compute them remains expensive, growing in general quadratically in the size n of the support of these distributions. Indeed, solving optimal transport (OT) with an entropic regularization requires computing a n × n kernel matrix (the neg-exponential of a n × n pairwise ground cost matrix) that is repeatedly applied to a vector. We propose to use instead ground costs of the form c(x, y) = - logh'(x), '(y)i where ' is a map from the ground space onto the positive orthant Rr+, with r « n. This choice yields, equivalently, a kernel k(x, y) = h'(x), '(y)i, and ensures that the cost of Sinkhorn iterations scales as O(nr). We show that usual cost functions can be approximated using this form. Additionaly, we take advantage of the fact that our approach yields approximation that remain fully differentiable with respect to input distributions, as opposed to previously proposed adaptive low-rank approximations of the kernel matrix, to train a faster variant of OT-GAN [49].

langue originaleAnglais
journalAdvances in Neural Information Processing Systems
Volume2020-December
étatPublié - 1 janv. 2020
Modification externeOui
Evénement34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online
Durée: 6 déc. 202012 déc. 2020

Empreinte digitale

Examiner les sujets de recherche de « Linear time sinkhorn divergences using positive features ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation