On graph reconstruction via empirical risk minimization: Fast learning rates and scalability

Guillaume Papa, Stéphan Clémençon, Aurélien Bellet

Research output: Contribution to journalConference articlepeer-review

Abstract

The problem of predicting connections between a set of data points finds many applications, in systems biology and social network analysis among others. This paper focuses on the graph reconstruction problem, where the prediction rule is obtained by minimizing the average error over all n(n - 1)/2 possible pairs of the n nodes of a training graph. Our first contribution is to derive learning rates of order O (log n/n) for this problem, significantly improving upon the slow rates of order O(1/√n) established in the seminal work of Biau and Bleakley (2006). Strikingly, these fast rates are universal, in contrast to similar results known for other statistical learning problems (e.g., classification, density level set estimation, ranking, clustering) which require strong assumptions on the distribution of the data. Motivated by applications to large graphs, our second contribution deals with the computational complexity of graph reconstruction. Specifically, we investigate to which extent the learning rates can be preserved when replacing the empirical reconstruction risk by a computationally cheaper Monte-Carlo version, obtained by sampling with replacement B>2 pairs of nodes. Finally, we illustrate our theoretical results by numerical experiments on synthetic and real graphs.

Original languageEnglish
Pages (from-to)694-702
Number of pages9
JournalAdvances in Neural Information Processing Systems
Publication statusPublished - 1 Jan 2016
Externally publishedYes
Event30th Annual Conference on Neural Information Processing Systems, NIPS 2016 - Barcelona, Spain
Duration: 5 Dec 201610 Dec 2016

Fingerprint

Dive into the research topics of 'On graph reconstruction via empirical risk minimization: Fast learning rates and scalability'. Together they form a unique fingerprint.

Cite this