Abstract
Evaluating integrals is an ubiquitous issue and Monte Carlo methods, exploiting advances in random number generation over the last decades, offer a popular and powerful alternative to integration deterministic techniques, unsuited in particular when the domain of integration is complex. This paper is devoted to the study of a kernel smoothing based competitor built from a sequence of n ≥ 1 i.i.d random vectors with arbitrary continuous probability distribution f(x)dx, originally proposed in [7], from a nonasymptotic perspective. We establish a probability bound showing that the method under study, though biased, produces an estimate approximating the target integral ∫xϵRd ϕ(x)dx with an error bound of order o(1/√ n) uniformly over a class φ of functions ϕ, under weak complexity/smoothness assumptions related to the class φ, outperforming Monte-Carlo procedures. This striking result is shown to derive from an appropriate decomposition of the maximal deviation between the target integrals and their estimates, highlighting the remarkable benefit to averaging strongly dependent terms regarding statistical accuracy in this situation. The theoretical analysis then rests on sharp probability inequalities for degenerate U-statistics. It is illustrated by numerical results in the context of covariate shift regression, providing empirical evidence of the relevance of the approach.
| Original language | English |
|---|---|
| Journal | Proceedings of Machine Learning Research |
| Volume | 84 |
| Publication status | Published - 1 Jan 2018 |
| Externally published | Yes |
| Event | 21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018 - Playa Blanca, Lanzarote, Canary Islands, Spain Duration: 9 Apr 2018 → 11 Apr 2018 |
Fingerprint
Dive into the research topics of 'Beating Monte Carlo Integration: a Nonasymptotic Study of Kernel Smoothing Methods'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver