Résumé
Gradient boosting has been extensively studied in batch learning. Recently, its streaming adaptation, Streaming Gradient Boosted Trees (Sgbt), has surpassed existing state-of-the-art random subspace and random patches methods for streaming classification under various drift scenarios. However, its application in streaming regression remains unexplored. Vanilla Sgbt with squared loss exhibits high variance when applied to streaming regression problems. To address this, we utilize bagging streaming regressors in this work to create Streaming Gradient Boosted Regression (Sgbr). Bagging streaming regressors are employed in two ways: first, as base learners within the existing Sgbt framework, and second, as an ensemble method that aggregates multiple Sgbts. Our extensive experiments on 11 streaming regression datasets, encompassing multiple drift scenarios, demonstrate that the Sgb(Oza), a variant of the first Sgbr category, significantly outperforms current state-of-the-art streaming regression methods in terms of both predictive power and computational cost.
| langue originale | Anglais |
|---|---|
| Numéro d'article | 65 |
| journal | Data Mining and Knowledge Discovery |
| Volume | 39 |
| Numéro de publication | 5 |
| Les DOIs | |
| état | Publié - 1 sept. 2025 |
| Modification externe | Oui |
Empreinte digitale
Examiner les sujets de recherche de « Gradient boosted bagging for evolving data stream regression ». Ensemble, ils forment une empreinte digitale unique.Contient cette citation
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver