Résumé
Query Plan Representation (QPR) is central to workload modeling, with various deep-learning based architectures proposed in the literature. Our work is motivated by two key observations: (i) the research community still lacks clarity on which model, if any, best suits the QPR problem; and (ii) while transformers have revolutionized many !elds, their potential for QPR remains largely underexplored. This study examines the strengths and challenges of Graph Transformers for QPR. We introduce a new taxonomy that uni!es deep-learning based QPR techniques along key design axes. Our benchmark analysis of common QPR architectures reveals that Graph Transformer Networks (GTNs) consistently outperform alternatives, but can degrade under limited training data. To address this, we propose novel data augmentation techniques to enhance training diversity and re!ne GTN architectures by replacing ine"ec-tive language-model-inspired components with techniques better suited for query plans. Evaluation on JOB, TPC-H, and TPC-DS benchmarks shows that with su#cient training data, enhanced GTNs outperform existing models for capturing complex queries ( JOB Full and TPC-DS) and enable the query embedder trained on TPC-DS to generalize to TPC-H queries out of the box.
| langue originale | Anglais |
|---|---|
| Pages (de - à) | 5716-5730 |
| Nombre de pages | 15 |
| journal | Proceedings of the VLDB Endowment |
| Volume | 18 |
| Numéro de publication | 13 |
| Les DOIs | |
| état | Publié - 1 janv. 2025 |
| Modification externe | Oui |
| Evénement | 52nd International Conference on Very Large Data Bases, VLDB 2026 - Boston, États-Unis Durée: 31 août 2026 → 4 sept. 2026 |
Empreinte digitale
Examiner les sujets de recherche de « Graph Transformers for !ery Plan Representation: Potentials and Challenges ». Ensemble, ils forment une empreinte digitale unique.Contient cette citation
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver