Passer à la navigation principale Passer à la recherche Passer au contenu principal

Graph Transformers for !ery Plan Representation: Potentials and Challenges

Résultats de recherche: Contribution à un journalArticle de conférenceRevue par des pairs

Résumé

Query Plan Representation (QPR) is central to workload modeling, with various deep-learning based architectures proposed in the literature. Our work is motivated by two key observations: (i) the research community still lacks clarity on which model, if any, best suits the QPR problem; and (ii) while transformers have revolutionized many !elds, their potential for QPR remains largely underexplored. This study examines the strengths and challenges of Graph Transformers for QPR. We introduce a new taxonomy that uni!es deep-learning based QPR techniques along key design axes. Our benchmark analysis of common QPR architectures reveals that Graph Transformer Networks (GTNs) consistently outperform alternatives, but can degrade under limited training data. To address this, we propose novel data augmentation techniques to enhance training diversity and re!ne GTN architectures by replacing ine"ec-tive language-model-inspired components with techniques better suited for query plans. Evaluation on JOB, TPC-H, and TPC-DS benchmarks shows that with su#cient training data, enhanced GTNs outperform existing models for capturing complex queries ( JOB Full and TPC-DS) and enable the query embedder trained on TPC-DS to generalize to TPC-H queries out of the box.

langue originaleAnglais
Pages (de - à)5716-5730
Nombre de pages15
journalProceedings of the VLDB Endowment
Volume18
Numéro de publication13
Les DOIs
étatPublié - 1 janv. 2025
Modification externeOui
Evénement52nd International Conference on Very Large Data Bases, VLDB 2026 - Boston, États-Unis
Durée: 31 août 20264 sept. 2026

Empreinte digitale

Examiner les sujets de recherche de « Graph Transformers for !ery Plan Representation: Potentials and Challenges ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation