Skip to main navigation Skip to search Skip to main content

Graph Transformers for !ery Plan Representation: Potentials and Challenges

Research output: Contribution to journalConference articlepeer-review

Abstract

Query Plan Representation (QPR) is central to workload modeling, with various deep-learning based architectures proposed in the literature. Our work is motivated by two key observations: (i) the research community still lacks clarity on which model, if any, best suits the QPR problem; and (ii) while transformers have revolutionized many !elds, their potential for QPR remains largely underexplored. This study examines the strengths and challenges of Graph Transformers for QPR. We introduce a new taxonomy that uni!es deep-learning based QPR techniques along key design axes. Our benchmark analysis of common QPR architectures reveals that Graph Transformer Networks (GTNs) consistently outperform alternatives, but can degrade under limited training data. To address this, we propose novel data augmentation techniques to enhance training diversity and re!ne GTN architectures by replacing ine"ec-tive language-model-inspired components with techniques better suited for query plans. Evaluation on JOB, TPC-H, and TPC-DS benchmarks shows that with su#cient training data, enhanced GTNs outperform existing models for capturing complex queries ( JOB Full and TPC-DS) and enable the query embedder trained on TPC-DS to generalize to TPC-H queries out of the box.

Original languageEnglish
Pages (from-to)5716-5730
Number of pages15
JournalProceedings of the VLDB Endowment
Volume18
Issue number13
DOIs
Publication statusPublished - 1 Jan 2025
Externally publishedYes
Event52nd International Conference on Very Large Data Bases, VLDB 2026 - Boston, United States
Duration: 31 Aug 20264 Sept 2026

Fingerprint

Dive into the research topics of 'Graph Transformers for !ery Plan Representation: Potentials and Challenges'. Together they form a unique fingerprint.

Cite this