Energy-based Self-attentive Learning of Abstractive Communities for Spoken Language Understanding

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Abstractive community detection is an important spoken language understanding task, whose goal is to group utterances in a conversation according to whether they can be jointly summarized by a common abstractive sentence. This paper provides a novel approach to this task. We first introduce a neural contextual utterance encoder featuring three types of self-attention mechanisms. We then train it using the siamese and triplet energy-based meta-architectures. Experiments on the AMI corpus show that our system outperforms multiple energy-based and non-energy based baselines from the state-of-the-art. Code and data are publicly available.

Original languageEnglish
Title of host publicationProceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, AACL-IJCNLP 2020
EditorsKam-Fai Wong, Kevin Knight, Hua Wu
PublisherAssociation for Computational Linguistics (ACL)
Pages313-327
Number of pages15
ISBN (Electronic)9781952148910
DOIs
Publication statusPublished - 1 Jan 2020
Externally publishedYes
Event1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, AACL-IJCNLP 2020 - Virtual, Online, China
Duration: 4 Dec 20207 Dec 2020

Publication series

NameProceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, AACL-IJCNLP 2020

Conference

Conference1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, AACL-IJCNLP 2020
Country/TerritoryChina
CityVirtual, Online
Period4/12/207/12/20

Fingerprint

Dive into the research topics of 'Energy-based Self-attentive Learning of Abstractive Communities for Spoken Language Understanding'. Together they form a unique fingerprint.

Cite this