Knowledge Bases and Language Models: Complementing Forces

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Large language models (LLMs), as a particular instance of generative artificial intelligence, have revolutionized natural language processing. In this invited paper, we argue that LLMs are complementary to structured data repositories such as databases or knowledge bases, which use symbolic knowledge representations. Hence, the two ways of knowledge representation will likely continue to co-exist, at least in the near future. We discuss ways that have been explored to make the two approaches work together, and point out opportunities and challenges for their symbiosis.

Original languageEnglish
Title of host publicationRules and Reasoning - 7th International Joint Conference, RuleML+RR 2023, Proceedings
EditorsAnna Fensel, Ana Ozaki, Dumitru Roman, Ahmet Soylu
PublisherSpringer Science and Business Media Deutschland GmbH
Pages3-15
Number of pages13
ISBN (Print)9783031450716
DOIs
Publication statusPublished - 1 Jan 2023
Event7th International Joint Conference on Rules and Reasoning, RuleML+RR 2023 - Oslo, Norway
Duration: 18 Sept 202320 Sept 2023

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume14244 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference7th International Joint Conference on Rules and Reasoning, RuleML+RR 2023
Country/TerritoryNorway
CityOslo
Period18/09/2320/09/23

Keywords

  • Databases
  • Knowledge Bases
  • Large Language Models

Fingerprint

Dive into the research topics of 'Knowledge Bases and Language Models: Complementing Forces'. Together they form a unique fingerprint.

Cite this