Conversational question answering over knowledge graphs with transformer and graph attention networks
Publication date
2021-01-01
Document type
Conference paper
Author
Organisational unit
Universität Bonn
Scopus ID
ISBN
ISSN
Conference
16th conference of the European Chapter of the Association for Computational Linguistics (EACL 2021), April 19-23, 2021
Book title
he 16th Conference of the European Chapter of the Association for Computational Linguistics - proceedings of the conference
Peer-reviewed
✅
Part of the university bibliography
Nein
Abstract
This paper addresses the task of (complex) conversational question answering over a knowledge graph. For this task, we propose LASAGNE (muLti-task semAntic parSing with trAnsformer and Graph atteNtion nEtworks). It is the first approach, which employs a transformer architecture extended with Graph Attention Networks for multi-task neural semantic parsing. LASAGNE uses a transformer model for generating the base logical forms, while the Graph Attention model is used to exploit correlations between (entity) types and predicates to produce node representations. LASAGNE also includes a novel entity recognition module which detects, links, and ranks all relevant entities in the question context. We evaluate LASAGNE on a standard dataset for complex sequential question answering, on which it outperforms existing baseline averages on all question types. Specifically, we show that LASAGNE improves the F1-score on eight out of ten question types; in some cases, the increase in F1-score is more than 20% compared to the state of the art.
Version
Not applicable (or unknown)
Access right on openHSU
Metadata only access