Publication:
Conversational question answering over knowledge graphs with transformer and graph attention networks

cris.customurl 15249
cris.virtual.department #PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtual.department #PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtual.department #PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtual.department Data Engineering
cris.virtual.department #PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtual.department #PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtual.departmentbrowse Data Engineering
cris.virtual.departmentbrowse Data Engineering
cris.virtual.departmentbrowse Data Engineering
cris.virtualsource.department #PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtualsource.department #PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtualsource.department #PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtualsource.department #PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtualsource.department 3a2553bc-4d23-4bae-a22f-5d92c868792c
cris.virtualsource.department #PLACEHOLDER_PARENT_METADATA_VALUE#
dc.contributor.author Kacupaj, Endri
dc.contributor.author Plepi, Joan
dc.contributor.author Singh, Kuldeep
dc.contributor.author Thakkar, Harsh
dc.contributor.author Lehmann, Jens
dc.contributor.author Maleshkova, Maria
dc.date.issued 2021-01-01
dc.description.abstract This paper addresses the task of (complex) conversational question answering over a knowledge graph. For this task, we propose LASAGNE (muLti-task semAntic parSing with trAnsformer and Graph atteNtion nEtworks). It is the first approach, which employs a transformer architecture extended with Graph Attention Networks for multi-task neural semantic parsing. LASAGNE uses a transformer model for generating the base logical forms, while the Graph Attention model is used to exploit correlations between (entity) types and predicates to produce node representations. LASAGNE also includes a novel entity recognition module which detects, links, and ranks all relevant entities in the question context. We evaluate LASAGNE on a standard dataset for complex sequential question answering, on which it outperforms existing baseline averages on all question types. Specifically, we show that LASAGNE improves the F1-score on eight out of ten question types; in some cases, the increase in F1-score is more than 20% compared to the state of the art.
dc.description.version NA
dc.identifier.doi 10.48550/arXiv.2104.01569
dc.identifier.isbn 9781954085022
dc.identifier.issn 2331-8422
dc.identifier.scopus 2-s2.0-85107293854
dc.identifier.uri https://openhsu.ub.hsu-hh.de/handle/10.24405/15249
dc.language.iso en
dc.publisher Association for Computational Linguistics (ACL)
dc.relation.conference 16th conference of the European Chapter of the Association for Computational Linguistics (EACL 2021), April 19-23, 2021
dc.relation.orgunit Universität Bonn
dc.rights.accessRights metadata only access
dc.title Conversational question answering over knowledge graphs with transformer and graph attention networks
dc.type Conference paper
dcterms.bibliographicCitation.booktitle he 16th Conference of the European Chapter of the Association for Computational Linguistics - proceedings of the conference
dcterms.bibliographicCitation.originalpublisherplace Stroudsburg, PA
dspace.entity.type Publication
hsu.peerReviewed
hsu.uniBibliography Nein
Files