Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/31346
Title: Improving Low-resource Question Answering by Augmenting Question Information
Authors: Chen, A
Sun, Y
Zhao, X
Galindo Esparza, RP
Chen, K
Xiang, Y
Zhao, T
Zhang, M
Issue Date: 6-Dec-2023
Publisher: Association for Computational Linguistics (ACL)
Citation: Chen, A. et al. (2025) 'Improving Low-resource Question Answering by Augmenting Question Information', Findings of the Association for Computational Linguistics: EMNLP 2023, Singapore / Online, 6-10 December, pp. 10413 - 10420. doi: 10.18653/v1/2023.findings-emnlp.699.
Abstract: In the era of large models, low-resource question-answering tasks lag, emphasizing the importance of data augmentation - a key research avenue in natural language processing. The main challenges include leveraging the large model’s internal knowledge for data augmentation, determining which QA data component - the question, passage, or answer - benefits most from augmentation, and retaining consistency in the augmented content without inducing excessive noise. To tackle these, we introduce PQQ, an innovative approach for question data augmentation consisting of Prompt Answer, Question Generation, and Question Filter. Our experiments reveal that ChatGPT underperforms on the experimental data, yet our PQQ method excels beyond existing augmentation strategies. Further, its universal applicability is validated through successful tests on high-resource QA tasks like SQUAD1.1 and TriviaQA.
URI: https://bura.brunel.ac.uk/handle/2438/31346
DOI: https://doi.org/10.18653/v1/2023.findings-emnlp.699
ISBN: 979-8-89176-061-5
Other Identifiers: ORCiD: Rosella Paulina Galindo Esparza https://orcid.org/0000-0003-2552-0224
Appears in Collections:Brunel Design School Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2023 Association for Computational Linguistics. Materials published in or after 2016 are licensed on a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/).1.27 MBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons