Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/30635
Title: A review of faithfulness metrics for hallucination assessment in Large Language Models
Authors: Malin, B
Kalganova, T
Boulgouris, N
Keywords: cs.CL;evaluation;fact extraction;faithfulness;hallucination;LLM;machine translation;question-answering;RAG;summarization
Issue Date: 31-Dec-2024
Publisher: Cornell University
Citation: Malin, B., and . (2024) 'A review of faithfulness metrics for hallucination assessment in Large Language Models', arXiv preprint, arXiv:2501.00269v1 [cs.CL], pp. 1 - 13. doi: 10.48550/arXiv.2501.00269.
Abstract: This review examines the means with which faithfulness has been evaluated across open-ended summarization, question-answering and machine translation tasks. We find that the use of LLMs as a faithfulness evaluator is commonly the metric that is most highly correlated with human judgement. The means with which other studies have mitigated hallucinations is discussed, with both retrieval augmented generation (RAG) and prompting framework approaches having been linked with superior faithfulness, whilst other recommendations for mitigation are provided. Research into faithfulness is integral to the continued widespread use of LLMs, as unfaithful responses can pose major risks to many areas whereby LLMs would otherwise be suitable. Furthermore, evaluating open-ended generation provides a more comprehensive measure of LLM performance than commonly used multiple-choice benchmarking, which can help in advancing the trust that can be placed within LLMs.
URI: https://bura.brunel.ac.uk/handle/2438/30635
DOI: https://doi.org/10.48550/arXiv.2501.00269
Other Identifiers: ORCiD: Ben Malin https://orcid.org/0009-0006-5791-2555
ORCiD: Tatiana Kalganova https://orcid.org/0000-0003-4859-7152
ORCiD: Nikolaos Boulgouris https://orcid.org/0000-0002-5382-6856
arXiv:2501.00269v1 [cs.CL]
Appears in Collections:Dept of Electronic and Electrical Engineering Research Papers

Files in This Item:
File Description SizeFormat 
Preprint.pdfCopyright © 2024 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/).307.69 kBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons