Please use this identifier to cite or link to this item:
http://bura.brunel.ac.uk/handle/2438/31832
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Yuan, Y | - |
dc.contributor.author | Tao, L | - |
dc.contributor.author | Lu, H | - |
dc.contributor.author | Khushi, M | - |
dc.contributor.author | Razzak, I | - |
dc.contributor.author | Dras, M | - |
dc.contributor.author | Yang, J | - |
dc.contributor.author | Naseem, U | - |
dc.coverage.spatial | Sydney NSW, Australia | - |
dc.date.accessioned | 2025-08-26T11:41:05Z | - |
dc.date.available | 2025-08-26T11:41:05Z | - |
dc.date.issued | 2025-05-23 | - |
dc.identifier | ORCiD: Matloob Khushi https://orcid.org/0000-0001-7792-2327 | - |
dc.identifier.citation | Yuan, Y. et al, (2025) 'KG-UQ: Knowledge Graph-Based Uncertainty Quantification for Long Text in Large Language Models', WWW '25: Companion Proceedings of the ACM on Web Conference 2025, Sydney NSW, Australia, 28 April-2 May, pp. 2071 - 2077. doi: 10.1145/3701716.3717660. | en_US |
dc.identifier.isbn | 979-8-4007-1331-6 | - |
dc.identifier.uri | https://bura.brunel.ac.uk/handle/2438/31832 | - |
dc.description.abstract | With the commercialization of large language models (LLMs) and their integration into daily life, addressing their susceptibility to hallucinations-unfactual information in generated outputs-has become an urgent priority. Existing uncertainty quantification (UQ) methods often rely on access to LLMs' internal states, which is unavailable for closed-source models like GPTs, or are primarily designed for short text. Current research on long text typically evaluates sentences individually, overlooking smaller semantic units that better capture the text's complexity. Recognizing the potential of knowledge graphs (KGs) to extract structured relationships from unstructured text, we propose KG-UQ, a UQ method leveraging KGs to address the semantic intricacies of long text. Our approach involves constructing KGs from long-text outputs and utilizing their embeddings to estimate uncertainties. Through our analysis, we demonstrate that knowledge graphs are an effective tool for decomposing long text into fundamental statements. However, we also highlight the increased uncertainty introduced during KG construction, stemming from inherent challenges in accurately capturing all semantic information. | en_US |
dc.description.sponsorship | This research was supported by the Macquarie University Research Acceleration Scheme (MQRAS) and Data Horizon funding. | en_US |
dc.format.extent | 2071 - 2077 | - |
dc.language | English | - |
dc.language.iso | en_US | en_US |
dc.publisher | Association for Computing Machinery (ACM) | en_US |
dc.rights | Creative Commons Attribution 4.0 International | - |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | - |
dc.source | WWW '25: The ACM Web Conference 2025 | - |
dc.source | WWW '25: The ACM Web Conference 2025 | - |
dc.subject | large language models | en_US |
dc.subject | hallucinations | en_US |
dc.subject | uncertainty quantification | en_US |
dc.subject | knowledge graphs | en_US |
dc.subject | LLM | en_US |
dc.subject | uncertainty estimation | en_US |
dc.title | KG-UQ: Knowledge Graph-Based Uncertainty Quantification for Long Text in Large Language Models | en_US |
dc.type | Conference Paper | en_US |
dc.date.dateAccepted | 2025-01-27 | - |
dc.identifier.doi | https://doi.org/10.1145/3701716.3717660 | - |
dc.relation.isPartOf | WWW '25: Companion Proceedings of the ACM on Web Conference 2025 | - |
pubs.finish-date | 2025-05-02 | - |
pubs.finish-date | 2025-05-02 | - |
pubs.publication-status | Published | - |
pubs.start-date | 2025-04-28 | - |
pubs.start-date | 2025-04-28 | - |
dc.rights.license | https://creativecommons.org/licenses/by/4.0/legalcode.en | - |
dcterms.dateAccepted | 2025-01-27 | - |
dc.rights.holder | The owner/author(s) | - |
Appears in Collections: | Dept of Computer Science Research Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FullText.pdf | Copyright © 2025 held by the owner/author(s). This work is licensed under a Creative Commons Attribution International 4.0 License (https://creativecommons.org/licenses/by/4.0/). | 1.52 MB | Adobe PDF | View/Open |
This item is licensed under a Creative Commons License