Journal article icon

Journal article

Detecting hallucinations in large language models using semantic entropy

Abstract:
Large language model (LLM) systems, such as ChatGPT1 or Gemini2, can show impressive reasoning and question-answering capabilities but often ‘hallucinate’ false outputs and unsubstantiated answers3, 4. Answering unreliably or without the necessary information prevents adoption in diverse fields, with problems including fabrication of legal precedents5 or untrue facts in news articles6 and even posing a risk to human life in medical domains such as radiology7. Encouraging truthfulness through supervision or reinforcement has been only partially successful8. Researchers need a general method for detecting hallucinations in LLMs that works even with new and unseen questions to which humans might not know the answer. Here we develop new methods grounded in statistics, proposing entropy-based uncertainty estimators for LLMs to detect a subset of hallucinations—confabulations—which are arbitrary and incorrect generations. Our method addresses the fact that one idea can be expressed in many ways by computing uncertainty at the level of meaning rather than specific sequences of words. Our method works across datasets and tasks without a priori knowledge of the task, requires no task-specific data and robustly generalizes to new tasks not seen before. By detecting when a prompt is likely to produce a confabulation, our method helps users understand when they must take extra care with LLMs and opens up new possibilities for using LLMs that are otherwise prevented by their unreliability.
Publication status:
Published
Peer review status:
Peer reviewed

Actions


Access Document


Files:
Publisher copy:
10.1038/s41586-024-07421-0

Authors


More by this author
Institution:
University of Oxford
Role:
Author
ORCID:
0000-0002-9185-6415
More by this author
Institution:
University of Oxford
Role:
Author
More by this author
Institution:
University of Oxford
Role:
Author
More by this author
Institution:
University of Oxford
Role:
Author
ORCID:
0000-0002-2733-2078


Publisher:
Nature Research
Journal:
Nature More from this journal
Volume:
630
Issue:
8017
Pages:
625-630
Publication date:
2024-06-19
Acceptance date:
2024-04-12
DOI:
EISSN:
1476-4687
ISSN:
0028-0836


Language:
English
Pubs id:
2009320
Local pid:
pubs:2009320
Source identifiers:
2055962
Deposit date:
2024-06-20
This ORA record was generated from metadata provided by an external service. It has not been edited by the ORA Team.

Terms of use



Views and Downloads






If you are the owner of this record, you can report an update to it here: Report update to this record

TO TOP