File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

나승훈

Na, Seung-Hoon
Natural Language Processing Lab
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Evaluating Complex Entity Knowledge Propagation for Knowledge Editing in LLMs

Author(s)
Shafqat, WafaNa, Seung-Hoon
Issued Date
2024-02
DOI
10.3390/app14041508
URI
https://scholarworks.unist.ac.kr/handle/201301/86770
Citation
APPLIED SCIENCES-BASEL, v.14, no.4, pp.1508
Abstract
In today's world, where information keeps growing rapidly and changing constantly, language models play a crucial role in making our lives easier across different fields. However, it is tough to keep these models updated with all the new data while making sure they stay accurate and relevant. To tackle this challenge, our study proposes an innovative approach to facilitate the propagation of complex entity knowledge within language models through extensive triplet representation. Using a specially curated dataset (CTR-KE) derived from reliable sources like Wikipedia and Wikidata, the research assesses the efficacy of editing methods in handling intricate relationships between entities across multiple tiers of information. By employing a comprehensive triplet representation strategy, the study aims to enrich contextual understanding while mitigating the risks associated with distorting or forgetting critical information. The study evaluates its proposed methodology using various evaluation metrics and four distinct editing methods across three diverse language models (GPT2-XL, GPT-J, and Llama-2-7b). The results indicate the superiority of mass-editing memory in a transformer (MEMIT) and in-context learning for knowledge editing (IKE) in efficiently executing multiple updates within the triplet representation framework. This research signifies a promising pathway for deeper exploration of data representation for knowledge editing within large language models, and improved understanding of contexts to facilitate continual learning.
Publisher
MDPI
ISSN
2076-3417
Keyword (Author)
entity knowledge propagation (EKP)comprehensive knowledge representationknowledge graphknowledge editinglarge language models (LLMs)

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.