Fine-tuning large language models for knowledge graph completion

Li Zhiyong2
Li Dequan1,2
1. State Key Laboratory of Digital Intelligent Technology for Unmanned Coal Mining, Anhui University of Science & Technology, Huainan Anhui 232001, China
2. School of Artificial Intelligence, Anhui University of Science & Technology, Hefei 231131, China

Abstract

Knowledge graph completion aims to tackle the prevalent issue of missing information in knowledge graphs. Traditional representation learning methods struggle to effectively leverage external textual information and the high-order structure of the graph, while directly employing large language models faces challenges such as factual hallucinations and insufficient use of structural information. To address these limitations, this paper proposed a Dual-Path Fine-Tuning framework (DPFT) for large language models tailored to knowledge graph completion. The framework integrates the advantages of generative and contrastive learning: the generative path enhances generalization capability, while the contrastive path strengthens the ability to discriminate factual differences. By designing structured prompt templates that incorporate multi-hop neighborhood information, the knowledge graph completion task was reformulated as a context-based reasoning problem. Experimental results show that the proposed method outperformed baseline models across various tasks. On the WN11 and FB13 datasets, triple classification accuracy improved by 0.31% and 0.67%, respectively. For entity prediction, performance increased by 17.93% on WN18RR and 37.89% on YAGO3-10. Additionally, relation prediction performance on YAGO3-10 improved by 3.93%.

Foundation Support

国家重点研发计划项目(Nos,2023YFC3807500)

Publish Information

DOI: 10.19734/j.issn.1001-3695.2025.11.0469
Publish at: Application Research of Computers Accepted Paper, Vol. 43, 2026 No. 7

Publish History

[2026-03-25] Accepted Paper

Cite This Article

李之勇, 李德权. 基于大语言模型微调的知识图谱补全 [J]. 计算机应用研究, 2026, 43 (7). (2026-03-27). https://doi.org/10.19734/j.issn.1001-3695.2025.11.0469. (Li Zhiyong, Li Dequan. Fine-tuning large language models for knowledge graph completion [J]. Application Research of Computers, 2026, 43 (7). (2026-03-27). https://doi.org/10.19734/j.issn.1001-3695.2025.11.0469. )

About the Journal

  • Application Research of Computers Monthly Journal
  • Journal ID ISSN 1001-3695
    CN  51-1196/TP

Application Research of Computers, founded in 1984, is an academic journal of computing technology sponsored by Sichuan Institute of Computer Sciences under the Science and Technology Department of Sichuan Province.

Aiming at the urgently needed cutting-edge technology in this discipline, Application Research of Computers reflects the mainstream technology, hot technology and the latest development trend of computer application research at home and abroad in a timely manner. The main contents of the journal include high-level academic papers in this discipline, the latest scientific research results and major application results. The contents of the columns involve new theories of computer discipline, basic computer theory, algorithm theory research, algorithm design and analysis, blockchain technology, system software and software engineering technology, pattern recognition and artificial intelligence, architecture, advanced computing, parallel processing, database technology, computer network and communication technology, information security technology, computer image graphics and its latest hot application technology.

Application Research of Computers has many high-level readers and authors, and its readers are mainly senior and middle-level researchers and engineers engaged in the field of computer science, as well as teachers and students majoring in computer science and related majors in colleges and universities. Over the years, the total citation frequency and Web download rate of Application Research of Computers have been ranked among the top of similar academic journals in this discipline, and the academic papers published are highly popular among the readers for their novelty, academics, foresight, orientation and practicality.


Indexed & Evaluation

  • The Second National Periodical Award 100 Key Journals
  • Double Effect Journal of China Journal Formation
  • the Core Journal of China (Peking University 2023 Edition)
  • the Core Journal for Science
  • Chinese Science Citation Database (CSCD) Source Journals
  • RCCSE Chinese Core Academic Journals
  • Journal of China Computer Federation
  • 2020-2022 The World Journal Clout Index (WJCI) Report of Scientific and Technological Periodicals
  • Full-text Source Journal of China Science and Technology Periodicals Database
  • Source Journal of China Academic Journals Comprehensive Evaluation Database
  • Source Journals of China Academic Journals (CD-ROM Version), China Journal Network
  • 2017-2019 China Outstanding Academic Journals with International Influence (Natural Science and Engineering Technology)
  • Source Journal of Top Academic Papers (F5000) Program of China's Excellent Science and Technology Journals
  • Source Journal of China Engineering Technology Electronic Information Network and Electronic Technology Literature Database
  • Source Journal of British Science Digest (INSPEC)
  • Japan Science and Technology Agency (JST) Source Journal
  • Russian Journal of Abstracts (AJ, VINITI) Source Journals
  • Full-text Journal of EBSCO, USA
  • Cambridge Scientific Abstracts (Natural Sciences) (CSA(NS)) core journals
  • Poland Copernicus Index (IC)
  • Ulrichsweb (USA)