In accordance with regulations and requirements, the editorial department's website domain has been changed to arocmag.cn. The original domain (arocmag.com) has been redirecting to new domain since Jan. 1st, 2025.
Special Topics in Large Models
|
1956-1963

PMoE: parameter-efficient fine-tuning framework introducing mixture of experts in P-tuning

Wang Hao1a
Wang Jun1a
Hu Haifeng1a
Zhou Feifei2
Gong Rui2
Zhang Suofei1b
1. a. School of Communications & Information Engineering, b. School of Internet of Things, Nanjing University of Posts & Telecommunications, Nanjing 210003, China
2. China Telecom Corporation Limited Jiangsu Branch, Nanjing 210037, China

Abstract

Large language model(LLM) has significantly improved performance in reasoning and generation tasks. However, existing open-source LLM still lacks sufficient domain-specific knowledge and requires fine-tuning for specialized tasks. Traditional fine-tuning methods struggle to balance low cost and high efficiency in multi-task learning. To address this issue, this paper proposed a parameter-efficient fine-tuning framework named PMoE. Based on the P-tuning method, this framework introduced a mixture-of-experts mechanism to enhance multi-task processing while maintaining low-cost tuning. In each Transformer module layer, PMoE constructed trainable expert modules to replace the prompt modules in P-tuning and utilized a routing mechanism to dynamically allocate tasks based on input task features. Additionally, it designed the expert modules in PMoE to be detachable, enabling model reuse across different task scenarios and further reducing computational costs. Experimental results demonstrate that PMoE achieves a 6.24% performance improvement over P-tuning on a Chinese medical dataset and exhibits superior capabilities in multi-task processing and transfer learning, verifying its efficiency and broad applicability.

Foundation Support

国家自然科学基金资助项目(62371245)

Publish Information

DOI: 10.19734/j.issn.1001-3695.2024.11.0484
Publish at: Application Research of Computers Printed Article, Vol. 42, 2025 No. 7
Section: Special Topics in Large Models
Pages: 1956-1963
Serial Number: 1001-3695(2025)07-005-1956-08

Publish History

[2025-03-13] Accepted Paper
[2025-07-05] Printed Article

Cite This Article

王浩, 王珺, 胡海峰, 等. PMoE:在P-tuning中引入混合专家的参数高效微调框架 [J]. 计算机应用研究, 2025, 42 (7): 1956-1963. (Wang Hao, Wang Jun, Hu Haifeng, et al. PMoE: parameter-efficient fine-tuning framework introducing mixture of experts in P-tuning [J]. Application Research of Computers, 2025, 42 (7): 1956-1963. )

About the Journal

  • Application Research of Computers Monthly Journal
  • Journal ID ISSN 1001-3695
    CN  51-1196/TP

Application Research of Computers, founded in 1984, is an academic journal of computing technology sponsored by Sichuan Institute of Computer Sciences under the Science and Technology Department of Sichuan Province.

Aiming at the urgently needed cutting-edge technology in this discipline, Application Research of Computers reflects the mainstream technology, hot technology and the latest development trend of computer application research at home and abroad in a timely manner. The main contents of the journal include high-level academic papers in this discipline, the latest scientific research results and major application results. The contents of the columns involve new theories of computer discipline, basic computer theory, algorithm theory research, algorithm design and analysis, blockchain technology, system software and software engineering technology, pattern recognition and artificial intelligence, architecture, advanced computing, parallel processing, database technology, computer network and communication technology, information security technology, computer image graphics and its latest hot application technology.

Application Research of Computers has many high-level readers and authors, and its readers are mainly senior and middle-level researchers and engineers engaged in the field of computer science, as well as teachers and students majoring in computer science and related majors in colleges and universities. Over the years, the total citation frequency and Web download rate of Application Research of Computers have been ranked among the top of similar academic journals in this discipline, and the academic papers published are highly popular among the readers for their novelty, academics, foresight, orientation and practicality.


Indexed & Evaluation

  • The Second National Periodical Award 100 Key Journals
  • Double Effect Journal of China Journal Formation
  • the Core Journal of China (Peking University 2023 Edition)
  • the Core Journal for Science
  • Chinese Science Citation Database (CSCD) Source Journals
  • RCCSE Chinese Core Academic Journals
  • Journal of China Computer Federation
  • 2020-2022 The World Journal Clout Index (WJCI) Report of Scientific and Technological Periodicals
  • Full-text Source Journal of China Science and Technology Periodicals Database
  • Source Journal of China Academic Journals Comprehensive Evaluation Database
  • Source Journals of China Academic Journals (CD-ROM Version), China Journal Network
  • 2017-2019 China Outstanding Academic Journals with International Influence (Natural Science and Engineering Technology)
  • Source Journal of Top Academic Papers (F5000) Program of China's Excellent Science and Technology Journals
  • Source Journal of China Engineering Technology Electronic Information Network and Electronic Technology Literature Database
  • Source Journal of British Science Digest (INSPEC)
  • Japan Science and Technology Agency (JST) Source Journal
  • Russian Journal of Abstracts (AJ, VINITI) Source Journals
  • Full-text Journal of EBSCO, USA
  • Cambridge Scientific Abstracts (Natural Sciences) (CSA(NS)) core journals
  • Poland Copernicus Index (IC)
  • Ulrichsweb (USA)