In accordance with regulations and requirements, the editorial department's website domain has been changed to arocmag.cn. The original domain (arocmag.com) has been redirecting to new domain since Jan. 1st, 2025.

Multimodal dialogue system (model) and applications based on modality-sensitive attention mechanism

Du Wei1
Zhu Xiaoying2
Xu Fangmin3
Zheng Jiansheng4
Zhu Fuxi1
Gong Mingmin1
Li Ziyu1
1. School of Information Engineering, Wuhan College, Wuhan, Hubei Province 430030, China
2. School of Cyberspace Security, Beijing University of Posts & Telecommunications, Beijing 100000, China
3. School of Information & Communication Engineering, Beijing University of Posts & Telecommunications, Beijing 100000, China
4. School of Electronic Information, Wuhan University, Wuhan, Hubei Province 430030, China

Abstract

The multimodal dialogue system adopted methods such as transformer, cross-attention mechanism and pre-trained models to fuse text, speech and video modalities of different granularities and extracted cross-modal features. However, the existing research ignored the sensitive differences of different modal features on classification tasks, resulting in excessive fusion and information redundancy. Regarding the influence of sequential features of multimodal fusion on classification results , this paper proposed the Multimodal Dialogue Model MDM-MSAM(Multimodal Dialogue Model Based on Modality Sensitive Attention Mechanism) . The model was divided into three parts: master-slave mode screening, dual-modal cross-modal fusion, and tri-modal cross-modal fusion. By determining the master-slave modalities and extracting cross-dual-modal features, and then re-fusing them with the tri-modal fusion features, modality-sensitive hierarchical cross-multimodal features were formed. The classification accuracy on MintRec and CMU-MOSI datasets increase by 3.15% and 3.5% respectively compared with the currently best-performing model. The deployment and application of the MDM-MSAM model in flow engine-based multi-round dialogue system achieve good application results.

Foundation Support

国家自然科学基金资助项目(42374013)
北京市自然科学基金资助项目(L234080)
武汉学院科研基金年度计划项目(JJA202304)
中国高校产学研创新基金一腾讯科技创新教育专项(2022TX007)

Publish Information

DOI: 10.19734/j.issn.1001-3695.2025.02.0043
Publish at: Application Research of Computers Accepted Paper, Vol. 42, 2025 No. 9

Publish History

[2025-05-21] Accepted Paper

Cite This Article

杜维, 朱晓瑛, 许方敏, 等. 基于模态敏感注意力机制的多模态对话模型及应用 [J]. 计算机应用研究, 2025, 42 (9). (2025-05-27). https://doi.org/10.19734/j.issn.1001-3695.2025.02.0043. (Du Wei, Zhu Xiaoying, Xu Fangmin, et al. Multimodal dialogue system (model) and applications based on modality-sensitive attention mechanism [J]. Application Research of Computers, 2025, 42 (9). (2025-05-27). https://doi.org/10.19734/j.issn.1001-3695.2025.02.0043. )

About the Journal

  • Application Research of Computers Monthly Journal
  • Journal ID ISSN 1001-3695
    CN  51-1196/TP

Application Research of Computers, founded in 1984, is an academic journal of computing technology sponsored by Sichuan Institute of Computer Sciences under the Science and Technology Department of Sichuan Province.

Aiming at the urgently needed cutting-edge technology in this discipline, Application Research of Computers reflects the mainstream technology, hot technology and the latest development trend of computer application research at home and abroad in a timely manner. The main contents of the journal include high-level academic papers in this discipline, the latest scientific research results and major application results. The contents of the columns involve new theories of computer discipline, basic computer theory, algorithm theory research, algorithm design and analysis, blockchain technology, system software and software engineering technology, pattern recognition and artificial intelligence, architecture, advanced computing, parallel processing, database technology, computer network and communication technology, information security technology, computer image graphics and its latest hot application technology.

Application Research of Computers has many high-level readers and authors, and its readers are mainly senior and middle-level researchers and engineers engaged in the field of computer science, as well as teachers and students majoring in computer science and related majors in colleges and universities. Over the years, the total citation frequency and Web download rate of Application Research of Computers have been ranked among the top of similar academic journals in this discipline, and the academic papers published are highly popular among the readers for their novelty, academics, foresight, orientation and practicality.


Indexed & Evaluation

  • The Second National Periodical Award 100 Key Journals
  • Double Effect Journal of China Journal Formation
  • the Core Journal of China (Peking University 2023 Edition)
  • the Core Journal for Science
  • Chinese Science Citation Database (CSCD) Source Journals
  • RCCSE Chinese Core Academic Journals
  • Journal of China Computer Federation
  • 2020-2022 The World Journal Clout Index (WJCI) Report of Scientific and Technological Periodicals
  • Full-text Source Journal of China Science and Technology Periodicals Database
  • Source Journal of China Academic Journals Comprehensive Evaluation Database
  • Source Journals of China Academic Journals (CD-ROM Version), China Journal Network
  • 2017-2019 China Outstanding Academic Journals with International Influence (Natural Science and Engineering Technology)
  • Source Journal of Top Academic Papers (F5000) Program of China's Excellent Science and Technology Journals
  • Source Journal of China Engineering Technology Electronic Information Network and Electronic Technology Literature Database
  • Source Journal of British Science Digest (INSPEC)
  • Japan Science and Technology Agency (JST) Source Journal
  • Russian Journal of Abstracts (AJ, VINITI) Source Journals
  • Full-text Journal of EBSCO, USA
  • Cambridge Scientific Abstracts (Natural Sciences) (CSA(NS)) core journals
  • Poland Copernicus Index (IC)
  • Ulrichsweb (USA)