DC-LoRA: Domain correlation low-rank adaptation for domain incremental learning

Lin Li*, Shiye Wang, Changsheng Li, Ye Yuan, Guoren Wang

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

摘要

Continual learning, characterized by the sequential acquisition of multiple tasks, has emerged as a prominent challenge in deep learning. During the process of continual learning, deep neural networks experience a phenomenon known as catastrophic forgetting, wherein networks lose the acquired knowledge related to previous tasks when training on new tasks. Recently, parameter-efficient fine-tuning (PEFT) methods have gained prominence in tackling the challenge of catastrophic forgetting. However, within the realm of domain incremental learning, a type characteristic of continual learning, there exists an additional overlooked inductive bias, which warrants attention beyond existing approaches. In this paper, we propose a novel PEFT method called Domain Correlation Low-Rank Adaptation for domain incremental learning. Our approach put forward a domain correlated loss, which encourages the weights of the LoRA module for adjacent tasks to become more similar, thereby leveraging the correlation between different task domains. Furthermore, we consolidate the classifiers of different task domains to improve prediction performance by capitalizing on the knowledge acquired from diverse tasks. To validate the effectiveness of our method, we conduct comparative experiments and ablation studies on publicly available domain incremental learning benchmark dataset. The experimental results demonstrate that our method outperforms state-of-the-art approaches.

源语言英语
文章编号100270
期刊High-Confidence Computing
5
4
DOI
出版状态已出版 - 12月 2025
已对外发布

指纹

探究 'DC-LoRA: Domain correlation low-rank adaptation for domain incremental learning' 的科研主题。它们共同构成独一无二的指纹。

引用此