DiffClass: Diffusion-Based Class Incremental Learning
Zichong Meng, Jie Zhang, Changdi Yang, Zheng Zhan, Pu Zhao*, Yanzhi Wang*
;
Abstract
"Class Incremental Learning (CIL) is challenging due to catastrophic forgetting. On top of that, exemplar-free CIL is even more challenging due to forbidden access to data of previous tasks. Recent exemplar-free CIL methods attempt to mitigate catastrophic forgetting by synthesizing previous task data. However, they fail to overcome the catastrophic forgetting due to the inability to deal with the significant domain gap between real and synthetic data. To overcome these issues, we propose a novel exemplar-free CIL method. Our method adopts multi-distribution matching (MDM) diffusion models to align quality of synthetic data and bridge domain gaps among all domains of training data. Moreover, our approach integrates selective synthetic image augmentation (SSIA) to expand the distribution of the training data, thereby improving the model’s plasticity and reinforcing the performance of our multi-domain adaptation (MDA) technique. With the proposed integrations, our method then reformulates exemplar-free CIL into a multi-domain adaptation problem to implicitly address the domain gap problem and enhance model stability during incremental training. Extensive experiments on benchmark CIL datasets and settings demonstrate that our method excels previous exemplar-free CIL methods with non-marginal improvements and achieves state-of-the-art performance. Our project page is available at https://cr8br0ze.github.io/DiffClass."
Related Material
[pdf]
[supplementary material]
[DOI]