On the Approximation Risk of Few-Shot Class-Incremental Learning

Xuan Wang, Zhong Ji*, Xiyao Liu, Yanwei Pang, Jungong Han ;

Abstract


"Few-Shot Class-Incremental Learning (FSCIL) aims to learn new concepts with few training samples while preserving previously acquired knowledge. Although promising performance has been achieved, there remains an underexplored aspect regarding the basic statistical principles underlying FSCIL. Therefore, we thoroughly explore the approximation risk of FSCIL, encompassing both transfer and consistency risks. By tightening the upper bounds of these risks, we derive practical guidelines for designing and training FSCIL models. These guidelines include (1) expanding training datasets for base classes, (2) preventing excessive focus on specific features, (3) optimizing classification margin discrepancy, and (4) ensuring unbiased classification across both base and novel classes. Leveraging these insights, we conduct comprehensive experiments to validate our principles, achieving state-of-the-art performance on three FSCIL benchmark datasets. Code is available at https://github.com/xwangrs/Approximation_FSCIL-ECCV2024. git."

Related Material


[pdf] [supplementary material] [DOI]